I have s h*tload of reports on my reportserver. Most of them have a Cache Refresh Plan using a shared schedule. Is it programmatically possible to set a Cache Refresh Plan on a report?
Enabling caching, set expiration for a cache using a shared schedule, running snapshots according to a shared schedule all works runs fine using SetExecutionOptions-method and SetCacheOptions-method.
Setting a Cached Refreshplan for a report however does NOT run fine. Suggestions?
edit: I would like to do the same for all, datasets set them to refresh on a shared schedule.
Below is the code I am using (Powershell V3)
$reportServerURI = "http://localhost/Reportserver"
$ReportPathWildCard = "/SOME/FOLDER/ON/SERVER";
$NameSharedSchedule="NAMEOFSCHEDULE";
# init WS proxy
$reportServerURI2010 = "$reportServerURI/ReportService2010.asmx?WSDL"
$RS = New-WebServiceProxy -Uri $reportServerURI2010 -UseDefaultCredential
$proxyNamespace = $RS.GetType().Namespace
#Get Schedule Reference
$NeverExpireSchedule= $RS.ListSchedules([System.Management.Automation.Language.NullString]::Value) | where {$_.Name -eq "$NameSharedSchedule"}
$NeverExpireScheduleID = $NeverExpireSchedule.scheduleid;
$NeverExpireDescription = $NeverExpireSchedule.Description;
$NeverExpireDefinition = $NeverExpireSchedule.Definition;
Write-Host "Found Shared Schedule: '$NameSharedSchedule' with id $NeverExpireScheduleID and definition $NeverExpireDescription";
$NeverExpireScheduleRef =New-Object("$proxyNamespace.ScheduleReference");
$NeverExpireScheduleRef.ScheduleID=$NeverExpireScheduleID;
#get all needed items
$items = $RS.ListChildren($ReportPathWildCard, $true) | Where-Object {"Report" -contains $_.TypeName}
#process all items
foreach ($item in $items) {
$xpath = $item.path
$xtype = $item.TypeName
Write-Host "Processing $xtype $xpath"
##SET Refresh
$r= $RS.SetExecutionOptions( $xpath,"Snapshot",$NeverExpireDefinition)
}
Actually found it. Somehow me ànd several colleagues of mine overlooked it:
CreateCacheRefreshPlan Method is the solution...
It does not look nice (but hey, I am not a developer) and half of it are shameless rip-offs but it DOES do the trick.... :)
Thnx and kudo's to all the people who posted the tidbits I needed...
$reportServerURI = "http://localhost/Reportserver"
$ReportPathWildCard = "/";
$NameSharedSchedule="NAME OF SCHEDULE";
# init WS proxy
$reportServerURI2010 = "$reportServerURI/ReportService2010.asmx?WSDL"
$RS = New-WebServiceProxy -Uri $reportServerURI2010 -UseDefaultCredential
$proxyNamespace = $RS.GetType().Namespace
# Get Schedule Reference
$NeverExpireSchedule= $RS.ListSchedules([System.Management.Automation.Language.NullString]::Value) | where {$_.Name -eq "$NameSharedSchedule"}
$NeverExpireScheduleID = $NeverExpireSchedule.scheduleid;
$NeverExpireDescription = $NeverExpireSchedule.Description;
$NeverExpireDefinition = $NeverExpireSchedule.Definition;
#Write-Host "Found Shared Schedule: '$NameSharedSchedule' with id $NeverExpireScheduleID and definition $NeverExpireDescription";
$NeverExpireScheduleRef =New-Object("$proxyNamespace.ScheduleReference");
$NeverExpireScheduleRef.ScheduleID=$NeverExpireScheduleID;
# Wat dingen voorbereiden
#delivery Extension
#$setting = "Report Server Email"
$matchdata = $NeverExpireScheduleID
$description = "Automatisch ingesteld op " + $NameSharedSchedule
$eventtype = "RefreshCache"
$parameters
#get all needed items
$items = $RS.ListChildren($ReportPathWildCard, $true) | Where-Object {"Dataset" -contains $_.TypeName}
#process all items
foreach ($item in $items) {
$xpath = $item.path
$xtype = $item.TypeName
Write-Host "Processing $xtype $xpath"
$report = $xpath
##SET Cache
$r= $RS.SetCacheOptions( $xpath,[System.Management.Automation.Language.NullString]::Value, $o)
$r= $RS.CreateCacheRefreshPlan( $report,
$description,
$eventtype,
$matchdata,
$parameters
)
}
Related
I have somple experience in PowerShell but I don't have experience in using it to automate SQL Server Reporting Service. Basically I want to assign a user a role to a particular report object in SSRS. I have found the following codes in
SSRS: How to assign multiple users a role to a report quickly?
It seems a good start for creating my script.
function Add-SSRSUserRole
(
[string]$reportServerUrl,[string]$userGroup,[string]$requiredRole,[string]$folder,[bool]$inheritFromParent
)
{
#Ensure we stop on errors
$ErrorActionPreference = "Stop";
#Connect to the SSRS webservice
$ssrs = New-WebServiceProxy -Uri "$reportServerUrl" -UseDefaultCredential;
$namespace = $ssrs.GetType().Namespace;
$changesMade = $false;
#Look for a matching policy
$policies = $ssrs.GetPolicies($folder, [ref]$inheritFromParent);
if ($policies.GroupUserName -contains $userGroup)
{
Write-Host "User/Group already exists. Using existing policy.";
$policy = $policies | where {$_.GroupUserName -eq $userGroup} | Select -First 1 ;
}
else
{
#A policy for the User/Group needs to be created
Write-Host "User/Group was not found. Creating new policy.";
$policy = New-Object -TypeName ($namespace + '.Policy');
$policy.GroupUserName = $userGroup;
$policy.Roles = #();
$policies += $policy;
$changesMade = $true;
}
#Now we have the policy, look for a matching role
$roles = $policy.Roles;
if (($roles.Name -contains $requiredRole) -eq $false)
{
#A role for the policy needs to added
Write-Host "Policy doesn't contain specified role. Adding.";
$role = New-Object -TypeName ($namespace + '.Role');
$role.Name = $requiredRole;
$policy.Roles += $role;
$changesMade = $true;
}
else
{
Write-Host "Policy already contains specified role. No changes required.";
}
#If changes were made...
if ($changesMade)
{
#...save them to SSRS
Write-Host "Saving changes to SSRS.";
$ssrs.SetPolicies($folder, $policies);
}
Write-Host "Complete.";
}
[string]$url = "http://localhost/ReportServer/ReportService2006.asmx?wsdl";
Add-SSRSUserRole $url "Everyone" "Browser" "/MyReportFolder" $true;
Add-SSRSUserRole $url "Domain\User" "Browser" "/MyReportFolder" $true;
Now I have two elementary questions:
Do I need any SSRS modules to be installed in my PowerShell in order to run the above script?
The sample code above assign a permission to a folder. What changes are required if I want to assign permissions to a report object directly instead?
Thanks for your response in advance,
Essentially what I'm after is the results of rest API Gateways - Get Datasource Users but retaining the ID (in this example $Line.id from my imported CSV file).
The end result should be a CSV with the following fields -
ID, emailAddress, datasourceAccessRight, displayName, identifier, principalType
I'm new to PowerShell and surprised I got this far but can't figure out this final bit.
Cheers
$webclient=New-Object System.Net.WebClient
$webclient.Proxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
$Dir = "C:\pbi_pro_user_logs\"
Login-PowerBI
$GateWayFile = Import-CSV -Path "C:\pbi_pro_user_logs\Gateway_Detail.csv"
$Output = #()
foreach ($Line in $GateWayFile){
$Item = $Line.id
$url = "https://api.powerbi.com/v1.0/myorg/gateways/HIDDEN/datasources/"+$Item+"/users"
$Output += (Invoke-PowerBIRestMethod -Url $url -Method Get | ConvertFrom-Json)
}
$Result = $Output.value
$Result | Export-Csv $Dir"GateWay_users.csv" -NoTypeInformation
Try this, using a calculated property from Select-Object:
$GateWayFile = Import-CSV -Path "C:\pbi_pro_user_logs\Gateway_Detail.csv"
$Output = Foreach ($Line in $GateWayFile){
$url = "https://api.powerbi.com/v1.0/myorg/gateways/HIDDEN/datasources/"+$Line.id+"/users"
$Item = (Invoke-PowerBIRestMethod -Url $url -Method Get | ConvertFrom-Json)
# output all properties of the item, plus the ID:
$ItemWithID = $Item | Select *,#{l='Id';e={$line.id}}
Write-Output $ItemWithID
}
# This depends on how you want your csv structured, but for example:
$Result = $Output | Select Id,Value
Or, if Value is a whole object that ID should be assigned inside of, then change the selection lines:
$ItemWithID = $Item.Value | Select *,#{l='Id';e={$line.id}}
$Result = $Output
I'm doing a BITS transfer of daily imagery from a web server and I keep getting random drops during the transfer.
As it's cycling through the downloads I get the occasional "The connection was closed prematurely" or "An error occurred in the secure channel support". There are about 180 images in each folder and this happens for maybe 5-10% of them. I need to retry the download for those that didn't complete.
My code to do so follows - my imperfect work-around is to run the loop twice but I'm hoping to find a better solution.
# Set the URL where the images are located
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
# Set the local path where the images will be stored
$path = 'C:\images\Wind_Waves\latest\'
# Create a list of all assets returned from $url
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links | Where-Object{ $_.tagName -eq 'A' -and $_.href.ToLower().EndsWith("jpg") }
# Create a list of all href items from the table & call it $images
$images = $table.href
# Enumerate all of the images - for troubleshooting purposes - can be removed
$images
# Check to make sure there are images available for download - arbitrarily picked more than 2 $images
if($images.count -gt 2){
# Delete all of the files in the "latest" folder
Remove-Item ($path + "*.*") -Force
# For loop to check to see if we already have the image and, if not, download it
ForEach ($image in $images)
{
if(![System.IO.File]::Exists($path + $image)){
Write-Output "Downloading: " $image
Start-BitsTransfer -Source ($url + $image) -Destination $path -TransferType Download -RetryInterval 60
Start-Sleep 2
}
}
Get-BitsTransfer | Where-Object {$_.JobState -eq "Transferred"} | Complete-BitsTransfer
} else {
Write-Output "No images to download"}
I don't see any error handling in your code to resume/retry/restart on fail.
Meaning why is there no try/catch in the loop or the Get?
If the Get is on per download job in the loop, why is it outside the loop?
Download is the default for TransferType, so no need to specify, it normally will generate an error if you do.
So, something like this. I did test this, but never got a fail. Yet, I have a very high-speed speed internet connection. If you are doing this inside an enterprise, edge devices (filters, proxies, could also be slowing things down, potentially forcing timeouts.)
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object{
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<#
Check to make sure there are images available for download - arbitrarily
picked more than 2 $images
#>
if($images.count -gt 2)
{
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images)
{
Try
{
Write-Verbose -Message "Downloading: $image" -Verbose
if(![System.IO.File]::Exists($path + $image))
{
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 2
}
Get-BitsTransfer |
Where-Object {$PSItem.JobState -eq 'Transferred'} |
Complete-BitsTransfer
}
Catch
{
$PSItem.Exception.Message
Write-Warning -Message "Download of $image not complete or failed. Attempting a resume/retry" -Verbose
Get-BitsTransfer -Name $image | Resume-BitsTransfer
}
}
}
else
{
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
}
See the help files
Resume-BitsTransfer
Module: bitstransfer Resumes a BITS transfer job.
# Example 1: Resume all BITS transfer jobs owned by the current user
Get-BitsTransfer | Resume-BitsTransfer
# Example 2: Resume a new BITS transfer job that was initially suspended
$Bits = Start-BitsTransfer -DisplayName "MyJob" -Suspended
Add-BitsTransfer -BitsJob $Bits -ClientFileName C:\myFile -ServerFileName http://www.SomeSiteName.com/file1
Resume-BitsTransfer -BitsJob $Bits -Asynchronous
# Example 3: Resume the BITS transfer by the specified display name
Get-BitsTransfer -Name "TestJob01" | Resume-BitsTransfer
Here's a somewhat modified version of the above code. It appears the BITS transfer job object goes away when the error occurs, so there is no use in trying to find/resume that job. Instead, I wrapped the entire Try-Catch block in a while loop with an exit when the file is downloaded.
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
$MaxRetries = 3 # Initialize the maximum number of retry attempts.
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object {
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<#
Check to make sure there are images available for download - arbitrarily
picked more than 2 $images
#>
if ($images.count -gt 2) {
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images) {
# Due to occasional failures to transfer, wrap the BITS transfer in a while loop
# re-initialize the exit counter for each new image
$retryCount = 0
while ($retryCount -le $MaxRetries){
Try {
Write-Verbose -Message "Downloading: $image" -Verbose
if (![System.IO.File]::Exists($path + $image)) {
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 2
}
# To get here, the transfer must have finished, so set the counter
# greater than the max value to exit the loop
$retryCount = $MaxRetries + 1
} # End Try block
Catch {
$PSItem.Exception.Message
$retryCount += 1
Write-Warning -Message "Download of $image not complete or failed. Attempting retry #: $retryCount" -Verbose
} # End Catch Block
} # End While loop for retries
} # End of loop over images
} # End of test for new images
else {
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
} # End of result for no new images
Here is a combination of the code that postanote provided and a Do-While loop to retry the download up to 5x if an error is thrown.
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object{
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<# Check to make sure there are images available for download - arbitrarily
picked more than 2 $images #>
if($images.count -gt 2)
{
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images)
{
# Create a Do-While loop to retry downloads up to 5 times if they fail
$Stoploop = $false
[int]$Retrycount = "0"
do{
Try
{
Write-Verbose -Message "Downloading: $image" -Verbose
if(![System.IO.File]::Exists($path + $image))
{
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 10
$Stoploop = $true
}
Get-BitsTransfer |
Where-Object {$PSItem.JobState -eq 'Transferred'} |
Complete-BitsTransfer
}
Catch
{
if ($Retrycount -gt 5){
$PSItem.Exception.Message
Write-Warning -Message "Download of $image not complete or failed." -Verbose
$Stoploop = $true
}
else {
Write-Host "Could not download the image, retrying..."
Start-Sleep 10
$Retrycount = $Retrycount + 1
}
}
}
While ($Stoploop -eq $false)
}
}
else
{
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
}
My company wants me to grab data from their internal website, organize it, and send it to a database. The data is displayed on tables that you navigate to within the site. I'm wanting to pull the fields into a file or memory for further processing.
So far, I can log into the site in powershell by getting the submit login button's ID, and passing my username/password. I'm able to pass use the navigate method to change the page to the appropriate page within the site. However, running an Invoke-WebRequest on the new page, as well as using the Net.WebClient on the new page is returning the information found on the original site's login screen(I know, because nothing from the table makes it into the returned values, regardless of the commands I use). The commented code is what I've tried previously.
Here is the code-minus the values of my id/password/site link
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$ie = New-Object -ComObject 'internetExplorer.Application'
$ie.Visible= $true # Make it visible
$username="myid"
$password="mypw"
$ie.Navigate("https://webpage.com/index.jsp")
While ($ie.Busy -eq $true) {Start-Sleep -Seconds 3;}
$usernamefield = $ie.document.getElementByID('login')
$usernamefield.value = "$username"
$passwordfield = $ie.document.getElementByID('password')
$passwordfield.value = "$password"
$Link = $ie.document.getElementByID('SubmitLogin')
$Link.click()
$url = "https://webpage.com/home.pa#%5BT1%2CM181%5D"
$ie.Navigate($url)
While ($ie.Busy -eq $true) {Start-Sleep -Seconds 3;}
$doc = $ie.document
$web = New-Object Net.WebClient
$web.DownloadString($url)
#$r = Invoke-WebRequest $url
#$r.Forms.fields | get-member
#$InnerText = $r.AllElements |
# Where-Object {$_.tagName -ne "TD" -and $_.innerText -ne $null} |
# Select -ExpandProperty innerText
#write-host $InnerText
#$r.AllElements|Where-Object {$_.InnerHtml -like "*=*"}
#$doc = $ie.Document
#$doc.getElementByID("ext-element-7") | % {
# if ($_.id -ne $null){
# write-host $_.id
# }
#}
$ie.Quit()
I obviously don't have your page and can't ensure that the body of the POST from signing in contains the fields login and password so that will require some trial & error from you. As a mini-example, if you open up your console dev tools network tab and filter by POST, you can observe how your login page signs you in. When I open reddit to sign in, it sends a POST to https://www.reddit.com/login with a body containing a username and password key/value (both plaintext). This action sets up my browser session to persist my login.
Here's a code example that uses the HtmlAgilityPack library to interact with the resulting page as if it were XML.
Enabling TLS1.2:
[System.Net.ServicePointManager]::SecurityProtocol =
[System.Net.ServicePointManager]::SecurityProtocol -bor [System.Net.SecurityProtocolType]::Tls12
Setting up your web session:
$iwrParams = #{
'Uri' = 'https://webpage.com/index.jsp'
'Method' = 'POST'
'Body' = #{
'login' = $username
'password' = $password
}
'SessionVariable' = 'session'
# avoids cases where IE has not been opened
'UseBasicParsing' = $true
}
# don't care about response - only here to initialize the session
$null = Invoke-WebRequest #iwrParams
Getting the protect page content:
$iwrParams = #{
'Uri' = 'https://webpage.com/home.pa#%5BT1%2CM181%5D'
'WebSession' = $session
'UseBasicParsing' = $true
}
$output = (Invoke-WebRequest #iwrParams).Content
Downloading/adding HtmlAgility:
if (-not (Test-Path -Path "$PSScriptRoot\HtmlAgilityPack.dll" -PathType Leaf))
{
Invoke-WebRequest -Uri https://www.nuget.org/api/v2/package/HtmlAgilityPack -OutFile "$PSScriptRoot\html.zip"
Expand-Archive -Path "$PSScriptRoot\html.zip" -DestinationPath "$PSScriptRoot\html" -Force
Copy-Item -Path "$PSScriptRoot\html\lib\netstandard2.0\HtmlAgilityPack.dll" -Destination "$PSScriptRoot\"
Remove-Item -Path "$PSScriptRoot\html", "$PSScriptRoot\html.zip" -Recurse -Force
}
Add-Type -Path "$PSScriptRoot\HtmlAgilityPack.dll"
$html = [HtmlAgilityPack.HtmlDocument]::new()
Loading/parsing your page content:
$html.LoadHtml($output)
# do stuff with output.
$html.DocumentNode.SelectNodes('//*/text()').Text.Where{$PSItem -like '*=*'}
Footnote
I made the assumption in the code you were executing from a script where $PSScriptRoot will be populated. If it's being run interactively, you can use the $pwd automatic variable instead (carry-over from *nix, print working directory). This code requires PSv5+.
After some serious effort-I managed to get the pages to work correctly. Turns out I wasn't waiting for everything to load-but once I had that, I eventually found the correct tag/name to make everything work.
Assuming the code in the original post is correct up to "ie.Navigate($url)"
$ie.Navigate($url)
While ($ie.Busy -eq $true) {Start-Sleep -Seconds 3;}
$r = Invoke-WebRequest $url
$doc = $ie.document
$j = ($doc.getElementsByTagName("body") | Where {$_.className -eq 'thefullclassname found in the quotes of <body class="" of the area you wanted'}).innerText
write-host $j
This gave me the output of a very annoyingly done table that isn't a "table", and has the first row/col on it's own-so formatting the output to an easy to use version will be the new hassle. At least I got everything on the page that had the text I needed...so progress!
I have a report that is copied to a number of different servers. It is imported manually and the data source properties are altered to match the current server's specs. I would like to be able to automate the process by enabling users to open a the SSRS report and dynamically alter it's shared data source properties through PowerShell. I hope you could help. You may see reference below.
The script would accept an input parameter for servername, username and password. Also, the save my password must be ticked.
I couldn't believe I managed to create a script for this. You may make use of the script below as future reference. Comments are available for each part and anything that needs to be altered has a "here" keyword , ex. Your_database_name_here .
Import-Module SqlPs
#Input parameter to get Server\Instancename of your Datasource
$Servername = Read-Host "Please enter your Servername"
$Instancename = Read-Host "Please enter your Instancename. For default instance please press enter"
Write-host ""
if ($Instancename -eq ""){
$ServerInstance = $Servername
}
Else {
$ServerInstance = $Servername +"\"+ $InstanceName
}
#Setting up SSRS Target URL. This is the location where your reports would be deployed.
if ($Instancename -eq ""){
$ReportServerUri = "http://$Servername/ReportServer//ReportService2010.asmx?wsdl"
$TargetURL = "http://$Servername/Reports"
}
Else {
$ReportServerUri = "http://$Servername/ReportServer_$Instancename//ReportService2010.asmx?wsdl"
$TargetURL = "http://$Servername/Reports_$Instancename"
}
$global:proxy = New-WebServiceProxy -Uri $ReportServerUri -UseDefaultCreden
#We would make use of SQL Server Authentication for the reports shared datasource so you need to supply a username and password.
Write-Host " SQL Server Authentication:"
$Username = Read-Host " Username"
$Password = Read-Host -AsSecureString "Password"
$type = $Proxy.GetType().Namespace
$datatype = ($type + '.Property')
$property =New-Object ($datatype);
$property.Name = “NewFolder”
$property.Value = “NewFolder”
$numproperties = 1
$properties = New-Object ($datatype + '[]')$numproperties
$properties[0] = $property;
$newFolder = $proxy.CreateFolder("Reports”, “/”, $properties);
$newFolder = $proxy.CreateFolder("Data Sources”, “/”, $properties);
$Children =$proxy.ListChildren("/",$false)
$DBname = 'Your_Database_Name_Here'
# Creating Datasource through powershell
Write-Host " Creating Datasource ..."
$Name = "Name_Your_Datasource_here"
$Parent = "/Data Sources"
$ConnectString = "data source=$Servername\$Instancename;initial catalog=$DBname"
$type = $Proxy.GetType().Namespace
$DSDdatatype = ($type + '.DataSourceDefinition')
$DSD = new-object ($DSDdatatype)
if($DSD -eq $null){
Write-Error Failed to create data source definition object
}
$CredentialDataType = ($type + '.CredentialRetrievalEnum')
$Cred = new-object ($CredentialDataType)
$CredEnum = ($CredentialDataType).Integrated
$Cred.value__=1
$DSD.CredentialRetrieval =$Cred
$DSD.ConnectString = $ConnectString
$DSD.Enabled = $true
$DSD.EnabledSpecified = $false
$DSD.Extension = "SQL"
$DSD.ImpersonateUserSpecified = $false
$DSD.Prompt = $null
$DSD.WindowsCredentials = $false
$DSD.UserName = $Username
$DSD.Password = [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR($Password))
$newDSD = $proxy.CreateDataSource($Name,$Parent,$true,$DSD,$null)
#Deploying RLD files to Target URL
Write-Host " Deploying RDL files ..."
$stream = Get-Content 'D:\Your_RDL_path_here.rdl' -Encoding byte
$warnings =#();
$proxy.CreateCatalogItem("Report","Report_Name_here","/Reports",$true,$stream,$null,[ref]$warnings)
#Let's make use of the datasource we just created for your RDL files.
$Items = $global:proxy.listchildren("/Data Sources", $true)
foreach ($item in $items)
{
$DatasourceName = $item.Name
$DatasourcePath = $item.Path
}
$RDLS = $global:proxy.listchildren("/Reports", $true)
foreach ($rdl in $rdls)
{
$report = $rdl.path
$rep = $global:proxy.GetItemDataSources($report)
$rep | ForEach-Object {
$proxyNamespace = $_.GetType().Namespace
$constDatasource = New-Object ("$proxyNamespace.DataSource")
$constDatasource.Name = $DataSourceName
$constDatasource.Item = New-Object ("$proxyNamespace.DataSourceReference")
$constDatasource.Item.Reference = $DataSourcePath
$_.item = $constDatasource.Item
$global:proxy.SetItemDataSources($report, $_)
Write-Host "Changing datasource `"$($_.Name)`" to $($_.Item.Reference)"
}
}
#Open a IE browser to view the report.
$IE=new-object -com internetexplorer.application
$IE.navigate2($TargetURL)
$IE.visible=$true
Write-Host ""
Write-Host "You may now view the Reports through the open IE browser."
Write-Host -ForegroundColor Green "**STEP COMPLETED!"