when I'm executing a SSIS package from powershell, the return data only consisting the SSIS process result not the actual result
I'm trying to get the resultset from SSIS process when it is done executing,
I'm executing the packages from powershell by using PS' package.execute
#connecting to sql server
$sqlConnStr = "Data Source=" + $targetServer + ";Initial Catalog=master;Integrated Security=SSPI;"
$sqlConn = New-Object System.Data.SqlClient.SqlConnection $sqlConnStr
#create new SSIS object
$ssisService = New-Object $ssisNameSpace".IntegrationServices" $sqlConn
#select SSIS catalog
$cat = $ssisService.Catalogs["SSISDB"]
#select SSIS folder
$folder = $cat.Folders[$targetFolder]
#select target project
$project = $folder.Projects[$projectName]
#select target package
$targetPackage = $project.Packages[$package.PackageName]
#execute package and get the result
$actualVal = $targetPackage.Execute("false", $null)
expected value:
the dataset from SSIS process
actual value:
SSIS process result code only
in the end I dump the result into table and then select the result again after my SSIS process is done, just like Jacob said.
thank you for your input.
I don't think you can get the result set using Powershell or other languages and tools since it is an SSIS internal part, you can only retrieve the execution result and package log (errors, warning, information).
dtexec Utility
DTEXEC Command Line Parameters Using Command Files
As a workaround you can export the result set into a flat file and read it using Powershell
Related
I am trying to deploy SQL files to an Azure Synapse Analytics dedicated SQL pools using PowerShell script in an Azure Devops pipeline.
I have a folder with SQL files and after defining array of files I am trying to run foreach loop for array and trying to Invoke-Sqlcmd to deploy files but first SQL file get deployed (object is created) and then I get error:
Msg 104309, Level 16, State 1, Line 1 There are no batches in the
input script.
Below is my piece of code:
$files='$(Build.SourcesDirectory)\folder1\'
foreach ($s in $files)
{
Invoke-sqlcmd -ServerInstance $(server) -Database $(db) -InputFile $s -Credential $(cred)}
Azure Synapse Analytics dedicated SQL pools scripts are sensitive to empty batches, eg this script would generate the same error:
Msg 104309, Level 16, State 1, Line 1 There are no batches in the
input script.
-- xx
GO
However the truth is you get the same error in Synapse Studio and SQL Server Management Studio (SSMS), so I suggest you run through your scripts and use the Parse button (Ctrl+F5) in SSMS, which parses the script but does not execute it. This will help you track down your error:
In summary don't have batches that don't do anything.
I was able to get a simple local example added by including dir and using the FullName method to get the full path:
$files = dir "D:\temp\Powershell\*.sql"
foreach ($s in $files) {
#$s.Name
Invoke-Sqlcmd -ServerInstance '.\sql2019x' -Database 'tempdb' -InputFile $s.FullName
}
I have to implement a solution where I have to deploy a SSIS project (xy.ispac) from one machine to another. So far I've managed to copy-cut-paste the following stuff from all around the internet:
# Variables
$ServerName = "target"
$SSISCatalog = "SSISDB" # sort of constant
$CatalogPwd = "catalog_password"
$ProjectFilePath = "D:\Projects_to_depoly\Project_1.ispac"
$ProjectName = "Project_name"
$FolderName = "Data_collector"
# Load the IntegrationServices Assembly
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices")
# Store the IntegrationServices Assembly namespace to avoid typing it every time
$ISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
Write-Host "Connecting to server ..."
# Create a connection to the server
$sqlConnectionString = "Data Source=$ServerName;Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString
$integrationServices = New-Object "$ISNamespace.IntegrationServices" $sqlConnection
$catalog = $integrationServices.Catalogs[$SSISCatalog]
# Create the Integration Services object if it does not exist
if (!$catalog) {
# Provision a new SSIS Catalog
Write-Host "Creating SSIS Catalog ..."
$catalog = New-Object "$ISNamespace.Catalog" ($integrationServices, $SSISCatalog, $CatalogPwd)
$catalog.Create()
}
$folder = $catalog.Folders[$FolderName]
if (!$folder)
{
#Create a folder in SSISDB
Write-Host "Creating Folder ..."
$folder = New-Object "$ISNamespace.CatalogFolder" ($catalog, $FolderName, $FolderName)
$folder.Create()
}
# Read the project file, and deploy it to the folder
Write-Host "Deploying Project ..."
[byte[]] $projectFile = [System.IO.File]::ReadAllBytes($ProjectFilePath)
$folder.DeployProject($ProjectName, $projectFile)
This seemed to be working surprisingly well on the development machine - test server pair. However, the live environment will be a bit different, the machine doing the deployment job (deployment server, or DS from now on) and the SQL Server (DB for short) the project is to be deployed are in different domains and since SSIS requires windows authentication, I'm going to need to run the above code locally on DS but using credentials of a user on the DB.
And that's the point where I fail. The only thing that worked is to start the Powershell command line interface using runas /netonly /user:thatdomain\anuserthere powershell, enter the password, and paste the script unaltered into it. Alas, this is not an option, since there's no way to pass the password to runas (at least once with /savecred) and user interactivity is not possible anyway (the whole thing has to be automated).
I've tried the following:
Simply unning the script on DS, the line $sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString would use the credentials from DS which is not recognized by DB, and New-Object does not have a -Credential arg that I could pass to
Putting everything into an Invoke-Command with -Credential requires using -Computername as well. I guess it would be possible to use the local as 'remote' (using . as Computername) but it still complains about access being denied. I'm scanning through about_Remote_Troubleshooting, so far without any success.
Any hints on how to overcome this issue?
A solution might be to use a sql user (with the right access rights) instead of an AD used.
Something like this should work.
(Check also the answer to correct the connection string)
I have a server health check script which i'm trying to get working by scheduled task.
The scheduled task has the following set for 'AddArguments"
Add Arguments: -NoLogo -ExecutionPolicy Bypass -File "C:\HealthCheck.ps1"
Everything in the server health portion of full script works fine to create the .csv report, except the last part, which does the CSV to excel conversion/save/close - I've not included the preceding code as it includes some confidential stuff, and i don't believe it's relevant.
When I run the script with the same ID, but from the GUI (not as a scheduled task) it works fine.
Note: The last part of the script definitely does launch excel briefly and performs the functions, and saves/closes it - i'm thinking the scheduled task isn't doing this because it's not supported by Microsoft?
I did find the following SpiceWorks post but the solution noted didn't resolve the issue for me in this case. That's where you create a DESKTOP folder under these paths depending on your version of Office (i'm using Office 2010 32-bit on Windows 7 x64 Pro)
C:\windows\system32\config\systemprofile
C:\windows\syswow64\config\systemprofile
Anyway, here's the code - Any help appreciated!
#Convert CSV to EXCEL, format, and save
#Create excel object
$xl = new-object -comobject excel.application
$xl.visible = $true
#Input
$Workbook = $xl.workbooks.open(“$Dir\Reports\SeverHealth-Results- $CurrentDate.csv”)
$worksheet = $workbook.worksheets.Item(1)
$xl.Rows.Item("2:2").Select()
$xl.ActiveWindow.FreezePanes = $true
$HeaderRow = $Worksheet.Range("A1:L1")
$HeaderRow.Font.Bold = $True
$HeaderRow.Font.Underline = $True
$range = $worksheet.UsedRange
$range.AutoFilter() | Out-Null
$range.EntireColumn.AutoFit() | Out-Null
$rowc = $WorkSheet.UsedRange.Rows.Count
$colc = $WorkSheet.UsedRange.Columns.Count
#Coloring
for ($z = 1; $z -le $rowc; $z++) {
$ActionReqCol = $worksheet.cells.item($z,7)
$ServerCol= $worksheet.cells.item($z,1)
if ($ActionReqCol.text -eq "YES") {
$ActionReqCol.interior.colorindex=3
$ACtionReqCol.font.colorindex=2
$ServerCol.interior.colorindex=3
$ServerCol.font.colorindex=2}}
#Save and close!
$EndDate = Get-Date
$EndDate = $EndDate.ToString('MM-dd-yyyy_hhmm')
$Worksheets = $Workbooks.worksheets
$xlFixedFormat = [Microsoft.Office.Interop.Excel.XlFileFormat]::xlWorkbookDefault
$Workbook.SaveAs($Dir + "\Reports\SeverHealth-Results-$EndDate.xls”, $XLFixedFormat)
$Workbook.Saved = $True
$xl.Quit()
Write the command to invoke the PowerShell with arguments in a batch file. I believe from the comments that you are already able to do this successfully. Configure the Task Scheduler to execute the batch file.
Other advantage of this is, you have reduced dependency. If later you want to make modifications to your command or alter arguments, then you will be able to do so without altering or even opening the Task Scheduler.
Update: #Kenny reported that running task scheduler's task with highest privilege resolved this. The script required elevated access and the same was provided by checking the check box in Task Scheduler to run the task with highest privilege.
I have requirement where I want to write some metrics to the application insight for monitoring a service at a regular interval.
I though that I would write this PowerShell script and schedule it accordingly.
Write-Output "Script Start"
$PSScriptRoot = Get-Location
$AI = "$PSScriptRoot\Microsoft.ApplicationInsights.dll"
[Reflection.Assembly]::LoadFile("$AI")
$InstrumentationKey = ""
$TelClient = New-Object "Microsoft.ApplicationInsights.TelemetryClient"
$TelClient.InstrumentationKey = $InstrumentationKey
$TrackMetric = New-Object "Microsoft.ApplicationInsights.DataContracts.MetricTelemetry"
$TrackMetric.Name = "PowershellTest"
$TrackMetric.Value = Get-Random -Minimum:1 -Maximum:100
$TelClient.TrackMetric($TrackMetric)
$TelClient.Flush()
Write-Output "Script End $TrackMetric.Value"
This PowerShell Script works, but after I moving that script to Runbook it is no longer working.
So, here is the issue.
I am not able to load the ApplicationInsight DLL inside the Runbook.
Any idea how to do that?
Exception Details
Exception calling "LoadFile" with "1" argument(s): "The system cannot find the file specified. (Exception from HRESULT:
0x80070002)"
Thanks
Siraj
Try following path for the assembly
"C:\Modules\Global\Azure\Compute\Microsoft.ApplicationInsights.dll"
The issue is in loading the DLL file. The Runbook is not able to find the file in this line:
$AI = "$PSScriptRoot\Microsoft.ApplicationInsights.dll"
[Reflection.Assembly]::LoadFile("$AI")
When you run a Runbook via Azure Automation, you don't have access to the local path as you normally do on a local machine or on premise. In Azure Automation, modules are placed in "C:\Modules".
Instead, use below code snippet, after you have uploaded the dll file:
[System.Reflection.Assembly]::LoadFrom("C:\Modules\Azure\Microsoft.ApplicationInsights.dll")
Closest Related Reference: Referencing DLL
Can I use a Powershell to generate and print an SSRS report (with parameters) to a network printer.
I will use a SQL Agent job poll a table for new order entries. When an order comes in I want to generate an SSRS report and print it to one of many remote printers (printer will be one of the input parameters) on our network, e.g., print a pick ticket in the appropriate warehouse when an order is placed.
We currently use batch files to print to local printers, but the program hangs often and does not scale well.
In this case:
Report Name: Ticket System
Parameters are: Date and Department
Report Format:PDF
#Place adobe in your path
$env:Path = $env:Path + ";C:\Program Files (x86)\Adobe\Reader 11.0\Reader"
#Specify variables and pass parameters and specify format of report PDF (to keep things simple)
$url = "http://$serverName/ReportServer?/$reportFolder/Ticket+System&Date=3-31-2014&Department=Finance&rs:Format=PDF"
#Use alternative credentials as needed to access report server
$webclient = New-Object System.Net.WebClient
$webclient.UseDefaultCredentials = $TRUE
$file = "C:\temp\report.pdf"
$webclient.DownloadFile($url,$file)
#Specify printer \\server\name
$printer = "\\NorthSide\SharpPrinter"
#The /s /o switch may not be necessary. You can test it out.
AcroRd32.exe /s /o /t $file $printer
Get-Process AcroRd32.exe | kill
References:
This link will indicate how to pass parameters to a report via url
http://msdn.microsoft.com/en-us/library/ms155391.aspx
This link will indicate how to access a report in a printable format via url
http://msdn.microsoft.com/en-us/library/ms154040.aspx
This link will show you how to use powershell to write this file to the filesystem via url.
http://teusje.wordpress.com/2011/02/19/download-file-with-powershell/
This link will show you how to push the job to the spooler. Using Powershell version 4.0
http://technet.microsoft.com/en-us/library/hh849886.aspx
If you are having trouble using powershell in a sql agent job, I alternatively would do the polling with a Task Scheduler job and query your database using the SQLPS module.
Hope that helps.