Set-SqlColumnEncryption throws error - powershell

I am encrypting several columns in an existing table using the Encrypt Columns feature in SSMS. I have chosen to generate a Powershell script instead of encrypting the columns in the wizard so I can encrypt the columns at a later point in time. The script is below:
# Generated by SQL Server Management Studio at 3:03 PM on 4/05/2018
Import-Module SqlServer
# Set up connection and database SMO objects
$sqlConnectionString = "Data Source=.;Initial Catalog=MyDatabase;Integrated Security=True;MultipleActiveResultSets=False;Connect Timeout=30;Encrypt=False;TrustServerCertificate=False;Packet Size=4096;Application Name=`"Microsoft SQL Server Management Studio`""
$smoDatabase = Get-SqlDatabase -ConnectionString $sqlConnectionString
# If your encryption changes involve keys in Azure Key Vault, uncomment one of the lines below in order to authenticate:
# * Prompt for a username and password:
#Add-SqlAzureAuthenticationContext -Interactive
# * Enter a Client ID, Secret, and Tenant ID:
#Add-SqlAzureAuthenticationContext -ClientID '<Client ID>' -Secret '<Secret>' -Tenant '<Tenant ID>'
# Change encryption schema
$encryptionChanges = #()
# Add changes for table [dbo].[Voucher]
$encryptionChanges += New-SqlColumnEncryptionSettings -ColumnName dbo.Voucher.Code -EncryptionType Randomized -EncryptionKey "cek"
Set-SqlColumnEncryption -ColumnEncryptionSettings $encryptionChanges -InputObject $smoDatabase
However when I run the script, I get the below exception from the Set-SqlColumnEncryption cmdlet:
System.Reflection.TargetInvocationException: Exception has been thrown by the target of an
invocation. ---> System.TypeInitializationException: The type initializer for
'Microsoft.SqlServer.Management.AlwaysEncrypted.Management.AlwaysEncryptedManagement' threw an
exception. ---> System.IO.FileNotFoundException: Could not load file or assembly 'Newtonsoft.Json,
Version=6.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed' or one of its dependencies. The
system cannot find the file specified.
I updated the Sqlserver module also. Surely I don't have to manually drop the Newtonsoft.Json.dll file into the SqlServer module directory. Any ideas?

You're using 'Always Encrypted'. I'm assuming your column master key is stored in Windows Certificate Store on the machine where your script is running. And, I'm assuming your keys are setup correctly.
Let's say I have a table with schema:
CREATE TABLE dbo.property_bag (
id int identity(1,1) primary key
, insert_user sysname not null
, insert_date datetime2 not null
, insert_source sysname not null
, [stuff] varchar(max) ENCRYPTED WITH (ENCRYPTION_TYPE = RANDOMIZED
, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256'
, COLUMN_ENCRYPTION_KEY = [dbo-property_bag]) NULL
);
GO
Here is a script to insert data into my table:
$insert_count = 10
$words = #('bobby','mikey','billy','suzie','kenny','narav','navneet','rachel','jose','juan')
$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = “Server='<db-server>';Database='<db>';Column Encryption Setting=enabled;Integrated Security=True;”
$hostname = $env:computername
try {
$conn.Open()
 
for ($i = 1; $i -le $insert_count; $i++) {
    $val = Get-Random -Maximum 10
    $word_to_insert = $words[$val]
    $sqlcmd = New-Object System.Data.SqlClient.SqlCommand
    $sqlcmd.Connection = $conn
    $sqlcmd.CommandText = “INSERT INTO dbo.property_bag ([insert_user], [insert_date], [insert_source], [stuff]) VALUES (SUSER_SNAME(), GETDATE(), '${hostname}', #value)”
    $sqlcmd.Parameters.Add((New-Object Data.SqlClient.SqlParameter("#value",[Data.SQLDBType]::VarChar, 4000))) | Out-Null
    $sqlcmd.Parameters[0].Value = $word_to_insert
    $sqlcmd.ExecuteNonQuery() | Out-Null
Write-Progress -Activity "Inserting Records..." -Status "Progress $($i / $insert_count * 100)%" -percentComplete ($i / $insert_count)
}
} finally {
$conn.Close()
$conn.Dispose()
}
When I registered the column master key with the database (through Object Explorer) I'd set the path to the "CurrentUser/My/". SQL Server will pass that location back to the driver, which will search that location in the local key store for the certificate matching the thumbprint provided. That key (which in my case is on my app-server where the script is running) will decrypt the column encryption key. None of this is detailed in the script. It's all setup in the database. Here is the data flow for this example:
In your example, the connection string doesn't contain the "Column Encryption Setting=enabled". I don't believe the Get-SqlDatabase commandlet takes a connection string parameter. I believe you'd pass it an instance object. There are just a bunch of things that don't look quiet right.
If your keys aren't setup correctly, I'd start here.

I was having the exact same issue, and I believe this is some kind of component integration issue on your particular mix of OS and Sql Server.
Case in point, I received this error when trying to run that powershell script on my laptop. My laptop happens to use Windows 10 and Sql Server Management Studio 17.2 (which is what ultimately generates the ps1). And furthermore, my laptop does contain Newtonsoft.Json.dll in the correct directory.
However, I hopped on to a server, which was using Windows 2012 R2 and SSMS 17.2 and the script DOES work!
Ultimately, it's as if there is some kind of assembly binding redirect missing in the Windows10/SSMS17.2 installation which Windows2012R2/SSMS17.2 seems to resolve correctly.

Related

Using Powershell to update On Premise PowerBI Datasource causes Connection Test to fail

I have a script which updates PowerBI data sources on an on-premise PowerBI report server (script below is abridged for brevity) the script updates the connection string in all SQL datasources from $OldServerName to $NewServerName
The script below filters to just one report for the sake of testing. The real script loops through all reports from the root folder.
param ([Parameter(Mandatory=$true)]$ReportServerURI,
[Parameter(Mandatory=$true)]$OldServerName,
[Parameter(Mandatory=$true)]$NewServerName
);
$session = New-RsRestSession -ReportPortalUri $ReportServerURI;
# get all PowerBI reports
$powerBIs = Get-RsFolderContent -RsFolder '/MyFolder1/MyFolder2' -ReportServerUri $ReportServerURI -Recurse | Where-Object -Property "TypeName" -EQ "PowerBIReport"; #the real script starts at the root folder. I just restrict here to target one report for testing
foreach ($pbi In $powerBIs | Where-Object {$_.Name -eq "MyReport"}) #again, this restriction to one report is just for testing
{
# get all the datasources in the report
$rds = Get-RsRestItemDataSource -WebSession $session -RsItem $pbi.Path;
# if data sources have been found
if ($rds -ne $null)
{
# loop through all the datasources
foreach ($d in $rds)
{
if ($d.ConnectionString.ToUpper().Contains($OldServerName.ToUpper()) -and $d.DataModelDataSource.Kind -eq "SQL")
{
$d.ConnectionString = $d.ConnectionString -replace $OldServerName, $NewServerName;
Write-Host ("$($d.id) updated") -ForegroundColor Green;
};
};
};
Set-RsRestItemDataSource -WebSession $session -RsItem $pbi.Path -DataSources $rds -RsItemType PowerBIReport;
};
The script works and when I browse to /MyFolder1/MyFolder2/ in the web report manager, click manage for MyReport and then go to the datasources tab, I can see that the datasources are there, the SQL data source connection strings have been updated as hoped and the credentials are as they were before the update. However, When I click "Test Connection" I get the error
Log on failed. Ensure the user name and password are correct.
I can confirm that the connection succeeds before the update (although this is against $oldServerName).
The credentials for the SQL data sources are for a windows user and that Windows Login exists on the SQL Server $NewServerName and is a user in the database that the data source points to.
There are also some Excel data sources for the PowerBI report, which use the same windows user's credentials which, whilst not updated by the script, display the same behaviour (Connection Test succeeds before the script update but fails after)
If I re-enter the credentials manaually the test then succeeds, however, when I refresh the shared schedule via Manage --> Scheduled Refresh --> refresh now. The refresh fails and I get the following error
SessionID: 45944afc-c53c-4cca-a571-673c45775eab [0] -1055784932:
Credentials are required to connect to the SQL source. (Source at
OldServerName;Database.). The exception was raised by the IDbCommand
interface.
[1] -1055129594: The current operation was cancelled
because another operation in the transaction failed.
[2] -1055784932:
The command has been canceled.. The exception was raised by the
IDbCommand interface.
What am I missing? Is there something else I need to do?
I am using PowerBI Report Server version October 2020
I think the moment you execute Set-RsRestItemDataSource to modify the SQL DataSource Connection strings the password becomes invalid. This seems normal to me, as you don't want someone to modify the connection string and use it with someone else's credentials. So in a way this looks like a security feature and is behaving as desinged.
A possible workaround you could try is to set the credentials again:
Create a credential object with New-RsRestCredentialsByUserObject
From the docs:
This script creates a new CredentialsByUser object which can be used when updating shared/embedded data sources.
Update the SQL DataSource connection with the new credentials object
Something like this in your case might work:
$newCredentialsParams = #{
Username = "domain\\username"
Password = "userPassword"
WindowsCredentials = $true
}
$rds = Get-RsRestItemDataSource -WebSession $session -RsItem $pbi.Path
$rds[0].CredentialRetrieval = 'Store'
$rds[0].CredentialsByUser = New-RsRestCredentialsByUserObject #newCredentialsParams
$setDataSourceParams = #{
WebSession = $session
RsItem = $pbi.Path
DataSources = $rds
RsItemType = PowerBIReport
}
Set-RsRestItemDataSource #setDataSourceParams

File opened that is not a database file (but it is)

I'm trying to pull information from an SQLite database using a Powershell script from a Redgate article.
#I've installed the 64 bit System.Data.SQLite ADO.NET data provider in this directory
Add-Type -Path "C:\Program Files (x86)\SQLite.NET\bin\x64\System.Data.SQLite.dll"
#I just create a connection to my existing database
$con = New-Object -TypeName System.Data.SQLite.SQLiteConnection
# I then give it a simple connection string
$con.ConnectionString = "Data Source=C:\Users\name\Desktop\fax.sqlite"# CHANGE THIS
#and open the connection
$con.Open()
#We'll just start by creating a SQL statement in a command
$sql = $con.CreateCommand()
$sql.CommandText = "select column from table;"
# we now execute the SQL and return the results in a dataset
$adapter = New-Object -TypeName System.Data.SQLite.SQLiteDataAdapter $sql
#we create the dataset
$data = New-Object System.Data.DataSet
#and then fill the dataset
[void]$adapter.Fill($data)
#we can, of course, then display the first one hundred rows in a grid
#(1..100)|foreach{$data.tables[0].Rows[$_]}|out-gridview #-Title 'Authors and books'
I viewed the database using SQLite DB Browser (didn't need a password to open). However, whenever I use this script I get the below error:
Exception calling "Open" with "0" argument(s): "File opened that is
not a database file file is encrypted or is not a database" At
C:\Users\name\Desktop\query sqlite db.ps1:9 char:1
$con.Open()
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SQLiteException
Assuming you have all the libraries in place, to import the assemblies you need to use the Add-Type command:
Add-Type -Path "C:\Program Files\System.Data.SQLite\2010\bin\System.Data.SQLite.dll"
To connect to the database using the ADO.NET protocol, you need to create a SQLiteConnection object with the proper connection string:
$connection_details = New-Object -TypeName System.Data.SQLite.SQLiteConnection
$connection_details.ConnectionString = "Data Source=C:\database\test.db"
Your $con.ConnectionString = "Data Source=C:\Users\name\Desktop\fax.sqlite" is having issue. It is expecting a proper DB connection but instead it is getting a string statement.
I got the same error when running a 32-bit dll on a 64-bit system.
If you are running a 64-bit Windows system then download sqlite-netFx46-static-binary-x64-2015-1.0.116.0.zip from https://system.data.sqlite.org/index.html/doc/trunk/www/downloads.wiki
and extract the contents to C:\Program Files\System.Data.SQLite\2010\bin\
Then run this command once or place it in your PowerShell script before making your connection object:
Add-Type -Path "C:\Program Files\System.Data.SQLite\2010\bin\System.Data.SQLite.dll"

Automatic configuration of ssrs with power shell

i have already installed Report server database. I know how to configure Report server through RS Configuration Manager, but i want to do this automatically with power shell.
So how to change these thing:
Change data source connection string
Backup and restore encryption key (i will have Report server on two instances and will have RS in sync)
Change rsconfig file (modify Authentication types, add 2 more for Kerberos)
You can do it by Get-WMIObject with the configuration settings for the SSRS 2017 instance and then setting the required configuration
Get-WmiObject –namespace "root\Microsoft\SqlServer\ReportServer\RS_SSRS\v14\Admin" -class MSReportServer_ConfigurationSetting -ComputerName localhost
connect to the machine
$conn = New-Object Microsoft.SqlServer.Management.Common.ServerConnection -ArgumentList $env:ComputerName
$conn.ApplicationName = "SSRS Configuration Script"
$conn.StatementTimeout = 0
$conn.Connect()
$smo = New-Object Microsoft.SqlServer.Management.Smo.Server -ArgumentList $conn
and change the configuration as required
## Create the ReportServer and ReportServerTempDB databases
$db = $smo.Databases["master"]
$db.ExecuteNonQuery($dbscript)
## Set permissions for the databases
$dbscript = $configset.GenerateDatabaseRightsScript($configset.WindowsServiceIdentityConfigured, "ReportServer", $false, $true).Script
$db.ExecuteNonQuery($dbscript)
## Set the database connection info
$configset.SetDatabaseConnection("(local)", "ReportServer", 2, "", "")
You can modify the script for other two things #2 & #3. Refer this article for more details. hope that should help.

Programmatically Deploying Power BI Reports to Power BI Report Server and change Connection String

Is there any method to deploy Power BI reports to Power BI Report Server without having to manually copy these files, upload them to the server and finally change the data source connectivity information for each report on a report by report basis which is not practical in each customer sites.
Eg. PowerBI Report File - 'Report_1' need to Deploy on Customer server S1, S2, S3, & so on.
Now we doing manually copy these files, upload them to the server and finally change the data source connectivity information for each report on a report by report basis which is not practical in each customer sites.
How we can automate the deployment of PBIX reports to Power BI Report Server and changing Datasource connection string Pro-grammatically.?
Microsoft releasing feature in 2020 Jan to update connection string using API.
Microsoft releasing feature in 2020 Jan. But There is any way in 2019 ? any other way for update connection string ?
Microsoft Link
Finally invented one trick to update Connection String in PowerBI.
First Install PowerBI API in Powershell.
Microsoft API don’t give ability to update connection string but give permission to update username.
Both username and connection string are stored in encrypted format in database.
So logic is pass connection string to username and then copy encrypted string to connection string column in database.
Just check below example I have written and invented this trick. Thank you.
# Code By SB 2019
$ReportServerURI = 'http://localhost/PowerBIReports' # Input Local path of powerbi file
$filePath = "C:\12.pbix" # Input Local path of powerbi file
$PBIxfileName = "12" # INput your Powerbi File Name
$FolderName ='NewDataset' # Input PowerBI server Folder Name Where you wann to deploy
$Username ='admin'
$password ='password'
$ReportServerName ='localhost\SQl2017' #input SQL server where POWERBI database installed
$ReportServerDatabase = 'ReportServerPowerBI' #input PowerBi Database Name
$ConnectionString ='data source=Client01\SQL2019;initial catalog=Client_SB_1' # input New Connection String / Client ConnectionString
$FolderLocation = '/'
$FolderPath = $FolderLocation + $FolderName
write-host "Deployment Started ..." -ForeGroundColor Yellow
$session = New-RsRestSession -ReportPortalUri $ReportServerURI
Write-RsRestCatalogItem -WebSession $session -Path $filePath -RsFolder $folderPath -Description $Description -Overwrite
$datasources = Get-RsRestItemDataSource -WebSession $session -RsItem "$FolderPath/$PBIxfileName"
$dataSources[0].DataModelDataSource.AuthType = ‘Windows'
$dataSources[0].DataModelDataSource.Username = $ConnectionString
$dataSources[0].DataModelDataSource.Secret = $password
Set-RsRestItemDataSource -WebSession $session -RsItem "$folderPath/$PBIxfileName" -RsItemType PowerBIReport -DataSources $datasources
$ID = $dataSources[0].Id
$Query = " Update [DataModelDataSource] SET ConnectionString = Username From [dbo].[DataModelDataSource] Where DataSourceID ='" + $ID + "' "
Invoke-Sqlcmd -Query $Query -ServerInstance CPMSUNRSQL17\CPMSRINST17 -Database ReportServerPowerBI
$datasources = Get-RsRestItemDataSource -WebSession $session -RsItem "$FolderPath/$PBIxfileName"
$dataSources[0].DataModelDataSource.Username = $Username
$dataSources[0].DataModelDataSource.Secret = $password
Set-RsRestItemDataSource -WebSession $session -RsItem "$folderPath/$PBIxfileName" -RsItemType PowerBIReport -DataSources $datasources
write-host "Deployment Done . . ." -ForeGroundColor Green
This would only work if the change you need can be driven by a parameter, e.g. for a SQL Server source, can set database, schema or table name (but not server name).
First I would set up the query definitions to use query parameter(s) and test. The specifics of this would depend on your data sources and scenario - you have not provided any info on that.
Then I would call the appropriate REST API Update Parameters method - probably the Group version.
https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/updateparametersingroup
You can deploy using to Power BI Report Server, and change connections and other setting using Powershell using the ReportingServiceTools library, As Power BI Report Service is SSRS you can use the same tools, to load reports, change data connections etc
Example of deploying a file and here
You can also change the connection settings directly in the PBIX file. If you change the extension from pbix to zip you can take a look inside.
If you open the 'Connections' file, it contains the setting via a JSON structured file
{"Version":1,"Connections":[{"Name":"EntityDataSource","ConnectionString":"Data Source=asazure://region.asazure.windows.net/somecubegoes her;Initial Catalog=SmartSpacesAnalysis;Cube=SmartSpacesModel","ConnectionType":"analysisServicesDatabaseLive"}]}
That can be read and changed if needed

How to run Powershell script on local computer but with credentials of a domain user

I have to implement a solution where I have to deploy a SSIS project (xy.ispac) from one machine to another. So far I've managed to copy-cut-paste the following stuff from all around the internet:
# Variables
$ServerName = "target"
$SSISCatalog = "SSISDB" # sort of constant
$CatalogPwd = "catalog_password"
$ProjectFilePath = "D:\Projects_to_depoly\Project_1.ispac"
$ProjectName = "Project_name"
$FolderName = "Data_collector"
# Load the IntegrationServices Assembly
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices")
# Store the IntegrationServices Assembly namespace to avoid typing it every time
$ISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
Write-Host "Connecting to server ..."
# Create a connection to the server
$sqlConnectionString = "Data Source=$ServerName;Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString
$integrationServices = New-Object "$ISNamespace.IntegrationServices" $sqlConnection
$catalog = $integrationServices.Catalogs[$SSISCatalog]
# Create the Integration Services object if it does not exist
if (!$catalog) {
# Provision a new SSIS Catalog
Write-Host "Creating SSIS Catalog ..."
$catalog = New-Object "$ISNamespace.Catalog" ($integrationServices, $SSISCatalog, $CatalogPwd)
$catalog.Create()
}
$folder = $catalog.Folders[$FolderName]
if (!$folder)
{
#Create a folder in SSISDB
Write-Host "Creating Folder ..."
$folder = New-Object "$ISNamespace.CatalogFolder" ($catalog, $FolderName, $FolderName)
$folder.Create()
}
# Read the project file, and deploy it to the folder
Write-Host "Deploying Project ..."
[byte[]] $projectFile = [System.IO.File]::ReadAllBytes($ProjectFilePath)
$folder.DeployProject($ProjectName, $projectFile)
This seemed to be working surprisingly well on the development machine - test server pair. However, the live environment will be a bit different, the machine doing the deployment job (deployment server, or DS from now on) and the SQL Server (DB for short) the project is to be deployed are in different domains and since SSIS requires windows authentication, I'm going to need to run the above code locally on DS but using credentials of a user on the DB.
And that's the point where I fail. The only thing that worked is to start the Powershell command line interface using runas /netonly /user:thatdomain\anuserthere powershell, enter the password, and paste the script unaltered into it. Alas, this is not an option, since there's no way to pass the password to runas (at least once with /savecred) and user interactivity is not possible anyway (the whole thing has to be automated).
I've tried the following:
Simply unning the script on DS, the line $sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString would use the credentials from DS which is not recognized by DB, and New-Object does not have a -Credential arg that I could pass to
Putting everything into an Invoke-Command with -Credential requires using -Computername as well. I guess it would be possible to use the local as 'remote' (using . as Computername) but it still complains about access being denied. I'm scanning through about_Remote_Troubleshooting, so far without any success.
Any hints on how to overcome this issue?
A solution might be to use a sql user (with the right access rights) instead of an AD used.
Something like this should work.
(Check also the answer to correct the connection string)