DSC problems with Credentials and build 10586 - powershell

The latest windows 10 build pushed out an updated version of powershell (10586).
In addition to the change required for the certificate documented at https://dscottraynsford.wordpress.com/2015/11/15/windows-10-build-10586-powershell-problems/ i seem to have an additional problem, while trying to apply the configuration:
WarningMessage An error occured while applying the partial configuration [PartialConfiguration]ExternalIntegrationConfiguration. The error message is :
Decryption failed..
Using the same certificate I can successfully create a MOF with build 10.0.10240.16384 , and successfully apply it. So looking at the difference between the two MOFs I see that the MOF built by build 10586 looks like:
instance of MSFT_Credential as $MSFT_Credential6ref
{
Password = "-----BEGIN CMS-----
\nBase64 encrypted
\n-----END CMS-----";
UserName = "SomeUser";
};
instead of what it used to be like in build (10.0.10240.16384):
instance of MSFT_Credential as $MSFT_Credential6ref
{
Password = "Base64 encrypted";
UserName = "SomeUser";
};
So the content is different. I did check to see whether I could decrypt the credential using Get-CmsMessage and unprotect-CmsMessage, and I could. So the public/private key stuff appears to be good.
Should there be an update to the machine that the configuration is being applied to? I don't see any new powershell build.
Any ideas would be appreciated.

Update 2015-12-18: Installing Windows Management Framework (WMF) 5.0 RTM edition that was released 2015-12-17 on the nodes being configured will resolve this error. WMF 5.0 can be downloaded here.
MS has changed the Get-EncryptedPassword function in the PSDesiredStateConfiguration to generate the new format for Password field in a MOF. This prevents DSC nodes from decrypting the password if WMF has not been upgraded to support it. But as MS has not released an update to allow WMF to read this new format then this should be considered a completely broken release.
I have managed to find a work around:
Copy the PSDesiredStateConfiguration module from a pre 10586 machine (e.g. Windows Server 2012 R2 with latest WMF 5.0) to the PowerShell modules folder on the Built 10586 machine.
E.g.
Replace the C:\Windows\System32\WindowsPowerShell\v1.0\Modules\PSDesiredStateConfiguration folder with an older version
Note: You'll need to take ownership of this folder and give yourself permission to write into it because by default only TrustedInstaller can write to this folder.
As far as I'm concerned this version of PSDesiredStateConfiguration is completely broken and you're better off rolling it back until MS can fix it. This will also fix some other reported problems (module versions, new certificate Policy requirements).
FYI, here is the changed code that changes the credential encryption:
Old code in PSDesiredStateConfiguration.psm1:
# Cast the public key correctly
$rsaProvider = [System.Security.Cryptography.RSACryptoServiceProvider]$cert.PublicKey.Key
# Convert to a byte array
$keybytes = [System.Text.Encoding]::UNICODE.GetBytes($Value)
# Add a null terminator to the byte array
$keybytes += 0
$keybytes += 0
try
{
# Encrypt using the public key
$encbytes = $rsaProvider.Encrypt($keybytes, $false)
# Reverse bytes for unmanaged decryption
[Array]::Reverse($encbytes)
# Return a string
[Convert]::ToBase64String($encbytes)
}
catch
{
if($node)
{
$errorMessage = $LocalizedData.PasswordTooLong -f $node
}
else
{
$errorMessage = $LocalizedData.PasswordTooLong -f 'localhost'
}
$exception = New-Object -TypeName System.InvalidOperationException -ArgumentList $errorMessage
Write-Error -Exception $exception -Message $errorMessage -Category InvalidOperation -ErrorId PasswordTooLong
Update-ConfigurationErrorCount
}
New code in PSDesiredStateConfiguration.psm1:
# Encrypt using the public key
$encMsg =Protect-CmsMessage -To $CmsMessageRecipient -Content $Value
# Reverse bytes for unmanaged decryption
#[Array]::Reverse($encbytes)
#$encMsg = $encMsg -replace '-----BEGIN CMS-----',''
#$encMsg = $encMsg -replace "`n",''
#$encMsg = $encMsg -replace '-----END CMS-----',''
return $encMsg

Related

Using Powershell to update On Premise PowerBI Datasource causes Connection Test to fail

I have a script which updates PowerBI data sources on an on-premise PowerBI report server (script below is abridged for brevity) the script updates the connection string in all SQL datasources from $OldServerName to $NewServerName
The script below filters to just one report for the sake of testing. The real script loops through all reports from the root folder.
param ([Parameter(Mandatory=$true)]$ReportServerURI,
[Parameter(Mandatory=$true)]$OldServerName,
[Parameter(Mandatory=$true)]$NewServerName
);
$session = New-RsRestSession -ReportPortalUri $ReportServerURI;
# get all PowerBI reports
$powerBIs = Get-RsFolderContent -RsFolder '/MyFolder1/MyFolder2' -ReportServerUri $ReportServerURI -Recurse | Where-Object -Property "TypeName" -EQ "PowerBIReport"; #the real script starts at the root folder. I just restrict here to target one report for testing
foreach ($pbi In $powerBIs | Where-Object {$_.Name -eq "MyReport"}) #again, this restriction to one report is just for testing
{
# get all the datasources in the report
$rds = Get-RsRestItemDataSource -WebSession $session -RsItem $pbi.Path;
# if data sources have been found
if ($rds -ne $null)
{
# loop through all the datasources
foreach ($d in $rds)
{
if ($d.ConnectionString.ToUpper().Contains($OldServerName.ToUpper()) -and $d.DataModelDataSource.Kind -eq "SQL")
{
$d.ConnectionString = $d.ConnectionString -replace $OldServerName, $NewServerName;
Write-Host ("$($d.id) updated") -ForegroundColor Green;
};
};
};
Set-RsRestItemDataSource -WebSession $session -RsItem $pbi.Path -DataSources $rds -RsItemType PowerBIReport;
};
The script works and when I browse to /MyFolder1/MyFolder2/ in the web report manager, click manage for MyReport and then go to the datasources tab, I can see that the datasources are there, the SQL data source connection strings have been updated as hoped and the credentials are as they were before the update. However, When I click "Test Connection" I get the error
Log on failed. Ensure the user name and password are correct.
I can confirm that the connection succeeds before the update (although this is against $oldServerName).
The credentials for the SQL data sources are for a windows user and that Windows Login exists on the SQL Server $NewServerName and is a user in the database that the data source points to.
There are also some Excel data sources for the PowerBI report, which use the same windows user's credentials which, whilst not updated by the script, display the same behaviour (Connection Test succeeds before the script update but fails after)
If I re-enter the credentials manaually the test then succeeds, however, when I refresh the shared schedule via Manage --> Scheduled Refresh --> refresh now. The refresh fails and I get the following error
SessionID: 45944afc-c53c-4cca-a571-673c45775eab [0] -1055784932:
Credentials are required to connect to the SQL source. (Source at
OldServerName;Database.). The exception was raised by the IDbCommand
interface.
[1] -1055129594: The current operation was cancelled
because another operation in the transaction failed.
[2] -1055784932:
The command has been canceled.. The exception was raised by the
IDbCommand interface.
What am I missing? Is there something else I need to do?
I am using PowerBI Report Server version October 2020
I think the moment you execute Set-RsRestItemDataSource to modify the SQL DataSource Connection strings the password becomes invalid. This seems normal to me, as you don't want someone to modify the connection string and use it with someone else's credentials. So in a way this looks like a security feature and is behaving as desinged.
A possible workaround you could try is to set the credentials again:
Create a credential object with New-RsRestCredentialsByUserObject
From the docs:
This script creates a new CredentialsByUser object which can be used when updating shared/embedded data sources.
Update the SQL DataSource connection with the new credentials object
Something like this in your case might work:
$newCredentialsParams = #{
Username = "domain\\username"
Password = "userPassword"
WindowsCredentials = $true
}
$rds = Get-RsRestItemDataSource -WebSession $session -RsItem $pbi.Path
$rds[0].CredentialRetrieval = 'Store'
$rds[0].CredentialsByUser = New-RsRestCredentialsByUserObject #newCredentialsParams
$setDataSourceParams = #{
WebSession = $session
RsItem = $pbi.Path
DataSources = $rds
RsItemType = PowerBIReport
}
Set-RsRestItemDataSource #setDataSourceParams

IIS 10 - Import SSL certificate using Powershell - "A specified logon session does not exist"

Importing a .pfx-file to IIS using Powershell is pretty straight forward thanks to guidelines such as this one Use PowerShell to install SSL certificate on IIS. But I do run into an issue when trying to bind port 443 using the imported certificate:
Error: "A specified logon session does not exist. It may already have been terminated. (Exception from HRESULT: 0x80070520)".
This due to "...If you don't already have a cer version, or you do but it includes the private key, enable Allow this certificate to be exported..." (ref. Setup of SharePoint 2013 High-Trust On-premise Add-In Developer / Production environment)
This is how it is set in the GUI
But, looking at the following line in the code which I got from dejanstojanovic.net.
pfx.Import($certPath,$certPass,"Exportable,PersistKeySet")
it is set to Exportable. Removing PersistKeyset does not make a difference. So what could causing this?
The script is not able to set it to Exportable as in the GUI "Allow this certificate to be exported"
...I'm all out of options...
Update
I did tweak the code a bit, using constants and such, but still same issue
$certPath = "D:\ssl\cert-export-to-iis-10.pfx"
$certPass = "password"
$pfx = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$KeyStorageFlags = [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable -bxor [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::PersistKeySet
$pfx.Import($certPath,$certPass,$KeyStorageFlags)
$store = New-Object System.Security.Cryptography.X509Certificates.X509Store("WebHosting","LocalMachine")
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]::ReadWrite)
$store.Add($pfx)
$store.Close()
$store.Dispose()

WriteAllBytes OutOfMemory in PowerShell

I have an API which returns a file as byte[].
I am trying to download this file onto a local machine using PowerShell (needs to be PowerShell for other automation reasons).
I am using WriteAllBytes, however, it throws error with files which are larger than 100MB (I think, threshold might be different).
Are there any other ways to download these files and convert byte[] into an actual file?
Here is what I have at the moment:
$fileInfo = New-Object ($namespace + ".fileInfoRequest")
$fileInfo.Filename = "$($File)"
$fileInfo.Hash = "e0d123e5f316bef78bfdf5a008837577" #random hash so ignore this.
$FileDetails = $WebService.GetFileInfo($fileInfo)
if ($FileDetails.Exists -eq "True") {
[IO.File]::WriteAllBytes("$($InstallPath)\$($File)", $WebService.GetFileData($FileDetails))
} else {
Write-Host -ForegroundColor Red "File $($File.FileName) could not be found in the system"
}
$WebService.GetFileData($FileDetails) returns the file data in byte[] so this is the one that I need to manipulate somehow.
I faced the same message right this morning.
Weirdly in my case the problem was triggered only when using a remote powershell session, so I can see affinity with API which can be passing trough network as well.
It happened, in my case, that the same command from a "standard" powershell session opened directly on the server console was not raising the error.
I was able to avoid it by using the following on an admin powershell session on the server console:
set-item wsman:localhost\Shell\MaxMemoryPerShellMB 2048
After that all the remote powershell session stop to give OutOfmemory.

How can I generate the same checksum as artifactory?

As art.exe (the artifactory CLI interface) hangs whenever I call it from my build script, I am rewriting the bash script from their page on the topic in powershell 2.0. I'm using this code to generate my checksum:
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
$path = Resolve-Path "CatVideos.zip"
$bytes = [System.IO.File]::ReadAllBytes($path.Path)
$sha1.ComputeHash($bytes) | foreach { $hash = $hash + $_.ToString("X2") }
$wc=new-object System.Net.WebClient
$wc.Credentials= new-object System.Net.NetworkCredential("svnintegration","orange#5")
$wc.Headers.Add("X-Checksum-Deploy", "true")
$wc.Headers.Add("X-Checksum-Sha1", $hash)
The problem is it consistently produces a different checksum on my local than artifactory generates. It's not a 'corruption during transmission' error because I'm generating the checksum on my local and I've manually deployed several times.
How can I generate a checksum in the same manner as artifactory (2.6.5) so our checksums will match when I send mine and the deploy won't fail? Am I doing something obviously wrong when generating my checksum?
Thanks!
You have to use
$wc.Headers.Add("X-Checksum-Sha1", $hash.ToLower())

Private keys get deleted unexpectedly in Windows Server 2008 R2

I am facing a strange problem in developing an installation that should in one of the steps install a certificate.
The problem has to do with granting Certificate’s private key access for an account (e.g. IIS_IUSRS) on Windows Server 2008 R2. The private keys are stored in the location C:\Users\All Users\Microsoft\Crypto\RSA\MachineKeys.
A custom C# Setup Project imports a Certificate and gives access for an account on Certificate’s private key during the installation process. After some time (2-3 sec) the private key file is automatically deleted from the MachineKeys folder. Thus the installed Web Application cannot access the specific certificate and displays the following error message:
“System.Security.Cryptography.CryptographicException: Keyset does not exist”. This error occurs only on Windows Server 2008 R2, while for Windows Server 2003 everything is working correctly.
My question is, why the private key gets deleted and which process does this?
Thx
UPDATE 17/05/2012
I have not yet found a solution to the described problem, and no response has been posted on the other forums where I asked (forums.asp.net, social.msdn.microsoft.com). So, can anyone suggest any other resources or advice for further troubleshooting this issue?
Thanks again
This was happening to me too - my setup script would add the cert and grant access to the PK file fine, and the app would work. Then later, after I had closed the PowerShell editor I re-launched the app and it failed with a keyset not found.
Adding the PersistKeySet flag when importing the cert fixed the problem. Here's the PowerShell code for adding the cert and private key with persistence:
param(
[string]$certStore = "LocalMachine\TrustedPeople",
[string]$filename = "sp.pfx",
[string]$password = "password",
[string]$username = "$Env:COMPUTERNAME\WebSiteUser"
)
function getKeyUniqueName($cert) {
return $cert.PrivateKey.CspKeyContainerInfo.UniqueKeyContainerName
}
function getKeyFilePath($cert) {
return "$ENV:ProgramData\Microsoft\Crypto\RSA\MachineKeys\$(getKeyUniqueName($cert))"
}
$certFromFile = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2($filename, $password)
$certFromStore = Get-ChildItem "Cert:\$certStore" | Where-Object {$_.Thumbprint -eq $certFromFile.Thumbprint}
$certExistsInStore = $certFromStore.Count -gt 0
$keyExists = $certExistsInStore -and ($certFromStore.PrivateKey -ne $null) -and (getKeyUniqueName($cert) -ne $null) -and (Test-Path(getKeyFilePath($certFromStore)))
if ((!$certExistsInStore) -or (!$keyExists)) {
$keyFlags = [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::MachineKeySet
$keyFlags = $keyFlags -bor [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::PersistKeySet
$certFromFile.Import($filename, $password, $keyFlags)
$store = Get-Item "Cert:\$certStore"
$store.Open("ReadWrite")
if ($certExistsInStore) {
#Cert is in the store, but we have no persisted private key
#Remove it so we can add the one we just imported with the key file
$store.Remove($certFromStore)
}
$store.Add($certFromFile)
$store.Close()
$certFromStore = $certFromFile
"Installed x509 certificate"
}
$pkFile = Get-Item(getKeyFilePath($certFromStore))
$pkAcl = $pkFile.GetAccessControl("Access")
$readPermission = $username,"Read","Allow"
$readAccessRule = new-object System.Security.AccessControl.FileSystemAccessRule $readPermission
$pkAcl.AddAccessRule($readAccessRule)
Set-Acl $pkFile.FullName $pkAcl
"Granted read permission on private key to web user"
Is very clear that is a security issue “System.Security.”. and you do not have permissions to do the installation., you need to set the permissions on the private key to allow that service account access it.
Edit later: Go to Start->Run->cmd->type mmc->Select File->Add/Remove->Select Certificates->Add->Computer Account->Local., i attach a screenshot is in spanish but I indicated the fields:
Open->Certificates->Personal->Certificates->Right click Certificate-> All Tasks->Manage Private Keys->Add Network Service.
Also check this entry to see how works this feature in Windows Server 2008., then please after you try it, come back and say if you could solve the issue with what I have told you.
http://referencesource.microsoft.com/#System/security/system/security/cryptography/x509/x509certificate2collection.cs,256 shows where the PersistKeySet flag is tested. The PersistKeySet flag is documented at https://msdn.microsoft.com/en-us/library/system.security.cryptography.x509certificates.x509keystorageflags%28v=vs.110%29.aspx with the phrase "The key associated with a PFX file is persisted when importing a certificate." My techno-babble to English translator tells me this means "You must include the PersistKeySet flag if you call the X509Certificate2 constructor and the certificate might already be installed on the machine." This probably applies to the .Import calls too. It's likely the powershell Import-PfxCertificate cmdlet already does this. But if you are doing what the accepted answer shows or what the OP asked, you need to include the special key. We used a variation of ejegg's script in our solution. We have a process that runs every 3 minutes to check that all configured certs are installed and this seems to work fine now.
The symptom we saw in powershell is the HasPrivateKey property is true but the PrivateKey value itself is null. And the key file for the cert in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys was deleted. The FindPrivateKey utility at https://msdn.microsoft.com/en-us/library/aa717039(v=vs.110).aspx helped us watch file get deleted.
So happy 4th birthday to the question with this very late response.