Read encrypted connection string in Machine.config from Powershell - powershell

Is it possible to read the encrypted connection string in Machine.config from Powershell script?
Due to security reason, we are trying to move the hardcoded connection string from PowerShell script to Machine.config
Update: Powershell script is supposed to read the connection string from Machine.config (Encrypted through aspnet_regiis) and connect to the DB.

I'm not sure where PowerShell figures into your question. If you use a built-in command like aspnet_regiis.exe to encrypt a configuration section, then your IIS site is unaware that the section is encrypted.
I'm not sure how putting something in machine.config is going to make an app more secure since anything that reads it transparently would work for any app on that machine.
The way .net configs work is you have an some.exe and it has a some.exe.config file. When you deploy some.exe you can (should?) include a step to encrypt sensitive sections. The app would still just read the data, unaware that it's been encrypted.
But again that is the app, not powershell that's reading the encrypted value.
Here's a section of a script I use that encrypts/decrypts sections. Use carefully since data encrypted can only be decrypted on the same machine (and likely the same OS). YMMV
$appConfig = "your.exe"
$sectionName = "appSettings"
$dataProtectionProvider = "DataProtectionConfigurationProvider"
if (-not (Test-Path $path) ) { throw "Unable to find $($appConfig) $($path)" }
$configuration = [System.Configuration.ConfigurationManager]::OpenExeConfiguration($path)
$section = $configuration.GetSection($sectionName)
if (-not $section.SectionInformation.IsProtected) {
$section.SectionInformation.ProtectSection($dataProtectionProvider)
$section.SectionInformation.ForceSave = $true
$configuration.Save([System.Configuration.ConfigurationSaveMode]::Full)
Write-Information -Message "$($sectionName) in $($appConfig) has been protected from casual users"
}
else {
Write-Information -Message "$($sectionName) in $($appConfig) is already protected"
$section.SectionInformation.UnprotectSection()
$section.SectionInformation.ForceSave = $true
$configuration.Save([System.Configuration.ConfigurationSaveMode]::Full)
Write-Information -Message "$($sectionName) in $($appConfig) protection removed"
}
This is a snippet of a full script to encrypt/decrypt config sections.

The previous answer was largely correct, I did have to make a couple minor changes to get it working on my system with respects to the 'path' and 'appConfig' variables. For this script to work I am placing the powershell script in the same folder as the exe and config file I'm working with.
#This script will encrypt/decrypt the section specified in the sectionName variable of a .net configuration file
#using the DataProtectionConfigurationProvider method of encryption/decryption.
#Place this file in the same folder as the executable/config file and run it, be sure to update the 'appConfig'
#and 'sectionName' variables accordingly.
$path = Get-Location
$appConfig = "ServionDataRecovery.exe"
$sectionName = "connectionStrings"
$dataProtectionProvider = "DataProtectionConfigurationProvider"
if (-not (Test-Path $appConfig) ) { throw "Unable to find $($appConfig) $($path)" }
$configuration = [System.Configuration.ConfigurationManager]::OpenExeConfiguration("$path\$appConfig")
$section = $configuration.GetSection($sectionName)
if (-not $section.SectionInformation.IsProtected) {
$section.SectionInformation.ProtectSection($dataProtectionProvider)
$section.SectionInformation.ForceSave = $true
$configuration.Save([System.Configuration.ConfigurationSaveMode]::Full)
Write-Host "$($sectionName) in $($appConfig) has been protected from casual users"
}
else {
Write-Host "$($sectionName) in $($appConfig) is already protected"
$section.SectionInformation.UnprotectSection()
$section.SectionInformation.ForceSave = $true
$configuration.Save([System.Configuration.ConfigurationSaveMode]::Full)
Write-Host "$($sectionName) in $($appConfig) protection removed"
}

Related

Provider cannot be found when try to open/access mdb

I'm writing a powershell 5.1 script to query info from a mdb file based on this article. I get this error message when I try to open/access an mdb file:
MDB Open next
Provider cannot be found. It may not be properly installed.
I'm pretty sure I need to adjust my connection info according to what I have installed. This is what I'm doing:
$pathViewBase = 'C:\Data\EndToEnd_view\' #View dir.
$XML_MDB_Dirs = #('\AppText.mdb') #more files later
foreach($mdbFile in $XML_MDB_Dirs)
{
$pathToMdb = Join-Path -Path $pathViewBase -ChildPath $mdbFile
if(Test-Path $pathToMdb)
{
$cn = new-object -comobject ADODB.Connection
$rs = new-object -comobject ADODB.Recordset
Write-Host "MDB Open next" -ForegroundColor DarkCyan
$cn.Open("Provider = Microsoft.Jet.OLEDB.4.0;Data Source = $pathToMdb") ###error this line
Write-Host "Open done" -ForegroundColor DarkCyan
$rs.Open(“SELECT TOP 1 [TableName].[Message Description],
[TableName].[Column1]
FROM [TableName]
WHERE [TableName].[Message Description] = 'ERROR'”,
$cn, $adOpenStatic, $adLockOptimistic)
$rs.MoveFirst()
Write-Host "Message value obtained for ERROR: " $rs.Fields.Item("Name").Value
Break ##########################
}#test-Path
}#foreach
I found this regarding odbc connections, and it seems to say I need to adjust my connection info. Looking at what's installed on my computer, I see this, but I'm a little unclear what I need to adjust my open code to use. Would I need to replace 'Microsoft.Jet.OLEDB.4.0' with 'SQLNCLI11.dll'?
Update:
I checked,
(Get-WmiObject Win32_OperatingSystem).OSArchitecture
and I am running 64 bit powershell, so since I installed accessdatabaseengine_x64.exe access database engine, per #
Mathias R. Jessen below (and rebooted), that's correct. But I still get the same error. I'm not sure if there's something I could check to see if it's installed correctly, or if I need to use SQLNCLI11.0 as the provider instead of Microsoft.Jet.OLEDB.4.0, or if I need to add "using" at the top? Or do I need to check if a powershell module is installed?

Powershell - Download the latest FTP files from Ftp server [duplicate]

I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"

Find the last modified file on FTP site using powershell [duplicate]

I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"

Powershell: Brute-forcing password-protected .zip file (speeding up the process)

First-time questioner, so here's hoping I'm doing it right. :)
A co-worker and I have been playing around with Powershell, getting the lay of the land and see what you can do with it. Using info we found online (mostly here), we've managed to whip together a script to brute-force a password-protected .zip file using a .txt containing a list of passwords:
# Stopwatch for measurement
$stopWatch = [System.Diagnostics.Stopwatch]::startNew()
$7zipExec = """-7z.exe (7zip) location-"""
$input = """-.zip location-"""
$output = """-where to drop contents of .zip file-"""
$passwordfile = "-location of .txt file containing passwords-"
$windowStyle = "Hidden"
[long] $counter = 0
# Correct password is 12341234
foreach ($password in (get-content $passwordfile)) {
$counter++
Write-Host -NoNewLine "Attempt #($counter): $password"
$arguments = "x -o$output -p$password -aoa $input"
$p = Start-Process $7zipExec -ArgumentList $arguments -Wait -PassThru -WindowStyle $windowStyle
if ($p.ExitCode -eq 0) {
# Password OK
Write-Host " ...OK!"
Write-Host ""
Write-Host "Password is $password, found it after $counter tries."
break
}
elseif ($p.ExitCode -eq 2) {
# Wrong password
Write-Host " ...wrong"
}
else {
# Unknown
Write-Host " ...ERROR"
}
}
# Halt the stopwatch and display the time spent for this process
$stopWatch.Stop()
Write-Host
Write-Host "Done in $($stopWatch.Elapsed.Hours) hour(s), $($stopWatch.Elapsed.Minutes) minute(s) and $($stopWatch.Elapsed.Seconds) seconds(s)"
Read-Host -Prompt "Press Enter to exit"
It actually works! Probably not as clean as it could be, but we've managed to reach our goal to make a functioning script.
However! It takes about 1 second for each password try, and if you have a file with, say, the 10,000 most common passwords...that could take a while.
So now we're trying to figure out how to speed up the process, but we've hit a wall and need help. I'm not asking for someone to get 'er done, but I would really appreciate some tips/tricks/hints for someone who has only recently started getting into Powershell (and loving it so far!).
Took a while to get back to this, real life and all that, but while I did not manage to speed up the script, I did manage to speed up the process.
What I do now is run 4 instances of the script simultaneously (using an extra PS script to start them, which itself can be started with a batch file).
All of them have their own lists of passwords, and their own output directory (I found that when they use the same location, the file extracted by the script that found the password becomes unusable).
This way, it takes about 7-8 hours to attempt 100,000 of the most commonly used passwords! While I'm sure there are quicker scripts/programs out there, I'm pretty happy with the result.
Thanks all for the input!

Powershell Script to Install Certificate Into Active Directory Store

I'm trying to write a powershell script to install a certificate into the active directory certificate store,
Here are the steps to do this manually, any help would be greatly appreciated.
On a Windows 2008R2 domain controller,
Click Start -> Run
type MMC
click ok
Click File -> Add/Remove Snap-In
Select "Certificates" -> Add
Select "Service Account"
Click Next
Select "Local Computer"
Click Next
Select "Active Directory Domain Services"
Click Finish
Click Ok
I want the script to install the certificate into :
NTDS\Personal
I would post an image but I don't have enough "reputation" apparently, so I can only provide text instructions.
So basically what I've tried is, I've used this powershell function below to import a certificate into the Local Machine -> Personal Store, which is where most certificates go, and the code works.
But I need to install the certificate into the "NTDS\Personal" store on a domain controller, but the $certRootStore only accepts localmachine or CurrentUser, so I'm stuck : /
function Import-PfxCertificate
{
param
(
[String]$certPath,
[String]$certRootStore = "localmachine",
[String]$certStore = "My",
$pfxPass = $null
)
$pfx = new-object System.Security.Cryptography.X509Certificates.X509Certificate2
if ($pfxPass -eq $null)
{
$pfxPass = read-host "Password" -assecurestring
}
$pfx.import($certPath,$pfxPass,"Exportable,PersistKeySet")
$store = new-object System.Security.Cryptography.X509Certificates.X509Store($certStore,$certRootStore)
$store.open("MaxAllowed")
$store.add($pfx)
$store.close()
}
Import-PfxCertificate -certPath "d:\Certificate.pfx"
Regards Alex
Using a combination of what you already had above and the registry keys for the two certificate stores this works.
The only other thing is that I don't know how NTDS determines which certificate to use when there are multiple in the certificate store.
function Import-NTDSCertificate {
[CmdletBinding()]
param(
[Parameter(Mandatory)]
[string]$PFXFile,
[Parameter(Mandatory)]
[string]$PFXPassword,
#Remove certificate from LocalMachine\Personal certificate store
[switch]$Cleanup
)
begin{
Write-Verbose -Message "Importing PFX file."
$PFXObject = New-Object -TypeName System.Security.Cryptography.X509Certificates.X509Certificate2
$PFXObject.Import($PFXFile,$PFXPassword,[System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable)
$thumbprint = $PFXObject.Thumbprint
}
process{
Write-Verbose -Message "Importing certificate into LocalMachine\Personal"
$certificateStore = New-Object -TypeName System.Security.Cryptography.X509Certificates.X509Store('My','LocalMachine')
$certificateStore.Open('MaxAllowed')
$certificateStore.Add($PFXObject)
$certificateStore.Close()
Write-Verbose -Message "Copying certificate from LocalMachine\Personal to NTDS\Personal"
$copyParameters = #{
'Path' = "HKLM:\Software\Microsoft\SystemCertificates\MY\Certificates\$thumbprint"
'Destination' = "HKLM:\SOFTWARE\Microsoft\Cryptography\Services\NTDS\SystemCertificates\My\Certificates\$thumbprint"
'Recurse' = $true
}
Copy-Item #copyParameters
}
end{
if ($Cleanup){
Write-Verbose -Message "Removing certificate from LocalMachine\Personal"
$removalParameters = #{
'Path' = "HKLM:\SOFTWARE\Microsoft\SystemCertificates\MY\Certificates\$thumbprint"
'Recurse' = $true
}
Remove-Item #removalParameters
}
}
}
Alright, first the bad news. The only managed certificate stores are LocalMachine and CurrentUser, as we have all seen in powershell.
Now, the not so bad news. We know that the 'physical' location store (physical is MS' word, not mine) exists in the registry on the ADDS server, HKLM\Software\Microsoft\Cryptography\Services\NTDS\SystemCertificates. This was dually verified by both
Using procmon while importing a certificate into the store using the mmc snap-in
Scavenging msdn for this nugget
The link in #2 shows that all physical stores for services are stored in the path mentioned above, substituting NTDS for . The real service name, not the display name.
However,
Because of the bad news. Trying to map it in powershell with that reg key as the root and -PSProvider Certificate will prove disappointing, it was the first thing I tried.
What one can try, is using the X509Store constructor that takes an IntPtr to a SystemStore, as described here. Yes, that invovles some unmanaged code, and mixing the two is something I do rarely, but this and googling for HCERTSTORE C# should get you there.
Even though this post is years old, it is still helpful and turns up in searches, so to address the question of "I don't know how NTDS determines which certificate to use when there are multiple in the certificate store", the answer is that you will get unreliable results when there are two or more valid certificates installed that meet the requested criteria so it is recommended to remove the old/unneeded certificate(s) and just leave the newest/best one for the server auth.