If I needed to give a certificate an additional permission as described below, where do I even begin with in Powershell? There must be a Windows commandline executable that I have to start with from commandline. Only then I could think about writing it in a powershell script, correct?
Click Start, type mmc in the Search programs and files box, and then press ENTER.
On the File menu, click Add/Remove Snap-in.
Under Available snap-ins, double-click Certificates.
Select Computer account, and then click Next.
Click Local computer, and then click Finish. Click OK.
Under Console Root, Certificates (Local Computer), in the Personal store, click Certificates.
Right-click xxAzurehost1 certificate that has been created earlier. Choose All Tasks > Manage Private Keys. Click Add and then Advanced.
Click Locations and choose your local computer. Click Find Now. Select
NETWORK SERVICE in the search results and click OK. Click OK. In the
Permissions for xxxazurehost1 private keys window, select NETWORK
SERVICE and give Read permissions. Click OK.
To control the ACL for the private key all you have to do is edit a file ACL. The trick is to find which file.
Private keys are stored in:
%ProgramData%\Microsoft\Crypto
On XP:
C:\Documents and Settings\All Users\Application Data\Microsoft\Crypto
Under here you'll see keys organized by algorithm e.g. DSS, RSA.
The User Store will be subfolder with a user SID. The Local Machine store will be in subfolder:
MachineKeys
Here's a way to get the file which stores the private key information for your cert.
First go into the local machine personal certificate store:
PS> cd cert:\LocalMachine\My
Now you have to get a handle to your cert. There's more than one way to do this, here's one using the thumbprint:
$cert = dir | ? {$_.Thumbprint -eq "232820EEBF7DBFA01EE68A28BA0450671F862AE1"}
Now you can find the private key file name like this:
$fileName = $cert.PrivateKey.CspKeyContainerInfo.UniqueKeyContainerName
$keyFile = dir -Path "C:\Documents and Settings\All Users\Application Data\Microsoft\Crypto" -Recurse | ? {$_.Name -eq $fileName}
$keyFile will be the FileInfo object you can change the ACL on with either Set-ACL or icacls.exe
Related
I've been searching all the internet and stackOF to and resolve this issue.
I am trying to automate a db restore using SQL Server Agent. The sql server agent job comprises of four steps 3 of which are tsql and one which is a powershell script.
I have created a proxy with admin credentials so that the script can be run as admin.
cd c:;
$backuppath="Microsoft.PowerShell.Core\FileSystem::\\sharedcomputer\backup";
$destpath="c:\tmp\";
get-childitem -path $backuppath | where-object { -not $_.PSIsContainer } |
sort-object -Property $_.CreationTime |
select-object -last 1 | copy-item -Destination (join-path $destpath "byte.BAK");
It copies the .bak file from the source shared folder and places it in to tmp folder on the target.
Whenever I run this through regular Powershell it works fine.
Whenever I try to run this from SQL server agent I get an error stating that it cannot find path.
I tried to even use net use to pass credentials for the shared folder. I am thinking it has to do with the fact that the folder has requirement for credentials.
I have turned of password file sharing as well on the source server but for some reason when i use windows explorer to locate the shared file it still asks for credentials initially. Once its saved and cached I can then use powershell to cd in to that folder. But none of this works when its executed from sql server agent
I was able to finally figure this out with a little help from a Windows Server guy...
Going back to answering the question. When I created a proxy agent I used the credentials that were associated with the current Domain Account i.e Domain\Administrator.
In order for the proxy to connect to the remote server it needs to have credentials on that domain.
So what I did was create another domain account on my target and source servers using the same name and password and gave it permissions to the folders I needed
That account was used in the proxy and the credential was set up as .\AccountName, so because the wildcard was in place the proxy was able to jump back and fort between the two servers and successfully transfer the files....
Hope this helps
I'm fairly new to PowerShell and am running into a problem.
I want to do the following:
Get list of permissions/users on a single folder on a different server than where I am running my PowerShell window from.
Current command failing:
Get-acl -path "\\servername\folder"
Error Message:
Get-acl : Cannot find path '\\servername\folder' because it does not exist
Does this command only work on the local machine?
It turns out with the way permissions/authentications are setup in my environment prevented my code from working.
Here are the steps I took to verify if I could connect to the server:
Test-Path \\server\folder
This returned "False", which is why my code was breaking.
The work around I used was this:
#Step 1: remotely connect to server
Enter-PSSession -ComputerName servernamegoeshere
#Step 2: get list of permissions on folder and save to csv
get-acl E:\foldernamehere |
select -expand access |
export-csv C:\Users\usernamegoeshere\Documents\listofperms.csv |
#Step 3: close remote connection
Exit-PSSession
I still had to remote into the server and copy the csv to the location I wanted because again, any copy command to another server/share in PowerShell would not work due to permission/authentication issues.
This article explains authentication/permissions a bit better than I can:
http://blogs.technet.com/b/heyscriptingguy/archive/2012/11/14/enable-powershell-quot-second-hop-quot-functionality-with-credssp.aspx
Second way to do this with less code and not having to create a remote session thanks to user Ansgar Wiechers:
Invoke-Command -Computer server -ScriptBlock {get-acl E:\folder |
select -expand access } |
export-csv \\server\folder\accesslist.csv
With PowerShell, there are many ways to do one thing...I think this way is best/most simple! Thanks!
The command works on UNC paths as well, but UNC paths are slightly different from local paths. You need an access point to enter the file system of a remote host. For SMB/CIFS access (via UNC paths) that access point is a shared folder, so you need a path \\server\share or \\server\share\path\to\subfolder.
With an admin account you could use the administrative shares (e.g. \\server\C$\Users\Administrator), otherwise you need to create a share first.
I'm trying to develop my own Boxstarter script for spinning up new machines. I just realized that I'd really like to add a line that will change default applications to open certain file types. For example, I want to open .txt files with Notepad++. I know how to do this by right-click the file and checking it's properties, but is there a line I can add to my Boxstarter script that will do it? Or, since Boxstarter is basically a special set of PowerShell commands, is there a PowerShell command I can invoke directly to change the opens with property? I did some searching, and most of the results were about how to get PowerShell to open something, not change the opens with property. The rest were all about how to open PowerShell.
Another similar, but not quite the same, way to go about this is to change the file association you want to associate with a particular applicaition. Chocolatey includes some helper commands to do this and is therefore available to your Boxstarter package. Here is an excerpt from one of my Boxstarter packages:
Install-ChocolateyFileAssociation ".txt" "$env:programfiles\Sublime Text 2\sublime_text.exe"
Install-ChocolateyFileAssociation ".dll" "$($Boxstarter.programFiles86)\jetbrains\dotpeek\v1.1\Bin\dotpeek32.exe"
So now double clicking on any text file opens sublime or any dll opens dotpeek.
But I agree. Its still helpful to be able to add to the "Open With..." list.
Thanks to #Raf for pointing me in the right direction. Here's the code to change the OpensWith property of .txt files:
$principal = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
$key = [Microsoft.Win32.Registry]::CurrentUser.OpenSubKey("Software\Microsoft\Windows\CurrentVersion\Explorer\FileExts\.txt\UserChoice",[Microsoft.Win32.RegistryKeyPermissionCheck]::ReadWriteSubTree,[System.Security.AccessControl.RegistryRights]::ChangePermissions)
$acl = $key.GetAccessControl()
$right = "SetValue"
$denyrule = New-Object System.Security.AccessControl.RegistryAccessRule($principal,$right,"DENY")
$ret = $acl.RemoveAccessRule($denyrule)
$ret = $key.SetAccessControl($acl)
Set-ItemProperty -Path HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\FileExts\.txt\UserChoice -Name ProgId -Value Applications\notepad++.exe
Slightly modified from an answer in the TechNet forums.
I haven't figured out if there's a boxstarter shortcut for this, but changing the ACL rules was the key. Without it, you don't have the proper access to change this particular registry item. Even when I tried running Powershell as Admin and made sure I had all the right permissions on the UserChoice key (both the administrator account and my user account had Full Control), I kept getting an error that the Requested registry access is not allowed.
I am writing a Powershell script to automate the setting up of a Windows 2008 R2 server and one thing that is required is the importing of several certificates into different stores. After doing some research on how best to achieve this, I found that Importpfx.exe was the best choice for what I am aiming to do, which is import one .pfx file into the Trusted People store and another .pfx file into the Personal store, both for the Computer account. I then also need to Manage Private keys on the certificate imported into the Personal store once it has been imported.
At first, I thought that Importpfx.exe was doing this correctly, but after researching on how to manage the private keys via Powershell, I learned that this can be done my editing the acl for the file that corresponds to the imported certificate which should be found here "C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys". This is where I started to notice that something wasn't quite right with the imported certificate. After searching this folder for a new file after importing the certificates, I noticed that no new files had been added to this folder.
I searched the entire C drive for all files sorted by date modified and found that new files had been added to this folder "C:\Users\'user'\AppData\Roaming\Microsoft\Crypto\RSA\S-1-5-21-2545654756-3424728124-1046164030-4917" instead of the expected folder. Whilst I was able to manually manage private keys for the certificate via the certificate store (as I was user who imported it), no other users were able to log onto the machine and manage the private keys, getting the error message "Cannot find the certificate and private key for decryption" (which would make sense given the folder that the corresponding file exists in).
I use a function to get the thumbprint of the certificates before trying to import the .pfx file. The code I have used to run is:
function GetCertificateThumbprint ( [string]$certPreFix, [string]$certPassword, [string]$certFolder, [string]$domain, [bool]$addIfNotFound, [hashtable]$return)
$storePath = "cert:\LocalMachine"
$storeDir = "My"
$storeName = [System.Security.Cryptography.X509Certificates.StoreName]::My
if($certPreFix -eq "XXX")
{
$storeDir = "TrustedPeople"
$storeName = [System.Security.Cryptography.X509Certificates.StoreName]::TrustedPeople
}
$storePath = [System.IO.Path]::Combine($storePath, $storeDir)
#Build the certificate file name and get the file
$certFileName = $certPreFix + "." + $domainName + ".*"
$certFile = Get-ChildItem -Path $certFolder -Include $certFileName -Recurse
if ($certFile)
{
# The certificate file exists so get the thumbprint
$Certificate = New-Object system.Security.Cryptography.X509Certificates.X509Certificate2($certFile, $certPassword)
$certThumbprint = $Certificate.Thumbprint
if($addIfNotFound)
{
# Check for the certificate's thumbprint in store and add if it does not exist already
if(-not(Get-ChildItem $storePath | Where-Object {$_.Thumbprint -eq $certThumbprint}))
{
Set-Location "$Env:windir\Tools"
.\importpfx.exe -f $certFile -p $certPassword -t MACHINE -s $storeDir
}
}
}
Can anyone see if I have done anything wrong? Has anyone come across this issue and got around it somehow? This is causing me issues as I cannot automate the Manage Private keys task properly!
I just ran in to the same problem. You must specify the MachineKeySet X509KeyStorageFlag when creating the certificate object:
New-Object system.Security.Cryptography.X509Certificates.X509Certificate2($certFile, $certPassword, "PersistKeySet,MachineKeySet")
Hopefully that helps someone.
I am using Windows 7, and want to run signed scripts from Powershell, the security-settings of Powershell are set to "all-signed", and my scripts are signed with a valid certificate from my company. I have also added the .pfx-file to my local certificate store (right-clicked the pfx-file and installed).
However, when I start a signed script, I get a message that says:
"Do you want to run software from this untrusted publisher?
File Z:\Powershell Signed Scripts\signed.ps1 is published by CN=[MyCompanyName] and is not trusted on your system. Only run scripts from
trusted publishers.
[V] Never run [D] Do not run [R] Run once [A] Always run [?] Help
(default is "D"):"
Since I want to automatically call these scripts on my systems, I would like to add my imported certificate to the trusted list on my system, so that I do not get a message anymore when I run a signed script for the first time. How can I make my certificate a trusted one?
How to trust a certificate in Windows Powershell
Indeed, you can do this without any mmc :)
First, check the location of your personal certificate named for example "Power" :
Get-ChildItem -Recurse cert:\CurrentUser\ |where {$_ -Match "Power"} | Select PSParentPath,Subject,Issuer,HasPrivateKey |ft -AutoSize
(This one should be empty:)
gci cert:\CurrentUser\TrustedPublisher
Build the command with the path to your certificate:
$cert = Get-ChildItem Certificate::CurrentUser\My\ABLALAH
Next work on certificate store (Here I work on two certificate store : user & computer)
$store = New-Object System.Security.Cryptography.X509Certificates.X509Store "TrustedPublisher","LocalMachine"
$store.Open("ReadWrite")
$store.Add($cert)
$store.Close()
Check, you should find your certificate :
ls cert:\CurrentUser\TrustedPublisher
Sounds like you need to verify that the script is signed properly and that you have the correct certificate installed in the correct certificate store.
Use the Get-AuthenticodeSignature cmdlet to get information about the signed script.
Also review Scott's guide for signing certificates.