Update multiple Certificate friendly names using PowerShell - powershell

I am fairly new to PowerShell and I am currently updating a large list of Certificate Friendly names remotely using PowerShell.
I have done the below script which works fine if there is one certificate but it fails if there is multiple certificates in the store as I need to add a Loop into the script. When I am trying to add a loop in it doesn't seem to be working. Could someone help or point me in the right direction please?
Enter-PSSession –ComputerName Servername
Get-ChildItem -Path Cert:\LocalMachine\My
$CertStore = "Cert:\LocalMachine\My\"
$FriendlyName = 'Examplename'
$cert = Get-ChildItem $CertStore
$cert.FriendlyName = $FriendlyName
Thanks for any help.

Just add a Foreach Loop into the script.
Something like below:
$CertStore = "Cert:\LocalMachine\My\"
$FriendlyName = 'Examplename'
$cert = Get-ChildItem $CertStore | foreach {$_.FriendlyName = $FriendlyName}
And this will update multiple Certificates with friendly names.

Related

Remove Expired Certificates with Powershell

I have a simple script to show all certificates on a server, I would like to expand that script to then remove all expired certificates
I have tried several scripts from MS and 3rd parties to find a remove certs but have had no luck with them working properly
The first code I am using is:
Get-ChildItem Cert:\ -Recurse
This Powershell script shows all certificates on a server.
Example output is below for each certificate. I want to target the NotAfter field and have the script then remove the certificate if it's old than todays date
Subject:
Issuer:
Thumbprint:
FriendlyName:
NotBefore:
NotAfter:
Extensions
I would also like to do this for a list of servers, have the script run on each server in a text document, query all certificates, then remove the certs that are expired and move on to the next server.
I have seen some code targeting the date like the following:
ForEach-Object -begin { $now = get-date } -process { if ($PSItem.NotAfter -lt $now ) { $PSItem } } | Remove-Item
I would like the script to go out and query a servers certificates, then deletes out the expired certificates
What you are after is this. This should work perfectly for you. You were close in your logic, just the execution seemed to be a bit off.
$ListOfServers = Get-Content "c:\temp\serv.txt"
Foreach($Server in $ListOfServers) {
Invoke-Command -ComputerName $Server -ScriptBlock {
# Get Certificate list and assign to a variable
$Certs = Get-ChildItem "Cert:\LocalMachine\My" -Recurse
# Loop through each object in $Certs
Foreach($Cert in $Certs) {
# If The objects property "NotAfter" is older than the current time, delete
If($Cert.NotAfter -lt (Get-Date)) {
$Cert | Remove-Item
}
}
}
}
Edited based on comment to prevent accidental destruction of all certs.
To get a list of all cert storage locations.
(Get-ChildItem -Path "Cert:" -Recurse `
| Where-Object {($_).GetType().Name -eq 'X509Store'}).Name

Remotely access \\server\c$\users\user\My Documents

I'm trying to remotely get the size of a users 'My Documents' folder using the C$ built in share.
I can browse the share, I can 'Set-Location' to the share but as soon as I try to 'Get-ChildItem' I get a permission denied.
I can't figure out if this is some built in limitation of Powershell?
Currently tried on PS2, PS3 same result.
(User has full access on both share and NTFS)
I've tried with providing the credentials using 'Get-Credentials' and I have also tried with 'New-PSDrive' mappings as well, same issue, the location is fine but as soon as I GCI it spits out 'PermissionDenied'.
$compList = [LIST OF COMPUTERS]
$exclude = [LIST OF EXCLUDED USERS]
$userSizes = #()
foreach ($computer in $compList){
gci ("\\$computer\c$\users\") | where {$exclude -notcontains $_.name}| foreach-object {
$curUser = $_.name
New-PSDrive -name "Map" -PSProvider FileSystem -Root "\\$computer\c$\users\$_\My Documents"
$size = "{0:N2}" -f ((gci "Map:\" -recurse | Measure-Object -property length -sum).sum /1MB)
$properties = #{'Computer'=$computer;'User'=$curUser;'Size (MB)'=$size}
$curObject = New-Object –TypeName PSObject –Prop $properties
$userSizes += $curObject
Remove-PSDrive -name "Map"
}
}
$userSizes | Out-GridView
$usersizes = $null
Keep in mind that GCI in PS2 doesn't allow providing credentials and the 'FileSystem' provider doesn't either!
You might need credentials to use Get-ChildItem on a remote share, i've had it happen that i've had full access to my NAS but powershell gave me the same error "Permission Denied", it seems weird and i can't why it failed when i had full permissions but it worked when i gave powershell my credentials.
Try declaring credentials first:
$creds = get-credential
then using the credentials like so
Get-ChildItem "\\server\c$\users\user\My Documents" -credentials $creds
Uggh what a disgrace.
The reason was that the path is actually
\\[server]\c$\users\[user]\documents
For some unknown, god forsaken reason, Windows Explorer displays the path as 'My Documents' but the actual path is 'Documents'.
I have no idea why they would do this but there it is. Working fine now, another few hours wasted...

AWS Tools: Copy-S3Object script fails in 2.x with Error "Bucket does not exist"

I am attempting to figure out why a script that works in AWS tools 1.x (I think 1.1.16?) Is now not working after upgrade to the latest AWS tools (2.0.3)
The Script
Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
$creds = New-AWSCredentials -AccessKey [REDACTED] -SecretKey [REDACTED]
Set-AWSCredentials -Credentials $creds
$a = Get-Content C:\users\killeens\desktop\temp\AmazonKeysToDownload.txt
$startingpath = "G:\TheFiles\"
$a | ForEach-Object {
$keyname = $_
$fullpath = $startingpath + $keyname
write-host "fullpath: "$fullpath
Get-S3Bucket -BucketName OURBUCKETNAME | Get-S3Object -Key $_ | Copy-S3Object -Key $keyname -LocalFile $fullpath
}
The Problem
In 1.1.16, this works fine.
Now, under deadline in 2.0.3, I get the following error:
Copy-S3Object : The specified bucket does not exist
These details might be important
For what it's worth, our bucket name is all capital letters. ("COMPANYCLIENT")
This literally worked on my machine an hour or so ago. I then wanted to do something in parallel, so I downloaded powershell v4 and the latest AWS Tools. This problem kept happening. I have since reverted to powershell 3 but the issue remains.
I have not been able to find an old version of amazon 1.x tools to test
Troubleshooting so far
if I only execute Get-S3Bucket OURBUCKETNAME, it works
if I execute the script, leaving off the piped Copy-S3Object command, it works, outputting all of the objects that I imported in my file.
I checked and it doesn't appear that there is a BucketName parameter on the Copy-S3Object command according to the intellisense. If I try to specify one, I get an error.
It appears there is also a cmdlet called Read-S3Object that ends up with the same result. Had to use that.
Didn't see anything about Copy-S3object being deprecated or having its functionality changed, so that's unfortunate.
Assuming you have:
Powershell V3
Amazon Tools for Powershell v2.x
Appropriate Amazon Credentials
Then the following script should work:
Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
### SET ONLY THE VARIABLES BELOW ###
$accessKey = "" # Amazon access key.
$secretKey = "" # Amazon secret key.
$fileContainingAmazonKeysSeparatedByNewLine = "" # Full path to a file, e.g. "C:\users\killeens\desktop\myfile.txt"
$existingFolderToPlaceDownloadedFilesIn = "" # Path to a folder, including a trailing slash, such as "C:\MyDownloadedFiles\" NOTE: This folder must already exist.
$amazonBucketName = "" # the name of the Amazon bucket you'll be retrieving the keys for.
### SET ONLY THE VARIABLES ABOVE ###
$creds = New-AWSCredentials -AccessKey $accessKey -SecretKey $secretKey
Set-AWSCredentials -Credentials $creds
$amazonKeysToDownload = Get-Content $fileContainingAmazonKeysSeparatedByNewLine
$uniqueAmazonKeys = $amazonKeysToDownload | Sort-Object | Get-Unique
$startingpath = $existingFolderToPlaceDownloadedFilesIn
$uniqueAmazonKeys | ForEach-Object {
$keyname = $_
$fullpath = $startingpath + $keyname
Read-S3Object -BucketName $amazonBucketName -Key $keyname -File $fullpath
}
Obviously there would be better ways to produce this (as a function that accepts parameters, in a Powershell v4 workflow with parallel loops and a throttle count, better dealing with credentials, etc.) but this gets it done in its most basic form.

Comparing certificates using Powershell

I'm working on a disaster recovery project and I am recommending as part of the plan to do regular audits of the primary and secondary sites. One of the audit tasks is to make sure that the secondary site has the same certificates installed as the primary site. I think I can accomplish this using Powershell
Get-ChildItem -Path Cert:\LocalMachine\My
Get-ChildItem -Path Cert:\LocalMachine\Root
I know I can use the above commands to get a list of certs but what I'm having trouble with is trying to do this all in one script. I would like to get the list of certs on one server and then get the list of certs on another server and then compare the two lists. I'm very new to Powershell so I'm not to sure where to start.
To retrieve the certificates you would use the underlying .NET classes since the Certificates provider does not expose remote machine connectivity by default. You may find another possibility with PS remoting as well. Here is the function:
function Get-Certificates {
Param(
$Computer = $env:COMPUTERNAME,
[System.Security.Cryptography.X509Certificates.StoreLocation]$StoreLocation,
[System.Security.Cryptography.X509Certificates.StoreName]$StoreName
)
$Store = New-Object System.Security.Cryptography.X509Certificates.X509Store("\\$computer\$StoreName",$StoreLocation)
$Store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]"ReadOnly")
$Store.Certificates
}
And here is how you would use it to compare two lists:
$Left = Get-Certificates -StoreLocation LocalMachine -StoreName Root
$Right = Get-Certificates -StoreLocation LocalMachine -StoreName Root -Computer "REMOTE-PC"
# Dump to console
Compare-Object $Left $Right -property Thumbprint, FriendlyName, Subject, NotAfter | Format-Table
# Export results to file
Compare-Object $Left $Right -property Thumbprint, FriendlyName, Subject, NotAfter | Export-Csv Comparison.csv

Powershell, ftp, get-childitem

Little new to powershell. I am trying to locate a get-childitem like command that will work on an ftp site.
Here is some psuedo-code:
$target = "c:\file.txt"
$username = "username"
$password = "password"
$ftp = "ftp://$username:$password#myftpsite"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
#below is the code that does not work, get-childitem needs a local path
$name = get-childitem -path $ftp
The get-childitem only works with a local path. Does anyone know how I could access the filenames in this manner when on an ftp site?
Thanks
What you would need is a PowerShell provider for FTP if you wanted Get-ChildItem to work on a remote filesystem accessed by FTP. This forum post mentions work being done by Nick Howell on an FTP provider. Other than that, I haven't heard of any other FTP providers for PowerShell.