Adding If / If Not statement into PowerShell - powershell

We are moving to JumpCloud AD services, and with that, comes automated deployment commands from the JumpCloud console. I've created a script that works with Chocolatey to install some apps, and the ones not on Chocolatey I have them in an S3 bucket on AWS that I've tied into a Invoke-WebRequest -Uri command to pull the package, and copy it to the destination folder.
The problem I'm running into, is, I want the command to run, but I want it to check and see if the install files are already there, if they are, move onto the next item, if they aren't, then copy the file over.
Anyone willing to give me a few pointers? Currently have 7 packages being copied over, so I assume I'll need 7 if statements.
Here is the code of what I have attempted so far:
if ( -not (Test-Path -path "C:\Windows\Temp\JC_ScheduledTasks")) {
New-Item -Path "C:\Windows\Temp\JC_ScheduledTasks" -ItemType directory
}
But I'm not sure how to tweak that for items pulled from AWS:
Invoke-WebRequest -Uri "cavo-deploy-virginia.s3.amazonaws.com/QualysCloudAgent.exe" -OutFile "c:\jumpcloud\QualysCloudAgent.exe"

Since you have an array of files that you want to do the same set of commands for. Then use simple loop either using foreach or ForEach-Object.
$DestinationFolder = 'c:\jumpcloud\' #'
# Define the array here or use Get-Content for a list from a text file
$Files = #('QualysCloudAgent.exe','example1.exe')
foreach ($File in $Files) {
# Determine the destination for the file
$DestinationFile = Join-Path $DestinationFolder $File
# Validate if the file already exists
if ( -not (Test-Path $DestinationFile)) {
Invoke-WebRequest -Uri "cavo-deploy-virginia.s3.amazonaws.com/$File" -OutFile $DestinationFile
}
}

Related

How to create a powershell script to move specific files to a different location?

So I have been tasked to write a script that will move files from one folder to another folder, which is easy enough. The problem I am having is the files are for accounts so there will be a file called DEA05292020.pdf and another file called TENSJ05292020 and each file needs to go to a specific folder (EX. the DEA05292020.pdf file needs to be moved to a folder called DEA and the TENSJ05292020 will move to the TENSJ folder. There are over a hundred different accounts that have their own specific folder. The files all start off in our Recon folder and need to be moved at the end of each month to their respective accounts folder. So my question is how could I go about creating a powershell script to make that happen. I am very new to powershell and have been studying the "Learn Powershell in a Month of Lunches" and have a basic grasp of it. So what I have so far is very simple where I can copy the file over to the new folder:
copy-item -path "\Sageshare\share\Reconciliation\PDF Recon Center\DEA RECON 05292020" -destination "Sageshare\share\Account Rec. Sheets\Seperate Accounts\DEA"
This works but I need a lot more automation in regards to seperating all the different account names in the PDF Recon Center folder. How do I make a script that can filter the account name (IE: DEA) and also the month and year from the name of the file (IE: 052020 pulled out of the 05292020 part of the filename)?
Thanks!
If #Lee_Dailey wants to write the code and post it here, I'll delete my answer. He solved the problem, I just code monkeyed it.
Please don't test on everything at once, run it in batches so you can monitor its behavior and not mess up your environment. It moves files in ways you may not want, i.e. if there is a folder named a it'll move everything that matches that folder into it. If you want to prevent this you can write the prescanning if there is a folder more "closely matching" that name before it actually creates the folder itself. Pretty sure it does everything you want however in the simplest way to understand. :)
$names = $(gci -af).name |
ForEach-Object {
if (-not ($_.Contains(".git"))){
$_
}
}
if ( $null -eq $names ) {
Write-Host "No files to move!"
Start-Sleep 5
Exit
}
$removedNames = $names |
ForEach-Object {
$_ = $_.substring(0, $_.IndexOf('.')) # Remove extension
$_ -replace '[^a-zA-Z-]','' # Regex removes numbers
}
$removedNames = $removedNames |
Get-Unique # Get unique folder names
$names |
ForEach-Object {
$name = $_
$removedNames |
ForEach-Object {
if ($name.Contains($_)) # If it matches a name
{
if (-not (Test-Path ".\$_")) { # If it doesn't see the folder
New-Item -Path ".\" `
-Name "$_" `
-ItemType "directory"
}
Move-Item -Path ".\$name" `
-Destination ".\$_" # Move file to folder
}
}
}

Create PS script to find files

I want to start by saying coding is a bit outside of my skill set but because a certain problem keeps appearing at work, I'm trying to automate a solution.
I use the below script to read an input file for a list of name, search the C:\ for those files, then write the path to an output file if any are found.
foreach($line in Get-Content C:\temp\InPutfile.txt) {
if($line -match $regex){
gci -Path "C:\" -recurse -Filter $line -ErrorAction SilentlyContinue |
Out-File -Append c:\temp\ResultsFindFile.txt
}
}
I would like to make two modifications to this. First, to search all drives connected to the computer not just C:\. Next, be able to delete any found files. I'm using the Remove-Item -confirm command but so far can't make it delete the file it just found.

How to run a powershell script that is based online

I have a Powershell script that is stored here:
https://gitlab.example.example.co.uk/example/example/raw/master/shrink-diskpart.ps1
I would like to run this on many servers through a scheduled task from this gitlab so I can make single changes to the script and it will run the most up to date one on all the servers.
Can anyone advise if this is possible an if it is how it can be done?
Thanks
You can use Invoke-WebRequest with -OutFile to download a file, and then just execute it. You might store the file on the web server as a .txt so that you don't have to add a MIME type.
$scriptUrl = "http://localhost/test.txt"
$destination = "c:\temp\test.txt"
$scriptName = "c:\temp\test.ps1"
Invoke-WebRequest $scriptUrl -OutFile $destination
# if the file was downloaded, delete the old script file and rename the new
# file
if(test-path $destination){
remove-item $scriptName
Rename-Item $destination $scriptName
}
&$scriptName
Props to http://www.powershellatoms.com/basic/download-file-website-powershell/

PowerShell to delete Desktop Items from a remote PC

I have 200 PC that need to have some specific icons removed.
I created a CSV file with the ComputerName (1 name per row)
I have another file with the file name of the icon that needs to be removed from the desktops (Shortcut1.lnk, etc). This other file is also a CSV (1 file name per row).
How can I run a PowerShell script to remove those icons. (Please note that not all computers in my CSV file maybe turned on. Some maybe off or have network issues).
$SOURCE = "C:\powershell\shortcuts"
$DESTINATION = "c$\Documents and Settings\All Users\Desktop"
$LOG = "C:\powershell\logs\logsremote_copy.log"
$REMOVE = Get-Content C:\powershell\shortcuts-removal.csv
Remove-Item $LOG -ErrorAction SilentlyContinue
$computerlist = Get-Content C:\powershell\computer-list.csv
foreach ($computer in $computerlist) {
foreach ($file in $REMOVE) {
Remove-Item "\\$computer\$DESTINATION\$file" -Recurse
}
}
This is my code so far but it doesn't appear to delete the files from
\\computername\c$\Documents and Settings\All Users\Desktop
I am getting errors and warnings. The log file also doesn't seem to be creating.
Anyway to get a report of what was deleted. what was not deleted?
Change this, you already specify a slash in your $destination variable, you are double up # \\c$
Remove-Item "\\$computer$DESTINATION\$file" -Recurse
otherwise, you are trying to delete this path and failing.
\\computername\\c$\Documents and Settings\All Users\Desktop\$file

PowerShell script to echo files from folders and subfolders and then delete files over X days old

I am curious to understand the possible ways to echo the files in folders and subfolders and generate a output stating the filenames, which are picked up to delete X days old.
I wanted to write this script in two different levels
Level1:
PowerShell script only to echo filenames and give me the output of the files, which have been identified to be deleted. This should include the files including folders and subfolders.
Level2:
Combine the level1 script by adding a delete functionality, which would delete the files in folders and subfolders.
I have a move script and a direct script to delete but I want to ensure the correct files are picked and I want to know the file names which are being deleted.
Any help is highly appreciated.
EDIT Added from comment
I have been trying something like this in a very simple fashion
Get-ChildItem -Path c:\test | where {$_.lastWriteTime -lt (Get-Date).addDays(-60)}
I would like to add some parameter, which would generate an output of filenames in a different folder location.
I think this is something along the lines of what you need, I have introduced you to a few concepts which you might not be aware of, such as cmdletbinding which allows you to dry run your script using the -whatif parameter. You can also supply -verbose to see what is happening along the way, you could also append to a log at this point using the Add-Content cmdlet.
So you might run it like this:
.\DeleteOldFiles.ps1 -Path c:\test -Age 50 -WhatIf -Verbose
Then when you are ready to delete the files you can run it without the -WhatIf parameter:
.\DeleteOldFiles.ps1 -Path c:\test -Age 50 -Verbose
This doesn't answer all your questions, but should help you get started, I've put plenty of comments in the code so you should be able to follow it all.
# Add CmdletBinding to support -Verbose and -WhatIf
[CmdletBinding(SupportsShouldProcess=$True)]
param
(
# Mandatory parameter including a test that the folder exists
[Parameter(Mandatory=$true)]
[ValidateScript({Test-Path $_ -PathType 'Container'})]
[string]
$Path,
# Optional parameter with a default of 60
[int]
$Age = 60
)
# Identify the items, and loop around each one
Get-ChildItem -Path $Path | where {$_.lastWriteTime -lt (Get-Date).addDays(-$Age)} | ForEach-Object {
# display what is happening
Write-Verbose "Deleting $_ [$($_.lastWriteTime)]"
# delete the item (whatif will do a dry run)
$_ | Remove-Item
}
The question is a little vague, but I assume this is something like what you want.
I like David Martin's answer, but it may be a little too complex depending on your skill level and needs.
param(
[string]$Path,
[switch]$LogDeletions
)
foreach($Item in $(Get-ChildItem -Path $Path | where {$_.lastWriteTime -lt (Get-Date).addDays(-60)}))
{
if($LogDeletions)
{
$Item | Out-File "C:\Deleted.Log" -Append
}
rm $Item
}