PowerShell: using Compress-Archive with Start-Job won't work - powershell

I'm trying to use PowerShell to compress a bunch of video files on my H:\ drive. However, running this serially would take a long time as the drive is quite large. Here is a short snippet of the code that I'm using. Some parts have been withheld.
$shows = Get-ChildItem H:\
foreach($show in $shows){
Start-Job -ArgumentList $show -ScriptBlock {
param($show)
$destPath = "$($show.DirectoryName)\$($show.BaseName).zip"
Compress-Archive -Path $show.FullName -DestinationPath $destPath
}
}
When I run a Get-Job, the job shows up as completed with no reason in the JobStateInfo, but no .zip was created. I've run some tests by replacing the Compress-Archive command with an Out-File of the $destPath variable using Start-Job as well.
Start-Job -ArgumentList $shows[0] -ScriptBlock {
param($show)
$show = [System.IO.FileInfo]$show
$destPath = "$($show.DirectoryName)\$($show.BaseName).zip"
$destPath | Out-File "$($show.DirectoryName)\test.txt"
}
A text file IS created and it shows the correct destination path. I've run PowerShell as an Administrator and tried again but that doesn't appear to work either. Not sure if it matters, but I'm running on Windows 10 (latest).
Any help would be appreciated. Thanks!

For some reason, inside the job, the serialized fileinfo object has no basename (it's a scriptproperty). If you have threadjobs, that works.
dir | start-job { $input } | receive-job -wait -auto |
select name,basename,fullname
Name basename FullName
---- -------- --------
file1.txt C:\Users\js\foo\file1.txt
file2.txt C:\Users\js\foo\file2.txt
file3.txt C:\Users\js\foo\file3.txt

Am not sure if that's what you want but I think you first need to create an archive then update that archive with shows so I created zip called archive and looped through adding files.
$shows = Get-ChildItem H:\
foreach($show in $shows){
Compress-Archive -Path $show.FullName -Update -DestinationPath "C:\archive"
}

Related

Powershell script searching files on domain

Very new to powershell and AD, so apologies if this post has an obvious answer. I have done some research and I am still not finding the answers I am looking for. My script is below for reference.
I have created a simple powershell script that will run on an admin vm i have setup on my domain. I have a separate SQL vm running a backup script that consume a lot of storage over time. I am trying to run this very simple script. My question is, do I need to modify this script in order to store it on my admin vm but have it run on my sql vm? Or can i leave the path as is and just set up in AD task scheduler. I have tried targeting the FQDN and the IP, but it doesn't seem to be working either way.
$backups_file = 'E:\blahBlahBla\SQL\Backups' or
$backups_file = '<IP_ADDRESS>\E:\blahBlahBla\SQL\Backups' or
$backups_file = '<FQDN>E:\blahBlahBla\SQL\Backups'
$backup_file_exist = (Test-Path -Path $backups_file)
if ($backup_file_exist){
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $backups_file
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $backups_file -Recurse | Where-Object {($_.LastWriteTime -lt (Get-
Date).AddDays(-7))} | Remove-Item
}
else
{
Write-Output -InputObject "Unable to access this directory."
}
Thanks.
well all your $backups_file solutions seems wrong to me.
If you want excess a directory on a Remote system, it has to be at least a fileshare or a administrative share like \\computer\e$\folder\folder\
But why using file shares or something like that when you just simple can connect to a Powershell Session on the Remote Host? here is a example.:
$mySQLServer = "Server1.domain.name", "server2.domain.name"
$backupFolder = "E:\blahBlahBla\SQL\Backups"
foreach ($server in $mySQLServer)
{
$session = New-PSSession -ComputerName $server #maybe -cred if needed
Invoke-Command -Session $session -ArgumentList $backupFolder -ScriptBlock {
param(
$directoy
)
if ($backup_file_exist)
{
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $directoy
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $directoy -Recurse | Where-Object { ($_.LastWriteTime -lt (Get-Date).AddDays(-7))
} | Remove-Item
}
}
Remove-PSSession
}
Good Luck!

How to make a script run only while another one is running

I need your help resolving an exercise in Powershell.
I have to make a script which runs only when another one is running.
For example, I have to run a script which deletes files older than 1 day while a script which restarts a process runs.
I tried to use jobs to make the script run in parallel but I haven't had any succes.
--The script to delete files
Get-ChildItem -Path "a path" -include *.txt -Recurse | Where-Object {$_.LastWriteTime -lt $DateToDelete} | Remove-Item -Force
--The script to restart a process
Get-Process notepad | Stop-Process | Start-Process
I think your problem is with the second script, you can't restart a process like that.
If you tried this line Get-Process notepad | Stop-Process | Start-Process in the console it will prompt you requesting the FilePath to the process you want to start, that's because Stop-Process do not return any result to the pipeline and then Start-Process is not receiving anything from the pipeline.
Look here to see how to restart process using PowerShell
And take a look at this MS module Restart-Process
Use this code to run scripts as job:
$days = -1
$currentDate = Get-Date
$DateToDelete = $currentDate.AddDays($days)
start-job -ScriptBlock {
Get-ChildItem -Path "a path" -include *.txt -Recurse | Where-Object {$_.LastWriteTime -lt $DateToDelete} | Remove-Item -Force
}
start-job -ScriptBlock {
Get-Process notepad | Stop-Process
}

Compress-Archive Error: Cannot access the file because it is being used by another process

I would like to zip a path (with a service windows running inside).
When the service is stopped, it works perfectly, when the service is running, I have the exception:
The process cannot access the file because it is being used by another
process.
However, when I zip with 7-zip, I don't have any exception.
My command:
Compress-Archive [PATH] -CompressionLevel Optimal -DestinationPath("[DEST_PATH]") -Force
Do you have any idea to perform the task without this exception?
Copy-Item allows you to access files that are being used in another process.
This is the solution I ended up using in my code:
Copy-Item -Path "C:\Temp\somefolder" -Force -PassThru |
Get-ChildItem |
Compress-Archive -DestinationPath "C:\Temp\somefolder.zip"
The idea is that you pass through all the copied items through the pipeline instead of having to copy them to a specific destination first before compressing.
I like to zip up a folder's content rather than the folder itself, therefore I'm using Get-ChildItem before compressing in the last line.
Sub-folders are already included. No need to use -recurse in the first line to do this
A good method to access files being used by another process is by creating snapshots using Volume Shadow Copy Service.
To do so, one can simply use PowerShells WMI Cmdlets:
$Path = "C:/my/used/folder"
$directoryRoot = [System.IO.Directory]::GetDirectoryRoot($Path).ToString()
$shadow = (Get-WmiObject -List Win32_ShadowCopy).Create($directoryRoot, "ClientAccessible")
$shadowCopy = Get-WmiObject Win32_ShadowCopy | ? { $_.ID -eq $shadow.ShadowID }
$snapshotPath = $shadowCopy.DeviceObject + "\" + $Path.Replace($directoryRoot, "")
Now you can use the $snapshotPath as -Path for your Compress-Archive call.
This method can also be used to create backups with symlinks.
From there on you can use the linked folders to copy backed up files, or to compress them without those Access exceptions.
I created a similiar function and a small Cmdlet in this Gist: Backup.ps1
There was a similar requirement where only few extensions needs to be added to zip.
With this approach, we can copy the all files including locked ones to a temp location > Zip the files and then delete the logs
This is bit lengthy process but made my day!
$filedate = Get-Date -Format yyyyMddhhmmss
$zipfile = 'C:\Logs\logfiles'+ $filedate +'.zip'
New-Item -Path "c:\" -Name "Logs" -ItemType "directory" -ErrorAction SilentlyContinue
Robocopy "<Log Location>" "C:\CRLogs\" *.txt *.csv *.log /s
Get-ChildItem -Path "C:\Logs\" -Recurse | Compress-Archive -DestinationPath $zipfile -Force -ErrorAction Continue
Remove-Item -Path "C:\Logs\" -Exclude *.zip -Recurse -Force

Confusion on Start-Job for File Share

So, this is absolutely whipping me. I have created a script that moves data based on a user's response to an number of questions from one file share to another. What I would like to do is have a background job running that provides a report of all the files being moved prior to the move taking place. As a result, I added this little bit of code that absolutely doesn't gather info from the source file share. It simply provides data from my particular machine. What am I doing wrong?
While ($sourcepath -eq $null) {
$sourcepath= read-host "Enter source file path"
}
Set-Location $sourcepath
Start-job -Scriptblock {Get-childitem -recurse |Out-File
c:\users\john.smith\desktop\shareonfile.txt}
Jobs run in a different process, with their own scope. The working directory won't be inherited. To demonstrate this:
Set-Location $sourcepath
Start-Job -ScriptBlock {
Get-Location
} | Wait-Job | Receive-Job
Get-Job | Remove-Job
You should avoid setting the location anyway, and just pass the path to Get-ChildItem. To do that in a job, define a parameter and pass its value like so:
Start-job -Scriptblock { param($thePath)
Get-childitem -Path $thePath -recurse |
Out-File c:\users\john.smith\desktop\shareonfile.txt
} -ArgumentList $sourcepath

Robocopy commands to copy a file to over 50 remote machines

I started looking at robocopy yesterday to try to copy and overwrite a file from one destination to many remote computers. I've tried Robocopy to copy files to a remote machine but it doesn't work. I get the same error as the person in the link. Does anybody have any suggestions or lead me in the right way ? thank you so much !
You could just use PowerShell for this. It has an inefficiency issue wherein it would copy one at a time but that shouldnt be an issue for 50ish machines. This could help if you made a PowerShell script
$computers = Get-Content "C:\filewithcomputers.txt"
$fileToCopy = "C:\filetocopy.txt"
ForEach($computer in $Computers){
Copy-Item -Path $fileToCopy -Destination "\\$computer\C`$\Temp"
}
The would copy the file $fileToCopy to each server in the file C:\filewithcomputers.txt assuming that the file contained a list of computer with each one on its own line. The file would be copied to the temp folder on each machine. Update the paths as required for your scenario. I only suggest this since you tagged powershell-remoting. If you are not adept with PowerShell maybe someone else can give you a better answer more of what you are looking for. Using RoboCopy for one file seemed tedious.
If you wanted to check to see if a folder exists and is accessible you could do something like this.
$computers = Get-Content "C:\filewithcomputers.txt"
$fileToCopy = "C:\filetocopy.txt"
ForEach($computer in $Computers){
$destinationx86 = "\\$computer\C`$\Program Files (x86)"
$destination = "\\$computer\C`$\Program Files"
If(Test-Path $destinationx86){
# Copy this to Program Files (x86)
Copy-Item -Path $fileToCopy -Destination $destinationx86
} Else {
# Copy this to Program Files
Copy-Item -Path $fileToCopy -Destination $destination
}
}
If you need to connect with different credentials, you can use
$credential = Get-Credential
New-PSDrive -Name "Computer01" -PSProvider FileSystem -Root "\\Computer01\Share" -Credential $credential -Scope global
Now you can copy to e.g. Computer01:\Folder01\
If you have set your environment up to support PSRemoting and have placed the file in a file share you can use PowerShell Remoting to instruct many computers to retrieve the file themselves nearly simultaneously with Invoke-Command. You can limit the number of simultaneous actions using -ThrottleLimit depending on the size of the source file and how robust the network/server are:
$computers = Get-Content "C:\filewithcomputers.txt"
$originalsource = "\\fileserver\shared\payload.exe"
$originaldestination = "c:\"
$scriptblockcontent = {
param($source,$destination)
Copy-Item -Path $source -Destination $destination
}
Invoke-Command –ComputerName $Computers –ScriptBlock $scriptblockcontent `
–ThrottleLimit 50 -ArgumentList $originalsource,$originaldestination