Slight modification to powershell command - powershell

This powershell command works perfectly to copy and extract a zip file across two directories:
$shell = New-Object -COM Shell.Application
$target = $shell.NameSpace('D:\destination\')
$zip = $shell.NameSpace('D:\source\version_*.zip')
$target.CopyHere($zip.Items(), 16)
However I am struggling with a modification to make it select only the latest zip file from the source.

Get the zip file with the most recent modification date in a given directory:
$source = "C:\temp"
$destination = "C:\temp\output"
$zipFile = Get-ChildItem -Path $source -Filter "*.zip" |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
Expand-Archive -Path $zipFile.FullName -DestinationPath $destination
This works by searching for all zip files, sorting them by the modified date descending and then grabbing the "first" one (according to the defined sort order).
I've also used the Expand-Archive command to extract the zip to a specified destination. If you need to copy the zip then that's easy enough using the Copy-Item cmdlet first.
As some commenters have pointed out: Expand-Archive was introduced in version 5 of PowerShell.
However the logic of getting the "latest" file is unchanged and can easily be plumbed in to your existing script:
$zip = $shell.NameSpace($zipFile.FullName)

Related

Powershell: Find Folders with (Name) and Foreach Copy to Location Preserve Directory Structure

Got another multi-step process I'm looking to streamline. Basically, I'm looking to build a Powershell script to do three things:
Get-Childitem to look for folders with a specific name (we'll call it NAME1 as a placeholder)
For each folder it finds that has the name, I want it to output the full directory to a TXT file (so that in the end I wind up with a text file that has a list of the results it found, with their full paths; so if it finds folders with "NAME1" in five different subdirectories of the folder I give it, I want the full path beginning with the drive letter and ending with "NAME1")
Then I want it to take the list from the TXT file, and copy each file path to another drive and preserve directory structure
So basically, if it searches and finds this:
D:\TEST1\NAME1
D:\TEST7\NAME1
D:\TEST8\NAME1\
That's what I want to appear in the text file.
Then what I want it to do is to go through each line in the text file and plug the value into a Copy-Item (I'm thinking the source directory would get assigned to a variable), so that when it's all said and done, on the second drive I wind up with this:
E:\BACKUP\TEST1\NAME1
E:\BACKUP\TEST7\NAME1
E:\BACKUP\TEST8\NAME1\
So in short, I'm looking for a Get-Childitem that can define a series of paths, which Copy-Item can then use to back them up elsewhere.
I already have one way to do this, but the problem is it seems to copy everything every time, and since one of these drives is an SSD I only want to copy what's new/changed each time (not to mention that would save time when I need to run a backup):
$source = "C:\"
$target = "E:\BACKUP\"
$search = "NAME1"
$source_regex = [regex]::escape($source)
(gci $source -recurse | where {-not ($_.psiscontainer)} | select -expand fullname) -match "\\$search\\" |
foreach {
$file_dest = ($_ | split-path -parent) -replace $source_regex,$target
if (-not (test-path $file_dest)){mkdir $file_dest}
copy-item $_ -Destination $file_dest -force -verbose
}
If there's a way to do this that wouldn't require writing out a TXT file each time I'd be all for that, but I don't know a way to do this the way I'm looking for except a Copy-Item.
I'd be very grateful for any help I can get with this. Thanks all!
If I understand correctly, you want to copy all folders with a certain name, keeping the original folder structure in the destination path and copy only files that are newer than what is in the destination already.
Try
$source = 'C:\'
$target = 'E:\BACKUP\'
$search = 'NAME1'
# -ErrorAction SilentlyContinue because in the C:\ disk you are bound to get Access Denied on some paths
Get-ChildItem -Path $source -Directory -Recurse -Filter $search -ErrorAction SilentlyContinue | ForEach-Object {
# construct the destination folder path
$dest = Join-Path -Path $target -ChildPath $_.FullName.Substring($source.Length)
# copy the folder including its files and subfolders (but not empty subfolders)
# for more switches see https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
robocopy $_.FullName $dest  /XO /S /R:0
}
If you don't want console output of robocopy you can silence it by appending 2>&1, so neither stdout nor stderr is echoed
If you want to keep a file after this with both the source paths and the destinations, I'd suggest doing
$source = 'C:\'
$target = 'E:\BACKUP\'
$search = 'NAME1'
$output = [System.Collections.Generic.List[object]]::new()
# -ErrorAction SilentlyContinue because in the C:\ disk you are bound to get Access Denied on some paths
Get-ChildItem -Path $source -Directory -Recurse -Filter $search -ErrorAction SilentlyContinue | ForEach-Object {
# construct the destination folder path
$dest = Join-Path -Path $target -ChildPath $_.FullName.Substring($source.Length)
# add an object to the output list
$output.Add([PsCustomObject]#{Source = $_.FullName; Destination = $dest })
# copy the folder including its files and subfolders (but not empty subfolders)
# for more switches see https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
robocopy $_.FullName $dest  /XO /S /R:0
}
# write the output to csv file
$output | Export-Csv -Path 'E:\backup.csv' -NoTypeInformation

Powershell List Excel Files and Copy

I apologize for the naivety of this post, please forgive my newness.
I have approximately 20,000 network files to filter through and copy certain ones to a local drive.
File List Requirements:
Excel files of various type (.xls, .xlsx, .xlsm)
Only files modified after 4/1/2022
Only files that contain "2022" in the filename
If the file meets those requirements then:
Copy the file to a local folder (original folder path structure doesn't matter, all files can go in one folder)
Output the original path and filename to a txt file, along with the lastwritedate
I have created the following code, which successfully obtains all excel files and creates the filename list
Get-ChildItem "D:\network_folder\" -Filter *.xls -Recurse | Select-Object -Property FullName, LastWriteTime |
Export-Csv -Path "C:\local_folder\file_list.csv" -Force -NoTypeInformation
However I cannot figure out the following issues:
how and where to filter for the lastwritetime
how and where to filter for the "2022" in the name
how and where to copy the files to the local folder
right now I'm just putting this all in the command line, do I need to make some file to run this process?
Thank you for any assistance you can provide!
I guess you want something like this.
It searches for files in the source folder with 2022 in the name and having .xls (or anything following xls) as extension.
It then loops over these items, creates the subfolder structure where they were found in the destination folder, copies the files and finally writes out a CSV file with information of the original file.
$sourcePath = 'D:\network_folder'
$destination = 'D:\dest_folder'
$refDate = [datetime]::new(2022,4,2) # --> next day date as of midnight
Get-ChildItem -Path $sourcePath -Filter '*2022*.xls*' -File -Recurse |
Where-Object {$_.LastWriteTime -ge $refDate} | ForEach-Object {
# create the destination folder if it does not already exist
$target = Join-Path -Path $destination -ChildPath $_.DirectoryName.Substring($sourcePath.Length)
$null = New-Item -Path $target -ItemType Directory -Force
# copy the file
$_ | Copy-Item -Destination $target
# output the wanted properties from the original file
$_ | Select-Object Name, FullName, LastWriteTime
} | Export-Csv -Path "C:\local_folder\file_list.csv" -Force -NoTypeInformation

using 7zip to zip in powershell 5.0

I have customized one powershell code to zip files older than 7 days from a source folder to a subfolder and then delete the original files from source after zipping is complete. The code is working fine with inbuilt Compress-Archive and Remove-Item cmdlets with less volume of files, but takes more time and system memory for a large volume of files. So, I'm working on a solution using 7zip instead as it's faster.
Below script does zipping correctly but not following the condition of only files older than 7 days and deletes all the files from source folder. It should zip and delete only files older than 7 days.
I have tried all possible ways to troubleshoot but no luck. Can anybody suggest possible solution?
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias sz "$env:ProgramFiles\7-Zip\7z.exe"
$Date = Get-Date -format yyyy-MM-dd_HH-mm
$Source = "C:\Users\529817\New folder1\New folder_2\"
$Target = "C:\Users\529817\New folder1\New folder_2\ARCHIVE\"
Get-ChildItem -path $Source | sz a -mx=9 -sdel $Target\$Date.7z $Source
There are several problems here. The first is that 7-Zip doesn't accept a list of files as a pipe, furthermore even if it did your GCI is selecting every file and not selecting by date. The reason that it works at all is that you are passing the source folder as a parameter to 7-Zip.
7-Zip accepts the list of files to zip as a command line argument:
Usage: 7z <command> [<switches>...] <archive_name> [<file_names>...] [#listfile]
And you can select the files you want by filter the output from GCI by LastWriteTime.
Try changing your last line to this
sz a -mx=9 -sdel $Target\$Date.7z (gci -Path $Source |? LastWriteTime -lt (Get-Date).AddDays(-7) | select -expandproperty FullName)
If you have hundreds of files and long paths then you may run into problems with the length of the command line in which case you might do this instead:
gci -Path $Source |? LastWriteTime -lt (Get-Date).AddDays(-7) |% { sz a -mx=9 -sdel $Target\$Date.7z $_.FullName }
Consider a temporary file with a list of those files which need to be compressed:-
$tmp = "$($(New-Guid).guid).tmp"
set-content $tmp (gci -Path $Source |? LastWriteTime -lt (Get-Date).AddDays(-7)).FullName
sz a -mmt=8 out.7z #$tmp
Remove-Item $tmp
Also looking at the parameters to 7-Zip: -mx=9 will be slowest for potentially a small size gain. Perhaps leave that parameter out and take the default and consider adding -mmt=8 to use multiple threads.

How to rename large number of files using Powershell and a CSV

Ultimately, I need a solid PowerShell script that will take a folder with several hundred video files, import the existing file names into the program, lookup the new file name in a CSV, and rename it. The old filename is simply (ie. File1.mp4, File2.mp4, etc.) I would like to appended a date to the front of the file in the format of (YYYY-MM-DD).
For testing, I created a folder on my desktop with (10) text files, each with a unique file name.
My CSV file appears as follows:
Image of CSV
The "newfilename" column, was created by using the Concatenate command in Excel.
`(=CONCATENATE(TEXT(A2, "yyyy-mm-dd")," ", B2)`
As much as I would just like PowerShell to handle everything, I feel using Excel for most of this might be the best way.
In my testing, everything was in one folder. However, at work, I will have video files on one drive, and the script will have to be in a folder on my desktop. Because I am in a corporate network, I need a special batch file to run my scripts, which is nothing new. I just modify the script name, and away it goes!
So what commands do I need to do in order to have the script separate from the video files AND the CSV file?
Here is the code that I have so far. Everything works when it's in one folder.
PS C:\Users\ceran\Desktop\Rename Project> Import-Csv -Path .\MyFileList.csv | ForEach-Object {
>> $Src = Join-Path -Path $TargetDir -ChildPath $_.filename
>> $Dst = Join-Path -Path $TargetDir -ChildPath $_.newfilename
>> Rename-Item -Path $Src -NewName $Dst
>> }
Thanks in advance for the help!
Chris
I'm not sure what the date column is in your Excel file and if you want to rename all files in the folder, but if that is the case, you don't need a csv file at all and can do this:
$sourceFolder = 'X:\Path\to\the\video\files' # change this to the real path
Get-ChildItem -Path $sourceFolder -Filter '*.mp4' -File | # iterate through the files in the folder
Where-Object {$_.Name -notmatch '^\d{4}-\d{2}-\d{2}'} | # don't rename files that already start with the date
Rename-Item -NewName { '{0:yyyy-MM-dd} {1}' -f $_.LastWriteTime, $_.Name } -WhatIf
This uses parameter -Filter '*.mp4', to get only files with an .mp4 extension. For the files in your testfolder (Desktop\Rename Project), change this to -Filter '*.txt'.
If you want all files renamed, no matter what the extension, simply remove the Filter from the cmdlet.
Because of the -WhatIf switch, no file is actually renamed and the code just shows in the console what would happen. Once satisfied that this is OK, remove the -WhatIf
Hope that helps.
$targetdir="C:\path\to\where\our\file\directory\is"
$pathtocsv="c:\path\to\csv.csv"
Import-Csv -Path $pathtocsv | ForEach-Object {
$Src = Join-Path -Path $TargetDir -ChildPath $_.filename
$Dst = Join-Path -Path $TargetDir -ChildPath $_.newfilename
Rename-Item -Path $Src -NewName $Dst
}
Why would this not work in any situation?
By the way, if the csv had the columns path and newname, it could be piped directly to rename-item:
path,newname
file.txt,file2.txt
import-csv ren.csv | Rename-Item -whatif
What if: Performing the operation "Rename File" on target "Item: /Users/js/foo/file.txt Destination: /Users/js/foo/file2.txt".

Compress-Archive Error: Cannot access the file because it is being used by another process

I would like to zip a path (with a service windows running inside).
When the service is stopped, it works perfectly, when the service is running, I have the exception:
The process cannot access the file because it is being used by another
process.
However, when I zip with 7-zip, I don't have any exception.
My command:
Compress-Archive [PATH] -CompressionLevel Optimal -DestinationPath("[DEST_PATH]") -Force
Do you have any idea to perform the task without this exception?
Copy-Item allows you to access files that are being used in another process.
This is the solution I ended up using in my code:
Copy-Item -Path "C:\Temp\somefolder" -Force -PassThru |
Get-ChildItem |
Compress-Archive -DestinationPath "C:\Temp\somefolder.zip"
The idea is that you pass through all the copied items through the pipeline instead of having to copy them to a specific destination first before compressing.
I like to zip up a folder's content rather than the folder itself, therefore I'm using Get-ChildItem before compressing in the last line.
Sub-folders are already included. No need to use -recurse in the first line to do this
A good method to access files being used by another process is by creating snapshots using Volume Shadow Copy Service.
To do so, one can simply use PowerShells WMI Cmdlets:
$Path = "C:/my/used/folder"
$directoryRoot = [System.IO.Directory]::GetDirectoryRoot($Path).ToString()
$shadow = (Get-WmiObject -List Win32_ShadowCopy).Create($directoryRoot, "ClientAccessible")
$shadowCopy = Get-WmiObject Win32_ShadowCopy | ? { $_.ID -eq $shadow.ShadowID }
$snapshotPath = $shadowCopy.DeviceObject + "\" + $Path.Replace($directoryRoot, "")
Now you can use the $snapshotPath as -Path for your Compress-Archive call.
This method can also be used to create backups with symlinks.
From there on you can use the linked folders to copy backed up files, or to compress them without those Access exceptions.
I created a similiar function and a small Cmdlet in this Gist: Backup.ps1
There was a similar requirement where only few extensions needs to be added to zip.
With this approach, we can copy the all files including locked ones to a temp location > Zip the files and then delete the logs
This is bit lengthy process but made my day!
$filedate = Get-Date -Format yyyyMddhhmmss
$zipfile = 'C:\Logs\logfiles'+ $filedate +'.zip'
New-Item -Path "c:\" -Name "Logs" -ItemType "directory" -ErrorAction SilentlyContinue
Robocopy "<Log Location>" "C:\CRLogs\" *.txt *.csv *.log /s
Get-ChildItem -Path "C:\Logs\" -Recurse | Compress-Archive -DestinationPath $zipfile -Force -ErrorAction Continue
Remove-Item -Path "C:\Logs\" -Exclude *.zip -Recurse -Force