I am using a PowerShell script to zip up files that are older than 60 days. Some of these files have really long filenames so I get the filename or extension is too long error.
I would rather not go into each file and change the names so I need a way to be able to apply something to all the files at the same time or have my script bypass the error somehow. This script is also going to be run on several computers so I would prefer not to download something on to each one.
This is the script:
#Set source and target
$Source = "D:\Testing\"
$Target = "$ENV:USERPROFILE\Desktop\TEST.zip"
#Set time parameters
$Days = 60
$LastWrite = (Get-Date).Date.AddDays(-$Days)
#Invoke 7-zip
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw
"$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias zip "$env:ProgramFiles\7-Zip\7z.exe"
$Target = Get-Childitem $Source -Recurse | Where-Object -FilterScript
{($_.LastWriteTime -ge $LastWrite)}
zip a -mx=9 $Target $Source
I am using 7-zip to zip up the files and I have PS version 5.1.
As mentioned in the comments, one way around long file names is to store relative paths. 7Zip will store relative paths if you specify an input file with the relative paths and they resolve to the files you want to archive, as described in this answer.
Intermediate files can be messy, so I've written a script that uses the ZipFileExtensions' CreateEntryFromFile method to store a relative path in a zip file.
You can specify -ParentFolder on the command line to store paths relative to the parent, including a UNC path if you want to archive files on another computer. If -ParentFolder is not specified it will choose the script's folder as the parent and store paths relative to the script.
Copy the code to a new script named ArchiveOldLogs.ps1 and run it with this command line:
.\ArchiveOldLogs.ps1 -ParentFolder "D:\Testing\" -FileSpecs #("*.*") -Filter { $_.LastWriteTime -lt (Get-Date).AddDays(-60)} -DeleteAfterArchiving:$false
That will get you 11 more characters at the end of the path to store, which should be enough to get around the 10 character difference between Windows and Zip path length limits. Try a deeper folder if you still get errors. The files that can't be archived, or are already archived will be skipped.
Remove -DeleteAfterArchiving:$false from the command line when you're comfortable that it's archiving only what you want.
Related
back with another request to try and make my life a little easier. The problem: one of the programs I use deposits BMPs (yes, bitmaps, this is an ancient app, and no, I can't configure it not to make BMPs) where I don't need them. I've got a BAT file that can sweep a folder and remove them, but what I'd really like to do is put a copy of said BAT file in each folder where it leaves them, and then every time I run a backup cycle, have it search for those BAT files, and wherever it finds one, run it. (I'd also need to know how to tell it "look in the same folder you're in"--I think I can do that by something like $searchfolder = "." but please correct me if I'm wrong)
I'm guessing this is a Get-Childitem and ForEach, but I've taken a few stabs at it and it won't work. Does anyone have an idea how to go about it?
This is what I've got so far for the parent script to find all instances of "Clear_BMPs.bat":
Get-ChildItem $sourceDir -Include Clear_BMPs.bat -Recurse | ForEach-Object { call "Clear_BMPs.bat" }
And this is what I've got in the child script, to get rid of the BMPs themselves (the filename for it is "Clear_BMPs.bat":
$searchfile = "*.bmp"
$targetdir = ".\"
Get-ChildItem $targetdir -Include $searchfile | foreach{ "Removing file $($_.FullName)"; Remove-Item -force $_}
I'm still trying to get the Clear_BMPs.bat files to work properly but in my vision it will only search the root of the folder it's in, and not recurse through subdirectories.
Since you're calling from PowerShell, there's no reason to involve batch files, given that the code is under your control.
Indeed, what you show as the content of a Clear_BMPs.bat batch file is PowerShell code, which means you need to store it in a .ps1 file, not a .bat file.
Therefore, your recursive invocation that executes all .ps1 files should look like this:
# Find all 'Clear_BMPs.ps1' scripts in the subdir. tree of $sourceDir
# and invoke them.
Get-ChildItem -Recurse -LiteralPath $sourceDir -Filter Clear_BMPs.ps1 |
ForEach-Object { & $_.FullName }
And the Clear_BMPs.ps1 files in the various directories should contain:
# Remove all *.bmp files from the same dir. in which this .ps1 script is located.
Remove-Item -Path "$PSScriptRoot/*.bmp"
Note the use of the automatic $PSScriptRoot variable, which refers to the directory in which the enclosing .ps1 file is located.
I am trying to zip files that are in directories that have subdirectories and I can't figure out how to zip the files and not the subdirectories.
Here is the current setup:
C:\users\user\appdata\local\folder\
Inside of this folder, I need 3 out of the 20 or so folders that are in there so I used the Get-Childitem to accomplish this:
GCI C:\users\user\appdata\local\folder | ? {$_.name -like "*folder*}
Now that I have that, I don't want the subdirectories and just want the files that are sitting in the folder itself. I have not found a way to do this, but I have gotten close with using this:
& "C:\program files\7-zip\7z.exe" "a" "D:\TestBackup\Zipfile.7z" (GCI C:\users\user\appdata\local\folder | ? {$_.name -like "*folder*} | select -expandproperty FullName)
But this gives me the entire contents of the folder. I want to keep the structure so that it looks like this:
folder 1\files
folder 2\files
folder 3\files
I hope I am explaining myself well. The files are all different types of extensions so I was wanting a blanket way to do this or to exclude the subdirectories when zipping.
I had to consult the FAQ to get this right:
7-Zip stores only relative paths of files (without drive letter
prefix). You can change current folder to folder that is common for
all files that you want to compress and then you can use relative
paths:
cd /D C:\dir1\
7z.exe a c:\a.7z file1.txt dir2\file2.txt
Solution:
# Set base directory that is common for all files
Push-Location 'C:\users\user\appdata\local\folder'
# Get names of directories that match filter
$folderNames = (Get-ChildItem -Directory -Filter '*folder*').Name
# Call 7-zip, passing the list of directory names.
& 'C:\Program Files\7-Zip\7z.exe' a 'D:\TestBackup\Zipfile.7z' $folderNames
# Restore current directory
Pop-Location
Remarks:
Push-Location sets the current directory, while Pop-Location restores the previous current directory. Changing the current directory is crucial for this solution, as explained by the 7-zip FAQ. It is also the only way to set the base directory for Resolve-Path -Relative.
Pass -Directory or -File to Get-ChildItem if you are only interested in either directories or files.
Use -Filter instead of Where-Object (alias ?) if you only need simple wildcard filtering. -Filter is faster because it uses the FileSystem provider which filters at a lower API level, which avoids PowerShell overhead.
I have multiple files that I want to copy each file to a folder with same name
For example, the files
orange_file100 , orange_file200 , orange_file300 , apple_file120 , apple_file150
I want to move each file to a folder that contain part of the filename say orange and apple so the result will be
orange\orange_file100
orange\orange_file200
orange\orange_file300
apple\apple_file120
apple\apple_file150
How can I do that through powershell, should I use Get-ChildItem then ForEach{Copy-Item) ?
You can use Get-Childitem with a -File or -Directory to only grab the files or folders in a folder, that way you wont grab a folder and try place it in itself.
For example, the code below will only grab the files in the current directory
Get-Childitem -File
You can then use some regex to split the names so you can get the fruit name e.g.
$String.split('_')[0]
You should insert it into a list or array or something to store it, but now you have a list of files and fruit names.
Now you can loop over the list and start to move or copy the files into the right folder structure
Foreach($file in $FileList){
if($file.name -matches $Fruitname){
if($file.name -notmatch $pwd.path ){
mkdir $file.name
cd $file.name
move-item $file.fullname $pwd
}
}
}
The code above is just a quick attempt. It probably wont work the first time and you should make adjustments to understand what you are doing.
A few notes
$pwd gets the current directory. I'm assuming Get-Childitem returns the list of files in the correct order, so you will get Orange_100, then Orange_200 and so on
Get-Childitem returns a powershell object. The file names can be accessed using $_.name or the full path using $_.fullname
If -matches doesn't work, you can also try -like or -in
I didn't add in the first fruit folder into the code above, but it won't be hard to create
Remember to play around and find whats best for you.
Long story short, I have a powershell script which compresses several folders into zip-files.
In order to compress a single directory into a zip file, I use this command:
Compress-Archive -Path $SourcePath -DestinationPath $OutputPath -CompressionLevel Optimal
Where $SourcePath is an absolute path ending on *, e.g. C:\Build\Output*, and $OutputPath is an absolute path ending on .zip, e.g. C:\Build\Debug.zip.
There are a lot of files and folders in the source path.
The issue I experience is that, scattered around the zip file, folders have a duplicate empty file. This causes problems when trying unzip the archive with e.g. 7-zip.
Interestingly enough, I do not see this issue with the build-in unzip in Total Commander.
I am wondering if this is an issue with the Powershell command, or 7-zip?
I'm trying to feed the results of a Get-ChildItem call through io.compression.zipfile to create a zip file of "E:\Applications_Server_Test", excluding two folders "BACKUP" and "BACKUP2".
However, Powershell seems to be interpreting this as "$items = a string of directories and file names" instead of a recursive collection of directories and files I want to zip. I can find tutorials on using Get-ChildItem to exclude directories and I can find tutorials on how to zip a full directory or zip multiple directories but I can't find anything on zipping directories with exclusions. Can somebody tell me where I'm going wrong?
$source = "E:\Applications_Server_Test"
$destination = "E:\AST_Dump.zip"
$items = Get-ChildItem $source -Recurse | ?{ $_.fullname -notmatch "\\backup\\?" }
Add-Type -assembly "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory($items, $destination)
Thanks!
At the moment you are trying to archive the object $items, not the folder. Unfortunately this is not gonna work.
Try to move the backup folders at the same drive (this will just change records at the drive and not move any data), then to archive the whole "E:\Applications_Server_Test" folder and to move back the backup folders.
Other option is to use ZipArchive.CreateEntry method and to add file by file in to the archive. But this is not so elegant ;)