Long story short, I have a powershell script which compresses several folders into zip-files.
In order to compress a single directory into a zip file, I use this command:
Compress-Archive -Path $SourcePath -DestinationPath $OutputPath -CompressionLevel Optimal
Where $SourcePath is an absolute path ending on *, e.g. C:\Build\Output*, and $OutputPath is an absolute path ending on .zip, e.g. C:\Build\Debug.zip.
There are a lot of files and folders in the source path.
The issue I experience is that, scattered around the zip file, folders have a duplicate empty file. This causes problems when trying unzip the archive with e.g. 7-zip.
Interestingly enough, I do not see this issue with the build-in unzip in Total Commander.
I am wondering if this is an issue with the Powershell command, or 7-zip?
Related
I'm working on an Alteryx workflow which creates multiple pdfs and outputs them to a directory. I need to call a Powershell script at the end of the workflow to compress the pdfs into a zip file and save it in the same location.
I found this:
PowerShell Compress-Archive By File Extension, but it requires copying the files to another location and then compressing them. Is there a way I can just compress all pdfs in a certain folder location and output the zip file in the same location? Due to restrictions in Alteryx I might not be able to work with the output if it is in a different location.
My current Powershell script:
Compress-Archive -LiteralPath 'D:\temp\test zipping flow\' -DestinationPath 'D:\temp\final.zip' -Force
This works perfectly, but tries to zip files with other extensions as well but I only want .pdf files.
Any suggestions are greatly appreciated.
Instead of -LiteralPath use -Path, so you can add a wildcard character *
Compress-Archive -Path 'D:\temp\test zipping flow\*.pdf' -DestinationPath 'D:\temp\final.zip' -Force
As alternative for when you need to use -LiteralPath (perhaps because the path contains characters that would be interpreted using -Path like square brackets), you could do this instead:
Get-ChildItem -LiteralPath 'D:\temp\test zipping flow' -Filter '*.pdf' -File |
Compress-Archive -DestinationPath 'D:\temp\final.zip' -Force
I have requirement as below.
Source : C:\s
Destination: C:\d
files located are more than 255 characters.
Moving files based on last modified or written days(10) and it should copy complete folder structure, if any of the files not modified as per the last modified date it should be available at the source file in the same folder. While the other is modified, it should be created with a new directory with the same folder structure and file to moved in the same location as per the source location path.
I have tried PowerShell script using days, however the files are being copied into and folders are staying at the source itself.
Get-ChildItem -Path C:\s -Recurse | Where-Object {$_.LastWriteTime -lt (Get-date).AddDays(0)} | Move-Item -destination C:\d
So far the output is giving only files but not the folder structure, if it is empty folder it should be moved to the destination folder.
Thanks
Suman
The issue is that you are getting all the files with Get-ChildItem -Recurse and filtering them, but when those files are piped to Move-Item -Destination you essentially are saying to take all files (regardless of the source) and put them into the single folder C:\d. If you want to preserve the directory structure, you have to pass through and specify the directory structure in the Move-Item -Destination parameter.
#AdminOfThings is correct, an easier way to do the move is to use Robocopy. Experimenting with the various switches should get you what you need. e.g.:
Robocopy.exe C:\s C:\d /move /e /minage:10
Where:
/move says to move instead of copy
/e is copy directories including empty ones
/minage:10 is to move files older than 10 days.
I am using a PowerShell script to zip up files that are older than 60 days. Some of these files have really long filenames so I get the filename or extension is too long error.
I would rather not go into each file and change the names so I need a way to be able to apply something to all the files at the same time or have my script bypass the error somehow. This script is also going to be run on several computers so I would prefer not to download something on to each one.
This is the script:
#Set source and target
$Source = "D:\Testing\"
$Target = "$ENV:USERPROFILE\Desktop\TEST.zip"
#Set time parameters
$Days = 60
$LastWrite = (Get-Date).Date.AddDays(-$Days)
#Invoke 7-zip
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw
"$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias zip "$env:ProgramFiles\7-Zip\7z.exe"
$Target = Get-Childitem $Source -Recurse | Where-Object -FilterScript
{($_.LastWriteTime -ge $LastWrite)}
zip a -mx=9 $Target $Source
I am using 7-zip to zip up the files and I have PS version 5.1.
As mentioned in the comments, one way around long file names is to store relative paths. 7Zip will store relative paths if you specify an input file with the relative paths and they resolve to the files you want to archive, as described in this answer.
Intermediate files can be messy, so I've written a script that uses the ZipFileExtensions' CreateEntryFromFile method to store a relative path in a zip file.
You can specify -ParentFolder on the command line to store paths relative to the parent, including a UNC path if you want to archive files on another computer. If -ParentFolder is not specified it will choose the script's folder as the parent and store paths relative to the script.
Copy the code to a new script named ArchiveOldLogs.ps1 and run it with this command line:
.\ArchiveOldLogs.ps1 -ParentFolder "D:\Testing\" -FileSpecs #("*.*") -Filter { $_.LastWriteTime -lt (Get-Date).AddDays(-60)} -DeleteAfterArchiving:$false
That will get you 11 more characters at the end of the path to store, which should be enough to get around the 10 character difference between Windows and Zip path length limits. Try a deeper folder if you still get errors. The files that can't be archived, or are already archived will be skipped.
Remove -DeleteAfterArchiving:$false from the command line when you're comfortable that it's archiving only what you want.
I'm currently trying to streamline the installation process of a few products that require a powershell script to be run each, I've encountered some trouble when trying to write a script to search a directory for these files.
Example: in the directory 'Install' i have four subfolders named 'product1-4' in each of these folders there is a file 'Script.ps1'
The first issue is that the install scripts are all named the same 'script.ps1' which complicated my first idea to pull all the files from the sub-folders into a centralised location and run them all sequentially.
Feel like i'm making this more complicated than it needs to be, any advice?
Get-ChildItem $installFolder -Include *.ps1 -Recurse
That will list the .ps1 files, you could also do the following to then address the files as a single variable.
$ps1Files = Get-ChildItem $installFolder -Include *.ps1 -Recurse
I'm trying to create a workflow for designing multiple Shopify themes based on the same theme. I have written a simple PowerShell script that collects the necessary files and compresses them into a zip file.
Here is a simplified version of my script:
# Copy all files from the base theme to my temporary folder
Copy-Item $baseTheme"*" $tempFolder -recurse -force -exclude ".git"
# Include the files specific to the current theme
Copy-Item $specificTheme"assets" $tempFolder -recurse -force
Copy-Item $specificTheme"config" $tempFolder -recurse -force
Copy-Item $specificTheme"layout" $tempFolder -recurse -force
Copy-Item $specificTheme"snippets" $tempFolder -recurse -force
Copy-Item $specificTheme"templates" $tempFolder -recurse -force
# Compress the temporary folder
Compress-Archive $tempFolder $zipFileName
When I manually perform these steps, and create the zip file in Windows using Send To > Compressed (zipped) folder, Shopify is completely happy with the zip file.
However, when I upload the result of this script it gives me the following error:
There was 1 error:
zip does not contain a valid theme: missing template "layout/theme.liquid", missing template "templates/index.liquid", missing template "templates/collection.liquid", missing template "templates/product.liquid", missing template "templates/page.liquid", missing template "templates/cart.liquid", and missing template "templates/blog.liquid"
I've double and triple checked the zip file, and all of the required files exist. I've messed around with the -CompressionLevel of Compress-Archive. I've tried other methods of zipping folders within PowerShell. All with no luck.
I really can't see any difference between the results of the script and compressing manually. Any ideas?
I am answering my own question, but I'm not the one that solved it. I posted a link to this question on the shopify forum and someone named Adrian posted the following answer.
Short version. Download 7-ZIP and use the test archive facility. Zips
created with compress-archive show no folders whereas those created
with the Send To zip in Windows do. The folders do exist inside the
actual archive though so I'd say the header info is malformed.
Looks like there is a structural error within the files that
compress-archive creates that Windows is happy with but the, probably
Unix-based, Shopify servers don't accept.
I installed 7-Zip and used the Test Archive feature to evaluate both folders. Here is the (truncated) output:
Compressed with Compress-Archive
Archives: 1
Files: 77
There are no errors.
Compressed with Send-To Zip context menu
Archives: 1
Folders: 6
Files: 76
There are no errors.
At this point, I'm just happy knowing what's going on. But in order to really call this problem solved I updated my PowerShell script to use 7-Zip for compressing the folder.
I replaced this:
# Compress the temporary folder
Compress-Archive $tempFolder $zipFileName
With this:
# Compress the temporary folder
& "c:\Program Files\7-Zip\7z.exe" a -tzip $zipFile $tempFolder
I uploaded the archive to Shopify and everything worked just fine.
Now, I'm really not that excited about installing and running third party software to do what (I believe) should be a basic OS task. I've reported this issue to Microsoft here. Who knows, maybe they'll fix it.
This looks like an issue with the folder structure inside the archive file.
If you inspect the archive file created via PowerShell, does it show the same structure as the file created manually ?
This link shows the expected folder structure for a Shopify Theme:
index.liquid
product.liquid
collection.liquid
cart.liquid
blog.liquid
article.liquid
page.liquid
list_collections.liquid
search.liquid
404.liquid
gift_cards.liquid
customers/account.liquid
customers/activate.liquid
customers/addresses.liquid
customers/login.liquid
customers/order.liquid
customers/register.liquid
customers/reset_password.liquid
password.liquid
settings_schema.json
theme.liquid (layout file)