Using 7zip on powershell without compression - powershell

I am using 7zip on powershell because I need to zip some folders. This is the code I am using:
Set-StrictMode -Version "2.0"
Clear-Host
$7Zip_ProgramPath="C:\Program Files\7-zip\7z"
$Destination = "c:\temp\7ztest41.zip"
$Source = "c:\temp\homes\homeuser002"
$Option = "a"
$Command = "$7Zip_ProgramPath $Option $Destination $Source"
#$Command="C:\Program Files\7-zip\7z a c:\temp\7ztest.zip c:\temp\homes\homeuser002"
$ManagementClass=[System.Management.ManagementClass] "\\.\ROOT\cimv2:Win32_Process"
#kürzer: $$ManagementClass=[WmiClass] "Win32_Process"
$StartupOptions=[WmiClass] "Win32_ProcessStartup"
$StartupOptions.PsBase.Properties["ShowWindow"].Value=1
$null=$ManagementClass.Create($Command,$Null,$StartupOptions)'
I got this code from this page here: http://www.powershellpraxis.de/index.php/ntfs-filesystem/laufwerke-ordner-dateien-und-freigaben#2.1.2.5.2%20Packen%20mit%207-Zip
Everything is working quite good, except for the fact that I do not want to compress my files when using this method. I have a folder which is 67 MB big, but after zipping this folder it is only 55 MB big. Maybe I do not fully understand this code but I want to change the compression option and do not know where and how. Does anybody know?

Your command line is only specifying an 'Add' operation.
$Option = "a"
The 'Add' option does not disable compression in 7z.exe. You need to explicitly configure the compression option to specify 'no compression'.
$Option = 'a -mx=0'

Related

How do I run a PowerShell script on a file from the context menu?

I have written a PS script, which replaces a specific string at the beginning of the file, adds another piece of string to the end of the file, and finally it puts out an XML.
My code might be ugly (I am not a programmer/engineer or anything, just trying to make life easier for some family members who are running a small business), but it works:
$content = Get-Content -Path 'C:\Users\blabla\Desktop\4440341930.txt'
$newContent = $content -replace 'text to be replaced','this is going to replace stuff'
$newContent | Set-Content -Path 'C:\Users\blabla\Desktop\4440341930.txt'
Add-Content C:\Users\blabla\Desktop\4440341930.txt '</Items>'
$x = [xml](Get-Content "C:\Users\blabla\Desktop\4440341930.txt")
$x.Save("C:\Users\blabla\Desktop\4440341930.xml")
I would like them to be able to run this script from the context menu, by right clicking on a txt file. I did a little research and I kind of get what I have to add to Registry, however, I'm not sure how to make it work. Since the path of each file that they are going to right click on is going to be different, the path that I'm specifying in $content is not going to work.
What do I have to modify in my code to be able to add it to the Registry?
To accomplish this you need to:
Create a Shortcut in the SendTo Folder: "$DestinationPath\AppData\Roaming\Microsoft\Windows\SendTo"
The target: "C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe"
The Arguments: -File "d:\path\your PS1 file"
In your program read the file name passed by Explorer as:
Param
(
[Parameter(Mandatory=$false)]
[String] $FilePath
)
I've written a Setup Function that accomplishes steps 1-3 that I include in all my programs that I want on the context menu and then just run the program with the -Setup switch. We're not supposed to post developed code here, but if you can't figure it out let me know and I'll post it and hope I don't get killed for it. LOL!
UPDATE:
If you want to pass more than one file you need to process the files a little differently. Delete the Param block above and then use this type of code to retrieve the files.
If ($Args.count -eq 0) {
$Message = "No Files were passed from File Explorer."
[Void]$MsgBox::Show(
"$Message","System Exit",$Buttons::OK, $MBIcons::Stop)
Show-PowerShell
Exit #Comment out for testing from ISE!
}
Else {
$FilesToCopy = $Args
}

Running a command with arguments assistance

I have a command which runs a program in silent mode, it uses an XML file for the data repository and a word template to create multiple word documents based on a filter xml file.
The command I use is:
"P:\ath to\executable" -Username:Admin -Password:Pa55w0rd -Datadefinition:"C:\Data.xml" -Datafilter:"C:\Filter.xml" -wordtemplate:"C:\Batch\Paul1.dotx" -Targetdocument:="C:\Batch\Paul1.pdf" -filetype:PDF -Log:"C:\Logs\error.log" -Usage:DOCGENSILENT
I need to run this as a PowerShell script which I have mostly managed:
set-executionpolicy unrestricted
$datadefinition = Get-Content "C:\Data file.xml"
$datafilter = Get-Content "C:\Filter for data file.xml"
$wordTemplate = Get-Content "C:\"C:\Template\Paul1.dotx"
$targetFolder = Get-Content "C:\"C:\Paul\Paul.pdf"
Stop-Job = "Executable path" -Username:Admin -Password:Pa55w0rd -Datadefinition:%dataDefinition% -Datafilter:%dataFilter% -wordtemplate:%wordTemplate% -Targetdocument:%targetFolder% -filetype:docx -Log:%logPath% -Usage:DOCGENSILENT
Stop-Job 1
set-executionpolicy restricted
Write-Host -NoNewLine "Press any key to continue..."
$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
My issue is that the script starts the executable but then doesnt pass the Variables, can anyone guide me in the right direction to fix this?
Getting this working depends on the behavior of your executable. Some things I noticed:
Shouldn't this:
$wordTemplate = Get-Content "C:\"C:\Template\Paul1.dotx"
be this:
$wordTemplate = "C:\Template\Paul1.dotx"
Are you sure you need Get-Content? (Aside from that, the path and quoting in your sample are not correct.)
Shouldn't this:
$targetFolder = Get-Content "C:\"C:\Paul\Paul.pdf"
be this:
$targetDocument = "C:\Paul\Paul.pdf"
I doubt Get-Content is correct here, since presumably your output file doesn't exist yet? I also renamed the variable so it makes more sense in your command.
In fact, are you sure you need Get-Content for any of those? Aren't you specifying filenames, not the content of the files?
In PowerShell, variables are prefixed with $ rather than being surrounded by %.
Using Set-ExecutionPolicy within a script to enable scripts to run is pointless, because the script is already running. (That is, if execution policy prevented script execution, PowerShell wouldn't let you run the script in the first place.)
If my guesses regarding your variables are correct, I think your script should look something like this (note also that I specified a $logFile variable, which I didn't see in your script):
$datadefinition = "C:\Users\Administrator\data\Sample Model_146_object type(s).xml"
$datafilter = "C:\Users\Administrator\data\Sample Model_146_object type(s).xml"
$wordtemplate = "C:\Users\Administrator\Templates\Base object.docx"
$targetdocument = "C:\Users\Administrator\Result\sample test15"
$logfile = "C:\Users\Administrator\Logs\C4W Error.log"
& "C:\Program Files (x86)\Communicator4Word.exe" -Username:Admin -Password: -Datadefinition:$datadefinition -Datafilter:$datafilter -wordtemplate:$wordtemplate -Targetdocument:$targetdocument -filetype:docx -Log:$logfile -Usage:DOCGENSILENT
I don't know the behavior of Communicator4Word.exe when you use -Password: with no password after it. (Is that a syntax error, or should you just omit -Password: altogether?)

Error in PowerShell due to copying the content of an S3 bucket

I copy the content of an S3 bucket to a local directory, however I get an error output from the powershell.
Copy-S3Object : The requested range is not satisfiable
It is pointing to this command:
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -Region $region
Why do I get this error ? Note that the desired files that are needed to be copied do indeed get copied locally.
I can't say why you are getting that error returned from S3, but I can tell you that if you are copying multiple objects you probably want to use the -LocalFolder parameter, not -LocalFile. -LocalFolder will preserve the prefixes as subpaths.
When downloading one or more objects from S3, the Read-S3Object cmdlet works the same as Copy-S3Object, but uses -KeyPrefix to specify the common prefix the objects share, and -Folder to indicate the folder they should be downloaded to.
This also reminds me I need to check why we used -LocalFolder on Copy-, and -Folder on Read- although I suspect aliases may also be available to make them consistent.
HTH
(Edit): I spent some time this morning reviewing the cmdlet code and it doesn't appear to me the cmdlet would work as-is on a multi-object download, even though it has a -LocalFolder parameter. If you have a single object to download, then using -Key/-LocalFile is the correct parameter combination. If -LocalFolder is passed, the cmdlet sets up internally to do a single file download instead of treating -Key as a common key prefix to a set of objects. So, I think we have a bug here that I'm looking into.
In the meantime, I would use Read-S3Object to do your downloads. It supports both single (-Key) or multi-object download (-KeyPrefix) modes. https://docs.aws.amazon.com/powershell/latest/reference/index.html?page=Read-S3Object.html&tocid=Read-S3Object
this seems to occur with folders that do not contain files since copy wants to copy files.
i accepted this error and trapped it.
catch [Amazon.S3.AmazonS3Exception]
{
# get error record
[Management.Automation.ErrorRecord]$e = $_
# retrieve information about runtime error
$info = [PSCustomObject]#{
Exception = $e.Exception.Message
Reason = $e.CategoryInfo.Reason
Target = $e.CategoryInfo.TargetName
Script = $e.InvocationInfo.ScriptName
Line = $e.InvocationInfo.ScriptLineNumber
Column = $e.InvocationInfo.OffsetInLine
ErrorCode = $e.Exception.ErrorCode
}
if ($info.ErrorCode="InvalidRange") { #do nothing
} Else {
# output information. Post-process collected info, and log info (optional)
write-host $info -ForegroundColor Red}
}
}
This happened to me when I tried to download the file which had more than one dots in it. Simplifying the file name, fixed the error.
File name that gave me error: myfile-18.10.exe
File name that worked: myfile-1810.exe

Moving (not copying) remote files after download with WinSCP .NET assembly

I have this script that downloads all .txt and .log files. But I need to move them to another directory on the server after the download.
So far I just keep getting errors like "cannot move "file" to "/file".
try
{
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::ftp
$sessionOptions.HostName = "host"
$sessionOptions.PortNumber = "port"
$sessionOptions.UserName = "user"
$sessionOptions.Password = "pass"
$session = New-Object WinSCP.Session
try
{
# Connect
$session.DisableVersionCheck = "true"
$session.Open($sessionOptions)
$localPath = "C:\users\user\desktop\file"
$remotePath = "/"
$fileName = "*.txt"
$fileNamee = "*.log"
$remotePath2 = "/completed"
$directoryInfo = $session.ListDirectory($remotePath)
$directoryInfo = $session.ListDirectory($remotePath2)
# Download the file
$session.GetFiles(($remotePath + $fileName), $localPath).Check()
$session.GetFiles(($remotePath + $fileNamee), $localPath).Check()
$session.MoveFile(($remotePath + $fileName, $remotePath2)).Check()
$session.MoveFile(($remotePath + $fileNamee, $remotePath2)).Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host $_.Exception.Message
exit 1
}
You have many problems in your code:
The targetPath argument of the Session.MoveFile method is a path to move/rename the file to.
So, if you use the target path /complete, you are trying to move the file to a root folder and rename it to the complete. While you probably want to move the file to folder the /complete, and keep its name. For that use the target path /complete/ (or the /complete/* to make it more obvious).
Your current code fails, because you are renaming the file to a name of an already existing folder.
You actually have the same bug in the .GetFiles. You are downloading all files (both *.txt and *.log) to the folder C:\users\user\desktop and save them all to the same name file, overwriting one another.
You have brackets incorrectly around both arguments, instead of around the first argument only. While I'm no PowerShell expert, I'd actually say you are omitting the second argument of the method completely this way.
Further, note that the MoveFile method does not return anything (contrary to the GetFiles). So there's no object to call the .Check() method on.
The MoveFile (note the singular, comparing to the GetFiles), moves only a single file. So you should not use a file mask. Actually the present implementation allows a use of the file mask, but this use is undocumented and may be deprecated in future versions.
Anyway, the best solution is to iterate the list of actually downloaded files, as returned by the GetFiles and move the files one by one.
This way you avoid race condition, where you download set of files, new files are added (which you didn't download) and you incorrectly move them to the "completed" folder.
The code should look like (for the first set of files only, i.e. the *.txt):
$remotePath2 = "/completed/"
...
$transferResult = $session.GetFiles(($remotePath + $fileName), $localPath)
$transferResult.Check()
foreach ($transfer in $transferResult.Transfers)
{
$session.MoveFile($transfer.FileName, $remotePath2)
}
Note that this does not include a fix for the $localPath, as I'm not sure, what the path C:\users\user\desktop\file actually mean.
There's actually a very similar sample code available:
Moving local files to different location after successful upload
Have you checked to make sure your process has rights to move files to the new directory?
I am doing what Martin suggest in here with success.
But I have been stuck for sometimes.
After running "session.MoveFile()", the file in origin folder is gone, but it is not showing in destination folder.
The files will showing to destination after "session" disposed automatically after some time period (around 30 minutes I guess).
To avoid this confusion, dispose the session.
Like this :
session.Dispose();
I know this is trivial, but I hope that you don't fall to same problem.

Powershell running under a service hangs on *.zip CopyHere

I'm running a Windows Service (Hudson) which in turn spawns a PowerShell process to run my custom PowerShell commands. Part of my script is to unzip a file using CopyHere. When I run this script locally, I see a progress dialog pop up as the files are extracted and copied. However, when this runs under the service, it hangs at the point where a dialog would otherwise appear.
Here's the unzip portion of my script.
# Extract the contents of a zip file to a folder
function Extract-Zip {
param([string]$zipFilePath, [string]$destination)
if(test-path($zipFilePath)) {
$shellApplication = new-object -com shell.application
$zipFile = get-item $zipFilePath
$zipFolder = $shellApplication.NameSpace($zipFile.fullname)
$destinationFile = get-item $destination
$destinationFolder = $shellApplication.NameSpace($destinationFile.fullname)
$destinationFolder.CopyHere($zipFolder.Items())
}
}
I suspect that because its running under a service process which is headless (no interaction with the desktop), its somehow stuck trying to display a dialog.
Is there a way around this?
If it's still actual, I managed to fix this with having CopyHere params equal 1564.
So in my case extract zip function looks like:
function Expand-ZIPFile{
param(
$file, $destination
)
$shell = new-object -com shell.application
$zip = $shell.NameSpace($file)
foreach($item in $zip.items())
{
$shell.Namespace($destination).copyhere($item,1564)
"$($item.path) extracted"
}
1564 description can be found here - http://msdn.microsoft.com/en-us/library/windows/desktop/bb787866(v=vs.85).aspx:
(4) Do not display a progress dialog box.
(8) Give the file being operated on a new name in a move, copy, or rename operation if a file with the target name already exists.
(16) Respond with "Yes to All" for any dialog box that is displayed.
(512) Do not confirm the creation of a new directory if the operation requires one to be created.
(1024) Do not display a user interface if an error occurs.
If this is running on Vista or Windows 7, popping up UI from a service isn't going to be seen by the end user as you suspected. See this paper on Session 0 Isolation. However, does the progress dialog require user input? If not, I wouldn't think that would cause the service to hang. I would look for an option to disable the progress display. If you can't find that, then try switching to another ZIP extractor. PSCX 1.2 comes with an Expand-Archive cmdlet. I'm sure there are also others available.
Looking at the documentation for PowerShell, it looks like the -NonInteractive option may help here