I am using Windows Server 2012 R2 (64 bit). I have powershell version 4 available in it. I am trying to zip and unzip files. When I try Write-Zip command, it throws me following error:
Write-Zip : The term 'Write-Zip' is not recognized as the name of a cmdlet, function, script file, or operable
program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
What should I do to fix it? Do I need to install zip/winrar in the server? Or is there any other command do zip/unzip files?
Write-Zip seems to be part of http://pscx.codeplex.com/ that require a separate installation before you can use it.
However, if you just want to create a Zip archive from a folder, you could just run
$source = "c:\temp\source"
$archive = "c:\temp\archive.zip"
Add-Type -assembly "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory($source, $archive)
This utilizes the CreateFromDirectory method from the .NET Framework class ZipFile. It creates a zip archive from the files located inside the $source folder and creates an archive as defined in the $archive variable. Note, ZipFile class was introduced in .NET Framework 4.5
You can use custom powershell object New-Object -ComObject Shell.Application and copy the file with flags to unzip.
$filePath = "foo.zip"
$shell = New-Object -ComObject Shell.Application
$zipFile = $shell.NameSpace($filePath)
$destinationFolder = $shell.NameSpace("C:\Program Files\WindowsPowerShell\Modules")
$copyFlags = 0x00
$copyFlags += 0x04 # Hide progress dialogs
$copyFlags += 0x10 # Overwrite existing files
$destinationFolder.CopyHere($zipFile.Items(), $copyFlags)
Credit source https://github.com/hashicorp/best-practices/blob/master/packer/scripts/windows/install_windows_updates.ps1#L12-L22
This does not work with windows 'core' edition. If possible, upgrade to powershell 5 and use Expand Archive since it is much simpler.
Should work under PS4. SeeAdd-Zip and New-Zipfunctions
[CmdletBinding()]
Param(
[Parameter(Mandatory=$True)]
[ValidateScript({Test-Path -Path $_ })]
[string]$sourceDirectory,
[Parameter(Mandatory=$True)]
[ValidateScript({-not(Test-Path -Path $_ -PathType Leaf)})]
[string]$destinationFile,
[Parameter(Mandatory=$True)]
[int]$noOlderThanHours
)
function New-Zip
{
param([string]$zipfilename)
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
function Add-Zip
{
param([string]$zipfilename)
if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Start-sleep -milliseconds 500
}
}
$oldest = (Get-Date) - (New-TimeSpan -Hours $noOlderThanHours)
$filesToZip = dir $sourceDirectory -Recurse | Where-Object {$_.lastwritetime -gt $oldest}
Write-Host Going to zip following files
$filesToZip | foreach {Write-Host $_.FullName}
$filesToZip| Add-Zip $destinationFile
If you can upgrade to PowerShell V5 (https://www.microsoft.com/en-us/download/details.aspx?id=50395), it has them natively. https://richardspowershellblog.wordpress.com/2014/10/25/powershell-5-zip-and-unzip/
For PowerShell version 4, you may be able to use this search http://www.powershellgallery.com/items?q=zip&x=0&y=0. This also looks to do what you are looking for: https://www.powershellgallery.com/packages/Microsoft.PowerShell.Archive/1.0.1.0
To install the modules, you need to type:
install-module -name <module name>
powershellgallery.com is a free to upload site. Please check and understand module before running it.
Hope this helps.
Thanks, Tim.
Write-Zip installation could have been performed incorrectly. An incorrect manual edit of the environment parameter PSModulePath may cause it:
Bad (original) value:
PSModulePath = %SystemRoot%\system32\WindowsPowerShell\v1.0\Modules\;C:\Program Files (x86)\PowerShell Community Extensions\Pscx3\;C:\Program Files\Intel\
Good value (which fixed the problem):
PSModulePath = C:\Program Files (x86)\PowerShell Community Extensions\Pscx3\;%SystemRoot%\system32\WindowsPowerShell\v1.0\Modules\;C:\Program Files\Intel\
Related
Good day, I would ask you to help me with finding the solution how to copy each MSI package to remote machine using link on nas storage.
# Get list of servers
param(
[ValidateSet('STUDENT_LAB', 'LIBRARY_LAB', 'TEACHER_LAB')]
[Parameter(Mandatory = $true,
HelpMessage = 'Select one of the valid servers by typing one of these names: STUDENT_LAB, LIBRARY_LAB, TEACHER_LAB')]
[string]$ServerGroup
)
$servers = #{
STUDENT_LAB = ('192.168.1.1','192.168.1.2','192.168.1.3')
LIBRARY_LAB = ('192.168.10.1','192.168.10.2','192.168.10.3')
TEACHER_LAB = ('192.168.15.1','192.168.15.2','192.168.15.3')
}[$ServerGroup]
Write-Output "The user chose $ServerGroup"
#this is what I don't know how to implement - download file from nas storage on remote machine
$sourcefiles = '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-ODBCDriver-11-SQLServer-x64\msodbcsql.msi' ; '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-ODBCDriver-17-SQLServr-x64\msodbcsql_17.2.0.1_x64.msi'; '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-OLEDBDriver-SQL Server-x64\msoledbsql_18.1.0.0_x64.msi'
foreach($server in $servers) {
# Destination UNC path changes based on server name
$destinationPath = "\\$server\D$\tmp\"
# Check that full folder structure exists and create if it doesn't
if(!(Test-Path $destinationPath)) {
New-Item -ItemType Directory -Force -Path $destinationPath
}
# Copy the file across
Copy-Item $sourcefiles $destinationPath
#list of packages to install
$msiList = #(
'Microsoft-ODBCDriver-11-SQLServer-x64\msodbcsql.msi'
'Microsoft-ODBCDriver-17-SQLServr-x64\msodbcsql_17.2.0.1_x64.msi'
'Microsoft-OLEDBDriver-SQL Server-x64\msoledbsql_18.1.0.0_x64.msi'
)
#now I'm trying to install on remote machine
foreach ($msi in $msiList) {
$install = Join-Path -Path $destinationPath -ChildPath $msi
Start-Process "msiexec.exe" -ArgumentList "/I $install",'/qn' -Wait
}
}
And is there any way how to check if the msi was installed properly?
Thank you for your time.
you can add this at the installation section :
$LaunchMsi = Start-Process "msiexec.exe" -ArgumentList "/I $install",'/qn' -Wait -PassThru
$ReturnCode = $LaunchMsi.ExitCode
if (($ReturnCode -eq "0") -OR ($ReturnCode -eq "3010")) {Write-Host "installation OK, return code : $ReturnCode"}
Else {Write-host "installation KO, return code : $ReturnCode"}
I have a .zip containing an installer (setup.exe and associated files).
How can I run setup.exe in a PowerShell script without extracting the zip?
Also, I need to pass command line parameters to setup.exe.
I tried
& 'C:\myzip.zip\setup.exe'
but I get an error
... not recognized as the name of a cmdlet, function, script file, or operable program.
This opens the exe:
explorer 'C:\myzip.zip\setup.exe'
but I cannot pass parameters.
What you're asking is not possible. You must extract the zip file in order to be able to run the executable. The explorer statement only works because the Windows Explorer does the extraction transparently in the background.
What you can do is write a custom function to encapsulate extraction, invocation, and cleanup.
function Invoke-Installer {
Param(
[Parameter(Mandatory=$true)]
[ValidateScript({Test-Path -LiteralPath $_})]
[string[]]$Path,
[Parameter(Manatory=$false)]
[string[]]$ArgumentList = #()
)
Begin {
Add-Type -Assembly System.IO.Compression.FileSystem
}
Process {
$Path | ForEach-Object {
$zip, $exe = $_ -split '(?<=\.zip)\\+', 2
if (-not $exe) { throw "Invalid installer path: ${_}" }
$tempdir = Join-Path $env:TEMP [IO.File]::GetFileName($zip)
[IO.Compression.ZipFile]::ExtractToDirectory($zip, $tempdir)
$installer = Join-Path $tempdir $exe
& $installer #ArgumentList
Remove-Item $tempdir -Recurse -Force
}
}
}
Invoke-Installer 'C:\myzip.zip\setup.exe' 'arg1', 'arg2', ...
Note that this requires .Net Framework v4.5 or newer.
I'm in the process of learning Powershell, and am working on a little script that will upload a group of files to an FTPS server nightly. The files are located on a network share in a sub-directory containing the date in the name. The files themselves will all begin with the same string, let's say "JONES_". I have this script working for FTP, but I don't quite get what I need to do to get it to work for FTPS:
# Set yesterday's date (since uploads will happen at 2am)
$YDate = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
#Create Log File
$Logfile = "C:\powershell\$YDate.log"
Function LogWrite
{
Param ([string]$logstring)
Add-Content $Logfile -value $logstring
}
# Find Directory w/ Yesterday's Date in name
$YesterdayFolder = Get-ChildItem -Path "\\network\storage\location" | Where-Object {$_.FullName.contains($YDate)}
If ($YesterdayFolder) {
#we specify the directory where all files that we want to upload are contained
$Dir= $YesterdayFolder
#ftp server
$ftp = "ftp://ftps.site.com"
$user = "USERNAME"
$pass = "PASSWORD"
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") | Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
foreach($item in ($FilesToUpload))
{
LogWrite "Uploading file: $YesterdayFolder\Report\$item"
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
}
} Else {
LogWrite "No files to upload"
}
I'd rather not have to deal with a 3rd party software solution, if at all possible.
Using psftp didn't work for me. I couldn't get it to connect to the FTP over SSL. I ended up (reluctantly?) using WinSCP with this code:
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USER:PASS#ftps.hostname.com:21/directory/ -explicitssl" "put """"' + $Item.FullName + '""""" "exit"'
Invoke-Expression $PutCommand
In the foreach loop.
I'm not sure if you would consider this as "3rd party software" or not, but you can run PSFTP from within Powershell. Here is an example of how you could do that (source):
$outfile=$YesterdayFolder"\Report\"$item.Name
"rm $outfile`nput $outfile`nbye" | out-file batch.psftp -force -Encoding ASCII
$user = "USERNAME"
$pass = "PASSWORD"
&.\psftp.exe -l $user -pw $pass $ftp -b batch.psftp -be
I want to update 2 workspaces from two different tfs in one script using powershell.
The first Workspace is updating without any Problems. After the update is finished powershell connects to the second Workspace, but isn't updating the local data like the first time.
I guess the old Connection might still block the pipe or something like that, but I haven't found any cmd to clean the pipe. My code looks like this:
param(
[string]$TestTFS = "http://TestTFS",
[string]$ProdTFS = "http://ProdTFS",
[string]$Teamproject="$\TeamprojectPath",
[string]$LocalTestWorkspace="C:\LocalTestWorkspacePath",
[string]$LocalProdWorkspace="C:\LocalProdWorkspacePath"
)
# Import Microsoft.TeamFoundation.PowerShell Snapin
Add-PSSnapin Microsoft.TeamFoundation.PowerShell
# Connect to production-TFS
$ProdEnvServer = Get-TfsServer -Name $ProdTFS
Write-Host "tfsConnect ="$ProdEnvServer
# Get prod teamprojekt
Get-TfsChildItem $Teamprojekt -Server $ProdEnvServer
# Update files in local prod workspace
Update-TfsWorkspace -Force -Recurse $LocalProdWorkspace
# Connect to test-TFS
$TestEnvServer = Get-TfsServer -Name $TestTFS
Write-Host "tfsConnect ="$TestEnvServer
# Get test teamprojekt
Get-TfsChildItem $Teamprojekt -Server $TestEnvServer
# Update files in local test workspace
Update-TfsWorkspace -Force -Recurse $LocalTestWorkspace
After 3 months and noone coming up with an answer. I just assume that the Cmdlets don't work as they should. The only option here seems to be a Workaround.
# Copy Team Project from Prod to Test TFS
param([string]$TestTFS = "http://TestTFS",
[string]$ProdTFS = "http://ProdTFS",
[String]$Teamproject="$/Teamproject",
[String]$LocalTestWorkspace="C:\LocalTestWorkspacePath",
[String]$LocalProdWorkspace="C:\LocalProdWorkspacePath")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.VersionControl.Client")
try
{
clear
$LocalTestProjectPath = $LocalTestWorkspace + $Teamproject.Substring(1)
$LocalProdProjectPath = $LocalProdWorkspace + $Teamproject.Substring(1)
# Connect to production-TFS
Write-Host "Getting latest of $ProdTFS"
$tfsColProd = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($ProdTFS)
[Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer] $vcsProd = $tfsColProd.GetService([type] "Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer")
# TryGetWorkspace is sometimes buggy and doesn't return an existing workspace
# Delete existing workspace manually before if that happens
$workspaceProd = $vcsProd.TryGetWorkspace($LocalProdWorkspace)
$isProdTempWorkspace = $false
# create Workspace if it doesn't exists
if (-not $workspaceProd) {
Write-Host "No workspace found, creating temporary for prod"
$workspaceProd = $vcsProd.CreateWorkspace("Temp_" + [System.Guid]::NewGuid().ToString())
$workspaceProd.Map($Teamproject, $LocalProdProjectPath)
$isProdTempWorkspace = $true
}
$itemSpecFullTeamProj = New-Object Microsoft.TeamFoundation.VersionControl.Client.ItemSpec($Teamproject, "Full")
$fileRequest = New-Object Microsoft.TeamFoundation.VersionControl.Client.GetRequest($itemSpecFullTeamProj,
[Microsoft.TeamFoundation.VersionControl.Client.VersionSpec]::Latest)
$workspaceProd.Get($fileRequest, [Microsoft.TeamFoundation.VersionControl.Client.GetOptions]::GetAll)
if ($isProdTempWorkspace) {
Write-Host "Deleting temporary workspace for prod"
$workspaceProd.Delete()
}
Write-Host "Getting latest of $TestTFS"
$tfsColTest = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($TestTFS)
$vcsTest = $tfsColTest.GetService([type] "Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer")
# TryGetWorkspace is sometimes buggy and doesn't return an existing workspace
# Delete existing workspace manually before if that happens
[Microsoft.TeamFoundation.VersionControl.Client.Workspace] $workspaceTest = $vcsTest.TryGetWorkspace($LocalTestWorkspace)
$isTestTempWorkspace = $false
# create Workspace if it doesn't exists
if (-not $workspaceTest) {
Write-Host "No workspace found, creating temporary for test"
$workspaceTest = $vcsTest.CreateWorkspace("Temp_" + [System.Guid]::NewGuid().ToString())
$workspaceTest.Map($Teamproject, $LocalTestProjectPath)
$isTestTempWorkspace = $true
}
$workspaceTest.Get($fileRequest, [Microsoft.TeamFoundation.VersionControl.Client.GetOptions]::GetAll)
# Remove local test folder and copy prod folder into test workspace
Write-Host "Copying over Prod to Test"
# Delete updated test project folder
Remove-Item -Path $LocalTestProjectPath -Force -Recurse
# Copy prod folder to test workspace
Copy-Item -Path $LocalProdProjectPath -Destination $LocalTestProjectPath -Force -Recurse
# Calling tfpt is the only thing that works
Write-Host "Comparing for changes"
$ps = new-object System.Diagnostics.Process
$ps.StartInfo.Filename = $env:TFSPowerToolDir + "tfpt.exe"
$ps.StartInfo.Arguments = "online /adds /deletes /diff /noprompt /recursive $LocalTestProjectPath"
$ps.StartInfo.RedirectStandardOutput = $false # careful, only output works, has hanging problems (2k Buffer limit)
$ps.StartInfo.RedirectStandardError = $false
$ps.StartInfo.UseShellExecute = $false
$ps.Start()
$ps.WaitForExit()
# Check in new test project folder into test environment
$wsCheckinParams = New-Object Microsoft.TeamFoundation.VersionControl.Client.WorkspaceCheckInParameters(
#($itemSpecFullTeamProj),"Update project to production environment version")
# CheckIn better manually to check for errors
$workspaceTest.CheckIn($wsCheckinParams)
if ($isTestTempWorkspace) {
Write-Host "Deleting temporary workspace for test"
$workspaceTest.Delete()
Remove-Item -Path D:\Development -Force -Recurse
}
}
catch [System.Exception]
{
Write-Host "Exception: " ($Error[0]).Exception
EXIT $LASTEXITCODE
}
My approach is very similar to Zittelrittel. Just send the path and it will automatically figure out the workspace.
This will not work in PowerShell ISE (x86), I had to use the 64-bit version!
Add-PSSnapin Microsoft.TeamFoundation.PowerShell
Write-Host "Updating Workspace1, please wait..."
Update-TfsWorkspace -item C:\dev\Workspace1\code -Recurse | Format-Table
Write-Host "Updating Workspace2, please wait..."
Update-TfsWorkspace -item C:\dev\Workspace1\code -Recurse | Format-Table
In your calls to update TFS workspace, pipe the result to out-null. This should effectively remove any data that would otherwise be stored in the pipeline.
Update-TfsWorkspace -Force -Recurse $LocalProdWorkspace | Out-Null
Update-TfsWorkspace -Force -Recurse $LocalTestWorkspace | Out-Null
I am trying to use PowerShell to do a batch conversion of Word Docx to PDF - using a script found on this site:
http://blogs.technet.com/b/heyscriptingguy/archive/2013/03/24/weekend-scripter-convert-word-documents-to-pdf-files-with-powershell.aspx
# Acquire a list of DOCX files in a folder
$Files=GET-CHILDITEM "C:\docx2pdf\*.DOCX"
$Word=NEW-OBJECT –COMOBJECT WORD.APPLICATION
Foreach ($File in $Files) {
# open a Word document, filename from the directory
$Doc=$Word.Documents.Open($File.fullname)
# Swap out DOCX with PDF in the Filename
$Name=($Doc.Fullname).replace("docx","pdf")
# Save this File as a PDF in Word 2010/2013
$Doc.saveas([ref] $Name, [ref] 17)
$Doc.close()
}
And I keep on getting this error and can't figure out why:
PS C:\docx2pdf> .\docx2pdf.ps1
Exception calling "SaveAs" with "16" argument(s): "Command failed"
At C:\docx2pdf\docx2pdf.ps1:13 char:13
+ $Doc.saveas <<<< ([ref] $Name, [ref] 17)
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DotNetMethodException
Any ideas?
Also - how would I need to change it to also convert doc (not docX) files, as well as use the local files (files in same location as the script location)?
Sorry - never done PowerShell scripting...
This will work for doc as well as docx files.
$documents_path = 'c:\doc2pdf'
$word_app = New-Object -ComObject Word.Application
# This filter will find .doc as well as .docx documents
Get-ChildItem -Path $documents_path -Filter *.doc? | ForEach-Object {
$document = $word_app.Documents.Open($_.FullName)
$pdf_filename = "$($_.DirectoryName)\$($_.BaseName).pdf"
$document.SaveAs([ref] $pdf_filename, [ref] 17)
$document.Close()
}
$word_app.Quit()
The above answers all fell short for me, as I was doing a batch job converting around 70,000 word documents this way. As it turns out, doing this repeatedly eventually leads to Word crashing, presumably due to memory issues (the error was some COMException that I didn't know how to parse). So, my hack to get it to proceed was to kill and restart word every 100 docs (arbitrarily chosen number).
Additionally, when it did crash occasionally, there would be resulting malformed pdfs, each of which were generally 1-2 kb in size. So, when skipping already generated pdfs, I make sure they are at least 3kb in size. If you don't want to skip already generated PDFs, you can delete that if statement.
Excuse me if my code doesn't look good, I don't generally use Windows and this was a one-off hack. So, here's the resulting code:
$Files=Get-ChildItem -path '.\path\to\docs' -recurse -include "*.doc*"
$counter = 0
$filesProcessed = 0
$Word = New-Object -ComObject Word.Application
Foreach ($File in $Files) {
$Name="$(($File.FullName).substring(0, $File.FullName.lastIndexOf("."))).pdf"
if ((Test-Path $Name) -And (Get-Item $Name).length -gt 3kb) {
echo "skipping $($Name), already exists"
continue
}
echo "$($filesProcessed): processing $($File.FullName)"
$Doc = $Word.Documents.Open($File.FullName)
$Doc.SaveAs($Name, 17)
$Doc.Close()
if ($counter -gt 100) {
$counter = 0
$Word.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Word)
$Word = New-Object -ComObject Word.Application
}
$counter = $counter + 1
$filesProcessed = $filesProcessed + 1
}
This works for me (Word 2007):
$wdFormatPDF = 17
$word = New-Object -ComObject Word.Application
$word.visible = $false
$folderpath = Split-Path -parent $MyInvocation.MyCommand.Path
Get-ChildItem -path $folderpath -recurse -include "*.doc" | % {
$path = ($_.fullname).substring(0,($_.FullName).lastindexOf("."))
$doc = $word.documents.open($_.fullname)
$doc.saveas($path, $wdFormatPDF)
$doc.close()
}
$word.Quit()
Neither of the solutions posted here worked for me on Windows 8.1 (btw. I'm using Office 365). My PowerShell somehow does not like the [ref] arguments (I don't know why, I use PowerShell very rarely).
This is the solution that worked for me:
$Files=Get-ChildItem 'C:\path\to\files\*.docx'
$Word = New-Object -ComObject Word.Application
Foreach ($File in $Files) {
$Doc = $Word.Documents.Open($File.FullName)
$Name=($Doc.FullName).replace('docx', 'pdf')
$Doc.SaveAs($Name, 17)
$Doc.Close()
}
I've updated this one to work on latest office :
# Get invocation path
$curr_path = Split-Path -parent $MyInvocation.MyCommand.Path
# Create a PowerPoint object
$ppt_app = New-Object -ComObject PowerPoint.Application
#$ppt.visible = $false
# Get all objects of type .ppt? in $curr_path and its subfolders
Get-ChildItem -Path $curr_path -Recurse -Filter *.ppt? | ForEach-Object {
Write-Host "Processing" $_.FullName "..."
# Open it in PowerPoint
$document = $ppt_app.Presentations.Open($_.FullName,0,0,0)
# Create a name for the PDF document; they are stored in the invocation folder!
# If you want them to be created locally in the folders containing the source PowerPoint file, replace $curr_path with $_.DirectoryName
$pdf_filename = "$($curr_path)\$($_.BaseName).pdf"
# Save as PDF -- 17 is the literal value of `wdFormatPDF`
#$opt= [Microsoft.Office.Interop.PowerPoint.PpSaveAsFileType]::ppSaveAsPDF
$document.SaveAs($pdf_filename,32)
# Close PowerPoint file
$document.Close()
}
# Exit and release the PowerPoint object
$ppt_app.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($ppt_app)