ls into an extracted directory - powershell

Using PowerShell, I am downloading and extracting a file that has a directory and another file in it (it's basically from https://aka.ms/downloadazcopy-v10-windows). I'd like to be able to get into the directory after extraction.
So, in PS, I am at c:\AzCopyTest while downloading the file. It is being extracted at the same location. Here's the code for it:
$URL = "https://aka.ms/downloadazcopy-v10-windows"
New-Item -ItemType Directory -Path c:\AzCopyTest
$Destination = "c:\AzCopyTest\zzz.zip"
$WebClient = New-Object -TypeName System.Net.WebClient
$WebClient.DownloadFile($URL, $Destination).
$ExtractLocation = "c:\AzCopyTest"
$ExtractShell = New-Object -ComObject Shell.Application
$file = $ExtractShell.NameSpace($Destination).Items()
$ExtractShell.NameSpace($ExtractLocation).CopyHere($file)
How can I get into the folder after the extraction is done? FYI, I don't want to ls into it directly (or manually instead). I'd like to be able to list out the items in that directory and get the first directory. azcopy_windows_amd64_10.3.4 is what the directory called BTW. Now, when MS realease a new version (say 10.3.5), the directory will be renamed, and I do not want to go back in and manually change it. You get where I am going with this..
I know Get-ChildItem -Path ("$ExtractLocation") -Recurse will list the items in the directory and the sub-directories. But it does not server my purpose unfortunately.
Any help is greatly appreciated!

I tried this:
Get-ChildItem -Path $ExtractLocation -Recurse -Directory -Force -ErrorAction SilentlyContinue | Select-Object).Name and it worked.
FYI, at a later time, if MS decided to add another directory in the zipped file, you can simple do this:
Get-ChildItem -Path $ExtractLocation -Recurse -Directory -Force -ErrorAction SilentlyContinue | Select-Object).Name[<position_of_the_directory_starting_from_0]

Related

Powershell: Find Folders with (Name) and Foreach Copy to Location Preserve Directory Structure

Got another multi-step process I'm looking to streamline. Basically, I'm looking to build a Powershell script to do three things:
Get-Childitem to look for folders with a specific name (we'll call it NAME1 as a placeholder)
For each folder it finds that has the name, I want it to output the full directory to a TXT file (so that in the end I wind up with a text file that has a list of the results it found, with their full paths; so if it finds folders with "NAME1" in five different subdirectories of the folder I give it, I want the full path beginning with the drive letter and ending with "NAME1")
Then I want it to take the list from the TXT file, and copy each file path to another drive and preserve directory structure
So basically, if it searches and finds this:
D:\TEST1\NAME1
D:\TEST7\NAME1
D:\TEST8\NAME1\
That's what I want to appear in the text file.
Then what I want it to do is to go through each line in the text file and plug the value into a Copy-Item (I'm thinking the source directory would get assigned to a variable), so that when it's all said and done, on the second drive I wind up with this:
E:\BACKUP\TEST1\NAME1
E:\BACKUP\TEST7\NAME1
E:\BACKUP\TEST8\NAME1\
So in short, I'm looking for a Get-Childitem that can define a series of paths, which Copy-Item can then use to back them up elsewhere.
I already have one way to do this, but the problem is it seems to copy everything every time, and since one of these drives is an SSD I only want to copy what's new/changed each time (not to mention that would save time when I need to run a backup):
$source = "C:\"
$target = "E:\BACKUP\"
$search = "NAME1"
$source_regex = [regex]::escape($source)
(gci $source -recurse | where {-not ($_.psiscontainer)} | select -expand fullname) -match "\\$search\\" |
foreach {
$file_dest = ($_ | split-path -parent) -replace $source_regex,$target
if (-not (test-path $file_dest)){mkdir $file_dest}
copy-item $_ -Destination $file_dest -force -verbose
}
If there's a way to do this that wouldn't require writing out a TXT file each time I'd be all for that, but I don't know a way to do this the way I'm looking for except a Copy-Item.
I'd be very grateful for any help I can get with this. Thanks all!
If I understand correctly, you want to copy all folders with a certain name, keeping the original folder structure in the destination path and copy only files that are newer than what is in the destination already.
Try
$source = 'C:\'
$target = 'E:\BACKUP\'
$search = 'NAME1'
# -ErrorAction SilentlyContinue because in the C:\ disk you are bound to get Access Denied on some paths
Get-ChildItem -Path $source -Directory -Recurse -Filter $search -ErrorAction SilentlyContinue | ForEach-Object {
# construct the destination folder path
$dest = Join-Path -Path $target -ChildPath $_.FullName.Substring($source.Length)
# copy the folder including its files and subfolders (but not empty subfolders)
# for more switches see https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
robocopy $_.FullName $dest  /XO /S /R:0
}
If you don't want console output of robocopy you can silence it by appending 2>&1, so neither stdout nor stderr is echoed
If you want to keep a file after this with both the source paths and the destinations, I'd suggest doing
$source = 'C:\'
$target = 'E:\BACKUP\'
$search = 'NAME1'
$output = [System.Collections.Generic.List[object]]::new()
# -ErrorAction SilentlyContinue because in the C:\ disk you are bound to get Access Denied on some paths
Get-ChildItem -Path $source -Directory -Recurse -Filter $search -ErrorAction SilentlyContinue | ForEach-Object {
# construct the destination folder path
$dest = Join-Path -Path $target -ChildPath $_.FullName.Substring($source.Length)
# add an object to the output list
$output.Add([PsCustomObject]#{Source = $_.FullName; Destination = $dest })
# copy the folder including its files and subfolders (but not empty subfolders)
# for more switches see https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
robocopy $_.FullName $dest  /XO /S /R:0
}
# write the output to csv file
$output | Export-Csv -Path 'E:\backup.csv' -NoTypeInformation

PowerShell script isn't copying like I want

Right in the beginning I should note that I am a bloody beginner because I can't attend it classes in my grade.
I want to create a PowerShell script which will copy everything from
C:\Users\Robert\Desktop\test(lots of folders)
to
C:\Users\Robert\Desktop\neu(lots of folders with the exact same names as above)\price
As an absolute beginner I thought that it will be ok to replace the variable folder name with $_Name because it is the same name in both but I am obviously wrong and don't know why.
Here is my attempt
Copy-Item "C:\Users\Robert\Desktop\test\$_name\*" -Destination "C:\Users\Robert\Desktop\neu\$_Name\price" -Recurse
It is copying something but in one package in a new folder in "neu".
I can't avoid creating this script because it would take me at least two or three days to do it by hand.
I am also sorry for my poor English skills
Thank you
the $_ represents the current pipeline item. i don't see a pipeline in there ... [grin]
the following works by grabbing every file in the source dir & its subdirs, and copying that structure to the destination dir. it uses Splatting to structure the parameters neatly.
$SourceDir = "$env:TEMP\Apps - Copy"
$DestDir = "$env:TEMP\Apps - Copy - Two"
$CI_Params = #{
LiteralPath = $SourceDir
Destination = $DestDir
Force = $True
Recurse = $True
}
Copy-Item #CI_Params
If my understanding is correct:
$src = 'C:\Users\Robert\Desktop\test'
$dst = 'C:\Users\Robert\Desktop\neu\{0}\price'
Get-ChildItem $src -Directory | ForEach-Object {
Copy-Item -Path "$($_.FullName)\*" -Destination ($dst -f $_.BaseName) -Recurse -Force -WhatIf
}
Remove -WhatIf to actually do it.

Slight modification to powershell command

This powershell command works perfectly to copy and extract a zip file across two directories:
$shell = New-Object -COM Shell.Application
$target = $shell.NameSpace('D:\destination\')
$zip = $shell.NameSpace('D:\source\version_*.zip')
$target.CopyHere($zip.Items(), 16)
However I am struggling with a modification to make it select only the latest zip file from the source.
Get the zip file with the most recent modification date in a given directory:
$source = "C:\temp"
$destination = "C:\temp\output"
$zipFile = Get-ChildItem -Path $source -Filter "*.zip" |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
Expand-Archive -Path $zipFile.FullName -DestinationPath $destination
This works by searching for all zip files, sorting them by the modified date descending and then grabbing the "first" one (according to the defined sort order).
I've also used the Expand-Archive command to extract the zip to a specified destination. If you need to copy the zip then that's easy enough using the Copy-Item cmdlet first.
As some commenters have pointed out: Expand-Archive was introduced in version 5 of PowerShell.
However the logic of getting the "latest" file is unchanged and can easily be plumbed in to your existing script:
$zip = $shell.NameSpace($zipFile.FullName)

Compress-Archive Error: Cannot access the file because it is being used by another process

I would like to zip a path (with a service windows running inside).
When the service is stopped, it works perfectly, when the service is running, I have the exception:
The process cannot access the file because it is being used by another
process.
However, when I zip with 7-zip, I don't have any exception.
My command:
Compress-Archive [PATH] -CompressionLevel Optimal -DestinationPath("[DEST_PATH]") -Force
Do you have any idea to perform the task without this exception?
Copy-Item allows you to access files that are being used in another process.
This is the solution I ended up using in my code:
Copy-Item -Path "C:\Temp\somefolder" -Force -PassThru |
Get-ChildItem |
Compress-Archive -DestinationPath "C:\Temp\somefolder.zip"
The idea is that you pass through all the copied items through the pipeline instead of having to copy them to a specific destination first before compressing.
I like to zip up a folder's content rather than the folder itself, therefore I'm using Get-ChildItem before compressing in the last line.
Sub-folders are already included. No need to use -recurse in the first line to do this
A good method to access files being used by another process is by creating snapshots using Volume Shadow Copy Service.
To do so, one can simply use PowerShells WMI Cmdlets:
$Path = "C:/my/used/folder"
$directoryRoot = [System.IO.Directory]::GetDirectoryRoot($Path).ToString()
$shadow = (Get-WmiObject -List Win32_ShadowCopy).Create($directoryRoot, "ClientAccessible")
$shadowCopy = Get-WmiObject Win32_ShadowCopy | ? { $_.ID -eq $shadow.ShadowID }
$snapshotPath = $shadowCopy.DeviceObject + "\" + $Path.Replace($directoryRoot, "")
Now you can use the $snapshotPath as -Path for your Compress-Archive call.
This method can also be used to create backups with symlinks.
From there on you can use the linked folders to copy backed up files, or to compress them without those Access exceptions.
I created a similiar function and a small Cmdlet in this Gist: Backup.ps1
There was a similar requirement where only few extensions needs to be added to zip.
With this approach, we can copy the all files including locked ones to a temp location > Zip the files and then delete the logs
This is bit lengthy process but made my day!
$filedate = Get-Date -Format yyyyMddhhmmss
$zipfile = 'C:\Logs\logfiles'+ $filedate +'.zip'
New-Item -Path "c:\" -Name "Logs" -ItemType "directory" -ErrorAction SilentlyContinue
Robocopy "<Log Location>" "C:\CRLogs\" *.txt *.csv *.log /s
Get-ChildItem -Path "C:\Logs\" -Recurse | Compress-Archive -DestinationPath $zipfile -Force -ErrorAction Continue
Remove-Item -Path "C:\Logs\" -Exclude *.zip -Recurse -Force

Need a script to publish build output to a staging server

I am trying to write a PowerShell script that will copy a subset of files from a source folder and place them into a target folder. I've been playing with "copy-item" and "remove-item" for half a day and cannot get the desired or consistent results.
For example, when I run the following cmdlet multiple times, the files end up in different locations?!?!:
copy-item -Path $sourcePath -Destination $destinationPath -Include *.dll -Container -Force -Recurse
I've been trying every combination of options and commands I can think of but can't find the right solution. Since I'm sure that I'm not doing anything atypical, I'm hoping someone can ease my pain and provide me with the proper syntax to use.
The source folder will contain a large number of files with various extensions. For example, all of the following are possible:
.dll
.dll.config
.exe
.exe.config
.lastcodeanalysisissucceeded
.pdb
.Test.dll
.vshost.exe
.xml
and so on
The script needs to only copy .exe, .dll and .exe.config files excluding any .test.dll and .vshost.exe files. I also need the script to create the target folders if they don't already exist.
Any help getting me going is appreciated.
try:
$source = "C:\a\*"
$dest = "C:\b"
dir $source -include *.exe,*.dll,*.exe.config -exclude *.test.dll,*.vshost.exe -Recurse |
% {
$sp = $_.fullName.replace($sourcePath.replace('\*',''), $destPath)
if (!(Test-Path -path (split-path $sp)))
{
New-Item (split-path $sp) -Type Directory
}
copy-item $_.fullname $sp -force
}
As long as the files are in one directory, the following should work fine. It might be a bit more verbose than needed, but it should be a good starting point.
$sourcePath = "c:\sourcePath"
$destPath = "c:\destPath"
$items = Get-ChildItem $sourcePath | Where-Object {($_.FullName -like "*.exe") -or ($_.FullName -like "*.exe.config") -or ($_.FullName -like "*.dll")}
$items | % {
Copy-Item $_.Fullname ($_.FullName.Replace($sourcePath,$destPath))
}