I need to create an archive file of the files in C:\src\csv without recursion. It appears that Compress-Archive will always recurse subdirectories.
The following code fails. It appears to only be using the file Name and not FullName in the list of files. In what ways could I overcome this?
Compress-Archive -Path (Get-ChildItem -File -Path 'C:\src\csv') -DestinationPath $Env:TEMP\t.zip
Specify files FullName and indicate an array of values with #() construct.
Compress-Archive -Path #((Get-ChildItem -File -Path 'C:\src\csv').FullName) -DestinationPath $Env:TEMP\t.zip
Related
I'm using the Get-ChildItem command in a script. I just noticed that it will return file names beginning with bernie3_first or bernie3_second, but not folders. How can this be modified to return folders as well?
$FileNames = Get-ChildItem -Path $FilePath -Include bernie3_first*,bernie3_second* -File -Recurse | select BaseName
Your code shows a parameter that is filtering the results to only show the files and not folders. Filter parameter is -File.
Here an example:
# This would get content of C:\Test, files and folders
Get-ChildItem -Path C:\Test
# This would get content of C:\Test, only folders
Get-ChildItem -Path C:\Test -Directory
# This would get content of C:\Test, only files
Get-ChildItem -Path C:\Test -File
If you want to read more about each of the parameters you can check this on the documentation.
I have a folder that contains several thousand files. I would like to write a Powershell script that loops through the files and copies each file whose filename contains a specific keyword. In pseudocode:
For each file in C:\[Directory]
If filename contains "Presentation" Then
copy file in C:\[Directory 2]
Simply like this ?
copy-item "C:\SourceDir\*Presentation*" "C:\DestinationDir"
or like this :
copy-item "C:\SourceDir\*" "C:\DestinationDir" -Filter "*rrrr*"
But a risk exist if you have a directory with "presentation" in his name into the source directory. Then take all method proposed here and add -file in get-childitem command.
Like in this short version of Robdy code :
gci "C:\SourceDir" -file | ? Name -like "*Presentation*" | cpi -d "C:\DestinationDir"
That code should do the trick:
$files = Get-ChildItem -Path "C:\path\to\source\folder"
$files | Where-Object Name -Like "*Presentation*" | Copy-Item -Destination "C:\path\to\destination\folder"
Of course can be written in one line but I put in two for visibility.
Edit: as Esperento57 pointed out, you might want to add -ItemType File to Get-ChildItem cmdlet to not include folders with 'Presentation' in their name. Also, depending on your needs you might also want to use -Recurse param to include files in subfolders.
If you have files in subfolders and you want to keep the path in destination folder you'll have to change the script a bit to something like:
Copy-Item -Destination $_.FullName.Replace('C:\path\to\source\folder','C:\path\to\destination\folder')
And for the above you'll have to make sure that folders are actually created (e.g. by using -Force for Copy-Item.
This seems to work:
$src = "Dir1"
$dst = "Dir2"
Get-ChildItem $src -Filter "*Presentation*" -Recurse | % {
New-Item -Path $_.FullName.Replace($src,$dst) -ItemType File -Force
Copy-Item -Path $_.FullName -Destination $_.FullName.Replace($src,$dst) -Force
}
Try something like this:
Get-ChildItem "C:\Your\Directory" -File -Filter *YourKeyWordToIsolate* |
Foreach-Object { Copy-Item $_.FullName -Destination "C:\Your\New\Directory" }
... but, of course, you'll need to fill in some of the blanks left open by your pseudocode example.
Also, that's a one-liner, but I inserted a return carriage for easier readability.
I want to remove the following files from the source, however in the source there is a sub-directory that contains files with similar names. When I run the following command it is deleting files in the sub-directory with similar file name. Is there a way to just delete the files from the source and not the sub-directory?
Example: test_1_file, test_2_file, test_3_file exists in each directory, TestFolder and TestFolder/sub
$source = testfolder
remove-item -Path $source -filter test_*_file -recurse -force
It's usually easiest to pipe the output of Get-ChildItem cmdlet into Remove-Item. You then can use the better filtering of Get-ChildItem as I think -Recurse in Remove-Item has some issues. You can even use Where-Object to further filter before passing to Remove-Item
$source = testfolder
Get-ChildItem -Path $source -Filter test_*_file -Recurse |
Where-Object {$_.Fullname -notlike "$source\sub\*"} |
Remove-Item -Force
If the files to delete:
are all located directly in $source
and no other files / directories must be deleted:
Remove-Item -Path $source/test_*_file -Force
No need for -Recurse (as #Bill_Stewart notes).
Note: For conceptual clarity I've appended the wildcard pattern (test_*_file) directly to the $source path.
Using a wildcard expression separately with -Filter is generally faster (probably won't matter here), but it has its quirks and pitfalls.
Can I somehow exclude a folder when I compress an archive like this?
$compress = Compress-Archive $DestinationPath $DestinationPath\ARCHIVE\archiv-$DateTime.zip -CompressionLevel Fastest
Now it always saves the whole folder structure of $destinationpath to the archive, but since the archive is in the same folder, it always gets zipped into a new archive, making the archive double in size every time I run the command.
Get all the files you want to compress, excluding the files and folders you don't want compressed and then pass that to the cmdlet
# target path
$path = "C:\temp"
# construct archive path
$DateTime = (Get-Date -Format "yyyyMMddHHmmss")
$destination = Join-Path $path "ARCHIVE\archive-$DateTime.zip"
# exclusion rules. Can use wild cards (*)
$exclude = #("_*.config","ARCHIVE","*.zip")
# get files to compress using exclusion filer
$files = Get-ChildItem -Path $path -Exclude $exclude
# compress
Compress-Archive -Path $files -DestinationPath $destination -CompressionLevel Fastest
you can use -update option of Compress-Archive. Select your subdirs with Get-ChildItem and Where
like it:
$YourDirToCompress="c:\temp"
$ZipFileResult="C:\temp10\result.zip"
$DirToExclude=#("test", "test1", "test2")
Get-ChildItem $YourDirToCompress -Directory |
where { $_.Name -notin $DirToExclude} |
Compress-Archive -DestinationPath $ZipFileResult -Update
I know this question is rather old, but wanted to post my solution here. This solution has worked for me and I hope it may help someone else having the same issue.
I took the ideas from the previous answers and developed them a bit.
So generally speaking what you need to do is to create two lists, one for the files in the root directory and another one for directories (excluding the directory you'd want to omit). Then you need to concatenate these two lists together and put them into -Path parameter of Compress-Archive cmdlet.
Voila! It will create a .zip archive with all files and directories we need, preserving the directory structure.
$files = Get-ChildItem -Path /RootDir -File
$directories = Get-ChildItem -Path /RootDir -Recurse -Directory -Exclude DirToExclude
Compress-Archive -Path $($files + $directories) -DestinationPath Archive.zip
I am new to powershell. I use the following powershell script to copy file from a network share, but the time cost is ridiculously long compared to a traditional windows batch file. What could be the cause?
$dlls=get-childitem -path "\\myShare\myBinFolder" -include *.dll -recurse
copy-item $dlls -destination c:\bins
Thanks
Update - 1 - 1:38 PM 1/13/2011
Why Get-ChildItem is So Slow?
http://blogs.msdn.com/b/powershell/archive/2009/11/04/why-is-get-childitem-so-slow.aspx
Do not use the Include parameter. Use the Filter parameter instead. Include will require every file to be returned from the share and filtered locally. Using Filter should allow the filtering to happen on the remote end.
$dlls = Get-ChildItem -Path "\\myShare\myBinFolder" -Filter *.dll -recurse
or using positional feature of these parameters:
$dlls = Get-ChildItem \\myShare\myBinFolder *.dll -r
In fact, the only time I would ever use Include over Filter is if I needed to specify multiple filter terms (Include takes a string array) e.g.:
Get-ChildItem . -inc *.h,*.cpp,*.rc -r
One way to optimize this is to avoid assigning it to a variable. Try this
Get-ChildItem *.dll -Path \\Myshare\Folder -recurse | % { Copy-item $_.FullName -destination C:\bins }
You can use Measure-Command to measure how much time these two methods are taking. You can do that by:
(Measure-Command { Get-ChildItem *.dll -Path \\Myshare\Folder -recurse | % { Copy-item $_.FullName -destination C:\bins } }).Milliseconds
and
(Measure-Command {$dlls = Get-ChildItem *.dll -Path \\Myshare\Folder -recurse; copy-item $dlls -Destination C:\bins}).Milliseconds
All you really need from the remote system is a list of the full paths to the .dll files in that share. Get-childitem is overkill for that, and has known issues working with large directory structures remotely.
See if this isn't a lot quicker:
cmd /c dir \\Myshare\Folder\*.dll /b /s |% {Copy-Item $_ -destination C:\bins}
Note: the double backslash in the UNC is showing up as a single in the post.
How do I fix that?