I have a powershell script that will search through the sub folders of a directory and copy any files that contain a specific string in the name and then move those into a different folder that it creates. The problem I am having is that the script is not creating and new folder but just a new file without an extension. Here is my script.
Get-ChildItem -Path 'C:\users\user1\Documents\q3\' -Recurse | Where-Object { $_.Name -like "*test2*" -and $_.FullName -notmatch 'newfolder' } | Copy-Item -Destination 'C:\Users\user1\Documents\Q3\test'
There is no parameter with a behavior like: "createifnotexist" for directories in powershell's copy-item() function. (I'm using powershell 5.1 and could not find one)
This is a way to achieve your goal with a one liner
New-Item -ItemType Directory -Name "test" -Path 'C:\Users\user1\Documents\Q3' -Force; Get-ChildItem -Path 'C:\users\user1\Documents\q3\' -Recurse | Where-Object { $_.Name -like "*test2*" -and $_.FullName -notmatch 'newfolder' } | % { Copy-Item -Path "C:\Users\user1\Documents\Q3\test\"}
The New-Item will override the latest created folder with the -Force
Related
Here is my current script and it works fine. Not efficient running same code twice but I don't know how to combine the wildcards... anyway on to the bigger issue.
The below code searches through my $sourceDir, excludes the files listed in $ExclusionFiles, copies all folders and structure as well as any .jpg or any .csv files, then puts them into the $targetDir.
$sourceDir = 'c:\sectionOne\Graphics\Data'
$targetDir = 'C:\Test\'
$ExclusionFiles = #("InProgress.jpg", "input.csv", "PCMCSV2.csv")
# Get .jpg files
Get-ChildItem $sourceDir -filter "*.jpg" -recurse -Exclude $ExclusionFiles | `
foreach{
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}
# Get .csv files
Get-ChildItem $sourceDir -filter "*.csv" -recurse -Exclude $ExclusionFiles | `
foreach{
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}
My list of files in the main $sourceDir that I need to exclude is getting longer and there are folders I want to exclude as well. Can someone tell me how to,
Copy only a list of specific files in the $sourceDir
Exclude certain folders in $sourceDir from copying
Combine the wildcard search for .jpg and .csv into one statement
I'm still learning so any help would be greatly appreciated!
This is a case where a little bit of Regex will go a long way:
You can filter multiple extensions by using a pretty basic match:
$extensions = 'jpg', 'csv'
$endsWithExtension = "\.(?>$($extensions -join '|'))$"
Get-ChildItem -Recurse |
Where-Object Name -Match $endsWithExtension
You can exclude a list of specific files with one more Where-Object and the -In parameter:
$extensions = 'jpg', 'csv'
$endsWithExtension = "\.(?>$($extensions -join '|'))$"
$ExcludeFileNames = #("InProgress.jpg", "input.csv", "PCMCSV2.csv")
Get-ChildItem -Recurse |
Where-Object Name -Match $endsWithExtension |
Where-Object Name -NotIn $ExcludeFileNames
From there on in, your Foreach-Object is basically correct (nice touch making sure the file exists by using New-Item, though I'd personally assign it's output to null and -PassThru the Copy-Item).
Get-ChildItem $sourceDir -Recurse |
Where-Object Name -Match $endsWithExtension |
Where-Object Name -NotIn $ExcludeFileNames |
Foreach-Object {
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}
Tried to copy several files (newer version) from one location to several different locations (older version) based on search result. The search and rename worked fine; but copy didn't work. Just need a way to point the current search directory as -destination folder.
Search for older files (DB2JCC) in all locations C: & D: drivers
Rename files from DB2JCC* to OLD_DB2JCC*.*
Copy the new version DB2JCC* files to all step 2 (above step) locations
Script used -
search for file name older than '2018-05-02' and replace with newer version files
Get-ChildItem C:\, D:\ -include '*db2jcc*' -Recurse -ErrorAction SilentlyContinue |
Where-Object { $_.LastWriteTime -lt '2018-05-02' } |
Rename-Item -NewName { $_.name -replace 'DB2JCC','Old_DB2JCC' }
ForEach-Object { Copy-Item 'C:\Program Files\IBM\SQLLIB\java\DB2JCC*.*' -Destination $ENV:Temp }
Any help or suggestion will be appreciated.
Thanks, Johnson
Based off your answer, you can try something a long the lines of this:
[System.Collections.ArrayList]$Paths = #()
Get-ChildItem C:\,D:\ -Filter "*db2jcc*" -Recurse -ErrorAction SilentlyContinue |
Where-Object { $_.LastWriteTime -lt '2018-05-02' } | ForEach-Object -Process {
$null = $Paths.Add($_.FullName)
Rename-Item -Path $_.FullName -NewName $_.Name.Replace('DB2JCC','Old_DB2JCC') -ErrorAction SilentlyContinue }
Foreach($Path in $Paths){
$Path = $Path -replace '[^\\]+$'
Copy-Item "C:\Program Files\IBM\SQLLIB\java\" -Filter "DB2JCC*" -Recurse -Destination $Path }
I would appreciate some help here.
The Powershell script should close Outlook process which works.
Aswell as scan C disk for .pst files which works.
Copy these files to "\fileserver01\temp\test\"
Export to csv/excel list where these files where located and last write time.
Possible hide error messages for the user when running the script since it complains about not full access on a few folders when running the scan.
Code:
Get-Process outlook | Foreach-Object { $_.CloseMainWindow() }
Get-ChildItem -path c:\ -recurse -include *.pst | `
Copy-Item -destination "\\fileserver01\temp\test\" | `
Select-object fullname,lastwritetime|export-csv "\\fileserver01\temp\test\"
How should I fix the last couple of things on my list?
Thanks
First you have to use double backslash for UNC paths.
Second, the copy-item does not output anything to the pipeline, you have to use the -Passthru parameter.
Get-ChildItem -path z:\ -recurse -include *.pst -PipelineVariable source |
Copy-Item -Destination "\\path\temp" -Verbose -PassThru |
Select-Object #{n="Source";e={$source.versioninfo.filename}},fullname,lastwritetime | export-csv "\\path\temp\copy_result.csv" -Append -Verbose
I believe the issue is that after the files are copied, the object is gone from the pipeline.
This works:
Get-ChildItem -Path C:\ -Include *.pst -ErrorAction SilentlyContinue | Select-Object FullName, LastWriteTime | Export-Csv -Path "\fileserver01\temp\test\MyCSV.csv"
This doesn't directly answer the question you've asked, as #Adamar's answer appears to do just that.
however, your issue could also be resolved by querying ost/pst files from registry using a snippet like this:
(Get-ChildItem HKCU:\Software\Microsoft\Office\16.0\Outlook\Search).Property | ? {$_ -match "(o|p)st$"}
which will return all of the ost/pst files the logged in user has open in outlook.
a snippet like this will then copy them all to a network share and print the logs to a file.
$FilesToCopy = (Get-ChildItem HKCU:\Software\Microsoft\Office\16.0\Outlook\Search).Property | ? {$_ -match "(o|p)st$"}
$FilesToCopy | ForEach { Copy-Item -Path $_ -Destination "\\network\share" ; $_ | Out-File "\\network\share\log.txt" -Append }
This saves a LOT of time over indexing through all of the C: drive - there's also an issue where very long directory names (greater than 260 char length) are not indexed properly by Get-ChildItem - making this method a bit more reliable and appealing for a production script.
This is the final code.
Thanks everyone for your support.
#Kill Oulook process
Get-Process outlook -ErrorAction SilentlyContinue | Foreach-Object { $_.CloseMainWindow() }
#Scan for .pst files on the C disk
Get-ChildItem -path c:\ -recurse -include *.pst -ErrorAction SilentlyContinue |
#Copy located .pst files to the destination
Copy-Item -Destination "\\networkpath\home\$env:username\ComputerChange\" -Verbose -PassThru -ErrorAction SilentlyContinue |
#Log where files were located and when they were last written to.
Select-Object fullname,lastwritetime | export-csv \\networkpath\home\$env:username\ComputerChange\PSTlog.csv -Verbose
Write-Host "PST Files have successfully been copied, press any key to close" -ErrorAction SilentlyContinue
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
end
So I have created a much faster script as I have excluded some systemfolders and programfiles folders where .PST files doesn't save.
I bet some of you expert can find out why this code doesn't work?
#Exclude systemfolders etc
$folders=get-childitem c:\ | where-object{$_.mode -like "d-*" -AND $_.name -notlike "windows" -AND $_.name -notlike "drivers" -AND $_.name -notlike "program files*"}
#Search thru each root folder for PST-files
$allfiles=$folders | ForEach-Object{get-childitem -Path $_.fullname -include "*.pst" -recurse -ErrorAction silentlycontinue};
$env:username
$foldertocreate="\\destination\$env:username\"
#Check if folder with username exists in the \\destination folder otherwise create folder with username.
if((Test-Path -Path $foldertocreate -PathType Container)) {write-host "Folder already created"}
else {write-host "Creating Folder", New-Item -ItemType Directory -Force -Path $foldertocreate }
#Copy .PST files which is in $allfiles to the folder created in fileshare> $foldertocreate.
#Copy .PST files in $allfiles to the destination folder created.
robocopy $allfiles $foldertocreate
Write-Host "Press any key to close" -ErrorAction SilentlyContinue $x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
end
Probably very easy for you guys.
I want to copy all the folders and files from this path c:\default\
to this destination c:\environment\customfolder\
in the folder customerfolder are other folders with different names.
It should only copy the files and folders to the destination Where the customfolder contains the name DEMO_test
What is the best and easiest way to do that?
should I use for-each?
thanks for your help.
Sorry I should be more clear. ;-)
I have a folder c:\default
All the files and sub-folders in that folder c:\default
should be copied to these folders
c:\environment\customfolder\demo_test
c:\environment\customfolder\demo_test01
c:\environment\customfolder\demo_test02
c:\environment\customfolder\demo_test03
I know it should be possible to copy all files and sub-folders from this path (source)c:\default\
to this path (destination)c:\environment\customfolder\
And only copy it to the folders if they have the name (like) demo_test*
Is that question better?
thanks.
Get a list of files:
Get-ChildItem -Path "C:\default\" -Recurse
The -Recurse parameter searches subfolders.
Now filter the list to show only files that fit a certain pattern
Get-ChildItem -Path "C:\default\" -Recurse |
Where-Object Name -like "*test*"
Note that the pipe | is effectively chaining these commands together.
Now copy the filtered list of items to a destination folder:
Get-ChildItem -Path "C:\default\" -Recurse |
Where-Object Name -like "*test*" |
Copy-Item -Destination "C:\destination\"
if i understood you corectly you have flat structure of catalogs:
SOURCE:
C\Catalog[lookupCatalogs]\files
DEST:
c:\Catalog\SomeCatlogs[lookupCatalogs]\files
if yes,
this function should be ok:
function copy-TestdemoFolders
{
param ($source,
$destination,
$filter,
$recursive = $false)
$folders = Get-ChildItem $source -filter $filter -Directory
$folders | % {
$copyPath = Join-Path $destination $_.name
if (!(Test-Path $copyPath))
{
New-Item $copyPath -ItemType Directory | Out-Null
"Created New Folder: $($_.name)"
}
$scriptBlock = { Get-ChildItem $_.Fullname }
if ($recursive -eq $true)
{
$scriptBlock = { Get-ChildItem $_.Fullname -Recurse }
}
Invoke-Command -ScriptBlock $scriptBlock | %{
Copy-Item $_.Fullname $copyPath -ErrorAction SilentlyContinue
if (!($?)) { $error[0].Exception.Message }
}
}
}
copy-TestdemoFolders -source 'C:\Source' -filter "*demo_test*" -destination D:\TEST
You can recursively copy files from subfolders to [lookupCatalog] with switch copy-TestdemoFolders -source 'C:\Source' -filter "*demo_test*" -destination D:\TEST -recursive:$true
OK, trying to copy folders and contents from a UNC path (shared drive) to another UNC path (NAS) based on date (Before 01 Jan 2015). Yes I know the code says 2017 but once I get it working on test then I'll change the date and run on prod.
#Original file path
$path = "UNC Path"
#Destination file path
$destination = "Different UNC Path"
#It makes a filelist of what's inside the $path path
Foreach($file in (Get-ChildItem $path)) {
#If the lastwrite time is before the given date
If($file.LastWriteTime -lt "01/01/2017") {
#It copies the file to the destination
Copy-Item -Path $file.fullname -Destination $destination -Force } }
It copies the contents of folders fine but not the folders. I think I'm missing a -recurse but putting it after Get-ChildItem $path didn't work.
I plan to get this working then add a Remove-Item line to remove all the old items from the file server.
Thoughts? Suggestions of better ways to accomplish this?
Thanks,
I think you're just missing the -Recurse from Get-ChildItem, but I would do it like so:
Get-ChildItem -Path $Path -Recurse `
| Where-Object { $_.LastWriteTime -lt '2017-01-01' } `
| ForEach-Object {
Copy-Item -Path $_.FullName -Destination ($_.FullName.Replace($source,$destination)) -Force;
}
If you've got hidden or system files to copy, you'll also want the -Force parameter on Get-ChildItem.
Actually, you might need to do this:
Get-ChildItem -Path $Path -Recurse `
| Where-Object { $_.LastWriteTime -lt '2017-01-01' } `
| ForEach-Object {
if ($_.PSIsContainer -and !(Test-Path($_.FullName.Replace($source,$destination)) {
mkdir ($_.FullName.Replace($source,$destination));
}
else {
Copy-Item -Path $_.FullName -Destination ($_.FullName.Replace($source,$destination)) -Force;
}
}