I have folder called Logfolder in C.
C:\LogFolder
it has multiple logs with name as follows
errorLogs.log
errorLogs.log.1
errorLogs.log.2
errorLogs.log.3
Transmitlogs.log
Transmitlogs.log.1
Transmitlogs.log.2
Transmitlogs.log.3
Transmitlogs.log.4
Transmitlogs.log.5
Receivelogs.log
Receivelogs.log.1
Receivelogs.log.2
Receivelogs.log.3
Receivelogs.log.4
Dataexchange.log
Dataexchange.log.1
and many other with the different name but with same extension like .log, .log.1 and so on.
I am interested in only above mention logs.
my goal is to copy this logs starting from log.1 upto log.10 or 20 all which exist and than
delete the original file with an exception .log and .log.1.
I have achieved following until now.
$logLocation = "C:LogFolder"
$tempLocation = "C:\Temp\Logs\"
$LogfileName = "errorLogs.log.", "Transmitlogs.log.","Receivelogs.log.","Dataexchange.log."
foreach ($element in $LogfileName)
{
$NewLogFileName = -join($element,"*")
Copy-Item -Path "$logLocation\$NewLogFileName" -Destination $tempLocation
}
I am able to copy all logs starting from .log.1 and all other which exist.
my problem is how can i delete those logs from original folder without deleting .log and .log.1
I have tried the following but not working.
foreach ($element in $LogfileName)
{
$deleteLogFileName = -join($element,"*")
Remove-Item –path "$logLocation\$deleteLogFileName" -exclude *.log, *.log.1
}
You can do that by selectively copy only file *.log.1 to the destination folder and move the others. That would save you removing files from the source location afterwards.
The thing that matters here most is to get a list of files that
have a numeric extension
have a basename like 'errorLogs.log', 'Transmitlogs.log', 'Receivelogs.log' or 'Dataexchange.log'
Try
$logLocation = "C:\LogFolder"
$tempLocation = "C:\Temp\Logs"
# if the destination folder does not exist yet, creatre it first
if (!(Test-Path -Path $tempLocation -PathType Container)) {
$null = New-Item -Path $tempLocation -ItemType Directory
}
# get an array of objects of the files where the extension ends in a numeric value
# and where the basename is either 'errorLogs.log', 'Transmitlogs.log', 'Receivelogs.log'
# or 'Dataexchange.log'.
$files = Get-ChildItem -Path $logLocation -Filter '*.log*' -File |
Where-Object {$_.Name -match '^(errorLogs|Transmitlogs|Receivelogs|Dataexchange)\.log\.\d+$' } |
Select-Object FullName, #{Name = 'Number'; Expression = {[int]($_.Name.Split(".")[-1])}}
foreach ($file in $files ) {
if ($file.Number -eq 1) {
# this file should be copied
Copy-Item -Path $file.FullName -Destination $tempLocation -Force
}
else {
# the others are to be moved
Move-Item -Path $file.FullName -Destination $tempLocation -Force
}
}
Related
I am trying to create a PowerShell script to copy new and modified files from the source folder to the destination folder. I am able to copy the new file with the given script but also want to add the condition for the modified file also. Can anyone help me to achieve this.
$Sourcefolder = "C:\Users\parveen.kumar\Downloads\Source"
$Desifolder = "C:\Users\parveen.kumar\Downloads\desi"
$GetFiles = Get-ChildItem -Path $Sourcefolder
$BackUpImagesFiles = (Get-ChildItem -Path $Desifolder).Name
foreach($image in $GetFiles)
{
$fileName = $image.Name;
if($BackUpImagesFiles -notcontains $fileName)
{
Copy-Item $image.FullName -Destination $Desifolder
}
}
You can use Get-Item to find if there is a file with that name already in the destination folder or not.
If not OR the file you found is older that the one in the source folder, copy the file.
Something like this:
$Sourcefolder = "C:\Users\parveen.kumar\Downloads\Source"
$Destfolder = "C:\Users\parveen.kumar\Downloads\desi"
Get-ChildItem -Path $Sourcefolder -File | ForEach-Object {
# test if there already is a file with that name in the destination folder
$existingFile = Get-Item -Path (Join-Path -Path $Destfolder -ChildPath $_.Name) -ErrorAction SilentlyContinue
# if not existing or the existing file is older than the one in the source folder, do the copy
if (!$existingFile -or $existingFile.LastWriteTime -lt $_.LastWriteTime) {
$_ | Copy-Item -Destination $Destfolder -Force
}
}
Based on your comment, if you want to keep a copy of the file that was already in the destination folder, you can change to:
$Sourcefolder = "C:\Users\parveen.kumar\Downloads\Source"
$Destfolder = "C:\Users\parveen.kumar\Downloads\desi"
Get-ChildItem -Path $Sourcefolder -File | ForEach-Object {
# test if there already is a file with that name in the destination folder
$existingFile = Get-Item -Path (Join-Path -Path $Destfolder -ChildPath $_.Name) -ErrorAction SilentlyContinue
# if a file already exists AND is older than the one in the source folder, do the copy
if ($existingFile -and $existingFile.LastWriteTime -lt $_.LastWriteTime) {
# rename the existing file first before you overwrite with a newer file from the source folder
# for demo, add the file's last modified date to its name
$newName = '{0}_{1:yyyy-MM-dd HHmmss}{2}' -f $existingFile.BaseName,
$existingFile.LastWriteTime,
$existingFile.Extension
$existingFile | Rename-Item -NewName $newName -Force
$_ | Copy-Item -Destination $Destfolder -Force
}
elseif (!$existingFile) {
$_ | Copy-Item -Destination $Destfolder -Force
}
}
Another way as you suggested is to Move the existing files into another backup folder instead of renaming them first:
$Sourcefolder = "C:\Users\parveen.kumar\Downloads\Source"
$Destfolder = "C:\Users\parveen.kumar\Downloads\desi"
$BackupofDestfolder = "C:\Users\parveen.kumar\Downloads\just"
# make sure the destination and backup folders exist before trying to copy or move files there
$null = New-Item -Path $Destfolder -ItemType Directory -Force
$null = New-Item -Path $BackupofDestfolder -ItemType Directory -Force
Get-ChildItem -Path $Sourcefolder -File | ForEach-Object {
# test if there already is a file with that name in the destination folder
$existingFile = Get-Item -Path (Join-Path -Path $Destfolder -ChildPath $_.Name) -ErrorAction SilentlyContinue
# if a file already exists AND is older than the one in the source folder, do the copy
if ($existingFile -and $existingFile.LastWriteTime -lt $_.LastWriteTime) {
# move the existing file first before you overwrite with a newer file from the source folder
$existingFile | Move-Item -Destination $BackupofDestfolder -Force
$_ | Copy-Item -Destination $Destfolder -Force
}
elseif (!$existingFile) {
$_ | Copy-Item -Destination $Destfolder -Force
}
}
The code below has been wonderful so far for organising my hard-drives.
I do face this error when I transfer large amounts of data:
Move-Item : Cannot create a file when that file already exists.
This happens when I move a file that is duplicate, is there a way to rename the duplicate file in some sort of sequence?
That would be much appreciated :))
# Get all files
Get-ChildItem "C:\zAa" -File -Recurse | ForEach-Object {
# Get the modified date
$dt = Get-Date $_.LastWriteTime
$year = $dt.Year
$month = $dt.Month
# This adds "0" in front of the 1-9 months
if($dt.Month -lt 10) {
$month = "0" + $dt.Month.ToString()
} else {
$month = $dt.Month
}
# Remove leading '.' from the extension
$extension = $_.Extension.Replace(".", "")
# Where we want to move the file
$destinationFolder = "C:\zBb\$extension\$year\$month\"
# Ensure full folder path exists
if(!(Test-Path $destinationFolder)) {
New-Item -ItemType Directory -Force -Path $destinationFolder
}
# Copy/Move the item to it's new home
Move-Item $_.FullName $destinationFolder
}
I haven't been able to do much, I normally go find the duplicates and rename them manually.
Probably the easiest way to move a file with a unique name is to use a Hashtable that stores the filenames already present.
Then a simple loop can add a sequence number to its file name until it is no longer found in the Hashtable.
Next simply move the file under that new name.
Your code modified:
# Where we want to move the file
$destinationFolder = 'C:\zBb\{0}\{1:yyyy}\{1:MM}' -f $_.Extension.TrimStart("."), $_.LastWriteTime
# Ensure full folder path exists
$null = New-Item -Path $destinationFolder -ItemType Directory -Force
# create a Hashtable and store the filenames already present in the destination folder
$existing = #{}
Get-ChildItem -Path $destinationFolder -File | ForEach-Object { $existing[$_.Name] = $true }
# Get all source files
Get-ChildItem "C:\zAa" -File -Recurse | ForEach-Object {
# Copy/Move the item to it's new home
# construct the new filename by appending an index number in between brackets
$newName = $_.Name
$count = 1
while ($existing.ContainsKey($newName)) {
$newName = "{0}({1}){2}" -f $_.BaseName, $count++, $_.Extension
}
# add this new name to the Hashtable so it exists in the next run
$existing[$newName] = $true
# use Join-Path to create a FullName for the file
$newFile = Join-Path -Path $destinationFolder -ChildPath $newName
Write-Verbose "Moving '$($_.FullName)' as '$newFile'"
$_ | Move-Item -Destination $newFile -Force
}
I have a directory of information that is separated into document numbers so each folder that contains documents starts with DOC-######-NameOfDocument. The thing I am trying to do is create a PowerShell script that will search a directory for any folders with a specified document number and then take the contents of that folder, move it up one level, and then delete the original folder (which should now be empty).
Below is the closest I have gotten to my intended result.
$Path = "filepath"
$Folders = Get-ChildItem -Filter "DOC-#####*" -Recurse -Name -Path $Path
$companyID = "######"
foreach ($Folder in $Folders){
$filepath = $Path + $Folder
$Files = Get-ChildItem -Path $filepath
$imagesourc = $filepath + $companyID
$imageDest = $filepath.Substring(0, $filepath.LastIndexOf('\'))
if (Test-Path -Path $imagesourc){
Copy-Item -Path $imagesourc -Destination $imageDest -Recurse
}
foreach ($File in $Files){
$Parent_Directory = Split-Path -Path $File.FullName
$Destination_Path = $filepath.Substring(0, $filepath.LastIndexOf('\'))
Copy-Item -Path $File.FullName -Destination $Destination_Path -Recurse
if ($null -eq (Get-ChildItem -Path $Parent_Directory)) {
}
}
Remove-Item $filepath -Recurse
}
This does what I need but for whatever reason I can't Devine, it will not work on .HTM files. Most of the files I am moving are .html and .htm files so I need to get it to work with .htm as well. The files with .HTM will not move and the folder won't be deleted either which is good at least.
Try using this:
$ErrorActionPreference = 'Stop'
$fileNumber = '1234'
$initialFolder = 'X:\path\to\folders'
$folders = Get-ChildItem -Path $initialFolder -Filter DOC-$fileNumber* -Force -Directory -Recurse
foreach($folder in $folders)
{
try
{
Move-Item $folder\* -Destination $folder.Parent.FullName
Remove-Item $folder
}
catch [System.IO.IOException]
{
#(
"$_".Trim()
"File FullName: {0}" -f $_.TargetObject
"Destination Folder: {0}" -f $folder.Parent.FullName
) | Out-String | Write-Warning
}
catch
{
Write-Warning $_
}
}
Important Notes:
Move-Item $folder\* will move all folder contents recursively. If there are folders inside $folder, those will also be moved too, if you want to target folders which only have files inside, an if condition should be added before this cmdlet.
Try {...} Catch {...} is there to handle file collision mainly, if a file with a same name already exists in the parent folder, it will let you know and it will not be moved nor will the folder be deleted.
-Filter DOC-$fileNumber* will capture all the folders named with the numbers in $fileNumber however, be careful because it may capture folders which you may not intent to remove.
Example: If you want to get all folders containing the number 1234 (DOC-12345-NameOfDocument, DOC-12346-NameOfDocument, ...) but you don't want to capture DOC-12347-NameOfDocument then you should fine tune the filter. Or you could add the -Exclude parameter.
-Force & -Directory to get hidden folders and to target only folders.
Our Git repo blew up and we ended up losing the repo so now all our our users code is only on local workstations. For temporary storage we are going to have all of them put their local repo's on a network share. I am currently trying to write a PowerShell script to allow users to select all their repos with GridView and then copy them to the network share. This will cause a lot of overlap, so I only want files that have the latest modified date (commit) to overwrite when their are duplicate files.
For example,
User 1 has repo\file.txt last modified 8/10 and uploads it to network share.
User 2 also has repo\file.txt last modifed 8/12. when User 2 copies to the share it should overwrite User 1 file because it is the newer file.
I am new to PowerShell so I am not sure which direction to take.
As of right now I figured out how to copy over all files, but can't figure out the last modified piece. Any help would be greatly appreciated.
$destination = '\\remote\IT\server'
$filesToMove = get-childitem -Recurse | Out-GridView -OutputMode Multiple
$filesToMove | % { copy-item $_.FullName $destination -Recurse }
If your users have permission to write/delete files in the remote destination path, this should do it:
$destination = '\\remote\IT\server\folder'
# create the destination folder if it does not already exist
if (!(Test-Path -Path $destination -PathType Container)) {
Write-Verbose "Creating folder '$destination'"
New-Item -Path $destination -ItemType Directory | Out-Null
}
Get-ChildItem -Path 'D:\test' -File -Recurse |
Out-GridView -OutputMode Multiple -Title 'Select one or more files to copy' | ForEach-Object {
# since we're piping the results of the Get-ChildItem into the GridView,
# every '$_' is a FileInfo object you can pipe through to the Copy-Item cmdlet.
$skipFile = $false
# create the filename for a possible duplicate in the destination
$dupeFile = Join-Path -Path $destination -ChildPath $_.Name
if (Test-Path -Path $dupeFile) {
# if a file already exists AND is newer than the selected file, do not copy
if ((Get-Item -Path $dupeFile).LastWriteTime -gt $_.LastWriteTime ) {
Write-Host "Destination file '$dupeFile' is newer. Skipping."
$skipFile = $true
}
}
if (!$skipFile) {
$_ | Copy-Item -Destination $destination -Force
}
}
this is my first post here so please be forgiving. I'm browsing reddit/stackoverflow looking for cases to practice my PowerShell skills. I tried creating a script like you asked for on my local home PC, let me know if that somehow helps you:
$selectedFiles = get-childitem -Path "C:\Users\steven\Desktop" -Recurse | Out-GridView -OutputMode Multiple
$destPath = "D:\"
foreach ($selectedFile in $selectedFiles) {
$destFileCheck = $destPath + $selectedFile
if (Test-Path -Path $destFileCheck) {
$destFileCheck = Get-ChildItem -Path $destFileCheck
if ((Get-Date $selectedFile.LastWriteTime) -gt (Get-Date $destFileCheck.LastWriteTime)) {
Copy-Item -Path $selectedFile.FullName -Destination $destFileCheck.FullName
}
else {
Write-Host "Source file is older than destination file, skipping copy."
}
}
}
I've finally have given up googling and come here out of desperation. Go easy on me I'm fairly new to Powershell.
So, the objective of the code below was to first look through the source folder, then read through each .zip file and move to the directory specified by the value in the hashtable. Unfortunately, this is not how they want it to work anymore.
Now I need to retain the parent folder from source: for example "DAL" and then create the proceeding folders based on the file names and finally move each .zip to its file specified folder. Also, it needs to go through each folder under source which will be at least 20 other folders with a unique 3 character names.
$srcRoot = "C:\Cloud\source\dal"
$dstRoot = "C:\Cloud\Destination"
##$map = #{}; dir -recurse | ? { !$_.psiscontainer} | % { ##$map.add($_.name,$_.PSChildName) }
# DAT and DEV will have to be excluded from folder creation
$map = {
#AEODDAT_201901 = "AEOD\2019\01"
#AEOMDEV_201902 = "AEOM\2019\01"
#AEOYDAT_201902 = "AEOY\2019\01"
}
$fileList = Get-ChildItem -Path $srcRoot -Filter "*.zip*" -File -Force -Recurse
foreach ($file in $fileList)
{
#Go through each file up to mapped string
$key = $file.BaseName.Substring(0,14)
if ($key -in $map.Keys)
{
$fileName = $file.Name
$dstDir = Join-Path -Path $dstRoot -ChildPath $map[$key]
#create direcotory if not in path
if (-not (Test-Path -Path $dstDir))
{
mkdir -Path $dstDir
}
Write-Verbose "Moving $($file.FullName)"
if (Test-Path -Path (Join-Path -Path $dstDir -ChildPath $fileName))
{
#Write error if name exists
Write-Error -Message "File $fileName already exists at $dstDir"
#move path
} else {
Move-Item -Path $($file.FullName) -Destination $dstDir
}
}
}
So C:\Cloud\source\DAL\AEODDAT20190101.zip should create folders in C:\Cloud\Destination\DAL\AEOD\2019\01\AEODDAT20190101.zip would be my desired output.
Welcome, Matt! (no pun intended) One of the habits I have in similar situations with destination folders is to Set-Location $dstRoot and create folders from the relative path. You can execute New-Item with the relative path and the syntax is simpler. For example, your If statement could look like this and it would work the same way (with a slightly different error message):
if ($key -in $map.Keys){
Set-Location $dstRoot
New-Item -ItemType Directory $map[$key] -ErrorAction Ignore #won't raise an error if it exists
Write-Verbose "Moving $($file.FullName)"
#this will raise an error if the file already exists, unless you specify -Force
Move-Item "$($file.FullName)" $map[$key]
}
EDIT: Found 2 issues.
$map is a Hashtable literal that should be preceded with #:
$map = #{
AEODDAT20190101 = "AEOD\2019\01"
You were missing the last character of the base file name by taking only the first 14 characters. AEODDAT2019010 didn't match AEODDAT20190101. This should fix it:
$key = $file.BaseName.Substring(0,15)