Rename a bulk of files based on a txt file - powershell

I am trying to rename some configuration files that reside into a folder. Some of them have the ".disabled" extentions, some don't.
My intention: foreach file in files-to-change.txt (relative path to the .config file, one under the other), if the file has the ".disabled" extension, remove it, if it doesn't have it, add it. This needs to apply only to the files in the .txt source file.
Basically the files
app_config\file1.config
app_config\CBS\file2.config.disabled
app_config\file3.config
app_config\CBD\Testing\file4.config.disabled
reside in the txt file and they need to match with the files in the destination folder in which I need to change the extension.
I miss the login in creating a proper script to have this completed.

where it says -path you will need to change this to reflect the location of your files.
You can use rename-item cmdlet, here is a simple script i created :
$name=(Get-ChildItem -Path 'C:\testing\New folder\').FullName
foreach ($item in $name) {
if ($item.Contains("disabled"))
{
rename-item -Path $item -NewName $item.Replace(".disabled","") -ErrorAction SilentlyContinue
}
else
{
rename-item -path $item -newname $item.Replace(".config.",".config.disabled.") -ErrorAction SilentlyContinue
}
}

Related

Verify file copied using copy-item

i'm trying to copy file from source to destination any verify if file copied or not.
But the problem is if i make changes inside the source file which was copied earlier then destination file not getting override when i execute the code again. Also i want log file each time files are copied.
Files in folder:- .csv, .log, .png, .html
$source="C:\52DWM93"
$destination="C:\Temp\"
Copy-Item -Path $source -Destination $destination -Force
$ver=(Get-ChildItem -file -path $destination -Recurse).FullName | foreach {get-filehash $_ -Algorithm md5}
If($ver)
{Write-Host "ALL file copied"}
else
{Write-Host "ALL file not copied"}
If you copy directories like this you need the -Recurse switch for Copy-Item. Without it you're not going to copy anything except the directory itself.
You can of course also use Get-ChildItem with whatever filter or Recurse flag you care about and pipe that into Copy-Item.
Use the *-FileCatalog cmdlets for verification.

Copy-Item with overwrite?

Here is a section of code from a larger script. The goal is to recurse through a source directory, then copy all the files it finds into a destination directory, sorted into subdirectories by file extension. It works great the first time I run it. If I run it again, instead of overwriting existing files, it fails with this error on each file that already exists in the destination:
Copy-Item : Cannot overwrite the item with itself
I try, whenever possible, to write scripts that are idempotent but I havn't been able to figure this one out. I would prefer not to add a timestamp to the destination file's name; I'd hate to end up with thirty versions of the exact same file. Is there a way to do this without extra logic to check for a file's existance and delete it if it's already there?
## Parameters for source and destination directories.
$Source = "C:\Temp"
$Destination = "C:\Temp\Sorted"
# Build list of files to sort.
$Files = Get-ChildItem -Path $Source -Recurse | Where-Object { !$_.PSIsContainer }
# Copy the files in the list to destination folder, sorted in subfolders by extension.
foreach ($File in $Files) {
$Extension = $File.Extension.Replace(".","")
$ExtDestDir = "$Destination\$Extension"
# Check to see if the folder exists, if not create it
$Exists = Test-Path $ExtDestDir
if (!$Exists) {
# Create the directory because it doesn't exist
New-Item -Path $ExtDestDir -ItemType "Directory" | Out-Null
}
# Copy the file
Write-Host "Copying $File to $ExtDestDir"
Copy-Item -Path $File.FullName -Destination $ExtDestDir -Force
}
$Source = "C:\Temp"
$Destination = "C:\Temp\Sorted"
You are trying to copy files from a source directory to a sub directory of that source directory. The first time it works because that directory is empty. The second time it doesn't because you are enumerating files of that sub directory too and thus attempt to copy files over themselves.
If you really need to copy the files into a sub directory of the source directory, you have to exclude the destination directory from enumeration like this:
$Files = Get-ChildItem -Path $Source -Directory |
Where-Object { $_.FullName -ne $Destination } |
Get-ChildItem -File -Recurse
Using a second Get-ChildItem call at the beginning, which only enumerates first-level directories, is much faster than filtering the output of the Get-ChildItem -Recurse call, which would needlessly process each file of the destination directory.

Merge CSV Files in Powershell traverse subfolders - archiving & deleting old files use folder name for Target-CSV

I want to merge many CSV-files into one (a few hundred files) removing the header row of the added CSVs.
As the files sit in several subfolders I need to start from the root traversing all the subfolders and process all CSVs in there. Before merging I want to archive them with zip deleting old CSVs. The new merged CSV-file and the zip-archive should be named like their parent folder.
In case the Script is started again for the same folder none of already processed files should be damaged or removed accidentally.
I am not a Powershell guy so I have been copying pasting from several resources in the web and came up with the following solution (Sorry don't remember the resources feel free to put references in the comment if you know).
This patch-work code does the job but it doesn't feel very bulletproof. For now it is processing the CSV files in the subfolders only. Processing the files within the given $targDir as well would also be nice.
I am wondering if it could be more compact. Suggestions for improvement are appreciated.
$targDir = "\\Servername\folder\"; #path
Get-ChildItem "$targDir" -Recurse -Directory |
ForEach-Object { #walkinthrough all subfolder-paths
#
Set-Location -Path $_.FullName
#remove existing AllInOne.csv (targed name for a merged file) in case it has been left over from a previous execution.
$FileName = ".\AllInOne.csv"
if (Test-Path $FileName) {
Remove-Item $FileName
}
#remove existing AllInOne.csv (targed name for archived files) in case it has been left over from a previous execution.
$FileName = ".\AllInOne.zip"
if (Test-Path $FileName) {
Remove-Item $FileName
}
#compressing all csv files in the current path, temporarily named AllInOne.zip. Doing that for each file adding it to the archive (with -Update)
# I wonder if there is a more efficient way to do that.
dir $_.FullName | where { $_.Extension -eq ".csv"} | foreach { Compress-Archive $_.FullName -DestinationPath "AllInOne.zip" -Update}
##########################################################
# This code is basically merging all the CSV files
# skipping the header of added files
##########################################################
$getFirstLine = $true
get-childItem ".\*.csv" | foreach {
$filePath = $_
$lines = $lines = Get-Content $filePath
$linesToWrite = switch($getFirstLine) {
$true {$lines}
$false {$lines | Select -Skip 1}
}
$getFirstLine = $false
Add-Content ".\AllInOne.csv" $linesToWrite
# Output file is named AllInOne.csv temporarily - this is not a requirement
# It was simply easier for me to come up with this temp file in the first place (symptomatic for copy&paste).
}
#########################################################
#deleting old csv files
dir $_.FullName | where { $_.Extension -eq ".csv" -and $_ -notlike "AllInOne.csv"} | foreach { Remove-Item $_.FullName}
# Temporarily rename AllinOne files with parent folder name
Get-ChildItem -Path $_.FullName -Filter *.csv | Rename-Item -NewName {$_.Basename.Replace("AllInOne",$_.Directory.Name) + $_.extension}
Get-ChildItem -Path $_.FullName -Filter *.zip | Rename-Item -NewName {$_.Basename.Replace("AllInOne",$_.Directory.Name) + $_.extension}
}
I have been executing it in the Powershell ISE. The Script is for a house keeping only, executed casually and not on a regular base - so performance doesn't matter so much.
I prefer to stick with a script that doesn't depend on additional libraries if possible (e.g. for Zip).
It may not be bulletproof, but I have seen worse cobbled together scripts. It'll definitely do the job you want it to, but here are some small changes that will make it a bit shorter and harder to break.
Since all your files are CSVs and all would have the same headers, you can use Import-CSV to compile all of the files into an array. You won't have to worry about stripping the headers or accidentally removing a row.
Get-ChildItem "*.csv" | Foreach-Object {
$csvArray += Import-CSV $_
}
Then you can just use Export-CSV -Path $_.FullName -NoTypeInformation to output it all in to a new CSV file.
To have it check the root folder and all the subfolders, I would throw all of the lines in the main ForEach loop into a function and then call it once for the root folder and keep the existing loop for all the subfolders.
function CompileCompressCSV {
param (
[string] $Path
)
# Code from inside the ForEach Loop
}
# Main Script
CompileCompressCSV -Path $targetDir
Get-ChildItem -Path $targetDir -Recurse -Directory | ForEach-Object {
CompileCompressCSV -Path $_.FullName
}
This is more of a stylistic choice, but I would do the steps of this script in a slightly different order:
Get Parent Folder Name
Remove old compiled CSVs and ZIPs
Compile CSVs into an array and output with Parent Folder Name
ZIP together CSVs into a file with the Parent Folder Name
Remove all CSV files
Personally, I'd rather name the created files properly the first time instead of having to go back and rename them unless there is absolutely no way around it. That doesn't seem the case for your situation so you should be able to create them with the right name on the first go.

Powershell , remove folder with the same name like .zip

I must delete files which have been extracted from a zip file, into a folder named after the zip file, i.e.:
\test1.zip -> \test1
My script must find the folder which have the same name as the zip file and delete this folder.
Get a list of all of the Zip files in the directory, then loop over the results and delete any folder with the same name minus the extension, also known as the BaseName.
Get-ChildItem -Filter *.zip | `
ForEach-Object { if (Test-Path $_.BaseName) {
Remove-Item -Recurse -Force $_.BaseName }
}
You can enter the entire command on one line, I have split it up so that it is easy to read on here. I used the following commands in this example:
Get-ChildItem - Creates a object in the pipeline for each file with a .zip extension
ForEach-Object - Simply allows you to perform an action for each object in the pipeline.
Remove-Item - note the use of -Recurse and -Force ensures that the folder is removed even if it contains files, you will not be asked to confirm.

Copying subfolders and files, checking "last modified time"

I have made a backup script that:
Reads source file paths and destination folder path from an XML file
Checks if source file and destination path exist (for each file)
Checks if source file (same name) exists in the target folder
Checks the last modified date of every source and destination file, if the file exists in the target folder
Copies source files to the target folder if the file does not already exist, or if the source file is newer than the existing file in the destination folder, otherwise does nothing
This only works on source files, if a source folder is specified in the XML file, only that folder will be copied, and none of its content.
I don't want to use Copy-Item -Recurse because I want to check the last modified date of every item, and if it fails the above conditions I don't want to copy it at all.
This brings me to Get-ChildItem -Recurse to list everything, but I'm having trouble coming up with something that works for this example:
C:\powershell\test\ (XML specified source)
Underlying structure:
C:\powershell\test\xmltest2.xml
C:\powershell\test\test2\xmltest.xml
C:\powershell\test\test3\test4\xmltest3.xml
etc.
i.e. I want to check every file before copying it but if say a folder has not been modified but a file inside it should be copied it should still work, AND retain the same folder structure.
Any ideas? :)
As Ansgar Wiechers says, you are reinventing the wheel, RoboCopy will do it much more easily. RoboCopy can also copy the security permissions and the created/modified dates as well, which is great. Relevant RoboCopy discussion: https://superuser.com/a/445137/67909
Still, it's not as fun as writing it yourself, eh? I've come up with this:
# Assuming these two come from your XML config, somehow
$xmlSrc = "c:\users\test\Documents\test1"
$xmlDestPath = "c:\users\test\Documents\test2"
#==========
# Functions
#==========
function process-file ($item) {
#$item should be a string, full path to a file
#e.g. 'c:\users\test\Documents\file.txt'
# Make the destination file full path
$destItem = $item.ToLower().Replace($xmlSrc, $xmlDestPath)
if (-not (Test-Path $destItem)) { #File doesn't exist in destination
#Is there a folder to put it in? If not, make one
$destParentFolder = Split-Path $destItem -Parent
if (-not (Test-Path $destParentFolder)) { mkdir $destParentFolder }
# Copy file
Copy-Item $item -Destination $destParentFolder -WhatIf
} else { #File does exist
if ((Get-Item $item).LastAccessTimeUtc -gt (Get-Item $destItem).LastAccessTimeUtc) {
#Source file is newer, copy it
$destParentFolder = Split-Path $destItem -Parent
Copy-Item $item -Destination $destParentFolder -Force -WhatIf
}
}
}
function process-directory($dir) {
# Function mostly handles "copying" empty directories
# Otherwise it's not really needed
# Make the destination folder path
$destDir = $dir.ToLower().Replace($xmlSrc, $xmlDestPath)
# If that doesn't exist, make it
if (-not (Test-Path $destDir)) { mkdir $destDir -whatif }
}
#==========
# Main code
#==========
if ((Get-Item $xmlSrc).PsIsContainer) {
# You specified a folder
Get-ChildItem $xmlSrc -Recurse | ForEach {
if ($_.PsIsContainer) {
process-directory $_.FullName
} else {
process-file $_.FullName
}
}|Out-Null
} else {
# You specified a file
process-file $xmlSrc
}
NB. The copies are -WhatIf so it won't do anything drastic. And it has two immediate problems:
It makes everything lowercase. Otherwise you have to match the case properly because .Replace() is case sensitive.
I used .Replace() because -replace treats the \ in the file path as part of a regular expression and doesn't work. There's probably an escape-string commandlet to fix this, but I haven't looked for one.
If you put \ at the end of the $xmlSrc or $xmlDestPath it will fall over.