Remove everything before \ - powershell

I need to copy a lot of files and use the same sort of folder structure where the files needs to go.
So for instance if I have the following two documents:
\\Server1\Projects\OldProject\English\Text_EN.docx
\\Server1\Projects\OldProject\English\Danish\Text_DA.docx
I would need to move them to a new place on the server, but they need to be in the same "language folder". So I need to move them like this:
\\Server1\Projects\OldProject\English\Text_EN.docx -> \\Server1\Projects\NewProject\English\Text_EN.docx
\\Server1\Projects\OldProject\English\Danish\Text_DA.docx -> \\Server1\Projects\NewProject\English\Danish\Text_DA.docx
The issue here is, that I would need to take names of the "language" folder and create them in the NewProject folder.
How would I be able to take and remove everything before the \, so I end up with only having the "language" folders like English\ and English\Danish

If the goal it to just replace the 'OldProject' folder with 'NewProject' in the file path you could use replace to make the change to the file path:
$filePath = Get-ChildItem \\Server1\Projects\OldProject\English\Text_EN.docx
Copy-Item $filePath.FullName -Destination ($filepath.FullName -replace "\bOldProject\b", "NewProject")
The '\b' is used to do a regex EXACT match for anything inside the tags.

Try the following, which, for each input file:
constructs the target dir. path by replacing the old project dir.'s root path with the new one's, thereby effectively replicating the old dir.'s subdirectory structure.
makes sure that the target dir. exists
then copies the input file to the target dir.
$oldProjectRoot = '\\Server1\Projects\OldProject'
$newProjectRoot = '\\Server1\Projects\NewProject'
Get-ChildItem -Recurse -Filter *.docx -LiteralPath $oldProjectRoot |
ForEach-Object {
# Construct the target dir. path, with the same relative path
# as the input dir. path relative to the old project root.
$targetDir =
$newProjectRoot + $_.Directory.FullName.Substring($oldProjectRoot.Length)
# Create the target dir., if necessary (-Force returns any preexisting dir.)
$null = New-Item -Force -Type Directory $targetDir
$_ # Pass the input file through.
} |
Copy-Item -Destination { $targetDir } -WhatIf
Note: The -WhatIf common parameter in the command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.

Related

Copying files to directory whilst retaining directory structure from list

Good afternoon all,
I'm guessing this is super easy but really annoying for me; I have a text file with a list of files, in the same folders there are LOTS of other files but I only need specific ones.
$Filelocs = get-content "C:\Users\me\Desktop\tomove\Code\locations.txt"
Foreach ($Loc in $Filelocs){xcopy.exe $loc C:\Redacted\output /s }
I figured this would go through the list which is like
"C:\redacted\Policies\IT\Retracted Documents\Policy_Control0.docx"
and then move and create the folder structure in a new place and then copy the file, it doesn't.
Any help would be appreciated.
Thanks
RGE
xcopy can't know the folder structure when you explicitly pass source file path instead of a source directory. In a path like C:\foo\bar\baz.txt the base directory could be any of C:\, C:\foo\ or C:\foo\bar\.
When working with a path list, you have to build the destination directory structure yourself. Resolve paths from text file to relative paths, join with destination directory, create parent directory of file and finally use PowerShell's own Copy-Item command to copy the file.
$Filelocs = Get-Content 'locations.txt'
# Base directory common to all paths specified in "locations.txt"
$CommonInputDir = 'C:\redacted\Policies'
# Where files shall be copied to
$Destination = 'C:\Redacted\output'
# Temporarily change current directory -> base directory for Resolve-Path -Relative
Push-Location $CommonInputDir
Foreach ($Loc in $Filelocs) {
# Resolve input path relative to $CommonInputDir (current directory)
$RelativePath = Resolve-Path $Loc -Relative
# Resolve full target file path and directory
$TargetPath = Join-Path $Destination $RelativePath
$TargetDir = Split-Path $TargetPath -Parent
# Create target dir if not already exists (-Force) because Copy-Item fails
# if directory does not exist.
$null = New-Item $TargetDir -ItemType Directory -Force
# Well, copy the file
Copy-Item -Path $loc -Destination $TargetPath
}
# Restore current directory that has been changed by Push-Location
Pop-Location
Possible improvements, left as an exercise:
Automatically determine common base directory of files specified in "locations.txt". Not trivial but not too difficult.
Make the code exception-safe. Wrap everything between Push-Location and Pop-Location in a try{} block and move Pop-Location into the finally{} block so the current directory will be restored even when a script-terminating error occurs. See about_Try Catch_Finally.

Powershell , remove folder with the same name like .zip

I must delete files which have been extracted from a zip file, into a folder named after the zip file, i.e.:
\test1.zip -> \test1
My script must find the folder which have the same name as the zip file and delete this folder.
Get a list of all of the Zip files in the directory, then loop over the results and delete any folder with the same name minus the extension, also known as the BaseName.
Get-ChildItem -Filter *.zip | `
ForEach-Object { if (Test-Path $_.BaseName) {
Remove-Item -Recurse -Force $_.BaseName }
}
You can enter the entire command on one line, I have split it up so that it is easy to read on here. I used the following commands in this example:
Get-ChildItem - Creates a object in the pipeline for each file with a .zip extension
ForEach-Object - Simply allows you to perform an action for each object in the pipeline.
Remove-Item - note the use of -Recurse and -Force ensures that the folder is removed even if it contains files, you will not be asked to confirm.

How can I use Powershell to keep two directories updated with the latest files

I have two directories:
C:\G\admin\less
C:\G\user\less
Inside of those directives I have multiple less and css files. I know all the names of the files so I would like to hardcode a list of these into the script. Perhaps this could be in an array or something but my knowledge of PowerShell is not even enough to know if there are arrays in the scripting language.
C:\G\admin\less
html-light.less
html-light.css
html-dark.less
html-dark.css
do-not-track-me.less
C:\G\user\less
html-light.less
html-light.css
html-dark.less
html-dairk.css
do-not-track-me.less
Is there a way I can use PowerShell to check each of these files (that I want to hardcode in my program) one by one and copy the last modified file from its directory to the other directory so that both directories will contain the same latest versions of these files?
Note that I would need to evaluate the predefined list of files one by one. Comparing modified date in one directory with the other and copying over as needed.
again assume that this isn't the best solution or approach
This solution assumes following
- when the LastWriteTime on one folder is bigger than the other it copy it to another folder.
- I'm not doing the path validation because of laziness but if you want with path validation just ask.
- I'm assuming that all the files on those folder must be tracked otherwise read the comment on code.
- i suggest you backup your folder before you run the script.
#if there is a file you don't want to track on those folder (for example you don't want to track txt files)
#just write $userFiles=dir C:\G\user\less\ -Exclude "*.txt"
#if you don't want track txt but only one of them should be track among with other file format
#$userFiles=dir C:\G\user\less\ -Exclude "*.txt" -Include "C:\G\user\less\Txtadditionaltotrack.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
foreach($userfile in $userFiles)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
you can improve it because the way this code is it always copy file from one directory to another inside the else you can validate so that when the lastwriteTime is equal on both it doesn't copy.
You can improve it in many ways. i hope you got the ideia
Find the modification made to code so that it can archieve this requirement.
PLEASE READ THE COMMENT IN CODE.
NOTE THAT I'M NOT FOLLOWING THE BEST PRATICE (avoid unexpected error, name correctly all variable, ...)
#to make it more dynamical you can save on one file
#all the file names including extension in different lines.
#For example on path C:\FilesToWatch\watcher.txt
#$filestowatch=get-content C:\FilesToWatch\watcher.txt
$filestowatch="felicio.txt","marcos.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
#Optionally instead of use this if approach you can
#$userFiles=dir C:\G\user\less\ |? {$filestowatch -contains $_.Name}
#$adminfiles=dir C:\G\admin\less\|? {$filestowatch -contains $_.Name}
#loading in the above manner the first if statement on code bellow can be removed because
#We make sure that $userFiles and $adminfiles only have correct file to monitor
foreach($userfile in $userFiles)
{
if($filestowatch -contains $userfile.Name)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
}
I think what you need is symbolic links, aka symlinks.
With symlinks, you can define files and folders that will be always in-sync, where the target file is updated automatically when the original is modified.
To create a symbolic link, enter the following in the console/command prompt:
mklink /[command] [link path] [file or folder path]
Mklink can create several types of links, according to these commands:
/D – creates a soft symbolic link, which is similar to a standard folder or file shortcut in Windows. This is the default option, and mklink will use it if you do not enter a command.
/H – creates a hard link to a file.
/J – creates a hard link to a folder.
The syntax is simple. Choose your option, define the path you want for the symlink, and finally the path of the original file/folder.
For example, imagine I'm developing a new project, and I want to share it to my client via Dropbox shared folder. I don't want to move all my workspace to dropbox, I just want to share that specific folder to them:
mklink /J C:\Dropbox\clients_shared_folders\project_x C:\my_dev_rootfolder\project_x
Note that the first path is the symbolic folder I want to create, while the second path is the existing directory.
In you case, I'll be assuming your working on the admin folder, and want to generate a syncd copy on the user folder:
mklink /J C:\G\user\less C:\G\admin\less
Here's a nice article for more info:
http://www.howtogeek.com/howto/16226/complete-guide-to-symbolic-links-symlinks-on-windows-or-linux/

Copying subfolders and files, checking "last modified time"

I have made a backup script that:
Reads source file paths and destination folder path from an XML file
Checks if source file and destination path exist (for each file)
Checks if source file (same name) exists in the target folder
Checks the last modified date of every source and destination file, if the file exists in the target folder
Copies source files to the target folder if the file does not already exist, or if the source file is newer than the existing file in the destination folder, otherwise does nothing
This only works on source files, if a source folder is specified in the XML file, only that folder will be copied, and none of its content.
I don't want to use Copy-Item -Recurse because I want to check the last modified date of every item, and if it fails the above conditions I don't want to copy it at all.
This brings me to Get-ChildItem -Recurse to list everything, but I'm having trouble coming up with something that works for this example:
C:\powershell\test\ (XML specified source)
Underlying structure:
C:\powershell\test\xmltest2.xml
C:\powershell\test\test2\xmltest.xml
C:\powershell\test\test3\test4\xmltest3.xml
etc.
i.e. I want to check every file before copying it but if say a folder has not been modified but a file inside it should be copied it should still work, AND retain the same folder structure.
Any ideas? :)
As Ansgar Wiechers says, you are reinventing the wheel, RoboCopy will do it much more easily. RoboCopy can also copy the security permissions and the created/modified dates as well, which is great. Relevant RoboCopy discussion: https://superuser.com/a/445137/67909
Still, it's not as fun as writing it yourself, eh? I've come up with this:
# Assuming these two come from your XML config, somehow
$xmlSrc = "c:\users\test\Documents\test1"
$xmlDestPath = "c:\users\test\Documents\test2"
#==========
# Functions
#==========
function process-file ($item) {
#$item should be a string, full path to a file
#e.g. 'c:\users\test\Documents\file.txt'
# Make the destination file full path
$destItem = $item.ToLower().Replace($xmlSrc, $xmlDestPath)
if (-not (Test-Path $destItem)) { #File doesn't exist in destination
#Is there a folder to put it in? If not, make one
$destParentFolder = Split-Path $destItem -Parent
if (-not (Test-Path $destParentFolder)) { mkdir $destParentFolder }
# Copy file
Copy-Item $item -Destination $destParentFolder -WhatIf
} else { #File does exist
if ((Get-Item $item).LastAccessTimeUtc -gt (Get-Item $destItem).LastAccessTimeUtc) {
#Source file is newer, copy it
$destParentFolder = Split-Path $destItem -Parent
Copy-Item $item -Destination $destParentFolder -Force -WhatIf
}
}
}
function process-directory($dir) {
# Function mostly handles "copying" empty directories
# Otherwise it's not really needed
# Make the destination folder path
$destDir = $dir.ToLower().Replace($xmlSrc, $xmlDestPath)
# If that doesn't exist, make it
if (-not (Test-Path $destDir)) { mkdir $destDir -whatif }
}
#==========
# Main code
#==========
if ((Get-Item $xmlSrc).PsIsContainer) {
# You specified a folder
Get-ChildItem $xmlSrc -Recurse | ForEach {
if ($_.PsIsContainer) {
process-directory $_.FullName
} else {
process-file $_.FullName
}
}|Out-Null
} else {
# You specified a file
process-file $xmlSrc
}
NB. The copies are -WhatIf so it won't do anything drastic. And it has two immediate problems:
It makes everything lowercase. Otherwise you have to match the case properly because .Replace() is case sensitive.
I used .Replace() because -replace treats the \ in the file path as part of a regular expression and doesn't work. There's probably an escape-string commandlet to fix this, but I haven't looked for one.
If you put \ at the end of the $xmlSrc or $xmlDestPath it will fall over.

Delete items at given path, and then all parents recursively

I have a path. Could be path to a file, could be path to a directory.
Now I need to delete the file (if it is a path to a file) and then check if there's no more files in the same folder, delete it as well, and then check the parent folder and so on.
if it is a path to a directory, delete directory, and then check if the parent is empty - delete it as well, and then its parent and so on.
This script will remove the top folder in the path including everything under it. the $path variable can point to either a file or directory.
$path = "D:\temp\temp2\file.txt"
$parts = $path.Split([System.IO.Path]::DirectorySeparatorChar)
# The following will remove D:\temp and everything in it
Remove-Item (Join-Path $parts[0] $parts[1]) -Recurse
I guess by combining these possible to build something:
Get-ChildItem
Split-Path $path -parent
Remove-Item
If you haven't already done the job this might help you:
You can use this to find out if the child item is a folder
| ? {$_.PSIsContainer}
and combined with this you can see if it is an empty folder
| ? {$_.GetFiles().Count -eq 0}
Good luck!