I have 20 reports that come weekly. I have a powershell scrip that will create a folder with the date on it. I would like to create a script that will create a new folder each time it runs and then move all the xlsx file into that new folder.
For the folder creation I am using
New-Item -ItemType Directory -Path "S:\***\***\Reporting\Export\$((Get-Date).ToString('yyyy-MM-dd'))"
The road block I am running into is finding a way to specify the new folder as the target path for the moving of files.
Store the newly created directory in a variable use that variable for the Destination parameter when moving:
$destination = New-Item -ItemType Directory -Path "S:\SOC\Reports\Reporting\Export\$((Get-Date).ToString('yyyy-MM-dd'))"
Move-Item -Path "$SourcePath\*" -Destination $destination
Related
Good afternoon all,
I'm guessing this is super easy but really annoying for me; I have a text file with a list of files, in the same folders there are LOTS of other files but I only need specific ones.
$Filelocs = get-content "C:\Users\me\Desktop\tomove\Code\locations.txt"
Foreach ($Loc in $Filelocs){xcopy.exe $loc C:\Redacted\output /s }
I figured this would go through the list which is like
"C:\redacted\Policies\IT\Retracted Documents\Policy_Control0.docx"
and then move and create the folder structure in a new place and then copy the file, it doesn't.
Any help would be appreciated.
Thanks
RGE
xcopy can't know the folder structure when you explicitly pass source file path instead of a source directory. In a path like C:\foo\bar\baz.txt the base directory could be any of C:\, C:\foo\ or C:\foo\bar\.
When working with a path list, you have to build the destination directory structure yourself. Resolve paths from text file to relative paths, join with destination directory, create parent directory of file and finally use PowerShell's own Copy-Item command to copy the file.
$Filelocs = Get-Content 'locations.txt'
# Base directory common to all paths specified in "locations.txt"
$CommonInputDir = 'C:\redacted\Policies'
# Where files shall be copied to
$Destination = 'C:\Redacted\output'
# Temporarily change current directory -> base directory for Resolve-Path -Relative
Push-Location $CommonInputDir
Foreach ($Loc in $Filelocs) {
# Resolve input path relative to $CommonInputDir (current directory)
$RelativePath = Resolve-Path $Loc -Relative
# Resolve full target file path and directory
$TargetPath = Join-Path $Destination $RelativePath
$TargetDir = Split-Path $TargetPath -Parent
# Create target dir if not already exists (-Force) because Copy-Item fails
# if directory does not exist.
$null = New-Item $TargetDir -ItemType Directory -Force
# Well, copy the file
Copy-Item -Path $loc -Destination $TargetPath
}
# Restore current directory that has been changed by Push-Location
Pop-Location
Possible improvements, left as an exercise:
Automatically determine common base directory of files specified in "locations.txt". Not trivial but not too difficult.
Make the code exception-safe. Wrap everything between Push-Location and Pop-Location in a try{} block and move Pop-Location into the finally{} block so the current directory will be restored even when a script-terminating error occurs. See about_Try Catch_Finally.
I am using following code to create a folder with current date name:
Get-ChildItem -Path $Path
$Path = 'C:\temp'
$folderName = (Get-Date).tostring("ddMMyyyy")
New-Item -itemType Directory -Path $Path -Name $FolderName
After this code, I want to add another code that copies files from certain directory and puts them into this newly created folder with current date name.
Thanks
Check this
cp C:\temp\filename.txt "$path\$foldername\"
I have a simple folder structure that I need to zip up to upload to AWS Lambda function
node-modules/
index.js
package.json
The files above are in the root directory. Basically the end goal is I would like a zip of all of these files and sub-files/directories instead of the structure as it is.
When I try to run the command below it says it can't access E://apps/myapp/release.zip because it's being used by another process. However I can see it starts to create release.zip but it does not have all the contents.
Add-Type -Assembly "System.IO.Compression.FileSystem";
[System.IO.Compression.ZipFile]::CreateFromDirectory(E://apps/myapp, E://apps/myapp/release.zip);
So I tried another approach. Take the folder and two files and copy them into a temporary folder, then try to zip it back into the root.
Copy-Item E://apps/myapp E://apps/myapp/temp -recurse
I do see a temp/ folder but within the temp folder it's like an inception of never ending copies until the file path gets too long.
Any tips would be much appreciated.
Issue could be that you are creating the zip file in the same folder that you are trying to compress. And you basically do the same thing when you tried using the temporary folder, hence the inception.
Try creating destination outside of the source folder being compressed.
$source = "E://apps/myapp"
$destination = "E://apps/myapp.release.zip"
# If the archive already exists, an IOException exception is thrown.
if(Test-Path $destination) {
Remove-Item -Path $destination -Force -Recurse -Confirm:$false
}
[Reflection.Assembly]::LoadWithPartialName( "System.IO.Compression.FileSystem" )
[System.IO.Compression.ZipFile]::CreateFromDirectory($source, $destination)
# Once archive created delete folder
if(Test-Path $destination) {
Remove-Item -Path $source -Force -Recurse -Confirm:$false
Write-Host "directory removed: $source"
}
I'm having a few issues while copying desktop files as a part of the back up process from local folder to server.
My script automatically creates a backup folder on the server and I just realised when it creates the back up folder windows automatically creates a desktop.ini file, and as desktop.ini is a system file the script failed to write another desktop.ini from desktop to the backup folder in server.
My initial code was :
Copy-Item ("$Global:localBackupRestoreLocation\Desktop\$file") ("$Global:remoteBackupRestorePath\$nameOfbackupDirectory") ($recursive="true")
But I was to use the exclude in that line and I know exclude does not work recursively and I have to use Get-ChildItem.
Try something like this
$Exclude = #('desktop.ini')
copy-item -Path $Source -Destination $Dest -Exclude $Exclude
I know there has to be a simple answer but how can I run both my commands at the same time? I can run one line to create the new directory and name it, then copy items from one folder to another. But how do I create the folder and copy the files into that created folder?
#Creates new directory and names it current date and time.
New-Item C:\ -type directory -Name ("$(Get-Date -f ddMMyyyy_hhmm)")
#Copies Data from one directory to another, works good.
Copy-Item C:\test\* C:\test2
Thanks for your help.
You could use the FullName property of the object that New-Item returns as the destination for another command. i.e.
Copy-Item C:\somefolder\* -Destination (New-Item -type Directory -name (Get-Date -f ddMMyyyy_hhmmss)).FullName