Powershell Rename Files in a Directory Synchronously - powershell

I have a script that renames all the files in a directory from fields/columns after importing a .CSV. My problem is that PS is renaming the files asynchronously and not synchronously. Is there a better way to accomplish get the result I want?
Current file name = 123456789.pdf
New File Name = $documentID_$fileID
I need to new file name to rename the files in order to make the script viable.
Here's my code (I'm new at this):
$csvPath = "C:\Users\dougadmin28\Desktop\Node Modify File Name App\test.csv"
$filePath = "C:\Users\dougadmin28\Desktop\Node Modify File Name App\pdfs"
$csv = Import-Csv $csvPath Select-Object -Skip 0
$files = Get-ChildItem $filePath
foreach ($item in $csv) {
foreach($file in $files) {
Rename-Item $file.fullname -NewName "$($item.DocumentID +"_"+ ($item.FileID)+($file.extension))" -Verbose
}
}

you may try using workflows, which would allow you to execute tasks in parallel:
https://learn.microsoft.com/en-us/powershell/module/psworkflow/about/about_foreach-parallel?view=powershell-5.1
Have in mind that PowerShell Workflows, have some limitations:
https://devblogs.microsoft.com/scripting/powershell-workflows-restrictions/
Hope it helps!

I thought synchronous meant sequentially as in 'one after the other', Which is what your script is doing now.
If you mean to say 'in parallel' as in Asynchronously or independent of each other, you can look at using
Background Jobs. They include Start-Job, wait-job and
receive-job. Easiest to work with but not efficient in terms to performance. It is also available in some cmdlets as an -AsJob switch.
PowerShell Runspaces. [Most efficient but hard to code for]
PowerShell Workflows [Balanced but has limitations]

Related

Copy files after time x based on modifaction Date

I need a script that only copy files after 5 minutes based on the modification date. Does anyone have a solution for this ?
I couldn't find any script online.
The answer from jdweng is a good solution to identify the files in scope.
You could make your script something like this to easily re-use it with other paths or file age.
# Customizable variables
$Source = 'C:\Temp\Input'
$Destination = 'C:\Temp\Output'
[int32]$FileAgeInMinutes = 5
# Script Execution
Get-ChildItem -Path $Source | Where-Object { $_.LastWriteTime -lt (Get-Date).AddMinutes(-$FileAgeInMinutes) } | Copy-Item -Destination $Destination
You could then run a scheduled task using this script and schedule it to run in periodically, depending on your need.

powershell - read all .sql files in a folder and save them all into a single .sql file without changing line ends or line feeds

I manage database servers and often I have to apply scripts into different servers or databases.
Sometimes these scripts are all saved in a directory and need to be open and run in the target server\database.
As I have been looking at automating this task I came across how Run All PowerShell Scripts In A Directory and also How can I execute a set of .SQL files from within SSMS? and that is exactly what I needed, however I stumbled over a few issues:
I don't know the file names
:setvar path "c:\Path_to_scripts\"
:r $(path)\file1.sql
:r $(path)\file2.sql
I tried to add all .sql files into one big thing, but when I copied from powershell into sql, in many of the procedures that had long lines, the lines got messed up
cls
$Radhe = Get-Content 'D:\apply all scripts to SQLPRODUCTION\*.sql' -Raw
$Radhe.Count
$Radhe.LongLength
$Radhe
If I could read all the files in that specific folder and save them all into a single the_scripts_to_run.sql file, without changing the line endings, that would be perfect.
I don't need to use get-content or any command in particular, I just would like to get all my scripts into a big single script with everything in it, without changes.
How can I achieve that?
I even found Merge multiple SQL files into a single SQL file but I want to get it done via powershell.
This should work fine, I'm not sure what you mean by not needing to use Get-Content you could use [System.IO.File]::ReadAllLines( ) or [System.IO.File]::ReadAllText( ) but this should work fine too. Try it and let me know if it works.
$path = "c:\Path_to_scripts"
$scripts = (Get-ChildItem "$path\*.sql" -Recurse -File).FullName
$merged = [system.collections.generic.list[string[]]]::new()
foreach($script in $scripts)
{
$merged.Add((Get-Content $script))
}
$merged | Out-File "$path\mergedscripts.sql"
This is actually much simpler than the proposed solutions. Get-Content takes a list of paths and supports wildcards, so no loop is required.
$path = 'c:\temp\sql'
Set-Content -Path "$path\the_scripts_to_run.sql" -Value (Get-Content -Path "$path\*.sql" -Raw)
Looks like me and #Santiago had the same idea:
Get-ChildItem -Path "$path" -Filter "*.sql" | ForEach-Object -Process {
Get-Content $_.FullName | Out-File $Path\stuff.txt -Append utf8
}

Powershell 5.1: How to iterate over files in parallel

I need to copy files dependent on content. So I get all files, read the content and ask a regex if it is valid. After that I want to copy the file to a certain directory.
My problem is, that there are a lot of source files, so I need to execute this in parallel.
I cannot use PowerShell ForEach-Object Parallel Feature because we are using Powershell Version < 7.0.
Using a workflow is way to slow.
$folder = "C:\InputFiles"
workflow CopyFiles
{
foreach -parallel ($file in gci $folder *.* -rec | where { ! $_.PSIsContainer })
{
//Get content and compare against a regex
//Copy if regex matches
}
}
CopyFiles
Any ideas how to run this in a parallel manner with Powershell?
Another option is using jobs. You'd have to define a ScriptBlock accepting path and regex as parameters, then run it in paralell in the background. Read about Start-Job, Receive-Job, Get-Job, Remove-Job cmdlets.
But I don't think it's really going to help:
I don't expect it to be much faster than workflows
You'd have to throttle and control execution of jobs by yourself adding complexity to the script
There's substantial overhead to running jobs
Most probably file system is the bottleneck of this task, so any approach accessing files in paralell isn't really going to help here
Can you run the following script with your configuration and see how much time it takes with this method? It takes 100ms for me to find around 2000 occurrences of text PowerShell in it.
$starttime = Get-Date;
$RegEx = 'Powershell'
$FilesFound = Get-ChildItem -Path "$PSHOME\en-US\*.txt" | Select-String -Pattern $RegEx
Write-Host "Total occurence found: $($FilesFound.Count)"
$endtime = Get-Date;
Write-Host "Time of execution:" ($endtime - $starttime).Milliseconds "Mili Seconds";

Using a filename for a variable and move similar files with Powershell

I am trying to write a powershell script that will look at a directory(c:\Temp) and look for files with exensions of .fin.
If it finds a file with the fin extension I need it to move all files that have the same first seven characters of that fin file to another directory(c:\Temp\fin).
There could be multiple .fin files at a time in that directory with files that need to be moved.
I have tried a few different things but I am new to anything like this. I have only used very basic powershell scripts or commands. While this might be basic(not sure) I am lost as to where to start.
You'll need to call Get-ChildItem twice. The first time to get the pattern you want to match the file names to and the second to get the files. You can then pipe the results of the second command to Move-Item. Here's an example of how to do that:
[CmdletBinding()]
param(
$DirectoryToScan,
$ExtensionToMatch = ".fin",
$TargetDirectory
)
$Files = Get-ChildItem -Path $DirectoryToScan -File
foreach ($File in $Files) {
if($File.Extension -eq $ExtensionToMatch) {
$PatternToMatch = "$($File.BaseName.Substring(0, 7))*$ExtensionToMatch"
Write-Verbose "PatternToMatch: $PatternToMatch"
$MatchedFiles = Get-ChildItem -Path $DirectoryToScan -Filter $PatternToMatch
}
$MatchedFiles | Move-Item -Destination $TargetDirectory
}
If you save the above into a file called Move-MatchingFiles.ps1 you can pass in your parameters Moving-MatchingFiles.ps1 -DirectoryToScan C:\Temp -TargetDirectory C:\Temp\fin. The ExtensionToMatch parameter is optional and only needed if you wanted to move a different type of file.

Create PS script to find files

I want to start by saying coding is a bit outside of my skill set but because a certain problem keeps appearing at work, I'm trying to automate a solution.
I use the below script to read an input file for a list of name, search the C:\ for those files, then write the path to an output file if any are found.
foreach($line in Get-Content C:\temp\InPutfile.txt) {
if($line -match $regex){
gci -Path "C:\" -recurse -Filter $line -ErrorAction SilentlyContinue |
Out-File -Append c:\temp\ResultsFindFile.txt
}
}
I would like to make two modifications to this. First, to search all drives connected to the computer not just C:\. Next, be able to delete any found files. I'm using the Remove-Item -confirm command but so far can't make it delete the file it just found.