Report generation using For loop powershell - powershell

I have a program which generate some reports by reading .XML file and I have to generate reports for multiple files.
But the problem which I am facing is for doing this I need to run it multiple times for each files as program reads only 1 file in 1 click.
Is there any way by which I can generate reports for multiple files in one click ?
So far i have tried below codes
$a = Get-ChildItem "D:\Directory1\Files\*.xml"
foreach ($i in $a)
{
Move-Item $i "D:\Directory1\"
if ($a) {
D:\Directory1\Program1.exe /run /exit /SilentMode
}
}
As per the above code I am trying to Read files from "D:\Directory1\Files\" Then move any 1 file (Not all Files) to the directory "D:\Directory1\" and then start the Program "Program1.exe" and generate the reports and repeat it till the .xml files exist in "D:\Directory1\Files\"

Is your goal to copy all files from D:\Directory1\Files\ to D:\Directory1\ in one step and then run D:\Directory1\Program1.exe /run /exit /SilentMode?
EDIT:
This work for you?
0. Set location that your program work
1. Get all files
2. For each file
3. Move file to new location
4. Start you program
5. Remove the moved file
Set-Location -Path "D:\Directory1\"
$arrFiles = Get-ChildItem -Path "D:\Directory1\Files\*.xml"
Foreach ($objFile in $arrFiles) {
Move-Item -Path $objFile.FullName -Destination "D:\Directory1\$($objFile.Name)"
Start-Process -FilePath "D:\Directory1\Program1.exe" -ArgumentList "/run /exit /SilentMode" -Wait
Remove-Item -Path "D:\Directory1\$($objFile.Name)"
}

Your logic here was sound, however, one issue you would have is the script will continue processing even while Program1.exe is running. Thereby making it possible for it to seeminly skip files. Also your If statement is just check if $a contains data which it always will in you example. Makes the condition check mute.
What you can do is something like this.
$moveLocation = "D:\Directory1\"
Get-ChildItem "D:\Directory1\Files\*.xml" | ForEach-Object{
# Move the file to its new location
Move-Item -Path $_.FullName -Destination $moveLocation
Start-Process -FilePath "D:\Directory1\Program1.exe" -ArgumentList "/run /exit /SilentMode" -Wait
}

Related

Using a filename for a variable and move similar files with Powershell

I am trying to write a powershell script that will look at a directory(c:\Temp) and look for files with exensions of .fin.
If it finds a file with the fin extension I need it to move all files that have the same first seven characters of that fin file to another directory(c:\Temp\fin).
There could be multiple .fin files at a time in that directory with files that need to be moved.
I have tried a few different things but I am new to anything like this. I have only used very basic powershell scripts or commands. While this might be basic(not sure) I am lost as to where to start.
You'll need to call Get-ChildItem twice. The first time to get the pattern you want to match the file names to and the second to get the files. You can then pipe the results of the second command to Move-Item. Here's an example of how to do that:
[CmdletBinding()]
param(
$DirectoryToScan,
$ExtensionToMatch = ".fin",
$TargetDirectory
)
$Files = Get-ChildItem -Path $DirectoryToScan -File
foreach ($File in $Files) {
if($File.Extension -eq $ExtensionToMatch) {
$PatternToMatch = "$($File.BaseName.Substring(0, 7))*$ExtensionToMatch"
Write-Verbose "PatternToMatch: $PatternToMatch"
$MatchedFiles = Get-ChildItem -Path $DirectoryToScan -Filter $PatternToMatch
}
$MatchedFiles | Move-Item -Destination $TargetDirectory
}
If you save the above into a file called Move-MatchingFiles.ps1 you can pass in your parameters Moving-MatchingFiles.ps1 -DirectoryToScan C:\Temp -TargetDirectory C:\Temp\fin. The ExtensionToMatch parameter is optional and only needed if you wanted to move a different type of file.

Sync Folders with Powershell - New and edited Files

I am trying to Sync 2 Folders with Powershell.
Comparing and copying any new Files works just fine. But I want to additionally copy all files that got modified in the reference Foler.
The following Code works and copys all new Files which got created in the reference Folder.
$folderReference = 'C:\Users\Administrator\Desktop\TestA'
$folderToSync = 'C:\Users\Administrator\Desktop\TestB'
$referenceFiles = Get-ChildItem -Recurse -Path $folderReference
$FolderSyncFiles = Get-ChildItem -recurse -Path $folderToSync
$fileDiffs = Compare-Object -ReferenceObject $referenceFiles -DifferenceObject $FolderSyncFiles
foreach ($File in $fileDiffs){
try {
if ($File.SideIndicator -eq "<="){
$FullSourceObject = $File.InputObject.Fullname
$FullTargetObject = $File.InputObject.Fullname.Replace($folderreference, $folderToSync)
Write-Host "copy File: " $FullSourceObject
copy-Item -Path $FullSourceObject -Destination $FullTargetObject
}
}
catch {
Write-Error -Message "Something went wrong!" -ErrorAction Stop
}
}
Now I also want to copy the modified Files.
I tried -property LastWriteTime after the Compare-Objectbut I get a WriteErrorException when running the code.
Do you guys have some tips on how to get this Code to run properly?
Thanks in advance
I'd just use robocopy, it's built specifically for this type of task and included in most modern versions of windows by default:
robocopy C:\Source C:\Destination /Z /XA:H /W:5
/Z - resumes copy if interrupted
/XA:H - ignores hidden files
/W:5 - shortens wait for failures to 5 sec (default 30)
Worth taking a look through the documentation as there's many different options for practically every situation you can think of...
For example, add /MIR and it will remove any files from the destination when they are deleted from source.

Create PS script to find files

I want to start by saying coding is a bit outside of my skill set but because a certain problem keeps appearing at work, I'm trying to automate a solution.
I use the below script to read an input file for a list of name, search the C:\ for those files, then write the path to an output file if any are found.
foreach($line in Get-Content C:\temp\InPutfile.txt) {
if($line -match $regex){
gci -Path "C:\" -recurse -Filter $line -ErrorAction SilentlyContinue |
Out-File -Append c:\temp\ResultsFindFile.txt
}
}
I would like to make two modifications to this. First, to search all drives connected to the computer not just C:\. Next, be able to delete any found files. I'm using the Remove-Item -confirm command but so far can't make it delete the file it just found.

Copy files in PowerShell too slow

Good day, all. New member here and relatively new to PowerShell so I'm having trouble figuring this one out. I have searched for 2 days now but haven't found anything that quite suits my needs.
I need to copy folders created on the current date to another location using mapped drives. These folders live under 5 other folders, based on language.
Folder1\Folder2\Folder3\Folder4\chs, enu, jpn, kor, tha
The folders to be copied all start with the same letters followed by numbers - abc123456789_111. With the following script, I don't need to worry about folder names because only the folder I need will have the current date.
The folders that the abc* folders live in have about 35k files and over 1500 folders each.
I have gotten all of this to work using Get-ChildItem but it is so slow that the developer could manually copy the files by the time the script completes. Here is my script:
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} |
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force -Recurse
(This only copies to one destination folder at the moment.)
I have also been looking into using cmd /c dir and cmd /c forfiles but haven't been able to work it out. Dir will list the folders but not by date. Forfiles has turned out to be pretty slow, too.
I'm not a developer but I'm trying to learn as much as possible. Any help/suggestions are greatly appreciated.
#BaconBits is right, you have a recurse on your copy-item as well as your getchild-item. This will cause a lot of extra pointless copies which are just overwrites due to your force parameter. Change your script to do a foreach loop and drop the recurse parameter from copy-item
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} | % {
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force
}

Powershell restore previous version of the files

We got hit by virus, it changed all common file extension to .kmybamf (.txt >> .txt.kmybamf ) and if I delete .kmybamf , the file got damaged.....
So I made a list of all files that got damaged. now I'm trying to overwrite them by previous version. Anyone knows how to do it in Powershell?
I can do it in cmd similar to this
subst X: \localhost\D$\#GMT-2011.09.20-06.00.04_Data
robocopy Z: D:\Folder\ /E /COPYALL
But I want to do it in one shot in Powershell, It has to be a "if .kmybamf found, then restore previous version." and powershell seems like has no such cmdlet for restoring previous version of files or folders.
$fileList = Get-Content -Path "\\localhost\D$\#GMT-2011.09.20-06.00.04_Data"
$destinationFolder = "D:\Folder\"
foreach ($file in $fileList)
{
Copy-Item -Path $file -Destination $destinationFolder -Force
}
This will also work but I find it less readable
Get-Content -Path "\\localhost\D$\#GMT-2011.09.20-06.00.04_Data" | ForEach-Object { Copy-Item -Path $_ -Destination "D:\Folder" -Force }
Get-Content is for reading the text from the files, to read the files from a folder you would have to use Get-Childitem