Appending outputs to existing text file - powershell

I have this piece of powershell code below which creates an individual text file in the folder C:\Users\XX\Desktop\info\ from each individual zip file in the folder C:\Users\XX\Desktop\Powershell\Zip, with the name of the text files being the name of the zip files.
Get-ChildItem -Path "C:\Users\XX\Desktop\Powershell\Zip" -Recurse -exclude '*.info' | ForEach { [System.IO.File]::WriteAllText("C:\Users\XX\Desktop\info\"+ $_.Name + ".txt", $_.FullName)}
ontop of that I have the script below which gets the last modified date for the zip files
$path = 'C:\Users\XX\Desktop\Powershell\Zip'
$files = Get-ChildItem $path -Recurse -excluse '*.info'
foreach($file in $files){
$file.lastwritetime
and also this command that gets the computer name
{
(Get-WmiObject Win32_Computersystem).name
}
All these will be in one script, but I need the outputs of the 2nd and 3rd section of the script to append to the text file created in the first section of the script, appending to the appropriate file.
I have tried a couple of commands, the main one being [System.IO.File]::AppendAllText, but I cant seem to get anywhere with this.
Any ideas on the right way I should be doing this?
Thankyou.

You can try this :
$path = 'C:\Users\XX\Desktop\Powershell\Zip'
$files = Get-ChildItem $path -Recurse -Exclude '*.info'
$ComputerName = (Get-WmiObject Win32_Computersystem).name
foreach($file in $files) {
$OutputFilePath = "C:\Users\XX\Desktop\info\"+ $file.Name + ".txt"
[System.IO.File]::WriteAllText($OutputFilePath, $file.FullName)
$file.lastwritetime | Add-Content $OutputFilePath
$ComputerName | Add-Content $OutputFilePath
}

Related

Powershell: Moving named files to their corresponding folder

enter image description hereI have a folder which has a bunch of files named: WBF123456, WBF135464, etc. These files need to be moved to the corresponding folder. At the moment I am using the commandline to manually enter the numbers of each file so they get moved, using this code:
$files = $args[0]
mv O:\SCAN\SecSur\*$files.pdf O:\SPG\G*\*\*$files
How can I automate this process?
It needs to identify the number in the filename, then move it to the folder containing the same number.
Any help would be great. Thanks.
I need to get the files on the left, inside the corresponding folders on the right.
Maybe the below solution will help you. You should change $origin_path and $destination_path
$origin_path= "C:\Users\geralexgr\Desktop\kati\files"
$destination_path = "C:\Users\geralexgr\Desktop\kati\folders"
Get-ChildItem $origin_path -Recurse -Include *.txt | ForEach-Object {
$folder = [regex]::Matches($_.Name, "\d+(?!.*\d+)").value
Move-Item $_.FullName $destination_path\$folder
}
The example will place files under the folders that match the numeric regex.
After powershell execution file WBF12 gets inside 12 folder
Apparently the files to move are .pdf files, so what you can do is get a list of those files in the source folder and then loop over that list to create (if needed) the destination subfolder and move the file there.
Try:
$destinationRoot = 'O:\SPG\G\SomeWhere' # enter the root folder destination path here
$filesToMove = Get-ChildItem -Path 'O:\SCAN\SecSur' -Filter '*.pdf' -File
foreach ($file in $filesToMove) {
$numName = $file.BaseName -replace '\D+' # leaving only the numbers
# create the target path for the file
$targetFolder = Join-Path -Path $destinationRoot -ChildPath $numName
# create that subfolder if it does not already exist
$null = New-Item -Path $targetFolder -ItemType Directory -Force
# now, move the file
$file | Move-Item -Destination $targetFolder
}
Seeing your screenshots, this might be a better approach for you.
$destinationRoot = 'O:\SPG\G\SomeWhere' # enter the root folder destination path here
# get a list of target folders for the files to be moved to and create a lookupHashtable from their names
$targets = #{}
Get-ChildItem -Path $destinationRoot -Directory | Where-Object {$_.Name -match '(\d+)'} | ForEach-Object {
$targets[$matches[1]] = $_.FullName # key is the number, value is the directory fullname
}
# get a list of files to move
$filesToMove = Get-ChildItem -Path 'O:\SCAN\SecSur' -Filter '*.pdf' -File | Where-Object {$_.Name -match '\d+'}
foreach ($file in $filesToMove) {
$numName = $file.BaseName -replace '\D+' # leaving only the numbers
# see if we have a target folder with that same number in its name
if ($targets.ContainsKey($numName)) {
$targetFolder = $targets[$numName]
Write-Host "Moving file $($file.Name) to $targetFolder"
$file | Move-Item -Destination $targetFolder
}
else {
Write-Warning "Could not find a destination folder for file $($file.Name).."
}
}

Powershell Script that references a text doc and moves everything NOT in that doc

I created a powershell script that copies the name of every file in a given folder (it creates filename.txt, not C:\path\filename.txt) and places it all in a text doc.
I'm now trying to write a powershell script that ONLY copies items to a different directory that are not contained in that text doc.
Any ideas? Thank you all.
The following code will do exactly what you’re after in a more PowerShell way. Better to use an array to hold the names to be ignored. If you need to use the .txt then just useGet-Content to load the .txt into the $ignoreFiles variable.
#-file gets files only (not folders)
$ignoreFiles = (Get-ChildItem -Path "C:\Users\YOURS" -file).name
$checkFolder = Get-ChildItem -Path "C:\Users\YOURS" -file
$targetFolder = "C:\Users\YOURS"
 
foreach ($t in $checkFolder){
    if($t.name -notin $ignoreFiles){
        Copy-Item $t.fullname $targetFolder
    }
}
#-file gets files only (not folders)
$listPath = "\\server\C$\exclusionpath\list.txt"
$downloadPath = "\\adifferentserver\download\"
$targetFolder = "\\adifferentserver\Processing\"
$ignoreFiles = Get-Content -Path $listPath
$checkFolder = Get-ChildItem -Path $downloadPath -File -Exclude $ignoreFiles -
Name
foreach ($t in $checkFolder){
#Copy-Item $t.fullname $targetFolder
Copy-Item (Get-ChildItem -Path $downloadPath -File | Where-Object {$_.Name -eq
$t}).FullName $targetFolder
}

Merge CSV files in subfolders

I have found a script which does everything that I need, but it's only useful if you run it in a single folder. What I'd like is:
Script is located in c:/temp/. Upon running the script, it would go into each subfolder and execute. Each subfolder would then have a separate Final.csv.
Somebody mentioned just add -Recurse, but it doesn't complete the job as described. With -Recurse added, it goes into each subfolder and creates a Final.csv final in the root dir (C:/temp/) instead of creating a Final.csv in each subfolder.
$getFirstLine = $true
get-childItem *.csv | foreach {
$filePath = $_
$lines = Get-Content $filePath
$linesToWrite = switch($getFirstLine) {
$true {$lines}
$false {$lines | Select -Skip 2}
}
$getFirstLine = $false
Add-Content Final.csv $linesToWrite
}
If you are certain the csv files combined this way will leave you a valid 'Final.csv', you need to use Group-Object in order to create a combined file in each of the directories where the csv files to combine are found.
Suppose you have a folder with subfolders 'Folder1' and 'Folder2', both having csv files in them like these:
first.csv
Lorem,Ipsum,Dolor,Sic,Amet
data1-1,data1-2,data1-3,data1-4,data1-5
data2-1,data2-2,data2-3,data2-4,data2-5
second.csv
Lorem,Ipsum,Dolor,Sic,Amet
something,blah,whatever,very important,here's more..
Then this should do it for you:
$targetFileName = 'Final.csv'
# loop over the CSV files, but exclude the Final.csv file
# Group the files by their DirectoryNames
Get-ChildItem -Path 'D:\Test' -Filter '*.csv' -File -Recurse -Exclude $targetFileName | Group-Object DirectoryName | ForEach-Object {
# reset the $getFirstLine variable for each group
$getFirstLine = $true
# create the target path for the combined csv inside this folder.
# ($_.Name is the name of the group, which is the Directory name of the files inside the group)
$target = Join-Path -Path $_.Name -ChildPath $targetFileName
foreach ($file in $_.Group) {
if ($getFirstLine) {
# copy the first CSV as a whole
Get-Content -Path $file.FullName | Set-Content -Path $target
$getFirstLine = $false
}
else {
# add the content of the next file(s) without the header line
Get-Content -Path $file.FullName | Select-Object -Skip 1 | Add-Content -Path $target
}
}
}
The end result is that each subfolder will have a new 'Final.csv' file containing
Lorem,Ipsum,Dolor,Sic,Amet
data1-1,data1-2,data1-3,data1-4,data1-5
data2-1,data2-2,data2-3,data2-4,data2-5
something,blah,whatever,very important,here's more..
Of course I'm just showing an example for one of the subfolders.. Other subfolders will contain different 'Final.csv' content

How to copy files using a txt list to define beginning of file names

Hello awesome community :)
I have a list containing a bunch of SKU's. All the filenames of the files, that I need to copy to a new location, starts with the corresponding SKU like so
B6BC004-022_10_300_f.jpg
In this case "B6BC004" is the SKU and my txt list contains "B6BC004" along with many other SKU's.
Somewhere in the code below I know I have to define that it should search for files beginning with the SKU's from the txt file but I have no idea how to define it.
Get-Content .\photostocopy.txt | Foreach-Object { copy-item -Path $_ -Destination "Z:\Photosdestination\"}
Thanks in advance :)
If all files start with one of the SKU's, followed by a dash like in your example, this should work:
$sourceFolder = 'ENTER THE PATH WHERE THE FILES TO COPY ARE'
$destination = 'Z:\Photosdestination'
# get an array of all SKU's
$sku = Get-Content .\photostocopy.txt | Select-Object -Unique
# loop through the list of files in the source folder and copy all that have a name beginning with one of the SKU's
Get-ChildItem -Path $sourceFolder -File -Recurse |
Where-Object { $sku -contains ($_.Name -split '\s*-')[0] } |
ForEach-Object { $_ | Copy-Item -Destination $destination }
I haven't tested this so please proceed with caution!
What is does it loops through all the items in your photostocopy.txt file, searches the $source location for a file(s) with a name like the current item from your file. It then checks if any were found before outputting something to the console and possibly moving the file(s).
$source = '#PATH_TO_SOURCE'
$destination = '#PATH_TO_DESTINATION'
$photosToCopy = Get-Content -Path '#PATH_TO_TXT_FILE'
$photosToCopy | ForEach-Object{
$filesToCopy = Get-ChildItem -Path $source -File | Where-Object {$_.Name -like "$_*"}
if ($fileToCopy.Count -le 0){
Write-Host "No files could be found for: " $_
}else{
$filesToCopy | ForEach-Object{
Write-Host "Moving: " $_.Name
Copy-Item -Path $_.FullName -Destination $destination
}
}
}
Let me know how if this helps you :)

foreach copy-item not working

My Copy-Item doesn't work when included in a foreach loop.
Much like Powershell: Copy-Item not working when in ForEach loop
Only my destination folder is not set the same as the originating folder which seemed to be the problem there.
This is the very basic function. My objective is to grab the latest log files from a directory containting log files for lots of stuff. I'm only interested in a few defined in $servers. A line in Servers.txt looks like this: \\clientname\d$\logdirectory\processlog\
When I Set-Location to a path in servers.txt and run Get-ChildItem it works as expected.
I also need to generate a new folder for each object in \Logs\ but one thing at a time.
$servers = #()
$servers = Get-Content c:\Test\Servers.txt
$destServer = #()
$destServer = ( "clientname")
$destinationFolder = "\\" + $destServer + "\d$\Logs\"
foreach ($serverpath in $servers) {
Write-Host " Copying from $serverpath "
Set-Location -literalpath $serverpath |
Get-ChildItem |
Sort-Object -Descending LastWriteTime |
Select -First 2 |
Copy-Item -Destination $destinationFolder -Recurse -Force
}