PowerShell - Copying files in random folders to other folders - powershell

I have the following text file (tab delimited) that maps a specific file to a folder. I start with importing this csv:
SourcePathFile DestinationPath
C:\Test\Source\SourceDir 1\pic1.jpg C:\Test\Destination\Folder, 1
C:\Test\Source\SourceDir 1\Pic 2.jpg C:\Test\Destination\Folder 2
By using:
Import-csv -Delimiter `t "C:\Test\FileMapping.csv"
This gives me the array I want, so I figured that it would be a simple For-each to go through each line using
Copy-Item SourcePathFile DestinationPath
I'm clearly missing the general concepts

Assuming that your input file is valid TSV (is comma at the end of the line 2 is intended?) you can do this using pipeline:
Import-Csv -Path 'C:\Test\FileMapping.csv' -Delimiter "`t" |
ForEach-Object {Copy-Item -Path $_.SourcePathFile -Destination $_.DestinationPath}

Related

PowerShell: Export Data from Specific Row Number from multiple files and export to one csv

Beginner user...
I have 30+ .dat files in one folder and need to export row 10 from each file and compile into one csv file.
I know I am on the right lines but am not sure of the middle section - this is where I'm at...
Get-ChildItem -Path C:\Users\pitters\Folder\* -Include *.dat -Recurse
ForEach-Object {
Select-Object -Skip 9 -First 1 }
Export-CSV -Path Users\pitters\Folder\output.csv
Am I missing Get-Content and can anyone help with what needs correcting?
Thanks in advance.
Matt
As you mentioned yourself, you need to invoke Get-Content to actually read the file contents. In addition, you also need to construct an object with appropriate properties corresponding to the coumns you want in your CSV file - something we can do with Select-Object directly:
Get-ChildItem -Path C:\Users\pitters\Folder\* -Include *.dat -Recurse |ForEach-Object {
$file = $_
$file |Get-Content |Select-Object #{Name='Line10';Expression={$_}},#{Name='File';Expression={$file.FullName}} -Skip 9 -First 1
} |Export-CSV -Path Users\pitters\Folder\output.csv

Powershell Import CSV

I am just starting to learn PowerShell and have run into a hurdle where I'm trying to use gci and import-csv. My goal is to run a script in a folder directory that has numerous subfolders that contain a specific csv file that I would like to import and consolidate the data. These subfolders have additional subfolders that have other file types including csv that I don't have any use for. I am interested in the specific path below. The csv files have a specific header type called location that I care about and have to parse out into a string.
Folder directory example
This is my code so far:
$files = gci .\ -Recurse
foreach($file in $files) {
$foldername = $file.Name
$csv = Import-Csv -Path ".\$foldername\$foldername.csv"
foreach ($line in $csv) {
$outputlines = $line.location -split '/'
Export-csv -Path .\Test.csv -NoTypeInformation -Append
}
}
This is the message I get when I run it:
cmdlet Export-Csv at command pipeline position 1
Supply values for the following parameters:
InputObject:
Can someone please guide me in the right direction on what I'm doing wrong?
As others have commented and strictly speaking, Export-Csv needs something to export. However, that's not the only issue. Import-Csv will return objects based on the data in the csv file. So -split is likely to provide strange results as it's designed to run against strings and/or arrays of strings.
Withstanding the -split that is unlikely to work, you can address consolidation more simply by simply feeding the results from many Import-Csv commands to a single Export-csv command:
$OutputCsv = 'c:\temp\Output.csv'
Get-ChildItem .\ -Directory |
ForEach-Object{
$FolderName = $_.Name
".\$FolderName \$FolderName .csv"
} |
Import-Csv |
Export-Csv -Path $OutputCsv -NoTypeInformation -Append
The loop will output a series of strings that are piped to Import-Csv. So, all the files will get imported and the resulting objects will be streamed to Export-Csv consolidating everything into $Outputcsv which is c:\temp\Output.csv.
Note: The use of the -Directory parameter. Since you are only leveraging the folder names that should prevent a few errors, particularly for file names that may not exist.
If you want to clarify the question with an example of the CSV contents and the desired output we can take this further.

Copying files to specific folder declared in a CSV file using Powershell Script

i am quite new to powershell and i am trying to make a script that copy files to certain folders that are declared in a CSV file. But till now i am getting errors from everywhere and can't find nothing to resolve this issue.
I have this folders and .txt files created in the same folder as the script.
Till now i could only do this:
$files = Import-Csv .\files.csv
$files
foreach ($file in $files) {
$name = $file.name
$final = $file.destination
Copy-Item $name -Destination $final
}
This is my CSV
name;destination
file1.txt;folderX
file2.txt;folderY
file3.txt;folderZ
As the comments indicate, if you are not using default system delimiters, you should make sure to specify them.
I also recommend typically to use quotes for your csv to ensure no problems with accidentally including an entry that includes the delimiter in the name.
#"
"taco1.txt";"C:\temp\taco2;.txt"
"# | ConvertFrom-CSV -Delimiter ';' -Header #('file','destination')
will output
file destination
---- -----------
taco1.txt C:\temp\taco2;.txt
The quotes make sure the values are correctly interpreted. And yes... you can name a file foobar;test..txt. Never underestimate what users might do. 😁
If you take the command Get-ChildItem | Select-Object BaseName,Directory | ConvertTo-CSV -NoTypeInformation and review the output, you should see it quoted like this.
Sourcing Your File List
One last tip. Most of the time I've come across a CSV for file input lists a CSV hasn't been needed. Consider looking at grabbing the files you in your script itself.
For example, if you have a folder and need to filter the list down, you can do this on the fly very easily in PowerShell by using Get-ChildItem.
For example:
$Directory = 'C:\temp'
$Destination = $ENV:TEMP
Get-ChildItem -Path $Directory -Filter *.txt -Recurse | Copy-Item -Destination $Destination
If you need to have more granular matching control, consider using the Where-Object cmdlet and doing something like this:
Get-ChildItem -Path $Directory -Filter *.txt -Recurse | Where-Object Name -match '(taco)|(burrito)' | Copy-Item -Destination $Destination
Often you'll find that you can easily use this type of filtering to keep CSV and input files out of the solution.
example
Using techniques like this, you might be able to get files from 2 directories, filter the match, and copy all in a short statement like this:
Get-ChildItem -Path 'C:\temp' -Filter '*.xlsx' -Recurse | Where-Object Name -match 'taco' | Copy-Item -Destination $ENV:TEMP -Verbose
Hope that gives you some other ideas! Welcome to Stack Overflow. 👋

PowerShell copy and rename multiple .csv files from 10+ subfolders

I'm searching for a way to copy multiple .csv files all named exactly the same, located in different folders (all of them are in the same dierctory) and merge them into 1 .csv file (I would like to skip copying the first line which is head, except from the first file and there is no rule how many lines are written in each .csv file, so the script should recognize written lines to know how many and which one to merge /to avoid blank lines).
This is what I tried so far:
$src = "C:\Users\E\Desktop\Merge\Input\Files*.csv"
$dst = "C:\Users\E\Desktop\Merge\Output"
Get-ChildItem -Path $src -Recurse -File | Copy-Item -Destination $dst
and this one:
Get-ChildItem -Path $src -Recurse -File | Copy-Item -Destination $dst |
ForEach-Object {
$NewName = $_.Name
$Destination = Join-Path -Path $_.Directory.FullName -ChildPath $NewName
Move-Item -Path $_.FullName -Destination $Destination -Force
}
any help please? :)
Since you are looking to merge the files you may as well read them all into PowerShell, and then output the whole thing at once. You could do something like:
$Data = Get-ChildItem -Path $src -Recurse -File | Import-Csv
$Data | Export-Csv $dst\Output.csv -NoTypeInformation
That may not be feasible if your CSV files are extremely large, but it is a simple way to merge CSV files if the header row is the same in all files.
Another method would be to just treat it as text, which is much less memory intensive. For that you would want to get a list of files, copy the first one intact, and then copy the rest of them skipping the header row.
$Files = Get-ChildItem $src -Recurse
$TargetFile = Join-Path $dst $Files[0].Name
$Files[0] | Copy-Item -Dest $TargetFile
#Skip the first file, and loop through the rest
$Files | Select -Skip 1 | ForEach-Object{
#Get the contents of the file, and skip the header row, then append the rest to the target
Get-Content $_ | Select -Skip 1 | Add-Content $TargetFile
}
Edit: Ok, I wanted to replicate the process so that I could figure out what was giving you errors. To do that I created 3 folders, and copied a .csv file with 4 entries into each folder, with all of the files named 'Files 06202018.csv'. I ran my code above, and it did what it should, but there was some file corruption where the second file would be appended directly to the end of the first file without a new line being created for it, so I changed things from just copying the first file, to reading it and creating a new file in the destination. The below code worked flawlessly for me:
$src = "C:\Temp\Test\Files*.csv"
$dst = "C:\Temp\Test\Output"
$Files = Get-ChildItem $src -Recurse
$TargetFile = Join-Path $dst $Files[0].Name
GC $Files[0] | Set-Content $TargetFile
#Skip the first file, and loop through the rest
$Files | Select -Skip 1 | ForEach-Object{
#Get the contents of the file, and skip the header row, then append the rest to the target
Get-Content $_ | Select -Skip 1 | Add-Content $TargetFile
}
That took the files:
C:\Temp\Test\Lapis\Files 06202018.csv
C:\Temp\Test\Malachite\Files 06202018.csv
C:\Temp\Test\Opal\Files 06202018.csv
And it combined those three files into a correctly merged file at:
C:\Temp\Test\Output\Files 06202018.csv
The only time that I had any issues is if I forgot to delete the target file before running this. Depending on how large these files are, and how much memory you have available, you could probably speed this up by changing the last two lines to this:
Get-Content $_ | Select -Skip 1
} | Add-Content $TargetFile
That would read all of the files in (other than the first one) and only write to the destination once, instead of having to get file lock, open the file for writing, write, and close the destination for each file.

Delete first n lines in file (.zip file with many "strange" symbols)

This powershell code delete first 4 lines in file
(gc "old.zip" | select -Skip 4) | sc "new.zip"
But old.zip file has Unix (LF) line endings
And this code also converts file line endings to Windows (CR LF)
How to delete first 4 lines without converting ?
Because of the presence of many "strange" symbols in .zip, other ways to remove the first n lines in a .zip file do not work. For example, more than +4 "old.zip"> "new.zip" in cmd does not work, etc.
Through powershell something like it is removed but also not without problems.
Do you know other ways to remove the first n lines in a .zip file ?
somthing like this?
$PathZipFile="c:\temp\File_1.zip"
$NewPathZipFile="c:\temp\NewFile_1.zip"
#create temporary directory name
$TemporyDir=[System.IO.Path]::Combine("c:\temp", [System.IO.Path]::GetRandomFileName())
#extract archive into temporary directory
Expand-Archive "c:\temp\File_1.zip" -DestinationPath $TemporyDir
#loop file for remove 4 first lines for every files and compress in new archive
Get-ChildItem $TemporyDir -file | %{
(Get-Content $_.FullName | select -Skip 4) | Set-Content -Path $_.FullName;
Compress-Archive $_.FullName $NewPathZipFile -Update
}
Remove-Item $TemporyDir -Recurse -Force
PowerShell:
(gc "old.txt" | select -Skip 4 | Out-String) -replace "`r`n", "`n" | Out-File "new.txt"
C#:
File.WriteAllText("new.txt", string.Join("\n", File.ReadLines("old.txt").Skip(4)));