I have an array of custom PS objects that I want to upload to Sharepoint Online as a .csv file, without saving the array to storage first.
I am using the following code:
$memStream = [System.IO.MemoryStream]([System.Text.Encoding]::UTF8.GetBytes(($devices | ConvertTo-Csv -NoTypeInformation)))
$null = Add-PnPFile -FileName $fileName -Folder $folderPath -Stream $memStream -ErrorAction Stop
When I open the file in Sharepoint or when I download the file to my PC and open it in Excel, I see only the header row. When I do a regular Export-Csv, I see all of the expected rows:
$devices | Export-Csv -Path <path> -NoTypeInformation
How do I get the entire array (as a csv) into the memory stream?
Related
I am just starting to learn PowerShell and have run into a hurdle where I'm trying to use gci and import-csv. My goal is to run a script in a folder directory that has numerous subfolders that contain a specific csv file that I would like to import and consolidate the data. These subfolders have additional subfolders that have other file types including csv that I don't have any use for. I am interested in the specific path below. The csv files have a specific header type called location that I care about and have to parse out into a string.
Folder directory example
This is my code so far:
$files = gci .\ -Recurse
foreach($file in $files) {
$foldername = $file.Name
$csv = Import-Csv -Path ".\$foldername\$foldername.csv"
foreach ($line in $csv) {
$outputlines = $line.location -split '/'
Export-csv -Path .\Test.csv -NoTypeInformation -Append
}
}
This is the message I get when I run it:
cmdlet Export-Csv at command pipeline position 1
Supply values for the following parameters:
InputObject:
Can someone please guide me in the right direction on what I'm doing wrong?
As others have commented and strictly speaking, Export-Csv needs something to export. However, that's not the only issue. Import-Csv will return objects based on the data in the csv file. So -split is likely to provide strange results as it's designed to run against strings and/or arrays of strings.
Withstanding the -split that is unlikely to work, you can address consolidation more simply by simply feeding the results from many Import-Csv commands to a single Export-csv command:
$OutputCsv = 'c:\temp\Output.csv'
Get-ChildItem .\ -Directory |
ForEach-Object{
$FolderName = $_.Name
".\$FolderName \$FolderName .csv"
} |
Import-Csv |
Export-Csv -Path $OutputCsv -NoTypeInformation -Append
The loop will output a series of strings that are piped to Import-Csv. So, all the files will get imported and the resulting objects will be streamed to Export-Csv consolidating everything into $Outputcsv which is c:\temp\Output.csv.
Note: The use of the -Directory parameter. Since you are only leveraging the folder names that should prevent a few errors, particularly for file names that may not exist.
If you want to clarify the question with an example of the CSV contents and the desired output we can take this further.
I open a PS in a folder then use
dir -name > asd.xls -recurse.
How can I modify this so it doesn't incude folders in the filenames?
Instead of using -name, try using
(Get-ChildItem -Recurse).Name > asd.xls
and be aware that you won’t get a valid Excel workbook that way. You can get a valid CSV that can be loaded into Excel with
(Get-ChildItem -Recurse) | Select-Object -Property Name | Export-CSV -Path asd.csv -NoTypeInformation
Any help greatly appreciated.
I have a folder that contains 30+ folders which each have a .txt file that I can search for using:
Get-ChildItem -Filter *.txt -Recurse
I want to read the contents of each .txt file discovered and output the contents int a new .csv file on my desktop that also includes the directory of each .txt file contents being displayed.
The question is twofold,
how to use pipe and powershell commands to read/show all the words in the files.
how to create the csv data that will output both the directory name and the contents of the .txt files.
I can already pipe results to:
c:\desktop\test.csv -Encoding ascii -noTypeInformation
The following script reads all .txt files within a specific directory, stores the full filename and path as well as the files content into an array and saves it as a csv.
$csvOut = #()
Get-ChildItem -LiteralPath C:\temp -Filter *.txt -File -Recurse | foreach {
$fileData = #{
"File"=$_.FullName;
"Content"=(Get-Content -LiteralPath $_.FullName -Raw)
}
$csvOut += (New-Object psobject -Property $fileData)
}
$csvOut | Export-Csv -LiteralPath "C:\temp\csvout.csv" -NoTypeInformation
My Old Bat file
Copy F:\File.hdr+F:*.csv F:\FinalOutput.csv
the HDR file is a single entry file that has only header data for the CSV files
Is there a way to perform this in PowerShell (to combine all the CSV files into a single file)?
Here is my powershell script that doesn't work
$CSVFolder = 'F:\Input\';
$OutputFile = 'F:\Output\NewOutput.csv';
$CSV= #();
Get-ChildItem -Path $CSVFolder -Filter *.inv | ForEach-Object {
$CSV += #(Import-Csv -Path $CSVFolder\$_)
}
$CSVHeader = Import-Csv 'F:\Input\Headings.hdr'
$CSV = $CSVHeader + $CSV
$CSV | Export-Csv -Path $OutputFile -NoTypeInformation -Force;
I get the list of FileNames that are exported and not the content of the Files.
The script is also modifying the date/time stamp on my INV files. It shouldn't be doing that.
You can skip the whole CSV bit if you just append the files as you would before.
Something like this should work:
# First we create the new file and add the header.
get-content $headerfile | set-content $outputfile
# Then we get the input files, read them out with get-content
# and append them to the output file (add-content).
get-childitem -path $csvfolder *.inv | get-content | add-content $outputfile
The CSV commandlets are handy if you want to be processing the CSV data in your script, but in your case simply appending the files will do the trick. Not bothering with the CSV conversion will be a lot faster as Powershell doesn't have to parse the CSV lines and create PS-objects. It's really fast with pure text though.
Another trick here is how the get-content and add-content are used in the pipeline. Since they are aware of the pipeline you can pass in file objects without having to use a foreach loop. This makes your statements a lot shorter.
How about:
get-childitem *.inv | foreach-object {
import-csv $_ -header (get-content Headings.hdr)
} | export-csv NewOutput.csv -notypeinformation
I have list of share path in a text file. I try to read the files and folders in each path and exporting to csv file using powershell script. I got some csv files with 0 KB.
so i try to test the existance of such network path using Test-Path command. few path shows it is exist but when itry to list out the directories of existing path using Dir \sharepath name i got error like "The specified network name is no longer available" Why??
Sharing code below..
foreach ($dir in (Get-Content $infile)) {
$outfilecsv='jerin-Download'+'.csv'
Get-ChildItem -Path $dir -Filter *.* -Recurse | Select-Object
Name,#{Name="Owner";Expression={(Get-ACL $_.fullname).Owner}},CreationTime,#{Name="FileModifiedDate";Expression={$_.LastWriteTime}},
#{Name="FileAccessedDate";Expression={$_.LastAccessTime}},#{Name="Attributes";Expression=
{$_.Attributes}},#{l='ParentPath';e={Split-Path $_.FullName}},
#{Name="DormantFor(days)";Expression={[int]((Get-Date)-$_.LastWriteTime).TotalDays}},
#{N="FileCategory";E={Get-FileSizeCategory($_)}},
#{Name="Size";Expression={if($_.PSIsContainer -eq $True){(New-Object -com
Scripting.FileSystemObject).GetFolder( $_.FullName).Size} else {$_.Length}}}|
Export-Csv -Path $outfilecsv -Encoding ascii -NoTypeInformation
}
Can anyone suggest
Thanks
Jerin