I am trying to report on the number of pdf files in a directory. The below code works fine, however i have added Export-Csv into it, and the output does not work. The file is created, but the count is wrong. I get "#TYPE System.Int32" in cell 1A of the output file instead of the file count.... not sure why.
(get-ChildItem C:\Test\* -Filter *.pdf -Recurse).Count | Export-Csv C:\TEMP\Test.csv
Export-CSV works better when you have an object or hashtable with properties and values. All you have is a number in this case and it has no idea what the column heading should be. If all you want is a number in a file, try this:
(get-ChildItem C:\Test\* -Filter *.pdf -Recurse).Count | Set-Content C:\TEMP\Test.csv
But if you really want a csv file or an example for other projects, try this:
$HashTable = #{NumberOfPDFFiles = ((get-ChildItem C:\Test\* -Filter *.pdf -Recurse).Count)}
$HashTable | Export-csv C:\TEMP\Test.csv -NoTypeInformation
Or something like this to stick with the one line idea
(get-ChildItem C:\Test\* -Filter *.pdf -Recurse).Count |
Select-Object #{n='PdfCount';e={$_}} |
Export-CSV C:\TEMP\Test.csv -NoTypeInformation
To compliment kevmar's answer since it does address the issue but doesn't explain why.
From TechNet
By default, the first line of the CSV file contains "#TYPE " followed
by the fully-qualified name of the type of the object.
That is why your first line is: #TYPE System.Int32 and why -NoTypeInformation removes it. If all you are doing is outputting a count then Set-Content makes more sense.
Related
I am trying count the rows containing values in a bunch of CSV in a folder. I managed to get the code to count it but I can't seem to find a way to export the results to a CSV. All I got is a blank CSV.
What am I missing here?
$FOLDER_ROOT = "C:\Test\2019"
$OUTPUT_CSV = "C:\Test\2019\Count.csv"
Get-ChildItem $FOLDER_ROOT -re -in "*.csv" | ForEach-Object {
$filestats = Get-Content $_.Fullname | Measure-Object -Line
$linesInFile = $filestats.Lines - 1
Write-Host "$_,$linesInFile"
} | Export-Csv -Path $OUTPUT_CSV -NoType
There are several issues with your code:
Use Get-ChildItem -Filter '*.csv' instead of Get-ChildItem -Include '*.csv'. The former is faster than the latter.
Write-Host most likely causes the output to go directly to the host console. I've been told that this was changed in recent versions (so that host output goes to the new information stream), but for versions at least prior to v5 it's still a reality.
Export-Csv expects object input, since it outputs the properties of the objects as the fields of the CSV (taking the column titles from the property names of the first object). Feeding it strings ("$_,$linesInFile") will result in a CSV that contains only a column "Length", since that is the only property of the string objects.
Use a calculated property for creating a CSV with the filename and line count of the input files:
Get-ChildItem $FOLDER_ROOT -Recurse -Filter '*.csv' |
Select-Object Name, #{n='LineCount';e={(Get-Content $_.Fullname | Measure-Object -Line).Lines - 1}} |
Export-Csv $OUTPUT_CSV -NoType
Write-Host writes only to the host! Most probably you see the output into the PowerShell Console?
Use Write-Output, which could be piped to Export-CSV.
I am attempting to output full directory path and lastaccesstime in one line.
Needed --
R:\Directory1\Directory2\Directory3, March 10, 1015
What I am getting --
R:\Directory1\Directory2\Directory3
March 10, 1015
Here is my code, It isn't that complicated, but it is beyond me.
Get-ChildItem -Path "R:\" -Directory | foreach-object -process{$_.FullName, $_.LastAccessTime} | Where{ $_.LastAccessTime -lt [datetime]::Today.AddYears(-2) } | Out-File c:\temp\test.csv
I have used foreach-object in the past in order to ensure I do not truncate the excessively long directory names and paths, but never used it when pulling two properties. I would like the information to be on all one line, but haven't been successful. Thanks in advance for the assist.
I recommend filtering (Where-Object) before selecting the properties you want. Also I think you want to replace ForEach-Object with Select-Object, and lastly I think you want Export-Csv rather than Out-File. Example:
Get-ChildItem -Path "R:\" -Directory |
Where-Object { $_.LastAccessTime -lt [DateTime]::Today.AddYears(-2) } |
Select-Object FullName,LastAccessTime |
Export-Csv C:\temp\test.csv -NoTypeInformation
We can get your output on one line pretty easily, but to make it easy to read we may have to split your script out to multiple lines. I'd recommend saving the script below as a ".ps1" which would allow you to right click and select "run with powershell" to make it easier in the future. This script could be modified to play around with more inputs and variables in order to make it more modular and work in more situations, but for now we'll work with the constants you provided.
$dirs = Get-ChildItem -Path "R:\" -Directory
We'll keep the first line you made, since that is solid and there's nothing to change.
$arr = $dirs | Select-Object {$_.FullName, $_.LastAccessTime} | Where-Object{ $_.LastAccessTime -lt [datetime]::Today.AddYears(-2) }
For the second line, we'll use "Select-Object" instead. In my opinion, it's a lot easier to create an array this way. We'll want to deal with the answers as an array since it'll be easiest to post the key,value pairs next to each other this way. I've expanded your "Where" to "Where-Object" since it's best practice to use the full cmdlet name instead of the alias.
Lastly, we'll want to convert our "$arr" object to csv before putting in the temp out-file.
ConvertTo-CSV $arr | Out-File "C:\Temp\test.csv"
Putting it all together, your final script will look like this:
$dirs = Get-ChildItem -Path "C:\git" -Directory
$arr = $dirs | Select-Object {$_.FullName, $_.LastAccessTime} | Where{ $_.LastAccessTime -lt [datetime]::Today.AddYears(-2) }
ConvertTo-CSV $arr | Out-File "C:\Temp\test.csv"
Again, you can take this further by creating a function, binding it to a cmdlet, and creating parameters for your path, output file, and all that fun stuff.
Let me know if this helps!
I have a lot of csv files stored in a directory.
In each csv file need to add the name as column this through powershell.
Example
File location: <SERVERNAME>\Export\FILENAME1.CSV
Contents:
1232;Description;a1
1232;Description;a2
The result must be:
1232;Description;a1;FILENAME1.CSV
1232;Description;a2;FILENAME1.CSV
Can someone help me with this?
The following will append a Filename column to each .CSV file in a directory:
Get-ChildItem *.csv | ForEach-Object {
$CSV = Import-CSV -Path $_.FullName -Delimiter ";"
$FileName = $_.Name
$CSV | Select-Object *,#{N='Filename';E={$FileName}} | Export-CSV $_.FullName -NTI -Delimiter ";"
}
Explanation:
Uses Get-ChildItem to get all files named *.csv
Iterates through each file with ForEach-Object and uses Import-CSV to load their contents as a PowerShell object
Records the name of the file in $FileName
Uses Select-Object to add a calculated property with the name Filename and the value of the $FileName variable
Uses Export-CSV to write back over the original file. The -NTI (NoTypeInformation) switch is used to ensure the PowerShell object header line is not included.
Mark Wragg's PowerShell code works, but I had to change the delimiter to , instead of ; so that it opens in Excel. I'm trying to figure out how to append the filename to the first column instead of the last because my files don't have the same number of fields so they don't align.
I have a script that searches for the lastest modified log file. It then is suppose to read that text file and pick up a key phrase then display the line after it.
So far i have this
$logfile = get-childitem 'C:\logs' | sort {$_.lastwritetime} | where {$_ -notmatch "X|Zr" }| select -last 1
$error = get-content $logfile | select-string -pattern "Failed to Modify"
an example line it reads is this
20150721 12:46:26 398fbb92 To CV Failed to Modify
CN=ROLE-x-USERS,OU=Role Groups,OU=Groups,DC=gyp,DC=gypuy,DC=net
MDS_E_BAD_MEMBERSHIP One or more members do not exist in the directory
They key bit of information im trying to get here is
Can anyone help?
Thanks
Try this:
$error = get-content $logfile |
Where-Object { $_ -like "*Failed to Modify*" } |
Select-Object -First 1
This is provided you are looking for the first match in the file. The Select-String cmdlet returns a MatchInfo object. Depending on your requirements there might be no reason to add that level of complexity if you're just looking to pull the first occurrence of this error in the file.
Failing this, my recommendation would be to debug this and step through it. Break on the Get-Content call and see what $logfile is. Run Get-Content $logfile and see what that content looks like. Then do your Select-String on that output. See what MatchInfo.ToString() looks like. Maybe you'll see some disconnect.
Again, my recommendation would be to just parse manually through the file and work with the Where-Object cmdlet at this point.
This shoul work:
get-childitem 'c:\logs' | where {$_.Name -notmatch "X|Zr" } | sort {$_.lastwritetime} | select -last 1 | select-string "Failed to Modify"
But I don't like "X|Zr" part. If your log files have .txt extension, it'll not list them because you're saying you don't want any file containing "x" or "zr" in entire name. Use $_.BaseName (name without extension), or modify regular expression.
I have 3 .csv files that I am combining into one. This bit of code works:
Get-ChildItem 'C:\Scripts\testing\csvStuffer\temp\Individual.*.csv' |
ForEach-Object {Import-Csv $_} |
Export-Csv -NoTypeInformation 'C:\Scripts\testing\csvStuffer\temp\MergedCsvFiles.csv'
The problem is that each .csv file has a header and a footer.
I do not want to keep the header or footer from any of the files.
Any suggestions of what I need to add to the above code to remove the headers and footers???
Thanks!
This is not the most elegant solution but it worked for my test files.
Get-ChildItem 'C:\Scripts\testing\csvStuffer\temp\Individual.*.csv' |
ForEach-Object {
$filecontent = get-content $_ | select-object -skip 1;
$filecontent | select -First $($filecontent.length -1) | Set-Content -Path $_;
};
Skipping the first line is easy with select-object. Dropping the last line requires a bit more work, but since get-content returns an array of lines, you can just grab all but the last element in that array.
Looks like alroc already gave an answer, but since I already had it written up I figured I'd post this too. It doesn't load it all into a variable, it just reads each file, strips the first and last line of the current file, and then pipes to out-file with -append on it.
gci 'C:\Scripts\testing\csvStuffer\temp\Individual.*.csv' | %{
$(gc $_.fullname|skip 1)|select -First ($(gc $_.fullname|skip 1).count-1)
}|Out-File -Append 'C:\Scripts\testing\csvStuffer\temp\MergedCsvFiles.csv'