PowerShell code to export to CSV - powershell

I have two Powershell scrips below and they do different things but I need them to do roughly similar things when they export to CSV.
Get-ChildItem C:\Users\user\Desktop\here -Recurse |
where {!$_.PSIsContainer} |
Select-Object * |
Export-Csv -NoTypeInformation -Path C:\Users\user\Desktop\Output\output.csv |
% {$_.Replace('"','')}
Gets me all the detailed info about a directory and when I open the CSV in Excel everything is in separate columns - Perfect.
plink.exe -ssh root#user -pw password -m C:\Users\user\Desktop\Scrips\LinuxScriptCommand.txt > C:\Users\user\Desktop\Output\Linux.csv
Runs df -h on my cloud and returns the space left on my drives, which is what I want, but when I open this CSV in Excel it makes me go through the text import wizard which the other doesn't.
I can't figure out a way to automate the text wizard part, can anyone provide me some insight? Also if you have a way to optimize it please let me know too.

Change your df -h command to output with comma separated values:
df -h | awk '{print $1","$2","$3","$4","$5","$6" "$7}'
The issue you were having with it saving to a single cell is the UNIX line endings not displaying correctly in Windows. This can be fixed by replacing them:
plink.exe -ssh root#user -pw password -m C:\Users\user\Desktop\Scrips\LinuxScriptCommand.txt | foreach {
if (([regex] "`r`n$").IsMatch($_) -eq $false) { $_ -replace "`n", "`r`n" } else { $_ }
} | Set-Content "C:\Users\user\Desktop\Output\Linux.csv"
If you're using a newer version of Powershell you can use the -File param to only return files. Which makes your other command considerably shorter when you remove the other unnecessary parts:
Get-ChildItem C:\Users\user\Desktop\here -Recurse -File | Export-CSV C:\Users\user\Desktop\Output\output.csv -NoTypeInformation

Related

Add eighth and ninth lines to all *.txt files

i have more than 100 txt files in C:\myfolder*.txt
when i run this script from "C:\myfolder" i can add eighth and ninth lines to somename.txt
#echo off
powershell "$f=(Get-Content somename.txt);$f[8]='heretext1';$f | set-content somename.txt"
powershell "$f=(Get-Content somename.txt);$f[9]='heretext2';$f | set-content somename.txt"
but how can i add eighth and ninth lines to all *.txt files located in path C:\myfolder*.txt
Can someone explain me how to do it please...
Sorry for my English and Sorry if i didn't explaned my problem. i will try now:
I uses "*.uci" files, instead of *.txt files. i wrote txt because uci extensions are unknown for most of the people. These *.uci files are settings for chess engines with uci protocol.
So when you use chessbase program you have a lot of chess engines and each engine creates their "enginename.uci" file.
If you want to change the numbers of core used on your PC from 1 to 16 you need to do it manually by adding following information in *.uci file like this:
[OPTIONS]
Threads=1
That's why is better to make small batch or ps1 to change settings to all engines by adding these two lines with one click
Perhaps something like this PowerShell script would suit your task:
Get-ChildItem -Path 'C:\myfolder' -Filter '*.txt' | ForEach-Object {
$LineIndex = 0
$FileContent = Switch -File $_.FullName {Default {
$LineIndex++
If ($LineIndex -Eq 8) {#'
heretext1
heretext2
'#}
$_}}
Set-Content -Path $_.FullName -Value $FileContent}
Note:
Your code isn't adding lines, it is modifying existing lines. The solution below does the same.
Indices [8] and [9] access the 9th and 10th lines, not the 8th and 9th, given that array indexing is 0-based.
You need to call Get-ChildItem with your file-name pattern, C:\myfolder\*.txt, and process each matching file via ForEach-Object:
#echo off
powershell "Get-ChildItem C:\myfolder\*.txt | ForEach-Object { $f=$_ | Get-Content -ReadCount 0; $f[8]='heretext1'; $f[9]='heretext2'; Set-Content $_.FullName $f }"
Due to calling from a batch file (cmd.exe), the PowerShell command is specified on a single line; here's the readable version:
Get-ChildItem C:\myfolder\*.txt | # get all matching files
ForEach-Object { # process each
$f = $_ | Get-Content -ReadCount 0 # read all lines
$f[8] = 'heretext1'; $f[9] = 'heretext2' # update the 9th and 10th line
Set-Content $_.FullName $f # save result back to input file
}
Note:
Consider adding -noprofile after powershell, so as to suppress potentially unnecessary loading of profile files - see the documentation of the Windows PowerShell CLI, powershell.exe.
Using -ReadCount 0 with Get-Content greatly speeds up processing, because all lines are then read into a single array, instead of streaming the lines one by one, which requires collecting them in an array, which is much slower.
Note: If a given file has fewer than 10 lines, the above solution won't work, because you can only assign to existing elements of an array (an array is a fixed-size data structure). If you need to deal wit this case, insert the following after the $f = $_ | Get-Content -ReadCount 0 line, which inserts empty lines as needed to ensure that at least 10 lines are present:
if ($f.Count -lt 10) { $f += #('') * (10 - $f.Count) }
Easiest solution I can think of is using the -Index parameter provided in Select-Object for that.
Get-ChildItem -Path .\Desktop\*.txt | % { Get-Content $_.FullName | Select-Object -Index 7,8 } |
Out-File -FilePath .\Desktop\index.txt
Edit: based on your post.

File Size with Powershell

What I am trying to do is create a PS script to see when a certain folder has a file over 1GB. If it found a file over 1GB, I want it to write a log file with info saying the name of the certain file and its size.
This works but not fully, if the file is less than 1GB I don't want a log file. (right now this will display the file info for over 1GB but if its less that 1GB it still creates a log file with no data). I don't want it to create a log for anything less than 1GB.
Any idea on how to do that?
Thanks!
Ryan
Get-ChildItem -Path C:\Tomcat6.0.20\logs -File -Recurse -ErrorAction SilentlyContinue |
Where-Object {$_.Length -gt 1GB} |
Sort-Object length -Descending |
Select-Object Name,#{n='GB';e={"{0:N2}" -F ($_.length/ 1GB)}} |
Format-List Name,Directory,GB > C:\Users\jensen\Desktop\FolderSize\filesize.log`
First, set a variable with the term/filter you're after and store the results
$items = Get-ChildItem -Path C:\Tomcat6.0.20\logs -File -Recurse -ErrorAction SilentlyContinue |
Where-Object {$_.Length -gt 1GB} |
Sort-Object Length -Descending |
Select-Object Name,#{n='GB';e={"{0:N2}" -F ($_.length/ 1GB)}}
Then pipe that to Out-File to your desired output path. This example will output a file to the Desktop of the user running the script, change as needed:
$items | Out-File -FilePath $ENV:USERPROFILE\Desktop\filesize.log -Force
The -Force parameter will overwrite an existing filesize.log file if one already exists.
To make sure you don't write blank files you should collect the minimal starting results that match your filter, and test them to see if they contain anything at all.
Then if they on;t you can ed the script, but if they do you can go on to do the sort and select the data and output it to a log file.
# Collect Matching Files
$Matched = GCI -Path "C:\Tomcat6.0.20\logs" -File -R -ErrorA "SilentlyContinue" | ? {
$_.Length -gt 1GB
}
# Check is $Matched contains Results before further processing, otherwise, we're done!
IF ([bool]$Matched) {
# If here, we have Data so select what we need and output to the log file:
$Matched | Sort Length -D | FT Name,Directory,#{
n="GB";e={"{0:N2}" -F ($_.Length/ 1GB)}
} -Auto | Out-File "C:\Users\jensen\Desktop\FolderSize\filesize_$(Get-Date -F "yyyy-MM-dd_HH-mm-ss").log"
}
In the above script, I fixed the $. to be $_., and separated Matching the 1GB files from Manipulating them, and Outputting them to a file.
We simply test if any files were matched at 1 GB by checking to see if the Variable has any results or is $NULL/undefined.
If so, there is no need to take any further action.
Only when 1Gb files are matched do we quickly sort them, and select the details you wanted, but instead we'll just Use Format-Table (FT) with -Auto-size to get nice looking output that is much easier to review for this sort of data.
(Note Format-Table selects and formats the info into a table in one step, saving the unnecessary step of using Select to get the data and then piping (|) it to Format-list or Format-Table, as that just adds a bit of a redundant step. Select-Object is best used when you will need to do further steps with that data that require "object-oriented" operations in future steps.)
Then we pipe that output to save it all to a Log file using Out-File, and I also changed the log file name to contain the current date and time in ISO format filesize_$(Get-Date -F "yyyy-MM-dd_HH-mm-ss").log So you can save each run and review them later and won't want to have one gigantic file or have no history of runs.

How to get the first word of output from a PowerShell command

I am trying to get first word from the output of this powershell command
Get-ChildItem -Path Cert:\Certificate::LocalMachine\My | findstr -i ecimas
Which is returning output like:
ffdrggjjhj ecims.example.com
How can I return the string "ffdrggjjhj" only?
You should just be able to split the output like so:
(Get-ChildItem -Path Cert:\Certificate::LocalMachine\My | findstr -i ecimas).split()[0]
Usually powershell looks more like this. Since there's objects, parsing isn't needed.
get-childitem Cert:\LocalMachine\TrustedPublisher | where subject -match wireless |
select -expand thumbprint
ABCDEFABCDEFABCDEFABCDEFABCDEFABCDEFABCD

How do I remove carriage returns from text file using Powershell?

I'm outputting the contents of a directory to a txt file using the following command:
$SearchPath="c:\searchpath"
$Outpath="c:\outpath"
Get-ChildItem "$SearchPath" -Recurse | where {!$_.psiscontainer} | Format-Wide -Column 1'
| Out-File "$OutPath\Contents.txt" -Encoding ASCII -Width 200
What I end up with when I do this is a txt file with the information I need, but it adds numerous carriage returns I don't need, making the output harder to read.
This is what it looks like:
c:\searchpath\directory
name of file.txt
name of another file.txt
c:\searchpath\another directory
name of some file.txt
That makes a txt file that requires a lot of scrolling, but the actual information isn't that much, usually a lot less than a hundred lines.
I would like for it to look like:
c:\searchpath\directory
nameoffile.txt
c:\searchpath\another directory
another file.txt
This is what I've tried so far, not working
$configFiles=get-childitem "c:\outpath\*.txt" -rec
foreach ($file in $configFiles)
{
(Get-Content $file.PSPath) |
Foreach-Object {$_ -replace "'n", ""} |
Set-Content $file.PSPath
}
I've also tried 'r but both options leave the file unchanged.
Another attempt:
Select-String -Pattern "\w" -Path 'c:\outpath\contents.txt' | foreach {$_.line}'
| Set-Content -Path c:\outpath\contents2.txt
When I run that string without the Set-content at the end, it appears exactly as I need it in the ISE, but as soon as I add the Set-Content at the end, it once agains carriage returns where I don't need them.
Here's something interesting, if I create a text file with a few carriage returns and a few tabs, then if I use the same -replace script I've been using, but uset to replace the tabs, it works perfect. Butr and n do not work. It's almost as though it doesn't recognize them as escape characters. But if I addr and `n in the txt file then run the script, it still doesn't replace anything. Doesn't seem to know what to do with it.
Set-Content adds newlines by default. Replacing Set-Content by Out-File in your last attempt in your question will give you the file you want:
Select-String -Pattern "\w" -Path 'c:\outpath\contents.txt' | foreach {$_.line} |
Out-File -FilePath c:\outpath\contents2.txt
It's not 'r (apostrophe), it's a back tick: `r. That's the key above the tab key on the US keyboard layout. :)
You can simply avoid all those empty lines by using Select-Object -ExpandProperty Name:
Get-ChildItem "$SearchPath" -Recurse |
Where { !$_.PSIsContainer } |
Select-Object -ExpandProperty Name |
Out-File "$OutPath\Contents.txt" -Encoding ASCII -Width 200
... if you don't need the folder names.

Using PowerShell, read multiple known file names, append text of all files, create and write to one output file

I have five .sql files and know the name of each file. For this example, call them one.sql, two.sql, three.sql, four.sql and five.sql. I want to append the text of all files and create one file called master.sql. How do I do this in PowerShell? Feel free to post multiple answers to this problem because I am sure there are several ways to do this.
My attempt does not work and creates a file with several hundred thousand lines.
PS C:\sql> get-content '.\one.sql' | get-content '.\two.sql' | get-content '.\three.sql' | get-content '.\four.sql' | get-content '.\five.sql' | out-file -encoding UNICODE master.sql
Get-Content one.sql,two.sql,three.sql,four.sql,five.sql > master.sql
Note that > is equivalent to Out-File -Encoding Unicode. I only tend to use Out-File when I need to specify a different encoding.
There are some good answers here but if you have a whole lot of files and maybe you don't know all of the names this is what I came up with:
$vara = get-childitem -name "path"
$varb = foreach ($a in $vara) {gc "path\$a"}
example
$vara = get-childitem -name "c:\users\test"
$varb = foreach ($a in $vara) {gc "c:\users\test\$a"}
You can obviously pipe this directly into | add-content or whatever but I like to capture in variables so I can manipulate later on.
See if this works better
get-childitem "one.sql","two.sql","three.sql","four.sql","five.sql" | get-content | out-file -encoding UNICODE master.sql
I needed something similar, Chris Berry's post helped, but I think this is more efficient:
gci -name "*PathToFiles*" | gc > master.sql
The first part gci -name "*PathToFiles*" gets you your file list. This can be done with wildcards to just get your .sql files i.e. gci -name "\\share\folder\*.sql"
Then pipes to Get-Content and redirects the output to your master.sql file. As noted by Kieth Hill, you can use Out-File in place of > to better control your output if needed.
I think logical way of solving this is to use Add-Content
$files = Get-ChildItem '.\one.sql', '.\two.sql', '.\three.sql', '.\four.sql', '.\five.sql'
$files | foreach { Get-Content $_ | Add-Content '.\master.sql' -encoding UNICODE }
hovewer Get-Content is usually very slow when reading multiple very large files. If its your case this article could help: http://keithhill.spaces.live.com/blog/cns!5A8D2641E0963A97!756.entry
What about:
Get-Content .\one.sql,.\two.sql,.\three.sql,.\four.sql,.\five.sql | Set-Content .\master.sql
Here is how I do concatenate sql files from the Sql folder:
# Set the current location of the script to use relative path
Set-Location $PSScriptRoot
# Concatenate all the sql files
$concatSql = Get-Content -Path .\Sql\*.sql
# Write/overwrite sql to single file
Add-Content -Path concatFile.sql -Value $concatSql