Powershell 2.0 out-file formatting nightmare - powershell

How can I get Powershell to output a file IDENTICAL to the file produced by the following command?
dir /s /b /a-d *.* > C:\files.txt
should be easy, right?!!
EDIT:
I found ps was truncating the output based on the screen buffer width. Fix that with format-table and it pads with spaces... try format-list and you get property headings...you get the idea.

Does it have to be exact? Why? As the saying goes, if you're parsing strings in Powershell, you're probably doing something wrong...anyway...
1) Just call into cmd.exe.
PS> cmd /c "dir /s /b /a-d *.* > c:\files.txt"
2) I believe you can get the same results from native Powershell. But I can't be responsible for testing every edge case with NTFS junctions, hidden files, etc.
PS> gci -r | ?{ !$_.psiscontainer } | %{ $_.fullname } | out-file c:\files.txt
I personally hate the fact you can't use "select" to retrieve the FullName property without weird side effects on downstream cmdlets. If the pointlessness of the foreach loop bothers you as much as it does me, use Get-PropertyValue from PSCX or Linq-Select from Josh Einstein.

There are many ways to write information to a file. One is Set-Content, which does not have the width problem. Also, converting a FileInfo object to a string results in the FullName.
dir * -r | ?{!$_.PSIsCcontainer} | Set-Content C:\files.txt
The reason you have a problem with Out-File is that all the Out-* cmdlets use the automatic formatting views. Those views are created with the console in mind.

Related

Powershell looping through files to pass to a program as a parameter

I have the following two cmd.exe commands, but I need to convert them to Powershell, and I've failed miserably trying to figure it out. Line 1 is finding a dll, but only when in a bin folder and line two then takes all the entries it finds and runs a command with it, e.g. bin\Debug\file, bin\Release\file
Can anyone help? The only limitation is this is inside a yaml runner file so I don't think I can split lines for each part, e.g. I don't think a Foreach-Object will work.
dir /s /b RunnerUnitTest.dll | findstr /r bin\\ > tests_list.txt
for /f %f in (tests_list.txt) do vstest.console.exe "%f"
I got as far as this
(gci -r RunnerUnitTest.dll).FullName | select-string bin
thanks.
Write a multi-line powershell script to do the work and then call that script from your yaml runner.
powershell -file "c:\myscripts\runtests.ps1" "c:\mydlls\RunnerUnitTest.dll" "c:\mytests\tests_list.txt"
A single command (pipeline), spread across 3 lines for readability, using built-in command aliases for brevity (but the parameter names are spelled out, for long-term robustness):
gci -Recurse -Filter RunnerUnitTest.dll |
? FullName -match bin\\ |
% { vstest.console.exe $_.FullName }
gci -Recurse -Filter RunnerUnitTest.dll finds all RunnerUnitTest.dll in the current directory's subtree; -Filter makes for faster matching than using the (positionally implied) -Path parameter.
? FullName -match bin\\ uses ? (Where-Object) to test the .FullName (full path) property values of the input file-info objects for matching regex bin\\, i.e. for a literal bin\ substring, and only passes matching file-info objects on.
% { vstest.console.exe $_.FullName } uses % (ForEach-Object) to invoke vstest.console.exe with each matching file's full path.
Note that no intermediate file with a list of DLLs to process is created, because it isn't necessary.
If you need to pass the above to an explicit invocation of the PowerShell CLI, you'd do:
powershell -noprofile -command "gci -Recurse -Filter RunnerUnitTest.dll | ? FullName -match bin\\ | % { vstest.console.exe $_.FullName }"
If you're using PowerShell [Core] 6+, substitute pwsh for powershell.

Create text file containing a list of all files of a certain type with their filesize

I want to create a text file with all filenames of a certain filetype plus the filesize, recursively from a specified directory and all subdirectories.
For example: Listing all .jpg files plus their sizes from a huge photo-collection.
I have found several similar questions, but not this specific listing.
One did this with the full path name, but I don't need this and it would become very long.
Another lists all files, but without size.
Another lists all filenames with size, but I can't specify a filetype.
This PowerShell command creates the desired list, but I don't know how to limit it to a certain filetype (e.g. .jpg)
gci -rec|?{!$_.PSIsContainer}|%{"$($_.Fullname) $($_.Length)"} >filelist.txt
This batch file lists all .jpg's, but without showing the filesize.
dir /b /s z:\Filme\*.jpg > list1.txt
for /f "tokens=*" %%A in (list1.txt) do echo %%~nxA >> list.txt
del list1.txt
Could anyone edit one of these? so I get the desired list, or come up with a different solution?
Could anyone edit one of these so I get the desired list?
You are almost there with the batch script.
%~z1 will display the file size (in bytes).
You can also get rid of the temporary file by using a slightly different version of the for command.
Use the following batch file:
#echo off
setlocal
for /f "tokens=*" %%A in ('dir /b /s z:\Filme*.jpg') do (
if /i "%%~xf" equ ".jpg" echo %%~nxf %%~zf
) > list.txt
endlocal
Further Reading
An A-Z Index of the Windows CMD command line | SS64.com
Windows CMD Commands (categorized) - Windows CMD - SS64.com
Command Redirection, Pipes - Windows CMD - SS64.com
Dir - list files and folders - Windows CMD - SS64.com
For - Loop through command output - Windows CMD - SS64.com
If - Conditionally perform command - Windows CMD - SS64.com
Parameters / Arguments - Windows CMD - SS64.com
You know about the %%~nxA modifier, so I'm a bit surprised you didn't notice the %%~zA modifier.
To simplify it even more, use a for /R loop and don't use a temp file:
(for /R %%A in (*.jpg) do echo %%~nxA %%~zA)>list.txt
or if you need the full path\name, use %%~fA (explicite) or even just %%A
Text output:
Get-ChildItem -Path 'X:\PHOTO' -Filter '*.jp*g' -Recurse |
Where-Object {-not $_.PsIsContainer} |
Select-Object Name, Length |
Out-File -FilePath '.\FileList.txt'
CSV output:
Get-ChildItem -Path 'X:\PHOTO' -Filter '*.jp*g' -Recurse |
Where-Object {-not $_.PsIsContainer} |
Select-Object Name, Length |
Export-Csv -Path '.\FileList.csv' -NoTypeInformation
P.S. I've used *.jp*g wildcard that will also match *.jpeg files. Unfortunately, * wildcard matches zero or more symbols, so you can get files like zzz.jpXXXg in your list. There are other ways to filter Get-ChildItem output that don't suffer from this issue, such as filtering with pipeline and regex but they're slower: Where-Object {$_.Extension -match '^\.jp[e]{1}g$'}
Another option would be to not use the -Filter parameter, but the -Include instead where the wildcard pattern works as expected, like this:
PowerShell version 3.0 and up
Get-ChildItem 'z:\Filme' -File -Include '*.jpg' -Recurse |
Select FullName, Length |
Export-Csv '.\FileList.csv' -NoTypeInformation
PowerShell version below 3.0
Get-ChildItem 'z:\Filme' -Include '*.jpg' -Recurse |
Where-Object { !$_.PsIsContainer} |
Select FullName, Length |
Export-Csv '.\FileList.csv' -NoTypeInformation
Note that -Include only works if you also specify -Recurse or if you have the path end in \* like in Get-Childitem 'z:\Filme\*'.
Also, -Filter works faster than -Include (or -Exclude) parameters.
As stated in the docs:
"Filters are more efficient than other parameters, because the provider applies them when the cmdlet gets the objects. Otherwise, PowerShell filters the objects after they are retrieved."
I have never looked into the layout from the Where command, but if it does not alter between languages/locales, or technically if your layout is not too dissimilar to that of my test system, you could do it on your machine like this:
From the Command Prompt:
(For /F "Tokens=1,3*" %A In ('Where/T /R . *.jpg 2^>Nul')Do #Echo("%C","%A")>"list.txt"
From a batch file:
#(For /F "Tokens=1,3*" %%A In ('Where/T /R . *.jpg 2^>Nul')Do #Echo("%%C","%%A")>"list.txt"
Obviously if the layout from your Where command output differs there's still a possibility to adjust the Tokens and/or include delimiters to suit your target system.
In the examples above, I've used . to represent the current directory, you could of course change that to another relative path, e.g. ..\Pictures, or full path, e.g. C:\Users\Patrick\Pictures as necessary.
And a powershell option:
Ls -Filt '*.jpg' -Fo -Rec -EA SilentlyContinue|?{!$_.PSIsContainer -And $_.Extension -Eq '.jpg'}|Select FullName,Length|ConvertTo-CSV -NoT|Select -Skip 1|SC '.\list.txt'
This will also include e.g. system and hidden files, will not include files with extensions other than .jpg and will not include an unrequested header with that listing.
try this
Get-ChildItem "yourdir" -File -Filter '*.jpg' -Recurse |
Select FullName, Length |
Export-Csv '.\FileList.csv' -NoType

Ignore first line in FINDSTR search

I am searching an object-oriented Modelica library for a certain string using the following command in the Windows 7 PowerShell:
findstr /s /m /i "Searchstring.*" *.*
click for findstr documentation
The library consists of several folders containing text files with the actual code in them. To reduce the number of (unwanted) results, I have to ignore the first line of every text file.
Unfortunately, I cannot work out how to do this with the findstr command.
You can use Select-String instead of findstr
To get all matches excluding the ones on the first line try something like this:
Select-String -Path C:\dir\*.* -pattern "Searchstring*" | where {$_.LineNumber -gt 1}
If you have to search subdirectories you can pair it with Get-Childitem:
Get-Childitem C:\dir\*.* -recurse | Select-String -pattern "Searchstring*" | where {$_.LineNumber -gt 1}
If you want to keep using findstr you could simply pipe the output into Select-Object:
findstr /s /m /i "Searchstring.*" *.* | select -Skip 1

Enterprise find files (via powershell) exclude Tempory Internet Files

I am trying to run this command via powershell on every computer on my network.
I am running into a problem with Temporary Internet Files, providing too many false positives.
Can anyone suggest a way to improve this command?
dir C:\Users\ /S /B | findstr /i "".t.st."" > "C:\test.txt"
One suggestion, was
dir C:\Users\ /S /B | findstr /i "".t.st."" > "C:\test.txt"
dir C:\Users\ /S /B | findstr /i "".t.st."" | findstr ""Temporary Internet Files"" > "C:\test2.txt"
fc C:\test.txt C:\test2.txt > C:\results.txt
But running through tests, it didn't give me the results I was looking for. I still had duplications. Or it would say the files are too different.
Thanks!
If you are running this through PowerShell then why not use PowerShell commands? Your examples are using the legacy DIR command. You could use Get-ChildItem instead. You can use -Exclude to skip a folder. You could use -filter or -include to find the files you want. Worse case is to get the files and then pipe to where object perhaps using a regex pattern to filter out the files you want. I can't tell from your commands what files you are looking for.

How do I concatenate two text files in PowerShell?

I am trying to replicate the functionality of the cat command in Unix.
I would like to avoid solutions where I explicitly read both files into variables, concatenate the variables together, and then write out the concatenated variable.
Simply use the Get-Content and Set-Content cmdlets:
Get-Content inputFile1.txt, inputFile2.txt | Set-Content joinedFile.txt
You can concatenate more than two files with this style, too.
If the source files are named similarly, you can use wildcards:
Get-Content inputFile*.txt | Set-Content joinedFile.txt
Note 1: PowerShell 5 and older versions allowed this to be done more concisely using the aliases cat and sc for Get-Content and Set-Content respectively. However, these aliases are problematic because cat is a system command in *nix systems, and sc is a system command in Windows systems - therefore using them is not recommended, and in fact sc is no longer even defined as of PowerShell Core (v7). The PowerShell team recommends against using aliases in general.
Note 2: Be careful with wildcards - if you try to output to inputFiles.txt (or similar that matches the pattern), PowerShell will get into an infinite loop! (I just tested this.)
Note 3: Outputting to a file with > does not preserve character encoding! This is why using Set-Content is recommended.
Do not use >; it messes up the character encoding. Use:
Get-Content files.* | Set-Content newfile.file
In cmd, you can do this:
copy one.txt+two.txt+three.txt four.txt
In PowerShell this would be:
cmd /c copy one.txt+two.txt+three.txt four.txt
While the PowerShell way would be to use gc, the above will be pretty fast, especially for large files. And it can be used on on non-ASCII files too using the /B switch.
You could use the Add-Content cmdlet. Maybe it is a little faster than the other solutions, because I don't retrieve the content of the first file.
gc .\file2.txt| Add-Content -Path .\file1.txt
To concat files in command prompt it would be
type file1.txt file2.txt file3.txt > files.txt
PowerShell converts the type command to Get-Content, which means you will get an error when using the type command in PowerShell because the Get-Content command requires a comma separating the files. The same command in PowerShell would be
Get-Content file1.txt,file2.txt,file3.txt | Set-Content files.txt
I used:
Get-Content c:\FileToAppend_*.log | Out-File -FilePath C:\DestinationFile.log
-Encoding ASCII -Append
This appended fine. I added the ASCII encoding to remove the nul characters Notepad++ was showing without the explicit encoding.
If you need to order the files by specific parameter (e.g. date time):
gci *.log | sort LastWriteTime | % {$(Get-Content $_)} | Set-Content result.log
You can do something like:
get-content input_file1 > output_file
get-content input_file2 >> output_file
Where > is an alias for "out-file", and >> is an alias for "out-file -append".
Since most of the other replies often get the formatting wrong (due to the piping), the safest thing to do is as follows:
add-content $YourMasterFile -value (get-content $SomeAdditionalFile)
I know you wanted to avoid reading the content of $SomeAdditionalFile into a variable, but in order to save for example your newline formatting i do not think there is proper way to do it without.
A workaround would be to loop through your $SomeAdditionalFile line by line and piping that into your $YourMasterFile. However this is overly resource intensive.
To keep encoding and line endings:
Get-Content files.* -Raw | Set-Content newfile.file -NoNewline
Note: AFAIR, whose parameters aren't supported by old Powershells (<3? <4?)
I think the "powershell way" could be :
set-content destination.log -value (get-content c:\FileToAppend_*.log )