I'm trying to translate a Linux command to be easily usable for Windows user for a project, but I am not having any luck finding comparable commands in Powershell.
I have two paths with some SQL and CSV files. What I need is this command:
cat ./* ../path/* > new_file.sql
This takes all content from all files in path1 and then all content from all files in path2 and writes it to a file.
I assumed I could do something similar in Powershell, but apparently the behaviour is wildly different.
What I have tried are:
cat ./*, ../path/* > new_file.sql
Get-Content ./*, ../path2/* | Out-File new_file.sql
They both do the same which seems to... I'm not sure, take the entirety of path2/* for every file in path1? The output quickly balloons to tens of megabytes. The combined content of both directories is a perhaps 40 kilobytes.
Anyone know? I cannot find a proper answer to this. Thanks!
EDIT: I think I figured out what the problem is. I guess I should've just used the actual paths for the example. First path is ./* and it seems like it keeps looping over the Out-File it makes itself. I have updated the title and examples to reflect this.
Enumerate the files as a separate step before concatenating their contents (this way Get-Content won't accidentally discover the new file halfway through):
$files = Get-ChildItem ./,../path2/ -File
$files |Get-Content |Out-File newfile.txt
You can combine these statements in a single pipeline if you wish:
(Get-ChildItem ./,../path2/ -File) |Get-Content |Out-File newfile.txt
Related
Need help with command like, one-liner, powershell to remove folders
I'm trying to find an elegant way to remove folders by folder name which reflects the date but I cannot rely on the file/folder date meta-data attributes.
Here's the problem I'm trying to solve:
I have a folder in which there are archived call recordings for each day the recording system creates folders and fills them with call recordings, a folder for each day named like format MM_dd_yyyy.
I need to remove all but the last 7 folders. But, I cannot rely on the creation/modified date on the file. That would be much easier with just powershell. So I MUST, unfortunately, remove the folders by testing the file name against the dates of the folders that I need to retain with same format (MM_dd_yyyy).
I can get the list of folder names that are to be retained base on the previous 6 days with the following Windows command line:
c:\>powershell $d0=(Get-Date).ToString('MM-dd-yyyy'); $d1=(Get-Date).AddDays(-1).ToString('MM-dd-yyyy'); $d2=(Get-Date).AddDays(-2).ToString('MM-dd-yyyy'); $d3=(Get-Date).AddDays(-3).ToString('MM-dd-yyyy'); $d4=(Get-Date).AddDays(-4).ToString('MM-dd-yyyy'); $d5=(Get-Date).AddDays(-5).ToString('MM-dd-yyyy'); $d6=(Get-Date).AddDays(-6).ToString('MM-dd-yyyy'); $d0; $d1; $d2; $d3; $d4; $d5; $d6
NOTE: I need to keep this in a command one-liner and cannot use PS1 power shell script because of corporate and domain enforced security limitations
This produces the folder names to be retained as listed below (ran on 20 NOV 2021 to retain last 7 days).
11_20_2021
11_19_2021
11_18_2021
11_17_2021
11_16_2021
11_15_2021
11_14_2021
The intention would be to remove any folder names that were like 11_13_2021, 11_12_2021... etc.
I can get away with running nested FOR loops in a Windows bat file to try and hack this together but I'm trying to find a more simple, readable and elegant one-liner that will let me do something like the following:
powershell $d=(Get-Date).AddDays(-7).ToString('MM-dd-yyyy'); and then some magic powershell stuff to remove any folder that doesn't match any of those that are to be retained.
If I had a way to provide the folder name (MM_dd_yyyy) to the (get-date).AddDays(-6) powershell command and have it return a boolean yes or no, that would be something closer to what I'm looking for.
I've been reading and you tubing and pulling hairs out but so far I'm learning but mostly making a mess of it. Any ideas are most welcome.
I'm likely approaching this all wrong. The constraints are:
Given a list of folders with naming format MM_dd_yyyy, I need to remove/delete all that are not within the last week of days.
I cannot run powershell scripts .ps1
I can run windows bat or cmd files with for loops and such
I cannot rely on the folder of files date/time meta attributes, some data in the folders may have create/write/modified dates that are not in line with the folder name. I must rely on the folder name (MM_dd_yyyy) to remove the folders.
UPDATED with resolution:
powershell "($f=Get-ChildItem -Path 'D:\PosConvSav' -Filter '*_*_*' -Directory | Where-Object { $_.Name -match '\d{2}_\d{2}_\d{4}' } | sort-object -desc)[14..($_.count)] | remove-item -recurse"
The PowerShell code for this would be:
Get-ChildItem -Path 'RootPath\Where\The\Folders\To\Delete\Are\Found' -Filter '*_*_*' -Directory |
Where-Object { $_.Name -match '\d{2}_\d{2}_\d{4}' } | # filter some more using regex -match
Sort-Object { [datetime]::ParseExact($_.Name, 'MM_dd_yyyy', $null) } | # sort by date
Select-Object -SkipLast 7 | # skip the newest 7 folders
Remove-Item -Recurse -Force # remove the rest
To play it safe, add -WhatIf to the final Remove-Item command. By doing that, the code does not actually delete anything, but show in the console what would be deleted. If you are satisfied that is correct, then remove -WhatIf to actually remove those folders.
As Olaf already commented, don't think using one-line code would be best, because what you'll end up with is code that isn't readable anymore and where mistakes are extremely hard to find.
There is no penalty whatsoever for multiline code, in fact it is THE way to go!
I manage database servers and often I have to apply scripts into different servers or databases.
Sometimes these scripts are all saved in a directory and need to be open and run in the target server\database.
As I have been looking at automating this task I came across how Run All PowerShell Scripts In A Directory and also How can I execute a set of .SQL files from within SSMS? and that is exactly what I needed, however I stumbled over a few issues:
I don't know the file names
:setvar path "c:\Path_to_scripts\"
:r $(path)\file1.sql
:r $(path)\file2.sql
I tried to add all .sql files into one big thing, but when I copied from powershell into sql, in many of the procedures that had long lines, the lines got messed up
cls
$Radhe = Get-Content 'D:\apply all scripts to SQLPRODUCTION\*.sql' -Raw
$Radhe.Count
$Radhe.LongLength
$Radhe
If I could read all the files in that specific folder and save them all into a single the_scripts_to_run.sql file, without changing the line endings, that would be perfect.
I don't need to use get-content or any command in particular, I just would like to get all my scripts into a big single script with everything in it, without changes.
How can I achieve that?
I even found Merge multiple SQL files into a single SQL file but I want to get it done via powershell.
This should work fine, I'm not sure what you mean by not needing to use Get-Content you could use [System.IO.File]::ReadAllLines( ) or [System.IO.File]::ReadAllText( ) but this should work fine too. Try it and let me know if it works.
$path = "c:\Path_to_scripts"
$scripts = (Get-ChildItem "$path\*.sql" -Recurse -File).FullName
$merged = [system.collections.generic.list[string[]]]::new()
foreach($script in $scripts)
{
$merged.Add((Get-Content $script))
}
$merged | Out-File "$path\mergedscripts.sql"
This is actually much simpler than the proposed solutions. Get-Content takes a list of paths and supports wildcards, so no loop is required.
$path = 'c:\temp\sql'
Set-Content -Path "$path\the_scripts_to_run.sql" -Value (Get-Content -Path "$path\*.sql" -Raw)
Looks like me and #Santiago had the same idea:
Get-ChildItem -Path "$path" -Filter "*.sql" | ForEach-Object -Process {
Get-Content $_.FullName | Out-File $Path\stuff.txt -Append utf8
}
I am attempting to search a directory of perl scripts and compile a list of all the other perl scripts executed from those files(intentionally trying to do this through Powershell). A simplistic dependency mapper, more or less.
With the below line of code I get output of every line where a reference to a perl file is found, but what I really need is same output AND the file in which each match was found.
Get-Content -Path "*.pl" | Select-String -Pattern '\w+\.pl' | foreach {Write-Host "$_"}
I have succeeded using some more complicated code but I think I can simplify it and accomplish most of the work through a couple lines of code(The code above accomplishes half of that).
Running this on a windows 10 machine powershell v5.1
I do things like this all the time. You don't need to use get-content.
ls -r *.pl | Select-String \w+\.pl
file.pl:1:file2.pl
You don't need to use ls or Get-ChildItem either; Select-String can take a path parameter:
Select-String -Pattern '\w+\.pl' -Path *.pl
which shortens to this in the shell:
sls \w+\.pl *.pl
(if your regex is more complex it might need spaces around it).
For the foreach {write-host part, you're writing a lot of code to turn useful objects back into less-useful strings, and forcibly writing them to the host instead of the standard output stream. You can pick out the data you want with:
sls \w+\.pl *.pl | select filename, {$_.matches[0]}
which will keep them as objects with properties, but render by default as a table.
I am trying to configure my dotnet core project (in Windows) as "case sensitive", so it behaves as in my production server (linux).
I have found this way of doing it:
fsutil.exe file setCaseSensitiveInfo "C:\my folder" enable
The problem is that this function is not recursive:
The case sensitivity flag only affects the specific folder to which you apply it. It isn’t automatically inherited by that folder’s subfolders.
So I am trying to build a powershell script that applies this to all folders and subfolders, recursively.
I have tried googling something similar and just modifying the command line, but I don't seem to find the corrent keywords. This is the closest that I've gotten to this sort of example.
Correct code:
(Get-ChildItem -Recurse -Directory).FullName | ForEach-Object {fsutil.exe file setCaseSensitiveInfo $_ enable}
Explanation:
NOTE: The code in the answer assumes you're in the root of the directory tree and you want to run fsutil.exe against all the folders inside, as it's been pointed out in the comments (thanks #Abhishek Anand!)
Get-ChildItem -Recurse -Directory will give you list of all folders (recursively).
As you want to pass their full path, you can access it by using .FullName[1] (or more self-explanatory | Select-Object -ExpandProperty FullName ).
Then you use ForEach-Object to run fsutil.exe multiple times. Current file's FullName can be accessed using $_ (this represents current object in ForEach-Object)[2].
Hint:
If you want more tracking of what's currently being processed you can add the following to write the path of currently processed file to the console: ; Write-Host $_ (semicolon ; is to separate from fsutil invocation) as it was pointed out in the comments (thanks Fund Monica's Lawsuit !)
[1] .FullName notation works for PowerShell 3.0 and greater, Select-Object -ExpandProperty FullName is preferred if there's a chance that lower version will be used.
[2] $_ is an alias for $PSItem
(Get-ChildItem -Recurse -Directory).FullName | ForEach-Object {if (-Not ($_ -like '*node_modules*')) { fsutil.exe file setCaseSensitiveInfo $_ enable } }
I modified #robdy's code to allow excluding node_modules. You can replace the "node_modules" bit in the above with anything to exclude filepaths containing it.
If you're working with npm, you probably want to exclude node_modules. #robdy's answer is great, but was taking minutes at a time iterating over every single node package folder even if I didn't have the package installed; given that this is something one might want to run fairly often since directories might be added all the time, and since you probably aren't modifying anything in node_modules, excluding it seems reasonable.
With Cygwin and bash shell, you can do this:
$ find $THEDIR -type d -exec fsutil file setCaseSensitiveInfo "{}" enable \;
It appears that Windows handles the '/' characters output by the find command just fine.
In my case I had to first enable the Linux subsystem before using the fsutil tool. So my steps were:
Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux
then restart, and then #robdy 's solution:
(Get-ChildItem -Recurse -Directory).FullName | ForEach-Object {fsutil.exe file setCaseSensitiveInfo $_ enable}
On windows 11, the other answers are not correct, as fsutil requires that the directory is not empty. To overcome this, I created a NEW empty directory, used fsutil file setCaseSensitiveInfo to set the case sensitive flag on the new directory, then MOVED the files from the other directory inside the new one. This works, as the directories are re-created when moved, and new directories inherit the case sensitive flag.
Okeys, I'm at a loss of understanding something. It might be in here somewhere already, but I'm haveing some trouble explaining it to myself, so I'm also haveing trouble searching for it.
The thing isn't really complicated, just illogical to me. I'm trying to set up a script in Powershell that mainly goes into a folder and gets the versions of all the *.exe files. And that works quite well like this :
dir 'C:\Program Files (x86)\SPN'| Get-ChildItem -Recurse -Include '*.exe' |%{$_.VersionInfo} | select FileName, FileVersion | Format-Table -Autosize | Out-file c:\test.txt
It's pretty simple as all it does is go into the folder, recursivly searching for the exe files, taking out the VersionInfo and then selecting the FileNames and Versions before putting it into a file as to make a report of the versions installed. And it works too!
The problem is a bit more complicated. We use a htm\vbs application to run scripts. Mainly it means that all our scripts be it vbs, batch, or powershell is run through this platform. Sometimes we also run a powershell through a batch file started by a vbs script.
When I start powershell and run the script above a file is outputed with the information I'm after, when I start a cmd and run the powershell script I get a list of the applications without the versions. Same goes for vbs.
So I thought I'd go more deep. I started a cmd, and typed in start powershell to get a new window that way, but alas, I still only get the filenames and no versions! The only way I get the versions is to run the script through Powershell it self. ( meaning I have to manually open Powershell, as right clicking and selecting Run with Powershell gives names, and no versions )
Any suggestions and ideas would be welcome.
Thanks.
Some columns might not be displayed if you specify the widest columns first, so it is safest to specify the smallest data elements first. In the following example, we specify the extremely wide path element first, and even with wrapping, we still lose the final column.
Specify fileversion first fixes the problem.
dir 'C:\.....| Get-ChildItem -Recurse -Include '*.exe' | %{$_.VersionInfo} | select FileName, FileVersion | Format-Table -Wrap -Autosize -Property fileversion, filename | Out-File "C:\....\testing.txt"
http://technet.microsoft.com/en-us/library/dd347677.aspx