Powershell through CMD or VBS getting FileVersion - powershell

Okeys, I'm at a loss of understanding something. It might be in here somewhere already, but I'm haveing some trouble explaining it to myself, so I'm also haveing trouble searching for it.
The thing isn't really complicated, just illogical to me. I'm trying to set up a script in Powershell that mainly goes into a folder and gets the versions of all the *.exe files. And that works quite well like this :
dir 'C:\Program Files (x86)\SPN'| Get-ChildItem -Recurse -Include '*.exe' |%{$_.VersionInfo} | select FileName, FileVersion | Format-Table -Autosize | Out-file c:\test.txt
It's pretty simple as all it does is go into the folder, recursivly searching for the exe files, taking out the VersionInfo and then selecting the FileNames and Versions before putting it into a file as to make a report of the versions installed. And it works too!
The problem is a bit more complicated. We use a htm\vbs application to run scripts. Mainly it means that all our scripts be it vbs, batch, or powershell is run through this platform. Sometimes we also run a powershell through a batch file started by a vbs script.
When I start powershell and run the script above a file is outputed with the information I'm after, when I start a cmd and run the powershell script I get a list of the applications without the versions. Same goes for vbs.
So I thought I'd go more deep. I started a cmd, and typed in start powershell to get a new window that way, but alas, I still only get the filenames and no versions! The only way I get the versions is to run the script through Powershell it self. ( meaning I have to manually open Powershell, as right clicking and selecting Run with Powershell gives names, and no versions )
Any suggestions and ideas would be welcome.
Thanks.

Some columns might not be displayed if you specify the widest columns first, so it is safest to specify the smallest data elements first. In the following example, we specify the extremely wide path element first, and even with wrapping, we still lose the final column.
Specify fileversion first fixes the problem.
dir 'C:\.....| Get-ChildItem -Recurse -Include '*.exe' | %{$_.VersionInfo} | select FileName, FileVersion | Format-Table -Wrap -Autosize -Property fileversion, filename | Out-File "C:\....\testing.txt"
http://technet.microsoft.com/en-us/library/dd347677.aspx

Related

How to make a script to take newly updated files from a folder and transfer them to another by name

I’m trying to extract html files that automatically update from one folder to another. I am searching them by a auto generated code for every file. The files are named like : Diagnostics (2022'12'02 #1403'01''918''').html
Where the code after the # changes on every app start and close. Sometimes there a more than one file and they have the extension “p1” , “p2” ,”p3” , … “pn” and I want all the files with the # code to be taken and moved to another folder. This is what I came up with as my strategy was to take the most recent file created (which is most of the time Diagnostics (xxxxxx)pn.html and wanted to an script to copy the part from Diagnostics - xxx) and then find the other files.
$a = Get-ChildItem -Path 'C:\User\something\Temporary Files\'|Sort-Object -Descending -Property LastWriteTime | select -First 1
$b = Get-Content "C:\Users\something\Desktop\test.txt"
Echo $a
Out-File -FilePath .\test.txt|
Get-Content -Path .\test.txt
This creates a new file with the name of the file but I don’t know how to extract the useful part as a string and then search it in the directory and copy it to another folder on the desktop. This is done with Powershell as I want to make it a Bat file tha when executed does the whole process but coudnt do it with bat files only.
Thanks for the help in advance and I’m happy to join the community :)
Note: this is what the file test.txt has
    Directory: C:\Users\something\Temporary Files
Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a---l         12/2/2022   2:09 PM         706568 Diagnostics (2022'12'02 #1403'01''918''').html
This is what I came up with
Note: this is my first time playing with .bat files and with Powershell so I understand it’s not really much but… :D

Need windows batch command one-liner to remove folders by name, not by date.time using powershell if applicable

Need help with command like, one-liner, powershell to remove folders
I'm trying to find an elegant way to remove folders by folder name which reflects the date but I cannot rely on the file/folder date meta-data attributes.
Here's the problem I'm trying to solve:
I have a folder in which there are archived call recordings for each day the recording system creates folders and fills them with call recordings, a folder for each day named like format MM_dd_yyyy.
I need to remove all but the last 7 folders. But, I cannot rely on the creation/modified date on the file. That would be much easier with just powershell. So I MUST, unfortunately, remove the folders by testing the file name against the dates of the folders that I need to retain with same format (MM_dd_yyyy).
I can get the list of folder names that are to be retained base on the previous 6 days with the following Windows command line:
c:\>powershell $d0=(Get-Date).ToString('MM-dd-yyyy'); $d1=(Get-Date).AddDays(-1).ToString('MM-dd-yyyy'); $d2=(Get-Date).AddDays(-2).ToString('MM-dd-yyyy'); $d3=(Get-Date).AddDays(-3).ToString('MM-dd-yyyy'); $d4=(Get-Date).AddDays(-4).ToString('MM-dd-yyyy'); $d5=(Get-Date).AddDays(-5).ToString('MM-dd-yyyy'); $d6=(Get-Date).AddDays(-6).ToString('MM-dd-yyyy'); $d0; $d1; $d2; $d3; $d4; $d5; $d6
NOTE: I need to keep this in a command one-liner and cannot use PS1 power shell script because of corporate and domain enforced security limitations
This produces the folder names to be retained as listed below (ran on 20 NOV 2021 to retain last 7 days).
11_20_2021
11_19_2021
11_18_2021
11_17_2021
11_16_2021
11_15_2021
11_14_2021
The intention would be to remove any folder names that were like 11_13_2021, 11_12_2021... etc.
I can get away with running nested FOR loops in a Windows bat file to try and hack this together but I'm trying to find a more simple, readable and elegant one-liner that will let me do something like the following:
powershell $d=(Get-Date).AddDays(-7).ToString('MM-dd-yyyy'); and then some magic powershell stuff to remove any folder that doesn't match any of those that are to be retained.
If I had a way to provide the folder name (MM_dd_yyyy) to the (get-date).AddDays(-6) powershell command and have it return a boolean yes or no, that would be something closer to what I'm looking for.
I've been reading and you tubing and pulling hairs out but so far I'm learning but mostly making a mess of it. Any ideas are most welcome.
I'm likely approaching this all wrong. The constraints are:
Given a list of folders with naming format MM_dd_yyyy, I need to remove/delete all that are not within the last week of days.
I cannot run powershell scripts .ps1
I can run windows bat or cmd files with for loops and such
I cannot rely on the folder of files date/time meta attributes, some data in the folders may have create/write/modified dates that are not in line with the folder name. I must rely on the folder name (MM_dd_yyyy) to remove the folders.
UPDATED with resolution:
powershell "($f=Get-ChildItem -Path 'D:\PosConvSav' -Filter '*_*_*' -Directory | Where-Object { $_.Name -match '\d{2}_\d{2}_\d{4}' } | sort-object -desc)[14..($_.count)] | remove-item -recurse"
The PowerShell code for this would be:
Get-ChildItem -Path 'RootPath\Where\The\Folders\To\Delete\Are\Found' -Filter '*_*_*' -Directory |
Where-Object { $_.Name -match '\d{2}_\d{2}_\d{4}' } | # filter some more using regex -match
Sort-Object { [datetime]::ParseExact($_.Name, 'MM_dd_yyyy', $null) } | # sort by date
Select-Object -SkipLast 7 | # skip the newest 7 folders
Remove-Item -Recurse -Force # remove the rest
To play it safe, add -WhatIf to the final Remove-Item command. By doing that, the code does not actually delete anything, but show in the console what would be deleted. If you are satisfied that is correct, then remove -WhatIf to actually remove those folders.
As Olaf already commented, don't think using one-line code would be best, because what you'll end up with is code that isn't readable anymore and where mistakes are extremely hard to find.
There is no penalty whatsoever for multiline code, in fact it is THE way to go!

Powershell equivalent of Cat ./* ../path2/* > file.sql

I'm trying to translate a Linux command to be easily usable for Windows user for a project, but I am not having any luck finding comparable commands in Powershell.
I have two paths with some SQL and CSV files. What I need is this command:
cat ./* ../path/* > new_file.sql
This takes all content from all files in path1 and then all content from all files in path2 and writes it to a file.
I assumed I could do something similar in Powershell, but apparently the behaviour is wildly different.
What I have tried are:
cat ./*, ../path/* > new_file.sql
Get-Content ./*, ../path2/* | Out-File new_file.sql
They both do the same which seems to... I'm not sure, take the entirety of path2/* for every file in path1? The output quickly balloons to tens of megabytes. The combined content of both directories is a perhaps 40 kilobytes.
Anyone know? I cannot find a proper answer to this. Thanks!
EDIT: I think I figured out what the problem is. I guess I should've just used the actual paths for the example. First path is ./* and it seems like it keeps looping over the Out-File it makes itself. I have updated the title and examples to reflect this.
Enumerate the files as a separate step before concatenating their contents (this way Get-Content won't accidentally discover the new file halfway through):
$files = Get-ChildItem ./,../path2/ -File
$files |Get-Content |Out-File newfile.txt
You can combine these statements in a single pipeline if you wish:
(Get-ChildItem ./,../path2/ -File) |Get-Content |Out-File newfile.txt

Code working a bit differently between Powershell.exe and Powershell ISE - Sort-Object behavior

I've got a bit of code to get hyperlinks from http webpage - find all items matching criteria, then find newest by sorting them and use it's InnerText as source for download. All hyperlinks get assigned a 7-digit number at the end of their name with greater number for newer files.
Code:
$Web = ((Invoke-WebRequest "http://serveraddress/Export/").Links |
Where-Object {$_.href -like “xx_yyyyyy_auto_standard*”} |
Sort Name –Desc | Select –Last 1).innertext
Start-BitsTransfer -Source http://serveraddress/Export/$Web -Destination C:\autoreports\
Now, when I run above in Powershell.exe (e.g. when launching .ps1 file by right-click and "Run with Powershell") I get the oldest file downloaded. When I run above in Powershell ISE I get newest file. When I changed -Last to -First in Powershell.exe code works as expected.
I can easily change this, but since I'm very new to Powershell - why is there such difference between Powershell ISE and "standard"? Thanks in advance! While this might be rookie question, I did not find - or understand - reason for the difference.
To complement Jordan's helpful answer, which solves your specific problem:
As stated, Sort-Object quietly accepts nonexistent properties to sort by.
Unfortunately, as of Windows PowerShell v5.1, attempting to sort by a nonexistent property results in seemingly random output order:
WinPS> 1, 2, 3, 4 | Sort-Object -Property Foo
3
4
1
2
Fortunately, this problem has been fixed in PowerShell Core, where the input order is preserved in that case.
The link objects don't have a property of "Name" so sorting by Name won't work.

Clean up a remote machine's temporary directory: How do I identify files that are in use? (Powershell)

I have a number of remote machines whose temporary directories get full. (The are Selenium / webdriver grid remotes). I have a powershell script that identifies the files and directories that need to be cleaned. The command in use looks something like this (excluding complexities of the various machines and directories):
gci $env:TEMP -Recurse| Remove-Item -ErrorAction Continue -Recurse
The problem is that this takes far too long when some files are in use. Locally, I could join to the output of handles (parsing would be a little ugly), but that would be more complicated on a remote machine. Among other things, I'd need to verify that WinRM was configured correctly, handles was in path, etc.
Is there a simpler way to identify that a file is in use? Ideally one that can be filtered on via Powershell (which includes .NET). I'm familiar with a variety of other scripting languages (ruby, python, perl).
The best tool I've found for listing open files is the SysInternals tool handle.exe e.g.:
$openFiles = #(handle $env:TEMP | Foreach {($_ -split ": ")[3]} | Select -Unique)