How to detect if a special file is running or not with powershell? - powershell

Get-Process only gives the result if a notepad or exe file is running but I want to know if a specific file (index.txt) which is in some folder is running or not in powershell

You can use the mainWindowTitle method and then select the names of the processes running. Something like this -
get-Process notepad | where-Object {$_.mainWindowTitle} | Select-Object id, name, mainwindowtitle
This will give you the list of notepads processes running, and if you find your file index.txt under the MainWindowTitle header then, you can confirm that your file is running indeed.

Get-Process gets all running processes.
A text file is not a process, it of course is an object opened by / in a process (whether PS started it or not), notepad, winword, etc...
PS can be used to start a process, say notepad, but PS does not own it, the exe does.
So, a file, in the context you are asking, is never running in PS, the process (running on your system) can be looked up using Get-Process (or the old Tasklist tool which Get-Process replaces) as well as the path information of the running process.
Start notepad manually and open a text file.
Run Get-Process and ask for all values of the notepad process.
You will see the Get-Process brings back a whole lot of info for you to select from.
Note that it is the MainWindowTitle, which shows what file the Notepad process has open, but no where in this results does it say where that file (path) is ran from.
Get-Process Notepad | Select-Object *
Name : notepad
Id : 20516
...
Path : C:\WINDOWS\system32\notepad.exe
Company : Microsoft Corporation
CPU : 2.515625
ProductVersion : 10.0.17134.1
Description : Notepad
Product : Microsoft® Windows® Operating System
__NounName : Process
...
MainWindowTitle : news-stuff.txt - Notepad
MainModule : System.Diagnostics.ProcessModule (notepad.exe)
...

Note:
This answer tells you if a given file is currently held open by someone.
If you also need to know who (what process) has it open, see the answers to this related question, but note that they require either installation of a utility (handle.exe) or prior configuration of the system with administrative privileges (openfiles)
If you want a conveniently packaged form of the technique presented in this answer, you can download function Get-OpenFiles from this Gist, which supports finding all open files in a given directory [subtree].
Files, unlike processes, aren't running, so I assume that you meant to test if a file is currently open (has been opened, possibly by another process, for reading and/or writing, and hasn't been closed yet).
The following snippet detects if file someFile.txt in the current dir. is currently open elsewhere:
$isOpen = try {
[IO.File]::Open("$PWD/someFile.txt", 'Open', 'Read', 'None').Close()
$false # file NOT open elsewhere
}
catch {
# See if the exception is a sharing / locking error, which indicates that the file is open.
if (
$_.Exception.InnerException -is [System.IO.IOException] -and
($_.Exception.InnerException.HResult -band 0x21) -in 0x21, 0x20
) {
$true # file IS open elsewhere
}
else {
Throw # unexpected error, relay the exception
}
}
$isOpen # echo the result
Note the $PWD/ before someFile.txt, which explicitly prepends the path to the current directory so as to pass a full filename. This is necessary, because the .NET framework typically has a different current directory. Prepending $PWD/ doesn't work in all situations, however; you can read more about it and find a fully robust solution here.
The code tries to open the file for reading with an exclusive lock (a sharing mode of None), which fails if the file is currently open.
Note, however, that this only works if you have permission to at least read the file.
If you don't, the test cannot be performed, and the code relays the [System.UnauthorizedAccessException] that occurred; similarly, exceptions from other unexpected conditions are relayed, such as the specified file not existing ([System.IO.FileNotFoundException]).
[System.IO.IOException] can indicate a range of error conditions (and operator -is also matches derived classes such as [System.IO.FileNotFoundException]), so in order to specifically detect sharing/locking errors, the code must test the .HResult property of the exception for containing Win32 API error codes ERROR_SHARING_VIOLATION (0x20) or ERROR_LOCK_VIOLATION (0x21).
Taking a step back: If the intent is ultimately to process the file [content] yourself, it's better not to perform a separate test beforehand, because the file may get opened again between performing your test and your subsequent attempt to open it; instead, simply try to open the file yourself and handle any failure - see this (C#-based) answer.

Related

Powershell compare command for 2 folders having corrupt folders

I am using the command below to compare 2 paths and I get an error msg. when it gets to a folder that ends with a period in the name ie "Folder123."
When I manually try to open those folders I get an error, so I think they are corrupt. How can I skip all folders that end with a period or at least ignore the errors so that my processing can finish?
Compare (Get-ChildItem -r Y:\Ftp\BFold\Final) (Get-ChildItem -r Y:\Dest\TFold\Temp)
You're getting that error because it's part of Naming Files, Paths, and Namespaces limitations in Windows. One or severals of the tools you're using are not able to handle this special case.
Do not end a file or directory name with a space or a period. Although the underlying file system may support such names, the Windows shell and user interface does not. However, it is acceptable to specify a period as the first character of a name. For example, ".temp".
You could either filter the list of folders or using the -ErrorAction to change what happens on an error. Depending on what're you're seeing the error migth already by purely cosmetic.
For Filtering you could use Where-Object for example with -NotMatch ".*\.$".

How should I write a Powershell script to execute a single program on multiple files?

I'm using Kalles' Fraktaler on Windows 10 to render images of the Mandelbrot set. Bundled with KF is a program to take a single parameter file and beak it into multiple tiles for easier rendering.
The output for the tiling program is multiple files with the following naming scheme: name-0000-0000.kfr, name-0000-0000.kfs, where the name can be anything and the numbers increment as needed.
The .kfr files are the parameter files.
The .kfs files are the settings files.
After I have these generated parameter and setting files, I can execute KF on the command line with the following arguments:
kf.exe -s name-0000-0000.kfs -l name-0000-0000.kfr -p name-0000-0000.png
Doing this for every pair of parameter and setting files works perfectly fine, taking the input files and saving the render to name-0000-0000.png
I asked the developer for an example PowerShell script to automate the process for when there are dozens or more of the files that need to be rendered, and this is what he gave me. The script needs to be run from the same directory as the files are stored.
Get-ChildItem "." -Filter *.kfr |
Foreach-Object {
$kfr = $_.FullName
$kfs = $kfr.replace("kfr", "kfs")
$png = $kfr.replace("kfr", "png")
C:/path/to/kf.exe -s $kfs -l $kfr -p $png
}
Unfortunately, I've tried every variation of this script that I could think of, and nothing gives me any results. I have already allowed unsigned scripts to be run on my computer. I would greatly appreciate some help on this.
(PowerShell is nice and flexible - but only when you use it to invoke only PowerShell commands rather than running native executables. For example, to run a program in the current directory you need to prefix the program's name with ./ - ostensibly this is done for safety and I assume for similarity to Unix shells, but it's the first in a long list of gotchas for anyone wanting to use PowerShell for tasks that would be trivial in old-school batch files)
Anyway, you need to use Invoke-Command or Start-Process.
I've changed your script from using a piped expression into an easier-to-digest loop (and invoking .NET's Path.ChangeExtension directly because PowerShell's built-in string match-and-replace syntax is too arcane for me):
$kfrFiles = Get-ChildItem "." -Filter "*.kfr"
foreach ( $kfrFile in $kfrFiles ) {
$kfr = $kfrFile.Name
$kfs = [System.IO.Path]::ChangeExtension( $kfrFile.Name, "kfs" )
$png = [System.IO.Path]::ChangeExtension( $kfrFile.Name, "png" )
Start-Process -FilePath "C:\path\to\kfs.exe" -ArgumentList "-s $kfs", "-l $kf", "-p $png" -Wait
}
The -Wait option will wait for the kfs.exe program to finish before starting the next instance - otherwise if you have hundreds of .kfr files then you'll end-up with hundreds of kfr processes running concurrently.
I don't know how to allow concurrent processes but impose a limit on the maximum-number of concurrent processes in PowerShell. It is possible, just complicated.

How to read a text file to a variable in batch and pass it as a parameter to a powershell script

I have a powershell script that generates a report, and I have connected it to an io.filesystemwatcher. I am trying to improve the error handling capability. I already have the report generation function (which only takes in a filepath) within a try-catch loop that basically kills word, excel and powerpoint and tries again if it fails. This seems to work well but I want to embed in that another try-catch loop that will restart the computer and generate the report after reboot if it fails a second consecutive time.
I decided to try and modify the registry after reading this article: https://cmatskas.com/configure-a-runonce-task-on-windows/
my plan would be, within the second try-catch loop I will create a textfile called RecoveredPath.txt with the file path being its only contents, and then add something like:
Set-ItemProperty "HKLMU:\Software\Microsoft\Windows\CurrentVersion\RunOnce" -Name '!RecoverReport' -Value "C:\...EmergencyRecovery.bat"
Before rebooting. Within the batch file I have:
set /p RecoveredDir=<RecoveredPath.txt
powershell.exe -File C:\...Report.ps1 %RecoveredDir%
When I try to run the batch script, it doesn't yield any errors but doesn't seem to do anything. I tried adding in an echo statement and it is storing the value of the text file as a variable but doesn't seem to be passing it to powershell correctly. I also tried adding -Path %RecoveredDir% but that yielded an error (the param in report.ps1 is named $Path).
What am I doing incorrectly?
One potential problem is that not enclosing %RecoveredDir% in "..." would break with paths containing spaces and other special chars.
However, the bigger problem is that using mere file name RecoveredPath.txt means that the file is looked for in whatever the current directory happens to be.
In a comment your state that both the batch file and input file RecoveredPath.txt are located in your desktop folder.
However, it is not the batch file's location that matters, it's the process' current directory - and that is most likely not your desktop when your batch file auto-runs on startup.
Given that the batch file and the input file are in the same folder and that you can refer to a batch file's full folder path with %~dp0 (which includes a trailing \), modify your batch file to look as follows:
set /p RecoveredDir=<"%~dp0RecoveredPath.txt"
powershell.exe -File C:\...Report.ps1 "%RecoveredDir%"

checking to see if files are executable perl

I have a program that checks to see if the files in my directory are readable,writeable, and executable.
i have it set up so it looks like
if (-e $file){
print "exists";
}
if (-x $file){
print "executable";
}
and so on
but my issue is when I run it it shows that the text files are executable too. Plain text files with 1 word in them. I feel like there is an error. What did I do wrong. I am a complete perl noob so forgive me.
It is quite possible for a text file to be executable. It might not be particularly useful in many cases, but it's certainly possible.
In Unix (and your Mac is running a Unix-like operating system) the "executable" setting is just a flag that is set in the directory entry for a file. That flag can be set on or off for any file.
There are actually three of these permissions why record if you can read, write or execute a file. You can see these permissions by using the ls -l command in a terminal window (see man ls for more details of what various ls options mean). There are probably ways to view these permissions in the Finder too (perhaps a "properties" menu item or something like that - I don't have a Mac handy to check).
You can change these permissions with the chmod ("change mode") command. See man chmod for details.
For more information about Unix file modes, see this Wikipedia article.
But whether or not a file is executable has nothing at all to do with its contents.
The statement if (-x $file) does not check wether a file is an executable but if your user has execution priveleges on it.
For checking if a file is executable or not, I'm affraid there isn't a magic method for it. You may try to use:
if (-T $file) for checking if the file has an ASCII or UTF-8 enconding.
if (-B $file) for checking if the file is binary.
If this is unsuitable for your case, consider the following:
Assuming you are on a Linux enviroment, note that every file can be executed. The question here is: The execution of e.g.: test.txt, is going to throw a standard error (STDERR)?
Most likely, it will.
If test.txt file contains:
Some text
And you launched it in your Perl script by: system("./test.txt"); This will display a STDERR like:
./test.txt: line 1: Some: command not found
If for some reason you are looking to run all the files of your directory (in a for loop for instance) be warned that this is pretty dangerous, since you will launch all your files and you may not be willing to do so. Specially if the perl script is in the same directory that you are checking (this will lead to undesirable script behaviour).
Hope it helps ;)

AgeStore Fails to Remove Expired Debug Symbol Files

I’m trying to use AgeStore to remove some expired symbol files. I’ve written a Powershell script in which the AgeStore command works sometimes, but, not always.
For example, my symbol store contains symbol files dating back to 2010. I’d like to clean out the “expired” symbols because they are no longer needed. To that end, I use the -date command line argument to specify “-date=10-01-2010”. Additionally, I use the “-l” switch to force AgeStore to
Causes AgeStore not to delete any files, but merely to list all the
files that would be deleted if this same command were run without the
-l option.
Here’s a snippet of the script code that runs…
$AgeStore = "$DebuggingToolsPath\AgeStore"
$asArgs = "`"$SymbolStorePath`" -date=$CutoffDate -s -y "
if ($WhatIf.IsPresent) { $asArgs += "-l" }
# determine size of the symbol store before delete operation.
Write-Verbose ">> Calculating current size of $SymbolStorePath before deletion.`n" -Verbose
">> $SymbolStorePath currently uses {0:0,0.00} GB`n" -f (((Get-ChildItem -R $SymbolStorePath | measure-object length -Sum ).Sum / 1GB))
Write-Verbose ">> Please wait...processing`n`n" -Verbose
& $AgeStore $asArgs
When the above code runs, it returns the following output…
processing all files last accessed before 10-01-2010 12:00 AM
0 bytes would be deleted
The program 'RemoveOldDebugSymbols.ps1: PowerShell Script' has exited
with code 0 (0x0).
I have verified that there are symbol files with dates earlier than “10-01-2010” in the symbol store. I’ve subsequently tried the same experiment with a different cutoff date, “11-01-2015” and the output indicates that there are several files it would have deleted, but, not those that are from 2010. I’m at a loss as to what may cause the discrepancy.
Has anyone tried to delete symbol files from a symbol store using AgeStore? If so, have you run into this problem? How did you resolve it?
I’ve tried to resolve this many different ways using AgeStore. For the sake of moving forward with a project, I’ve decided to rewrite the script to use the SymStore command with a delete transaction. Basically, I created a list of the debug symbol transactions that should be removed and wrote a loop that iterates over the list and deletes each entry one at a time.
Hope this is helpful for anyone who runs into the same problems.
EDIT: Per request....I cannot post the entire script, but, I used the following code in a loop as a replacement for the AgeStore command.
$ssArgs = ".\symstore.exe del /i $SymbolEntryTransactionID /s `"$SymbolStorePath`""
Invoke-Expression $ssArgs