Creating CSV file as an object - powershell

I have got a robust script which gets, parse and uses some data from .csv file. To run the script I can use
.\script.ps1 -d data_file.csv
The thing is I cannot interfere into script itself that is why i need to create some kind of a wrapper which will create new csv file and use script.ps1 with a new made file. I am wondering if there is a possibility to create a csv file as an object which will be passed directly to the command like this
.\script.ps1 -d csv_file_as_object.csv
without creating file in some path directory.

What you'd need in this case is the equivalent of Bash's process substitution (<(...)), which, in a nutshell, would allow you to present a command's output as the content of a temporary file whose path is output:
.\scripts.ps1 -d <(... | ConvertTo-Csv) # !! does NOT work in PowerShell
Note: ... | ConverTo-Csv stands for whatever command is needed to transform the original CSV in-memory.
No such feature exists in PowerShell as of Windows PowerShell v5.1 / PowerShell Core v6.1, but it has been proposed.
If .\scripts.ps1 happens to also accept stdin input (via pseudo-path - indicating stdin input), you could try:
... | ConvertTo-Csv | .\script.ps1 -d -
Otherwise, your only option is to:
save your modified CSV data to a temporary file
pass that temporary file's path to .\script.ps1
remove the temporary file.

Related

What does "supply values for the following parameters mean in command line?"

So inside of my terminal, I created a text file inside a directory (cat > fnames.txt). My initial goal was to write some data into said file. After creating fnames.txt, the following information showed up after trying to append data to the file using (cat >> fnames.txt):
cmdlet Get-Content at command pipeline position 1
Supply values for the following parameters:
Path[0]:
Image of terminal
Does anyone know the reason of this and what it means?
The command cat is in Powershell an alias to Get-Content. The cmdlet reads information from somewhere. The error message means that Get-Content does not have an idea from where you want it to get data, so it asks you.

Powershell ps2exe config variable

I would like To store a variable in a config file for a .ps1(powershell script) converted to .exe using ps2exe
$LnkPath="...\" #Directory Path of .lnk
$LnkFile="\nnnn.lnk" #name of .lnk file
Invoke-Item $ScriptPath + $LnkFile
I was hoping to have $LnkFile and $LnkPath as config file variables so if the version of the lnk stops working, i can just point to a new lnk.
There is a reason why the version of the .lnk file stops working, but it is complicated, and not worth anyone's time.
edit:
The config file created with the optional -configFile switch isn't meant for use by the wrapped script - it merely contains runtime metadata for the generated executable (in the form of an XML file placed alongside the executable with additional extension .config).
However, you can create your own config file.
While PowerShell has a configuration-data format that uses hashtable-literal syntax, which can be read with Import-PowerShellDataFile, as of PowerShell 7.2.x there is no way to create this format programmatically.
A simple alternative that supports both reading and programmatic creation is to use a JSON file:
The following assumes that your script file is foo.ps1, to be converted to foo.exe, with a configuration file foo.json located in the same directory (which you'll have to bundle with your .exe file when deploying it):
First, create your JSON config file:
#{ LnkPath = '...\'; LnkFile = 'nnnn.lnk' } | ConvertTo-Json > foo.json
Now you can read this file from foo.ps1 / foo.exe as follows:
# Determine this script's / executable's full path.
$scriptOrExePath =
if ($PSCommandPath) { # Running as .ps1
$PSCommandPath
} else { # Running as .exe"
Convert-Path ([Environment]::GetCommandLineArgs()[0])
}
# Look for the JSON config file in the same directory as this script / executable, load it and parse it into an object.
$config =
Get-Content -Raw ([IO.Path]::ChangeExtension($scriptOrExePath, '.json')) |
ConvertFrom-Json
# $config is now an object with .LnkPath and .LnkFile properties.
$config # Output for diagnostic purposes.
Note the need to use [Environment]::GetCommandLineArgs() to determine the executable path when running as an .exe file, because the usual automatic variables indicating the script path ($PSCommandPath) and script directory ($PSScriptRoot) aren't available then.

How to read a text file to a variable in batch and pass it as a parameter to a powershell script

I have a powershell script that generates a report, and I have connected it to an io.filesystemwatcher. I am trying to improve the error handling capability. I already have the report generation function (which only takes in a filepath) within a try-catch loop that basically kills word, excel and powerpoint and tries again if it fails. This seems to work well but I want to embed in that another try-catch loop that will restart the computer and generate the report after reboot if it fails a second consecutive time.
I decided to try and modify the registry after reading this article: https://cmatskas.com/configure-a-runonce-task-on-windows/
my plan would be, within the second try-catch loop I will create a textfile called RecoveredPath.txt with the file path being its only contents, and then add something like:
Set-ItemProperty "HKLMU:\Software\Microsoft\Windows\CurrentVersion\RunOnce" -Name '!RecoverReport' -Value "C:\...EmergencyRecovery.bat"
Before rebooting. Within the batch file I have:
set /p RecoveredDir=<RecoveredPath.txt
powershell.exe -File C:\...Report.ps1 %RecoveredDir%
When I try to run the batch script, it doesn't yield any errors but doesn't seem to do anything. I tried adding in an echo statement and it is storing the value of the text file as a variable but doesn't seem to be passing it to powershell correctly. I also tried adding -Path %RecoveredDir% but that yielded an error (the param in report.ps1 is named $Path).
What am I doing incorrectly?
One potential problem is that not enclosing %RecoveredDir% in "..." would break with paths containing spaces and other special chars.
However, the bigger problem is that using mere file name RecoveredPath.txt means that the file is looked for in whatever the current directory happens to be.
In a comment your state that both the batch file and input file RecoveredPath.txt are located in your desktop folder.
However, it is not the batch file's location that matters, it's the process' current directory - and that is most likely not your desktop when your batch file auto-runs on startup.
Given that the batch file and the input file are in the same folder and that you can refer to a batch file's full folder path with %~dp0 (which includes a trailing \), modify your batch file to look as follows:
set /p RecoveredDir=<"%~dp0RecoveredPath.txt"
powershell.exe -File C:\...Report.ps1 "%RecoveredDir%"

How can we use "ls" command for appending file list from FTP to local file?

I'm using the command ls *Pattern*Date* Files.txt to get a list of files from FTP to my local text file.
I'm now required to get multiple patterned files (The Date and Pattern may come in different order). So when I try to add another line ls *Date*Pattern* Files.txt, this clears the FCCfiles.txt and I'm not able to get the first set of files.
Is there any command that can append the list of files rather than creating new files list?
You cannot append the listing to a file with ftp.
But you can merge multiple listings in PowerShell. I assume that you run ftp from PowerShell, based on your use of powershell tag.
In ftp script do:
ls *Pattern*Date* files1.txt
ls *Date*Pattern* files2.txt
And then in PowerShell do:
Get-Content files1.txt,files2.txt | Set-Content files.txt
(based on How do I concatenate two text files in PowerShell?)

Compressing to tar.xz using 7-zip through a pipe on windows

My command line is this (powershell):
$7z ="`"c:\Program Files\7-Zip\7z.exe`""
&$7z a -r -ttar -bd -so . | &$7z a -r -txz -bd $archive -si
The produced archive file indeed contains a tar file, but that tar file is corrupt.
Note, that breaking the pipe into two commands works correctly:
&$7z a -r -ttar -bd ${archive}.tmp .
&$7z a -r -txz -bd $archive ${archive}.tmp
The produced archive is perfectly valid.
So, what is wrong with my pipeline?
(I am using Powershell)
Nothing is wrong with your pipeline it is the way that the pipeline works that's causing the error.
PowerShell pipe works in an asynchronous way. Meaning that output of the first command is available to the second command immediately one object at the time even if the first one has not finished executing, See here.
Both Unix and PowerShell pipes operate in the same way. The reason why you might be seeing a difference from Unix to PowerShell is the way in which they go about it is different.
Unix passes Strings between the commands. Where as a Powershell pipe will pass full-fledged .net object between commands. This difference in the data type being past between command will be why it works on unix and not in PowerShell. If 7z.exe can not huddle these .net objects correctly the files will be come corrupt, See here.
Try adding | %{ "$_" } in between the pipes like
&$7z a -r -ttar -bd -so . | %{ "$_" } | &$7z a -r -txz -bd $archive -si
The point is that the second call to 7z expects unmodified data on STDIN, but PowerShell is converting the output from the first call to 7z to (multiple) (string) objects. % is an alias for foreach-object, so what the additional command does is to loop over each object and convert it to a plain string before passing it on to the second call to 7z.
Edit: Reading through PowerShell’s Object Pipeline Corrupts Piped Binary Data it looks to me now as if my suggestion would not work, and there's also no way to fix it. Well, other than wrapping the whole pipeline into a cmd /c "..." call to make cmd and not PowerShell handle the pipeline.
Edit2: I also was trying this solution from the PowerShell Cookbook, but it was very slow.
In the end, I created a .cmd script with the 7z pipes that I'm calling from my PowerShell script.