I have a script, let's call it A.ps1 that uses a workflow called Copy-Parallel (the workflow is defined in the beginning of the script). I created a GUI.ps1 script that I use to easily select between multiple scripts, it calls the desired script with "Start-Process". In the end I created a shortcut, so I could run the GUI by double-click.
All resources are located on a server and the two scripts are in the same folder called Res. The shortcut is located in the same folder as Res. When we use the shortcut, the script works fine for me but other users get the: term 'Copy-Parallel' is not recognized as the name of a cmdlet, function... error.
We tried to run the A.ps1 script directly and it works fine for me and other users. We tried running the GUI.ps1 via a .bat file, the results were similar to the ones when we used the shortcut.
A.ps1:
workflow Copy-Parallel {
[CmdletBinding()]
param (
[Parameter(Mandatory)]
[array]
$Files,
[Parameter(Mandatory)]
[string]
$Destination
)
foreach -parallel -throttlelimit 20 ($File in $Files) {
Copy-Item -LiteralPath $File -Destination $Destination -Force
}
}
Copy-Parallel -Files $SourceFiles.FullName -Destination $DestinationPath
GUI.ps1 calls the script as follows:
$ScriptToRun = "$PSScriptRoot\A.ps1"
Start-Process PowerShell.exe -ArgumentList $ScriptToRun -NoNewWindow
The target for the shortcut that starts the GUI is: %windir%\System32\WindowsPowerShell\v1.0\powershell.exe -NoProfile -WindowStyle Hidden -NonInteractive -ExecutionPolicy bypass -File "Res\GUI.ps1"
Related
I have to find and then execute a .exe file from a script deployed by our asset management software. Currently it looks like this:
Set-Location $PSScriptRoot
$proc = (Start-Process -FilePath "C:\Program Files (x86)\software\software name\Uninstall.exe" -ArgumentList "/S /qn" -Wait -PassThru)
$proc.WaitForExit()
$ExitCode = $proc.ExitCode
Exit($ExitCode)
As far as I understand the location for the location for the file is set and some users do not have it there hence why it fails.
So I understand that you can search for a program with
Get-ChildItem C:\Program Files (x86)\software\
And execute with Start-process -Filepath
But do I simply combine that with a | or is there an easier way/will it even work.
As commenter suggested, you can use Test-Path to test if a path exists:
$uninstallPath = Join-Path ${env:ProgramFiles(x86)} 'software\software name\Uninstall.exe'
if( Test-Path $uninstallPath ) {
$proc = Start-Process -FilePath $uninstallPath -ArgumentList '/S /qn' -Wait -PassThru
$proc.WaitForExit()
$ExitCode = $proc.ExitCode
Exit $ExitCode
}
I've also made the code more robust by avoiding the hardcoded "Program Files (x86)" directory, using an environment variable. Because of the parentheses in the name of the env var, it must be enclosed in curly braces.
For added robustness, you may read the path of the uninstall program from the registry, as detailed by this Q&A. If you are lucky, the program even stores a QuietUninstallString in the registry, which gives you the full command line for silent uninstall.
I wrote a simple script to move files that contain a specific substring in the file name from one folder to another.
$Source = Get-ChildItem -Path "C:\source" -Filter *.xlsm
$Target = "C:\target"
$substring = 'copy'
foreach ($file in $Source) {
if($file.Name -match $substring){
Move-Item -Path $file.FullName -Destination $Target -Force
}
}
I want to automate it on VM. It works fine when I'm running it manually and via task scheduler when I'm logged in VM, however when I switch to 'run whether logged on or logged off' in task scheduler (script properties) it won't work. I run it with following parameters:
-noprofile -executionpolicy unrestricted -noninteractive -file "path to my script"
Any ideas?
When the option "Run whether user is logged on or not" is selected, the scheduled task runs on a different session.
This means that it does not have access to the mapped network drives !.
So, you need to either map drives in your script or use the fully qualified name (i.e., \server name\share name)
More details here as well
i would like to copy the license folder and overwrite the existing folder, since it is program file (x86), i have to run the elevated powershell, i am able to copy it when i launch it manually, just wonder is it possible to get all run at one line (all at once) ? really appreicated
$net = new-object -ComObject WScript.Network
$net.MapNetworkDrive("R:", "\\roa\smdd\Software\Mest", $false)
Start-process Powershell.exe -ArgumentList " Copy-Item "R:\Licenses\" "C:\Program Files `(x86`)\Mest Research S.L\Mest\licenses"" -force -recurse -wait
You don't need to map a drive or invoke powershell.exe. The code is PowerShell, so you don't need to spin up a new copy of PowerShell to run the Copy-Item cmdlet to copy files. You only need one PowerShell command:
Copy-Item "\\roa\smdd\Software\Mest\Licenses\*" "${Env:ProgramFiles(x86)}\Mest Research S.L\Mest\licenses" -Force -Recurse
Note that you will likely need to open PowerShell as administrator (elevated) to be able to copy items into that directory.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
what is the best way to execute .ps1 script?
if I'm a user who has no idea about powershell, but I want to just double click the script and let it run.
could you please advise on the best practice?
Thank you
[CmdletBinding()]
Param(
[Parameter(Mandatory = $False,
ValueFromPipeline = $True,
ValueFromPipelineByPropertyName = $True,
HelpMessage = "Provide Files Path.")]
[string]$FilesPath = 'C:\Users\myusername\OneDrive - mycompany\mycompany\Sales & Marketing\Sales Reports',
[Parameter(Mandatory = $False,
ValueFromPipeline = $True,
ValueFromPipelineByPropertyName = $True,
HelpMessage = "Provide Sheet Name.")]
[string]$SheetName = 'Expenses'
)
Try
{
$Files = Get-ChildItem -Path $FilesPath -Include *.xlsx, *.xls, *.xlsm -Recurse
$Counter = $Files.Count
$Array = #()
$OutPutFilePath = (Join-Path $FilesPath -ChildPath "Exported-ExcelData.csv")
Remove-Item -Path $OutPutFilePath -Force -ErrorAction SilentlyContinue
ForEach ($File In $Files)
{
Write-Verbose -Message "Accessing File $($File.Name) and Exporting Data from Sheet $SheetName. Remaining $Counter Files." -Verbose
$Counter -= 1
$AllData = Import-Excel -Path $File.FullName -WorksheetName $SheetName -NoHeader
$i = 0
ForEach ($Data In $AllData)
{
$ArrayData = "" | Select-Object "P1", "P2", "P3", "P4", "P5", "P6"
$ArrayData.P1 = $Data[0].P1
$ArrayData.P2 = $Data[0].P2
$ArrayData.P3 = $Data[0].P3
$ArrayData.P4 = $File.Name
$ArrayData.P5 = $File.FullName
$ArrayData.P6 = ($i += 1)
$Array += $ArrayData
}
}
$Array | Export-Csv -Path $OutPutFilePath -Append -NoTypeInformation
}
Catch
{
$ErrorLog = "Error On " + (Get-Date) + ";$($_.Exception.Message) - Line Number: $($_.InvocationInfo.ScriptLineNumber)"
Write-Error "$($_.Exception.Message) - Line Number: $($_.InvocationInfo.ScriptLineNumber)"
}
Finally
{
Write-Host "Process has been completed!" -ForegroundColor Green
Read-Host "Press any key to continue..."
}
Create a batch file.
Launching your PS1 file can be problematic because of the Powershell policy and also, PS1 scripts won't launch by default on a windows machine.
However, if you create a batch file that reference your PS1 script, then you're all set.
The -ExecutionPolicy Bypass will ignore the current system policy so your script can run without issues.
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& 'C:\Users\SO\Desktop\YourScript.ps1'"
See below two batch command using relative path.
# Launch script
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& '%mypath:~0,-1%\data\install.ps1'"
# Same thing but force the script to run as admin
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& {Start-Process PowerShell -ArgumentList '-NoProfile -ExecutionPolicy Bypass -NoExit -File ""%mypath:~0,-1%\data\install.ps1""' -Verb RunAs}"
In these two variants, the PS1 I am lauching is in a subfolder called data and script is called install.ps1.
Therefore, I do not have to hardcode the script path in the batch file (you could put the PS1 at the same level than the batch file. In my example, I actually wanted only the batch at the first folder level and everything else "hidden from view" in a subfolder so the user does not have to think what file to execute.
I would recommend compiling the script, I believe that is the only way to reliably double-click a script. There are other ways, but they require weakening security, and partly as a result of this are hard to run across different systems.
The easiest way I've found to compile powershell is by using PowerGUI. You can paste your script and compile a portable .exe file that can be double-clicked. It doesn't require elevated permissions. Their website no longer hosts it, so you will need to find a 3rd party download.
You can also find a Visual Studio extension or one of Sapien's products, however both will cost money and aren't any better for just creating a basic double-clickable script.
Additionally, you will need to modify your script to include a prompt - assuming you wrote your code, that should be easily accomplished. Basically, you need to get it so that you never have to type anything in that you are not directly asked for, if you were to click RUN in ISE.
We have field devices that we decided to use a powershell script to help us handle 'updates' in the future. It runs every 5 minutes to execute rsync to see if it should download any new files. The script, if it sees any file types of .ps1, .exe, .bat ,etc. will then attempt to execute those files using the & operator. At the conclusion of execution, the script will write the file executed an excludes file (so that rsync will not download again) and remove the file. My problem is that the return from the executed code (called by &) behaves differently, depending on how the main script is called.
This is the main 'guts' of the script:
Get-ChildItem -Path $ScriptDir\Installs\* -Include #("*.ps1","*.exe","*.cmd","*.VBS","*.MSI") | ForEach {
Write-Verbose "Executing: $_"
& $_
$CommandName = Split-Path -Leaf $_
Write-Verbose "Adding $CommandName to rsync excludes"
Write-Output "$CommandName" | Out-File -FilePath $ScriptDir\excludes -Append -Encoding ASCII
Write-Verbose "Deleting '$_'"
Remove-Item $_
}
When invoking powershell (powershell.exe -ExecutionPolicy bypass) and then executing the script (.\Update.ps1 -Verbose), the script runs perfectly (i.e. the file is written to excludes and deleted) and you can see the verbose output (writing and deleting).
If you run the following (similar to task scheduler) powershell.exe -ExecutionPolicy bypass -NoProfile -File "C:\Update.ps1" -Verbose, you can see the new script get executed but none of the steps afterwards will execute (i.e. no adding to excludes or removing the file or the verbose outputs).