Pass a Team City Parameter to a PowerShell file - powershell

I have the following parameter defined in Team City:
I want to pass this parameter into a powershell script I have (that will update the xml file with the version number).
But this inserts the actual text %version% into the script (No substitution is made for the actual value of the parameter.)
However, I know my script is working because if I hardcode the values like this then it works:
Is there a way to get %version% to convert to the actual value when when used as a PowerShell script argument?

If you put the parameter in quotes, "%version%", and change the script execution mode to Execute ps1 script with "-File" argument then this should resolve and inject correctly
e.g.
Hope this helps

You need Environment Variables (env.), it's work to me
enter image description here

Related

How to pass parameters to a Powershell script with UiPath

I am trying to run a powershell script with uipath. I have found two approaches to run it;
Read the script as a text and pass to Invoke Powershell activity
Use Run powershell script activity from Script Activities package
Now I need to pass arguments to the powershell script from uipath. Some have mentioned about formatting the string with parameters and invoking the script.
But, rather than that, think we can directly pass parameters from UiPath.
In Run Powershell script, it has Parameters as input
In Invoke Powershell, it has Parameters under Input section and PowershellVariables under Misc
I have been searching for a while. But still I am unable to figure out how to pass parameters using above activities.
I am trying to send outlook mails using powershell. Still in the process of learning how to work with it. Plz help…
EDIT
Found solution for one approach. Added it as an answer.
Found an answer from this link. Out of the two approaches, it provides a solution to pass the parameters using Invoke Power Shell activity.
Read the .ps1/.txt file containing the script, with Read Text File activity
Call the Invoke Power Shell activity by passing file content as CommandText and parameters in the Parameter collection.
But, it would be great, if I can get to know how to pass parameters using Run power shell script activity.

How to pass arguments from UFT to command line

I am trying to run tests in UFT by running a .vbs file. I am also passing arguments through command line. .vbs file reads the arguments and sets the environment variable of UFT. Hence, I can read them inside UFT.
qtApp.Test.Environment.Value("First_Argument") = WScript.Arguments.Item(0)
qtApp.Test.Environment.Value("Second_Argument") = WScript.Arguments.Item(1)
After that, I want to get a number as an output from UFT because I will use that output to pass it to the next command in command line.
The Test Parameters Object can be a way , more detailed in the Automation Object Documentation
You will have to define the TestParameters of the TestCase from the UFT IDE(manually) there is no way to define them automatically. If you declare them as in and out type, and change their value as a part of a Test Case, you would be able to read it afterwards from the vbs (Do not open a new Test Case until you did not read out the preferred values)
Although this is a working (and standard) way for exchanging parameters between the driver script and the TA Robot(UFT) I would advise you to use a simple file based way of doing this - managing test parameters can be very time consuming.
Tell the script via an Environment variable the path of the xml / json or simple text file where you expect the results to be written and when the test is done, read the content of the file (assuming the test will write into that file)
The plain old file way should not be underestimated especially in such circumstances.

How would I run a Lua script with user specified parameters from inside another Lua script?

How would I run a Lua script with user specified parameters from inside another Lua script?
Would the below code work? Where "content_image" is my specified input image (either saved to an image file, or still in the script) into the "deepdream.lua" script, and "output_image" is the output from the "deepdream.lua" script that I want to use in my Lua script.
dofile("deepdream.lua -content_image content_image -output_image output_image")
The script I am seeking to run within another Lua script can be found here: https://github.com/bamos/dream-art/blob/master/deepdream.lua
If you want to load and execute a script by passing it a number of parameters, you have to do this by... loading the script and executing it by passing it a number of parameters:
local chunk = loadfile("deepdream.lua")
chunk("-content_image", "content_image", "-output_image", "output_image")
Note that this will not fill in args for the arguments the way lua.exe does. It will pass the parameters as variadic parameters, just like any other Lua function. So it can mess with your globals and so forth. Also, unlike executing lua.exe, this will be executed in the current process, so if it errors out, the error will have to be handled by you.
If you want, it wouldn't be difficult at all to write a function that takes the string you provided, uses Lua patterns to parse parameters and so forth, and then loads the script with those parameters.
If you want to execute a script exactly as if you had used lua.exe on it, then you would just use os.execute:
os.execute("lua.exe deepdream.lua -content_image content_image -output_image output_image")
you can use loadfile with parameters in arg:
loadfile("deepdream.lua")({content_image="content_image",output_image="output_image"})
in deepdream.lua:
local arg={...}
local content_image = arg[1].content_image
local output_image = arg[1].output_image

Run a PowerShell script from another one

What is the best and correct way to run a PowerShell script from another one?
I have a script a.ps1 from which I want to call b.ps1 which does different task.
Let me know your suggestions. Is dot sourcing is the best option here?
Dot sourcing will run the second script as if it is part of the caller—all script scope changes will affect the caller. If this is what you want then dot-source,
However it is more usual to call the other script as if it were a function (a script can use param and function level attributes just like a function). In many ways a script is a PowerShell function, with the name of the file replacing the naming of the function.
Dot sourcing makes it easier to at a later stage convert your script(s) into a module, you won't have to change the script(s) into functions.
Another advantage of dot sourcing is that you can add the function to your shell by adding the file that holds the functions to Microsoft.PowerShell_profile.ps1, meaning you have them available at all times (eliminating the need to worry about paths etc).
I have a short write-host at the top of my dot sourced files with the name of the function and common parameters and I dot source the functions in my profile. Each time I open PowerShell, the list of functions in my profile scrolls by (If like me, you frequently forget the exact names of your functions/files You'll appreciate this as over time as the number of functions start to pile up).
Old but still relevant.
I work with modules with "Import-Module ", this will import the module in the current powershell session.
To avoid keep in cache and to always have the last changes from the module I put a "Get-Module | Remove-Module" that will clear all the loaded modules in the current session.
Get-Module | Remove-Module
Import-Module '.\IIS\Functions.psm1'

How to add a script to the standard path in PowerShell?

how can i add a custom function / object to the standard array of recognized functions in PowerShell so that i can call it from the shell of PowerShell?
Thanks
You can put the function into your profile script. You can find out where this is by looking into the variable $profile. That script will be run automatically on starting Powershell (if you are allowed to run scripts) and functions declared in it will be available in every session.