This question already has answers here:
Permanent PowerShell variable
(7 answers)
Closed 9 years ago.
I wrote a powershell module in c# which allows me to call some webservices.
Before I can use them, I have to specify where the service is located. For example:
Set-ServiceUrl 'http://myService.cloudapp.net/'
Create-Something ...
Get-Something ...
The test functions Create-Something and Get-Something accessing a static variable "serviceurl" which is set within the Set-ServiceUrl function.
This is similar to the Azure module where you have to call Set-AzureSubscription before you can query the services.
The diffrent between my and the Azure module is, that I have to specifiy the service url for my module for each time I start the powershell whereas I have to set the AzureSubscription only once.
Is there any mechanism in powershell to store such information "forever"? Or do I have to use a file-based / registry or environment-variable solution? What would you suggest?
PowerShell has no built-in method for persisting data across sessions; you must come up with a way to store & retrieve it yourself.
One way would be to store it in a global variable and check it from other functions or even assign the value of the variable to the required parameter.
There's no "forever" mechanism, even the Azure team saves their information to disk.
Related
This is a question about using PowerShell with Custom Commands (or scheduled tasks) in the Adaxes Active Directory management software by Softerra.
I am trying to accept a parameter from a user when using a custom command, then I need to take that value and modify it for use in a future action of the custom command.
A "for example" use-case would be creating a script that sets a user's out of office, where the custom command takes a target user reference in the out of office message. The first action in the custom command would find the email address of the provided user, then the second action would set the out-of-office with a message telling recipients for immediate assistance to email the provided user's email address. I realize there may be ways to solve this with one PowerShell script, but there are MANY scenarios where it would be beneficial to process provided information with a script action for use with MULTIPLE future actions in the custom command.
I already know how to access parameter values in custom commands for Softerra Adaxes, but I can't figure out how to WRITE to parameter values.
Accessing values:
$context.GetParameterValue('param-Example')
Does anyone know how to write TO parameter values? $context.SetParameterValue() does not work. This would be extremely useful for being able to store and manipulate values between actions in custom commands in Adaxes.
If anyone is looking for something similar, the answer I got from Adaxes support was that there is no means to do this currently with their software.
The only work-around would be writing to a property of the object being modified, then reference that property later.
For instance, writing a value to the extensionAttribute1 property of a user, then referencing that later in the script in a different action.
If anyone comes up with a better solution or Adaxes changes this, please feel free to suggest a better solution!
I'm currently working on a module in PowerShell which uses a standard REST API in the background. For that, I wrote a Connect-Server cmdlet that retrieves an auth key for later calls.
My question is: Is there any best practice regarding sharing the data with other cmdlets? I know I could easily just return it from the Connect function and pass it to the following cmdlet, but that's not what I'm looking for.
Until now, I've been using global variables for that exchange of data. But as I've read in some best practice guidelines you should try not to pollute the global scope.
Other solutions I've seen use Get and Set cmdlets, but I don't think that's the best PowerShell way of doing it.
So are there any other ways of solving that?
The normal way is to return data from one cmdlet and store it either in a variable or forward it to the pipeline. Another way of sharing data might be serializing (ConverTo-Json, ConvertTo-Csv, ...) it to a file (located in e.g. $env:TEMP, or created via New-Temporaryfile), and deserializing it back again in another cmdlet (at cost of DISK I/O). Personally I'm always the result in a variable for lather usage and inject it in the next cmdlet (or use the pipeline).
Using global variables is not the best idea since you don't know on which parameters your cmdlet/function depends on.
So as the guys at PoshCode stated, the best way to do such a thing is using a variable in script scope as this is available for all cmdlets in the module but not visible for users.
I have a PowerShell module that contains a number of common management and deployment functions. This is installed on all our client workstations. This module is called from a large number of scripts that get executed at login, via scheduled tasks or during deployments.
From within the module, it is possible to get the name of the calling script:
function Get-CallingScript {
return ($script:MyInvocation.ScriptName)
}
However, from within the module, I have not found any way of accessing the parameters originally passed to the calling script. For my purposes, I'd prefer to access them in the form of a dictionary object, but even the original command line would do. Unfortunately, given my use case, accessing the parameters from within the script and passing them to the module is not an option.
Any ideas? Thank you.
From about_Scopes:
Sessions, modules, and nested prompts are self-contained environments,
but they are not child scopes of the global scope in the session.
That being said, this worked for me from within a module:
$Global:MyInvocation.UnboundArguments
Note that I was calling my script with an unnamed parameter when the script was defined without parameters, so UnboundArguments makes sense. You might need this instead if you have defined parameters:
$Global:MyInvocation.BoundParameters
I can see how this in general would be a security concern. For instance, if there was a credential passed to the function up the stack from you, you would be able to access that credential.
The arguments passed to the current function can be accessed via $PSBoundParameters, but there isn't a mechanism to look at the call stack function's parameters.
We have a need that periodically, we will run a build configuration that among other things, recreates tokens/logins etc. We want to save these back to Team City as Environment variables. Builds that we subsequently do will want to look at this Environment Variable store and do a string replace within our configurations as required.
I've taken a look at :
##teamcity[setParameter name='env.TEST' value='test']
But from reading the documentation, this is only used to pass variables between build steps within the same build. It doesn't actually save the variable back to Team City.
Is there any way (Preferably from a powershell script), to call Team City and tell it to add a Environment Variable (Or any other variable).
In order to persist a value back to a parameter you have to call the REST API.
I use a PowerShell script that acts as a wrapper around the Invoke-RestMethod cmdlets in PowerShell 3+ that can be reused in a build step to achieve what you want.
Step 1.
Save the script to a PowerShell file and add it to your source control rest-api-wrapper.ps1
Step 2.
Create a PowerShell build step referencing the script and pass in the following arguments, tailored for your situation
%teamcity.serverUrl%/httpAuth/app/rest/projects/project_id/parameters/parameter_name
"Username"
"Password"
"PUT"
"TheValueToSave"
More details can be found here - TeamCity Documentation
Hope this helps
I want to create a Powershell function/cmdlet which installs (and one that uninstalls) a web application: copies files, creates an app pool, creates the web application, sets up all kinds of IIS properties, does some web.config modifications, etc. I'm confused about how I should name it. Powershell has this verb-object naming convention, and it's all nice, but the names I want to use (New-WebApplication, etc.) are already taken by the WebAdministration module (which this new function will use internally). Is there a nice way to scope my functions to make it clear that it's a different module? Like mymodule.New-WebApplication, My-New-WebApplication, New-MyWebApplication? Or I could call it Install-WebApplication but that could lead to confusion because of reusing the same name.
I just ran into this recently for a similar issue. This could have many opinionated answers but this would handle the way to scope my functions to make it clear that it's a different module.
You could use the -Prefix parameter of Import-Module
Import-Module mymodule -Prefix Super
So when you go to use your cmdlet you would call it with
New-SuperWebApplication
Alternatively, you can also explicitly call the cmdlet with the module path
mymodule\New-WebApplication
I agree with Matt's answer, but I wanted to offer another perspective.
I wrote a module where the intention was specifically to recreate the functionality of an existing cmdlet. I named my function differently, but I also exported functions from the module that allow the caller to overrride the existing cmdlet with mine (using an Alias, which is interpreted first), and then to also undo that process.
This allowed someone to explicitly call the function without needing to use -Prefix nor use the \ syntax, using the new name with new code, but it also allowed one to use my function as a drop-in replacement for existing code by calling a single new command.
Here's that module if you want to take a look:
DnsCmdletFixes