Is it possible to execute some code (eg. a function, script block, cmdlet etc) whenever the current path changes in the PowerShell console or ISE?
The scenario I am thinking of is to modify some environment variables and dot source some location-specific PowerShell functions depending on the current folder.
You have several options. You can remove the "cd" alias and then write a "cd" function and add the desired logic. The downside to this approach is that if someone uses Set-Location, your function is bypassed. Another option is to create a proxy command for Set-Location. Shay (and Kirk) have a video on how to do this here. The nice thing about this approach is that the built-in ways to change dir (cd and Set-Location) will go through your proxy command.
Related
I want to create a task in .vscode/tasks.json where one of the args should be the name of the directory that contains the current file. For example, if I have the file folder1/folder2/myFile.txt open, I want to get the string folder2. As far as I can tell, none of the predefined variables gives me this. The closest is probably ${relativeFileDirname}, but that gives you the full directory path from the workspace folder , so it does not work for files deeper than one level in the file hierarchy.
If VSCode supported something like shell parameter expansion I could do with it, but since it does not I thought maybe I could use either a command variable or an input variable with "type": "command" in order to run a terminal command that gives me this (for example, in PowerShell it could be something like (Get-Item ${fileDirname}).Name). But I don't know how to do this, or if this is possible at all. Seems like something minor enough that should be possible to do without extensions, but maybe it's not.
I don't believe you can modify the built-in variables in a task, only use them as is or part of a string. But you can get other similar path variables through an extension called Command Variable that has many custom variables of the type you are looking for.
You indicated that extension.commandvariable.file.fileDirBasename will work for you.
I have the code below which executes an exe file and then i want the path which is assigned to a variable to be copied into the text box of that exe and once it gets copied then the next button in the exe should get clicked\executed automatically by powershell.
& "D:\SOFTWARE\notepad ++.exe"
$Path="C:\Program Files"
Basically i will be using this code for some other exe file but process would be same. So is there any way by which i can do this by using powershell?
Consider the snapshot below of the UI of application installed. If i want to pass the product key which i have declared in a variable inside the powershell after passing it to the application and then i want its Get Product Details button to be hit\Run.
What you actually want is pass the variables into the msi by running it quietly without UI. Try the following first (assuming your exe is actually just an MSI wrapper)
msiexec.exe /i $PathToExe /q /l*vx log.txt
This should quietly launch the msi and produce a log.txt file. Wait a few moments for it to complete since it is installing in the background. Then check if it correctly installed. Then check the log file to see all the variables that were set at the end. Guessing by their names or values, you could then pass them in the next time simply by adding variablename=variablevalue to the command line call. You might need to experiment and learn a bit about MSI and msiexec as part of this.
If talk about invoking UI actions in general, then you would need to look at the AutomationElement class but it can get very complex here on...
You should look at using SendKeys.
https://blogs.technet.microsoft.com/heyscriptingguy/2011/01/10/provide-input-to-applications-with-powershell/
What is the best and correct way to run a PowerShell script from another one?
I have a script a.ps1 from which I want to call b.ps1 which does different task.
Let me know your suggestions. Is dot sourcing is the best option here?
Dot sourcing will run the second script as if it is part of the caller—all script scope changes will affect the caller. If this is what you want then dot-source,
However it is more usual to call the other script as if it were a function (a script can use param and function level attributes just like a function). In many ways a script is a PowerShell function, with the name of the file replacing the naming of the function.
Dot sourcing makes it easier to at a later stage convert your script(s) into a module, you won't have to change the script(s) into functions.
Another advantage of dot sourcing is that you can add the function to your shell by adding the file that holds the functions to Microsoft.PowerShell_profile.ps1, meaning you have them available at all times (eliminating the need to worry about paths etc).
I have a short write-host at the top of my dot sourced files with the name of the function and common parameters and I dot source the functions in my profile. Each time I open PowerShell, the list of functions in my profile scrolls by (If like me, you frequently forget the exact names of your functions/files You'll appreciate this as over time as the number of functions start to pile up).
Old but still relevant.
I work with modules with "Import-Module ", this will import the module in the current powershell session.
To avoid keep in cache and to always have the last changes from the module I put a "Get-Module | Remove-Module" that will clear all the loaded modules in the current session.
Get-Module | Remove-Module
Import-Module '.\IIS\Functions.psm1'
I have both Sybase and MSFT SQL Servers installed. There is a time when Sybase interferes with MS SQL because they have they have some overlapping commands.
So, I need two scripts:
A) When runs, script A backs up the current path, grabs all paths that contain sybase or SYBASE or SyBASE (you get the point) in them and move them all at the very end of the path, while preserving the order.
B) When it runs, script B restores the path from back-up.
Both script a and script b should affect the path immediately. So, if a.bat that calls patha.ps1, pathb.ps1 looks like so:
#REM Old path here
call patha.ps1
#REM At this point the effective path should be different.
call pathb.ps1
#REM Effective old path again
Please let me know if this does not make sense. I am not sure if call command is the best one to use.
I have never used P.S. before. I can try to formulate the same thing in Python (I know S.O. users tend to ask for "What have you tried so far"). Well, at this point I am VERY slow at writing anything in Power Shell language.
Please help.
First of all: call will be of no use here as you are apparently writing a batch file and PowerShell scripts have no association to run them by default. call is for batch files or subroutines.
Secondly, any PowerShell script you call from a batch file cannot change environment variables of the caller's environment. That's a fundamental property of how processes behave and since you are calling another process, this is never going to work.
I'm not so sure why you are even using a batch file here in the first place if you have PowerShell. You might just as well solve this in PowerShell completely.
However, what I get from your problem is that the best way to resolve this is probably the following: Create two batch files that each set the PATH appropriately. You can probably leave out both the MSSQL and Sybase paths from your usual PATH and add them solely in the batch files. Then create shortcuts to
cmd /k set_mssql_path.cmd
and
cmd /k set_sybase_path.cmd
each of which now is a shortcut to a shell to work with the appropriate database's tools. This is how the Visual Studio Command Prompt works and it's probably the cleanest solution you have. You can use the color and prompt commands in those batches to make the two different shells distinct so you always know what environment you have. For example the following two lines will color the console white on blue and set a prompt indicating MSSQL:
color 1f
prompt MSSQL$S$P$G
This can be quite handy, actually.
Generally, trying to rearrange the PATH environment variable isn't exactly easy. While you could trivially split at a ; this will fail for paths that itself contain a semicolon (and which need to be quoted then). Even in PowerShell this will take a while to get right so I think creating shortcuts specific to the tools is probably the nicest way to deal with this.
how can i add a custom function / object to the standard array of recognized functions in PowerShell so that i can call it from the shell of PowerShell?
Thanks
You can put the function into your profile script. You can find out where this is by looking into the variable $profile. That script will be run automatically on starting Powershell (if you are allowed to run scripts) and functions declared in it will be available in every session.