For some reason i simply can't understand most of the sites who explain this question. So i'll try to ask here, if i'm am in the wrong place, just tell me in the comments and i'll put this in another forum and delete this question.
Let's say that i have 2 files, Batch.bat and PowerShell.ps1.
Batch.bat:
set A="ThisIsSuchVar!"
PowerShell.ps1:
$B = "Well, i don't know what to do here"
What can i do to the B variable be the same as the A variable?
Remember: I want the Batch variable to go to the PowerShell file. It's an one-way script. I want to use the built-in windows sources. And please, consider that i am a complete newbie in programming and don't speak english very well, so be the simplest possible, please.
In your batch file run.bat, set the environment variable A and run the PowerShell script:
set A=8
PowerShell.exe -File .\script.ps1
pause
In script.ps1, get the environment variable A, and assign its value to B:
$B=$Env:A
echo $B
When you run run.bat you get:
C:\Temp\try>set A=8
C:\Temp\try>PowerShell.exe -File .\script.ps1
8
C:\Temp\try>pause
Press any key to continue . . .
Related
I know I can dot source a file:
. .\MyFunctions.ps1
But, I would like to dot source the commands in a string variable:
. $myFuctions
I see that this is possible:
.{$x=2}
And $x equals 2 after the script block is sourced.
But... .{$myFunctions} does not work.
I tried $myFunctions | Invoke-Expression, but it doesn't keep the source function in the current scope. The closest I have been able to come up with is to write the variable to a temporary file, dot source the file, and then remove the file.
Inevitably, someone will ask: "What are you trying to do?" So here is my use case:
I want to obfuscate some functions I intend to call from another script. I don't want to obfuscate the master script, just my additional functions. I have a user base that will need to adjust the master script to their network, directory structure and other local factors, but I don't want certain functions modified. I would also like to protect the source code. So, an alternate question would be: What are some good ways to protect PowerShell script code?
I started with the idea that PowerShell will execute a Base64-encoded string, but only when passed on the command line with -EncodedCommand.
I first wanted to dot source an encoded command, but I couldn't figure that out. I then decided that it would be "obfuscated" enough for my purposes if I converted by Base64 file into a decode string and dot sourced the value of the string variable. However, without writing the decoded source to a file, I cannot figure out how to dot source it.
It would satisfy my needs if I could Import-Module -EncodedCommand .\MyEncodedFile.dat
Actually, there is a way to achieve that and you were almost there.
First, as you already stated, the source or dot operator works either by providing a path (as string) or a script block. See also: . (source or dot operator).
So, when trying to dot-source a string variable, PowerShell thinks it is a path. But, thanks to the possibility of dot-sourcing script blocks, you could do the following:
# Make sure everything is properly escaped.
$MyFunctions = "function Test-DotSourcing { Write-Host `"Worked`" }"
. { Invoke-Expression $MyFunctions }
Test-DotSourcing
And you successfully dot-sourced your functions from a string variable!
Explanation:
With Invoke-Expression the string is evaluated and run in the child scope (script block).
Then with . the evaluated expressions are added to the current scope.
See also:
Invoke-Expression
About scopes
While #dwettstein's answer is a viable approach using Invoke-Expression to handle the fact that the function is stored as a string, there are other approaches that seem to achieve the same result below.
One thing I'm not crystal clear on is the scoping itself, Invoke-Expression doesn't create a new scope so there isn't exactly a need to dot source at that point...
#Define your function as a string
PS> $MyUselessFunction = "function Test-WriteSomething { 'It works!' }"
#Invoke-Expression would let you use the function
PS> Invoke-Expression $MyUselessFunction
PS> Test-WriteSomething
It works!
#Dot sourcing works fine if you use a script block
PS> $ScriptBlock = [ScriptBlock]::Create($MyUselessFunction)
PS> . $ScriptBlock
PS> Test-WriteSomething
It works!
#Or just create the function as a script block initially
PS> $MyUselessFunction = {function Test-WriteSomething { 'It works!' }}
PS> . $MyUselessFunction
PS> Test-WriteSomething
It works!
In other words, there are probably a myriad of ways to get something similar to what you want - some of them documented, and some of them divined from the existing documentation. If your functions are defined as strings, then Invoke-Expression might be needed, or you can convert them into script blocks and dot source them.
At this time it is not possible to dot source a string variable.
I stand corrected! . { Invoke-Expression $MyFunctions } definitely works!
I'm trying to make a system which turns the contents of a .txt file into a variable. This isn't my problem, though; for some reason, my files are reading characters I didn't enter, and can't use.
Please note: What I'm doing is in no way efficient, and I'm positive there are other ways to go about this, but this is the best way for me. Also, I'm not amazingly intelligent when it comes to coding. Just thought I'd throw that out there.
First, let me show you the system I have in place.
value.txt
4
This file has the contents which I'd like to make into a variable.
Batch Files
setcmdvar.bat
set cmdvar=
I leave this empty so that I can put the contents of value.txt at the end (more on this later).
start.bat
#echo off
call PowerShell.exe cd "C:\Users\%username%\Desktop\Folder"; $PSvar = Get-Content value.txt; $PSvar >> setcmdvar.bat
pause
call setcmdvar.bat
pause
echo The variable equals %cmdvar%.
pause
exit
The second line from start.bat creates this script in PowerShell:
PowerShell script
cd "C:\Users\%username%\Desktop\Folder\"
$PSvar = Get-Content value.txt; $PSvar >> setcmdvar.bat
This creates a variable in PowerShell, $PSvar, which equals the contents of value.txt; in our case, 4. Then, it puts $PSvar (4) at the end of setcmdvar.bat, using >>, which changes it to:
setcmdvar.bat (changed)
set cmdvar=4
Or, at least, it should, to my knowledge. Instead, it changes the file to this:
set Items=桔瑳楲杮椠業獳湩桴整浲湩瑡牯›⸢ †⬠䌠瑡来牯䥹普††††㨠倠牡敳䕲牲牯›㨨
嵛慐敲瑮潃瑮楡獮牅潲割捥牯䕤捸灥楴湯 †⬠䘠汵祬畑污晩敩䕤牲牯摉㨠吠牥業慮潴䕲灸捥整䅤䕴摮晏瑓楲杮
Or some other strange combination of characters. I looked one up, and it was Chinese. There's also some other characters like †, ⬠, ›, ⸢,
, and . I have no idea why these are being typed. Along with this, start.bat displays the following:
Press any key to continue . . .
(PowerShell script runs here)
'■s' is not recognized as an internal or external command,
operable program or batch file.
Press any key to continue . . .
The variable equals .
Press any key to continue . . .
(exit)
I did not type "■s," and I assume this may be the problem. Whatever it is, does anyone have any ideas?
P.S. I'm sorry if my code is complicated, or if it looks bad, but I have it this way for a reason. Mostly my incompetence, actually. But I think it's better that way.
Also, I know there are commands like for /f "delims=" %a in ('ver') do #set foobar=%a (I just took this off the internet) but I've tried commands like those, and I suppose I just don't understand them all that well, because they didn't work.
I appreciate the help!
It would probably be better to avoid the static setcmdvar.bat script and write it all in the script. Using -Encoding ascii is what keeps the output from being Unicode and having a BOM (Byte Order Mark) at the beginning. It has nothing to do with Chinese characters.
ECHO>"%USERPROFILE%\value.txt" 4
SET "CMDFILE=%USERPROFILE%\setcmdvar.bat"
powershell -NoLogo -NoProfile -Command ^
"'SET ""cmdvar=' + (Get-Content $Env:USERPROFILE\value.txt) + '""""' |" ^
"Out-File -FilePath %CMDFILE% -Encoding ascii"
pause
call "%CMDFILE%"
pause
echo The variable equals %cmdvar%
pause
EXIT /B 0
>> has its problems with mixing encodings. Plus it defaults to utf16. I recommend changing
$PSvar >> setcmdvar.bat
to
add-content setcmdvar.bat $PSvar
I'm not sure how to keep everything on one line. You can make setcmdvar.bat like this:
set cmdvar=^
So that it continues.
First, I would like to apologize in case that the title is not descriptive enough, I'm having a hard time dealing with this problem. I'm trying to build an automation for a svn merge using a powershell script that will be executed for another process. The function that I'm using looks like this:
function($target){
svn merge $target
}
Now, my problem occurs when there are conflicts in the merge. The default behavior of the command is request an input from the user and proceed accordingly. I would like to automatize this process using predefined values (show the differences and then postpone the merge), but I haven't found a way to do it. In summary, the workflow that I am looking to accomplish is the following:
Detect whether the command execution requires any input to proceed
Provide a default inputs (in my particular case "df" and then "p")
Is there any way to do this in powershell? Thank you so much in advance for any help/clue that you can provide me.
Edit:
To clarify my question: I would like to automatically provide a value when a command executed within a powershell script require it, like in the following example:
Requesting user input
Edit 2:
Here is a test using the snippet provided by #mklement0. Unfortunately, It didn't work as expected, but I thought it was wort to add this edition to clarify the question per complete
Expected behavior:
Actual result:
Note:
This answer does not solve the OP's problem, because the specific target utility, svn, apparently suppresses prompts when the process' stdin input isn't coming from a terminal (console).
For utilities that do still prompt, however, the solution below should work, within the constraints stated.
Generally, before attempting to simulate user input, it's worth investigating whether the target utility offers programmatic control over the behavior, via its command-line options, which is both simpler and more robust.
While it would be far from trivial to detect whether a given external command is prompting for user input:
you can blindly send the presumptive responses,
which assumes that no situational variations are needed (except if a particular calls happens not to prompt at all, in which case the input is ignored).
Let's assume the following batch file, foo.cmd, which puts up 2 prompts and echoes the input:
#echo off
echo begin
set /p "input1=prompt 1: "
echo [%input1%]
set /p "input2=prompt 2: "
echo [%input2%]
echo end
Now let's send responses one and two to that batch file:
C: PS> Set-Content tmp.txt -Value 'one', 'two'; ./foo.cmd '<' tmp.txt; Remove-Item tmp.txt
begin
prompt 1: one
[one]
prompt 2: two
[two]
end
Note:
For reasons unknown to me, the use of an intermediate file is necessary for this approach to work on Windows - 'one', 'two' | ./foo.cmd does not work.
Note how the < must be represented as '<' to ensure that it is passed through to cmd.exe and not interpreted by PowerShell up front (where < isn't supported).
By contrast, 'one', 'two' | ./foo does work on Unix platforms (PowerShell Core).
You can store the SVN command line output into a variable and parse through that and branch as you desire. Each line of output is stored into a new enumerator (cli output stored in PS variables is in array format)
$var = & svn merge $target
$var
Suppose I have a function in test.ps1, its name is show the code is like below:
. "some_path\some_other_script.ps1"
function show {
write-host "Hello World"
#some other function call from some_other_script.ps1
...
}
How can I call show in follow format (in a & - call operator)
powershell.exe "& '%test_script_path%\test.ps1'\show"
I think I need to dot source test.ps1 first in order to get dependencies in show function from some_other_script.ps1
I know I can create a new script and put the code in show in the new script instead of in a function. In that way, when do &... the script will be invoked. But I don't want to create a new script just for a very simple function
Thanks for any suggestion
You could dot source it and then call it like this.
powershell -Command ". '%test_script_path%\toolsql.ps1'; show"
The ; is used as a separator between commands.
I am really appreciated for the help and time from #Patrick Meinecke, the following command works.
powershell -command ". .\test.ps1;show"
Like he mentioned in the chat,
"So to explain that
the .\ just indicates it's in the current directory
it still needed the other . to tell powershell to load the script into the current context"
I have a variable that is common to most of my app called "emails". I also want to use "emails" as the name of a parameter in one of the scripts. I need to refer to the value of both variables in the same script. Ideally there would be a way to refer using module/namespace or something and perhaps there is but I don't know it. You can see how I hack around this but it is ugly and prone to error. Is there a better way?
# PowerShell v1
# Some variable names are very common.
param ($emails)
# My Hack
# We need to save current value so we have it after we source in variables below.
$emails0=$emails
# Below is going to load a variable called "emails" which will overwrite parm above.
. C:\load_a_bunch_of_global_variables.ps1
It is because as documentation says: (the dot sourcing operator) Runs a script so that the items in the script are part of the calling scope.
In this case I would convert C:\load_a_bunch_of_global_variables.ps1 to a module and pass $emails as parameter or export a function that sets the $script:emails variable in the module. Then the variable will not be in a conflict with the variable in the parent script.
For more information about modules you can use get-help about_modules.
I would avoid using global variables if possible in my scripts.
Why? Because it is a code smell (as programmers say). With one script there is no problem. If two scripts use the same global variable and only read, it is maybe acceptable. But if any of them changes the value, then there might be unpleasant conflicts.
In some cases Get-Variable -scope 1 -name myvariable would help, but I would use it only in closed pieces of code like modules or in short scripts (the same reason as with global variables).
While you can use Get-Variable -scope to get access to variables at arbitrary levels of the call stack, it is easier in this case to grab the top level (to the script) variable using the script: modifier e.g.
$script:emails
rerun and stej both helped me out.
I still want to source in the file using ". file.ps1" but changing "$emails=foo#yahoo.com" in my load_a_bunch_of...ps1 file to "$global:emails=foo#yahoo.com" solved the problem. I can now refer to the variable using global key word when I have a local and a global variable, and when there is only one variable to deal with I can leave out the global keyword.
You can alwways access your global variables from a script using $global:var name inside your script you have local scope and you won't get collisions. If you . source your script you will override the global var.
For Ex if a have a script
$Crap ="test"
$Crap
And you run the flowing commands you get what you want. In line 2 we run the script and the var doesn't get a conflict but if you run the script as in line 4 with a . source you get what you are discovering which due to the way the . operator works
1:PS C:\Users\Adam> $crap = "hi"
2:PS C:\Users\Adam> .\test.ps1
test
3:PS C:\Users\Adam> $crap
hi
4:PS C:\Users\Adam> . .\test.ps1
test
5:PS C:\Users\Adam> $crap
test
6:PS C:\Users\Adam>
if You add the following line to the script run it
$global:crap;
you will get
PS C:\Users\Adam> .\test.ps1
test
hi