trying to make a timer delayed based script for mIRC for a scenario based game. everything is put together and should run but when I actually go to start the script it just says unknown command.
SCRIPT {
/TIMER1 1 1 /query #mircroom THIS IS WHAT I WANT IT TO SAY
/TIMER2 1 4 /query #mircroom2 THIS IS WHAT I WANT IT TO SAY
}
I put it into alias's so in theory I should just have to type /SCRIPT and it would run right? I also threw it into it's own .txt file and it did not recognize starting the script.
Any and all help would be greatly appreciated.
$script is a built-in identifier
$script can be used to return the filename of the Nth loaded script file. If you specify a filename, it returns $null if the file is not loaded; otherwise it returns the file's name.
Try to give it another name and it should work
Related
Say I have a script to be executed in a single call, how do I do it?
Like, say I have a powershell script saved at E:\Fldr\scrpt.ps1.
Now if I have to normally execute that script is PowerShell ISE then I would have to use:
& "E:\Fldr\scrpt.ps1"
and the scrpt.ps1 gets executed.
But whatI want is, when I write a word, say "exeScrpt" instead of & "E:\Fldr\scrpt.ps1" then I want scrpt.ps1 to get executed.
Is there a way to do this?
Thank you for checking in..
You can wrap your call to the script in a function:
function Invoke-Script
{
E:\Fldr\scrpt.ps1
}
Then you can run your script by executing a single command anywhere after the definition:
Invoke-Script
Note that it is good practice to name your functions according to the Verb-Noun cmdlet naming standard, so something like Invoke-Script instead of exeScrpt. If you really want a single word as the name, then you can additionally create an alias for your function:
New-Alias -Name exeScrpt -Value Invoke-Script
I think we all know the PsIsContainer method to check if the current file is a folder or not. But in my project I need a way to quickly know the number of folders in a folder. All I need is to quickly get their number. I want to write in a .txt lines which would look like C:\folder;12. It would mean in the folder, with the -recurse argument, there would be 12 folders.
To explain why, I need to save the progress of my work when i cut off the program which is used to analyse some folders. When a folder's analysed, the result is written in a second .txt. For example, if a folder is called C:\folder\folder1, folder will be analysed and then folder1 will be too. Which makes folder appear 2 times in the file because the full name always is written. What i want to do is to count the number of lines where C:\folder is written. If it equals the number next it's path in the first .txt, it means the file already has been analysed and the function doesnt need to do it again.
Does someone have a solution ? Or maybe an another idea to save the progress ? Cause i really have the feeling this is taking too long to do this.
Thank you for your help.
Another approach, which i find much faster is using cmd built-in 'dir' command
of course this is in case you don't need the subfolders(which you can then run the function in a foreach loop, or change the function if this is the case)
Function Get-FolderCount($path)
{
$Dir = cmd /c dir $path /a:d
Return ($Dir[-1] -csplit 'Dir' -replace '\s')[0]
}
I use this as well for measuring folder size with /s switch and take the total size which is much faster then powershell, also much faster then run it on interactive shell...
is there any way to get the scriptname of the calling script into a dot-sourced ps1 script?
This would be awsome for logging.
eg: script test1.ps1 is calling a function from dot-sourced log.ps1.
The String "test.ps1" is needed in log.ps1. Is this possible?
Thanks in advance
If you are dot sourcing log.ps1 then the execution is still occurring within Test.ps1.
To get the name of the executing script use:
$ExecutingScript = $MyInvocation.MyCommand.Name
# Test.ps1
You can then use $ExecutingScript in whatever logging functions provided by the dot sourced script.
If you need the entire path to the executing script, you would use:
$MyInvocation.InvocationName
# C:\Whereever\Test.ps1
You can try :
$a = Split-Path $PSCommandPath -Leaf
I was trying to solve similar or maybe same problem - how to generate log file name based on script name. Solution from #SomeShinyObject is nice but when you have your logging script sourced from file, that is sourced from another file, then you have problem. I have arrived at following solution. In case anyone is interested.
$scriptFileName = Get-Item (Get-PSCallStack)[(Get-PSCallStack).length-1].ScriptName
$log = "$($scriptFileName.DirectoryName)\log\$($scriptFileName.Basename).log"
It basically gets highest frame from the call stack by first getting stack depth using (Get-PSCallStack).length then it uses this value in obtaining the frame itself and gets ScriptName from there, which is uppermost script in calling hierarchy.
I've got a wrapper powershell script that I'm hoping to use to automate a few things. It's pretty basic, and accepts a parameter that I want the script to run as if it were a line in the script. I absolutely cannot get it to work.
example:
param( [string[]] $p)
echo $p
# Adds the base cmdlets
Add-PSSnapin VMware.VimAutomation.Core
# Add the following if you want to do things with Update Manager
Add-PSSnapin VMware.VumAutomation
# This script adds some helper functions and sets the appearance. You can pick and choose parts of this file for a fully custom appearance.
. "C:\Program Files (x86)\VMware\Infrastructure\vSphere PowerCLI\Scripts\Initialize-VIToolkitEnvironment.ps1"
$p
In the example above, I want $p to execute as if it were a line in the script. I know this isn't secure, and that's probably where the problem lies.
Here is how I try running the script and passing in a parameter for $p:
D:\ps\test>powershell -command "D:\ps\test\powershell_wrapper.ps1" 'Suspend-VM servername -Verbose -Confirm:$False'
How can I get my parameter of 'Suspend-VM servername -Verbose -Confirm:$False' to run inside my script? If I just include the value in the script instead of pass it in as a parameter it runs without any issues...
You can basically approach this two ways, depending on what your needs really are and how you want to structure your code.
Approach #1 - Invoke-Expression
Invoke-Expression basically allows you to treat a string like an expression and evaluate it. Consider the following trivial example:
Invoke-Expression '{"Hello World"}'
That will evaluate the string as if it were an expression typed in directly, and place the string "Hello World" on the pipeline. You could use that to take your string parameter and run it on-the-fly in your script.
Approach #2 - Using a ScriptBlock
PowerShell has a special data type called a ScriptBlock, where you can bind a script to a variable, and then invoke that script as part of your code. Again, here is a trivial example:
function Test-SB([ScriptBlock]$sb) {
$sb.Invoke()
}
Test-SB -sb {"Hello World"}
This example creates a function with a single parameter $sb that is of type ScriptBlock. Notice the parameter is bound to the actual chunk of code {"Hello World"}? That code is assigned to the $sb parameter, and then a call to the .Invoke method actually executes the code. You could adapt your code to take in a ScriptBlock and invoke it as part of your script.
Approach #3 - Updating your profile
OK, so I said there were two ways to approach it. There is actually a third... sort of... You could add the VMWare cmdlets to your $profile so they are always present and you don't need your wrapper to load in those libraries. Granted, this is a pretty big hammer - but it might make sense if this is the environment you are constantly working in. You could also just create a shortcut to PowerShell that runs a .ps1 on startup that includes those libraries and hangs around (this is what MS did with the SharePoint admin shell and several others). Take a look at this TechNet page to get more info on the $profile and if it can help you out:
http://msdn.microsoft.com/en-us/library/windows/desktop/bb613488.aspx
I've written a powershell script that I run several times a day. It's getting to be somewhat of a chore to execute the script manually (from within Powergui or the shell), so I'd like to create a frontend which prompts me for the variables. I've found that Primalforms can supply me with pre-populated fields that can be adjusted if needed.
My problem is that I would like to create a gui and pass ALL the variables to my external script (this script is already written and will not be part of the Primalforms project).
How would I do this? Or should I pass the variables manually? How would I do that?
(I do not think this would be specific to Primalforms.. I'm rather executing a script with variables with another script as input.)
Any help would be greatly appreciated!
Use splatting. Collect all the values for the parameters in a hashtable (key names match parameter names) and assign each name the value of the parameter from the corresponding text feild in your form. Then pass the hashtable to script B. The following assumes that you have two text fields with names of: filter and path.
## scriptA ##
$params = #{
path=$path.text
filter=$filter.text
}
D:\Scripts\scriptB.ps1 #params