We are using install4j in version 6.0.4. When we put compiler variables to the helper script of an installation application like
options.add(new String[] {"-VmyOption=[Value]", "Set the custome option (default=" + context.getCompilerVariable("myoption") + ")");
we get a NullPointerException for the context variable. Is it a bug or is the context not available at the time the helper script will be called?
or is the context not available at the time the helper script will be called
That is indeed that case. The script is called before the environment is initialized, so the context parameter is null.
Related
For debugging I have a need to see the actual modulepath. How can I print it out at runtime?
The problem: Using ServiceLoader to load a module against a defined API works fine in normal runtime environment but not in testing. I'd got to find out why.
As I am new to ServiceLoader, it may not to be enough that a provider module can be found on modulepath. However, my first question is: is it on modulepath even in test environment?
When you want to check the presence of a module, you can straight-forwardly check the presence of a module, e.g.
String moduleName = "java.compiler";
ModuleLayer.boot().findModule(moduleName)
.ifPresentOrElse(
m -> System.out.println(m + " loaded via " + m.getClassLoader()),
() -> System.out.println(moduleName + " not found"));
When the module is not present, you can check module path using the non-standard system property,
System.out.println(System.getProperty("jdk.module.path"));
Besides possibility of a mismatching path, it’s worth checking the context loader of each environment if loading a service fails in one environment, as the method ServiceLoader.load(Class) uses the context loader for searching.
Also check that the module’s classes are not accidentally provided on the old class path.
I'm going to start by saying I'm still pretty much a rookie at PowerShell and hoping there is a way to do this.
We have a utils.ps1 script that contains just functions that we dot source with in other scripts. One of the functions returns back a default value if a value is not passed in. I know I could check $args and such but what I wanted was to use the function for the default value in the parameters.
param(
[string]$dbServer=$(Get-DefaultParam "dbServer"),
[string]$appServer=$(Get-DefaultParam "appServer")
)
This doesn't work since the Util script hasn't been sourced yet. I can't put the dot source first because then params doesn't work as it's not the top line. The utils isn't a module and I can't use the #require.
What I got working was this
param(
[ValidateScript({ return $false; })]
[bool]$loadScript=$(. ./Utils.ps1; $true),
[string]$dbServer=$(Get-DefaultParam "dbServer"),
[string]$appServer=$(Get-DefaultParam "appServer")
)
Create a parameter that loads the script and prevent passing a value into that parameter. This will load the script in the correct scope, if I load it in the ValidateScript it's not in the correct scope. Then the rest of the parameters have access to the functions in the Utils.ps1. This probably is not a supported side effect, aka hack, as if I move the loadScript below the other parameters fail since the script hasn't been loaded.
PowerShell guarantee parameters will always load sequential?
Instead should we put all the functions in Utils.ps1 in global scope? this would need to run Utils.ps1 before the other scripts - which seems ok in scripting but less than ideal when running the scripts by hand
Is there a more supported way of doing this besides modules and #require?
Better to not use default value of params and just code all the checks after sourcing and check $args if we need to run the function?
It would be beneficial to instead turn that script into a PowerShell Module, despite your statement that you desire to avoid one. This way, your functions are always available for use as long as the module is installed. Also, despite not wanting to use it, the #Require directive is how you put execution constraints on your script, such as PowerShell version or modules that must be installed for the script to function.
If you really don't want to put this into a module, you can dot-source utils.ps1 from the executing user's $profile. As long as you don't run powershell.exe with the -NoProfile parameter, the profile loads with each session and your functions will be available for use.
I'm using Pester with Selenium WebDriver.
WebDriver is initialized in 'BeforeAll' block within corresponding 'Describe' block and the resulting instance is assigned to $driver variable. Then, in 'Describe' and 'It' block I call my custom functions that reside in external PowerShell module that is autoloaded with PowerShell. I expect that these functions have access to $driver variable defined in 'BeforeAll' block, but it does not happen and I get the following error message:
RuntimeException: You cannot call a method on a null-valued expression.
Here is the code from Search.Tests.ps1 Pester script:
Describe "Search for something" -Tag something {
BeforeAll {
$driver = New-WebDriver
$driver.Navigate().GoToUrl('http://example.com')
}
AfterAll {
$driver.Close()
$driver.Dispose()
$driver.Quit()
}
Find-WebElement -Selector ('some_selector')
It "Something is found in search results" {
GetTextFrom-WebElement -Selector ('some_selector') | Should Be 'something'
}
}
Find-WebElement and GetTextFrom-WebElement are helper functions that use $driver to get element by CSS and extract element's inner text.
I investigated the issue and found a workaround, but I don't think it's an elegant way to go. The workaround is to redefine $driver in each helper function in the external PowerShell module right after the param block like this:
$driver = $PSCmdlet.GetVariableValue('driver')
This way the functions can see $driver and everything works.
My question: is it possible to do something, so the functions always have access to $driver without a need to redefine driver in each of them?
"I expect that these functions [defined in a PowerShell module] have access to $driver variable defined in 'BeforeAll' block..."
They don't, and you probably shouldn't rely on that behaviour even if they did.
Variables Defined in Pester Scriptblocks Aren't Accessible from Modules
Variables defined in BeforeAll{},BeforeEach{},Context{}, and It{} blocks are not accessible from a module under test when the x.Tests.ps1 file is invoked by Invoke-Pester (reference). If the x.Tests.ps1 file happens to be invoked directly (ie. by pressing F5 in ISE) then variables defined in BeforeAll{} are accessible from a module under test. Relying on that behavior precludes that test from running in bigger batches, so should be avoided.
Reliance of Implicit Accessibility of External Variables Should be Avoided
It seems like your custom module expects that $driver is defined somewhere outside the module and is implicitly accessible from inside the module. That raises the following question: Where did the author of the custom module intend $driver to be defined? As a script variable in the module? As a global variable? Both of those are pretty awkward public interfaces for a module because it is difficult to control whether the correct value for $driver is indeed available to the module. If the module does rely on such behavior, I suggest changing the custom module to explicitly accept your $driver object
(or at least the information required to create that object).
If you cannot change the custom module, you might be able to get by with changing your variable references from $driver to $global:driver. You should really try to avoid that though, because using global variables in that way will likely result in any of a variety of problems at some point.
I have a PowerShell module that contains a number of common management and deployment functions. This is installed on all our client workstations. This module is called from a large number of scripts that get executed at login, via scheduled tasks or during deployments.
From within the module, it is possible to get the name of the calling script:
function Get-CallingScript {
return ($script:MyInvocation.ScriptName)
}
However, from within the module, I have not found any way of accessing the parameters originally passed to the calling script. For my purposes, I'd prefer to access them in the form of a dictionary object, but even the original command line would do. Unfortunately, given my use case, accessing the parameters from within the script and passing them to the module is not an option.
Any ideas? Thank you.
From about_Scopes:
Sessions, modules, and nested prompts are self-contained environments,
but they are not child scopes of the global scope in the session.
That being said, this worked for me from within a module:
$Global:MyInvocation.UnboundArguments
Note that I was calling my script with an unnamed parameter when the script was defined without parameters, so UnboundArguments makes sense. You might need this instead if you have defined parameters:
$Global:MyInvocation.BoundParameters
I can see how this in general would be a security concern. For instance, if there was a credential passed to the function up the stack from you, you would be able to access that credential.
The arguments passed to the current function can be accessed via $PSBoundParameters, but there isn't a mechanism to look at the call stack function's parameters.
I'm using eclipse and want to have a "macro" that the preprocessor will replace with the name of the current method before compiling it.
I have an error reporting function, that is called as: reportthis(String errormessage) - different functions throughout the application have try/catch blocks, that call reportthis(...) from the catch block upon errors.
I'd like to be able to specify something like reportthis(MACRO_CURRENT_METHOD_NAME + ":" + e.ToString()); - where MACRO_CURRENT_METHOD_NAME will be preprocessed by eclipse before compilation and result in the name of the method where the catch {} block calls reportthis().
So if the catch{} block happens in main(), the macro should return the string "main" (or "main()", etc.).
Is this possible? how do i go about achieving my goal?
Thank you!
Edit
I wish to get this done by preprocessors in Eclipse - are those impossible? isn't it possible to perhaps write a plugin for eclipse to replace all occurrences of "MACRO_CURRENT_METHOD_NAME" with the current function name?
I've not found an automated way of doing this, so have manually added a string literal that indicates the name of the caller at each invocation of the logging code.
Nokia's S40 platform is also based on Java-ME, and I know some Nokia S40 developers have made good use of Jarrut, which is available on Sourceforge, to produce stack traces by modifying the program to track the stack. You could leverage this functionality to get the calling function name in your logging code, but you may need to modify Jarrut a bit to make that work.
Java does not support Macros.
But what you can do to determine the current method is something like
final StackTraceElement aTop = Thread.currentThread ().getStackTrace ()[1];
System.out.println (aTop.getMethodName ());
By using the element at index [1] you get the calling method, because the element at [0] is Thread.getStackTrace().
If you wrap this code in an additional method, you must adopt the array index e.g. to 2, depending on the number of wrapping methods you are using.
There is no preprocessor in java, and no macro language either.
While there are situations where either could be useful, if I understand your problem its entirely pointless, since the stack trace of the exception will already contain class and method of the place where the excetion occured.
Instead of passing a String to your "reportthis()", make a signature that just takes the exception and prints it (or just write e.printStackTrace()).