Assigning commands to Procedure variable type in ATEasy environment - function-pointers

I am coding in the ATEasy environment which is BASIC programming.
Assume I have a variable of Procedure type, named pFunc.
And also I have DRV that has the command: MYDRV EXECUTE MYCMD(iCnt,dResult).
The function that I want to use "sits" in another driver and it is not public.
So I only have access to the COMMAND that calls this function.
How do I assign the command to the Procedure variable?
I tried doing this:
pFunc = MYDRV EXECUTE MYCMD
But it does not compile. It thinks I want to call the MYCMD commands and asks for parameters.

I got an answer from ATEasy support forum in Marvin Test Solution site:
https://www.marvintest.com/forums/Thread.aspx?ID=392#bottom
Apparently it is an issue that was fixed in ATEasy 8.
So this:
pFunc = MYDRV EXECUTE MYCMD
actually works in ATEasy 8 and above

Related

calling source from within a function

I'm currently moving from zsh to fish. Most of my aliases etc are pretty easy to port, but I'm running into problems with the following one:
alias idf.py='unalias idf.py; source $IDF_PATH/export.sh; idf.py'
I know in fish aliases are just syntactic sugar for functions so I started writing the following function:
function "idf.py"
source $IDF_PATH/export.fish
functions --erase idf.py
idf.py $argv
end
the export.fish script is from the esp-idf and starts complaining about needing to be sourced. After that the function gets removed correctly but idf.py can not be found:
his script should be sourced, not executed:
realpath: /home/robert/Programs/esp-idf/export.fish/..: Not a directory
.
fish: Unknown command: idf.py
fish:
idf.py $argv
^
in function 'idf.py'
Do you have any idea how I can source to the global scope from within this function?
Just going on the commit history of the project you linked to -- The ESP-IDF script you linked is the latest version, but the error you are receiving was from a bug in an earlier version of the script that has since been fixed.
The script attempts to detect whether it is being called with the source command by testing status current-command, but since you are calling it from within the function idf.py, that's the name returned, rather than source.
It was a bug, and it was fixed about two years ago by simply removing the source check.
Upgrading to the latest ESP-IDF release should take care of it.

In powershell, how can I define all env vars for the next command similar to python's Popen abilities?

In powershell, how can I...
explicitly define all the env vars for the next command?
I don't want any system env vars if possible,
After this command runs I dont want anything we have done to affect further processes in the shell.
As an example, in python we have the equivalent ability in Popen to pass a dictionary of the full environment to the subprocess, and I'm hoping there might be something similar in Powershell.
I think this link explains what you need: Windows user environment variable vs. system environment variable
[Environment]::GetEnvironmentVariable("TEMP", "Machine")

Want to activate a virtual environment from terminal with Octave/Matlab

I would like to execute a bash command to activate a virtual environment with Octave using Linux. What I actually want to do is run DeepSpeech using Octave/Matlab.
The command I want to use is
source $HOME/tmp/deepspeech-venv/bin/activate
The line of code I tried on my own is system("source $HOME/tmp/deepspeech-venv/bin/activate")
And the output I'm getting is sh: 1: source: not found
I saw this answer on a post and tried this command setenv('PATH', ['/source $HOME/tmp/deepspeech-venv/bin/activate', pathsep, getenv('PATH')]) but with no help it returned the same error.
It's not completely clear from your question, but I'm assuming you're trying to do is run python commands within octave/matlab, and you'd like to use a python virtual environment for that.
Unfortunately, when you run a system command from within octave, what most likely happens is that this creates a subshell to execute your command, which is discarded once the command has finished.
You have several options to rectify this, but I think the easiest one would be to activate the python virtual environment first, and run your octave instance from within that environment. This then inherits all environmental variables as they existed when octave was run. You can confirm this by doing getenv( 'VIRTUAL_ENV' ).
If that's not an option, then you could make sure that all system commands intended to run python scripts, are prefixed with a call to the virtual environment first (e.g. something like system( 'source ./my/venv/activate; python3 ./myscript.py') ).
Alternatively, you can try to recreate the virtual environment from its exported variables manually, using the setenv command.

Eval in fish shell function working strange

I tried to make wrapper to docker's command to change docker machine context. Here's the command and output:
docker-machine env dev
set -x DOCKER_TLS_VERIFY "1";
set -x DOCKER_HOST "tcp://192.168.99.101:2376";
set -x DOCKER_CERT_PATH "/Users/sandric/.docker/machine/machines/dev";
set -x DOCKER_MACHINE_NAME "dev";
# Run this command to configure your shell:
# eval (docker-machine env dev)
Code for function, I placed in .config/fish/config.fish:
function cdm
eval (docker-machine env $argv)
end
So, when trying to run cdm in new session, function evaluated, but context didn't change. If however, I'd run:
eval (docker-machine env default)
from command prompt, and tried to run cdm with different arguments - all works fine. So I suspect that it has smth to do with existing of environment variables this command trying to set from fish function.. I even tried then to experiment and change that function to alias (which is also alias to fish functions as I get) command with constant instead of parameter:
alias cdm "eval (docker-machine env dev)"
And it worked the same - it didn't changed environment variables if I ran this alias first on newly opened session, but if I run eval code from command prompt - after that alias working as expected.
So, whats its all about, anyone have any ideas?
It looks like that Docker's output does not specify an explicit scope, so when you run it inside your function and those variables have not been defined elsewhere, they will end up in the function's scope.
However, if you run the same code from the command prompt, you will end up with variables defined in the global scope, which are then updated by the set in the function.
See the documentation for set:
The scoping rules when creating or updating a variable are:
If a variable is explicitly set to either universal, global or local,
that setting will be honored. If a variable of the same name exists in
a different scope, that variable will not be changed.
If a variable is not explicitly set to be either universal, global or
local, but has been previously defined, the previous variable scope is
used.
If a variable is not explicitly set to be either universal, global or
local and has never before been defined, the variable will be local to
the currently executing function. Note that this is different from
using the -l or --local flag. If one of those flags is used, the
variable will be local to the most inner currently executing block,
while without these the variable will be local to the function. If no
function is executing, the variable will be global.
To fix this problem, try defining your cdm function with --no-scope-shadowing (although this seems to work, I'm not sure that it should).

Lua: command lines parameters are nil on windows 8.1

I run the Lua script using command line:
scipt.lua arg
But when I want to print the value arg1 in script:
print(arg[1])
Result is nil.
When I try to run it like:
lua script.lua arg
It returns not recognized command for windows.
What I am doing wrong? How can I get parameters from command line?
I don't see any problem with your example. Since you are able to run this command, but do not get any arguments passed, it's possible that whatever script registered the association, didn't use the syntax for passing arguments. You can find the registered association and check the command to make sure it includes %* to pass all the parameters to the script.
You can find where the executable is by using where lua.exe command and then call that executable directly from the command line to see if it works.