I've been using some very nice code from this example to run one ipython notebook from another, which I (basically) copy below. This turns out to be a very nice way to organize my code.
But now, I want to compare some sympy expressions that I've coded up with roughly equivalent sympy expressions that someone else has coded up. And since there are some name clashes, I'd like to be able to execute the two notebooks in their own namespaces, so that if Bob and I both define a sympy expression x, I can just evaluate
Bob.x - Me.x
to see if they are the same (or find their differences). [Note that it's easy to change a namespace dictionary into a "dottable" namespace using something like this Bunch object.]
Here's the function:
def exec_nb(nbfile):
from io import open
from IPython.nbformat import current
with open(nbfile) as f:
nb = current.read(f, 'json')
ip = get_ipython()
for cell in nb.worksheets[0].cells:
if cell.cell_type != 'code':
continue
ip.run_cell(cell.input)
The basic problem is the get_ipython gets the currently running ipython instance, and then run_cell executes the cells from the other notebook in the current namespace of that instance.
I can't figure out how to change this. For example, running the whole command in exec with a different namespace still finds the current ipython instance, and uses that namespace.
Also, both notebooks actually need to be run in ipython; I can't export them to a script and execute the scripts in a namespace.
For the record, the link Jakob pointed to has now moved here, and answered my question perfectly.
Related
So I'm writing Python with IDLE and my usual workflow is to have two windows (an editor and a console), and run my script with F5 to quickly test it.
My script has some non-optional command line parameters, and I have two conflicting desires:
If someone launches my script without passing any parameters, I'd like him to get an error telling him to do so (argparse does this well)
When I hit F5 in IDLE, I'd like to have my script run with dummy parameters (and I don't want to have more keystrokes than F5 to have this work, and I don't want to have a piece of code I have to remember to remove when I'm not debugging any more)
So far my solution has been that if I get no parameters, I look for a params.debug file (that's not under source control), and if so, take that as default params, but it's a bit ugly... so would there be a cleaner, more "standard" solution to this? Do other IDEs offer easier ways of doing this?
Other solutions I can think of: environment variables, having a separate "launcher" script that's the one taking the "official" parameters.
(I'm likely to try out another IDE anyway)
With some editors you can define the 'execute' command,
For example with Geany, for Python files, F5 is python2.7 %f. That could be modified to something like python2.7 %f dummy parameters. But I use an attached terminal window and its line history more than F5 like commands.
I'm an Ipython user, so don't remember much about the IDLE configuration. In Ipython I usually use the %run magic, which is more like invoking the script from a shell than from an IDE. Ipython also has a better previous line history than the shell.
For larger scripts I like to put the guts of the code (classes, functions) in one file, and test code in the if __name__ block. The user interface is in another file that imports this core module.
Thanks for your question. I also searched for a way to do this. I found that Spyder which is the IDE I use has an option under Run/Configure to enter the command line parameters for running the program. You can even configure different ones for the different editors you have open.
Python tracker issue 5680 is a proposal to add to Idle a way to set command lines args for F5 Run module. You are free to test any of the proposed patches if you are able to apply them.
In the meanwhile conditionally extending sys.argv as done below should fulfill your requirements.
import sys
in __name__ == '__main__':
if 'idlelib.PyShell' in sys.modules:
sys.argv.extend(('a', '-2')) # add your argments here.
print(sys.argv) # in use, parse sys.argv after extending it
# ['C:\\Programs\\python34\\tem.py', 'a', '-2']
I am new to python. I generated a macro which is a .py script using Abaqus Macro manager. I realised that this script works only when run from the Abaqus manager and does not run by itself.
Please does anyone know how to modify this script so i can run it without using the Abaqus. Thank you in advance for your help
Adroit
to run a python script that relies on abaqus cae from the command line and without opening up the gui window you do:
abaqus cae noGUI=script.py
As mentioned if all the script does is define a macro, well that's all it does is define the macro and quit. Typically you need to add code to open an odb, do something, write output, etc.
In general, Python scripts can be run in Abaqus via 'File > Run script'. However, as is a case for all Python scripts, if all your code is contained inside of a function (and in case of Abaqus macro, it is), and that function is never called explicitly inside the script, the code will not be executed.
You file probably looks something like this:
from abaqus import *
# some other imports, if any
def macro_function():
# code defining the macro's behavior
You should edit the script by calling the function at the end of the script.
If you want some more concrete help, post your actual code.
EDIT: To call the defined function, you just write macro_function() at the end of the file, so that the script looks something like this:
from abaqus import *
# some other imports, if any
def macro_function():
# code defining the macro's behavior
macro_function()
Maybe it would be easier if you just had the code outside of the function and remove the function completely. For anything more than this, you really should learn some Python.
According to my moderate experience, if you need loop computations, you have to launch the script inside CAE, since when starting it in command line, only one cycle is computed. An example of the script intended for loop computations and visualisation, you can find at researchgate, search text "How to write scripts for Abaqus"
When using IPython, it's often convenient to see how long your commands take to run by using the %time magic function. When you use this often enough, you start to wish that you could just get toggle a setting to get this metadata by default whenever you enter a query. Psql lets you do this with \timing. GHCi lets you do this with :set s+. Does IPython let you do this? And if not, why not?
The "ipythonic" way of timing code is using the %timeit or %%timeit magic function (respectively for online, and multi-line code).
These functions provide quite accurate results by running the code multiple times (the exact number is adaptive if not specified).
The global flag you are asking does not exist in ipython. Furthermore, just adding %%timeit to all the cells will not work because global variables are not modified when calling %%timeit. This "feature" is directly inherited from the timeitmodule.
For more info see this ipython issue.
According to the docs, I should be able to define a macro and store it. Then, the macro will be available the next time I start the IPython shell. But, it doesn't work:
In [4]: print "Foobarbatbizbuzzbonk"
Foobarbatbizbuzzbonk
In [5]: %macro foo 4
Macro `foo` created. To execute, type its name (without quotes).
=== Macro contents: ===
print "Foobarbatbizbuzzbonk"
In [6]: %store foo
Stored 'foo' (Macro)
In [7]: quit()
When I start the IPython shell again, no macros:
In [1]: foo
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-1-d3b07384d113> in <module>()
----> 1 foo
NameError: name 'foo' is not defined
In [2]: %macro
Out[2]: []
Does anyone know why this doesn't work?
I found the answer to this in a couple of obscure places.
First, in the README found in $HOME/.ipython, it says, "For more information on configuring IPython, do: ipython config -h"
Doing that produces a bunch of help including the following advice:
To initialize a profile with the default configuration file, do::
$> ipython profile create
and start editing `IPYTHONDIR/profile_default/ipython_config.py`
The old docs for this configuration file are here:
Configuring the ipython command line application. The latest (as of Jan. 2020) are in the much improved section on Configuration and customization.
Finally, I found my answer in the docs for storemagic [Link updated, Jan. 2020]:
%store magic for lightweight persistence. Stores variables, aliases
and macros in IPython’s database. To automatically restore stored
variables at startup, add this to your ipython_config.py file:
c.StoreMagics.autorestore = True
Add that, restart IPython and bang! there are my macros. Cool!
As an alternative to defining a macro in the shell then storing it, you could just define the macro in a startup file. For example, you could put the following line into IPYTHONDIR/profile_default/ipython_config.py:
get_ipython().define_macro('foo','print "Foobarbatbizbuzzbonk"')
If you want to define a macro for a magic command, you could use something like this:
get_ipython().define_macro('clr',"""get_ipython().magic(u'%reset -sf')""");
I haven't figured out how to define a macro for a magic command with options that will accept arguments (e.g., typing 'rn tmp' instead of '%run -i tmp' to run the file tmp.py in ipython's namespace).
I must admit that sometimes finding a good documentation for ipython is a nightmare (for the basics). It seems like this interactive console has been created for really gurus of programming. At least, that is what I feel when trying to solve a simple issue like saving a variable for posterior use...something more than simple in matlab.
But to answer your qestion... try to open ipython normally just typing ipython and then write %store -r
It should recover your data with original name(s). I still do not know the use of storing variables into files or recovering just the variabels that I want. BTw, to know the names of your stored variables, type %store.
Hope it works for you.
Is there a way for "ipython3 notebook" to receive command line arguments with its 'run' button?
Thank you very much.
I just encountered this problem when I tried to run some scripts using argparse library in Jupyter Notebook. Since I don't want to do much modification to the code, I needed to provide sys.argv[] to the parser.
In my case, such code can solve the problem:
import sys
sys.argv=['self.py','arg1','arg2']
Because in a real command line, the first element of sys.argv is the name of the script, so you have to provide a random script name or a name that suitable for your application, and the following arguments are just like they are passed by a real command line.
This works for me so far, but I don't know if it is safe and elegant, wish this could help you.
Do you mean the Cell -> Run menu item? If so, no. The notebook's not really designed to be used like that. What are you trying to do?