How to mark a jupyter lab document as trusted when building through mybinder.org so that all codecells are run automaticaly - jupyter

I need all codecells in a jupyter lab document to run when opening it in binder. I've tried adding jupyter trust <filename> to the postbuild file but it does'nt work. Any tips are welcome.

When you say that it doesn’t work, does it at least show as trusted when you open the notebook? Because that approach is/was recommended way to set to trusted. Until the extension enabling example notebook was streamlined to only include an example that works in both the classic an JupyterLab interface, that was the way documented here. Here discusses how to check if trusted.
However, just setting trusted does not trigger the code to run when the notebook is launched.
Aside of coding extension that will run code like python-markdown, there is no simple way to trigger that in the Binderhub system at this time. When this question came up the other day in the binder gitter channel, I suggested a person could most likely make an empty notebook appear to have just been run by using a start config file to trigger execution of the notebook via nbconvert or papermill.
Alternatively, using Appmode as discussed here will execute all cells when run in app mode; however, the interface won’t be a notebook. Similarly, installing Voila and linking to the Voila render will run code in a notebook at launch. It will look like a web app interface and not a notebook though.

Related

command not found: flow

I followed the Flow installation guide for npm & babel and when I get to the second stage where you flow init I keep getting the error message zsh command not found: flow. I installed flow into my project (a branch of my Gatsby blog) for testing/debugging purposes. It is not installed globally, which is what the Flow docs state is the best practice:
Flow works best when installed per-project with explicit versioning rather than globally.
I have been having a similar issue with Lume that returns zsh command not found: lume
If I enter echo $PATH
The colon delimited list should have user/local/.deno/bin:$PATH but it is not there. If I add it by running:
export PATH="/Users/yourUserName/.deno/bin:$PATH"
Than I am able to run lume commands. However, when I try to run lume commands the next day I have to go through the whole process once more as the error crops up again...
My question here today is regarding the Flow error and getting it sorted. I only mention the Lume error because it makes me fairly certain something is messed up in $Path or my Zsh config. I am just not sure what. The only caveat to that hunch though is that Deno is a global install, whereas Flow is installed directly into my project...
So, maybe the two errors while the same syntax are totally separate?
Thank you in advance for any guidance/suggestions. Cheers!
I came across this video from 2017 no less and the host had issues with flow not working within the project and so he installed it globally. I gave it a shot and the flow error zsh command not found: flow has been resolved...

Twincat Activate Configuration error 0x1028 - unable to activate

I am a beginner in Twincat and is trying to run a sample program on my system(not on target).
I did all the steps mentioned here and did get the system up and running my sample code once. However, when I tried to run it again after a system restart, I get an error message.
I tried creating a new empty solution and another one with test code, both of them throws the same error code. Also, when I click on the green Restart TwinCat system button, I get the following error.
How do I solve this?
You need to enable AMD-V in your BIOS.
The procedure is explained here:
https://youtu.be/P9uUgT8EhUM?t=1029
(This is done for an Intel system, while you seem to have an AMD system, but it's the same procedure).
Also make sure to run this file in CMD (as Administrator):
C:\TwinCAT\3.1\System\win8settick.bat
And then reboot your computer.
If this doesn't do the trick, then do core isolation and run your TwinCAT task on an isolated core. Maybe that was what you were doing before but not now? It's described quite well in this video: https://youtube.com/watch?v=q7iRvDuAOFQ

Running Alteryx flows from command line

I'm trying to figure out if I can launch a pre-built Alteryx workflow without launching the Designer - and without having Alteryx Server.
I came across a helpful post on Alteryx uses by #Runonthespot that, among other things, addressed running workflows from the command line, but doesn't go into detail. That discussion is here: https://stackoverflow.com/a/30469848/4313331. I don't have the rep to comment on his post and the question is closed.
He writes:
"Flows are runnable from the commandline on a server, and easiest way I've found (besides using Alteryx's own scheduler) is to save as an "App", and then run from the command line using the Alteryx engine executable, passing it parameters via xml file. You can save a sample xml parameter file from your flow by hitting the magic wand button (after saving the flow as a .yxwz (app)) This brings up a panel that lets you set the variables, and that panel has a handy "save" button which generates an xml file in the right format."
So, I'm looking for more info on this process. Is it simply a question of using Alteryx Server? Or is this a more interesting work around?
Thanks.
Yes, you can run a workflow (used generally to refer to a workflow, macro, or analytical app) without launching the Designer. You'll first need to understand how to run the workflow from the command line. The AlteryxEngineCmd.exe executable runs a workflow. It is located in the Alteryx install path in the bin subfolder. Here is where mine is located:
C:\Program Files\Alteryx\bin
It allows an additional parameter of an XML file with interface values. This is documented for analytical apps ONLY though it does work for macros as well. This is based on my extensive use of this undocumented feature.
Below are two examples:
AlteryxEngineCmd.exe MyWorkflow.yxmd
AlteryxEngineCmd.exe MyAnalyticApp.yxwz AppValues.xml
You can see a post here:
Alteryx Command Line Help
I prefer to wrap the command in a batch file and execute that for more control.
Now that you understand how to run the workflow from the command line, you can execute it anytime you want without launching Designer. Furthermore, you can use Windows Scheduler or a third-party tool to run the command or the batch file on a schedule.
Finally, you do need a license which enables API & Command Line w/ Scheduler. This is less expensive than Alteryx Server.
Have you tried C:\Program Files\Alteryx\bin\AlteryxEngineCmd.exe? It doesn't require server.
https://help.alteryx.com/2019.1/Command_Line.htm
If restrained by budget, you don't need a scheduler license (enables the AlteryxEnginecmd.exe), you can use a windows mouse clicker or even Powershell, to run the Designer though, without manual intervention.

VSCODE remote editing, download-on-open option?

I am mostly editing files remotely in VSCode, and have tried several sftp extensions. ftp-sync has been the best so far, but there is one nagging problem that hopefully someone has solved: Upload-on-save is great and works perfectly, but I'd like to Download-on-open also (with bonus points for warning if the file is different). I sometimes edit the remote files on the remote server, and because there's no check in vscode on open, it's easy to lose those changes. Anyone run into this and have suggestions for a different extension that works this way?
A recent release on March 19th of https://github.com/liximomo/vscode-sftp has added support for this functionality with downloadOnOpen. It works perfectly for my use case (if there's an updated version of the file on the server, download and use that). The UX is a little rough still, but will surely improve over time.
If you're using git locally, there's very little chance of losing local changes, so this works perfectly for the case where you want to edit and manage files locally, but stay in sync per-file with a remote ssh/sftp server.
Look into the Remote VSCode plugin. It doesn't do FTP-like navigation, but if you use SSH, you can tunnel an editing session over the connection into VSCode pretty easily. It felt a little wonky at first, but I use this plugin constantly. As I work across a fleet of a few hundred servers, this option made a lot more sense than trying to set up some of those "deploy" plugins for each host.
Check This Extension on VSCode.Its really awesome.
Remote WorkSpace

How can I get a IPython NotebookApp to be started and still manage it from outside of a browser?

It is possible to start a notebook app within an ipython console by
from IPython.html import notebookapp
nbapp = notebookapp.NotebookApp()
nbapp.initialize()
nbapp.start()
This will simply open a browser with a dashboard from which is possible to create/delete and start/shutdown notebooks. However nbapp.start() hooks up to tornado http server and it is not possible to use the instance nbapp to manage the notebooks from the console.
Some level of management can be done before executing the method nbapp.start(), but I couldn't find a way to start a proper notebook(linked to a new ipython kernel) page which can be read and edited from a web browser.
By taking a look at the start method in NotebookApp class, it seems that all the magic is done by the following method call
ioloop.IOLoop.instance().start()
This is the link for what I am refering to.
https://github.com/ipython/ipython/blob/master/IPython/html/notebookapp.py#L824
ioloop is imported at the beginning of the file and I didn't quite understand what this actually do.
# Install the pyzmq ioloop. This has to be done before anything else from
# tornado is imported.
from zmq.eventloop import ioloop
ioloop.install()
I was wondering if there was a way to start the server just the way it is working now, and send requests to it, as it works on a click of a mouse on the dashboard.
Or even better, have full access to the nbapp instance to create and start notebooks in the server.
Hope someone can help me with that, I would love to better understand how the ipython notebook work at the back stage.
Cheers