How do I run a sister script on a raspberry pi pico using code inside the main.py script? - micropython

So I have a raspberry pi pico, which on boot, runs main.py. I have a sister script on the Pico, and I was wondering if there was a function or way to run one of these sister scripts, by using code in main.py. I would also like it to only run the sister script, and not go back to main.py until the Pico reboots.
Something like this:
If variable1.value == 1:
run sisterScript.py
I have tried putting the entire sister script into a function (imports and all) which I can then import and run from the main.py script. While it does call the sister script, it goes back to the main.py script. I would like it to stay on the sister script, after the main script had the command to switch.

To stay on side-loaded script- make it so it never finishes it's job. One of option- create endless loop
on sisterScript.py
# do stuff
while(True):
# do more stuff
# and do not leave this loop
on main.py
If variable1.value == 1:
import sisterScript

Related

How to run a MicroPython host script file on the Raspbery Pi PIco from the host command line and receive UART output back on terminal?

I know how to do something similar with Thonny: I could paste my code into the editor, and press the green "Run" button. This would run the program and give me output. But it would require copy pasting my file into Thonny (I want to code in Vim or run pre-existing examples) and pressing GUI buttons which I don't want to do.
Another related approach would be to copy the program as main.py to the pico, e.g. with rshell: How can you make a micropython program on a raspberry pi pico autorun? But this requires plugging and unplugging the USB, and then reconnecting to the UART every time to see the output.
Is it possible to send file contents to a GNU Screen session? would likely also solve or almost solve my problem, but:
I don't want to start a named server and then run another command, it's messy, I just want to run!
I would need to think about how to send Ctrl+D to soft restart. Should not be hard, but lazy to learn
The first way I got it to work was with https://github.com/scientifichackers/ampy That tool is just designed for the job, and does it perfectly with the run command:
python3 -m pip install --user adafruit-ampy
ampy --port /dev/ttyACM0 run blink.py
Outcome:
stops execution of current program
starts execution of blink.py
shows UART output on my shell
I can then quit ampty with Ctrl + C to get back to my shell, and the program continues to run.
Tested on adafruit-ampy==1.1.0, Ubuntu 22.04 host, Raspberry Pi Pico W, MicroPython rp2-pico-w-20221014-unstable-v1.19.1-544-g89b320737.uf2.

Python Windows how to get STDOUT data in real time?

I have a windows executable that I want to run over and over. The problem is that sometimes there's an error about 1 second in, but the program doesn't exit. So what I would like to do is to be able to grab the contents of stdout, recognize there is an error, and then kill the subprocess and start it over.
When I run this executable, stuff prints to the screen just fine. But when I wrap it in a subprocess from python then the stdout stuff doesn't show up until the program terminates.
I've tried basically everything posted here with no luck:
Constantly print Subprocess output while process is running
Here's my current code, I replaced the executable with a second python program just to remove any other weird variables:
parent_program.py:
import subprocess, os, sys
program = "python "+os.path.dirname(os.path.abspath(__file__)) + "/child_program.py"
with subprocess.Popen(program, shell=True, stdout=subprocess.PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')
child_program.py:
from time import sleep
for i in range(0,10):
print(i)
sleep(1)
What I would expect is that I would see 1,2,3,4... printed one second at a time, as if I had just run python child_program.py, but instead I get nothing for 10 seconds and then get all the output at once.
I also thought about trying to run the program from the CMD prompt and piping the stdout to a file python child_program.py 2>&1 > output.txt and then having python read that file, but it's the same problem, the file doesn't get written until the program terminates.
Is there any way to fix this on windows?

Interacting with IPython kernel from a Bash script

Is it possible to interact with an IPython interactive session (or with a kernel) from a Bash script? Ideally, I'd like to do something like this, within a shell script (I'm aware that the send subcommand probably doesn't exist like this):
# do stuff in Bash ...
# start a kernel and get its Id
KERNEL=`ipython init --command="print(__KERNELID__)"`
# do something inside the kernel
ipython send --kernel=KERNELID --command="mylist = [0,1,2]"
Then, ideally, the command
ipython send --kernel=KERNELID --command="print(mylist)"
would output
[0, 1, 2]
In the end, I would need to destroy the kernel somehow:
ipython --kernel=KERNELID --command="sys.exit()"
Probably, there is already a mechanism to do what I'd like,
right? Unfortunately, I wasn't able to find it ...
There are quite a number of ways around this problem. Since you are having to use python you might as well use python for the whole thing. Python programs can take command line arguments like mylist and do whatever you want with them.
Since you are sending commands to be evaluated make sure you are the one controlling the inputs. Don't let someone start typing "import os" and "os.unlink([your hard drive here])" for example.
For other options: Check out expect for your interactive needs http://expect.sourceforge.net/
or just the python version check out the pexpect module http://pexpect.sourceforge.net/pexpect.html

starting C executable in raspbian on startup

I'm using raspbian on a raspberry pi and I need to start a program on startup. What is the easiest way to do this? A bash script?
normally I run the following code in terminal:
../simple/./simple_run 12345
the executable has an input for 12345
Can someone step me through on how to do this?
Could you call your script at /etc/rc.local
If this file doesn't exist, create this:
#!/bin/sh -e
#
/.../myScript.sh
replace /.../myScript.sh by your script call... use full path.
Only one question... your script will be executed with "root" user... take care!

python subprocess communicate hangs calling shell script

Using python 3.2, and the following code snippet:
p = subprocess.Popen(['../start_server.sh'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out,err = p.communicate()
if out != None :
out = out.decode('utf-8')
if err != None :
err = err.decode('utf-8')
print('out ',out)
print('err ',err)
on some shell scripts, it works just fine and I get my output. on others it just hangs. but in every case the shell script runs from the command line with no errors. The only commonality i can see is (usually) the ones that hang have zero output. When stuff fails, I check running processes and i see my shell script is not listed and the python script is still running
Whats a reliable way to call a shell script and always return control to my python program?
Edit:
Using pipes Popen and such is not a requirement, the only requirement is that control is returned to my python script when the shell script exits. If the shell script never returns to the command prompt, then my python script will also never return.
So assuming the shell script(s) I am calling always return to the command prompt, how can I get control back to my python program?
If theres a better way that what ive listed above -- please enlighten me
One additional bit ive found is the shell scripts that "hang" seem to end with a call to 'nohup' Ye they return to the command prompt with no issues.
Whats a reliable way to call a shell script and always return control
to my python program?
If you are using pipes, this will depend on your scripts; a more general answer is essentially the halting problem and even the mighty StackOverflow can't help you with that.
I would encourage you to dig deeper and try to create a reproducible case so that we can help you solve the particular problem you're seeing.
Edit
If you don't need pipes, then just omit the stdout and stderr parameters (or set them to something other than PIPE). See python subprocess management.