jupyter notebook kernel dies when loading mat files using scipy - scipy

I am using jupyter notebook for my python work. I have a large dataset .mat file which I am trying to load using scipy package using loadMat function. When I am trying to load my data file, my jupyter kernel dies and restart without any error in the console.
python console works fine and it loads data properly.
I am using ubuntu machine.
Can someone please let me know what is the issue with the jupyter notebook

Related

unable to view Matlab notebook in Jupyter

I am trying to run matlab code from Jupyter notebooks.
I am following the link below for guidance:
https://www.youtube.com/watch?v=WufMGW5Bv4g
Now I have installed matlab_kernel. And I see that it is installed on the machine when i run a pip list.
When I open Jupyter notebook, and click on the New notebook dropdown, I do not see the MATLAB option.
Also, at point 3:22, the instructor installs python from Program Files/Matlab.
However, I do not see the MATLAB folder in my Program Files.
What am I missing here. Any guidnce would be very helpful.

How to setup custom jupyter kernels on WSL2?

I've been following https://queirozf.com/entries/jupyter-kernels-how-to-add-change-remove to create a jupyter kernel.
This procedure worked when I was on Ubuntu or WSL1.
However, I'm unable to change kernel on WSL2. Whatever custom kernel I select, the python and pip-packages seem to be pointing to the version where the jupyter notebook is launched and not the virtualenv associated to the kernel.
Does anyone know how to setup custom jupyter kernels on WSL2 ?
The recent update on my computer seemed to have resolved the problem

Start notebook from other notebook

Using jupyter-lab
%run otherNotebook.ipynb
gives the following error message
Error: file not found otherNotebook.ipynb.py
How can I use the magic method and prevent it from adding .py to the file
As described here %run is for running a named file inside IPython as a program. Jupyter notebooks are not Python programs.
Notebooks can be converted to Python programs/scripts using Jupytext. Following that conversion you could then use %run.
Alternatively, you can use nbconvert to execute a notebook or use Papermill to execute a notebook. Papermill allows you to easily pass in parameters at the time of run. I have an example of both commented out in code under 'Step #5' here and 'Step#2' here.
If you are actually trying to bring the code into your present notebook, then you may want to explore importing Jupyter notebooks as modules. importnb is recommended here for making importing notebooks more convenient. Or, I just came across the subnotebook project that let's you run a notebook as you would call a Python function, pass parameters and get results back, including output contents.

Jupyter Notebook is showing No pyspark kernel upon startup

I am running pyspark scripts in jupyter notebook but the kernel is not starting. upon selecting pyspark from the dropdown the kernel loads and remains busy for some time and then shows "no kernel".
Can someone help me?
Note: upon running "$Jupyter kernelspec list" i can see pyspark kernel in the list.
The issue was resolved only by re configuring the Jupyter notebook.

Can't launch PySpark in browser (windows 10)

I'm trying to launch PySpark notebook in my browser by typing in pyspark from the console, but I get the following error:
c:\Spark\bin>pyspark
python: can't open file 'notebook': [Errno 2] No such file or directory
What am I doing wrong here?
Please help?
Sounds like the jupyter notebook is either not installed or not in your path.
I prefer to use Anaconda for my python distribution and Jupyter comes standard and will install all necessary path information as well.
After that as long as you have set PYSPARK_PYTHON_DRIVER=jupyter and PYSPARK_PYTHON_DRIVER_OPTS='notebook' correctly you are good to go.
You want to launch the jupyter notebook when you invoke the command pyspark. Therefore you need to add the following to the bash_profile or zshrc_profile.
export PYSPARK_SUBMIT_ARGS="pyspark-shell"
export PYSPARK_DRIVER_PYTHON=ipython
export PYSPARK_DRIVER_PYTHON_OPTS="notebook" pyspark