Problems for Hail0.2 working on Azure DataBrick - python-3.7

Hello? Anyone who can help for Hail 0.2 on Azure DataBrick?
After pip install lots of problems came out....
can't find Java Package , import hail.plot , hl.init()
According to document
https://docs.azuredatabricks.net/applications/genomics/tertiary/hail.html#create-a-hail-cluster
I've pip install hail
set ENABLE_HAIL=true in Cluster Environment Setting
However
import hail as hl
hl.init(sc, idempotent=True)
AttributeError: module 'hail' has no attribute 'init'
Also another document
https://docs.azuredatabricks.net/applications/genomics/tertiary/hail.html
import hail as hl
import hail.expr.aggregators as agg
hl.init(sc, idempotent=True)
ModuleNotFoundError: No module named 'hail.expr'
Anyone can give a solution?
Thanks a lot !!!

Are you using the genomics runtime? See https://learn.microsoft.com/en-us/azure/databricks/runtime/genomicsruntime#dbr-genomics to launch a cluster with the genomics runtime. Then Hail will be installed if you set the ENABLE_HAIL=true environment variable.

You may want to move to Azure HDInsight, and follow the instructions on the following page, under the Microsoft Azure section:
https://hail.is/docs/0.2/cloud/other_cloud_providers.html
This should get you up and running!

Related

Importing a module in Databricks

I'm not sure how importing module in python really works in Azure Databricks. First, I was able to make it work using the following:
import sys
sys.path.append("/Workspace/Repos/Github Repo/sparkling-to-databricks/src")
from utils.some_util import *
I was able to use the imported function. But then I restarted the cluster and this would not work even though the path is in sys.path.
I also tried the following:
spark.sparkContext.addPyFile("/Workspace/Repos/Github Repo/my-repo/src/utils/some_util.py")
This did not work either. Can someone please tell me what I'm doing wrong here and suggest a solution. Thanks.

Jupyterlab-dash needs to be included in built on Binder but then throws an error

I am trying to launch a Binder rep and import JupyterDash into it.
The Binder built worked.
In my Binder rep
enter link description here
My yml is:
name: plotly_dash
channels:
defaults
dependencies:
python
ipykernel
seaborn
pandas
matplotlib
numpy
plotly
jupyter-dash
dash
chart-studio
nbformat
ipywidgets
openpyxl
jupyter_server_proxy
However, I get the message:
jupyterlab-dash needs to be included in built
On my machine locally this is also the case and not a problem and works.
In Binder it throws an error:
If you are experiencing the build failure after installing an extension (or trying to include previously installed extension after updating JupyterLab) please check the extension repository for new installation instructions as many extensions migrated to the prebuilt extensions system which no longer requires rebuilding JupyterLab (but uses a different installation procedure, typically involving a package manager such as 'pip' or 'conda').
If you specifically intended to install a source extension, please run 'jupyter lab build' on the server for full output.
As I see it JupyterDash is not prebuilt.
On
github.com/plotly/jupyter-dash
they say to include:
JupyterDash.infer_jupyter_proxy_config()
This does not work either and to make it work it requires some efforts on the server site, I guess. However I do not even think this would solve the problem.
What can I do to make the built work?
Thank you for any suggestions.
I posted this on Jupyter Discourse Forum too:
enter link description here

Unable to import 'azure.functions' pylint(import-error) [3,1] and Unable to import '__app__modules.library_finder' pylint(import-error) [4,1]

I'm trying to test out a serverless Python Chatbot API in Microsoft Azure, but when I follow online guide https://towardsdatascience.com/creating-a-serverless-python-chatbot-api-in-microsoft-azure-from-scratch-in-9-easy-steps-2f1913fc9581
it gives these error :
Unable to import 'azure.functions' pylint(import-error) [3,1]
Unable to import '__app__modules.library_finder' pylint(import-error) [4,1]
any idea how to resolve this?
Regards
This error is coming from pylint. It seems that the linter is not pointed to the .env and therefore cannot validate the azure package! To solve it, you can try this:
In Visual Studio Code:
Locate the Python version in the status bar and click on it
Select the Azure workspace where your project resides
A list of Python versions show up for you. Pick the one that starts with ./.venv/ (In my case, it is: ./.venv/bin/python)
You might then get a popup saying the Linter pylint is not installed.
Click on Install button to install it and you should be good to go
Hope this helps

Swift 4.0 No such module 'libxmlKanna'?

https://github.com/tid-kijyun/Kanna
I use Manual Installation
Ive tried multiple variations of this, but none of them seem to work. Any ideas?
Thanks in advance.
I was facing the same problem, but I followed the instructions given here:
https://github.com/tid-kijyun/Kanna#manual-installation
And it works for me!
I am on Xcode 9.4.1. My project uses swift 4.1.
Make sure you give proper path
$(SRCROOT)/Modules
in the target Build settings to the Swift Compiler - Search Paths > Import Paths field.
Once this is done, remove the following import statement from your file:
import Kanna
From the github link you shared, it seems the easiest way is to add pod 'Kanna' to your podfile, run pod install in your terminal, build your project cmd + b and then add import Kanna in your project. You said import libxmlKanna ?? Just try import Kanna

"Attempted relative import in non-package" on Google Cloud ML

Since yesterday or the day before, I get a
ValueError: Attempted relative import in non-package
for an import in my main trainer file like
from . import mobilenet_v1 as mobilenet
when running the exact same trainer code with the exact same parameters on cloud ML using the exact same training job. I'm bundling my trainer using the gcloud tool. I tried rolling my own setup.py instead without luck. Any pointers as to what this could be caused by?
Looks like this was actually a Cloud ML bug. It has been fixed! Thank you for the super fast turnaround.