Since yesterday or the day before, I get a
ValueError: Attempted relative import in non-package
for an import in my main trainer file like
from . import mobilenet_v1 as mobilenet
when running the exact same trainer code with the exact same parameters on cloud ML using the exact same training job. I'm bundling my trainer using the gcloud tool. I tried rolling my own setup.py instead without luck. Any pointers as to what this could be caused by?
Looks like this was actually a Cloud ML bug. It has been fixed! Thank you for the super fast turnaround.
Related
I'm not sure how importing module in python really works in Azure Databricks. First, I was able to make it work using the following:
import sys
sys.path.append("/Workspace/Repos/Github Repo/sparkling-to-databricks/src")
from utils.some_util import *
I was able to use the imported function. But then I restarted the cluster and this would not work even though the path is in sys.path.
I also tried the following:
spark.sparkContext.addPyFile("/Workspace/Repos/Github Repo/my-repo/src/utils/some_util.py")
This did not work either. Can someone please tell me what I'm doing wrong here and suggest a solution. Thanks.
Trying to deploy real time emotion detection model and getting this error from streamlit. I tried installing all the libraries but it always throwing this error.
Looks like another versioning issue for streamlit components. The response from Ben in the following should work. Let me know if this helps.
https://discuss.streamlit.io/t/module-not-found-reportthread/5657/2
I had some code for training and then using XGBoost models on a Databricks environment. As my runtime version got deprecated, I upgraded it, but I quickly noticed I could not load my trained models anymore. The reason seems to be a change in the naming of functions in Sparkdl:
Error loading metadata: Expected class name sparkdl.xgboost.xgboost_core.XgboostClassifierModel but found class name sparkdl.xgboost.xgboost.XgboostClassifierModel
Would anyone have advise on how to fix this issue? Maybe modify the metadata?
Hello? Anyone who can help for Hail 0.2 on Azure DataBrick?
After pip install lots of problems came out....
can't find Java Package , import hail.plot , hl.init()
According to document
https://docs.azuredatabricks.net/applications/genomics/tertiary/hail.html#create-a-hail-cluster
I've pip install hail
set ENABLE_HAIL=true in Cluster Environment Setting
However
import hail as hl
hl.init(sc, idempotent=True)
AttributeError: module 'hail' has no attribute 'init'
Also another document
https://docs.azuredatabricks.net/applications/genomics/tertiary/hail.html
import hail as hl
import hail.expr.aggregators as agg
hl.init(sc, idempotent=True)
ModuleNotFoundError: No module named 'hail.expr'
Anyone can give a solution?
Thanks a lot !!!
Are you using the genomics runtime? See https://learn.microsoft.com/en-us/azure/databricks/runtime/genomicsruntime#dbr-genomics to launch a cluster with the genomics runtime. Then Hail will be installed if you set the ENABLE_HAIL=true environment variable.
You may want to move to Azure HDInsight, and follow the instructions on the following page, under the Microsoft Azure section:
https://hail.is/docs/0.2/cloud/other_cloud_providers.html
This should get you up and running!
Trying to import a viewport, more specifically a ScreenViewport, into a libgdx project project but the package doesn't seem to exist on my machine. According to the documentation and code other people have posted I'm using the right address but I'm getting an error.
import com.badlogic.gdx.utils.viewport.ScreenViewport;
"The import com.badlogic.gdx.util.viewport cannot be resolved."
Any idea what might be happening?
The most probable possibility is that you are using an old version of libgdx.
Try updating jar files in your project (manually or using ui).
Hope this helps.
Good luck.