Importing a module in Databricks - pyspark

I'm not sure how importing module in python really works in Azure Databricks. First, I was able to make it work using the following:
import sys
sys.path.append("/Workspace/Repos/Github Repo/sparkling-to-databricks/src")
from utils.some_util import *
I was able to use the imported function. But then I restarted the cluster and this would not work even though the path is in sys.path.
I also tried the following:
spark.sparkContext.addPyFile("/Workspace/Repos/Github Repo/my-repo/src/utils/some_util.py")
This did not work either. Can someone please tell me what I'm doing wrong here and suggest a solution. Thanks.

Related

Import a dataflow automatically into a workspace

I am looking for a way to automatically import a dataflow directly into a workspace.
Does anyone have a solution?
And has anyone ever managed to set up a PowerShell script with this API: https://learn.microsoft.com/en-us/rest/api/power-bi/imports/post-import-in-group
And if so, could they tell me how?
Thanks in Advance
I try to test this API in Postman and PowerShell

Is it possible to see the script generated from a pgAdmin import job?

I was having trouble importing a csv, so I created a table without loading and went through the pgAdmin 4 importer - which worked! But now I want to see what it did differently to make it work. Is there a way to see the script behind the import job?
Thanks!
Check the pgAdmin4.log, You might get the complete command executed by pgAdmin4 in the log file itself.

"Attempted relative import in non-package" on Google Cloud ML

Since yesterday or the day before, I get a
ValueError: Attempted relative import in non-package
for an import in my main trainer file like
from . import mobilenet_v1 as mobilenet
when running the exact same trainer code with the exact same parameters on cloud ML using the exact same training job. I'm bundling my trainer using the gcloud tool. I tried rolling my own setup.py instead without luck. Any pointers as to what this could be caused by?
Looks like this was actually a Cloud ML bug. It has been fixed! Thank you for the super fast turnaround.

PostgreSQL Import CVS Issue

When importing a cvs file into postgresql using pgadmin III, it says import successful and then I am able to hit done. However, when I look into the table -- no data is there. Has anyone experienced a similar issue ? I have imported in the past, with no change in environment, to success, but I do not know what is happening now

com.badlogic.gdx.utils.viewport Doesn't Exist

Trying to import a viewport, more specifically a ScreenViewport, into a libgdx project project but the package doesn't seem to exist on my machine. According to the documentation and code other people have posted I'm using the right address but I'm getting an error.
import com.badlogic.gdx.utils.viewport.ScreenViewport;
"The import com.badlogic.gdx.util.viewport cannot be resolved."
Any idea what might be happening?
The most probable possibility is that you are using an old version of libgdx.
Try updating jar files in your project (manually or using ui).
Hope this helps.
Good luck.