How to let IDE know a certain folder should be mapped into a certain package path? - import

I use PyCharm and Eclipse with PyDev.
To be specific, I am using Odoo and setting up project.
https://github.com/odoo/odoo
Here is the folder structure.
odoo-12
|-addons
| '-web
| '-...
|-odoo
'-addons
'-...
In source code for example:
addons/purchase/controllers/portal.py
# Unresolved yet this is the official source code
from odoo.addons.web.controllers.main import Binary
# Resolved perfectly
from addons.web.controllers.main import Binary
I understand the reason why this one works
from addons.web.controllers.main import Binary
but how can I make this works instead?
from odoo.addons.web.controllers.main import Binary
I cannot and should not modify any Odoo source code to make IDE resolving path correctly

In Eclipse/PyDev you should be able to set your source folder to the folder containing odoo and it should work...

Related

Can't import the Svg library in elm?

Trying to use Svg and Svg.Attributes. Getting the error message
I cannot find module 'Svg'.
Module 'Main' is trying to import it.
Potential problems could be:
* Mispelled the module name
* Need to add a source directory or new dependency to elm-package.json
I'm certain that there aren't any spelling errors because I copy and pasted the imports from a tutorial. Where do I install this library?
The tutorial I'm going through is the one elm-lang.org, specifically the section on time.
You need the elm-lang/svg package as a dependency in your elm-package.json. Run elm package install elm-lang/svg in the project directory.

How to import libraries in Spark Notebook

I'm having trouble importing magellan-1.0.4-s_2.11 in spark notebook. I've downloaded the jar from https://spark-packages.org/package/harsha2010/magellan and have tried placing SPARK_HOME/bin/spark-shell --packages harsha2010:magellan:1.0.4-s_2.11 in the Start of Customized Settings section of the spark-notebook file of the bin folder.
Here are my imports
import magellan.{Point, Polygon, PolyLine}
import magellan.coord.NAD83
import org.apache.spark.sql.magellan.MagellanContext
import org.apache.spark.sql.magellan.dsl.expressions._
import org.apache.spark.sql.Row
import org.apache.spark.sql.types._
And my errors...
<console>:71: error: object Point is not a member of package org.apache.spark.sql.magellan
import magellan.{Point, Polygon, PolyLine}
^
<console>:72: error: object coord is not a member of package org.apache.spark.sql.magellan
import magellan.coord.NAD83
^
<console>:73: error: object MagellanContext is not a member of package org.apache.spark.sql.magellan
import org.apache.spark.sql.magellan.MagellanContext
I then tried to import the new library like any other library by placing it into the main script like so:
$lib_dir/magellan-1.0.4-s_2.11.jar"
This didn't work and I'm left scratching my head wondering what I've done wrong. How do I import libraries such as magellan into spark notebook?
Try evaluating something like
:dp "harsha2010" % "magellan" % "1.0.4-s_2.11"
It will load the library into Spark, allowing it to be imported - assuming it can be obtained though the Maven repo. In my case it failed with a message:
failed to load 'harsha2010:magellan:jar:1.0.4-s_2.11 (runtime)' from ["Maven2 local (file:/home/dev/.m2/repository/, releases+snapshots) without authentication", "maven-central (http://repo1.maven.org/maven2/, releases+snapshots) without authentication", "spark-packages (http://dl.bintray.com/spark-packages/maven/, releases+snapshots) without authentication", "oss-sonatype (https://oss.sonatype.org/content/repositories/releases/, releases+snapshots) without authentication"] into /tmp/spark-notebook/aether/b2c7d8c5-1f56-4460-ad39-24c4e93a9786
I think file was to big and connection was interrupted before whole file could be downloaded.
Workaround
So I downloaded the JAR manually from:
http://dl.bintray.com/spark-packages/maven/harsha2010/magellan/1.0.4-s_2.11/
and copied it into the:
/tmp/spark-notebook/aether/b2c7d8c5-1f56-4460-ad39-24c4e93a9786/harsha2010/magellan/1.0.4-s_2.11
And then :dp command worked. Try Calling it first, and if it will fail copy JAR into the right path to make things work.
Better solution
I should investigate why download failed to fix it in the first place... or put that library in my local M2 repo. But that should get you going.
I would suggest to check this:
https://github.com/spark-notebook/spark-notebook/blob/master/docs/metadata.md#import-download-dependencies
and
https://github.com/spark-notebook/spark-notebook/blob/master/docs/metadata.md#add-spark-packages
I think the :dp magic command is depreciated, instead you should add your custom dependencies in the notebook metadata. You can go in the menu Edit > Edit notebook metadata, there add something like:
"customDeps": [
"harsha2010 % magellan % 1.0.4-s_2.11"
]
Once done, you will need to restart the kernel, you can check in the browser console if the package is being downloaded properly.
The easy way, you should set or add the EXTRA_CLASSPATH environnent variable to point to your .jar file downloaded :
export EXTRA_CLASSPATH = </link/to/your.jar> or set EXTRA_CLASSPATH= </link/to/your.jar> in wondows OS. Here find the detailed solution.

pyinstaller Adafruit DHT library raspberry_Pi_Driver.so: cannot open shared object

I'm using Adafruit_DHT library in a file and when I try to use pyinstaller to compile, I get an error that Raspberry_Pi_Driver.so: cannot open shared object file
I'm using normal RPi (not model 2) and raspbian. The file that I'm trying to compile works by itself fine. The Raspberry_Pi_Driver.so is there I did find ./ | grep Raspberry_Pi_Driver.so and it existed. I tried also to use the --hidden-import=Adafruit_DHT when compiling the pyinsatller and that did not work too. When compiling it does not give any error.
I noticed after compilation and in the build folder thee is an Adafruit_DHT folder that has the driver and etc.
Any idea what's going on? Could it be that the library has been recreated under build folder and this is confusing when executing the file?
I found a resolution for this (thanks to: k4ml.me/posts/pyinstaller.html) I just added '-p /path/to/mylib' when creating the exe file and mylib was the directory that Adafruit_DHT folder was (that Adafruit_DHT_Driver.so file was there).

golang using functions of imported subdirectories

I can't use functions of custom subdirectories.
My Code Organziation
I have under "src" a path hierarchy like
a/b
with all my directories and go-Files (it is the "root" of my project). The directories contain no subdirectories and it works fine. So the deepest path is "a/b/c". E.g. I have
a/b/c
and
a/b/d
with some go-files. Import of "a/b/d" and calling a function with "d.DoSomething()" from a file in "a/b/c" works fine.
Problem description
Now I want ot reorganize "a/b/d". I move some files from "a/b/d" to
a/b/d/e
and the rest of the files to
a/b/d/f
If try to import "a/b/d/e" with import-statement
import ( "a/b/d/e" )
from the same file in "/a/b/c" and want to call "e.DoSomething()" (it is the place, where the file with the "DoSomething-function" moved to), I get an error at the line, where I call "e.DoSomething()": "undefined: e".
While searching for a result, I've nowhere seen examples with deeper path hierarchies. Is it generally not possible to use/import subdirectories or what's the problem?
go-version I used: go1.2.2 linux/amd64
Thanks for any advices
Your approach is completely wrong. Go has absolutely no concept of importing files or directories, all you can import in Go are packages. It now happens that the name of a package is it's path relative to GOPATH and you import packages by that name. But the identifier under which an imported package is available in the importing code depends on the package declaration of the package. You cannot simply "move" files between directories as each directory (for the go tool) is a single package without changing the package declaration.
You can have package x under path a/b/c. When you import package x with import ( "a/b/c" ) all the exported stuff from package x is available to you as x.ExportedName.
Please read http://blog.golang.org/organizing-go-code carefully.
Try and do a go build in a/b/d/e first, before trying to build in a/b: that will generate the compiled classes you want to import.

cant compile ast parser code in java

I have some ast parser code in eclipse but I am unable to import the
import org.eclipse.jdt.core.dom.AST;
import org.eclipse.jdt.core.dom.ASTParser;
packages.
could some body tell me which jar file to download and where to add same that jar file in eclipse folder.
You probably have those files downloaded already in eclipse\plugins folder i.e. in windows C:\Program Files\eclipse\plugins\org.eclipse.jdt.core_3.9.1.v20130905-0837.jar
Of course you can download from internet as well
http://central.maven.org/maven2/org/eclipse/tycho/org.eclipse.jdt.core/
Note that in order to run it as a stand alone application you will have to import such librariers (where xx stands for version and again they can be found in eclipse\plugins folder):
org.eclipse.core.contenttype_xx.jar
org.eclipse.core.jobs_xx.jar
org.eclipse.core.resources_xx.jar
org.eclipse.core.runtime_xx.jar
org.eclipse.equinox.common_xx.jar
org.eclipse.equinox.preferences_xx.jar
org.eclipse.jdt.core_xx.jar
org.eclipse.osgi_xx.jar
Usually people add other libraries in folder called lib but still you will have to set it in eclipse. In order to do that right click on your project then build path -> configure build path -> libraries and select add JAR.