import user defined modules into pydev - pydev

I am a beginner to Eclipse neon + Pydev combo.
Trying to use python modules I created into other modules I will be creating.
For a start, I was going to use TKinter tutorial program outlined here:
http://effbot.org/tkinterbook/tkinter-hello-again.htm
In addition to printing a statement in response to a mouse click, I want to run a small module, fibo.py
Here's my code:
import the library
from tkinter import *
import fibo
class App:
def __init__(self, master):
frame = Frame(master)
frame.pack()
self.button = Button(
frame, text="QUIT", fg="red", command=frame.quit
)
self.button.pack(side=LEFT)
self.hi_there = Button(frame, text="Hello",command=self.say_hi)
self.hi_there.pack(side=LEFT)
def say_hi(self):
fib(100)
print ("hi there, everyone!")
root = Tk()
app = App(root)
root.mainloop()
root.destroy() # optional; see description below
Here's fibo.py
def fib(n): # write Fibonacci series up to n
a, b = 0, 1
while b < n:
print (b, end=" ")
a, b = b, a+b
def fib2(n): # return Fibonacci series up to n
result = []
a, b = 0, 1
while b < n:
result.append(b)
a, b = b, a+b
return result
Both modules are in the same project and workspace.
The editor says,"unresolved import fibo"
Why is the module fibo not recognized by in pydev/eclipse?
My ultimate goal is to run a module upon button click. If there's a more direct way to accomplish this, I would like to know.

Ok, so, based on your screenshot, the structure you have is actually:
/project (this is the PYTHONPATH root and marked as source folder)
/project/root
/project/root/__init__.py
/project/root/nested
/project/root/nested/__init__.py
/project/root/nested/example.py
/project/root/nested/fibo.py
In this case, your import should be: from root.nested import fibo. Your code may work in the command line, but that's because you added an entry to sys.path only in runtime (so, PyDev can't follow that).
The other option would be moving both example.py and fibo.py to /project.
You can also use a relative import as from . import fibo, but then, to run the module as a __main__ module, you'll have to run modules by the module name (with the -m flag) -- you can configure PyDev to do that at the preferences > PyDev > Run > Launch modules with "python -m mod.name".
As a note, if you just write:
fibo in your case, and wait for the undefined variable error to be shown, you can use Ctrl+1 in that same line to get a suggestion which will write the import for you (or you can do a code-completion which will also write an import for you automatically).

Related

Why are Julia benchmarks not shown in VSCode?

Hi Below is some simple code that displays the benchmark results form a Julia REPL. In VSCode I have tried
launching the Julia REPL from the Command Pallet and Running the file without Debugging
Execute the active File in Repl from the drop down menu top right
In both cases the println statements are displayed but not the benchmark results. Is this to be expected or have I messed up?
using DifferentialEquations, BenchmarkTools
A = rand(1000,1000); B = rand(1000,1000); C = rand(1000,1000)
println("TX 1 and 2")
test(A,B,C) = A + B + C
#benchmark test(A,B,C)
println("T 1 End")
t(A,B,C) = A .+ B .+ C
#benchmark t(A,B,C)
println("TX 2 End")
readline()
println("After read")
I found a workaround : remove or comment out the #benchmark from the file and
run them directly in the REPL.
This should depend on which setting you have for result type (Julia > Execution: Result Type in the Settings GUI or julia.execution.resultType in settings.json).
With "inline" result type I get:
Hovering over the BenchmarkTools.Trial box, I get:
Note the println line just shows a tick as it has been executed but didn't return anything, instead it printed to the REPL, the terminal at the bottom now looks like this:

Why can't I import a let definition from a package, in a SystemVerilog module?

I would like to put the following definitions in a default package, which I include in all my other SystemVerilog modules:
let max(a,b) = a > b ? a : b;
But, when I try to use the imported let definition in a module, I'm told that I'm attempting to use a non-local function definition and VCS errors out.
Why?
The simple example works with no issues. Make sure that the package is always compiled before it is imported. Do correct import from the package either as pkg::* or pkg::max. Or use it as pkg::max(a,b) directly without import. And yes, use the compiler which supports this syntax.
package pkg;
let max(a,b) = a > b ? a : b;
endpackage:pkg
module top();
import pkg::*;
int a = 1,b = 2;
initial begin
$display("max of %d and %d is %d", a, b, max(a,b));
end
endmodule

auto-save-visited-interval not save automatically

I enabled the auto-save-visited-mode in global scope and write such a script
~/D/O/ORG/pySrc [undefined] λ cat sicp.py
#!/usr/bin/env python
def remainder(x, y):
return x % y
def gcd(a, b):
if b == 0:
retunr a
else:
return gcd(b, remainder(a, b))
print(gcd(30, 15))
Run it but find typo-error of retunr, and corrected it immediately.
The auto-save-visited-interval set as default 5, so I count to 10 and run it again
get error
File "sicp.py", line 9
retunr a
^
SyntaxError: invalid syntax
the file was not saved automatically.
Consult with auto save file, which state that files will be saved in place.
What's the problem with my usage?
doom-emacs issue
To enable a minor mode you must call its function: (auto-save-visited-mode +1). Setting the auto-save-visited-mode variable is not enough.
Try adding this to your config.el:
(auto-save-visited-mode +1)

PySpark: Error "Cannot pickle standard input" on function map

I'm trying to learn to use Pyspark.
I'm usin spark-2.2.0- with Python3
I'm in front of a problem now and I can't find where it came from.
My project is to adapt a algorithm wrote by data-scientist to be distributed. The code below it's what I have to use to extract the features from images and I have to adapt it to extract features whith pyspark.
import json
import sys
# Dependencies can be installed by running:
# pip install keras tensorflow h5py pillow
# Run script as:
# ./extract-features.py images/*.jpg
from keras.applications.vgg16 import VGG16
from keras.models import Model
from keras.preprocessing import image
from keras.applications.vgg16 import preprocess_input
import numpy as np
def main():
# Load model VGG16 as described in https://arxiv.org/abs/1409.1556
# This is going to take some time...
base_model = VGG16(weights='imagenet')
# Model will produce the output of the 'fc2'layer which is the penultimate neural network layer
# (see the paper above for mode details)
model = Model(input=base_model.input, output=base_model.get_layer('fc2').output)
# For each image, extract the representation
for image_path in sys.argv[1:]:
features = extract_features(model, image_path)
with open(image_path + ".json", "w") as out:
json.dump(features, out)
def extract_features(model, image_path):
img = image.load_img(image_path, target_size=(224, 224))
x = image.img_to_array(img)
x = np.expand_dims(x, axis=0)
x = preprocess_input(x)
features = model.predict(x)
return features.tolist()[0]
if __name__ == "__main__":
main()
I have written the begining of the Code:
rdd = sc.binaryFiles(PathImages)
base_model = VGG16(weights='imagenet')
model = Model(input=base_model.input, output=base_model.get_layer('fc2').output)
rdd2 = rdd.map(lambda x : (x[0], extract_features(model, x[0][5:])))
rdd2.collect()[0]
when I try to extract the feature. There is an error.
~/Code/spark-2.2.0-bin-hadoop2.7/python/pyspark/cloudpickle.py in
save_file(self, obj)
623 return self.save_reduce(getattr, (sys,'stderr'), obj=obj)
624 if obj is sys.stdin:
--> 625 raise pickle.PicklingError("Cannot pickle standard input")
626 if hasattr(obj, 'isatty') and obj.isatty():
627 raise pickle.PicklingError("Cannot pickle files that map to tty objects")
PicklingError: Cannot pickle standard input
I try multiple thing and here is my first result. I know that the error come from the line below in the method extract_features:
features = model.predict(x)
and when I try to run this line out of a map function or pyspark, this work fine.
I think the problem come from the object "model" and his serialisation whith pyspark.
Maybe I don't use a good way to distribute this with pyspark and if you have any clew to help me, I will take them.
Thanks in advance.

how to use output of one file as input to another file in python

like I have a code
def poly (x,a):
j=0
ans=[0]*len(x)
while (j<len(x)):
i=0
while(i<len(a)):
ans[j] = ans[j] + a[i]*((x[j])**i)
i=i+1
else:
j=j+1
else:
print ans
a=[1,1,1]
x=[1,1,1]
print poly(x,a)
now I want x in this file from another file containg program like:
def mult(x,q):
i=0
while(i<len(x)):
x[i] = x[i]*q
i=i+1
else:
print x
x=[1,1,1]
q=2
print mult(x,q)
Maybe this is a possible solution.
First: In your function definitions where you have "print ans" and "print x", I think you want return statements. i.e. "return ans" and "return x".
e.g.
def poly (x,a):
j=0
ans=[0]*len(x)
while (j<len(x)):
i=0
while(i<len(a)):
ans[j] = ans[j] + a[i]*((x[j])**i)
i=i+1
else:
j=j+1
else:
return ans # changed print to return
Then, assuming you have two separate program files (let's call them poly.py and mult.py) in which the given functions are separately defined you could write a new program file (e.g. polymult.py) which looks something like this:
#start code
# import programs poly.py and mult.py
# this will execute commands in these files
# but gives access to functions defined within
from poly import *
from mult import *
# Main
# define/reset the variables we need/want to use
a=[1,1,1]
x=[1,1,1]
q=2
# pass mult which returns new x into poly
print poly(mult(x,q),a)
#end code
A caveat to the way I wrote the above is that the files poly.py and mult.py have to be in the same directory/folder as the program importing them. (A simple alternative would be to simply define the functions (paste them in) in place of the import statements)
Also, if you just want to use them as imported functions as in my example you could remove all the code other than the actual function def's. Otherwise, when you execute polymult.py you will see the extra screen prints executed during the import of poly.py and mult.py