What I am trying to achieve.
Run a python script saved On pythonanywhere host from google sheets on a button press.
Check the answer by Dustin Michels
Task of Each File?
app.py: contains code of REST API made using Flask.
runMe.py: contains code for that get values from(google sheet cell A1:A2). And sum both values send sum back to A3.
main.py: contains code for a GET request with an argument as name(runMe.py).filename may change if the user wants to run another file.
I Made an API by using Flask.it works online and offline perfectly but still, if you want to recommend anything related to the app.py.Code Review App.py
from flask import Flask, jsonify
from flask_restful import Api, Resource
import os
app = Flask(__name__)
api = Api(app)
class callApi(Resource):
def get(self, file_name):
my_dir = os.path.dirname(__file__)
file_path = os.path.join(my_dir, file_name)
file = open(file_path)
getvalues = {}
exec(file.read(), getvalues)
return jsonify({'data': getvalues['total']})
api.add_resource(callApi, "/callApi/<string:file_name>")
if __name__ == '__main__':
app.run()
Here is the Code of runMe2.py
import gspread
from oauth2client.service_account import ServiceAccountCredentials
# use creds to create a client to interact with the Google Drive API
scopes =['https://www.googleapis.com/auth/spreadsheets',"https://www.googleapis.com/auth/drive.file","https://www.googleapis.com/auth/drive"]
creds = ServiceAccountCredentials.from_json_keyfile_name('service_account.json', scopes)
client = gspread.authorize(creds)
# Find a workbook by name and open the first sheet
# Make sure you use the right name here.
sheet = client.open("Demosheet").sheet1
# Extract and print all of the values
list_of_hashes = sheet.get_all_records()
print(list_of_hashes)
below is the main.py code
import requests
BASE = 'https://username.pythonanywhere.com/callApi/test.py'
response = requests.get(BASE)
print(response.json())
main.py output
{'data': 54}
Test.py code
a = 20
b = 34
total = a+b
print(total)
PROBLEM IS
if I request runMe2.py at that time I am got this error.
check runMe2.py code above
app.py is hosted on https://www.pythonanywhere.com/
ModuleNotFoundError: No module named 'gspread'
However, I installed gspread on pythonanywhere why using the command. but it's not working.
You either haven't installed the gspread package on your current python environment or it is installed somewhere (e.g. in a diff. virtual env) and your script cant find it.
Try installing the package inside the environment your running your script in using pip3:
pip3 install gspread
You can try something like this on Windows
pip install gspread
or on Mac
pip3 install gspread
If you're running on Docker, or building with a requirements.txt you can try adding this line you your requirements.txt file
gspread==3.7.0
Any other instructions for this package can be found here => https://github.com/burnash/gspread
Download gspread here
Download the tar file: gspread-3.7.0.tar.gz from the above link
Extract file and convert folder in zip then upload it back on server
Open bash console and use command as
$ unzip gspread-3.7.0
$ cd gspread-3.7.0
$ python3.7 setup.py install --user
Related
Is it possible to add bash script as an entrypoint (console script) to Python package via poetry? It looks like it only accepts python files (see code here).
I want entry.sh to be an entry script
#!/usr/bin/env bash
set -e
echo "Running entrypoint"
via setup.py
entry_points={
"console_scripts": [
"entry=entry.sh",
],
},
On the other hand setuptools seems to be supporting shell scripts (see code here).
Is it possible to include shell script into a package and add it to the entrypoints after installing when working with Poetry?
UPD. setuptools does not support that as well (it generates code below)
def importlib_load_entry_point(spec, group, name):
dist_name, _, _ = spec.partition('==')
matches = (
entry_point
for entry_point in distribution(dist_name).entry_points
if entry_point.group == group and entry_point.name == name
)
return next(matches).load()
globals().setdefault('load_entry_point', importlib_load_entry_point)
Is it design decision? It looks to me that packaging should provide such a feature to deliver complex applications as a single bundle.
So I ended up using this workaround: have my script in place and add it to the bundle via package_data and call it from within Python code which I made as an entrypoint.
import subprocess
def _run(bash_script):
return subprocess.call(bash_script, shell=True)
def entrypoint():
return _run("./scripts/my_entrypoint.sh")
def another_entrypoint_if_needed():
return _run("./scripts/some_other_script.sh")
and pyproject.toml
[tool.poetry.scripts]
entrypoint = 'bash_runner:entrypoint'
another = 'bash_runner:another_entrypoint_if_needed'
Same works for console_scripts in setup.py file.
Assume the folder structure is
src
-__init__.py
-A
-__init__.py
-lib.py
-B
-__init__.py
-script.py
I am in script.py trying to import a function p() from A.lib. The file script.py contains
import os
print(os.listdir())
from A.lib import fkt
if __name__ == "__main__":
fkt()
I am calling os to show that my current working directory is in src/ so the module A should be in reach. The return from the print statements is
['A', '.vscode', 'B', '__init__.py']
Is there a "correct" way to do this? I figure importing the path via sys isn't "correct" so I'd like to avoid doing this.
Note: I have tried almost every version of from src.A.lib, from .A.lib, or from A.lib and all of these return either that src is not a package or that there is no module A.
OS: I am running on ubuntu 18.04 LTS with python 3.6.9 using vscode with "cwd" set to "${workspaceFolder}".
So this seems to be a really common problem with this setup, but I can't find any solutions that work on SO. I've setup a very new Ubuntu 15.04 server, then installed nginx, virtualenv (and -wrapper), and uWSGI (via apt-get, so globally, not inside the virtualenv).
My virtualenv is located at /root/Env/example. Inside of the virtualenv, I installed Django, then at /srv/www/example/app ran Django's startproject command with the project name example, so I have vaguely this structure:
-root
-Env
-example
-bin
-lib
-srv
-www
-example
-app
-example
manage.py
-example
wsgi.py
...
My example.ini file for uWSGI looks like this:
[uwsgi]
project = example
plugin = python
chdir = /srv/www/example/app/example
home = /root/Env/example
module = example.wsgi:application
master = true
processes = 5
socket = /run/uwsgi/app/example/example.socket
chmod-socket = 664
uid = www-data
gid = www-data
vacuum = true
But no matter whether I run this via uwsgi --ini /etc/uwsgi/apps-enabled/example.ini or via daemon, I get the exact same error:
Python version: 2.7.9 (default, Apr 2 2015, 15:37:21) [GCC 4.9.2]
Set PythonHome to /root/Env/example
ImportError: No module named site
I should note that the Django project works via the built-in development server ./manage.py runserver, and that when I remove home = /root/Env/example the thing works (but is obviously using the global Python and Django rather than the virtualenv versions, which means it's useless for a proper virtualenv setup).
Can anyone see some obvious path error that I'm not seeing? As far as I can tell, home is entirely correct based on my directory structure, and everything else in the ini too, so why is it not working with this ImportError?
In my case, I was seeing this issue because the django app I was trying to run was written in python 3 whereas uwsgi was configured for python 2. I fixed the problem by:
recompiling uwsgi to support both python 2 and python 3 apps
(I followed this guide)
adding this to my mydjangoproject_uwsgi.ini:
plugins = python35 # or whatever you specified while compiling uwsgi
For other folks using Django, you should also make sure you are correctly specifying the following:
# Django dir that contains manage.py
chdir = /var/www/project/myprojectname
# Django wsgi (myprojectname is the name of your top-level project)
module = myprojectname.wsgi:application
# the virtualenv you are using (full path)
home = /home/ubuntu/Env/mydjangovenv
plugins = python35
As #Freek said, site refers to a python module.
The error claims that python cannot find that package, which is because you have specified python_home to the wrong location.
I've encountered with the same problem and my uwsgi.ini is like below:
[uwsgi]
# variable
base = /home/xx/
# project settings
chdir = %(base)/
module = botservice.uwsgi:application
home = %(base)/env/bin
For this configuration uwsgi can find python executable in /env/bin but no packages could be found under this folder. So I changed home to
home = %(base)/env/
and it worked for me.
In your case, I suggest digging into home directive and point it to a location which contains both python executable and packages.
The site module is in the root of django.
First check is to activate the virtualenv manually (source /root/Env/example/bin/activate, start python and import site). If that fails, pip install django.
Assuming that django is correctly installed in the virtualenv, make sure that uWSGI activates the virtualenv. Relevant uWSGI configuration directives:
plugins = python
virtualenv = /root/Env/example
and in case you have error importing example.wsgi:
pythonpath = /srv/www/example/app/example
I am trying to import some of my personal modules into my IPython Clusters. I am using Anacondas on Windows Vista 64 bit
from IPython.parallel import Client
rc = Client()
dview = rc[:]
with dview.sync_imports():
import lib.rf
It is giving me this error:
No module named 'lib.rf'
I can import the module in the rest of my IPython notebook, as I have this .bat file to start ipython notebook:
cd C:\Users\Jon\workspace\bf
set PYTHONPATH=%PYTHONPATH%;C:\Users\Jon\workspace\bf
C:\Anaconda\envs\p33\scripts\ipython notebook
I am using this similar code to start my ip clusters:
cd C:\Users\Jon\workspace\bf
set PYTHONPATH=%PYTHONPATH%;C:\Users\Jon\workspace\bf
C:\Anaconda\envs\p33\Scripts\ipcluster start --n=7
Why is this not working?
More info:
If I print out sys.path, I get a list that contains C:\Users\Jon\workspace\bf
If I print out the paths of my clusters, I get the same list:
%px sys.path
['',
'',
'',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\distribute-0.6.28-py3.3.egg',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\pykalman-0.9.5-py3.3.egg',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\patsy-0.2.1-py3.3.egg',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\joblib-0.8.3_r1-py3.3.egg',
'C:\\Users\\Jon\\workspace\\bf',
'C:\\Users\\Jon\\workspace\\bf\\my_numba',
'C:\\Anaconda\\envs\\p33\\python33.zip',
'C:\\Anaconda\\envs\\p33\\DLLs',
'C:\\Anaconda\\envs\\p33\\lib',
'C:\\Anaconda\\envs\\p33',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\Sphinx-1.2.3-py3.3.egg',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\win32',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\win32\\lib',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\Pythonwin',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\runipy-0.1.1-py3.3.egg',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\setuptools-7.0-py3.3.egg',
'C:\\Anaconda\\envs\\p33\\lib\\site-packages\\IPython\\extensions']
In [45]:
Further analysis:
%px lib.__path__
Out[0:11]: _NamespacePath(['C:\\Anaconda\\envs\\p33\\lib\\site-packages\\win32\\lib'])
lib.__path__
Out[57]: ['.\\lib']
Looks like the ipcluster and notebook are looking at lib in different places. I have tried renaming lib to mylib. It has not helped.
It seems that with dview.sync_imports() is being run someplace other than your IPython Notebook environment and is therefore relying a different PYTHONPATH. It is definitely not being run on one of the cluster engines and so wouldn't expect it to leverage your cluster settings of PYTHONPATH.
I'm thinking you'll need to have that directory in your PYTHONPATH (not your PATH) for the calling python environment because that is the location from which you are importing the modules.
The impact of the bit you have about setting the PYTHONPATH in the DOS shell from which you invoke ipclusters isn't clear to me. I can see that one might expect this to let the engines know about your directory, but I'm wondering if that PYTHONPATH gets initilized to the environment from which you call IPython.parallel.Client.
I designed a GUI application using wxPython that communicate with a local database (Mongodb) located in the same folder. My main application has the relative path to the database daemon to start it every time the GUI is lunched.
This is the main.py:
import mongodb
class EVA(wx.App):
# wxPython GUI here
pass
if __name__ == "__main__":
myMongodb = mongodb.Mongodb()
myMongodb.start()
myMongodb.connect()
app = EVA(0)
app.MainLoop()
This is the mongodb.py module:
from pymongo import Connection
import subprocess, os , signal
class Mongodb():
pid = 0
def start(self):
path = "/mongodb-osx-x86_64-1.6.5/bin/mongod"
data = "/data/db/"
cmd = path + " --dbpath " + data
MyCMD = subprocess.Popen([cmd],shell=True)
self.pid = MyCMD.pid
def connect(self):
try:
connection = Connection(host="localhost", port=27017)
db = connection['Example_db']
return db
except Exception as inst:
print "Database connection error: " , inst
def stop(self):
os.kill(self.pid,signal.SIGTERM)
Every thing works fine from the terminal. However, when I used py2app to make a standalone version of my program on Mac OS (OS v10.6.5, Python v2.7), I am able to lunch the GUI but can't start the database. It seems py2app changed the location of Mongodb executable folder and broke my code.
I use the following parameters with py2app:
$ py2applet --make-setup main.py
$ rm -rf build dist
$ python setup.py py2app --iconfile /icons/main_icon.icns -r /mongodb-osx-x86_64-1.6.5
How to force py2app to leave my application structure intact?
Thanks.
Py2app changes the current working directory to the foo.app/Content/Resources folder within the app bundle when it starts up. It doesn't seem to be the case from the code you show above, but if you have any paths that are dependent on the CWD (including relative pathnames) then you'll have to deal with that somehow. One common way to deal with it is to also copy the other stuff you need into that folder within the application bundle, so it will then truly be a standalone bundle that is not dependent on its location in the filesystem and hopefully also not dependent on the machine it is running upon.