I created a Django app locally, and I would like to host it on PythonAnywhere.com. I've followed the directions at https://help.pythonanywhere.com/pages/VirtualEnvForNewerDjango and created a virtualenv with 1.9 installed. However, when I try to run my app, I get the error ImportError: No module named myapp.settings
Here is my username_pythonanywhere_com_wsgi.py
import os
import sys
# add your project directory to the sys.path
project_home = u'/home/rhpt'
if project_home not in sys.path:
sys.path.append(project_home)
# set environment variable to tell django where your settings.py is
os.environ['DJANGO_SETTINGS_MODULE'] = 'myapp.settings'
# serve django via WSGI
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()
I also tried myapp.my_app.settings without success.
My tree
myapp
├── my_app
│ ├── __init__.py
│ ├── settings.py
│ ├── urls.py
│ └── wsgi.py
├── get_data
│ ├── __init__.py
│ ├── admin.py
│ ├── models.py
│ ├── tests.py
│ ├── urls.py
│ └── views.py
└── manage.py
if your settings.py file is in /home/rhpt/myapp/my_app/settings.py
then this part
# add your project directory to the sys.path
project_home = u'/home/rhpt'
needs to be
# add your project directory to the sys.path
project_home = u'/home/rhpt/myapp'
and also keep this
os.environ['DJANGO_SETTINGS_MODULE'] = 'my_app.settings'
Maybe if you put an empty
__init__.py
in the same folder as the settings.py it worked for me
Related
Behold, my tree:
src/
├── bin
│ ├── cli.py
├── mypackage
│ ├── __init__.py
│ ├── api.py
│ ├── models
│ └── utils.py
Now, observe, my setup.cfg:
[options]
zip_safe = False
packages = find:
package_dir =
=src
[options.package_data]
src/bin = *
[options.packages.find]
where =
src
[options.entry_points]
console_scripts =
mycommand = bin.cli:main
That seemed fairly reasonable to me. Yet when I install this package in a new virtual environment (pip install /path/to/package), and I run mycommand, I get a very unreasonable answer: ModuleNotFoundError: No module named 'bin'.
NOTE this does work if I do an editable install!
What am I doing wrong?
I have been having a lot of trouble trying to run pytest. Basically the importing was not working properly. Let me explain. The folder structure of my project is the following:
src
└── api
├── __init__.py
├── app.py
├── document.py
└── tests
├── __init__.py
└── test_app.py
And app.py has the following import statement:
from document import Document
and obviously test_app.py imports app
Whenever I runned pytest from the src folder it would throw the following error:
ModuleNotFoundError: No module named 'document'
But once I removed the __init__.py file from the api directory it just works out of the box.
src
└── api
├── __init__.py <--- REMOVED
├── app.py
├── document.py
└── tests
├── __init__.py
└── test_app.py
I have been reading many different threads, but none of them explains why this is so.
Anyone understands why?
Sources I have been reading through:
pytest cannot import module while python can
PATH issue with pytest 'ImportError: No module named YadaYadaYada'
https://docs.pytest.org/en/6.2.x/pythonpath.html
I would like to use the nx:affected:build command to only build the apps that are affected by the last commit in an nx/xplat workspace.
Let's say I have 2 angular web apps web-app1 and web-app2.
.
├── apps
│ ├── web-app1
│ └── web-app2
├── libs
│ ├── core
│ └── features
│ └── app1
│ └── app2
└── xplat
└── web
└── features
└── app1
└── app2
When I make a change to /xplat/web/app1 I would expect the nx:affected:apps command to only show the web-app1 app. However nx thinks that the commit affects both apps. (**web-app2 ** does not import the app1 feature.)
nx affected:apps --base=head~1 --head=head
I've created a sample repo to demostrate: https://github.com/barryajones/workspace-affected-test
I have two SBT projects as outlined below.
├── Project 1
│ │
│ └── src
│ └── main
│ ├── scala
│ │ └── com
│ │ └── xyz
│ │ └── <*.scala>
│ └── resources
│ └── <Typesafe & Log4J config files>
│
│
└── Project 2
│
├── src
│ └── main
│ ├── scala
│ │ └── com
│ │ └── xyz
│ │ └── <*.scala>
│ └── resources
│ └── <Typesafe & Log4J config files>
│
├── resources
│ └── <JS, HTML, Image files etc.>
├── other-dir-1
│
├── other-dir-2
│
└── other-dir-3
Compiling Project 1 (actually running SBT exportedProducts task) produces the following directory structure. unmanagedResourceDirectories points to Project1/src/main/resources. I believe this is the default resourceDirectory (as mentioned in Customizing Paths). In other words, files in default resource directory are automatically added by exportedProducts
├── Project 1
└── target
└── scala-2.10
└── classes
├── com
│ └── xyz
│ └── <*.class>
└── <Typesafe & Log4J config files>
For Project 2, I want the following directory structure to be produced by exportedProducts.
├── Project 2
└── target
└── scala-2.10
└── classes
├── com
│ └── xyz
│ └── <*.class>
├── <Typesafe & Log4J config files>
│
└── resources
└── <JS, HTML, Image files etc.>
To do this I added the following to SBT build file in the appropriate project definition.
unmanagedResourceDirectories in Compile += baseDirectory.value
excludeFilter in unmanagedResources := HiddenFileFilter || "other-dir-*"
includeFilter in unmanagedResources :=
new SimpleFileFilter(_.getCanonicalPath.startsWith((baseDirectory.value / "resources").getCanonicalPath))
This correctly includes resources directory but doesn't include the files from Project2\src\main\resources. The target directory looks like the
├── Project 2
└── target
└── scala-2.10
└── classes
├── com
│ └── xyz
│ └── <*.class>
└── resources
└── <JS, HTML, Image files etc.>
Adding a custom resource directory in some way masks the content of the default resource directory. I tried something along the lines of what was mentioned in this SO post but wasn't successful.
The other thing that I tried was to set unmanagedResourceDirectories in Compile += baseDirectory.value / "resources" and remove both includeFilter and excludeFilter. This adds the files from Project2\src\main\resources correctly but adds the files & directories from Project2\resources directly to Project2\target\scala-2.10\classes. The target directory looks like the following
├── Project 2
└── target
└── scala-2.10
└── classes
├── com
│ └── xyz
│ └── <*.class>
├── <Typesafe & Log4J config files>
│
└── <JS, HTML, Image files etc.>
I want to create my own Perl module, but the problem is that it contain multiple .pm files. The structure is:
lib
├── A_Z.pm
└── T_test
├── A.pm
├── B.pm
├── C.pm
├── D.pm
└── E.pm
I used h2xs -XA -n T_test::A T_test::B T_test::C T_test::D T_test::E. It compiled only A.pm; the other B.pm, C.pm, D.pm, E.pm are not considered. Is there any solution to execute all the .pm file at the same time?
Use Module::Starter::PBP instead.
$ module-starter --builder=Module::Build --module=A_Z,T_test::{A,B,C,D,E}
Added to MANIFEST: Build.PL
Added to MANIFEST: Changes
Added to MANIFEST: lib/A_Z.pm
Added to MANIFEST: lib/T_test/A.pm
Added to MANIFEST: lib/T_test/B.pm
Added to MANIFEST: lib/T_test/C.pm
Added to MANIFEST: lib/T_test/D.pm
Added to MANIFEST: lib/T_test/E.pm
Added to MANIFEST: MANIFEST
Added to MANIFEST: README
Added to MANIFEST: t/00.load.t
Created starter directories and files
$ tree A_Z
A_Z
├── Build.PL
├── Changes
├── lib
│ ├── A_Z.pm
│ └── T_test
│ ├── A.pm
│ ├── B.pm
│ ├── C.pm
│ ├── D.pm
│ └── E.pm
├── MANIFEST
├── README
└── t
└── 00.load.t
3 directories, 11 files
You don't have to do anything special. Just makes sure all the files are listed in MANIFEST as usual. Both ExtUtils::MakeMaker and Module::Build consider all .pm to be modules to install.