sys.path incorrect in Pytest - pytest

I have been having a lot of trouble trying to run pytest. Basically the importing was not working properly. Let me explain. The folder structure of my project is the following:
src
└── api
├── __init__.py
├── app.py
├── document.py
└── tests
   ├── __init__.py
└── test_app.py
And app.py has the following import statement:
from document import Document
and obviously test_app.py imports app
Whenever I runned pytest from the src folder it would throw the following error:
ModuleNotFoundError: No module named 'document'
But once I removed the __init__.py file from the api directory it just works out of the box.
src
└── api
├── __init__.py <--- REMOVED
├── app.py
├── document.py
└── tests
   ├── __init__.py
└── test_app.py
I have been reading many different threads, but none of them explains why this is so.
Anyone understands why?
Sources I have been reading through:
pytest cannot import module while python can
PATH issue with pytest 'ImportError: No module named YadaYadaYada'
https://docs.pytest.org/en/6.2.x/pythonpath.html

Related

Bitbake hello world example fails

I'm following this tutorial:
https://docs.yoctoproject.org/bitbake/bitbake-user-manual/bitbake-user-manual-hello.html
My bitbake version is 1.50.0 and BBPATH is following:
export BBPATH="/home/testusr/projects/bitbake_example/bitbake_sample_project/hello"
and I'm at the last step of the tutorial with following file structure:
/home/testusr/projects/bitbake_example/bitbake_sample_project $ tree
.
├── hello
│   ├── classes
│   │   └── base.bbclass
│   └── conf
│   ├── bblayers.conf
│   └── bitbake.conf
└── mylayer
├── conf
│   └── layer.conf
└── printhello.bb
The problem is when I try to build that:
/home/testusr/projects/bitbake_example/bitbake_sample_project/hello $ bitbake printhello
WARNING: Layer mylayer should set LAYERSERIES_COMPAT_mylayer in its conf/layer.conf file to list the core layer names it is compatible with.
Loading cache: 100% | | ETA: --:--:--
Loaded 0 entries from dependency cache.
WARNING: No bb files in default matched BBFILE_PATTERN_mylayer '^\/home\/testusr\/projects\/bitbake_example\/bitbake_sample_project\/mylayer/'
ERROR: Nothing PROVIDES 'printhello'
The warning is saying that there are no bb files under:
/home/testusr/projects/bitbake_example/bitbake_sample_project/mylayer/
but this file is clearly there as can be seen in tree output, so I'm totally confused.
The problem was with the tutorial in layer.conf file. It had:
BBFILES += "${LAYERDIR}/\*.bb"
Instead of:
BBFILES += "${LAYERDIR}/*.bb"
and that's why it couldn't be found.

How can I make test data files accessible to pytest tests when run with tox?

I want to run a test for a function which accepts a Path to a file as input via an argument: function(some_path_to_file) via tox. The file I want to pass to the function cannot be created temporarily during test setup (what I usually do via pytests builtin tmpdir fixtures) but resides in the package <package>/data directory besides the test directory <package>/tests (the location <package>/tests/data would probably be better). Because tox runs the tests in a virtualenv it's not clear to me how to make the test data file available to the test. I know that I can define the base temporary directory of pytest with the --basedir option but I did not get it working with tox yet.
tl;dr
The problem was a conversion of some_path_to_file from Path to str (to pass it to sqlite3.connect(database inside the function) using Path.resolve(. No need to configure pytests --basedir option and tox in any way.
This tripped me up as well. The way I was able to solve it was to specify the full path of the text file I wanted the testing function to read relative to the base directory.
So for example, my directory tree looks like this:
.
├── __init__.py
├── my_package
│   ├── __init__.py
│   └── calculate_stats.py
├── my_package.egg-info
│   ├── PKG-INFO
│   ├── SOURCES.txt
│   ├── dependency_links.txt
│   ├── requires.txt
│   └── top_level.txt
├── bin
│   └── calculate_stats
├── requirements
│   ├── default.txt
│   └── development.txt
├── setup.py
├── test
│   ├── __init__.py
│   ├── test_calculate_stats.csv
│   ├── test_calculate_stats.txt
│   └── test_calculate_stats.py
└── tox.ini
In the file test_calculate_stats.py I have the following line:
assert (calculate_stats.calculate_stats_to_csv("test/test_calculate_stats.txt", "test/test_calculate_stats.csv") == 60)
The calculate_stats_to_csv function reads in the test/test_calculate_stats.txt file, calculates some stats, and outputs them to test/test_calculate_stats.csv
Initially I had just specified the input file to be test_calculate_stats.txt because it's in the same directory as the file containing the testing function -- that's when I ran into the error.
tox predfines a number of substitutions. The directory of the virtualenv is {envdir}, site-packages is at {envsitepackagesdir}. Pass a value from the command line to your test script like this:
[testenv]
commands = pytest --basedir={envsitepackagesdir}/mypackage

If I have a local rpm in my ansible-playbook can I do yum install in one step?

I have downloaded a rpm in my ansible-playbook:
(djangoenv)~/P/c/apache-installer ❯❯❯ tree .
.
├── defaults
│   └── main.yml
├── files
│   ├── apache2latest.tar
│   ├── httpd_final.conf
│   ├── httpd_temp.conf
│   └── sshpass-1.05-9.1.i686.rpm
├── handlers
│   └── main.yml
├── hosts
├── meta
│   └── main.yml
├── README.md
├── tasks
│   └── main.yml
├── templates
├── tests
│   ├── inventory
│   └── test.yml
└── vars
└── main.yml
My question is why can't I just install it using:
- yum: name=files/sshpass-1.05-9.1.i686.rpm
? It complains that files/sshpass-1.05-9.1.i686.rpm is not found in the system. Now I am doing it in two steps:
- copy: src=files/sshpass-1.05-9.1.i686.rpm dest=/tmp/sshpass-1.05-9.1.i686.rpm force=no
- yum: name=/tmp/sshpass-1.05-9.1.i686.rpm state=present
No, there is no simple way around coping the package to the remote host before installing. Ansible yum module expects a local file when you define a file in the name parameter.
IMHO it is not a good idea to keep packages inside the Ansible code base. Because they are binary and not exactly part of the actual Ansible code. It would be cleaner to setup a private repository and store those files there. That is the only way around coping a package in this situation I'm aware of.

Python 3.5 Parent Module Not Found

In Python 3.5, for a directory structure that looks like this:
.
└── project
└── __init__.py
└── project.py
└── module1
├── __init__.py
└── module1.py
└── module2
├── __init__.py
└── module2.py
Why do I receive "No module named 'project'" error when calling
import project.module1
from module2.py?
As far as I can tell, the docs say this should work.

cucumber-scala and Play Framework integration testing

I've managed to get cucumber-scala up and running on a Play/Scala. Now I want to run the entire play application so that I can use something like Selenium to test my application.
My Current attempts have lead me to
val app = new FakeApplication()
val port = 3333
lazy val browser: TestBrowser = TestBrowser.of(webDriverClass, Some("http://localhost:" + port))
lazy val server = TestServer(port, app)
Of course this FakeApplication() is not configured in any way... Am I approaching this incorrectly? This application is also multi-module and Ideally I would like to have the feature tests run per module (see output from tree below)
├── README.md
├── build.sbt
├── conf
│   ├── application.conf
│   └── routes
├── logs
│   └── application.log
├── modules
│   ├── module1
│   │   ├── app
│   │   ├── conf
│   │   ├── target
│   │   └── test
│   └── module2
│   ├── app
│   ├── conf
│   └── target
└── project
   ├── build.properties
   ├── plugins.sbt
   ├── project
   │   └── target
   └── target
   ├── config-classes
   ├── resolution-cache
   ├── scala-2.10
   └── streams
I am aware that Play has a selenium integration which can be used to drive my tests. However I have a business requirement for feature files, as they are used as a reporting mechanism. I am not absolutely tied to Cucumber so if anyone is aware of a way of driving browser based tests using Feature files that would also be acceptable to me?
Thanks,
Ben
Update:
I was running through IntelliJ, which causes a server to run with no routes or anything provided. I assume this is because it runs with a default blank application.
However When running through sbt test I get the following output:
Caused by: com.google.inject.ProvisionException: Unable to provision, see the following errors:
1) Error in custom provider, Configuration error: Configuration error[Router not found: admin.Routes]
while locating play.api.inject.guice.FakeRouterProvider
while locating play.api.routing.Router
for parameter 0 at play.api.http.JavaCompatibleHttpRequestHandler.<init>(HttpRequestHandler.scala:200)
while locating play.api.http.JavaCompatibleHttpRequestHandler
while locating play.api.http.HttpRequestHandler
for parameter 4 at play.api.DefaultApplication.<init>(Application.scala:221)
at play.api.DefaultApplication.class(Application.scala:221)
while locating play.api.DefaultApplication
while locating play.api.Application
If you're using the default configuration settings for IntegrationTest in sbt, and ScalaTestPlusPlay, here are the directories to use:
src/it/scala – scala IT code
src/it/java – java IT code
src/it/resources – resources, including configuration
Place your application.conf and routes files in the resources directory and FakeApplication will pick it up. You can also set the paths in your sbt script to some other place.
If all of your development/integration testing machines are Unix-like and you use git for VC, you can use a relative symlink1 to your real routes file.
For Windows or other VCS's, you'll likely have to copy the routes file and do double maintenance.
1 symlink using a relative path to the original, e.g., ../../../conf/routes.