load base class for tests from conftest in pytest - pytest

We have a lot of repeated tests for many applications (testinfra). I want to create BaseClass to be inherited by app-specific test classes to reduce boilerplate of repeated tests (is this service running? is this port listen?).
We have tests in a structure like this /tests/{app-name}/test_{app_name}.py. There is /tests/conftest.py file with pytest fixtures.
I'm trying to find a way to put BaseClass somewhere in conftest.py or near it and to be able import it.
The problems:
tests are run in CI and I don't want to install any modules/packages (that means I can't just pip install module with BaseClass and than from thismodule import BaseClass)
Due to convention we must use dashes in {app-name}. That beaks all relative imports as app-name is not a package for python, even there if __init__.py is present.
Is any reasonable way to load class from a specific file, or get it for free (magic at collection time) from pytest?

Related

pytest.ini doesn't take effect when calling pytest vs. pytest <test_name>

I am working creating some testing infrastructure and struggling with taking care of all the dependencies correctly.
The directory structure I have looks like:
conftest.py
kernels/
|-kernel_1/
|---<kernel_src1>
|---__init__.py
|---options.json
|---test/
|-----test_func1.py
|-kernel_2/
|---<kernel_src2>
|---__init__.py
|---pytest.ini
|---options.json
|---scripts/
|-----__init__.py
|-----some_module.py
|---test/
|-----test_func2.py
When I call pytest on any of these tests, the test first compiles and simulates the kernel source code (C++) and compares the output against golden that is generated in python. Since all the kernels will be compiled individually, I create an output directory to store compile/simulation logs along with some header files that we generated in the kernel_1 directory.
For example, pytest kernel_1/test/test_func1.py will create a directory in kernel_1/build_test_func1/<compile/sim logs>.
I use the conftest.py which updates cwd to the test directory based on the accepted answer here:
Change pytest working directory to test case directory
I also added pytest.ini to add kernel_2 to the pythonpath when running test_func2 so we can find modules in scripts folder:
[pytest]
pythonpath=.
Tests run correctly when calling it from:
cd kernel_2/; pytest
cd kernel_2/test; pytest
cd kernel_2; pytest test/test_func1.py
cd kernel_2/test; pytest test_func1.py
The test also runs correctly when calling it like this: pytest kernel_2/test/test_func2.py
But I start seeing ModuleImportError when calling it from top-level without specifying the test
pytest
ImportError while importing test module '<FULL_PATH>/kernel_2/test/test_func2.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
<FULL_PATH>miniconda3/envs/pytest/lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
kernel_2/test/test_func2.py:8: in <module>
from scripts.some_module import some_func
E ModuleNotFoundError: No module named 'scripts'
The issue looks when collecting pytest.ini in a specific kernel doesn't take effect when calling pytest, but I haven't been able to find a way to fix this issue. Any comments, concerns are appreciated!

How to debug unit test while developping a package in Julia

Say I develop a package with a limited set of dependencies (for example, LinearAlgebra).
In the Unit testing part, I might need additional dependencies (for instance, CSV to load a file). I can configure that in the Project.toml all good.
Now from there and in VS Code, how can I debug the Unit tests? I tried running the "runtests.jl" in the debugger; however, it unsurprisingly complains that the CSV package is unavailable.
I could add the CSV package (as a temporary solution), but I would prefer that the debugger run with the configuration for the unit testing; how can I achieve that?
As requested, here is how it can be reproduced (it is not quite minimal, but instead I used a commonly used package as it give confidence the package is not the problem). We will use DataFrames and try to execute the debugger for its unit tests.
Make a local version of DataFrames for the purpose of developing a feature in it. I execute dev DataFrames in a new REPL.
Select the correct environment (in .julia/dev/DataFrames) through the VS-code user interface.
Execute the "proper" unit testing by executing test DataFrames at the pkg prompt. Everything should go smoothly.
Try to execute the tests directly (open the runtests.jl and use the "Run" button in vs-code). I see some errors of the type:
LoadError: ArgumentError: Package CategoricalArrays not found in current path:
- Run `import Pkg; Pkg.add("CategoricalArrays")` to install the CategoricalArrays package.
which is consistent with CategoricalArrays being present in the [extras] section of the Project.toml but not present in the [deps].
Finally, instead of the "Run" command, execute the "Run and Debug". I encounter similar errors here is the first one:
Test Summary: | Pass Total
merge | 19 19
PASSED: index.jl
FAILED: dataframe.jl
LoadError: ArgumentError: Package DataStructures not found in current path:
- Run `import Pkg; Pkg.add("DataStructures")` to install the DataStructures package.
So I can't debug the code after the part requiring the extras packages.
After all that I delete this package with the command free DataFrames at the pkg prompt.
I see the same behavior in my package.
I'm not certain I understand your question, but I think you might be looking for the TestEnv package. It allows you to activate a temporary environment containing the [extras] dependencies. The discourse announcement contains a good description of the use cases.
Your runtest.jl file should contain all necessary imports to run tests.
Hence you are expected to have in your runtests.jl file lines such as:
using YourPackageName
using CSV
# the lines with tests now go here.
This is a standard in Julia package layout. For an example have a look at any mature Julia such as DataFrames.jl (https://github.com/JuliaData/DataFrames.jl/blob/main/test/runtests.jl).

Pytest can't find files/modules

I have had a look at several different topics on this matter but can't work out how to apply it to my situation. I don't have an init.py in my test folder and I have tried to use conftest. I have a directory structure like this:
--app
--app.py
--src
--init.py
--module1.py
--module2.py
--module3.py
--configs
--config.json
--non-default-config.json
--tests
--test1.py
--conftest.py
where app.py imports module1, which then imports modules 2&3 (using import src.module2). I load up config.json in all the modules files (and app.py) using:
with open('configs/config.json') as f:
CFG = json.load(f)
This works when I run app.py from the app directory. However, when I run pytest (which I believe should also be referencing from the app directory, since conftest.py is in the app directory) and it imports module1 (using import src.module1), it cannot find configs/config.json, but will find app/configs/config.json. I cannot use this as it will cause my app to break when I run app.py. However, Pytest can find the imports from within the src folder, even though this is on the same level as the configs folder.
If I move the conftest.py outside of the app directory and import module1 using import app.src.module1 then this import succeeds, but the import of module2 inside module1 then fails.
How can I resolve this issue? And is there a better way of structuring my project?
Solved this by running pytest from inside the app folder instead of from the base directory.

Setting module name to be different from directory name in SwiftPM

I have a Swift library with a core module plus optional bonus modules. I would like to use the following directory layout, mapping to exported Swift package names as shown:
Taco/
Source/
Core/ → import Taco
Toppings/ → import TacoToppings
SideDishes/ → import TacoSideDishes
To my eyes, that’s a sensible-looking project layout. However, if I’m reading the docs right, this will pollute the global module namespace with unhelpful names like “Core”. It seems that SwiftPM will only export a module whose name is identical to the directory name, and thus I have to do this:
Taco/
Source/
Taco/
TacoToppings/
TacoSideDishes/
Is there a way to configure Package.swift to use the tidier directory layout above and still export the desired module names?
Alternatively, is it possible to make the Core, Toppings, and SideDishes modules internal to the project, and export them all to the world as one big Taco module?
There is not currently a clean way to do this, but it seems like a reasonable request. I recommend filing an enhancement request at http://bugs.swift.org for it.
There is one "hacky" way you can do this:
Create your sources in the desired internal layout:
Sources/Core
Sources/Toppings
Add additional symbolic links for the desired module names:
ln -s Core Sources/Taco
ln -s Toppings Sources/TacoToppings
Add an exclude directive to the manifest to ignore the non-desired module name:
let package = Package(
name: "Taco",
exclude: ["Sources/Core", "Sources/Toppings"]
)
is it possible to make the Core, Toppings, and SideDishes modules internal to the project, and export them all to the world as one big Taco module?
No, unfortunately there is no way to do this currently, and it requires substantial compiler work to be able to support.

Why does the scala-ide not allow multiple package definitions at the top of a file?

In scala it is common practice to stack package statements to allow shorter imports, but when I load a file using stacked packages into the scala ide and I attempt to use an import starting with the same organization I get a compiler error from what appears to be the presentation compiler. The code compiles fine in sbt outside of the IDE.
An example code snippet is as follows:
package com.coltfred
package util
package time
import com.github.nscala_time.time.Imports._
On the import I get the error object github is not a member of package com.coltfred.util.com.
If I move the import to a single line the error will go away, but we've used this practice frequently in our code base so changing them all to be single line package statements would be a pain.
Why is this happening and is there anything I can do to fix it?
Edit:
I used the eclipse-sbt plugin to generate the eclipse project file for this. The directory structure is what it should be and all of the dependencies are in the classpath.
Edit 2:
It turns out there was a file in the test tree of the util package (which should have been in the same package), but had a duplicate package statement at the top. I didn't check the test tree because it shouldn't affect the compilation of the main tree, but apparently I was wrong.
Not sure why the Scala IDE is not liking this, but you can force the import to start at the top level using _root_:
import _root_.com.github.nscala_time.time.Imports._
See if that avoids irritating the IDE.
This is a common annoyance that annoyed paulp into an attempt to fix it. His idea was that a dir that doesn't contribute class files shouldn't be taken as a package. If you can take util as scala.util, you should do so in preference to foo.util where that util is empty.
The util dir is the usual suspect, because who doesn't have a util dir lying around, and in particular, ./util?
apm#mara:~/tmp/coltfred$ mkdir -p com/coltfred/util/time
apm#mara:~/tmp/coltfred$ mkdir -p com/coltfred/util/com
apm#mara:~/tmp/coltfred$ vi com/coltfred/util/time/test.scala
apm#mara:~/tmp/coltfred$ scalac com/coltfred/util/time/test.scala
./com/coltfred/util/time/test.scala:5: error: object github is not a member of package com.coltfred.util.com
import com.github.nscala_time.time._
^
one error found
apm#mara:~/tmp/coltfred$ cat com/coltfred/util/time/test.scala
package com.coltfred
package util
package time
import com.github.nscala_time.time._
class Test
apm#mara:~/tmp/coltfred$
To debug, find out where the offending package is getting loaded from.