Allure report using Newman - rest

I use "newman" to run API tests on Jenkins server. It's very easy for me, I write test scripts in "Postman" and run my collection in "newman" but I can't provide good reports for my manager. I found "allure report" and I like it. Is there any chance to create allure report if I use "Newman". Does allure support newman?

Looks like no, it's impossible.
Why now (after writed tests) you are looking for report tool? It's activity happen in start automation process when qa team analyze test tools are could be used for automation.
I have a look on https://github.com/postmanlabs/newman and think could you try parse commandline output to text file? And use this output to generate simple-report for manager.

Yes, you can. Follow below steps:
we can generate nice and clean report using Allure-js framework.
1. Installation
$ npm install -g newman-reporter-allure
2. Run the newman cli command to generate Allure results, specify allure in Newman's -r or --reporters option.
$ newman run <Collection> -e <Environment> -r allure
3. Allure results will be generated under folder "allure-results" in the root location. Use allure-commandline to serve the report locally.
$ allure serve
4. To generate the static report web-application folder using allure-commandline
$ allure generate --clean
Report will be generated under folder "allure-report" in the root location
.

Try to use my repository, here's tricky solution, which covers everything:
And add couple more things:
Add 2 files: collection *.json and env *.json
Add +x permissions to start.sh file with chmod command, like chmod +x start.sh
And run script ./start.sh your_collection.json your_env.json
Finally, you will get 2 reports:
HTML report
Allure report

Related

How to define rule file in debian packaging of a project which have a make file to build from source?

I'm new to stackoverflow so correct me if I made any mistake in providing the details.
So I'm trying to make a deb file for Apache-Age, and going by the documentation, if we try to install AGE from source then we can simply do it by :
make install
I have setup the basic directory structure by dh_make and have made the control file with proper dependencies, then comes the rule file.
So I went through 2 different extensions of postgreSQL :
postgresql-q3c
Postgis
And tried to replicate the same for apache-age, and tried to build by following commands
dpkg-buildpackage -rfakeroot -b
dpkg-buildpackage -nc -i
the build was giving some errors and warning but a deb file was generated.
The deb file installed properly but age-extension was not installed in PostgreSQL.
It's probably because the age was not building properly from source using make command as specified in the rule file.
Is there any good resource or how to make rule file ?
I tried following this answer, but got stuck here.
I found a PDF but didn't understand the build process.
This might be a naive way but it works for me:
Clone the repo and cd to it
Run the dh_make_pgxs command to make the debian build directory structure.
The you need to make changes to pgversion, control/control.in, changelog, copyright and rule files.
If you are just trying to use the make file to build the package then the rule file can be as simple as:
#!/usr/bin/make -f
%:
dh $#
Then simply run the build command as before.

Dealing with the No source for code error in Python coverage

I have a new variant of this old "No source for code" issue.
"No source for code" message in Coverage.py
Here is the error I see:
$ coverage report
No source for code: '/home/pauljohn/GIT/projects/ml_grb/grb/packages/grb/tests/C:\Users\G33987\.conda\envs\grb\lib\site-packages\grb\review\review.py'.
Aborting report output, consider using -i.
I'm in Linux, working in a Git repository with a windows teammate "G33987". My current working directory is /home/pauljohn/GIT/projects/ml_grb/grb/packages/grb/tests, as you can see, and the virtual environment I'm using is in ~/venv-grb. Notice the super weird thing is that it is looking for a file in my tests folder with an appended full path to a teammate's installed "grb" package folder, "C:\Users\G33987.conda..."
I can add the "-i" flag to ignore problem, but I want to understand and fix.
In the other posts about this issue with coverage.py, the problem was linked to presence of old copy of .coverage in tests folder or to presence of *.pyc files. I've checked and our Git repository does not track any pyc files. However, by mistake it was tracking the original .coverage file. But we don't track that anymore and I've manually deleted it between runs.
So far, I have this workflow
coverage erase
find . -name "*.pyc" -exec rm {} \;
coverage run --source=grb -m pytest .
coverage report
I can run coverage report -i to ignore issue and the output does give line-by-line reports on the files in my own virtual environment. But I'm disgusted by ignoring an error. Where does reference to teammate's virtual environment come from? I'm not using a conda virtual environment, but rather I'm pure Python virtual environment.
Python 3.8.5
coverage 5.5
pytest 6.2.2
py 1.10.0
pluggy 0.13.1

Codecov: error processing coverage reports

I want to add codecov to this project. Yet, codecov says here that it can not process my coverage.xml file that I created with this command: pytest python/tests -v --junitxml=coverage.xml in the Travis CI script.
Everything prior to that like providing my token seems to work as suggested in the TravisCI build here.
I thought this could perhaps be a problem with the paths but I included a potential fix in the codecov.yml and nothing changed.
Therefore, I do not think that the script codecov.yml, travis.yml, and utils/travis_runner.py are part of the problem.
The --junitxml option is for generating reports in JUnit format. Use the option --cov-report to generate coverage reports. pytest-cov allows passing --cov-report multiple times to generate reports in different formats. Example:
$ pip install pytest pytest-cov
$ pytest --cov=mypkg --cov-report term --cov-report xml:coverage.xml
will print the coverage table and generate the Cobertura XML report which is CodeCov-compatible.

Katalon in Docker Test Suite Pathing

I have a powershell script which executes the following but struggling to find the Test Suite pathing
docker run -t -v ${pwd}:/katalon/katalon/source katalonstudio/katalon katalon-execute.sh -browserType="Chrome" --privileged -retry=0 -statusDelay=15 -testSuiteCollectionPath='/katalon/katalon/source/Test Suites/'
I have also tried to reference the Test Suite explicitly (like the official docs)
docker run -t -v $(pwd):/katalon/katalon/source katalonstudio/katalon katalon-execute.sh -browserType="Chrome" -retry=0 -statusDelay=15 -testSuitePath="Test Suites/Fund Fact Details"
Gives this error:
If I go into the container and cd to the Test Suite directory I can see the files(as seen below) but the arguments are failing when being passed in via the powershell script
Test suite directory inside container
Any pathing tips or tricks to try for Bash/Powershell will be greatly appreciated
After much frustration - turns out the shell scripts and bat files were using the wrong Project file in the bin folder. Once I deleted the bin folder it used the correct project in the root Katalon folder and Test Suite pathings were found :)

Run all files in a directory to measure coverage

I want to run coverage for all files in a directory.
For instance, I have the following directory structure:
root_dir/
tests/
test1.py
test2.py
code_dir/
There are some python files in tests directory. I want to run them together using coverage run and generate a report.
Individually, I can do like this:
coverage run tests/test1.py
coverage run tests/test2.py
and generate a report.
How can I do this with a single command?
Thanks.
You should use a test runner to find and run those tests. Either pytest, or python -m unittest discover will do that for you.