PyTest Logging Capture & Output Control - pytest

I am using the JSON report (--json-report) to provide me a json report file that I can programmatically post process at the end of a test session. I want only the post processed report to be displayed at stdout/stderr. How do I "programmatically disable" pytest from capturing & outputting to the screen, and "programmatically enable" after post processing completion and only show the post processed report ???

Related

Jmeter - Set custom text on request tab using JSR223 sampler

I am creating a script on groovy to be able to send http requests.
To call this script I'm using the JSR223 Sampler
The thing is that I would like to reproduce as much as possible the behaviour that a HTTP Sampler has, meaning that I want to perform the request and also to fill the sampler's info (Response data, Request and Response)
Although I'm able to obtain SampleResult and set Response data and Response, does not seem to exist a method to set our own request string:
https://jmeter.apache.org/api/org/apache/jmeter/samplers/SampleResult.html
Following the docs, the closest method to do what I want is the setRequestHeaders().
If I call that method like this SampleResult.setRequestHeaders("My custom text") something like this appears on the request tab:
File C:\Users\UserName\groovy_file.groovy
Request Headers:
My custom text
Is there any way to print only the string My custom text, on the Request?
EDIT
Sampler must use a script file instead of the script field
The easiest way would be just overwriting the data using prev.samplerData() shorthand from the JSR223 PostProcessor
prev.samplerData = 'put the desired request data here'
where prev stands for the parent SampleResult class instance, check out Top 8 JMeter Java Classes You Should Be Using with Groovy article for more information on the JMeter API shorthands available for the JSR223 Test Elements.
If you don't want the PostProcessor you can still call the same function from your Groovy script like:
SamplerResult.setSamplerData('put the desired request data here')

How to create a composed test fragment

Let's suppose that to make a REST request C, I need to make a request A and a request B as a set up for the business case.
I know how to run 3 requests sequentially in jmeter, but I want just to C be measured by jmeter stats to see tps and response time. Is there a way to do that?
Let's say that A and B not necessarily will be executed in a near time in a real case, but they need to be requested before C.
There are 2 options:
Add JSR223 PostProcessors as children of requests A and B and use the following code:
prev.setIgnore()
this line will invoke SampleResult.setIgnore() function suppressing the output of the sampler(s) in the JSR223 PostProcessor's scope. Check out Top 8 JMeter Java Classes You Should Be Using with Groovy article for more information on JMeter API shortcuts available for JSR223 Test Elements.
Another option is using Filter Results Tool which allows removing "not interesting" entries from the .jtl results file. Filter Results Tool can be installed using JMeter Plugins Manager (you will need Merge Results as well), example command line would be something like:
FilterResults.bat --output-file onlyrequestc.jtl --input-file result.jtl --include-labels "Request C"

Receive Prompts from a report using REST services for Pentaho

I was wondering if it is possible to receive the prompts used (with possibly all it's options) in a report using REST services.
What I like to achieve is receiving the prompts and if possible all the options for those prompts in an XML format from any given Pentaho report. I know there are rest calls for basic repository listings etc... but I can't seem to find this specific call.
It is possible to get full parameters xml (which includes parameters, parameters values, parameters attributes as far as info which is used to create report prompts) - full parameter info. You need bi server and reporting plugin. The url is:
http://localhost:8080/pentaho/api/repos/"%"3Apublic"%"3ASteel"%"20Wheels"%"3AInventory"%"20List"%"20(report).prpt/parameter
And we have to pass parameter renderMode with value: PARAMETER.
Here we call to report under /public/Steel Wheels/Inventory List (reprot).prpt
or simplifying -
"http://localhost:8080/pentaho/api/repos/<path_to_report>.prpt/parameter"
You can open browser and inspect requests responses just on fly:
On a screen is actually parameter requests you are looking for.

GoodData Export Reports API Call results in incomplete file

I've developed a method that does the following steps, in this order:
1) Get a report's metadata via /gdc/md//obj/
2) From that, get the report definition and use that as payload for a call to /gdc/xtab2/executor3
3) Use the result from that call as payload for a call to /gdc/exporter/executor
4) Perform a GET on the returned URI to download the generated CSV
So this all works fine, but the problem is that I often get back a blank CSV or an incomplete CSV. My workaround has been to put a sleep() in between getting the URI back and actually calling a GET on the URI. However, as our data grows, I have to keep increasing the delay on this, and even then it is no guarantee that I got complete data. Is there a way to make sure that the report has finished exporting data to the file before calling the URI?
The problem is that export runs as asynchronous task - result on the URL returned in payload of POST to /gdc/exporter/executor (in form of /gdc/exporter/result/{project-id}/{result-id}) is available after exporter task finishes its job.
If the task has not been done yet, GET to /gdc/exporter/result/{project-id}/{result-id} should return status code 202 which means "we are still exporting, please wait".
So you should periodically poll on the result URL until it returns status 200 which will contain a payload (or 40x/50x if something wrong happened).

Get result of REST sampler in JMeter

Please show me how can get the result of REST sampler in Jmeter. Because I need to to it for checking my sampler is right or wrong
Thanks,
You can use View results tree and uncheck Errors and Success check boxes in order to show all responses you receive. The Response data tab will show response data :) which you can format using drop down list in the lower left part of the View Results Tree component, bellow the request tree.
When you run test with large number of requests I suggest you check Errors (to record only requests that failed) to avoid filling up the RAM.
Or even better (for advanced test verification), you can use Assertions to mark the requests/responses that failed (which doesn't have to mean only "response code != 200", you may want to include your business logic and check arbitrary response header/param).
Add a 'View Results Tree View' to your testplan to see how the tests run.
When clicking on a test you can view the request/response data.
Next you might want to validate that response; add a 'Response Assertion' to your test
where you can match the response to anything you want.