How to generate test overview in Frisby - jasmine-node

As mentioned in frisby official document (http://frisbyjs.com/) , I am using --junitreport something like following
jasmine-node ./demo/validation_spec.js/ --junitreport --output
C:\Users\Administrator\Documents\script/Reports
which is generating 15 xml files. As I am calling 15 time post using for loop. File contains data like following
<?xml version="1.0" encoding="UTF-8" ?>
<testsuites>
<testsuite name="Frisby Test:blank action " errors="0" tests="1" failures="0" time="0.284" timestamp="2017-03-21T13:55:15">
<testcase classname="Frisby Test: blank action " name="
[ POST https://localhost:8443/api/v2/settings/pointwise ]" time="0.282"></testcase>
</testsuite>
</testsuites>
so my difficulty is that I need to read each and every file to check if tests is pass or not. I want some kind of overview page like http://maven.apache.org/surefire/maven-surefire-report-plugin/surefire-report.html
which contain information how many test passed, fail etc. Is it possible to get that.

Related

SonarQube cannot parse TEST-report.xml which contains any failures

How to send the XML report to SonarQube? I mean, while the TEST-report.xml file contains any failures, the import operation fails. I got an error:
Running SonarQube using SonarQube Runner.17:18:18.452 ERROR: Error during SonarScanner execution
java.lang.NullPointerException
at java.text.DecimalFormat.parse(DecimalFormat.java:2030)
at java.text.NumberFormat.parse(NumberFormat.java:383)
...
The TEST-report.xml file (JUnit) contains something like:
<?xml version='1.0' encoding='UTF-8'?>
<testsuites name='X UITests.xctest' tests='12' failures='3'>
<testsuite name='X.MoreViewTests' tests='1' failures='1'>
<testcase classname='X.MoreViewTests' name='testSucceed_allAction'>
<failure message='XCTAssertTrue failed'>X UITests/Cases/Bottom Navigation/MoreViewTests.swift:212</failure>
</testcase>
</testsuite>
<testsuite name='X.OrderViewTests' tests='1' failures='0'>
<testcase classname='X.OrderViewTests' name='testSucceed_allAction' time='68.570'/>
</testsuite>
</testsuites>
But when I removed the failure lines, becomes:
<?xml version='1.0' encoding='UTF-8'?>
<testsuites name='X UITests.xctest' tests='12' failures='3'>
<testsuite name='X.OrderViewTests' tests='1' failures='0'>
<testcase classname='X.OrderViewTests' name='testSucceed_allAction' time='68.570'/>
</testsuite>
</testsuites>
It works. Any idea? Thanks.
Your XML report is invalid (doesn't meet the Surefire XML format requirements).
testsuite required parameters:
name - Full class name of the test for non-aggregated testsuite documents. Class name without the package for aggregated testsuites documents
tests - The total number of tests in the suite
failures - The total number of tests in the suite that failed. A failure is a test which the code has explicitly failed by using the mechanisms for that purpose. e.g., via an assertEquals
errors - The total number of tests in the suite that errored. An errored test is one that had an unanticipated problem. e.g., an unchecked throwable; or a problem with the implementation of the test
skipped - The total number of ignored or skipped tests in the suite
The exception starts with DecimalFormat.parse
java.lang.NullPointerException
at java.text.DecimalFormat.parse(DecimalFormat.java:2030)
at java.text.NumberFormat.parse(NumberFormat.java:383)
which means that the XML report parser couldn't parse a number. You have two missing parameters: errors and skipped. I don't know how you generate these reports (which JUnit version etc.), but probably the used tool is outdated or miss-configured.

Kafka connect - file pulse - 'xml attribute' extraction

I am trying to use the file pulse connector to read XML file.
I am new to Kafka/Kafka Connect/XML processing
For file like below, I'd like to keep the data "unit", and the "string1", "string2".
currently, by default the processed payload drop them.
<?xml version="1.0" encoding="UTF-8"?>
<data>
<someField>someValue</someField>
<anotherField-I-Forced-the-type-to-Array>
<value unit="string1">123</value>
<value unit="string2">456</value>
</anotherField-I-Forced-the-type-to-Array>
<lastField>lastValue</lastField>
</data>
Does some kind of configruation already exist?
I have not found the configuration in the doc https://streamthoughts.github.io/kafka-connect-file-pulse/docs/developer-guide/file-readers/
Please help and maybe give some examples if there are already solution exist.
currently I got this payload. You can see unit and its value string1, string2 are gone.
"anotherField-I-Forced-the-type-to-Array": [
{
"value": [
"123",
"456"
]
}
],
ps. The version I used is 1.5.2 downloaded zip from here https://github.com/streamthoughts/kafka-connect-file-pulse/releases
curious, based on this article: https://medium.com/streamthoughts/streaming-data-into-kafka-s01-e02-loading-xml-file-21b5e69c645
the playlist does have 'name' attribute' and it was not lost.
<playlist name="BestOfStarWars">
FYI This is now fixed in 1.5.3 version very quickly

Azure devops not showing the testsuite name under a testsuite in the report(xml) section of the pipeline

I am generating xml reports via postman in the azure devops pipeline. My Postman collection structure is below:
Collection name->Folder name->Requests(Containing test cases)
When the xml report gets generated then it looks like below:
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="Suite1" tests="2" time="1.317">
<testsuite name="Folder1 / Request1" id="abc" timestamp="time/date" tests="1" failures="1" errors="0" time="0.658">
<testcase name="Verify that the success status code is 200" time="0.179" classname="ABC">
<failure type="AssertionFailure" message="expected response to have status code 200 but got 401">
I expect from Azure devops to show the generated & categorize xml report as:
Suite1->Folder1->Test cases inside Request1
but it actually shows like below:
Suite1->Test cases inside Request1
I expect all the test cases to be inside "Folder1" instead of Suite1. SomeHow it is skipping the "Folder1" inside "Suite1". Let me know if needed more explanation.

XSLT streaming not streaming

I'm using Saxon-EE for the purpose of streaming XSLT transformation of large XML. The transformation works fine but it seems it's not really streaming since the java.exe process is inflating: for a 100 MB XML, process memory increases ~1GB. This is the XSLT:
<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="3.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
xmlns:fo="http://www.w3.org/1999/XSL/Format"
xmlns:bb="urn:xx-zz-1.1"
xmlns:aa="urn:xx-yy-1.1">
<xsl:mode streamable="yes"/>
<xsl:output method="text" omit-xml-declaration="yes" indent="no"/>
<xsl:template match="/">
<xsl:for-each select="aa:LevelOne/aa:LevelTwo">
<xsl:iterate select="bb:LevelThree! copy-of(.)">
<xsl:value-of select="concat(bb:fieldOne,',',bb:fieldTwo,'
')"/>
</xsl:iterate>
</xsl:for-each>
</xsl:template>
</xsl:stylesheet>
This is the XML:
<?xml version="1.0" encoding="utf-8"?>
<aa:LevelOne xmlns="urn:xx-zz-1.1" xmlns:aa="urn:xx-yy-1.1">
<aa:LevelTwo xmlns="urn:xx-zz-1.1" xmlns:aa="urn:xx-yy-1.1">
<LevelThree xmlns="urn:xx-zz-1.1">
<fieldOne>f1</fieldOne>
<fieldTwo>f2</fieldTwo>
</LevelThree>
<!-- Level three is repeated many times -->
</aa:LevelTwo>
</aa:LevelOne>
I would like to know if there (& what) is a problem with the XSLT above.
The code I use:
net.sf.saxon.s9api.Processor processor = new net.sf.saxon.s9api.Processor(true);
processor.setConfigurationProperty(Feature.STREAMABILITY, "standard");
XsltCompiler compiler = processor.newXsltCompiler();
XsltExecutable stylesheet = compiler.compile(new StreamSource(stylesheetFile));
Serializer out = processor.newSerializer(outputCsvFile);
Xslt30Transformer transformer = stylesheet.load30();
transformer.applyTemplates(new StreamSource(xmlFile), out);
EDIT: Fixed the XSLT so it compiles & added XML example.
Remark: using command java -cp "<path>\test;<path>\saxon9ee.jar" com.example.test.Test -t does not ouput additional info (only the printlns in the code). java -cp "<path>\test;<path>\saxon9ee.jar" -t com.example.test.Test outputs: Unrecognized option: -Xt Error: Could not create the Java Virtual Machine. If I change the XSLT to non-streamable rule e.g. remove the iterate line, program outputs Template rule is not streamable, also without -t option. In this case if I remove the streamability requirement from code/xslt, the error goes away.
Thanks.
Probably the most likely reason Saxon would fall back to non-streaming mode is that it hasn't located a Saxon-EE license. The easiest way to test that is (unintuitively!) by calling processor.isSchemaAware() - that will only be true if you're running Saxon-EE code with a recognized license, which is exactly the same condition to enable streaming.
If it hasn't found a license, the Saxon documentation includes a section on troubleshooting license problems at http://www.saxonica.com/documentation/index.html#!about/license
Also, try it from the command line with option -t; that will give you more information (a) about streaming, and (b) about loading of license files.
I think, if the data is as simple and as regular as shown in the question, then you can avoid the use of copy-of() and simply use
<xsl:iterate select="bb:LevelThree">
<xsl:value-of select="bb:*" separator=","/>
<xsl:text>
</xsl:text>
</xsl:iterate>
That in a quick test shows a reduced memory consumption compared to your posted approach.
As for your posted approach not using streaming with Saxon EE 9.9, I have tested the posted XSLT and the input sample with Saxon 9.9 EE from the command line with the -t option and it shows the input is streamed.
I also think the Java code shown is fine to process the file with streaming with Saxon EE.
For a detailed analysis of the memory consumption and any problems you encounter with that it might be better to raise an issue with all details on the Saxonica support site. I am not sure how the memory info Saxon outputs relates exactly to the one you say you see for java.exe.

using curl to interact with REST API - error: "Content not allowed in prolog"

I am sending the following curl command from terminal in Mac OSX:
curl -d "OPERATION_NAME=ADD_REQUEST&TECHNICIAN_KEY=1AD….4&INPUT_DATA=TestData.xml" http://xxx.xx.xx.xx/sdpapi/request/
I am getting back the response:
FailedError when performing - ADD_REQUEST - Content is not allowed in prolog.
Here is my xml file:
<?xml version="1.0" encoding="UTF-8"?>
<Operation>
<Details>
<requester>Me</requester>
<subject>Test</subject>
<description>Testing curl input</description>
</Details>
</Operation>
I have checked and my xml file is indeed a UTF-8 file. I can tell from google searches that this most likely has to do with my encoding, however, I can't find how to fix it.
I've also tried saving the xml file as ANSI on my pc, xml file looks like this:
<?xml version="1.0"?>
<Operation>
<Details>
<requester>Me</requester>
<subject>Test</subject>
<description>Testing curl input</description>
</Details>
</Operation>
I downloaded Notepad++ and checked the encoding, which is UTF8 with no BOM.
I am still getting the same error. Can anyone see what I am doing wrong?
Updated 9/19 to add:
In addition to everything I've tried in the comments below, I've also tried this:
curl -d "OPERATION_NAME=ADD_REQUEST&TECHNICIAN_KEY=xxxxxxxxxxxxxxxxx&INPUT_DATA=<?xml version="1.0" encoding="utf-8"?><Operation><Details><requester>Me</requester><subject>Test</subject><description>Testing curl input</description></Details></Operation>" http://xxx.xx.xx.xx/sdpapi/request/
The error I'm getting now is: "Error when performing - ADD_REQUEST - The value following "version" in the XML declaration must be a quoted string."
Anyone have any thoughts?
I replaced <?xml version="1.0" encoding="utf-8"?> with <?xml version=%221.0%22 encoding=%22utf-8%22?>
It is now working.