I'm considering using FitNesse to write some acceptance tests for some extensions to a RESTful API. The GET response includes XML in an anonymous namespace, e.g.
<?xml version="1.0" encoding="utf-8"?>
<things xmlns="http://example.com/ns/">
<thing id="1"/>
<thing id="2"/>
</things>
The FitNesse fixture RestFixture seems a good candidate for this. It should allow me to run an XPath to verify the response, but this does not appear to play nicely with anonymous namespaces. The following test will fail because needs the namespace specifying:
|!-smartrics.rest.fitnesse.fixture.RestFixture-!|http://example.com/v1.0/inbox |
|GET | /things | 200 | | //thing |
I can find no way of expressing the XPath such that RestFixture will parse it successfully.
A couple of notes:
(a) You can query attributes because they're not in a namespace. The following passes:
|GET | /things | 200 | | //#id |
(b) An example elsewhere suggested using string matching. This is wrong - the following passes too!
|GET | /things | 200 | | 'complete and utter nonsense' |
RestFixture now support namespaces.
You need to define the namespace context as a key value map of alias/namespace uri using the RestFixtureConfig (this must include an alias for the default namespace too).
Then you can use the aliases therein defined in the xpaths that match the response body of a request, or in the let() command, to extract data from the response.
An example is included in the live documentation of the rest-fixture:
https://github.com/smartrics/RestFixture/downloads (check the downloadable html RestFixture-<ver>.html
Related
I am able to execute WebUI feature file against single browser (Zalenium) using parallel runner and defined driver in karate-config.js. How can we execute WebUI feature file against multiple browsers (Zalenium) using parallel runner or distributed testing?
Use a Scenario Outline and the parallel runner. Karate will run each row of an Examples table in parallel. But you will have to move the driver config into the Feature.
Just add a parallel runner to this sample project and try: https://github.com/intuit/karate/tree/master/examples/ui-test
Scenario Outline: <type>
* def webUrlBase = karate.properties['web.url.base']
* configure driver = { type: '#(type)', showDriverLog: true }
* driver webUrlBase + '/page-01'
* match text('#placeholder') == 'Before'
* click('{}Click Me')
* match text('#placeholder') == 'After'
Examples:
| type |
| chrome |
| geckodriver |
There are other ways you can experiment with, here is another pattern when you have a normal Scenario in main.feature - which you can then call later from a Scenario Outline from a separate "special" feature - which is used only when you want to do this kind of parallel-ization of UI tests.
Scenario Outline: <config>
* configure driver = config
* call read('main.feature')
Examples:
| config! |
| { type: 'chromedriver' } |
| { type: 'geckodriver' } |
| { type: 'safaridriver' } |
EDIT - also see this answer: https://stackoverflow.com/a/62325328/143475
And for other ideas: https://stackoverflow.com/a/61685169/143475
EDIT - it is possible to re-use the same browser instance for all tests and the Karate CI regression test does this, which is worth studying for ideas: https://stackoverflow.com/a/66762430/143475
I'm building a Gauge automation project with Selenium, Maven and Java. When executing a specification with an included table data like
# Specification
| name |
| A |
| B |
| C |
## Scenario 1
* User logs in application
## Scenario 2
* User does something for product <name>
In single thread, it runs:
mvn clean install
Output:
Scenario 1
Scenario 2 for name A
Scenario 2 for name B
Scenario 2 for name C
And then it moves to the next specification.
However, Gauge behaves different when running the same spec in parallel on 2 nodes:
mvn clean install -DinParallel=true -Dnodes=2
Output:
Browser 1: Scenario 1
Browser 2: Scenario 2 for name A
Browser 1: Scenario 2 for name B
Browser 2: Scenario 2 for name C
You can immediately see that the scenarios from Browser 2 will not succeed as the "precondition" from Scenario 1 was not run.
Is there a way to parallelize Gauge at specification level?
Note: I know that rewriting the scenarios to be self-contained is one way to go, but these tests get really long, really fast and increase the running time.
After some experimenting, it turns out that Gauge has 2 different parallelizations, depending on how you write the spec.
With a spec with test data like
# Specification
| name |
| A |
| B |
| C |
## Scenario 1
* User logs in application
## Scenario 2
* User does something for product <name>
the parallelization is done at scenario level, as described in the original question:
mvn clean install -DinParallel=true -Dnodes=2
Output:
Browser 1: Scenario 1
Browser 2: Scenario 2 for name A
Browser 1: Scenario 2 for name B
Browser 2: Scenario 2 for name C
However, when rewriting the specification to incorporate the test data into the steps
# Specification
## Scenario 1
* User logs in application
## Scenario 2 for A
* User does something for product "A"
## Scenario 2 for B
* User does something for product "B"
## Scenario 2 for C
* User does something for product "C"
the output looks something like
mvn clean install -DinParallel=true -Dnodes=2
Output:
Browser 1: Scenario 1
Browser 1: Scenario 2 for name A
Browser 1: Scenario 2 for name B
Browser 1: Scenario 2 for name C
which effectively applies parallelization at spec level rather than scenario level.
Lets say I have a scenario in my demo.feature file
Scenario Outline: Gather and load all submenus
Given I will login using <username> and <password>
When I will click all links
Examples :
| username | password |
| user1 | pass1 |
| use2 | pass2 |
lets say i have a file called users.json
How can i get those usernames and passwords from that external file to my demo.feature ?
Can I catch the file by passing parameters to my npm script like below ?
npm run cucumber -- --params.environment.file=usernames.json
I recommend having the login step access that json file within the step definition. Just make sure not to check it into the repo and instead always expect it to be in a location but only locally and not in the repository.
Doing the above is useful for a couple of reasons:
- An engineer running your tests does not need to know that a param must be passed in from the command line
- The code is self-descriptive in that step as to how it logs in
- You can add better error handling
- You can use multiple user files if needs be by having hooks define paths etc based on tags
I want to inspect a list of all routes, that Vapor app is serving. Is there a script or a run-time command that will generate the list for me?
I'm looking for something similar to rake routes in Ruby on Rails
Run vapor run routes from the command line after executing vapor build
Another alternative to get the routes use the following command. It will build your project and displays all routes registered to the Application's Router in an ASCII-formatted table.
$ swift run Run routes
+------+------------------+
| GET | /search |
+------+------------------+
| GET | /hash/:string |
+------+------------------+
A colon preceding a path component indicates a variable parameter. A colon with no text following is a parameter whose result will be discarded.
Continuing from this SO question.
When following the openDDS install guide I attempt to run configure from within the command prompt but receive this output recieve this error set:
C:\Users\Supervisor\Desktop\opendds>C:\Users\Supervisor\Desktop\opendds\configure.cmd
Options:
'compiler' => 'gcc'
'verbose' => 1
host system is: win32
compiler is: gcc
Using ace_src: C:/Users/Supervisor/Desktop/opendds/ACE_wrappers
Using tao_src: C:/Users/Supervisor/Desktop/opendds/ACE_wrappers/TAO
ACE_ROOT/ace/config.h exists, skipping configuration of ACE+TAO
ENV: saving current environment
ENV: Appending ;C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\bin;C:\Users\Supervisor\Desktop\opendds\bin;C:\Users\Supervisor\Desktop\opend
ds\ACE_wrappers\lib;C:\Users\Supervisor\Desktop\opendds\lib to PATH
ENV: Setting ACE_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers
ENV: Setting MPC_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\MPC
ENV: Setting CIAO_ROOT to unused
ENV: Setting TAO_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\TAO
ENV: Setting DDS_ROOT to C:\Users\Supervisor\Desktop\opendds
ENV: Setting DANCE_ROOT to unused
Use of uninitialized value $mpctype in concatenation (.) or string at configure line 1028.
OpenDDS mwc command line: -type C:\Users\Supervisor\Desktop\opendds\DDS_TAOv2_all.mwc
Use of uninitialized value $mpctype in string eq at configure line 1031.
Running MPC to generate project files.
MPC_ROOT was set to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\MPC.
Using .../opendds/ACE_wrappers/bin/MakeProjectCreator/config/MPC.cfg
ERROR: Invalid type: C:\Users\Supervisor\Desktop\opendds\DDS_TAOv2_all.mwc
mwc.pl v4.1.8
Usage: mwc.pl [-global <file>] [-include <directory>] [-recurse]
[-ti <dll | lib | dll_exe | lib_exe>:<file>] [-hierarchy]
[-template <file>] [-relative NAME=VAL] [-base <project>]
[-noreldefs] [-notoplevel] [-static] [-genins] [-use_env]
[-value_template <NAME+=VAL | NAME=VAL | NAME-=VAL>]
[-value_project <NAME+=VAL | NAME=VAL | NAME-=VAL>]
[-make_coexistence] [-feature_file <file name>] [-gendot]
[-expand_vars] [-features <feature definitions>]
[-exclude <directories>] [-name_modifier <pattern>]
[-apply_project] [-version] [-into <directory>]
[-gfeature_file <file name>] [-nocomments]
[-relative_file <file name>] [-for_eclipse]
[-workers <#>] [-workers_dir <dir> | -workers_port <#>]
[-language <cplusplus | csharp | java | vb>]
[-type <automake | bcb2007 | bcb2009 | bds4 | bmake | cc | cdt6 |
cdt7 | em3 | ghs | gnuace | gnuautobuild | html | make |
nmake | rpmspec | sle | vc6 | vc7 | vc8 | vc10 | vc11 |
vc12 | vc14 | vc71 | vc9 | vxtest | wb26 | wb30 | wix>]
[files]
-base Add <project> as a base project to each generated
project file. Do not provide a file extension, the
.mpb extension will be tried first; if that fails the
.mpc extension will be tried.
-exclude Use this option to exclude directories or files when
searching for input files.
-expand_vars Perform direct expansion, instead of performing relative
replacement with either -use_env or -relative options.
-feature_file Specifies the feature file to read before processing.
The default feature file is default.features under the
config directory.
-features Specifies the feature list to set before processing.
-for_eclipse Generate files for use with eclipse. This is only
useful for make based project types.
-gendot Generate .dot files for use with Graphviz.
-genins Generate .ins files for use with prj_install.pl.
-gfeature_file Specifies the global feature file. The
default value is global.features under the
config directory.
-global Specifies the global input file. Values stored
within this file are applied to all projects.
-hierarchy Generate a workspace in a hierarchical fashion.
-include Specifies a directory to search when looking for base
projects, template input files and templates. This
option can be used multiple times to add directories.
-into Place all output files in a mirrored directory
structure starting at <directory>. This should be a
full path. If any project within the workspace is
referenced via a full path, use of this option is
likely to cause problems.
-language Specify the language preference; possible values are
[cplusplus, csharp, java, vb]. The default is
cplusplus.
-make_coexistence If multiple 'make' based project types are
generated, they will be named such that they can coexist.
-name_modifier Modify output names. The pattern passed to this
parameter will have the '*' portion replaced with the
actual output name. Ex. *_Static
-apply_project When used in conjunction with -name_modifier, it applies
the name modifier to the project name also.
-nocomments Do not place comments in the generated files.
-noreldefs Do not try to generate default relative definitions.
-notoplevel Do not generate the top level target file. Files
are still processed, but no top level file is created.
-recurse Recurse from the current directory and generate from
all found input files.
-relative Any $() variable in an mpc file that is matched to NAME
is replaced by VAL only if VAL can be made into a
relative path based on the current working directory.
This option can be used multiple times to add multiple
variables.
-relative_file Specifies the relative file to read before processing.
The default relative file is default.rel under the
config directory.
-static Specifies that only static projects will be generated.
By default, only dynamic projects are generated.
-template Specifies the template name (with no extension).
-workers Specifies number of child processes to use to generate
projects.
-workers_dir The directory for storing temporary output files
from the child processes. The default is '/tmp/mpc'
If neither -workers_dir nor -workers_port is used,
-workers_dir is assumed.
-workers_port The port number for the parent listener. If neither
-workers_dir nor -workers_port is used, -workers_dir
is assumed.
-ti Specifies the template input file (with no extension)
for the specific type (ex. -ti dll_exe:vc8exe).
-type Specifies the type of project file to generate. This
option can be used multiple times to generate multiple
types. There is no longer a default.
-use_env Use environment variables for all uses of $() instead
of the relative replacement values.
-value_project This option allows modification of a project variable
assignment. Use += to add VAL to the NAME's value.
Use -= to subtract and = to override the value.
This can be used to introduce new name value pairs to
a project. However, it must be a valid project
assignment.
-value_template This option allows modification of a template input
name value pair. Use += to add VAL to the NAME's
value. Use -= to subtract and = to override the value.
-version Print the MPC version and exit.
Error from MPC, stopped at configure line 1035.
The cmd script being run is:
#echo off
:: Win32 configure script wrapper for OpenDDS
:: Distributed under the OpenDDS License.
:: See: http://www.opendds.org/license.html
for %%x in (perl.exe) do set PERLPATH=%%~dp$PATH:x
if x%PERLPATH%==x (
echo ERROR: perl.exe was not found. This script requires ActiveState Perl.
exit /b 1
)
set PERLPATH=
perl configure -verbose --compiler=gcc %*
if exist setenv.cmd call setenv.cmd
And the section of configure that generates the error is:
my $mwcargs = "-type $mpctype $buildEnv->{'DDS_ROOT'}$slash$ws $static";
$mwcargs .= ' ' . $opts{'mpcopts'} if defined $opts{'mpcopts'};
print "OpenDDS mwc command line: $mwcargs\n" if $opts{'verbose'};
print 'Running MPC to generate ', ($mpctype eq 'gnuace' ? 'makefiles' :
'project files'), ".\n";
if (!$opts{'dry-run'}) {
if (system("perl $ENV{'ACE_ROOT'}/bin/mwc.pl $mwcargs") != 0) {
die "Error from MPC, stopped";
}
}
Where initial unset variable is set:
my $mpctype = ($slash eq '/') ? 'gnuace' : $opts{'compiler_version'};
I have both perl and visual studio installed. Looking up MPC I can find a 'multi-precision library. Could this be because I am using gcc? I have to use GCC in order to create a library to use with the JNI out of this code eventually...
You need to make sure that you are using ActiveState perl on windows, other perl variants seem not to work 100%