Display passed tests while they run with latest NUnit3TestAdapter version 4+ - nunit

I'm having a silly problem, with Nunit3TestAdapter version 3 under dotnet 5, I could see tests while they were passing, with the execution time detailed, the "Passed Test1" in the following transcript, as long as verbosity was set to at least normal:
$ dotnet test -v normal
[...]
NUnit Adapter 3.17.0.0: Test execution complete
Passed Test1 [21 ms]
Passed Test2 [< 1 ms]
Test Run Successful.
Total tests: 2
Passed: 2
I recently upgraded to dotnet 6 and nunit adapter 4.2.0, and now I'm unable to display the detailed output, even with the higher (detailed) verbosity:
$ dotnet test -v detailed
[...]
Test run for /tmp/nunit-repro/bin/Debug/net6.0/nunit-repro.dll (.NETCoreApp,Version=v6.0)
Microsoft (R) Test Execution Command Line Tool Version 17.0.0
Copyright (c) Microsoft Corporation. All rights reserved.
Starting test execution, please wait...
A total of 1 test files matched the specified pattern.
Passed! - Failed: 0, Passed: 2, Skipped: 0, Total: 2, Duration: 24 ms - /tmp/nunit-repro/bin/Debug/net6.0/nunit-repro.dll (net6.0)
I've been looking around for some time now and cannot find a relevant configuration option. Am I missing something?
Having integration test suites made of hundreds of tests and taking several minutes to pass, it's quite frustrating to have no visual progress whatsoever, not knowing if things are running or hanging.

Someone found the solution for me, add -l "console;verbosity=detailed":
dotnet test -l "console;verbosity=detailed"

Related

NUnit tests failed on SetUp are not displayed in TeamCity

I use NUnit and TeamCity to run my tests.
Some tests (not all) has actions made in test class constructor. I call these actions "pre-actioins" for validations. So in one test class I have for example 5 validations (tests) and a set of pre-actions.
I noticed that if a suite of tests if failed on the stage of pre-actions executing then TeamCity doesn't display these tests in its report at all (not under any status).
In build log I see error like:
SetUp Error : {test_name} + error code.
What I expect from TeamCity is to report these tests at least as Ignored.
To compare running tests using TeamCity with running tests using Visual Studio in Visual Studio the result of the same failure condition will be failure for all the test suite. Failure error will be the same for all the tests.
So what I want is just to know if some of my tests were not run at all because if TeamCity doesn't include then in test results then I don't even know about problems!
Configs: TeamCity 10.0, NUnit 3.0.
Command line params: --result=TestResult.xml --workers=4 --teamcity
Update: results of tests executing in log looks like:
[13:03:48][Step 1/1] Test Run Summary
[13:03:48][Step 1/1] Overall result: Failed
[13:03:48][Step 1/1] Tests run: 82, Passed: 0, Errors: 82, Failures: 0, Inconclusive: 0
[13:03:48][Step 1/1] Not run: 0, Invalid: 0, Ignored: 0, Explicit: 0, Skipped: 0
[13:03:48][Step 1/1] Start time: 2016-09-08 09:56:33Z
[13:03:48][Step 1/1] End time: 2016-09-08 10:03:48Z
[13:03:48][Step 1/1] Duration: 434,948 seconds
So NUnit marks such tests even not as failed but as "erros". Still I want them in test results.
Your tests are errors because you are throwing an exception in the constructor. Since the test fixture can't be constructed, the test is not really being run as far as NUnit is concerned. The fact that it's an NUnit assertion failure causing the exception is irrelevant in the context of constructing the object.
We have always advised people to keep their constructors very simple because NUnit makes no guarantees about when and how often your object will be constructed. Using assertions in the constructor is an extreme violation of that principal and, in fact, I've never seen anyone do it before.
The OneTimeSetUp attribute is there if you want some thing to happen every time your test is run as opposed to constructed. NUnit does make guarantees about when that method will be executed. :-)
None of this tells me for sure why TC is not recognizing the error but I'm guessing it's because once the constructor fails, the tests are never actually run. NUnit itself compensates for that by reporting the tests as errors but TC would not necessarily do the same.

nunit-console hangs in remote powershell session after finishing test run

I'm running NUnit tests using the following PS line:
& 'D:\tools\nunit-console.exe' D:\MyApplication.Specs.dll /run=MyApplication.MyTest /framework=net-4.5 /nothread
This line is working fine.
After that I'm running the same test in remote session on the same PC with following line:
Invoke-Command -ComputerName MyHost -ScriptBlock { & 'D:\tools\nunit-console.exe' D:\MyApplication.Specs.dll /run=MyApplication.MyTest /framework=net-4.5 /nothread } -credential MyUser
After the test execution is finished nunit-console hangs with following output:
NUnit-Console version 2.6.3.13283
Copyright (C) 2002-2012 Charlie Poole.
Copyright (C) 2002-2004 James W. Newkirk, Michael C. Two, Alexei A. Vorontsov.
Copyright (C) 2000-2002 Philip Craig.
All Rights Reserved.
Runtime Environment -
OS Version: Microsoft Windows NT 6.1.7601 Service Pack 1
CLR Version: 2.0.50727.5477 ( Net 3.5 )
ProcessModel: Default DomainUsage: Single
Execution Runtime: net-4.5
Selected test(s): MyApplication.MyTest
.
Tests run: 1, Errors: 0, Failures: 0, Inconclusive: 0, Time: 6.96245315227513 seconds
Not run: 0, Invalid: 0, Ignored: 0, Skipped: 0
I have noticed nunit-agent process in task manager. If I kill this process test execution is finished successfully.
The tests are running in separate manually created ASP.NET application domain via Remoting. The solution is to unload this app domain when all tests are passed. Global NUnit TearDown method is useful in this case :)

Perl 5.16.3.1 win32 Devel::REPL module-build-tiny fails to install

I am using strawberry Perl win32 version 5.16.3.1 on Windows 7 and am trying to install the Devel::REPL module which seems to use module-build-tiny and both fail. From the install I get:
---- ETHER/Devel-REPL-1.003025.tar.gz ---
Module::Build::Tiny [build_requires]
Running make test
Make had some problems, won't test
Delayed until after prerequisites
Running make install
Make had some problems, won't install
Delayed until after prerequisites
Running install for module 'Module::Build::Tiny'
...
Test Summary Report
-------------------
t/simple.t (Wstat: 1280 Tests: 21 Failed: 0)
Non-zero exit status: 5
Parse errors: No plan found in TAP output
Files=3, Tests=22, 14 wallclock secs ( 0.13 usr + 0.06 sys = 0.19 CPU)
Result: FAIL
LEONT/Module-Build-Tiny-0.028.tar.gz
C:\strawberry\perl\bin\perl.exe ./Build test -- NOT OK
//hint// to see the cpan-testers results for installing this module, try:
reports LEONT/Module-Build-Tiny-0.028.tar.gz
Running Build install
make test had returned bad status, won't install without force
Stopping: 'install' failed for 'Module::Build::Tiny'.
Failed during this command:
LEONT/Module-Build-Tiny-0.028.tar.gz : make_test NO
I tried running force but it still fails at the first simple.t test - a dialog opens to say that Perl has stopped running.
Thoughts on what I need to do to get this to work? Thanks.
Problem solved. Conflict with MinGW which was in the path prior to Strawberry's MinGW environment. Changed the path order and it works fine now.

WWW:Mechanize Perl Module install errors

I am attempting to install the WWW:Mechanize module on my XAMMP server. I have copied the test results that were displayed at the end of the install here:
Test Summary Report
-------------------
t\local\back.t (Wstat: 256 Tests: 47 Failed: 1)
Failed test: 33
Non-zero exit status: 1
t\local\click_button.t (Wstat: 0 Tests: 19 Failed: 0)
TODO passed: 15-17, 19
Files=51, Tests=554, 203 wallclock secs ( 0.51 usr + 0.11 sys = 0.62 CPU)
Result: FAIL
Failed 1/51 test programs. 1/554 subtests failed.
NMAKE : fatal error U1077: 'C:\Windows\system32\cmd.exe' : return code '0xff'
Stop.
PETDANCE/WWW-Mechanize-1.66.tar.gz
nmake.exe test -- NOT OK
//hint// to see the cpan-testers results for installing this module, try:
reports PETDANCE/WWW-Mechanize-1.66.tar.gz
Running make install
make test had returned bad status, won't install without force
Failed during this command:
PETDANCE/WWW-Mechanize-1.66.tar.gz : make_test NO
Can anyone tell me why there are so many errors, and has WWW:Mechanize installed in spite of these errors?
Test 33 in t/local/back.t (in WWW::Mechanize 1.66 anyway) appears to test for 404s on a local loopback HTTP server, created just for testing. It apparently received a different message than expected for the test.
I'd say you'd be fine installing it and ignoring that one test, since the other 553 seem to have succeeded. You can do it by hand by downloading WWW::Mechanize (or going to your CPAN build directory if you know where it is), and running
perl Makefile.PL
make
make test # just try it again, it might have been transient
make install
You may also wish to visit the WWW::Mechanize bug tracker and report this as a bug, if it persists; it would be nice.
EDIT: More on this issue in the bug tracker, so apparently no need to report it. No fix yet, though.
Some time ago I have analyzed this failure. At least to me it does not seem to be a WWW::Mechanize's bug.
Have a look at:
1/ my LWP (HTTP::Daemon) bug report - https://rt.cpan.org/Public/Bug/Display.html?id=62354
2/ my Socket (gethostbyaddr) bug report - http://rt.perl.org/rt3/Ticket/Display.html?id=78364
--
kmx
P.S. sorry can post just 1 link

How can I install HTML-Parser-3.64 on Perl 5.6?

I am trying to install HTML-Parser-3.64 and I get following report while running make test:
ERL_DL_NONLAZY=1 /home/Perl/bin/perl -Iblib/arch -Iblib/lib -I/home/Perl/5.6.1-nothread/lib/perl5/5.6.1/sun4.SVR4 -I/home/Perl/5.6.1-nothread/lib/perl5/5.6.1 -e 'use Test::Harness qw(&runtests $verbose); $verbose=0; runtests #ARGV;' t/*.t
t/api_version........ok
t/argspec-bad........ok
t/argspec............ok
t/argspec2...........ok
t/attr-encoded.......ok
t/callback...........ok
t/case-sensitive.....ok
t/cases..............ok
t/comment............ok
t/crashme............ok
t/declaration........ok
t/default............ok
t/document...........ok
t/dtext..............ok
t/entities...........ok 1/17
t/entities...........NOK 17# Failed test (t/entities.t at line 74)
# got: 'Attention Home&#959&#969n&#1257rs...1&#1109t T&#1110&#1084e E&#957&#1257&#1075'
# expected: 'Attention HomeοÏnÓ©rs...1Ñt TÑмe Eνөг'
# Looks like you failed 1 test of 17.
t/entities...........dubious
Test returned status 1 (wstat 256, 0x100)
DIED. FAILED test 17
Failed 1/17 tests, 94.12% okay
t/entities2..........ok
t/filter-methods.....ok
t/filter.............ok
t/handler-eof........ok
t/handler............ok
t/headparser-http....ok
t/headparser.........ok
4/15 skipped: Need Unicode support
t/ignore.............ok
t/largetags..........ok
t/linkextor-base.....ok
t/linkextor-rel......ok
t/magic..............ok
t/marked-sect........ok
t/msie-compat........ok
t/offset.............ok
t/options............ok
t/parsefile..........ok
t/parser.............ok
t/plaintext..........ok
t/pod................skipped
all skipped: Test::Pod 1.00 required for testing POD
t/process............ok
t/pullparser.........ok
t/script.............ok
t/skipped-text.......ok
t/stack-realloc......ok
t/textarea...........ok
t/threads............skipped
all skipped: Not configured for threads
t/tokeparser.........ok
t/uentities..........ok 1/26# Looks like you planned 26 tests but ran 1 extra.
t/uentities..........dubious
Test returned status 1 (wstat 256, 0x100)
DIED. FAILED test 27
Failed 1/26 tests, 96.15% okay (less 27 skipped tests: -2 okay, -7.69%)
t/unbroken-text......ok
t/unicode-bom........ok
2/2 skipped: This perl does not support Unicode
t/unicode............skipped
all skipped: This perl does not support Unicode
t/xml-mode...........ok
Failed Test Stat Wstat Total Fail Failed List of Failed
-------------------------------------------------------------------------------
t/entities.t 1 256 17 1 5.88% 17
t/uentities.t 1 256 26 1 3.85% 27
3 tests and 33 subtests skipped.
Failed 2/48 test scripts, 95.83% okay. 0/338 subtests failed, 100.00% okay.
*** Error code 11
make: Fatal error: Command failed for target `test_dynamic'
I have Perl version 5.6.1. and 3.64 version of Parser on Solaris 8.
I have seen following similar error reported by another person on internet at http://www.cpantesters.org/cpan/report/6653478. But could not see any reply or solution for that.
If you are comfortable with the other test results, and you don't expect to use HTML::Parser for pages which require extended character sets, then you can force an install.
From command-line:
$ perl -MCPAN -e 'force install HTML::Parser'
From CPAN command line:
cpan> force install HTML::Parser
From your build directory (if you'd rather not use CPAN):
mv t/entities.t t/entities.tt
mv t/uentities.t t/uentities.tt
make install
You need to report it to HTML::Parser bug tracker. Cpantesters results are only reports, in many cases automatic, so to gain attention of author you need to use bug tracker.
P.S. Upgrading is also a good idea (in case you have no problems of doing this) - Perl has changed much in 9 years from 5.6.1.