How to prevent pytest from capturing log messages when -m flag is used - pytest

I am using pytest -s for displaying log messages to the console. It works fine for individual pytest. But when used with pytest -s -m option, it does not work.
pytest -s -m sdk
does not output to console. How do I get it to work?

Related

Get all dependencies for Par::Packer?

I'm using Par::Packer on OSX and Linux to create binaries for some applications that utilize Paws.
I'm calling pp like so:
pp -o build_cluster -x -c -I lib/ #ppdeps bin/build_cluster
ppdeps contains a list of modules that caused pp to fail because they weren't detected at build time. The only way that I'm aware of to make this work is to run pp repeatedly, wait for it to fail, then add the module it complains about to ppdeps.
My question is whether there's a better way to do this, or a way to get a full list of dependencies beforehand? I could write a script to do what I've been doing manually, but if there's a better way, I'd be glad to know it.
ppdeps contents:
-I local/lib/perl5/
-a fakename.conf
-M Paws::Net::Caller
-M Paws::Net::RetryCallerRole
-M Paws::Net::APIResponse
-M Paws::Net::S3Signature
-M Paws::Net::RestXmlCaller
-M Paws::Net::RestXMLResponse
-M Paws::API::Caller
-M Paws::API::EndpointResolver
-M Paws::S3
-M Paws::S3::ListBuckets
-M Paws::S3::ListBucketsOutput
-M Paws::Route53
-M Method::Generate::BuildAll
-M IO::Socket::SSL
-M Net::SSLeay
-M Archive::Zip::ZipFileMember
This is incomplete, I'm still building it for this app. The more Paws modules are included in the app, the more hidden dependencies pop up.
This has only been an issue with apps that use Paws and, to a lesser extent, Moose.
Here is how i make a build.
pp -o build/run -B -I local/lib/perl5/ script/app.pl

Missing 'libexpat-1_.dll' error for executable made with pp

I made exe file with pp using Strawberry Perl, but when I run it on another machine, I get following error:
The program can't start because libexpat-1__.dll is missing from your computer. Try reinstalling the program to fix this problem.
I make the executable with this command:
pp -M FindBin -M DateTime -M DateTime::Format::JSON::MicrosoftDateFormat -M DateTime::Format::DateParse -M REST::Client -M JSON::XS -M Spreadsheet::ParseExcel -M Spreadsheet::ParseXLSX -M Log::Log4perl::Tiny -o test.exe test.pl
I tried using -a "c:\strawberry\c\bin\libexpat-1_.dll" (didn't help) and -l "c:\strawberry\c\bin\libexpat-1_.dll" ("Can't find shared library.." error).
How can I resolve this issue?
I had a typo in DLL name. Using -l option resolved issue. Specifying modules in the command wasn't necessary, as pp scans script for the used modules, and includes them automatically. Built it with:
pp -l "libexpat-1__.dll" -o test.exe test.pl

"gsutil -m mv" not running parallel transfers

I am moving a large number of files from a local drive into a bucket using "gsutil -m mv". However during the transfer process it would appear to only be running one transfer at a time. I have checked top and only see one process from python running the command. I have modified both "parallel_process_count"
"parallel_thread_count" in the boto config file and do not observe any change in the transfers behavior. Even when running gsutil with the -m option i still receive the message below:
"==> NOTE: You are performing a sequence of gsutil operations that may
run significantly faster if you instead use gsutil -m -m ... Please
see the -m section under "gsutil help options" for further information
about when gsutil -m can be advantageous."
Has anyone else run into this issue before?
OS: Centos 6.6
gsutil version: 4.15
python version: 2.6.6
This is a bug in gsutil 4.14-4.15, where the -m flag was not getting propagated correctly for the mv command. It is fixed in gsutil 4.16.

can't capture output error from isql shell script

I am trying to capture output and errors throw from shell script executed via isql. I've tried with the following code, but it doesn't work:
isql -q -i e:\test\script.sql e:\db\testdb.fdb -m -u sysuser -p xxx
I am looking for example how to use –m[erge_stderr] command.

How to perform logging with gsutil rsync

What's the proper way to log any errors or warnings when performing a quiet rsync?
This is what I currently run from my crontab:
gsutil -m -q rsync -r -C /mount1/share/folder gs://my-bucket-1/folder/ > /mount2/share/folder/gsutil.log
Since the log file is always completely empty and I'm uploading terabytes of data I'm starting to think that maybe even errors and warnings are being supressed.
After having realized that this is related to how you pipe stdout and/or stderr to files in general, the answer really lies within this existing thread: How to redirect both stdout and stderr to a file
So a simple solution to log as much as possible into one single log file could be something like:
gsutil -d rsync [src] [dst] &> [logfile]
...where -d enables debug output. I found this to be the only way to show files which were affected by an error such as CommandException: 3 files/objects could not be copied. Please note that -d exposes authentication credentials.