How to pass multiple arguments to dotCover merge - powershell

I am writing powershell command to merge two snapshots as following -
&$coveragTool merge /Source= $TestResult1;$TestResult2 /Output= TestMergeOutput.dcvr
it is giving error as -
Parameter 'Source' has invalid value.
Invalid volume separator char ':' (0x3A) in path at index 67.
where as the document says two files should be separated by a semicolon(;)
like this -
merge: Merge several coverage snapshots
usage: dotCover merge|m <parameters>
Valid parameters:
--Source=ARG : (Required) List of snapshots separated with semicolon (;)
--Output=ARG : (Required) File name for the merged snapshot
--TempDir=ARG : (Optional) Directory for the auxiliary files. Set to system temp by default
Global parameters:
--LogFile=ARG : (Optional) Enables logging and allows specifying a log file name
--UseEnvVarsInPaths=ARG : (Optional) [True|False] Allows using environment variables (for example, %TEMP%) in paths. True
by default
how do i make it correct?

You cannot pass an unquoted ; as part of an argument, because PowerShell interprets it as a statement separator.
Either enclose the argument in "...", or `-escape the ; character selectively; also, the space after = may or may not be a problem.
To make the call (at least syntactically) succeed, use the following:
& $coveragTool merge /Source="$TestResult1;$TestResult2" /Output=TestMergeOutput.dcvr
Alternatively (note the `, ignore the broken syntax highlighting):
& $coveragTool merge /Source=$TestResult1`;$TestResult2 /Output=TestMergeOutput.dcvr
PowerShell has more so-called metacharacters than cmd.exe, for instance, notably ( ) , { } ; # $ # in addition to & | < > - see this answer for additional information.

Related

How to Abort Informatica workflow if source file has only header row

I have a requirement wherein I am checking if source file which is in CSV format has only header rows i.e. 1 row only then I need to fail the informatica workflow. Informatica is installed in Windows server so only Command task is supported not Unix or bash.
I am using below code to count lines in source file using Command Task in workflow.
for /f "usebackq" %%b in (type $$outputfile ^| find "" /v /c)do (
echo line count is %%b> $$count_file.txt
)
)
Here $$outfile and $$count_file paths and filenames are picked from param files.
There is an ABORT() function that you can use in expression transformation.
Create a dummy column and put a sorter, aggregator right after the source qualifier. in the aggregator, get a count of all data and then join it back to the main flow. After joiner, put an expression transformation with below condition-
IIF( cnt_all > 1, NULL, ABORT( 'Only header exists in the input file! Session will be aborted.'))
Whole mapping should look like this --
SQ -- EXP(add dummy_col) -->SRT on dummy_col -->AGG on dummy_col, calculate Count(*)->|
|--------------------------------------------> JNR on dummy_col -->EXP (abort if count <=1) --> existing mapping logic...
EDIT :
From command task, you can call pmcmd abortworkflow when your condition satisfies. Normal syntax is below -
pmcmd abortworkflow -service service -user username -password password -f folder workflow

Executing sql file with sqlplus in windows 10 powershell

I have created a .bat file to export csv file regulary through windows task scheduling which works fine.
But not working when I switch to Powershell. It returns (both in ISE and right click .ps1 "Run with Powershell") with:
SQL*Plus: Release 19.0.0.0.0 - Production on Sun May 2 14:05:52 2021
Version 19.10.0.0.0
Copyright (c) 1982, 2020, Oracle. All rights reserved.
ERROR: ORA-12154: TNS:could not resolve the connect identifier specified
So.I'm not sure what I'm doing wrong. The variable input are dummies.
In my .bat contains:
SET NLS_LANG=.AL32UTF8
SET hostIp="123.123.1.12"
SET username="user1"
SET password="pass1"
SET port="1521"
SET service="myDBname"
SET sqlPath="C:\My script\TEST_EXPORT.sql"
sqlplus %username%/%password%#%hostIp%:%port%/%service% #"%sqlPath%"
In my .ps1 contains:
cls
$hostIp="123.123.1.12"
$username="user1"
$password="pass1"
$port="1521"
$service="myDBname"
$sqlPath="C:\My script\TEST_EXPORT.sql"
echo "$username/$password#$hostIp`:$port/$service #$sqlPath"
sqlplus "$username/$password#$hostIp`:$port/$service #$sqlPath"
Try using composite formatting to build the parameter string. The upside is that one can build the string and not to worry about quotation issues. Note that there is no need to escape the colon `: in the string, as it is not interpreted as scope operator.
# A variable that contains double quote
$quote = '"'
$("{0}/{1}#{2}:{3}/{4} #{5}{6}{5}" -f $username, $password, $hostIp, $port, $service, $quote, $sqlPath,$quote)
user1/pass1#123.123.1.12:1521/myDBname #"C:\My script\TEST_EXPORT.sql"
Another an alternative for building complex strings is string interpolation. Here are three versions that contain different techniques to include double-quotes. The same works in composite fomatting too.
# Double-doublequote version. I'd avoid this, as multiple double quotes are hard to read
"${username}/${password}#{$hostIp}:${port}/${service} #""${sqlPath}"""
user1/pass1#{123.123.1.12}:1521/myDBname #"C:\My script\TEST_EXPORT.sql"
# Backtick-escape version
"${username}/${password}#{$hostIp}:${port}/${service} #`"${sqlPath}`""
user1/pass1#{123.123.1.12}:1521/myDBname #"C:\My script\TEST_EXPORT.sql"
# Quote in a variable version
"${username}/${password}#{$hostIp}:${port}/${service} #${quote}${sqlPath}${quote}"
user1/pass1#{123.123.1.12}:1521/myDBname #"C:\My script\TEST_EXPORT.sql"

How to replace variables of JSON file in Team Services?

I'm stuck with a release variable substitution of an angular project. I have a settings.json file which I would like to replace some variables:
{
test : "variable to replace"
}
I tried to find some custom task on the marketplace but all of the tasks seems to work only with xml files for the web.config.
I use the "Replace tokens" from the Marketplace https://marketplace.visualstudio.com/items?itemName=qetza.replacetokens
You define the desired values as variables in the Release Definition and then you add the Replace Tokens task and configure a wildcard path for all target text files in your repository where you want to replace values (for example: **/*.json). The token that gets replaced has configurable prefix and postfix (default are '#{' and '}#'). So if you have a variable named constr you can put in your config.json
{
"connectionstring": "#{constr}#"
}
and it will deploy the file like
{
"connectionstring": "server=localhost,user id=admin,password=secret"
}
The IIS Web App Deploy Task in VSTS Releases has JSON variable substitution under *File Transforms & Variable Substitution Options.
Provide a list of json files and JSONPath expressions for the variables that need replacing
For example, to replace the value of ‘ConnectionString’ in the sample below, you need to define a variable as ‘Data.DefaultConnection.ConnectionString’ in the build/release definition (or release definition’s environment).
{
  "Data": {
    "DefaultConnection": {
      "ConnectionString": "Server=(localdb)\SQLEXPRESS;Database=MyDB;Trusted_Connection=True"
    }
  }
}
You can add a variable in release variables Tab, and then use PowerShell task to update the content of your settings.json.
Assume the original content is
{
test : "old
}
And you want to change it to
{
test : "new"
}
So you can replace the variable in json file with below steps:
1. Add variable
Define a variable in release variable tab with the value you want to replace with (variable test with value new):
2. Add PowerShell task
Settings for powershell task:
Type: Inline Script.
Inline Script:
# System.DefaultWorkingDirectory is the path like C:\_work\r1\a, so you need specify where your appsettings.json is.
$path="$(System.DefaultWorkingDirectory)\buildName\drop\WebApplication1\src\WebApplication1\appsettings.json"
(Get-Content $path) -replace "old",$(test) | out-file $path

OpenDDS Perl Script Setup Throwing Error

Continuing from this SO question.
When following the openDDS install guide I attempt to run configure from within the command prompt but receive this output recieve this error set:
C:\Users\Supervisor\Desktop\opendds>C:\Users\Supervisor\Desktop\opendds\configure.cmd
Options:
'compiler' => 'gcc'
'verbose' => 1
host system is: win32
compiler is: gcc
Using ace_src: C:/Users/Supervisor/Desktop/opendds/ACE_wrappers
Using tao_src: C:/Users/Supervisor/Desktop/opendds/ACE_wrappers/TAO
ACE_ROOT/ace/config.h exists, skipping configuration of ACE+TAO
ENV: saving current environment
ENV: Appending ;C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\bin;C:\Users\Supervisor\Desktop\opendds\bin;C:\Users\Supervisor\Desktop\opend
ds\ACE_wrappers\lib;C:\Users\Supervisor\Desktop\opendds\lib to PATH
ENV: Setting ACE_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers
ENV: Setting MPC_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\MPC
ENV: Setting CIAO_ROOT to unused
ENV: Setting TAO_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\TAO
ENV: Setting DDS_ROOT to C:\Users\Supervisor\Desktop\opendds
ENV: Setting DANCE_ROOT to unused
Use of uninitialized value $mpctype in concatenation (.) or string at configure line 1028.
OpenDDS mwc command line: -type C:\Users\Supervisor\Desktop\opendds\DDS_TAOv2_all.mwc
Use of uninitialized value $mpctype in string eq at configure line 1031.
Running MPC to generate project files.
MPC_ROOT was set to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\MPC.
Using .../opendds/ACE_wrappers/bin/MakeProjectCreator/config/MPC.cfg
ERROR: Invalid type: C:\Users\Supervisor\Desktop\opendds\DDS_TAOv2_all.mwc
mwc.pl v4.1.8
Usage: mwc.pl [-global <file>] [-include <directory>] [-recurse]
[-ti <dll | lib | dll_exe | lib_exe>:<file>] [-hierarchy]
[-template <file>] [-relative NAME=VAL] [-base <project>]
[-noreldefs] [-notoplevel] [-static] [-genins] [-use_env]
[-value_template <NAME+=VAL | NAME=VAL | NAME-=VAL>]
[-value_project <NAME+=VAL | NAME=VAL | NAME-=VAL>]
[-make_coexistence] [-feature_file <file name>] [-gendot]
[-expand_vars] [-features <feature definitions>]
[-exclude <directories>] [-name_modifier <pattern>]
[-apply_project] [-version] [-into <directory>]
[-gfeature_file <file name>] [-nocomments]
[-relative_file <file name>] [-for_eclipse]
[-workers <#>] [-workers_dir <dir> | -workers_port <#>]
[-language <cplusplus | csharp | java | vb>]
[-type <automake | bcb2007 | bcb2009 | bds4 | bmake | cc | cdt6 |
cdt7 | em3 | ghs | gnuace | gnuautobuild | html | make |
nmake | rpmspec | sle | vc6 | vc7 | vc8 | vc10 | vc11 |
vc12 | vc14 | vc71 | vc9 | vxtest | wb26 | wb30 | wix>]
[files]
-base Add <project> as a base project to each generated
project file. Do not provide a file extension, the
.mpb extension will be tried first; if that fails the
.mpc extension will be tried.
-exclude Use this option to exclude directories or files when
searching for input files.
-expand_vars Perform direct expansion, instead of performing relative
replacement with either -use_env or -relative options.
-feature_file Specifies the feature file to read before processing.
The default feature file is default.features under the
config directory.
-features Specifies the feature list to set before processing.
-for_eclipse Generate files for use with eclipse. This is only
useful for make based project types.
-gendot Generate .dot files for use with Graphviz.
-genins Generate .ins files for use with prj_install.pl.
-gfeature_file Specifies the global feature file. The
default value is global.features under the
config directory.
-global Specifies the global input file. Values stored
within this file are applied to all projects.
-hierarchy Generate a workspace in a hierarchical fashion.
-include Specifies a directory to search when looking for base
projects, template input files and templates. This
option can be used multiple times to add directories.
-into Place all output files in a mirrored directory
structure starting at <directory>. This should be a
full path. If any project within the workspace is
referenced via a full path, use of this option is
likely to cause problems.
-language Specify the language preference; possible values are
[cplusplus, csharp, java, vb]. The default is
cplusplus.
-make_coexistence If multiple 'make' based project types are
generated, they will be named such that they can coexist.
-name_modifier Modify output names. The pattern passed to this
parameter will have the '*' portion replaced with the
actual output name. Ex. *_Static
-apply_project When used in conjunction with -name_modifier, it applies
the name modifier to the project name also.
-nocomments Do not place comments in the generated files.
-noreldefs Do not try to generate default relative definitions.
-notoplevel Do not generate the top level target file. Files
are still processed, but no top level file is created.
-recurse Recurse from the current directory and generate from
all found input files.
-relative Any $() variable in an mpc file that is matched to NAME
is replaced by VAL only if VAL can be made into a
relative path based on the current working directory.
This option can be used multiple times to add multiple
variables.
-relative_file Specifies the relative file to read before processing.
The default relative file is default.rel under the
config directory.
-static Specifies that only static projects will be generated.
By default, only dynamic projects are generated.
-template Specifies the template name (with no extension).
-workers Specifies number of child processes to use to generate
projects.
-workers_dir The directory for storing temporary output files
from the child processes. The default is '/tmp/mpc'
If neither -workers_dir nor -workers_port is used,
-workers_dir is assumed.
-workers_port The port number for the parent listener. If neither
-workers_dir nor -workers_port is used, -workers_dir
is assumed.
-ti Specifies the template input file (with no extension)
for the specific type (ex. -ti dll_exe:vc8exe).
-type Specifies the type of project file to generate. This
option can be used multiple times to generate multiple
types. There is no longer a default.
-use_env Use environment variables for all uses of $() instead
of the relative replacement values.
-value_project This option allows modification of a project variable
assignment. Use += to add VAL to the NAME's value.
Use -= to subtract and = to override the value.
This can be used to introduce new name value pairs to
a project. However, it must be a valid project
assignment.
-value_template This option allows modification of a template input
name value pair. Use += to add VAL to the NAME's
value. Use -= to subtract and = to override the value.
-version Print the MPC version and exit.
Error from MPC, stopped at configure line 1035.
The cmd script being run is:
#echo off
:: Win32 configure script wrapper for OpenDDS
:: Distributed under the OpenDDS License.
:: See: http://www.opendds.org/license.html
for %%x in (perl.exe) do set PERLPATH=%%~dp$PATH:x
if x%PERLPATH%==x (
echo ERROR: perl.exe was not found. This script requires ActiveState Perl.
exit /b 1
)
set PERLPATH=
perl configure -verbose --compiler=gcc %*
if exist setenv.cmd call setenv.cmd
And the section of configure that generates the error is:
my $mwcargs = "-type $mpctype $buildEnv->{'DDS_ROOT'}$slash$ws $static";
$mwcargs .= ' ' . $opts{'mpcopts'} if defined $opts{'mpcopts'};
print "OpenDDS mwc command line: $mwcargs\n" if $opts{'verbose'};
print 'Running MPC to generate ', ($mpctype eq 'gnuace' ? 'makefiles' :
'project files'), ".\n";
if (!$opts{'dry-run'}) {
if (system("perl $ENV{'ACE_ROOT'}/bin/mwc.pl $mwcargs") != 0) {
die "Error from MPC, stopped";
}
}
Where initial unset variable is set:
my $mpctype = ($slash eq '/') ? 'gnuace' : $opts{'compiler_version'};
I have both perl and visual studio installed. Looking up MPC I can find a 'multi-precision library. Could this be because I am using gcc? I have to use GCC in order to create a library to use with the JNI out of this code eventually...
You need to make sure that you are using ActiveState perl on windows, other perl variants seem not to work 100%

copy task in Cakefile

I am trying to copy all the files in a list of directories and paste them into an output directory. The problem is whenever I use an *, the output says there is no file or directory by that name exists. Here is the specific error output:
cp: cannot stat `tagbox/images/*': No such file or directory
cp: cannot stat `votebox/images/*': No such file or directory
If I just put the name of a specific file instead of *, it works.
here is my Cakefile:
fs = require 'fs'
util = require 'util'
{spawn} = require 'child_process'
outputImageFolder = 'static'
imageSrcFolders = [
'tagbox/images/*'
'votebox/images/*'
]
task 'cpimgs', 'Copy all images from the respective images folders in tagbox, votebox, and omnipost into static folder', ->
for imgSrcFolder in imageSrcFolders
cp = spawn 'cp', [imgSrcFolder, outputImageFolder]
cp.stderr.on 'data', (data) ->
process.stderr.write data.toString()
cp.stdout.on 'data', (data) ->
util.log data.toString()
You are using the * character, probably because that works for you in your shell. Using * and other wildcard characters that expand to match multiple paths is called "globbing" and while your shell does it automatically, most other programs including node/javascript/coffeescript will not do it by default. Also the cp binary itself doesn't do globbing, as you are discovering. The shell does the globbing and then passes a list of matching files/directories as arguments to cp. Look into the node module node-glob to do the globbing and give you back a list of matching files/directories, which you can then pass to cp as arguments if you like. Note that you could also use a filesystem module that would have this type of functionality built in. Note however that putting async code directly into a Cakefile can be problematic as documented here.