How to Abort Informatica workflow if source file has only header row - command-line

I have a requirement wherein I am checking if source file which is in CSV format has only header rows i.e. 1 row only then I need to fail the informatica workflow. Informatica is installed in Windows server so only Command task is supported not Unix or bash.
I am using below code to count lines in source file using Command Task in workflow.
for /f "usebackq" %%b in (type $$outputfile ^| find "" /v /c)do (
echo line count is %%b> $$count_file.txt
)
)
Here $$outfile and $$count_file paths and filenames are picked from param files.

There is an ABORT() function that you can use in expression transformation.
Create a dummy column and put a sorter, aggregator right after the source qualifier. in the aggregator, get a count of all data and then join it back to the main flow. After joiner, put an expression transformation with below condition-
IIF( cnt_all > 1, NULL, ABORT( 'Only header exists in the input file! Session will be aborted.'))
Whole mapping should look like this --
SQ -- EXP(add dummy_col) -->SRT on dummy_col -->AGG on dummy_col, calculate Count(*)->|
|--------------------------------------------> JNR on dummy_col -->EXP (abort if count <=1) --> existing mapping logic...
EDIT :
From command task, you can call pmcmd abortworkflow when your condition satisfies. Normal syntax is below -
pmcmd abortworkflow -service service -user username -password password -f folder workflow

Related

db2 how to configure external tables using extbl_location, extbl_strict_io

db2 how to configure external tables using extbl_location, extbl_strict_io. Could you please give insert example for system table how to set up this parameters. I need to create external table and upload data to external table.
I need to know how to configure parameters extbl_location, extbl_strict_io.
I created table like this.
CREATE EXTERNAL TABLE textteacher(ID int, Name char(50), email varchar(255)) USING ( DATAOBJECT 'teacher.csv' FORMAT TEXT CCSID 1208 DELIMITER '|' REMOTESOURCE 'LOCAL' SOCKETBUFSIZE 30000 LOGDIR '/tmp/logs' );
and tried to upload data to it.
insert into textteacher (ID,Name,email) select id,name,email from teacher;
and get exception [428IB][-20569] The external table operation failed due to a problem with the corresponding data file or diagnostic files. File name: "teacher.csv". Reason code: "1".. SQLCODE=-20569, SQLSTATE=428IB, DRIVER=4.26.14
If I correct understand documentation parameter extbl_location should pointed directory where data will save. I suppose full directory will showed like
$extbl_location+'/'+teacher.csv
I found some documentation about error
https://www.ibm.com/support/pages/how-resolve-sql20569n-error-external-table-operation
I tried to run command in docker command line.
/opt/ibm/db2/V11.5/bin/db2 get db cfg | grep -i external
but does not information about external any tables.
CREATE EXTERNAL TABLE statement:
file-name
...
When both the REMOTESOURCE option is set to LOCAL (this is its default value) and the extbl_strict_io configuration parameter is set
to NO, the path to the external table file is an absolute path and
must be one of the paths specified by the extbl_location configuration
parameter. Otherwise, the path to the external table file is relative
to the path that is specified by the extbl_location configuration
parameter followed by the authorization ID of the table definer. For
example, if extbl_location is set to /home/xyz and the authorization
ID of the table definer is user1, the path to the external table file
is relative to /home/xyz/user1/.
So, If you use relative path to a file as teacher.csv, you must set extbl_strict_io to YES.
For an unload operation, the following conditions apply:
If the file exists, it is overwritten.
Required permissions:
If the external table is a named external table, the owner must have read and write permission for the directory of this file.
If the external table is transient, the authorization ID of the statement must have read and write permission for the directory of this file.
Moreover you must create a sub-directory equal to your username (in lowercase) which is owner of this table in the directory specified in extbl_location and ensure, that this user (not the instance owner) has rw permission to this sub-directory.
Update:
To setup presuming, that user1 runs this INSERT statement.
sudo mkdir -p /home/xyz/user1
# user1 must have an ability to cd to this directory
sudo chown user1:$(id -gn user1) /home/xyz/user1
db2 connect to mydb
db2 update db cfg using extbl_location /home/xyz extbl_strict_io YES

How to pass multiple arguments to dotCover merge

I am writing powershell command to merge two snapshots as following -
&$coveragTool merge /Source= $TestResult1;$TestResult2 /Output= TestMergeOutput.dcvr
it is giving error as -
Parameter 'Source' has invalid value.
Invalid volume separator char ':' (0x3A) in path at index 67.
where as the document says two files should be separated by a semicolon(;)
like this -
merge: Merge several coverage snapshots
usage: dotCover merge|m <parameters>
Valid parameters:
--Source=ARG : (Required) List of snapshots separated with semicolon (;)
--Output=ARG : (Required) File name for the merged snapshot
--TempDir=ARG : (Optional) Directory for the auxiliary files. Set to system temp by default
Global parameters:
--LogFile=ARG : (Optional) Enables logging and allows specifying a log file name
--UseEnvVarsInPaths=ARG : (Optional) [True|False] Allows using environment variables (for example, %TEMP%) in paths. True
by default
how do i make it correct?
You cannot pass an unquoted ; as part of an argument, because PowerShell interprets it as a statement separator.
Either enclose the argument in "...", or `-escape the ; character selectively; also, the space after = may or may not be a problem.
To make the call (at least syntactically) succeed, use the following:
& $coveragTool merge /Source="$TestResult1;$TestResult2" /Output=TestMergeOutput.dcvr
Alternatively (note the `, ignore the broken syntax highlighting):
& $coveragTool merge /Source=$TestResult1`;$TestResult2 /Output=TestMergeOutput.dcvr
PowerShell has more so-called metacharacters than cmd.exe, for instance, notably ( ) , { } ; # $ # in addition to & | < > - see this answer for additional information.

CMake collect sources from folders using batch script (without GLOB)

I want to collect all the source or header files from a specified folder, also matching a curtain naming convention. I don't want to use GLOBbing, and also couldn't find any examples of an approach using only cmake.
One answer from this question suggests to use ls *.cpp into CMakeLists.txt. So I though of getting a list of sources via invoking a batch script in CMakeLists.
But something is wrong. Though it seems that the output is totally correct, CMake can not find those files. The path is (visually) correct: if I manually type it into add_executable, generating will succeed.
While I still want to know how to achieve the initial intent, I am extremely confused about the reason why totally identical strings compare to false:
CMake log:
-- Manually-typed: C:/Repos/cmake-scanner/src/main.cpp
-- Recieved-batch: C:/Repos/cmake-scanner/src/main.cpp
-- Path strings not identical
CollectSources.bat
#echo off
set arg1=%1
set arg2=%2
powershell -Command "$path = '%1'.Replace('\','/'); $headers = New-Object Collections.Generic.List[string]; ls -Name $path/*.%2 | foreach-object{ $headers.Add($path + '/' + $_)}; $headers"
CMakeLists.txt
cmake_minimum_required(VERSION 3.12 FATAL_ERROR)
project(Auto-scanner)
set(HEADERS)
set(SOURCES)
if(WIN32)
execute_process(
COMMAND CMD /c ${CMAKE_CURRENT_SOURCE_DIR}/CollectSources.bat ${CMAKE_CURRENT_SOURCE_DIR}/include h
OUTPUT_VARIABLE res
)
message(STATUS "Found headers: ${res}")
execute_process(
COMMAND CMD /c ${CMAKE_CURRENT_SOURCE_DIR}/CollectSources.bat ${CMAKE_CURRENT_SOURCE_DIR}/src cpp
OUTPUT_VARIABLE res2
)
message(STATUS "Found sources: ${res2}")
set(${HEADERS} ${res})
endif(WIN32)
message(STATUS "Collected headers: ${HEADERS}")
message(STATUS "Manually-typed: C:/Repos/cmake-scanner/src/main.cpp")
message(STATUS "Recieved-batch: ${res2}")
if(NOT "C:/Repos/cmake-scanner/src/main.cpp" STREQUAL "${res2}")
message(STATUS "Path strings not identical")
else()
message(STATUS "Path strings are identical")
endif()
add_executable(${PROJECT_NAME}
${res}
${res2}
)
target_include_directories(${PROJECT_NAME}
PRIVATE
${CMAKE_CURRENT_SOURCE_DIR}/include
${CMAKE_CURRENT_SOURCE_DIR}/src
)
and project tree:
cmake-scanner
|-include
| |-IPublicA.h
| |-IPublicB.h
| |-IPublicC.h
| |-IPublicD.h
|-src
|-main.cpp
https://github.com/ElDesalmado/cmake-scanner.git
UPDATE
Strings' comparison by length yields different results, so I thought maybe there are some trailing characters in the output of execute_process.
So I replaced all the newlines that actually might prevent cmake from finding source files.
string(REGEX REPLACE "\n$" "" ...)
So they compare equal, however still could not be located by cmake.
I had some luck with using OUTPUT_STRIP_TRAILING_WHITESPACE in execute_command and main.cpp has been finally located and project generated. But when there are 2 or more sources this doesn't help.
I m going to try outputting sources' names in a single line and see what would occur...
I have solved the issue.
Cmake accepts lists of sources that must be formatted in a way, that sources' paths are separated with a semicolon.
So the solution was to modifiy batch script to output a string line of semicolon-separated file names. Later I will update the repo and provide the batch code.
In order for CMake to recognize the output from the bathc script as a list of Source/Header files, it must not contain any trailing symbols like whitespaces or newlines and file paths must be separated with a semicolon:
path-to-headerA.h;path-to-headerB.h;path-to-headerC.h;
(It is ok if there is a semiciolon at the end of the string line - CMake accepts that).
Working solution
powershell
#echo off
set arg1=%1
set arg2=%2
powershell -Command "$path = '%1'.Replace('\','/'); $headers = ''; get-childitem $path/*.%2 | select-object -expandProperty Name | foreach-object{ $headers += ($path + '/' + $_ + ';')}; Write-output $headers"
CollectSources.cmake
#Collect source files from a given folder
set(DIR_OF_CollectSources_CMAKE ${CMAKE_CURRENT_LIST_DIR})
function(CollectSources path ext ret)
message(STATUS "Collecting sources *.${ext} from ${path}")
execute_process(
COMMAND CMD /c ${DIR_OF_CollectSources_CMAKE}/CollectSources.bat ${path} ${ext}
OUTPUT_VARIABLE res
OUTPUT_STRIP_TRAILING_WHITESPACE
)
message(STATUS "Sources collected:")
foreach(src ${res})
message(${src})
endforeach()
set(${ret} "${res}" PARENT_SCOPE)
endfunction()
usage in CMakeLists.txt:
include(CollectSources)
CollectSources(${CMAKE_CURRENT_SOURCE_DIR}/include h HEADERS)
Example:
https://github.com/ElDesalmado/cmake-scanner.git
CMake output:
-- Collecting sources *.h from C:/Repos/cmake-scanner/include
-- Sources collected:
C:/Repos/cmake-scanner/include/IPublicA.h
C:/Repos/cmake-scanner/include/IPublicB.h
C:/Repos/cmake-scanner/include/IPublicC.h
C:/Repos/cmake-scanner/include/IPublicD.h
-- Collecting sources *.cpp from C:/Repos/cmake-scanner/src
-- Sources collected:
C:/Repos/cmake-scanner/src/lib.cpp
C:/Repos/cmake-scanner/src/main.cpp

Multi line command (to export .csv) not working in Apache Drill (web interface)

I am trying to use Apache Drill to export a .csv file. This other question indicated that this is achieved by:
use dfs.tmp;
alter session set `store.format`='csv';
create table dfs.tmp.my_output as select * from cp.`employee.json`;
I tried running this block (of three commands) simultaneously in the Apache Drill web interface but got the error bellow. It somehow is not recognizing the ; or not taking multiple commands.
I also tried running each line separately, without the ; but the changes of the two commands did not persist (and the export command (3rd command) deafauted back to exporting a parquet file (the set default)).
How can I run this in Drill?
Query Failed: An Error Occurred
org.apache.drill.common.exceptions.UserRemoteException: PARSE ERROR: Encountered ";" at line 1, column 12. Was expecting one of: <EOF> "." ... "[" ... SQL Query use dfs.tmp; ^ alter session set `store.format`='csv'; create table dfs.tmp.`elos_cnis` as select * from dfs.tmp.`/bases_parquet/elos_cnis` [Error Id: 00493fbe-924e-43e9-a684-f7d1abfed04e on sbsb35.ipea.gov.br:31010] (org.apache.calcite.sql.parser.SqlParseException) Encountered ";" at line 1, column 12. Was expecting one of: <EOF> "." ... "[" ... org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.convertException():391 org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.normalizeException():121 org.apache.calcite.sql.parser.SqlParser.parseStmt():149 org.apache.drill.exec.planner.sql.SqlConverter.parse():157 org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():104 org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():79 org.apache.drill.exec.work.foreman.Foreman.runSQL():1017 org.apache.drill.exec.work.foreman.Foreman.run():289 java.util.concurrent.ThreadPoolExecutor.runWorker():1142 java.util.concurrent.ThreadPoolExecutor$Worker.run():617 java.lang.Thread.run():748 Caused By (org.apache.drill.exec.planner.sql.parser.impl.ParseException) Encountered ";" at line 1, column 12. Was expecting one of: <EOF> "." ... "[" ... org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.generateParseException():17963 org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.jj_consume_token():17792 org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.SqlStmtEof():861 org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.parseSqlStmtEof():180 org.apache.drill.exec.planner.sql.parser.impl.DrillParserWithCompoundIdConverter.parseSqlStmtEof():59 org.apache.calcite.sql.parser.SqlParser.parseStmt():142 org.apache.drill.exec.planner.sql.SqlConverter.parse():157 org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():104 org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():79 org.apache.drill.exec.work.foreman.Foreman.runSQL():1017 org.apache.drill.exec.work.foreman.Foreman.run():289 java.util.concurrent.ThreadPoolExecutor.runWorker():1142 java.util.concurrent.ThreadPoolExecutor$Worker.run():617 java.lang.Thread.run():748
Drill Web-UI does not support submitting multiple queries within the same query page. Please try using SqlLine or submit in Web-UI one-by-one
alter system set `store.format`='csv';
query to set store.format at the system level, since Web-UI does not store session by default and after that submit the following query
create table dfs.tmp.my_output as select * from cp.`employee.json`;

OpenDDS Perl Script Setup Throwing Error

Continuing from this SO question.
When following the openDDS install guide I attempt to run configure from within the command prompt but receive this output recieve this error set:
C:\Users\Supervisor\Desktop\opendds>C:\Users\Supervisor\Desktop\opendds\configure.cmd
Options:
'compiler' => 'gcc'
'verbose' => 1
host system is: win32
compiler is: gcc
Using ace_src: C:/Users/Supervisor/Desktop/opendds/ACE_wrappers
Using tao_src: C:/Users/Supervisor/Desktop/opendds/ACE_wrappers/TAO
ACE_ROOT/ace/config.h exists, skipping configuration of ACE+TAO
ENV: saving current environment
ENV: Appending ;C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\bin;C:\Users\Supervisor\Desktop\opendds\bin;C:\Users\Supervisor\Desktop\opend
ds\ACE_wrappers\lib;C:\Users\Supervisor\Desktop\opendds\lib to PATH
ENV: Setting ACE_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers
ENV: Setting MPC_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\MPC
ENV: Setting CIAO_ROOT to unused
ENV: Setting TAO_ROOT to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\TAO
ENV: Setting DDS_ROOT to C:\Users\Supervisor\Desktop\opendds
ENV: Setting DANCE_ROOT to unused
Use of uninitialized value $mpctype in concatenation (.) or string at configure line 1028.
OpenDDS mwc command line: -type C:\Users\Supervisor\Desktop\opendds\DDS_TAOv2_all.mwc
Use of uninitialized value $mpctype in string eq at configure line 1031.
Running MPC to generate project files.
MPC_ROOT was set to C:\Users\Supervisor\Desktop\opendds\ACE_wrappers\MPC.
Using .../opendds/ACE_wrappers/bin/MakeProjectCreator/config/MPC.cfg
ERROR: Invalid type: C:\Users\Supervisor\Desktop\opendds\DDS_TAOv2_all.mwc
mwc.pl v4.1.8
Usage: mwc.pl [-global <file>] [-include <directory>] [-recurse]
[-ti <dll | lib | dll_exe | lib_exe>:<file>] [-hierarchy]
[-template <file>] [-relative NAME=VAL] [-base <project>]
[-noreldefs] [-notoplevel] [-static] [-genins] [-use_env]
[-value_template <NAME+=VAL | NAME=VAL | NAME-=VAL>]
[-value_project <NAME+=VAL | NAME=VAL | NAME-=VAL>]
[-make_coexistence] [-feature_file <file name>] [-gendot]
[-expand_vars] [-features <feature definitions>]
[-exclude <directories>] [-name_modifier <pattern>]
[-apply_project] [-version] [-into <directory>]
[-gfeature_file <file name>] [-nocomments]
[-relative_file <file name>] [-for_eclipse]
[-workers <#>] [-workers_dir <dir> | -workers_port <#>]
[-language <cplusplus | csharp | java | vb>]
[-type <automake | bcb2007 | bcb2009 | bds4 | bmake | cc | cdt6 |
cdt7 | em3 | ghs | gnuace | gnuautobuild | html | make |
nmake | rpmspec | sle | vc6 | vc7 | vc8 | vc10 | vc11 |
vc12 | vc14 | vc71 | vc9 | vxtest | wb26 | wb30 | wix>]
[files]
-base Add <project> as a base project to each generated
project file. Do not provide a file extension, the
.mpb extension will be tried first; if that fails the
.mpc extension will be tried.
-exclude Use this option to exclude directories or files when
searching for input files.
-expand_vars Perform direct expansion, instead of performing relative
replacement with either -use_env or -relative options.
-feature_file Specifies the feature file to read before processing.
The default feature file is default.features under the
config directory.
-features Specifies the feature list to set before processing.
-for_eclipse Generate files for use with eclipse. This is only
useful for make based project types.
-gendot Generate .dot files for use with Graphviz.
-genins Generate .ins files for use with prj_install.pl.
-gfeature_file Specifies the global feature file. The
default value is global.features under the
config directory.
-global Specifies the global input file. Values stored
within this file are applied to all projects.
-hierarchy Generate a workspace in a hierarchical fashion.
-include Specifies a directory to search when looking for base
projects, template input files and templates. This
option can be used multiple times to add directories.
-into Place all output files in a mirrored directory
structure starting at <directory>. This should be a
full path. If any project within the workspace is
referenced via a full path, use of this option is
likely to cause problems.
-language Specify the language preference; possible values are
[cplusplus, csharp, java, vb]. The default is
cplusplus.
-make_coexistence If multiple 'make' based project types are
generated, they will be named such that they can coexist.
-name_modifier Modify output names. The pattern passed to this
parameter will have the '*' portion replaced with the
actual output name. Ex. *_Static
-apply_project When used in conjunction with -name_modifier, it applies
the name modifier to the project name also.
-nocomments Do not place comments in the generated files.
-noreldefs Do not try to generate default relative definitions.
-notoplevel Do not generate the top level target file. Files
are still processed, but no top level file is created.
-recurse Recurse from the current directory and generate from
all found input files.
-relative Any $() variable in an mpc file that is matched to NAME
is replaced by VAL only if VAL can be made into a
relative path based on the current working directory.
This option can be used multiple times to add multiple
variables.
-relative_file Specifies the relative file to read before processing.
The default relative file is default.rel under the
config directory.
-static Specifies that only static projects will be generated.
By default, only dynamic projects are generated.
-template Specifies the template name (with no extension).
-workers Specifies number of child processes to use to generate
projects.
-workers_dir The directory for storing temporary output files
from the child processes. The default is '/tmp/mpc'
If neither -workers_dir nor -workers_port is used,
-workers_dir is assumed.
-workers_port The port number for the parent listener. If neither
-workers_dir nor -workers_port is used, -workers_dir
is assumed.
-ti Specifies the template input file (with no extension)
for the specific type (ex. -ti dll_exe:vc8exe).
-type Specifies the type of project file to generate. This
option can be used multiple times to generate multiple
types. There is no longer a default.
-use_env Use environment variables for all uses of $() instead
of the relative replacement values.
-value_project This option allows modification of a project variable
assignment. Use += to add VAL to the NAME's value.
Use -= to subtract and = to override the value.
This can be used to introduce new name value pairs to
a project. However, it must be a valid project
assignment.
-value_template This option allows modification of a template input
name value pair. Use += to add VAL to the NAME's
value. Use -= to subtract and = to override the value.
-version Print the MPC version and exit.
Error from MPC, stopped at configure line 1035.
The cmd script being run is:
#echo off
:: Win32 configure script wrapper for OpenDDS
:: Distributed under the OpenDDS License.
:: See: http://www.opendds.org/license.html
for %%x in (perl.exe) do set PERLPATH=%%~dp$PATH:x
if x%PERLPATH%==x (
echo ERROR: perl.exe was not found. This script requires ActiveState Perl.
exit /b 1
)
set PERLPATH=
perl configure -verbose --compiler=gcc %*
if exist setenv.cmd call setenv.cmd
And the section of configure that generates the error is:
my $mwcargs = "-type $mpctype $buildEnv->{'DDS_ROOT'}$slash$ws $static";
$mwcargs .= ' ' . $opts{'mpcopts'} if defined $opts{'mpcopts'};
print "OpenDDS mwc command line: $mwcargs\n" if $opts{'verbose'};
print 'Running MPC to generate ', ($mpctype eq 'gnuace' ? 'makefiles' :
'project files'), ".\n";
if (!$opts{'dry-run'}) {
if (system("perl $ENV{'ACE_ROOT'}/bin/mwc.pl $mwcargs") != 0) {
die "Error from MPC, stopped";
}
}
Where initial unset variable is set:
my $mpctype = ($slash eq '/') ? 'gnuace' : $opts{'compiler_version'};
I have both perl and visual studio installed. Looking up MPC I can find a 'multi-precision library. Could this be because I am using gcc? I have to use GCC in order to create a library to use with the JNI out of this code eventually...
You need to make sure that you are using ActiveState perl on windows, other perl variants seem not to work 100%