AEM: Issue using Command Line DAM Workflow - aem

I like to execute a command line programm as a DAM workflow. I tried to implement the ImageMagic example from here: Best Practices for Configuring ImageMagick:
I addded a new Workflow Model,
added "command line" from the "DAM Workflow" list.
In the Argument tab set Mime type to "image/jpeg" (even tried wihtout Mime type)
and in Commands: "C:\Program Files\ImageMagick-7.0.7-Q16\magick.exe" convert ${file} -flip ${file}-flipped.jpg (instead of magick convet ... because in another discussion using an absolute path instead of global name helped people Re: CommmandLineProcess : ImageMagick)
I then added a luncher. And uploaded an Image to the DAM.
In the workflow > instances overview, i see that the workflow was started, it's running and the command line job is set to active.
Unfortunantly this state is never chnaged and no new asset is generated via imageMagic.
I even tried replacing the command with something simple like "ren C:\test\foo.txt bar.txt" which renames a local file. The chnage never happend either.
My question is what am i doing wrong, and how can i debug / find the command outputs? in \crx-quickstart\logs i couldn't find any logs regarding CommandLineProcess.
Thanx

Related

Renaming objects in Google Cloud Storage

I am backing up footage for my video company in google cloud storage. I did a dumb, and my object name has spaces in it and an &. So I need to download this 700gb project, and PowerShell won't do it because it thinks the text after the & is a command and fails the command. What are my options for renaming this object? Or if there's another thing I am missing, please let me know. My problem is using an & in my naming structure, so if anyone knows of any way to rename that would be fantastic.
Other info: Downloading using the command that the download button gives me. I just pasted that into my command prompt and get the error " 'Joe' is not recognized as an internal or external command,
operable program or batch file."
I have tried using "gsutil -m mv gs://my_bucket/oldprefix gs://my_bucket/newprefix" this code with the paths changed to change the name but this also fails because of the spaces in the file path

AzCopy ignore if source file is older

Is there an option to handle the next situation:
I have a pipeline and Copy Files task implemented in it, it is used to upload some static html file from git to blob. Everything works perfect. But sometimes I need this file to be changed in the blob storage (using hosted application tools). So, the question is: can I "detect" if my git file is older than target blob file and ignore this file for the copy task to leave it untouched. My initial idea was to use Azure file copy and use an "Optional Arguments" textbox. However, I couldn't find required option in the documentation. Does it allow such things? Or should this case be handled some other way?
I think you're looking for the isSourceNewer value for the --overwrite option.
--overwrite string Overwrite the conflicting files and blobs at the destination if this flag is set to true. (default true) Possible values include true, false, prompt, and ifSourceNewer.
More info: azcopy copy - Options
Agree with ickvdbosch. The isSourceNewer value for the --overwrite option could meet your requirements.
error: couldn't parse "ifSourceNewer" into a "OverwriteOption"
Based on my test, I could reproduce this issue in Azure file copy task.
It seems that the isSourceNewer value couldn't be set to Overwrite option in Azure File copy task.
Workaround: you could use PowerShell task to run the azcopy script to upload the files with --overwrite=ifSourceNewer
For example:
azcopy copy "filepath" "BlobURLwithSASToken" --overwrite=ifSourceNewer --recursive
For more detailed info, you could refer to this doc.
For the issue about the Azure File copy task, I suggest that you could submit a feedback ticket in the following link: Report task issues.

Export DataStage Job Designs to .dsx file

I am trying to export the DataStage job designs with executables. Below is the screenshots I use to export from the GUI.
This is the two commands I use:
dsexport.exe /h=XX /U=XX /p=XX projectXXX /job=XXX jobname.dsx
dsexport.exe /h=XX /U=XX /p=XX projectXXX /job=XXX /EXEC /APPEND jobname.dsx
The file generated from commands is bigger than the one from GUI. Anyone knows how to use dsexport command to export jobs with the options as in the GUI screenshots. much appreciated. I am using Designer V8.5.
JS
C:\IBM\InformationServer\Clients\Classic>dsexport /d={ip address of server} /u={user id} /p={password} /job={job to export} {Project where job is located in} {FileName.dsx}
try this, it will export a single dsx file with all informations
P.S.I am using version 11.3
As you can see GUI is excluding some read-only files which is not excluded in command line this is why the file size difference is there.
You have "Include Dependent Items" unchecked in the GUI. The command line will include dependent items by default (i.e. shared containers or routines). You can disable this behaviour on the command line by using the /NODEPENDENTS command switch.

Ctools do not show up in pentaho UI

I am using Pentaho CE 5 on windows. I would like to use CTools but I can't make them show up in the File -> New menu to use them.
Being behind a proxy, I can not use the Marketplace plugin, so I have tried a manual installation.
First, I tried to use the ctools-installer.sh. I have run the following command line in cygwin (wget and unzip are installed):
./ctools-installer.sh -s /cygdrive/d/Users/[user]/Mes\ Programmes/pentaho/biserver-ce/pentaho-solutions/ -w /cygdrive/d/Users/[user]/Mes\ programmes/pentaho/biserver-ce/tomcat/webapps/pentaho/
The script starts, asks me what module I want to install, and begins the downloads.
For each module, I get an output like (set -x added to the script) :
echo -n 'Downloading CDF...' Downloading CDF...+ wget -q --no-check-certificate 'http://ci.analytical-labs.com/job/Webdetails-CDF-5-Release/lastSuccessfulBuild/artifact/bi-platform-v2-plugin/dist/zip/dist.zip'
-O .tmp/cdf/dist.zip SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc
'[' '!' -z '' ']'
rm -f .tmp/dist/marketplace.xml
unzip -o .tmp/cdf/dist.zip -d .tmp End-of-central-directory signature not found. Either this file is not a zipfile, or it
constitutes one disk of a multi-part archive. In the latter case
the central directory and zipfile comment will be found on the last
disk(s) of this archive. unzip: cannot find zipfile directory in
.tmp/cdf/dist.zip,
and cannot find .tmp/cdf/dist.zip.zip, period.
chmod -R u+rwx .tmp
echo Done Done
Then the script ends. I have seen on this page (pentaho-bi-suite) that it is the normal output. Nevertheless, it seems a bit strange to me and when I start my pentaho server (login: admin/password), I cannot see any new tools in the menus.
After a look to a few other tutorials and the script itself, I have downloaded the .zip snapshots for every tool and unzipped them in the system directory of my pentaho server. Same result.
I would like to make the .sh works, what can I try or adjust ?
Thanks
EDIT 05/06/2014
I checked the dist.zip files dowloaded by the script and they are all empty. It seems that wget cannot fetch the zip files, and therefore the installation fails.
When I try to get any webpage through wget, it fails. I think it is because of the proxy.
Here is my .wgetrc file, located in my user's cygwin home folder:
use_proxy=on
http_proxy=http://[url]:[port]
https_proxy=http://[url]:[port]
proxy_user=[user]
proxy_password=[password]
How could I make this work?
EDIT 10/06/2014
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline.
I guess this is related with the -r option.
I consider this post solved, since it not a CTools issue anymore.
Difficult to identify the issue in the above procedure..
but you can refer this blog he is key member of pentaho itself..
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline. I guess this is related with the -r option.
I consider this post solved, since it is not a CTools issue anymore.
You can manually install the components from http://www.webdetails.pt/ctools/ or if you have pentaho 5.1 or above, you add the following parameters to CATALINA_OPTS option (in start-pentaho.bat or start-pentaho.sh):
-Dhttp.proxyHost= -Dhttp.proxyPort= -Dhttp.nonProxyHosts="localhost|127.0.0.1|10...*"
http://docs.treasuredata.com/articles/pentaho-dataintegration#tips-how-can-i-use-pentaho-through-a-proxy

Update ClearCase view config spec with command line with changed load rules

I have a base ClearCase snapshot view that being updated automatically overnight based on config spec file using this command
cleartool setcs -overwrite -ptime d:\CS.cs
The problem is that the config spec load rules are being changed and if I run the command it ask for confirmation to update load rules
R:\>cleartool setcs -overwrite -ptime d:\CS.cs
cleartool: Warning: 1 objects were eliminated from the new config spec's load rules:
"\QA\QTP"
Continue, and unload these objects? [no]
So is there a way to tell ClearCase using command line to automatically continue with the update without getting confirmation ?
As mentioned in "Batch Script to Automate a DOS Program with Options", you could write the right answer in a file, and redirect it to your command.
cleartool setcs -overwrite -ptime d:\CS.cs < yes.txt
That way, if the command stops for getting an input, it will have it immediately.
You find a similar approach in "how to userinput without typing to a batch file".
You should use the "-force" option