Rundeck - Optionally execute step depending on sql query result - rundeck

I'm very new to rundeck and I was asked to insert something (a step or I really don't know what) to check if the data for a job are up to date (a sql script can provide me a date or a parameter). Depending on this condition, the job can start or not.
I've been surfing on the web for days but didn't find out any answer. Could you please help me?

On Rundeck Enterprise you can use the Ruleset Workflow Strategy to decide if some job or step runs or not based on an option or data value (Check this example related to data values) generated from your SQL query.
The best approach on the Community version is to use an inline-script step that calls the job whichs contains your step using RD-CLI or Rundeck API (using cURL). Check this example about how to call a job from inline-script step using RD-CLI.
You can write a script like:
if [ #data.your_data_value# = 'available' ]; then
rd run -p YourProjectName -j YourJobToCall
else
echo "Not available, nothing to do"
fi
Of course if you don't want to call another jobs you can create an inline-script step with all logic inside, something like:
if [ #data.your_data_value# = 'available' ]; then
# step logic
else
echo "Not available, nothing to do"
fi
So, the idea is to create the first step with your SQL call wichs generates data (check this too), and then a second step that gets that data and decides on an inline script step.

Related

IBM RQM: execute/run the test case execution record (TCER) via POST operation using REST API

We are planning to create the test cases drafted in Excel file using RQMExcelWordExport tool . It is a tedious task to manually update the test case execution result status. Also, it is quite difficult to do a retrospection of the execution history result of the same test case.
Can anyone help us on this with the following :
How to get all the test case execution record (TCER) with respect to test plan id via REST GET ?
How to execute/run the test case execution record (TCER) via POST operation in REST API /
or we can able to update TCER status through RQMExcelWordExport tool.
Regards,
Sujata

Azure Logic Apps error: The response is not in a JSON format

I'm trying to execute simple step from Azure Apps to get the pipeline run statistics, said pipeline calls Logic Apps in the Web activity:
However I'm receiving the error and I don't understand what exactly the step expects as input here:
Could you please assist in resolving above?
You should not use http requests to pass in your Run Id, because Run Id changes every time you run the pipeline.
You should use Create a pipeline run action first, then you can pass the run ID of the output of this operation to the Get a pipeline run action.
You can refer to this question.
There should be a file identifier logic to be added it seems in your case:
You need to take the Output body of JSON file in next block.

How to fail Azure Data Factory pipeline based on IF Task

I have a pipeline built on Azure data Factory. It has:
a "LookUp" task that has an SQL query that returns a column [CountRecs]. This columns holds a value 0 or more.
an "if" task to check this returned value. I want to fail the pipeline when the value of [CountRecs]>0
Is this possible?
You could probably achieve this by having a Web Activity when your IF Condition is true ([CountRecs]>0) in which the web activity should call the below REST API to cancel the pipeline run by using the pipelinerunID (you can get this value by using dynamic expression - #pipeline().RunId)
Sample Dynamic Expression for Condition: #greater(activity('LookupTableRecordCount').output.firstRow.COUNTRECS, 0)
REST API to Cancel the Pipeline Run: POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/pipelineruns/{runId}/cancel?api-version=2018-06-01
MS Doc related to Rest API: ADF Pipeline Runs - Cancel
One other possible way is to have an invalid URL in your web activity which will fail the Web activity in-turn it will fail the IfCondition activity, which inturn will result in your pipeline to fail.
There is an existing feature request related to the same requirement in ADF user voice forum suggested by other ADF users. I would recommend you please up-vote and/or comment on this feedback which will help to increase the priority of the feature request implementation.
ADF User voice feedback related to this requirement: https://feedback.azure.com/forums/270578-data-factory/suggestions/38143873-a-new-activity-for-cancelling-the-pipeline-executi
Hope this helps.
As a sort-of hack-solution you can create a "Set variable" activity which incurs division by zero if a certain condition is met. I don't like it but it works.
#string(
div(
1
, if(
greater( int(variables('date_diff')), 100 )
, 0
, 1
)
)
)

Trigger update does not work when creating a new request

I am new to DB Oracle, When I create a new request in Clarity (that is a project & portfolio management application) or when I change the status of a request, I would like to update the field status to the new value of mb_status_idea.
The following query works well in case of Update, but if I create a new request, it does not update the status. (so status is not equal to status MB).
IF ( :old.mb_status_idea != :new.mb_status_idea)
THEN update inv_investments a
set a.status = stat
where a.id=:new.id ;
END IF;
I think the problem is that when creating a new request, since for insert trigger OLD contains NO VALUE, so the condition would be false and it doeas not update the status.
Note: The field status is in the table INV_INVETMENTS , (stat := :new.mb_status_idea) and database column for status MB is mb_status_idea
I also added this condition --> or (:old.mb_status_idea is null), but again when I create a new request, the value of "Status" and "status MB" are different (status is not updated).
I do appreciate if someone could help to overcome this problem.
All ideas are highly appreciated,
Mona
With Clarity it is recommended to not use triggers for a couple of reasons... jobs and processes may sometimes change the values of some fields at other times than when edits happen through the application. You can't control these. Triggers can't be used if you use CA hosting services. Triggers will have to be removed for upgrades because the upgrade process breaks them.
For this type of action I would recommend using the process engine. You can setup a process to run any time the field is updated. The update could be performed by a custom script or a system action. The system action is fairly straight forward to configure. If you use a custom script there are examples in the admin bookshelf documentation. You would write a SQL update statement and put it in a GEL script.

Can't create user with name like "DOMAIN\name" in Transact-SQL scenario

I have the MS-SQL server and some jobs running every day on it. I can't make one instruction to work correctly in theese tasks:
CREATE USER "DOMAIN\username"
Job fails at this step with "syntax error" message. But if I try to perform the same code in a query manually, everything works perfectly. Job also works fine if executing smth like:
CREATE USER "username"
without backslash symbol.
Specified domain- and user-names are correct and exist. Login for user "DOMAIN\username" already created and appeared in the list of all server logins.
All manipulation were performed via SQL Server Management Studio 2008 R2.
Help me, please, to solve this issue.
Need square brackets:
CREATE USER [DOMAIN\username]
In practice it's safer to use [ and ] (not ") for all identifiers because of vagaries of SET QUOTED_IDENTIFIER