Making a determination whether DB/Server is down before I kick-off a pipeline - azure-data-factory

I want to check whether the database/Server is Online before I kick off a pipeline. In the database is down I want to cancel the pipeline processing. I also would like to log the results in a table.
format (columns) : DBName Status Date
If the DB/Server is down then I want to send an email to concerned team with formatted table showing which DB/Servers are down.
Approach:
Run a query on each of the servers. If there is a result, then format output as shown above. I am using ADF pipeline to achive this. My issue is how do I combine various outputs from different servers.
For e.g.
Server1:
DBName: A Status: ONLINE runDate:xx/xx/xxxx
Server2:
DBName: B Status: ONLINE runDate:xx/xx/xxxx
I would like to combine them as follows:
Server DBName Status runDate
1 A ONLINE xx/xx/xxxx
2 B ONLINE xx/xx/xxxx
Use this to update the logging table as well as in the email if I were to send one out.
Is this possible using the Pipeline activities or do I have to use mapping dataflows?

I did similar work a few weeks ago. We make an API where we put all server-related settings or URL endpoint which we need to ping.
You don't require to store username-password (of SQL Server) at all. When you ping the SQL server, it will timeout if it isn't online. If it's online it will give you password related error. This way you can easily figure out whether it's up and running.
AFAIK, If you are using azure-DevOps you can use your service account to log into the SQL server. If you have set up an AD to log into DevOps, this thing can be done in the build script.
Both way you will be able to make sure whether SQL Server is Up and Running or not.

You can have all the actions as tasks in a yaml pipeline
You need something like below:
steps:
task: Check database status
register: result
task: Add results to a file
shell: "echo text >> filename"
task: send e-mail
when: some condition is met
There are several modules to achieve what you need. You need to find the right modules. You can play around with the flow of tasks by registering results and using the when clause.

Related

Overriding Jmeter property in Run Taurus task Azure pipeline is not working

I am running jmeter from Taurus and I need a output kpi.jtl file with url listing.
I have tried passing parameter -o modules.jmeter.properties.save.saveservice.url='true' and
-o modules.jmeter.properties="{'jmeter.save.saveservice.url':'true'}". Pipeline is running successfully but the kpi.jtl doesn't have the url. Please help
I have tried few more options like editing jmeter.properties via pipeline - which broke the pipeline and expecting input from user
user.properities- Which is ineffective.
I am expecting kpi.jtl file with all the possible logs especially url.
I believe you're using the wrong property, you should pass the next one:
modules.jmeter.csv-jtl-flags.url=true
More information: CSV file content configuration
However be informed that having a.jtl file "with all possible logs" is some form of a performance anti-pattern as it creates massive disk IO and may ruin your test. More information: 9 Easy Solutions for a JMeter Load Test “Out of Memory” Failure

Rundeck - Get in the script the username who executed the job

I would like to know if there is a way to get the username who executed the job and use it in the job which has been executed ?
If my question isn't clear, here is an example :
An user 'bobby' connects to rundeck and execute the job 'lets_go_to_the_moon'
In the job 'lets_go_to_the_moon' I want to get 'bobby' in a variable and use it to store this information in a database for example.
Ps : I know that this information can be retrieve in the database dedicated to rundeck, but is there a native variable dedicated to get this kind of information ? And if yes, how ?
Thank you guys,
Happy day !
You can use job.username context variable.
Command step format: ${job.username}
Inline-script format: #job.username#
"External" script format: $RD_JOB_USERNAME
If you are dispatching an "external" script to a remote node, take a look at this.
Here you can see all context variables available.

Printing the Console output in the Azure DevOps Test Run task

I am doing some initial one off setup using [BeforeTestRun] hook for my specflow tests. This does check on some users to make sure if they exist and creates them with specific roles and permissions if they are not so the automated tests can use them. The function to do this prints a lot of useful information on the Console.Writeline.
When I run the test on my local system I can see the output from this hook function on the main feature file and the output of each scenario under each of them. But when I run the tests via Azure DevOps pipleine, I am not sure where to find the output for the [BeforeTestRun] because it is not bound a particular test scenario. The console of Run Tests Tasks has no information about this.
Can someone please help me to show this output somewhere so I can act accordingly.
I tried to use System.Diagnostics.Debug.Print, System.Diagnostics.Debug.Print, System.Diagnostics.Debug.WriteLine and System.Diagnostics.Trace.WriteLine, but nothing seems to work on pipeline console.
[BeforeTestRun]
public static void BeforeRun()
{
Console.WriteLine(
"Before Test run analyzing the users and their needed properties for performing automation run");
}
I want my output to be visible somewhere so I can act based on that information if needed to.
It's not possible for the console logs.
The product currently does not support printing console logs for passing tests and we do not currently have plans to support this in the near future.
(Source: https://developercommunity.visualstudio.com/content/problem/631082/printing-the-console-output-in-the-azure-devops-te.html)
However, there's another way:
Your build will have an attachment with the file extension .trx. This is a xml file and contains an Output element for each test (see also https://stackoverflow.com/a/55452011):
<TestRun id="[omitted]" name="[omitted] 2020-01-10 17:59:35" runUser="[omitted]" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
<Times creation="2020-01-10T17:59:35.8919298+01:00" queuing="2020-01-10T17:59:35.8919298+01:00" start="2020-01-10T17:59:26.5626373+01:00" finish="2020-01-10T17:59:35.9209479+01:00" />
<Results>
<UnitTestResult testName="TestMethod1">
<Output>
<StdOut>Test</StdOut>
</Output>
</UnitTestResult>
</Results>
</TestRun>

MDS import data queue

I am following this guidance: https://www.mssqltips.com/sqlservertutorial/3806/sql-server-master-data-services-importing-data/
The instructions say after we load data into the staging tables, we go into the MDS integration screen and select "START BATCHES".
Is this a manual override to begin the process? or how do I know how to automatically queue up a batch to begin?
Thanks!
Alternative way to run the staging process
After you load the staging table with required data.. call/execute the Staging UDP.
Basically, Staging UDPs are different Stored Procedures for every entity in the MDS database (automatically created by MDS) that follow the naming convention:
stg.udp_<EntityName>_Leaf
You have to provide it values for some parameters. Here is a sample code of how to call these.
USE [MDS_DATABASE_NAME]
GO
EXEC [stg].[udp_entityname_Leaf]
#VersionName = N'VERSION_1',
#LogFlag = 1,
#BatchTag = N'batch1'
#UserName=N’domain\user’
GO
For more details look at:
Staging Stored Procedure (Master Data Services).
Do remember that the #BatchTag value has to match the value that you initially populated in the Staging table.
Automating the Staging process
The simplest way for you to do that would be to schedule a job in SQL Agent which would execute something like the code above to call the staging UDP.
Please note that you would need to get creative about figuring out how the Job will know the correct Batch Tag.
That said, a lot of developers just create a single SSIS Package which does the Loading of data in the Staging table (as step 1) and then Executes the Staging UDP (as the final step).
This SSIS package is then executed through a scheduled SQL Agent job.

For NetSuite Map/Reduce script - Why is map stage failing when being called from Restlet?

In NetSuite, have a Restlet script that calls a deployed map/reduce script but the map stage shows as Failed when looking at details of status page (the getInputData stage does run and shows as Complete).
However, if I do a "Save and Execute" from the deployment of the map/reduce script, it works fine (Map stage does run).
Note that:
There is no error on execution log of either the restlet or the map/reduce scripts.
Have 'N/task' in define section of restlet as well as task in function parameters.
The map/reduce script has the Map checkbox checked. The map/reduce script deployment is not scheduled and has default values for other fields.
Using NetSuite help "See the quick brown fox jump" sample map/reduce script
Using Sandbox account
Using a custom role to post to restlet
Below is call to create.task code snippet from my Restlet. Don't know what is wrong, any help is appreciated.
var mrTask = task.create({
taskType: task.TaskType.MAP_REDUCE,
scriptId: 'customscript_test',
deploymentId: 'customdeploy_test'
});
var mrTaskId = mrTask.submit();
var taskStatus = task.checkStatus(mrTaskId);
log.debug('taskStatus', taskStatus);
You also need Documents and Files -View permission along with SuiteScript - View and SuiteScript Scheduling - Full permissions to access the Map/Reduce script.
The help for MapReduceScriptTask.submit() does not mention this but the help for ScheduledScriptTask.submit() does:
Only administrators can run scheduled scripts. If a user event script calls ScheduledScriptTask.submit(), the user event script has to be deployed with admin permissions.
I did a test of posting to my restlet using Administrator role credentials and it works fine as opposed to using my other custom role. Maybe just like ScheduledScriptTask, the MapReduceScriptTask can only be called by Administrator role ? My custom role does have SuiteScript - View and SuiteScript Scheduling - Full permissions. Thought that would do the trick but apparently not.
Anyone have any further insight on this ?