Enterprise Autoscale Application Block (WASABi) <scale> up by a variable amount - enterprise-library

I was looking at the WASABi documentation and I am confused about a particular aspect of this library.
I need to create a custom reactive rule. Say, this rule runs every minute and the "scale" action of this rule should be to scale up by "x" amount. It seems that as though I can set the "scale" action to a particular number (say 1 or 2), but not pass in a variable computed by, say my custom operand.
I understand that I can create a custom operand to check my condition, but I want the custom operand to compute how much the "scale" action should scale the target Worker Role by and then pass this value to the "scale" action.
Is there someway to define these rules outside the XML to achieve this?
Any help would be greatly appreciated!

Actions can increment or decrement the count by a number or by a proportion. So if you want a dynamic increment or decrement I think you will need to create a custom action. I think you could pull out the info you need from the IRuleEvaluationContext.
To change the instance count you will need to change the deployment configuration. See https://social.msdn.microsoft.com/forums/azure/en-US/dbbf14d1-fd40-4aa3-8c65-a2424702816b/few-question-regarding-changing-instance-count-programmatically?forum=windowsazuredevelopment&prof=required for some discussion.
You should be able to do that using the Azure Management Libraries for .NET and the ComputeManagementClient. Something like:
using (ComputeManagementClient client = new ComputeManagementClient(credentials))
{
var response = await client.Deployments.GetBySlotAsync(serviceName, slot);
XDocument config = XDocument.Parse(response.Configuration);
// Change the config
StringBuilder builder = new StringBuilder();
using (TextWriter writer = new StringWriter(builder))
{
config.Save(writer);
}
string newConfig = builder.ToString();
await client.Deployments.BeginChangingConfigurationBySlotAsync(serviceName, slot, new DeploymentChangeConfigurationParameters(newConfig));
}

Related

mirth connect Database Reader automatic column mapping

Please could somebody confirm the following..
I am using Mirth Connect 3.5.08232.
My Source Connector is a Database Reader.
Say, I am using a query that returns multiple rows, and return the result (via JavaScript), as documentation suggests, so that Mirth would treat each row as a separate message. I also use a couple of mappers as source transformers, and save the mapped fields in my channel map (which ends up to contain only those fields that I define in transformers)
In the destination, and specifically, in destination response transformer (or destination body, if it is a JavaScript writer), how do I access the source fields?
the only way I found by trial and error is
var rawMsg = connectorMessage.getRawData();
var xmlMsg = new XML(rawMsg);
logger.info(xmlMsg.some_field); // ignore the root element of rawMsg
Is this the right way to do this? I thought that maybe the fields that were nicely automatically detected would be put in some kind of a map, like sourceMap - but that doesn't seem to be the case, right?
Thank you
If you are using Mapper steps in your transformer to extract the data and put it into a variable map (like the channel map), then you can use any of the following methods to retrieve it from a subsequent JavaScript context (including a JavaScript Writer, and your response transformer):
var value = channelMap.get('key');
var value = $c('key');
var value = $('key');
Look at the Variable Maps section of the User Guide for more information.
So to recap, say you're selecting a column "mycolumn" with a Database Reader. The XML sent to the channel will be something like this:
<result>
<mycolumn>value</mycolumn>
</result>
Then you can choose to extract pieces of that message into specific variables for later use. The transformer allows you to easily drag-and-drop pieces of the sample inbound message.
Finally in your JavaScript Writer (or in any subsequent filter, transformer, or response transformer), just drag the value into the field you want:
And the corresponding JavaScript code will automatically be inserted:
One last note, if you are selecting a lot of variables and don't want to make Mapper steps for each one individually, you can use a JavaScript Step to iterate through the message and extract each column into a separate map variable:
for each (child in msg.children()) {
channelMap.put(child.localName(), child.toString());
}
Or, you can just reference the columns directly from within the JavaScript Writer:
var msg = new XML(connectorMessage.getEncodedData());
var column1 = msg.column1.toString();
var column2 = msg.column2.toString();
...

Does model.getProperty() return a live object for objects that are members of an array?

I get an object from within an array in my model (a JSONmodel type) which is
{
"task": [
{
"dbid": 465,
"bk_cnt": 11,
}, {
"dbid": 472,
"bk_cnt": 16,
}
]
}
I bind this model to a table and connect the bk_cnt up to an objectNumber in a cell. No problem so far.
In code I want to change the value of the first bk_cnt value from 11 to 20 on press of a button. Inside the event I have:
var model = this.getView().getModel() // get the model
var tasks = model.getProperty("/task"); // get as a JS object
tasks[0].bk_cnt = 20 // update the model...will it update the view?
// model.refresh() // it will if this is uncommented.
Problem: Though it is bound to the view, the displayed value of bk_cnt does not change. if I add model.refresh() it does. This code is extracted from a larger section and one of the larger features is sorting by column click. When I click a column to re-sort (no change to the model), the value 20 appears.
What gives?
Musings: I have read that the model.getProprty() function returns a javascript object with a live reference back to the model, and that a change to the value of the object will automatically be reflected in the view for any bound controls. Does this statement fall down on array attributes ?
EDIT: Still feeling around the issue I find that
model.setProperty("/task/0/bk_cnt", 20)
Does not require a model.refresh() to update the view. Not a total surprise as this command is directly acting through the model. This leaves me thinking that the 'live' object returned by getProperty() is only live when it is a primitive datatype like a string or integer, but not for a JS object. Or am I missing something ?
EDIT 2: #Ash points out in his answer that there is a further approach which is to access the JS object from the model property, set whatever attributes need to be updated in the JS object, then replace that into the model, e.g.
var tasks = model.getProperty("/task");
tasks[0].bk_cnt = 20
model.setProperty('/task', tasks)
Second edit done to complete the trio of approaches for future readers.
The Model object is an abstraction layer ON TOP of a javascript object. There is no way that a change within an object is notified anywhere. You need to explicitly trigger the notifications through model.refresh() or model.setProperty().
So both of your solutions are valid, another one (which I favor) would be
var tasks = model.getProperty("/task");
tasks[0].bk_cnt = 20
model.setProperty('/task', tasks)
But this actually depends on how you bind your model to your UI objects :)

Spark: How to structure a series of side effect actions inside mapping transformation to avoid repetition?

I have a spark streaming application that needs to take these steps:
Take a string, apply some map transformations to it
Map again: If this string (now an array) has a specific value in it, immediately send an email (or do something OUTSIDE the spark environment)
collect() and save in a specific directory
apply some other transformation/enrichment
collect() and save in another directory.
As you can see this implies to lazily activated calculations, which do the OUTSIDE action twice. I am trying to avoid caching, as at some hundreds lines per second this would kill my server.
Also trying to mantaining the order of operation, though this is not as much as important: Is there a solution I do not know of?
EDIT: my program as of now:
kafkaStream;
lines = take the value, discard the topic;
lines.foreachRDD{
splittedRDD = arg.map { split the string };
assRDD = splittedRDD.map { associate to a table };
flaggedRDD = assRDD.map { add a boolean parameter under a if condition + send mail};
externalClass.saveStaticMethod( flaggedRDD.collect() and save in file);
enrichRDD = flaggedRDD.map { enrich with external data };
externalClass.saveStaticMethod( enrichRDD.collect() and save in file);
}
I put the saving part after the email so that if something goes wrong with it at least the mail has been sent.
The final 2 methods I found were these:
In the DStream transformation before the side-effected one, make a copy of the Dstream: one will go on with the transformation, the other will have the .foreachRDD{ outside action }. There are no major downside in this, as it is just one RDD more on a worker node.
Extracting the {outside action} from the transformation and mapping the already sent mails: filter if mail has already been sent. This is a almost a superfluous operation as it will filter out all of the RDD elements.
Caching before going on (although I was trying to avoid it, there was not much to do)
If trying to not caching, solution 1 is the way to go

Can I start a Service Now workflow via an external SOAP call?

I would like to make a call into the ServiceNow SOAP webservice to start an instance of a specific web service.
I can find the WSDL for functions like incident.do but seem to be missing the step needed to find the proper table/endpoint for workflows to start.
If you want to start a Workflow via SOAP I think the only way to do this is to create a Scripted Web-Service or a Custom Processor.
In there you will have to define a script which starts your Workflow.
var w = new Workflow();
var context = w.startFlow(id, current, current.operation(), getVars());
In this wiki article you can find API Methods for Workflows.
The tricky bit is getting the variables into the Workflow.
While this sounds easy, in fact it isn't.
If your workflow runs on the table sc_req_item (which is likely if you are dealing with Request Fulfillment), you first need to set the Property (sys_properties) glide.workflow.enable_input_variables to true, because otherwise, you will not be able to add normal Input variables to your workflow.
Then, add the Input variables to the workflow. Note that you have some nifty datatypes available there. Note for example the "Data Structure" type.
All Input variables are treated like custome columns (in fact they are columns of a workflw-specific table). That is why the names start with u_.
Lets say, you define an input variable called u_dynamic_vars (Datatype "Data Structure").
Here is how to call the workflow:
var wf_name = "Name of your workflow";
// Instantiate JSON machinery
var parser = new JSON();
//Declare an instance of workflow.js
var wf = new Workflow ();
//Get the workflow id
var wfId = wf.getWorkflowFromName (wf_name) ;
//Start workflow, passing along object containing name/value pairs mapping to inputs expected by the workflow
var vars = { } ;
// Prepare the JSON Datastructure
var obj ={"name":"George",
"lastname":"Washington"};
// Encode the data
vars.u_dynamic_vars = parser.encode(obj);
vars.u_new_email = "inject#new.com";
// Get a specific RITM
var gr = GlideRecord("sc_req_item");
gr.get("18d8e9740f4013002f504c6be1050e48");
gs.print(gr.number);
// Start the Workflow with a "current" record
wf.startFlow(wfId , gr , "update" , vars ) ;
// You may also pass null, then current is null.
wf.startFlow(wfId , null , "update" , vars ) ;
In the workflow, you then unpack the data like so:
// Let's unpack it. For some reason, intantiating the parse won't work here...
payload = JSON.parse(workflow.variables.u_dynamic_vars);
gs.print("payload.first_name:" + payload.name);
Also note that a workflow does not necessarily need to run on a table.
To achieve this, choose "global" as table name when defining the workflow.

using Joblets in talend with tMemorize and tJavaFlex

I am trying to create some joblets in Talend that will speed up some processes.
I have an input from a MSSQLInput, the results are then sorted and filtered a little. Then I have a tMemorizeRows and a tJavaFlex, the purpose of this is to memorize the rows in a column to preform a count. The count is based on a customer ID, once the the id changes the count starts back to 1 and the proccess begine again and continues to the end. I have refactored this as a joblet but it does not work, the error is:
ID_tMemorizeRows_1 cannot be resolved to a variable
I have a tJavaFlex which starts with
int counte = 1;
The Main code is
if(ID_tMemorizeRows_1[0].equals(ID_tMemorizeRows_1[1]))
{
counte = counte + 1;
}
else
{
counte = 1;
}
context.Enqnum = counte;
The Enqnum variable and is created correctly and added into a tMaps component.
Does anyone know why this is happening, one person told me it is because when you move something to a joblet it gets a new/different name so it has to be specifically called in the Java, if this is the case how do I find the name out?
Thank you
Rich
I do have a resolution. I have tried to add images however my reputation is not high enough.
When using joblets we know that Talend essentially recycles the code used in the joblet by inserting it into the code for the main job.
This is the joblet I have created, i know it works because I have refactored it to a joblet instead of building it from sctatch. What its doing is simply memorises row 0 and row 1 in an ordered data set, the java performs a count and the tMap appends the result to the job (as Mentioned above).
(I will try it inser image in my question, I do not have enough reputation point to insert it into a question).
When the job is run it runs fine. But problems occur when I want to reuse the same joblet in another part of the job. What Talend does is it assigns names within the source code to each component depending on the name of the joblet.
For example, if the Joblet was called ThisJob, then tMemorizeRows_1 would be called ThisJob_1_tMemorizeRows_1.
The row within the component (in this example ReferenceID) would renamed as:
ReferenceID_ThisJob_1_tMemorizeRows_1.
But when you add a second joblet to your job it gives it a new name, eg ThisJob_2. This name will be different depending on how much you have been altering your job before you add the second joblet. Therefore the number within the name will depend on this activity.
If you add the joblet into your job immediately then the joblet would be called ThisJob_2, if you have added 5 other components before you add it in then the joblet is likely to be called ThisJob_6 etc. (I'm not 100% sure how talend renames components)
When you add a joblet, You can see the name of the joblet on the joblet component, this then reverts back the the original joblet name when you create any links/joins to other components.
Its also important that each component within the code is assigned to a variable called currentComponent.
Resolution
What I did was used the Java code to split the name using the code below. This way I can get the current name of the of the joblet and use this name in my Java.
String string = currentComponent;
String[] parts = string.split("_");
String part1 = parts[0];
String part2 = parts[1];
String joblet = part1+'_'+part2;
String newrow = "ReferenceID_"+joblet+"_tMemorizeRows_1";
I hope this makes sense.
Thanks