I want to develop a Talend job parent job which will read a file. This file will have all the child job names.
So the parent job when run must go through each entry in the file (ie each child job name) and execute the child job.
Can anyone please guide me on this.
You can use the tRunJob dynamic job feature, I executed 3 sub jobs called a, b and c, and the order is specified in the file, see below :
First block is to load your file into context, you can use a properties file or delimited file.
I used properties file with this content :
jobs=a;c;b;a
You have to add a string typed context variable named jobs to put the propertie value into.
Now, our job list is stored in a context variable, we need to iterate through them using tJavaFlex and update a context variable called currentjob:
And finally, this is how to set tRunJob component:
Here you specify jobs to be executed, but the execution order is specified by the file !
You can do it like following scenario
This is the sample job which I tried out.
I have assumed your input from CSV file and the input file contain three job name which are
ChildJob1
ChildJob2
ChildJob3
I am getting the jobname and assigning it to a context variable like below
Then that context variable is used in tRunJob component.
Each child job has a single tjava component which displays the jobname
Hope this may help you out.
Related
I am trying to check if any zip file exists in my SFTP folder. GetMetadata activity works fine if I explicitly provide the filename but I can't know the file name here as the file name is embeded with timestamp and sequence number which are dynamic.
I tried specifying *.zip but that never works and GetMetadata activity always returns false even though the zip file actually exists. is there any way to get this worked? Suggestion please.
Sample file name as below, in this the last part 0000000004_20210907080426 is dynamic and will change every time:
TEST_TEST_9999_OK_TT_ENTITY_0000000004_20210907080426
You could possibly do a Get Metadata on the folder and include the Child items under the Field List.
You'll have to iterate with a ForEach using the expression
#activity('Get Folder Files').output.childItems
and then check if item().name (within the ForEach) ends with '.zip'.
I know it's a pain when the wildcard stuff doesn't work for a given dataset, but this alternative ought to work for you.
If you are using exists in the Get Metadata activity, you need to provide the file name in it.
As a workaround, you can get the child items (with filename *.zip) using the Get Metadata activity.
Output:
Pass the output to If Condition activity, to check if the required file exists.
#contains(string(json(string(activity('Get Metadata1').output.childItems))),'.zip')
You can use other activities inside True and False activities based on If Condition.
If there is no file exists or no child items found in the Get Metadata activity.
If condition output:
For SFTP dataset, if you want to use a wildcard to filter files under the field specified folderPath, you would have to skip this setting and specify the file name in activity source settings (Get Metadata activity).
But Wildcard filter on folders/files is not supported for Get Metadata activity.
I'm new on AnyLogic, I'm a universitary student, I'm working in my title project and I still can't get to save the coordinates of my agents in time. I've been working with this tool for near 3 months and just started to seek for help.
I tried with traceln(getX and getY), propperly written, but it only shows me the entrance position, how could I create a function to store this information onto a txt file? Or how could I create a new agent type to apply the steps above described
Please your help
Here is a simple example to show you how to do this.
I have a custom pedestrian agent that gets created in the ped source block and a population, which starts of empty where all the pedestrians that gets created get added to.
This is just so that I can have the event to record the location inside the agent and the file object to write to is on main. So we need to access main from inside the agent.
In the file object, I just set it to write and selected a blank .txt file that I created.
You can read more about using the text file from the help - https://anylogic.help/anylogic/connectivity/text-file.html
Inside the pedestrian agent, I have an event that simply saves the location to the file in a new line.
I make use of the "\t" so that the txt file is tab-separated. You can use whatever separator you prefer.
You must remember to call the close command on the text file when you have finished writing to it. I simply called it on the destroy code of main. So it only "writes" the data to the text file when you close the model.
Here is the output from this very simple example
Calling getX() and getY() on agents does definitely give their current coordinates (assuming they are in a continuous space and not GIS/discrete space) so it's likely you're calling this on the wrong agents (e.g., calling it for your instance of Main instead of for some agents living in a population inside it).
Or you're not actually calling it at different simulation times (via, say, a cyclic event), so you're only getting the initial values.
I've configured a trigger on blob container change which should invoke a Function activity(HTTP POST), and I want to use this container name in the function itself(trigger container could be changed frequently, and I don't want to change function code). How can I pass it?
I found only an example for the copy activity
https://learn.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger
Based on the link you posted in your question,you could pass the value of folder path and file name to pipeline as parameters. #triggerBody().folderPath and #triggerBody().fileName could be configured in the parameters of pipeline.
For example:
Then if you want to get the container name ,you just need to split the folder path with / so that you could get the root path which is container name exactly.
So I created this quick job and strangely, even after getting confirmation that the variables have been loaded, I can't seem to read the variables using context.. What am I doing wrong?
The input file has just one variable 'temp_var' with value set to 'passed' (temp_var=passed). While loading the job, I also see the info in the talend log window (tContextLoad_2 set key "test_var" with value "passed"). Further, I can use the context.containsKey("test_var") construct and it yields 'true' in the Java Node. But the moment I try to run the following command, it fails:
System.out.println("Value of var test_var read from context " + context.test_var);
Kindly help.
So I saw the source code of another job with context variables and found that talend internally manages the context variables using the getProperty method. Used that information and it all worked fine.
System.out.println("Value of var test_var read from context " + (String) context.getProperty("test_var"));
I have a txt file that is holding a string inside, I want to be able to use this string in one of my scripts, so I'm wondering if there is a way to set the content of the file as one of the build properties or parameters which I'll be able to use in my scripts it should be the same as using one of the build environment properties.
For example : ${JOB_NAME} which is holding the the job name, so in the same way I want to access the content of the file which is holding some value inside.
Is it possible?
You can upload a file from your computer to the workspace through the File parameter of the job.
You can use Extended Choice plugin parameter, to read value(s) from a file and display them in a dropdown/radio-button/checkbox for the user to select, dynamically, every time the build is triggered.
You can use EnvInject plugin to read value(s) from a file and inject them into the build as environment variables, so that they can be used by the rest of the build steps/scripts.
Your question is very unclear on what your are trying to do. Pick one of the 3 methods above based on what you need, or clarify your question.