How to pass binary data to Mojolicious Minion? - perl

I am using the great tool for long running tasks Minion (docs)
For queued task I can provide a path to file.
This works fine if minions are working on same host machine.
But how to create task and pass binary data, if minions are running on different host?

The best approach for this should be:
store file into database into special table.
fetch id of this record
pass this id as parameter to Minion task instead of file path
In the example above it will look like: --allowed=12345
Then the task sub can connect to database and fetch content of your file file by provided id

Related

Overriding Jmeter property in Run Taurus task Azure pipeline is not working

I am running jmeter from Taurus and I need a output kpi.jtl file with url listing.
I have tried passing parameter -o modules.jmeter.properties.save.saveservice.url='true' and
-o modules.jmeter.properties="{'jmeter.save.saveservice.url':'true'}". Pipeline is running successfully but the kpi.jtl doesn't have the url. Please help
I have tried few more options like editing jmeter.properties via pipeline - which broke the pipeline and expecting input from user
user.properities- Which is ineffective.
I am expecting kpi.jtl file with all the possible logs especially url.
I believe you're using the wrong property, you should pass the next one:
modules.jmeter.csv-jtl-flags.url=true
More information: CSV file content configuration
However be informed that having a.jtl file "with all possible logs" is some form of a performance anti-pattern as it creates massive disk IO and may ruin your test. More information: 9 Easy Solutions for a JMeter Load Test “Out of Memory” Failure

How to use multiple client certificates stored in keystore.p12 and set jmeter's system.properties while setting test in Azure Load Testing service?

I am trying to set up test (manual/yaml) in Azure Load Testing service and my test uses client certificates, so I uploaded jmx, keystore(.p12) and csv (has alias of certificates in keystore) to test plan.
In Azure Load Testing, where can I set javax.net.ssl.keyStoreType, javax.net.ssl.keyStore, javax.net.ssl.keyStorePassword, https.use.cached.ssl.context,https.keyStoreStartIndex and https.keyStoreEndIndex properties?
In case of Jmeter, I would set above properties in jmeter's system.properties file. But, in case of Azure Load Testing, not sure how to get this working.
Please suggest, Thanks
As per Configure a load test in YAML
configurationFiles List of relevant configuration files or other files that you reference in the Apache JMeter script. For example, a CSV data set file, images, or any other data file. These files will be uploaded to the Azure Load Testing resource alongside the test script. If the files are in a subfolder on your local machine, use file paths that are relative to the location of the test script.
So my expectation is that if you upload system.properties file along with the .jmx script and CSV file with certificate aliases the Azure load testing engine should pick it up and apply.
It should also be possible to do via GUI:
More information: How to Use Multiple Certificates When Load Testing Secure Websites

How to make Snakemake recognize Globus remote files using Globus CLI?

I am working in a high performance computing grid environment, where large-scale data transfers are done via Globus. I would like to use Snakemake to pull data from a Globus path, process the data, and then push the processed data to a different Globus path. Globus has a command-line interface.
Pulling the data is no problem, for I'd just create a rule that would run globus transfer to create the requisite local file. But for pushing the data back to Globus, I think I'll need a rule that can "see" that the file is missing at the remote location, and then work backwards to determine what needs to happen to create the file.
I could create local "proxy" files that represent the remote files. For example I could make a rule for creating 'processed_data_1234.tar.gz' output files in a directory. These files would just be created using touch (thus empty), and the same rule will run globus transfer to push the files remotely. But then there's the overhead of making sure that the proxy files don't get out of sync with the real Globus-hosted files.
Is there a more elegant way to do this akin to the Remote File capability? Is it difficult to add a Globus CLI support for Snakemake? Thanks in advance for any advice!
Would it help to create a utility function that would generate a list of all desired files and compare it against the list of files available on globus? Something like this (pseudocode):
def return_needed_files():
list_needed_files = [] # either hard-coded or specified with some logic
list_available = [] # as appropriate, e.g. using globus ls
return [i for i in list_needed_files if i not in list_available]
# include all the needed files in the all rule
rule all:
input: return_needed_files

How to read a local csv file using Azure Data Factory and a self-hosted runtime?

I have a Windows Server VM with the ADF Integration Runtime installed running under a local account called deploy. This account is a member of the local admins group. The server is not domain-joined.
I created a new linked service (File System) and pointed it to a csv file on the root of the C drive as a test. When I test the connection I get Connection failed.
Error occurred when trying to access the file in Folder 'C:\etr.csv', File filter: ''. The directory name is invalid. Activity ID: 1b892702-7cc3-48d5-83c7-c680d6d15afd.
Any ideas on a fix?
The linked service needs to be a folder on the target machine. In your screenshot, change C:\etr.csv to C:\ and then define a new dataset that uses the linked service to select etr.csv.
The dataset represents the structure of the data within the linked data stores, and the linked service defines the connection to the data source. So the linked service should point to the folder instead of file. It should be C:\ instead of C:\etr.csv

how to copy local directory with files to remote server talend

in Talend(data integration) i am trying to copy local directory to remote directory but when i am running the job only i can copy files but not folders from directory.please help me with this job.
In my talend job i am using local connection and remote connection components->
tfilelist->tfileproperties(to store path and name in one table)->tmssqlinput(extracting path from last table)->iteration-> tssh(if directory s not available then create)->finally sending it to tftpput to connect and copy to remote directory.
when i am storing in one table using tfileproperties in that for files it will generate some size but when folder s coming the size will be zero,using this condition m creating the directory using tssh component but unable to create folders,please help me.
Do you get an error message?
I believe the output of the TMSSqlInput should be a row based, rather than iteration. That might be the source of the problem.
tMSqlInput docs
tMSSqlInput executes a DB query with a strictly defined order which
must correspond to the schema definition. Then it passes on the field
list to the next component via a Main row link.