Dafactory V2 Copy Activity Json - azure-data-factory

I am trying to copy json file from one blob to another using the datafactory copyactivity and need to Set custom value for one of the property in the dest Json when copying... so I was trying to set the jsonPathDefinition as shown in the link below to then use expression like this "#{parameters('myNumber')}" ...but seems in the C# library it is not available... is not going to be available as it available in V1 library ..is it deprecated in V2 or something that has been missed or there are other ways to achieve this in V2. Please suggest....
https://learn.microsoft.com/en-au/azure/data-factory/supported-file-formats-and-compression-codecs#json-format

In V2, there is a defect whose fix is on the way and should be available in about one week. The temporary workaround is using REST API.

Related

ADF Copy only when a new CSV file is placed in the source and copy to the Container

I want to copy the file from Source to target container but only when the Source file is new
(latest file is placed in source). I am not sure how to proceed this and not sure about the syntax to check the source file greater than target. Should i have to use two get metadata activity to check source and target last modified date and use if condition. i tried few ways but it didn't work.
Any help will be handy
syntax i used for the condition is giving me the error
#if(greaterOrEquals(ticks(activity('Get Metadata_File').output.lastModified),activity('Get Metadata_File2')),True,False)
error message
The function 'greaterOrEquals' expects all of its parameters to be either integer or decimal numbers. Found invalid parameter types: 'Object'
You can try one of the Pipeline Templates that ADF offers.
Use this template to copy new and changed files only by using
LastModifiedDate. This template first selects the new and changed
files only by their attributes "LastModifiedDate", and then copies
them from the data source store to the data destination store. You can
also go to "Copy Data Tool" to get the pipeline for the same scenario
with more connectors.
View
documentation
OR...
You can use Storage Event Triggers to trigger the pipeline with copy activity to copy when each new file is written to storage.
Follow detailed example here: Create a trigger that runs a pipeline in response to a storage event

VSTS: Built in variable for organization name?

In many of the calls described in the Azure DevOps REST API documentation, I need to supply the name of the organization, e.g.:
https://vsrm.dev.azure.com/{organization}/{project}/_apis/release/releases?api-version=5.0-preview.8
The project I can get from System.TeamProject. I would have expected something similar for organization name, something like:
System.TeamFoundationCollectionName
This does not seem to be available. I've even printed out all of my environment variables on the agent and don't see anything that fits the need exactly. Sure, I can parse it out of one of the other values, but this seems fragile since MS seems to like to change the format of URLs.
I also can't hard code the organization name because this release definition will live in multiple organizations and we don't want to have to manually update it for each. How are others solving this problem?
Try using System.TeamFoundationServerUri and System.TeamFoundationCollectionUri to build your API requests. They have the organization included in them.
https://learn.microsoft.com/en-us/azure/devops/pipelines/release/variables?view=vsts&tabs=batch
edit: SYSTEM_TEAMFOUNDATIONSERVERURI/BUILD_PROJECTNAME/_apis/release/releases?api-version=5.0-preview.8
It looks like currently there is no such variable for the organization, also, the variables return the old URL (xxx.visualstudio.com) and not the new URL (dev.azure.com/xxx) so if you use the System.TeamFoundationCollectionName the API should work without the {organization}:
https://System.TeamFoundationCollectionName/{project}/_apis/release/releases?api-version=5.0-preview.8.
In Powershell, do this:
# Where SYSTEM_TEAMFOUNDATIONCOLLECTIONURI=https://some_org_name.visualstudio.com/
([System.Uri]$Env:SYSTEM_TEAMFOUNDATIONCOLLECTIONURI).Host.split('.')[-3] # returns 'some_org_name'
Now, just assign that to a variable and use it anywhere you like. "SYSTEM_TEAMPROJECT" is the Project Name, so no need to do any parsing there. It is already available.

Unable to run experiment on Azure ML Studio after copying from different workspace

My simple experiment reads from an Azure Storage Table, Selects a few columns and writes to another Azure Storage Table. This experiment runs fine on the Workspace (Let's call it workspace1).
Now I need to move this experiment as is to another workspace(Call it WorkSpace2) using Powershell and need to be able to run the experiment.
I am currently using this Library - https://github.com/hning86/azuremlps
Problem :
When I Copy the experiment using 'Copy-AmlExperiment' from WorkSpace 1 to WorkSpace 2, the experiment and all it's properties get copied except the Azure Table Account Key.
Now, this experiment runs fine if I manually enter the account Key for the Import/Export Modules on studio.azureml.net
But I am unable to perform this via powershell. If I Export(Export-AmlExperimentGraph) the copied experiment from WorkSpace2 as a JSON and insert the AccountKey into the JSON file and Import(Import-AmlExperiment) it into WorkSpace 2. The experiment fails to run.
On PowerShell I get an "Internal Server Error : 500".
While running on studio.azureml.net, I get the notification as "Your experiment cannot be run because it has been updated in another session. Please re-open this experiment to see the latest version."
Is there anyway to move an experiment with external dependencies to another workspace and run it?
Edit : I think the problem is something to do with how the experiment handles the AccountKey. When I enter it manually, it's converted into a JSON array comprising of RecordKey and IndexInRecord. But when I upload the JSON experiment with the accountKey, it continues to remain the same and does not get resolved into RecordKey and IndexInRecord.
For me publishing the experiment as a private experiment for the cortana gallery is one of the most useful options. Only the people with the link can see and add the experiment for the gallery. On the below link I've explained the steps I followed.
https://naadispeaks.wordpress.com/2017/08/14/copying-migrating-azureml-experiments/
When the experiment is copied, the pwd is wiped for security reasons. If you want to programmatically inject it back, you have to set another metadata field to signal that this is a plain-text password, not an encrypted password that you are setting. If you export the experiment in JSON format, you can easily figure this out.
I think I found the issue why you are unable to export the credentials back.
Export the JSON graph into your local disk, then update whatever parameter has to be updated.
Also, you will notice that the credentials are stored as 'Placeholders' instead of 'Literals'. Hence it makes sense to change them to Literals instead of placeholders.
This you can do by traversing through the JSON to find the relevant parameters you need to update.
Here is a brief illustration.
Changing the Placeholder to a Literal:

Custom Search Results in REST MarkLogic

So new to MarkLogic am stuck and not finding the documentation of use. I know what i need to do, just do not know how to do it.
I have a keyvalue? search on my REST server which returns ML's standard search results and XML snippet. I want to create my own custom search result which will output a title element for my XML files.
I am aware that i need to create an XSLT transformation document and upload that to the server but do not know how to target ML's search function or how to write this out.
I have basic knowledge of XSLT, if i just created something that targets each files title using xPath will this work, or does ML require use of their custom functions?
I know its a bit broad, but hopefully someone can point steer me.
Sounds like you are talking about the GET /v1/keyvalue endpoint of MarkLogic REST API. Unfortunately that does not allow you to choose a transform. You can probably use GET /v1/search with a transform param instead though, using a structured query for an element value query. The docs contain a good syntax reference on that.
Docs on creating and managing transforms can be found here:
http://docs.marklogic.com/guide/rest-dev/transforms#chapter
HTH!
You can use extract-metadata in your search options with search:search or the /v1/search/ REST API endpoint to include the title element in a metadata element or JSON property in your results:
import module namespace search = "http://marklogic.com/appservices/search"
at "/MarkLogic/appservices/search/search.xqy";
search:search(
"my query string",
<options xmlns="http://marklogic.com/appservices/search">
<extract-metadata>
<qname elem-ns="" elem-name="title"/>
</extract-metadata>
</options>)
If you need more flexibility, you specify a custom snippet implementation or a results decorator function in your search options.
Is this key-value or full text? For key-value you could use XPath. Any XPath that starts with / or // or fn:collection() or fn:doc() will search the entire database. You can search specific document(s) or collection(s) too.
For full text you'd probably want to use https://docs.marklogic.com/search:search - or possibly https://docs.marklogic.com/cts:search for really low-level control.
There's some example code using search:search from XSL at https://github.com/marklogic/RunDMC which might help. It doesn't use the REST API: it's a traditional form-submit web page. But the view/search.xsl code might give you some idea how to call the search API from XSLT.
That RunDMC code might also help you if you need to call XSL from XQuery: take a look at controller/transform.xqy.

Updating value on a xml located on the web

I have a file on the web, that looks like this. I would like to update the value a 1 tag
Let's say "tempset" and change 150 for another number. How can i do this? NSURLConnection? NSMutableURLRequest? NSURLRequest? If possible keep it to iOs 4! Thanks!
<Courbe>
<age>45</age>
<tempdesi>150</tempdesi>
<vmininit>35</vmininit>
<tempinit>220</tempinit>
<unittemp>0</unittemp><te_fin_c>220,700,700,700,700,700,700,700,700,700</te_fin_c> <vm_fin_c>50,50,50,50,50,50,50,50,50,50</vm_fin_c>
<grfan_a>1,1,1,1,1,1,1,1,1,1</grfan_a>
<ecarnuit>0</ecarnuit>
<tempset>150</tempset>
<tempsetp>700</tempsetp>
<jo_cou_t>1</jo_cou_t>
<ty_stcha>1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1</ty_stcha>
</Courbe>
You have several options:
create a "service" for this , a kind of API so you can call this service from your client in different languages include ObjectiveC
like http://myserver.com/myobject/set?tempset=1
(in real world, use post and not get for this)
of course, to do this you need to write some server part in your favorite language
provide a way to upload the file and replace it completely, a kind of "upload.php"
Which solution is the best depends on your problem: how this file is generated and maintened