Use Azure ML methods like an API - frameworks

Is that possible to use machine learning methods from Microsoft Azure Machine Learning as an API from my own code (without ML Studio) with possibility to calculate everything on their side?

You can publish an experiment (machine learning functions you hooked together in Azure ML Studio) as an API. When you call that API in your custom code you give it your data and all the computation runs in the cloud in Azure ML.

I am reasonably new to Azure machine learning but I do not believe it is possible to use their API without using the MS studio at all. For example you will need the API key to call the API and authenticate with it and the only way I am aware of that you can obtain this key is via the ML studio (after you have published a trained experiment).

Related

How to connect Azure Data factory to Salesforce Commerce Cloud?

is there a way to connect to Azure Data factory to Salesforce Commerce Cloud ?
In Data Factory the I only see connectors to Salesforce Service & Marketing Cloud
if it's possible I'll appreciate it if someone could show me an example
Thank you !
Actually, from the Azure Data Factory connector overview, we can know that Salesforce Commerce Cloud is not supported.
The only way is that you must achieve that in code level. Then call the Function, Python or Notebook active to run it in Data Factory.
There isn't an exist code example we can provide for you. You need design it by yourself.
we may infer from the overview of the Azure Data Factory connector (https://docs.microsoft.com/en-us/azure/data-factory/connector-overview) that Salesforce Commerce Cloud is not supported. You can only accomplish that at the code level. In order to run the Function, Python, or Notebook in Data Factory, use the appropriate function. We are unable to give you an existing code example. You must create it on your own.

Azure DevOps extensions - what language can I use?

I want to develop a DevOps hub plugin that gets work item details and generates a PDF.
I have viewed the samples that all seem to use typescript: https://learn.microsoft.com/en-us/azure/devops/extend/develop/samples-overview?view=azure-devops.
I'm having trouble understanding the context that the code runs in, does it run on the server or in the browser? I know I need to use a web server as I have made a test hub plug in and is running from my local web server.
I would prefer to be able to use server side C# ASP.NET, is this possible or do we have to use a client side language?
It depends on how you want to do it.
The DevOps extension itself is only on client side and is Javascript/Typescript. So, if you are developing say a custom control for the work item form, that's all you can use. But it is hosted within Azure DevOps itself.
If your hub shows an external page, you can do whatever you want. But you have to host all that content yourself.
For your example I would not use an external site. It can all be done in the browser. In fact I made a (private) extension that does something very similar.

Is there any way to call Bing-ads api through a pipeline and load the data into Bigquery through Google Data Fusion?

I'm creating a pipeline in Google Data Fusion that allows me to export my bing-ads data into Bigquery using my bing-ads developer token. I couldn't find any data sources that should be added to my pipeline in data fusion. Is fetching data from API calls even supported on Google Data Fusion and if it is, how can it be done?
HTTP based sources for Cloud Data Fusion are currently in development and will be released by Q3. Could you elaborate on your use case a little more, so we can make sure that your requirements will be covered by those plugins? For example, are you looking to build a batch or real-time pipeline?
In the meantime, you have the following two, more immediate options/workarounds:
If you are ok with storing the data in a staging area in GCS before loading it into BigQuery, you can use the HTTPToHDFS plugin that is available in the Hub. Use a path that starts with gs:///path/to/file
Alternatively, we also welcome contributions, so you can also build the plugin using the Cloud Data Fusion APIs. We are happy to guide you, and can point you to documentation and samples.

Organizing Microsoft Azure DevOps Projects

I have a question about Microsoft DevOps (formerly Visual Studio Team Services or VSTS). I have multiple applications that are set up as separate projects, but we have basically one team of devs. Some of the older projects are TFS based some are git.
Ideally I would like to create a board based on the team and 'attach' projects to the board. Or something that ends up being roughly the equivalent of this.
I can't seem to find anything close to this. Does anyone have any ideas? Or any suggestions?
Thanks for your help!
As I mentioned in my comment you can use the AzureDevOps Rest API.
Representational State Transfer (REST) APIs are service endpoints that
support sets of HTTP operations (methods), which provide create,
retrieve, update, or delete access to the service's resources
Most REST APIs are accessible through our client
libraries,
which can be used to greatly simplify your client code.
Once you created your own board, you can fill up the details using the REST API response.

Where to host Smartsheets API code

I am interested in learning to use the Smartsheets API. In the past I created workflows in Google Apps Script, which has a built in IDE that houses the script. Does Smartsheets have something similar? If not, where is a common place to keep your code and have it react to webhooks/events?
Regards,
Shawn
The API is just a way to communicate between your application and Smartsheet - there is no hosting for your executable code. Smartsheet provides a number of SDKs to help make the calls easier to perform, but in theory you could use any language to make the REST commands. So, pretty much any service that allows executable code would work, such as Amazon AWS, Google Cloud, Microsoft Azure, or others. Here's a brief comparison of services.
You can start developing on your own computer before you worry about cloud deployment. See the getting started guide and samples here: https://github.com/smartsheet-platform/getting-started
If you really need to respond to webhooks, your code will have to run somewhere that accepts incoming HTTP calls from the Internet without being blocked by a firewall. This could be in your data center, any of cloud services, or via a tunnel such as https://ngrok.com/