We have Scheduler Jobs transfer data from Source to Destination (Employee Data to Systems or Filemovers (Configures in property files))
We have Rest API exposes data related to entity (For example Employee or Organisation)
We have Wildfly on Linux (cluser), Oracle and Java Integrations deployed.
Manager ask to provide IAAS i.e Integration as a Service. As we are already exposing Rest API; what more comprises IAAS?
Related
In our project, we need to work on multiple services. so started implementing Spring cloud contract testing between two services. (internal services ex: BookService, Employee Service)
Our Requriment:
How Can I implement Spring cloud contact for Third-party service( ex: AccessService) where we can access only API's but no control on Dev Code Base.
The base idea of contracts is that you can ensure that the producer API matches the requirements codified in the contract files. If that is not possible (you don't own the producer code base and you cannot do contract PRs to producer codebase), this approach will not be helpful for you. So you can use contracts for the services in which you can verify producer code against them and use a different approach for external services.
I'm trying to create VM using vSphere Automation SDK REST API and vRealize Orchestrator but the workflow in Orchestrator I'm using needs the host and datastore on which to create the VM (I'm cloning a VM using a template).
My problem is that my datastores are not shared by all clusters (and hosts) so I need to be careful to create a VM with a matching host and datastore.
With the vSphere Automation SDK REST API I can easily get the list of hosts and datastores (here's the doc I found : https://code.vmware.com/doc/preview?id=4645), but none of the "list" or "get" requests give me links between hosts and datastores.
How can I get the relations between my datastores and my hosts so that I can call Orchestrator with the correct parameters ?
Thank you.
Check out https://github.com/doublecloudinc/vim-rest-api
The REST APIs supports FULL features of vSphere APIs, and scales across many vCenter/ESXi servers. You can list all the host and pick one to get its datastore property easily with a couple of REST calls.
Also, vBrownBag tech talk: https://youtu.be/EpMlP27gEEM
Disclaimer: I am the creator of the REST APIs.
I have a REST service that I call from my client app. The service use JSON to manage data.
I want insert SAP NetWeaver Gateway between client and REST server to expose the REST data as oData.
The REST service have get and put methods to read and write data from/to db.
Now I have to decide the way in which starting:
Translate data from/to REST server using ABAP code to serialize data in the two ways (get REST data and create oData in response to a url get oData call and create the REST call in response to a url post oData call)
Use SMP - SAP Mobile Platform (Eclipse plugin) to write integration code (in javascript) to consume REST service in reading and writing exposing this service ad oData. http://scn.sap.com/community/developer-center/mobility-platform/blog/2015/04/08/integration-gateway-rest-data-source-overview-of-blogs
The 2nd solution seems to be the best way (no-require ABAP programming, use js high level language, parse-libs to manage oData and json ...) but i don't know if SMP was created to do this work.
And what is the result of the 2nd method? It seems to be a zip file (similar to a war) that I can push to the SAP Gateway to deploy the integration-logic. Right? How can I test my code without deploy every time the zip on SAP Gateway?
The recommended approach is to create a create Odata service in SAP Gateway and consume it in your app using SMP 3 or HCPms (SAP's mobile solution on cloud).
Use SMP SDK to consume the Odata service.
I'm a student and we (the team) are trying to host our graduation project on Azure. We mainly have five components which are :
1- A crawler that crawls data from website and it is written in C#
2- MySQL Database that store data crawled and the interactions of users in the ASP.NET website.
3- Java Restful web service that process the data collected and crawled and send results to the websites.
4- In addition , the ASP.NET website that view the data processed by the webservice in addition to other data from the db.
What are the best option to host all these components on Windows Azure ? Should we use Azure Cloud Services and Website Service or just a normal Virtual machines ?!
We don't have much knowledge about Cloud so please if you could also provide us with some resources that help us deploying all these components , we would be appreciated. Thanks in advance :)
1- A crawler that crawls data from website and it is written in C#
R:Worker Role (Cloud Services)
2- MySQL Database that store data crawled and the interactions of users in the ASP.NET website.
R:Virtual Machine / Clear DB (MySql from Azure Store)
3- Java Restful web service that process the data collected and crawled and send results to the websites.
R:If you chose Virtual Machine for item 2, you can use the same VM for that.
4- In addition , the ASP.NET website that view the data processed by the webservice in addition to other data from the db.
R:Azure Web Sites / Web Role (Cloud Services) / The same VM of item 2 and 3.
If it's a simple web site, go with WAMS (windows azure web sites), if you need more control, go with web role.
More info about execution models, it will clarify each one:
http://www.windowsazure.com/en-us/documentation/articles/choose-web-site-cloud-service-vm/
As I want to set up a BAM solution, I'm wondering if there's any Open Source BAM solution that accepts REST requests as events and provide a REST interface to query for events ?
I found a BAM Open Source solution created by Jos Dirksen author of the SOA Governance in Action but I couldn't get the maven dependencies from http://mvn.riptano.com/content/repositories/public/ repository (ref: mvn.riptano.com not authorized connection, connection require authentication).
In [1] you can find how to publish data to WSO2 BAM 2.4.0. There, first you should create a stream (which is a predefined set of fields to be published) using REST API. Then you can send data to BAM via REST API. All the data will be saved in Cassandra where you can analyze data in it using a Hadoop cluster and summarize data. In most cases summarized data is stored in RDB where these data can be visualizes in a dashboard or generate a report.
[1] http://docs.wso2.org/display/BAM240/Sending+Events+through+the+REST+API