Customizing the Hyperledger Composer Rest Server - ibm-cloud

I have a test Hyperledger Fabric running in the IBM Cloud, with the IBM Blockchain Application Service. I also have a kubernetes cluster running the Hyperledger Composer REST Server. Everything works great, but how do I extend the REST api with some custom api's?
The documentation mentions being able to use the swagger definition (yaml file) with the IBM API Connect or Strongloop product...but how do I do that, as I don't see any way to export the swagger definition?

I don't have a tutorial, but as I see it you would have to do three tasks minimally:
write your REST APIs (to do what you want to do) - perhaps these
resources on REST APIs will help ? -> Loopback ->
https://developer.ibm.com/code/patterns/create-rest-apis-using-loopback/
and in architectural in general
->https://www.ibm.com/developerworks/library/ws-restful/ and
build the routes for your custom APIs
customise your swagger
definitions to document your REST APIs - this may help you ->
https://www.ibm.com/developerworks/library/wa-use-swagger-to-document-and-define-restful-apis/index.html
and Swagger itself -> http://swagger.io/

If you write custom api and call it as 3rd party api which is mentioned in IBM already it will help you to response but it has very limted scope. I belive you should use swagger to understand its end point are properly configured

Related

How to mock the Kubernetes cluster/server?

Kubernetes OpenAPI specification is hosted here.
https://github.com/kubernetes/kubernetes/tree/master/api/openapi-spec
Additionally, various client APIs for the Kubernetes is provided here:
https://kubernetes.io/docs/reference/using-api/client-libraries/
Using the OpenAPI specification, I am able to generate the server code, which provides the REST services. However, the applications using these K8s client APIs (written in either language - Go, Java, etc.) do not use these REST API directly.
My objective is to mock the K8s server to use in the test automation and build a controlled environment to create various test scenarios.
Is there any ready-to-use Kubernetes mock available? If not, how we can interface the client APIs with the above OpenAPI generated REST server? This way, the applications shall continue to use the client APIs but internally, they will be communicating with the mocked K8s server and not the real one.
Please help with the options.
.
Not really a direct answer to your question, but most solutions i have seen implemented are not trying to mock the k8s API but are really using it through either k3s (from rancher labs) or KinD project (official way)
You then connect to it like a normal kubernetes cluster

Hyperledger REST Server or SDK

With Hyperledger composer deprecated and fabric-rest-api (https://github.com/hyperledger-archives/fabric-sdk-rest) archived, what is the future roadmap of REST API for Hyperledger fabric? What is recommended way to expose rest APIs with Hyperledger Fabric 1.4.x? The documentation (https://hyperledger-fabric.readthedocs.io/en/release-1.4/fabric-sdks.html) mentions that REST SDK will be provided in subsequent releases - but cannot find a roadmap.
When you say REST api, I assume you mean a REST api to access your hyperledger fabric application ? If so then you would need to implement that yourself. As far as I know there is no plan to provide any sort of generic rest server that can access smart contracts hosted in fabric.
There were plans to build a Hyperledger Fabric REST SDK on top of the Node.js SDK, and work was started on this. The FABR key tracks all related stories to it on the Hyperledger Jira tracker (see https://jira.hyperledger.org/projects/FABR).
The related repository is https://github.com/hyperledger-archives/fabric-sdk-rest
However as you can see the project has been archived. Many people have made their own REST APIs which exposes the internal Fabric functionality through REST, including myself. Another example might be one made by Altoros (see https://github.com/Altoros/fabric-rest). The problem is that these are not always kept up to date and have questionable authentication (if any).
It's sad to say but the first thing you would have to develop to work with Hyperledger Fabric is your own set of tools.
For a very basic example of a REST API implementation for Hyperledger Fabric with the Node.js SDK, I would like to refer to https://medium.com/#kctheservant/an-implementation-of-api-server-for-hyperledger-fabric-network-8764c79f1a87.

REST endpoint registration and bootstrap(Creating range-index) using U Deploy

I have my code in Git repository. I am using UDeploy to deploy my code into MarkLogic environment. I can able to move all my modules successfully but facing two problems
1. Creating New indexes
2. REST endpoint creation
Please let me know if there is anyway to implement these two
For creating indexes, I have tried to do it using API functions(admin:database-range-element-index()) and I have successful in that part. But is there any way to do it from UDeploy or DevOps.
For register REST endpoint I couldn't able to find anyway to try.
Have you looked at MarkLogic's REST Management APIs - https://docs.marklogic.com/REST/management. In particular, see if https://docs.marklogic.com/REST/POST/manage/v2/databases will help you create indexes via REST management APIs.
The most common way to deploy MarkLogic code & configuration is ml-gradle, a plugin to the widely used gradle tool. ml-gradle uses MarkLogic's Management API, mentioned by Ganesh, and is scriptable.

How to use an SAP ABAP RFC in SAPUI5 without using NetWeaver Gateway?

There is an SAP ABAP standard table that I'm trying to access in SAPUI5.
I have created an RFC. How do I use this RFC in SAPUI5 to get the data there without using NetWeaver Gateway?
You can use a rest service or Web Service on the SAP system and consume the URL in your SAPUI5 application. You can create a SICF service and developer a handler for the service. In the handler you can fetch the content from the table and return the content.
Thanks and Regards,
Veera
I agree with Veera's answer and think that this is the best way of doing it without the Gateway when your application is deployed on the ABAP system (i.e. in the BSP repository). For completeness sake, I will also describe an alternative way of doing it if you are interested in exposing the application on the HANA Cloud Platform (HCP).
ABAP RFCs can be consumed through the HANA Cloud Connector (HCC) by HCP applications. So, if you would want to put your application in the HCP, then an idea would be to expose the RFC through the HCC, consume it e.g. in a Java application and the expose it to a UI5 app through this Java app (e.g. with a Servlet or a JAX-RS service). You can find an example of such a scenario in this repository and here you can find the SAP documentation about this.
Another HCP example is SFlight Sample Application.
Open source python and nodejs rfc connectors are also available, if RFC connection required.

Azure Service Fabric and API Management Integration

I have provisioned a service fabric cluster on azure. It has two node types: one for the frontend and the other one for the backend.
I have deployed a Stateless self-hosted API on the frontend node-type. Now what i'd like to do is to expose that service through the Azure API Management. I've been trying to import the API without success. I have also tried to use swagger to generate the service specification but it seems that swagger does not work. I can access the swagger URL but it loads a blank page.
Any suggestions on how I can integrate my stateless service with APIM or how swagger works here?
Thanks a lot.
there are a few different ways you can produce your swagger document. If you are using Web API, you can use SwashBuckle to generate your Swagger for you. To validate your swagger file, you can use Swagger Editor. Finally, if you still have problems once you have validated your swagger document, share the document and the error here and we will see what we can do to assist.
Many thanks