How to generate a swagger file for the header "Content-Type: application/x-www-form-urlencoded" in Informatica cloud? - rest

I want to generate a swagger file for rest v2 connector in informatica cloud with these details.
POST CALL:
Accept: application/json
Content-Type: application/x-www-form-urlencoded
Raw Body:
token=XXXXXXX&content=record&format=csv
But informatica cloud does not have an option of application/x-www-form-urlencoded.
I am able to do the same request in POSTMAN as POSTMAN has all the functionalities.
I even tried to put the Content-Type separately in the headers section while generating the swagger file in Informatica-cloud, but still didn't work.
Someone told me to use this website: http://specgen.apistudio.io for creating the swagger file, but the site does not seem secure and thus I cannot enter any sensitive data
Is there any way I could generate the file through a website or through informatica itself?

Swagger file cannot be generated for the header “Content-Type: application/x-www-form-urlencoded” in Informatica cloud.
What can be done instead is to use 'Curl' for the rest api call in the pre/post processing command in the Mapping Task/Data Synchronization Task. You can take a look at the curl commands in here:
https://www.baeldung.com/curl-rest
Other way if you want to avoid using Curl then, you can create a 'service connector' for the REST call in the application integration.
It is also possible to run data integration tasks from application integration if you want to run them after using the service connector.
The way it works is:
Create a service connector
Create the connection for the service connector
Create a process.
Inside the process, use various services. First service can run your API connection that you just made, then you can use other service to run a data integration task which is available inside 'System service: -> Run cloud task'.
This way you can make the work done without creating a swagger file as it does not accept “Content-Type: application/x-www-form-urlencoded”.

Related

How to download data from a streaming endpoint using InvokeHTTP (NiFi 1.20)?

I'm currently trying to download data from an infinte streaming API endpoint (without closure) using the InvokeHTTP processor. I'm using NiFi 1.20. I'm able to connect to the API, but I'm not able to download any data. However when I'm connecting to other endpoints in the same API with a http-response that ends the connection, data streams through the processor.
Am I missing some config parameters or do I have to use another approach?
I,ve tested most of the settings in the processor. In addition I have verified that the Oauth2 token is valid by downloading from the streaming endpoint using Curl.
PS! I'm trying to dowload data from the Barentswatch API

How to copy a file from AWS rest API gateway to s3 bucket?

Using an API gateway, I created an S3 bucket to copy an image (image/jpg). This website describes how I can upload images using Amazon's API gateway: https://aws.amazon.com/premiumsupport/knowledge-center/api-gateway-upload-image-s3/.
When I type the URL by adding the bucket and object name, I get the following error:
{"message":"Missing Authentication Token"}
I would like to know, first of all, where API can find my image? How can I verify that the image is a local file as stated in the introduction? In addition to that, should I use the curl command to complete the transformation? I am unable to use Potsman.
I have included a note from the link, how can I change the type of header in a put request?
What is the best way to copy files from the API gateway to S3 without using the Lambda function?
Assuming you set up your API and bucket following the instructions in the link, you upload a local file to your bucket using curl command as below.
curl -X PUT -H "Accept: image/jpeg" https://my-api-id.execute-api.us-east-2.amazonaws.com/my-stage-name/my-bucket-name/my-local-file.jpg
Note that the header indicates it will accept jpeg files only. Depending on the file you are uploading to S3, you will change/add this header.
To answer the questions directly:
What is the best way to copy files from the API gateway to S3 without using the Lambda function? - follow steps in this link https://aws.amazon.com/premiumsupport/knowledge-center/api-gateway-upload-image-s3/
where API can find my image? - your image is located in your local host/computer. You use curl command to upload it via the API that you created
should I use the curl command to complete the transformation? - not sure what you meant by transformation. But you use curl command to upload the image file to your S3 bucket via API Gateway
how can I change the type of header in a put request? - you use -H to add headers in your curl command

Send REST request to get data obtained by Postman

I'm in the process of getting some data from Salesforce in order to store them in GCP. It seems that there doesn't exit any tool that directly connects both sides, so the way I'm doing it is by using Postman to send a REST request to Salesforce and therefore getting the data. Now, my question is how I should proceed in order to store those data into Cloud Storage or BigQuery as I can't find the way to create a channel between GCP and Postman (if that is the right thing to do). Any advice would be much appreciated.
I think it would be best to at least code a prototype a for doing this or a python script. But you could probably use cUrl to hit the salesforce api and push the response to a local file and use the cloud tools CLI (see example from docs) to then send it to Cloud Storage. bearing in mind the results from the api call to SF would be in the raw json format. You can probably combine the different commands into a single bash script to make running end to end repeatable once you have the individual commands working correctly
curl https://***instance_name***.salesforce.com/services/data/v42.0/ -H "Authorization: Bearer access_token_from_auth_call" > response.txt
gsutil cp ./response.txt gs://your-gs-bucket

Patching Wildfly using native or HTTP management API

I need to patch Wildfly10 using HTTP or native management API. I know how to do it with CLI using command
patch apply /home/user/patch.zip
but is it possible to apply patch using HTTP or native management API?
Also will that patch be applied to all servers in the targeted server-group?
It is possible. It can be done using native management API.
https://github.com/wildfly/wildfly-core/blob/master/patching/src/main/java/org/jboss/as/patching/tool/PatchOperationTarget.java#L384
It can also be done calling /management-upload url using multipart. One part of multipart is patch zip file, and the second part is json holding request for CLI command /core-service=patching:patch(input-stream-index=0)
you can view this behavior by patching server using admin console and looking in web browsers console network

REST: soapUI groovy script not running on tomcat server

I have a problem about executing groovy script of my rest mock service project on tomcat server.
Actually i'm using soapUI pro 5.1.0 to create a complete REST web services with the mock service.
Indeed, after resources creation (/api/1/mobile/users), I create a method POST that contains several requests and I generate the corresponding REST responses under (/api/1/mobile/users) resource. Then to have the corresponding response for each request I develop a groovy script that check the request content and send the right response. The script is well performed locally in the soapUI tool, but when I deploy the project on tomcat server, and try to send request with the client. I noticed that the default response is sent for all requests instead of targeted response. It seems that the script is not performed, and no log info is received.