I found many similar questions related to this, but not the particular answer I am looking for. Actually my requirement is little different, so I end up posting the following issue.
I want to automate Rest APIs, and I got two options for the same case.
The first one is Rest Assured and second one is Play Framework.
For example, to test this RestAPI:
http://servername:9000/dbs/all/list/m1/p1/sch1
(↑ This gives xml response)
So, I have written a code in Java with Rest assured, and is working fine. I integrate this with Maven project so that can be integrated with Jenkins.
Sample code:
import com.jayway.restassured
public class TestNGSimpleTest2 {
#Test
public void testApi() {
expect().
statusCode(200).
body("Status", equalTo("Su22ccess")).
when().
get("http://localhost:9000/dbs/all/list/m1/p1/sch1");
}
So my first question is:
Is the rest assured is the best tool to use?
Does Play framework is better?
I found many other tool like Jmeter, RightAPI etc. to test RestAPI. But I dont think this is automable. Am I right?
For automating REST API testing, as a starting point I recommend using Postman and newman.
Postman provides an excellent UI for building requests, and newman is its command-line counterpart. After you create a set of requests and corresponding tests within the Postman UI, you can run the entire collection from Jenkins via newman, preventing a deployment if tests fail.
The RestAssured code you have posted will work just fine for basic cases. It's not necessarily the "right tool" if you want to:
continuously add new test case and don't have many resources
propogate alerts with well-formed error messages (especially to places like Slack or GitHub)
reduce false-positives
re-use the same tests for monitoring
Building these features takes time and resources which, depending on the size of your team may or may not be a good call.
Some of the commercial solutions you posted can solve some of these problems for you.
Assertible is a codeless solution that supports the workflow you described directly: https://assertible.com/blog/automated-api-testing-with-jenkins
We can integrate Jenkins and JMeter for automating RestAPI testing.
The reason for that is,
In Jenkins we can schedule our tests/builds in whichever way(every minute/hour/day/month.....) or based on the commits etc..
we can bundle n number of APIs together in JMeter and execute in a single test.(Maintaining is easy)
There is a jenkins plugin "Performance" which can be used to check the response time for each API call, which compares the response time with respect to the previous response times.
JMeter has an in-build threading feature, which helps execute tests much faster than any single threaded tests.
Screenshots:
Steps
We can prepare our APIs in JMeter
Configure the tests in non GUI Mode in Jenkins.
Install and Configure the Performance plugin in Jenkins.
Related
I want to mock api calls from my application, and host the mock, so my tests can work without calls to real api. There is a service called restbird which does exactly that, but it is far from ideal for me. If you want to collaborate you have to host the service by your self. Also it has some errors like not displaying history of calls, or when it sends server errors for no reason. I want a service more robust than this one.
The only service that I think might be a good fit is SwaggerHub, it seems robust, it has virtual servers, and overall it is very popular. But the only problem is that I cannot find a way to record api calls from my application. So how can I record api calls for SwaggerHub?
There does not currently exist any functionality within SwaggerHub itself to record API calls made from the Swagger UI module within the tool. This is a limitation of the open-source Swagger UI tool.
What I can recommend is you use the Swagger Inspector tool. The Swagger Inspector can be used to make API calls from a client, save both the request and the response, and even generate an OpenAPI file for you based off the request/responses. If you create an account and sign in, you can even save your API calls to a collection to use later.
Swagger Inspector: https://inspector.swagger.io/builder
It may also be worth considering using ReadyAPI's Virtualization module to handle this use case. With ReadyAPI Virtualization you can record transactions from a browser, build mock services from the recorded transaction or an existing API definition, and then host the mock service using VirtServer.
ReadyAPI is a part of SmartBears API lifecycle products, so there are integrations between the two tools. For instance, you can port APIs from Swaggerhub into ReadyAPI directly and you can use mock services built in ReadyAPI to do dynamic mocking in Swaggerhub.
You can find more information about ReadyAPI Virtualization here: https://smartbear.com/product/ready-api/api-virtualization/
I realise this is a very late response to this thread, but hopefully this information comes in handy.
I have a SOAP endpoint and will be having more than 1000 request messages which have different values for the request parameters but same operation of SOAP Message. I want to execute them in a sequence if the previous request that got triggered was 200 OK?
Is there any way to do this without JAVA program? Is there any client that will help me?
I assume you already have some sort of loop in your test case that reads your variable properties from a file or perhaps Excel and feeds them into your SOAP request. Ready API/soapUI Pro gives you this functionality, but for open source soapUI you'll have to write your own Groovy test steps.
Then, you can use a soapUI Compliance, Status and Standards assertion to check you've received a valid or invalid HTTP status code and react accordingly.
Is there any way to do this without JAVA program? Is there any client
that will help me?
After re-reading the question, it seems to me you're not yet using SoapUI, though it has been tagged as a SoapUI question. It happens quite a lot on here where people are askign general SOAP questions, but tag SoapUI. BTW, Craig's answer should be accepted if you are using SoapUI.
In terms of options, you have lots....
Code. You can use Python, C#, Java, Javascript, etc. etc. to create a program that will call your endpoint. Any programming language will have the libraries to call web services. So, if you do know a language, you could use that.
SoapUI. There is a free version, which will allow you to call web services. In your question, you want to call the same service over and over with different parameters. In testing speak, this is a data-driven test. These can be achieved in the free SoapUI, but it is a fiddle. However, the full-licensed version offers data-driven tests out of the box. I use these all the time. Very easy to set-up. If you use SoapUI, then Craig's answer about using Assertions would stop the test if you got a status code other than a 200.
Postman. this is another free tool, which I have used a little. I haven't tried data-driven tests, but I'm sure the docs will tell you if they're supported. If you try Postman, then you ought to look at Danny Dainton's excellent tutorial on GitHub
JMeter. Another free tool. This is primarily used for performance and load testing, but would still meet your needs.
I know what Rest Assured is and for what purpose it is used and the same for the cucumber.
But the thing is what we can achieve with Rest Assured, we can also do with Cucumber for testing.
Rest Assured simply calls the web service and validates the response. We can't use Rest Assured during the Maven build because the service needs to be up and running.
But with Cucumber I can directly call the web service's business service layer and DOA layer and validate the response. Cucumber can invoke this at Maven build time.
So question is which one is better? I know we can use/integrate Cucumber with Rest Assured but.
Cucumber is a BDD tool and can be used to describe the expected behavior and use those descriptions as the basis for test automation.
RestAsured is a tool to test API's/http calls.
They do something different.
You could use them both: Cucumber to describe you functionality, and RestAssured to do http calls.
But with Cucumber i can directly call the web service's business service layer and doa layer and validate the response.
This is not necessarily true, and it has to do with the test level you want to achieve.
So if you just want to do tests on a unit level, then yes, you don't need to use REST assured, you can perfectly specify your tests with Cucumber feature files, and in the step definitions, you can test the service layer and the DOA layer directly, like you mentioned.
If you want to test a running instance of the webservice then you can use REST Assured or REST Assured plus Cucumber. REST Assured will only help you simplify the actual definition of each part of the test and the interactions with endpoints and its expectations, and Cucumber will allow you to define the high-level scenarios made of those steps.
So the question is which one is better? I knew we can use integrate cucumber with rest assured but.
All in all, it's not a matter of which one is better, but what level of testing you're trying to achieve and how you want to achieve it. A unit level you might not need REST assured. On an integration/live-execution level, then yes, you can use that library. In both levels, you can specify your tests using Cucumber.
Rest assured is java Api libraries to do automate REST web services. we can automated Rest apis by using BDD method,BDD is method and Cucumber is leading and free tool for that
Rest assured is not a tool it is a java library which we can use for test restful webservices, and yes cucumber is recommended to use because customer is giving better reporting which rest assured is not supporting to reports.
So I can recommend use cucumber framework to test API
We are currently in the process of identifying a suitable Automation Framework using JMeter for our RESTful APIs. A typical POST Request in our suite would be something as shown below :
URL : https://host123.com/createuser
Message body(JSON) :
{ "UserName" ,"Password","FirstName","LastName","PhoneNumber" }
There is an equivalent message body for XML as well
One framework we are interested in as shown :
The JSON/XML Repository would contain all the XML/JSON message bodies of every unique API Endpoint(We have close to 350 such unique API URLs).
The Test case repository would contain all the relevant tests containing data to be passed into the JSON/XMLs. One such example is shown below :
JMeter would run these tests and export the response to a file which would be parsed and presented graphically by another reporting plugin/utility.
Could you please let me know if the above data driven framework is suited for Automating RESTful Services ? Also if Jmeter is the ideal tool for performing these tests.
Not sure regarding the "ideal" but JMeter is definitely capable in helping you automating your scenario.
Some references:
CSV Data Set Config or JDBC Request Sampler (depending in what format you have your data) - to read from Test Case Repository
Beanshell Pre Processor or Sampler combined with i.e. GSON to dynamically construct JSON or XML test structures
XPath Assertion - to validate XML response from server
JSON Path Extractor and Assertion (available through JMeter Plugins
Hope this helps
D.
JMeter is a perfect solution for this.
If you want to automate running JMeter and graphing here are some solutions using Jenkins and the CLI:
https://blog.codecentric.de/en/2014/01/automating-jmeter-tests-maven-jenkins/
Need Step by Step Guide to execute the Jmeter Scripts in Jenkins (with Hudson build) over Ubuntu
Another option is a paid solution with http://BlazeMeter.com which is basically JMeter as a service. They have an API and a Jenkins Plugin as well. A lot simpler but not free.
Lastly, also look at the JMeter plugin project which has some great JMeter extras
http://jmeter-plugins.org/
I would use Staf/Stax to call XML test cases, run JMeter and collect the results.
Here is a very good article about this.
I'd like to use Gatling for REST performance and scalability web service testing. I'm currently using JMeter for this as I wasn't aware of gatling when I started this project. Gatling would integrate better and would be better for the project a number of reasons.
I'd like to ask one main question:
Obviously, there's a lot of overhead in configuring Gatling with the correct web service information. I've already done this in JMeter and I'm not keen to do it again. For one of the sub-projects, we have a WADL but we have no such thing for the other. Is it possible, out of the box, to import:
a. JMeter test plans and
b. WADL documents
into Gatling?
I've looked through the docs but unfortunately I can't find anything that references these.
No, Gatling has neither.
Building a jmx converter is something we might investigate in 2013, as you're not the first one to ask for it. At this point, I'm a bit skeptical, as the logic and the configuration of the 2 JMeter and Gatling are quite different, so the features and the way to use them don't map 1:1.
The easiest way to work with REST APIs is to use the recorder, so you'd dump request bodies as template and then inject data into them. See http://gatling.io/docs/2.1.6/http/http_request.html#request-body
If you work with JSON, you can use our JsonPath (or standard regex) checks in order to make assertions on the response body, or even capture data. See http://gatling.io/docs/2.1.6/http/http_check.html#defining-the-check-type
Using HttpSampler with Raw Post body And last 2.8 version is the right way to test Webservices.
Is it the way you are doing it ?
Upcoming 2.9 has new performance improvements related to memory and cpu consumed by Post Processors.
Regarding a, I don't think so.