I'd like to use Gatling for REST performance and scalability web service testing. I'm currently using JMeter for this as I wasn't aware of gatling when I started this project. Gatling would integrate better and would be better for the project a number of reasons.
I'd like to ask one main question:
Obviously, there's a lot of overhead in configuring Gatling with the correct web service information. I've already done this in JMeter and I'm not keen to do it again. For one of the sub-projects, we have a WADL but we have no such thing for the other. Is it possible, out of the box, to import:
a. JMeter test plans and
b. WADL documents
into Gatling?
I've looked through the docs but unfortunately I can't find anything that references these.
No, Gatling has neither.
Building a jmx converter is something we might investigate in 2013, as you're not the first one to ask for it. At this point, I'm a bit skeptical, as the logic and the configuration of the 2 JMeter and Gatling are quite different, so the features and the way to use them don't map 1:1.
The easiest way to work with REST APIs is to use the recorder, so you'd dump request bodies as template and then inject data into them. See http://gatling.io/docs/2.1.6/http/http_request.html#request-body
If you work with JSON, you can use our JsonPath (or standard regex) checks in order to make assertions on the response body, or even capture data. See http://gatling.io/docs/2.1.6/http/http_check.html#defining-the-check-type
Using HttpSampler with Raw Post body And last 2.8 version is the right way to test Webservices.
Is it the way you are doing it ?
Upcoming 2.9 has new performance improvements related to memory and cpu consumed by Post Processors.
Regarding a, I don't think so.
Related
I am quite new to JGiven and currently I have a set of REST API tests automated using Rest Assured and TestNG framework. I am also exploring JGiven as a framework to run the API tests for the advantages it gives with the human readable given when thens and the reports that it generates too. Rest Assured as a library lets us inject the URLs and actually make the REST calls. I want to understand if we have such capabilities within JGiven to actually make the REST calls. If so, I'd like to see an example and understand how I can do that. If not, can someone kindly advice and suggest the best way to achieve it with JGiven. I've been trying to search for this information but have struggled to do so thus far.
Thanks in advance.
JGiven is useful for creating test scenarios that are understandable by domain experts. It is a general tool that can be used for any kind of testing, including testing REST APIs. JGiven adds an understandable layer on top of your underlying test infrastructure. However, you will typically need tools in addition to JGiven to implement the underlying layer. So for testing REST APIs you will use a tool like Rest Assured in combination with JGiven. With JGiven you describe your scenario in the domain language, with Rest Assured you will execute the REST calls.
I have a SOAP endpoint and will be having more than 1000 request messages which have different values for the request parameters but same operation of SOAP Message. I want to execute them in a sequence if the previous request that got triggered was 200 OK?
Is there any way to do this without JAVA program? Is there any client that will help me?
I assume you already have some sort of loop in your test case that reads your variable properties from a file or perhaps Excel and feeds them into your SOAP request. Ready API/soapUI Pro gives you this functionality, but for open source soapUI you'll have to write your own Groovy test steps.
Then, you can use a soapUI Compliance, Status and Standards assertion to check you've received a valid or invalid HTTP status code and react accordingly.
Is there any way to do this without JAVA program? Is there any client
that will help me?
After re-reading the question, it seems to me you're not yet using SoapUI, though it has been tagged as a SoapUI question. It happens quite a lot on here where people are askign general SOAP questions, but tag SoapUI. BTW, Craig's answer should be accepted if you are using SoapUI.
In terms of options, you have lots....
Code. You can use Python, C#, Java, Javascript, etc. etc. to create a program that will call your endpoint. Any programming language will have the libraries to call web services. So, if you do know a language, you could use that.
SoapUI. There is a free version, which will allow you to call web services. In your question, you want to call the same service over and over with different parameters. In testing speak, this is a data-driven test. These can be achieved in the free SoapUI, but it is a fiddle. However, the full-licensed version offers data-driven tests out of the box. I use these all the time. Very easy to set-up. If you use SoapUI, then Craig's answer about using Assertions would stop the test if you got a status code other than a 200.
Postman. this is another free tool, which I have used a little. I haven't tried data-driven tests, but I'm sure the docs will tell you if they're supported. If you try Postman, then you ought to look at Danny Dainton's excellent tutorial on GitHub
JMeter. Another free tool. This is primarily used for performance and load testing, but would still meet your needs.
I like the TDD approach to documenting your restful api with spring-rest-docs. However, I love "API Playground" feature enabled by swagger specification. I wish there was a way to get best of both worlds.
Is there a way to build swagger2 specs from spring rest docs? may be via building custom request/response preprocessors.
Do you have any thoughts or recommendations?
There's not out-of-the-box support for this in Spring REST Docs at the moment. The issue that you opened will track the possibility of adding such functionality. In the meantime, your best bet would be to look at writing a custom Snippet implementation that generates (part of) a Swagger specification.
Typically, a Spring REST Docs snippet deals with documentating a single resource, whereas a Swagger specification describes an entire service. This means that the Swagger specification Snippet implementation will need to accumulate state somehow, before producing a complete specification at the end. There are lots of ways to do that (in memory, multiple files that are combined in a post-processing step, etc.). It's not clear to me that one approach is obviously the right one so some experimentation would be useful. If you do some experimentation, please comment on the issue that you opened with your findings.
this is a general 'what technologies are available' question.
My company provides a web application with a RESTful API. However, it is too slow for my needs and some of the results are in an awkward format.
I want to wrap their restful server with a proxy/adapter server, so when you connect to the proxy you get the RESTful API I wish the real one provides.
So it needs to do a few things:
passthrough most requests
cache some requests
do some extra requests on the original server to detect if a request is cacheable
for instance: there is a request for a field in a record: GET /records/id/field which might be slow, but there is a fingerprint request GET /records/id/fingerprint which is always fast. If there exists a cache of GET /records/1/field2 for the fingerprint feedbeef, then I need to check the original server still has the fingerprint feed beef before serving the cached version.
fix headers for some responses - e.g. content-type, based upon the path
do stream processing on some large content, for instance
GET /records/id/attachments/1234
returns a 100Mb log file in text format
remove null characters from files
optionally recode the log to filter out irrelevant lines, reducing the load on the client
cache the filtered version for later requests.
While I could modify the client to achieve this functionality, such code would not be re-usable for other clients (different languages), and complicates the client logic.
I had a look at whether clojure/ring could do it, and while there is a nice little proxy middleware for it, it doesn't handle streaming content as far as I can tell - the whole 100Mb would have to be downloaded. Also it doesn't include any cache logic yet.
I took a look at whether squid could do it, but I'm not familiar with the technology, and it seems mostly concerned with passing through requests rather than modifying them on the fly.
I'm looking for hints where I might find the correct technology to implement this. I'm mostly language agnostic if learning a new language gets me access to a really simple way to do it.
I believe you should choose a platform that is easier for you to implement your custom business logic on. The following web application frameworks provide easy connectivity with REST APIs, and allow you to create a web application that could work as a REST proxy:
Play framework (Java + Scala)
express + Node.js (Javascript)
Sinatra (Ruby)
I'm more familiar with Play, of which I know it provides utilities for caching you could find useful, and is also extendable by a number of plugins.
If you are familiar with Scala, you could have a also have a look at Finagle. It is a framework build be Twitter's infrastructure team to provide protocol-agnostic connectivity. It might be an overkill for REST to REST proxy, but it provides abstractions you might find useful.
You could also look at some 3rd party services like Apitools, which allows to create a proxy programmatically (in lua). Apirise is a similar service (of which I'm a co-founder) that intends to do provide similar functionalities with a user-friendly UI.
Beeceptor does exactly what you want. It plugs in-between your web-app and original API to route requests.
For your use-case of caching a few responses, you can create a rule. That way it shall not hit the original endpoint.
The requests to original APIs can be mocked, and you can inspect response
You can simulate delays.
(Note: it is a shameless plug, I am the author of Beeceptor and thought it should help you and other developers.)
https://github.com/nodejitsu/node-http-proxy is looking useful - although I don't yet know if it can stream process for transcoding.
Is there a way to use a webservice (REST in this case) as the data source for a Lift application? I can find a number of tutorials/examples of using Lift to provide the REST API, but in my case the data is hosted elsewhere and exported as a REST webservice. Pointers to doc are greatly appreciated.
Thanks,
Jeff
This is not related to Lift in fact. There is a lot of different pieces of information already:
HttpClient library as was suggested already,
or Dispatch Scala library for accessing HTTP services
information on how to cache data in Scala in various ways in case you need it
Think about caching thoroughly, it is generally a good choice if your application generates a lot of requests and you can afford caching. Caching will let you achieve many goals:
decrease response time, as you do not depend on the remote service (if you do synchronous data processing)
avoid Denial of Service in case the remote service dies. Otherwise your application will generate many sockets to read data and exhaust resources (either sockets or threads or something else)
do not exceed SLA of the remote service, as many services constrain the number of requests you are allowed to pefrorm per some unit of time.
So you can just sit and put these things together, that's it.
If you really want to be fancy, you can create a Record implementation for a REST-based data source. There's already one of these in existence that works with CouchDB. Using the lift-couchdb module, the interactions with CouchDB are abstracted away and all you deal with is the Scala code. There is a short wiki page with instructions on how to get started with lift-couchdb here:
http://www.assembla.com/wiki/show/liftweb/CouchDB
The pertinent source code files are available here:
http://github.com/lift/lift/tree/master/framework/lift-persistence/lift-couchdb/src/main/scala/net/liftweb/couchdb/
Using the Record interface gives you access to lots of Traits which you use to provide functionality with minimal code-writing such as creating HTML forms, providing lifecycle based calls, and easy hooks for validation.
I've put a scala layer over HttpClient and then use that. I've been meaning to put this on github for some time.
I use Dispatch (which is a wrapper around HttpClient) for making REST calls. Looks nice and simple