Moving a single feature to production! - version-control

I am working on an API that is triggered using a custom Job scheduler. The API is in the form of .NET dll. Currently it contains 10-15 methods. Most of the methods are currently in testing phase. The client wants to push out a single method to the production. Yes, I mean if there are 10 methods in a DLL the client wants to push a single method like FetchOrders(order id) into the production.
How can I go about doing that?

Publish that one method if it's tested and accepted.
Branch the version (when using TFS) in case work needs done on the prod version while working on the full version.
Make sure the other functions aren't available in production.

Related

Is possible to make a REST Call to webscript from own Java Backed Webscript?

I'm doing a Java Backed Webscript to put in Alfresco and call it via REST. This Webscript must do a set of 3 operations (find a path, create a folder and upload a document).
I read about this and found similar examples to do this operations throw the native Alfresco API, with methods like getFileFolderService, getContentService, etc. of Repository or ServiceRegistry classes. All in Java, without javascript.
But I would rather use REST calls instead of Alfresco API inside my Webscript. I think that if already exists Webscripts to do these operacions, is easier call them than use Alfresco API methods to try to do it. And if the API changes in future versions, the REST calls would remain the same. But I'm new here and I don't know if I'm wrong.
In summary: to do these 3 operacions, one after another, in my backed webscript, what is better and why? Use native API methods or use REST calls to existing webscripts?
And if I try to do the second option, is possible to do this? Using HttpClient class and GetMethod/PostMethod for the REST calls inside my Java Webscript may be the best option for Rest calls?. Or this could give me problems? Because I use a Rest call to my backed webscript that do another rest calls to another webscripts.
Thanks a lot!
I think it's bad practice to do it like this. In a lot of Alfresco versions the default services didn't change a bit. Even when they changed they still had deprecated methods.
The rest api changed as well. If you want to make an upgrade proof system I guess it's better to stick with the Webservices (which didn't change since version 2.x) or go with CMIS.
But then it doesn't make sense to have your code within Alfresco, so putting it within an interface is better.
I'd personally just stick with the JavaScript API which didn't change a lot. Yes more functions were enabled within, but the default actions to search & CRUD remained the same.
You could even to a duo: Have your Java Backendscript do whatever fancy stuff and send the result to je JavaScript controller and do the default stuff.
Executing HTTP calls against the process you are already in is a very very bad idea in general. It is slower, much more complex and error-prone, hogs more resources (two threads), and in your case, you will even lose transaction safety. Just imagine the last call fails for some reason. Besides you will most likely have to handle security context propagation yourself. Use the native public API and it will be easy, safe and stable.

Automate Rest API test and integrate this with Continuous Integration(CI-Jenkins)

I found many similar questions related to this, but not the particular answer I am looking for. Actually my requirement is little different, so I end up posting the following issue.
I want to automate Rest APIs, and I got two options for the same case.
The first one is Rest Assured and second one is Play Framework.
For example, to test this RestAPI:
http://servername:9000/dbs/all/list/m1/p1/sch1
(โ†‘ This gives xml response)
So, I have written a code in Java with Rest assured, and is working fine. I integrate this with Maven project so that can be integrated with Jenkins.
Sample code:
import com.jayway.restassured
public class TestNGSimpleTest2 {
#Test
public void testApi() {
expect().
statusCode(200).
body("Status", equalTo("Su22ccess")).
when().
get("http://localhost:9000/dbs/all/list/m1/p1/sch1");
}
So my first question is:
Is the rest assured is the best tool to use?
Does Play framework is better?
I found many other tool like Jmeter, RightAPI etc. to test RestAPI. But I dont think this is automable. Am I right?
For automating REST API testing, as a starting point I recommend using Postman and newman.
Postman provides an excellent UI for building requests, and newman is its command-line counterpart. After you create a set of requests and corresponding tests within the Postman UI, you can run the entire collection from Jenkins via newman, preventing a deployment if tests fail.
The RestAssured code you have posted will work just fine for basic cases. It's not necessarily the "right tool" if you want to:
continuously add new test case and don't have many resources
propogate alerts with well-formed error messages (especially to places like Slack or GitHub)
reduce false-positives
re-use the same tests for monitoring
Building these features takes time and resources which, depending on the size of your team may or may not be a good call.
Some of the commercial solutions you posted can solve some of these problems for you.
Assertible is a codeless solution that supports the workflow you described directly: https://assertible.com/blog/automated-api-testing-with-jenkins
We can integrate Jenkins and JMeter for automating RestAPI testing.
The reason for that is,
In Jenkins we can schedule our tests/builds in whichever way(every minute/hour/day/month.....) or based on the commits etc..
we can bundle n number of APIs together in JMeter and execute in a single test.(Maintaining is easy)
There is a jenkins plugin "Performance" which can be used to check the response time for each API call, which compares the response time with respect to the previous response times.
JMeter has an in-build threading feature, which helps execute tests much faster than any single threaded tests.
Screenshots:
Steps
We can prepare our APIs in JMeter
Configure the tests in non GUI Mode in Jenkins.
Install and Configure the Performance plugin in Jenkins.

Lifecycle management of a Rails application server

We are developing an application that has an iPhone client, and a Rails server. We have released a first version, and are now starting to work on 1.1 version.
We were wondering if there are any tools (external or provided by hostingrails) to address those two basic requirements:
- development / production versions of a Rails application
- simultaneous live versions of the application (versioned APIs) for example to keep supporting older versions of the client application iPhone.
A first approach we are thinking of right now would be to duplicate the application for each version of the API we want to have, each of them being referenced by a specific URL for example: myapp.com/v1 , myapp.com/v2 ...
This entire stack would itself be duplicated in order to have a live/production version, and a development one. Once tested, the development version would be switched with the production version.
What do you think of this approach ? Are there any tools that allow to manage the lifecycle of the application ?
Does Rails has built-in features facilitating this ?
Thanks
The simplest thing would just be to keep your API backward-compatible, thus obviating the need to maintain two versions of the API, and if you must evolve it in a way that breaks backwards compatibility, deprecate the old API and give it a real termination date so that you don't support it ad infinitum.
If you absolutely have to go down this road, read Fowler's blog post on the topic first (http://martinfowler.com/bliki/TolerantReader.html) and then look at namespacing your API routes. In Rails you could accomplish this using namespaced routes and controllers, so you might have your original api at /application/endpoint and your new version at /application/v2/endpoint (I'm assuming that you can't change your endpoints for old clients easily)
I'm not aware of any tools that explicitly claim to solve the problems you're saying you want to solve, but I think that has more to do with developers working hard not to need them than the idea that they're not solvable in Rails.
Did you consider using Subdomains?: http://railscasts.com/episodes/221-subdomains-in-rails-3

Async requests vaadin

I find no documentation on how to update objects vaadin asynchronously. Can anyone help me? What I need is to render a table and then update the values โ€‹โ€‹of a column with a call rather slow, and so I want to make it asynchronous ..
This has been discussed a lot on this thread on the Vaadin forum. You might want to read it, it contains a lot of useful information.
Just do the updates in another thread. UI modifications from background threads must be synchronized to application object. Add icepush, refresher or proggresbar to get changes from server to client.
As far as I know Vaadin provides two add-ons for solving this problem: ServerPush and DontPush. Both add-ons can be imported via maven and both support WebSockets as well as fallback solutions for browsers without WebSocket support. Although ServerPush provides seemingly more features than DontPush, it is lower rated than DontPush, probably because it is more complicated.
For pushing updates to the client DontPush provides a very simple solution that does not require any changes to the web application. Only the servlet-class in web.xml needs to be replaced by org.vaadin.dontpush.server.impl.jetty.DontPushServlet and the widget set has to be updated afterwards via mvn vaadin:update-widgetset. That's all. Any changes on the server will be automatically pushed to the client. I successfully tested this add-on with Chrome 14. Unfortunately, I could not get it working with Firefox 7.
According to the web page of ServerPush the ServerPush add-on should provide this functionality, too. However, I could not figure out how to setup ServerPush to be working with jetty. Moreover, it seems to be more complicated in use. It requires several changes to the web.xml as well as additional configuration files for the atmosphere server.
In contrast to DontPush ServerPush provides also an explicit pushing mechanism which allows to update the GUI manually by calling the push() method of a certain pusher component which needs to be added to the main window beforehand. However, I also failed to get this working.

WF4 workflow versions where service contract changes

I just successfully implemented a WF4 "versioning" system using WCF's Routing Service. I had a version1 workflow service to which I added a new Decision activity and saved it as a version2 service. So now I have 2 endpoints (with identical service contracts, i.e. all Receive activities are the same for both service) and a router that checks the content of a message (a "versionId" string on the object that all of my Receive's accept as an argument) to decide which endpoint to hit.
My question is, while this works fine when no changes are made to the service contract, how to I handle the need to add or remove methods from my service contract and create a version3 service? My original thought was, when I add the service reference to my client, I use the latest workflow service's endpoint to get the latest service contract. Then, in the config file, I change the endpoint I connect to to the router's endpoint. But this won't work if v1 and v2 have a different contract than v3. My proxy will have v3's methods and forget all about v1 and v2.
Any ideas of how to handle this? Should I create an actual service contract interface in my workflow solution (instead of just supplying a ServiceContractName in my Receive activities)?
If the WCF contract changes your client will need to be aware of the additional operations and when to call them. I have used the active bookmarks, it contains the WCF operation, from the persistence store in some applications to have the client app adapt to the workflow dynamically by checking the enabled bookmarks and enabling/disabling UI controls based on that. The client will still have to be updated when new operations are added to a new version of the workflow.
While WCF was young I heard a few voices arguing that endpoint versioning (for web services that is) should be accomplished by using a folder structure. I never got to the point of trying it out myself, but just analyzing the consequences of such a strategy seems to me as a splendid solution. I have no production experience of WCF, but am about to launch a rather comprehensive solution using version 4.0 of .NET (ASP.NET, WCF, WF...) and at this stage I would argue that using a folder structure to separate versions of endpoints would be a good solution.
The essence of such a strategy would be to never change or remove the contract of an endpoint (a specific version) until you are 100% sure that it is not used any more. While your services evolves you would just add new contracts and endpoints. This could lead to code duplication if one is not such a structured developer as one should be. But by introducing a service facade the duplication would be insignificant
I have been through the same situation. You can maintain the version by the help of custom implementation. save the Workflow Service URL in Database. And invoke them as per desire.
You can get the information about calling the WF Service with the URL by the client.
http://rajeevkumarsrivastava.blogspot.in/2014/08/consume-workflow-service-45-from-client.html
Hope this helps