I would like to design a client that would talk to a REST API. I have implemented the bit that actually does call the HTTP methods on the server. I call this Layer, the API layer. Each operation the server exposes is encapsulated as one method in this layer. This method takes as input a ClientContext which contains all the needed information to make the HTTP method call on the server.
I'm now trying to set up the interface to this layer, let's call it ClientLayer. This interface will be the one any users of my client library should use to consume the services. When calling the interface, the user should create the ClientContext, set up the request parameters depending on the operation that he is willing to invoke. With the traditional Java approach, I would have a state on my ClientLayer object which represents the ClientContext:
For example:
public class ClientLayer {
private static final ClientContext;
...
}
I would then have some constructors that would set up my ClientContext. A sample call would look like below:
ClientLayer client = ClientLayer.getDefaultClient();
client.executeMyMethod(client.getClientContext, new MyMethodParameters(...))
Coming to Scala, any suggestions on how to have the same level of simplicity with respect to the ClientContext instantiation while avoiding having it as a state on the ClientLayer?
I would use factory pattern here:
object RestClient {
class ClientContext
class MyMethodParameters
trait Client {
def operation1(params: MyMethodParameters)
}
class MyClient(val context: ClientContext) extends Client {
def operation1(params: MyMethodParameters) = {
// do something here based on the context
}
}
object ClientFactory {
val defaultContext: ClientContext = // set it up here;
def build(context: ClientContext): Client = {
// builder logic here
// object caching can be used to avoid instantiation of duplicate objects
context match {
case _ => new MyClient(context)
}
}
def getDefaultClient = build(defaultContext)
}
def main(args: Array[String]) {
val client = ClientFactory.getDefaultClient
client.operation1(new MyMethodParameters())
}
}
Related
Is there a way to reconfigure the Grails 3 Link Generator to create Restful links, i.e. localhost:8080/book/{id} rather than the old style that includes the action in the URL, localhost:8080/book/show/{id}?
I'd like to have restful URLs in the location headers of the responses to save actions.
I've been using this Grails Restful Link Generator as a workaround. I'm not perfectly happy with it, but it's the best I've been able to come up with thus far.
1. Create a trait in src/main/groovy that removes the superfluous action from the URL
import grails.web.mapping.LinkGenerator
trait RestfulLinkGeneratorTrait {
LinkGenerator grailsLinkGenerator
String generateLink(Map map) {
map.controller = map.controller ?: this.controllerName
map.absolute = map.absolute ?: true
map.action = map.action ?: "show"
grailsLinkGenerator.link(map).replace("/$map.action", "")
}
}
2. Implement the RestfulLinkGenerator on your controller(s) and call generateLink(id: obj.id) to generate links.
#Secured('ROLE_USER')
class BookController extends RestfulController implements RestfulLinkGeneratorTrait {
//... other methods ...//
#Transactional
def save() {
// ... save you resource ... //
response.addHeader(HttpHeaders.LOCATION, generateLink(id: book.id))
respond book, [status: CREATED, view: 'show']
}
//... other methods ...//
}
I have a Object like this:
// I want to test this Object
object MyObject {
protected val retryHandler: HttpRequestRetryHandler = new HttpRequestRetryHandler {
def retryRequest(exception: IOException, executionCount: Int, context: HttpContext): Boolean = {
true // implementation
}
}
private val connectionManager: PoolingHttpClientConnectionManager = new PoolingHttpClientConnectionManager
val httpClient: CloseableHttpClient = HttpClients.custom
.setConnectionManager(connectionManager)
.setRetryHandler(retryHandler)
.build
def methodPost = {
//create new context and new Post instance
val post = new HttpPost("url")
val res = httpClient.execute(post, HttpClientContext.create)
// check response code and then take action based on response code
}
def methodPut = {
// same as methodPost except use HttpPut instead HttpPost
}
}
I want to test this object by mocking dependent objects like httpClient. How to achieve this? can i do it using Mokito or any better way? If yes. How? Is there a better design for this class?
Your problem is: you created hard-to test code. You can turn here to watch some videos to understand why that is.
The short answer: directly calling new in your production code always makes testing harder. You could be using Mockito spies (see here on how that works).
But: the better answer would be to rework your production code; for example to use dependency injection. Meaning: instead of creating the objects your class needs itself (by using new) ... your class receives those objects from somewhere.
The typical (java) approach would be something like:
public MyClass() { this ( new SomethingINeed() ); }
MyClass(SomethingINeed incoming) { this.somethign = incoming; }
In other words: the normal usage path still calls new directly; but for unit testing you provide an alternative constructor that you can use to inject the thing(s) your class under test depends on.
In Cucumber, how do i go about passing variables between step definition classes. Im trying to implement in Scala.
Looking around I have seen people suggest using Guice or Picocontainer or any other DI framework. But have not really come across an example in Scala.
For instance for the example below how do I pass the variable using DI ?
Provider.scala,
class Provider extends ScalaDsl with EN with Matchers with WebBrowser {
......
When("""I click the Done button$""") {
val doneButton = getElement(By.id(providerConnectionButton))
doneButton.click()
}
Then("""a new object should be created successfully""") {
// Pass the provider ID created in this step to Consumer definition
}
}
Consumer.scala,
class Consumer extends ScalaDsl with EN with Matchers with WebBrowser {
......
When("""^I navigate to Consumer page$""") { () =>
// providerId is the id from Provider above
webDriver.navigate().to(s"${configureUrl}${providerId}")
}
}
You can use ThreadLocal to solve your problem
Here's code snippet for solution.
object IDProvider{
val providerId = new ThreadLocal[String]
def getProviderId: String = {
providerId.get()
}
def setProviderId(providerId: String): Unit = {
providerId.set(providerId)
}
}
To access providerID across different step definitions. You can simply call IDProvider.getProviderId
And to set the value of providerID, simply call IDProvider.setProviderId(PROVIDER_ID)
I am trying to use Restful API plugin (Restful API plugin). (using Grails 2.3.8, Groovy 2.1)
As stated in documentation I created a Grails service that implements RestfulServiceAdapter.
import net.hedtech.restfulapi.RestfulServiceAdapter
import com.game.trivia.Question
#Transactional
class QuestionService implements RestfulServiceAdapter {
#Override
public Object list(def service, Map params) throws Throwable{
List Q = Question.list(params)
return Q;
}
.
.
.
When trying to access the service: http://localhost:8080/test_triv/api/questions
I received following exception:
{"errors":[{"type":"general",
"errorMessage":"No signature of method: test_triv.QuestionService.list() is applicable for argument types:
(org.codehaus.groovy.grails.web.servlet.mvc.GrailsParameterMap) values:
[[pluralizedResourceName:questions, action:[...], ...]]\nPossible solutions: list(java.lang.Object, java.util.Map),
is(java.lang.Object), wait(), find(), wait(long), with(groovy.lang.Closure)"}]}
So I implemented another list method (which is not part of the interface):
public Object list(Map params) throws Throwable {
List Q = Question.list(params)
return Q;
}
Which works ok.
Am I doing something wrong?
Do I implement the correct interface?
Do I have to expose a service for each domain or there is any way to use an existing controller instead of a service?
Creating new service is a big overhead! I already have controllers for all domains.
Just got reply on this issue from Charlie (The pludin developer):
Our documentation should be clearer in this area, so I'll take an action to look at improving it.
You should not implement the RestfulServiceAdapter within a service, but implement and register an adapter that implements this interface if you need to adapt an existing service that does not provide the expected methods.
Since you are writing a new service, you can just expose the required methods (you don't need to implement any interface). Note the contract is essentially the same as the adapter interface, without the 'service' argument that represents the service to which the adapter would delegate.
To avoid needing an adapter, a service should expose these methods:
def list( Map params ) throws Throwable { ... }
def count( Map params ) throws Throwable { ... }
def show( Map params ) throws Throwable { ... }
def create( Map content, Map params ) throws Throwable { ... }
def update( def id, Map content, Map params ) throws Throwable { ... }
void delete( def id, Map content, Map params ) throws Throwable { ... }
The controller is intended to delegate to a service that contains business logic, and it cannot delegate to another controller. Our expectation is the RestfulApiController and other controllers within an application would share services (e.g., a ThingController and the RESTfulApiController could both use the same ThingService so that business logic is not duplicated).
I have read dozens of posts about PROs and CONs of trying to mock \ fake EF in the business logic.
I have not yet decided what to do - but one thing I know is - I have to separate the queries from the business logic.
In this post I saw that Ladislav has answered that there are 2 good ways:
Let them be where they are and use custom extension methods, query views, mapped database views or custom defining queries to define reusable parts.
Expose every single query as method on some separate class. The method
mustn't expose IQueryable and mustn't accept Expression as parameter =
whole query logic must be wrapped in the method. But this will make
your class covering related methods much like repository (the only one
which can be mocked or faked). This implementation is close to
implementation used with stored procedures.
Which method do you think is better any why ?
Are there ANY downsides to put the queries in their own place ? (maybe losing some functionality from EF or something like that)
Do I have to encapsulate even the simplest queries like:
using (MyDbContext entities = new MyDbContext)
{
User user = entities.Users.Find(userId); // ENCAPSULATE THIS ?
// Some BL Code here
}
So I guess your main point is testability of your code, isn't it? In such case you should start by counting responsibilities of the method you want to test and than refactor your code using single responsibility pattern.
Your example code has at least three responsibilities:
Creating an object is a responsibility - context is an object. Moreover it is and object you don't want to use in your unit test so you must move its creation elsewhere.
Executing query is a responsibility. Moreover it is a responsibility you would like to avoid in your unit test.
Doing some business logic is a responsibility
To simplify testing you should refactor your code and divide those responsibilities to separate methods.
public class MyBLClass()
{
public void MyBLMethod(int userId)
{
using (IMyContext entities = GetContext())
{
User user = GetUserFromDb(entities, userId);
// Some BL Code here
}
}
protected virtual IMyContext GetContext()
{
return new MyDbContext();
}
protected virtual User GetUserFromDb(IMyDbContext entities, int userId)
{
return entities.Users.Find(userId);
}
}
Now unit testing business logic should be piece of cake because your unit test can inherit your class and fake context factory method and query execution method and become fully independent on EF.
// NUnit unit test
[TestFixture]
public class MyBLClassTest : MyBLClass
{
private class FakeContext : IMyContext
{
// Create just empty implementation of context interface
}
private User _testUser;
[Test]
public void MyBLMethod_DoSomething()
{
// Test setup
int id = 10;
_testUser = new User
{
Id = id,
// rest is your expected test data - that is what faking is about
// faked method returns simply data your test method expects
};
// Execution of method under test
MyBLMethod(id);
// Test validation
// Assert something you expect to happen on _testUser instance
// inside MyBLMethod
}
protected override IMyContext GetContext()
{
return new FakeContext();
}
protected override User GetUserFromDb(IMyContext context, int userId)
{
return _testUser.Id == userId ? _testUser : null;
}
}
As you add more methods and your application grows you will refactor those query execution methods and context factory method to separate classes to follow single responsibility on classes as well - you will get context factory and either some query provider or in some cases repository (but that repository will never return IQueryable or get Expression as parameter in any of its methods). This will also allow you following DRY principle where your context creation and most commonly used queries will be defined only once on one central place.
So at the end you can have something like this:
public class MyBLClass()
{
private IContextFactory _contextFactory;
private IUserQueryProvider _userProvider;
public MyBLClass(IContextFactory contextFactory, IUserQueryProvider userProvider)
{
_contextFactory = contextFactory;
_userProvider = userProvider;
}
public void MyBLMethod(int userId)
{
using (IMyContext entities = _contextFactory.GetContext())
{
User user = _userProvider.GetSingle(entities, userId);
// Some BL Code here
}
}
}
Where those interfaces will look like:
public interface IContextFactory
{
IMyContext GetContext();
}
public class MyContextFactory : IContextFactory
{
public IMyContext GetContext()
{
// Here belongs any logic necessary to create context
// If you for example want to cache context per HTTP request
// you can implement logic here.
return new MyDbContext();
}
}
and
public interface IUserQueryProvider
{
User GetUser(int userId);
// Any other reusable queries for user entities
// Non of queries returns IQueryable or accepts Expression as parameter
// For example: IEnumerable<User> GetActiveUsers();
}
public class MyUserQueryProvider : IUserQueryProvider
{
public User GetUser(IMyContext context, int userId)
{
return context.Users.Find(userId);
}
// Implementation of other queries
// Only inside query implementations you can use extension methods on IQueryable
}
Your test will now only use fakes for context factory and query provider.
// NUnit + Moq unit test
[TestFixture]
public class MyBLClassTest
{
private class FakeContext : IMyContext
{
// Create just empty implementation of context interface
}
[Test]
public void MyBLMethod_DoSomething()
{
// Test setup
int id = 10;
var user = new User
{
Id = id,
// rest is your expected test data - that is what faking is about
// faked method returns simply data your test method expects
};
var contextFactory = new Mock<IContextFactory>();
contextFactory.Setup(f => f.GetContext()).Returns(new FakeContext());
var queryProvider = new Mock<IUserQueryProvider>();
queryProvider.Setup(f => f.GetUser(It.IsAny<IContextFactory>(), id)).Returns(user);
// Execution of method under test
var myBLClass = new MyBLClass(contextFactory.Object, queryProvider.Object);
myBLClass.MyBLMethod(id);
// Test validation
// Assert something you expect to happen on user instance
// inside MyBLMethod
}
}
It would be little bit different in case of repository which should have reference to context passed to its constructor prior to injecting it to your business class.
Your business class can still define some queries which are never use in any other classes - those queries are most probably part of its logic. You can also use extension methods to define some reusable part of queries but you must always use those extension methods outside of your core business logic which you want to unit test (either in query execution methods or in query provider / repository). That will allow you easy faking query provider or query execution methods.
I saw your previous question and thought about writing a blog post about that topic but the core of my opinion about testing with EF is in this answer.
Edit:
Repository is different topic which doesn't relate to your original question. Specific repository is still valid pattern. We are not against repositories, we are against generic repositories because they don't provide any additional features and don't solve any problem.
The problem is that repository alone doesn't solve anything. There are three patterns which have to be used together to form proper abstraction: Repository, Unit of Work and Specifications. All three are already available in EF: DbSet / ObjectSet as repositories, DbContext / ObjectContext as Unit of works and Linq to Entities as specifications. The main problem with custom implementation of generic repositories mentioned everywhere is that they replace only repository and unit of work with custom implementation but still depend on original specifications => abstraction is incomplete and it is leaking in tests where faked repository behaves in the same way as faked set / context.
The main disadvantage of my query provider is explicit method for any query you will need to execute. In case of repository you will not have such methods you will have just few methods accepting specification (but again those specifications should be defined in DRY principle) which will build query filtering conditions, ordering etc.
public interface IUserRepository
{
User Find(int userId);
IEnumerable<User> FindAll(ISpecification spec);
}
The discussion of this topic is far beyond the scope of this question and it requires you to do some self study.
Btw. mocking and faking has different purpose - you fake a call if you need to get testing data from method in the dependency and you mock the call if you need to assert that method on dependency was called with expected arguments.