Cannot expose service method in ADO.NET data service - service

I am trying to create a method that can be exposed through an ADO.NET data service. No matter what I do, the client cannot see the method that I am exposing. I am out of ideas. Please help:
[WebGet]
public ObjectResult<Product> GetAllProducts()
{
ProductOrdersEntities entities = new ProductOrdersEntities();
return entities.GetAllProducts();
}
I have also kept access open to the methods:
public static void InitializeService(IDataServiceConfiguration config)
{
config.SetEntitySetAccessRule("*", EntitySetRights.All);
config.SetServiceOperationAccessRule("*", ServiceOperationRights.All);
}
Still, when I create a client proxy, it cannot see the method GetAllProducts().

I was told by a developer in the Astoria team that the current code generation tool does not support generating code for service operations. By that time I had already started using the .Execute method to make an explicit HTTP request to invoke the method, and this strategy works fine; just that it is not elegant or typesafe.

Related

Benefit of applying Unit-of-Work on Service-layer instead of Controller

When implementing MVC project, I usually add Service Layer to perform the actual work. But actually sometimes 1 Web Request should be done with several AppService methods. Then the location of Unit-of-Work (UoW) may affect the coding handling.
No matter in C# EF/Java Spring, there's Transaction annotation in Service Layer methods, so the transaction is Per-Service based (i.e. UoW on Service layer). Let's take Java version as example here:
#Transactional(propagation = Propagation.REQUIRED, isolation = Isolation.READ_COMMITTED)
Public class UserAppService{
public UserDTO createUser() {
// Do sth to create a new user
userRepository.save(userBean);
// Convert userBean to userDTO
return userDTO;
}
public xxx DoSth() {
// Break the operation here
throw new Exception("Whatever");
// (never execute actually)
sthRepository.save(someBean);
}
}
Then in Controller:
Public class SomeController : Controller {
Public xxx DoSth(){
UserAppService Service = new UserAppService();
Service.CreateUser(); // DB committed
Service.DoSth(); //Exception thrown
}
}
With this structure, If there's any exception thrown on 2nd service method call, the 1st service method still commit the user to the DB. If I want "all-or-nothing" handling, this structure doesn't work unless I wrap those service method calls into another wrapper service call with single transaction. But it's sort of extra work.
Another version is using transaction on Controller action level instead (i.e. UoW on Controller Action). Let's take C# code as example:
Remarks: AppService in code version 2 here use the DbContext (sth like transaction) defined in controller, and doesn't do any commit inside.
Public class SomeController : Controller {
Public ActionResult DoSth(){
using (var DB = new DbContext()){
Var UserAppService = new UserAppService(DB);
var userEntity = userAppService.GetUser(userId);
UserAppService.DoSth(userEntity);
Var AnotherAppService = new AnotherAppService(DB);
AnotherAppService.DoSthElse(userEntity);
// Throw exception here
throw new Exception("Whatever");
DB.Save(); // commit
}
}
}
In this example, there will be no partial commit to the DB.
Is applying UoW on service-layer really better?
Is applying UoW on service-layer really better?
IMO No. And you've just figured out why. If the service methods are discreet and re-usable, they are also not suitable for being atomic transactions.
In .NET the controller should control the transaction lifecycle, and enlist service methods in the transaction.
Note that this also implies that the service methods should be local method calls, not remote or web service calls.
It is better because your following the main principle of Object Oriented Programming seperation of concerns.What if you made another controller and wanted to do more database processing using the same object? You dont want to instantiate the controller in which your doing something completely different.By the way check out the facade service pattern http://soapatterns.org/design_patterns/service_facade it may help you understand why its so sexy. .Hi the image above shows the pattern, basically you wrap your database access objects with transactional at the service layer so a customerService object can wrap 1,2...inf transactions and either all fail or succeed.

Function Return void How to Run with Open RIA Domain Service

I have few Functions & Procedures to Run from Silverlight project. I have mapped data in EF6. Now I want to Run these function Using OpenRIA Service. I tried giving annotation like Query, Update, Insert, Delete. One or other function does some of these operations. Few return nothing & Few return data.
I wrote a method to call 1 Function from RIA Domain Service
public void BuildRoute4Rdinv(decimal? hpmsyear, string errorout)
{
this.ObjectContext.BuildRoute4Rdinv(year, ref errorout);
}
What Annotation I should provide in above case. this function doesnt return anything & Have no complex type associated with. [Query] requires Return IEnumarable so i cant provide that.
I think the one you are after is "[Invoke]"
Annotate your function with [Invoke] then a related method will be generated on your client side.

Workflow: Creating Dependency Chain with Service Locator Pattern

I'm trying to get dependencies set up correctly in my Workflow application. It seems the best way to do this is using the Service Locator pattern that is provided by Workflow's WorkflowExtensions.
My workflow uses two repositories: IAssetRepository and ISenderRepository. Both have implementations using Entity Framework: EFAssetRepository, and EFSenderRepository, but I'd like both to use the same DbContext.
I'm having trouble getting both to use the same DbContext. I'm used to using IoC for dependency injection, so I thought I'd have to inject the DbContext into the EF repositories via their constructor, but this seems like it would be mixing the service locator and IoC pattern, and I couldn't find an easy way to achieve it, so I don't think this is the way forward.
I guess I need to chain the service locator calls? So that the constructor of my EF repositories do something like this:
public class EFAssetRepository
{
private MyEntities entities;
public EFAssetRepository()
{
this.entities = ActivityContext.GetExtension<MyEntities>();
}
}
Obviously the above won't work because the reference to ActivityContext is made up.
How can I achieve some form of dependency chain using the service locator pattern provided for WF?
Thanks,
Nick
EDIT
I've posted a workaround for my issue below, but I'm still not happy with it. I want the code activity to be able to call metadata.Require<>(), because it should be ignorant of how extensions are loaded, it should just expect that they are. As it is, my metadata.Require<> call will stop the workflow because the extension appears to not be loaded.
It seems one way to do this is by implementing IWorkflowInstanceExtension on an extension class, to turn it into a sort of composite extension. Using this method, I can solve my problem thus:
public class UnitOfWorkExtension : IWorkflowInstanceExtension, IUnitOfWork
{
private MyEntities entities = new MyEntities();
IEnumerable<object> IWorkflowInstanceExtension.GetAdditionalExtensions()
{
return new object[] { new EFAssetRepository(this.entities), new EFSenderRepository(this.entities) };
}
void IWorkflowInstanceExtension.SetInstance(WorkflowInstanceProxy instance) { }
public void SaveChanges()
{
this.entities.SaveChanges();
}
}
The biggest downside to doing it this way is that you can't call metadata.RequireExtension<IAssetRepository>() or metadata.RequireExtension<ISenderRepository>() in the CacheMetadata method of a CodeActivity, which is common practice. Instead, you must call metadata.RequireExtension<IUnitOfWork>(), but it is still fine to do context.GetExtension<IAssetRepository>() in the Execute() method of the CodeActivity. I imagine this is because the CacheMetadata method is called before any workflow instances are created, and if no workflow instances are created, the extension factory won't have been called, and therefore the additional extensions won't have been loaded into the WorkflowInstanceExtensionManager, so essentially, it won't know about the additional extensions until a workflow instance is created.

How do I inject Db into Service classes when unit testing ServiceStack.OrmLite with NUnit?

I'm interested in writing unit tests (using NUnit) for some service classes created using ServiceStack, using the "New API" (inheriting from ServiceStack.ServiceInterface.Service). The services' Db property is properly auto-wired when hosted in an ASP.NET application using AppHost, but I can't seem to figure out the proper technique when running outside of that environment. I see a variety of testing-related namespaces and classes in ServiceStack but can't find a clear example where a service's Db property is getting injected, as opposed to simply setting up a connection factory directly and then calling the various IDbConnection extension methods (Insert, Select, etc.).
I have tried having my test class inherit from ServiceStack.ServiceInterface.Testing.TestBase and overriding its Configure method to register an IDbConnectionFactory (using ":memory:"), as well as setting OrmLiteConfig.DialectProvider = SqliteDialect.Provider; in my TestFixtureSetUp, but I continue to get a NullReferenceException when calling my service methods (at at ServiceStack.ServiceInterface.Service.get_Db()). It seems that the Funq container is not auto-wiring anything.
SQLite itself is properly set up, which I'm able to confirm with simpler unit tests that bypass my service classes and simply do direct IDbConnection calls.
What am I missing?
Edit
It appears that unit-testing ServiceStack services requires the existence of a host and a client, although it looks like there are ways to set this up to avoid the serialization cost (using the DirectServiceClient, as shown here) -- though I haven't succeeded in getting that to work in my case. I managed to get this working using an AppHostHttpListenerBase approach (see here) although it's more of an integration test than a unit one (and is accordingly slower).
The docs on Testing shows a couple of different approaches for injecting dependencies.
If you look at the implementation for the base Service it just creates the Db from the IDbConnectionFactory:
private IDbConnection db;
public virtual IDbConnection Db
{
get { return db ?? (db = TryResolve<IDbConnectionFactory>().OpenDbConnection()); }
}
That it just resolves from the local or Global IResolver IOC container:
public static IResolver GlobalResolver { get; set; }
private IResolver resolver;
public virtual IResolver GetResolver()
{
return resolver ?? GlobalResolver;
}
public virtual T TryResolve<T>()
{
return this.GetResolver() == null
? default(T)
: this.GetResolver().TryResolve<T>();
}
So to inject your own dependencies (when using Service base class) you just need to configure a IAppHost with your dependencies your services need which you can do with:
using (var appHost = new BasicAppHost {
ConfigureContainer = c => {
c.Register<IDbConnectionFactory>(new ...);
}
}.Init())
{
//...
}
Which you can then set on your service along with any of your own dependencies your service needs, e.g:
var service = appHost.ResolveService<MyService>();
Which will autowire all dependencies configured in your AppHost, you can also add your own adhoc test-specific dependencies via normal property access, e.g:
var service.MyDependency = new Mock<IMyDependency>().Object;
From then on you can just call and test your C# class methods as per usual:
var response = service.Get(new RequestDto { ... });
Assert.That(response.Result, Is.Equal("Expected Result from DB"));

Dynamic configs with Structuremap

Here's what I am trying to accomplish with Structuremap.
On each we request, database connection strings and web service urls used in our clients will vary based on some business logic. Currently, our sql and web service client implementations receive the configs in their constructors.
I wanted to use profiles, only to discover that it is not possible to use them per request.
In our team, we're having a debate over two solutions:
1- Pass a config factory into the registry that can resolve which configurations to use
when the container needs to instantiate something.
Problems I see is that we might have to use HttpContext.Items, as most of the app objects are not instantiated in structuremap and it seems hard to get the current request context from within the factory.
2- Instantiate containers for every different configurations and decide which container to use depending on the business logic.
Problems I see is the load time, the memory consumption and maybe the lifecycles of objects. So, I don't seem to find any real problem here, it just feels wrong to me to have multiple containers.
1- Do you see other problems?
2- Any better idea?
3- Which one would you choose?
Thank you
EDIT
and it seems hard to get the current request context from within the factory.
I don't mean HttpContext, I mean the request data. For this app, it is a wcf request object.
it seems hard to get the current request context from within the factory.
Not sure why it seems that way. Wouldnt the following do the trick?
ObjectFactory.Configure(config => {
config.For<HttpContextBase>()
.Use(() => { return new HttpContextWrapper(HttpContext.Current); });
config.For<Service>().Use<Service>();
});
var service = ObjectFactory.GetInstance<Service>();
public class ConfigurationFactory
{
public ConfigurationFactory(System.Web.HttpContextBase context)
{
}
}
public class Service
{
public Service(ConfigurationFactory Configuration)
{
}
}