Servicestack (rest) incorrect WSDL with mono - soap

I've written a simple self-hosted (in a ConsoleApplication) rest service with service stack 3.9.70.
using System;
using System.Runtime.Serialization;
// service stack support
using ServiceStack.ServiceHost;
using ServiceStack.WebHost.Endpoints;
namespace HelloWorldConsole
{
namespace DTO
{
[DataContract(Namespace = "http://localhost:8080/types")]
[Route("/hello/{Name}")]
class Hello : IReturn<HelloResponse>
{
[DataMember]
public string Name { get; set; }
}
[DataContract(Namespace = "http://localhost:8080/types")]
class HelloResponse
{
[DataMember]
public string Response { get; set; }
}
}
class HelloService : IService
{
public Object Any(DTO.Hello request)
{
return new DTO.HelloResponse { Response = "Hello " + request.Name };
}
}
public class HelloHost : AppHostHttpListenerBase
{
public HelloHost()
: base("Hello Service Self-Host",
typeof(HelloService).Assembly)
{ }
public override void Configure(Funq.Container container)
{
SetConfig(new EndpointHostConfig
{
DebugMode = true,
WsdlServiceNamespace = "http://localhost:8080/",
WsdlSoapActionNamespace = "http://localhost:8080/",
SoapServiceName = "HelloService"
});
}
}
class MainClass
{
public static void Main (string[] args)
{
string listenOn = "http://localhost:8080/";
HelloHost host = new HelloHost ();
host.Init ();
host.Start (listenOn);
Console.WriteLine ("AppHost created at {0} on {1}",
DateTime.Now, listenOn);
Console.ReadKey ();
}
}
}
Under Windows the generated WSDL is good, and if I try to create a client application and add a web reference to the soap service on localhost, I'm able to call Hello.
If I run the same code under Linux using Mono, the generated WSDL does not contain the types defined inside the DTO namespace. If I try to add a web service reference on a client, I'm not able to exploit hello method.
At this link I've read that by default the same ServiceStack Console app binary runs on both Windows/.NET and Mono/Linux as-is. I've tried to launch the binary under windows; the service runs but the generated WSDL is incorrect (without types defined in DTO namespace).
I use mono 2.10.8.1.
Does anyone have any suggestion?
I also have another question. If I use new version Servicestack last release (4.0.33) I'm not able to exploit soap endpoint.
At this link I've read that SOAP endpoints are not available when hosted on a HttpListener Host. Is it a feature introduced with new version 4.0?
Isn't there the posbility to exploit soap endpoints with servicestack releases higher than 3.9?
Any help is appreciated.

Mono has a weak and partial WCF/SOAP support which will fail to generate WSDLs for many non-trivial Service definitions. This situation may improve in the near future now that Microsoft has Open Sourced .NET server libraries, but in the interim I recommend avoiding Mono if you want to use SOAP.

Related

Resource not exported up in OSGi container

I'm trying to expose a REST service through OSGi (using Apache Felix). I'm using the osgi-jax-rs-connector to publish the resource. Here is the resource interface:
#Path("/bingo")
public interface BingoService {
#GET
#Produces(MediaType.APPLICATION_JSON)
#Path("/lottery")
List<Integer> getLottery();
}
The implementation uses DS annotation to obtain reference to a provided service in container:
#Component(
immediate = true,
service = BingoService.class,
properties = "jersey.properties")
public class Bingo implements BingoService {
#Reference
private RandomNumberGenerator rng;
#Activate
public void activateBingo() {
System.out.println("Bingo Service activated using " +
rng.getClass().getSimpleName());
}
#Override
public List<Integer> getLottery() {
List<Integer> result = new ArrayList<>();
for (int i = 5; i > 0; i--) {
result.add(rng.nextInt());
}
return result;
}
}
jersey.properties simply contains this line
service.exported.interfaces=*
When I deploy the bundle it starts and register the service correctly. But if I go to http://localhost:8181/services/bingo/lottery I get 404.
Could someone point me to the issue or give me some advice on where to look?
On reading the documentation for OSGi - JAX-RS Connector, it expects to find the annotations #Path or #Provider on the service instance object. You have placed them instead on an interface implemented by the component.
I'm not sure what the purpose of the BingoService interface is. This is not required for JAX-RS services. Normally you would register the resource class using its own type (e.g. service=Bingo.class) or simply java.lang.Object.

Is it possible to use one database to dynamically define the ConnectionString of another?

I've reached a bit of a brick-wall with my current project.
I have three normalised databases, one of which I want to dynamically connect to; these are:
Accounts: For secure account information, spanning clients
Configuration: For managing our clients
Client: Which will be atomic for each of our clients & hold all of their information
I need to use data stored in the "Configuration" database to modify the ConnectionString that will be used to connect to the "Client" database, but this is the bit I'm getting stuck on.
So far I've generated the entities from the databases into a project by hooking up EntityFrameWorkCore Tools and using the "Scaffold-DbContext" command & can do simple look-ups to make sure that the databases are being connected to okay.
Now I'm trying to register the databases by adding them to the ServiceCollection, I have them added in the StartUp class as follows:
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
services.AddMvc();
services.Configure<MvcOptions>(options =>
{
options.Filters.Add(new RequireHttpsAttribute());
});
services.AddDbContext<Accounts>( options =>
options.UseSqlServer(Configuration.GetConnectionString("Accounts"))
);
services.AddDbContext<Support>(options =>
options.UseSqlServer(Configuration.GetConnectionString("Configuration"))
);
// Erm?
SelectClientDatabase(services);
}
Obviously the next stage is to dip into the "Configuration" database, so I've been trying to keep that contained in "SelectClientDatabase()", which just takes the IServiceCollection as a parameter and is for all intents and purposes empty for now. Over the last few days I've found some excellent write-ups on EFC and I'm currently exploring a CustomConfigurationProvider as a possible route, but I must admit I'm a little lost on starting out in ASP.Net Core.
Is it possible to hook into the freshly added DbContext within the ConfigureServices method? Or can/must I add this database to the service collection at a later point?
Thanks!
Edit 1:
I just found this post, which mentions that a DbContext cannot be used within OnConfiguring as it's still being configured; which makes a lot of sense. I'm now wondering if I can push all three DbContexts into a custom middleware to encapsulate, configure and make the connections available; something new to research.
Edit 2:
I've found another post, describing how to "Inject DbContext when database name is only know when the controller action is called" which looks like a promising starting point; however this is for an older version of ASP.Net Core, according to https://learn.microsoft.com "DbContextFactory" has been renamed so I'm now working to update the example given into a possible solution.
So, I've finally worked it all out. I gave up on the factory idea as I'm not comfortable enough with asp.net-core-2.0 to spend time working it out & I'm rushing head-long into a deadline so the faster options are now the better ones and I can always find time to refactor the code later (lol).
My appsettings.json file currently just contains the following (the relevant bit of appsettings.Developments.json is identical):
{
"ConnectionStrings" : {
"Accounts": "Server=testserver;Database=Accounts;Trusted_Connection=True;",
"Client": "Server=testserver;Database={CLIENT_DB};Trusted_Connection=True;",
"Configuration": "Server=testserver;Database=Configuration;Trusted_Connection=True;"
},
"Logging": {
"IncludeScopes": false,
"Debug": {
"LogLevel": {
"Default": "Warning"
}
},
"Console": {
"LogLevel": {
"Default": "Warning"
}
}
}
}
I've opted to configure the two static databases in the ConfigureServices method of StartUp, these should be configured and ready to use by the time the application gets around to having to do anything. The code there is nice & clean.
public void ConfigureServices(IServiceCollection services)
{
services.AddMvc();
services.Configure<MvcOptions>(options =>
{
//options.Filters.Add(new RequireHttpsAttribute());
});
services.AddDbContext<AccountsContext>(options =>
options.UseSqlServer(Configuration.GetConnectionString("Accounts"))
);
services.AddDbContext<ConfigContext>(options =>
options.UseSqlServer(Configuration.GetConnectionString("Configuration"))
);
services.AddSingleton(
Configuration.GetSection("ConnectionStrings").Get<ConnectionStrings>()
);
}
It turns out that one can be spoilt for choice in how to go about accessing configuration options set in the appsettings.json, I'm currently trying to work out how I've managed to get it to switch to the release version instead of the development one. I can't think what I've done to toggle that...
To get the placeholder config setting I'm using a singleton to hold the string value. This is just dipping into the "ConnectionStrings" group and stuffing that Json into the "ClientConnection" object (detailed below).
services.AddSingleton(
Configuration.GetSection("ConnectionStrings").Get<ClientConnection>()
);
Which populates the following structure (that I've just bunged off in its own file):
[DataContract(Name = "ConnectionStrings")]
public class ClientConnection
{
[DataMember]
public string Client { get; set; }
}
I only want this holding the connection string for the dynamically assigned database, so it's not too jazzy. The "Client" DataMember is what is selecting the correct key in the Json, if I wanted a different named node in the Json I'd rename it to "Accounts", for instance.
Another couple of options I tested, before settling on the Singleton option, are:
services.Configure<ConnectionStrings>(Configuration.GetSection("ConnectionStrings"));
and
var derp = Configuration.GetSection("ConnectionStrings:Client");
Which I discounted, but it's worth knowing other options (they'll probably be useful for loading other configuration options later).
I'm not keen on the way the Controller dependencies work in ASP.Net Core 2, I was hoping I'd be able to hide them in a BaseController so they wouldn't have to be specified in every single Controller I knock out, but I've not found a way to do this yes. The dependencies needed in the Controllers are passed in the constructor, these weirded me out for a while because they're auto-magically injected.
My BaseController is set up as follows:
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.Filters;
using Microsoft.EntityFrameworkCore.Internal;
using ServiceLayer.Entities;
using System;
using System.Collections.Generic;
using System.Linq;
namespace ServiceLayer.Controllers
{
public class BaseController : Controller
{
private readonly ClientConnection connectionStrings;
private readonly AccountsContext accountsContext;
private readonly ConfigurationContext configContext;
public ClientTemplateContext clientContext;
private DbContextServices DbContextServices { get; set; }
public BaseController(AccountsContext accounts, ConfigContext config, ClientConnection connection) : base()
{
accountsContext = accounts;
configContext = config;
connectionStrings = connection;
}
public override void OnActionExecuting(ActionExecutingContext context)
{
base.OnActionExecuting(context);
}
}
}
The code for selecting the database then goes in the "OnActionExecuting()" method; this proved to be a bit of a pain as well, trying to ensure that the dbcontext was set up properly, in the end I settled on:
using System;
using System.Collections.Generic;
using System.Linq;
namespace ServiceLayer.Controllers
{
public class BaseController : Controller
{
private readonly ClientConnection connectionStrings;
private readonly AccountsContext accountsContext;
private readonly ConfigurationContext configContext;
public ClientTemplateContext clientContext;
private DbContextServices DbContextServices { get; set; }
public BaseController(AccountsContext accounts, ConfigurationContext config, ClientConnection connection) : base()
{
accountsContext = accounts;
configContext= config;
connectionStrings = connection;
}
public override void OnActionExecuting(ActionExecutingContext context)
{
// Temporary selection identifier for the company
Guid cack = Guid.Parse("827F79C5-821B-4819-ABB8-819CBD76372F");
var dataSource = (from c in configContext.Clients
where c.Cack == cack
join ds in configContext.DataStorage on c.CompanyId equals ds.CompanyId
select ds.Name).FirstOrDefault();
// Proto-connection string
var cs = connectionStrings.Client;
if (!string.IsNullOrEmpty(cs) && !string.IsNullOrEmpty(dataSource))
{
// Populated ConnectionString
cs = cs.Replace("{CLIENT_DB}", dataSource);
clientContext = new ClientTemplateContext().Initialise(cs);
}
base.OnActionExecuting(context);
}
}
}
new ClientTemplateContext().Initialise() is a bit messy but I'll clean it up when I refactor everything else. "ClientTemplateContext" is the entity-framework-core generated class that ties together all the entities it generated, I've added the following code to that class (I did try putting it in a separate file but couldn't get that working, so it's staying in there for the moment)...
public ClientTemplateContext() {}
private ClientTemplateContext(DbContextOptions options) : base(options) {}
public ClientTemplateContext Initialise(string connectionString)
{
return new ClientTemplateContext().CreateDbContext(new[] { connectionString });
}
public ClientTemplateContext CreateDbContext(string[] args)
{
if (args == null && !args.Any())
{
//Log error.
return null;
}
var optionsBuilder = new DbContextOptionsBuilder<ClientTemplateContext>();
optionsBuilder.UseSqlServer(args[0]);
return new ClientTemplateContext(optionsBuilder.Options);
}
I also included using Microsoft.EntityFrameworkCore.Design; and added the IDesignTimeDbContextFactory<ClientTemplateContext> interface to the class. So it looks like this:
public partial class ClientTemplateContext : DbContext, IDesignTimeDbContextFactory<ClientTemplateContext>
This is where the CreateDbContext(string[] args) comes from & it allows us to create a new instance of a derived context at design-time.
Finally, the code for my test controller is as follows:
using Microsoft.AspNetCore.Mvc;
using ServiceLayer.Entities;
using System.Collections.Generic;
using System.Linq;
namespace ServiceLayer.Controllers
{
[Route("api/[controller]")]
public class ValuesController : BaseController
{
public ValuesController(
AccountsContext accounts,
ConfigurationContext config,
ClientConnection connection
) : base(accounts, config, connection) {}
// GET api/values
[HttpGet]
public IEnumerable<string> Get()
{
var herp = (from c in clientContext.Usage
select c).FirstOrDefault();
return new string[] {
herp.TimeStamp.ToString(),
herp.Request,
herp.Payload
};
}
}
}
This successfully yields data from the database dynamically selected from the DataSource table within the Configuration database!
["01/01/2017 00:00:00","derp","derp"]
If anyone can suggest improvements to my solution I'd love to see them, my solution is mashed together as it stands & I want to refactor it as soon as I feel I'm competent enough to do so.

Calling services from other application in the cluster

Is it possible to call services or actors from one application to another in a Service Fabric Cluster ? When I tryed (using ActorProxy.Create with the proper Uri), I got a "No MethodDispatcher is found for interface"
Yes, it is possible. As long as you have the right Uri to the Service (or ActorService) and you have access to the assembly with the interface defining your service or actor the it should not be much different than calling the Service/Actor from within the same application. It you have enabled security for your service then you have to setup the certificates for the exchange as well.
If I have a simple service defined as:
public interface ICalloutService : IService
{
Task<string> SayHelloAsync();
}
internal sealed class CalloutService : StatelessService, ICalloutService
{
public CalloutService(StatelessServiceContext context)
: base(context) { }
protected override IEnumerable<ServiceInstanceListener> CreateServiceInstanceListeners()
{
yield return new ServiceInstanceListener(this.CreateServiceRemotingListener);
}
public Task<string> SayHelloAsync()
{
return Task.FromResult("hello");
}
}
and a simple actor:
public interface ICalloutActor : IActor
{
Task<string> SayHelloAsync();
}
[StatePersistence(StatePersistence.None)]
internal class CalloutActor : Actor, ICalloutActor
{
public CalloutActor(ActorService actorService, ActorId actorId)
: base(actorService, actorId) {}
public Task<string> SayHelloAsync()
{
return Task.FromResult("hello");
}
}
running in a application like this:
Then you can call it from another application within the same cluster:
// Call the service
var calloutServiceUri = new Uri(#"fabric:/ServiceFabric.SO.Answer._41655575/CalloutService");
var calloutService = ServiceProxy.Create<ICalloutService>(calloutServiceUri);
var serviceHello = await calloutService.SayHelloAsync();
// Call the actor
var calloutActorServiceUri = new Uri(#"fabric:/ServiceFabric.SO.Answer._41655575/CalloutActorService");
var calloutActor = ActorProxy.Create<ICalloutActor>(new ActorId(DateTime.Now.Millisecond), calloutActorServiceUri);
var actorHello = await calloutActor.SayHelloAsync();
You can find the right Uri in the Service Fabric Explorer if you click the service and look at the name. By default the Uri of a service is: fabric:/{applicationName}/{serviceName}.
The only tricky part is how do you get the interface from the external service to your calling service? You could simply reference the built .exe for the service you wish to call or you could package the assembly containing the interface as a NuGet package and put on a private feed.
If you don't do this and you instead just share the code between your Visual Studio solutions the Service Fabric will think these are two different interfaces, even if they share the exact same signature. If you do it for a Service you get an NotImplementedException saying "Interface id '{xxxxxxxx}' is not implemented by object '{service}'" and if you do it for an Actor you get an KeyNotfoundException saying "No MethodDispatcher is found for interface id '-{xxxxxxxxxx}'".
So, to fix your problem, make sure you reference the same assembly that is in the application you want to call in the external application that is calling.

Unit tests for simple REST client [duplicate]

This question already has answers here:
Need some advice for trying to mock a .NET WebClient or equivalent
(2 answers)
Closed 6 years ago.
Let's assumed that i've got simple method which gets some data from REST service. Method looks like:
public string GetDataFromRest(string uri) {
string result = String.Empty;
using(WebClient web = new WebClient()) {
result = web.DownloadString(uri);
}
return result;
}
So, now i want to create unit test for this method. I don't want to use external REST service but i want fake response from any URI without real conecting to service. Something like every execute of GetDataFromRest(uri) in Unit Test -> always returns some XML.
As the posted answer goes into some detail, part of your problem is you have a dependency on the WebClient class.
A sample wrapper for WebClient could look like:
public interface IWebClient
{
string DownloadString(string address);
}
public class WebClientWrapper : IWebClient
{
public string DownloadString(string address)
{
using(WebClient web = new WebClient()) {
return result = web.DownloadString(uri);
}
}
}
public class MyClass
{
private readonly IWebClient _webClient;
public MyClass(IWebClient webClient)
{
_webClient = webClient;
}
public string GetDataFromRest(string uri)
{
return _webClient.DownloadString(uri);
}
}
Now of course going this route means WebClientWrapper can be unit tested with a "less real" URI or what that you specifically control. I've only implemented one method of the WebClient, but this externalizes the dependency within GetDataFromRest from a real URI, as you can now mock the return data. This also helps in that anything else you need a WebClient for, you can now use the wrapper class, and easily mock the returned data, as you are now programming to an interface, rather than a concretion.

Data Driven IoC

I am writing a program that uses IoC(Windsor v3.0) at startup to load all assemblies in a directory, if implementing an interface/service, into a repository for the core of the application. I am, however, a newcomer to Windsor. My app polls a DB table and when it finds a row that needs to be processed, it checks the name of the service to process the record and requests it from the repository. I can load all the modules into the dictionary and then into the repository via configuration as in this post. Good and well, but I need it to be more dynamic.
How I envision it (pseudo-code):
List<string> enabledServices = GetServicesFromDb();
IDictionary<string, IModule> dict = new IDictionary<string, IModule>();
//Load the assemblies (This works currently!)
_container.Register(AllTypes
.FromAssemblyInDirectory(new AssemblyFilter("Modules"))
.BasedOn<IModule>());
// Build dictionary
foreach(string service in enabledServices)
{
foreach(?? asmble in _container.assemblies)
{
if(asmble.Id == service)
dict.Add(service, asmble);
}
}
// Register the repository from constructed dictionary
_container.Register(
Component
.For<IModuleRepository>()
.ImplementedBy<IntegrationRepository>()
.Parameters(new { modules = dict})
);
The repository:
public class IntegrationRepository : IModuleRepository
{
private readonly IDictionary<string, IModule> _modules;
public IntegrationRepository(IDictionary<string, IModule> modules)
{
_modules = modules;
}
public IModule GetModule(string moduleName)
{
return _modules.ContainsKey(moduleName) ? _modules[moduleName] : null;
}
}
IModule looks like this:
public interface IModule : IDisposable
{
string Id { get; }
string Description { get; }
bool Enabled { get; set; }
bool Validate();
string EmailSubject { get; }
}
All modules:
Implement "IModule" interface
Reside in the "Modules" subfolder
Share a common namespace
I don't have enough experience with Windsor to know how to iterate through the container or if its possible, and _container.ResolveAll(); doesn't seem to work... at least the way I have it in my mind.
My thoughts come from this example which alludes to passing the object in if the object is already created. And this which is similar. I also saw some interesting things on the DictionaryAdapterFactory() but not confident enough to know how to use it
Is something like this possible? Any ideas?
instead of all this you can just store your container globally and you can resolve by full name your modules everywhere
_container.Resolve<IModule>(serviceFullName)
you can register all available services to your container and then create a provider that returns only the services that are enabled in the database. Your components then should of course only access these services through the provider.