I wanted to add a custom servlet extension to Keycloak which would install a http handler that gets invoked on every request sent to Keycloak and sets up some logging MDC context that our custom SPI code can use for logging the incoming request traces correctly.
Following the docs here I created a custom extension class:
public class UndertowHandlerExtension implements ServletExtension {
#Override
public void handleDeployment(DeploymentInfo deploymentInfo, ServletContext servletContext) {
deploymentInfo.addInnerHandlerChainWrapper(TraceIdCapturingHandler::new);
}
}
And have defined my custom http handler TraceIdCapturingHandler in the same JAR file. I also added a file to META-INF/services/io.undertow.servlet.ServletExtension and set the fully qualified reference to the extension class. I also updated my deployments jboss-deployment-structure.xml and added the following 2 entries as dependencies:
<module name="io.undertow.servlet" />
<module name="javax.servlet.api" />
However, when my deployment is created the extension is not being invoked and my filter is not executing. Is there something I am missing in terms of how to configure Wildfly for Keycloak so that my extension and handler are installed and used correctly?
EDIT:
After doing a bit of digging I realized I was headed down the wrong path. Looked at this repository and I think I need a custom RealResourceProvider as shown here which in turn can install my filter by obtaining an instance of ResteasyProviderFactory and invoking getContainerRequestFilterRegistry().registerSingleton().
Will try this out and report back.
Please see the edit above for my question. I was able to implement a RealmResourceProviderFactory instance that initialized the filters I needed on startup in the init() method:
#Override
public void init(Config.Scope scope) {
log.info("Initializing");
initializeKeycloakFilters();
}
private void initializeKeycloakFilters() {
ResteasyProviderFactory providerFactory = ResteasyProviderFactory.getInstance();
TraceIdCapturingFilter filter = new TraceIdCapturingFilter();
providerFactory.getContainerRequestFilterRegistry().registerSingleton(filter);
}
I am trying to implement dependency injection for the first time in my new project in .Net MAUI. For testing purposes, I want to fetch data from a local source in my ViewModel. The production scenario will fetch data from the remote data source using HttpClient.
Below is my code base structure:
I have an Interface:
public interface IApiService
{
Task<bool> GetSomething(string parameter);
Task<string> GetSomethingElse(string parameter);
}
I have two classes that derive from it.
public class LocalDataStore: IApiService
public class RemoteDataStore: IApiService
In my MauiProgram.cs, when I want to use Local Data Store:
builder.Services.AddSingleton<LocalDataStore>()
builder.Services.AddSingleton<IApiService>()
And for Remote Data Store
builder.Services.AddSingleton<RemoteDataStore>()
builder.Services.AddSingleton<IApiService>()
In my ViewModel:
public class Page1ViewModel
{
public Page1ViewModel(IApiService localDataStore)
{
var items = Task.Run(async () => await localDataStore.GetSomething(true));
}
}
While running the app, I get an error :
System.InvalidOperationException: 'Unable to resolve service for type '...IApiService' while attempting to activate 'ViewModels.Page1ViewModel'.'
What am I doing wrong or what else should I be doing?
Kindly help.
Thanks, and regards.
Edit*
Of course, it works if I use LocalDataStore or RemoteDataStore instead of IApiService, when I register the services with the builder. But then if I have to change from one data store to another, I will have to change that in all the ViewModel classes?
It was trivial.
I needed to register the service like so :
builder.Services.AddSingleton<IApiService,LocalDataStore>();
Thanks to https://youtu.be/paZNvvUNFi0, I realised that.
Hope this helps someone.
I'm trying to port some webjob code to the new Azure Functions. So far I've managed to import my DLL's and reference them succesfully, but when I use the connection string in my code, I get an error saying I have to add the ProviderName:
The connection string 'ConnectionString' in the application's
configuration file does not contain the required providerName
attribute."
Which is normally not a problem because in a webjob (or web app), this will be in the App or Web.config, and the connectionstring will simply be overwritten with whatever I entered in Azure.
With Azure Functions, I don't have a web.config (Although I tried adding one to no avail), so naturally the providername is missing.
How do I specify that?
EDIT:
After some playing around and some helpful tips by people below, I've almost managed to get it working.
What I do now is the following:
var connString = **MY CONN STRING FROM CONFIG**; // Constring without metadata etc.
EntityConnectionStringBuilder b = new EntityConnectionStringBuilder();
b.Metadata = "res://*/Entities.MyDB.csdl|res://*/Entities.MyDB.ssdl|res://*/Entities.MyDB.msl";
b.ProviderConnectionString = connString.ConnectionString;
b.Provider = "System.Data.SqlClient";
return new MyDB(b.ConnectionString);
Which gives me what I need for calling the database. I use a static method in a partial class to get an instance of the Database which runs the above code, and I decorate my MyDB Partial with [DbConfigurationType(typeof(MyDbConfiguration))]
I define that configuration as:
public class MyDBConfiguration: DbConfiguration
{
public MyDBConfiguration()
{
SetProviderFactory("System.Data.EntityClient", EntityProviderFactory.Instance);
}
}
My problem remains when I want to actually use the EF Entities. Here, it will try to initialize the database type using the original configuration, giving me the original error once again. As per this stack trace:
at Void Initialize()
at System.Data.Entity.Internal.EntitySetTypePair GetEntitySetAndBaseTypeForType(System.Type)
at Void InitializeContext()
at System.Data.Entity.Core.Objects.ObjectContext CreateObjectContextFromConnectionModel()
at Void Initialize()
at Boolean TryInitializeFromAppConfig(System.String, System.Data.Entity.Internal.AppConfig)
at Void InitializeFromConnectionStringSetting(System.Configuration.ConnectionStringSettings)
So how do I avoid this? I guess I need a way to hook into everything and run my custom setter..
In the end, Stephen Reindel pushed me in the right direction; Code-based Configuration for Entity Framework.
[DbConfigurationType(typeof(MyDBConfiguration))]
public partial class MyDB
{
public static MyDB GetDB()
{
var connString = **MY CONN STRING FROM SOMEWHERE**; // Constring without metadata etc.
EntityConnectionStringBuilder b = new EntityConnectionStringBuilder();
b.Metadata = "res://*/Entities.MyDB.csdl|res://*/Entities.MyDB.ssdl|res://*/Entities.MyDB.msl";
b.ProviderConnectionString = connString.ConnectionString;
b.Provider = "System.Data.SqlClient";
return new MyDB(b.ConnectionString);
}
public MyDB(string connectionString) : base(connectionString)
{
}
}
With MyDbConfiguration like this:
public class MyDBConfiguration: DbConfiguration
{
public MyDBConfiguration()
{
SetProviderServices("System.Data.SqlClient", SqlProviderServices.Instance);
SetDefaultConnectionFactory(new SqlConnectionFactory());
}
}
With the above code, EF never asks for AppConfig-related config files. But remember, if you have EF entries in your config file, it will attempt to use them, so make sure they're gone.
In terms of azure functions, this means I used the Azure Functions configuration panel in azure to punch in my ConnectionString without the Metadata and providername, and then loaded that in GetDB.
Edit: As per request, here is some explanatory text of the above:
You can't add EF metadata about the connection in Azure Functions, as they do not use an app.config in which to specify it. This is not a part of the connection string, but is metadata about the connection besides the connection string that EF uses to map to a specific C# Class and SQL Provider etc. To avoid this, you hardcode it using the above example. You do that by creating a class inheriting from DBConfiguration, and you mark (with an attribute on a partial class) your EF database class with that.
This DBConfiguration contains a different kind of way to instantiate a new database object, in which this metadata is hardcoded, but the connectionstring is retrieved from your app settings in Azure. In this example I just used a static method, but I guess it could be a new constructor also.
Once you have this static method in play, you can use that to get a new database in your database code, like this:
using (var db = MyDB.GetDB()) {
// db code here.
}
This allows you to use EntityFramework without an APP.Config, and you can still change the connectionstring using Azure Functions APP settings.
Hope that helps
Using this question you can set your default factory before opening the connection by having your personal DbConfiguration class (see this link also for usage):
public class MyDbConfiguration : DbConfiguration
{
public MyDbConfiguration()
{
SetDefaultConnectionFactory(new SqlConnectionFactory());
}
}
Now you need to tell your DbContext to use the new configuration. As using web.config or app.config is no option, you may use an attribute to add the configuration:
[DbConfigurationType(typeof(MyDbConfiguration))]
public class MyContextContext : DbContext
{
}
Now using a connection string on your DbContext will use the SQL provider by default.
Provided answer is perfect and it helped me a lot but it is not dynamic as I dont want to hardcode my connectionstring. if you are working the slots in azure functions. I was looking for a solution where I can use more than 1 connection strings. Here is my alternative approach step by step for anybody else struggling with this problem.
most important thing is that we understand local.settings.json file
IS NOT FOR AZURE. it is to run your app in the local as the name is
clearly saying. So solution is nothing to do with this file.
App.Config or Web.Config doesnt work for Azure function connection strings. If you have Database Layer Library you cant overwrite connection string using any of these as you would do in Asp.Net applications.
In order to work with, you need to define your connection string on the azure portal under the Application Settings in your Azure function. There is
Connection strings. there you should copy your connection string of your DBContext. if it is edmx, it will look like as below. There is Connection type, I use it SQlAzure but I tested with Custom(somebody claimed only works with custom) works with both.
metadata=res:///Models.myDB.csdl|res:///Models.myDB.ssdl|res://*/Models.myDB.msl;provider=System.Data.SqlClient;provider
connection string='data source=[yourdbURL];initial
catalog=myDB;persist security info=True;user
id=xxxx;password=xxx;MultipleActiveResultSets=True;App=EntityFramework
After you set this up, You need to read the url in your application and provide the DBContext. DbContext implements a constructor with connection string parameter. By default constructor is without any parameter but you can extend this. if you are using POCO class, you can amend DbContext class simply. If you use Database generated Edmx classes like me, you dont want to touch the auto generated edmx class instead of you want to create partial class in the same namespace and extend this class as below.
This is auto generated DbContext
namespace myApp.Data.Models
{
public partial class myDBEntities : DbContext
{
public myDBEntities()
: base("name=myDBEntities")
{
}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
throw new UnintentionalCodeFirstException();
}
}
this is the new partial class, you create
namespace myApp.Data.Models
{
[DbConfigurationType(typeof(myDBContextConfig))]
partial class myDBEntities
{
public myDBEntities(string connectionString) : base(connectionString)
{
}
}
public class myDBContextConfig : DbConfiguration
{
public myDBContextConfig()
{
SetProviderServices("System.Data.EntityClient",
SqlProviderServices.Instance);
SetDefaultConnectionFactory(new SqlConnectionFactory());
}
}
}
After all you can get the connection string from azure settings, in your Azure Function project with the code below and provide to your DbContext
myDBEntities is the name you gave in the azure portal for your connection string.
var connString = ConfigurationManager.ConnectionStrings["myDBEntities"].ConnectionString;
using (var dbContext = new myDBEntities(connString))
{
//TODO:
}
Adding an answer in the event you cannot simply change the way you instantiate you DbContext. This would occur if you are calling code that has DbContexts being instatiated with the parameter-less constructor.
It involves using a static constructor to read your connection string from the appsettings in the azure portal and passing it in to your DbContext base constructor. This allows you to circumvent the need for a providerName and also allows you to retain use of the portal configuration without needing to hardcode anything.
Please refer to my accepted answer here: Missing ProviderName when debugging AzureFunction as well as deploying azure function
Stumbled upon this and solved it like this, inside of the Azure Function.
public static class MyFunction
{
// Putting this in more than one place in your project will cause an exception,
// if doing it after the DbConfiguration has been loaded.
static MyFunction() =>
DbConfiguration.Loaded += (_, d) =>
d.AddDefaultResolver(new global::MySql.Data.Entity.MySqlDependencyResolver());
// The rest of your function...
//[FunctionName("MyFunction")]
//public static async Task Run() {}
}
You can access the site's App Settings by going to the portal, clicking Function app settings and then Configure app settings. That will open up a blade that allows you to set all the app settings for your function app. Just use the same key and value that you'd use for your web.config.
(Note: This is a simplified example intended to highlight the issue I'm seeing.)
I have a service I'm trying to register as a named service as follows:
builder.Register(new MyService()).Named<IMyService>("Test").SingleInstance();
I would have expected to be able to use this service in the constructor of my API Controller:
public TestController([WithKey("Test")] IMyService myService)
{
}
However, an exception gets thrown:
None of the constructors found with
'Autofac.Core.Activators.Reflection.DefaultConstructorFinder' on type
'TestController' can be invoked with the available services and parameters:
Cannot resolve parameter 'IMyService myService' of constructor 'Void
.ctor(IMyService)'.
The same code as above works when I replace the .Named() call with a .As():
builder.Register(new MyService()).As<IMyService>().SingleInstance();
public TestController(IMyService myService)
{
}
It also seems to work when I keep the .Named() call, but add the .As() call to it first:
builder.Register(new MyService()).As<IMyService>().Named<IMyService>("Test")
.SingleInstance();
public TestController([WithKey("Test")] IMyService myService)
{
}
Any ideas on why this behaves as it does? Am I doing something wrong in how I register named services?
From the Autofac wiki:
That component will require you to register a keyed service with the specified name. You'll also need to register the component with the filter so the container knows to look for it.
var builder = new ContainerBuilder();
// Register the keyed service to consume
builder.RegisterType<MyArtwork>().Keyed<IArtwork>("Painting");
// Specify WithAttributeFilter for the consumer
builder.RegisterType<ArtDisplay>().As<IDisplay>().WithAttributeFilter();
...
var container = builder.Build();
Notice the WithAttributeFilter(). Try to add this to your RegisterControllers() call.
You need to use the [WithName] attribute not [WithKey]
I'm trying to have signalR hub as part of a plugin using MEF. But after calling ImportMany on a List<> object and then adding the catalog/container/ComposeParts part in the Application_Start() method of the Global.asax file, all I get is :
Uncaught TypeError: Cannot read property 'server' of undefined.
I've got no clue if the problem comes from my interface, the plugin, the global.asax file, or the javascript.
The interface:
public interface IPlugin
{
}
the plugin:
[Export(typeof(IPlugin))]
[HubName("testHub")]
public class TestHub : Hub, IPlugin
{
public string Message()
{
return "Hello World!";
}
}
in the Global.asax file:
[ImportMany(typeof (IPlugin))]
private IEnumerable<IPlugin> _plugins { get; set; }
protected void Application_Start()
{
var catalog = new AggregateCatalog();
catalog.Catalogs.Add(new DirectoryCatalog(#"./Plugins"));
var container = new CompositionContainer(catalog);
container.ComposeParts(this);
RouteTable.Routes.MapHubs();
//log4net
log4net.Config.XmlConfigurator.Configure();
AreaRegistration.RegisterAllAreas();
WebApiConfig.Register(GlobalConfiguration.Configuration);
FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
RouteConfig.RegisterRoutes(RouteTable.Routes);
}
and finally the javascript:
$(document).ready(function () {
$.connection.hub.url = 'http://127.0.0.1/signalr/';
var proxy = $.connection.testHub;
$.connection.hub.start({ transport: ['webSockets', 'serverSentEvents', 'longPolling'] })
.done(function () {
proxy.invoke('Message').done(function(res) {
alert(res);
});
})
.fail(function () { alert("Could not Connect!"); });
});
the only information I've found was this post but I could not make it work. everything works fine when I add the reference manually, but when I have a look at "signalr/hubs" after loading the plugin, then there is not reference to my hub's method.
Thanks a lot for your help.
Your problem is that SignalR caches the generated "signalr/hubs" proxy script the first time it is requested. SignalR provides the cached script in response every subsequent request to "signalr/hubs".
SignalR not only caches the script itself, but it also caches the collection of Hubs it finds at the start of the process.
You can work around the cached proxy script issue by simply not using the proxy script, but that still won't enable you to actually connect to Hubs defined in assemblies that are loaded after the process starts.
If you want to be able to connect to such Hubs, you will need to implement your own IHubDescriptorProvider that is aware of Hubs defined in plugins loaded at runtime.
You can register your provider with SignalR's DependencyResolver which can be passed into SignalR via the Resolver property of the HubConfiguration object you pass into MapSignalR.
That said, it would probably be easier to restart the app pool/server process whenever a plugin is added to the "./Plugins" directory.