As shown in the runnable code example below I want to create a named scope wherein a certain instance of an object is resolved regardless of other unnamed scopes that are created after the object is created.
With regard to the documentation found here:
// You can't resolve a per-matching-lifetime-scope component
// if there's no matching scope.
using(var noTagScope = container.BeginLifetimeScope())
{
// This throws an exception because this scope doesn't
// have the expected tag and neither does any parent scope!
var fail = noTagScope.Resolve<Worker>();
}
In my case in the example below a parent scope DOES have a matching tag but it still does not work. Should it?
In the following example scopes are tidy and parent scopes are known - In my application only the root container object is accessible so when a scope is created it is always from the container not the parent scope.
public class User
{
public string Name { get; set; }
}
public class SomeService
{
public SomeService(User user)
{
Console.WriteLine($"Injected user is named {user.Name}");
}
}
class Program
{
private static IContainer container;
private const string USER_IDENTITY_SCOPE = "SOME_NAME";
static void Main(string[] args)
{
BuildContainer();
Run();
Console.ReadKey();
}
private static void BuildContainer()
{
ContainerBuilder builder = new ContainerBuilder();
builder.RegisterType<SomeService>();
builder.RegisterType<User>().InstancePerMatchingLifetimeScope(USER_IDENTITY_SCOPE);
container = builder.Build();
}
private static void Run()
{
using (var outerScope = container.BeginLifetimeScope(USER_IDENTITY_SCOPE))
{
User outerUser = outerScope.Resolve<User>();
outerUser.Name = "Alice"; // User Alice lives in this USER_IDENTITY_SCOPE
SomeService someService = outerScope.Resolve<SomeService>(); // Alice
// Now we want to run a "process" under the identity of a different user
// Inside of the following using block, we want all services that
// receive a User object to receive Bob:
using (var innerSope = container.BeginLifetimeScope(USER_IDENTITY_SCOPE))
{
User innerUser = innerSope.Resolve<User>();
innerUser.Name = "Bob"; // We get a new instance of user as expected. User Bob lives in this USER_IDENTITY_SCOPE
// Scopes happen in my app that are unrelated to user identity - how do I retain User object despite this?
// The following is not a USER_IDENTITY_SCOPE -- We still want Bob to be the User object that is resolved:
using (var unnamedScope = container.BeginLifetimeScope())
{
// Crashes. Desired result: User Bob is injected
SomeService anotherSomeService = unnamedScope.Resolve<SomeService>();
}
}
}
}
}
Using Autofac 4.9.2 / .net core 2.2
In your example, you're launching the unnamed scope from the container, not from a parent with a name:
using (var unnamedScope = container.BeginLifetimeScope())
Switch that to be a child of a scope with a name and it'll work.
using (var unnamedScope = innerScope.BeginLifetimeScope())
I'd also note that you've named these outerScope and innerScope but innerScope is not actually a child of the outerScope so the names are misleading. Technically the two scopes are peers.
container
innerScope (named)
outerScope (named)
unnamedScope
All three are direct children of the container. If you think about sharing the user in terms of scope hierarchy, you'd need to make child scopes from parents that have children.
container
innerScope (named)
unnamedScope
outerScope (named)
unnamedScope
You'll notice inner and outer are still peers - you can't have a parent and a child with the same name, so given inner and outer are both named, they'll never share the same hierarchy except for the container.
I would strongly recommend not trying to bypass a hierarchical model here. For example, say you really are trying to do this:
container
outerScope (named)
unnamedScope
Which might look like this:
using(var outerScope = container.BeginLifetimeScope(USER_IDENTITY_SCOPE))
using(var unnamedScope = container.BeginLifetimeScope())
{
//...
}
This is pretty much what you have in the snippet above. The only common sharing these scopes have is at the container level. If you tried to resolve something from the named scope and pass it to a peer scope, you run the risk of things being disposed out from under you or other weird hard-to-troubleshoot problems. Like if outerScope gets disposed but unnamedScope lives on, you can get into trouble.
// PLEASE DO NOT DO THIS. YOU WILL RUN INTO TROUBLE.
using(var outerScope = container.BeginLifetimeScope(USER_IDENTITY_SCOPE))
{
var user = outerScope.Resolve<User>();
using(var unnamedScope = container.BeginLifetimeScope(b => b.RegisterInstance(user)))
{
//...
}
}
That's bad news waiting to happen, from odd disposal problems to things not sharing the same set of dependencies when you think they should. But, you know, we can give you the gun, it's up to you to not shoot yourself in the foot with it.
Related
I'm new at Zenject(Extenject).
My dev environment: Win10, Unity2020, Extenject 9.2.0
Here is my question:
In installer bind the class
Container.Bind<AccountInfo>().AsCached();
Inject it at classA
private AccountInfo accountInfo;
[Inject]
private void Init(GameSetup _gameSetup, AccountInfo _accountInfo)
{
this.gameSetup = _gameSetup;
this.accountInfo = _accountInfo;
}
accountInfo.address = "xxx'; // works fine
Then inject AccountInfo to classB
private AccountInfo accountInfo;
[Inject]
private void Init(AccountInfo _accountInfo)
{
this.accountInfo = _accountInfo;
}
accountInfo.address = "xxx'; //NullReferenceException: Object reference not set to an instance of an object
Why accountInfo changed to null? AsCached() dosen't work? Or something worng else?
Help please~~ Thank you!
Here is my code:
Installer
"ClassA" inject GameSetup, and create instance, works fine
"ClassB" inject GameSetup, Error: null object
"ClassB" Creator, I'm trying use container.Instantiate() to create it
---update---
gameSetup still Null Object
There are two cases, when injection will not work properly in your code.
The code, that uses injected object is executed before Init. For example if this code is placed in the construcor.
You create your GameObject/Component in runtime whithout using IInstantiator. While you use Znject you always should use IInstantiator to create objects. To do it you should inject IInstantiator to the object, that creates another objects. IItstantiator is always binded in the container by default, so you don't have to bind it manually. For example:
public class Creator : MonoBehaviour {
[SerializeField]
private GameObject _somePrefab;
private IInstantiator _instantiator;
[Inject]
public void Initialize(IInstantiator instantiator) {
_instantiator = instantiator;
}
private void Start() {
// example of creating components
var gameObj = new GameObject(); // new empty gameobjects can be created without IInstantiator
_instantiator.InstantiateComponent<YourComponentClass>(gameObj);
// example of instantiating prefab
var prefabInstance = _instantiator.InstantiatePrefab(_somePrefab);
}
}
Not an expert but I think that passing IInstantiator or the container around is not a good practice. If you need to create injected instances at runtime, then you need a Factory.
From the documentation
1.- Best practice with DI is to only reference the container in the composition root "layer"
Note that factories are part of this layer and the container can be referenced there (which is necessary to create objects at runtime).
2.- "When instantiating objects directly, you can either use DiContainer or you can use IInstantiator, which DiContainer inherits from. However, note that injecting the DiContainer is usually a sign of bad practice, since there is almost always a better way to design your code such that you don't need to reference DiContainer directly".
3.- "Once again, best practice with dependency injection is to only reference the DiContainer in the "composition root layer""
The idea is just simple and works in the other containers, not limited with .Net:
Singleton component being referenced from within request context references transient component which in turn references request-scoped component (some UnitOfWork).
I expected that Autofac would resolve the same scoped component in both cases:
- when I request it directly from request scope
- when I request it by invoking Func<>
Unfortunately the reality is quite a bit different - Autofac sticks SingleInstance component to the root scope and resolves InstancePerLifetimeScope component on
the root component introducing memory leak (!!!) as UnitOfWork is disposable and becomes tracked by root scope (attempt to use matching web request scope would just fail finding request scope which is yet more misleading).
Now I'm wondering whether such behavior is by design or just a bug? If it is by design I'm not sure what are the use cases and why it differs from the other containers.
The example is as follows (including working SimpleInjector case):
namespace AutofacTest
{
using System;
using System.Linq;
using System.Linq.Expressions;
using Autofac;
using NUnit.Framework;
using SimpleInjector;
using SimpleInjector.Lifestyles;
public class SingletonComponent
{
public Func<TransientComponent> Transient { get; }
public Func<ScopedComponent> Scoped { get; }
public SingletonComponent(Func<TransientComponent> transient, Func<ScopedComponent> scoped)
{
Transient = transient;
Scoped = scoped;
}
}
public class ScopedComponent : IDisposable
{
public void Dispose()
{
}
}
public class TransientComponent
{
public ScopedComponent Scoped { get; }
public TransientComponent(ScopedComponent scopedComponent)
{
this.Scoped = scopedComponent;
}
}
class Program
{
static void Main(string[] args)
{
try
{
AutofacTest();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
try
{
SimpleInjectorTest();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
private static void AutofacTest()
{
var builder = new ContainerBuilder();
builder.RegisterType<ScopedComponent>().InstancePerLifetimeScope();
builder.RegisterType<SingletonComponent>().SingleInstance();
builder.RegisterType<TransientComponent>();
var container = builder.Build();
var outerSingleton = container.Resolve<SingletonComponent>();
using (var scope = container.BeginLifetimeScope())
{
var singleton = scope.Resolve<SingletonComponent>();
Assert.That(outerSingleton, Is.SameAs(singleton));
var transient = scope.Resolve<TransientComponent>();
var scoped = scope.Resolve<ScopedComponent>();
Assert.That(singleton.Transient(), Is.Not.SameAs(transient));
// this fails
Assert.That(singleton.Transient().Scoped, Is.SameAs(scoped));
Assert.That(transient.Scoped, Is.SameAs(scoped));
Assert.That(singleton.Scoped(), Is.SameAs(scoped)); // this fails
Assert.That(singleton.Transient(), Is.Not.SameAs(transient));
}
}
private static void SimpleInjectorTest()
{
var container = new SimpleInjector.Container();
container.Options.AllowResolvingFuncFactories();
container.Options.DefaultScopedLifestyle = new AsyncScopedLifestyle();
container.Register<ScopedComponent>(Lifestyle.Scoped);
container.Register<SingletonComponent>(Lifestyle.Singleton);
container.Register<TransientComponent>(Lifestyle.Transient);
container.Verify();
var outerSingleton = container.GetInstance<SingletonComponent>();
using (var scope = AsyncScopedLifestyle.BeginScope(container))
{
var singleton = container.GetInstance<SingletonComponent>();
Assert.That(outerSingleton, Is.SameAs(singleton));
var transient = container.GetInstance<TransientComponent>();
var scoped = container.GetInstance<ScopedComponent>();
Assert.That(singleton.Transient(), Is.Not.SameAs(transient));
Assert.That(singleton.Transient().Scoped, Is.SameAs(scoped));
Assert.That(transient.Scoped, Is.SameAs(scoped));
Assert.That(singleton.Scoped(), Is.SameAs(scoped));
Assert.That(singleton.Transient(), Is.Not.SameAs(transient));
}
}
}
public static class SimpleInjectorExtensions
{
public static void AllowResolvingFuncFactories(this ContainerOptions options)
{
options.Container.ResolveUnregisteredType += (s, e) =>
{
var type = e.UnregisteredServiceType;
if (!type.IsGenericType || type.GetGenericTypeDefinition() != typeof(Func<>))
{
return;
}
Type serviceType = type.GetGenericArguments().First();
InstanceProducer registration = options.Container.GetRegistration(serviceType, true);
Type funcType = typeof(Func<>).MakeGenericType(serviceType);
var factoryDelegate = Expression.Lambda(funcType, registration.BuildExpression()).Compile();
e.Register(Expression.Constant(factoryDelegate));
};
}
}
}
The short version what you're seeing is not a bug, you're just misunderstanding some of the finer points of lifetime scopes and captive dependencies.
First, a couple of background references from the Autofac docs:
Controlling Scope and Lifetime explains a lot about how lifetime scopes and that hierarchy works.
Captive Dependencies talks about why you don't generally shouldn't take an instance-per-lifetime or instance-per-dependency scoped item into a singleton.
Disposal talks about how Autofac auto-disposes IDisposable items and how you can opt out of that.
Implicit Relationship Types describes the Owned<T> relationship type used as part of the IDisposable opt-out.
Some big key takeaways from these docs that directly affect your situation:
Autofac tracks IDisposable components so they can be automatically disposed along with the lifetime scope. That means it will hold references to any resolved IDisposable objects until the parent lifetime scope is resolved.
You can opt out of IDisposable tracking either by registering the component as ExternallyOwned or by using Owned<T> in the constructor parameter being injected. (Instead of taking in an IDependency take in an Owned<IDependency>.)
Singletons live in the root lifetime scope. That means any time you resolve a singleton it will be resolved from the root lifetime scope. If it is IDisposable it will be tracked in the root lifetime scope and not released until that root scope - the container itself - is disposed.
The Func<T> dependency relationship is tied to the same lifetime scope as the object in which it's injected. If you have a singleton, that means the Func<T> will resolve things from the same lifetime scope as the singleton - the root lifetime scope. If you have something that's instance-per-dependency, the Func<T> will be attached to whatever scope the owning component is in.
Knowing that, you can see why your singleton, which takes in a Func<T>, keeps trying to resolve these things from the root lifetime scope. You can also see why you're seeing a memory leak situation - you haven't opted out of the disposal tracking for the things that are being resolved by that Func<T>.
So the question is, how do you fix it?
Option 1: Redesign
Generally speaking, it would be better to invert the relationship between the singleton and the thing you have to resolve via Func<T>; or stop using a singleton altogether and let that be a smaller lifetime scope.
For example, say you have some IDatabase service that needs an IPerformTransaction to get things done. The database connection is expensive to spin up, so you might make that a singleton. You might then have something like this:
public class DatabaseThing : IDatabase
{
public DatabaseThing(Func<IPerformTransaction> factory) { ... }
public void DoWork()
{
var transaction = this.factory();
transaction.DoSomethingWithData(this.Data);
}
}
So, like, the thing that's expensive to spin up uses a Func<T> to generate the cheap thing on the fly and work with it.
Inverting that relationship would look like this:
public PerformsTransaction : IPerformTransaction
{
public PerformsTransaction(IDatabase database) { ... }
public void DoSomethingWithData()
{
this.DoSomething(this.Database.Data);
}
}
The idea is that you'd resolve the transaction thing and it'd take the singleton in as a dependency. The cheaper item could easily be disposed along with child lifetime scopes (i.e., per request) but the singleton would remain.
It'd be better to redesign if you can because even with the other options you'll have a rough time getting "instance per request" sorts of things into a singleton. (And that's a bad idea anyway from both a captive dependency and threading standpoint.)
Option 2: Abandon Singleton
If you can't redesign, a good second choice would be to make the lifetime of the singleton... not be a singleton. Let it be instance-per-scope or instance-per-dependency and stop using Func<T>. Let everything get resolved from a child lifetime scope and be disposed when the scope is disposed.
I recognize that's not always possible for a variety of reasons. But if it is possible, that's another way to escape the problem.
Option 3: Use ExternallyOwned
If you can't redesign, you could register the disposable items consumed by the singleton as ExternallyOwned.
builder.RegisterType<ThingConsumedBySingleton>()
.As<IConsumedBySingleton>()
.ExternallyOwned();
Doing that will tell Autofac to not track the disposable. You won't have the memory leak. You will be responsible for disposing the resolved objects yourself. You will also still be getting them from the root lifetime scope since the singleton is getting a Func<T> injected.
public void MethodInsideSingleton()
{
using(var thing = this.ThingFactory())
{
// Do the work you need to and dispose of the
// resolved item yourself when done.
}
}
Option 4: Owned<T>
If you don't want to always manually dispose of the service you're consuming - you only want to deal with that inside the singleton - you could register it as normal but consume a Func<Owned<T>>. Then the singleton will resolve things as expected but the container won't track it for disposal.
public void MethodInsideSingleton()
{
using(var ownedThing = this.ThingFactory())
{
var thing = ownedThing.Value;
// Do the work you need to and dispose of the
// resolved item yourself when done.
}
}
I'm having problems with the NetBeans Nodes API.
I have this line of code:
Node n = (new MyNode(X)).getChildren().getNodeAt(Y);
The call to new MyNode(X) with the same X always initializes a MyNode the same way, independent of the context.
When I place it by itself (say, in an menu action), it successfully gets the Yth child, but if I put it in an event where other Node/Children stuff happens, it returns null.
MyNode's Children implementation is a trivial subclass of Children.Keys, which is approximately:
// Node
import org.openide.nodes.AbstractNode;
class MyNode extends AbstractNode {
MyNode(MyKey key) {
super(new MyNodeChildren(key));
}
}
// Children
import java.util.Collections;
import org.openide.nodes.Children;
import org.openide.nodes.Node;
public class MyNodeChildren extends Children.Keys<MyKey> {
MyKey parentKey;
MyNodeChildren(MyKey parentKey) {
super(true); // use lazy behavior
this.parentKey = parentKey;
}
#Override
protected Node[] createNodes(MyKey key) {
return new Node[] {new MyNode(key)};
}
#Override
protected void addNotify() {
setKeys(this.parentKey.getChildrenKeys());
}
#Override
protected void removeNotify() {
setKeys(Collections.EMPTY_SET);
}
}
// MyKey is trivial.
I assume this has something to do with the lazy behavior of Children.Keys. I have the sources for the API, and I've tried stepping through it, but they're so confusing that I haven't figured anything out yet.
NetBeans IDE 7.0.1 (Build 201107282000) with up-to-date plugins.
Edit: More details
The line with the weird behavior is inside a handler for an ExplorerManager selected-nodes property change. The weird thing is that it still doesn't work when the MyNode instance isn't in the heirarchy that the ExplorerManager is using (it's not even the same class as the nodes in the ExplorerManager), and isn't being used for anything else.
Accessing the nodes instead of the underlying model is actually necessary for my use case (I need to do stuff with the PropertySets), the MyNode example is just a simpler case that still has the problem.
It is recommended to use org.openide.nodes.ChildFactory to create child nodes unless you have a specific need to use one of the Children APIs. But for the common cases the ChildFactory is sufficient.
One thing to keep in mind when using the Nodes API is that it is only a presentation layer that wraps your model and used in conjunction with the Explorer API makes it available to the various view components in the NetBeans platform such as org.openide.explorer.view.BeanTreeView.
Using a model called MyModel which may look something like:
public class MyModel {
private String title;
private List<MyChild> children;
public MyModel(List<MyChild> children) {
this.children = children;
}
public String getTitle() {
return title;
}
public List<MyChild> getChildren() {
return Collections.unmodifiableList(children);
}
}
You can create a ChildFactory<MyModel> that will be responsible for creating your nodes:
public class MyChildFactory extends ChildFactory<MyModel> {
private List<MyModel> myModels;
public MyChildFactory(List<MyModel> myModels) {
this.myModels = myModels;
}
protected boolean createKeys(List<MyModel> toPopulate) {
return toPopulate.addAll(myModels);
}
protected Node createNodeForKey(MyModel myModel) {
return new MyNode(myModel);
}
protected void removeNotify() {
this.myModels= null;
}
}
Then, implementing MyNode which is the presentation layer and wraps MyModel:
public class MyNode extends AbstractNode {
public MyNode(MyModel myModel) {
this(myModel, new InstanceContent());
}
private MyNode(MyModel myModel, InstanceContent content) {
super(Children.create(
new MyChildrenChildFactory(myModel.getChildren()), true),
new AbstractLookup(content)); // add a Lookup
// add myModel to the lookup so you can retrieve it latter
content.add(myModel);
// set the name used in the presentation
setName(myModel.getTitle());
// set the icon used in the presentation
setIconBaseWithExtension("com/my/resouces/icon.png");
}
}
And now the MyChildrenChildFactory which is very similar to MyChildFactory except that it takes a List<MyChild> and in turn creates MyChildNode:
public class MyChildFactory extends ChildFactory<MyChild> {
private List<MyChild> myChildren;
public MyChildFactory(List<MyChild> myChildren) {
this.myChildren = myChildren;
}
protected boolean createKeys(List<MyChild> toPopulate) {
return toPopulate.addAll(myChildren);
}
protected Node createNodeForKey(MyChild myChild) {
return new MyChildNode(myChild);
}
protected void removeNotify() {
this.myChildren = null;
}
}
Then an implementation of MyChildNode which is very similar to MyNode:
public class MyChildNode extends AbstractNode {
public MyChildNode(MyChild myChild) {
// no children and another way to add a Lookup
super(Children.LEAF, Lookups.singleton(myChild));
// set the name used in the presentation
setName(myChild.getTitle());
// set the icon used in the presentation
setIconBaseWithExtension("com/my/resouces/child_icon.png");
}
}
And we will need the children's model, MyChild which is very similar to MyModel:
public class MyChild {
private String title;
public String getTitle() {
return title;
}
}
Finally to put it all to use, for instance with a BeanTreeView which would reside in a TopComponent that implements org.openide.explorer.ExplorerManager.Provider:
// somewhere in your TopComponent's initialization code:
List<MyModel> myModels = ...
// defined as a property in you TC
explorerManager = new ExplorerManager();
// this is the important bit and we're using true
// to tell it to create the children asynchronously
Children children = Children.create(new MyChildFactory(myModels), true);
explorerManager.setRootContext(new AbstractNode(children));
Notice that you don't need to touch the BeanTreeView and in fact it can be any view component that is included in the platform. This is the recommended way to create nodes and as I've stated, the use of nodes is as a presentation layer to be used in the various components that are included in the platform.
If you then need to get a child you can use the ExplorerManager which you can retrieve from the TopComponent using the method ExplorerManager.Provier.getExplorerManager() which was implemented due to the fact that your TopComponent implemented ExplorerManager.Provider and is in fact the way that a view component itself gets the nodes:
ExplorerManager explorerManager = ...
// the AbstractNode from above
Node rootContext = explorerManager.getRootContext();
// the MyNode(s) from above
Children children = rootContext.getChildren().getNodes(true);
// looking up the MyModel that we added to the lookup in the MyNode
MyModel myModel = nodes[0].getLookup().lookup(MyModel.class);
However, you must be aware that using the Children.getNodes(true) method to get your nodes will cause all of your nodes and their children to be created; which weren't created due to the fact that we told the factory that we wanted it to create the children asynchronously. This is not the recommended way to access the data but instead you should keep a reference to the List<MyModel> and use that if at all possible. From the documentation for Children.getNodes(boolean):
...in general if you are trying to get useful data by calling this method, you are probably doing something wrong. Usually you should be asking some underlying model for information, not the nodes for children.
Again, you must remember that the Nodes API is a presentation layer and is used as an adapter between your model and your views.
Where this becomes a powerful technique is when using the same ChildFactory in different and diverse views. You can reuse the above code in many TopComponents without any modifications. You can also use a FilterNode if you need to change only a part of the presentation of a node without having to touch the original node.
Learning the Nodes API is one of the more challenging aspects of learning the NetBeans platform API as you have undoubtedly discovered. Once you have some mastery of this API you will be able to take advantage of much more of the platforms built in capabilities.
Please see the following resources for more information on the Nodes API:
NetBeans Nodes API Tutorial
Great introduction to the Nodes API by Antonio Vieiro
Part 5: Nodes API and Explorer & Property Sheet API by Geertjan Wielenga
JavaDocs for the Nodes API
Timon Veenstra on the NetBeans Platform Developers mailing list solved this for me.
Actions on the explorerManager are guarded to ensure consistency. A
node selection listener on an explorer manager for example cannot
manipulate the same explorer manager while handling the selection
changed event because that would require a read to write upgrade. The
change will be vetoed and die a silent death.
Are you adding the MyNode root node to the explorer manager on
initialization, or somewhere else in a listener?
My problem line is in an ExplorerManager selection change listener. I guess the Children.MUTEX lock is getting set by ExplorerManager and preventing the Children.Keys instance from populating its Nodes...?
Anyways, I moved my Node access into a EventQueue.invokeLater(...), so it executes after the selection changed event finishes, and that fixed it.
I have a WPF view that has a corresponding ViewModel. All instances are resolved via an unity container. Because I'm using prism I need two independent instances of the view to add it into two different regions the view is registered to. If I'd try to add one instance into both regions I get an
InvalidOperationException: Specified
element is already the logical child
of another element. Disconnect it
first.
when the view is added into the second region because it is already added to the first region.
This problem can easily be solved by using a TransientLifetimeManager that always returns a new instance so both regions would be filled with an independent instance.
But we have decided to create a child container when a new user logs on. Every session related view and view model are resolved using this child container. When the user's session ends, the child container is disposed so that also every session related instances are disposed. But using a TransientLifetimeManager the unity container cannot dispose those instances.
What we need is a lifetime manager that always returns a new instance, but is also capable of disposing those instances. Is there already such an lifetime manager around? Or is there another way to achieve what I described above?
What you want sounds like a variant of the ContainerControlledLifetime manager that does not maintain a singleton instance, but a collection of instances. Unfortunately this is not one of the built-in lifetime managers.
You can look at the code for the ContainerControlledLifetimeManager and see that it is pretty simple. Your "SynchronizedGetValue" implementation would always return null (signaling to the container that a new instance needs to be instantiated). You could just subclass ContainerControlledLifetimeManager and override that method.
I've pretty much written it. I suppose I could give you the code. :)
public class ContainerTrackedTransientLifetimeManager :
ContainerControlledLifetimeManager
{
protected override object SynchronizedGetValue()
{
return null;
}
}
That should work. I've not tested it... from the interface, it looks like it's designed for a 1 to 1 LifetimeManager to Object relationship, but if it turns out it is more than that, you might have to override SetValue (adds to a collection of objects) and dispose (disposes that collection of objects). Here's that implementation:
public class ContainerTrackedTransientLifetimeManager :
SynchronizedLifetimeManager, IDisposable
{
private ConcurrentCollection<object> values = new ConcurrentCollection<object>();
protected override object SynchronizedGetValue()
{
return null;
}
protected override void SynchronizedSetValue(object newValue)
{
values.Add(newValue);
}
public override void RemoveValue()
{
Dispose();
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
protected void Dispose(bool disposing)
{
var disposables = values.OfType<IDisposable>();
foreach(var disposable in disposables)
{
disposable.Dispose();
}
values.Clear();
}
I'm not sure which of these is the right answer. Let me know how it goes for you.
When you use transient lifetime manager (which is the default), Unity does not keep a reference to the created instance.
Thus, when there are no more reference to the instance, it will be GCed.
Is there any way I can compose (or get an exported value) with a specific instance as one of it's dependencies?
I have something like this:
public interface IEntityContext
{
IEntitySet<T> GetEntitySet<T>();
}
[Export(typeof(IEntitySet<MyEntity>))]
class MyEntitySet
{
public MyEntitySet(IEntityContext context)
{
}
}
// then through code
var container = ...;
using (var context = container.GetExportedValue<IEntityContext>())
{
var myEntitySet = context.GetEntitySet<MyEntity>();
// I wan't myEntitySet to have the above context constructor injected
}
I'm trying to mock something like entity framework for testability sake. Not sure though if I would want to go down this road. Anyway, should I be creating a new container for this very purpose. A container specific to the mocking of this one IEntityContext object.
So, if my understanding is correct, you want to be able to inject whatever IEntityContext is available to your instance of MyEntitySet?
[Export(typeof(IEntitySet<MyEntity>))]
public class MyEntitySet : IEntitySet<MyEntity>
{
[ImportingConstructor]
public MyEntitySet(IEntityContext context)
{
}
}
Given that you then want to mock the IEntityContext? If so, you could then do this:
var contextMock = new Mock<IEntityContext>();
var setMock = new Mock<IEntitySet<MyEntity>>();
contextMock
.Setup(m => m.GetEntitySet<MyEntity>())
.Returns(setMock.Object);
Container.ComposeExportedValue<IEntityContext>(contextMock.Object);
var context = Container.GetExportedValue<IEntityContext>();
var entitySet = context.GetEntitySet<MyEntity>();
(That's using Moq)
You can use your existing CompositionContainer infrastructure by adding an exported value.
Does that help at all? Sorry it doesn't seem exactly clear what you are trying to do...