MVVM tutorial - 'Wrapping IClientChannel' questions - mvvm

This is a question on Mark Miller's MVVM tutorial, here.
I have 3 questions regarding 'Wrapping IClientChannel'.
Question 1:
Mark wrote:
And here’s how you instantiate ClientChannelWrapper:
IClientChannelWrapper service = new ClientChannelWrapper("BasicHttpBinding_IMessageEndpoint");
Where “BasicHttpBinding_IMessageEndPoint” is the name of your endpoint configuration in your ClientConfig file.
My question is - when I used to do 'Add Service Reference...' - I could create a client object, without having to specify the endpoint string. Using Mark's method - it requires I specify this string, and it seems to be a limitation for me.
Is there anyway to use his 'IClientChannelWrapper' without passing the endpoint string ?
Somehow mimic what the 'Add Service Reference...' does ?
.
Question 2
I know that most examples that involve a WPF app contacting a WCF service - use ASYNC operations.
My question is - why is this ?
If I use Mark's method - it means that I need to write 2 interfaces (SYNC and ASYNC), and it seems like overhead.
Why do people not just call the SYNC operation by simply calling it from a separate thread ?
.
Question 3:
Does Mark's class also work with 'Duplex' services ?
I have a WCF service that I need to connect to in order to receive notifications from via a callback method.

Related

Workflow Foundation: Error when resuming a persistent workflow after activity changes

the context of the problem is like this: we create workflows, we save it and after a while a new implementation request comes and we change an activity. After this the workflow instances that were saved cannot run anymore. We get this error:
StateMachine Error : Cannot convert object 'True' to type 'System.String'.
It seems that the new argument added brakes the serialization order?
You'll have to implement Dynamic Update in some fashion.
We are currently in the process of getting some infrastructure set up to update existing instances, and having lots of issues. Hopefully your scenario is easier to solve than ours!
Start here: https://msdn.microsoft.com/en-us/library/hh314052(v=vs.110).aspx
Word of caution: I've found various issues with Microsoft's provided code that required a lot of investigation to fix.

Manipulating path mapping in AWS API gateway integration

I would like to modify an url parameter /resource/{VaRiAbLe} in an API gateway to S3 mapping so that it actually points to /my-bucket/{variable}. That is, it accepts mixed-case input, and maps it to a lower-case name. Mapping path variables is relatively simple enough to S3 integrations, but I can't seem to get a lower-case mapping working.
Reading through the documentation for mapping parameters, it looks like the path parameters are simple string values (and not templated values), and so defining a mapping as method.request.path.variable.toLowerCase() won't work.
Does anyone have any ideas how to implement this mapping?
Map path variables to a JSON body, and then call another API method that actually does the S3 call?
Bite the bullet, and implement a Lambda function to do the S3 get for me?
Find another api method for S3 that accepts a JSON body that I can use to get the data?
Update using Orchestrated calls
Following the info from Jack, I figured I should try doing the orchestrated call, since the traffic volume is low enough that I am sure that I won't be able to keep the lambda hot.
As a proof of concept, I added two methods to my resource (sitting at /resource/{variable} - GET and POST. The GET method chains to the POST, which does the actual retrieving of the data.
POST method configuration
This is a vanilla S3 proxying method, where you set the URL Path parameter for {variable} to be method.request.body.variable.
GET method configuration
This is a HTTPS proxying method. You'll need an URL for the POST method, so you'll need to deploy the API to get the URL. The only other configuration needed here is a body mapping template with content like:
{
"variable" : "$input.params('variable').toLowerCase()",
"something" : "$input.params('something')"
}
This should be enough to get this working.
The downside to this looks to be that I'm adding an extra method (POST) to my API for that resource that could confuse consumers of the API. I think it should be possible to make the POST on the /resource resource, which would at least make a bit more sense from an API design standpoint.
Depending on how frequently this API will be called, I'd either go with the Lambda proxy or chaining two API Gateway methods together. If the API is called frequently enough to keep a Lambda function warm (say once a minute), then go with Lambda. If not, go with the orchestrated API call.
The orchestrated API call would be interesting, I'd be happy to help with that if you have questions.
As far as I know the only S3 API for getting object data is the GET that is documented in their API reference.

RequestFactory's Entity Relationships

The details of the Request's with() implementation of RequestFactory in GWT is a bit unclear to me. See here for the official documentation.
Question 1:
When querying the server, RequestFactory does not automatically
populate relations in the object graph. To do this, use the with()
method on a request and specify the related property name as a String.
Does this mean that if the Entity at the server uses Lazy Fetching, the returned EntityProxy will have all the requested objects specified in with()? It seems a bit odd to instantiate the whole object graph of the Object server side, to only send a small piece to the client.
Question 2:
Does req.with("foo").with("foo"); do the same as req.with("foo"); ?
Question 3:
Does req.with("foo").with("bar"); do the same as req.with("foo","bar"); ?
NOTE: I'm having a really hard time finding the implementation details of with() in the source code and the API doesn't help me either.
Question 1:
It probably depends on your server side implemenation.
The with invocation will only make sure that the corresponding getter (getFoo()) is called shortly before the RF call returns to the client.
That's the reason why you also have to make sure to use an OpenSessionInView pattern, otherwise you might run into NullPointeterExceptions.
Question 2:
I guess the Request<T> implements a builder pattern.
The end-result will be the same.
However I am not sure if the getter() will be called twice or if the with method will check if the getter is already requested.
Question 3:
Yes it's the same.
As a sidenote. You can use req.with("foo.bar").
On the backend this will lead to a getFoo().getBar() call.

Utilizing RijndaelManaged, Enterprise Library and Autofac together

I'm newly experimenting with the cryptography application block while using Autofac as the container.
As a result, I'm using the nuget package EntLibContrib 5.0 - Autofac Configurator.
With the DPAPI Symmetric Crypto Provider, I was able to encrypt/decrypt data just fine.
However, with RijndaelManaged, I receive an ActivationException:
Microsoft.Practices.ServiceLocation.ActivationException: Activation error occured while trying to get instance of type ISymmetricCryptoProvider, key "RijndaelManaged" ---> Autofac.Core.Registration.ComponentNotRegisteredException: The requested service 'RijndaelManaged (Microsoft.Practices.EnterpriseLibrary.Security.Cryptography.ISymmetricCryptoProvider)' has not been registered. To avoid this exception, either register a component to provide the service, check for service registration using IsRegistered(), or use the ResolveOptional() method to resolve an optional dependency.
Per instructions here: http://msdn.microsoft.com/en-us/library/ff664686(v=pandp.50).aspx
I am trying to inject CryptographyManager into MyService.
My bootstrapping code looks like this:
var builder = new ContainerBuilder();
builder.RegisterEnterpriseLibrary();
builder.RegisterType<MyService>().As<IMyService>();
_container = builder.Build();
var autofacLocator = new AutofacServiceLocator(_container);
EnterpriseLibraryContainer.Current = autofacLocator;
App.config has this info defined for symmetricCryptoProviders:
name: RijndaelManaged
type: Microsoft.Practices.EnterpriseLibrary.Security.Cryptography.HashAlgorithmProvider, Microsoft.Practices.EnterpriseLibrary.Security.Cryptography, Version=5.0.505.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35
algorithmType:System.Security.Cryptography.RijndaelManaged
protectedKeyFilename:[path_to_my_key]
protectedKeyProtectionScope: LocalMachine
Anyone have experience in this combination of technologies?
After some testing, I believe I may go with a Unity container instead, since I have no preference in IOC containers other than whatever I use should integrate nicely with ASP.NET MVC3 and http-hosted WCF services.
My bootstrapping code then becomes more simple:
var container = new UnityContainer()
.AddNewExtension<EnterpriseLibraryCoreExtension>();
container.RegisterType<IMyService, MyService>();
I actually wrote the Autofac EntLib configurator (with some help from some of the P&P folks). It's been tested with the exception handling block and logging block, but I haven't tried it with the cryptography stuff.
EntLib has an interesting thing where it sometimes requires registered services to be named, and I'm guessing from the exception where it says...
type ISymmetricCryptoProvider, key "RijndaelManaged"
...I'm thinking EntLib wants you to register a named service, like:
builder.Register(c =>
{
// create the HashAlgorithmProvider using
// RijndaelManaged algorithm
})
.Named<ISymmetricCryptoProvider>("RijndaelManaged");
I'm sort of guessing at the exact registration since, again, I've not got experience with it or tested it, but the idea is that EntLib is trying to register a named service whereas the actual service isn't getting registered with the name.
The RegisterEnterpriseLibrary extension basically goes through and tries to use the same algorithm that Unity uses to do the named/unnamed registrations. I'm guessing you've encountered an edge case where something's not getting handled right. EntLib is pretty well tied to Unity, even if they did try to abstract it away.
If you're not tied to Autofac, Unity is going to be your lowest-friction path forward. I like the ease of use and more lightweight nature of Autofac, and my apps are tied to it, so I needed everything to work that way; if you don't have such an affinity, might be easier to just use Unity.
Sorry that's not a super answer. EntLib wire-up in IoC is a really complex beast.

WCF Data Service with EF fails to expose imported functions

(I am also using .NET 4.0 and VS 2010.)
I created a function import returning a complex type, as explained at http://msdn.microsoft.com/en-us/library/bb896231.aspx. The function import and new complex type appear in my .edmx file and in the Designer.cs file. However, the function does not appear when I view the service in the browser, and when I add or update a service reference in the client project, the function does not appear there either - as is to be expected, given the first result.
Creating an imported function and using it seems conceptually very simple and straightforward, and one would think it would just work, as Microsoft's step-by-step instructions appear to suggest: http://msdn.microsoft.com/en-us/library/cc716672.aspx#Y798 (which article shows the SP returning entity types - I tried this also, and it doesn't work for me either).
This blog post shows the addition of a method to the DataService class, which Microsoft's instructions omit: http://www.codegain.com/articles/wcf/miscellaneous/how-to-use-stored-procedure-in-wcf-data-service.aspx I tried adding one method returning a list of entity types and another returning a list of complex types, and still had no success. I still could not access the functions, either directly via the browser or from the client application via a service reference.
Thanks in advance for any help with this.
config.SetServiceOperationAccessRule("*", ServiceOperationRights.All);
MS would do well to add a note to the walkthroughs stating that the above bit of code must be there. (It may be better to enable each operation explicitly than to use "*".)
http://www.codegain.com/articles/wcf/miscellaneous/how-to-use-stored-procedure-in-wcf-data-service.aspx shows that line of code. Also, something it is there in the code, commented out, when one creates the WCF Data Service. Some of us like to delete commented-out code that we aren't using and that seems irrelevant - perhaps doing so a bit prematurely, sometimes.