I have a problem when running multiple impersonations of users in EWS, when I want to recieve notifications on each of the impersonated persons calendars (possible 100 persons).
Currently I have an outlook account who have rights to impersonate all other users, and all the ExchangeService-objects get this accounts credentials
Short version is, that when I try to bind to an appointment via the unique ID it works as long as I only have one thread running. When I start a new thread containing a new Exchangeservice with its own subscription I dont recieve any response on the Appointment.Bind()-request.
When I run two instances of my program with only 1 thread in each it works fine, but as soon as I start a new thread with a new ExchangeService the Appointment.Bind() doesnt give any response.
The weird part about this is, that it worked fine two weeks ago, but suddenly it stopped working and I didnt change my code.
I have created a quick demo of my problem:
class Program
{
static void Main(string[] args)
{
var x = new OutlookListener("user1#server.com");
var y = new OutlookListener("user2#server.com");
new Thread(x.Start).Start();
new Thread(y.Start).Start();
while (true)
{
}
}
}
class OutlookListener
{
private ExchangeService _ExchangeService;
private AutoResetEvent _Signal;
public OutlookListener(string emailToImp)
{
_ExchangeService = new ExchangeService(ExchangeVersion.Exchange2010_SP1)
{
Credentials = new NetworkCredential("superuser#server.com", "password"),
Url = new Uri("exchangeUrl"),
ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, emailToImp)
};
}
public void Start()
{
var subscription = _ExchangeService.SubscribeToStreamingNotifications(new FolderId[] { WellKnownFolderName.Calendar },
EventType.Created);
var connection = CreateStreamingSubscription(_ExchangeService, subscription);
Console.Out.WriteLine("Subscription created.");
_Signal = new AutoResetEvent(false);
_Signal.WaitOne();
subscription.Unsubscribe();
connection.Close();
}
private StreamingSubscriptionConnection CreateStreamingSubscription(ExchangeService service, StreamingSubscription subscription)
{
var connection = new StreamingSubscriptionConnection(service, 30);
connection.AddSubscription(subscription);
connection.OnNotificationEvent += OnNotificationEvent;
connection.OnSubscriptionError += OnSubscriptionError;
connection.OnDisconnect += OnDisconnect;
connection.Open();
return connection;
}
private void OnNotificationEvent(object sender, NotificationEventArgs args)
{
// Extract the item ids for all NewMail Events in the list.
var newMails = from e in args.Events.OfType<ItemEvent>()
where e.EventType == EventType.Created
select e.ItemId;
foreach (var newMail in newMails)
{
var appointment= Appointment.Bind(_ExchangeService, newMail); //This is where I dont get a response!
Console.WriteLine(appointment.Subject);
}
}
private void OnSubscriptionError(object sender, SubscriptionErrorEventArgs args)
{
}
private void OnDisconnect(object sender, SubscriptionErrorEventArgs args)
{
}
}
Any suggestions?
I have had the same issue and found that my EWS solution was limited by two factors.
The System.Net.ServicePointManager.DefaultConnectionLimit is by default set to 2, which I've changed to 20 which i beleive to match the throttling policy of Exchange Online.
Second the ConnectionGroupName property on the ExchangeService object can be used to pool connections into different relevant groups which have a limit of concurrent connections cohernet with the DefaultConnectionLimit property.
A way to override the settings is to set the ConnectionGroupName property to a uniquevalue for each ExchangeService object you create.
ExchangeService exchangeService = new ExchangeService()
{
ConnectionGroupName = Guid.NewGuid().ToString()
};
Why do you need multiple threads ?
In my case , I have created a dictionary of Services based on the smtpaddress for each email I want to impersonate, and I subscribe to them all. All can happen in one thread, and all notification from any user will be handled in the OnNotificationEvent .
[THIS CODE IS JUST TO SHOW THE LOGIC AND IS NOT COMPLETE FOR FULL COMPILATION AND RUN]
var service = new ExchangeService(exchangeVersion);
var serviceCred = ((System.Net.NetworkCredential)(((WebCredentials)(Services.First().Value.Credentials)).Credentials));
service.Credentials = new WebCredentials(serviceCred.UserName, serviceCred.Password);
service.AutodiscoverUrl(userSmtp, RedirectionUrlValidationCallback);
service.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, userSmtp);
Services.Add(userSmtp, service);
Note that Services.First().Value is the service that can impersonate all the other users, and here it is cloned as the number of the user.
After that Subscriptions for the all services (note that now each service is impersonating different user)
foreach (var service in Services.Values)
{
SubscribeToService(service);
}
and the definition for SubscribeToService is as follow
private void SubscribeToService(ExchangeService service)
{
if (service.ImpersonatedUserId == null)
return;
if (service.Url == null)
return;
var serviceName = service.ImpersonatedUserId.Id;
var streamingSubscription =
service.SubscribeToStreamingNotifications(new FolderId[] { WellKnownFolderName.DeletedItems, WellKnownFolderName.Calendar },
EventType.FreeBusyChanged, EventType.Moved, EventType.Created, EventType.Modified);
if (!Connections.ContainsKey(service.Url))
{
Connections.Add(service.Url, new StreamingSubscriptionConnection(service, 30));
}
var connection = Connections[service.Url];
CloseConnection(connection);
if (!_subscriptions.ContainsKey(serviceName))
{
_subscriptions.Add(serviceName, streamingSubscription);
connection.AddSubscription(streamingSubscription);
}
}
}
All of this can happen in one single thread, and I hope my answer will help you
Cheers
Related
I ran into a problem in the last step of a test project using Photon Network. When you first connect and join the room, everything goes without errors. However, after completing the match, exiting the room, and using LoadScene(), errors appear:
JoinLobby operation (229) not called because client is not connected or not yet ready, client state: JoiningLob <- in OnConnectedToMaster()
Through experience, I realized that the ConnectUsingSettings() methods and other Photon methods are called multiple times. But the connection to the lobby happens and I can create a room, but I immediately encounter MissingReferenceException errors.
I've seen a solution from guys who ran into this very same problem. The problems arose because of the events. Wherever this could happen, I unsubscribed from the events, but that doesn't help. What else can cause such problems, because I obviously missed something that prevents me from completely closing the scene during the transition?
Sorry for my language, used Google Translate
Code:
LobbyManager.cs
private void StartConnect()
{
PhotonNetwork.NickName = master.GameSettings.NickName;
PhotonNetwork.GameVersion = master.GameSettings.NickName;
PhotonNetwork.ConnectUsingSettings();
PhotonNetwork.AutomaticallySyncScene = true;
}
public override void OnConnectedToMaster()
{
Debug.Log("Connected to server");
if(!PhotonNetwork.InLobby) PhotonNetwork.JoinLobby();
}
public override void OnJoinedLobby()
{
onConnected.Invoke();//This use for show UIElements on Canvas
}
JoinRandomRoom class
public void OnClick_JoinRandomRoom()
{
if (!PhotonNetwork.IsConnected) return;
if (GameModeGlobalData.SelectedGameMode != null)
{
SetRoomOptions();
PhotonNetwork.JoinRandomRoom(expectedRoomProperties, GameModeGlobalData.SelectedGameMode.MaxPlayers);
}
}
public override void OnJoinRandomFailed(short returnCode, string message)
{
Debug.Log("Join random failed: " + message + ". Room will be created...");
_createRoomMenu.CreateAndJoinRoom();
}
public void SetRoomOptions()
{
expectedRoomProperties[RoomData.GAME_MODE] = GameModeGlobalData.SelectedGameMode.GameModeName;
}
private void OnDisable()
{
ShowPanels.RemoveAllListeners();
}
And CreateRoom.cs
private ExitGames.Client.Photon.Hashtable _roomCustomProperties = new ExitGames.Client.Photon.Hashtable();
public void CreateAndJoinRoom()
{
if (!PhotonNetwork.IsConnected) return;
if (GameModeGlobalData.SelectedGameMode != null)
{
RoomOptions roomOptions = GetCustomRoomOptions();
roomOptions.CleanupCacheOnLeave = true;
PhotonNetwork.CreateRoom(randomRoomName, roomOptions);
}
}
public RoomOptions GetCustomRoomOptions()
{
RoomOptions options = new RoomOptions();
options.MaxPlayers = _maxPlayer;
options.IsOpen = true;
options.IsVisible = true;
string[] roomProperties = new string[]{ RoomData.GAME_MODE };
_roomCustomProperties[RoomData.GAME_MODE] = GameModeGlobalData.SelectedGameMode.GameModeName;
options.CustomRoomPropertiesForLobby = roomProperties;
options.CustomRoomProperties = _roomCustomProperties;
return options;
}
The project has grown, and I blame myself for not testing it at the very beginning. Didn't think there would be problems at this stage
Sorry for this post. Its resolved. For those who may encounter this in the future, in addition to unsubscribing from events, check all classes that inherit from MonoBehaviourPunCallbacks for overridden OnDisable() methods.
Like this:
public override void OnDisable()
{
base.OnDisable();
}
This in turn will call the
PhotonNetwork.RemoveCallbackTarget(this);
Also, from the documentation:
Do not add new MonoBehaviour.OnEnable or MonoBehaviour.OnDisable. Instead, you should override those and call base.OnEnable and base.OnDisable.
I forgot about it and used MonoBehaviour.OnDisable.
I want my app to be a viewer and send target for PDFs but don't want it to create new instances everytime. How do I catch the view intent action in my MainActivity? I tried OnNewIntent() but it doesn't get called. Only if the app wasn't already running, I get the action in OnCreate(). What am I missing?
[Activity (Theme = "#style/MainTheme", Label = "MyPdfViewer", Icon = "#drawable/icon", /*MainLauncher = true, --> SplashActivity is now the MainLauncher */LaunchMode = LaunchMode.SingleTop, ConfigurationChanges = ConfigChanges.ScreenSize | ConfigChanges.Orientation)]
[IntentFilter(new[] { Intent.ActionSend }, Categories = new[] { Intent.CategoryDefault }, DataMimeType = #"application/pdf")]
[IntentFilter(new[] { Intent.ActionView }, Categories = new[] { Intent.CategoryDefault }, DataMimeType = #"application/pdf")]
public class MainActivity : global::Xamarin.Forms.Platform.Android.FormsAppCompatActivity
{
protected override void OnCreate (Bundle bundle)
{
base.OnCreate (bundle);
LoadApplication (new App ());
// handle clipboard data "send to" or "view document" actions
if (Intent.Type == "application/pdf")
{
HandleSendOrViewAction();
}
}
protected virtual void OnNewIntent()
{
var data = this.Intent.Data; // <-- never called
// do similar thing like in HandleSendOrViewAction()
}
private bool HandleSendOrViewAction()
{
// Get the info from ClipData
var pdf = Intent.ClipData.GetItemAt(0);
// Open a stream from the URI
byte[] bytes;
Stream inputStream;
if (Intent.Action == Intent.ActionSend)
inputStream = ContentResolver.OpenInputStream(pdf.Uri);
else if (Intent.Action == Intent.ActionView)
inputStream = ContentResolver.OpenInputStream(Intent.Data);
else
return false;
using (StreamReader sr = new StreamReader(inputStream))
{
MemoryStream ms = new MemoryStream();
inputStream.CopyTo(ms);
bytes = ms.ToArray();
}
Services.PdfReceiver.Base64Data = Convert.ToBase64String(bytes);
return true;
}
but don't want it to create new instances everytime.
The standard and singleTop of Launch Mode would create multiple instances. if you do not want create instance every time, you could use singleTask and singleInstance instead.
For singleTop Launch Mode, you need to know, if an instance of the activity already exists at the top of the target task, the system routes the intent to that instance through a call to its onNewIntent() method, rather than creating a new instance of the activity. If the instance of the activity which already exists is not at the top, it would not call onNewIntent() method.
That's why i suggest to use singleTask. The system creates the activity at the root of a new task and routes the intent to it. However, if an instance of the activity already exists, the system routes the intent to existing instance through a call to its onNewIntent() method, rather than creating a new one.
Using SingleTop launch mode is correct. The reason that OnNewIntent() is not being called is that you have declared it like this:
protected virtual void OnNewIntent()
That isn't correct. The signature is wrong. You need to declare it like this:
protected override void OnNewIntent(Intent intent)
We are using ASP.NET Zero and are running into issues with parallel processing from an AppService. We know requests must be transactional, but unfortunately we need to break out to slow running APIs for numerous calls, so we have to do parallel processing.
As expected, we are running into a DbContext contingency issue on the second database call we make:
System.InvalidOperationException: A second operation started on this context
before a previous operation completed. This is usually caused by different
threads using the same instance of DbContext, however instance members are
not guaranteed to be thread safe. This could also be caused by a nested query
being evaluated on the client, if this is the case rewrite the query avoiding
nested invocations.
We read that a new UOW is required, so we tried using both the method attribute and the explicit UowManager, but neither of the two worked.
We also tried creating instances of the referenced AppServices using the IocResolver, but we are still not able to get a unique DbContext per thread (please see below).
public List<InvoiceDto> CreateInvoices(List<InvoiceTemplateLineItemDto> templateLineItems)
{
List<InvoiceDto> invoices = new InvoiceDto[templateLineItems.Count].ToList();
ConcurrentQueue<Exception> exceptions = new ConcurrentQueue<Exception>();
Parallel.ForEach(templateLineItems, async (templateLineItem) =>
{
try
{
XAppService xAppService = _iocResolver.Resolve<XAppService>();
InvoiceDto invoice = await xAppService
.CreateInvoiceInvoiceItem();
invoices.Insert(templateLineItems.IndexOf(templateLineItem), invoice);
}
catch (Exception e)
{
exceptions.Enqueue(e);
}
});
if (exceptions.Count > 0) throw new AggregateException(exceptions);
return invoices;
}
How can we ensure that a new DbContext is availble per thread?
I was able to replicate and resolve the problem with a generic version of ABP. I'm still experiencing the problem in my original solution, which is far more complex. I'll have to do some more digging to determine why it is failing there.
For others that come across this problem, which is exactly the same issue as reference here, you can simply disable the UnitOfWork through an attribute as illustrated in the code below.
public class InvoiceAppService : ApplicationService
{
private readonly InvoiceItemAppService _invoiceItemAppService;
public InvoiceAppService(InvoiceItemAppService invoiceItemAppService)
{
_invoiceItemAppService = invoiceItemAppService;
}
// Just add this attribute
[UnitOfWork(IsDisabled = true)]
public InvoiceDto GetInvoice(List<int> invoiceItemIds)
{
_invoiceItemAppService.Initialize();
ConcurrentQueue<InvoiceItemDto> invoiceItems =
new ConcurrentQueue<InvoiceItemDto>();
ConcurrentQueue<Exception> exceptions = new ConcurrentQueue<Exception>();
Parallel.ForEach(invoiceItemIds, (invoiceItemId) =>
{
try
{
InvoiceItemDto invoiceItemDto =
_invoiceItemAppService.CreateAsync(invoiceItemId).Result;
invoiceItems.Enqueue(invoiceItemDto);
}
catch (Exception e)
{
exceptions.Enqueue(e);
}
});
if (exceptions.Count > 0) {
AggregateException ex = new AggregateException(exceptions);
Logger.Error("Unable to get invoice", ex);
throw ex;
}
return new InvoiceDto {
Date = DateTime.Now,
InvoiceItems = invoiceItems.ToArray()
};
}
}
public class InvoiceItemAppService : ApplicationService
{
private readonly IRepository<InvoiceItem> _invoiceItemRepository;
private readonly IRepository<Token> _tokenRepository;
private readonly IRepository<Credential> _credentialRepository;
private Token _token;
private Credential _credential;
public InvoiceItemAppService(IRepository<InvoiceItem> invoiceItemRepository,
IRepository<Token> tokenRepository,
IRepository<Credential> credentialRepository)
{
_invoiceItemRepository = invoiceItemRepository;
_tokenRepository = tokenRepository;
_credentialRepository = credentialRepository;
}
public void Initialize()
{
_token = _tokenRepository.FirstOrDefault(x => x.Id == 1);
_credential = _credentialRepository.FirstOrDefault(x => x.Id == 1);
}
// Create an invoice item using info from an external API and some db records
public async Task<InvoiceItemDto> CreateAsync(int id)
{
// Get db record
InvoiceItem invoiceItem = await _invoiceItemRepository.GetAsync(id);
// Get price
decimal price = await GetPriceAsync(invoiceItem.Description);
return new InvoiceItemDto {
Id = id,
Description = invoiceItem.Description,
Amount = price
};
}
private async Task<decimal> GetPriceAsync(string description)
{
// Simulate a slow API call to get price using description
// We use the token and credentials here in the real deal
await Task.Delay(5000);
return 100.00M;
}
}
I'm trying to optimize my application to perform at maximum speed. I intended on having two threads each executing a batch request of sales receipts additions. I also intended on having two parallel threads each with a batch request of customer additions. I was wondering whether this is possible or would the API lock the sales receipt/customer table in QuickBooks thus only allowing one thread to perform.
From my research I know that there a three types of entities (Name list, transaction and supporting entities). So what are the causes of locks on these entities, ie what scenario's will cause a lock? Is there any documentation on this matter I couldn't seem to find any?
Thanks
Lock is applicable for Name entities(Vendor, Customer and Employee ). While creating a new name entity, service ensures that an unique name is getting inserted in cloud. So, it puts a lock across all names of these 3 entities.
You can try this scenario using a decent payload.
public static void main(String args[]) {
PropertyConfigurator
.configure("log4j.properties");
Config.setProperty(Config.SERIALIZATION_REQUEST_FORMAT, "xml");
Config.setProperty(Config.SERIALIZATION_RESPONSE_FORMAT, "xml");
final Context platformContext = getPlatformContext("QBO");
final QBOV3ProdTest qbov3ProdTest = new QBOV3ProdTest(platformContext);
Thread customerThread = new Thread(new Runnable() {
#Override
public void run() {
for (int i = 0; i < 15; i++) {
qbov3ProdTest.addCustomer();
}
}
});
customerThread.start();
Thread vendorThread = new Thread(new Runnable() {
#Override
public void run() {
for (int i = 0; i < 15; i++) {
qbov3ProdTest.addVendor();
}
}
});
vendorThread.start();
}
private void addCustomer() {
Customer customer = new Customer();
customer.setDisplayName("TestCustomer-" + staticCount++);
try {
this.service.add(customer);
} catch (FMSException e) {
e.printStackTrace();
}
}
private void addVendor() {
Vendor vendor = new Vendor();
vendor.setDisplayName("TestVendor-" + staticCount++);
try {
this.service.add(vendor);
} catch (FMSException e) {
e.printStackTrace();
}
}
Service doesn't return a proper response. Wherever it fails, service returns 401. Please let me know if you can reproduce this behaviour while trying this use-case in your test QBO account.
Thanks
This is not exactly a DB locking rule but because of the way we are saving data to our cache for Names lists.
We do not allow users to update these entities in a multi-threaded manner:
Account,
Department,
Item,
Class,
Customer,
Employee,
Vendor,
PaymentMethod,
Terms.
The above has been confirmed by our engineering team.
I am attempting to upload multiple files from a Silverlight client directly to Amazon S3. The user chooses the files from the standard file open dialog and I want to chain the uploads so they happen serially one at a time. This can happen from multiple places in the app so I was trying to wrap it up in a nice utility class that accepts an IEnumerable of the chosen files exposes an IObservable of the files as they are uploaded so that the UI can respond accordingly as each file is finished.
It is fairly complex due to all the security requirements of both Silverlight and AmazonS3. I'll try to briefly explain my whole environment for context, but I have reproduced the issue with a small console application that I will post the code to below.
I have a 3rd party utility that handles uploading to S3 from Silverlight that exposes standard event based async methods. I create one instance of that utility per uploaded file. It creates an unsigned request string that I then post to my server to sign with my private key. That signing request happens through a service proxy class that also uses event based async methods. Once I have the signed request, I add it to the uploader instance and initiate the upload.
I've tried using Concat, but I end up with only the first file going through the process. When I use Merge, all files complete fine, but in a parallel fashion rather than serially. When I use Merge(2) all files start the first step, but then only 2 make their way through and complete.
Obviously I am missing something related to Rx since it isn't behaving like I expect.
namespace RxConcat
{
using System;
using System.Collections.Generic;
using System.Linq;
using System.Reactive.Linq;
using System.Timers;
public class SignCompletedEventArgs : EventArgs
{
public string SignedRequest { get; set; }
}
public class ChainUploader
{
public IObservable<string> StartUploading(IEnumerable<string> files)
{
return files.Select(
file => from signArgs in this.Sign(file + "_request")
from uploadArgs in this.Upload(file, signArgs.EventArgs.SignedRequest)
select file).Concat();
}
private IObservable<System.Reactive.EventPattern<SignCompletedEventArgs>> Sign(string request)
{
Console.WriteLine("Signing request '" + request + "'");
var signer = new Signer();
var source = Observable.FromEventPattern<SignCompletedEventArgs>(ev => signer.SignCompleted += ev, ev => signer.SignCompleted -= ev);
signer.SignAsync(request);
return source;
}
private IObservable<System.Reactive.EventPattern<EventArgs>> Upload(string file, string signedRequest)
{
Console.WriteLine("Uploading file '" + file + "'");
var uploader = new Uploader();
var source = Observable.FromEventPattern<EventArgs>(ev => uploader.UploadCompleted += ev, ev => uploader.UploadCompleted -= ev);
uploader.UploadAsync(file, signedRequest);
return source;
}
}
public class Signer
{
public event EventHandler<SignCompletedEventArgs> SignCompleted;
public void SignAsync(string request)
{
var timer = new Timer(1000);
timer.Elapsed += (sender, args) =>
{
timer.Stop();
if (this.SignCompleted == null)
{
return;
}
this.SignCompleted(this, new SignCompletedEventArgs { SignedRequest = request + "signed" });
};
timer.Start();
}
}
public class Uploader
{
public event EventHandler<EventArgs> UploadCompleted;
public void UploadAsync(string file, string signedRequest)
{
var timer = new Timer(1000);
timer.Elapsed += (sender, args) =>
{
timer.Stop();
if (this.UploadCompleted == null)
{
return;
}
this.UploadCompleted(this, new EventArgs());
};
timer.Start();
}
}
internal class Program
{
private static void Main(string[] args)
{
var files = new[] { "foo", "bar", "baz" };
var uploader = new ChainUploader();
var token = uploader.StartUploading(files).Subscribe(file => Console.WriteLine("Upload completed for '" + file + "'"));
Console.ReadLine();
}
}
}
The base observable that is handling the 2 step upload for each file is never 'completing' which prevents the next one in the chain from starting. Add a Limit(1) to that observable prior to calling Concat() and it will working correctly.
return files.Select(file => (from signArgs in this.Sign(file + "_request")
from uploadArgs in this.Upload(file, signArgs.EventArgs.SignedRequest)
select file).Take(1)).Concat();