Detect IsAlive on an IObservable - system.reactive

I'm writing a function IsAlive to take an IObservable<T>, and a timespan, and return an IObservable<bool> The canonical use case is to detect if a streaming server is still sending data.
I've come up with the following solution for it, but feel it's not the most clear as to how it works.
public static IObservable<bool> IsAlive<T>(this IObservable<T> source,
TimeSpan timeout,
IScheduler sched)
{
return source.Window(timeout, sched)
.Select(wind => wind.Any())
.SelectMany(a => a)
.DistinctUntilChanged();
}
Does anyone have a better approach?
FYI -
Here are the unit tests and existing approaches that I've tried: https://gist.github.com/997003

This should work:
public static IObservable<bool> IsAlive<T>(this IObservable<T> source,
TimeSpan timeout,
IScheduler sched)
{
return source.Buffer(timeout, 1, sched)
.Select(l => l.Any())
.DistinctUntilChanged();
}
This approach makes semantic sense, too. Every time an item comes in, it fills the buffer and then true is passed along. And every timeout, an empty buffer will be created and false will be passed along.
Edit:
This is why the buffer-1 approach is better than windowing:
var sched = new TestScheduler();
var subj = new Subject<Unit>();
var timeout = TimeSpan.FromTicks(10);
subj
.Buffer(timeout, 1, sched)
.Select(Enumerable.Any)
.Subscribe(x => Console.WriteLine("Buffer(timeout, 1): " + x));
subj
.Window(timeout, sched)
.Select(wind => wind.Any())
.SelectMany(a => a)
.Subscribe(x => Console.WriteLine("Window(timeout): "+x));
sched.AdvanceTo(5);
subj.OnNext(Unit.Default);
sched.AdvanceTo(16);
yields:
Buffer(timeout, 1): True
Window(timeout): True
Buffer(timeout, 1): False
To be specific, the window is open for the whole timeout and doesn't close and reset as soon as an item comes in. This is where the buffer limit of 1 comes into play. As soon as an item comes in, the buffer and its timer get restarted.
I could re-implement my buffer as a window, as buffer's implementation is a window, but a) I think buffer makes better semantic sense and b) I don't have to SelectMany. Scott's Select and SelectMany could be combined into a single SelectMany(x => x.Any()), but I can avoid the entire lambda and specify the Enumerable.Any method group, which will bind faster (trivial) anyway.

How about:
source.Select(_ => true)
.Timeout(timeout, sched)
.DistinctUntilChanged()
.Catch<bool, TimeoutException>)(ex => Observable.Return(false));

Related

Spit IObservable<T> exceptions in a separate IObservable<Exception> and continue normally

I have an hot IObservable<T> which may throw an exception. However, I would like to continue with it. I think I could use Retry operator for that. However, it would be great if I can also listen to any error in IObservable<T> through a separate IObservable<Exception>. Is it possible?
Your case is significantly more simplified in that you have a hot observable.
OnError is a notification outside your value stream, so we could materialize the notifications to retrieve the error. This still causes the tear-down of the stream with an OnCompleted, so you'll need to re-subscribe with Repeat.
var exceptions =
source
.Materialize()
.Where(notif => notif.Kind == NotificationKind.OnError)
.Select(notif => notif.Exception)
.Repeat();
Note
If you're using a Subject<T> for your hot observable, you might run into the usual problem of re-subbing a subject. A subject will replay its OnError or OnCompleted notifications for every new observer.
var source = new Subject<int>();
source.OnNext(1);
source.OnError(new Exception());
source.Subscribe(
i => Console.WriteLine(i),
ex => Console.WriteLine("Still got exception after the throw")
);
In this case your exception stream will go into an infinite re-subscription loop.
The premise of your question violates the observable contract:
An Observable may make zero or more OnNext notifications, each representing a single emitted item, and it may then follow those emission notifications by either an OnCompleted or an OnError notification, but not both. Upon issuing an OnCompleted or OnError notification, it may not thereafter issue any further notifications. (emphasis mine)
In other words, after your hot IObservable<T> throws an exception, the observable is ended. The observable of exceptions that comes out of that has a max count of one.
If you want to support a scenario where you re-start an observable after an exception, you're producing a stream of observables, or IObservable<IObservable<T>>. To work with that, here's a code sample:
var source = new Subject<Subject<int>>();
var exceptionStream = source
.SelectMany(o => o.Materialize())
.Where(n => n.Kind == NotificationKind.OnError)
.Select(n => n.Exception);
var itemStream = source
.SelectMany(o => o.Materialize())
.Where(n => n.Kind == NotificationKind.OnNext)
.Select(n => n.Value);
var items = new List<int>();
var exceptions = new List<Exception>();
itemStream.Subscribe(i => items.Add(i));
exceptionStream.Subscribe(e => exceptions.Add(e));
var currentSubject = new Subject<int>();
source.OnNext(currentSubject);
currentSubject.OnNext(1);
currentSubject.OnNext(2);
currentSubject.OnNext(3);
currentSubject.OnError(new Exception("First error"));
var currentSubject2 = new Subject<int>();
source.OnNext(currentSubject2);
currentSubject2.OnNext(4);
currentSubject2.OnNext(5);
currentSubject2.OnNext(6);
currentSubject2.OnError(new Exception("Second error"));
items.Dump(); //Linqpad
exceptions.Dump(); //Linqpad

Rate limiting observable [duplicate]

I would like to set up an Rx subscription that can respond to an event right away, and then ignore subsequent events that happen within a specified "cooldown" period.
The out of the box Throttle/Buffer methods respond only once the timeout has elapsed, which is not quite what I need.
Here is some code that sets up the scenario, and uses a Throttle (which isn't the solution I want):
class Program
{
static Stopwatch sw = new Stopwatch();
static void Main(string[] args)
{
var subject = new Subject<int>();
var timeout = TimeSpan.FromMilliseconds(500);
subject
.Throttle(timeout)
.Subscribe(DoStuff);
var factory = new TaskFactory();
sw.Start();
factory.StartNew(() =>
{
Console.WriteLine("Batch 1 (no delay)");
subject.OnNext(1);
});
factory.StartNewDelayed(1000, () =>
{
Console.WriteLine("Batch 2 (1s delay)");
subject.OnNext(2);
});
factory.StartNewDelayed(1300, () =>
{
Console.WriteLine("Batch 3 (1.3s delay)");
subject.OnNext(3);
});
factory.StartNewDelayed(1600, () =>
{
Console.WriteLine("Batch 4 (1.6s delay)");
subject.OnNext(4);
});
Console.ReadKey();
sw.Stop();
}
private static void DoStuff(int i)
{
Console.WriteLine("Handling {0} at {1}ms", i, sw.ElapsedMilliseconds);
}
}
The output of running this right now is:
Batch 1 (no delay)
Handling 1 at 508ms
Batch 2 (1s delay)
Batch 3 (1.3s delay)
Batch 4 (1.6s delay)
Handling 4 at 2114ms
Note that batch 2 isn't handled (which is fine!) because we wait for 500ms to elapse between requests due to the nature of throttle. Batch 3 is also not handled, (which is less alright because it happened more than 500ms from batch 2) due to its proximity to Batch 4.
What I'm looking for is something more like this:
Batch 1 (no delay)
Handling 1 at ~0ms
Batch 2 (1s delay)
Handling 2 at ~1000s
Batch 3 (1.3s delay)
Batch 4 (1.6s delay)
Handling 4 at ~1600s
Note that batch 3 wouldn't be handled in this scenario (which is fine!) because it occurs within 500ms of Batch 2.
EDIT:
Here is the implementation for the "StartNewDelayed" extension method that I use:
/// <summary>Creates a Task that will complete after the specified delay.</summary>
/// <param name="factory">The TaskFactory.</param>
/// <param name="millisecondsDelay">The delay after which the Task should transition to RanToCompletion.</param>
/// <returns>A Task that will be completed after the specified duration.</returns>
public static Task StartNewDelayed(
this TaskFactory factory, int millisecondsDelay)
{
return StartNewDelayed(factory, millisecondsDelay, CancellationToken.None);
}
/// <summary>Creates a Task that will complete after the specified delay.</summary>
/// <param name="factory">The TaskFactory.</param>
/// <param name="millisecondsDelay">The delay after which the Task should transition to RanToCompletion.</param>
/// <param name="cancellationToken">The cancellation token that can be used to cancel the timed task.</param>
/// <returns>A Task that will be completed after the specified duration and that's cancelable with the specified token.</returns>
public static Task StartNewDelayed(this TaskFactory factory, int millisecondsDelay, CancellationToken cancellationToken)
{
// Validate arguments
if (factory == null) throw new ArgumentNullException("factory");
if (millisecondsDelay < 0) throw new ArgumentOutOfRangeException("millisecondsDelay");
// Create the timed task
var tcs = new TaskCompletionSource<object>(factory.CreationOptions);
var ctr = default(CancellationTokenRegistration);
// Create the timer but don't start it yet. If we start it now,
// it might fire before ctr has been set to the right registration.
var timer = new Timer(self =>
{
// Clean up both the cancellation token and the timer, and try to transition to completed
ctr.Dispose();
((Timer)self).Dispose();
tcs.TrySetResult(null);
});
// Register with the cancellation token.
if (cancellationToken.CanBeCanceled)
{
// When cancellation occurs, cancel the timer and try to transition to cancelled.
// There could be a race, but it's benign.
ctr = cancellationToken.Register(() =>
{
timer.Dispose();
tcs.TrySetCanceled();
});
}
if (millisecondsDelay > 0)
{
// Start the timer and hand back the task...
timer.Change(millisecondsDelay, Timeout.Infinite);
}
else
{
// Just complete the task, and keep execution on the current thread.
ctr.Dispose();
tcs.TrySetResult(null);
timer.Dispose();
}
return tcs.Task;
}
Here's my approach. It's similar to others that have gone before, but it doesn't suffer the over-zealous window production problem.
The desired function works a lot like Observable.Throttle but emits qualifying events as soon as they arrive rather than delaying for the duration of the throttle or sample period. For a given duration after a qualifying event, subsequent events are suppressed.
Given as a testable extension method:
public static class ObservableExtensions
{
public static IObservable<T> SampleFirst<T>(
this IObservable<T> source,
TimeSpan sampleDuration,
IScheduler scheduler = null)
{
scheduler = scheduler ?? Scheduler.Default;
return source.Publish(ps =>
ps.Window(() => ps.Delay(sampleDuration,scheduler))
.SelectMany(x => x.Take(1)));
}
}
The idea is to use the overload of Window that creates non-overlapping windows using a windowClosingSelector that uses the source time-shifted back by the sampleDuration. Each window will therefore: (a) be closed by the first element in it and (b) remain open until a new element is permitted. We then simply select the first element from each window.
Rx 1.x Version
The Publish extension method used above is not available in Rx 1.x. Here is an alternative:
public static class ObservableExtensions
{
public static IObservable<T> SampleFirst<T>(
this IObservable<T> source,
TimeSpan sampleDuration,
IScheduler scheduler = null)
{
scheduler = scheduler ?? Scheduler.Default;
var sourcePub = source.Publish().RefCount();
return sourcePub.Window(() => sourcePub.Delay(sampleDuration,scheduler))
.SelectMany(x => x.Take(1));
}
}
The solution I found after a lot of trial and error was to replace the throttled subscription with the following:
subject
.Window(() => { return Observable.Interval(timeout); })
.SelectMany(x => x.Take(1))
.Subscribe(i => DoStuff(i));
Edited to incorporate Paul's clean-up.
Awesome solution Andrew! We can take this a step further though and clean up the inner Subscribe:
subject
.Window(() => { return Observable.Interval(timeout); })
.SelectMany(x => x.Take(1))
.Subscribe(DoStuff);
The initial answer I posted has a flaw: namely that the Window method, when used with an Observable.Interval to denote the end of the window, sets up an infinite series of 500ms windows. What I really need is a window that starts when the first result is pumped into the subject, and ends after the 500ms.
My sample data masked this problem because the data broke down nicely into the windows that were already going to be created. (i.e. 0-500ms, 501-1000ms, 1001-1500ms, etc.)
Consider instead this timing:
factory.StartNewDelayed(300,() =>
{
Console.WriteLine("Batch 1 (300ms delay)");
subject.OnNext(1);
});
factory.StartNewDelayed(700, () =>
{
Console.WriteLine("Batch 2 (700ms delay)");
subject.OnNext(2);
});
factory.StartNewDelayed(1300, () =>
{
Console.WriteLine("Batch 3 (1.3s delay)");
subject.OnNext(3);
});
factory.StartNewDelayed(1600, () =>
{
Console.WriteLine("Batch 4 (1.6s delay)");
subject.OnNext(4);
});
What I get is:
Batch 1 (300ms delay)
Handling 1 at 356ms
Batch 2 (700ms delay)
Handling 2 at 750ms
Batch 3 (1.3s delay)
Handling 3 at 1346ms
Batch 4 (1.6s delay)
Handling 4 at 1644ms
This is because the windows begin at 0ms, 500ms, 1000ms, and 1500ms and so each Subject.OnNext fits nicely into its own window.
What I want is:
Batch 1 (300ms delay)
Handling 1 at ~300ms
Batch 2 (700ms delay)
Batch 3 (1.3s delay)
Handling 3 at ~1300ms
Batch 4 (1.6s delay)
After a lot of struggling and an hour banging on it with a co-worker, we arrived at a better solution using pure Rx and a single local variable:
bool isCoolingDown = false;
subject
.Where(_ => !isCoolingDown)
.Subscribe(
i =>
{
DoStuff(i);
isCoolingDown = true;
Observable
.Interval(cooldownInterval)
.Take(1)
.Subscribe(_ => isCoolingDown = false);
});
Our assumption is that calls to the subscription method are synchronized. If they are not, then a simple lock could be introduced.
Use .Scan() !
This is what I use for Throttling when I need the first hit (after a certain period) immediately, but delay (and group/ignore) any subsequent hits.
Basically works like Throttle, but fires immediately if the previous onNext was >= interval ago, otherwise, schedule it at exactly interval from the previous hit. And of course, if within the 'cooling down' period multiple hits come, the additional ones are ignored, just like Throttle does.
The difference with your use case is that if you get an event at 0 ms and 100 ms, they will both be handled (at 0ms and 500ms), which might be what you actually want (otherwise, the accumulator is easy to adapt to ignore ANY hit closer than interval to the previous one).
public static IObservable<T> QuickThrottle<T>(this IObservable<T> src, TimeSpan interval, IScheduler scheduler)
{
return src
.Scan(new ValueAndDueTime<T>(), (prev, id) => AccumulateForQuickThrottle(prev, id, interval, scheduler))
.Where(vd => !vd.Ignore)
.SelectMany(sc => Observable.Timer(sc.DueTime, scheduler).Select(_ => sc.Value));
}
private static ValueAndDueTime<T> AccumulateForQuickThrottle<T>(ValueAndDueTime<T> prev, T value, TimeSpan interval, IScheduler s)
{
var now = s.Now;
// Ignore this completely if there is already a future item scheduled
// but do keep the dueTime for accumulation!
if (prev.DueTime > now) return new ValueAndDueTime<T> { DueTime = prev.DueTime, Ignore = true };
// Schedule this item at at least interval from the previous
var min = prev.DueTime + interval;
var nextTime = (now < min) ? min : now;
return new ValueAndDueTime<T> { DueTime = nextTime, Value = value };
}
private class ValueAndDueTime<T>
{
public DateTimeOffset DueTime;
public T Value;
public bool Ignore;
}
I got another one for your. This one doesn't use Repeat() nor Interval() so it might be what you are after:
subject
.Window(() => Observable.Timer(TimeSpan.FromMilliseconds(500)))
.SelectMany(x => x.Take(1));
Well the most obvious thing will be to use Repeat() here. However, as far as I know Repeat() might introduce problems so that notifications disappear in between the moment when the stream stops and we subscribe again. In practice this has never been a problem for me.
subject
.Take(1)
.Concat(Observable.Empty<long>().Delay(TimeSpan.FromMilliseconds(500)))
.Repeat();
Remember to replace with the actual type of your source.
UPDATE:
Updated query to use Concat instead of Merge
I have stumbled upon this question while trying to re-implement my own solution to the same or similar problem using .Window
Take a look, it seems to be the same as this one and solved quite elegantly:
https://stackoverflow.com/a/3224723/58463
It's an old post, but no answer could really fill my needs, so I'm giving my own solution :
public static IObservable<T> ThrottleOrImmediate<T>(this IObservable<T> source, TimeSpan delay, IScheduler scheduler)
{
return Observable.Create<T>((obs, token) =>
{
// Next item cannot be send before that time
DateTime nextItemTime = default;
return Task.FromResult(source.Subscribe(async item =>
{
var currentTime = DateTime.Now;
// If we already reach the next item time
if (currentTime - nextItemTime >= TimeSpan.Zero)
{
// Following item will be send only after the set delay
nextItemTime = currentTime + delay;
// send current item with scheduler
scheduler.Schedule(() => obs.OnNext(item));
}
// There is still time before we can send an item
else
{
// we schedule the time for the following item
nextItemTime = currentTime + delay;
try
{
await Task.Delay(delay, token);
}
catch (TaskCanceledException)
{
return;
}
// If next item schedule was change by another item then we stop here
if (nextItemTime > currentTime + delay)
return;
else
{
// Set next possible time for an item and send item with scheduler
nextItemTime = currentTime + delay;
scheduler.Schedule(() => obs.OnNext(item));
}
}
}));
});
}
First item is immediately sent, then following items are throttled. Then if a following item is sent after the delayed time, it's immediately sent too.

Throttle observable based on whether handler is still busy [duplicate]

I want to run periodic tasks in with a restriction that at most only one execution of a method is running at any given time.
I was experimenting with Rx, but I am not sure how to impose at most once concurrency restriction.
var timer = Observable.Interval(TimeSpan.FromMilliseconds(100));
timer.Subscribe(tick => DoSomething());
Additionally, if a task is still running, I want the subsequent schedule to elapse. i.e I don't want the tasks to queue up and cause problems.
I have 2 such tasks to execute periodically. The tasks being executed is currently synchronous. But, I could make them async if there is a necessity.
You are on the right track, you can use Select + Concat to flatten out the observable and limit the number of inflight requests (Note: if your task takes longer than the interval time, then they will start to stack up since they can't execute fast enough):
var source = Observable.Interval(TimeSpan.FromMilliseconds(100))
//I assume you are doing async work since you want to limit concurrency
.Select(_ => Observable.FromAsync(() => DoSomethingAsync()))
//This is equivalent to calling Merge(1)
.Concat();
source.Subscribe(/*Handle the result of each operation*/);
You should have tested your code as is because this is exactly what Rx imposes already.
Try this as a test:
void Main()
{
var timer = Observable.Interval(TimeSpan.FromMilliseconds(100));
using (timer.Do(x => Console.WriteLine("!")).Subscribe(tick => DoSomething()))
{
Console.ReadLine();
}
}
private void DoSomething()
{
Console.Write("<");
Console.Write(DateTime.Now.ToString("HH:mm:ss.fff"));
Thread.Sleep(1000);
Console.WriteLine(">");
}
When you run this you'll get this kind of output:
!
<16:54:57.111>
!
<16:54:58.112>
!
<16:54:59.113>
!
<16:55:00.113>
!
<16:55:01.114>
!
<16:55:02.115>
!
<16:55:03.116>
!
<16:55:04.117>
!
<16:55:05.118>
!
<16:55:06.119
It is already ensuring that there's no overlap.
Below are two implementations of a PeriodicSequentialExecution method, that creates an observable by executing an asynchronous method in a periodic fashion, enforcing a no-overlapping-execution policy. The interval between subsequent executions can be extended to prevent overlapping, in which case the period is time-shifted accordingly.
The first implementation is purely functional, while the second implementation is mostly imperative. Both implementations are functionally identical. The first one can be supplied with a custom IScheduler. The second one may be slightly more efficient.
The functional implementation:
/// <summary>
/// Creates an observable sequence containing the results of an asynchronous
/// action that is invoked periodically and sequentially (without overlapping).
/// </summary>
public static IObservable<T> PeriodicSequentialExecution<T>(
Func<CancellationToken, Task<T>> action,
TimeSpan dueTime, TimeSpan period,
CancellationToken cancellationToken = default,
IScheduler scheduler = null)
{
// Arguments validation omitted
scheduler ??= DefaultScheduler.Instance;
return Delay(dueTime) // Initial delay
.Concat(Observable.Using(() => CancellationTokenSource.CreateLinkedTokenSource(
cancellationToken), linkedCTS =>
// Execution loop
Observable.Publish( // Start a hot delay timer before each operation
Delay(period), hotTimer => Observable
.StartAsync(() => action(linkedCTS.Token)) // Start the operation
.Concat(hotTimer) // Await the delay timer
)
.Repeat()
.Finally(() => linkedCTS.Cancel()) // Unsubscription: cancel the operation
));
IObservable<T> Delay(TimeSpan delay)
=> Observable
.Timer(delay, scheduler)
.IgnoreElements()
.Select(_ => default(T))
.TakeUntil(Observable.Create<Unit>(o => cancellationToken.Register(() =>
o.OnError(new OperationCanceledException(cancellationToken)))));
}
The imperative implementation:
public static IObservable<T> PeriodicSequentialExecution2<T>(
Func<CancellationToken, Task<T>> action,
TimeSpan dueTime, TimeSpan period,
CancellationToken cancellationToken = default)
{
// Arguments validation omitted
return Observable.Create<T>(async (observer, ct) =>
{
using (var linkedCTS = CancellationTokenSource.CreateLinkedTokenSource(
ct, cancellationToken))
{
try
{
await Task.Delay(dueTime, linkedCTS.Token);
while (true)
{
var delayTask = Task.Delay(period, linkedCTS.Token);
var result = await action(linkedCTS.Token);
observer.OnNext(result);
await delayTask;
}
}
catch (Exception ex) { observer.OnError(ex); }
}
});
}
The cancellationToken parameter can be used for the graceful termination of the resulting observable sequence. This means that the sequence waits for the currently running operation to complete before terminating. If you prefer it to terminate instantaneously, potentially leaving work running unobserved in a fire-and-forget fashion, you can simply dispose the subscription to the observable sequence as always. Canceling the cancellationToken results to the observable sequence completing in a faulted state (OperationCanceledException).
Here is a factory function that does exactly what you are asking for.
public static IObservable<Unit> Periodic(TimeSpan timeSpan)
{
return Observable.Return(Unit.Default).Concat(Observable.Return(Unit.Default).Delay(timeSpan).Repeat());
}
Here is an example usage
Periodic(TimeSpan.FromSeconds(1))
.Subscribe(x =>
{
Console.WriteLine(DateTime.Now.ToString("mm:ss:fff"));
Thread.Sleep(500);
});
If you run this, each console print will be roughly 1.5 seconds apart.
Note, If you don't want the first tick to run immediately, you could instead use this factory, which won't send the first Unit until after the timespan.
public static IObservable<Unit> DelayedPeriodic(TimeSpan timeSpan)
{
return Observable.Return(Unit.Default).Delay(timeSpan).Repeat();
}

rx reactive extension: how to have each subscriber get a different value (the next one) from an observable?

Using reactive extension, it is easy to subscribe 2 times to the same observable.
When a new value is available in the observable, both subscribers are called with this same value.
Is there a way to have each subscriber get a different value (the next one) from this observable ?
Ex of what i'm after:
source sequence: [1,2,3,4,5,...] (infinite)
The source is constantly adding new items at an unknown rate.
I'm trying to execute a lenghty async action for each item using N subscribers.
1st subscriber: 1,2,4,...
2nd subscriber: 3,5,...
...
or
1st subscriber: 1,3,...
2nd subscriber: 2,4,5,...
...
or
1st subscriber: 1,3,5,...
2nd subscriber: 2,4,6,...
I would agree with Asti.
You could use Rx to populate a Queue (Blocking Collection) and then have competing consumers read from the queue. This way if one process was for some reason faster it could pick up the next item potentially before the other consumer if it was still busy.
However, if you want to do it, against good advice :), then you could just use the Select operator that will provide you with the index of each element. You can then pass that down to your subscribers and they can fiter on a modulus. (Yuck! Leaky abstractions, magic numbers, potentially blocking, potentiall side effects to the source sequence etc)
var source = Obserservable.Interval(1.Seconds())
.Select((i,element)=>{new Index=i, Element=element});
var subscription1 = source.Where(x=>x.Index%2==0).Subscribe(x=>DoWithThing1(x.Element));
var subscription2 = source.Where(x=>x.Index%2==1).Subscribe(x=>DoWithThing2(x.Element));
Also remember that the work done on the OnNext handler if it is blocking will still block the scheduler that it is on. This could affect the speed of your source/producer. Another reason why Asti's answer is a better option.
Ask if that is not clear :-)
How about:
IObservable<TRet> SomeLengthyOperation(T input)
{
return Observable.Defer(() => Observable.Start(() => {
return someCalculatedValueThatTookALongTime;
}, Scheduler.TaskPoolScheduler));
}
someObservableSource
.SelectMany(x => SomeLengthyOperation(input))
.Subscribe(x => Console.WriteLine("The result was {0}", x);
You can even limit the number of concurrent operations:
someObservableSource
.Select(x => SomeLengthyOperation(input))
.Merge(4 /* at a time */)
.Subscribe(x => Console.WriteLine("The result was {0}", x);
It's important for the Merge(4) to work, that the Observable returned by SomeLengthyOperation be a Cold Observable, which is what the Defer does here - it makes the Observable.Start not happen until someone Subscribes.

Rx Breaking Changes

I know that SL5 has a new property to count MouseClicks, but with help, I got this working when SL4 came out. Now I moved to a new machine, downloaded RX, and I understand the RX went through a few changes that has broken this code. I have tried, but I can't seem to navigate the transition away from FastSubject.
I would really like to fully understand the use of Subject here, and how to update the call to work with the current version of Rx.
public static IObservable<TSource> MonitorForDoubleClicks<TSource>(this IObservable<TSource> source, TimeSpan doubleClickSpeed, IScheduler scheduler)
{
return source.Multicast<TSource, TSource, TSource>(
() => new FastSubject<TSource>(),
values =>
{
return values
.TimeInterval(scheduler) //injects a timestamp event arguments
.Skip(1) // in order to determine an interval we need two of these, so this keeps the event in the collection, but does not process the first one in
.Where(interval => interval.Interval <= doubleClickSpeed) //second event has arrived, so we can test the interval
.RemoveTimeInterval() //take the time argument out of the event args
.Take(1) //we take one of the events (the latest) and throw it
.Repeat(); //keep the observer alive forever
});
FastSubject is just Subject now, all subjects are fast :) However, this is a weird way to check for double clicks.
How about just (warning: Coding via TextArea):
return source.Timestamp(scheduler)
.Buffer(/*buffer of*/2, 1 /*advanced at a time*/)
.Where(x => x[1].Timestamp - x[0].Timestamp < doubleClickSpeed)
.Select(x => x[1]);