The following code works as expected but I am puzzled by the way it behaves when I uncomment the line 'o.OnCompleted();'
The code joins all subscribers to the result of a single long operation and caches the result for further subscribers for 2 seconds. Any subscription after this time starts the process again.
Subscriptions will come from other threads (simulated with the thread pool).
var obs = Observable.Create((IObserver<Guid> o) =>
{
Console.WriteLine("Start");
Thread.Sleep(1000); // process
Console.WriteLine("End");
o.OnNext(Guid.NewGuid());
//o.OnCompleted(); // <-- uncomment this
return Disposable.Empty;
})
.Replay(TimeSpan.FromSeconds(2))
.RefCount()
.Take(1);
ThreadPool.QueueUserWorkItem(delegate
{
// simulate request from threadpool
obs.Subscribe(x => Console.WriteLine($"1: {x}"), () => Console.WriteLine($"1: complete"));
});
ThreadPool.QueueUserWorkItem(delegate
{
obs.Subscribe(x => Console.WriteLine($"2: {x}"), () => Console.WriteLine($"2: complete"));
});
Thread.Sleep(4000);
ThreadPool.QueueUserWorkItem(delegate
{
obs.Subscribe(x => Console.WriteLine($"3: {x}"), () => Console.WriteLine($"3: complete"));
});
Here is the result:
Start
End
1: 255BEFDC-2F14-40AD-AE77-2B005C5A3AA9
2: 255BEFDC-2F14-40AD-AE77-2B005C5A3AA9
1: complete
2: complete
Start
End
3: 1214DC63-F688-475A-9CB7-C3784054A4AC
3: complete
The odd behaviour is if I uncomment the line 'o.OnCompleted()' the result changes to this:
Start
End
1: 255BEFDC-2F14-40AD-AE77-2B005C5A3AA9
2: 255BEFDC-2F14-40AD-AE77-2B005C5A3AA9
1: complete
2: complete
Start
End
3: complete
The 3rd subscriber causes another subscription to the root observable but the result is missing. It appears the ReplaySubject caches the result of the previous observable having completed but still causes a new subscription. This seems unintuitive. I would like to understand why it doesn't work.
Note: I originially tried this using Defer instead of Create which had the same result as the second run above (for obvious reasons).
When you use the Replay/RefCount pair you create an observable that shares a common subscription to the source observable.
From the source:
Returns a connectable observable sequence that shares a single subscription to the underlying sequence replaying all notifications.
Now, it's important to remember that an observable produces a series of zero or more values, followed by either a complete or error signal. It cannot produce values after a complete or error is produced.
Since you are sharing a common subscription to the source and if your source producing a complete then it cannot produce more values. So when you call o.OnCompleted() then you're doing exactly that.
Also, as a side-note, you should avoid ever writing return Disposable.Empty; inside a Create. It means you're creating an observable than can complete before the subscription has returned and that can lead to race conditions.
The way to write your code without it is:
var obs =
Observable
.Defer(() => Observable.Return(Guid.NewGuid()).Concat(Observable.Never<Guid>()))
.Replay(TimeSpan.FromSeconds(2.0))
.RefCount()
.Take(1);
But this is the same as not calling o.OnCompleted().
Related
So here we go: given a Confluent.Kafka IConsumer<>, it wraps it into a dedicated async CE and consumes as long as cancellation hasn't been requested. This piece of code is also defends itself against the OperationCancelledException and runs finally block to ensure graceful termination of consumer.
let private consumeUntiCancelled callback (consumer: IConsumer<'key, 'value>) =
async {
let! ct = Async.CancellationToken
try
try
while not ct.IsCancellationRequested do
let consumeResult = consumer.Consume(ct)
if not consumeResult.IsPartitionEOF then do! (callback consumeResult)
with
| :? OperationCanceledException -> return ()
finally
consumer.Close()
consumer.Dispose()
}
Question #1: is this code correct or am I abusing the async?
So far so good. In my app I have to deal with lots of consumers that must die altogether. So, assuming that consumers: seq<Async<unit>> represents them, the following code is what I came up with:
async {
for consumer in consumers do
do! (Async.StartChild consumer |> Async.Ignore).
}
I expect this code to chain childs to the parent's cancellation context, and once it is cancelled, childs gonna be cancelled as well.
Question #2: is my finally block guaranteed to be ran even though child got cancelled?
I have two observations about your code:
Your use of Async.StartChild is correct - all child computations will inherit the same cancellation token and they will all get cancelled when the main token is cancelled.
The async workflow can be cancelled after you call consumer.Consume(ct) and before you call callback. I'm not sure what this means for your specific problem, but if it removes some data from a queue, the data could be lost before it is processed. If that's an issue, then I think you'll need to make callback non-asynchronous, or invoke it differently.
In your consumeUntilCancelled function, you do not explicity need to check while not if ct.IsCancellationRequested is true. The async workflow does this automatically in every do! or let!, so you can replace this with just a while loop.
Here is a minimal stand-alone demo:
let consume s = async {
try
while true do
do! Async.Sleep 1000
printfn "%s did work" s
finally
printfn "%s finalized" s }
let work =
async {
for c in ["A"; "B"; "C"; "D"] do
do! Async.StartChild (consume c) |> Async.Ignore }
Now we create the computation with a cancellation token:
// Run this in F# interactive
let ct = new System.Threading.CancellationTokenSource()
Async.Start(work, ct.Token)
// Run this sometime later
ct.Cancel()
Once you call ct.Cancel, all the finally blocks will be called and all the loops will stop.
I'm trying to learn the RxJS library. One of the cases I don't quite understand is described in this jsfiddle (code also below).
var A= new Rx.Subject();
var B= new Rx.Subject();
A.onNext(0);
// '.combineLatest' needs all the dependency Observables to get emitted, before its combined signal is emitted.
//
// How to have a combined signal emitted when any of the dependencies change (using earlier given values for the rest)?
//
A.combineLatest( B, function (a,b) { return a+b; } )
.subscribe( function (v) { console.log( "AB: "+ v ); } );
B.onNext("a");
A.onNext(1);
I'd like to get two emits to the "AB" logging. One from changing B to "a" (A already has the value 0). Another from changing A to 1.
However, only changes that occur after a subscribe seem to matter (even though A has a value and thus the combined result could be computed).
Should I use "hot observables" for this, or some other method than .combineLatest?
My problem in the actual code (bigger than this sample) is that I need to make separate initialisations after the subscribes, which cuts stuff in two separate places instead of having the initial values clearly up front.
Thanks
I think you have misunderstood how the Subjects work. Subjects are hot Observables. They do not hold on to values, so if they receive an onNext with no subscribers than that value will be lost to the world.
What you are looking for is a either the BehaviorSubject or the ReplaySubject both of which hold onto past values that re-emit them to new subscribers. In the former case you always construct it with an initial value
//All subscribers will receive 0
var subject = new Rx.BehaviorSubject(0);
//All subscribers will receive 1
//Including all future subscribers
subject.onNext(1);
in the latter you set the number of values to be replayed for each subscription
var subject = new Rx.ReplaySubject(1);
//All new subscribers will receive 0 until the subject receives its
//next onNext call
subject.onNext(0);
Rewriting your example it could be:
var A= new Rx.BehaviorSubject(0);
var B= new Rx.Subject();
// '.combineLatest' needs all the dependency Observables to get emitted, before its combined signal is emitted.
//
// How to have a combined signal emitted when any of the dependencies change (using earlier given values for the rest)?
//
A.combineLatest( B, function (a,b) { return a+b; } )
.subscribe( function (v) { console.log( "AB: "+ v ); } );
B.onNext("a");
A.onNext(1);
//AB: 0a
//AB: 1a
On another note, realizing of course that this is all new to you, in most cases you should not need to use a Subject directly as it generally means that you are trying to wrangle Rx into the safety of your known paradigms. You should ask yourself, where is your data coming from? How is it being created? If you ask those questions enough, following your chain of events back up to the source, 9 out of 10 times you will find that there is probably an Observable wrapper for it.
I have paged interface. Given a starting point a request will produce a list of results and a continuation indicator.
I've created an observable that is built by constructing and flat mapping an observable that reads the page. The result of this observable contains both the data for the page and a value to continue with. I pluck the data and flat map it to the subscriber. Producing a stream of values.
To handle the paging I've created a subject for the next page values. It's seeded with an initial value then each time I receive a response with a valid next page I push to the pages subject and trigger another read until such time as there is no more to read.
Is there a more idiomatic way of doing this?
function records(start = 'LATEST', limit = 1000) {
let pages = new rx.Subject();
this.connect(start)
.subscribe(page => pages.onNext(page));
let records = pages
.flatMap(page => {
return this.read(page, limit)
.doOnNext(result => {
let next = result.next;
if (next === undefined) {
pages.onCompleted();
} else {
pages.onNext(next);
}
});
})
.pluck('data')
.flatMap(data => data);
return records;
}
That's a reasonable way to do it. It has a couple of potential flaws in it (that may or may not impact you depending upon your use case):
You provide no way to observe any errors that occur in this.connect(start)
Your observable is effectively hot. If the caller does not immediately subscribe to the observable (perhaps they store it and subscribe later), then they'll miss the completion of this.connect(start) and the observable will appear to never produce anything.
You provide no way to unsubscribe from the initial connect call if the caller changes its mind and unsubscribes early. Not a real big deal, but usually when one constructs an observable, one should try to chain the disposables together so it call cleans up properly if the caller unsubscribes.
Here's a modified version:
It passes errors from this.connect to the observer.
It uses Observable.create to create a cold observable that only starts is business when the caller actually subscribes so there is no chance of missing the initial page value and stalling the stream.
It combines the this.connect subscription disposable with the overall subscription disposable
Code:
function records(start = 'LATEST', limit = 1000) {
return Rx.Observable.create(observer => {
let pages = new Rx.Subject();
let connectSub = new Rx.SingleAssignmentDisposable();
let resultsSub = new Rx.SingleAssignmentDisposable();
let sub = new Rx.CompositeDisposable(connectSub, resultsSub);
// Make sure we subscribe to pages before we issue this.connect()
// just in case this.connect() finishes synchronously (possible if it caches values or something?)
let results = pages
.flatMap(page => this.read(page, limit))
.doOnNext(r => this.next !== undefined ? pages.onNext(this.next) : pages.onCompleted())
.flatMap(r => r.data);
resultsSub.setDisposable(results.subscribe(observer));
// now query the first page
connectSub.setDisposable(this.connect(start)
.subscribe(p => pages.onNext(p), e => observer.onError(e)));
return sub;
});
}
Note: I've not used the ES6 syntax before, so hopefully I didn't mess anything up here.
I have a bunch of events coming in and I have to execute ALL of them without a loss, but I want to make sure that they are buffered and consumed at the appropriate time slots. Anyone have a solution?
I can't find any operators in Rx that can do that without the loss of the events (Throttle - looses events). I've also considered Buffered, Delay, etc... Can't find a good solution.
I've tried to put a timer in the middle, but somehow it doesn't work at all:
GetInitSequence()
.IntervalThrottle(TimeSpan.FromSeconds(5))
.Subscribe(
item =>
{
Console.WriteLine(DateTime.Now);
// Process item
}
);
public static IObservable<T> IntervalThrottle<T>(this IObservable<T> source, TimeSpan dueTime)
{
return Observable.Create<T>(o =>
{
return source.Subscribe(x =>
{
new Timer(state =>
o.OnNext((T)state), x, dueTime, TimeSpan.FromMilliseconds(-1));
}, o.OnError, o.OnCompleted);
});
}
The question is not 100% clear so I'm making some presumptions.
Observable.Delay is not what you want because that will create a delay from when each event arrives, rather than creating even time intervals for processing.
Observable.Buffer is not what you want because that will cause all events in each given interval to be passed to you, rather than one at a time.
So I believe you're looking for a solution that creates some sort of metronome that ticks away, and gives you an event per tick. This can be naively constructed using Observable.Interval for the metronome and Zip for connecting it to your source:
var source = GetInitSequence();
var trigger = Observable.Interval(TimeSpan.FromSeconds(5));
var triggeredSource = source.Zip(trigger, (s,_) => s);
triggeredSource.Subscribe(item => Console.WriteLine(DateTime.Now));
This will trigger every 5 seconds (in the example above), and give you the original items in sequence.
The only problem with this solution is that if you don't have any more source elements for (say) 10 seconds, when the source elements arrive they will be immediately sent out since some of the 'trigger' events are sitting there waiting for them. Marble diagram for that scenario:
source: -a-b-c----------------------d-e-f-g
trigger: ----o----o----o----o----o----o----o
result: ----a----b----c-------------d-e-f-g
This is a very reasonable issue. There are two questions here already that tackle it:
Rx IObservable buffering to smooth out bursts of events
A way to push buffered events in even intervals
The solution provided is a main Drain extension method and secondary Buffered extension. I've modified these to be far simpler (no need for Drain, just use Concat). Usage is:
var bufferedSource = source.StepInterval(TimeSpan.FromSeconds(5));
The extension method StepInterval:
public static IObservable<T> StepInterval<T>(this IObservable<T> source, TimeSpan minDelay)
{
return source.Select(x =>
Observable.Empty<T>()
.Delay(minDelay)
.StartWith(x)
).Concat();
}
I know this could just be too simple, but would this work?
var intervaled = source.Do(x => { Thread.Sleep(100); });
Basically this just puts a minimum delay between values. Too simplistic?
Along the lines of Enigmativity's answer, if all you want to do is just Delay all of the values by a TimeSpan, I cant see why Delay is not the operator you want
GetInitSequence()
.Delay(TimeSpan.FromSeconds(5)) //ideally pass an IScheduler here
.Subscribe(
item =>
{
Console.WriteLine(DateTime.Now);
// Process item
}
);
How about Observable.Buffer? This should return all the events in the 1s window as a single event.
var xs = Observable.Interval(TimeSpan.FromMilliseconds(100));
var bufferdStream = xs.Buffer(TimeSpan.FromSeconds(5));
bufferdStream.Subscribe(item => { Console.WriteLine("Number of events in window: {0}", item.Count); });
It might be what you're asking isnt that clear. What is your code supposed to do? It looks like you're just delaying by creating a timer for each event. It also breaks the semantics of the observable as the next and complete could occur before the next.
Note this is also only as accurate at the timer used. Typically the timers are accurate to at most 16ms.
Edit:
your example becomes, and item contains all the events in the window:
GetInitSequence()
.Buffer(TimeSpan.FromSeconds(5))
.Subscribe(
item =>
{
Console.WriteLine(DateTime.Now);
// Process item
}
);
Using reactive extension, it is easy to subscribe 2 times to the same observable.
When a new value is available in the observable, both subscribers are called with this same value.
Is there a way to have each subscriber get a different value (the next one) from this observable ?
Ex of what i'm after:
source sequence: [1,2,3,4,5,...] (infinite)
The source is constantly adding new items at an unknown rate.
I'm trying to execute a lenghty async action for each item using N subscribers.
1st subscriber: 1,2,4,...
2nd subscriber: 3,5,...
...
or
1st subscriber: 1,3,...
2nd subscriber: 2,4,5,...
...
or
1st subscriber: 1,3,5,...
2nd subscriber: 2,4,6,...
I would agree with Asti.
You could use Rx to populate a Queue (Blocking Collection) and then have competing consumers read from the queue. This way if one process was for some reason faster it could pick up the next item potentially before the other consumer if it was still busy.
However, if you want to do it, against good advice :), then you could just use the Select operator that will provide you with the index of each element. You can then pass that down to your subscribers and they can fiter on a modulus. (Yuck! Leaky abstractions, magic numbers, potentially blocking, potentiall side effects to the source sequence etc)
var source = Obserservable.Interval(1.Seconds())
.Select((i,element)=>{new Index=i, Element=element});
var subscription1 = source.Where(x=>x.Index%2==0).Subscribe(x=>DoWithThing1(x.Element));
var subscription2 = source.Where(x=>x.Index%2==1).Subscribe(x=>DoWithThing2(x.Element));
Also remember that the work done on the OnNext handler if it is blocking will still block the scheduler that it is on. This could affect the speed of your source/producer. Another reason why Asti's answer is a better option.
Ask if that is not clear :-)
How about:
IObservable<TRet> SomeLengthyOperation(T input)
{
return Observable.Defer(() => Observable.Start(() => {
return someCalculatedValueThatTookALongTime;
}, Scheduler.TaskPoolScheduler));
}
someObservableSource
.SelectMany(x => SomeLengthyOperation(input))
.Subscribe(x => Console.WriteLine("The result was {0}", x);
You can even limit the number of concurrent operations:
someObservableSource
.Select(x => SomeLengthyOperation(input))
.Merge(4 /* at a time */)
.Subscribe(x => Console.WriteLine("The result was {0}", x);
It's important for the Merge(4) to work, that the Observable returned by SomeLengthyOperation be a Cold Observable, which is what the Defer does here - it makes the Observable.Start not happen until someone Subscribes.