Assigning Observable to another - swift

I have this object TryOut, when initialize, it executes a private method every 2 seconds. Within that method func execute() there is ,internalStream, a local variable of type Observable<Int> that captures data that I wish to emit to the outside world.
The issue is that even though internalStream is assigning to a member property public var outsideStream: Observable<Int>?, There aren't any events coming from subscribing to outsideStream. Why though ? is there any reason behind that ?
Working Case
The only way it work, is by having a closure as member property public var broadcast:((Observable<Int>) -> ())? = nil, and raise it within the execute method by doing this broadcast?(internalStream)
A sample code could be found in this gist. Thank you for your help.

For this kind of situation, when you want to produce events by yourself, you should better use any of *Subject provided by RxSwift.
For example:
Change outputStream declaration to:
public var outsideStream = PublishSubject<Int>()
Produce events in right way:
#objc private func execute() {
currentIndex += 1
if currentIndex < data.count {
outsideStream.onNext(data[currentIndex])
}
guard currentIndex + 1 > data.count && timer.isValid else { return }
outsideStream.onCompleted()
timer.invalidate()
}
And the usage:
let participant = TryOut()
participant.outsideStream
.subscribe(
onNext: { print("income index:", $0) },
onCompleted: { print("stream completed") }
)
.disposed(by: bag)
gives you the output:
income index: 1
income index: 2
income index: 3
income index: 4
income index: 5
stream completed
P.S. Also, there is another way to do that by using (or reproduce) retry method by RxSwiftExt library.

Related

How to cancel an Asynchronous function in Swift

In swift, what is the common practice to cancel an aync execution?
Using this example, which execute the closure asynchronously,
what is the way to cancel the async function?
func getSumOf(array:[Int], handler: #escaping ((Int)->Void)) {
//step 2
var sum: Int = 0
for value in array {
sum += value
}
//step 3
Globals.delay(0.3, closure: {
handler(sum)
})
}
func doSomething() {
//setp 1
self.getSumOf(array: [16,756,442,6,23]) { [weak self](sum) in
print(sum)
//step 4, finishing the execution
}
}
//Here we are calling the closure with the delay of 0.3 seconds
//It will print the sumof all the passed numbers.
Unfortunately, there is no generalized answer to this question as it depends entirely upon your asynchronous implementation.
Let's imagine that your delay was the typical naive implementation:
static func delay(_ timeInterval: TimeInterval, closure: #escaping () -> Void) {
DispatchQueue.main.asyncAfter(deadline: .now() + timeInterval) {
closure()
}
}
That's not going to be cancelable.
However you can redefine it to use DispatchWorkItem. This is cancelable:
#discardableResult
static func delay(_ timeInterval: TimeInterval, closure: #escaping () -> Void) -> DispatchWorkItem {
let task = DispatchWorkItem {
closure()
}
DispatchQueue.main.asyncAfter(deadline: .now() + timeInterval, execute: task)
return task
}
By making it return a #discardableResult, that means that you can use it like before, but if you want to cancel it, grab the result and pass it along. E.g., you can define your asynchronous sum routine to use this pattern, too:
#discardableResult
func sum(of array: [Int], handler: #escaping (Int) -> Void) -> DispatchWorkItem {
let sum = array.reduce(0, +)
return Globals.delay(3) {
handler(sum)
}
}
Now, doSomething can, if it wants, capture the returned value and use it to cancel the asynchronously scheduled task:
func doSomething() {
var task = sum(of: [16, 756, 442, 6, 23]) { sum in
print(Date(), sum)
}
...
task.cancel()
}
You can also implement the delay with a Timer:
#discardableResult
static func delay(_ timeInterval: TimeInterval, closure: #escaping () -> Void) -> Timer {
Timer.scheduledTimer(withTimeInterval: timeInterval, repeats: false) { _ in
closure()
}
}
And
#discardableResult
func sum(of array: [Int], handler: #escaping (Int) -> Void) -> Timer {
let sum = array.reduce(0, +)
return Globals.delay(3) {
handler(sum)
}
}
But this time, you'd invalidate the timer:
func doSomething() {
weak var timer = sum(of: [16, 756, 442, 6, 23]) { sum in
print(Date(), sum)
}
...
timer?.invalidate()
}
It must be noted that the above scenarios are unique to simple “delay” scenarios. This is not a general purpose solution for stopping asynchronous processes. For example, if the asynchronous tasks consists of some time consuming for loop, the above is insufficient.
For example, let's say you are doing something really complicated calculation in a for loop (e.g. processing the pixels of an image, processing frames of a video, etc.). In that case, because there is no preemptive cancelation, you'd need to manually check to see if the DispatchWorkItem or the Operation has been canceled by checking their respective isCancelled properties.
For example, let's consider an operation to sum all primes less than 1 million:
class SumPrimes: Operation {
override func main() {
var sum = 0
for i in 1 ..< 1_000_000 {
if isPrime(i) {
sum += i
}
}
print(Date(), sum)
}
func isPrime(_ value: Int) -> Bool { ... } // this is slow
}
(Obviously, this isn't an efficient way to solve the “sum of primes less than x” problem, but it just an example for illustrative purposes.)
And
let queue = OperationQueue()
let operation = SumPrimes()
queue.addOperation(operation)
We're not going to be able to cancel that. Once it starts, there’s no stopping it.
But we can make it cancelable by adding a check for isCancelled in our loop:
class SumPrimes: Operation {
override func main() {
var sum = 0
for i in 1 ..< 1_000_000 {
if isCancelled { return }
if isPrime(i) {
sum += i
}
}
print(Date(), sum)
}
func isPrime(_ value: Int) -> Bool { ... }
}
And
let queue = OperationQueue()
let operation = SumPrimes()
queue.addOperation(operation)
...
operation.cancel()
Bottom line, if it’s something other than a simple delay, and you want it to be cancelable, you have to integrate this into your code that can be run asynchronously.
Using this example..., what is the way to cancel the async function?
Using that example, there is no such way. The only way to avoid printing the sum is for self to go out existence some time in the 0.3 seconds immediately after the call.
(There are ways to make a cancellable timer, but the timer you've made, assuming that it's the delay I think it is, is not cancellable.)
I don't know your algorithm but first I have suggestions for some points.
If you want to delay, do it outside of getSumOf function for adapt Single Responsibility.
Use built-in reduce function to sum items in array in better and more efficient way.
You can use DispatchWorkItem to build a cancellable task. So you can remove getSumOf function and edit doSomething function like below.
let yourArray = [16,756,442,6,23]
let workItem = DispatchWorkItem {
// Your async code goes in here
let sum = yourArray.reduce(0, +)
print(sum)
}
// Execute the work item after 0.3 second
DispatchQueue.main.asyncAfter(deadline: .now() + 0.3, execute: workItem)
// You can cancel the work item if you no longer need it
workItem.cancel()
You can also look into OperationQueue for advanced use.

Why does `Publishers.Map` consume upstream values eagerly?

Suppose I have a custom subscriber that requests one value on subscription and then an additional value three seconds after it receives the previous value:
class MySubscriber: Subscriber {
typealias Input = Int
typealias Failure = Never
private var subscription: Subscription?
func receive(subscription: Subscription) {
print("Subscribed")
self.subscription = subscription
subscription.request(.max(1))
}
func receive(_ input: Int) -> Subscribers.Demand {
print("Value: \(input)")
DispatchQueue.main.asyncAfter(deadline: .now() + .seconds(3)) {
self.subscription?.request(.max(1))
}
return .none
}
func receive(completion: Subscribers.Completion<Never>) {
print("Complete")
subscription = nil
}
}
If I use this to subscribe to an infinite range publisher, back pressure is handled gracefully, with the publisher waiting 3 seconds each time until it receives the next demand to send a value:
(1...).publisher.subscribe(MySubscriber())
// Prints values infinitely with ~3 seconds between each:
//
// Subscribed
// Value: 1
// Value: 2
// Value: 3
// ...
But if I add a map operator then MySubscriber never even receives a subscription; map appears to have synchronously requested Demand.Unlimited upon receiving its subscription and the app infinitely spins as map tries to exhaust the infinite range:
(1...).publisher
.map { value in
print("Map: \(value)")
return value * 2
}
.subscribe(MySubscriber())
// The `map` transform is executed infinitely with no delay:
//
// Map: 1
// Map: 2
// Map: 3
// ...
My question is, why does map behave this way? I would have expected map to just pass its downstream demand to the upstream. Since map is supposed to be for transformation rather than side effects, I don't understand what the use case is for its current behavior.
EDIT
I implemented a version of map to show how I think it ought to work:
extension Publishers {
struct MapLazily<Upstream: Publisher, Output>: Publisher {
typealias Failure = Upstream.Failure
let upstream: Upstream
let transform: (Upstream.Output) -> Output
init(upstream: Upstream, transform: #escaping (Upstream.Output) -> Output) {
self.upstream = upstream
self.transform = transform
}
public func receive<S: Subscriber>(subscriber: S) where S.Input == Output, S.Failure == Upstream.Failure {
let mapSubscriber = Subscribers.LazyMapSubscriber(downstream: subscriber, transform: transform)
upstream.receive(subscriber: mapSubscriber)
}
}
}
extension Subscribers {
class LazyMapSubscriber<Input, DownstreamSubscriber: Subscriber>: Subscriber {
let downstream: DownstreamSubscriber
let transform: (Input) -> DownstreamSubscriber.Input
init(downstream: DownstreamSubscriber, transform: #escaping (Input) -> DownstreamSubscriber.Input) {
self.downstream = downstream
self.transform = transform
}
func receive(subscription: Subscription) {
downstream.receive(subscription: subscription)
}
func receive(_ input: Input) -> Subscribers.Demand {
downstream.receive(transform(input))
}
func receive(completion: Subscribers.Completion<DownstreamSubscriber.Failure>) {
downstream.receive(completion: completion)
}
}
}
extension Publisher {
func mapLazily<Transformed>(transform: #escaping (Output) -> Transformed) -> AnyPublisher<Transformed, Failure> {
Publishers.MapLazily(upstream: self, transform: transform).eraseToAnyPublisher()
}
}
Using this operator, MySubscriber receives the subscription immediately and the mapLazily transform is only executed when there is demand:
(1...).publisher
.mapLazily { value in
print("Map: \(value)")
return value * 2
}
.subscribe(MySubscriber())
// Only transforms the values when they are demanded by the downstream subscriber every 3 seconds:
//
// Subscribed
// Map: 1
// Value: 2
// Map: 2
// Value: 4
// Map: 3
// Value: 6
// Map: 4
// Value: 8
My guess is that the particular overload of map defined for Publishers.Sequence is using some kind of shortcut to enhance performance. This breaks for infinite sequences, but even for finite sequences eagerly exhausting the sequence regardless of the downstream demand messes with my intuition. In my view, the following code:
(1...3).publisher
.map { value in
print("Map: \(value)")
return value * 2
}
.subscribe(MySubscriber())
ought to print:
Subscribed
Map: 1
Value: 2
Map: 2
Value: 4
Map: 3
Value: 6
Complete
but instead prints:
Map: 1
Map: 2
Map: 3
Subscribed
Value: 2
Value: 4
Value: 6
Complete
Here's a simpler test that doesn't involve any custom subscribers:
(1...).publisher
//.map { $0 }
.flatMap(maxPublishers: .max(1)) {
(i:Int) -> AnyPublisher<Int,Never> in
Just<Int>(i)
.delay(for: 3, scheduler: DispatchQueue.main)
.eraseToAnyPublisher()
}
.sink { print($0) }
.store(in: &storage)
It works as expected, but then if you uncomment the .map you get nothing, because the .map operator is accumulating the infinite upstream values without publishing anything.
On the basis of your hypothesis that map is somehow optimizing for a preceding sequence publisher, I tried this workaround:
(1...).publisher.eraseToAnyPublisher()
.map { $0 }
// ...
And sure enough, it fixed the problem! By hiding the sequence publisher from the map operator, we prevent the optimization.

"PassthroughSubject" seems to be thread-unsafe, is this a bug or limitation?

"PassthroughSubject" seems to be thread-unsafe. Please see the code below, I'm sending 100 values concurrently to a subscriber which only request .max(5). Subscriber should only get 5 values I think, but it actually got more. Is this a bug or limitation?
// Xcode11 beta2
var count = 0
let q = DispatchQueue(label: UUID().uuidString)
let g = DispatchGroup()
let subject = PassthroughSubject<Int, Never>()
let subscriber = AnySubscriber<Int, Never>(receiveSubscription: { (s) in
s.request(.max(5))
}, receiveValue: { v in
q.sync {
count += 1
}
return .none
}, receiveCompletion: { c in
})
subject.subscribe(subscriber)
for i in 0..<100 {
DispatchQueue.global().async(group: g) {
subject.send(i)
}
}
g.wait()
print("receive", count) // expected 5, but got more(7, 9...)
I believe the prefix operator can help:
/// Republishes elements up to the specified maximum count.
func prefix(Int) -> Publishers.Output<PassthroughSubject<Output, Failure>>
The max operator is returning the largest value at completion (and it's possible you're triggering completion more than once):
/// Publishes the maximum value received from the upstream publisher, after it finishes.
/// Available when Output conforms to Comparable.
func max() -> Publishers.Comparison<PassthroughSubject<Output, Failure>>

Skip an event from source observable if a new event from a given observable was received in a given time interval

I'm trying to write a method on UIView extension, which will observe long press on a given view. I know it can be done using UILongPressGestureRecognizer, but I really want to figure out the question and do it this way.
I tried to use takeUntil operator, but it completes an observable, but I need to skip the value and receive further events.
The question can be also transformed to: How to omit completed event and keep receiving further events?
func observeLongPress(with minimumPressDuration: Double = 1) ->
Observable<Void> {
let touchesBeganEvent = rx.methodInvoked(#selector(touchesBegan))
let touchesEndedEvents = [#selector(touchesEnded), #selector(touchesCancelled)]
.map(rx.methodInvoked)
let touchesEndedEvent = Observable.merge(touchesEndedEvents)
return touchesBeganEvent
.observeOn(MainScheduler.instance)
.delay(minimumPressDuration, scheduler: MainScheduler.instance)
.takeUntil(touchesEndedEvent)
.map { _ in }
}
This will work, but will complete the whole sequence (as it intended to do).
The answer if floating around (as it always do), but after a few hours I decided to ask. :)
Update
The floating answer just flew inside (~15 mins for doing so), but I'm still interested in answer, because maybe there's something that I'm missing here.
func observeLongPress(with minimumPressDuration: Double = 1) -> Observable<Void> {
let touchesBeganEvent = rx.methodInvoked(#selector(touchesBegan))
let touchesEndedEvents = [#selector(touchesEnded), #selector(touchesCancelled)]
.map(rx.methodInvoked)
let touchesEndedEvent = Observable.merge(touchesEndedEvents)
return touchesBeganEvent
.observeOn(MainScheduler.instance)
.flatMapLatest { _ -> Observable<Void> in
return Observable.just(())
.delay(minimumPressDuration, scheduler: MainScheduler.instance)
.takeUntil(touchesEndedEvent)
.void()
}
}
Your Updated code won't work. Even if you don't emit the completed event out of the function, it still got emitted from the takeUntil and therefore that operator won't emit any more values.
That said, this idea can be accomplished. Since you said you want to learn, I'll talk through my entire thought process while writing this.
First let's outline our inputs and outputs. For inputs we have two Observables, and a duration, and whenever we are dealing with duration we need a scheduler. For outputs we only have a single Observable.
So the function declaration looks like this:
func filterDuration(first: Observable<Void>, second: Observable<Void>, duration: TimeInterval, scheduler: SchedulerType) -> Observable<Void> {
// code goes here.
}
We are going to be comparing the time that the two observables fire so we have to track that:
let firstTime = first.map { scheduler.now }
let secondTime = second.map { scheduler.now }
And since we are comparing them, we have to combine them somehow. We could use merge, combineLatest, or zip...
combineLatest will fire whenever either Observable fires and will give us the latest values from both observables.
zip will mate, 1 for 1, events from both observables. This sounds intriguing, but would break down if one of the observables fires more often than the other so it seems a bit brittle.
merge will fire when either of them fire, so we would need to track which one fired somehow (probably with an enum.)
Let's use combineLatest:
Observable.combineLatest(firstTime, secondTime)
That will give us an Observable<(RxTime, RxTime)> so now we can map over our two times and compare them. The goal here is to return a Bool that is true if the second time is greater than the first time by more than duration.
.map { arg -> Bool in
let (first, second) = arg
let tickDuration = second.timeIntervalSince(first)
return duration <= tickDuration
}
Now the above will fire every time either of the two inputs fire, but we only care about the events that emit true. That calls for filter.
.filter { $0 } // since the wrapped value is a Bool, this will accomplish our goal.
Now our chain is emitting Bools which will always be true but we want a Void. How about we just throw away the value.
.map { _ in }
Putting it all together, we get:
func filterDuration(first: Observable<Void>, second: Observable<Void>, duration: TimeInterval, scheduler: SchedulerType) -> Observable<Void> {
let firstTime = first.map { scheduler.now }
let secondTime = second.map { scheduler.now }
return Observable.combineLatest(firstTime, secondTime)
.map { arg -> Bool in
let (first, second) = arg
let tickDuration = second.timeIntervalSince(first)
return duration <= tickDuration
}
.filter { $0 }
.map { _ in }
}
The above isolates our logic and is, not incidentally, easy to test with RxTests. Now we can wrap it up into a UIView (that would be hard to test.)
func observeLongPress(with minimumPressDuration: Double = 1) -> Observable<Void> {
let touchesBeganEvent = rx.methodInvoked(#selector(touchesBegan)).map { _ in }
let touchesEndedEvents = [#selector(touchesEnded), #selector(touchesCancelled)]
.map(rx.methodInvoked)
let touchesEndedEvent = Observable.merge(touchesEndedEvents).map { _ in }
return filterDuration(first: touchesBeganEvent, second: touchesEndedEvent, duration: minimumPressDuration, scheduler: MainScheduler.instance)
}
There you go. One custom operator.

swift parse query limit

Got a question about parse query.limit with swift
using below settings and limit to pull objects from parse
func getUsernames()
query.limit = self.limit
query.findObjectsInBackgroundWithBlock //display on TableView
self.usernamesArray.append(someuser)
self.tableView.reloadData()
}
then in tableView:willDisplayCell
if indexPath.row == self.usernamesArray.count - 1 {
self.limit += 5
self.getUsernames()
}
and in viewDidAppear
if self.usernames.count == 0 {
self.limit = self.limit + 10
self.getUsernames()
}
this works fine. whenever my table scrolls through the second last Cell another 5 is ready which is what i expected yay!.
problem is if usernamesArray.count has total value of 50 and when the last cell(50th cell/count) has reached/scrolled the tableView:willDisplayCell is keep getting called and the self.limit is keep increasing 55, 60, 65 etc .... it doesn't stop when it reaches the LAST Cell or Last data in array. it keeps using the LTE data and query.limit number increases 5 by 5 (when there isn't anymore array value available)
am i doing this right? or should i try different approach?
any master of swift will be appreciated! Thanks
No, you aren't doing it right. First, by just increasing the limit and querying you're always getting another copy of the data you already have and maybe a few additional items. You should be changing the offset (skip) for each new query, not the limit.
You also need to check the response to see how many items were returned. If the number of items returned is less than the request limit then you know you're at the end of the data and you shouldn't make any more requests. Set a flag so that when scrolling you know there isn't anything else you can load.
I found it easier to create a resultSet class that handles all this information.
class ParseResultSet: NSObject {
var limit = 20
var skip = 0
var total = 0
var limitReached = false
var orderByAscendingKey: String?
var orderByDescendingKey: String?
var searchActive: Bool?
func increaseSkipByLimit() {
skip += limit
}
func increaseTotal(byNumber: Int) {
total += byNumber
if byNumber == 0 {
self.limitHasBeenReached()
skip = total
}
else {
self.increaseSkipByLimit()
}
}
func limitHasBeenReached() {
limitReached = true
}
}
I then use a method where I get the objects from Parse in a completion block. I check if the limit has been reached and increment the total if it hasn't
func getObjects(classname: String, include: [String], completion: #escaping (_ result: [PFObject]) -> Void) {
if self.resultSet.limitReached == false || self.resultSet.searchActive == true {
fetchObjectsFromClass(parseClass: classname, include: include, completion: { [weak self] (objects) in
self?.resultSet.increaseTotal(byNumber: objects.count)
completion(objects)
})
}
else {
print("LIMIT REACHED")
}
}
In my case, the fetchObjectsFromClass is a global function where the query is generated and returns an array of PFObjects.
Hopefully this gives you an idea of what you need to do