RxSwift retry full chain - swift

I'm pretty new in RxSwift and I have the following problem.
Given two functions:
struct Checkout { ... }
func getSessionIdOperation() -> Single<UUID>
func getCheckoutForSession(_ sessionId: UUID, asGuestUser: Bool) -> Single<Checkout>
I have a third function that combines the result of the two:
func getCheckout(asGuestUser: Bool) -> Single<Checkout> {
return getSessionIdOperation()
.map { ($0, asGuestUser) }
.flatMap(getCheckoutForSession)
}
both getSessionIdOperationand getCheckoutForSession can fail, and in case of failure I would like to restart the whole chain just once. I tried retry(2) but just getCheckoutForSession was repeated. :(

Make sure you retry(2) on stream with flatMap
func getCheckout(asGuestUser: Bool) -> Single<Checkout> {
return getSessionIdOperation()
// .retry(2) will retry just first stream
.map { ($0, asGuestUser) }
.flatMap(getCheckoutForSession)
.retry(2) // Here it will retry whole stream
}
In case getSessionIdOperation will fail getCheckoutForSession will never be called as it's based on output from first stream.

Related

Empty() publisher does not send completion

In this code I am expecting the Empty() publisher to send completion to the .sink subscriber, but no completion is sent.
func testEmpty () {
let x = XCTestExpectation()
let subject = PassthroughSubject<Int, Never>()
emptyOrSubjectPublisher(subject).sink(receiveCompletion: { completion in
dump(completion)
}, receiveValue: { value in
dump(value)
}).store(in: &cancellables)
subject.send(0)
wait(for: [x], timeout: 10.0)
}
func emptyOrSubjectPublisher (_ subject: PassthroughSubject<Int, Never>) -> AnyPublisher<Int, Never> {
subject
.flatMap { (i: Int) -> AnyPublisher<Int, Never> in
if i == 1 {
return subject.eraseToAnyPublisher()
} else {
return Empty().eraseToAnyPublisher()
}
}
.eraseToAnyPublisher()
}
Why does the emptyOrSubjectPublisher not receive the completion?
The Empty completes, but the overall pipeline does not, because the initial Subject has not completed. The inner pipeline in which the Empty is produced (the flatMap) has "swallowed" the completion. This is the expected behavior.
You can see this more easily by simply producing a Just in the flatMap, e.g. Just(100):
subject
.flatMap {_ in Just(100) }
.sink(receiveCompletion: { completion in
print(completion)
}, receiveValue: { value in
print(value)
}).store(in: &cancellables)
subject.send(1)
You know and I know that a Just emits once and completes. But although the value of the Just arrives down the pipeline, there is no completion.
And you can readily see why it works this way. It would be very wrong if we had a potential sequence of values from our publisher but some intermediate publisher, produced in a flatMap, had the power to complete the whole pipeline and end it prematurely.
(And see my https://www.apeth.com/UnderstandingCombine/operators/operatorsTransformersBlockers/operatorsflatmap.html where I make the same point.)
If the goal is to send a completion down the pipeline, it's the subject that needs to complete. For example, you could say
func emptyOrSubjectPublisher (_ subject: PassthroughSubject<Int, Never>) -> AnyPublisher<Int, Never> {
subject
.flatMap { (i: Int) -> AnyPublisher<Int, Never> in
if i == 1 {
return subject.eraseToAnyPublisher()
} else {
subject.send(completion: .finished) // <--
return Empty().eraseToAnyPublisher()
}
}
.eraseToAnyPublisher()
}
[Note, however, that your whole emptyOrSubjectPublisher is peculiar; it is unclear what purpose it is intended to serve. Returning subject when i is 1 is kind of pointless too, because subject has already published the 1 by the time we get here, and isn't going to publish anything more right now. Thus, if you send 1 at the start, you won't receive 1 as a value, because your flatMap has swallowed it and has produced a publisher that isn't going to publish.]

ReactiveSwift pipeline flatMap body transform not executed

I have the following pipeline setup, and for some reason I can't understand, the second flatMap is skipped:
func letsDoThis() -> SignalProducer<(), MyError> {
let logError: (MyError) -> Void = { error in
print("Error: \(error); \((error as NSError).userInfo)")
}
return upload(uploads) // returns: SignalProducer<Signal<(), MyError>.Event, Never>
.collect() // SignalProducer<[Signal<(), MyError>.Event], Never>
.flatMap(.merge, { [uploadContext] values -> SignalProducer<[Signal<(), MyError>.Event], MyError> in
return context.saveSignal() // SignalProducer<(), NSError>
.map { values } // SignalProducer<[Signal<(), MyError>.Event], NSError>
.mapError { MyError.saveFailed(error: $0) } // SignalProducer<[Signal<(), MyError>.Event], MyError>
})
.flatMap(.merge, { values -> SignalProducer<(), MyError> in
if let error = values.first(where: { $0.error != nil })?.error {
return SignalProducer(error: error)
} else {
return SignalProducer(value: ())
}
})
.on(failed: logError)
}
See the transformations/signatures starting with the upload method.
When I say skipped I mean even if I add breakpoints or log statements, they are not executed.
Any idea how to debug this or how to fix?
Thanks.
EDIT: it is most likely has something to do with the map withing the first flatMap, but not sure how to fix it yet.
See this link.
EDIT 2: versions
- ReactiveCocoa (10.1.0):
- ReactiveObjC (3.1.1)
- ReactiveObjCBridge (6.0.0):
- ReactiveSwift (6.1.0)
EDIT 3: I found the problem which was due to my method saveSignal sending sendCompleted.
extension NSManagedObjectContext {
func saveSignal() -> SignalProducer<(), NSError> {
return SignalProducer { observer, disposable in
self.perform {
do {
try self.save()
observer.sendCompleted()
}
catch {
observer.send(error: error as NSError)
}
}
}
}
Sending completed make sense, so I can't change that. Any way to change the flatMap to still do what I intended to do?
I think the reason your second flatMap is never executed is that saveSignal never sends a value; it just finishes with a completed event or an error event. That means map will never be called, and no values will ever be passed to your second flatMap. You can fix it by doing something like this:
context.saveSignal()
.mapError { MyError.saveFailed(error: $0) }
.then(SignalProducer(value: values))
Instead of using map (which does nothing because there are no values to map), you just create a new producer that sends the values after saveSignal completes successfully.

Swift Combine Future with multiple values?

I may be going about this the wrong way, but I have a function with which I want to emit multiple values over time. But I don’t want it to start emitting until something is subscribed to that object. I’m coming to combine from RxSwift, so I’m basically trying to duplicated Observable.create() in the RxSwift world. The closest I have found is returning a Future, but futures only succeed or fail (so they are basically like a Single in RxSwift.)
Is there some fundamental thing I am missing here? My end goal is to make a function that processes a video file and emits progress events until it completes, then emits a URL for the completed file.
Generally you can use a PassthroughSubject to publish custom outputs. You can wrap a PassthroughSubject (or multiple PassthroughSubjects) in your own implementation of Publisher to ensure that only your process can send events through the subject.
Let's mock a VideoFrame type and some input frames for example purposes:
typealias VideoFrame = String
let inputFrames: [VideoFrame] = ["a", "b", "c"]
Now we want to write a function that synchronously processes these frames. Our function should report progress somehow, and at the end, it should return the output frames. To report progress, our function will take a PassthroughSubject<Double, Never>, and send its progress (as a fraction from 0 to 1) to the subject:
func process(_ inputFrames: [VideoFrame], progress: PassthroughSubject<Double, Never>) -> [VideoFrame] {
var outputFrames: [VideoFrame] = []
for input in inputFrames {
progress.send(Double(outputFrames.count) / Double(inputFrames.count))
outputFrames.append("output for \(input)")
}
return outputFrames
}
Okay, so now we want to turn this into a publisher. The publisher needs to output both progress and a final result. So we'll use this enum as its output:
public enum ProgressEvent<Value> {
case progress(Double)
case done(Value)
}
Now we can define our Publisher type. Let's call it SyncPublisher, because when it receives a Subscriber, it immediately (synchronously) performs its entire computation.
public struct SyncPublisher<Value>: Publisher {
public init(_ run: #escaping (PassthroughSubject<Double, Never>) throws -> Value) {
self.run = run
}
public var run: (PassthroughSubject<Double, Never>) throws -> Value
public typealias Output = ProgressEvent<Value>
public typealias Failure = Error
public func receive<Downstream: Subscriber>(subscriber: Downstream) where Downstream.Input == Output, Downstream.Failure == Failure {
let progressSubject = PassthroughSubject<Double, Never>()
let doneSubject = PassthroughSubject<ProgressEvent<Value>, Error>()
progressSubject
.setFailureType(to: Error.self)
.map { ProgressEvent<Value>.progress($0) }
.append(doneSubject)
.subscribe(subscriber)
do {
let value = try run(progressSubject)
progressSubject.send(completion: .finished)
doneSubject.send(.done(value))
doneSubject.send(completion: .finished)
} catch {
progressSubject.send(completion: .finished)
doneSubject.send(completion: .failure(error))
}
}
}
Now we can turn our process(_:progress:) function into a SyncPublisher like this:
let inputFrames: [VideoFrame] = ["a", "b", "c"]
let pub = SyncPublisher<[VideoFrame]> { process(inputFrames, progress: $0) }
The run closure is { process(inputFrames, progress: $0) }. Remember that $0 here is a PassthroughSubject<Double, Never>, exactly what process(_:progress:) wants as its second argument.
When we subscribe to this pub, it will first create two subjects. One subject is the progress subject and gets passed to the closure. We'll use the other subject to publish either the final result and a .finished completion, or just a .failure completion if the run closure throws an error.
The reason we use two separate subjects is because it ensures that our publisher is well-behaved. If the run closure returns normally, the publisher publishes zero or more progress reports, followed by a single result, followed by .finished. If the run closure throws an error, the publisher publishes zero or more progress reports, followed by a .failed. There is no way for the run closure to make the publisher emit multiple results, or emit more progress reports after emitting the result.
At last, we can subscribe to pub to see if it works properly:
pub
.sink(
receiveCompletion: { print("completion: \($0)") },
receiveValue: { print("output: \($0)") })
Here's the output:
output: progress(0.0)
output: progress(0.3333333333333333)
output: progress(0.6666666666666666)
output: done(["output for a", "output for b", "output for c"])
completion: finished
The following extension to AnyPublisher replicates Observable.create semantics by composing a PassthroughSubject. This includes cancellation semantics.
AnyPublisher.create() Swift 5.6 Extension
public extension AnyPublisher {
static func create<Output, Failure>(_ subscribe: #escaping (AnySubscriber<Output, Failure>) -> AnyCancellable) -> AnyPublisher<Output, Failure> {
let passthroughSubject = PassthroughSubject<Output, Failure>()
var cancellable: AnyCancellable?
return passthroughSubject
.handleEvents(receiveSubscription: { subscription in
let subscriber = AnySubscriber<Output, Failure> { subscription in
} receiveValue: { input in
passthroughSubject.send(input)
return .unlimited
} receiveCompletion: { completion in
passthroughSubject.send(completion: completion)
}
cancellable = subscribe(subscriber)
}, receiveCompletion: { completion in
}, receiveCancel: {
cancellable?.cancel()
})
.eraseToAnyPublisher()
}
}
Usage
func doSomething() -> AnyPublisher<Int, Error> {
return AnyPublisher<Int, Error>.create { subscriber in
// Imperative implementation of doing something can call subscriber as follows
_ = subscriber.receive(1)
subscriber.receive(completion: .finished)
// subscriber.receive(completion: .failure(myError))
return AnyCancellable {
// Imperative cancellation implementation
}
}
}

Swift Combine: How to create a single publisher from a list of publishers?

Using Apple's new Combine framework I want to make multiple requests from each element in a list. Then I want a single result from a reduction of all the the responses. Basically I want to go from list of publishers to a single publisher that holds a list of responses.
I've tried making a list of publishers, but I don't know how to reduce that list into a single publisher. And I've tried making a publisher containing a list but I can't flat map a list of publishers.
Please look at the "createIngredients" function
func createIngredient(ingredient: Ingredient) -> AnyPublisher<CreateIngredientMutation.Data, Error> {
return apollo.performPub(mutation: CreateIngredientMutation(name: ingredient.name, optionalProduct: ingredient.productId, quantity: ingredient.quantity, unit: ingredient.unit))
.eraseToAnyPublisher()
}
func createIngredients(ingredients: [Ingredient]) -> AnyPublisher<[CreateIngredientMutation.Data], Error> {
// first attempt
let results = ingredients
.map(createIngredient)
// results = [AnyPublisher<CreateIngredientMutation.Data, Error>]
// second attempt
return Publishers.Just(ingredients)
.eraseToAnyPublisher()
.flatMap { (list: [Ingredient]) -> Publisher<[CreateIngredientMutation.Data], Error> in
return list.map(createIngredient) // [AnyPublisher<CreateIngredientMutation.Data, Error>]
}
}
I'm not sure how to take an array of publishers and convert that to a publisher containing an array.
Result value of type '[AnyPublisher]' does not conform to closure result type 'Publisher'
Essentially, in your specific situation you're looking at something like this:
func createIngredients(ingredients: [Ingredient]) -> AnyPublisher<[CreateIngredientMutation.Data], Error> {
Publishers.MergeMany(ingredients.map(createIngredient(ingredient:)))
.collect()
.eraseToAnyPublisher()
}
This 'collects' all the elements produced by the upstream publishers and – once they have all completed – produces an array with all the results and finally completes itself.
Bear in mind, if one of the upstream publishers fails – or produces more than one result – the number of elements may not match the number of subscribers, so you may need additional operators to mitigate this depending on your situation.
The more generic answer, with a way you can test it using the EntwineTest framework:
import XCTest
import Combine
import EntwineTest
final class MyTests: XCTestCase {
func testCreateArrayFromArrayOfPublishers() {
typealias SimplePublisher = Just<Int>
// we'll create our 'list of publishers' here. Each publisher emits a single
// Int and then completes successfully – using the `Just` publisher.
let publishers: [SimplePublisher] = [
SimplePublisher(1),
SimplePublisher(2),
SimplePublisher(3),
]
// we'll turn our array of publishers into a single merged publisher
let publisherOfPublishers = Publishers.MergeMany(publishers)
// Then we `collect` all the individual publisher elements results into
// a single array
let finalPublisher = publisherOfPublishers.collect()
// Let's test what we expect to happen, will happen.
// We'll create a scheduler to run our test on
let testScheduler = TestScheduler()
// Then we'll start a test. Our test will subscribe to our publisher
// at a virtual time of 200, and cancel the subscription at 900
let testableSubscriber = testScheduler.start { finalPublisher }
// we're expecting that, immediately upon subscription, our results will
// arrive. This is because we're using `just` type publishers which
// dispatch their contents as soon as they're subscribed to
XCTAssertEqual(testableSubscriber.recordedOutput, [
(200, .subscription), // we're expecting to subscribe at 200
(200, .input([1, 2, 3])), // then receive an array of results immediately
(200, .completion(.finished)), // the `collect` operator finishes immediately after completion
])
}
}
I think that Publishers.MergeMany could be of help here. In your example, you might use it like so:
func createIngredients(ingredients: [Ingredient]) -> AnyPublisher<CreateIngredientMutation.Data, Error> {
let publishers = ingredients.map(createIngredient(ingredient:))
return Publishers.MergeMany(publishers).eraseToAnyPublisher()
}
That will give you a publisher that sends you single values of the Output.
However, if you specifically want the Output in an array all at once at the end of all your publishers completing, you can use collect() with MergeMany:
func createIngredients(ingredients: [Ingredient]) -> AnyPublisher<[CreateIngredientMutation.Data], Error> {
let publishers = ingredients.map(createIngredient(ingredient:))
return Publishers.MergeMany(publishers).collect().eraseToAnyPublisher()
}
And either of the above examples you could simplify into a single line if you prefer, ie:
func createIngredients(ingredients: [Ingredient]) -> AnyPublisher<CreateIngredientMutation.Data, Error> {
Publishers.MergeMany(ingredients.map(createIngredient(ingredient:))).eraseToAnyPublisher()
}
You could also define your own custom merge() extension method on Sequence and use that to simplify the code slightly:
extension Sequence where Element: Publisher {
func merge() -> Publishers.MergeMany<Element> {
Publishers.MergeMany(self)
}
}
func createIngredients(ingredients: [Ingredient]) -> AnyPublisher<CreateIngredientMutation.Data, Error> {
ingredients.map(createIngredient).merge().eraseToAnyPublisher()
}
To add on the answer by Tricky, here is a solution which retains the order of elements in the array.
It passes an index for each element through the whole chain, and sorts the collected array by the index.
Complexity should be O(n log n) because of the sorting.
import Combine
extension Publishers {
private struct EnumeratedElement<T> {
let index: Int
let element: T
init(index: Int, element: T) {
self.index = index
self.element = element
}
init(_ enumeratedSequence: EnumeratedSequence<[T]>.Iterator.Element) {
index = enumeratedSequence.offset
element = enumeratedSequence.element
}
}
static func mergeMappedRetainingOrder<InputType, OutputType>(
_ inputArray: [InputType],
mapTransform: (InputType) -> AnyPublisher<OutputType, Error>
) -> AnyPublisher<[OutputType], Error> {
let enumeratedInputArray = inputArray.enumerated().map(EnumeratedElement.init)
let enumeratedMapTransform: (EnumeratedElement<InputType>) -> AnyPublisher<EnumeratedElement<OutputType>, Error> = { enumeratedInput in
mapTransform(enumeratedInput.element)
.map { EnumeratedElement(index: enumeratedInput.index, element: $0)}
.eraseToAnyPublisher()
}
let sortEnumeratedOutputArrayByIndex: ([EnumeratedElement<OutputType>]) -> [EnumeratedElement<OutputType>] = { enumeratedOutputArray in
enumeratedOutputArray.sorted { $0.index < $1.index }
}
let transformToNonEnumeratedArray: ([EnumeratedElement<OutputType>]) -> [OutputType] = {
$0.map { $0.element }
}
return Publishers.MergeMany(enumeratedInputArray.map(enumeratedMapTransform))
.collect()
.map(sortEnumeratedOutputArrayByIndex)
.map(transformToNonEnumeratedArray)
.eraseToAnyPublisher()
}
}
Unit test for the solution:
import XCTest
import Combine
final class PublishersExtensionsTests: XCTestCase {
// MARK: - Private properties
private var cancellables = Set<AnyCancellable>()
// MARK: - Tests
func test_mergeMappedRetainingOrder() {
let expectation = expectation(description: "mergeMappedRetainingOrder publisher")
let numbers = (1...100).map { _ in Int.random(in: 1...3) }
let mapTransform: (Int) -> AnyPublisher<Int, Error> = {
let delayTimeInterval = RunLoop.SchedulerTimeType.Stride(Double($0))
return Just($0)
.delay(for: delayTimeInterval, scheduler: RunLoop.main)
.setFailureType(to: Error.self)
.eraseToAnyPublisher()
}
let resultNumbersPublisher = Publishers.mergeMappedRetainingOrder(numbers, mapTransform: mapTransform)
resultNumbersPublisher.sink(receiveCompletion: { _ in }, receiveValue: { resultNumbers in
XCTAssertTrue(numbers == resultNumbers)
expectation.fulfill()
}).store(in: &cancellables)
waitForExpectations(timeout: 5)
}
}
You can do it in one line:
.flatMap(Publishers.Sequence.init(sequence:))

How to use pagination using RxSwift and Alamofire?

I am trying to consume an api with alamofire and rxswift. I have written the methods but the onNext of the observer is getting called only once. I am trying to do it with recursive call. What is wrong with this?
Api will return 10 object at a time based on the timestamp. So I am checking if just returned array contains 10 objects. If yes then there are more, if not then that's the end.
func fetchPersonalization(fromList:[Personalization],timeStamp:Int) -> Observable<PersonalizationContainer>
{
let dictHeader = ["Accept":"application/json","regid" : pushtoken , "os" : "ios" , "token" : token , "App-Version" : "1324" , "Content-Type" : "application/json"]
return fetchPersonalizationUtil(dictHeader: dictHeader, timeStamp: timeStamp)
.flatMap { (perList) -> Observable<PersonalizationContainer> in
let persoList:[Personalization] = perList.list
let finalList = fromList + persoList
if(persoList.count==10){
let newTimeStamp = persoList.last!.lastModifiedAt! - 1
return Observable.merge(Observable.just(PersonalizationContainer(l: finalList, d: perList.data)),
self.fetchPersonalization(fromList:finalList,timeStamp: newTimeStamp)
)
//self.fetchPersonalization(fromList:finalList,timeStamp: newTimeStamp)
}else {
return Observable.just(PersonalizationContainer(l: finalList, d: Data()))
}
}
}
func fetchPersonalizationUtil(dictHeader:[String:String],timeStamp:Int) -> Observable<PersonalizationContainer>
{
return Observable<PersonalizationContainer>.create({ (observer) -> Disposable in
Alamofire.request("https://mranuran.com/api/hubs/personalization/laterthan/\(timeStamp)/limit/10/" ,headers: dictHeader).responseData { response in
if let json = response.result.value {
//print("HUBs JSON: \(json)")
do {
let list = try JSONDecoder().decode([Personalization].self, from: json)
let pContainer = PersonalizationContainer(l: list, d: json)
print("ANURAN \(list[0].name)")
observer.onNext(pContainer)
observer.onCompleted()
}catch {
print(error)
observer.onError(error)
}
}
else{
observer.onError(response.result.error!)
}
}
return Disposables.create()
})
}
I put a break point on the onNext method and it seemed it's getting called only once. Stuck with this for hours and RxSwift's GithubRepo example in their official github repo, I can't figure it out what they are doing. What can be wrong with my process?
I wrote this up a while back using Promises, here it is using Singles.
You pass in:
the seed which is used to make the first network call.
the pred which will be given the results of the most recent call and produces either an argument to make the next network call or nil if done. (In here is where you would check the count and return the next time stamp if another call is required.)
the producer which makes the network call.
It eventually returns a Single with an array of all the results. It will error if any of the internal network calls error out.
func accumulateWhile<T, U>(seed: U, pred: #escaping (T) -> U?, producer: #escaping (U) -> Single<T>) -> Single<[T]> {
return Single.create { observer in
var disposable = CompositeDisposable()
var accumulator: [T] = []
let lock = NSRecursiveLock()
func loop(_ u: U) {
let product = producer(u)
let subDisposable = product.subscribe { event in
lock.lock(); defer { lock.unlock() }
switch event {
case let .success(value):
accumulator += [value]
if let u = pred(value) {
loop(u)
}
else {
observer(.success(accumulator))
}
case let .error(error):
observer(.error(error))
}
}
_ = disposable.insert(subDisposable)
}
loop(seed)
return disposable
}
}
I don't think the lock is actually necessary, but I put it in just in case.
I've improved, based on #Daniel T.'s answer, by adding next page loading trigger. This is useful when the next page should be loaded only when user scrolls to the bottom of UITableView of in similar cases.
First page is loaded instantly upon subscribe and each subsequent page right after receiving a signal in nextPageTrigger parameter
Example usage:
let contents = loadPagesLazily(
seed: 1,
requestProducer: { (pageNumber: Int) -> Single<ResponseContainer<[Content]>> in
return dataSource.loadContent(page: Id, pageSize: 20)
},
nextKeySelector: { (responseContainer: ResponseContainer<[Content]>) -> Meta? in
let hasMorePages = responseContainer.meta.currentPage < responseContainer.meta.lastPage
return hasMorePages ? responseContainer.meta.currentPage + 1 : nil
},
nextPageTrigger: loadMoreTrigger
)
return contents
.scan([]], accumulator: { (accumulator, nextPageContainer) -> SearchResults in
accumulator + nextPageContainer.data
})
Parameters:
seed - first page loading information PageKey
requestProducer -
transforms each PageKey to a page loading Single
nextKeySelector - creates next page loaging info based on data
retrieved in eah page resulted fromrequestProducer call. Return
nil here if there is no next page.
nextPageTrigger - after
receiving first page each subsequent page is returned only after
receiving a .next signal in this observable/
func loadPagesLazily(
seed: PageKey,
requestProducer: #escaping (PageKey) -> Single<Page>,
nextKeySelector: #escaping (Page) -> PageKey?,
nextPageTrigger: Observable<Void>
) -> Observable<Page> {
return requestProducer(seed)
.asObservable()
.flatMap({ (response) -> Observable<Page> in
let nextPageKey = nextKeySelector(response)
let nextPageLoader: Observable<Page> = nextPageKey
.map { (meta) -> Observable<Page> in
nextPageTrigger.take(1)
.flatMap { (_) -> Observable<Page> in
loadPagesLazily(
seed: meta,
requestProducer: requestProducer,
nextKeySelector: nextKeySelector,
nextPageTrigger: nextPageTrigger
)
}
} ?? Observable.empty()
// Concatenate self and next page recursively
return Observable
.just(response)
.concat(nextPageLoader)
})
}