Swift combine chain requests - swift

I'm attempting to use combine to chain two requests together. The code is pretty rough, but I need to call two api requests. One to get the schedule data than one for live data. I'm able to get the live data (second request) but how do I get the schedule data (first request)? I'm having a hard time understanding how to use combine to chain two requests together, this is my first need to use combine for a widget I'm working on. I'm still fresh to Swift, so my terminology may be lacking.
My last code example wasn't correct and my question was unclear. I have two publishers and the second one depends on the first one. My understanding is still unclear on how to handle the data from my first publisher as well as in .flatMap for the second data. Does it need to be ObservableObject class and have #Published variables for the data? Do I use .assign or .sink to get data from my codable data Schedule and Live? Articles seem a bit too advance for myself as they create custom extensions and changing the API data to nested types.
New example code
import Foundation
import Combine
class DataGroup {
// How to get data from Schedule and Live codable data, do I use a variable and .assign or .sink?
// Where do I put the subscriber?
func requestSchedule(_ teamID : Int) -> AnyPublisher<Schedule, Error> {
let url = URL(string: "https://statsapi.web.nhl.com/api/v1/schedule?teamId=\(teamID)")!
return URLSession
.shared.dataTaskPublisher(for: url)
.map(\.data)
.decode(type: Schedule.self, decoder: JSONDecoder())
.flatMap {self.fetchLiveFeed($0.dates.first?.games.first?.link ?? "")}
/*
.flatMap {URLSession.shared.dataTaskPublisher(for: URL(string: $0.dates.first?.games.first?.link ?? "")!)}
*/
.eraseToAnyPublisher()
}
// Remove and put into flatMap URLSession.shared.dataTaskPublisher?
func fetchLiveFeed(_ link: String) -> AnyPublisher<Live, Error> {
let url = URL(string: "https://statsapi.web.nhl.com\(link)")!
return URLSession.shared.dataTaskPublisher(for: url)
.map(\.data)
.decode(type: Live.self, decoder: JSONDecoder())
.eraseToAnyPublisher()
}
}
OLD
import Foundation
import Combine
class CombineData {
var schedule: Schedule? // Get schedule data alongside live data
var live: Live?
private var cancellables = Set<AnyCancellable>()
func fetchSchedule(_ teamID: Int, _ completion: #escaping (/* Schedule, */Live) -> Void) {
let url = URL(string: "https://statsapi.web.nhl.com/api/v1/schedule?teamId=\(teamID)")!
URLSession.shared.dataTaskPublisher(for: url)
.map { $0.data }
.decode(type: Schedule.self, decoder: JSONDecoder())
.flatMap { self.fetchLiveFeed($0.dates.first?.games.first?.link ?? "") }
.receive(on: DispatchQueue.main)
.sink(receiveCompletion: { _ in }) { data in
// How to get both schedule data and live data here?
//self.schedule = ?
self.live = data
print(data)
completion(self.schedule!, self.live!)
}.store(in: &cancellables)
}
func fetchLiveFeed(_ link: String) -> AnyPublisher<Live, Error> {
let url = URL(string: "https://statsapi.web.nhl.com\(link)")!
return URLSession.shared.dataTaskPublisher(for: url)
.map(\.data)
.decode(type: Live.self, decoder: JSONDecoder())
.eraseToAnyPublisher()
}
}

The general idea is to use a flatMap for chaining, which is what you did, but if you also need the original value, you can return a Zip publisher (with a .zip operator) that puts two results into a tuple.
One of the publishers is the second request, and the other should just emit the value. You can typically do this with Just(v), but you have to make sure that its failure type (which is Never) matches with the other publisher. You can match its failure type with .setFailureType(to:):
publisher1
.flatMap { one in
Just(one).setFailureType(to: Error.self) // error has to match publisher2
.zip(publisher2(with: one))
}
.sink(receiveCompletion: { completion in
// ...
}, receiveValue: { (one, two) in
// ...
})
Alternatively, you can use Result.Publisher which would infer the error (but might look somewhat odd):
.flatMap { one in
Result.Publisher(.success(one))
.zip(publisher2)
}
So, in your case it's going to be something like this:
URLSession.shared.dataTaskPublisher(for: url)
.map(\.data)
.decode(type: Schedule.self, decoder: JSONDecoder())
.flatMap {
Result.Publisher(.success($0))
.zip(self.fetchLiveFeed($0.dates.first?.games.first?.link ?? ""))
}
.sink(receiveCompletion: { completion in
// ...
}, receiveValue: { (schedule, live) in
// ...
})
.store(in: &cancellables)

Related

Swift Combine complex api request

I just started to learn Combine and therefore I can't figure out how to make a complex request to the API.
It is necessary to create an application where the user can enter the name of the company's GitHub account in the input field and get a list of open repositories and their branches.
There are two API methods:
https://api.github.com/orgs/<ORG_NAME>/repos This method returns a list of organization account repositories by name. For example, you can try to request a list of Apple's repositories https://api.github.com/orgs/apple/repos
struct for this method
struct Repository: Decodable {
let name: String
let language: String?
enum Seeds {
public static let empty = Repository(name: "", language: "")
}
}
https://api.github.com/repos/<ORG_NAME>/<REPO_NAME>/branches This method will be needed to get the branch names in the specified repository.
struct for this method
struct Branch: Decodable {
let name: String
}
As a result, I need to get an array of such structures.
struct BranchSectionModel {
var name: Repository
var branchs: [Branch]
}
For this I have two functions:
func loadRepositorys(orgName: String) -> AnyPublisher<[Repository], Never> {
guard let url = URL(string: "https://api.github.com/orgs/\(orgName)/repos" ) else {
return Just([])
.eraseToAnyPublisher()
}
return URLSession.shared.dataTaskPublisher(for: url)
.map { $0.data }
.decode(type: [Repository].self, decoder: JSONDecoder())
.replaceError(with: [])
.receive(on: RunLoop.main)
.eraseToAnyPublisher()
}
and
func loadBranchs(orgName: String, repoName: String) -> AnyPublisher<[Branch], Never> {
guard let url = URL(string: "https://api.github.com/repos/\(orgName)/\(repoName)/branches") else {
return Just([])
.eraseToAnyPublisher()
}
return URLSession.shared.dataTaskPublisher(for: url)
.map { $0.data }
.decode(type: [Branch].self, decoder: JSONDecoder())
.replaceError(with: [])
.receive(on: RunLoop.main)
.eraseToAnyPublisher()
}
Both of these functions work separately, but I don't know how to end up with an [BranchSectionModel] . I guess to use flatMap and sink, but don't understant how.
I do not understand how to combine these two requests in one thread.
When you're looking to convert one publisher into another, .map and .switchToLatest. In this case, since you're also looking to turn one publisher into many (and then back down into one), MergeMany will also be a useful tool:
loadRepositorys(orgName: orgName)
.map { repos in
Publishers.MergeMany(repos.map { repo in
loadBranchs(orgName: orgName, repoName: repo.name)
.map { branches in
BranchSectionModel(name: repo, branchs: branches)
}
})
.collect(repos.count)
}
.switchToLatest()
.sink { result in
print("---")
print(result)
}
.store(in: &cancellables)
Although I'm a big fan of Combine, I don't think it's particularly well suited to this task, compared with async/await, which will probably be a little less confusing and look cleaner. As a learning exercise, it's a great one, but if you were to tackle this problem in the real world, async/await would likely be my go-to.

Combine: how to chain and then recombine one-to-many network queries

I'm trying to understand how to chain and then recombine one-to-many network queries using Combine.
I have an initial request that retrieves some JSON data, decodes it and maps that to a list of IDs:
let _ = URLSession.shared
.dataTaskPublisher(for: url)
.receive(on: apiQueue)
.map(\.data)
.decode(type: MyMainResultType.self, decoder: JSONDecoder())
.map { $0.results.map { $0.id } } // 'results' is a struct containing 'id', among others
// .sink() and .store() omitted
This gives me the expected array of ints: [123, 456, ...]
For each of the ints I'd like to start another request that queries another endpoint using that ID as a parameter, retrieves some JSON, extracts an appropriate piece of data and then recombines that with the ID to give me a final list of [(id, otherData), ...].
The second request is working as a function in isolation, with its own sink() and store(), and also as an AnyPublisher<>.
I've tried any number of map { Publishers.Sequence ...}, .flatMap(), .combine() etc. but think that maybe my mental model of what's happening is incorrect.
What I think I should be doing is map() each int to the secondary details request publisher, then doing a flatMap() to get back a single publisher, and collect()ing all the results, possibly with another map to bring in the ID, but nothing seems to give me a simple list, as described above, at the end.
How can I take my list of ints and spawn a number of further requests, waiting until all of them have completed, and then reassemble the id and the additional info into a single Combine chain?
TIA!
After the existing pipeline, you should first flatMap to Publishers.Sequence:
.flatMap(\.publisher)
This changes turns your publisher from one that publishes arrays of things, into one that publishes those array elements.
Then do another flat map to the URL session data task publisher, with all the steps to extract otherData attached. Note that at the very end is where we associate id to otherData:
.flatMap { id in
URLSession.shared.dataTaskPublisher(
// as an example
for: URL(string: "https://example.com/?id=\(id)")!
).receive(on: apiQueue)
.map(\.data)
.decode(type: Foo.self, decoder: JSONDecoder())
.map { (id, $0.otherData) } // associate id with otherData
}
Then you can collect() to turn it into a publisher that publishes an array only.
Full version:
// this is of type AnyPublisher<[(Int, Int)], Error>
let _ = URLSession.shared
.dataTaskPublisher(for: url)
.receive(on: apiQueue)
.map(\.data)
.decode(type: MyMainResultType.self, decoder: JSONDecoder())
.map { $0.results.map { $0.id } }
.flatMap(\.publisher)
.flatMap { id in
URLSession.shared.dataTaskPublisher(
// as an example
for: URL(string: "https://example.com/?id=\(id)")!
).receive(on: apiQueue)
.map(\.data)
.decode(type: Foo.self, decoder: JSONDecoder())
.map { (id, $0.otherData) } // associate id with otherData
}
.collect()
.eraseToAnyPublisher()

Generic Func: Key path value type '[T]' cannot be converted

I'm playing with Combine to learn it and improve my reactive programming skills, and I'm trying to create some generic class that convert data to my T type
I have this error, and I don't understand why
Key path value type '[T]' cannot be converted to contextual type 'T'
class Fetcher<T: Codable>: ObservableObject {
private var task: AnyCancellable?
#Published var result = [T]()
init<T: Codable> (type: T.Type) {
guard let url = URL(string: "https://api.example.com") else { return }
task = URLSession.shared.dataTaskPublisher(for: url)
.map{$0.data}
.decode(type: T.self, decoder: JSONDecoder())
.receive(on: DispatchQueue.global(qos: .background))
.replaceError(with: T.self as! T)
.assign(to: \.result, on: self)
}
}
Since the URL gives you an array of Ts, you should decode an array, rather than a single T in the decode call. This line
.decode(type: T.self, decoder: JSONDecoder())
should be:
.decode(type: [T].self, decoder: JSONDecoder())
That replaceError call is going to crash your app, as T.self is not a T (it's a T.Type), and you are force-casting. Since you are receiving an array, a logical choice for a value to replace errors with, is the empty array []:
.replaceError(with: [])
Also, remove your generic parameter on init:
init() {

Swift Combine - Accessing separate lists of publishers

I have two lists of URLs that return some links to images.
The lists are passed into a future like
static func loadRecentEpisodeImagesFuture(request: [URL]) -> AnyPublisher<[RecentEpisodeImages], Never> {
return Future { promise in
print(request)
networkAPI.recentEpisodeImages(url: request)
.sink(receiveCompletion: { _ in },
receiveValue: { recentEpisodeImages in
promise(.success(recentEpisodeImages))
})
.store(in: &recentImagesSubscription)
}
.eraseToAnyPublisher()
}
Which calls:
/// Get a list of image sizes associated with a featured episode .
func featuredEpisodeImages(featuredUrl: [URL]) -> AnyPublisher<[FeaturedEpisodeImages], Error> {
let featuredEpisodesImages = featuredUrl.map { (featuredUrl) -> AnyPublisher<FeaturedEpisodeImages, Error> in
return URLSession.shared
.dataTaskPublisher(for: featuredUrl)
.map(\.data)
.decode(type: FeaturedEpisodeImages.self, decoder: decoder)
.receive(on: networkApiQueue)
.catch { _ in Empty<FeaturedEpisodeImages, Error>() }
.print("###Featured###")
.eraseToAnyPublisher()
}
return Publishers.MergeMany(featuredEpisodesImages).collect().eraseToAnyPublisher()
}
/// Get a list of image sizes associated with a recent episode .
func recentEpisodeImages(recentUrl: [URL]) -> AnyPublisher<[RecentEpisodeImages], Error> {
let recentEpisodesImages = recentUrl.map { (recentUrl) -> AnyPublisher<RecentEpisodeImages, Error> in
return URLSession.shared
.dataTaskPublisher(for: recentUrl)
.map(\.data)
.decode(type: RecentEpisodeImages.self, decoder: decoder)
.receive(on: networkApiQueue)
.catch { _ in Empty<RecentEpisodeImages, Error>() }
.print("###Recent###")
.eraseToAnyPublisher()
}
return Publishers.MergeMany(recentEpisodesImages).collect().eraseToAnyPublisher()
}
and is attached to the app state:
/// Takes an action and returns a future mapped to another action.
static func recentEpisodeImages(action: RequestRecentEpisodeImages) -> AnyPublisher<Action, Never> {
return loadRecentEpisodeImagesFuture(request: action.request)
.receive(on: networkApiQueue)
.map({ images in ResponseRecentEpisodeImages(response: images) })
.replaceError(with: RequestFailed())
.eraseToAnyPublisher()
}
It seems that:
return Publishers.MergeMany(recentEpisodes).collect().eraseToAnyPublisher()
doesn't give me a reliable downstream value as whichever response finishes last overwrites the earlier response.
I am able to log the responses of both series of requests. Both are processing the correct arrays and returning the proper json.
I would like something like:
return recentEpisodeImages
but currently this gives me the error
Cannot convert return expression of type '[AnyPublisher<RecentEpisodeImages, Error>]' to return type 'AnyPublisher<[RecentEpisodeImages], Error>'
How can I collect the values of the inner publisher and return them as
AnyPublisher<[RecentEpisodeImages], Error>
Presuming that the question is how to turn an array of URLs into an array of what you get when you download and process the data from those URLs, the answer is: turn the array into a sequence publisher, process each URL by way of flatMap, and collect the result.
Here, for instance, is how to turn an array of URLs representing images into an array of the actual images (not identically what you're trying to do, but probably pretty close):
func publisherOfArrayOfImages(urls:[URL]) -> AnyPublisher<[UIImage],Error> {
urls.publisher
.flatMap { (url:URL) -> AnyPublisher<UIImage,Error> in
return URLSession.shared.dataTaskPublisher(for: url)
.compactMap { UIImage(data: $0.0) }
.mapError { $0 as Error }
.eraseToAnyPublisher()
}.collect().eraseToAnyPublisher()
}
And here's how to test it:
let urls = [
URL(string:"http://www.apeth.com/pep/moe.jpg")!,
URL(string:"http://www.apeth.com/pep/manny.jpg")!,
URL(string:"http://www.apeth.com/pep/jack.jpg")!,
]
let pub = publisherOfArrayOfImages(urls:urls)
pub.sink { print($0) }
receiveValue: { print($0) }
.store(in: &storage)
You'll see that what pops out the bottom of the pipeline is an array of three images, corresponding to the array of three URLs we started with.
(Note, please, that the order of the resulting array is random. We fetched the images asynchronously, so the results arrive back at our machine in whatever order they please. There are ways around that problem, but it is not what you asked about.)

How to process an array of task asynchronously with swift combine

I have a publisher which takes a network call and returns an array of IDs. I now need to call another network call for each ID to get all my data. And I want the final publisher to have the resulting object.
First network result:
"user": {
"id": 0,
"items": [1, 2, 3, 4, 5]
}
Final object:
struct User {
let id: Int
let items: [Item]
... other fields ...
}
struct Item {
let id: Int
... other fields ...
}
Handling multiple network calls:
userPublisher.flatMap { user in
let itemIDs = user.items
return Future<[Item], Never>() { fulfill in
... OperationQueue of network requests ...
}
}
I would like to perform the network requests in parallel, since they are not dependent on each other. I'm not sure if Future is right here, but I'd imagine I would then have code to do a
DispatchGroup or OperationQueue and fulfill when they're all done. Is there more of a Combine way of doing this?
Doe Combine have a concept of splitting one stream into many parallel streams and joining the streams together?
Combine offers extensions around URLSession to handle network requests unless you really need to integrate with OperationQueue based networking, then Future is a fine candidate. You can run multiple Futures and collect them at some point, but I'd really suggest looking at URLSession extensions for Combine.
struct User: Codable {
var username: String
}
let requestURL = URL(string: "https://example.com/")!
let publisher = URLSession.shared.dataTaskPublisher(for: requestURL)
.map { $0.data }
.decode(type: User.self, decoder: JSONDecoder())
Regarding running a batch of requests, it's possible to use Publishers.MergeMany, i.e:
struct User: Codable {
var username: String
}
let userIds = [1, 2, 3]
let subscriber = Just(userIds)
.setFailureType(to: Error.self)
.flatMap { (values) -> Publishers.MergeMany<AnyPublisher<User, Error>> in
let tasks = values.map { (userId) -> AnyPublisher<User, Error> in
let requestURL = URL(string: "https://jsonplaceholder.typicode.com/users/\(userId)")!
return URLSession.shared.dataTaskPublisher(for: requestURL)
.map { $0.data }
.decode(type: User.self, decoder: JSONDecoder())
.eraseToAnyPublisher()
}
return Publishers.MergeMany(tasks)
}.collect().sink(receiveCompletion: { (completion) in
if case .failure(let error) = completion {
print("Got error: \(error.localizedDescription)")
}
}) { (allUsers) in
print("Got users:")
allUsers.map { print("\($0)") }
}
In the example above I use collect to collect all results, which postpones emitting the value to the Sink until all of the network requests successfully finished, however you can get rid of the collect and receive each User in the example above one by one as network requests complete.