Can we define an async/await function that returns one value instantly, as well as another value asynchronously (as you'd normally expect)? - swift

Picture an image loading function with a closure completion. Let's say it returns a token ID that you can use to cancel the asynchronous operation if needed.
#discardableResult
func loadImage(url: URL, completion: #escaping (Result<UIImage, Error>) -> Void) -> UUID? {
if let image = loadedImages[url] {
completion(.success(image))
return nil
}
let id = UUID()
let task = URLSession.shared.dataTask(with: url) { data, response, error in
defer {
self.requests.removeValue(forKey: id)
}
if let data, let image = UIImage(data: data) {
DispatchQueue.main.async {
self.loadedImages[url] = image
completion(.success(image))
}
return
}
if let error = error as? NSError, error.code == NSURLErrorCancelled {
return
}
//TODO: Handle response errors
print(response as Any)
completion(.failure(.loadingError))
}
task.resume()
requests[id] = task
return id
}
func cancelRequest(id: UUID) {
requests[id]?.cancel()
requests.removeValue(forKey: id)
print("ImageLoader: cancelling request")
}
How would we accomplish this (elegantly) with swift concurrency? Is it even possible or practical?

I haven't done much testing on this, but I believe this is what you're looking for. It allows you to simply await an image load, but you can cancel using the URL from somewhere else. It also merges near-simultaneous requests for the same URL so you don't re-download something you're in the middle of.
actor Loader {
private var tasks: [URL: Task<UIImage, Error>] = [:]
func loadImage(url: URL) async throws -> UIImage {
if let imageTask = tasks[url] {
return try await imageTask.value
}
let task = Task {
// Rather than removing here, you could skip that and this would become a
// cache of results. Making that correct would take more work than the
// question asks for, so I won't go into it
defer { tasks.removeValue(forKey: url) }
let data = try await URLSession.shared.data(from: url).0
guard let image = UIImage(data: data) else {
throw DecodingError.dataCorrupted(.init(codingPath: [],
debugDescription: "Invalid image"))
}
return image
}
tasks[url] = task
return try await task.value
}
func cancelRequest(url: URL) {
// Remove, and cancel if it's removed
tasks.removeValue(forKey: url)?.cancel()
print("ImageLoader: cancelling request")
}
}
Calling it looks like:
let image = try await loader.loadImage(url: url)
And you can cancel a request if it's still pending using:
loader.cancelRequest(url: url)
A key lesson here is that it is very natural to access task.value multiple times. If the task has already completed, then it will just return immediately.

Return a task in a tuple or other structure.
In the cases where you don't care about the ID, do this:
try await imageTask(url: url).task.value
private var requests: [UUID: Task<UIImage, Swift.Error>] = [:]
func imageTask(url: URL) -> (id: UUID?, task: Task<UIImage, Swift.Error>) {
switch loadedImages[url] {
case let image?: return (id: nil, task: .init { image } )
case nil:
let id = UUID()
let task = Task {
defer { requests[id] = nil }
guard let image = UIImage(data: try await URLSession.shared.data(from: url).0)
else { throw Error.loadingError }
try Task.checkCancellation()
Task { #MainActor in loadedImages[url] = image }
return image
}
requests[id] = task
return (id: id, task: task)
}
}

Is it even possible or practical?
Yes to both.
As I say in a comment, I think you may be missing the fact that a Task is an object you can retain and later cancel. Thus, if you create an architecture where you apply an ID to a task as you ask for the task to start, you can use that same ID to cancel that task before it has returned.
Here's a simple demonstration. I've deliberately written it as Playground code (though I actually developed it in an iOS project).
First, here is a general TimeConsumer class that wraps a single time-consuming Task. We can ask for the task to be created and started, but because we retain the task, we can also cancel it midstream. It happens that my task doesn't return a value, but that's neither here nor there; it could if we wanted.
class TimeConsumer {
var current: Task<(), Error>?
func consume(seconds: Int) async throws {
let task = Task {
try await Task.sleep(for: .seconds(seconds))
}
current = task
_ = await task.result
}
func cancel() {
current?.cancel()
}
}
Now then. In front of my TimeConsumer I'll put a TaskVendor actor. A TimeConsumer represents just one time-consuming task, but a TaskVendor has the ability to maintain multiple time-consuming tasks, identifying each task with an identifier.
actor TaskVendor {
private var tasks = [UUID: TimeConsumer?]()
func giveMeATokenPlease() -> UUID {
let uuid = UUID()
tasks[uuid] = nil
return uuid
}
func beginTheTask(uuid: UUID) async throws {
let consumer = TimeConsumer()
tasks[uuid] = consumer
try await consumer.consume(seconds: 10)
tasks[uuid] = nil
}
func cancel(uuid: UUID) {
tasks[uuid]??.cancel()
tasks[uuid] = nil
}
}
That's all there is to it! Observe how TaskVendor is configured. I can do three things: I can ask for a token (really my actual TaskVendor needn't bother doing this, but I wanted to centralize everything for generality); I can start the task with that token; and, optionally, I can cancel the task with that token.
So here's a simple test harness. Here we go!
let vendor = TaskVendor()
func test() async throws {
let uuid = await vendor.giveMeATokenPlease()
print("start")
Task {
try await Task.sleep(for: .seconds(2))
print("cancel?")
// await vendor.cancel(uuid: uuid)
}
try await vendor.beginTheTask(uuid: uuid)
print("finish")
}
Task {
try await test()
}
What you will see in the console is:
start
[two seconds later] cancel?
[eight seconds after that] finish
We didn't cancel anything; the word "cancel?" signals the place where our test might cancel, but we didn't, because I wanted to prove to you that this is working as we expect: it takes a total of 10 seconds between "start" and "finish", so sure enough, we are consuming the expected time fully.
Now uncomment the await vendor.cancel line. What you will see now is:
start
[two seconds later] cancel?
[immediately!] finish
We did it! We made a cancellable task vendor.

I'm including one possible answer to the question, for the benefit of others. I'll leave the question in place in case someone has another take on it.
The only way that I know of having a 'one-shot' async method that would return a token before returning the async result is by adding an inout argument:
func loadImage(url: URL, token: inout UUID?) async -> Result<UIImage, Error> {
token = UUID()
//...
}
Which we may call like this:
var requestToken: UUID? = nil
let result = await imageLoader.loadImage(url: url, token: &requestToken)
However, this approach and the two-shot solution by #matt both seem fussy, from the api design standpoint. Of course, as he suggests, this leads to a bigger question: How do we implement cancellation with swift concurrency (ideally without too much overhead)? And indeed, using tasks and wrapper objects seems unavoidable, but it certainly seems like a lot of legwork for a fairly simple pattern.

Related

Cancelling the task of a URLSession.AsyncBytes doesn't seem to work

I want to download a large file, knowing the number of bytes transferred, and be able to cancel the download if necessary.
I know that this can be done having a URLSessionDownloadTask and conforming to the URLSessionDownloadDelegate, but I wanted to achieve it through an async/await mechanism, so I used URLSession.shared.bytes(from: url) and then a for-await-in loop to handle each byte.
The issue comes when trying to cancel the ongoing task, as even though the URLSession.AsyncBytes's Task has been cancelled, the for-await-in loop keeps processing bytes, so I'm assuming that the download is still ongoing.
I've tested it with this piece of code in a playground.
let url = URL(string: "https://example.com/large_file.zip")!
let (asyncBytes, _) = try await URLSession.shared.bytes(from: url)
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
asyncBytes.task.cancel()
}
var data = Data()
for try await byte in asyncBytes {
data.append(byte)
print(data.count)
}
I would have expected that, as soon as the task is cancelled, the download would have been stopped and, therefore, the for-await-in would stop processing bytes.
What am I missing here? Can these tasks not be effectively cancelled?
Canceling a URLSessionDataTask works fine with AsyncBytes. That having been said, even if the URLSessionDataTask is canceled, the AsyncBytes will continue to iterate through the bytes received prior to cancelation. But the data task does stop.
Consider experiment1:
#MainActor
class ViewModel: ObservableObject {
private let url: URL = …
private let session: URLSession = …
private var cancelButtonTapped = false
private var dataTask: URLSessionDataTask?
#Published var bytesBeforeCancel = 0
#Published var bytesAfterCancel = 0
func experiment1() async throws {
let (asyncBytes, _) = try await session.bytes(from: url)
dataTask = asyncBytes.task
var data = Data()
for try await byte in asyncBytes {
if cancelButtonTapped {
bytesAfterCancel += 1
} else {
bytesBeforeCancel += 1
}
data.append(byte)
}
}
func cancel() {
dataTask?.cancel()
cancelButtonTapped = true
}
}
So, I canceled after 1 second (at which point I had iterated through 2,022 bytes), and it continues to iterate through the remaining 14,204 bytes that had been received prior to the cancelation of the URLSessionDataTask. But the download does stop successfully. (In my example, the actual asset being downloaded was 74mb.) When using URLSession, the data comes in packets, so it takes AsyncBytes a little time to get through everything that was actually received before the URLSession request was canceled.
You might consider canceling the Swift concurrency Task, rather than the URLSessionDataTask. (I really wish they did not use the same word, “task”, to refer to entirely different concepts!)
Consider experiment2:
#MainActor
class ViewModel: ObservableObject {
private let url: URL = …
private let session: URLSession = …
private var cancelButtonTapped = false
private var task: Task<Void, Error>?
#Published var bytesBeforeCancel = 0
#Published var bytesAfterCancel = 0
func experiment2() async throws {
task = Task { try await download() }
try await task?.value
}
func cancel() {
task?.cancel()
cancelButtonTapped = true
}
func download() async throws {
let (asyncBytes, _) = try await session.bytes(from: url)
var data = Data()
for try await byte in asyncBytes {
try Task.checkCancellation()
if cancelButtonTapped { // this whole `if` statement is no longer needed, but I've kept it here for comparison to the previous example
bytesAfterCancel += 1
} else {
bytesBeforeCancel += 1
}
data.append(byte)
}
}
}
Without the try Task.checkCancellation() line, the behavior is almost the same as in experiment1. The cancelation of the Task with the AsyncBytes will result in the cancelation of the underlying URLSessionDataTask (but the sequence will continue to iterate through the bytes in the packets that were successfully received prior to cancelation). But with try Task.checkCancellation(), it will exit as soon as the Task is canceled.
TL;DR Read Rob's answer, but the iterator code and and the partial download code are still handy so I'm leaving this answer with corrections.
Okay so I spent some time on this because I'm about to try to write my own cancellable url stream object. and it appears that asyncBytes.task.cancel() is more along the lines of URLSession's finishTasksAndInvalidate() than invalidateAndCancel(). Since you are pointing your streaming task at a file that isn't really that large the URLSessionDataTask had already gotten the bytes in the buffer.
You can see this when you change up the function a bit (see Rob's example as well):
func test_funcCondition(timeOut:TimeInterval, url:URL, session:URLSession) async throws {
let (asyncBytes, _) = try await session.bytes(from: url)
let deadLine = Date.now + timeOut
var data = Data()
func someConditionCheck(_ deadline:Date) -> Bool {
Date.now > deadLine
}
for try await byte in asyncBytes {
if someConditionCheck(deadLine) {
asyncBytes.task.cancel()
print("trying to cancel...")
}
//Wrong type of task! Should not work. if Task.isCancelled { print ("cancelled") }
data.append(byte)
//just to reduce the amount of printing
if data.count % 100 == 0 {
print(data.count)
}
}
}
If you point the URL at "https://example.com/large_file.zip" like your example and make the time interval very short the function will print "trying to cancel..." between the time your marker hits and the file completes. It does NOT however, ever print "cancelled". (The task being cancelled is a URLSessionDataTask, not a Swift concurrency Task, that line never would have worked.)
If you point either what you wrote or this function at a Server-Sent-Event stream it will cancel out just fine. (While true, its not in contrast to the other behavior, which also works just fine. There are just bigger pauses in SSE data.)
If that isn't what you want, if you want to be able to start-stop streams mid-chunk, maybe explore a custom delegate (something I haven't done yet myself), or go work with AVFoundation if that's an option because they've thought a lot about working with large streaming files. I did not check making my own session and running session.invalidateAndCancel() on it instead, because that seems kind of extreme, but may be the way to go if you want to flush the buffer immediately.
The below will work to stop caring about the buffer immediately. It involves making a custom iterator. but it seems kind of quirky and may not in fact arrest the downloading (still cost users data rates and power). I haven't looked into how the stream protocol relates to the network protocol on that lower level, if you stop asking does it stop getting? I don't know. The cancel will arrest the stream allowing through the bytes that are already in the buffer, but your code won't get them. On my todo-list now is to look into how to change buffering policies.
Rob's code seems a nice way to go and advantage of a concurrency Task.
func test_customIterator(timeOut:TimeInterval, url:URL, session:URLSession) async throws {
let (asyncBytes, _) = try await session.bytes(from: url)
let deadLine = Date.now + timeOut
var data = Data()
func someConditionCheck(_ deadline:Date) -> Bool {
Date.now > deadLine
}
//could also be asyncBytes.lines.makeAsyncIterator(), etc.
var iterator = asyncBytes.makeAsyncIterator()
while !someConditionCheck(deadLine) {
//await Task.yield()
let byte = try await iterator.next()
data.append(byte!)
print(data.count)
}
//make sure to still tell URLSession you aren't listening anymore.
//It may auto-close but that's not how I roll.
asyncBytes.task.cancel()
}
let tap_out:TimeInterval = 0.0005
try await test_customIterator(timeOut: tap_out, url: URL(string:"https://example.com/large_file.zip")!, session: URLSession.shared)
Interesting flavor of behavior. Thanks for pointing it out. Also I didn't know that the task was already available (asyncBytes.task). Thanks for that. Incorrect. The asyncBytes.task is a URLSessionDataTask not a concurrency Task
UPDATED TO ADD:
To get part of the file explicitly
//https://developer.mozilla.org/en-US/docs/Web/HTTP/Range_requests
func requestInChunks(data:inout Data, url:URL, session:URLSession, offset:Int, length:Int) async throws {
var urlRequest = URLRequest(url: url)
urlRequest.addValue("bytes=\(offset)-\(offset + length - 1)", forHTTPHeaderField: "Range")
let (asyncBytes, response) = try await
session.bytes(for: urlRequest, delegate: nil)
guard (response as? HTTPURLResponse)?.statusCode == 206 else { //NOT 200!!
throw APIngError("The server responded with an error.")
}
for try await byte in asyncBytes {
data.append(byte)
if data.count % 100 == 0 {
print(data.count)
}
}
}
Still think if my task on hand was about file downloading session.download would be my go to, but then there is file clean up, etc. so I get why not go there.

How to suspend subsequent tasks until first finishes then share its response with tasks that waited?

I have an actor which throttles requests in a way where the first one will suspend subsequent requests until finished, then share its response with them so they don't have to make the same request.
Here's what I'm trying to do:
let cache = Cache()
let operation = OperationStatus()
func execute() async {
if await operation.isExecuting else {
await operation.waitUntilFinished()
} else {
await operation.set(isExecuting: true)
}
if let data = await cache.data {
return data
}
let request = myRequest()
let response = await myService.send(request)
await cache.set(data: response)
await operation.set(isExecuting: false)
}
actor Cache {
var data: myResponse?
func set(data: myResponse?) {
self.data = data
}
}
actor OperationStatus {
#Published var isExecuting = false
private var cancellable = Set<AnyCancellable>()
func set(isExecuting: Bool) {
self.isExecuting = isExecuting
}
func waitUntilFinished() async {
guard isExecuting else { return }
return await withCheckedContinuation { continuation in
$isExecuting
.first { !$0 } // Wait until execution toggled off
.sink { _ in continuation.resume() }
.store(in: &cancellable)
}
}
}
// Do something
DispatchQueue.concurrentPerform(iterations: 1_000_000) { _ in execute() }
This ensures one request at a time, and subsequent calls are waiting until finished. It seems this works but wondering if there's a pure Concurrency way instead of mixing Combine in, and how I can test this? Here's a test I started but I'm confused how to test this:
final class OperationStatusTests: XCTestCase {
private let iterations = 10_000 // 1_000_000
private let outerIterations = 10
actor Storage {
var counter: Int = 0
func increment() {
counter += 1
}
}
func testConcurrency() {
// Given
let storage = Storage()
let operation = OperationStatus()
let promise = expectation(description: "testConcurrency")
promise.expectedFulfillmentCount = outerIterations * iterations
#Sendable func execute() async {
guard await !operation.isExecuting else {
await operation.waitUntilFinished()
promise.fulfill()
return
}
await operation.set(isExecuting: true)
try? await Task.sleep(seconds: 8)
await storage.increment()
await operation.set(isExecuting: false)
promise.fulfill()
}
waitForExpectations(timeout: 10)
// When
DispatchQueue.concurrentPerform(iterations: outerIterations) { _ in
(0..<iterations).forEach { _ in
Task { await execute() }
}
}
// Then
// XCTAssertEqual... how to test?
}
}
Before I tackle a more general example, let us first dispense with some natural examples of sequential execution of asynchronous tasks, passing the result of one as a parameter of the next. Consider:
func entireProcess() async throws {
let value = try await first()
let value2 = try await subsequent(with: value)
let value3 = try await subsequent(with: value2)
let value4 = try await subsequent(with: value3)
// do something with `value4`
}
Or
func entireProcess() async throws {
var value = try await first()
for _ in 0 ..< 4 {
value = try await subsequent(with: value)
}
// do something with `value`
}
This is the easiest way to declare a series of async functions, each of which takes the prior result as the input for the next iteration. So, let us expand the above to include some signposts for Instruments’ “Points of Interest” tool:
import os.log
private let log = OSLog(subsystem: "Test", category: .pointsOfInterest)
func entireProcess() async throws {
let id = OSSignpostID(log: log)
os_signpost(.begin, log: log, name: #function, signpostID: id, "start")
var value = try await first()
for _ in 0 ..< 4 {
os_signpost(.event, log: log, name: #function, "Scheduling: %d with input of %d", i, value)
value = try await subsequent(with: value)
}
os_signpost(.end, log: log, name: #function, signpostID: id, "%d", value)
}
func first() async throws -> Int {
let id = OSSignpostID(log: log)
os_signpost(.begin, log: log, name: #function, signpostID: id, "start")
try await Task.sleep(seconds: 1)
let value = 42
os_signpost(.end, log: log, name: #function, signpostID: id, "%d", value)
return value
}
func subsequent(with value: Int) async throws -> Int {
let id = OSSignpostID(log: log)
os_signpost(.begin, log: log, name: #function, signpostID: id, "%d", value)
try await Task.sleep(seconds: 1)
let newValue = value + 1
defer { os_signpost(.end, log: log, name: #function, signpostID: id, "%d", newValue) }
return newValue
}
So, there you see a series of requests that pass their result to the subsequent request. All of that os_signpost signpost stuff is so we can visually see that they are running sequentially in Instrument’s “Points of Interest” tool:
You can see ⓢ event signposts as each task is scheduled, and the intervals illustrate the sequential execution of these asynchronous tasks.
This is the easiest way to have dependencies between tasks, passing values from one task to another.
Now, that begs the question is how to generalize the above, where we await the prior task before starting the next one.
One pattern is to write an actor that awaits the result of the prior one. Consider:
actor SerialTasks<Success> {
private var previousTask: Task<Success, Error>?
func add(block: #Sendable #escaping () async throws -> Success) {
previousTask = Task { [previousTask] in
let _ = await previousTask?.result
return try await block()
}
}
}
Unlike the previous example, this does not require that you have a single function from which you initiate the subsequent tasks. E.g., I have used the above when some separate user interaction requires me to add a new task to the end of the list of previously submitted tasks.
There are two subtle, yet critical, aspects of the above actor:
The add method, itself, must not be an asynchronous function. We need to avoid actor reentrancy. If this were an async function (like in your example), we would lose the sequential execution of the tasks.
The Task has a [previousTask] capture list to capture a copy of the prior task. This way, each task will await the prior one, avoiding any races.
The above can be used to make a series of tasks run sequentially. But it is not passing values between tasks, itself. I confess that I have used this pattern where I simply need sequential execution of largely independent tasks (e.g., sending separate commands being sent to some Process). But it can probably be adapted for your scenario, in which you want to “share its response with [subsequent requests]”.
I would suggest that you post a separate question with MCVE with a practical example of precisely what you wanted to pass from one asynchronous function to another. I have, for example, done permutation of the above, passing integer from one task to another. But in practice, that is not of great utility, as it gets more complicated when you start dealing with the reality of heterogenous results parsing. In practice, the simple example with which I started this question is the most practical pattern.
On the broader question of working with/around actor reentrancy, I would advise keeping an eye out on SE-0306 - Future Directions which explicitly contemplates some potential elegant forthcoming alternatives. I would not be surprised to see some refinements, either in the language itself, or in the Swift Async Algorithms library.
tl;dr
I did not want to encumber the above with discussion regarding your code snippets, but there are quite a few issues. So, if you forgive me, here are some observations:
The attempt to use OperationStatus to enforce sequential execution of async calls will not work because actors feature reentrancy. If you have an async function, every time you hit an await, that is a suspension point at which point another call to that async function is allowed to proceed. The integrity of your OperationStatus logic will be violated. You will not experience serial behavior.
If you are interesting in suspension points, I might recommend watching WWDC 2021 video Swift concurrency: Behind the scenes.
The testConcurrency is calling waitForExpectations before it actually starts any tasks that will fulfill any expectations. That will always timeout.
The testConcurrency is using GCD concurrentPerform, which, in turn, just schedules an asynchronous task and immediately returns. That defeats the entire purpose of concurrentPerform (which is a throttling mechanism for running a series of synchronous tasks in parallel, but not exceed the maximum number of cores on your CPU). Besides, Swift concurrency features its own analog to concurrentPerform, namely the constrained “cooperative thread pool” (also discussed in that video, IIRC), rendering concurrentPerform obsolete in the world of Swift concurrency.
Bottom line, it doesn't make sense to include concurrentPerform in a Swift concurrency codebase. It also does not make sense to use concurrentPerform to launch asynchronous tasks (whether Swift concurrency or GCD). It is for launching a series of synchronous tasks in parallel.
In execute in your test, you have two paths of execution, one which will await some state change and fulfills the expectation without ever incrementing the storage. That means that you will lose some attempts to increment the value. Your total will not match the desired resulting value. Now, if your intent was to drop requests if another was pending, that's fine. But I don't think that was your intent.
In answer to your question about how to test success at the end. You might do something like:
actor Storage {
private var counter: Int = 0
func increment() {
counter += 1
}
var value: Int { counter }
}
func testConcurrency() async {
let storage = Storage()
let operation = OperationStatus()
let promise = expectation(description: "testConcurrency")
let finalCount = outerIterations * iterations
promise.expectedFulfillmentCount = finalCount
#Sendable func execute() async {
guard await !operation.isExecuting else {
await operation.waitUntilFinished()
promise.fulfill()
return
}
await operation.set(isExecuting: true)
try? await Task.sleep(seconds: 1)
await storage.increment()
await operation.set(isExecuting: false)
promise.fulfill()
}
// waitForExpectations(timeout: 10) // this is not where you want to wait; moved below, after the tasks started
// DispatchQueue.concurrentPerform(iterations: outerIterations) { _ in // no point in this
for _ in 0 ..< outerIterations {
for _ in 0 ..< iterations {
Task { await execute() }
}
}
await waitForExpectations(timeout: 10)
// test the success to see if the store value was correct
let value = await storage.value // to test that you got the right count, fetch the value; note `await`, thus we need to make this an `async` test
// Then
XCTAssertEqual(finalCount, value, "Count")
}
Now, this test will fail for a variety of reasons, but hopefully this illustrates how you would verify the success or failure of the test. But, note, that this will test only that the final result was correct, not that they were executed sequentially. The fact that Storage is an actor will hide the fact that they were not really invoked sequentially. I.e., if you really needed the results of one request to prepare the next is not tested here.
If, as you go through this, you want to really confirm the behavior of your OperationStatus pattern, I would suggest using os_signpost intervals (or simple logging statements where your tasks start and finish). You will see that the separate invocations of the asynchronous execute method are not running sequentially.

Captured value within completion handler not mutating

So I have here an async method defined on an actor type. The issue im having is that people isnt being mutated at all; mainly due to what from what I understand concurrency issues. From what I think I know the completion handler is executed concurrently on the same thread and so cannot mutate people.
Nonetheless my knowledge in this area is pretty foggy, so any suggestions to solve it and maybe a better explanation as to why this is happening would be great! Ive thought of a few things but im still new to concurrency.
func getAllPeople() async -> [PersonModelProtocol] {
let decoder = JSONDecoder()
var people: [Person] = []
let dataTask = session.dataTask(with: request!) { data, response, error in
do {
let newPeople = try! decoder.decode([Person].self, from: data!)
people = newPeople
} catch {
print(error)
}
}
dataTask.resume()
return people
}
If you do want to use async/await you have to use the appropriate API.
func getAllPeople() async throws -> [Person] {
let (data, _ ) = try await session.data(for: request!)
return try JSONDecoder().decode([Person].self, from: data)
}
In a synchronous context you cannot return data from a completion handler anyway.
vadian is right but it's better-expressed as a computed property nowadays.
var allPeople: [PersonModelProtocol] {
get async throws {
try JSONDecoder().decode(
[Person].self,
from: await session.data(for: request!).0
)
}
}
And also, your code does mutate people, but returning its empty value before it gets mutated.

swift function doesnt return a value [duplicate]

This question already has answers here:
Returning data from async call in Swift function
(13 answers)
Closed last year.
I'm new at Swift and that's why i need your help. So I have a function which should send request and return a value
func getAnswer() -> String? {
var answer: String?
guard let url = URL(string: "https://8ball.delegator.com/magic/JSON/_") else { return nil }
URLSession.shared.dataTask(with: url) { data, response, error in
guard let data = data, error == nil else {
return
}
guard let response = response as? HTTPURLResponse else { return }
guard response.statusCode == 200 else { return }
do {
let model = try JSONDecoder().decode(Answer.self, from: data)
DispatchQueue.main.async {
answer = model.magic.answer
}
} catch let error {
fatalError(error.localizedDescription)
}
}.resume()
return answer
}
but it always returns nil.
I suppose problem is here
DispatchQueue.main.async {
answer = model.magic.answer
}
How can I fix it?
In order to know what is happening here, you need to learn about #escaping functions in swift, here is some link1 together with taking function as another functions parameter link2 written in part "Function Types as Parameter Types" , closures in Swift link3 and
Here is what is happening simplified and explained step by step :
you call getAnswer()
variable answer gets initialized with value nil by declaring answer: String?
URLSession.shared.dataTask is called and it is taking as an argument another function - closure (Data?, URLResponse?, Error?) -> Void . Also URLSession.shared.dataTask is executed on different thread and is not returning yet, but will return right after it receives response from server, which can take any time (but usually milliseconds) and will basically happen after your getAnswer() function is returning value.
your getAnswer() immediately returns value of answer which is currently nil
if you get any data from server, or server could not be reached, your URLSession.shared.dataTask function executes your code in closure. This is the code it will execute:
guard let data = data, error == nil else {
return
}
guard let response = response as? HTTPURLResponse else { return }
guard response.statusCode == 200 else { return }
do {
let model = try JSONDecoder().decode(Answer.self, from: data)
DispatchQueue.main.async {
answer = model.magic.answer
}
} catch let error {
fatalError(error.localizedDescription)
}
Your problem lies in how swift executes closures. When you call
URLSession.shared.dataTask(with: url) {
// Closure code here
}
return answer
Your "Closure code here" doesn't get called until the endpoint "https://8ball.delegator.com/magic/JSON/_" actually gives a response. However, you've promised swift that your function will return an optional string immediately after the serial code of your function has completed. For this reason, by the time your "Closure code here" has run, and your "answer" variable has been updated with the correct value, your function is long gone, and has already returned a value (which in this case is whatever you've set it to at the beginning - nil).
You can fix this issue in one of two ways.
Swift's new concurrency system
By defining your own closure.
Swift's new concurrency system
You can define your function as async, meaning that the function won't have to return a value in serial, as follows.
enum GetAnswerError: Error {
case invalidURL
}
func getAnswer() async throws -> String {
var answer: String?
guard let url = URL(string: "https://8ball.delegator.com/magic/JSON/_") else {
throw GetAnswerError.invalidURL
}
// Your function will suspend here and probably be moved to a different thread. It will resume once a response has been received from the endpoint.
let (data, _) = try await URLSession.shared.dataTask(with: url)
let parsedData = try JSONDecoder().decode(Answer.self, from: data)
return parsedData.magic.answer
}
When you call this function, you'll have to do so from an environment which swift can suspend. This means you'll call the function from either another async function like so
func anotherFunction() async throws -> Bool {
let answer = try await getAnswer()
// Run some code here
return answer == "YES" // Return some useful value
}
or from a Task object like so
Task {
// Note that because the function getAnswer() can throw errors, you'll have to handle them when you call the function. In this case, I'm handling them by using try?, which will simply set answer to nil if an error is thrown.
let answer = try? await getAnswer()
}
Note that when you call code in a task, you must be using the return value's from within the scope of the task. If you try to do something like this
func getAnswerTheSecond() -> String? {
var answer: String? = nil
Task {
let receivedAnswer = try? await getAnswer()
answer = receivedAnswer
}
return answer
}
You'll just end up back where you started, where swift immediately returns the nil value because your code is ran in serial. To fix this, run the relevant code on the "answer" from wherever it is needed within the task. If you are using the "answer" to update a SwiftUI view that might look like this.
struct ContentView: View {
#State var answer: String = ""
// This is the function that I've written earlier
func getAnswer() async throws -> String {
// Make URL Request
// Return the value
}
var body: some View {
Text(self.answer)
.onAppear{
Task{
let result = try? await self.getAnswer()
self.answer = result
}
}
}
}
Defining your own closure
You can define your own closure to handle the URL response; however, because of swift's new concurrency framework, this is probably not the right way to go.
If you'd like to go this way, do a google search for "Swift closures", and you'll find what you need.

Can't get data returned from dataTask()

For one week I have been trying to get a string returned from dataTask().
I already read a lot here on StackOverFlow and also from serval sites where they tackle this topic. For example, this one. So I already understand that it's that the dataTask doesn't directly return values, cause it happens on different threads and so on. I also read about closures and completion handlers. I really got the feeling that I actually already got a little clue what this is about. But I can't get it to work.
So this is my code. I just post the whole code so no-one needs to worry that the problem sticks in a part which I don't show. Everything is working fine until I try to return a value and save it for example in a variable:
func requestOGD(code gtin: String, completion: #escaping (_ result: String) -> String) {
// MARK: Properties
var answerList: [String.SubSequence] = []
var answerDic: [String:String] = [:]
var product_name = String()
var producer = String()
// Set up the URL request
let ogdAPI = String("http://opengtindb.org/?ean=\(gtin)&cmd=query&queryid=400000000")
guard let url = URL(string: ogdAPI) else {
print("Error: cannot create URL")
return
}
let urlRequest = URLRequest(url: url)
// set up the session
let config = URLSessionConfiguration.default
let session = URLSession(configuration: config)
// make the request
let task = session.dataTask(with: urlRequest) {
(data, response, error) in
// check for any errors
guard error == nil else {
print("error calling GET on /todos/1")
print(error!)
return
}
// make sure we got data
guard let responseData = data else {
print("Error: did not receive data")
return
}
// parse the result, which is String. It willbecome split and placed in a dictionary
do {
let answer = (String(decoding: responseData, as: UTF8.self))
answerList = answer.split(separator: "\n")
for entry in answerList {
let entry1 = entry.split(separator: "=")
if entry1.count > 1 {
let foo = String(entry1[0])
let bar = String(entry1[1])
answerDic[foo] = "\(bar)"
}
}
if answerDic["error"] == "0" {
product_name = answerDic["detailname"]!
producer = answerDic["vendor"]!
completion(product_name)
} else {
print("Error-Code der Seite lautet: \(String(describing: answerDic["error"]))")
return
}
}
}
task.resume()
Here I call my function, and no worries, I also tried to directly return it to the var foo, also doesn't work The value only exists within the closure:
// Configure the cell...
var foo:String = ""
requestOGD(code: listOfCodes[indexPath.row]) { (result: String) in
print(result)
foo = result
return result
}
print("Foo:", foo)
cell.textLabel?.text = self.listOfCodes[indexPath.row] + ""
return cell
}
So my problem is, I have the feeling, that I'm not able to get a value out of a http-request.
You used a completion handler in your call to requestOGD:
requestOGD(code: listOfCodes[indexPath.row]) {
(result: String) in
// result comes back here
}
But then you tried to capture and return that result:
foo = result
return result
So you're making the same mistake here that you tried to avoid making by having the completion handler in the first place. The call to that completion handler is itself asynchronous. So you face the same issue again. If you want to extract result at this point, you would need another completion handler.
To put it in simple terms, this is the order of operations:
requestOGD(code: listOfCodes[indexPath.row]) {
(result: String) in
foo = result // 2
}
print("Foo:", foo) // 1
You are printing foo before the asynchronous code runs and has a chance to set foo in the first place.
In the larger context: You cannot use any asynchronously gathered material in cellForRowAt. The cell is returned before the information is gathered. That's what asynchronous means. You can't work around that by piling on further levels of asynchronicity. You have to change your entire strategy.