Is code within DispatchQueue.global.async executed serially? - swift

I have some code that looks like this:
DispatchQueue.global(qos: .userInitiated).async {
self.fetchProjects()
DispatchQueue.main.async {
self.constructMenu()
}
}
My question is are the blocks within the global block executed serially? When I add print statements, they always executed in the same order, but I'm not sure if I'm getting lucky as looking at the documentation, it says:
Tasks submitted to the returned queue are scheduled concurrently with respect to one another.
I wonder if anyone can shed any light on this?
EDIT:
Apologies, I don't think I made the question clear. I would like for the method constructMenu to only be called once fetchProjects has completed. From what I can tell (by logging print statements) this is the case.
But I'm not really sure why that's the case if what Apple's documentation above says (where each task is scheduled concurrently) is true.
Is code within an async block always executed serially, or is the fact that the code seems to execute serially a result of using DispatchQueue.main or is it just 'luck' and at some point constructMenu will actually return before fetchProjects?

I would like for the method constructMenu to only be called once fetchProjects has completed. From what I can tell (by logging print statements) this is the case.
Yes, this is the case.
But I'm not really sure why that's the case if what Apple's documentation above says (where each task is scheduled concurrently) is true.
Apple’s documentation is saying that two separate dispatches may run concurrently with respect to each other.
Consider:
DispatchQueue.global(qos: .userInitiated).async {
foo()
}
DispatchQueue.global(qos: .userInitiated).async {
bar()
}
In this case, foo and bar may end up running at the same time. This is what Apple means by “Tasks submitted to the returned queue are scheduled concurrently.”
But consider:
DispatchQueue.global(qos: .userInitiated).async {
foo()
bar()
}
In this case, bar will not run until we return from foo.
Is code within an async block always executed serially, or is the fact that the code seems to execute serially a result of using DispatchQueue.main or is it just ‘luck’ and at some point constructMenu will actually return before fetchProjects?
No luck involved. It will never reach the DispatchQueue.main.async line until you return from fetchProjects.
There is one fairly major caveat, though. This assumes that fetchProjects won’t return until the fetch is done. That means that fetchProjects better not be initiating any asynchronous processes of its own (i.e. no network requests). If it does, you probably want to supply it with a completion handler, and put the call to constructMenu in that completion handler.

Yes, blocks of code are submitted as blocks, and those blocks run sequentially. So for your example:
DispatchQueue.global(qos: .userInitiated).async {
self.fetchProjects()
DispatchQueue.main.async {
self.constructMenu()
}
}
fetchProjects() must complete before constructMenu is enqueued. There is no magic here. The block between {...} is submitted as a block. At some point in the future it will executed. But the pieces of the block will not be considered in any granular way. They will start at the top, fetchProjects, and then the next line of code will be executed, DispatchQueue.main.async, which accepts as a parameter another block. The complier doesn't know anything about these blocks. It just passes them to functions, and those functions put them on queues.

DispatchQueue.global is a concurrent queue that mean all tasks submitted run asynchronously at the same time , if you need it serially create a custom queue
let serial = DispatchQueue(label: "com.queueName")
serial.sync {
///
}

Related

How to make a serial queue of tasks running in main thread with asynchronous callbacks?

In this case I have to consider the asynchronous callback as part of the task. I want only one task running at the same time.
Here is boilerplate:
func queueATask() {
DispatchQueue.main.async {
doSomeHeavyWorkPartA()
//tunnelProvider is an instance of NETunnelProviderManager
tunnelProvider.saveToPreferences {
continueHeavyWorkPartB()
}
}
}
So I can make multiple calls of this function.But still before running a new task,the previous one must be finished (including partA and PartB).
queueATask()
queueATask()
queueATask()
I was considering DispatchGroup and DispatchSemaphore, but it needs to block the main queue which is not viable? All the tasks must be running in the main queue. If there is no partB callback block it would be much easier, I think this is the hard part for this situation.
Kindly guide me how to solve this issue.
Thanks

Swift dispatch queues async order of execution

Considering this trivial piece of code
DispatchQueue.global().async {
print("2")
}
print("1")
we can say that output will be as following:
1
2
Are there any circumstances under which order of execution will be different (disregarding kind of used queue)? May they be forced to appear manually if any?
You said:
we can say that output will be as following ...
No, at best you can only say that the output will often/frequently be in that order, but is not guaranteed to be so.
The code snippet is dispatching code asynchronously to a global, concurrent queue, which will run it on a separate worker thread, and, in the absence of any synchronization, you have a classic race condition between the current thread and that worker thread. You have no guarantee of the sequence of these print statements, though, in practice, you will frequently see 1 before 2.
this is one of the common questions at tech interview; and for some reason interviewers expect conclusion that order of execution is constant here. So (as it was not aligned with my understanding) I decided to clarify.
Your understanding is correct, that the sequence of these two print statements is definitely not guaranteed.
Are there any circumstances under which order of execution will be different
A couple of thoughts:
By adjusting queue priorities, for example, you can change the likelihood that 1 will appear before 2. But, again, not guaranteed.
There are a variety of mechanisms to guarantee the order.
You can use a serial queue ... I gather you didn’t want to consider using another/different queue, but it is generally the right solution, so any discussion on this topic would be incomplete without that scenario;
You can use dispatch group ... you can notify on the global queue when the current queue satisfies the group;
You can use dispatch semaphores ... semaphores are a classic answer to the question, but IMHO semaphores should used sparingly as it’s so easy to make mistakes ... plus blocking threads is never a good idea;
For the sake of completeness, we should mention that you really can use any synchronization mechanism, such as locks.
I do a quick test too. Normally I will do a UI update after code execution on global queue is complete, and I would usually put in the end of code block step 2.
But today I suddenly found even I put that main queue code in the beginning of global queue block step 1, it still is executed after all global queue code execution is completed.
DispatchQueue.global().async {
// step 1
DispatchQueue.main.async {
print("1. on the main thread")
}
// code global queue
print("1. off the main thread")
print("2. off the main thread")
// step 2
DispatchQueue.main.async {
print("2. on the main thread")
}
}
Here is the output:
1. off the main thread
2. off the main thread
1. on the main thread
2. on the main thread

Is it possible to specify the `DispatchQueue` for `DispatchQueue.concurrentPerform`?

dispatch_apply takes a dispatch queue as a parameter, which allows you to choose which queue to execute the block on.
My understanding is that DispatchQueue.concurrentPerform in Swift is meant to replace dispatch_apply. But this function does not take a dispatch queue as a parameter. After googling around, I found this GCD tutorial which has this code:
let _ = DispatchQueue.global(qos: .userInitiated)
DispatchQueue.concurrentPerform(iterations: addresses.count) { index in
// do work here
}
And explains:
This implementation includes a curious line of code: let _ = DispatchQueue.global(qos: .userInitiated). Calling this tells causes GCD to use a queue with a .userInitiated quality of service for the concurrent calls.
My question is, does this actually work to specify the QoS? If so, how?
It would kind of make sense to me that there is no way to specify the queue for this, because a serial queue makes no sense in this context and only the highest QoS really makes sense given that this is a synchronous blocking function. But I can't find any documentation as to why it is possible to specify a queue with dispatch_apply but impossible(?) with DispatchQueue.concurrentPerform.
The author’s attempt to specify the queue quality of service (QoS) is incorrect. The concurrentPerform uses the current queue’s QoS if it can. You can confirm this by tracking through the source code:
concurrentPerform calls _swift_dispatch_apply_current.
_swift_dispatch_apply_current calls dispatch_apply with 0, i.e., DISPATCH_APPLY_AUTO, which is defined as a ...
... Constant to pass to dispatch_apply() or dispatch_apply_f() to request that the system automatically use worker threads that match the configuration of the current thread as closely as possible.
When submitting a block for parallel invocation, passing this constant as the queue argument will automatically use the global concurrent queue that matches the Quality of Service of the caller most closely.
This can also be confirmed by following dispatch_apply call dispatch_apply_f in which using DISPATCH_APPLY_AUTO results in the call to _dispatch_apply_root_queue. If you keep tumbling down the rabbit hole of swift-corelibs-libdispatch, you’ll see that this actually does use a global queue which is the same QoS as your current thread.
Bottom line, the correct way to specify the QoS is to dispatch the call to concurrentPerform to the desired queue, e.g.:
DispatchQueue.global(qos: .userInitiated).async {
DispatchQueue.concurrentPerform(iterations: 3) { (i) in
...
}
}
This is easily verified empirically by adding a break point and looking at the queue in the Xcode debugger:
Needless to say, the suggestion of adding the let _ = ... is incorrect. Consider the following:
DispatchQueue.global(qos: .utility).async {
let _ = DispatchQueue.global(qos: .userInitiated)
DispatchQueue.concurrentPerform(iterations: 3) { (i) in
...
}
}
This will run with “utility” QoS, not “user initiated”.
Again, this is easily verified empirically:
See WWDC 2017 video Modernizing Grand Central Dispatch for a discussion about DISPATCH_APPLY_AUTO and concurrentPerform.

Meaning of async in DispatchQueue

I'm watching the concurrent programming talk from WWDC, found here and I'm a little confused about the meaning of async. In Javascript world, we use "async" to describe processes that happen "out of order" or practically, processes that are not guaranteed to return before the next line of code is executed. However, in this talk it seems as though what the presenter is demonstrating is in fact a series of tasks being executed one after the other (ie. synchronous behavior)
Then in this code example
let queue = DispatchQueue(label: "com.example.imagetransform")
queue.async {
let smallImage = image.resize(to: rect)
DispatchQueue.main.async {
imageView.image = smallImage
}
}
This DispatchQueue doesn't seem to behave like the FIFO data structure it's supposed to be on first glance. I see that they are passing in a closure into the queue.async method, but I'm not sure how to add more blocks, or tasks, for it to execute.
I'm guessing the async nature of this is similar to Javascript callbacks in the sense that all the variables being assigned in the closure are only scoped to that closure and we can only act on that data within the closure.
DispatchQueues are FIFO with regard to the start of tasks. If they are concurrent queues, or if you're submitting async tasks, then there's no FIFO guarantee on when they finish
Concurrent queues allow multiple tasks to run at the same time. Tasks are guaranteed to start in the order they were added. Tasks can finish in any order and you have no knowledge of the time it will take for the next task to start, nor the number of tasks that are running at any given time.
- https://www.raywenderlich.com/148513/grand-central-dispatch-tutorial-swift-3-part-1

priority of Dispatch Queues in swift 3

I have read the tutorial about GCD and Dispatch Queue in Swift 3
But I'm really confused about the order of synchronous execution and asynchronous execution and main queue and background queue.
I know that if we use sync then we execute them one after the precious one, if we use async then we can use QoS to set their priority, but how about this case?
func queuesWithQoS() {
let queue1 = DispatchQueue(label: "com.appcoda.myqueue1")
let queue2 = DispatchQueue(label: "com.appcoda.myqueue2")
for i in 1000..<1005 {
print(i)
}
queue1.async {
for i in 0..<5{
print(i)
}
}
queue2.sync {
for i in 100..<105{
print( i)
}
}
}
The outcome shows that we ignore the asynchronous execution. I know queue2 should be completed before queue1 since it's synchronous execution but why we ignore the asynchronous execution and
what is the actual difference between async, sync and so-called main queue?
You say:
The outcome shows that we ignore the asynchronous execution. ...
No, it just means that you didn't give the asynchronously dispatched code enough time to get started.
I know queue2 should be completed before queue1 since it's synchronous execution ...
First, queue2 might not complete before queue1. It just happens to. Make queue2 do something much slower (e.g. loop through a few thousand iterations rather than just five) and you'll see that queue1 can actually run concurrently with respect to what's on queue2. It just takes a few milliseconds to get going and the stuff on your very simple queue2 is finishing before the stuff on queue1 gets a chance to start.
Second, this behavior is not technically because it's synchronous execution. It's just that async takes a few milliseconds to get it's stuff running on some worker thread, whereas the synchronous call, because of optimizations that I won't bore you with, gets started more quickly.
but why we ignore the asynchronous execution ...
We don't "ignore" it. It just takes a few milliseconds to get started.
and what is the actual difference between async, sync and so-called main queue?
"Async" merely means that the current thread may carry on and not wait for the dispatched code to run on some other thread. "Sync" means that the current thread should wait for the dispatched code to finish.
The "main thread" is a different topic and simply refers to the primary thread that is created for driving your UI. In practice, the main thread is where most of your code runs, basically running everything except that which you manually dispatch to some background queue (or code that is dispatched there for you, e.g. URLSession completion handlers).
sync and async are related to the same thread / queue. To see the difference please run this code:
func queuesWithQoS() {
let queue1 = DispatchQueue(label: "com.appcoda.myqueue1")
queue1.async {
for i in 0..<5{
print(i)
}
}
print("finished")
queue1.sync {
for i in 0..<5{
print(i)
}
}
print("finished")
}
The main queue is the thread the entire user interface (UI) runs on.
First of all I prefer to use the term "delayed" instead of "ignored" about the code execution, because all your code in your question is executed.
QoS is an enum, the first class means the highest priority, the last one the lowest priority, when you don't specify any priority you have a queue with default priority and default is in the middle:
userInteractive
userInitiated
default
utility
background
unspecified
Said that, you have two synchronous for-in loops and one async, where the priority is based by the position of the loops and the kind of the queues (sync/async) in the code because here we have 3 different queues (following the instructions about your link queuesWithQoS() could be launched in viewDidAppearso we can suppose is in the main queue)
The code show the creation of two queues with default priority, so the sequence of the execution will be:
the for-in loop with 1000..<1005 in the main queue
the synchronous queue2 with default priority
the asynchronous queue1 (not ignored, simply delayed) with default priority
Main queue have always the highest priority where all the UI instructions are executed.