Is DispatchSemaphore a good replacement for NSLock? - swift

I've been using NSLocks to synchronize touchy parts of code, but have been running into issues due to the fact that they must be unlocked from the same thread that they were locked from. Then I found that GCD's DispatchSemaphores seem to do the same thing, with the added convenience that they can be signaled from any thread. I was wondering, though, if this convenience comes at the price of thread-safety. Is it advisable to replace
let lock = NSLock()
lock.lock()
// do things...
lock.unlock()
with
let semaphore = DispatchSemaphore(value: 1)
semaphore.wait()
// do things...
semaphore.signal()
or will I run into issues regarding thread-safety anyway?

Yes they have the same function, both to deal with producer-consumer problem.
Semaphore allows more than one thread to access a shared resource if it is configured accordingly. You can make the execution of the blocks in the same concurrent dispatchQueue.
{
semaphore.wait()
// do things...
semaphore.signal()
}
Actually the same applies to Lock, if you only want one thread to touch the resource at one time, in the concurrent way.
I found this to be helpful: https://priteshrnandgaonkar.github.io/concurrency-with-swift-3/

Since asking this, I have mostly switched over to another way of locking blocks of code: serial dispatch queues. I use it like this:
let queue = DispatchQueue(label: "<your label here>")
queue.async {
// do things...
}
The queue is serial by default, meaning it acts as a lock that releases when the block is exited. Therefore, it's not appropriate if you need to lock on an asynchronous operation, but it works a charm in most cases.

Related

Swift Concurrency : Why Task is not executed on other background thread

I am trying to learn the swift concurrency but it brings in a lot of confusion. I understood that a Task {} is an asynchronous unit and will allow us to bridge the async function call from a synchronous context. And it is similar to DispatchQueue.Global() which in turn will execute the block on some arbitrary thread.
override func viewDidLoad() {
super.viewDidLoad()
Task {
do {
let data = try await asychronousApiCall()
print(data)
} catch {
print("Request failed with error: \(error)")
}
}
for i in 1...30000 {
print("Thread \(Thread.current)")
}
}
my asychronousApiCall function is below
func asychronousApiCall() async throws -> Data {
print("starting with asychronousApiCall")
print("Thread \(Thread.current)")
let url = URL(string: "https://www.stackoverflow.com")!
// Use the async variant of URLSession to fetch data
// Code might suspend here
let (data, _) = try await URLSession.shared.data(from: url)
return data
}
When I try this implementation. I always see that "starting with asychronousApiCall" is printed after the for loop is done and the thread is MainThread.
like this
Thread <_NSMainThread: 0x600000f10500>{number = 1, name = main}
You said:
I understood that a Task {} is an asynchronous unit and will allow us to bridge the async function call from a synchronous context.
Yes.
You continue:
And it is similar to DispatchQueue.global() which in turn will execute the block on some arbitrary thread.
No, if you call it from the main actor, it is more akin to DispatchQueue.main.async { … }. As the documentation says, it “[r]uns the given nonthrowing operation asynchronously as part of a new top-level task on behalf of the current actor” [emphasis added]. I.e., if you are currently on the main actor, the task will be run on behalf of the main actor, too.
While it is a probably mistake to dwell on direct GCD-to-concurrency mappings, Task.detached { … } is more comparable to DispatchQueue.global().async { … }.
You commented:
Please scroll to figure 8 in last of the article. It has a normal task and Thread is print is some other thread.
figure 8
In that screen snapshot, they are showing that prior to the suspension point (i.e., before the await) it was on the main thread (which makes sense, because it is running it on behalf of the same actor). But they are also highlighting that after the suspension point, it was on another thread (which might seem counterintuitive, but it is what can happen after a suspension point). This is very common behavior in Swift concurrency, though it can vary.
FWIW, in your example above, you only examine the thread before the suspension point and not after. The take-home message of figure 8 is that the thread used after the suspension point may not be the same one used before the suspension point.
If you are interested in learning more about some of these implementation details, I might suggest watching WWDC 2021 video Swift concurrency: Behind the scenes.
While it is interesting to look at Thread.current, it should be noted that Apple is trying to wean us off of this practice. E.g., in Swift 5.7, if we look at Thread.current from an asynchronous context, we get a warning:
Class property 'current' is unavailable from asynchronous contexts; Thread.current cannot be used from async contexts.; this is an error in Swift 6
The whole idea of Swift concurrency is that we stop thinking in terms of threads and we instead let Swift concurrency choose the appropriate thread on our behalf (which cleverly avoids costly context switches where it can; sometimes resulting code that runs on threads other than what we might otherwise expect).

Swift: DispatchQueue always Serial?

I receive a lot of data and want to process it off the main thread. Most of the things can be asynchronous in nature, however sometimes there is critical data coming in with each update invalidating the previous iteration. At the moment I set up everything like this:
func onReceivedData(d: Data){ //is on main thread unfortunately
//some work 1
DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async {
//some work 2
if(saveToFile) {
DispatchQueue.global(qos: DispatchQoS.QoSClass.utility).async {
//UserDefaults write
}
}
}
}
Now I am not sure if I understood the DispatchQueue correctly. Is it possible when data A comes in before data B, that data B for some reason may reach the UserDefaults write earlier than data A? Do the DispatchQueues push operations in a serial manner or can they also operate in parallel, potentially writing data B before data A, leaving the UserDefaults in an invalid state?
From the Apple Developer documentation on DispatchQueue.global():
Tasks submitted to the returned queue are scheduled concurrently with respect to one another.
This means that the global() queues are concurrent dispatch queues.
If you want to ensure that the tasks execute in the order that you schedule them, you should create your own serial DispatchQueue.
let queue = DispatchQueue(label: "my-queue")
queue.async {
// Work 1...
}
queue.async {
// Work 2...
}
"Work 1" will always run before "Work 2" in this serial queue (which may not be the case with a concurrent queue).
Alternatively, you could look into using a Swift actor which can more efficiently control access to shared mutable state in a memory-safe way.

Why does DispatchSemaphore.wait() block this completion handler?

So I've been playing about with NetworkExtension to to make a toy VPN implementation and I ran into an issue with the completion handlers/asynchronously running code. I'll run you through my train of thought/expirments and would appreciate any pointers at areas where I am mistaken, and how to resolve this issue!
Here's the smallest reproducible bit of code (obviously you will need to import NetworkExtension):
let semaphore = DispatchSemaphore(value: 0)
NETunnelProviderManager.loadAllFromPreferences { managers, error in
print("2 during")
semaphore.signal()
}
print("1 before")
semaphore.wait()
print("3 after")
With my understanding of semaphores and asynchronous code I'd expect the printouts to occur in the order:
1 before
2 during
3 after
However the program hangs at "1 before". If I remove the semaphore.wait() line, the printout occurs as expected in the order: 1, 3, 2 (as the closure runs later).
So after a bit of digging around with the debugger, it looks like the semaphore trap loop is blocking up execution. This sparked me to read around a bit into queues, and I discovered that changing it to the following works:
// ... as before
DispatchQueue.global().async {
semaphore.wait()
print("3 after")
}
This makes some sense as the blocking .wait() call is now being called asynchronously in a separate thread. However, this solution is not desired for me as in my actual implementation I am actually capturing the results from the closure and returning them later, in something that looks like this:
let semaphore = DispatchSemaphore(value: 0)
var results: [NETunnelProviderManager]? = nil
NETunnelProviderManager.loadAllFromPreferences { managers, error in
print("2 during")
results = managers
semaphore.signal()
}
print("1 before")
// DispatchQueue.global().async {
semaphore.wait()
print("3 after")
// }
return results
Obviously I cannot return data from from the async closure, and moving the return out of it would make it defunct. Acdditionally, adding another semaphore to make things synchronous exhibits the same issue as before just moving the problem along in a chain.
As a result, I decided to try putting the .loadAllFromPreferences() call and completion handler in an async closure and leave everything else as in the original code snippet:
// ...
DispatchQueue.global().async {
NETunnelProviderManager.loadAllFromPreferences { loadedManagers, error in
print("2 during")
semaphore.signal()
}
}
// ...
However this does not work and the .wait() call is never passed - as before. I assume that somehow the sempahore is still blocking the thread and not allowing anything to execute, meaning whatever in the system is managing the queue is not running the async block? However I'm clutching at straws here, and fear my original conclusion may not have been right.
This is where I'm starting to get out of my depth, so I'd like to know what is actually going on, and what resolution would you recommend to get the results from .loadAllFromPreferences() in a synchronous manner?
Thanks!
From the documentation for NETunnelProviderManager loadAllFromPreferences:
This block will be executed on the caller’s main thread after the load operation is complete
So we know that the completion handler is on the main thread.
We also know that the call to DispatchSemaphore wait will block whatever thread it is running on. Given this evidence, you must be calling all of this code from the main thread. Since your call to wait is blocking the main thread, the completion handler can never be called because the main thread is blocked.
This is made clear by your attempt to call wait on some global background queue. That allows the completion block to be called because your use of wait is no longer blocking the main thread.
And your attempt to call loadAllFromPreferences from a global background queue doesn't change anything because its completion block is still called on the main thread and your call to wait is still on the main thread.
It's a bad idea to block the main thread at all. The proper solution is to refactor whatever method this code is in to use its own completion handler instead of trying to use a normal return value.

Safely locking variable in Swift 3 using GCD

How to lock variable and prevent from different thread changing it at the same time, which leads to error?
I tried using
func lock(obj: AnyObject, blk:() -> ()) {
objc_sync_enter(obj)
blk()
objc_sync_exit(obj)
}
but i still have multithreading issues.
Shared Value
If you have a shared value that you want to access in a thread safe way like this
var list:[Int] = []
DispatchQueue
You can create your own serial DispatchQueue.
let serialQueue = DispatchQueue(label: "SerialQueue")
Dispatch Synch
Now different threads can safely access list, you just need to write the code into a closure dispatched to your serial queue.
serialQueue.sync {
// update list <---
}
// This line will always run AFTER the closure on the previous line 👆👆👆
Since the serial queue executes the closures one at the time, the access to list will be safe.
Please note that the previous code will block the current thread until the closure is executed.
Dispatch Asynch
If you don't want to block the current thread until the closure is processed by the serial queue, you can dispatch the closure asynchronously
serialQueue.async {
// update list <---
}
// This line can run BEFORE the closure on the previous line 👆👆👆
Swift's concurrency support isn't there yet. It sounds like it might be developed a bit in Swift 5. An excellent article is Matt Gallagher's Mutexes and Closure Capture in Swift, which looks at various solutions but recommends pthread_mutex_t. The choice of approach depends on other aspects of what you're writing - there's much to consider with threading.
Could you provide a specific simple example that's failing you?

Swift: synchronously perform code in background; queue.sync does not work as I would expect

I would like to perform some code synchronously in the background, I really thought this is the way to go:
let queue = DispatchQueue.global(qos: .default)
queue.async {
print("\(Thread.isMainThread)")
}
but this prints true unless I use queue.async. async isn't possible as then the code will be executed in parallel. How can I achieve running multiple blocks synchronously in the background?
What I would like to achieve: synchronize events in my app with the devices calendar, which happens in the background. The method which does this can be called from different places multiple times so I would like to keep this in order and in the background.
Async execution isn't your problem, since you only care about the order of execution of your code blocks relative to each other but not relative to the main thread. You shouldn't block the main thread, which is in fact DispatchQueue.main and not DispatchQueue.global.
What you should do is execute your code on a serial queue asynchronously, so you don't block the main thread, but you still ensure that your code blocks execute sequentially.
You can achieve this using the following piece of code:
let serialQueue = DispatchQueue(label: "serialQueue")
serialQueue.async{ //call this whenever you need to add a new work item to your queue
//call function here
}
DispatchQueue is not equal to a Thread. Think of it as of a kind of abstraction over the thread pool.
That being said, main queue is indeed "fixed" on the main thread. And that is why, when you synchronously dispatch a work item from the main queue, you are still on the main thread.
To actually execute sync code in the background, you have to already be in the background:
DispatchQueue.global().async {
DispatchQueue.global().sync {
print("\(Thread.isMainThread)")
}
}
This will print false.
Also, as user #rmaddy correctly pointed out in comments, doing any expensive tasks synchronously from the main queue might result in your program becoming unresponsive, since the main thread is responsible for the UI updates.