A deep dive into Grand Central Dispatch in Swift
Basics article available: Grand Central DispatchGrand Central Dispatch (or GCD for short) is one of those fundamental technologies that most Swift developers have used countless times. It’s primarily known for being able to dispatch work on different concurrent queues, and is very often used to write code like this:
DispatchQueue.main.async {
// Run async code on the main queue
}
But it turns out that if we dive a little bit deeper, GCD also has a suite of really powerful APIs and features that not everyone knows about. This week, let’s go beyond async {}
and take a look at some situations where GCD can be really useful, and how it can provide simpler (and more “Swifty”) options to many other - more common - Foundation APIs.
Delaying a cancellable task with DispatchWorkItem
One common misconception about GCD is that “once you schedule a task it cannot be cancelled, you need to use the Operation
API for that”. While that used to be true, with iOS 8 & macOS 10.10 DispatchWorkItem
was introduced, which provides this exact functionality in a very easy to use API.
Let’s say our UI has a search bar, and when the user types a character we perform a search by calling our backend. Since the user can type quite rapidly, we don’t want to start our network request right away (that could waste a lot of data and server capacity), and instead we’re going to “debounce” those events and only perform a request once the user hasn’t typed for 0.25 seconds.
This is where DispatchWorkItem
comes in. By encapsulating our request code in a work item, we can very easily cancel it whenever it's replaced by a new one, like this:
class SearchViewController: UIViewController, UISearchBarDelegate {
// We keep track of the pending work item as a property
private var pendingRequestWorkItem: DispatchWorkItem?
func searchBar(_ searchBar: UISearchBar, textDidChange searchText: String) {
// Cancel the currently pending item
pendingRequestWorkItem?.cancel()
// Wrap our request in a work item
let requestWorkItem = DispatchWorkItem { [weak self] in
self?.resultsLoader.loadResults(forQuery: searchText)
}
// Save the new work item and execute it after 250 ms
pendingRequestWorkItem = requestWorkItem
DispatchQueue.main.asyncAfter(deadline: .now() + .milliseconds(250),
execute: requestWorkItem)
}
}
As we can see above, using DispatchWorkItem
is actually a lot simpler and nicer in Swift than having to use a Timer
or Operation
, thanks to trailing closure syntax and how well GCD imports into Swift. We don’t need @objc
marked methods or #selector
- it can all be done with closures.
Grouping and chaining tasks with DispatchGroup
Sometimes we need to perform a group of operations before we can move on with our logic. For example, let’s say we need to load data from a group of data sources before we can create a model. Rather than having to keep track of all the data sources ourselves, we can easily synchronize the work with a DispatchGroup
.
Using dispatch groups also gives us a big advantage in that our tasks can run concurrently, in separate queues. That enables us to start off simple, and then easily add concurrency later if needed, without having to rewrite any of our tasks. All we have to do is make balanced calls to enter()
and leave()
on a dispatch group to have it synchronize our tasks.
Let’s take a look at an example, in which we load notes from local storage, iCloud Drive and a backend system, and then combine all of the results into a NoteCollection
:
// First, we create a group to synchronize our tasks
let group = DispatchGroup()
// NoteCollection is a thread-safe collection class for storing notes
let collection = NoteCollection()
// The 'enter' method increments the group's task count…
group.enter()
localDataSource.load { notes in
collection.add(notes)
// …while the 'leave' methods decrements it
group.leave()
}
group.enter()
iCloudDataSource.load { notes in
collection.add(notes)
group.leave()
}
group.enter()
backendDataSource.load { notes in
collection.add(notes)
group.leave()
}
// This closure will be called when the group's task count reaches 0
group.notify(queue: .main) { [weak self] in
self?.render(collection)
}
The above code works, but it has a lot of duplication in it. Let’s instead refactor it into an extension on Array
, using a DataSource
protocol as a same-type constraint for its Element
type:
extension Array where Element == DataSource {
func load(completionHandler: @escaping (NoteCollection) -> Void) {
let group = DispatchGroup()
let collection = NoteCollection()
// De-duplicate the synchronization code by using a loop
for dataSource in self {
group.enter()
dataSource.load { notes in
collection.add(notes)
group.leave()
}
}
group.notify(queue: .main) {
completionHandler(collection)
}
}
}
With the above extension, we can now reduce our previous code to this:
let dataSources: [DataSource] = [
localDataSource,
iCloudDataSource,
backendDataSource
]
dataSources.load { [weak self] collection in
self?.render(collection)
}
Very nice and compact! 👍
Waiting for asynchronous tasks with DispatchSemaphore
While DispatchGroup
provides a nice and easy way to synchronize a group of asynchronous operations while still remaining asynchronous, DispatchSemaphore
provides a way to synchronously wait for a group of asynchronous tasks. This is very useful in command line tools or scripts, where we don’t have an application run loop, and instead just execute synchronously in a global context until done.
Like DispatchGroup
, the semaphore API is very simple in that we only increment or decrement an internal counter, by either calling wait()
or signal()
. Calling wait()
before a signal()
will block the current queue until a signal is received.
Let’s create another overload in our extension on Array
from before, that returns a NoteCollection
synchronously, or else throws an error. We’ll reuse our DispatchGroup
-based code from before, but simply coordinate that task using a semaphore.
extension Array where Element == DataSource {
func load() throws -> NoteCollection {
let semaphore = DispatchSemaphore(value: 0)
var loadedCollection: NoteCollection?
// We create a new queue to do our work on, since calling wait() on
// the semaphore will cause it to block the current queue
let loadingQueue = DispatchQueue.global()
loadingQueue.async {
// We extend 'load' to perform its work on a specific queue
self.load(onQueue: loadingQueue) { collection in
loadedCollection = collection
// Once we're done, we signal the semaphore to unblock its queue
semaphore.signal()
}
}
// Wait with a timeout of 5 seconds
semaphore.wait(timeout: .now() + 5)
guard let collection = loadedCollection else {
throw NoteLoadingError.timedOut
}
return collection
}
}
Using the above new method on Array
, we can now load notes synchronously in a script or command line tool like this:
let dataSources: [DataSource] = [
localDataSource,
iCloudDataSource,
backendDataSource
]
do {
let collection = try dataSources.load()
output(collection)
} catch {
output(error)
}
Observing changes in a file with DispatchSource
The final “lesser known” feature of GCD that I want to bring up is how it provides a way to observe changes in a file on the file system. Like DispatchSemaphore
, this is something which can be super useful in a script or command line tool, if we want to automatically react to a file being edited by the user. This enables us to easily build developer tools that have “live editing” features.
Dispatch sources come in a few different variants, depending on what we want to observe. In this case we’ll use DispatchSourceFileSystemObject
, which lets us observe events from the file system.
Let's take a look at an example implementation of a simple FileObserver
, that lets us attach a closure to be run every time a given file is changed. It works by creating a dispatch source using a fileDescriptor
and a DispatchQueue
to perform the observation on, and uses Files to refer to the file to observe:
class FileObserver {
private let file: File
private let queue: DispatchQueue
private var source: DispatchSourceFileSystemObject?
init(file: File) {
self.file = file
self.queue = DispatchQueue(label: "com.myapp.fileObserving")
}
func start(closure: @escaping () -> Void) {
// We can only convert an NSString into a file system representation
let path = (file.path as NSString)
let fileSystemRepresentation = path.fileSystemRepresentation
// Obtain a descriptor from the file system
let fileDescriptor = open(fileSystemRepresentation, O_EVTONLY)
// Create our dispatch source
let source = DispatchSource.makeFileSystemObjectSource(
fileDescriptor: fileDescriptor,
eventMask: .write,
queue: queue
)
// Assign the closure to it, and resume it to start observing
source.setEventHandler(handler: closure)
source.resume()
self.source = source
}
}
We can now use FileObserver
like this:
let observer = try FileObserver(file: file)
observer.start {
print("File was changed")
}
Imagine all the cool developer tools that could be built using this! 😀
Conclusion
Grand Central Dispatch is a really powerful framework that does so much more than what it might first look like. Hopefully this post has sparked your imagination in terms of what you can use it for, and I suggest you try it out the next time you need to do one of the tasks that we took a look at in this post.
In my opinion, a lot of Timer
or OperationQueue
-based code, as well as usage of 3rd party async frameworks, can actually be made simpler by using GCD directly.
What do you think? Do you know about another GCD feature that you find really useful? Let me know - along with your questions, comments or feedback - on Twitter, Mastodon or Micro.blog.
Thanks for reading! 🚀