Concurrency in Swift 3

‌‌

Concurrency is vital. Without it our computers and smartphones wouldn’t provide the seamless user experience we’ve come to rely on. Today’s computers and operating systems can start, execute and finish multiple tasks within the same time period. This allows us to interact with the UI while the app performs background tasks like networking, filing I/O, database queries or other, long-running operations.

While users undoubtedly benefit from concurrent programming, actually implementing the concept can be a challenge. Back in the Computer Stone Age (about forty years ago), personal computers weren’t capable of running multiple tasks at once. Then came the evolution of CPU-architectures; computers started supporting the execution of multiple processes or threads. This was great, but the operating systems and applications still lagged.

But like anything that seems too good to be true, concurrency comes with several hazards. As a developer, you’ll want to watch out for a series of potential issues (think “can of worms”: once you open it, be prepared for trouble).

 

Multithreading

Of course, programming languages didn’t make it easy to implement multithreading. Most languages simply provided access to the native, low-level threading APIs and constructs of the underlying operating system—and each operating system used a different threading API. 

Standardized solutions soon appeared, including POSIX Threads. PThreads is a C-library which exposes a set of functions that can be implemented by OS-vendors. This approach lets you use the same interface across multiple platforms. PThreads is supported by UNIX platforms including iOS, macOS and Linux.

Thread pools are another concept meant to simplify an abstract threading. At the core of this pattern stands the idea of having a number of pre-created, idle threads which are ready to be utilized. Whenever there’s a new task to be executed, the thread wakes up, performs the task and then goes back to idle.

So, why would you want to create a bunch of threads to keep around? In one word: performance.

Instead of creating a new thread whenever a task is to be executed (and then destroying it when the task finishes), available threads are taken from the thread pool. Thread creation and destruction is an expensive process, so the thread pool pattern offers considerable performance gains. Letting the library or operating system manage the threads means that you have less to worry about (read: fewer lines of code to write). Besides, the library can optimize the thread management behind the scenes.

Concurrency and parallelism

Concurrent tasks can be executed via parallelism. Parallelism is often confused with concurrency, and while these concepts are related, it’s important to know that they’re different things. Parallelism can only be achieved on multi-core devices; while one core executes one task, the other core can run the other task simultaneously, like this:

Parallelism, however, isn’t required for concurrency. With single core devices, concurrency is achieved via context switching: The core runs one task for some time, then switches to the other task or process, runs it, then switches back to the previous task, and so on, until the task is complete.

Concurrency via context-switch doesn’t ruin the illusion because the switching happens quickly.  With true parallelism, the execution of concurrent tasks is snappier. Furthermore, a context-switch requires storing and restoring the execution state when switching between threads, which means additional overhead.

Grand Central Dispatch (GCD)

Grand Central Dispatch (GCD) is Apple’s framework and it relies on the thread-pool pattern. GCD was first made available in 2009 with MAC OS X 10.6/Snow Leopard and iOS 4. At the core of GCD is the idea of work items, which can be dispatched to a queue; the queue hides all thread management related tasks. You can configure the queue, but you won’t interact directly with a thread. This model simplifies the creation and the execution of asynchronous or synchronous tasks.

GCD abstracts the notion of threads, and exposes dispatch queues to handle work items (work items are blocks of code that you want to execute). These tasks are assigned to a dispatch queue, which processes them in a First-In-First-Out (FIFO) order.

Serial and concurrent GCD queues

There are two types of queues in GCD. These are known as serial and concurrent.

First, let’s talk about serial queues. If you submit work items to a serial queue, they’ll be executed one after the other in the order they were added. Since it’s a serial queue, no concurrency of any kind is involved. A serial dispatch queue always executes one work item at a time.

Now, let’s take a quick look at Concurrent dispatch queues. Work items submitted to a concurrent dispatch queue will start in the order of adding them to the queue. The number of tasks that will run simultaneously, and the time it takes to execute the next task, is controlled by the queue. It’s important to note that we can’tinfluence this behavior.

Also, it’s entirely hidden whether concurrency is achieved via parallelism, or via context switching.

GCD hides these details from us, and the result is a very simple API, which was further refined in Swift 3.0.

The following snippet illustrates the simplicity of concurrently executing work items using a GCD concurrent dispatch queue:

// create the concurrent queue
let asyncQueue = DispatchQueue(label: "asyncQueue", attributes: .concurrent)

// perform the task asynchronously
asyncQueue.async {
            // perform some long-running task here
}

Isn’t it elegant? Creating the concurrent queue is a breeze. First, we pass in a unique identifier,  then we specify the concurrent nature of this queue by setting the attributes argument to .concurrent. (Note: If you leave out the attributes argument, the queue will be serial by default).

GCD in Swift 3

GCD was completely revamped in Swift 3. In previous Swift versions, GCD exposed a set of C-functions; Swift 3 moved away from the C API and all global C-functions are gone. What we now have are dedicated Swift types, member functions, and properties—all of which provide a more natural Swift interface. You can read more about the proposal here.

Now, using the queue is even easier. Just call the queues async() method, and pass in the block of code that you wish to be executed in the background. Usually, you’ll want to run time-consuming code asynchronously to prevent blocking the main UI thread.

It may seem like we’ve covered quite a bit here, but we’ve barely scratched the surface. Above all else, remember that  Grand Central Dispatch makes it easier to support concurrency in our apps. Instead of managing threads, locks and semaphores, you can now focus on the real implementation task. Running computationally expensive tasks in the background has probably never been easier. Once you completely grasp the concept of thread pools, dispatch queues and work items, adding concurrency to your apps will be almost as easy as any other typical programming task.

Learn more about creational design patterns and concurrency in Swift 3 here.

Contributor

Károly Nyisztor

Károly is a veteran (mobile) developer having built several successful iOS apps and games, most of which were featured by Apple. He is the author of three books on programming and game development, and has worked with companies such as Apple, Siemens - Evosoft, SAP, Zen Studios, and many more. Most of his days are spent as a professional software engineer and IT architect. As an instructor, his aim is to share 20+ years of software development expertise, and change the lives of students throughout the world. His passion is helping people reveal hidden talents, and guide them into the world of startups and programming. He currently teaches object-oriented software design, iOS programming, Objective-C, Swift, and UML.