Swift 6.2 is obtainable and it comes with a number of enhancements to Swift Concurrency. Considered one of these options is the @concurrent
declaration that we will apply to nonisolated
features. On this put up, you’ll be taught a bit extra about what @concurrent
is, why it was added to the language, and when you ought to be utilizing @concurrent
.
Earlier than we dig into @concurrent
itself, I’d like to supply somewhat little bit of context by exploring one other Swift 6.2 characteristic known as nonisolated(nonsending)
as a result of with out that, @concurrent
wouldn’t exist in any respect.
And to make sense of nonisolated(nonsending)
we’ll return to nonisolated
features.
Exploring nonisolated features
A nonisolated
operate is a operate that’s not remoted to any particular actor. In the event you’re on Swift 6.1, otherwise you’re utilizing Swift 6.2 with default settings, that signifies that a nonisolated
operate will at all times run on the worldwide executor.
In additional sensible phrases, a nonisolated
operate would run its work on a background thread.
For instance the next operate would run away from the primary actor always:
nonisolated
func decode(_ knowledge: Information) async throws -> T {
// ...
}
Whereas it’s a handy method to run code on the worldwide executor, this habits might be complicated. If we take away the async
from that operate, it’s going to at all times run on the callers actor:
nonisolated
func decode(_ knowledge: Information) throws -> T {
// ...
}
So if we name this model of decode(_:)
from the primary actor, it’s going to run on the primary actor.
Since that distinction in habits might be surprising and complicated, the Swift group has added nonisolated(nonsending)
. So let’s see what that does subsequent.
Exploring nonisolated(nonsending) features
Any operate that’s marked as nonisolated(nonsending)
will at all times run on the caller’s executor. This unifies habits for async
and non-async
features and might be utilized as follows:
nonisolated(nonsending)
func decode(_ knowledge: Information) async throws -> T {
// ...
}
Everytime you mark a operate like this, it now not robotically offloads to the worldwide executor. As an alternative, it’s going to run on the caller’s actor.
This doesn’t simply unify habits for async
and non-async
features, it additionally makes our much less concurrent and simpler to purpose about.
After we offload work to the worldwide executor, because of this we’re primarily creating new isolation domains. The results of that’s that any state that’s handed to or accessed inside our operate is probably accessed concurrently if we’ve got concurrent calls to that operate.
Because of this we should make the accessed or passed-in state Sendable
, and that may turn out to be fairly a burden over time. For that purpose, making features nonisolated(nonsending)
makes lots of sense. It runs the operate on the caller’s actor (if any) so if we cross state from our call-site right into a nonisolated(nonsending)
operate, that state doesn’t get handed into a brand new isolation context; we keep in the identical context we began out from. This implies much less concurrency, and fewer complexity in our code.
The advantages of nonisolated(nonsending)
can actually add up which is why you can also make it the default to your nonisolated
operate by opting in to Swift 6.2’s NonIsolatedNonSendingByDefault
characteristic flag.
When your code is nonisolated(nonsending)
by default, each operate that’s both explicitly or implicitly nonisolated
might be thought of nonisolated(nonsending)
. Because of this we’d like a brand new method to offload work to the worldwide executor.
Enter @concurrent
.
Offloading work with @concurrent in Swift 6.2
Now that you already know a bit extra about nonisolated
and nonisolated(nonsending)
, we will lastly perceive @concurrent
.
Utilizing @concurrent
makes most sense if you’re utilizing the NonIsolatedNonSendingByDefault
characteristic flag as properly. With out that characteristic flag, you may proceed utilizing nonisolated
to realize the identical “offload to the worldwide executor” habits. That stated, marking features as @concurrent
can future proof your code and make your intent specific.
With @concurrent
we will be certain that a nonisolated
operate runs on the worldwide executor:
@concurrent
func decode(_ knowledge: Information) async throws -> T {
// ...
}
Marking a operate as @concurrent
will robotically mark that operate as nonisolated
so that you don’t have to jot down @concurrent nonisolated
. We will apply @concurrent
to any operate that doesn’t have its isolation explicitly set. For instance, you may apply @concurrent
to a operate that’s outlined on a essential actor remoted kind:
@MainActor
class DataViewModel {
@concurrent
func decode(_ knowledge: Information) async throws -> T {
// ...
}
}
And even to a operate that’s outlined on an actor:
actor DataViewModel {
@concurrent
func decode(_ knowledge: Information) async throws -> T {
// ...
}
}
You’re not allowed to use @concurrent
to features which have their isolation outlined explicitly. Each examples beneath are incorrect because the operate would have conflicting isolation settings.
@concurrent @MainActor
func decode(_ knowledge: Information) async throws -> T {
// ...
}
@concurrent nonisolated(nonsending)
func decode(_ knowledge: Information) async throws -> T {
// ...
}
Figuring out when to make use of @concurrent
Utilizing @concurrent
is an specific declaration to dump work to a background thread. Observe that doing so introduces a brand new isolation area and would require any state concerned to be Sendable
. That’s not at all times a straightforward factor to tug off.
In most apps, you solely need to introduce @concurrent
when you will have an actual concern to resolve the place extra concurrency helps you.
An instance of a case the place @concurrent
ought to not be utilized is the next:
class Networking {
func loadData(from url: URL) async throws -> Information {
let (knowledge, response) = attempt await URLSession.shared.knowledge(from: url)
return knowledge
}
}
The loadData
operate makes a community name that it awaits with the await
key phrase. That signifies that whereas the community name is energetic, we droop loadData
. This enables the calling actor to carry out different work till loadData
is resumed and knowledge is obtainable.
So once we name loadData
from the primary actor, the primary actor could be free to deal with consumer enter whereas we anticipate the community name to finish.
Now let’s think about that you just’re fetching a considerable amount of knowledge that you must decode. You began off utilizing default code for every little thing:
class Networking {
func getFeed() async throws -> Feed {
let knowledge = attempt await loadData(from: Feed.endpoint)
let feed: Feed = attempt await decode(knowledge)
return feed
}
func loadData(from url: URL) async throws -> Information {
let (knowledge, response) = attempt await URLSession.shared.knowledge(from: url)
return knowledge
}
func decode(_ knowledge: Information) async throws -> T {
let decoder = JSONDecoder()
return attempt decoder.decode(T.self, from: knowledge)
}
}
On this instance, all of our features would run on the caller’s actor. For instance, the primary actor. After we discover that decode
takes lots of time as a result of we fetched an entire bunch of knowledge, we will resolve that our code would profit from some concurrency within the decoding division.
To do that, we will mark decode
as @concurrent
:
class Networking {
// ...
@concurrent
func decode(_ knowledge: Information) async throws -> T {
let decoder = JSONDecoder()
return attempt decoder.decode(T.self, from: knowledge)
}
}
All of our different code will proceed behaving prefer it did earlier than by operating on the caller’s actor. Solely decode
will run on the worldwide executor, guaranteeing we’re not blocking the primary actor throughout our JSON decoding.
We made the smallest unit of labor doable @concurrent
to keep away from introducing a great deal of concurrency the place we don’t want it. Introducing concurrency with @concurrent
shouldn’t be a nasty factor however we do need to restrict the quantity of concurrency in our app. That’s as a result of concurrency comes with a reasonably excessive complexity value, and fewer complexity in our code sometimes signifies that we write code that’s much less buggy, and simpler to take care of in the long term.