HomeiOS DevelopmentWhat's @concurrent in Swift 6.2? – Donny Wals

What’s @concurrent in Swift 6.2? – Donny Wals


Swift 6.2 is on the market and it comes with a number of enhancements to Swift Concurrency. One in all these options is the @concurrent declaration that we will apply to nonisolated features. On this publish, you’ll be taught a bit extra about what @concurrent is, why it was added to the language, and when try to be utilizing @concurrent.

Earlier than we dig into @concurrent itself, I’d like to offer a bit of little bit of context by exploring one other Swift 6.2 function referred to as nonisolated(nonsending) as a result of with out that, @concurrent wouldn’t exist in any respect.

And to make sense of nonisolated(nonsending) we’ll return to nonisolated features.

Exploring nonisolated features

A nonisolated perform is a perform that’s not remoted to any particular actor. If you happen to’re on Swift 6.1, otherwise you’re utilizing Swift 6.2 with default settings, that implies that a nonisolated perform will at all times run on the worldwide executor.

In additional sensible phrases, a nonisolated perform would run its work on a background thread.

For instance the next perform would run away from the primary actor always:

nonisolated 
func decode(_ information: Information) async throws -> T {
  // ...
}

Whereas it’s a handy option to run code on the worldwide executor, this conduct might be complicated. If we take away the async from that perform, it is going to at all times run on the callers actor:

nonisolated 
func decode(_ information: Information) throws -> T {
  // ...
}

So if we name this model of decode(_:) from the primary actor, it is going to run on the primary actor.

Since that distinction in conduct might be sudden and complicated, the Swift workforce has added nonisolated(nonsending). So let’s see what that does subsequent.

Exploring nonisolated(nonsending) features

Any perform that’s marked as nonisolated(nonsending) will at all times run on the caller’s executor. This unifies conduct for async and non-async features and might be utilized as follows:

nonisolated(nonsending) 
func decode(_ information: Information) async throws -> T {
  // ...
}

Everytime you mark a perform like this, it not routinely offloads to the worldwide executor. As an alternative, it is going to run on the caller’s actor.

This doesn’t simply unify conduct for async and non-async features, it additionally makes our much less concurrent and simpler to cause about.

Once we offload work to the worldwide executor, because of this we’re primarily creating new isolation domains. The results of that’s that any state that’s handed to or accessed inside our perform is doubtlessly accessed concurrently if we’ve got concurrent calls to that perform.

Which means that we should make the accessed or passed-in state Sendable, and that may grow to be fairly a burden over time. For that cause, making features nonisolated(nonsending) makes a whole lot of sense. It runs the perform on the caller’s actor (if any) so if we go state from our call-site right into a nonisolated(nonsending) perform, that state doesn’t get handed into a brand new isolation context; we keep in the identical context we began out from. This implies much less concurrency, and fewer complexity in our code.

The advantages of nonisolated(nonsending) can actually add up which is why you may make it the default on your nonisolated perform by opting in to Swift 6.2’s NonIsolatedNonSendingByDefault function flag.

When your code is nonisolated(nonsending) by default, each perform that’s both explicitly or implicitly nonisolated will likely be thought of nonisolated(nonsending). Which means that we’d like a brand new option to offload work to the worldwide executor.

Enter @concurrent.

Offloading work with @concurrent in Swift 6.2

Now that you already know a bit extra about nonisolated and nonisolated(nonsending), we will lastly perceive @concurrent.

Utilizing @concurrent makes most sense once you’re utilizing the NonIsolatedNonSendingByDefault function flag as nicely. With out that function flag, you’ll be able to proceed utilizing nonisolated to realize the identical “offload to the worldwide executor” conduct. That mentioned, marking features as @concurrent can future proof your code and make your intent express.

With @concurrent we will make sure that a nonisolated perform runs on the worldwide executor:

@concurrent
func decode(_ information: Information) async throws -> T {
  // ...
}

Marking a perform as @concurrent will routinely mark that perform as nonisolated so that you don’t have to put in writing @concurrent nonisolated. We are able to apply @concurrent to any perform that doesn’t have its isolation explicitly set. For instance, you’ll be able to apply @concurrent to a perform that’s outlined on a primary actor remoted sort:

@MainActor
class DataViewModel {
  @concurrent
  func decode(_ information: Information) async throws -> T {
    // ...
  }
}

And even to a perform that’s outlined on an actor:

actor DataViewModel {
  @concurrent
  func decode(_ information: Information) async throws -> T {
    // ...
  }
}

You’re not allowed to use @concurrent to features which have their isolation outlined explicitly. Each examples under are incorrect because the perform would have conflicting isolation settings.

@concurrent @MainActor
func decode(_ information: Information) async throws -> T {
  // ...
}

@concurrent nonisolated(nonsending)
func decode(_ information: Information) async throws -> T {
  // ...
}

Figuring out when to make use of @concurrent

Utilizing @concurrent is an express declaration to dump work to a background thread. Notice that doing so introduces a brand new isolation area and would require any state concerned to be Sendable. That’s not at all times a simple factor to drag off.

In most apps, you solely need to introduce @concurrent when you might have an actual difficulty to resolve the place extra concurrency helps you.

An instance of a case the place @concurrent ought to not be utilized is the next:

class Networking {
  func loadData(from url: URL) async throws -> Information {
    let (information, response) = strive await URLSession.shared.information(from: url)
    return information
  }
}

The loadData perform makes a community name that it awaits with the await key phrase. That implies that whereas the community name is energetic, we droop loadData. This enables the calling actor to carry out different work till loadData is resumed and information is on the market.

So once we name loadData from the primary actor, the primary actor can be free to deal with person enter whereas we await the community name to finish.

Now let’s think about that you simply’re fetching a considerable amount of information that it’s worthwhile to decode. You began off utilizing default code for all the things:

class Networking {
  func getFeed() async throws -> Feed {
    let information = strive await loadData(from: Feed.endpoint)
    let feed: Feed = strive await decode(information)
    return feed
  }

  func loadData(from url: URL) async throws -> Information {
    let (information, response) = strive await URLSession.shared.information(from: url)
    return information
  }

  func decode(_ information: Information) async throws -> T {
    let decoder = JSONDecoder()
    return strive decoder.decode(T.self, from: information)
  }
}

On this instance, all of our features would run on the caller’s actor. For instance, the primary actor. Once we discover that decode takes a whole lot of time as a result of we fetched a complete bunch of information, we will resolve that our code would profit from some concurrency within the decoding division.

To do that, we will mark decode as @concurrent:

class Networking {
  // ...

  @concurrent
  func decode(_ information: Information) async throws -> T {
    let decoder = JSONDecoder()
    return strive decoder.decode(T.self, from: information)
  }
}

All of our different code will proceed behaving prefer it did earlier than by operating on the caller’s actor. Solely decode will run on the worldwide executor, guaranteeing we’re not blocking the primary actor throughout our JSON decoding.

We made the smallest unit of labor potential @concurrent to keep away from introducing a great deal of concurrency the place we don’t want it. Introducing concurrency with @concurrent will not be a nasty factor however we do need to restrict the quantity of concurrency in our app. That’s as a result of concurrency comes with a fairly excessive complexity price, and fewer complexity in our code sometimes implies that we write code that’s much less buggy, and simpler to take care of in the long term.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments