Go back to Day 3
Woops! I think I did too much on Day 3. I didn't do the pre-reading to realise that I didn't need to learn about all the Task stuff. Oh well! Let's get started.
1. Concept Overview
Tasks
LLM Says: The fundamental unit of async work. Created implicitly when you await, or explicitly with Task { ... }.
• Parent/child hierarchy: an async function is the parent; any work it spawns is a child. Parent waits for children before it returns (unless cancelled), and cancellation flows downward.
• Priorities & actors: a task has a priority; hop to the main actor with @MainActor/MainActor.run for UI updates.
I already looked up a lot of this on Day 3, so I don't feel like I need to do a whole lot of research. Task definitely is the fundamental unit of work, I know it's created with an await (which also creates a suspension point). I get that there's a parent/child heirarchy with tasks, with the function being the parent, and any calls it makes to async functions being children.
Up until this point, I haven't really looked to much at SwiftUI's MainActor. I know that Actors provide isolation guarantees, without giving much certainty about which threads code will run on. But this is a pretty good overview about the Main Actor.
Specifically, the Main Actor is one instance of a Global Actor, which acts as a Singleton actor. The app has only one Main Actor, and the main actor always performs its tasks on the main thread. Any actor conforming to the GlobalActor protocol has to provide a static property called shared, through which access to that actor is synchronized.
⚠️ Marking a function with @MainActor only guarantees being run on the UI thread (main thread) if it's called from an asynchronous context. If you're calling a synchronous method from a non-isolated context, the method will be run on the same thread as the caller, regardless of any actor annotations.. (The Swift 6 compiler will catch most of these, thankfully)
async let
LLM Says: Fire off a fixed, small number of sibling child operations in parallel inside the current scope.
You must await (or otherwise resolve) each async let before leaving scope; if you don’t, it’s implicitly cancelled.
Great for “fetch A, B, C at once,” where the arity is known at compile time.
Antoine van der Lee and Paul Hudson both have good articles about async let, and it's not difficult to see the utility.
Essentially, async let allows you to kick off an asynchronous task, and continue execution to the next line of code. So you could run a few async lets in a row to download a number of photos on the screen, then only return from the function once they've all downloaded.
Like so (h/t Paul Hudson, linked above):
func getAppData() async -> ([String], [String], Bool) {
async let news = getNews()
async let weather = getWeather()
async let hasUpdate = getUpdateAvailable()
return await (news, weather, hasUpdate)
}
⚠️ The await at the end of the function is key there; if you don't actually wait for the tasks to complete at some point and the function reaches the end, then it's the same as cancelling the tasks that are still in flight.
Task Groups (with(Throwing)TaskGroup)
For dynamic fan-out (N tasks based on runtime data) with automatic child management.
Child failures in a withThrowingTaskGroup cancel remaining siblings and rethrow to the parent.
You can addTask, optionally await group.next() to consume results as they finish (not in submission order).
Task Groups guarantee concurrency, but not parallelism. Thanks Jon Shier! That's a good little nugget to keep in mind.
Basically, you can call the static methods withTaskGroup or withThrowingTaskGroup from anywhere, and then put a dynamic set of tasks into that group. You then can know that these tasks will be executed concurrently. You can also cancel them all, or use sequence operations (like map, reduce) on them, since TaskGroup also conforms to AsyncSequence.
Cancellation
Cooperative: cancelling a task sets a flag and children are cancelled too. Your code should check and react.
How to observe/respond: • try Task.checkCancellation() – throws CancellationError immediately if cancelled.
• Task.isCancelled – boolean; you decide how to exit.
• withTaskCancellationHandler(operation:onCancel:) – run immediate cleanup when the task is cancelled.
Design tip: make async APIs idempotent/cancellation-tolerant and return quickly on cancellation.
2. Code Examples
async let for small, fixed fan-out
So, based on what I've learned this far, this seems pretty sensible.
- A few
Decodablestructs that mirror some JSON data - The
loadDashboardmethod that returns a tuple of those structs. I'm not really in love with the tuple, but it's not awful if it's just being used locally. - As we learned, you have to
awaitany task you create withasync let, eventually. SoloadDashboarddoes
struct User: Decodable { let id: Int; let name: String }
struct Profile: Decodable { let bio: String }
struct FeedItem: Decodable { let id: Int }
func loadDashboard(userID: Int) async throws -> (User, Profile, [FeedItem]) {
async let user: User = get("/users/\(userID)")
async let profile: Profile = get("/users/\(userID)/profile")
async let feed: [FeedItem] = get("/users/\(userID)/feed")
// All three requests run in parallel; throws if any throw
return try await (user, profile, feed)
}
// Minimal typed GET helper
func get<T: Decodable>(_ path: String) async throws -> T {
let url = URL(string: "https://api.example.com\(path)")!
let (data, resp) = try await URLSession.shared.data(from: url)
guard (resp as? HTTPURLResponse)?.statusCode == 200 else { throw URLError(.badServerResponse) }
return try JSONDecoder().decode(T.self, from: data)
}
Dynamic fan-out with a throwing task group
Here, we use withThrowingTaskGroup:of to dynamically fire off 'n' number of tasks. In this case, for each URL in an array, one task is created that will convert that URL to data. Key points that I see:
- for each element in the
urlsarray, we need to rungroup.addTaskto actually create the task that will run concurrently. - Once all the tasks are spun up, we then have to
try awaiteach one before returning the return value, an Array full of Data objects. - I'm not sure why the LLM decided that it needed to create an Array and put a Data() instance in each element of the array. I suppose this creates the array in the correct size in the first place? But surely there's some overhead to creating 'n' Data() instances in the array. Especially since each one is just going to be overwritten once the data comes in.
func fetchImages(_ urls: [URL]) async throws -> [Data] {
try await withThrowingTaskGroup(of: (Int, Data).self) { group in
for (i, url) in urls.enumerated() {
group.addTask {
let (data, resp) = try await URLSession.shared.data(from: url)
guard (resp as? HTTPURLResponse)?.statusCode == 200 else { throw URLError(.badServerResponse) }
return (i, data)
}
}
// Collect as they finish; preserve original order
var results = Array(repeating: Data(), count: urls.count)
for try await (i, data) in group {
results[i] = data
}
return results
}
}
SwiftUI view model that owns a cancellable Task
I love View Models. With apologies to Soroush Khanlou, I've found them to be exceptionally useful when I'm treating them essentially like a Presenter pattern.
Although, the class that the LLM has written below does NOT look like a View Model to me! It's got fetch requests, which doesn't jive with MY concept of a View Model. I know that the term is nebulous at best, so I guess I can forgive the LLM here.
With that said, the point here is to look at how a SwiftUI class can own the cancellable task, and it's interesting:
- It's using Cooperative Cancellation to create a debouncing effect; By holding a reference to
currentTask, and cancelling immediately any time the search() function is called, we can only have one task in flight at any given time, and change out that Task as soon as the user changes the search parameters. - If the task is cancelled, the
try Task.checkCancellation()will pick up on that, and throw aCancellationError, which - in this method - is a no-op. Theresultsarray will stay as it was, and there will be no change in the UI. - What's interesting is that the method itself sleeps for a little bit (300 million nanoseconds) every time the
search()method is called. This provides a bit of a 'debounce' time, so if the user is callingsearch()multiple times in rapid succession, thefetchResults()task will only be called at the end. - The LLM uses
deferhere too, which I remember is akin tofinallyin other languages. Essentially, theisLoadingboolean will be updated after thefetchResultstask either finishes its task, or is cancelled.
@MainActor
final class SearchViewModel: ObservableObject {
@Published var query = ""
@Published var results: [String] = []
@Published var isLoading = false
private var currentTask: Task<Void, Never>?
func search() {
currentTask?.cancel() // cancel in-flight work
currentTask = Task { [query] in
isLoading = true
defer { isLoading = false }
// Debounce-style pause; cancels quickly on new input
try? await Task.sleep(nanoseconds: 300_000_000)
try Task.checkCancellation()
do {
results = try await self.fetchResults(for: query)
} catch is CancellationError {
// Swallow cancellation
} catch {
results = []
}
}
}
func fetchResults(for query: String) async throws -> [String] {
// Pretend network
try Task.checkCancellation()
return ["\(query) 1", "\(query) 2", "\(query) 3"]
}
func onDisappear() { currentTask?.cancel() }
}
Immediate cleanup with a cancellation handler
This one is for when we want to know IMMEDIATELY that the task has been cancelled. This then allows us to clean up code immediately, and set a flag to ensure that the operation handler knows that it should stop.
withTaskCancellationHandler:operation:onCancel:isolation: is used here, and the first closure details what should happen as soon as cancellation occurs.
It's worth noting here that the LLM didn't try and use the isolation parameter, which allows you to run the task in a different isolation context.
💣 ACTUALLY, looking at the Swift docs for this function, it looks like the LLM got the order of the parameters wrong! It should be operation first, THEN onCancel.
func download(to destination: URL, from url: URL) async throws {
try await withTaskCancellationHandler {
// onCancel: remove partial file immediately
try? FileManager.default.removeItem(at: destination)
} operation: {
let (bytes, response) = try await URLSession.shared.bytes(from: url)
guard (response as? HTTPURLResponse)?.statusCode == 200 else { throw URLError(.badServerResponse) }
var iterator = bytes.makeAsyncIterator()
FileManager.default.createFile(atPath: destination.path, contents: nil)
let handle = try FileHandle(forWritingTo: destination)
defer { try? handle.close() }
while let chunk = try await iterator.next() {
try Task.checkCancellation()
try handle.write(contentsOf: chunk)
}
}
}
3. Hands-On Exercises
Parallel Profile Screen (async let)
In one async function, async let fetch User, Settings, and Notifications. Update the UI when all succeed; simulate one failing to confirm the error propagates.
import UIKit
import SwiftUI
import PlaygroundSupport
struct User: Codable { let name: String }
struct Settings: Codable { let canEdit: Bool }
struct Notifications: Codable { let messages: [String] }
struct AsyncLet: View {
@State var user:User?
@State var settings: Settings?
@State var notifications: Notifications?
// Called from a .task in the SwiftUI View
func loadData() async {
// Async Fan out with async let
async let fetchUser = try getUser()
async let fetchSettings = try getSettings()
async let fetchNotifications = try getNotifications()
do {
// We await the results of the 3 `async let` calls here.
// 🤔 So... what is the ACTUAL type of fetchUser/Settings/Notifications?
let (serverUser, serverSettings, serverNotifications) = try await (fetchUser, fetchSettings, fetchNotifications)
// Once we have all the data, we can update the view
await MainActor.run {
self.user = serverUser
self.settings = serverSettings
self.notifications = serverNotifications
}
}
catch {
print(error)
}
}
var body: some View {
VStack {
Text(user?.name ?? "No User")
Text("\(settings?.canEdit ?? false)")
Spacer()
VStack {
if let notifications {
ForEach(notifications.messages, id: \.self) { string in
Text(string)
}
} else {
Text("No Notifications")
}
}
}
.task {
await loadData()
}
}
}
//MARK: - Fake Network requests
func getUser() async throws -> User {
try await Task.sleep(nanoseconds: 2 * 1_000_000_000)
return User(name: "Francis Chary")
}
func getSettings() async throws -> Settings {
try await Task.sleep(nanoseconds: 1 * 1_000_000_000)
return Settings(canEdit: true)
}
func getNotifications() async throws -> Notifications {
try await Task.sleep(nanoseconds: 1 * 1_000_000_000)
return Notifications(messages: [
"You do not have edit permission",
"There is a spider on your face"
])
}
PlaygroundPage.current.liveView = UIHostingController(rootView: AsyncLet())
Ok, this seems to work. It's a bit ugly, but I get the idea. EXCEPT... I'm curious about something. What is the actual type of the fetchUser/Settings/Notifications object? Xcode reports it as being User/Settings/Notifications, but in most languages, I would expect that to be wrapped by some kind of Future, or promise. But it seems like it's not.
I did some searching, and discovered the answer; The type of the fetchUser object IS in fact User. There's no wrapping Future there. Doug Gregor explained this on John Sundell's podcast about Swift 5.5's new concurrency features (Start listening at 12:20 or so). Essentially, there are a couple of things here:
- Swift didn't have an existing 'Future' infrastructure to lean on when creating the
async/awaitconcept, like other languages have had - Swift already DID have the
try/catch/deferstructures, which match up VERY well withasync/await - The Swift team wanted a more lightweight implementation. They handle this efficiently at a very low level, so the compiler simply knows that if you try to access or use a variable created by an async function, it's not allowed before you
awaitit. So the type is always just the type of the variable, but the compiler will tell you if you try to access it too early.
This is different from Combine; In Combine, you'd get some kind of AnyCancellable back. With async/await, you have to use cooperative cancellation (ie. checking within your Task for Task.checkCancellation() or Task.isCancelled).
Thumbnail Prefetcher (TaskGroup)
Given a list of 50 image URLs, use withThrowingTaskGroup to download thumbnails concurrently (cap with a simple semaphore or TaskPriority.low). Preserve original order; show progressive updates as items finish.
```
💣 Ok, this is annoying from the LLM. It's fine to say "Given a list of 50 image urls", but it would be nice if it would actually GIVE me some urls! Ok, I guess I'll let the LLM fix this:
> For the next task (Thumbnail Prefetcher (TaskGroup)) - any idea where I can get a list of 50 image URLs?
And that my friends, is how I learned about [picsum.photos](https://picsum.photos)! Extremely cool.
### Cancellable Type-ahead
In a SwiftUI search bar, start a new Task on each keystroke, sleep ~250–400ms, check cancellation, then hit a mock endpoint. Ensure results never “flash back” from stale tasks.
---
## 4. Interview Angle
__Q:__ When would you use async let vs. a Task Group?
__A:__
__Verification:__
---
__Q:__ How does cancellation propagate, and how do you make your code cancellation-friendly?
__A:__
__Verification:__
---
__Q:__ What guarantees does structured concurrency give you compared to ad-hoc DispatchQueue work?
__A:__
__Verification:__
---
## 5. Further Resources