Sponsored Link
Support Not only Swift: Sponsorship Opportunities Now Available
If you love the regular insights I provide on Swift, SwiftUI, Firebase, AI, ML and other tech topics, you can now help sustain and grow this newsletter by becoming a sponsor!
Reach a dedicated community of developers while supporting the creation of high-quality content that helps us all build better software.
Click the link below for details and the booking calendar.
Coming up

Sharing is caring!
I continue my series of (mostly) weekly livestreams on my journey to build an AI-powered read it later app / second brain app.
One of the key use cases for any "Read it later" app is to make it convenient for users to add new links. At the moment, users have to add new links using an "Add new link" button inside the app, which is mildly inconvenient.
In this week's livestream, I will implement a share extension that allows users to share links right from inside their browser.
Join me live at 19:00 GMT / 20:00 CET / 12:00 PDT / 15:00 EDT!
Swift
iOS Coding Technique: The Hidden Power of Swift Result Type
Working with Result
can sometimes feel cumbersome (when trying to remember the syntax for unpacking a Result
, I either end up on Swift If Case Let or How Do I Write If Case Let in Swift?...), but reading this article gave me a new appreciation for its power.
I'm currently working on the authentication screens for Sofia, and the different states a Firebase Authentication flow can be in seem to be the perfect use case for Anthony's approach. I'll give it a try and report back!
SwiftUI
Taking screenshots from SwiftUI previews
Did you know you can take great screenshots from a SwiftUI preview? Stewart Lynch shows how to make this even more efficient by assigning a keyboard shortcut, and then passing the screenshots to Framous for framing.
The aspectRatio Modifier In SwiftUI
Presenting images so they take up all available space without being distorted is something every developer has to do on a regular basis. SwiftUI's aspectRatio
view modifier is a key tool to solve the challenges we face when scaling images proportionally - but you can also use it on other types of views.
Gabriel also shows that you can animate it.
AI and ML
Creating App Intents using Assistant Schemas
If you didn't get the chance to attend Apple's Enhance your apps with Apple Intelligence and App Intents event, this blog post is a great alternative.
Antonella covers the basics of App Intents, and there's also a sample app that you can use to follow along.
This is a great way to get some hands-on experience with the framework.
Model Context Protocol (MCP)
Large language models (LLMs) are pretty powerful, but since they rely on their training data, anything outside of that training data is out of reach for them. They sometimes feel like a giant brain without any means to interact with the world around us.
A typical example is "how is the weather at my location" - an LLM cannot answer this question, as it neither has access to your location nor the weather.
Tool calling is a way to give LLMs the chops to answer questions like these. By registering a getWeather
tool and a getLocation
tool, you can empower the LLM to retrieve the information it needs to answer the question.
The Model Context Protocol (MCP) goes one step further - it establishes a protocol that allows developers to register their apps and services as servers which can then be used by LLMs.
Read Mattt's article to understand why establishing a standard like this is so important, and see some examples. Oh, and of course, Mattt created a Swift package that allows you to build MCP servers in Swift.
Tools
DocC for Multi-Platform Documentation
When I read the title of this blog post, I got really excited. Like many of you, I create code for several platforms, so being able to use DocC to power documentation (and interactive tutorials) for iOS, Android, and other platforms sounded like an amazing idea to me.
However, in my excitement, I forgot that Apple uses the term "multi-platform" when they talk about iOS, watchOS, iPadOS, tvOS, visionOS - in short, the multiple platforms in their ecosystem.
However, this is still a great article! It provided valuable insight into how to set up a DocC build by using the DocC compiler directly (instead of using the SPM plugin).
At the end of the article, Alex casually mentions he built this for a theming library he built. This library looks quite promising, and it's no surprise the documentation also looks fantastic!
Lights, Camera, Action Button
The iPhone action button has a scalability issue - you can only use it for one thing at a time. When I first got my iPhone 16 Pro, I used the action button to open the camera app. A while ago, I changed it to open the Gemini app. But there is no way I can use the action button for different things depending on the context.
Enter the Get Current App Shortcut block...!
In this article, Joe shows how you can use this to implement context-sensitive behaviour for your phone's acton button. What a fantastic idea!
If you've read the introductory comment to the previous issue of Not only Swift, you might have noticed a mistake: I wrote "I've been using Swift since it was originally introduced at WWDC 2024 [...]". However, Swift was introduced at WWDC 2014 - apologies for the typo and thanks to everyone who reached out to let me know about this mistake!
In this issue, we've got a couple of articles that show how to bring AI to Apple's platforms - using both App Intents (which probably is Apple's preferred approach), and MCP (Model Context Protocol), which allows you to connect literally any API to any LLMs that support this protocol.
Is this going to be an Apple Intelligence killer? I don't think so. If anything, Apple could potentially adopt aspects of MCP to enhance their own implementation.
BTW, Genkit also supports MCP (via genkitx-mcp) , and I've got a bunch of ideas for using this. Stay tuned!
Peter