At WWDC24, Apple announced Translation, a new ML-powered framework in iOS 18 providing APIs to access the models powering iOS’s Translate app.

One nice surprise – the basic implementation is already available as of iOS 17.4 – so developers can easily add a system provided translation overlay in SwiftUI using the .translationPresentation() modifier. I first tried experimenting with this in June, but Xcode wasn’t recognizing the modifier on my SwiftUI view. I poked around a little but decided to move on, figuring it was a version compatibility issue somewhere in my smoothie of Xcode, SDK and simulator betas.

Last week in Cupertino, I met an ML engineering manager on the Translation team, who unblocked me with a simple question. “Did you import Translation?” This feels obvious in retrospect, but my takeaway from WWDC was this modifier came with SwiftUI itself. And that iOS 18’s Translation framework would only be needed for more custom implementations.

I haven’t shipped this in Art Museum yet, but here’s my first stab at it. The most relevant use is for artwork titles, which sometimes are provided in other languages. My implementation first uses the Natural Language framework’s NLLanguageRecognizer() class to detect the probable language code of the title string. If it detects something other than English, it cross checks against the supported translation languages. Translation’s LanguageAvailability() class is iOS 18 only though. So I ran this in a Swift Playground and hardcoded an array of supported language codes to hold me over until the fall. If a title’s language is supported, the translate icon is presented next to the title.