-
I started learning ASL this week!
-
Follow thatdeafzinester on IG and explore their zine collection to learn more about audism.
-
Working in the KEXP Gathering Space today. Seeing into the studio, I’m feeling inspired that a single human in front of a microphone is reaching people all over the world and helping make their day a little better.
-
I wish you could share public read-only iCloud links to Freeform boards.
-
📺 (via YouTube): Inside Apple's Audio Labs
ABC News’ Rebecca Jarvis visits Apple’s secret audio labs where the company’s engineers have spent years working on technology that can turn a set of AirPods into an FDA-cleared hearing aid.
-
Could a Mac app with Rogue Amoeba esque permissions keep a timeline history of your sonic alerts/notifications? I’ll often hear a sound and have no idea where it came from.
-
Been making an effort to see more live music and it’s always so nourishing.
-
new OS?! cloudOS for server hardware/private cloud compute services
no developer story yet, maybe never, but imagine CloudKit offering serverless compute for 3P apps to run private inference jobs on bigger models
-
I think my recent rental car had “next gen” CarPlay. The main display had the usual driving directions. But a secondary route overview was shown in the driver dashboard. Interesting too based on the screenshot, CarPlay provides a full width asset and the Volvo’s OS blended in a portion of it.
-
usually a few minutes crawling around on the ground leads to the cat smacked AirPod, but not this time.
-
🔊 Two Voice Devs Episode 211 – I guested this week to explore the evolving developer canvas across Siri, Shortcuts, App Intents and Apple Intelligence. With some throwback to how music, YouTube and Alexa led me to now. Two Voice Devs!
-
Such a letdown!
Phil Schiller introducing Siri at the iPhone 4S launch event in October 2011.
For decades, technologists have teased us with this dream that you’re gonna be able to talk to technology and it’ll do things for us. Haven’t we seen this before over and over? But it never comes true. We have very limited capability. We just learn a syntax. Call a name, dial a number, play a song. It is such a letdown! What we really want to do is just talk to our device.
-
They're in the AI area.
Walt Mossberg and Kara Swisher interviewing Steve Jobs at the All Things D conference in June 2010.
Walt: Last year at our conference we had a small search company called Siri.
Steve: Yeah. Well I don’t know if I would describe Siri as a search company.
Walt: Ok but it’s a search related company… you now own them right?
Steve: Yeah. We bought them.
Kara: Why?
Walt: There was a lot of speculation, well this company is kind of in the search area.
Steve: No, they’re not in the search area.
Walt: What are they in? How would you describe it?
Steve: They’re in the AI area. -
Trying to stay calm.
-
📸 “Macintosh Killed the Public Phone Call” by Michael Leavitt
📍Mini Mart City Park, Georgetown, Seattle
🔗 Trash TalkingReclaiming the packaging from half an Apple desktop, 8 accessories, 2 iPhones and an iPad made this sentimental lament to the pay phone possible.
-
Can you not rearrange the file order in Xcode anymore? Alphabetical only now? Or did a setting change?
-
import Translation
At WWDC24, Apple announced Translation, a new ML-powered framework in iOS 18 providing APIs to access the models powering iOS’s Translate app.
One nice surprise – the basic implementation is already available as of iOS 17.4 – so developers can easily add a system provided translation overlay in SwiftUI using the
.translationPresentation()
modifier. I first tried experimenting with this in June, but Xcode wasn’t recognizing the modifier on my SwiftUI view. I poked around a little but decided to move on, figuring it was a version compatibility issue somewhere in my smoothie of Xcode, SDK and simulator betas.Last week in Cupertino, I met an ML engineering manager on the Translation team, who unblocked me with a simple question. “Did you import Translation?” This feels obvious in retrospect, but my takeaway from WWDC was this modifier came with SwiftUI itself. And that iOS 18’s Translation framework would only be needed for more custom implementations.
I haven’t shipped this in Art Museum yet, but here’s my first stab at it. The most relevant use is for artwork titles, which sometimes are provided in other languages. My implementation first uses the Natural Language framework’s
NLLanguageRecognizer()
class to detect the probable language code of the title string. If it detects something other than English, it cross checks against the supported translation languages. Translation’sLanguageAvailability()
class is iOS 18 only though. So I ran this in a Swift Playground and hardcoded an array of supported language codes to hold me over until the fall. If a title’s language is supported, the translate icon is presented next to the title. -
FB14827045
iOS 17 introduced a new synonyms API to the App Intents framework. Developers can provide an array of synonyms on their
AppEntity
orAppEnum
cases, which is supposed to give users more flexibility with their spoken App Shortcut phrases. But in my experience, these synonyms are not actually recognized by Siri or Spotlight. -
Headed to Cupertino next week for a session on Apple Intelligence, Siri, and App Intents at the new Apple Developer Center!
-
🔊 Seattle Seafair Sounds
Some audio highlights of the Blue Angels' pilots during their Seafair rehearsal yesterday.
Signal path:
- Uniden Bearcat BCD536HP radio scanner with WiFi dongle
- Siren, Uniden’s control app for iPad (last updated 9 years ago but able to install on M1 Mac!)
- Audio Hijack by Rogue Amoeba
- iZotope RX Spectral De-noise
First time trying out Audio Hijack’s new Transcribe block (still in beta) – powered by OpenAI’s Whisper.
inaudibleinaudibleinaudible inaudibleinaudibleinaudible (static) (Growl) (Growl) (Growl) (Growl) (Growl) (Growl) (Growl) (Growl) (Growl) (Growl) (Growl)silence
-
Voice is sound.
-
Meet Art Museum, my first iOS app. Explore artwork from The Met, Art Institute of Chicago, and Cleveland Museum of Art. With a swipe. Or with your voice.
It’s powered by SwiftUI, App Intents, and CloudKit from Apple – along with open access data and imagery – published by these museums under Creative Commons Zero (CCO) to encourage download, remix and reuse.
Two years ago at WWDC22, Apple introduced App Intents.
An app intent represents something people can do inside of your app, and it makes it possible to do it from outside of your app.
– WWDC2022, Platforms State of the Union, Ari Weinstein, Shortcuts Engineering ManagerI’d been building with Alexa for five years, making conversational sound on smart speakers. The launch of App Intents electrified my focus and creative energy – so I charted a new course. I joined the Apple Developer program, started learning Swift, and dove into Apple’s APIs across speech, language and sound.
Art Museum is from Just By Speaking – a new imprint in Seattle exploring the canvas outside of your app.
-
Final WWDC Lab today with the CloudKit team! ☁️
CloudKit
Friday June 14, 2024 -
WWDC Day 3 Labs: App Intents/Siri this morning and Design consult this afternoon. Stoked for both.
-
WWDC Labs are great. I got detailed feedback from the accessibility design team, including from a visually impaired person on my VoiceOver implementation.