What a week it has been!

Apple launched Vision Pro, and dropped a whole slew of new and updated APIs on us (Marco has a very useful overview of all new frameworks presented at WWDC23)

Like many others, I wasn't lucky enough to win a ticket in the lottery, but it seems like those who did had a great time at the main event, trying out Vision Pro, the talk show, and the many community events!

I am in two minds about Apple Vision Pro: from an engineering standpoint, this is a marvel of technology, and the people who worked on it should rightfully be proud of what they've achieved. I very much look forward to trying it out myself once it becomes available in Apple stores. And I am curious to see all the amazing ideas will implement for this new computing platform (I've got some ideas myself).

However, I am very sceptical about the impact a device like this will have on the human-to-human interaction.

I agree with Nilay Patel - Vision Pro very much feels like a development platform that Apple wants to use as a device that has a maximum of capabilities to see what developers built with it, so they can see see what "sticks to the wall". Once it becomes clearer what the use cases are, this will move into one or more different directions. I wouldn't be surprised if the future of spatial computing was not wearing goggles, but rather glasses, which would be a lot less intrusive.

What do you think, and what are you most excited about? Let me know by hitting that reply button!

Peter Friese  





Computer History