PodSearch

The Talk Show

402: ‘Live From WWDC 2024’, With John Giannandrea, Craig Federighi, and Greg Joswiak

 

00:00:00   Good evening, you beautiful geeks! Welcome to San Jose's historic California Theater.

00:00:13   We are thrilled to once again be live and in person at Apple's Worldwide Developer Conference

00:00:19   2024. Now won't you please silence your devices, then open your ears and put your hands together

00:00:30   for our host, Jon Gruber.

00:00:37   Hello! Welcome to the talk show. I am your host Jon Gruber. Most of you here are probably

00:00:52   familiar with me. Welcome! I was told we were backstage and the House Sound guys were like,

00:01:01   "This crowd is really loud. What do you do?" It is great. The crowd is always great here.

00:01:11   It does feel a little extra energetic and with good reason. I think we're going to have

00:01:16   a very good show. But before we get to the meat of the show, I have some very special

00:01:23   people to thank. These are our sponsors, without whom we would not be here, trust me. They

00:01:30   are amazing sponsors, three of them. The first amazing sponsor is iMazing. iMazing 3 is the

00:01:40   all new version of the world's best iPhone manager for both Mac and Windows. Since the

00:01:47   iPhone's release way back when, the iMazing team has been delivering the missing features

00:01:53   you wish existed working for your iPhone, iPad, or if you still have one, an iPod and

00:01:59   your computer. I don't know, Phil Schiller obviously still has an iPod. I know he flies

00:02:05   with it. Version 3 of iMazing features a brand new user interface. It is so beautiful. It

00:02:12   is just a classic indie Mac app, beautiful, simple, gorgeous, makes you want to use it

00:02:20   just by looking at it. Version 3 has this new user interface. It is easier than ever

00:02:24   to discover, to explore, to get these features. But iMazing gives both individuals, families,

00:02:31   it looks very consumerish, but it is very powerful. Small businesses and even large

00:02:36   enterprises can use it to manage their devices and efficiently manage Apple mobile devices.

00:02:43   It includes powerful local backups with incremental snapshot support, text message extraction

00:02:51   options, whether for compliance or keepsakes. You can tell from that word compliance, probably

00:02:56   not about families, a little bit more for the enterprise. But it is seriously, it is

00:03:01   one of those tools that scales from you to really big companies. Versatile data extraction,

00:03:09   file transfer, it is natively coded for the Mac by a passionate team in Switzerland for

00:03:14   an unparalleled no compromise experience, trusted by millions of users all over the

00:03:20   world. Check out iMazing 3 today and save 20% by visiting iMazing.com/thetalkshow. That

00:03:31   is iMazing.com/thetalkshow. Second, my good friends at Flexibits, the makers of Fantastical

00:03:46   and Cardhop. These are, you guys probably have heard of them, these are amazing. Fantastical

00:03:54   is a calendar, Cardhop is a contact manager, absolutely gorgeous, previous Apple design

00:04:01   award winner and they are for every platform Apple makes. Of course the Mac, iPhone, iPad,

00:04:10   they even have right on day one a fantastic version of Fantastical for Apple Vision Pro.

00:04:20   Really gorgeous. Now I'm not naming names, but there are other companies that don't have

00:04:25   a calendar app that is native for Vision Pro. Fantastical was native on day one and it is

00:04:33   gorgeous. How Fantastical for Vision did not win an Apple design award this year, I do

00:04:39   not know. I got to find Galenzi in the audience after the show and petition him, I don't know,

00:04:45   must be a back story there, but it is beautiful, beautiful apps. They also have Flexibits Premium

00:04:52   is their subscription service. You pay for Flexibits Premium and you get it all. You

00:04:57   get Cardhop and Fantastical for all the platforms. You don't just pay per platform, you just

00:05:03   pay one thing, you get them all. They are great apps, they work together very well and

00:05:09   they have been using the fact that it is a subscription service to build out an infrastructure

00:05:15   on the web to do things like doing the thing where you pick a time for an event and people

00:05:25   coordinate that sort of thing. All super, super secure, they only send the data that

00:05:30   they need to the cloud, they don't keep anything they don't need, but to coordinate picking

00:05:34   time, stuff like that, you go to the web, you send an invitation, people can say, "Oh,

00:05:39   I'm free at 2, I'm free at 2, he's free at 3, everybody's free at 2.30, then we'll do

00:05:45   it at 2.30." That sort of stuff, all they're built into Flexibits Premium service, it is

00:05:51   fantastic. And serious, serious stuff like integration with Microsoft 365 and totally

00:05:57   fun stuff like stickers that you can put in messages. I mean, the whole scale from totally

00:06:02   serious enterprise to putting stickers in messages with the fantastic mascot. Really,

00:06:09   really great. New and existing customers with this special deal for this show, new customers

00:06:15   get 20% off when you start a subscription. Lots of people who watch my show probably

00:06:20   already subscribe, what's in it for you? Well, guess what? Go to this code, go to this URL

00:06:25   and when your renewal comes up, just by having gone to this, you'll get 20% off when you

00:06:31   renew. Everybody gets 20% off, new and existing, by going to flexibits.com/wwdc2024. That's

00:06:39   F-L-E-X-I-B-I-T-S flexibits.com/wwdc2024. Third, Flighty. How many people here already

00:06:58   use Flighty? I use Flighty, it is fantastic. Flighty, another Apple Design Award winner

00:07:10   back in 2023, way back last year. Every download of Flighty, you get the pro features complimentary

00:07:19   for your first flight. You do not need to sign up for a subscription and then cancel

00:07:23   it or anything. Try Flighty next time you're flying. You get all the pro features, you

00:07:29   can see what it's like. I guarantee you, you will sign up after you've flown with Flighty

00:07:34   compared to flying without it. Every year, there's a new record for problems with flights.

00:07:41   I mean, I don't think I'm making this up here. Flying is getting worse. Last minute delays,

00:07:50   more disruptions, plus the airlines are secret and they don't really want to keep you up

00:07:56   to date. Flighty hooks into all of the data sources and APIs that the airline industry

00:08:01   has and keeps you up to date. I cannot tell you just in the last year how many times Flighty

00:08:05   has told me about gate changes. It's not just like, oh, you go from A-10 to A-8, but when

00:08:10   you go from gate A-23 to gate D-12. It's like, oh, that's 20 minutes away in the airport.

00:08:18   Flighty tells you as soon as it happens. You're not at the wrong end of the airport. It is

00:08:21   absolutely fantastic. They've got great features for sharing your flights with friends or friends

00:08:29   sharing your flights with you. So if somebody in your family or dear friend is traveling,

00:08:34   put their flight into Flighty and you get everything you need to know and of course

00:08:38   hooks up to all of the latest stuff that you would want them to. Live activities on the

00:08:43   home screen, everything you want. It is so fantastic, so beautiful. Honestly, the airlines

00:08:50   just want you to sit quietly and just be uninformed at the gate, get there three hours early and

00:08:54   just sit there on your iPhone. No, get Flighty. You will be there when you need to be there

00:09:00   without wasting time and you will be informed of everything you need to know. Go to flighty.com/tts.

00:09:10   The talk show, TTS, flighty.com/tts. My thanks to all of those sponsors. And now with no

00:09:26   further ado, I have two very special guests to introduce. Ladies and gentlemen, let's

00:09:34   hear it for Craig Federighi and Greg Joswiak. Craig. >> Fired up? I'm fired up. >> You

00:09:47   shrunk. >> I'm not really Craig Federighi. I can never pull off the real hair. Anyway. All

00:09:55   right. Let's bring up the real Craig. >> I think they got both Craig Federighi and Greg

00:10:07   Joswiak at the same time. >> They did. It's a twofer. >> That was fantastic. >> That's as

00:10:16   close as Craig hair I'm ever going to get to. >> All right. >> Going in the opposite

00:10:21   direction. >> How many people in the audience saw last year's show? How many people remember

00:10:29   the end of last year's show? Mr. Federighi is one hell of a guitar player and did some

00:10:42   shredding. We don't have anything like that today. Now, how many people here were at Apple

00:10:50   Park for the keynote yesterday? And I'm sure for everybody watching in post on YouTube,

00:11:01   I'm sure everybody has watched the keynote. But for those who were in Apple Park the last

00:11:07   couple of years since this new format has been out, the keynote starts right at 10 a.m.

00:11:12   but around 955, Tim comes out, welcomes the crowd, has Craig come out to also say hello

00:11:18   to the crowd, and Craig comes out and fibs to the audience and tells them that there

00:11:28   are no gags, there are no -- there's no zany opening movie, and none of that is true. Well,

00:11:38   I'm an honest man, and I'm telling you, I swear to you, there is no stunt at the end

00:11:44   of the show. We spitballed ideas. There was an idea to maybe have Craig parachute from

00:11:58   up there. We have a helmet. >> That helmet is heavy, by the way. >> But I am to understand

00:12:06   there was some concern about the helmet and the hair. >> I mean, not especially. >> But

00:12:12   I thought it looked especially good on Josh. >> Yeah, I had cool Craig hair. Hey, we do

00:12:17   have some gifts for you. You have to declare those on your taxes. They are two patches

00:12:22   for WWDC. >> I have one more comment about the opening. I thought the opening movie of

00:12:35   the keynote was fantastic. I mean, I think you guys always -- [ Applause ] I mean, it

00:12:48   looked realistic. You did not jump out of an airplane. >> Whoa, whoa, whoa, whoa, whoa,

00:12:54   what the hell, Gruber? What are you implying? >> But here's my real question. My real question,

00:13:03   number one, how great was it to see captain Phil Schiller flying the airplane? [ Applause ]

00:13:12   >> I will tell you that stuff wasn't the only take. >> Well, that was my question. Are there

00:13:25   alternate takes where you didn't say I'm getting too old for this stuff? >> That was the G-rated

00:13:30   version of that take. >> You know Phil, you know there are other takes. >> As usual, I

00:13:38   think you guys did a very good job ordering the keynote, and I think we can stick to roughly

00:13:45   keynote order as we progress through topics tonight. If you guys -- >> We picked that

00:13:51   order for a reason. >> It opened with vision OS 2 and a sort of state of Apple vision pro.

00:14:03   Do you find it frustrating that five months in of shipping, four months? >> Four, four,

00:14:11   five months. Don't age us. >> That there's widespread sentiment that, well, it's not

00:14:24   really as popular as the iPhone. This product is a dud. >> John, we've gotten that with

00:14:31   every product we've ever introduced for new products. Truly, I mean, the iPod came out

00:14:36   and people are like, oh, my God, who's going to buy this, you know, at the time it was

00:14:42   a $399 music player, CD players are $39. There's no future in this. And the reality is it takes

00:14:49   time. It took us about two years, right? And iPod, if you remember that worked out pretty

00:14:54   well. But in the meantime, everybody was calling for its death. Same thing, by the way, for

00:15:02   the iPhone. The iPhone started off and it had a fairly meager start. You may remember

00:15:07   even after we were sometime into it, we said, okay, we'd love to get 1% market share of

00:15:14   the phone business in the first full year, which in 2008, we just barely made it, right?

00:15:19   And that turned out pretty well as well. >> Got past two, I think. >> Got past two. And,

00:15:29   you know, I mean, we have all the time. And watch has gotten it. I mean, oh, what a dud,

00:15:33   who's buying this? It's working out pretty well. So it takes time to build a business.

00:15:38   I think there's a general human tendency. I actually think it, God, I hate to tell you

00:15:42   whose quote it was, but somebody who talked about the fact that people tend to overestimate

00:15:49   things that can happen in the near term and they underestimate things that happen in the

00:15:53   longer term. And I think that product expectations are much like that. It takes a little while

00:15:57   to build a business. >> Well, and isn't platform building almost

00:16:04   the canonical example of that? That what platform ever debuted as a huge hit? Maybe the web,

00:16:15   right? You could sort of say sort of exploded and then you look back to -- >> Yeah, even

00:16:19   that. Started slower than you think. >> Right. And then you look back to like -- >> That's

00:16:22   worked out pretty well as well. >> When Tim Berners-Lee invented it and it's

00:16:27   like, oh, no, there was like a five-year stretch there where I was still using Gopher. >> Originally

00:16:32   had to buy a next machine in order to do the web. So the market was contained for a while

00:16:37   there. >> But I'll tell you what, we're psyched over how many apps we already have for Vision

00:16:40   Pro. We have over 2,000 native apps already. We had over 1,000 just in the first few days,

00:16:46   which was faster start than even we had for the original app store. And let's not forget,

00:16:51   we have a million and a half compatible apps. So there's a lot of things you could do with

00:16:54   it even today, even as we're, as you said, building a platform. So it's pretty exciting.

00:17:07   >> One of the themes I thought for this show and talking about Vision OS and the platform

00:17:17   is just the word patience. And I think that Apple as a company has institutional patience

00:17:25   that other companies in the field don't have. For example, I can think of a company that

00:17:35   was also in the AR/VR space and even renamed themselves after, I can't remember the word.

00:17:49   What was the word? >> I don't know. I don't say the word, you

00:17:56   know that. >> That was a trap. Because you told Joanna

00:18:04   Stern when she asked you what you thought of the metaverse, your answer was.

00:18:10   >> I would never, ever use that term. >> A word I would never say.

00:18:13   >> A word I will never say. And you tried to trick me right here.

00:18:19   >> But that, you know, seemingly has lost interest in the metaverse in favor of AI and

00:18:26   it's the new, you know, it's the hot new thing, chasing the hot new thing and then another

00:18:30   hot new thing and then you chase it. Whereas Apple, you know, there was the sometime in

00:18:39   the last decade, a thousand no's forever yes. But that when you guys get to a yes, you stick

00:18:48   with it. And I think, you know, four months into a platform is very early to judge the

00:18:55   long term prospects of it. >> We agree.

00:19:02   >> I thought that the other -- >> That was really well said, John.

00:19:06   >> WWDC is a developers conference, but I thought that some of the news yesterday about

00:19:13   vision was it's -- if you expand developers to creative people, it was creating content

00:19:22   for vision pro with partnerships, with cannon making a new lens with two elements so that

00:19:29   you can get stereo right from one lens and a camera system. A partnership with black

00:19:35   magic. Again, production tools for creating immersive content. And that to me is where

00:19:44   Apple is. Where Apple has always been is creating these platforms for creative people to make

00:19:51   the things. So I'm curious what else you see coming soon for vision and vision OS.

00:19:58   >> Well, let me just maybe even kind of build on that because that was probably one of the

00:20:01   biggest questions we got from creatives and film makers was I want to be able to do this.

00:20:07   Because once you see a movie, you know, even a 2D movie, it looks amazing. You see a 3D

00:20:13   movie, it looks even better. It's kind of like, you know, you've heard some of these

00:20:18   directors saying this is the way they always envision a 3D movie to be experienced is how

00:20:23   it was in vision pro. And then when you see the immersive video we did, it's like being

00:20:27   there. And of course, the film makers realizing this feels like the future, right? This feels

00:20:32   like the way you want to experience those things. They want to go out and create their

00:20:34   own stuff. So it was a big push for us to go out and work with, you know, the cannons,

00:20:40   the black magics to be able to say let's enable an ecosystem because also as part of building

00:20:44   a platform, you know, is an ecosystem, right? An ecosystem of content providers, app providers,

00:20:50   the whole works. So that was a big deal for us. So it's coming along.

00:20:53   >> All right. Moving on. Time is ticking. iOS opened up with --

00:21:03   [ Laughter ]

00:21:05   >> We agreed. Yeah.

00:21:06   >> Home screen customization. Apparently it takes 18 revisions to get to the point where

00:21:15   you can arrange icons. Where you want -- can you speak to the engineering challenges?

00:21:24   [ Laughter ]

00:21:25   >> You talked earlier, John, about patience. You got to wait, you know, pick your moments.

00:21:38   We thought 18 years is about the right moment to move an icon from the top down to the bottom.

00:21:48   >> It felt worth it to me. >> We did it.

00:21:53   [ Applause ]

00:21:57   >> There were actually developers that were born. We had to wait until they could mature

00:22:04   and join Apple and finish the project. We got to do it.

00:22:13   >> It's for the kids. >> You need to save a win for the next generation.

00:22:20   >> I think -- and I know we have very serious, very complicated, very serious computer science

00:22:26   topics to talk about tonight. But I do think -- I think it was a very fun and going to

00:22:34   be beloved feature in addition to being able to arrange the icons where you want, but being

00:22:40   able to customize the look and flipping to dark mode, I never thought of it before, but

00:22:46   I thought, boy, these icons -- I know dark mode is super duper popular, but it never

00:22:50   really had occurred to me before that the icons don't change in dark mode, and now they

00:22:54   do, and they do look really cool. But I think even cooler than that is the customization

00:23:00   you can do where you can pick these color tints and you can make it your own. Like the

00:23:04   example in the keynote was very, very vibrant, like a vibrant electric yellow, and it themed

00:23:09   all the things. My two thoughts on that are it takes me back to being when I first got

00:23:16   a Mac, and what did I do in the early '90s when I first got a Mac is I sat there and

00:23:21   dicked around in ResEdit customizing my icons and making -- ResEdit.

00:23:26   >> ResEdit. We loved ResEdit. >> And remember, folder icon maker, folder

00:23:33   icon maker was this great utility where if you wanted to make a folder for all of your

00:23:37   Photoshop files, you would drop Photoshop's icon on folder icon maker, and it would spit

00:23:44   out a folder icon with Photoshop's icon on the folder. Well, I thought my computer was

00:23:50   way better than anybody else's because my files were organized in folders that had the

00:23:55   icon of the app. I spent a lot of time on that. But a couple of years ago with shortcuts,

00:24:02   what I saw -- and my son really got into it and made this customized home screen where

00:24:07   all of his icons -- and it was sort of a three or four step process where you make a shortcut,

00:24:11   and the shortcut just opens the app, but then you can assign a custom icon to the shortcut

00:24:16   so it looks like you're tapping the app and it just jumps you into the app. You spend

00:24:23   some time, make a whole home screen of black and white icons, monochromatic, and then if

00:24:28   the mood hits you like, "I wish they were all blue," well, then you got to start all

00:24:32   over and do them all over again. Now you've made a feature where, you know, I think that's

00:24:37   great that you guys spend time on that because people love to customize.

00:24:41   >> I am glad, though, that you were able to pass that legacy on to your son. Had we made

00:24:48   it too easy, his critical step in his maturation would have missed it, and now he's stronger

00:24:53   for it.

00:24:56   >> But do you see that as a people pleasing, like, yes, people are going to love this feature?

00:25:01   >> Yeah, 100%. When we shipped iOS 14 with widgets and, you know, things went nuts. I

00:25:08   mean, just even the number of people becoming developers to download the beta on that first

00:25:16   release because --

00:25:17   >> It's true.

00:25:19   >> And I think it had really gone viral on social media with people sharing what they

00:25:24   were doing to their screens and it created this incredible enthusiasm. So, you know,

00:25:31   we waited four more years and then we did something about it.

00:25:35   >> Patience.

00:25:36   >> Patience. That is one of the nice themes I'm told.

00:25:39   >> But, you know, I think that at times, I mean, you guys are -- Apple is very famously

00:25:45   known for its design strength and, you know, you buy Apple products because they are well

00:25:53   designed and they are designed by Apple designers, but it's not, oh, here's how we want your

00:25:58   iPhone to look. Your iPhone is going to look just like the ones in the Apple store with

00:26:03   our wallpaper and our arrangement and, no, I mean, this is -- your iPhone can look like

00:26:09   your iPhone.

00:26:11   >> I think that's true. I think the design team has really wanted to -- I mean, when

00:26:15   you build a new product, I think you do want to agree to establish a kind of shared reality

00:26:20   and identity for that product and we certainly did that pretty thoroughly with iPhone. But

00:26:27   then there becomes a point where you want to let everybody make it their own, but you

00:26:33   still want it to be their own iPhone. And I think we've found the balance between your

00:26:38   -- even as you do all of this customization of your home screen, of your lock screen,

00:26:43   we did a bunch of work there, they are very much your own, but they're still very much

00:26:48   iPhone and I think that's where we've tried to find the balance and I think this year

00:26:52   we pushed that frontier a little further and I think people are really going to enjoy it.

00:26:58   >> I would say the biggest upgrade to control center since control center was added, where

00:27:03   control center now is almost like a mini environment within itself with third party APIs for controls,

00:27:18   including all the way back to the lock screen where you can -- previously it's like you

00:27:22   can get the flashlight or the camera or you can get the flashlight and the camera. And

00:27:28   now you can fill those spots with anything you want, including from third party apps

00:27:33   who can make their own controls for those spots.

00:27:37   >> Do you think it's a member of the control center engineering team out there?

00:27:44   >> Either way. We absolutely -- we started to see even with the action button on the

00:27:51   iPhone, like a lot of enthusiasm for people to run custom actions, sometimes they were

00:28:01   of course using shortcuts as a means to do that. And so the idea of making controls really

00:28:08   kind of a universal concept in the system, letting you of course tie them to the action

00:28:12   button, but also put them in control center and then that naturally became okay, well

00:28:16   you're going to want to customize control center and a lot of ways to do that. You might

00:28:19   have more of them, so you might want multiple pages of control center. And then the biggest

00:28:25   leap, those two sacred buttons, ever since iPhone 10 down at the bottom of the screen,

00:28:31   to allow those to be customized was a big step for us to take. But it opens up a lot

00:28:39   of power. And the other thing we did many years ago, you might remember, control center

00:28:43   had a brief moment with pagination. There was a time where it was side scrolling and

00:28:50   that design had the property that it was guaranteed that every time you went to control center

00:28:53   it was on the last page you were on that wasn't the page you wanted to use next. So we backed

00:28:59   off from that and created an all in one design. For this one, of course we preserved the ability

00:29:05   to have a really full featured main page if you want, but also we created a gesture so

00:29:11   you can just pull down and get right to the page you want in one step. And we thought

00:29:16   that was super important to make it browsable, but also essentially instantly navigable as

00:29:21   you build new pages. And so I think people are going to have a lot of fun with that.

00:29:26   And I think developers now just exposing more and more of their capability as actions and

00:29:32   giving users so many different ways to tap into that, whether it's from the shortcuts

00:29:35   app or from control center or these other action buttons are great. And as we'll talk

00:29:40   about later, the intense framework that is actually the way they express actions has

00:29:44   other side benefits in other parts of the system.

00:29:47   >> You say we'll talk later, but I know you haven't looked at my notes here. You said

00:29:52   we were going to talk about the show in order. So I'm just figuring. You're a thorough man.

00:29:57   You'll get us there eventually.

00:29:58   >> All right. Here's a feature that I think kind of flew by in the keynote, but it really

00:30:02   caught my eye, and it's the new setup accessories API.

00:30:06   >> Whoa!

00:30:07   >> I definitely remember the team.

00:30:08   >> We now know where every -- by the end of this we're going to have a map of where everybody

00:30:17   works.

00:30:18   >> We can call out different groups.

00:30:19   >> Webkit.

00:30:20   >> But I would say that this -- you -- in the keynote it was presented as solving the

00:30:30   following problem. It's just when you buy certain third-party peripherals that say use

00:30:35   Bluetooth or want to get on Wi-Fi, and when you're setting them up, they would ask to

00:30:41   use Bluetooth, which brings up a system prompt, and it would say this wants to use Bluetooth

00:30:47   and the explanation text would be like it wants to connect to certain things or something

00:30:52   like that. And it's like allow or don't allow. And it's like, well, I don't even -- I have

00:30:57   no idea why this thing I bought wants to connect to Bluetooth. What am I doing? It's very confusing.

00:31:01   I guess allow it. I bought the thing. And this solves it by presenting it in a very,

00:31:09   you know, pretty much using the framework of the way that you pair AirPods.

00:31:15   >> Yeah, I mean, this is one of the cases where providing the better user experience

00:31:21   and a more private one were hand in hand. And, you know, it is the case today or prior

00:31:27   to yesterday, I guess, that when you set up that accessory, if you felt comfortable proceeding

00:31:35   and you set up your drone or whatever it was, that app still had open-ended access to Bluetooth

00:31:41   or Wi-Fi or whatever, and so you don't know what kind of signature it's gathering of discovering

00:31:46   nearby devices and so forth, and you probably didn't go back afterwards and take that permission

00:31:52   away and maybe the app even still needed it. So we've really focused over the years, we've

00:31:57   seen this on feature after feature where we've made the granting of access very clear and

00:32:05   specific and well scoped. And you saw that this year also with contacts because, you

00:32:10   know, today you'd have an app, maybe you want to do some messaging with it or something,

00:32:14   and it says, all right, this app wants access to your contacts in order to give you maybe

00:32:18   convenient autocomplete as you're typing a name or something. Convenient autocomplete,

00:32:26   yes, we found that engineer. So, but what did you do? I mean, you just gave him like

00:32:33   the full history of your contacts, present and future, out of which we've seen a history

00:32:38   of people kind of hoovering that up and using it to build a social graph on their servers

00:32:42   and all sorts of things. And so now that's locked down. But one thing we weren't able

00:32:47   to really cover in great detail in the keynote is that we've given apps a great out of process

00:32:54   API for if you're typing in that app, trying to type a completion, there's a way that they

00:32:59   can surface in their own search UI without seeing the contact they're surfacing, surface

00:33:03   a match that we get out of your contacts so then you can grant access one at a time. So

00:33:08   this this gives you the same kind of convenience you want without just opening the kimono on

00:33:12   all your some of the most sensitive data. Right. So it's out of process. That was a

00:33:16   graphical image. kimono closed. And there's also a thing that didn't make the keynote,

00:33:24   which is to use the new setup accessories. It's a new API. So peripheral makers, the

00:33:32   people who make the apps for those peripherals that are on their phone that asked for the

00:33:36   permission have to opt into this. I think they're going to want to because it's such

00:33:39   a nicer experience. But if they stick with the older one, those dialogues saying the

00:33:45   older one saying this app wants to access Bluetooth have been expanded to show you,

00:33:52   oh, well, if you before you say allow or don't allow, here are all the Bluetooth things that

00:33:59   are in your network right now that you would be telling this app that you have or if it's

00:34:05   Wi Fi, here's all the devices and in my house, it's an awful lot of devices are on the Wi

00:34:10   Fi. But that's really, really just just fantastic. Because I think those those dialogues before

00:34:17   like I have no idea what it's going to I don't really feel comfortable giving this access

00:34:22   to this app. I don't know why it wants us but I don't even know what it would see. Now

00:34:25   it shows you what it would see. Yeah, these these dialogues. It was started with location

00:34:31   where you need some text in your location, you didn't quite get a visceral sense of what

00:34:35   that meant. And so we started showing you on a map what you were giving them. And then

00:34:40   if you gave app apps location over the long term, we remind you we still do remind you

00:34:46   and kind of show you where you've been right as well to give this sense just at a glance

00:34:50   visually like this is what you've been handing over. This is what you're going to be handing

00:34:54   over. And I think that's a critical design problem. How do you convey the concept of

00:34:59   sharing because a little bit of explanatory explanatory text, you know, doesn't doesn't

00:35:05   come close to giving like that picture or that real world sample of all your devices

00:35:10   and so forth. And so that that's been a big piece of work for us over the last several

00:35:13   years. Now, the other problem I see this solves is the accusation that okay, AirPods get a

00:35:22   really slick pairing UI because Apple makes AirPods and Apple makes the iPhone and you

00:35:30   get this really beautiful UI. What about third parties? Well, sure. And and the pattern that

00:35:39   I see, but here's the pattern that I see there's a problem. The problem is it's really hard

00:35:43   to pair wireless ear ear pods with a device. Yeah. To Apple solves the problem with a really

00:35:50   clever solution. Three, Apple creates API's using that same thing so that anybody can

00:35:58   do the same thing that Apple does with it. Apple sounds pretty great. But in between

00:36:05   step two and three is patience. That's one of our themes. But is the virtue. But is that

00:36:13   what do you say to someone who says that it should ship as a public API right away? Like

00:36:17   the day AirPods dropped, that should have been an API that any other company that makes

00:36:23   headphones could have used? Well, I think a lot of where where our ability to to innovate

00:36:30   comes is from our ability to rapidly try things out, some of which don't work, some of which

00:36:37   we need to get out there and learn and perfect before they become API API is a contract virtually

00:36:49   written in blood like people then depend on it. We are then signing up to support it,

00:36:55   keep it working well over time and getting to the right API. I think any any developer

00:37:01   here who knows, you know, you put something together for use inside your app. That's one

00:37:05   thing. Getting exactly the right abstractions, the right contract that you want to live with

00:37:11   and support for the decades is a whole different exercise. And so the practice of we have teams

00:37:19   internally that, you know, live on if you're any team shipping software with the operating

00:37:24   system, you're taking the pain of living on a lot of work in progress that are us different

00:37:32   teams at Apple trying out new framework APIs and designs and you're riding that crazy wave

00:37:38   and taking in, you know, a lot of the pain of doing that also that we can then produce

00:37:45   hopefully an API for others that has been perfected through through that process. And

00:37:51   I think that's that's a virtuous cycle for us to be able to perfect it and then and then

00:37:56   scale it out. And so do you think people out there need a little more patience? And yes,

00:38:05   absolutely. Speaking of patience, speak to me about the engineering challenges of bringing

00:38:15   color to tap backs. It was a lot more than color this time. That may not have been an

00:38:31   engineering issue. That might be the best answer you've ever ever. There is a lot of

00:38:49   cool stuff in messages, but I'm going to move on to photos and photos for both iOS and iPad

00:39:03   OS has it's very familiar when you launch it. It is a very it's a significant new version.

00:39:12   But when I heard that the tab bar at the bottom was eliminated, I have to admit I wasn't.

00:39:18   That seems like the wrong way to go. It seems like, oh, here's a list of photos. And now

00:39:25   a new version. We've added a tab bar at the bottom to organize. But then I saw it and

00:39:30   I realized and then I took out I saw iOS 18 photos and then I took out my phone still

00:39:36   running iOS 17 and I realized I don't really use that tab bar. I'm always in library. And

00:39:43   what are the what were those tabs for? They were for things like albums and stuff, which

00:39:48   you manually create and put things in or manually defined smart albums on the Mac or something

00:39:53   like that. But that you the user sit there and spend time on a Saturday afternoon organizing

00:39:59   and not necessarily the replacement. But the next step in iOS 18 and iPad or in all the

00:40:06   platforms and photos is using machine learning even more to automatically categorize your

00:40:13   photos into things like trips or groups or what other ways is machine learning organizing

00:40:22   your tens of thousands of photos for you?

00:40:25   >> Yeah, so well, one of my favorites now, you know, last year we did pets, which was

00:40:32   pretty great. But now we even do like groups of people. So because if you analyze the graph

00:40:38   of photos and who appears with whom, you'll find things like in my library also, you know,

00:40:44   me and my spouse, me and my best friend, me and the whole family together, these like

00:40:50   collections you're very often looking for exactly those those kinds of photos, right?

00:40:55   Not just of a single person, but of groups being together or another big one is trips,

00:41:00   right? Which are not just someday or some moment, but a extended period where you were

00:41:07   away and certain kind of photos and it's amazing. Just turns out we can do an extremely good

00:41:13   job of finding trips, finding the best photos of those trips and it creates a really just

00:41:20   compelling way to look back. But I think that design point about the tabs, this was one

00:41:28   that we worked on a long time. In fact, this was one where we were building it, you know,

00:41:34   design was doing incredible work. We were building prototypes. We were living on them.

00:41:39   We had patience. It was a whole year where we were thinking of shipping it. And then

00:41:43   we said, you know, we don't quite have it right yet. And we kept we kept iterating on

00:41:47   it to get it to get it right. And I think that the team did just amazing work to do

00:41:51   that. But part of what we realized is for many users, this is true across a lot of apps

00:41:57   on the phone. They have tabs, but you only use one of them, you know, and you forget

00:42:03   about the rest. And again, almost like that pagination of the original control center.

00:42:13   If you ever left photos on the other tab, you probably next time you came back, it was

00:42:16   in the wrong place. And we had this great area called for you. That was a tab that you

00:42:21   probably rarely went to unless maybe you clicked on a widget or something and it took you in

00:42:25   there. And yet that was where all of this organization actually had been building up

00:42:29   and happening over time. And it was like this hidden gem that we didn't have the right way

00:42:34   to make it accessible. And we finally found the design that said, if you're trying to

00:42:38   get at the grid, your most recent photos, there it is, it's effortless. It's just as

00:42:42   it was before. You want to see these organized collections. Well, they're right there to

00:42:47   each of them just as accessible as the other. I think it really came together to be interesting

00:42:51   to see how much whether third parties pick up this pattern or not for other kinds of

00:42:57   apps, because I think it's come together really, really nicely. Off the top of my head, I think

00:43:04   photos is the app that just broadly almost almost everybody, it's almost the odd user

00:43:11   who doesn't have a large collection of photos in their library. It's for very typical users.

00:43:18   And at the scale of your platforms, we're talking what, like a billion people? We have

00:43:25   over a billion users. That's correct. Over 1%. Over 1%. It's worked out all right. But

00:43:34   like, I have lots and lots of, I don't know, probably 100,000s of emails, most of them

00:43:40   unread still. But everybody has lots of email, but most people tend to live at the top of

00:43:46   their inbox at the new email. But photos is a thing where the stuff that's in there, you

00:43:52   know, if you have 50,000 photos in your library, if you go back 40,000 photos, if anything,

00:43:59   they might be more valuable to you, right than the most recent ones. You can take a

00:44:03   new picture. If I lost my last week of photos, I can take a picture of my family from the

00:44:08   last week and they look the same. I can't lose a photo. But you know what I mean? I

00:44:14   mean, I think it's very true. It's the one thing that everybody has in their lives where

00:44:19   it's a massive amount of data and what are you going to do with it? And this is where

00:44:24   machine learning, I think, it's just a step change in exposing this and changing the way

00:44:31   people deal with their photo libraries. Well, and you took the photos for a reason,

00:44:36   right? And you want to go back and enjoy them. And I think the team's done an amazing job

00:44:39   with everything from the photo widget to this redesign to exposing those things. I mean,

00:44:45   how many times do you get those photos and you're like, oh, my God, this is amazing.

00:44:47   And you share it, right? You go share it back with your family. It's like, that's why we

00:44:53   take pictures. We want to enjoy them. Yeah, I don't know about the Gruber household

00:44:58   when you were growing up, but in my household, I don't know, there were maybe four photos

00:45:03   taken of us per year. Like every few years, one of them made it up onto the mantle or

00:45:11   something like that. And growing up, those are like the only photos. My self-conception

00:45:16   is rooted in like handful of couple of photos. And now, my kids live in an entirely different

00:45:23   world where not only are we taking photos constantly, but now these, like the widgets

00:45:29   on my desktop are constantly picking great photos of them at different times of their

00:45:33   lives. And then I click on them and then I'm texting them every day with some photo of

00:45:37   them. Look at this, you used to look like this. I don't know what I'm doing to them,

00:45:43   but it's different at least. And I'm sure it's good. I'm sure it's all psychologically.

00:45:51   But it really couldn't do these things without machine learning.

00:45:54   No, no way. Yeah. I mean, if we were just picking photos at random, you'd get lots of

00:45:59   screenshots, whiteboards, receipts. It'd be great. Look at this. Yeah.

00:46:04   And I mentioned mail. Mail very similarly has a very nice machine learning updates this

00:46:10   year with things like categorization of your incoming mail. These are where things like

00:46:16   receipts go. These are things you subscribe to a bunch of newsletters. They go somewhere

00:46:20   else and it leaves your inbox for actual email from actual people trying to get to you. I

00:46:28   thought one of the cleverest little things was the, I think it was almost a throwaway

00:46:32   line in the keynote that, hey, maybe the first two lines of an email are not the best way

00:46:38   to understand what the email is about. Use machine learning to summarize the email. And

00:46:45   if you're only going to have, you know, it's a row of your inbox and you only get two or

00:46:48   three lines of text. Let AI give a summary. It's got to be better than the first two lines

00:46:52   that like PR people send me, right? It's life. Oh, PR people out there. But not your emails.

00:47:04   I'm talking about other PR people. I get these, these messages and they used to say, you know,

00:47:10   uh, Craig, you know, I love Apple so much and then you, whatever. And then now it's

00:47:15   just for me. Now it just says wants free iPhone. That's right through it. That's also for me.

00:47:30   But photos, like I said, I think, I don't know, do you guys know like what is the average

00:47:37   photo library size? I don't know if you guys even collect that. I don't know. But we know

00:47:43   more pictures are taken with an iPhone than any other camera in the world. Anecdotally,

00:47:48   I think, you know, very, very typically tens of thousands, lots, lots and lots, lots of

00:47:53   lots. Everybody uses email, right? It is, you know, for better or for worse, it is,

00:47:58   you know, it's part of the glue. Lots of people are age choosing. Yeah. I was gonna say it's

00:48:02   an age thing. Once, once, once people get jobs, I think, yeah, yeah, yeah. For that,

00:48:09   for that, some other apps that I don't, there may be some others. Yes. So you're saying

00:48:13   those young engineers who got the home screen to have icons for everyone and don't answer

00:48:17   their emails. Not on the home screen. Yes. But these are very practical apps that people

00:48:27   use very widely and machine learning is making things they actually do better or making it

00:48:35   so they do the things way less time like skipping the wants free iPhone, iPhone email. I mean,

00:48:44   that's a, that is probably a theme that I'm glad you picked up from, from our event. I

00:48:50   mean, I think it's a theme from Apple over many years now that in many cases you didn't

00:48:58   need to think about how we're doing it. Is it machine learning? Why does this camera

00:49:03   take a good photo? Why is photos building me a nice memory? How is it surfacing a photo

00:49:11   I actually care about? Yeah, there was a lot of powerful machine learning happening behind

00:49:15   that all along, but the key is what is it doing for you? Right. Watch OS. I think a

00:49:20   flagship feature of the new version of watch OS is the machine learning powered update

00:49:26   to the photos face, which is doing similar things like, Hey, not just this is a good

00:49:33   photo of a person I think you would like on your watch, but it's also one that's framed

00:49:40   well for Apple watch and with machine learning. Here's where we think the numerals for the

00:49:46   time could go and would fit and works. And I played with it in a demo and it's like,

00:49:53   yeah, this is like, these things look designed, they look like posters. Yeah. But where does

00:49:59   that are you guys frustrated or you guys just don't care that these features are in the

00:50:04   works with amidst this narrative of Apple is behind on artificial intelligence and all

00:50:11   of this whiz bang new intelligent software features. Well, we've been doing machine learning

00:50:18   and AI. You know, you've heard me say it for so many years. We didn't even call it ML or

00:50:24   AI. We called it proactive. If you remember, we're doing proactive features. We're doing

00:50:29   this stuff for a long time. We've been building neural engines for seven years now, I think,

00:50:33   to help us do AI and ML on devices. So we've been at this a long time. Well, I mean, that

00:50:40   has that's been the funny bit now with the AI PC. It's like someone discovered the idea

00:50:54   of a neural engine this year. A neural processing unit. Nothing like what we've done.

00:51:02   Yeah. So, so obviously, you know, from, from, you know, phone phones for, for many years

00:51:10   and you know, the first M one Mac and we introduced in 2020 and every Mac we've introduced since

00:51:16   I guess we missed the boat to name it an AI PC because we've been making great ones this

00:51:21   this whole time. But you know, that's not the point. They're great Macs.

00:51:27   Starling right from iOS into iPad OS. And I think this year it's particularly blurred

00:51:36   distinction because maybe there's some minor exceptions, but it seems like with the tent

00:51:42   pole features, it's feature parody on iOS and iPad OS with like, for example, when lock

00:51:49   screen customization first came to iPhone, it came to iPad one year later. Yeah. The

00:51:56   things we've already been talking about moving icons around, tinting your icons, getting

00:52:01   dark mode icons, iPad OS 18, the troll center. Yeah. Yeah. New photos app. Yeah. We were,

00:52:07   we were able to do them in sync this year, which is great. And then of course we added

00:52:13   on you know, one, one particular app this year.

00:52:20   Patience. Patience. Patience. Patience for your theme. Patience. Finally, calculator

00:52:31   for iPad. Yes.

00:52:38   Was your strategy to hold back on a calculator for iPad so that when you did it, we would

00:52:45   get this sort of reaction. We wanted you to really, I mean, could you imagine all

00:52:50   the, the adulation we would have given up had we actually shipped it right out the gates?

00:52:54   You know, when I first came out, would anyone have cheered? No, no, no. You hold it and

00:52:59   hold it and hold it like that dramatic pause. Yeah. I forget if I've previously asked you

00:53:07   about a calculator app for iPad. I know in other interviews I've seen you do, you have

00:53:12   been asked about it all the time. I mean, truly all the time. We did the big iPad introduction,

00:53:16   you know, in early May and I was still getting asked, do you guys do a calculator? You know,

00:53:24   I'm like, patience. That must've been hard in May. Oh, it was. It was. But one of your

00:53:31   answers when you would be asked your sort of non-answer answer was, well, what would

00:53:36   we bring to a calculator on a bigger screen? Turned out to be true. You did bring something

00:53:43   with math notes. Yeah. I mean, it was a real answer at the time, which is we, we really,

00:53:52   I mean, obviously it was straightforward if we wanted to bring a blown up iPhone style

00:53:57   calculator to the iPad. But we really felt that we wanted to do something more and different

00:54:04   and uniquely iPad when we did it. And we, you know, we really found the moment for a

00:54:11   lot of things to come together this year to do that, to do one that made it meaningful

00:54:17   for iPad. And so, and I couldn't be happier, I think, and the team with the reception that

00:54:25   all of you gave it at the event. I mean, it was fantastic. And we're obviously really

00:54:31   happy with the work. I think one of the really cool things about doing it with, or with the

00:54:36   pencil integration, which is not the only way to do it. I know in the keynote it was,

00:54:40   it came up in the iPad segment, it's all with the pencil, but it's there on the phone. It's

00:54:46   speaking about feature parody. It's on the Mac. And obviously on the Mac there's no pencil.

00:54:52   You type, you know, you can type Mac too. But I think, but I thought the really interesting

00:54:59   thing about emphasizing with pencil wasn't just finally there's an iPad calculator. It's

00:55:03   that for mathematical notation, the pencil is actually far more natural and expressive

00:55:10   than typing, right? You can't make a square root thing and with a keyboard, you know,

00:55:16   you type SQRT with parentheses. And that's not how math students think of it. Math students

00:55:22   think of it with the notation that they're doing in class. And you just draw it. And

00:55:26   then you're like, oh, I need parentheses. Well, it's a pencil. So you just draw the

00:55:29   pencils around it afterwards, or you draw the parentheses around it afterwards. That's

00:55:32   how I learned it. We agree, John. Yep. Sort of in the same lines, but the Smart Script,

00:55:44   which is the name that's the name for the feature where you're writing and you're handwriting,

00:55:50   and Smart Script will actually make your handwriting better. Yeah. I mean, step one was to learn

00:55:58   your handwriting so that we could both create writing that was consistent with your handwriting

00:56:07   and that we could even sort of build the neatest version of your handwriting that was consistent

00:56:12   with your handwriting style. You know, going back years now, we really had this goal to

00:56:17   how there's so many things that are great about writing with a pencil or pen. And but

00:56:25   there's also so much that's great that's a disjoint set about typing, which is type text.

00:56:30   You know, if you spell check and correct like no problem, it can rewrap that. You want to

00:56:34   insert in the middle? No problem. It can reflow it. You want to copy and paste a section?

00:56:40   You can do that. But of course, when you want to do that with handwriting, those things

00:56:44   aren't really possible unless we can actually generate your handwriting. If you want to

00:56:49   copy paste and move all the words around and change it, or we want to spell check, we have

00:56:52   to rewrite that word for you. And so SmartScript lets us learn your handwriting. And then we're

00:56:59   able to build all these other capabilities to I mean, I don't know if you take notes

00:57:03   with a pencil, I sometimes do. And always I realize like there's a word I want to insert

00:57:09   in the middle there. And then you know, you put an arrow and you scribble some mess up

00:57:12   above it, right? Now you can just move it aside. But of course, that means it has to

00:57:17   rewrap. So now we need to understand the flow of the text and rewrap it. So we're really

00:57:20   trying to bring the best of pencil and the best of type text together and SmartScript

00:57:25   lets us do that. So cool. Yeah. Somewhere out there, there's got to be some very long

00:57:37   term Apple employees who worked on the Newton who were like, Yes, this is what we were going

00:57:44   for. Right. And including scribble to create the gap to say I don't want that word anymore.

00:57:51   scribble, scribble it out, scribble it out, which is one thing I've been missing that

00:57:55   I can remember with my Newton that I missed until now. And now I've got it. Yep, you got

00:58:00   it. There you go. Patience. Patience. So a couple of years ago at WWDC, I think maybe

00:58:10   2018 I didn't bother looking at it doesn't matter what year but there I think it was

00:58:13   sort of an iconic slide in a keynote was you addressing the sort of conjecture out there

00:58:23   that Apple was in the midst of merging Mac OS and iPad OS. And do you remember the slide

00:58:31   that was very clear slide very simple. No. I really with I believe an anvil drop and

00:58:39   there was I think it was the single biggest biggest letters on a Apple slide. I think

00:58:46   at the time, the conjecture that you were addressing was of the mind that the Mac, which

00:58:57   came out in 1984 is old, and the iPod iPad is new. And that if that was what Apple was

00:59:06   thinking, it was going to be the iPad superseding the Mac is what people thought and you're

00:59:13   like, No. I think we've come around in the six years since and you know, with the new

00:59:21   iPads that came out a month ago, and in re up this sort of mentality of, oh, you're calling

00:59:30   these iPad pros, but I can't do my professional work on them. You should let me do boot my

00:59:38   iPad into Mac OS or why don't you bring app kit and Mac OS to iPads. And I kind of feel

00:59:48   like we're heading towards another No. But I tried to write about it but looking for

00:59:57   the answer. I've pontificated along similar lines of that of my not necessarily frustration,

01:00:06   but my personal feeling that I'm far more productive on the Mac than the iPad. But the

01:00:10   light bulb that went off to me is well, the way I deal with that is I just do what I feel

01:00:16   most productive on the Mac on the Mac and let the iPad be the iPad. That's exactly the

01:00:20   right answer. John. They're very different products. They're different design points.

01:00:24   And, you know, iPads great strength is the fact that it's you've heard us say it's a

01:00:29   cliche at this point, it's a magical sheet of glass becomes anything you want it to be.

01:00:33   And it's over a million apps that are designed for the iPad to allow to do incredible things.

01:00:38   And I wouldn't tell somebody who's a procreate artist that they're not doing pro things,

01:00:41   right? They're doing amazing things. And now we have Final Cut Pro, we have Logic Pro,

01:00:45   we have all kinds of incredible apps for it. And the Mac is the Mac. And I don't know,

01:00:51   I don't know why people have wanted for the time that these have existed to want to merge

01:00:56   them together. That's not our that's not our desire to still big fat No. Yeah, I love my

01:01:01   iPad, I probably spend at least as much maybe more time on my iPad doing a whole variety

01:01:08   of things than I do on my Mac. I also love my Mac. I do not want them to become the same

01:01:14   device. I think you have you've used the phrase in the past, maybe that the, you know, Mac

01:01:21   is heavy so that iPad can be light or something like that. And I don't think Mac is heavy.

01:01:27   I think the Mac is awesome. But, but I think I think you're right. And I think you're years

01:01:34   ago wise man talked about kind of cars and trucks. And and I think I was I was honestly

01:01:41   bracing myself for another wave of this, not at this particular forum, but of this sentiment

01:01:48   when we were about to release the the iPad with an incredible M4 in it. Because there's

01:01:55   a mindset out there that says, Okay, you took this incredible powerful V8 engine that powers

01:02:03   this truck that I use to tow my boat, and you put it in this sports car. Why can't I

01:02:11   haul lumber and a boat with this sports car? And it's like, no, this engine is awesome

01:02:17   in the truck. And it's awesome in the sports car. And and when you use your iPad Pro right

01:02:24   now, it's, it's the best iPad experience you can ever imagine. And that's worth a lot to

01:02:31   to a lot of people. I mean, that is that is a great experience. And we want to keep making

01:02:35   iPad, the best iPad it can be we are not trying to create a Windows 8, PC or whatever.

01:02:50   Moving on to Mac OS Sequoia. A I can't every time I see Sequoia, I cannot help but note

01:02:56   that it has all five vowels. Wow. Wow. And achievement. I think I play too much wordle.

01:03:10   I think it looks like well, that's cool. But big deal. I think it's like a game changer.

01:03:16   I'm really like it's like a light bulb went off in my head. Like I think I'm going to

01:03:20   use this all the time with the new continuity feature where you can get iPhone mirroring

01:03:23   on Mac. Yeah. Is that it? It seems like that's harder than it looks. Great low latency connectivity

01:03:42   and discovery and all that is is the magic behind continuity and is always a challenge

01:03:49   to make to make great. But when it works, it's magic. And that's that's why continuity

01:03:54   is magic. But this this is one that many of us and I certainly wanted for a long, a long

01:04:00   time. I use continuity camera all the time. And then my phone is up here. And, you know,

01:04:08   I believe it up there sometimes and then I want to get at it. And now now click boom,

01:04:14   I can, you know, it's it's right there. But there are just so many times when you want

01:04:18   to access your phone, it's on the other side of the house or in your bag. And it's it's

01:04:24   great. And we made the trackpad gesture handling just totally smooth. So manipulating the phone

01:04:31   is really natural. So I think it's great. Apple operates worldwide and. The world is

01:04:41   very wide. And you guys often say Apple complies with the laws around the world as needed on

01:04:54   a per country or, you know, whoever the jurisdiction. Yep. The world's gotten wider in the last

01:05:01   year and flatter. But with the DMA in the EU now in effect. Really. Best stickers out

01:05:19   there. Now we know the attendees from Belgium are sitting. Russell. In general, as Apple

01:05:32   has adjusted the rules, the nature, whether they're technical rules or they are policy

01:05:39   rules like in the app store to adapt to the changing world, to adapt to changing features.

01:05:45   In general, you guys seem to make significant efforts to keep the rules the same everywhere

01:05:55   as much as you can. For example, when you guys came to an agreement with the Japanese

01:06:02   Fair Trade Commission about something, something with reader apps. OK. And we'll just make

01:06:08   this the new rules everywhere. With the DMA compliance, you guys have gone a different

01:06:17   way where, OK, this is what we need to do in the EU. Here is an extensive set of frameworks,

01:06:26   technologies, policies. Trust me, I read it all. It is super extensive. But where what

01:06:35   is your thinking about when to, OK, let's apply these changes everywhere versus, OK,

01:06:43   we're going to keep this as the default everywhere, but we'll do that only when needed. Well,

01:06:48   John, this has been a first of all, it's been a massive effort. A massive effort by a lot

01:06:55   of people at Apple to work to comply to it. As you said, we have to write. We do business

01:07:03   in Europe. We have to comply with the laws there. And as the DMA is written, it's been

01:07:08   a heavy lift. And especially to do that while trying to balance the safety, security, privacy

01:07:17   needs of our users because a lot of these things, as you know, run in opposition to

01:07:22   those. So it's been a heavy lift. And it's only been in effect for three months since

01:07:31   early March. So it's still early days, but we're working hard to try to do what we have

01:07:37   to do there. But it's a tough one. Yeah, I don't think, especially among the technically

01:07:44   savvy among us, many of us in this room have an idea that we all used Macs since even the

01:07:52   pre-internet era. We use them now. And we feel like we know how to keep ourselves safe.

01:08:04   We know there is no battle that is more continuously fought and just a pitched battle than the

01:08:15   one we have around keeping our devices secure and creating an ecosystem where my mom, my

01:08:26   kids, I feel like go ahead and download whatever you want and enjoy your phone. That's not

01:08:35   what we say with the Macs, unfortunately. And that has been such a powerful thing for

01:08:43   our users and such an amazing thing, I think, for the overall community, for developers,

01:08:49   the opportunity it creates when everyone feels like, you know, I want to try that. Oh, I'll

01:08:53   just download it. It works. When we see something that has the threat of imperiling our users

01:09:00   and disrupting the ecosystem that has been so valuable to all of the participants in

01:09:06   that ecosystem, we're going to, where we can, protect our customers. And we've done lots

01:09:15   of very, very hard things in the EU to minimize the damage. But it is no panacea what's being

01:09:24   put on users there. And I think it's underappreciated, and I understand on a technical audience,

01:09:30   why they might think that. I can tell you and from the point of view of our security

01:09:35   teams and those that work on protecting our users, this is a serious issue. And so we're

01:09:43   trying to do the best we can for all the users where we can.

01:09:47   I think it's technically minded. Thank you. I think it's technically minded critics who

01:10:03   don't see things your way, who rolled their eyes and think it's spin and not true, that

01:10:11   you guys hear an overwhelming amount of feedback from users who say, oh, I like my iPhone and

01:10:16   iPad the way they are. I've decided I trust Apple, and I like the idea that everything

01:10:24   going into my phone comes through the App Store, which is vetted by Apple. And I mean,

01:10:31   is that true that there really are people that feedback to us by real users is just

01:10:35   overwhelming in that regard. And look, I think I think look, it's totally valid. Like if

01:10:40   you want an Android phone, go buy it. Right? There is there is choice out there. It's fantastic.

01:10:49   They're so there are options. And by the way, in Europe, I mean, more people have Android

01:10:55   phones than iPhones. So some people, though, say they like the overall ecosystem that Apple

01:11:04   has offered. And our customers there write us extensively. Like, don't let them screw

01:11:08   this up. And so we don't want to let them screw it up.

01:11:19   There's another big topic I want to talk about. But before we get to it, I think I would be

01:11:23   remiss not to mark that this WWDC is the 10 year anniversary of Swift. I guess it's me

01:11:37   getting older and the way time compresses. But I still think of Swift as new. But I did

01:11:43   the math. And when Mac OS 10 10.0 shipped in 2001, do you remember the code name for

01:11:50   10.0? Which cat? Oh, Jaguar. Cheetah. Cheetah. Very good crack marketing team. When Mac OS

01:12:05   10.0 shipped, Next was only 13 years old. I know Objective C technically came like 1984,

01:12:13   something like that. But it was sort of known for only 13 years. And here we are 10 years

01:12:17   into Swift. How is this going? I'm very extraordinarily well, and many dimensions.

01:12:25   I mean, when you talk about trying to and you're like, if you can go look around the

01:12:31   industry and see different places where companies have tried to create new languages, very,

01:12:36   very few of them kind of ultimately go anywhere. And Swift was, it has been just a runaway

01:12:46   success for app development on our platform. Right? Just a million app on the App Store.

01:12:52   I'm not sure what the number is. It's an extraordinary number of apps that use Swift. There's a whole

01:12:58   generation of developers now who've come up programming in it. But also many of us who

01:13:06   love and loved Objective C have, yeah, hell yeah, have learned to love and appreciate

01:13:15   all that Swift brings as well. So it's been phenomenally successful.

01:13:20   When we introduced the language, though, our ambition, step one was clearly a great language

01:13:26   for app development on our platform. Right? That was mission number one. But the ambition

01:13:30   for the language was as a general purpose systems programming language. And what's happened

01:13:37   semi-quietly while Swift has continued to be successful as an app programming language

01:13:44   is Apple's adoption and some others as a systems programming language. I mean, now with embedded

01:13:51   Swift we're running Swift in like the secure Enclave processor. We're running Swift on

01:13:57   servers. We're running Swift just all over the, yes, all over the system. And these are

01:14:06   places that historically our only alternative really was C or C++ really. And many of us

01:14:16   grew up programming those languages. They are by design effectively unsecurable. Right?

01:14:24   They have many interesting properties, but they're operating at a pretty low level of

01:14:28   abstraction and are inherently not very safe. And in the world we live in now, having a

01:14:33   programming language be safe as well as ideally expressive, incredibly productive are big

01:14:39   virtues. We've done so much work on Swift to make it great for interoperability, not

01:14:46   just with Objective-C and C, but now also with C++ that we think there's a new era ahead

01:14:54   where the world needs a safe, expressive systems programming language. And we think Swift is

01:15:01   it. The slide I saw in the State of the Union, I wrote it down, Swift is the best choice

01:15:06   to succeed C++. That's the whole slide. And that is the truth. You look at the alternatives

01:15:14   out there, nothing, look, Swift is built by the team that built probably the most popular

01:15:21   C++ compiler in use today in Clang. And Swift is more native to C++ than any of the alternatives

01:15:30   out there. Its ability to call and interoperate with C++. Because this code base, there are

01:15:35   code bases out there that aren't going away overnight. And people are going to continue

01:15:39   to write a lot of C and C++, but people are also going to add a lot onto it where using

01:15:44   a language that can interoperate seamlessly with it and let you grow a safer and more

01:15:51   productive code base and also create other things is a unique Swift power. And I see

01:15:57   Swift as having an extraordinary future, of course inside of Apple, but across the industry.

01:16:04   And I think that's the next 10 years.

01:16:13   Obviously we want to talk about Apple intelligence, but I feel if we're going to talk about Apple

01:16:18   intelligence, we should have somebody out here who's actually intelligent about it.

01:16:26   Ladies and gentlemen, John G. Andrea.

01:16:32   Hey guys.

01:16:37   Wow.

01:16:39   All right.

01:16:44   JG.

01:16:46   JG.

01:16:47   JG.

01:16:49   GJ.

01:16:50   What's the deal, Craig?

01:16:56   Sorry.

01:16:59   Sorry for keeping you waiting. It's your colleagues fault for having so much to talk about.

01:17:03   It was fun listening to you backstage.

01:17:07   Siri.

01:17:09   Let me give you a slogan. This time we mean it.

01:17:18   I think when I started working with the Siri team, the first instruction I gave them was

01:17:21   failure is not an option because a lot of people use Siri a lot of the time. And as

01:17:29   it's got better over the years, people just, we see in our data that people just use it

01:17:32   more. And the numbers, I'm not going to tell you what the numbers are, but they're huge.

01:17:39   A billion and a half requests a day.

01:17:42   A billion and a half voice requests a day. But within that, like the mix is shifting

01:17:46   because when people use a voice assistant and it works for them, then they use it more.

01:17:50   So it's a very shifting target.

01:17:54   I'm going to give you a recent example from a friend of mine. This is obviously using

01:17:58   iOS 17. What time zone is Las Vegas in? And the answer was I tried it and got the same

01:18:06   answer. Sorry, I can't help with that, but you can ask me the time in a specific city.

01:18:12   Yes.

01:18:14   If you change that from what time zone is Las Vegas in to what time zone is Las Vegas,

01:18:21   then it would, it gets a good answer.

01:18:22   So this is the NLP brittleness problem, which is you don't want to have to teach your system

01:18:27   all of the ways that human beings might ask for something. So one of the things that we

01:18:31   showed at WWDC was an example of you correcting yourself and saying two different things and

01:18:37   the new models figuring out which thing you meant. And so the good news is that the technology

01:18:42   of language models is getting dramatically better and is less likely to make these fragile

01:18:47   mistakes because they drive us nuts too.

01:18:51   It's good to hear that. Let me clarify something. And I don't think it was unclear in the keynote,

01:18:59   but I just think that it's so important that it deserves to be emphasized over and over

01:19:05   again is everything we've spoken about heretofore is not Apple intelligence, right? Scene learning

01:19:11   and photos, the email categorization, all sorts of stuff that would broadly fall under

01:19:19   your domain is not what you're calling Apple intelligence.

01:19:23   Some of it is powered by Apple intelligence.

01:19:25   Well, so things, things like for instance, the smart script, the math and so forth. We

01:19:33   are not, those aren't built on the foundation model on device or in the cloud. And when

01:19:39   we're able to bring those to the much broader class of devices because they don't require

01:19:47   some of the power that is foundational to running these foundation models. And so the

01:19:55   keynote was organized in part both to get that big idea out of Apple intelligence, but

01:20:00   also to make kind of clear for people what's for all the devices and what's for the class

01:20:06   of devices that can support Apple intelligence.

01:20:08   So let's talk about the class of devices that can support Apple intelligence. The cutoff

01:20:13   is for iPhone is very recent. It is the iPhone 15 pro with the a 17 pro system on a chip

01:20:22   and then any iPad or Mac with an M series chip. What, what is that cutoff? What were,

01:20:32   why, why is that the cut?

01:20:34   So these models, when you run them at runtime is called inference and the inference of large

01:20:39   language models is incredibly competition expensive. And so it's a combination of bandwidth

01:20:45   in the device. It's the size of the A and E it's the, it's the, the, the, the, the oomph

01:20:51   in the device to actually do these models fast enough to be useful. You could in theory

01:20:56   run these models on a very old device, but it would be so slow. It would not be useful.

01:21:01   And so it is not a scheme to sell new iPhones.

01:21:11   No not at all. Otherwise we've been smart enough just to do recent iPads and Macs too.

01:21:15   Wouldn't we?

01:21:16   Yes.

01:21:17   No, you know, we've, we've had so many, I mean, our, our first move in any ways to figure

01:21:22   out how we can bring features back as far as, absolutely. That's, I think you've seen

01:21:28   that time and time again with us. What, what JG says is, is right here. We, this is what

01:21:36   it takes. This is the hardware, what it takes. I mean, it's a pretty extraordinary thing

01:21:40   to run models of this power on an iPhone. And it turned out it, it took us building

01:21:46   this iPhone.

01:21:49   You guys have mentioned the neural engine and that the A17 Pro is the first A series

01:21:54   chip with the neural engine. But is RAM a function of that too? Or is it really primarily

01:22:00   neural engine or it's really all of it?

01:22:02   Sure. It's many dimensions of the system. Yeah. RAM is one of the, one of the pieces

01:22:06   of the total.

01:22:07   Yeah. And the A17 Pro is not the first A chip that's good in your engine, but it's got a

01:22:10   much bigger neural engine than the chip that came before.

01:22:13   Right.

01:22:15   One thing that it doesn't seem, I don't even know if they're capable of it, but like I,

01:22:23   I find with dealing with other human beings.

01:22:27   Difficult, right?

01:22:29   Difficult. I do. Yeah. Obviously. Well, you know, you know how many, how many colleagues

01:22:34   do I have? I got one podcast on the side with one colleague. Let's see if I can keep, keep

01:22:42   that going. No, but I, I find it, or I should say maddening when I meet somebody who never

01:22:52   says I don't know. I think that the humility of somebody who hopefully you know most of

01:23:01   the questions I'm talking to you about, but if you don't know, I appreciate just saying

01:23:06   I don't know or I'll go find out. Seems like LLMs do not have the I don't know.

01:23:13   They also have the feature that they double down on what they said. I think we technically

01:23:24   call this bullshitting, but so what we have done in the features that we announced is

01:23:34   we've been very careful about applying this technology in a very thoughtful way. So we

01:23:39   don't have features that will write, you know, a college essay about something about the

01:23:45   world, right? We've tried to make this, we've tried to corral this technology to do what

01:23:49   it's really good at doing. So a good example of that would be summarization. That used

01:23:53   to be an open research problem in the NLP community and now it's essentially solved

01:23:57   with guardrails and a whole bunch of careful work, but basically you can summarize an email

01:24:01   now and so that that's that for each of the features that we launched, we've had these

01:24:05   internal debates about is this technology ready for real users to use.

01:24:10   Right. So that you're not going to recommend using glue to stick your cheese on a pizza.

01:24:19   We do not have that feature. Just to take an example at random.

01:24:28   How broadly, how are you thinking though about protecting against, let's say offensive, wrong

01:24:36   or just playing goofy like using glue on pizza responses?

01:24:41   Yeah, so there's two problems that you have to address with that alignment. One is the

01:24:47   hallucination problem you've alluded to just saying random stuff that's not true. But the

01:24:52   other is the research community calls the safety problem, which is saying things that

01:24:55   are inappropriate or suggesting things are bad or illegal, for example. And there we

01:25:04   have to tread much more carefully because we think that the user is using our devices

01:25:10   for their purposes and their creative reasons. And so we don't want to be the arbiter. We

01:25:15   don't want to have a huge blog list of words you're not allowed to say or ideas that somehow

01:25:20   don't work in the word processor. So we have to find that balance point where we're not

01:25:24   amplifying inappropriate uses. And we do have a huge number of what we call guardrails and

01:25:31   system techniques to make sure that certain topics are off limits. But it's a very fine

01:25:38   balance as we spend a lot of time and a lot of meetings trying to figure out, because

01:25:42   this is new for us, this is new for the whole industry, trying to figure out where that

01:25:45   balance point should be.

01:25:47   And it seems you don't really have, like you said, like you're not writing, Apple Intelligence

01:25:54   is not generating a first draft of an essay. So you can't get it, you can't give it a prompt.

01:26:01   There is no interface to give it a prompt to say, write me a story about shoplifting

01:26:07   from my local Apple store. I'm trying to pick something that is funny and wrong, but you

01:26:17   know, would be a headache for Jaws that, oh, everybody's getting the new Apple Intelligence

01:26:24   to give them instructions on how to steal stuff from the Apple store.

01:26:27   Yeah, an example we use internally and research use all the time is like, tell me how to hot

01:26:30   wire a car. It's a factual question. Most large language models go try it, will probably

01:26:37   refuse to answer that question because they've detected that it's illegal activity and not

01:26:41   going to help with that.

01:26:44   So what you could do with Apple Intelligence is you can sit down and write your own story

01:26:51   that might be inappropriate or goofy or committing crimes in the story, select it and then say,

01:27:02   make this more serious. But that's your story that you started with.

01:27:09   That could be a fictional story.

01:27:11   Right. It could be fictional, but that's not being generated. You're the person. You're

01:27:15   the one who wrote the story. So if now it has more professional sentence structure,

01:27:21   that's on you. I feel like this is a very interesting balance you guys have.

01:27:25   It is. And I mean, we literally had ethicists involved in our discussions on this. We do

01:27:33   have our roots as a personal computing company. This is a tool for you. So if you were writing

01:27:40   a story or maybe you are writing something about a very harmful phenomena for the purposes

01:27:51   of illustrating why it's bad or to protest against it. And if you're not careful, a model

01:28:00   that you say, well, summarize this for me or help proofread it for me, make it more

01:28:04   professional, a model that says, I'm sorry, I can't do that. This seems to be about

01:28:10   harmful topics. Well, now that's not empowering you as the use of the system. We aren't going

01:28:17   to introduce the harm. We're not going to amplify the harm. But we do think it's a tool

01:28:23   for you. And so that's the line we're carefully trying to draw.

01:28:29   Yeah. And we published a blog post yesterday which goes into some depth about how we built

01:28:37   these models and how they work and so on and so forth, some technical behind the scenes

01:28:40   stuff. And in there we actually listed some of our values about how we approach our Apple

01:28:47   intelligence. And one of those is respect the user's agency.

01:28:59   I'm not surprised and I'm glad to hear it, but with rumors that you guys were heading

01:29:08   towards, I mean, Jaws kind of spoiled it with his tweet when he announced.

01:29:13   Absolutely incredible. Absolutely incredible.

01:29:15   Well, you saw right through that one, didn't you?

01:29:20   But I'll even admit knowing Apple's sensitivity towards the brand and the brand promise to

01:29:29   customers that perhaps Apple would shy away from such things. And I think that's great

01:29:36   that you don't want to limit, but you're not going to give them the story, but you'll let

01:29:41   them make the story they want to make. Yeah, it's very early days. This technology

01:29:45   is very, very nascent. It's very powerful, super exciting, the experiences we've been

01:29:49   able to build this year. But it really is the first of a many, many year journey with

01:29:54   its technology. Image playgrounds, I guess, is the closest

01:29:59   or is the part of it where you're generating the most. Right. And there's three styles.

01:30:07   I think you call them animation, illustration and sketch, sketch, sketch, but not among

01:30:15   them is photo realistic. Yes. Coincidence.

01:30:22   A notable absence. Why? Well, because you don't want to make it easy

01:30:32   to make deep fakes. Right.

01:30:34   And there's no reason to do that in the first version of the product. And there's lots of

01:30:37   other tools that you can still get. I mean, let's face it, there is an ick factor

01:30:47   to it. Right. It is. This is kind of gross and uncomfortable and worrisome, really. Right.

01:30:54   Yeah, potentially. Yeah. Yeah. I mean, the opportunities for abuse are really high. And

01:31:02   we we think a lot of this we're focused on communication. That's fun and expressive.

01:31:09   We're not trying to create an alternate reality at all. It's very clear when you see our output,

01:31:14   no one's going to go like, oh, my God, JG was on the moon or whatever. Right. Is that

01:31:21   alligator really on a surfboard? Is that alligator on a surfboard? We want to create no confusion

01:31:25   on these points. But like with the new feature in photos where

01:31:29   you can remove unwanted either people or objects in the background, that I'm guessing is generative

01:31:36   technology because the fill is not just, you know, checkerboard. It's actually filling

01:31:42   in what it is. So obviously, the capability is there in in the technology you have. But

01:31:48   there you're not generating reality. You're filling in the reality that was there behind

01:31:54   the object or movement. Yeah, it's it's and it's a super it's a super

01:31:58   fine line. And, you know, we how aggressive we would pursue, like decluttering a photo

01:32:09   is also a point of great debate. We make sure to mark up the metadata of the generated image

01:32:13   to indicate that it's been altered in this way. And like you say, we're not. Yeah, it's

01:32:18   good to. But but but yeah, we're not trying to generate photo realistic images of people

01:32:27   or places or anything like that. And or when you point your iPhone at the moon, for example,

01:32:37   or a good example, another random example, just off the top of my head. One of if I can

01:32:47   pick one word to summarize the entirety of yesterday's keynote, I would pick this word,

01:32:55   trust. And let me give you an example. Hypothetically speaking, let's say there's another company

01:33:05   that makes desktop operating systems and they see AI as, you know, the bright future that

01:33:19   it really has that this is where and we're on the cusp of it. And maybe they want to

01:33:24   power AI PCs and would introduce a feature that could recall everything that was on your

01:33:33   screen every five seconds. And then it turns out that the first version they shipped is

01:33:42   shipping the database of all of the text that was on your screen all the time in a plain

01:33:49   text database on your startup drive. Hypothetically speaking, I think that's the perfect reaction.

01:34:02   But is that frustrating to you that that it sort of plays into the consumer's worst fear

01:34:11   about this stuff? New stuff is scary. Fingerprint sensor on my phone. Whoa, whoa, whoa. Apple's

01:34:17   going to have my fingerprint. That seems scary. Oh, no. You know, and they're just on device

01:34:21   and it's not just on the device, but it's in a secure enclave. And we can explain to

01:34:25   you how the secure enclave is very clear and would never go to the cloud. And if you get

01:34:28   another device, you've got to use the fingerprint sensor again because we never had your fingerprint.

01:34:32   Oh, OK. Oh, now you're going to scan your face. Oh, no, no, no. Not my face. Right.

01:34:37   But that's I'm exaggerating. But I think it's not really normal. Not really. That was that

01:34:43   was the reaction. It's normal that people are like, whoa. And then, OK, let me listen.

01:34:50   But like with the Windows recall feature, it literally plays into the worst fears people

01:34:55   have. They hear this feature and then hear 10 days later, somebody figures it out. And

01:35:00   it's like it's all in plain text there. Is that frustrating to you as you're building

01:35:04   out features that you're trying to build trust in that overall are outside? Are we frustrated

01:35:10   by the failings of our competitors? The answer is no.

01:35:28   Let's right before you came out, we were talking about Swift. And to go back to that, a big

01:35:41   part of the news didn't really make the keynote because State of the Union is where developer

01:35:46   oriented news goes. That's you know, I always call it the developer keynote. And Xcode has

01:35:53   a tremendous amount of new generative A.I. features. There's I think at the first cut

01:36:02   is completion, code completion. And the next one is Swift Assist. Am I getting the name

01:36:11   right? Because it's Swift only. Right. It's very specific to Swift. Yeah. I hear that,

01:36:20   though. And I think trust. And I think there's two trust factors. The first one is can a

01:36:24   developer trust the code that's being generated? And then the second one is can the developer

01:36:29   trust that their code isn't being misused to train the data for others? Yeah, both are

01:36:36   true. Right. So so I mean, it turns out these large language models are very good at generating

01:36:42   code if you ground them in code that's either been run and tested or code that came, you

01:36:49   know, was legitimate code that was being used. And there's a technique for doing this called

01:36:53   retrieval augmented generation. And so that second feature is anchored in, you know, ground

01:37:00   truth about what is good working code. And yes, certainly we would not use our developers

01:37:07   code to train our models. In fact, we could say something even stronger, which is we don't

01:37:11   use any of our users data to train Apple Foundation models. So what what can you say about the

01:37:25   training data that you did use to train the code generation features? Where did it come

01:37:30   from? If not from users? Well, the foundation models themselves are trained on a wide variety

01:37:41   of data, some of it public web data and some of it licensed data. And so for the public

01:37:47   web data, these large language models learn to generalize by seeing literally, you know,

01:37:55   billions and trillions of tokens. When we do that, with with Apple bot, we let publishers

01:38:01   opt out of having their website be included in that. But the web is a very big place.

01:38:07   And then for the Swift code specifically, I actually don't know the answer to the question.

01:38:12   Oh, look at you like that. He said, I don't know. Yeah. Some of my team's feature, but

01:38:19   I do know that he uses rag. So it's grounded in code that's probably been licensed or otherwise

01:38:24   acquired. Yeah. And then we do a lot of synthetic data generation as well. When it comes to

01:38:28   coding data, we can take things like our own documentation prompt them up. All right. That

01:38:38   is someone who works on our document. But but use it to prompt prompt a model to say

01:38:44   what what are the kinds of things people would want to do with this API write a program that

01:38:47   does that and then create a pipeline that verifies that that's working code, corrects

01:38:53   it, et cetera. And so through through these kinds of advanced pipelines, you can generate

01:38:58   more and more great examples, which then in turn become great, great training data or

01:39:04   great prompts at runtime to answer answer questions. So synthetic data, I think, is

01:39:08   going to become more and more of the answer for making these models smarter, especially

01:39:13   if you can build a good evaluation feedback loop to establish like this. This generated

01:39:18   output is actually correct. So a question very specifically for you, Craig, is are the

01:39:25   engineers on your teams using these features only in a pilot sense so far? Because these

01:39:35   things have come together. Well, we've been working on training them and getting them

01:39:39   into a great shape. They've only been we've only achieved the performance and quality

01:39:43   level we wanted very recently. And so I can't claim that, you know, this this release was

01:39:48   X percent written by our models, but people who who use them love them. I mean, the acceleration

01:39:59   is sometimes it's it's you know, it's kind of a mind reading experience. You know, you

01:40:03   know what you want. You're like, wow, OK, I did it. It gave me exactly what I wanted.

01:40:07   And so I expect we'll be using it very regularly internally. Well, they will be. So I don't

01:40:12   mean to say present tense in terms of I know that this is new and maybe, you know, very

01:40:17   early release developer tools aren't being used for production releases of software.

01:40:22   But the plan is that they this is not just for others who, oh, OK, you guys want code

01:40:27   generation and Xcode internally. It's being it will be dog food. Oh, for sure. This is

01:40:31   like a lot of internal demand, a lot of interest in it. And absolutely. And in terms of that

01:40:38   fear or just back of your mind suspicion that, oh, I don't want to use a tools that have

01:40:44   a server component with my data because I don't know what happens with it. Engineers

01:40:50   on your team will be able to use it with Apple intelligence, even when it goes to the private

01:40:57   cloud compute. Knowing the secrecy that your teams work with, even inside Apple. Yeah,

01:41:04   well, and it's I mean, private cloud compute is a topic I'd love to dig into the the predictive

01:41:11   code completion feature you talked about, though, one of the great things about it is

01:41:14   it's entirely on device. Right. So we really get. Yeah. Which which I think is relatively

01:41:20   novel. I think a lot of the tools out there that do that now are involving sending your

01:41:24   code off to somebody servers and not private cloud compute servers, for that matter. But

01:41:30   this is all running on device. It's incredibly fast and it has it's going to work when you're

01:41:34   on the plane or offline, et cetera. So I think that's a fantastic component. But, yeah, I

01:41:40   mean, a huge part. And I think many of us who wanted to use models for a lot a lot of

01:41:50   times what you want to use them for are things that involve personal or confidential information.

01:41:55   You know, if you're going to like revise an email or fix up some code, odds are that's

01:42:02   not code. You just are attempting to give away or confidential information that you

01:42:09   were intending to leak into someone else's training set. And so for us, a prerequisite

01:42:16   to doing personal intelligence that involves data you actually care about is ironclad privacy.

01:42:24   And there. Yes. And on device, of course, a fantastic answer where that works. And we've

01:42:32   really been pushing the limits of what is possible to do on device. And part of this

01:42:36   is specializing these models for certain high value tasks by building adapters that can

01:42:41   achieve really high level performance. But sometimes you hard to beat having tons of

01:42:47   compute to throw at the problem in an instance when it comes to these big things. But we

01:42:52   don't want that to mean the compromise of giving away your data or losing control of

01:42:57   your data in any fashion. And so we absolutely moved mountains across like every component

01:43:05   of the stack from having our hardware team build us custom servers to us building a custom

01:43:11   OS and inventing a lot of technology for trusted attestation and server management, a bunch

01:43:19   of code written in Swift to do that, to create, I think, for the first time in the industry,

01:43:26   a an at scale AI inference platform where the operator of the platform and no one else

01:43:35   can have any access to the data that's being used for inference. So now you can use this

01:43:40   thing and really feel like, oh, I'm not I'm not exposing my data to others. It's existing

01:43:46   in that same privacy bubble that has protected my data on my phone. And that's been extended

01:43:52   to the cloud. And we think that's an extraordinary engineering achievement. And I think what

01:43:56   the future of AI should be. Was this table stakes for Apple that, OK, some of these tasks

01:44:14   are going to require cloud compute today. And if they're going to and we want to have

01:44:20   these features, therefore, we need to do what Craig just said and build out Apple's own

01:44:27   servers, Apple's own everything. Yeah. Yep. Private by design. I have to ask, what is

01:44:40   the Apple silicon in the cloud servers saying? No, we didn't specify. I said we didn't specify.

01:44:48   Oh, we didn't specify. That's the way we choose not to answer. John, when you say how many

01:44:56   of the servers there are a lot. It's very powerful. I had some value up here. John,

01:45:10   all right. Let me ask about this with private cloud compute. A lot of people are very concerned

01:45:17   with the rise of A.I. in general, which in until yesterday was almost entirely for a

01:45:28   large language models and generative in the servers in the cloud. The environmental impact.

01:45:35   Very, very serious concerns. What is the story with Apple's private cloud compute and its

01:45:43   environmental impact? I mean, there are two reasons why we wanted to build private cloud

01:45:48   compute out of Apple Silicon. One is the privacy architecture it gives us, you know, building

01:45:53   on the secure or enclave, trusted boot, every other thing about the systems. But the other

01:45:59   is the incredible energy efficiency of Apple Silicon, which, you know, is forged in our

01:46:04   history of building mobile silicon. This is not the history of your typical A.I. inference

01:46:10   silicon, right? But it is ours. And so we just get tremendous efficiency out of our

01:46:18   private cloud compute. And this is going to allow us to continue to have our data centers

01:46:22   be 100 percent carbon neutral, which renewable energy. More than that. It's more than 100.

01:46:28   No, no, it's a it's all renewable energy, right? It's all. Yeah. So it's not just it's

01:46:33   not just carbon neutral. It's entirely 100 percent renewable energy. That's right. Yeah.

01:46:40   So to clarify, this is not with carbon offsets. No, this is 100 percent renewable energy.

01:46:47   So it's a phenomenal environmental story. Not to mention that a ton of this is running

01:46:51   on your device, which is inherently very efficient. Right. And not even going to the data centers.

01:46:57   But as you know, this has been a big concern for generative A.I. is the amount of energy

01:47:01   they they're using off the grid. And we have an incredible Apple kind of story on that.

01:47:09   Is it possible that achieving that took a little extra time? Hence the patience. Patience.

01:47:19   But that's I do think that there was, you know, with with the rise of A.I. and the you

01:47:24   know, a couple of years ago, the hot new thing was cryptocurrency. And I don't think people

01:47:30   were waiting for Apple to mint its own cryptocurrency. I think most people expected Apple to let

01:47:37   that one pass. And I think A.I., there is undeniably not just a little there, there

01:47:44   a lot of there there. And I think there was a consensus of, oh, that 2030 whole company

01:47:51   is going to be carbon neutral. Wow. That goes out the door because now there's this monkey

01:47:56   wrench thrown in. But no, no, this does not disrupt. This doesn't move it back to 2031.

01:48:03   It's day one. Renewable energy. Yeah. I think that that goal is very much alive and well.

01:48:18   Can you speak to me a little bit about this? This idea of verifiable images on the cloud

01:48:22   compute. It's not source code that you're giving to researchers. They're images. But

01:48:28   how does that how does getting an image of the OS let them verify the integrity of the

01:48:35   OS in a cloud? A couple of things. So our benchmark really is iPhone. Right. When with

01:48:43   your iPhone, we of course, are every time we make a change to the behavior of of iPhone,

01:48:51   it's because we've shipped a software update with an image of the software and researchers

01:48:56   are able to debug that system and observe its behavior and check its code and so forth.

01:49:04   And traditionally in the cloud, that's totally not the case. Right. If you someone's operating

01:49:10   in a server, you're sending your request to their server. You have no idea what's running

01:49:15   on that server. You have no way to inspect what's on that server. Even if at one moment

01:49:20   in time it got audited, you don't know if a minute later it changed its code and a minute

01:49:24   later changed it back. I mean, you just can't have any confidence in what's happening there.

01:49:30   Very much unlike what's happening on your phone. So we want to make sure we could give

01:49:35   that same property. And we do this by saying that the whole trusted boot infrastructure

01:49:45   that the SCP, the secure enclave processor on the server is measuring is able to identify

01:49:53   exactly the total image of that software. And this OS is also frozen so that no new

01:49:59   code can be added. No new code can be generated at runtime. Like the OS literally won't map

01:50:04   in any executable code pages that aren't part of the encrypted image. And when you make

01:50:09   a request to that server, there's a handshake where the SCP attests and says this was a

01:50:16   trusted boot and here is the measurements, the signature of this OS that you would be

01:50:22   talking to if you were to send me the request and here are the keys to have that conversation

01:50:27   with this thing. Well, separately your phone is going to look in an attestation log, an

01:50:34   append only log, brought up crypto, that kind of technology, that Apple had to post that

01:50:42   these are known posted public images of this and your phone says is the image I'm talking

01:50:48   to a publicly posted image? If it's not, I will not talk to that device. So even if an

01:50:55   attacker were to try to sign an image, put it on that server and try to intercept a request,

01:51:03   your phone would refuse to talk to it. So there you've got these images that we have

01:51:08   to publicly post. We are then also equipping researchers with VMs so they can run those

01:51:15   images on a Mac and debug and test them and verify our claims as well, which is even a

01:51:22   step up from what they're offered for iPhone. And do the same thing where they can take

01:51:27   the image on their local VM that they're testing and check that its signature matches. Exactly.

01:51:33   The one that you've posted to this write only chain that they can verify that you didn't

01:51:38   ship, you Apple didn't ship the researchers a different image. Exactly. Exactly. And no

01:51:50   one's ever done anything like that. Not that I know of. Yeah. It's not a subtle difference,

01:51:57   but a complete inversion difference with Apple intelligence in the starting point is that

01:52:02   because you were running on the customers devices, you've got all of this personal context

01:52:11   to start with as opposed to starting with whatever was just typed in a box. How different

01:52:18   is that in terms of are your models along the lines of other models and you just have

01:52:27   a different data source or is it an entirely different approach to creating generative

01:52:33   models? The core models are so-called foundation models. What's new and novel is that they

01:52:40   can operate, the ones on device can operate over their semantic understanding of the user's

01:52:46   data. So we have a semantic index of your messages, things on your, Craig was mentioning

01:52:53   earlier about relationships in your photos and so on and so forth. And so the model can

01:52:57   reason over that private data. And what that gives you essentially is a personalized feature

01:53:04   that is personal to you and runs on your device. Whereas these other chatbot like products,

01:53:10   they have tremendous world knowledge, they've ingested all of Wikipedia or whatever, but

01:53:14   they don't know anything about you. And so they're limited from talking in a completely

01:53:17   unpersonalized way. And we think that that's just interesting but less useful than getting

01:53:23   done with your data like we demonstrated at DubDub.

01:53:29   Right like typing into a chatbot when does my mom's flight arrive?

01:53:34   It won't work, right?

01:53:36   It couldn't.

01:53:37   It'll make something up.

01:53:38   Right, right, right, because it's not going to say I don't know. We can try it right now

01:53:43   and it'll.

01:53:44   But I mean it's even more foundational than that. If you ask a chatbot what time is it?

01:53:47   Right.

01:53:48   It doesn't know. There's no concept that there's an API call that you should be making.

01:53:53   Yeah and I know John you sort of track how different parts of our stack develop over

01:53:57   time and what's interesting about the semantic index is for many, many years foundational

01:54:02   part even going back to Mac OS X has been Spotlight and apps and with iPhone apps contributing

01:54:10   information in the Spotlight so that the user could search them or so that the app could

01:54:14   use powerful search from the system index within the apps.

01:54:20   Well that same interface, the same way that apps are donating that context to Spotlight,

01:54:25   that is the source that the semantic index is then able to take that data and index it

01:54:29   not just by keyword and so forth but semantically and that becomes the source for Apple intelligence

01:54:38   running on device to look things up.

01:54:40   So it's not that there's a whole new, there's no new information being exposed to the system.

01:54:44   It's the same information that's been exposed to Spotlight all along and there's no new

01:54:49   work for developers. Another good reason for developers though to provide information into

01:54:53   Spotlight because now it's making Apple intelligence even more able to help the user and to direct

01:55:00   the user back into doing activities inside that app.

01:55:04   Probably worth saying it's stored securely on your device as well.

01:55:07   Yeah that too.

01:55:08   Yeah.

01:55:09   Yeah in the same way the data always has been on your iPhone.

01:55:12   And the difference in context, like I said, it's very useful over here with world knowledge

01:55:18   but entirely different but in many more practical ways with personal context.

01:55:25   But in your career really, do you find yourself thinking about how humans deal with context

01:55:35   naturally? Craig just addressed me as John a minute or two ago.

01:55:39   You knew he was talking to me even though you're John.

01:55:44   Does your mind as someone whose career has been like this, do you constantly think about

01:55:50   how you knew that that was me John not you John John?

01:55:53   Yeah.

01:55:54   I have spent a considerable part of my career working on this problem.

01:56:01   So I mean the technical answer to this question is the breakthrough with transformer models

01:56:07   and LLMs that has caused all of the excitement of the last few years is around an idea called

01:56:14   attention which is that not all data is equally relevant when you're considering a time sequence

01:56:19   of things.

01:56:20   So this is why we use examples like when do I need to leave to go pick mom up at the airport

01:56:24   because the model really truly has understood the notion that you have to do one thing before

01:56:28   the other and therefore that it needs to find a flight from your mom.

01:56:34   So let's go call the semantic index and see where that flight is.

01:56:36   It could be in my email, it could be in messages, whatever.

01:56:39   But that notion that there is the idea of picking somebody up from the airport is actually

01:56:44   really learned by these models.

01:56:48   And that has really been the breakthrough for LLMs and why this technology is so important

01:56:52   to figure out how to use appropriately.

01:56:55   That brings me to the last topic from the keynote and that is the partnership with OpenAI

01:57:00   and ChatGPT.

01:57:02   Here's my question.

01:57:03   Everything we have spoken about in this very long interview to date has all been Apple

01:57:08   technology.

01:57:09   Yes.

01:57:10   Correct.

01:57:11   Yeah, 100%.

01:57:12   All of it.

01:57:13   All of it.

01:57:14   And this is a lot and it is extremely useful and it is broad.

01:57:17   So why partner with OpenAI?

01:57:21   Well, we think there are use cases for these larger world models.

01:57:28   They're not the use cases we've been talking about.

01:57:31   But we know that people are pretty excited to use these models for all kinds of tasks.

01:57:36   And what we think is going to happen over the coming years is there will be many of

01:57:40   these large models that are specialized for very specific things.

01:57:44   And some of them will have different business models.

01:57:45   Some of them will be free.

01:57:46   Some of them you'll have to pay for.

01:57:48   But there may be legal models or maybe models that understand healthcare really well.

01:57:53   And so we think that baking an entry point into these models into the user interface

01:58:00   is kind of sort of necessary.

01:58:03   Otherwise you're forced with like bringing up a chatbot and cutting and pasting stuff.

01:58:08   And we respect what these models can do enough to say we should make it easy to do.

01:58:14   And we should give you a choice ultimately of models to use.

01:58:17   And so we'll see how users like the integration that we've done.

01:58:22   But we think that this technology is not going away anytime soon.

01:58:27   And I think of it a bit like the way Safari deals with search engines.

01:58:31   Search engines, sure.

01:58:32   Right?

01:58:33   I mean, it just became a thing that you would type into the URL bar your search term and

01:58:38   that's just a good way to do that.

01:58:40   Well similarly, if you actually do want to write an essay about blue whales and you want

01:58:43   to use ChatGPT to do it, we want to make it as easy as possible.

01:58:46   And we want to make it very super clear to you that you're using ChatGPT, that the thing

01:58:49   you're getting back might have hallucinations.

01:58:52   And so that's why we've done the integration that way.

01:58:55   Right.

01:58:56   All right.

01:58:57   It's probably worth clarifying.

01:58:58   We talk about it sort of built in, the integration built into the system.

01:59:06   The integration is turned off by default.

01:59:08   So when you first, I don't know if that's good, but it's a thing that when, you know,

01:59:15   if there were ever a time when the system thought that you might want to use ChatGPT,

01:59:21   it's going to say, do you want to set this up?

01:59:23   Right.

01:59:24   So it's not, there's nothing that's happening automatically or whatever.

01:59:27   You have to say, yes, I want to enable this feature.

01:59:31   And you know, if someone for some reason just never wants it to be suggested to them at

01:59:35   all, they also can just say, you know, never ask, I have no interest in this feature.

01:59:40   And so if this isn't somehow, you know, insidiously integrated and pre-enabled or anything like

01:59:46   this, it's just available if someone wants to turn it on.

01:59:51   And it's free and it doesn't require a subscription.

01:59:54   And your IP address is obscured and your requests are not logged, you know, so it was a lot

01:59:59   of really cool things.

02:00:00   There's a lot we did to make sure that if you were going to use it, that it was as good

02:00:03   as it could be.

02:00:06   So who's paying whom?

02:00:13   I pay Craig.

02:00:14   Craig pays Sean.

02:00:16   Yeah, sorry, you know, no answer.

02:00:23   You knew the answer.

02:00:26   Can you tell me, is it a good story?

02:00:29   Did I get Eddie up here?

02:00:35   And a bottle of tequila.

02:00:38   All right, last question.

02:00:41   I've seen this floating about on social media, the same sentiment, but something to the fact,

02:00:47   but I think it's everywhere because I think it resonates with people.

02:00:53   I don't want AI to create songs and write poetry and make paintings so that I can wash

02:01:02   my dishes and do my laundry.

02:01:06   I want AI to wash my dishes and do my laundry so that I can make music and I can make paintings

02:01:12   and I can make art.

02:01:16   And I think it speaks to Apple as a tool maker for people who want to do things like that.

02:01:22   It dates to the origins of the founding of the company.

02:01:26   Do you guys see it that way?

02:01:29   That you're making tools for creative people, not tools to replace creative people?

02:01:33   Yeah, 100%.

02:01:34   Absolutely.

02:01:35   But I do think these tools will amplify people's creativity and capability.

02:01:43   How many people now, I mean, just to go technology from the 80s could pick up a synthesizer and

02:01:51   now make music that maybe previously involved in orchestra, but now they're composing something

02:01:58   so much bigger because they have a powerful instrument at their disposal.

02:02:02   I think these tools are going to enable people to, a single individual to express their creativity

02:02:09   in bigger and bigger ways.

02:02:11   I think that's tremendously empowering and that's what it's for for us.

02:02:17   And that's what computing has always really been for is to augment human creativity.

02:02:24   Good answer.

02:02:27   Let me thank my guests, John G. Andrea, J.G., Craig Federighi, Jaws.

02:02:34   Let's go with Jaws.

02:02:40   I would also like to thank everybody here at the theater.

02:02:44   I would like to thank my friends at Sandwich who shot the video, Spatial Jen who shot this

02:02:52   and hopefully live streamed it.

02:02:56   I would like to thank my wife and my son who are here helping backstage, Amy and Jonas.

02:03:04   I would like to thank my sponsors, iMazing, Flexibits, Fantastic Alan Cardhop, and of

02:03:11   course Flighty, thank you to them, and thank you to you for coming.

02:03:17   This is great.

02:03:18   And don't forget the helmet.

02:03:19   Good job.

02:03:19   [Applause]

02:03:37   [End of Audio]