562: Do You Have a Dragon?


00:00:00   If you wanna really screw anything up big time,

00:00:03   really, if you want to just really make some of the biggest

00:00:06   public mistakes of your life, now is the time to do it,

00:00:10   because at least we're not the OpenAI board.

00:00:12   (Geoffrey laughs)

00:00:14   - You're jumping ahead, my friend.

00:00:15   Jumping ahead. - What a mess.

00:00:17   Oh my God.

00:00:18   Wow, I mean, you talk about people having bad weeks.

00:00:24   Like, sometimes, oh yeah, they're having a bad week.

00:00:26   Boy are they having a bad week.

00:00:28   - At least it's relatively low stakes, I feel like.

00:00:31   (Geoffrey laughs)

00:00:32   It's fun because it's only like, whatever,

00:00:34   80 billion dollars on the line, but it's, you know,

00:00:36   in the end, it's people with computers and stuff and things,

00:00:38   and it's not like life or death,

00:00:40   and most of the people involved,

00:00:42   you feel okay laughing at them a little bit

00:00:44   because they're such big, distant figures,

00:00:46   they don't seem like real people to you.

00:00:48   But rest assured, they are,

00:00:49   and that's probably why they're in this mess.

00:00:52   (electronic beeping)

00:00:53   - Final warning.

00:00:55   This is the moment where you, a dear, beloved listener,

00:00:59   who has not yet put in your order,

00:01:01   you sit or stand there and you think,

00:01:04   I've got this, no worries.

00:01:06   I'm gonna take care of that as soon as I get to the office,

00:01:08   or I'll take care of it as soon as I get home.

00:01:11   And then, you, dear, beloved listener, you forget.

00:01:16   And then what happens?

00:01:18   Then the rules state, you have to go tweeting,

00:01:21   even though I won't see it, or tweeting at me,

00:01:23   saying, I'm the one, I'm that person this time.

00:01:28   Don't be that person, don't do that to yourself.

00:01:30   Go to ATP.fm/store.

00:01:32   This is your final warning, this is all we got, this is it.

00:01:35   The sale ends this weekend, Sunday, November 26th.

00:01:40   That's it, that's all the time you get.

00:01:43   So, go now, don't hesitate, pull over the car,

00:01:46   use a signal, because you're an adult and courteous,

00:01:50   use a signal.

00:01:51   If you're walking, do what you need to do

00:01:53   to indicate to those behind you,

00:01:55   you will be pulling to the side of the sidewalk.

00:01:57   If you're biking, God help you,

00:01:59   but do what you gotta do as a bikist, as Marco would say,

00:02:02   to get to the side of the road

00:02:04   and not get run over by a car.

00:02:06   - That makes people so mad, by the way.

00:02:08   - I know, I know it does.

00:02:09   - They don't get that it's a joke, it makes them so mad.

00:02:12   - I know, we know that bikist is not a thing.

00:02:15   Anyways, all kidding aside, go to ATP.fm/store,

00:02:18   check out our merch, check out our wares,

00:02:20   and remember, if you are a member,

00:02:23   you get 15% off the ATP store

00:02:27   during this time-limited sale

00:02:29   and all other time-limited sales, now is the time.

00:02:32   I know we kinda glossed over it last week,

00:02:34   but Jon, would you mind just doing

00:02:36   a brief overview of everything,

00:02:37   just one last time to really sell it,

00:02:40   just nail it and send it home.

00:02:42   Let's get a few more sales, Jon, what do we have on offer?

00:02:45   - We're not gonna go through all the merch again,

00:02:46   they know what's there, people know.

00:02:48   - All right, I'll go through all the merch,

00:02:49   why you gotta be such a party pooper?

00:02:51   All right, I'll do a speed run.

00:02:52   We got ATP Pixels, we've got ATP Space Black,

00:02:56   we've got the M3 shirt, the M3 Pro shirt, the M3 Max shirt,

00:02:59   we've got ATP Six Colors, which is six colors of fabric,

00:03:02   actually more than six colors of fabric,

00:03:03   but the text is always white,

00:03:05   really should've workshop that name, but here we are.

00:03:07   ATP Logo shirt, the OG, we've got the ATP hoodie,

00:03:10   ATP Polo, don't let me down, Polo ponies, Polo people,

00:03:14   don't let me down, this is my jam,

00:03:16   get an ATP Polo, please and thank you.

00:03:18   We've also got the ATP mug in now white

00:03:21   and a bluey purpley sort of thing,

00:03:23   Jon will correct the actual--

00:03:24   - It's cobalt, please, Casey, cobalt.

00:03:26   - Of course, I said that, it's definitely cobalt,

00:03:28   definitely not a bluey purpley thing.

00:03:30   - It's recycled cobalt, don't worry.

00:03:31   - Sure, and the ATP pint glass,

00:03:33   so go now, ATP.fm/store, and remember, if you're a member,

00:03:38   go to your member page, landing, strip,

00:03:42   whatever you call it thing,

00:03:43   and go get your bespoke discount code for 15% off,

00:03:47   ATP.fm/store, this is it, people,

00:03:50   the 26th, that's all you got.

00:03:52   - Yeah, the only thing I'll add is,

00:03:53   remember that a lot of the products come in

00:03:55   long sleeve T-shirt versions and sweatshirt versions.

00:03:59   I think this is kind of the first time we've done those

00:04:01   in a lot of these products, so check,

00:04:02   even though a lot of times it will show a picture

00:04:04   of a T-shirt, if you watch long enough,

00:04:05   the picture will rotate and you'll see

00:04:06   some long sleeve varieties, and it's the cold winter months

00:04:10   in our parts anyway, so think about those long sleeve

00:04:15   options if you're interested, and this sale does span

00:04:19   Black Friday, we did this intentionally,

00:04:21   so it ends on the 26th, Black Friday is a couple days

00:04:23   before that, you're gonna be thinking about shopping

00:04:27   for the holidays, probably, or you'll be bombarded

00:04:30   by messages about shopping for the holidays,

00:04:33   don't forget, this is your last chance,

00:04:35   I know we did this sale kind of early,

00:04:36   but we're trying as best we can to have a chance

00:04:38   of getting these things to you in time for the holidays,

00:04:41   so, you know, nerdy gifts for yourself or for others,

00:04:44   don't forget, when Black Friday arrives,

00:04:46   that's another reminder for you to hear Casey's voice

00:04:49   in your head, just saying, oh wait, was I supposed

00:04:50   to buy some nerd thing?

00:04:51   Yes, ATP.fm/store.

00:04:53   - Indeed, there's also Ute sizes as well,

00:04:56   which we added a little bit late, apologies about that,

00:04:58   so you can check out for your kids, you can get them stuff

00:05:00   for friends, kids, you know, whatever you wanna do,

00:05:03   ATP.fm/store, I'm telling you, this is good stuff,

00:05:06   that you should definitely check it out,

00:05:08   and time's running out, my friends, time is running out.

00:05:12   I feel like I had something else that I wanted to add,

00:05:13   but I forgot what it was, so, that's all right.

00:05:15   We'll just make me feel, we'll make me sound smart in post,

00:05:17   right Marco, right, right, right, okay, good.

00:05:18   - Garbage in, garbage out.

00:05:20   - Oh, brutal, brutal, well played, but very mean.

00:05:25   Keyboardware, let's talk about,

00:05:28   God, you really derailed me with that.

00:05:29   Keyboardware, let's talk about what happens to keyboards

00:05:32   because of your damn dirty fingers.

00:05:34   Ryan Holmes writes, Casey, I think what you're seeing

00:05:37   as finger grease is likely what keyboard enthusiasts

00:05:40   call quote unquote shine that happens to ABS plastic.

00:05:43   It's relatively soft plastic that is susceptible

00:05:45   to finger oils and heat.

00:05:47   To John's point, some people's finger oils shine them

00:05:49   more than others.

00:05:50   When you use a magic eraser to remove the shine,

00:05:52   you are literally abrading off a layer of plastic.

00:05:55   Okay, so, interrupting briefly, I don't debate

00:05:57   that that's very possibly true, but I'm not like sawing

00:06:01   the keyboard with a magic eraser, like,

00:06:03   just a quick glance with it is more than enough,

00:06:06   and I only do this a handful of times a year, like I think--

00:06:08   - But that is true, though, like that is how the abrasive

00:06:11   in a magic eraser works, you are taking a layer off.

00:06:14   It's just a very small layer

00:06:15   'cause it's a very fine abrasive.

00:06:16   - But you're also leaving bits of the magic eraser

00:06:18   on your keyboard and going down into the,

00:06:20   I'm still against the magic eraser.

00:06:22   - Yeah, it's not a great option to have to use.

00:06:25   Ideally, you don't need it.

00:06:27   'Cause you're right, there is some risk of getting

00:06:30   little dots of it stuck in there and jamming up the keys.

00:06:34   Obviously, now that we have a little more robust keyboards

00:06:36   than we used to, that risk is lower, but it is still a risk.

00:06:40   So yeah, you are better off not doing that

00:06:41   if you can help it.

00:06:42   - That's fair.

00:06:43   Continuing back to what Ryan Holmes was writing,

00:06:45   ABS is nice for keycaps because they are easy to double shot,

00:06:48   meaning the letters can be embedded into the key

00:06:50   and won't wear off.

00:06:51   It's a separate layer of plastic molded inside.

00:06:53   This also enables any color combo.

00:06:56   A more durable option is PBT, polybutylene terra--

00:07:00   Oh my gosh, that's all it costs, that's all it wants.

00:07:02   Terra-fith-al-ate, which takes much higher temperatures

00:07:06   to deform, it does not quote unquote shine.

00:07:09   It often has a deeper sound and can be double shot,

00:07:10   but double shotting it is not common

00:07:12   and can involve a blend with ABS.

00:07:14   The more common way to print legends on PBT

00:07:17   is dye sublimation, which involves heat and dye.

00:07:21   The dye embeds into the top layer of the key

00:07:23   so the legends will essentially never wear off.

00:07:25   There is so much lingo here, my word.

00:07:28   I know every interest and profession

00:07:31   has their own vocabulary, but my gosh.

00:07:33   Anyways, it's difficult to get them as crisp

00:07:35   as double shot, however, and they will not

00:07:37   let LEDs glow through.

00:07:39   PBT is also difficult for light legends on dark caps

00:07:42   since the whole cap will need to be dyed

00:07:43   except the legend, which is masked out.

00:07:45   This amount of dyeing involves longer exposure

00:07:47   of the plastic to heat and often deforms larger keys,

00:07:49   most notably space bars.

00:07:50   Cheaper OEM keyboards use other kinds of surface printing

00:07:53   that wears off relatively easily.

00:07:55   So that was way more information than certainly I needed

00:07:57   or wanted to know about keyboards.

00:08:00   - Some background on Apple's keyboards,

00:08:01   so Apple's using ABS, which is the softer plastic.

00:08:04   Apple has used PBT in the past on older laptops

00:08:07   before your time, but back in the day,

00:08:10   the Power Brooks used to come with keyboards

00:08:12   that it was kind of hilarious.

00:08:13   They looked kind of like, I mean,

00:08:15   you've all seen the Apple Extended 2, right,

00:08:17   even if you've never actually touched one,

00:08:18   what those key caps look like.

00:08:19   Imagine those, but imagine you took them into a 3D program

00:08:23   and you squished them so they're not as high, right?

00:08:26   And that's what they made the Power Brook key caps look like.

00:08:28   From the top, it looked like,

00:08:29   "Oh, that's just a regular key."

00:08:30   But then you'd look from the side

00:08:31   and you'd see they were squished out.

00:08:32   Like they had the same kind of like slanted edges,

00:08:35   but just much, much smaller.

00:08:36   I'm pretty sure that back in those days,

00:08:38   the key caps were PBT.

00:08:40   Of course, they weren't backlit,

00:08:41   so that's the whole big thing.

00:08:43   I think all of Apple's keyboards now

00:08:44   on their laptops are backlit.

00:08:45   You can't backlight the PBT

00:08:47   'cause it doesn't let light through it.

00:08:49   That's why they were talking about having ABS mixed in

00:08:51   because you could let the light go through the ABS part,

00:08:53   like just the letter that's on them or whatever.

00:08:55   But anyway, Apple's key caps on all their modern keyboards

00:08:58   all appear to be the very much softer ABS.

00:09:03   - Anonymous writes, "The backlit keyboard key caps

00:09:05   "on Apple laptops are injection molded in clear plastic.

00:09:08   "They are painted with white paint, then black paint.

00:09:11   "The glyphs are then laser etched using a method

00:09:13   "that removes the black paint, but not the white paint.

00:09:15   "Lastly, a protective clear coat is applied."

00:09:19   Andrew writes, "Your conversation about wear on keyboards

00:09:21   "reminded me of a recent article

00:09:22   "by a British cycling journalist."

00:09:24   Would that be a bikist, Marco?

00:09:25   Is that correct?

00:09:26   A bikist journalist.

00:09:27   - A bikalist.

00:09:28   - (laughs)

00:09:28   Oh my, we're gonna get so much hate mail.

00:09:30   "On the recent demise of his 10-year-old MacBook,

00:09:32   "which had survived an impressive amount of abuse."

00:09:35   You might like to see how worn his most used keys were.

00:09:38   We will put a link in the show notes

00:09:39   and maybe Marco will make this chapter art, maybe, maybe.

00:09:43   Let me tell you, actually, you know what?

00:09:44   I would not advise you to make this chapter art

00:09:46   because this keyboard is fricking horrifying.

00:09:50   Not only is this the ridiculous Tetris key return key,

00:09:53   which I know the British people love,

00:09:55   leaving that monstrosity aside,

00:09:57   this is an abomination.

00:09:59   Look, this is disgusting, this thing.

00:10:01   I can't handle it.

00:10:02   I need to scroll down because this is--

00:10:04   - This is one of many photos that we got.

00:10:05   This is the worst one, obviously.

00:10:06   This is one of many, many, many photos that we got,

00:10:09   most of which were not 10 years old, to be clear,

00:10:11   most of which were much younger laptops.

00:10:14   And if you, you know, to visualize what this picture

00:10:16   looked like, picture an Apple keyboard

00:10:18   with all of its black key caps,

00:10:20   and then have little circles worn through them

00:10:22   at various places, seeing the mechanism beneath,

00:10:26   because as the earlier feedback item said,

00:10:29   Apple's keys are made from clear plastic,

00:10:32   and then they have black paint and white paint over them.

00:10:34   And so this, imagine the keys with the black

00:10:37   and white paint rubbed off,

00:10:39   just showing you the clear plastic.

00:10:41   It's like these little window panes of various sizes,

00:10:43   and you can see where they wore through the black first,

00:10:46   and then the white second, right?

00:10:49   And you see kind of like a terrace pattern

00:10:51   where it's like black, then there's a white ring,

00:10:53   and then there's a clear spot.

00:10:54   And you can tell where this person hit the space bar,

00:10:56   like the one very specific spot all the time.

00:11:00   Yeah, tons of people had this, and you know, again,

00:11:03   the leading theory is pH levels,

00:11:07   either more acidic or more basic, you know,

00:11:10   sweat from their fingers or whatever,

00:11:12   essentially wearing through the paint,

00:11:13   because these are painted clear plastic.

00:11:17   And if they were a PBT, where it was the color

00:11:20   of the plastic instead of being painted,

00:11:22   this wouldn't happen,

00:11:23   but then you couldn't have the backlights

00:11:24   unless you double shot them.

00:11:26   But then there's the, you know,

00:11:27   they're saying like to put the dye on the entire key,

00:11:30   something along here, like the space bar might warp.

00:11:33   So I would say, like, as we mentioned in the last show,

00:11:35   keyboard durability on laptops,

00:11:39   aesthetic durability, not like, you know,

00:11:43   making the keys continue to work,

00:11:44   still seems like an area

00:11:45   where Apple could use some improvement.

00:11:48   I think what they have now is a reasonable-ish compromise,

00:11:50   'cause they're lightweight, they're fairly sturdy,

00:11:54   they hold up pretty well most of the time,

00:11:56   but if you're one of those people who wears away key caps,

00:11:59   it's a bummer for you,

00:12:01   because you're going to have a worse experience

00:12:03   with this keyboard.

00:12:04   It'll be nice if the keys still work,

00:12:06   and I guess as long as you remember what those letters were

00:12:08   before you wore them off with your fingers, you're fine.

00:12:10   So functionality is job one,

00:12:12   but aesthetic should be somewhere in the top five,

00:12:14   and the current keyboards are failing that

00:12:17   for some subset of Apple's customers.

00:12:20   - This next section is John tries to make Casey feel bad

00:12:23   about liking 14-inch laptops, so I'm just gonna--

00:12:27   - It's not just John.

00:12:28   I mean, if only one of us would've said last episode,

00:12:32   you know, with this amount of power draw,

00:12:35   so much increase from the M1 generation,

00:12:37   maybe the 14-inch might be a bad idea

00:12:40   if you get the max chip,

00:12:42   because it seems like that's a lot of heat

00:12:44   for the 14-inch to quietly and gracefully get rid of

00:12:48   compared to the 16-inch.

00:12:49   - So John, what happened?

00:12:51   - I mean, to be clear, this was true

00:12:53   of the M2 and M1 generations as well,

00:12:55   which is why Marco gave that advice.

00:12:58   But anyway, we'll put a link in the show notes

00:13:01   to a YouTube video where they tested

00:13:02   the M3 Max 14-inch versus 16-inch.

00:13:06   And I think I said on the last show that like,

00:13:07   well, you know, maybe the cooling

00:13:09   should be roughly equivalent,

00:13:11   and the thing that is stuck on top of the M3 Max,

00:13:13   like the little whatever heat spreader thing

00:13:15   that's pulling the heat off, yeah, that part's the same.

00:13:17   Every other part of it is not the same though.

00:13:20   First of all, the heat pipe itself

00:13:21   that's leading away from the SOC,

00:13:23   that heat pipe is wider on the 16-inch.

00:13:25   So right away, there's a difference in the cooling system.

00:13:27   But the big difference is the fans in the 16-inch

00:13:30   are so much bigger fans in the 14-inch.

00:13:33   Like they're really using that extra space to good effect.

00:13:37   It is a dramatic difference in the size of the fans

00:13:39   and obviously the amount of air they can move.

00:13:41   Then the fans are blowing that air over,

00:13:44   you know, these little fin things

00:13:45   that are attached to the heat pipe and there's more of them.

00:13:48   Anyway, in the testing, two big things came out.

00:13:51   First, that 14-inch is going to run the fans

00:13:54   at much higher RPM.

00:13:55   So they did like a Cinebench benchmark

00:13:58   and the 14-inch was running the fans at 7200 RPM

00:14:02   and the 16-inch at 3800.

00:14:03   So it was like double the fan speed.

00:14:05   And obviously, if the fans are running faster,

00:14:07   they're gonna be noisier.

00:14:08   And since the 14-inch fans are smaller,

00:14:10   they're gonna be a little bit more annoying too

00:14:12   because they're higher pitched.

00:14:14   That's not great.

00:14:15   They stabilize fan speed.

00:14:17   I love when they fancy up their benchmarks here.

00:14:20   So the average speed in the Cinebench 2024 GPU test

00:14:25   was 7200 RPMs for the 14-inch, 1700 RPM for the 16-inch.

00:14:30   That's a big difference.

00:14:32   That's as in like, you can hear the fans screaming

00:14:34   the entire time in the 14-inch and on 16-inch,

00:14:37   you probably wouldn't hear anything.

00:14:39   And now Casey may be saying, I don't care about that.

00:14:41   I don't run 3D benchmarks.

00:14:43   I don't care about the average fan speed

00:14:46   over a 10-minute run of some type of thing.

00:14:49   Then they did an Xcode test

00:14:50   and that's really, really hidden Casey where it hurts

00:14:53   'cause that's the thing that he does do.

00:14:55   And this is not about the fan speed

00:14:57   or about temperatures or anything like that.

00:14:59   This is the fact that all those things

00:15:01   with the fan being higher on the 14-inch

00:15:03   translate to the 14-inch being in more thermal distress

00:15:07   and thermal distress means thermal throttling.

00:15:10   And so this is where the rubber meets the road.

00:15:12   The 14-inch is a little bit slower than the 16-inch

00:15:17   because it thermal throttles when doing Xcode compiles.

00:15:20   Now this is probably a big Xcode compile.

00:15:22   Maybe it's bigger than any of Casey's apps.

00:15:24   Maybe it doesn't affect him or whatever.

00:15:25   And the difference isn't huge.

00:15:26   - Yeah, 'cause this is, it took like almost a minute and a half

00:15:28   to do this compile.

00:15:29   So like if you're doing a very long sustained CPU

00:15:33   or GPU drain, you're gonna hit throttling on the 14-inch.

00:15:38   No question.

00:15:39   Like that's what this is clearly showing.

00:15:40   Like if you are pegging the CPU or GPU for a minute or more,

00:15:45   you are almost certainly gonna see throttling on the 14-inch

00:15:48   that is not present on the 16-inch.

00:15:50   - Yeah, and the Xcode benchmark was 82 seconds

00:15:52   on the 14-inch and 72 on the 16-inch.

00:15:55   So not a huge difference.

00:15:55   Not like the fan speed difference.

00:15:57   I feel like the big quality of life difference

00:15:58   is the fan speeds.

00:15:59   - I mean, that's like an M2 to M3 difference

00:16:01   in performance though.

00:16:02   But again, this is a long sustained one.

00:16:05   Like if my build takes 11 seconds,

00:16:08   it's probably in that span,

00:16:10   I probably wouldn't see the difference

00:16:11   between a 14 and a 16.

00:16:13   But if you're doing builds that take a minute,

00:16:14   you probably will.

00:16:15   - Yeah, so I mean, as the case with all these laptops,

00:16:18   right, it's just a question of how much the cooling system

00:16:23   can fight off the inevitable heat saturation

00:16:27   due to long-range jobs.

00:16:28   It's rare that a laptop, a very high-powered,

00:16:31   top-of-the-line, stuff-to-the-gills laptop,

00:16:34   can sustain its full power for a very long period of time

00:16:36   because the cooling systems usually aren't adequate

00:16:39   to get all that heat out of there to keep up with it.

00:16:41   Especially if you have some contrived,

00:16:43   like CPU maxed out at the same time as GPU maxed out,

00:16:47   the same time as a neural engine maxed out,

00:16:48   the same time as a video encoder matched out.

00:16:50   Like they're not gonna be able to deal with that.

00:16:51   But almost no real job is like that.

00:16:53   That's why the GPU benchmarks are so brutal here

00:16:55   because this was just a benchmark of the GPU

00:16:58   and the fan speeds were 7,000 RPM versus 1,700.

00:17:02   That's not even close.

00:17:03   That's just, they don't,

00:17:05   even just stressing the GPU is enough

00:17:08   to really, really see a difference.

00:17:09   And that's relevant to people playing games.

00:17:11   If you're gonna play games on your laptop,

00:17:13   you're like, I want one of these M series laptops,

00:17:15   so I don't need to hear the fans screaming when I play games.

00:17:19   Get a 16-inch.

00:17:20   You have a fighting chance of making that real

00:17:22   because if you wanna find something

00:17:25   that's gonna really stress the GPU,

00:17:26   yeah, playing a game for a long period of time will do it.

00:17:29   I don't think this means that a case

00:17:31   you should get a 16-inch.

00:17:33   I think the 16-inch is too big as well.

00:17:34   I think the whole point of being portable

00:17:35   is to have something small.

00:17:37   But if you are concerned about sustained performance,

00:17:39   A, don't get a laptop, but B.

00:17:42   - Hey, the 16-inch sustains it just fine.

00:17:44   - Well, no, but the 16-inch eventually heat soaks as well,

00:17:46   especially if you do a CPU plus GPU combo.

00:17:48   These benchmarks are a little bit shorter.

00:17:51   Yeah, and also the other thing they were measuring,

00:17:53   which isn't in any of these graphs in the shots here,

00:17:55   is clock speed.

00:17:55   Like they're just seeing,

00:17:57   are they hitting their rated top clock speed?

00:17:59   And both of these laptops occasionally,

00:18:02   depending on the benchmark,

00:18:03   would never actually achieve their rated top clock speed

00:18:07   or would do it very briefly in the beginning of the test

00:18:09   and then never see it again, right?

00:18:11   So there's always room for improvement, right?

00:18:13   16-inch versus 14,

00:18:14   doesn't mean the 16-inch never thermal throttles,

00:18:16   it just means it does it less than the 14,

00:18:18   which is why it's coming out ahead in these tests.

00:18:20   - I would say these tests prove my hunch last week

00:18:25   that if you're going for the max chip,

00:18:28   maybe don't get the 14-inch.

00:18:30   Like if you're getting the 14-inch

00:18:32   for like an all-around balance of portability and size,

00:18:35   maybe you're going for the M3 Pro chip,

00:18:38   that's a different story.

00:18:39   It can probably handle that a lot better.

00:18:40   But if you're getting the M3 Max

00:18:43   and you intend to push it at all,

00:18:45   which you probably are if you're getting the Max,

00:18:48   you should be aware of this.

00:18:49   Like this might not change your mind,

00:18:51   but this is something that you should really be aware of,

00:18:52   that the M3 Max seems to be substantially crushing

00:18:57   the 14-inch's thermal system more than the M1 generation did.

00:19:01   The M2 was probably somewhere in the middle

00:19:02   'cause its power was probably, it was between the two.

00:19:06   But they've ramped up the peak power draw of these chips

00:19:10   compared to previous generations.

00:19:12   The M1 draws less power than the M2.

00:19:15   The M2 draws less power than the M3 Max in all those cases.

00:19:18   And so be aware of that,

00:19:20   that they have raised the power envelope

00:19:23   without increasing the thermal dissipation capacity

00:19:27   significantly.

00:19:28   So if you're coming from a 14-inch M1 Max

00:19:31   and you're thinking, well, it handled my M1 Max

00:19:33   just fine in the 14,

00:19:35   the same can't necessarily apply to the M3 Max.

00:19:38   Like you have to reevaluate

00:19:39   because they're not dropping replacements

00:19:42   in terms of power and thermals.

00:19:43   The M3 Max uses substantially more power

00:19:46   and makes substantially more heat under load

00:19:49   than the M1 Max did.

00:19:50   So that might change your calculus

00:19:53   of whether you want a 14 or a 16.

00:19:56   - I think the cooling system is upgraded a tiny bit

00:19:59   in the M3 generation versus the M2.

00:20:01   I think the heat spreader thingy on the SOC

00:20:03   is a little bit larger in this generation.

00:20:04   And I'm not sure about the fans

00:20:06   if they are exactly the same,

00:20:08   but I don't think it's entirely true

00:20:12   they just use exactly the same cooling system.

00:20:14   Otherwise it would never be able to do it

00:20:15   because the power difference and the 16-inch

00:20:17   comparing the M2 16-inch versus M3 Max 16-inch,

00:20:21   it's a pretty big difference.

00:20:22   The M3 Max is pushing like 50 watts

00:20:24   and the M2 Max was only ever getting like 36 or so.

00:20:27   So there's a lot of extra heat.

00:20:29   - Yeah, and the M1 generation I think was in the 20s.

00:20:31   It's substantially more power and heat here.

00:20:34   And also that also results in a larger difference

00:20:38   between the two models in battery life

00:20:40   than there used to be.

00:20:41   There was always a decent difference

00:20:42   'cause the 16-inch battery is just much bigger.

00:20:44   And yeah, the screen uses more power

00:20:46   than the 14 'cause it's bigger,

00:20:47   but not that much more power.

00:20:49   So the 16 is still the battery monster.

00:20:51   But in this case, keep in mind,

00:20:54   again, the CPU is drawing more power

00:20:57   than the previous generations

00:20:58   at the same marketing name level.

00:21:00   And also, I don't know how much of a factor this is,

00:21:04   but if those fans are spinning faster,

00:21:07   fans also use power.

00:21:08   So the result after these tests,

00:21:11   the result in this video,

00:21:13   the 14-inch battery was substantially more drained

00:21:17   than the 16-inch.

00:21:17   It was not a small difference.

00:21:19   So again, just don't make assumptions

00:21:22   based on whatever worked for you in the M1 generation.

00:21:25   Don't assume the same thing will work for you

00:21:27   in the M3 generation

00:21:28   because the power levels are very different.

00:21:31   - I understand everything that you guys are saying,

00:21:33   but I don't feel like I'm bumping

00:21:37   into thermal throttling issues anytime

00:21:39   other than when I'm doing transcosos with FFmpeg,

00:21:42   at which case, okay, yeah, I'm reaping what I sowed

00:21:45   or whatever the turn of phrase is.

00:21:46   You know what I'm trying to say.

00:21:48   Okay, that's fine.

00:21:49   The other thing is I didn't want to go down

00:21:52   from 64 gigs of memory.

00:21:53   I probably could have.

00:21:54   And been fine, but I didn't want to.

00:21:57   And in order to do that in this generation,

00:21:59   you have to have an M3 max.

00:22:00   Like that's the end of the meeting.

00:22:01   So in fact, you not only need an M3 max,

00:22:03   you need the baller M3 max.

00:22:05   That's the only, I'm looking at the configurator right now.

00:22:07   It's the only option in the 14.

00:22:09   So if I want 64 gigs, well, guess what?

00:22:12   I'm getting an M3 max.

00:22:13   In fact, the Pro appears to top out at 36 gigs,

00:22:16   which even for me is probably fine,

00:22:19   but I didn't want that.

00:22:21   I wanted 64.

00:22:22   And honestly, I have zero regrets about this machine.

00:22:25   I've been using it at home.

00:22:26   I've been using it away from home.

00:22:28   I freaking love this computer.

00:22:29   And I mean, I freaking love my M1 max as well,

00:22:31   don't get me wrong, but this computer is great.

00:22:33   I haven't heard the fans except one time

00:22:35   when I was doing a transcode.

00:22:36   I have no reason to believe battery life is suffering

00:22:39   compared to the last one.

00:22:40   It's been flawless so far.

00:22:43   I don't argue any of what you guys are saying.

00:22:45   I'm not trying to say that you're wrong or lying

00:22:46   or anything like that.

00:22:47   It's just in real world use,

00:22:49   I am not personally noticing,

00:22:52   not to say it's not affecting me,

00:22:53   but I'm not noticing the results

00:22:55   of any of these problems in the 14.

00:22:57   And for me, and I totally understand, Marco,

00:23:00   why you reached a different conclusion,

00:23:01   but for me, I don't want a freaking aircraft carrier

00:23:04   in my backpack.

00:23:05   And I travel with this machine often enough

00:23:07   that that matters to me.

00:23:09   And so I made the choice I made,

00:23:11   and I think it's the right one for me.

00:23:12   You made the choice you made.

00:23:13   I think it's the right one for you.

00:23:14   And everyone's happy.

00:23:16   Why won't you just let me be happy?

00:23:17   - No, and look, I think it's important to point out

00:23:21   if the 14 inch still worked for you, that's great,

00:23:24   but people who are buying it should know this

00:23:26   ahead of time, that it's going to have these trade-offs,

00:23:29   that the M1 generation of it had fewer trade-offs

00:23:32   compared to the M3 generation.

00:23:34   And if what you are seeking

00:23:36   is the highest performing laptop,

00:23:40   in the M1 generation, you could say,

00:23:41   well, these are basically the same.

00:23:43   Now you can't really say that.

00:23:45   Now the 16 inches is noticeably higher performing

00:23:48   with a lot of workloads.

00:23:49   So it is worth knowing.

00:23:50   Also, again, if stuff like fan noise

00:23:52   is really critical to you, then again,

00:23:55   that's something that's worth knowing ahead of time.

00:23:57   So that's why.

00:23:58   And for many people, they're gonna still get

00:24:00   the 14 inch like you and be totally happy with,

00:24:02   and that's great.

00:24:03   But again, you gotta know going into it,

00:24:06   the assumptions that you might have made

00:24:07   from previous generations of the M series laptops

00:24:10   are a little bit different now.

00:24:12   - I do wonder if there's not a low tech solution to this,

00:24:15   with the memory target, a little RGB fan cooler thing

00:24:18   on the back of your phone to get sustained frame rates.

00:24:21   - Just get some giant plate.

00:24:22   - Yeah, like I know they make laptop cooler type things,

00:24:25   but I do wonder.

00:24:26   - They're really quiet and elegant.

00:24:28   - Yeah, right.

00:24:29   But I do wonder if you say you get a 14 inch

00:24:31   and you wanna use it in desktop mode

00:24:32   and you don't want it to thermal throttle,

00:24:33   just how much extra cooling would it take?

00:24:35   Probably not that much.

00:24:36   Like, and probably could be applied externally.

00:24:38   I don't, you know, again, the caveats about condensation

00:24:40   and don't destroy your computer and blah, blah, blah,

00:24:42   whatever, but like based on how well it works

00:24:44   on the little phone thing to sort of sustain frame rates

00:24:47   when the phone could not do it without the cooler,

00:24:49   I do wonder if just a little bit of extra help

00:24:51   on a 14 inch would help it not thermal throttle at all,

00:24:54   because it's, you know, it's,

00:24:55   the fans are almost doing the job

00:24:57   'cause yeah, the fan speed difference is huge.

00:24:59   Those fans are working overtime, right?

00:25:00   But the benchmark difference is not that big.

00:25:03   So it's like the 14 inches cooling system is overmatched,

00:25:07   but just by a little bit.

00:25:08   So I wonder if you just gave it a little bit of help,

00:25:10   which is unlike lots of other computer things.

00:25:11   Like, oh, I didn't get enough cores.

00:25:13   Well, there's not really anything you can do

00:25:14   to add more cores to your computer,

00:25:15   but if thermal throttling is your problem,

00:25:18   pointing a fan at the back of your laptop

00:25:20   when it's like sitting upright, you know,

00:25:22   like in a bookshelf type thing might be enough.

00:25:24   - One potential limitation there is,

00:25:27   remember we learned this from people who wrote in

00:25:29   like a couple of years ago, this came up

00:25:32   where we were talking about like the MacBook Air

00:25:33   thermal design without any fan,

00:25:35   that like there are certain standards

00:25:37   that different, you know, governments and things set

00:25:40   about how hot the outside of a laptop can get

00:25:44   for safety reasons.

00:25:45   And so the way these are designed,

00:25:48   usually the processor is not like heat pipe

00:25:53   directly to the exterior case.

00:25:55   Like there's usually some kind of air gap there

00:25:57   so that the processor is intentionally not

00:26:01   transferring its heat much to the metal case.

00:26:05   And because of that, it actually makes it difficult

00:26:08   to make external coolers that are very effective

00:26:11   because you're not getting all that heat away

00:26:14   from the chip through the case very much.

00:26:16   It's mostly dumping it into the air.

00:26:18   So it's a little bit harder.

00:26:20   So you couldn't, for instance, just stick like

00:26:22   a Peltier Element cooler on the back of a laptop

00:26:24   and have it be--

00:26:25   - Oh, sure you could.

00:26:26   You just need a temperature differential.

00:26:29   And one exists.

00:26:29   I mean, feel the bottom of a laptop

00:26:31   when it's hard working.

00:26:32   I know what you're saying, like not all the heat

00:26:33   is being dumped out there, but there's still

00:26:34   a temperature differential.

00:26:35   Like that's a test that someone should try.

00:26:37   Like just consider that.

00:26:38   I guess the reverse of it is like,

00:26:41   if you're doing some sustained workload with your laptop,

00:26:43   don't throw it on your bed on top of your comforter.

00:26:45   Like don't, you know what I mean?

00:26:47   Like, you see how kids cheap laptops,

00:26:49   they have no idea that these are like

00:26:50   living, breathing things, right?

00:26:51   It's a little empathy for the machine here.

00:26:53   Don't take your laptop and just put it on a big,

00:26:55   soft pillow surrounded by blankets.

00:26:57   Like it needs, you need to get it with the lid closed,

00:27:00   of course, right?

00:27:01   - Naturally.

00:27:02   - Give it a fighting chance.

00:27:04   - Yeah, and just as a final note,

00:27:06   'cause I'd like to move on,

00:27:07   I feel like there's a lot of FUD going around right now.

00:27:11   These, I really don't subscribe,

00:27:15   I don't have any anecdotal data

00:27:17   that indicates that this problem

00:27:18   is any worse with this generation

00:27:20   than it was with my two year old laptop.

00:27:21   Like yes, again, I'm not trying to argue

00:27:23   that the facts that you're presenting to me are wrong.

00:27:26   All I'm trying to say is that in day to day use,

00:27:29   I straight up do not hear the fans.

00:27:31   So yes, in the occasion that I can hear the fans,

00:27:34   yes, they probably are faster,

00:27:36   they probably are more annoying, they're probably louder.

00:27:39   I don't debate that,

00:27:40   but I feel like we're making a problem out of nothing here.

00:27:43   This machine is phenomenal.

00:27:45   If you want a desktop replacement

00:27:48   that you occasionally move, 100% get the 16, no argument.

00:27:51   But if you want a computer that you move occasionally,

00:27:53   and you favor that, you know,

00:27:56   the ease of moving it over anything else,

00:27:58   which is where I am, get the 14.

00:28:00   Don't let these two knuckleheads

00:28:02   try to convince you otherwise.

00:28:03   It's a perfectly good computer.

00:28:05   There's no need to, there's no need to--

00:28:06   - I'm not trying to convince, I also agree.

00:28:07   I think the 16 is just too big for a laptop,

00:28:09   and really you should just get a desktop anyway.

00:28:11   - Oh my God, it's so good, it's so good.

00:28:13   The 16 is so good.

00:28:15   There's basically no trade off, except it is large.

00:28:20   But again, when you look at, I mean, we are so spoiled now.

00:28:24   The 16 inch MacBook Pro is lighter

00:28:28   than most 15 inches of most of our computing using careers.

00:28:34   And it is 16 only because they made the bezel smaller.

00:28:38   So the footprint is roughly 15 inch class

00:28:41   that we've had forever.

00:28:42   The weight is similar or lighter

00:28:45   than most 15 inch class laptops we've ever used.

00:28:48   It is glorious.

00:28:49   I'm not gonna argue that the 14 inch

00:28:51   isn't noticeably smaller, and lighter it is.

00:28:54   It is noticeably smaller and lighter.

00:28:56   I would maybe argue that if that's really your top priority,

00:29:01   maybe consider the Air.

00:29:03   But the 16 inch, again, it's all relative.

00:29:07   Yes, it is larger and heavier than yours,

00:29:10   but compared to anything ever in the history of computing,

00:29:14   these are all very thin and light, and it's fine.

00:29:17   And even like, again, when you're looking at,

00:29:19   suppose you're putting it in a backpack

00:29:21   and bringing it to wherever.

00:29:23   The weight of your backpack empty is probably a few pounds.

00:29:28   Like, it depends on what kind of bag you have.

00:29:29   The weight of your total carry

00:29:32   of what most people bring around in their backpacks every day

00:29:35   may be 10 or 15 pounds.

00:29:37   And so the difference in laptop weight

00:29:39   of like a one pound difference between two things--

00:29:42   - That's fair.

00:29:43   - Is not really that significant.

00:29:44   It feels significant when it's in your hand,

00:29:47   but like however you are carrying it,

00:29:49   generally speaking for most people,

00:29:51   the difference between a 16 inch MacBook Pro

00:29:53   and a 14 inch MacBook Pro, being like a pound or whatever,

00:29:56   is not a massive difference

00:29:58   in whatever they're carrying it in total.

00:30:01   So again, if it matters to you

00:30:03   and if you really still wanna minimize it, great.

00:30:05   I get that, please do.

00:30:07   But these are trade-offs that you should be aware of.

00:30:10   Despite what you said, Casey,

00:30:12   they weren't as severe of trade-offs in the M1 generation

00:30:15   because the total power usage of the M1 chips was way lower.

00:30:20   Like, rather the peak power usage, like under load.

00:30:23   The peak power usage was way lower in the M1 generation

00:30:25   than it is in the M3 generation.

00:30:26   So that is a difference that is worth noting.

00:30:29   And before, again, with the M1 generation,

00:30:34   you could say these computers

00:30:35   are basically the exact same computer,

00:30:37   just bigger and smaller.

00:30:38   And now that's a little harder to say.

00:30:40   There's a much bigger asterisk on that now.

00:30:42   - So let's talk about chip packaging.

00:30:44   Johnny Srouji had an interview in July

00:30:47   with his alma mater who is a college that's,

00:30:50   the name is escaping me, it was somewhere in Israel.

00:30:53   And he went back there and did an interview

00:30:55   with somebody there and talked a bit about chip packaging.

00:30:59   John, do you wanna either intro this or read it to me

00:31:02   or tell me how you wanna handle this?

00:31:04   - Yeah, so this is the interview

00:31:05   that I referenced on a past show

00:31:07   where I figured we're talking about,

00:31:09   we're talking about three-nanometer processes

00:31:10   or stuff like that.

00:31:11   But anyway, I've said that Apple executives

00:31:13   won't reveal anything about future products, obviously,

00:31:15   but anytime one of them talks,

00:31:16   they'll usually say something that lets you know

00:31:19   some vague future direction.

00:31:21   And in the interview, Johnny Srouji said

00:31:24   that he thought that packaging was a potential area

00:31:27   of innovation, and I wanted to follow up on that today

00:31:30   to first provide the excerpt from the interview

00:31:34   because the whole blah, blah, blah is interesting

00:31:36   is I think that was like Tim Cook saying

00:31:37   the wrist is interesting or some BS like that.

00:31:39   And whatever the quote from Jobs

00:31:42   when he was talking about cell phones was similar,

00:31:44   that's how we all knew Apple was making a phone.

00:31:46   So I just wanted to get the actual words

00:31:48   that he said about packaging

00:31:50   because there was also recently an interview

00:31:54   of Intel CEO, Pat Gelsinger

00:31:57   with Ben Thompson of Stratechery,

00:32:00   or Strateecery, however you wanna pronounce it,

00:32:02   where the Intel CEO talked about, you guessed it,

00:32:07   packaging and how important it is to Intel

00:32:10   and for the future.

00:32:11   And so I feel like this is a confluence of events

00:32:13   that's worth digging into deeper,

00:32:14   even if this, as we said in the past episode,

00:32:17   of this generation, it just looks like the M3

00:32:18   is gonna be, M3 Ultra is just gonna be two M3 Maxes

00:32:22   stuck together just like it was before,

00:32:23   which is fine and great and it'll be awesome,

00:32:25   but keep your eye on the future for packaging stuff.

00:32:27   So yeah, Casey, you just wanna go through

00:32:29   the interview segment thing.

00:32:30   - Sure, so how about I'll play the role of the interviewer

00:32:33   and you can play the role of Johnny Cerusci.

00:32:35   - Sure, give me the longer one.

00:32:36   - Yeah, well, all right, fine, you know what?

00:32:37   I'll take one. - I don't know if we can play.

00:32:38   We can play acted.

00:32:39   If we get Marco to be mixed in, he could be the announcer.

00:32:42   - Oh gosh, all right, so you be the interviewer,

00:32:44   I will be Johnny.

00:32:45   Let's do it that way. - Well, you don't need

00:32:46   two people to read them.

00:32:47   We don't need to, you can just be one person.

00:32:48   - Play with me in the space, John, come on.

00:32:50   Are you really just gonna leave me hanging like this?

00:32:53   - Jesus, wow. - I forgot, is it my line?

00:32:55   - Yes, it's you. - Line?

00:32:56   - Line. - We're actually doing this?

00:32:59   - Yes, yes, you are the interviewer, John.

00:33:01   - All right, this is the interviewer.

00:33:03   What are the next challenges in the design

00:33:05   and architecture of processors that Apple should tackle

00:33:07   to get to the real next generation of processing?

00:33:09   This is where the interviewer starts asking questions

00:33:11   like they shouldn't ask or they know

00:33:12   that I'm gonna get the answer.

00:33:13   What's the next challenge?

00:33:16   What should they tackle to get to the real next generation?

00:33:18   I'm not asking a question about the future,

00:33:20   but I totally am.

00:33:21   Anyway, here's what Johnny Srdi said.

00:33:23   - It is getting more and more challenging.

00:33:24   Those of you who follow CMOS technology,

00:33:26   whether it's five nanometer or beyond,

00:33:28   that's getting harder and harder, which I think is great,

00:33:31   by the way, because if it gets harder,

00:33:33   that means Apple will try better and better, so that's good.

00:33:37   I'm not going to describe our future roadmap,

00:33:40   but there are many challenges.

00:33:41   For example, when you take CMOS technology,

00:33:43   I think one of the things that is going to be important

00:33:44   is packaging without getting into details.

00:33:47   So the way you package the chip is going to be important,

00:33:50   or maybe you architect the chip in a different way.

00:33:52   - And I know you think like, that's saying nothing.

00:33:55   What are you talking about?

00:33:55   This was the quote?

00:33:56   Yeah, they don't say a lot,

00:33:58   but the fact that they say anything about anything,

00:34:01   oh, packaging, just that word, packaging,

00:34:04   what does that mean?

00:34:05   I mean, again, Apple already does interesting things

00:34:07   with packaging.

00:34:08   The silicon interposer is an interesting thing

00:34:09   done with packaging, and we've talked a lot

00:34:11   about packaging technology in the past,

00:34:12   but this is the tiny tidbit they left.

00:34:15   Maybe you architect your chip a different way.

00:34:17   Packaging is interesting.

00:34:19   I do like these typical Apple bravado,

00:34:23   where they said, oh, the world of silicon,

00:34:27   it's getting more and more difficult,

00:34:28   and that's great for Apple,

00:34:29   because when things get hard, we will excel.

00:34:32   That'll make us try harder, 'cause we're the best,

00:34:34   and basically bring it on, because the harder it gets,

00:34:36   the more you'll see that Apple is the best,

00:34:38   which is great for him to say.

00:34:40   We'll see if it actually turns out to be true

00:34:42   how the cell modem's going, Johnny.

00:34:44   I thought that was some top-tier Apple executive

00:34:51   bragging in this interview, but anyway,

00:34:52   that's what he said about packaging.

00:34:53   He just dropped the word packaging,

00:34:55   so now here is the Intel CEO, Pat Gelsinger,

00:34:58   also talking about packaging.

00:34:59   - So I will be playing the role of Pat Gelsinger

00:35:01   in this one, since that's the longer one.

00:35:04   Gelsinger says, this idea of chiplets, I think,

00:35:07   is the new way that all chips get designed,

00:35:09   the idea of advanced packaging, multiple chips

00:35:11   into the advanced package, and whether that's an MCP,

00:35:14   or multi-chip packaging, whether that's a two-and-a-half

00:35:16   or 3D construction, I do think that becomes the standard.

00:35:19   - And this is Ben Thompson.

00:35:20   The interviewer was saying,

00:35:21   why is advanced packaging the future?

00:35:23   You can see, you can ask Intel questions about the future.

00:35:26   I know this has been a big focus for Intel.

00:35:28   It's something that you wanna talk about,

00:35:29   and from everything I know,

00:35:30   your technology is leading the way.

00:35:31   Why is that so important,

00:35:32   in addition to the traditional Moore's Law?

00:35:34   Why do we need to start stacking these chiplets?

00:35:37   Give me the top reasons.

00:35:38   So this, Ben is asking a good question here,

00:35:40   which is like, of all the things, the problems Intel has,

00:35:43   and you can read the full interview,

00:35:44   he's interviewed the new Intel CEO a few times,

00:35:48   Intel's got some problems.

00:35:49   They fell behind, they're getting beat

00:35:51   in the fab business by TSMC.

00:35:52   Why in the world is Intel carrying anything about packaging?

00:35:55   And more importantly, why is Intel putting tons of money

00:35:58   into innovations in packaging?

00:36:00   And so he's asking like, what's the deal?

00:36:02   Give me the top reasons for the whole focus on packaging.

00:36:06   - One is, now you're able to take

00:36:07   the performance-sensitive transistors,

00:36:09   and move them to the leading edge node,

00:36:11   but leverage some other technologies for other things.

00:36:14   So you get to mix and match technologies more efficiently,

00:36:16   effectively this way.

00:36:18   Jon, would you like to jump in,

00:36:19   and explain what I, Pat Gelsinger, just said?

00:36:21   - Yeah, so this is part of Intel's problem,

00:36:23   is they, you know, their fabs fell behind.

00:36:27   They couldn't get off of whatever,

00:36:28   or were they stuck on 14 nanometer for ages, right?

00:36:31   And TSMC went ahead of them,

00:36:33   and has been ahead since then.

00:36:35   And so they're like, look, we know at Intel

00:36:38   we have some fab problems.

00:36:39   In fact, Intel is paying TSMC to fab a bunch of its stuff.

00:36:43   They're saying, one of the things that we can do

00:36:45   by breaking the chips up into smaller individual pieces,

00:36:48   is get the most important parts to be fabbed by TSMC,

00:36:53   not by us, on like the good process node,

00:36:57   and then use other lesser fabs, like Intel's fabs,

00:37:01   to fab the parts of the chip where it doesn't matter as much.

00:37:04   So if you're Intel, you don't wanna pay TSMC to,

00:37:07   and Intel, you can read all about it in Ben's coverage,

00:37:09   but Intel is trying to split where they're like,

00:37:11   we're gonna have a fabbing part of the company,

00:37:13   and then we're gonna have the chip design part

00:37:14   of the company, and we're gonna pretend

00:37:15   like they're separate.

00:37:16   So the fabbing part is way behind and needs to catch up,

00:37:19   but the part that owns like x86 and makes the chips,

00:37:21   we think that's good.

00:37:22   So, you know, they're using TSMC,

00:37:24   but they don't wanna have US TSMC for everything.

00:37:26   That would get very expensive.

00:37:27   Intel's big advantage is they fab their own chips,

00:37:30   and they design their own chips,

00:37:31   their vertical integration, everything.

00:37:33   So like, this is why, one reason why packaging is important,

00:37:37   Intel can pay the minimum amount required

00:37:39   to the company with the good fabs

00:37:41   to make the part of the chip where it's most important

00:37:44   for it to be on the top process level,

00:37:47   and then they can use cheaper stuff for the other parts.

00:37:50   - Second, we can actually compose the chiplets

00:37:52   to more appropriate die sizes

00:37:54   to maximize defect density as well.

00:37:57   If you have a monster server die,

00:37:58   you're going to be dictated to be N minus two, N minus three,

00:38:02   just because of the monster die size.

00:38:04   I get to carve up that server chip.

00:38:05   I get to move the advanced nodes per computing more rapidly

00:38:09   and not be subject to some of the issues,

00:38:11   defect density early in the life

00:38:12   of a new process technology.

00:38:15   - So this is about Intel, setting aside SOCs,

00:38:18   where it's a whole bunch of stuff on a big giant,

00:38:20   it's a system on a chip.

00:38:21   They're giant server chips,

00:38:23   like the huge, like, 56-core Xeon things,

00:38:25   they're just a single, gigantic chip,

00:38:28   and you have to fab that in a single,

00:38:30   it's fabbed in a single die.

00:38:31   And if some part of that screws up,

00:38:33   like, the more area you start covering on the silicon wafer,

00:38:35   the more stuff you put there, the more chances

00:38:37   there's going to be some error,

00:38:38   and you might have to throw out the whole chip.

00:38:40   And so this is another advantage to packaging,

00:38:43   is like, if you just make it out of smaller parts,

00:38:46   A, if it's bad, you're only throwing away

00:38:48   like a smaller area of your wafer,

00:38:50   you don't have to throw away this huge square,

00:38:52   you throw out the little square,

00:38:52   and B, your defect density can be lower.

00:38:55   He says maximizing defect density,

00:38:56   but whatever, he meant minimizing.

00:38:58   That you'll get more of those littler things that you fab

00:39:02   will come off the line and be completely error-free,

00:39:04   whereas trying to just get a single error-free

00:39:06   humongous Xeon is difficult.

00:39:09   SRAMs in particular, SRAM scaling will become

00:39:11   a bigger and bigger issue going forward.

00:39:13   So I actually don't get benefit by moving a lot of my cache

00:39:16   to the next generation node,

00:39:18   like I do for logic, power, and performance.

00:39:20   I actually want to have a 3D construct

00:39:21   where I have lots of cache and a base die,

00:39:24   and put the advanced computing on top of it

00:39:25   into a 3D sandwich.

00:39:27   And now you get the best of a cache architecture

00:39:29   and the best of the next generation of Moore's law.

00:39:31   So it actually creates a much more effective

00:39:33   architectural model in the future.

00:39:35   Additionally, generally, you're struggling

00:39:36   with the power, performance, and speed of light

00:39:38   between chips.

00:39:40   - Yes, this whole technique,

00:39:41   this is a particular technology that Intel has advanced with

00:39:43   which lets them take power and feed it

00:39:46   through the layers of the sandwich

00:39:48   in a more efficient way to get to precisely

00:39:50   where they want it.

00:39:51   I think that's what they're referring to here.

00:39:52   The sandwiching isn't just because of that,

00:39:54   but Intel has an innovative way to send power

00:39:57   up to the bottom of the chip

00:39:58   instead of sending it through the top,

00:39:59   because when you send it through the top,

00:40:00   you gotta weave it around a bunch of obstacles

00:40:02   where you send it through the bottom.

00:40:03   There's nothing in the way

00:40:04   because you make these layered sandwiches or whatever.

00:40:06   That's what that's talking about.

00:40:07   And SRAM is like, look,

00:40:08   the SRAM is what we were talking about before

00:40:10   with the GPU RAM that was being shared better

00:40:14   with dynamic caching.

00:40:15   That's it's faster than the RAM that's out under RAM chips,

00:40:20   like your main memory,

00:40:21   but it's not the same as registers.

00:40:22   It's kind of in between there.

00:40:23   It's like very expensive RAM

00:40:25   or each a little bit of RAM

00:40:27   takes a large number of transistors,

00:40:28   but it's much faster than regular RAM and it's in the CPU.

00:40:32   And I say, I'm like, you don't get any benefit

00:40:33   of doing that on three nanometers.

00:40:34   So not only do you not want to waste money doing it

00:40:36   because it would be more expensive,

00:40:37   you don't get benefit from doing it anyway.

00:40:39   So keep that on a different lower process

00:40:41   and then sandwich it all together.

00:40:43   - Any other questions for me, Ben?

00:40:45   - Yeah, I'm Ben again.

00:40:45   All right, so how do you solve that with chiplets

00:40:48   when they're no longer on the same die?

00:40:50   - In the chiplet construct,

00:40:51   we're going to be able to put tens of thousands

00:40:53   of bond connections between different chiplets

00:40:55   inside of an advanced package.

00:40:57   So you're going to be able to have very high bandwidth,

00:40:58   low latency, low power consumption interfaces

00:41:00   between chiplets.

00:41:01   It also becomes very economical for design cycles.

00:41:04   Hey, I can design a chiplet with this IO

00:41:06   and use it for multiple generations.

00:41:08   - And you can see how all the stuff that I just described

00:41:11   would be appealing to Apple.

00:41:13   Apple's already making SOCs,

00:41:15   which are essentially single dies system on the chips

00:41:19   that have a bunch of stuff in them.

00:41:20   And they're getting pretty big.

00:41:21   That's part of the reason the ultras have been two maxes

00:41:25   weaved together with the Silicon Interposer,

00:41:27   rather than just being a single large chip

00:41:29   the size of an ultra,

00:41:31   because that would be even more expensive to manufacture.

00:41:33   It's also probably part of the reason

00:41:35   we didn't have the quad, right?

00:41:36   So Apple would love the idea

00:41:38   that you could take subunits of this,

00:41:41   like the thing that does IO or whatever,

00:41:43   and reuse that between generations,

00:41:46   because between generations, like, yo,

00:41:48   it's what over run now,

00:41:49   Thunderbolt 3, Thunderbolt 4,

00:41:51   that lasts a couple of generations.

00:41:52   I think both the M2 and the M3

00:41:54   have the same speed Thunderbolt interface.

00:41:56   So if that was in a separate chiplet,

00:41:57   you can just reuse that one from before on the future chip.

00:42:01   And that could still be seven nanometer, for example.

00:42:03   I don't know if this is the right part of the chip

00:42:05   to be reused, but you can design it once

00:42:07   and reuse that little sub chiplet in future generations,

00:42:11   as long as you can do an interconnect between all of these

00:42:13   that has all the attributes,

00:42:14   the subscribe low latency, blah, blah, blah.

00:42:16   So far, Apple's packaging innovation has been the Interposer,

00:42:19   which is technically impressive,

00:42:20   and it makes the ultra possible.

00:42:22   But this is more like a more general purpose thing

00:42:24   of like lots of different little, you know,

00:42:27   a city of chips,

00:42:28   lots of different little islands on your chip

00:42:30   to even to just make what Apple does currently

00:42:32   in a single SoC,

00:42:32   and that can make things less expensive

00:42:34   and improve your yields as well.

00:42:37   - And then you want to tell me

00:42:38   about the Universal Chiplet Interconnect Express.

00:42:40   Is that a train, Jon?

00:42:42   - UCI and then lowercase E.

00:42:44   So this is one of these multi company consortiums

00:42:49   to do exactly what we were just describing.

00:42:51   So this is a press release from March 2nd, 2022.

00:42:54   It says Intel,

00:42:55   along with advanced semiconductor engineering,

00:42:57   AMD, ARM, Google Cloud, Meta, Microsoft, Qualcomm,

00:43:00   Samsung, and TSMC have announced the establishment

00:43:03   of an industry consortium

00:43:04   to promote an open die-to-die interconnect standard

00:43:07   called Universal Chiplet Interconnect Express or UCIE.

00:43:10   The chiplet ecosystem created by UCIE is a critical step

00:43:12   in the creation of a unified standard

00:43:14   for interoperable chiplets,

00:43:16   which will ultimately allow for next generation

00:43:17   of technological innovations.

00:43:19   Apple's not on that list, but TSMC is,

00:43:21   and Apple has a good relationship with them.

00:43:24   So if we're looking towards a future

00:43:26   where it becomes untenable to continue making larger

00:43:30   and larger dies, you know,

00:43:32   with everything on them and these SOCs,

00:43:34   a good industry standard system for making chiplets

00:43:39   at whatever the most economical and useful size is

00:43:43   and weaving them together to essentially form an SOC

00:43:46   out of it seems like a good idea.

00:43:49   Doing it all on one die probably still has advantages,

00:43:52   but price may not be one of those advantages.

00:43:55   And again, Intel is super interested

00:43:57   because they essentially can't do that

00:43:59   without having TSMC do all of their fabbing for them

00:44:01   if they want the top tier technology.

00:44:04   Apple doesn't currently have that problem,

00:44:06   but the fact that Johnny Cerugi mentioned stuff

00:44:08   about packaging and the fact that this whole chiplets thing,

00:44:12   like other companies are already doing this,

00:44:13   AMD has been big into the chiplets thing for a while,

00:44:15   Intel is putting a lot of money and research into this

00:44:18   and their chips that they're gonna be coming out with

00:44:21   use some of this technology,

00:44:23   I think it's unavoidable that Apple will start using it too.

00:44:25   Why am I interested in all this?

00:44:27   You know why.

00:44:28   It's to keep the dream alive of a Mac Pro chip

00:44:31   that is not the same as one you can get in an Mac studio.

00:44:34   And what's gonna make that possible?

00:44:36   Apparently with the approach they use

00:44:37   for the past two generations, not economically feasible.

00:44:41   If it's ever gonna be economically feasible,

00:44:43   we need a change.

00:44:44   This change in packaging could bring that about.

00:44:47   Stay tuned, I don't know, 20, 27, who knows?

00:44:50   - Don't mind.

00:44:52   All right, Jon, in keeping with the idea

00:44:56   of keeping the dream alive,

00:44:58   tell me about error network change.

00:45:00   - This is a nightmare, it's not a dream.

00:45:02   - Touche.

00:45:03   - Error network change continues.

00:45:05   Some more research from the internet about this.

00:45:09   So here is a Stack Overflow question.

00:45:12   This person asks, whenever my iPhone and Mac OS

00:45:15   are on the same WiFi, Chrome and Mac OS

00:45:17   often reports the error network changed.

00:45:20   I found that whenever my iPhone and Mac OS

00:45:22   on the same WiFi, a record will often appear

00:45:24   in the routing table in Mac OS

00:45:26   and disappear after a few seconds.

00:45:28   When this route record appears,

00:45:29   my Chrome will most likely have

00:45:31   an error network change error.

00:45:32   I turned off the iPhone's WiFi,

00:45:34   the routing in a record in Mac OS disappeared,

00:45:36   and Chrome no longer had the error.

00:45:38   So this is, everyone is always looking for like,

00:45:40   what causes this?

00:45:41   And so they start looking at anything,

00:45:42   like what changes my network?

00:45:44   What is it about my network that has changed?

00:45:46   And they'll spot something,

00:45:47   and they'll find a thing to attribute it to,

00:45:49   and they'll stop that thing by saying,

00:45:50   it looks like it's happening because my phone is here,

00:45:52   and if I turn off my WiFi, I fix the problem.

00:45:54   Every single one of these that has been reported,

00:45:57   I have experimented with,

00:45:58   and it doesn't actually eliminate the problem,

00:46:02   it just moves it around.

00:46:03   Because the problem is, the network changed

00:46:05   and Chrome flips out about it.

00:46:07   Is the problem that the network is changing?

00:46:08   Is the problem that's Chrome's flipping out about it?

00:46:11   Is it both?

00:46:13   Either way, if you stop the thing

00:46:15   that changes in your network,

00:46:16   doesn't mean that something else

00:46:17   won't also change your network and cause the error.

00:46:20   But I'm glad that somebody found something that helps them.

00:46:22   This next one is about tailscale.

00:46:25   Gil Penderson says, "The tailscale VPN used to trigger this,

00:46:28   but they fixed it with," and there's a link to a patch.

00:46:30   So this is tailscale patching their own code

00:46:33   to avoid this error,

00:46:33   because they were seeing this error happening,

00:46:35   and they were saying, hey, this thing that we're doing,

00:46:38   Mac OS is flipping out about it.

00:46:40   So basically it says iOS/Mac OS,

00:46:44   so it's apparently not just a Mac thing,

00:46:46   will reconfigure their routing

00:46:47   anytime anything minor changes in the netmap.

00:46:50   And so they're changing their code to not do that,

00:46:52   because they don't wanna anger iOS/Mac OS.

00:46:54   This is a change that committed in 2020.

00:46:57   So is our iOS and Mac OS changing their routing more?

00:47:02   Is there something that was part of the operating system

00:47:05   that's causing it?

00:47:07   Lots of theories about this.

00:47:08   One that's not even in the notes here is about,

00:47:11   if you have a dev device,

00:47:12   like if you have an iPhone

00:47:13   that is configured as a dev device,

00:47:15   there's apparently a new way

00:47:16   that Macs communicate to the phones

00:47:19   for purposes of development,

00:47:21   that also uses a network connection

00:47:23   that's brought up and torn down,

00:47:24   and apparently causes network to be changed

00:47:26   as far as Chrome concerned, and it flips out.

00:47:28   So anyway, here's an anonymous bit

00:47:29   that's getting us closer to the root of the problem.

00:47:32   And this is an explanation of it

00:47:34   from the perspective of Chrome, Chromium, whatever.

00:47:38   Because again, this is not just a Mac or iOS problem.

00:47:40   This happens to be on Linux.

00:47:42   This is a, part of the problem is clearly

00:47:45   with the Chrome/Chromium browser engine,

00:47:50   but the operating system has to participate in it as well,

00:47:52   because they're the ones that control the network

00:47:53   that is changing.

00:47:54   Anyway, anonymous writes,

00:47:56   "The network layer from Chromium is available

00:47:58   as a standalone cross-platform library called Cronet,

00:48:01   which is open source

00:48:02   and used in other non-Google applications.

00:48:04   I work at Google on a major Android app

00:48:05   that uses Cronet extensively,

00:48:07   so I've got experience with their network changed."

00:48:09   This issue appears to be with QUIC, Q-U-I-C,

00:48:12   which is the UDP-based web protocol thing.

00:48:15   We'll put a link to the Wikipedia page.

00:48:16   You can read about it.

00:48:17   And/or HTTP/3, which is a new version of HTTP,

00:48:20   which is sensitive to changing networks.

00:48:22   Whenever the networks on the device change,

00:48:24   the connection needs to be reset.

00:48:25   This isn't exactly a bug.

00:48:26   It's a necessary step.

00:48:28   Not doing this would mean connections

00:48:29   wouldn't work properly.

00:48:30   Unfortunately, this causes error network change

00:48:33   when the device's network changed.

00:48:35   As previously mentioned,

00:48:36   this can be for non-obvious reasons,

00:48:37   such as Docker desktop customizing the networks repeatedly,

00:48:40   VPN apps misbehaving,

00:48:41   cell signal dropping in and out, et cetera.

00:48:44   There are ways in which this can be fixed.

00:48:46   First, not using HTTP/QUIC is the easiest.

00:48:50   Safari and other Apple stuff doesn't use this yet,

00:48:52   so it isn't currently experiencing these issues.

00:48:54   However, this will likely change

00:48:55   as Apple rolls out HTTP/3 support.

00:48:57   Apart from this issue, it's a much better technology,

00:48:59   particularly for mobile devices.

00:49:00   So this is the first offered explanation of,

00:49:03   hey, why doesn't Safari have this problem?

00:49:04   The theory is, oh, Safari doesn't use QUIC or HTTP/3,

00:49:08   and so it's not going to see any of these errors.

00:49:11   Difficult to test because,

00:49:13   as far as I was able to determine

00:49:15   in conversing with this person,

00:49:16   there is no way to disable HTTP/3 and QUIC in Chrome

00:49:20   to disable it entirely.

00:49:21   There's some flags in Chrome flags

00:49:22   where you can turn some things off,

00:49:23   but I think you can't turn it off entirely.

00:49:25   So it's hard to do an apples-to-apples comparison.

00:49:28   But you pin your hopes on the smallest thing.

00:49:30   If Apple does plan on rolling out HTTP/3 or QUIC,

00:49:33   their browser is going to break like Chrome is.

00:49:35   And hopefully they'll be like, oh, geez,

00:49:37   this browser has become useless

00:49:39   because half of my HTTP connections fail.

00:49:41   Let's see if we can fix this,

00:49:43   as opposed to now where apparently no one is doing anything.

00:49:45   Next possible fix.

00:49:46   OSes and networking apps will need to be more careful

00:49:49   about network changes.

00:49:50   Apple will probably fix this in a Sonoma point release.

00:49:52   Other platforms I've seen this on are working on fixes.

00:49:56   I wish I had this person's optimism.

00:49:59   Will it be fixed in a Sonoma point release?

00:50:01   Does Apple even know about this?

00:50:03   I haven't bothered reporting this

00:50:05   because they're just going to say it's a Chrome error,

00:50:07   and I'm not sure they're wrong.

00:50:09   Like, I honestly don't know how much of the blame

00:50:11   to a portion to Chrome versus Mac is.

00:50:13   Anecdotal evidence is that this is way worse in Sonoma.

00:50:17   There are a bunch of new network-related demons

00:50:19   for various things that people have assigned blame to.

00:50:21   Oh, it's because of the Utahn thing.

00:50:23   It's because of talking to a developer iPhone device.

00:50:26   You know, it's because of Docker.

00:50:28   It's because of a VPN.

00:50:29   It's because whenever your iPhone and your Mac

00:50:33   are on the same Wi-Fi network, it happens.

00:50:35   Like, I don't know.

00:50:37   But I don't know how to communicate this

00:50:41   to anybody involved other than to say

00:50:43   Chrome no longer works on Mac in a reliable way,

00:50:45   like in a fundamental, no longer useful

00:50:48   as a browser kind of way because, you know,

00:50:51   some large representative requests just fail,

00:50:54   and it's not great.

00:50:55   Finally, the last thing.

00:50:57   There is an experimentation going on in Cronet

00:50:59   to improve things by migrating connections

00:51:01   across network interfaces.

00:51:03   This is the migrate sessions on network change V2 flag

00:51:06   or feature or whatever.

00:51:07   We'll put a link to the thing in the show notes.

00:51:10   This isn't just a bug fix, though.

00:51:11   It has trade-offs, so it likely needs to be done

00:51:13   in combination with improvements from OSes

00:51:15   and networking apps.

00:51:16   So the idea behind this thing is,

00:51:18   hey, if there's some connection

00:51:20   in this Cronet networking library,

00:51:22   and the network changes, and we have to, you know,

00:51:25   not use that connection anymore because it won't work,

00:51:28   whatever was using that connection,

00:51:30   why don't we smoothly migrate it over

00:51:32   to the new working connection,

00:51:33   and then the application that is using Cronet,

00:51:36   which would be the Chromium browser engine,

00:51:38   doesn't need to worry that that happened.

00:51:40   It will just continue to work.

00:51:41   That would be great, but it's not just like,

00:51:43   it's not just a bug fix

00:51:44   because there are performance implications of that.

00:51:46   Migrating, it is not free.

00:51:47   It takes time, and it's not the same

00:51:50   as just not having the network change.

00:51:52   So I don't know what the solution is here,

00:51:54   but it's extremely frustrating.

00:51:56   We did not have an Air Network Change shirt

00:51:58   as part of this sale,

00:51:59   but I'm seriously thinking about it for future ones.

00:52:01   This continues to happen.

00:52:04   And then the final bit, this is really the icing on the cake.

00:52:06   This is not seemingly related to Air Network Change,

00:52:11   but kind of is.

00:52:12   This was a toot by someone named Thomas who said,

00:52:15   "I just learned what the user interface

00:52:17   in the SpaceX capsules run.

00:52:19   The capsules that provide life support

00:52:21   for people traveling into space

00:52:22   have to be absolutely reliable.

00:52:24   The user interface that controls an explosion.

00:52:27   It runs a home-compiled version of Chromium,

00:52:29   and the UI is written in JavaScript."

00:52:31   And this person wrote this to say,

00:52:32   "Can you believe they're using web technologies

00:52:34   and a thing that has to be reliable?"

00:52:36   But that's not the thing that sticks out.

00:52:38   The thing that sticks out is the word "chromium."

00:52:40   Do you think if they have Air Network Changed

00:52:42   in a SpaceX rocket, that might be a problem?

00:52:47   And who's responsible for fixing that bug?

00:52:49   I'm sure it's running Linux or whatever,

00:52:50   but, again, people running Chrome on Linux

00:52:52   or Chromium-based things on Linux

00:52:54   are also experiencing this problem.

00:52:57   Sometimes I feel like I'm the only person with this problem,

00:52:59   but every time I talk about it on the show,

00:53:01   people send me toots on Mastodon.

00:53:03   Here I am, I'm getting it, I'm getting it on Air Work Changed.

00:53:05   I got people on Linux, people on Macs.

00:53:07   So far no one on iOS,

00:53:08   'cause maybe those errors are hidden from you

00:53:10   on most iOS apps, but...

00:53:12   I mean, I still think my Winter Dragon bug

00:53:15   is more important 'cause I could just use Safari.

00:53:17   But for the people in the SpaceX capsules

00:53:21   that have a UI that's running Chromium,

00:53:22   maybe talk to your bosses about this.

00:53:27   We are brought to you this episode by Squarespace,

00:53:29   the all-in-one website platform

00:53:31   for entrepreneurs to stand out and succeed online.

00:53:34   Whether you're just starting out

00:53:35   or managing a growing brand,

00:53:37   Squarespace makes it easy to create a beautiful website,

00:53:40   engage with your audience, and sell anything,

00:53:42   from your products to your content to your time,

00:53:44   all in one place and all on your terms.

00:53:47   Squarespace has an amazing platform to build websites.

00:53:51   It has never been easier for anyone

00:53:53   to unlock unbreakable creativity

00:53:56   with their awesome all-new Fluid Engine.

00:53:58   You start with a best-in-class website template

00:54:00   and then you customize every design detail you want

00:54:03   with reimagined drag-and-drop tech

00:54:05   that works on desktop and mobile.

00:54:06   You can really stretch your imagination online

00:54:09   with Fluid Engine, built in with all new Squarespace sites.

00:54:12   And Squarespace has amazing online storefront support.

00:54:15   So you can sell your products on an online store.

00:54:18   And not just physical goods,

00:54:20   even digital goods or service products,

00:54:22   or even things like time slots.

00:54:24   Squarespace has all the tools you need

00:54:26   to start selling online,

00:54:28   all with flexible payment options,

00:54:30   seamless checkouts that really convert very well,

00:54:33   all sorts of options for people to pay,

00:54:34   including of course credit cards,

00:54:36   but also PayPal, Apple Pay, Buy Now, Pay Later,

00:54:39   all sorts of great support there.

00:54:41   All of this is backed by powerful analytics,

00:54:43   so you can grow your business with insights

00:54:46   about things like where your site visits

00:54:47   and sales are coming from,

00:54:48   which marketing channels are most effective.

00:54:50   You can improve your website

00:54:51   and build a marketing strategy based on your top keywords,

00:54:54   or most popular products and content,

00:54:55   or whatever you want.

00:54:57   You can try everything on Squarespace at squarespace.com

00:55:00   and start a free trial there.

00:55:03   You can see it all in trial mode,

00:55:04   and you can see how well it's gonna work for you.

00:55:06   When you're ready to launch,

00:55:07   go to squarespace.com/atp for 10% off

00:55:10   your first purchase of a website or domain.

00:55:13   So once again, squarespace.com to start that free trial.

00:55:15   When you're ready to buy,

00:55:16   squarespace.com/atp for 10% off.

00:55:19   Thank you so much to Squarespace for sponsoring our show.

00:55:23   (upbeat music)

00:55:26   As we record this, it is Monday, what's it say,

00:55:28   the 20th, is that right?

00:55:30   Yes, Monday the 20th.

00:55:31   In the last three days, there have been approximately,

00:55:35   carry the seven, 7,304 different CEOs for OpenAI.

00:55:40   - Somehow still only one board.

00:55:42   - Right, yeah.

00:55:43   - Well, we've been recording for a while, so.

00:55:45   - Yeah, right.

00:55:46   - That's also fair.

00:55:47   So I don't really have too much to say about this,

00:55:50   but I guess--

00:55:51   - Just read the headlines in the notes.

00:55:52   - All right, so these are the headlines,

00:55:55   and we'll link these in the show notes.

00:55:56   - I think this is like 48 hours span, 24, 48.

00:56:00   - Yeah, it's been almost no time.

00:56:02   - So these are the headlines as per the Verge.

00:56:05   Sam Altman fired a CEO of OpenAI.

00:56:08   That was on the 17th.

00:56:10   - And by the way, out of nowhere.

00:56:12   - Yeah, so let's get through the headlines,

00:56:14   and then let, yeah, I think it's worth unpacking all this.

00:56:17   Then next, Twitch co-founder, Emmet Sheer,

00:56:19   is the new CEO of OpenAI, that was apparently today.

00:56:23   Microsoft hires former OpenAI CEO, Sam Altman, also today.

00:56:27   Hundreds of OpenAI employees starting to resign

00:56:30   and join Microsoft today.

00:56:32   Is Sam Altman joining Microsoft?

00:56:34   Satya Dadella doesn't seem to know.

00:56:37   - I think there's hours between those things.

00:56:39   - Yeah.

00:56:40   - How can you have, like, yes, it's a comedy of errors,

00:56:44   but then every time there's a headline,

00:56:46   it's like, this is a thing that happened,

00:56:47   and it's like, but has it?

00:56:49   - Yeah, exactly.

00:56:51   - I skip the ones where it's like, oh, he was fired,

00:56:54   but there are negotiations for him to come back.

00:56:55   But no, he's not coming back.

00:56:56   But yes, he is coming back,

00:56:57   but they're talking about coming back.

00:56:58   But now he's at Microsoft,

00:56:59   but has he been hired by Microsoft?

00:57:01   And this is not just like,

00:57:01   oh, we're wondering what's going on.

00:57:04   That's why the headline says,

00:57:05   is Sam Altman joining Microsoft?

00:57:07   Satya Dadella, who by the way is the CEO of Microsoft,

00:57:09   doesn't seem to know.

00:57:10   Satya Dadella announced,

00:57:12   hey, Sam Altman's coming to Microsoft.

00:57:14   But then when asked about it, like, hours later,

00:57:16   it was like, well, not really sure.

00:57:19   I don't check back in a few hours.

00:57:22   Boy.

00:57:23   - Yeah, so let me try to give

00:57:24   the quick executive summary of this.

00:57:26   Now, I'm surely gonna get a little bit wrong,

00:57:28   but the general gist of it is,

00:57:30   I think it was late Friday, Eastern time,

00:57:33   there was a blog post from the OpenAI board, basically,

00:57:37   that said that Sam Altman, who was the former CEO,

00:57:42   maybe by the time you're listening to this,

00:57:44   might be the CEO again, who knows?

00:57:45   But former CEO had basically not,

00:57:48   what was the phrase they used?

00:57:50   Not been, forthcoming wasn't the word they used,

00:57:54   but it was something along those lines.

00:57:55   - It was pretty severe.

00:57:58   - It was vague, but like, as, you know,

00:58:00   normally when a CEO is fired,

00:58:04   this is the privilege of being in the executive ranks.

00:58:06   When a regular person is fired, you just get fired.

00:58:09   But if you're a CEO, no matter how terrible

00:58:12   whatever you did was, no matter how badly you screwed up,

00:58:16   you're always like, you know, moving on to spend more time

00:58:19   with your family or want to pursue other interests,

00:58:23   or like they have some euphemism about how you're,

00:58:25   it's not like we're firing him.

00:58:26   It's a mutual decision we've all come to,

00:58:28   and he's decided to leave, and he's, you know,

00:58:31   pursuing his passions in this other realm,

00:58:33   and we thank him for all his blah, blah, blah,

00:58:34   and this wasn't like that.

00:58:36   This was as close as you're ever gonna get

00:58:38   to saying we fired him 'cause we don't like him.

00:58:40   - Yeah, that's how I remember being with Forrestal,

00:58:42   was basically like, it was clear that he got fired,

00:58:45   but everyone put on the happy face.

00:58:47   We would like Scott to spend more time with his family.

00:58:51   Yes, I would like to spend more time with my family.

00:58:55   - And usually, and usually in those type of statements,

00:58:57   there's something from the person who was fired,

00:58:59   like quote, you know, whatever, quotes from them, right?

00:59:02   They'll say like, I'm happy that I spent all my time

00:59:04   with this company, I'm glad to be moving on,

00:59:06   and blah, blah, blah, like it's a participatory process.

00:59:08   But what happened with Sam Altman,

00:59:11   the story I heard on Pocket Osean 2 is that

00:59:13   he was on stage doing something,

00:59:15   like in his official capacity as CEO,

00:59:17   and he said, okay, well, you know, thanks everybody,

00:59:20   I have to go, I have a meeting.

00:59:21   In that meeting, he got on, he got on a call,

00:59:25   this is, I was listening to the Verges podcast about this,

00:59:27   he got onto the call using Google Meet,

00:59:28   and one of the people on the show said,

00:59:30   even for $10 billion, no one will use Teams.

00:59:32   Microsoft invested $10 billion or whatever in OpenAI.

00:59:37   So anyway, he goes to his meeting, he sits down,

00:59:39   and it's, you know, an online meeting, right?

00:59:42   And all they did in the online meeting was read him

00:59:45   the blog post he just referred to, okay,

00:59:46   so they just put in, that's what you do in these meetings,

00:59:48   you have a script, you stick to it so you don't get sued,

00:59:50   you just read the words, so if you're wondering,

00:59:52   like, oh, they must have had an extensive meeting

00:59:53   in which they explained to him why he was fired,

00:59:56   when we say this came out of nowhere,

00:59:57   didn't come out of nowhere to us on the outside,

00:59:59   'cause what do we know?

01:00:00   Came out of nowhere to Sam Altman,

01:00:02   who just got done doing a CEO thing and said,

01:00:04   oh, I gotta go to a meeting,

01:00:05   and they just read him a thing and said, yeah, you're fired.

01:00:07   - And by the way, the thing he was doing,

01:00:09   like having their whole developer conference,

01:00:11   was widely lauded by the industry as like an amazing thing,

01:00:15   everyone thought he did an amazing job,

01:00:16   this is a great path beyond--

01:00:18   - Everyone's excited about open AI.

01:00:19   - Yeah, everyone's super excited about open AI and him,

01:00:22   like it was, that's why when this news dropped

01:00:25   that the board fired him,

01:00:26   everyone in the entire tech business was like, what?

01:00:30   Are you serious, is this a joke, like what happened?

01:00:32   - Yeah, so here's from the blog post

01:00:34   I was able to dig it up while you guys were talking,

01:00:36   this is the blog post from the open AI board,

01:00:38   Mr. Altman's departure follows a deliberate review process

01:00:41   by the board which concluded that he was not consistently

01:00:44   candid in his communications with the board,

01:00:46   hindering its ability to exercise its responsibilities.

01:00:48   The board no longer has confidence in his ability

01:00:50   to continue leading open AI.

01:00:52   - Which by the way, like that's a really serious accusation,

01:00:56   like that is like, potentially misleading the board

01:01:00   is in many cases a crime,

01:01:02   like that is a really serious accusation,

01:01:05   that's why like this isn't just,

01:01:07   oh, we had creative differences,

01:01:09   we don't like him anymore,

01:01:10   that's a very serious thing to throw around,

01:01:13   what they threw around.

01:01:14   - And by the way, as far as I understand it,

01:01:17   the board doesn't have to have a reason,

01:01:19   like they have complete control,

01:01:21   there's no like power struggle or whatever,

01:01:23   the structure of this company is such that the board

01:01:26   can decide, oh, we don't want you to be CEO,

01:01:28   I think for pretty much any reason,

01:01:30   like he needs to do anything wrong,

01:01:31   like it just happens all the time,

01:01:32   the board's like, we want a different CEO,

01:01:34   we wanna go in a different direction,

01:01:35   that happens all the time,

01:01:36   they don't need to explain,

01:01:38   which is why you can do the,

01:01:39   oh, they're going to spend more time with the family

01:01:41   or whatever, whatever the real reason is,

01:01:43   it's not like they need to make something up,

01:01:44   like we can't fire him unless we say that he lied to us,

01:01:47   they could have just said later,

01:01:49   but they threw that in there to say,

01:01:51   oh, and we have reasons,

01:01:52   we think he has not been quote unquote,

01:01:54   consistently candid,

01:01:55   like they didn't have to put that in there for any reason,

01:01:58   other than to just, I mean, I don't know,

01:02:01   maybe this will give us cover because it's vague enough,

01:02:03   people can imagine some terrible thing happened,

01:02:05   but bottom line is, apparently a majority,

01:02:09   we don't even know how much,

01:02:10   because it's not a public company,

01:02:12   so it's not like they have public records to say,

01:02:13   was this a unanimous vote,

01:02:14   did everybody vote to kick him out or whatever,

01:02:16   but a majority of the people on the board

01:02:19   decided they don't want him to be CEO anymore,

01:02:22   and they made this decision without consulting Sam Altman

01:02:25   at all, I mean, maybe they've had past conversations

01:02:27   about it, maybe he should have known it's coming

01:02:29   because they said in their last meeting,

01:02:30   if you do really well in that dev conference, you're fired,

01:02:33   I don't know what they might have said to him.

01:02:36   - So the impression I get, and I don't have a lot of facts,

01:02:41   nobody has facts here at the moment,

01:02:43   but the impression I got based on the coverage that I've read

01:02:47   is that there were two different factions, or tribes,

01:02:49   I think Sam had called them at one point or another,

01:02:51   two different tribes within OpenAI.

01:02:54   OpenAI was originally founded in 2014, 2015 thereabouts

01:02:58   as a nonprofit, and their theory was,

01:03:01   we wanna make basically artificial general intelligence,

01:03:04   we wanna make AI, but we understand that with this great

01:03:09   power comes great responsibility, and we wanna do this

01:03:12   to improve the lives of people everywhere,

01:03:17   but we're gonna try to do it very methodically,

01:03:19   very deliberately, very safely.

01:03:21   And that was how the company was founded,

01:03:24   and it's not even really a company, I guess, it's--

01:03:28   - It's a nonprofit. - It's a nonprofit, right?

01:03:30   So that's how it was founded.

01:03:32   Well, at some point, you know, Altman swoops in,

01:03:34   I don't recall exactly when it was in the timeline,

01:03:36   but it's clear that Sam Altman is very Silicon Valley VC,

01:03:41   which I personally find to be a very ugly stereotype,

01:03:46   like, I just don't care for that whole like,

01:03:48   grow, grow, grow, take over the world mindset.

01:03:50   - And he is the embodiment of that stereotype,

01:03:52   he used to be with Y Combinator, which is also kind of

01:03:54   a poster child for that type of VC.

01:03:56   - Yeah, and he's not the guy who had the day-of-phone,

01:03:58   night-of-phone, he's the guy who wore the multiple

01:04:00   polo shirts at the same time.

01:04:01   - Yeah, sorry to keep track of all that, Sam, yeah.

01:04:04   - Exactly, now, and I'm trying to make plain,

01:04:07   to quote Merlin, my priors on this,

01:04:09   like, I find that whole Silicon Valley mindset

01:04:11   to be very off-putting, you know, grow,

01:04:14   grow no matter what, we don't care who we run over

01:04:17   in the process, grow, grow, grow, you know,

01:04:19   let's make money, make money, and that's all that matters

01:04:22   is making money, nothing else matters,

01:04:23   grow and make money, grow and make money,

01:04:25   or sometimes you don't even have to make money,

01:04:26   you just grow.

01:04:27   Well, so anyway, so consider that when I give,

01:04:29   one of my, you know, lip turns up,

01:04:31   as I tell you this story, but nevertheless,

01:04:34   so Sam comes in and he's grow, grow, grow,

01:04:36   money, money, money, and everything,

01:04:38   I guess everyone was kind of coexisting okay,

01:04:41   and even though they disagreed internally,

01:04:44   everything was mostly all right.

01:04:45   Well, then they released ChatGPT, what, about a year ago,

01:04:48   and it makes a tremendous splash.

01:04:53   I think a lot of people are saying it's the quickest

01:04:55   adopted consumer product ever, because, you know,

01:04:58   they had something like 100 million signups

01:05:00   in the span of like four minutes,

01:05:01   not literally, but you know what I mean,

01:05:02   and so now money is becoming a thing, like for real,

01:05:07   and suddenly we have to pay the piper on this division

01:05:11   between money, money, money, and let's do this

01:05:14   for the good of all people, and their chief scientist,

01:05:17   Ilya, something or other, I don't have the name

01:05:19   in front of me, I guess was more on the let's play it safe,

01:05:23   let's be deliberate, let's do this for the good

01:05:25   of humanity, and Sam was very much money, money, money,

01:05:28   and at some point, it appears something gave,

01:05:33   and the rumblings that I heard, not from like sources

01:05:37   or anything, just based on the reporting I read,

01:05:41   was that Ilya had convinced the board,

01:05:43   of which I think he is a part of the board,

01:05:45   let's get rid of Sam, and then all hell broke loose,

01:05:49   and now, as we record on Monday night,

01:05:51   we don't really know what the latest and greatest is,

01:05:54   but I understand, you know, both sides of this.

01:05:58   Like on the one side, if the board really did establish,

01:06:00   or if OpenAI really was established as a non-profit,

01:06:02   which I think is factual, I don't think that's up for grabs,

01:06:05   it is within reason for them to say, well, hold on,

01:06:08   suddenly we've taken a turn and pivoted

01:06:10   to money, money, money, we don't like that,

01:06:13   and Sam seems to, by most metrics,

01:06:16   have pivoted them in that direction,

01:06:18   so if we don't like this pivot to money, money, money,

01:06:20   then we probably don't like Sam anymore,

01:06:22   so on the surface, I don't have a problem with that,

01:06:24   and again, my priors tell me, yeah, screw that Silicon Valley

01:06:27   nut job, you know, let's do this for the good of people

01:06:30   rather than the good of the almighty dollar.

01:06:32   - Well, wait till you hear about the other side of that,

01:06:34   because I think that they're both nut jobs.

01:06:36   - Well, that's fair too, that's fair too.

01:06:38   But I'm just trying to make plain, you know,

01:06:40   this is my biases coming to light.

01:06:43   That being said, unquestionably,

01:06:46   ChatGPT and the work of OpenAI and Dolly,

01:06:49   and you know, all this AI stuff,

01:06:51   whether or not you think it's cool,

01:06:52   whether or not you think it's good,

01:06:55   I think unquestionably, it's important,

01:06:58   and it's been making a big damn splash,

01:07:01   and I think that there's a lot of interesting things here,

01:07:04   and unlike blockchain, I think there's a lot

01:07:07   of fascinating threads that we can pull,

01:07:09   and I think that there's a lot there,

01:07:12   and because of that, I also have sympathy

01:07:16   for the let's grow this product

01:07:18   and see what the world can do with it,

01:07:20   and let's just, let's not slow down,

01:07:22   let's not, you know, be deliberate about it, screw it,

01:07:25   let's just figure it out as we go,

01:07:26   which is when Silicon Valley is at its best.

01:07:29   So I have very mixed feelings about this,

01:07:32   and I mean, as someone who likes drama more than I should,

01:07:36   because I'm a grown-ass man,

01:07:38   I really shouldn't enjoy drama this much,

01:07:41   but oh, this is delicious drama,

01:07:42   and I am here for the drama of it,

01:07:44   but I honestly don't know who's right, who's wrong,

01:07:47   I'm not sure any of us do.

01:07:49   I don't know what to make of this,

01:07:50   but it is a mess, and it's been a mess

01:07:53   every other hour since late Friday evening.

01:07:56   - Yeah, by the time you hear this episode,

01:07:58   who knows what will have happened.

01:07:59   I think this is actually, I mean,

01:08:01   a lot of, this is the most,

01:08:03   the most Silicon Valley thing about this

01:08:05   is that a lot of the companies that, you know,

01:08:08   from my youth that sort of came out of Silicon Valley,

01:08:11   they were at the forefront of some technology that,

01:08:14   they were at the right place at the right time, right?

01:08:16   They decided to make personal computers,

01:08:18   and personal computers were just becoming possible, right?

01:08:20   You know, you had your Apples, you had your Intel,

01:08:22   all right, you know, microchips, memory, you know,

01:08:24   CPUs, x86, Microsoft with the software, like,

01:08:27   they're riding a wave of something.

01:08:31   And those particular companies, especially, you know,

01:08:33   we know a lot of the names of the founders,

01:08:35   Bill Gates, Steve Jobs, a lot of them were younger people,

01:08:38   and pretty much none of them went into it thinking,

01:08:43   I'm going to found what's going to become

01:08:46   what we know today as tech giants,

01:08:48   'cause that's not what, you know,

01:08:49   who's the, if you tried to tell a young Steve Jobs

01:08:53   and Steve Wozniak what Apple will look like today,

01:08:55   they'd be like, yeah, right.

01:08:56   No one thinks that's going to happen.

01:08:57   So what happens with these companies

01:08:58   that are incredibly successful is,

01:09:00   they're made up of people, the founders,

01:09:02   but also usually they bring in other people

01:09:04   to help them run the company,

01:09:06   and especially for companies

01:09:07   that experience explosive growth,

01:09:10   the people who are responsible for running that company,

01:09:12   they're just people.

01:09:13   Sometimes they're people who've never done

01:09:15   anything like this before.

01:09:17   And so if you're wondering how can this company

01:09:19   that's been so financially successful

01:09:21   and apparently has such amazing technology

01:09:24   not have their governance straightened out

01:09:26   and basically, you know, conduct their business

01:09:30   in this ridiculous way with this whipsawing of the CEO

01:09:34   being fired and asking him to come back

01:09:36   and regretting telling him to leave and all that stuff,

01:09:38   it's because they're a young tech company

01:09:41   riding the wave of new technology

01:09:43   run by a pinch of people who are bad at doing this.

01:09:46   And that's, I mean, look, Steve Jobs was fired from Apple

01:09:51   by a bunch of people that were brought on to run the company.

01:09:53   That was probably the wrong decision

01:09:55   for them to do at that time.

01:09:56   If they had, you know, nurtured his talent,

01:09:59   perhaps they would have had a more successful 90s,

01:10:01   but you can also understand why they fired him

01:10:03   because, you know, read the history.

01:10:05   Like, a lot of these companies in their early days

01:10:08   at least run up to the edge of being, you know,

01:10:12   doing something terrible but being run in a way

01:10:15   that is not, I'm not gonna say not sane,

01:10:18   but that is not the way you expect

01:10:21   a very wealthy company to be run.

01:10:23   Even with Google, they have the two founders

01:10:25   and we have to bring in the adults to help run things.

01:10:28   You know, it can go badly in so many different ways.

01:10:32   It's very rare that you get a company

01:10:35   that manages to get through this sort of awkward adolescence

01:10:38   and sustain its success into something that continues

01:10:42   and becomes like a company that, you know,

01:10:45   that is more reliable and steady

01:10:47   and does not have weird boardroom drama like this.

01:10:49   So I'm not saying that this is unique to OpenAI at all.

01:10:52   In fact, like I said, I think this is the most

01:10:54   Silicon Valley thing that they've done.

01:10:56   That said, the people in these two camps here,

01:10:59   the Sam Altman and the OpenAI people

01:11:03   are a little bit weird.

01:11:04   They're, both sides are a little bit weird.

01:11:07   So we didn't talk about the Microsoft stuff,

01:11:08   but like, OpenAI is this nonprofit.

01:11:11   Microsoft wanted to do a thing with them.

01:11:13   So here's the thing with OpenAI.

01:11:14   Their goal is to make artificial general intelligence,

01:11:16   like 9,000 or whatever, that doesn't kill you.

01:11:19   - Big asterisk there.

01:11:21   - Yeah, yeah.

01:11:21   They haven't done that, to be clear.

01:11:23   That's like their aspirational goal.

01:11:24   It's like their mission statement.

01:11:26   But what they did do was make ChatGPT.

01:11:28   And it turns out that ChatGPT is a useful thing

01:11:30   that people can do stuff with.

01:11:32   And everyone's interested in it.

01:11:35   And Microsoft was interested in it and they said,

01:11:37   "Hey, we would like to do stuff with your technology too.

01:11:40   The problem OpenAI has is they want to make artificial

01:11:43   general intelligence, but turns out doing anything,

01:11:46   even approaching that, costs tons and tons of money."

01:11:49   And so OpenAI's idea was like,

01:11:51   "We'll just, we'll raise money somehow.

01:11:54   And people will give us money because they know

01:11:55   we're doing the good work to make how 9,000 not kill us."

01:11:58   And they, the amount of money they were able to raise

01:12:00   was a fraction of the amount that they would need.

01:12:02   Microsoft came and said,

01:12:04   "We kind of like that things you're doing over there.

01:12:06   How about we let you use our massive computing resources

01:12:09   and our data centers to the tune of billions of dollars

01:12:12   worth of free credits, like little Chuck E. Cheese tokens?

01:12:15   Now you can run your stuff in Azure because OpenAI,

01:12:17   OpenAI can't do anything without large amounts of computers,

01:12:22   which takes large amounts of money.

01:12:23   And they're a nonprofit.

01:12:24   And their idea was like, "We'll raise that money.

01:12:26   People will give us money to pursue this."

01:12:28   They just did not get as much money as they would need.

01:12:30   But ChatGPT, Microsoft's like, "Hmm, kind of like that."

01:12:33   So they did this deal where Microsoft is like

01:12:37   giving them $10 billion,

01:12:39   most of which is in the form of credits

01:12:40   to run stuff in their Azure, cloud computing stuff.

01:12:44   But how can they do that?

01:12:46   Well, if you look at the org chart,

01:12:47   it's like there's a nonprofit

01:12:48   and the nonprofit controls this other for-profit thing

01:12:52   that gets money from Microsoft.

01:12:54   But the for-profit part, it's not,

01:12:58   I think it's like what, profit capped or something like that.

01:13:00   Anyway, it's entirely controlled by the nonprofit,

01:13:03   but still Microsoft has this financial interest.

01:13:05   And by the way, as part of this deal,

01:13:06   Microsoft gets all the rights to the open AI IP,

01:13:10   like their intellectual property,

01:13:12   I don't know what their intellectual property is,

01:13:13   but presumably whatever it is they use to make ChatGPT,

01:13:15   Microsoft now has the rights to that.

01:13:17   I think forever, as part of this $10 billion deal.

01:13:21   The only thing Microsoft doesn't have the rights to,

01:13:22   and this is another one of those great deals,

01:13:23   kind of like Microsoft doing the Internet Explorer thing

01:13:27   and saying, "Oh, don't worry.

01:13:28   We'll give you X percent

01:13:30   of all of our Internet Explorer sales,"

01:13:31   and then giving Internet Explorer away for free

01:13:34   to everybody, and so that person got nothing.

01:13:35   Anyway, the company that got nothing from that.

01:13:38   Open AI is like, "Okay, we'll do this $10 billion deal.

01:13:41   We'll get access to your computing resources,

01:13:42   which we kind of need to literally do anything

01:13:44   'cause we don't have enough money to do this AI stuff.

01:13:47   We don't have enough money to pay for the computers to do it

01:13:49   but what we won't give you,

01:13:50   and we'll license you our current technology,

01:13:53   but what we won't give you is anything having to do

01:13:56   with artificial general intelligence.

01:13:57   So if we invent HAL 9000, Microsoft, you don't get it."

01:14:00   And Microsoft's like, "Okay, I guess."

01:14:03   And Microsoft's actually thinking,

01:14:04   "They're never gonna do that.

01:14:05   It doesn't matter."

01:14:06   (laughing)

01:14:07   So they retained the right to the fantasy thing

01:14:10   that they wanna make,

01:14:11   but gave Microsoft the rights to all the other stuff.

01:14:13   And so Microsoft's doing all this stuff

01:14:14   with their co-pilot, ChatGPT, all that stuff.

01:14:19   ChatGPT, the fact that it does useful things,

01:14:22   that is a product that can make money.

01:14:24   Microsoft can incorporate that technology

01:14:26   into its own products, Microsoft Office, GitHub, everything,

01:14:29   and use that to make money.

01:14:31   That's what everyone's doing.

01:14:33   Meanwhile, the nonprofits over there are going,

01:14:35   "What are they doing making products and making money for?

01:14:38   Don't they understand we're trying to make HAL 9000,

01:14:40   and we're trying to make sure HAL 9000 doesn't kill us?"

01:14:42   And so there's already a disconnect.

01:14:44   And so on the one side of it is like,

01:14:46   you have a technology that people will pay money to use.

01:14:49   I forget what their numbers are,

01:14:50   but like if they got 100 million users,

01:14:52   they're making some huge amount of revenue to it

01:14:54   because they charge for access to this stuff.

01:14:56   And of course, to the extent that Microsoft incorporates

01:14:59   any of this technology into their products

01:15:00   that helps them sell more of their products,

01:15:01   which of course they make money on.

01:15:03   So it's a product business.

01:15:05   Hey, we have a thing we came up with,

01:15:07   whether it was Google search or whatever,

01:15:10   and we can use it to make money.

01:15:11   And that's what Sam Altman's out there doing,

01:15:13   using a product to make money.

01:15:14   And then the open AI people who are like,

01:15:16   "AI is gonna kill us.

01:15:17   We need to create it in a...

01:15:18   First, we need to create it,

01:15:19   but we need to create it the right way

01:15:20   so it doesn't kill us."

01:15:21   And everyone else is like,

01:15:22   "Yeah, but you haven't created it,

01:15:23   but we've got this thing over here called ChappTPT

01:15:25   that people wanna pay for it."

01:15:26   It's not AI, even though everyone calls it that,

01:15:29   but it is a product that people wanna use.

01:15:31   So can we have a developer day and make an API

01:15:34   and charge people for API access

01:15:35   and do deals with Microsoft?

01:15:36   And it's like, "Okay, I guess."

01:15:38   But eventually like, "No, we're making L9000.

01:15:42   We don't like that other stuff. Stop it."

01:15:44   And so now you have this disconnect.

01:15:46   And if you think about what open AI is,

01:15:49   it's this mission statement.

01:15:51   It's a bunch of employees

01:15:52   that implement this mission statement.

01:15:55   And then it is the output,

01:15:56   the knowledge and the output of those employees.

01:15:58   And so Sam Altman apparently was popular within the company.

01:16:02   So when he left and went to Microsoft, or did he,

01:16:06   of the 700 or so employees at open AI,

01:16:10   about 500 signed a letter that said,

01:16:12   "Hey, if you don't bring him back

01:16:14   or if you don't all quit or whatever,

01:16:15   we're all gonna go to Microsoft."

01:16:17   Because Microsoft said,

01:16:17   "We have an open, a standing offer

01:16:19   to go work for them instead."

01:16:20   So if Microsoft gets Sam Altman

01:16:24   and 500 of the 700 employees,

01:16:27   what is the nonprofit left with?

01:16:28   Microsoft already has the rights

01:16:29   to all the IP of everything they actually made,

01:16:32   'cause they didn't make L9000.

01:16:34   So they got the rights to that.

01:16:35   And if all their employees also go over there

01:16:38   and they have the former CEO,

01:16:39   then open AI has a mission statement

01:16:42   and 200 loyal employees left.

01:16:45   And really, I guess those people can regroup

01:16:48   and use their $10 billion of Azure bucks

01:16:51   to figure out how to make L9000 any day now.

01:16:54   Meanwhile, Sam Altman and Microsoft are over there

01:16:58   continuing to sell access to chat GPT

01:17:00   to auto-complete stuff when you type reminders in

01:17:03   or whatever the hell they're doing to make money.

01:17:04   The reason I think both camps are a little bit weird

01:17:07   is 'cause Sam Altman's in the, as Casey alluded to before,

01:17:10   the grow, grow, grow, boil the ocean,

01:17:12   kill all the poor people so we can make another buck,

01:17:15   because in the end, what really matters

01:17:17   are the people who are gonna live a trillion years from now.

01:17:20   Lots of interesting philosophical exclusives

01:17:22   to be a jerk today,

01:17:23   but really it's for the future anyway.

01:17:25   Money, money, money.

01:17:26   And then the flip side, the people will think,

01:17:28   AI is gonna come and kill us all,

01:17:29   and they need to be really careful about how they create it,

01:17:31   and you're like, what's gonna kill us all?

01:17:33   They're like, oh, the thing no one's been able to invent yet

01:17:35   but when they do, it's gonna be really bad.

01:17:37   It's like, yeah, dragons can kill us all too,

01:17:40   but there aren't any dragons.

01:17:42   Like, but there could be, you're right, there could be.

01:17:43   Someone could genetically engineer a dragon,

01:17:45   but do you have a dragon?

01:17:46   No, but I think I know how to make one.

01:17:49   All right, well get back to me about the dragon thing.

01:17:51   Yeah, so it's kind of like self-driving cars.

01:17:54   It's like, any day now, it'll come.

01:17:56   It's like, well, no one's ever done it,

01:17:57   but all these companies sprung up about the promise,

01:18:00   you know, like based on the promise of like,

01:18:02   well, you know, in five years,

01:18:03   there'll be self-driving cars,

01:18:04   so we need to build business around that reality.

01:18:07   And that reality has not happened,

01:18:09   and so a lot of those companies are having problems.

01:18:10   So OpenAI is trying to pursue now 9,000,

01:18:15   but I don't think they know how,

01:18:18   like everyone else who has tried before them,

01:18:19   but unlike lots of other companies,

01:18:21   they've made a useful thing

01:18:22   that is a marketable consumer product

01:18:24   in the course of trying to do that research,

01:18:27   and it seems that that useful consumer product

01:18:30   is not compatible with their missions statement,

01:18:33   because that's not what they wanted to do.

01:18:34   Oh, that's cool and all.

01:18:35   That's the stuff along the way,

01:18:37   but let's not get distracted

01:18:38   by trying to make a multi-billion dollar business

01:18:41   out of this useful thing that we've made,

01:18:42   and by the way, OpenAI is not the only company

01:18:45   that's done this.

01:18:45   Lots of other companies have similar things,

01:18:47   so it is kind of a cutthroat business.

01:18:49   OpenAI seems to be the leader in this field,

01:18:52   which is why Microsoft is interested in them

01:18:53   and why they did that deal,

01:18:55   but it's not like this is unknown elsewhere.

01:18:57   What is unknown everywhere is HAL 9000.

01:19:00   That doesn't exist, and so if you're scared of it existing,

01:19:04   okay, it would be scary if it did exist, but it doesn't,

01:19:09   and I'll be more interested

01:19:10   once you think you have a road to making it.

01:19:13   Can you go from ChatGPT to HAL 9000?

01:19:16   Maybe, maybe this is step one of five,

01:19:18   maybe step one of 5000,

01:19:20   or maybe you're just barking up the wrong tree

01:19:22   and this approach isn't gonna work anymore,

01:19:23   the plants that flap their wings, we'll see,

01:19:25   but yeah, that's kind of the state of the world,

01:19:28   and these two camps, at least the Sam Altman thing,

01:19:32   you're like, okay, I've got this guy's number.

01:19:34   I've seen people like this before.

01:19:35   I know what he's doing.

01:19:37   He's making products.

01:19:38   He's selling them.

01:19:39   It makes sense, and the OpenAI people

01:19:41   are more like very confused sort of monks

01:19:45   who have noble, if misguided goals and great ambitions,

01:19:50   but what they increasingly don't have

01:19:53   are people and money to accomplish that,

01:19:55   and it's all through their own kind of,

01:19:58   I'm not gonna say mismanagement,

01:19:59   but through their own apparent misreading of the situation,

01:20:01   which is what did you think you actually have?

01:20:04   Oh, we control the company, and we can fire him,

01:20:06   but if all the employees log in to him

01:20:08   and he goes to Microsoft and all your employees leave,

01:20:10   then you've really got nothing left,

01:20:11   so I feel kind of bad for OpenAI

01:20:14   because I admire the nobility of their mission,

01:20:18   even if I think their reasoning

01:20:19   or predictions are a little bit silly.

01:20:22   I don't particularly admire the nobility

01:20:26   or lack thereof of Sam Altman's ambitions,

01:20:28   but at least it's a devil that I know.

01:20:30   - Yeah, and I think what they created

01:20:34   is so incredibly commercially valuable and commercializable

01:20:39   that I don't think there was any chance

01:20:42   of this going any other way.

01:20:45   The idealists in OpenAI who wanted to keep it,

01:20:48   really, nonprofit for the good of everybody,

01:20:50   that is a noble goal, it's an interesting idea,

01:20:53   but there was no way that's gonna happen.

01:20:55   - I mean, the only way it would've worked

01:20:57   if all the employees agreed,

01:20:59   if they said, "We are OpenAI employees

01:21:02   "because we believe in the mission

01:21:04   "of doing this not for profit,"

01:21:06   because if that was true, they got $10 billion

01:21:08   from Microsoft to run all their stuff,

01:21:09   and they've got 700 loyal employees

01:21:11   who want it to be nonprofit,

01:21:12   and they got one obnoxious CEO

01:21:14   who wants to make a product out of it,

01:21:15   but that wasn't the situation.

01:21:17   Turns out they had 500 employees

01:21:18   who wanted to do what the CEO did,

01:21:20   and that is a big miscalculation.

01:21:22   Maybe the OpenAI board thought,

01:21:24   "This is gonna work.

01:21:25   "All our employees believe in our mission."

01:21:27   That's why when you do those net promoter score surveys

01:21:30   at WorkMarker doesn't know about these.

01:21:31   - No, yeah. - No, remember saying,

01:21:34   that's for our clients, but anyway,

01:21:35   for internally, like, "Do you believe

01:21:36   "in the mission of OpenAI?"

01:21:37   People are like, "Oh, yes, totally."

01:21:39   For the first time since the shove,

01:21:40   they believe in making money and getting rich.

01:21:42   - Yeah, and again, even the reality is just

01:21:45   when you have that many employees,

01:21:47   a lot of them are gonna be paid in stock,

01:21:49   or some kind of stock-based thing,

01:21:50   where the company, if the company does well commercially,

01:21:53   you make more money.

01:21:55   That's really hard for all those employees to say no to.

01:21:58   That's the thing, whatever their ambitions are or were

01:22:03   about being a nonprofit and everything,

01:22:05   again, that is laudable, and the world needs more of that,

01:22:09   but the cards were stacked so hard against them

01:22:12   to achieve that here, because what they were sitting on

01:22:15   is such a goldmine that it's just impossible

01:22:18   to sustain that in the environment around them,

01:22:21   and once they got in with Microsoft,

01:22:24   I think that pretty much was sealing the deal,

01:22:26   "Okay, this is the direction you're gonna go now."

01:22:28   - I mean, but they think they needed to do that.

01:22:31   They couldn't pursue their goal of trying

01:22:32   to make hell9000 without tons and tons of money,

01:22:35   and they were not able to raise it on their own.

01:22:36   They couldn't get people to say, you know what I mean?

01:22:39   It's not like a type of thing where you can just do

01:22:40   one quiet person alone at their personal computer

01:22:43   is gonna have this amazing breakthrough.

01:22:45   The current approach to solving this problem

01:22:47   requires massive amounts of computing,

01:22:49   which requires massive amounts of money,

01:22:51   and you're really gonna be stymied in the attempt

01:22:53   to achieve your goal if you just don't have enough.

01:22:55   I think they raised $130 million or something,

01:22:58   and Microsoft gave them 10 billion.

01:23:00   They weren't even close.

01:23:01   - Yeah, there was no way that the nonprofit side

01:23:08   of their leadership, that side was inevitably going to lose,

01:23:13   and the only question now is how and when.

01:23:15   I think they just did, frankly,

01:23:18   and I think it's only a very short matter of time,

01:23:21   possibly tonight, but it's a very short matter of time

01:23:26   before that entire part of the company,

01:23:28   including the board, is cleared out and replaced.

01:23:32   - Yeah, and also, it's not even clear to me.

01:23:34   I know that OpenAI is viewed as the leader in this area,

01:23:37   but they're not the only company doing large language models.

01:23:40   Like, everybody's got one.

01:23:41   Facebook's got one, Apple reportedly has one,

01:23:43   Google's got one.

01:23:44   Everyone is working on this technology.

01:23:47   It is not like a secret sauce that only OpenAI has.

01:23:49   Maybe they're the best at it.

01:23:50   Maybe they were there first.

01:23:51   Maybe they have advancements we don't know about,

01:23:54   but it's a type of thing that, you know,

01:23:57   it's kind of like, you know, so the iPhone is the best phone

01:24:00   but Android phones exist.

01:24:02   It's not like Apple is the only company in the world

01:24:03   to have a touchscreen smartphone.

01:24:05   It is a thing that exists in the industry,

01:24:07   and that's a hard choice

01:24:08   'cause it's a platform type thing, but like anything,

01:24:10   like SSDs, everybody's got SSDs.

01:24:11   It's not one company that says,

01:24:13   "Oh, Apple's laptops are better

01:24:14   "'cause they don't have spinning hard drives

01:24:15   "and everyone else does."

01:24:16   No, everybody has SSDs.

01:24:17   It's just who has the best SSDs.

01:24:18   Like, these large language models are so useful

01:24:21   and so widespread, and the fact that OpenAI,

01:24:25   especially in the beginning of their nonprofit existence,

01:24:29   shared all their technology with the world,

01:24:31   they shared their discoveries,

01:24:32   like that was part of their charter

01:24:33   and their mission statement.

01:24:34   Everyone's more or less on the same page with these things.

01:24:36   They're a useful technology.

01:24:38   Everyone is thinking of new ways that they can use them

01:24:41   in their existing products and make new products

01:24:43   that they weren't able to make before

01:24:46   with the student technology.

01:24:48   Whatever happens to OpenAI, that's gonna happen, right,

01:24:51   which is the other absurd thing about the board saying,

01:24:54   "We wanna keep this purity of nonprofit."

01:24:56   Like, even if all of your employees and Sam Alton

01:24:59   agree with you, the whole rest of the industry

01:25:01   is taking this ball and running with it

01:25:02   because it's a useful technology

01:25:04   that you can make money with,

01:25:05   and so they're sticking it in everything.

01:25:07   Will it go beyond that?

01:25:09   Will it, like, you know, the future of these type of things,

01:25:11   will they just get better and better

01:25:12   until they're practically HAL 9000?

01:25:14   We'll see.

01:25:15   Right now they're not, but they're useful.

01:25:16   They're useful right now.

01:25:17   And so everyone wants to have access to them,

01:25:20   and if OpenAI had disappeared from existence,

01:25:23   the rest of the industry would continue forward

01:25:26   with, you know, making fancy code autocomplete

01:25:28   in Xcode 16, for example.

01:25:30   That's gonna happen with or without OpenAI.

01:25:32   But HAL 9000, I don't know if there are any other companies

01:25:36   with anything close to access to the resources

01:25:39   that OpenAI has that are attempting to do that.

01:25:42   Part of the fear of the OpenAI board

01:25:43   is it's just gonna happen accidentally,

01:25:45   and so we better do it deliberately so we can do it right.

01:25:47   I don't think it's gonna happen accidentally

01:25:49   any more than self-driving cars

01:25:50   are gonna happen accidentally, but we'll see.

01:25:52   - Let's do some Ask ATP.

01:25:54   Tim Schmitz writes,

01:25:55   "What's the story with ECC RAM in the Apple Silicon era?

01:25:58   I remember it being important to some folks on Intel,

01:26:00   but I haven't heard much about it lately.

01:26:02   Is it not applicable or relevant on Apple Silicon?

01:26:04   If so, why?"

01:26:06   John, would you mind giving us a quick rehashes

01:26:08   to what ECC RAM is, and then answer Tim's question, please.

01:26:12   - ECC, that stands for Error Correcting Code.

01:26:15   It's a technology used in RAM chips

01:26:17   that will account for errors in the memory

01:26:21   and be able to correct them if the errors are small enough.

01:26:23   So RAM chips, they're not perfect.

01:26:26   They're tiny little analog electronic devices.

01:26:29   What you would hope is that you would write some bits

01:26:30   to one location, and when you read it later,

01:26:31   you get the exact same bits back that you wrote there.

01:26:33   That's how it's supposed to work.

01:26:35   But sometimes you'll read,

01:26:36   and one of those bits will be wrong,

01:26:38   or maybe two of the bits will be wrong.

01:26:40   And why?

01:26:41   Well, electrical issues with the manufacturing of the chips,

01:26:44   cosmic rays, all sorts of things can go wrong.

01:26:47   And you would think,

01:26:48   who cares if one bit is flipped here or there?

01:26:50   But as RAM sizes have increased,

01:26:53   the odds of finding one of those one or two bit errors

01:26:57   has increased as well, because if you find,

01:26:58   oh, that's gonna be one in a million,

01:27:00   how many bits do you think are there on RAM?

01:27:02   Back when you had eight kilobytes of RAM,

01:27:04   there weren't as many bits

01:27:05   as when you have 128 gigabytes of RAM.

01:27:07   So how many errors are there?

01:27:08   More than you might think.

01:27:09   So as RAM sizes have increased,

01:27:12   one of the technologies that was introduced

01:27:14   on really important computers is ECC,

01:27:17   where they say, okay, we store some extra information,

01:27:20   which costs money,

01:27:21   because you're storing more information.

01:27:23   And with that extra information,

01:27:24   we can tell when one of the bits

01:27:26   is not what it's supposed to be.

01:27:27   And in fact, sometimes we store enough information

01:27:28   that we can tell what it was supposed to be,

01:27:30   and we can fix it and change it back.

01:27:32   And so your RAM gives the correct results,

01:27:34   unlike Casey's third-party RAM and his iMac.

01:27:37   And the kind of computers that would do that

01:27:39   were expensive server computers

01:27:40   that people would run important stuff on,

01:27:43   like financial things or banking transactions,

01:27:45   or basically server hardware.

01:27:47   Server hardware would have ECC RAM,

01:27:48   because it's really important

01:27:50   that the contents of RAM be correct.

01:27:52   The consequences of even a single bit being flipped

01:27:54   could cause a security problem or a crash

01:27:57   or something else that you don't want to happen,

01:27:59   because it's a hardware failure.

01:28:00   Back in the day, when the Mac Pro

01:28:02   was a little bit more pro, it came with ECC RAM,

01:28:05   because Intel Xeon CPUs

01:28:09   were their sort of server-class CPUs,

01:28:11   and they supported ECC RAM, and Apple would buy,

01:28:14   and you know, the CPUs from Intel,

01:28:17   and they would get motherboard chipsets from Intel,

01:28:20   and they would support ECC RAM,

01:28:21   and they would put that on their Mac Pro motherboards.

01:28:25   These days, we know what the situation with the Mac Pro is.

01:28:28   It is very much like a Mac Mini

01:28:30   in a much, much, much, much, much, much bigger case

01:28:32   with some extra stuff thrown in there.

01:28:34   These days, Apple does not do as many things

01:28:38   to differentiate the Mac Pro

01:28:40   from the less expensive and vastly smaller computers,

01:28:44   as in they're not designing entirely different RAM,

01:28:48   entirely different SOC, entirely different anything, really,

01:28:52   for the Mac Pro except the big case

01:28:54   and some PCI Express stuff.

01:28:56   The parallel story here is,

01:28:57   and I don't know the details on this,

01:28:58   but as RAM has advanced over the years,

01:29:03   a lot of the new RAM designs

01:29:06   necessarily incorporated a technology much like ECC

01:29:10   just to function correctly,

01:29:11   because they would have various arrangements of bits

01:29:14   or speed enhancements or whatever

01:29:16   that they would incorporate some kind of error correction

01:29:19   within just the regular functioning of the RAM.

01:29:21   You wouldn't call it ECC RAM.

01:29:22   You'd just call it DDR4 or whatever.

01:29:24   But if you looked at what's inside DDR4 RAM,

01:29:28   you'd find some parts of the circuitry

01:29:30   that are there to try to detect and/or correct errors.

01:29:34   That leads us to the question of ECC RAM in Macs today.

01:29:38   Is this something that Apple should have

01:29:41   in their Mac Pros and don't?

01:29:42   Do all Macs with the unified memory architecture

01:29:46   where the ARM chips have ECC RAM?

01:29:49   I would say that much like the keyboards

01:29:50   and everything having to do with the Mac Pro,

01:29:52   the situation is this.

01:29:54   Apple could spend more money to essentially make RAM

01:29:58   that is more resilient to errors in the Mac Pro

01:30:02   and only in the Mac Pro,

01:30:03   because that is the model that costs the most,

01:30:05   and it's where it's most important

01:30:07   that you not have any bit flip errors.

01:30:09   Apple does not do that.

01:30:10   So could it benefit from it?

01:30:14   Yes, it could.

01:30:15   It would cost more money and it would be better.

01:30:16   Would it be better in a way that anybody would care about?

01:30:19   I'm not sure, because again,

01:30:21   I think the RAM that they do use,

01:30:23   like what is the reliability of that RAM

01:30:26   compared to the ECC RAM that was in the old Intel Mac Pros?

01:30:29   Like in absolute numbers, not in terms of like,

01:30:31   well, back then ECC RAM was better than regular RAM.

01:30:35   Today, again, more expensive RAM

01:30:37   with more complicated error correcting

01:30:39   would be better than RAM without it.

01:30:41   But in absolute values, are bit flip errors

01:30:45   in the M2 Ultra more of a problem

01:30:48   than they were on a Xeon with ECC RAM?

01:30:50   I don't know.

01:30:51   So I would follow some of the category of,

01:30:54   if they offered it, I would like it and think it's good.

01:30:57   And practically speaking, especially back in the old days,

01:30:59   ECC RAM did provide better reliability

01:31:02   than that same amount of RAM that was in ECC.

01:31:05   So, you know, if you're wondering

01:31:07   why did my computer have some random crash sometime,

01:31:09   again, ask Casey about this,

01:31:10   was it because you had some kind of problem with the RAM

01:31:12   and would it have been prevented from ECC?

01:31:15   Hard to say, but maybe.

01:31:16   Like there's a reason they sold it

01:31:17   on all those server chips for a long time.

01:31:19   And if you had a server that didn't have ECC

01:31:20   and data center next to one that did

01:31:22   and you had enough of them,

01:31:23   you could see the difference in the reliability of,

01:31:26   you know, across hundreds and hundreds of servers

01:31:28   of ECC versus non-ECC.

01:31:29   Like people weren't paying extra for that ECC RAM

01:31:31   just for the hell of it.

01:31:32   It actually did provide a benefit.

01:31:33   So I think it is applicable.

01:31:36   I think it is relevant to Apple Silicon.

01:31:38   I think we're never gonna get it in the Mac Pro.

01:31:40   So don't hold your breath.

01:31:41   - Chris Glime writes,

01:31:42   "I've undertaken the task of photographing my son's art,"

01:31:45   he just turned five,

01:31:46   "to replace my previous strategy,

01:31:48   which was throwing it in a cardboard box

01:31:49   in the storage room and forgetting about it for years.

01:31:52   Ideally, I will only keep a few of these items

01:31:54   and toss the rest.

01:31:55   And I'm planning to make a photo book

01:31:57   for the grandparents for Christmas.

01:31:58   Thankfully, I was able to borrow

01:31:59   some lighting equipment from work

01:32:00   and I have a decent camera.

01:32:01   I was wondering if you all had a strategy for this.

01:32:04   My strategy is generally speaking,

01:32:06   throwing it in a cardboard box in the storage room

01:32:07   and forgetting about it for years."

01:32:09   (laughing)

01:32:10   So no, I have no good answers for this, unfortunately.

01:32:13   Marco, do you have anything?

01:32:15   - No, I mean, for the most part,

01:32:18   most of my kids' art that is really,

01:32:22   that was like when he was old enough to care

01:32:24   and put time into it,

01:32:26   most of that has actually been in the digital realm.

01:32:29   Like, he's had an iPad for a while

01:32:32   and my wife is really artistic

01:32:34   and introduced him early on to Procreate,

01:32:38   which is this iPad drawing app that everybody uses.

01:32:41   And he does a lot in there.

01:32:44   And so most of what he has artistically

01:32:47   is either stuff on pieces of paper

01:32:49   that are easy to store in different places,

01:32:51   not a lot of three-dimensional stuff,

01:32:53   or drawings in Procreate on iOS,

01:32:55   which of course we back up and everything.

01:32:57   So I don't really have this problem

01:33:00   as much as many parents do

01:33:02   just because my kids' art format is more easily storable.

01:33:07   - Jon?

01:33:09   - My strategy is to take photographs of art

01:33:11   so I can get rid of it because we have the same situation,

01:33:13   just boxes and boxes of the stuff.

01:33:15   For the most part, I just take iPhone pictures.

01:33:17   Like, it's really, especially with just like

01:33:19   rumply kid artwork on construction paper with glue

01:33:22   and the corners are curling, like this, you know,

01:33:25   Chris sent in a picture of like,

01:33:27   he had like a lighting setup and it's big comp,

01:33:29   I'm just like, put it down on someplace with good sunlight

01:33:32   on the floor and take pictures of it with my iPhone.

01:33:34   'Cause that's all I want.

01:33:35   I can see these are not great works of art

01:33:37   that I need to preserve future generations.

01:33:38   Just don't remember, I'll remember that thing that he made.

01:33:40   And I look at the pictures and make sure they're clear

01:33:42   and not blurry and I can see all the things.

01:33:44   And if there's one that I really, really cared about,

01:33:46   I could square up the edges and put it in Photoshop

01:33:48   and, you know, try to fix the shadows of the curling corners

01:33:51   and do all that stuff.

01:33:52   But I think the, you know, the main goal is kind of like,

01:33:57   you know, scanning the digital photos or negatives,

01:33:59   get it out of the physical realm and into the digital ASAP.

01:34:02   And then you just can't hold onto that stuff.

01:34:04   Kids make a lot of drawings.

01:34:05   So you don't wanna throw it away and never see it again.

01:34:07   But just, you know, use your iPhone,

01:34:09   take a picture and then get rid of it.

01:34:12   - That's also important to kind of distinguish,

01:34:14   like, a lot of kid art that comes home from school,

01:34:19   what, you know, depending on your kid, of course,

01:34:21   you know, in my case, a lot of times it was very easy

01:34:23   to see like, okay, the teacher told him to do this.

01:34:27   And, you know, whatever came home was not really a work

01:34:30   by him, it was a work by the teacher's lesson plan.

01:34:32   And that's fine, there's a place for that.

01:34:34   But it was very obvious when he had made something

01:34:38   that he really cared about, rather than just follow

01:34:41   the template the teacher set out for everybody.

01:34:43   And so for the follow the template stuff,

01:34:46   he didn't care about it and so we didn't have to care

01:34:48   about it, you know, and like, our level of care

01:34:51   would ramp up with his level of care and involvement.

01:34:54   And so when he was doing something like on his own,

01:34:57   you know, of his own accord with his own direction,

01:35:00   that stuff is significantly better and worth keeping

01:35:03   and there's way less of that as a kid goes through school

01:35:06   compared to template stuff that, you know, everyone did.

01:35:10   - Yeah, and there's a hierarchy, like the other thing

01:35:11   you should do with them, aside from taking pictures

01:35:13   and throwing them out, take them really good ones

01:35:15   and frame them.

01:35:16   We have a bunch of select things from various points

01:35:18   in our kids' lives that are in frames, hanging on the wall.

01:35:20   That's another way to preserve them.

01:35:22   Also take digital pictures of those, by the way.

01:35:24   But, you know, you get rid of most of it,

01:35:27   you take a little extra credit, take photos of the gems,

01:35:30   and then the really good ones, put in frames,

01:35:31   put up around your house.

01:35:33   Jimmy West writes, "Do you think the use of technology

01:35:35   "actually makes your lives easier, or if you do it

01:35:38   "because it's more of a hobby?

01:35:40   "These days I try to have as little technology

01:35:41   "in my home as I can.

01:35:43   "I have regular light switches that make the lights

01:35:44   "go on and off, and I don't have any home automations

01:35:47   "of any kind.

01:35:48   "I don't notice my life being appreciatively worse

01:35:50   "because of this.

01:35:51   "In fact, since reducing the use of technology,

01:35:54   "my life has become consistently calmer

01:35:56   "and more stress-free.

01:35:57   "The same is true for my computer these days.

01:35:58   "Time was my menu bar would be filled with stats

01:36:00   "and custom functionality."

01:36:02   Are we sure this wasn't John writing this?

01:36:03   (laughing)

01:36:04   - No, his menu bar never had stats in it.

01:36:07   - "Along with custom icons in the Finder

01:36:09   "and automation scripts, all of which required

01:36:10   "ongoing maintenance and updating.

01:36:12   "Now I have a pretty vanilla installation of Mac OS.

01:36:14   "My second sideline career of being an author,

01:36:16   "I only really hit my stride productivity-wise

01:36:18   "when I got rid of all the complicated software

01:36:20   "and decided to just use a plain and simple text editor.

01:36:23   "Maybe I just like the simple life, and these days

01:36:25   "I only use as much technology as I need.

01:36:27   "But I'm interested to know where the trade-off lies

01:36:30   "for you with the added minor stresses that come

01:36:32   "with incorporating and maintaining technology

01:36:34   "in your home life.

01:36:35   "Do you only use technology which makes your life

01:36:37   "genuinely easier or better?

01:36:38   "Or is there a part of you that convinces yourself

01:36:40   "it's useful because tinkering with tech

01:36:42   "is a bit of a hobby and you find it fun to do?"

01:36:45   I understand what Jamie's coming from here.

01:36:47   I think for me, it's both.

01:36:50   I think unquestionably, it's a hobby.

01:36:53   One does not come up with a three raspberry pie solution

01:36:57   to knowing whether or not your garage door is open

01:36:59   if it isn't a hobby.

01:37:01   That is just straight up lunacy, right?

01:37:04   But knowing when the garage door is open

01:37:07   does indeed make my life better.

01:37:09   So it's a little of both.

01:37:10   I don't have a lot of automations.

01:37:12   In fact, I just went on Automators just a couple,

01:37:15   I think it was probably a month ago now,

01:37:17   and talked about some of the automations I have.

01:37:20   And I don't really have that much.

01:37:23   But the ones I have, I genuinely do think

01:37:26   make my life better.

01:37:27   Would I be fine without them?

01:37:28   Of course, I'd be just fine without them.

01:37:31   But they do make my life a little bit better,

01:37:32   and I like that.

01:37:33   And again, so much of this though is

01:37:37   I like certain kinds of tinkering.

01:37:40   I like certain kinds of projects and stuff like that.

01:37:44   And yeah, this is certainly a hobby.

01:37:46   And ultimately, when I come up with a solution for something,

01:37:52   even if it's just an out of the box solution,

01:37:55   as I have been espousing for a little over a year now

01:37:58   and Marco is now starting to join me,

01:38:00   one of us, one of us,

01:38:01   when you have your sonos playing the same music

01:38:04   in an uninterrupted way,

01:38:06   and without any sort of latency across your entire house,

01:38:09   including your backyard, that's pretty rad.

01:38:12   And granted, that's not as much tinkering as it is

01:38:15   spending a copious amount of money,

01:38:17   but nevertheless, that does make my life better.

01:38:19   That is tech that makes my life better.

01:38:21   So I see it definitely as a lot of column A

01:38:25   and a lot of column B.

01:38:26   But that's where I come down on it.

01:38:27   I don't know, Marco, where do you land?

01:38:29   - I think it depends a lot on the role of technology

01:38:34   in your life.

01:38:35   Are you an enthusiast about technology

01:38:37   and you want to explore stuff like that?

01:38:39   Or are you using it more as a tool to get your job done

01:38:44   and your job might be something else

01:38:46   or something that the technology

01:38:48   is only playing an assistive role in?

01:38:50   And this changes at different points in people's lives

01:38:52   and for different reasons.

01:38:54   For me, when I was younger,

01:38:56   especially my high school and college days,

01:38:59   I was much more of a tinkerer.

01:39:02   And I would spend a lot more of my time,

01:39:04   especially on the software side,

01:39:05   a lot more of my time messing with my,

01:39:09   I didn't have a menu bar yet 'cause I was on Windows,

01:39:11   but messing with my Windows setup

01:39:12   and installing a bunch of different system utilities

01:39:14   and doing up a whole bunch of power user stuff

01:39:16   like tweaking things and messing around with the registry

01:39:19   and making my start button say Marco and stuff.

01:39:22   All sorts of that kind of tinkering stuff that you do,

01:39:25   especially as a young person,

01:39:27   because back then my priorities were,

01:39:31   well, try to get a girlfriend,

01:39:33   but that wasn't going so well.

01:39:34   So instead, I spent as much time on my computer as possible.

01:39:37   (laughing)

01:39:38   And so I was using the computer back then

01:39:42   for the sake of using the computer.

01:39:43   That was the activity.

01:39:44   The activity I was spending most of my time on

01:39:46   was messing around on my computer.

01:39:48   - And actually to interrupt quickly,

01:39:49   because I was very much on the same boat,

01:39:52   a lot of the tinkering was finding an excuse.

01:39:55   I think this is what you're driving at,

01:39:56   but finding an excuse to need more time on the computer.

01:40:01   Like, oh, I don't really need my start button to say Marco,

01:40:04   but if I wanted to figure out how to do that,

01:40:07   that's more time I can spend using this thing,

01:40:09   which is both a toy and a hobby and everything to me.

01:40:13   Like, I probably speak for Marco

01:40:16   and actually probably John as well

01:40:18   in saying that the computer was like everything to me

01:40:21   when I was young because there was just limitless opportunity

01:40:25   to do anything there in a way that I think

01:40:27   is both there today and is very, very different today.

01:40:30   But finding all these things to tinker with

01:40:33   was in no small part for me anyway,

01:40:36   giving me reasons to continue to sit at the computer.

01:40:38   - Yeah, oh, totally, me too.

01:40:40   Like, my activity for the day would be

01:40:42   I'm gonna sit at my computer and I'll find stuff to do.

01:40:45   (laughing)

01:40:46   And largely, today, I actually still largely do that.

01:40:51   I just have more things that I need to do

01:40:54   that are at the computer and I have more things in my life

01:40:58   that break up those times so that I have less time

01:41:02   at the computer than I used to.

01:41:04   But as you go through life, your priorities change,

01:41:08   your needs change, and for the most part,

01:41:12   most people do some degree of specialization.

01:41:15   You specialize in some part of your career

01:41:18   or your hobby life or your family life, whatever,

01:41:20   and that necessitates having less time spent

01:41:24   messing around with stuff like this.

01:41:26   And some people, they make the messing around their career.

01:41:28   To some degree, we do that here by being on a tech podcast,

01:41:32   although, you mentioned automators and other shows

01:41:36   where there are people who go way more in depth

01:41:39   with that side of things than we ever do.

01:41:43   And they've kind of made that a part of their career,

01:41:45   so in some ways, for them, it's both work and play.

01:41:49   The same way, for us, buying new Macs is both work and play.

01:41:54   But for me personally, as I have gone through a career

01:42:01   and adulthood and adult responsibilities

01:42:05   and family responsibilities, my tolerance for tech

01:42:10   that requires a lot of tinkering has gone down over time.

01:42:14   And that happens to a lot of people.

01:42:17   Oftentimes, you find yourself trading money for time,

01:42:22   which is the opposite of when you're in college,

01:42:24   you have no money and tons of time.

01:42:26   And as you get older, a lot of times, you're like,

01:42:29   you know what, rather than figure out how to hack myself

01:42:32   into not needing this extra terabyte of disk space,

01:42:36   I'm just gonna buy the one with a terabyte more

01:42:38   of disk space because your trade-off is different

01:42:42   at that point in your life.

01:42:43   And for me, for what I'm doing this part of my life,

01:42:48   there are some areas that I'm still willing

01:42:51   to do some degree of technological tinkering.

01:42:54   For instance, I mentioned a few episodes ago,

01:42:56   I've been burning these Blu-ray 100 gig M-disk things

01:43:01   for archiving data.

01:43:03   I'm still doing that.

01:43:03   I've burned 26 of them so far.

01:43:06   I have my entire photo library, Tiff's entire photo library,

01:43:09   and now my entire music library all backed up

01:43:11   on these really weird disks.

01:43:14   It took forever.

01:43:15   So there are certain areas like that that I will try,

01:43:19   partly for show content to talk about here,

01:43:21   partly because I'm just interested in them,

01:43:23   but my tolerance for tech that doesn't work very well

01:43:27   or that requires lots of messing with is down to almost zero.

01:43:32   So for instance, smart home stuff is a great example

01:43:35   that Jamie brought up, and this is such a great example.

01:43:37   Smart home stuff leaves you infinite potential

01:43:41   for tinkering, but most of it really doesn't work very well.

01:43:46   Or it'll work for a few weeks,

01:43:48   and then something will change or break,

01:43:51   and then you gotta redo everything

01:43:54   to make it ever work again,

01:43:56   or you have to add these different bridging

01:43:59   and hacking projects or tricks to get this thing

01:44:04   to talk to that thing and then involve this web service

01:44:07   so that this thing can bounce things off the web service

01:44:08   and send this thing and this other thing.

01:44:11   That's where you lose me.

01:44:12   Once you get into that stuff, that's where you lose me,

01:44:15   because that doesn't make things better for me.

01:44:18   But for a lot of people, they are on a different part

01:44:21   of that trade-off continuum of like,

01:44:25   do you want to spend time tinkering

01:44:27   or do you wanna spend time just having stuff work for you

01:44:30   and not messing with it?

01:44:32   So for everyone, it's different, but as I said,

01:44:35   for me, I'm closer now to I just need things to work

01:44:40   because I need to make time for the parts

01:44:45   of my computing life that are more important to me

01:44:47   so that when I do get that precious time sitting

01:44:50   in front of the computer where I don't have

01:44:52   a different obligation that I must be doing at that moment,

01:44:55   I'm able to do the things I really wanna do,

01:44:58   like work on Overcast, like deep programming work

01:45:01   or really creative work.

01:45:04   That's what I wanna make time to do.

01:45:07   And the more I have to mess with my technology,

01:45:10   the less time I have for those higher priority things.

01:45:14   - Jon?

01:45:16   - Yeah, so asking three programmers on a tech podcast

01:45:20   is probably gonna give you a different answer

01:45:21   than the rest of the population.

01:45:22   But yeah, for me, tech has been my hobby, obviously,

01:45:27   since early childhood.

01:45:29   It's also an area of interest,

01:45:31   which I think is different than a hobby

01:45:32   'cause when people hear hobby, they're like,

01:45:33   oh, it's a thing that you're doing.

01:45:35   It's an interest of mine as well,

01:45:37   kind of in the same way that like cars are.

01:45:40   Like I don't, I spend all this time reading about cars

01:45:43   that I'm never going to own.

01:45:45   A lot of them I'll never even see in person.

01:45:47   It's like, is your hobby cars

01:45:49   or like all these car rebuilding videos?

01:45:50   Are you working on cars?

01:45:51   No, I'm never doing any of that, but I'm interested in it.

01:45:55   I find it interesting.

01:45:56   It is an interest of mine.

01:45:57   So a lot of the tech stuff that I'm into, it's an interest.

01:46:00   Even if I'm never gonna do it, I'm interested in it.

01:46:03   I wanna learn about it.

01:46:04   I wanna know about it.

01:46:05   I find it interesting.

01:46:06   So it is more than just a hobby.

01:46:08   And then of course it was my profession as well

01:46:10   and not in the way that like you use a computer

01:46:13   to help you do your job.

01:46:14   Computers were the job.

01:46:16   It's kind of like being a builder

01:46:17   and you build a hospital.

01:46:18   It's like, oh, you really super into hospitals?

01:46:20   Like, no, I'm a builder.

01:46:21   I build things.

01:46:22   And then you build a fire station.

01:46:23   Oh, I guess you really love fire stations?

01:46:24   Like, no, I'm a builder.

01:46:26   I'll build anything.

01:46:27   I like the task of building.

01:46:30   Well, I was a computer programmer and yes, I was a web dev.

01:46:33   So it was more narrow than just being a builder,

01:46:35   but like I built so many different things

01:46:37   with web technology,

01:46:39   depending on what company I was working for.

01:46:41   So my job was wrangling the computers, right?

01:46:44   The computers were a profession directly.

01:46:47   Not that I'm using a computer to write a novel

01:46:49   and I'm a novelist.

01:46:51   I'm literally wrangling the computers for you.

01:46:54   And it's my hobby and it's my interest.

01:46:57   So as you can imagine, the balance in my life

01:47:01   about trade-offs for technology was massively tilted

01:47:03   towards the tech side.

01:47:05   And the way that manifests in daily life

01:47:07   and sort of home life is for somebody like me,

01:47:10   and I imagine Marco Casey as well,

01:47:12   the personal payoff of getting something set up

01:47:17   the way you want it,

01:47:20   finally got my eMac set up the way I want it,

01:47:22   the personal payoff of getting

01:47:23   the three Raspberry Pi solution or whatever

01:47:26   for people like us very often balances

01:47:30   the amount of tinkering required to get there.

01:47:32   Even Marco doesn't want to deal with a lot of stuff.

01:47:35   He does deal with a lot of stuff,

01:47:36   even if it's just buying and returning products

01:47:38   and getting them set up and seeing how they work.

01:47:40   There is some amount of tinkering

01:47:42   and engaging in that hobby.

01:47:44   And the reward is when you come in

01:47:46   and it's finally set up the way you want

01:47:47   and the music plays the way you want

01:47:49   and you use the voice command, the lights go on

01:47:50   and the automatic thermostat does all the things,

01:47:53   you get a level of satisfaction out of that.

01:47:55   And if you are in a house with people

01:47:57   who are not exactly like you in that regard,

01:48:00   you have something to compare it to,

01:48:01   which is, yeah, they like it fine when the lights work

01:48:05   and you can watch stuff on the TV.

01:48:07   But you're over there going,

01:48:10   see, did you see how well that worked?

01:48:12   They didn't do any of the work to set it up.

01:48:14   And this amount of satisfaction they're getting

01:48:17   out of this pristine 4K television signal

01:48:20   going to your fancy TV that you researched,

01:48:21   they're like, oh, I guess TV works.

01:48:23   They didn't even have to do the work.

01:48:25   And their scales, it takes so little

01:48:28   to unbalance their scales.

01:48:29   That TV doesn't work one time, this is garbage.

01:48:32   I hate everything that you've done here,

01:48:33   get rid of it, right?

01:48:35   And we're willing to put in hours and hours

01:48:37   of buying products and hooking them up

01:48:38   and doing all this stuff or whatever,

01:48:39   if we want to get that satisfaction,

01:48:41   now it's finally working the way I want.

01:48:42   Even in the marketplace, it finally works

01:48:45   without me having to deal with it.

01:48:46   You put in effort to get to that goal.

01:48:49   No one else, even though no one else is asked

01:48:52   to put in any of that effort,

01:48:53   so if there's any downside whatsoever,

01:48:55   they're like, oh, scales unbalanced, I don't like this.

01:48:58   I just want the light switches to work it.

01:48:59   If the lights don't turn on one time,

01:49:01   even though they put in zero amount of effort

01:49:03   to make this home automation, it's like, nope.

01:49:06   Every time lights do come on,

01:49:09   do you think they're getting some amazing satisfaction

01:49:11   of knowing that the giant Rube Goldberg machine

01:49:13   that you put together to make that happen?

01:49:14   Nope, nope, they just want the lights to work.

01:49:16   And so everyone has to balance that differently.

01:49:19   And I think maybe the key to at least family happiness

01:49:23   within a household is to realize

01:49:24   that if it is your hobby, your interest,

01:49:27   your profession, or God forbid, all three,

01:49:30   your satisfaction when everything comes together

01:49:36   is not experienced by other people.

01:49:38   And so the second something doesn't work,

01:49:42   they will give it a thumbs down.

01:49:44   And when it does work, they will think nothing of it.

01:49:46   And so keep that in mind when trying to balance,

01:49:50   like, is it a benefit to your life,

01:49:52   or is it like a hindrance?

01:49:54   You're not the only one, if you're living with other people,

01:49:57   you're not the only one who's in play here.

01:49:59   You have to think about how it's affected

01:50:00   everyone else's life.

01:50:01   And by the way, as you know, if they're unhappy,

01:50:04   you're probably also gonna be unhappy.

01:50:06   So it's not as if you can say,

01:50:08   well, I'm satisfied with this trade off or whatever.

01:50:11   If everyone else in the house hates it,

01:50:12   you will eventually not be satisfied with that trade off.

01:50:14   So self reflection is the right approach,

01:50:17   including being true to yourself and knowing,

01:50:20   I do like this stuff.

01:50:21   I am interested in it.

01:50:22   I do, like Casey's part of the project,

01:50:25   he wants to do it for the sake of doing it

01:50:26   and it's fun and he feels a sense of competence

01:50:28   when he's done and he has to realize that's a him thing.

01:50:30   (laughing)

01:50:32   That is not a rest of the house thing.

01:50:33   And let people have hobbies.

01:50:35   Like let, enjoy your interests.

01:50:37   That's what life is all about.

01:50:38   You know, just like if you have hobbies and interests,

01:50:40   engage in them in a constructive way

01:50:42   and get as much enjoyment out of it as you can.

01:50:44   And hopefully you enjoy your work

01:50:46   and hopefully you're doing something in your profession

01:50:47   that is also something that you enjoy,

01:50:50   or at least have some kind of interest in,

01:50:52   but be aware that other people have different opinions

01:50:54   about the thrill they get when it works

01:50:57   and the feeling they get when it doesn't work.

01:51:00   - Thanks to our sponsor this week, Squarespace.

01:51:03   And thanks to our members who support us directly.

01:51:05   You can join us at ATP.fm/join.

01:51:08   And we will talk to you next week.

01:51:11   (upbeat music)

01:51:13   ♪ Now the show is over ♪

01:51:16   ♪ They didn't even mean to begin ♪

01:51:18   ♪ 'Cause it was accidental ♪

01:51:20   ♪ Accidental ♪

01:51:21   ♪ Oh, it was accidental ♪

01:51:22   ♪ Accidental ♪

01:51:24   ♪ John didn't do any research ♪

01:51:26   ♪ Marco and Casey wouldn't let him ♪

01:51:29   ♪ 'Cause it was accidental ♪

01:51:31   ♪ Accidental ♪

01:51:32   ♪ Oh, it was accidental ♪

01:51:33   ♪ Accidental ♪

01:51:34   ♪ And you can find the show notes at ATP.fm ♪

01:51:39   ♪ And if you're into Twitter ♪

01:51:42   ♪ You can follow them at C-A-S-E-Y-L-I-S-S ♪

01:51:47   ♪ So that's Casey List M-A-R-C-O-A-R-M ♪

01:51:53   ♪ N-T-M-A-R-C-O-R-M-N-S-I-R-A-C ♪

01:51:58   ♪ U-S-A-C-R-A-C-U-S-A ♪

01:52:00   ♪ It's accidental ♪

01:52:02   ♪ Accidental ♪

01:52:04   ♪ They didn't mean to accidental ♪

01:52:07   ♪ Accidental ♪

01:52:09   ♪ Tech podcast ♪

01:52:11   ♪ So long ♪

01:52:13   - So did we order our IE pins yet?

01:52:17   - Yeah, totally.

01:52:19   - Can you order them?

01:52:20   Are they accepting orders?

01:52:21   - Yes, so here's what happened.

01:52:23   So a couple episodes ago,

01:52:25   like right before we recorded,

01:52:28   The Verge got the leak of most of the details

01:52:31   about the humane AI pin that was about to be unveiled.

01:52:35   We did our episode, we talked a little bit

01:52:36   about our impressions of just seeing The Verge leak.

01:52:39   And we were like, well, you know,

01:52:40   we don't wanna judge it too much

01:52:42   because we don't know their side of the story yet.

01:52:44   Let's see what happens when they officially unveil it.

01:52:48   Then the next day, they officially unveiled it

01:52:51   with this really odd video and opened up preorders.

01:52:56   And then what happened in the meantime

01:52:59   is preorders opened up, no one noticed,

01:53:02   and then all this stuff blew up with open AI

01:53:04   a few days later. (laughs)

01:53:06   And so it seemed to breeze by without much mention.

01:53:10   And I think if I had to guess,

01:53:12   I think humane's chance in the press is already over.

01:53:17   But I thought it might be interesting

01:53:19   to just talk about it a little more

01:53:20   because we did get more details

01:53:23   about the humane AI pin product, their first product.

01:53:26   And this is, it's kind of a, it's a noteworthy thing.

01:53:30   But I think it's interesting,

01:53:31   the era of technology that we are in,

01:53:35   all of this drama around open AI

01:53:39   has gotten way more attention

01:53:41   and is way more interesting to the tech press

01:53:43   than a launch of a pretty hyped device

01:53:50   that's a whole new device type

01:53:51   from a whole bunch of ex-Apple people.

01:53:53   That, 10, 15 years ago,

01:53:57   that would have been way more headline-grabbing

01:53:59   than it ends up that it has been

01:54:02   and way more relevant, I think, to the modern tech world

01:54:05   than it will probably end up being.

01:54:08   So I think that's interesting by itself.

01:54:09   But also just kind of looking at the humane product,

01:54:13   now that we know all the details about it,

01:54:16   or at least whatever they have chosen to show so far,

01:54:18   Verg seems like pretty much everything.

01:54:20   What, if anything, is different,

01:54:26   like now that they've unveiled it,

01:54:27   basically everything that Verg said was correct,

01:54:29   but now we just have more details

01:54:32   about what some of its features are,

01:54:34   how some of these features work.

01:54:36   We see the laser thing that, don't call it a screen screen.

01:54:41   We see how humane is positioning it,

01:54:44   what they want it to be.

01:54:46   It has all these, basically,

01:54:49   it's kind of like wearing an Amazon Echo on your chest

01:54:51   that happens to have a few extra features.

01:54:53   I gotta say, the video,

01:54:58   and again, the video was really weird,

01:55:02   and I'm just gonna set that aside

01:55:04   and just focus on the product,

01:55:05   'cause the video didn't do them any favors, I don't think.

01:55:08   But at least we see what they're going for.

01:55:12   We see, here's this thing that you're gonna wear

01:55:15   as this, basically as this badge on your chest

01:55:18   that you tap and ask it to do things.

01:55:22   It can play music for you,

01:55:24   it can answer questions for you sometimes correctly.

01:55:28   It can take pictures for you with a camera

01:55:32   that faces front, like a body cam if you're a cop,

01:55:35   like it's that kind of perspective.

01:55:36   There's a whole bunch of interesting ideas here.

01:55:43   I don't think this is going to go well.

01:55:47   I would actually, I would honestly kind of be surprised

01:55:52   if they made it to their launch day.

01:55:55   They're taking pre-orders now,

01:55:58   and they say estimated delivery early 2024,

01:56:00   but I would be surprised if they're getting any pre-orders

01:56:06   that aren't just from my gadget reviewers,

01:56:09   and even then, I don't think

01:56:10   it's gonna be that big of a number.

01:56:11   But anyway, but I think it's interesting,

01:56:14   when you look at this product,

01:56:16   there are a lot of interesting ideas.

01:56:18   It's definitely going to flop,

01:56:22   and I think it's going to flop because they seem to have,

01:56:27   how do I put it?

01:56:31   I mean, they seem to have gone

01:56:33   for an ideal physical environment

01:56:36   that doesn't really exist for a lot of people

01:56:38   based on tech features that really aren't ready yet.

01:56:42   Or in some cases, they pulled off technology that is cool,

01:56:45   but is not actually better

01:56:47   than what they are trying to replace.

01:56:49   So for instance, the physical challenges.

01:56:52   As discussed last time, this is like a pin

01:56:56   that attaches to your clothing,

01:56:58   has magnetic backs to hold it on and charge it.

01:57:01   I mean, okay, what if I don't want this giant thing

01:57:04   on the front of my shirt?

01:57:06   It's pretty big, it's pretty noticeable.

01:57:08   It's not like a little tiny enamel pin.

01:57:10   It's a big badge-sized pin.

01:57:14   It is not discreet at all.

01:57:16   Anyone would see this,

01:57:18   and anyone who doesn't know what this is

01:57:20   would definitely ask you about it.

01:57:21   What the heck is that?

01:57:22   What is on your shirt?

01:57:24   So there's that angle of it.

01:57:26   There's the reality that it weighs about

01:57:29   as much as two AA batteries,

01:57:31   which is not heavy but not light.

01:57:33   And imagine that just flopping around in your chest all day.

01:57:37   That would actually be noticeable,

01:57:39   and that would be kind of annoying.

01:57:41   If you're wearing any kind of lightweight clothing,

01:57:42   that's gonna be very noticeable.

01:57:45   Suppose it's a season, and you wanna go outside

01:57:50   and put a jacket on or a hoodie or something.

01:57:52   Do you have to not have the features of this thing

01:57:54   during that time, or do you take it off your shirt

01:57:57   and move it to your jacket?

01:57:59   Every time you add or remove a layer of clothing,

01:58:01   it's gonna be a problem with this kind of device?

01:58:04   Like, okay.

01:58:05   - Well, but Marco, I don't understand why you would say

01:58:08   that that would be an issue.

01:58:10   First of all, how could they possibly know,

01:58:11   a company based in the greater San Francisco area,

01:58:14   how could they possibly know anything

01:58:16   about putting on and off layers?

01:58:17   I mean, that's not something

01:58:18   that happens in San Francisco ever.

01:58:19   - I know, it's like,

01:58:20   again, and even just the whole concept of like,

01:58:25   I hate my phone so much that I'm going to wear this thing

01:58:31   and use it instead of a phone in many cases

01:58:33   for things that really a phone would be better at.

01:58:36   But the environment that they wanna use this in,

01:58:40   it's like, I don't wanna be talking to something

01:58:45   constantly out in the world.

01:58:47   I barely even wanna do it in my own house.

01:58:50   Like, the idea that it is a voice-first interface,

01:58:53   I find optimistic, but not super compelling

01:58:57   for most realities that most people live in.

01:59:01   They also try to replace the screen.

01:59:04   This is a screenless device, kind of,

01:59:07   but it does have that laser projecting thing

01:59:12   that can project a screen onto your hand.

01:59:15   You know, they don't call it a screen,

01:59:16   but I got news for you, that's a screen.

01:59:19   Like, that's just a really weird, crappy screen

01:59:23   that is very limited and a little bit difficult to use.

01:59:26   - Excuse me, it is a laser ink display, thank you very much.

01:59:31   - But yeah, and again, that's kind of cool tech

01:59:34   for something, like I'm sure there's good uses for that,

01:59:35   but when it showed it in the video, I was like,

01:59:37   oh, you're just navigating a screen.

01:59:40   It just happens to be projected onto your hand

01:59:42   and you gotta tilt your hand

01:59:43   and weird ways to interact with it,

01:59:45   but that's just a screen.

01:59:46   It's just like what John always brings up,

01:59:49   when I tried to have my magazine app

01:59:51   that didn't have a settings screen,

01:59:53   and it ends up I just kinda had to shove settings everywhere

01:59:55   and it was a worse design

01:59:56   and I ended up just having to make a settings screen.

01:59:58   In this case, navigating this device,

02:00:00   what they were showing in the video,

02:00:03   they're using a device with a screen.

02:00:05   It's just a really unusual kind of screen,

02:00:07   but they were still navigating it like a screen

02:00:10   and using it like a screen,

02:00:11   and it turns out you need a screen

02:00:13   for a lot of these things.

02:00:14   And if you're gonna have a screen, just use your phone.

02:00:18   It's way better. (laughs)

02:00:21   So there's all that to contend with,

02:00:23   and so I think they have a lot of physical challenges

02:00:28   to this, that there is just, physically speaking,

02:00:31   I don't see any reason why this is better

02:00:34   than a phone and a watch.

02:00:36   For all of the ambient availability that they have on it,

02:00:41   it should be a watch, 'cause a watch is physically

02:00:45   a much easier and more versatile thing

02:00:47   that can work for more people in more conditions than this.

02:00:50   And even then, it's like, but I mean,

02:00:53   this is not that different from what a phone does,

02:00:56   and a phone is better at all these things,

02:00:58   and you already have a phone.

02:01:00   But I also think what's interesting is that

02:01:02   the ecosystem realities of the modern technological world

02:01:07   is like, even if this was a great idea

02:01:11   that was very well executed and that everybody would want,

02:01:15   the reality is they're trying to have this

02:01:18   replace your phone in a lot of key roles,

02:01:22   including things like, it has its own phone number,

02:01:26   they want people to call this and to message this, okay.

02:01:31   So you're gonna give people a whole new number

02:01:35   just for your AI pin, or are you gonna move your number

02:01:37   to it and then not have that number on your phone?

02:01:40   Are people supposed to text this,

02:01:43   and then it's just gonna read you a summary

02:01:45   of your friends' text messages, like hey, you know what,

02:01:47   I don't actually wanna read my friends' text messages,

02:01:50   just summarize them for me. (laughs)

02:01:52   Like, there are so many parts about this

02:01:54   that it makes for a cool two-second demo,

02:01:58   but if you think about it for another two seconds,

02:02:00   you're like, oh, wait a minute, though,

02:02:01   but what about this, or wait, wouldn't this fail

02:02:05   in this one way, or wouldn't this have

02:02:06   this major shortcoming?

02:02:08   But I think it's interesting that because

02:02:11   of the ecosystem realities of our tech world today,

02:02:15   because this is trying to replace your phone,

02:02:17   it has no chance whatsoever, because who is gonna buy this

02:02:21   to replace their phone if it's not going to sync

02:02:24   all this stuff back to their phone?

02:02:27   Like, it's not gonna sync with your,

02:02:29   it's not gonna have iMessage support,

02:02:31   so you're gonna become a green bubble friend,

02:02:33   and it's gonna sync to nothing, you're gonna have,

02:02:36   like, none of your messages that come to your AI pin

02:02:38   are gonna appear in your phone's messaging app,

02:02:41   'cause it doesn't work that way.

02:02:43   You have to have this different phone number for it,

02:02:46   maybe you could do some kind of forwarding tricks,

02:02:48   but who's gonna do that?

02:02:49   So, it's interesting that the modern Apple

02:02:53   and Google duopoly here, there's actually not that much room

02:02:56   for a startup like this to come in and make their own thing,

02:02:59   because unless it works really, really well,

02:03:02   and is very, very integrated with either iOS or Android,

02:03:06   something like this is not gonna get off the ground,

02:03:08   which is kind of a shame.

02:03:09   Like, honestly, again, I don't think they did,

02:03:12   I don't think this is a very compelling product, honestly,

02:03:14   but suppose it was, it still would fail,

02:03:19   because of that massive hardware lock-in environment

02:03:23   that we are in now with these modern platforms,

02:03:25   and that's kind of a shame, but ultimately, honestly,

02:03:29   I don't think this is what Humain originally intended

02:03:33   to launch, the Humain company and the massive talent drain

02:03:38   they did from Apple and all the work they've been doing

02:03:41   greatly predates the rise of what we're calling AI today.

02:03:47   What we're calling AI today is a very recent thing.

02:03:50   Humain's been in development for longer than that.

02:03:52   So I think what probably happened is they were probably

02:03:56   planning something else, or at least a very different focus

02:04:00   for this product, it wasn't working out so well,

02:04:04   they started running out of money, maybe,

02:04:05   or running out of time, and they kind of pivoted recently

02:04:09   to be like, hey, you know what, let's make it really AI

02:04:11   focused, that way that'll get more attention,

02:04:13   it'll fit the current market better,

02:04:15   maybe it'll help us raise more money or whatever.

02:04:17   So I think there's a more complicated story here,

02:04:21   but basically, this product doesn't even seem like

02:04:24   what they originally set out to make,

02:04:26   and also, it doesn't seem like it's going to succeed at all.

02:04:29   So frankly, again, I would be surprised if it ships.

02:04:34   I think the company might go under and get bought

02:04:36   for AquaHire or whatever before this even ships.

02:04:41   And if it does ship, I don't think it's gonna last long.

02:04:46   - I talked about this for a while

02:04:47   in the upcoming episode of Rectiffs,

02:04:49   so I won't repeat too much of what I said there,

02:04:50   if you wanna hear me talk about it

02:04:51   for a little bit longer with Merlin.

02:04:54   But yeah, this thing, it's kind of hard to tell

02:04:58   when they talk about it, whether they really believe

02:05:03   this wrong-headed idea of replacing people's phones,

02:05:05   as in any moment you spend on your phone is bad,

02:05:07   and so to the degree that we can reduce that or replace it,

02:05:10   our product is good, this kind of value judgment,

02:05:13   everything that we can help you do with your little badge,

02:05:17   that's a win, because it's time you didn't spend

02:05:19   on your phone, and that's why it's good,

02:05:21   because phone is bad and this is good.

02:05:24   I can't tell if they really believe that.

02:05:26   I kind of think maybe they do,

02:05:28   and that's what makes you think this is actually

02:05:29   kind of the product they wanted to make,

02:05:31   which is essentially, get off those screens, kids,

02:05:35   and don't look at screens,

02:05:38   and just have a thing that's a badge,

02:05:39   and I think the AI stuff coming along was like,

02:05:41   "Wow, this is a great boon for us,"

02:05:42   'cause we were already going down this path

02:05:44   of like, screens bad, project light onto your hand instead

02:05:47   and talk to your badge,

02:05:47   and now we can do it even better, right?

02:05:49   But it's hard for me to really,

02:05:52   if they really did believe that, it's like, really?

02:05:54   Did these smart people all really buy into that thing?

02:05:57   Maybe they had a very charismatic leader

02:06:00   who really believed in that or something?

02:06:02   The flip side of that is what you just alluded to,

02:06:03   and what I mostly spent all rectives talking about,

02:06:06   the platform problem.

02:06:08   A product like this, essentially an Amazon Echo,

02:06:11   a smarter Amazon Echo that you pin on your clothing

02:06:15   that you can talk to and that has a camera

02:06:17   that faces out or whatever, great idea.

02:06:20   Unfortunately, you cannot make that product

02:06:22   without deep integration with Android or iOS,

02:06:26   and the only companies that can have

02:06:27   that kind of deep integration with Android or iOS

02:06:30   are companies that are currently heavily entrenched in that,

02:06:34   either Google, because they make Android,

02:06:36   or a very big Google phone maker like Samsung or whatever,

02:06:40   or of course, Apple.

02:06:41   Humane out here cannot have the integration they need

02:06:45   with the iPhone and iOS to make a good version

02:06:48   of their little badgy product,

02:06:49   because Apple doesn't allow it,

02:06:51   because their platform does not allow

02:06:53   things like this to happen.

02:06:54   And rectives, the example I gave was Quicksilver,

02:06:56   we ended up talking about like, you know,

02:06:58   command space things that you type or whatever.

02:07:00   macOS as a platform was able to,

02:07:04   and continues to be able to, support things like this.

02:07:07   A system integrated extension that starts to become

02:07:10   part of the way you use your computer,

02:07:11   that the person who made that computer

02:07:13   and the operating system didn't foresee,

02:07:15   but that becomes like, that is like deeply entrenched,

02:07:19   hitting command space, I don't know if Spotlight exists,

02:07:21   but before Spotlight exists, there was things like Quicksilver

02:07:23   and Launchpad and stuff like that, right?

02:07:26   A, what we would call a system extension.

02:07:28   The things that the AI pin would have to do

02:07:31   would have to make it like a system extension.

02:07:33   It's the same reason we can't replace Siri

02:07:34   with something better, the same reason

02:07:36   we can't replace reminders, it's like the iPhone ecosystem

02:07:40   is too closed to support innovation like this

02:07:43   unless it comes from Apple.

02:07:44   That is, when you're saying this is a shame, Marco,

02:07:45   that is the biggest shame of it,

02:07:47   is that Apple's control of this platform

02:07:49   does not allow innovations like this to be successful.

02:07:52   The only company that could do a pin like that well

02:07:55   is Apple on the iPhone.

02:07:57   And on Android, it's a little bit better, but not much,

02:07:59   because Google really controls that platform very well,

02:08:01   and there's a small number of Android phone makers

02:08:04   that have the wherewithal to do their own software stacks

02:08:06   and that kind of integration.

02:08:07   And by the way, if you do something Google doesn't like,

02:08:09   they're not gonna let you in the Play Store,

02:08:11   and then you get, you know, it's,

02:08:12   these platforms are not as open as the personal computer

02:08:15   and the Mac's were, right?

02:08:17   Because there are so many things

02:08:19   that we continue to do to this day,

02:08:20   even as locked down as the Mac is today,

02:08:22   there are so many things that you can,

02:08:24   that a third party can do to enhance the way a Mac works

02:08:28   at a system level that feels part of the system

02:08:30   that allows innovation that yes,

02:08:32   eventually the platform owner copies like,

02:08:33   oh, someone made a menu bar, a clock in the menu bar.

02:08:36   Eventually Apple's gonna put that into the operating system,

02:08:38   but that's how things advance.

02:08:39   And iOS, it's like, run your little apps

02:08:42   in your little sandbox in your little world

02:08:44   and get your little, you know, squircle, right?

02:08:46   But don't you dare mess with any other part of the system.

02:08:48   We control notifications, we control the status bar,

02:08:51   we control the default, you know, the web browser,

02:08:55   the AI agent that's installed, the default mapping thing,

02:08:58   and like to the degree that Apple slowly opens that stuff up,

02:09:02   they're so far from allowing something

02:09:05   like the humane pin to work.

02:09:07   So I feel for the company and that's why I think,

02:09:09   did you make this because you think

02:09:11   this is the best thing to do?

02:09:12   Or did you make it like this because you have no choice?

02:09:15   It has to have its own phone number.

02:09:17   It can't sync, it can't use iMessage.

02:09:19   Like it can't share, like you have no choice but to do this.

02:09:23   You have no choice to at least try to be standalone

02:09:25   because Apple doesn't want you, it won't let you.

02:09:28   Integrate in the way that you want, right?

02:09:30   You'd want to be able to just talk to it

02:09:32   and have it do things on your phone when you talk to it

02:09:34   and have a tight connection with,

02:09:35   Apple can barely make a tight connection

02:09:36   between their own watch and their phone for crying out loud,

02:09:39   let alone letting the third parties do it.

02:09:41   So, you know, again, I can't tell if this company thinks

02:09:44   this is actually a good idea and phones are evil

02:09:46   or they just, they had a, you know,

02:09:48   they just see the reality that's like,

02:09:50   if we have no choice, we have to pretend

02:09:53   that we are replacing the phone,

02:09:54   we have to be as standalone as possible,

02:09:56   we have to get our own phone number,

02:09:57   we have to have our own world with our own contacts

02:09:59   and our own messaging and, you know, it's just,

02:10:01   and that's never gonna fly,

02:10:04   even if you did everything great

02:10:05   and it seems like you didn't.

02:10:06   And then there's the problems we talked about last week

02:10:08   of like they're assembling a bunch of disparate parts

02:10:10   into a whole that they hope is greater

02:10:11   than the sum of that parts, I don't think it is

02:10:13   and I think a lot of that parts don't work right yet.

02:10:16   - I also vehemently agree that it's going to flop,

02:10:19   like I will be flabbergasted

02:10:22   if this really gets any traction whatsoever.

02:10:24   In fact, I would be surprised if it gets as much traction

02:10:26   as Google Glass got and yes, I realize what I just said.

02:10:29   - I hope it does get to the point

02:10:30   where tech reviewers review it

02:10:31   'cause I wanna read those reviews.

02:10:33   - Yeah, agreed.

02:10:34   But I do think there are some very clever

02:10:37   and interesting ideas here.

02:10:38   I just, I really am extremely put off

02:10:42   by voice being the only real interaction model.

02:10:45   Like yes, there's the laser ink display or whatever,

02:10:47   but effectively it's just voice

02:10:49   and I really don't like that at all.

02:10:52   Like I would much rather type

02:10:54   than speak almost any time.

02:10:57   And maybe that makes me an old, I don't know.

02:10:58   I don't think that's unique to old people.

02:11:00   - But what would you type on?

02:11:01   - Well, no, no, I agree, yeah.

02:11:02   - Eventually they end up making a phone

02:11:04   and we know they can't.

02:11:05   Oh, well maybe they just have an app on iOS

02:11:07   and then you get back to the integration problem.

02:11:08   Okay, so you've got a hardware device

02:11:09   and you've got an app on iOS.

02:11:11   How well do they integrate with each other?

02:11:12   Is your app running all the time?

02:11:13   Can they communicate easily all the time or is it just,

02:11:16   you have to fit within this little walls that Apple makes.

02:11:19   Here's what apps are allowed to do

02:11:20   and here's what hardware accessories you're allowed to do

02:11:22   and here's when they're allowed to talk to each other

02:11:24   and how much and through what means and so terrible.

02:11:27   - I agree with you but like as an example,

02:11:29   one of the things they demoed in their very funny video,

02:11:32   which by the way, it's like 10 minutes

02:11:33   and it is worth watching. - Seems longer.

02:11:35   - It's so weird. - First of all,

02:11:36   it does seem longer but it is so weird.

02:11:38   And Imran or whatever his name is, the head,

02:11:42   I think for like the TED Talk he did way back when,

02:11:45   he's like kind of, I don't know if aloof

02:11:48   is the word I'm looking for but he's kind of like chill,

02:11:51   I don't really care what's going on right now vibe.

02:11:53   I think that did work for the TED Talk

02:11:55   but for the product video, it did not land well

02:12:00   and it is worth spending the 10 minutes watching this thing

02:12:03   'cause it's something else.

02:12:04   - He looks a little sleepy, not high energy.

02:12:06   - I understand that's just his vibe

02:12:09   but like the whole video, it's like there is no enthusiasm

02:12:13   about this product being displayed whatsoever.

02:12:15   - Yep, no, it's very true.

02:12:17   So all that aside, there are, I genuinely think

02:12:20   there's some very cool things in this.

02:12:22   So like as an example, I forget exactly what the phrase was

02:12:26   but he talks about how, oh, if you've been not paying

02:12:29   attention to your text messages for a couple hours,

02:12:30   maybe you're at a dinner at a kid's thing,

02:12:33   you can say catch me up and it will use AI

02:12:38   to hopefully do a good job of saying,

02:12:41   of the 907 text messages you've just received,

02:12:44   the key takeaways are you gotta go to dinner

02:12:46   in half an hour with your wife at such and such a location

02:12:49   and your kid wants to know if they can play more Minecraft

02:12:52   or whatever the case may be.

02:12:54   That to me is cool or I think they've shown

02:12:56   other examples of--

02:12:57   - That would be cool if it worked

02:12:59   but with any of these things, you're always like,

02:13:02   okay, so what is the consequence if it doesn't quite work?

02:13:07   Probably not that big for summarizing text messages

02:13:10   but are you gonna rely on, like what if it's something

02:13:13   about when someone needs to be picked up or dropped off

02:13:15   or stop off at the store and get something or whatever

02:13:17   and it either tells you something like that

02:13:20   that didn't exist or doesn't tell you about it

02:13:21   and does exist and getting back to the technology

02:13:23   making your life better and you come home

02:13:25   and your wife's like, what, did you forget to pick up Timmy?

02:13:29   You're like, oh, what do you mean?

02:13:31   I texted you, I said you need to pick him up at the school.

02:13:34   And I was like, oh, well, I asked my AI pin to catch me up

02:13:37   and it didn't mention that.

02:13:38   How do you think that's gonna fly?

02:13:40   - What are you supposed to do

02:13:41   after it summarizes your messages?

02:13:43   Do you just delete them or do you read them later?

02:13:45   - But I'm saying, is the summary useful?

02:13:48   The summary's only useful to you if you believe it.

02:13:50   And if you have any doubt, again, maybe you don't care.

02:13:53   Maybe it's like, oh, I don't care,

02:13:55   but I feel like it'll only take one of those things

02:13:57   where you were supposed to pick up your kid and you didn't

02:13:59   and you have to explain why by saying that you asked

02:14:02   for a summary from your pin and your wife's gonna say,

02:14:04   well, never use that again.

02:14:06   Because obviously it doesn't work every time.

02:14:09   You know what I mean?

02:14:10   And it's not life or death, like, oh, the kid's at school.

02:14:13   But that type of thing, when you get onto stuff

02:14:18   like interpersonal communication and ask me for a summary,

02:14:20   it looks great in a demo, like, look, it's saving me time.

02:14:23   It's like having an executive assistant.

02:14:24   But if your executive assistant forgot to mention

02:14:26   you had to pick up your kid, you fire them.

02:14:28   Like, you know what I mean?

02:14:29   Or you'd have a talk with them at least.

02:14:30   It's like, this is the job.

02:14:32   Then the AI thing does it.

02:14:33   You have no one to blame but yourself

02:14:35   because you believed it.

02:14:36   And it turns out that that was consequential enough

02:14:38   that now your hobby tinkering with this little thing

02:14:42   versus how much benefit is it actually giving you

02:14:45   in your life, like, you love it when it comes together

02:14:47   and you feel like you're in the future when it's summarized.

02:14:48   But that one time you forget to pick up your kid,

02:14:50   you're never gonna use that summary thing again.

02:14:53   Or if you do, you're gonna use a summary

02:14:54   and then take out your phone to confirm

02:14:56   and read every message?

02:14:57   What's the point of that?

02:14:58   - Yeah, and if anything, like, this,

02:15:00   I think this just shows, like, the leaps and bounds

02:15:04   they had to jump through to avoid having a screen

02:15:09   just shows how good screens are.

02:15:12   All the different ways this doesn't work,

02:15:15   all the different conditions this doesn't work in.

02:15:17   Like, there's, like somebody in the chat,

02:15:20   David Shaw and Repoman27 just posted,

02:15:24   the operating temperature range is only 41

02:15:26   to 95 degrees Fahrenheit.

02:15:27   So, like, if you're in a really cold place

02:15:29   or a really hot place, like, nope, won't work.

02:15:31   - Really cold is lower than 41?

02:15:33   I guess you have to wear it inside your jacket

02:15:35   if you live in any place that has winter.

02:15:37   - Right, so you can't use it when you're outside

02:15:39   wearing a jacket, really.

02:15:40   It's gonna be covered up anyway.

02:15:41   But even if you move it to the outside of your jacket

02:15:43   every time you put your jacket on.

02:15:45   - It's pretty solid.

02:15:45   - Yeah, it's gonna freeze.

02:15:47   On their own site, is AI pin waterproof?

02:15:50   No, basically, it's a long answer.

02:15:51   The summary, no.

02:15:52   - It doesn't rain in California, so you're fine.

02:15:53   - Right, so you can't use it if it's raining.

02:15:56   Like, there are all these different conditions.

02:15:57   It's like, okay, what if it was a screen in your pocket,

02:16:00   like a phone?

02:16:01   Well, it turns out those are really good.

02:16:04   And you can use a screen that lives in your pocket

02:16:06   or purse or bag or whatever.

02:16:08   You can use that in so many more places and contexts

02:16:13   and conditions than you can use something like this.

02:16:16   And that's why it has no chance.

02:16:19   Like, even if people hated their phones,

02:16:21   which they very much don't, but even if, for some reason,

02:16:25   you could buy into the narrative that there's lots of people

02:16:27   who wanna spend less time on their phones out there,

02:16:30   there aren't, but again, even if people hated their phones,

02:16:34   the versatility of screens that are handheld,

02:16:39   that fit in your pocket, and that you can pick up

02:16:41   and take out in a split second whenever you want to,

02:16:45   that versatility overpowers everything else here.

02:16:48   That's like, you can't use the AI pin in the cold,

02:16:52   in the hot, in the rain, when wearing a jacket,

02:16:55   when wearing a light shirt, like, when somewhere

02:16:59   where it would be visibly obtrusive

02:17:01   or draw too much attention to you.

02:17:04   You can't use it in a place where you can't use your voice.

02:17:07   Maybe you are somewhere where you have to be quiet,

02:17:09   you know, in certain contexts where you have to be quiet.

02:17:13   People have those all over their lives.

02:17:15   Screens work in so many more places and contexts,

02:17:20   and so while I understand, like, again,

02:17:25   I'm glad they're trying stuff,

02:17:26   I'm glad people out there are making cool gadgets.

02:17:28   It's fun, but don't bet against the smartphone,

02:17:33   especially so directly against the smartphone.

02:17:36   You will lose every time.

02:17:39   Screens are awesome.

02:17:40   There's a reason we have them.

02:17:42   And projecting a laser screen into your hand

02:17:45   that you have to tilt and gesture in weird ways to use,

02:17:48   that's not better.

02:17:49   Like, when you see the laser screen, they blew it.

02:17:51   Like, that's--

02:17:53   - Even on the Rumpley demo hand, it's like,

02:17:55   oh, I see something on my hand.

02:17:56   I can barely tell what anything is.

02:17:57   The hand's too wrinkly and rumply, yeah.

02:18:00   - What if you just put a phone into that hand?

02:18:03   Then it's so much more useful!

02:18:04   - That's what you're looking at.

02:18:06   Yeah, the hardware problems, like, could be solved

02:18:08   if the company had a good enough product

02:18:09   that they could iterate on the technology and so on.

02:18:12   But, like, it feels like this entire product hinges on,

02:18:17   as we pointed out last show,

02:18:18   the thing that this company is not actually really making,

02:18:20   which is the quote-unquote AI part of it.

02:18:23   And we all know that that part of it

02:18:26   doesn't work as well as they would like.

02:18:27   Imagine if this thing, you'd buy this thing

02:18:29   for the same price and pay the same monthly fee.

02:18:32   And every time you tapped it,

02:18:33   it was like Amazon Mechanical Turk

02:18:34   and there was a person there

02:18:35   who was your own human personal assistant.

02:18:38   If you could get a human personal assistant

02:18:39   to be on call 24 hours a day for $24 a month,

02:18:43   that would be a great deal.

02:18:44   And that is the dream of this.

02:18:46   And you'd say, "Oh, but the hardware's bad

02:18:47   and it's not waterproof and it's this and it's that."

02:18:49   I'd be like, "No, $24 a month and I tap this thing

02:18:51   and I can literally ask it to do anything

02:18:52   and it does it for me."

02:18:54   Summarize my text messages and it tells me

02:18:56   and it's a person, so it gets it right.

02:18:58   You know what I mean?

02:18:58   Like, that would be a great product.

02:19:00   And if they had that product, people would pay for it.

02:19:03   And then the next version would be better

02:19:04   and the next version would be better.

02:19:05   And you'd be like,

02:19:06   "I don't need to see anything on a screen.

02:19:07   I just ask my little assistant and it tells me everything.

02:19:10   Why do I need to see anything on a screen, right?

02:19:12   I can whisper real low if I'm in a library

02:19:14   and they can hear me because of these amazing microphones."

02:19:16   But they don't have a human at the other end of that thing.

02:19:19   They have whatever quote unquote AI thing they're using,

02:19:24   which is useful in lots of other contexts

02:19:26   and may be useful in some of the contexts

02:19:28   they want to use that,

02:19:29   but not useful enough to make up for all their other feelings.

02:19:32   So they're never gonna get to the point

02:19:33   where they can iterate on this hardware, right?

02:19:36   And, you know, again,

02:19:37   they don't even really make the AI part of that.

02:19:40   They're just piggybacking on a technology

02:19:42   that other people are innovating.

02:19:43   So I wouldn't trust this company

02:19:46   to continue to push to the forefront

02:19:47   of making that part smarter and better,

02:19:50   because that's not their core competency.

02:19:52   Their core competency is apparently this idea of,

02:19:55   you know, let's not be on screens as much

02:19:56   and let's use a badge type thing,

02:19:58   and the hardware that they've made

02:19:59   and, you know, the software stack that it runs on.

02:20:01   And the hubris to believe that,

02:20:04   okay, the phone platforms that dominate our lives

02:20:07   are so closed that we can't build the product we want,

02:20:10   so we're gonna do it all ourselves.

02:20:12   And I admire that.

02:20:14   I admire the attempt to do that.

02:20:16   It's one of the reasons I admire Lucid,

02:20:17   because they did so much stuff themselves,

02:20:19   but Lucid actually pulled it off,

02:20:20   and this company, I think, has not.

02:20:22   - Well, 'cause Lucid was making so many people want.

02:20:24   (laughing)

02:20:25   - Yeah, but they didn't choose

02:20:26   to do so many things themselves.

02:20:27   Like, oh, you're gonna make your own drivetrain?

02:20:29   You're gonna make your own?

02:20:30   Like, why would you do that?

02:20:31   We can do, all the other EV companies did.

02:20:33   I mean, you're not just gonna buy parts from somebody else

02:20:35   and assemble them?

02:20:36   Like, no, we're gonna build all this stuff ourselves.

02:20:38   It's like, boy, that seems really hard.

02:20:40   It is, and most companies can't do it,

02:20:42   but the ones that do, it's very impressive.

02:20:43   And this company, you know, AI was,

02:20:46   Humane was forced to do this,

02:20:47   because they can't build on any of the platforms

02:20:50   that it's natural for them to build on.

02:20:51   So it's sad.

02:20:53   It really, like, as much as I think this product

02:20:55   is not, you know, gonna do anything in the market,

02:20:59   I wish that it was possible to make a good version

02:21:02   of this product on any of the dominant platforms today

02:21:05   that would be appropriate, you know,

02:21:07   iPad OS, iOS, Android, right?

02:21:09   But it's just not.

02:21:10   Those companies don't, if they don't wanna make it,

02:21:12   they don't wanna let you make it,

02:21:13   and that is the real shame of this.

02:21:15   - Yeah, agreed, and again, like,

02:21:16   I don't, this is a bad idea for so many reasons.

02:21:20   Like, your Lucid analogy, it's like if Lucid was like,

02:21:23   hey, why don't you give up your car

02:21:25   and switch to a hot air balloon?

02:21:27   Like, well, okay, I mean, that's gonna be a tough sell

02:21:31   for most people.

02:21:32   - Yeah, we would've made a car,

02:21:33   but we're not allowed to be on the roads,

02:21:35   because the company that owns the roads won't let us.

02:21:37   But we think hot air balloons

02:21:38   are actually better than cars.

02:21:39   - Yeah, don't you hate your car?

02:21:41   How about something that's less capable and less convenient,

02:21:44   and you can operate in less conditions?

02:21:45   Great, how about a hot air balloon? (laughs)

02:21:48   Oh, God, anyway.

02:21:50   But yeah, you know, you're right,

02:21:50   and it's ultimately like the inability

02:21:53   for products like this to really have a fair shot

02:21:58   is a tragedy of the modern tech lock-in ecosystem.

02:22:03   And there's no telling how much innovation out there

02:22:08   we could have today,

02:22:09   and that could be making our lives better

02:22:11   and could be making the world better,

02:22:13   if not for Apple and Google's massive platform lock-in

02:22:17   on their modern platforms that basically make it impossible.

02:22:20   - Yeah, if anything, like a company that had this idea

02:22:22   could fail much faster,

02:22:24   because they wouldn't have to build everything,

02:22:26   all the operating system and platform stuff.

02:22:29   They could just piggyback, build the,

02:22:31   you know, you make an app on your phone

02:22:32   and build a little bit of the hardware

02:22:34   and have them integrated,

02:22:35   and just do the part that you care about.

02:22:37   They can't do that,

02:22:38   because you can't get that kind of deep integration

02:22:41   with iOS, but if they could,

02:22:42   boy, they would save a lot of time,

02:22:44   and they would be able to come up with their product,

02:22:45   and they would flop in the market,

02:22:46   'cause nobody wants it in the AI systems that you don't.

02:22:48   And it would save everybody a lot of money.

02:22:51   But they had to build so much stuff

02:22:53   and go this whole route of like,

02:22:55   we are our own thing with our own phone number

02:22:57   and our own operating system and our own lack of apps

02:23:00   or whatever they were pitching.

02:23:01   Like, we have experiences, not apps.

02:23:02   It's like, you have to build all of that

02:23:04   for what's probably gonna turn out to be,

02:23:06   if not a bad idea, then at the very least,

02:23:08   an idea that is too ahead of its time,

02:23:10   because the tech is just not ready for it yet.

02:23:12   (beeping)