677: I Accept the Battery Cost
00:00:00
◼
►
As I sit here tonight, the kids have a two-hour delay tomorrow, and they're finally going back to school on Thursday. They have not been in school since, what day was that, January 23rd?
00:00:14
◼
►
Wait, really? So this whole ice period, they haven't been in school since then?
00:00:17
◼
►
That's right. That's right. So we got the snow, I think, 24th, 25th weekend. Then we were, well, snow and ice. I shouldn't say the snow was really the ice that was a problem.
00:00:24
◼
►
We're off all last week, 26 through 30, we're off Monday, we're off Tuesday. Well, I shouldn't say off.
00:00:31
◼
►
They did asynchronous learning, which means they did a bunch of busy work Monday, Tuesday, and today, which is Wednesday.
00:00:37
◼
►
It's Thursday the 5th, they're finally going back after a two-hour delay.
00:00:41
◼
►
And I feel terrible for Aaron in particular, because I've been up here working in the office,
00:00:46
◼
►
the kids have been downstairs in the dining room doing their busy work and asking Aaron questions incessantly all day long for the last three days.
00:00:54
◼
►
Aaron loves her children more than she loves anything in the world, including, I think, me, potentially.
00:00:59
◼
►
And I think she's very ready for them to go back to school tomorrow.
00:01:03
◼
►
Yeah, I think, I don't know, there's days that I do okay that she might like me a smidge more.
00:01:08
◼
►
But don't ruin my moment here, John.
00:01:10
◼
►
But anyways, the point is, Icepocalypse is finally over, and I am very thankful.
00:01:16
◼
►
All right, well, let's do some follow-up, and I would like to say that I've been vindicated,
00:01:22
◼
►
and in congratulations to John, or maybe not congratulations, but in thanks to John for putting this into the show notes,
00:01:29
◼
►
even though my conversions to metric are generally trashed, they at least get us in the ballpark,
00:01:33
◼
►
and many people wrote in to say, hey, that's helpful.
00:01:36
◼
►
I mean, I'm not sure if it's vindication, but you have your supporters, let's say.
00:01:39
◼
►
There are people out there who love to hear you stumble through trying to convert in your head between units that you're not familiar with.
00:01:45
◼
►
That's right, vindicated.
00:01:46
◼
►
I feel like I'm okay with you continuing to put in metric units if you use kind of the less common intervals of them.
00:01:53
◼
►
Like, you know, like I'm currently drinking about 25 centiliters of a non-alcoholic Guinness tonight.
00:01:58
◼
►
Yeah, like the only metric unit that I think most Americans are familiar with, and some people pointed this out to us,
00:02:03
◼
►
is I think we know centimeters, because as someone pointed out, like lots of our rulers have interest on one side and centimeters on the other.
00:02:08
◼
►
So I have a kind of intuitive understanding of centimeters, and then a meter, we just be like, well, it's like a yard.
00:02:12
◼
►
Right, right.
00:02:13
◼
►
But yeah, kilometers and kilograms, forget it.
00:02:15
◼
►
Nope, I got nothing on that.
00:02:17
◼
►
And then some of the people, I love how some of the people supporting your unit conversions couldn't help but snark about Fahrenheit, too.
00:02:22
◼
►
So they're just going to get that jab at even when they're trying to support you.
00:02:27
◼
►
Well, the thing of it is, I don't plan to relitigate this, because we have a lot of stuff to talk about,
00:02:31
◼
►
but we get everything wrong.
00:02:33
◼
►
Inches are wrong.
00:02:34
◼
►
Ounces are wrong.
00:02:35
◼
►
Pounds are wrong.
00:02:36
◼
►
Yards are wrong.
00:02:38
◼
►
Everything is wrong.
00:02:39
◼
►
Everything is wrong, except Fahrenheit.
00:02:41
◼
►
Fahrenheit for ambient air temperatures, we got right.
00:02:44
◼
►
It's a percentage of hot.
00:02:46
◼
►
And I will tell you, it is a fact that if you have half degrees on your goddamn thermostats, you failed.
00:02:53
◼
►
That is a failure of your whole system.
00:02:57
◼
►
Celsius for ambient air temperatures is trash.
00:03:02
◼
►
You can use it for cooking.
00:03:03
◼
►
I'm good with that.
00:03:03
◼
►
I'll use it for cooking.
00:03:04
◼
►
I mean, I don't because I'm an idiot and I'm an American, but I'm full support for cooking.
00:03:08
◼
►
If you want to take your internal body temperature in Celsius, whatever, that's fine.
00:03:12
◼
►
I add anywhere else other than what is the temperature outside.
00:03:15
◼
►
I don't care if water is cold, y'all.
00:03:18
◼
►
I care if I'm cold.
00:03:20
◼
►
And so I think it's hilarious that all these people that make fun of Americans because of our stupid units and whatnot seem to have this inability to remember the numeral 30 or the number 32.
00:03:34
◼
►
Like, all these people who are so smart and look at us with our base 10 units, me, me, me, me, and look at our fancy paper that makes sense, unlike yours, which doesn't.
00:03:41
◼
►
Really making a lot of friends and the metric supporters right now, I guess.
00:03:45
◼
►
I'm just saying, you guys have, you folks have everything right except Celsius.
00:03:50
◼
►
Celsius is trash.
00:03:51
◼
►
Yeah, so much for not going into it.
00:03:53
◼
►
Yeah, well, my bad.
00:03:54
◼
►
I can't help myself.
00:03:54
◼
►
I'm so triggered by this.
00:03:56
◼
►
This is the hill I am dying on.
00:03:57
◼
►
I don't, I know you two know this better than almost anyone.
00:04:00
◼
►
This is your metric supporter, everyone, still on this side.
00:04:03
◼
►
But that's the thing.
00:04:04
◼
►
I'll happily do my best to give you centimeters or meters or whatever the case may be.
00:04:09
◼
►
And kilogram, I'm terrible at kilograms, but I'll do my best.
00:04:11
◼
►
We already do like, don't we do like millimeters?
00:04:13
◼
►
But like when Apple gives the units in metric, we just read them in metric.
00:04:16
◼
►
We don't convert them to, I don't know, 16th of an inch.
00:04:19
◼
►
I mean, again, all our units are trash.
00:04:21
◼
►
Every single one of them with one and only one exception and only when used for ambient air temperatures.
00:04:28
◼
►
Celsius in cooking, I'm good.
00:04:30
◼
►
I'm happy to do it.
00:04:31
◼
►
That makes sense.
00:04:32
◼
►
Not when you're walking outside because I don't give a shit if water is cold.
00:04:35
◼
►
Anyway, other authentication options for John's Cloudflare apps.
00:04:40
◼
►
Tell us, John, what else could you have done?
00:04:41
◼
►
What a transition.
00:04:42
◼
►
Yeah, well, he's got, in case he needs some time to calm down, I'll talk about Cloudflare.
00:04:46
◼
►
Cloudflare authentication.
00:04:47
◼
►
The last episode, I was talking about how I factored out the sort of passkey-based account system
00:04:53
◼
►
for all my little personal apps, so I didn't have to, you know, copy and paste that from one app to another.
00:04:59
◼
►
Since I edited, I think I have like three of them now, three or four little apps.
00:05:04
◼
►
And it's nice to have that all factored out.
00:05:06
◼
►
But a couple of people wrote in for possible alternatives.
00:05:08
◼
►
One of them is another Cloudflare product called Cloudflare Access.
00:05:12
◼
►
I read their web pages a few times to try to figure out what it was.
00:05:15
◼
►
Here's the beginning of their explanation.
00:05:17
◼
►
Cloudflare Access provides visibility and control over who has access to your custom host names.
00:05:22
◼
►
You can allow or block users based on identity, device posture, or other access rules.
00:05:27
◼
►
And then it says the prerequisites, you have to have a Cloudflare Zero Trust plan in your SaaS provider account.
00:05:32
◼
►
And it goes on from there to name a lot of different proper nouns that you need to set up in your Cloudflare account.
00:05:36
◼
►
I'm like, yeah, okay, but what is this?
00:05:38
◼
►
What, you know, and I looked up Zero Trust and it's the, you know, the Wikipedia page says it's a design and implementation strategy for IT systems.
00:05:47
◼
►
The principles that users and devices are not trusted by default, even if they are connected to the privileged network, such as a corporate LAN or whatever.
00:05:52
◼
►
So in bad old days of computing and my jobby jobs, once you got on the corporate network,
00:05:57
◼
►
you were kind of in and you were implicitly trusted to do a bunch of stuff.
00:05:59
◼
►
But later in my jobby job, that became, that went out of fashion.
00:06:03
◼
►
And what became in fashion is a Zero Trust architecture, which is even if you're in the network, even if you're on the Wi-Fi,
00:06:09
◼
►
even if you're plugged into Ethernet on the corporate network, still, you don't have implicit access to anything because that's really bad for security.
00:06:15
◼
►
So it sounds to me like Cloudflare Access is kind of like, you know, the, if you do any kind of remote work for a company,
00:06:21
◼
►
you probably have some kind of work VPN that you have to connect to that gets you on the network.
00:06:24
◼
►
And getting you on the network doesn't get you access to anything,
00:06:26
◼
►
but you have to be on the network before you can then authenticate using your ID provider or whatever.
00:06:31
◼
►
So I don't think this is appropriate for my mini apps to have a Zero Trust type thing where I have to install an app to sort of,
00:06:38
◼
►
I'm not sure if it's a VPN, but either way, like it doesn't sound like the right system for me to be able to log into my little personal apps.
00:06:46
◼
►
But be aware that it is available on Cloudflare if you're interested in it.
00:06:49
◼
►
By the way, to quickly interrupt, the whole shtick behind Tailscale, former sponsor, potentially future sponsor, they are very big into Zero Trust architecture.
00:06:58
◼
►
Now, with that said, by default, you know, they kind of allow everything everywhere just to make things easier.
00:07:03
◼
►
But their actual like recommendations and the way it's really built is Zero Trust.
00:07:08
◼
►
And it can be extremely powerful and extremely cool.
00:07:11
◼
►
And the next suggestion was OpenID Connect or OIDC.
00:07:14
◼
►
This one I also used at work.
00:07:16
◼
►
OpenID Connect is a sort of a way, it's an open standard for authentication where you have like an identity providing server.
00:07:24
◼
►
And if you try to log into a service, it will bounce you to or talk to through an API, the central identity provider.
00:07:32
◼
►
And that way, you can basically essentially make one account and be able to sign into all the corporate things.
00:07:37
◼
►
Again, it makes a lot of sense for an enterprise.
00:07:39
◼
►
You know, the dream of single sign on, as they used to call it back in the day, is this whole very large companies that are built up around providing this.
00:07:47
◼
►
And OpenID is a open standard.
00:07:50
◼
►
A couple of people suggested PocketID, which is a tiny little OpenID Connect identity provider that you can run yourself in a little host or a little Docker container.
00:07:59
◼
►
So you don't have to get one of those big commercial services.
00:08:02
◼
►
You can run this one thing over here, and that does your identity for all your other apps and you're good to go.
00:08:08
◼
►
Maybe it's a little PTSD for my job and working with Okta.
00:08:13
◼
►
I mean, I think Okta is actually pretty good.
00:08:15
◼
►
Okta was pretty good, all things specific.
00:08:17
◼
►
Yeah, it was one of the better vendors, but still, a lot of the stuff I did at work, I have bad associations with.
00:08:23
◼
►
And also, it's so heavyweight.
00:08:24
◼
►
And actually, what I have is kind of what I want.
00:08:27
◼
►
I don't actually want single sign on.
00:08:28
◼
►
I want individual accounts on individual services.
00:08:31
◼
►
Not that it matters.
00:08:32
◼
►
These are my tiny toy things that are going to have just my accounts, or maybe I'll put accounts on for family members or other people who want to take a peek at it.
00:08:39
◼
►
Like, it's just simpler for me to be like, these things are not connected in any way.
00:08:43
◼
►
There's no central ID provider.
00:08:44
◼
►
I don't have to, you know, do authentication over there and then authorization differently depending on, you know, like.
00:08:50
◼
►
It's just, it's too heavyweight for me.
00:08:52
◼
►
But it is cool that there is this pocket ID thing where if you do want to run this, you don't have to run one of these big, you know, giant commercial things.
00:08:59
◼
►
Or one of the free open source ones, you can run this little tiny thing, and it's just a little OpenID Connect provider just for you and your little services.
00:09:05
◼
►
Yeah, and building off of that, there's also TSIDP, which I think is written by TailScale themselves.
00:09:10
◼
►
And this is like a very similar in spirit to Pocket from what I gather, but it's a little baby OpenID Connect or IDC provider that you would run on your TailNet if you so choose.
00:09:19
◼
►
I haven't really looked into this, but it does seem like the sort of thing that I would do because I'm a weirdo like that.
00:09:24
◼
►
But anyways, lots of options for you, certainly for the public internet and also if you just wanted to do stuff on your TailNet too.
00:09:30
◼
►
And these work with pass keys and everything too.
00:09:32
◼
►
Like there's no, they're agnostic to the actual method you use to authenticate, whether it's a hardware key or a pass key or a password.
00:09:38
◼
►
They, you know, all these things support, I haven't looked into Pocket ID, but, you know, the standard supports all these things.
00:09:43
◼
►
All right, let's talk about TVs for a while.
00:09:46
◼
►
David Schaub writes with regard to per-pixel lighting control, per-pixel lighting is sufficient, but it isn't necessary.
00:09:52
◼
►
My 14-inch MacBook Pro has almost five dimming zones per inch, which David calls DZPI, question mark?
00:09:59
◼
►
And I find it quite good because my eyes cause blooming and my glasses cause blooming.
00:10:03
◼
►
Tech only needs to meet human perception.
00:10:05
◼
►
There's no reason to assume that per-pixel lighting is needed if laptops get to about 5,000 zones, desktops get to about 15,000 zones, and TVs get to about 20,000 zones.
00:10:14
◼
►
I'm sure my eyes would be perfectly happy.
00:10:16
◼
►
Yeah, I mean, there's the point of diminishing returns, but unfortunately, due to the nature of the LCDs that are in front of those dimming zones, there are still sort of worst-case scenarios that come up surprisingly often.
00:10:28
◼
►
Maybe not to 20,000 zones, but to be clear, these numbers are pushing the high end of what is available commercially.
00:10:35
◼
►
I'm not sure if there are many TVs that are 20,000 zones or higher.
00:10:38
◼
►
Obviously, the MacBooks are half of that 5,000, but they're not the highest-density screens.
00:10:43
◼
►
But starfields, if you watch science fiction, it's a black background with a bunch of pinpricks of white light, and it doesn't take that dense of a starfield to essentially require the TV to have every single backlight region turned on because there's at least one star in every region.
00:10:58
◼
►
And then you've just gone back to the worst-case Apple Studio display.
00:11:02
◼
►
The entire backlight is on, and your black levels are raised because the LCDs can't block all the light, right?
00:11:08
◼
►
You know, there are lots of other sources of blooming, but that's an argument against, you know, dynamic backlight regions because, yeah, there are lots of other sources of blooming, and we're talking about that in the next segment as well.
00:11:20
◼
►
You don't want to add to it.
00:11:21
◼
►
It's like, okay, and also the backlight is blooming.
00:11:23
◼
►
And to be clear, backlight blooming is so much better now than it used to be.
00:11:27
◼
►
It used to be incredibly awful, and the zones were big, and now the zones are tiny, and it's much better.
00:11:31
◼
►
So it is definitely getting way, way, way better.
00:11:33
◼
►
But per pixel lighting control, like, why fight this?
00:11:38
◼
►
Each individual pixel can light up.
00:11:39
◼
►
We have a technology that does it.
00:11:40
◼
►
There are tradeoffs, but, you know, the gap is narrowing with tandem OLED and QD OLED and everything.
00:11:46
◼
►
So I get where David Schaub's coming from.
00:11:49
◼
►
And in practice, most people are buying the LCD TVs as well just because they're cheaper.
00:11:52
◼
►
But if and when they sort of meet on price, the answer is clear.
00:11:57
◼
►
Per pixel lighting control, don't worry about dynamic backlights.
00:12:00
◼
►
I mean, even if it's just, like, because correctly using dynamic backlights, deciding which backlight should be on and how bright they should be,
00:12:07
◼
►
it's computationally expensive and complicated.
00:12:09
◼
►
And there can be situations where the backlight lags the action on the screen.
00:12:13
◼
►
And there can be situations where doing the backlight calculations adds a couple frames of lag when playing video games.
00:12:19
◼
►
It's just a complication that is not ideally necessary.
00:12:23
◼
►
But, yes, they are getting a lot better.
00:12:25
◼
►
All right. So speaking of TVs and backlights and things, Marco had made kind of an offhanded question, comment, whatever.
00:12:32
◼
►
It's less of a question, more of a comment.
00:12:34
◼
►
About microLED TVs last episode when we were talking about Sony and TCL.
00:12:39
◼
►
And, John, you had thoughts.
00:12:41
◼
►
Yeah, I tried to give Marco a lay of land, which is basically it's always five years in the future.
00:12:45
◼
►
And they're not affordable yet.
00:12:48
◼
►
And just entirely coincidentally, I didn't look this up, but my YouTube recommendation engine threw this at me shortly after we recorded the program.
00:12:56
◼
►
And it was a video from Robert Tate of the Hookup YouTube channel, which I had never heard of before.
00:13:01
◼
►
But he bought and tested a 157-inch AWOL, all caps, microLED TV, and TV in scare quotes.
00:13:10
◼
►
And he made a video about it.
00:13:11
◼
►
We'll put a link in the show notes.
00:13:12
◼
►
This doesn't really change anything that I told Marco, but it is a much more detailed view of, like, what's the state of things now?
00:13:19
◼
►
So, first of all, this is a $55,000 TV.
00:13:22
◼
►
Well, you know.
00:13:24
◼
►
And it's 157 inches, and it takes up the entire wall.
00:13:29
◼
►
Like all large microLED TVs, it's made up of individual panels, because the whole thing is, like, you have to put these red, green, and blue LEDs on these little boards.
00:13:39
◼
►
I think they have machines doing it still, like, and they're so tiny, and there's so many of them.
00:13:42
◼
►
And it's kind of like silicon chips, where the bigger you make it, if you screw it up, you've got to throw the whole thing out.
00:13:47
◼
►
So they always make these things out of modular panels.
00:13:49
◼
►
Even the ones that are, like, in stadiums and for rock tours, they're all modular panels.
00:13:52
◼
►
That's the deal.
00:13:53
◼
►
Because we can make a small modular panel.
00:13:55
◼
►
They're all interchangeable.
00:13:56
◼
►
If one of them breaks, you just replace the panel.
00:13:58
◼
►
And it's easier to make a small thing perfect than to make a giant thing perfect.
00:14:01
◼
►
These modules are 27 inches.
00:14:04
◼
►
They're 640 by 360 pixels at a 0.9 millimeter pixel pitch, which is how far the pixels are apart.
00:14:10
◼
►
And the general rule is if you multiply the pixel pitch by 10, you get the distance where you can't see the pixels anymore.
00:14:15
◼
►
So 0.9 millimeters means about 9 feet away, and you won't be able to see the pixels anymore.
00:14:22
◼
►
It also, the little modules, the 27-inch modules, have magnetically attachable and adjustable screen things.
00:14:28
◼
►
So there's, like, the back part of it that has, like, the power supply and the signal and everything.
00:14:31
◼
►
And then the actual micro LEDs attach in a little panel with magnets.
00:14:36
◼
►
And they're adjustable because you're going to set up these panels.
00:14:40
◼
►
And Robert's setup was six panels wide by five panels tall.
00:14:44
◼
►
And, you know, you put the panel, you put the things in the wall.
00:14:46
◼
►
We'll talk about the installation in a second.
00:14:48
◼
►
But you have to line up all the panels so they're, you know, the gaps between them are as small as possible.
00:14:53
◼
►
And they're all facing the same direction.
00:14:55
◼
►
So one of them can't be, like, tilted, you know.
00:14:57
◼
►
They have to all be smooth and everything.
00:14:58
◼
►
And so they attach with magnets.
00:15:00
◼
►
And there's these tiny little adjustment screws to make, to, like, level them in three dimensions so they all line up.
00:15:06
◼
►
It's very fidgety.
00:15:07
◼
►
Anyway, that six panels by five panels, that's 3840 by 1800.
00:15:12
◼
►
So it's not even full 4K.
00:15:14
◼
►
And if you look at his setup, that's because there's not any more room in his room.
00:15:17
◼
►
Like, he couldn't, like, it fills the height, you know, there's, he couldn't put another row of panels in.
00:15:22
◼
►
And it's just, they're too big.
00:15:23
◼
►
And the whole thing with these is, like, making them smaller is the hard part.
00:15:26
◼
►
You can put one in a stadium real easy, you know, for a few hundred grand or whatever.
00:15:29
◼
►
But it's making the little micro-LED small.
00:15:31
◼
►
That's why when you see those, a 55-inch micro-LED was causing huge waves when it appeared at CES a couple years ago.
00:15:37
◼
►
It's like, how do they make one that's 55-inch?
00:15:39
◼
►
Because it's so small.
00:15:40
◼
►
God knows how much.
00:15:42
◼
►
No one wants, you can't have in your house a 157-inch TV.
00:15:46
◼
►
It's just too big.
00:15:47
◼
►
So, anyway, this doesn't do full 4K, but it's enough to see all the pixels in, like, a widescreen movie.
00:15:53
◼
►
Because widescreen movies already have the letterbox things above and below.
00:15:56
◼
►
And this just has smaller letterbox things above and below.
00:15:59
◼
►
The installation was fun.
00:16:01
◼
►
So, to get this installed, he needed three power drops to that wall and a dedicated 15-amp circuit.
00:16:07
◼
►
So, it's not even like you just plug it in once.
00:16:09
◼
►
Because, remember, they're 27-inch panels, and they all need to be powered.
00:16:13
◼
►
He needed 18 Cat6 Ethernet wires to run to all the different panels.
00:16:19
◼
►
And that was all run to a controller in a separate room.
00:16:22
◼
►
And then he just put three-quarter-inch plywood over the entire wall mounted to the studs that he's going to now mount all of the panels onto.
00:16:29
◼
►
You should watch the installation.
00:16:32
◼
►
It's quite a thing.
00:16:32
◼
►
It's bananas.
00:16:33
◼
►
This video was relatively long.
00:16:36
◼
►
I forget exactly how long it was, like 20, 30 minutes.
00:16:37
◼
►
But it was bananas watching this installation happen.
00:16:41
◼
►
Because he didn't literally bring that wall down to the studs for the most part, but he kind of sort of did.
00:16:47
◼
►
Like, there was a lot of work.
00:16:48
◼
►
Yeah, because he had to get all the Ethernet cables in there and all the power.
00:16:50
◼
►
And, like, the wall wasn't set up in this way before.
00:16:53
◼
►
And so, it just really needed to be ripped up.
00:16:54
◼
►
So, the performance, if you do that and you pay your $5,500, what do you get?
00:16:57
◼
►
So, the brightness is only 800 nits, but it's 800 nits on any size window.
00:17:03
◼
►
As in, you can light up one pixel at 800 nits or you can light up all the pixels at 800 nits because they're individual panels.
00:17:09
◼
►
And, in fact, when he did a full-screen test, it was actually 840 nits for the whole screen.
00:17:13
◼
►
So, that's the beauty of microLED is it doesn't have these sort of brightness problems of OLED where they can't drive it that hard because the heat things will cause burning and degrade the organic components and everything.
00:17:23
◼
►
So, OLEDs are, like, super bright in a tiny region, but when you light up the whole screen with white, it's way dimmer.
00:17:28
◼
►
No, that's not a problem here.
00:17:30
◼
►
Much less blooming because there's just simply less glass and stuff over the pixels.
00:17:37
◼
►
I think there's still something over the pixels, but looking at it, it was hard to see it.
00:17:41
◼
►
And blooming is caused by, you know, as our past thing that we just read from David Shaw was like, your own glasses, the glass on the screen, the glass that's on the front of your television, in front of the pixels, that causes blooming.
00:17:52
◼
►
And, of course, within your eyes.
00:17:54
◼
►
I'm sure we'll get some vision doctor to tell us the details of that in a future episode.
00:17:59
◼
►
But this has much less blooming because there is much less glass over the individual LEDs.
00:18:06
◼
►
I'm not sure if there's any.
00:18:07
◼
►
It just seems like if there is, it's very thin.
00:18:08
◼
►
Color performance, 83% BT-2020.
00:18:12
◼
►
So, it's not 100% BT-2020 like that RGB backlight thing we saw.
00:18:16
◼
►
99% P3, which is still pretty good.
00:18:18
◼
►
Almost perfect accuracy out of the chart, which is nice.
00:18:20
◼
►
Didn't need to calibrate it or anything.
00:18:22
◼
►
Input lag is not great.
00:18:23
◼
►
It's like 34 milliseconds with no scaling and no processing and some weird bugs.
00:18:27
◼
►
But realistically, about 78 milliseconds with all the features you'd actually want to use to play video games, which is not a good number for a TV.
00:18:34
◼
►
But, you know, fine for casual play.
00:18:37
◼
►
This is not a TV.
00:18:39
◼
►
This is a display.
00:18:40
◼
►
As I said before, it's a TV in quotes.
00:18:41
◼
►
You buy this for $55,000 and what you get is an amazing screen.
00:18:45
◼
►
But you can't watch TV on it until you add something.
00:18:48
◼
►
It comes with this Novastar MX40 Pro display controller.
00:18:51
◼
►
But that doesn't make it a TV.
00:18:53
◼
►
That's just a big box in the rack that the 18 Ethernet cables run to that makes the display work.
00:18:58
◼
►
And, you know, you calibrate it and tell it where all the panels are.
00:19:01
◼
►
And so, it makes a full image.
00:19:02
◼
►
But if you want it to be like a TV, you have to buy something else.
00:19:05
◼
►
So, he added an HD Fury room HDMI signal processor for $1,000 extra dollars.
00:19:11
◼
►
Because why not?
00:19:12
◼
►
And this thing lets it have HDMI, CC, ARC and eARC, Dolby Vision, HDR10, Dolby Atmos, DTSX, a power button, a remote, and also a cool feature where you can do multi-view, where you can put multiple screens, you know, divide up multiple signals and put it on the giant screen.
00:19:29
◼
►
And then, finally, you plug this thing in with all its power drops and everything and you run it.
00:19:34
◼
►
What is the power usage like?
00:19:35
◼
►
Well, it's 71 watts for the Novastar controller thing.
00:19:39
◼
►
With the screen at 100% brightness, the display takes 1,200 watts.
00:19:44
◼
►
It's like a space eater, really.
00:19:48
◼
►
No, it's funny you say that.
00:19:50
◼
►
It's funny you say it.
00:19:50
◼
►
Let me interrupt you real quick.
00:19:51
◼
►
So, when we were pre-ice slash snowpocalypse, I was trying to compute, you know, what can I run off of the tailgate battery if we just need it in a pinch?
00:20:00
◼
►
What can I run off our gas generator, which we might talk about later, potentially?
00:20:03
◼
►
And I have, like, a little wattmeter thing where you plug that into the wall, you plug something into the wattmeter, it shows you how much power that thing uses.
00:20:10
◼
►
And we have a little baby space heater.
00:20:11
◼
►
It's a little tiny space heater.
00:20:12
◼
►
And that was something to the order of 1,300 watts.
00:20:15
◼
►
Yeah, what is the maximum?
00:20:16
◼
►
Is it 1,800 or something?
00:20:18
◼
►
Like, the maximum reasonable, like, 80% capacity?
00:20:20
◼
►
Something like that, yeah.
00:20:21
◼
►
Yeah, but generally, like, most space heaters are around 1,300 to 1,500.
00:20:25
◼
►
And it's like, yeah, you put one on a circuit, you don't put much else on that circuit, and that's it.
00:20:29
◼
►
Yeah, so at 100% brightness, it's 1,200 watts.
00:20:31
◼
►
At 30% brightness, it's still 950 watts.
00:20:34
◼
►
So it's not even the brightness it's killing.
00:20:35
◼
►
It is essentially driving all those little computers behind all the total 27th panels.
00:20:39
◼
►
And this is the best one.
00:20:40
◼
►
When the screen is off, 600 watts.
00:20:46
◼
►
It's 600, it's like a gaming PC at full tilt when it's off.
00:20:51
◼
►
What is it doing?
00:20:53
◼
►
Well, the displays are off, but the tiny computers that are behind every single one of those little 27-inch panels are still running.
00:21:00
◼
►
They don't sleep them?
00:21:02
◼
►
Yeah, seriously.
00:21:03
◼
►
This is what he measured.
00:21:05
◼
►
600 watts when the screen is off.
00:21:07
◼
►
I mean, maybe they go to sleep eventually.
00:21:09
◼
►
But anyway, like he said, if you have 55 grand for a TV, maybe you don't care about the 20 cents per hour it costs to leave it in its off state.
00:21:17
◼
►
You should watch the video, though.
00:21:18
◼
►
The picture is amazing.
00:21:20
◼
►
It is sad that he can't get full 4K on a TV screen that covers his entire wall and is 157-inch diagonal.
00:21:28
◼
►
The colors are amazing.
00:21:30
◼
►
The brightness is amazing.
00:21:31
◼
►
It's really cool, but it's 55 grand.
00:21:35
◼
►
And this is not like the highest end or whatever.
00:21:38
◼
►
I think the company's claim to fame is they make affordable ones.
00:21:41
◼
►
They found a good way to make these small components.
00:21:43
◼
►
But this is a really complicated, fancy DIY thing.
00:21:46
◼
►
And I feel like these type of screens are really aimed towards so you're a multi-bazillionaire and you have a giant mansion and you have a theater room and you want to put something the size of a movie screen in it.
00:21:56
◼
►
These look better than a projector.
00:21:58
◼
►
He spent a long time.
00:21:59
◼
►
I guess he does projectors on his channel.
00:22:00
◼
►
He spent a long time saying, look, here's how this compares to a projector.
00:22:04
◼
►
Projectors suck compared to projectors.
00:22:07
◼
►
There's no planet on which a projector can have the contrast and like bright room viewability because projectors, you need to be in the dark.
00:22:14
◼
►
Like, as he pointed out, a projector will never be darker than the screen you're projecting on.
00:22:19
◼
►
Like, how can it be?
00:22:20
◼
►
You're adding light to light that's already bouncing off a screen.
00:22:23
◼
►
So if all the lights are on in the room and you have a screen in front of you, whatever brightness that screen is, you're only going to add to the brightness by projecting light onto it.
00:22:31
◼
►
You'll never get the good, which is why you have to make it dark for projectors.
00:22:34
◼
►
And this doesn't have that problem at all.
00:22:35
◼
►
It's got perfect blacks, doesn't suffer from burn in, has 800 nits of brightness, which doesn't sound like a lot.
00:22:41
◼
►
But when it's 800 nits full field, that's still pretty decent.
00:22:43
◼
►
Yeah, this definitely looks cool.
00:22:45
◼
►
I would love to have one of these if I had a giant mansion and a giant wall where I could fit it on there.
00:22:50
◼
►
But there's no way I would pay 55 grand for something that's not full 4K.
00:22:53
◼
►
So for what it's worth, I have three 5K panels that I'm looking at right now.
00:22:59
◼
►
I have a six-bay Synology.
00:23:00
◼
►
I have some Ubiquiti Network media all hanging off of my whatever it is, tech power, the UPS that Marco likes.
00:23:08
◼
►
Cyber power.
00:23:10
◼
►
Cyber power, yeah.
00:23:11
◼
►
I couldn't remember the name of it.
00:23:12
◼
►
And from what I can tell on the little screen on the front of it, I'm using about 300 watts.
00:23:16
◼
►
And that's with all three of them lit up, the Synology chugging away.
00:23:19
◼
►
And that's half of what this thing uses when it's turned off.
00:23:22
◼
►
It's turned off sitting on there.
00:23:23
◼
►
It's incredible.
00:23:25
◼
►
And don't forget the other room you need to have with the server rack where you run the controller and the thing that turns it into a TV and you run all the cables to it and everything.
00:23:53
◼
►
Gusto is online payroll and benefits software built for small businesses.
00:23:58
◼
►
It's all-in-one, remote-friendly, and incredibly easy to use.
00:24:02
◼
►
So you can pay, hire, onboard, and support your team from anywhere.
00:24:06
◼
►
You can do unlimited payroll runs for just one monthly price with no hidden fees, no surprises.
00:24:12
◼
►
Gusto supports all sorts of things you might need.
00:24:15
◼
►
Automatic payroll tax filing, simple direct deposits, even things like health benefits, commuter benefits, workers' comp, 401k.
00:24:21
◼
►
Whatever it is, Gusto makes it simple and has options to fit nearly every budget.
00:24:26
◼
►
They have all these tools built right in, things like automated offer letters, onboarding materials.
00:24:30
◼
►
You can get direct access to certified HR experts if you need any support with any kind of tough HR situation.
00:24:36
◼
►
So it's a great platform for your payroll and benefits needs.
00:24:39
◼
►
Switching to Gusto is quick and simple.
00:24:41
◼
►
You can just import your existing data, get up and running quickly, and you don't pay a cent until you run your first payroll.
00:24:47
◼
►
Gusto has been trusted by over 400,000 small businesses and was named number one payroll software according to G2 just a few months ago in fall 2025.
00:24:56
◼
►
So try Gusto today at Gusto.com slash ATP and get three months free when you run your first payroll.
00:25:04
◼
►
That's three months of free payroll at Gusto.com slash ATP.
00:25:09
◼
►
That's Gusto spelled G-U-S-T-O dot com slash ATP.
00:25:14
◼
►
Thanks to Gusto for sponsoring our show.
00:25:16
◼
►
Let's talk about Claude Code and AI ethics.
00:25:23
◼
►
And I don't know, John, how you want to introduce this, how you want to talk about it.
00:25:26
◼
►
But what are we talking about here?
00:25:28
◼
►
Well, so I think the past couple episodes I've been talking about my experiments with Claude Code.
00:25:33
◼
►
And if you've listened to the show for a while, you would have heard in the past, I guess, year or so or two maybe.
00:25:38
◼
►
Well, when we first started talking about LLMs and ChatGPT and stuff, a lot of our early episodes about that were like on the topic of how did they get this training data?
00:25:50
◼
►
What is the legal and ethical and moral ramifications of training on the world's data and then charging people to use it?
00:26:00
◼
►
I think we talked about like the Miyazaki like image generation thing in ChatGPT.
00:26:05
◼
►
It's like that that product cannot exist and has no value without first stealing all the works of Hayao Miyazaki.
00:26:12
◼
►
But then it's like, OK, but is it stealing?
00:26:14
◼
►
Is it fair use?
00:26:16
◼
►
What are the you know, what are the courts going to say about it?
00:26:18
◼
►
We spent a long time talking about that.
00:26:20
◼
►
And I think if you've tuned in recently, you might be like, oh, they keep talking about AI, but they never talk about any of these other issues, including like what does it mean for people's jobs?
00:26:27
◼
►
And, you know, worker exploitation and what about, you know, the bubble and possible economic crashes and what about the environment and all the power of these things?
00:26:35
◼
►
We've talked about all these issues before, but it was a little while ago.
00:26:39
◼
►
And that's kind of that's kind of what we mostly focused on before the product before we actually started using the products and before the products sort of came to be broadly useful instead of like a technical curiosity.
00:26:48
◼
►
And then recently I was talking about cloud code.
00:26:50
◼
►
So a lot of people wrote in to say, hey, what about all these what about all these other factors?
00:26:54
◼
►
Like, yeah, it's great that you enjoy cloud code, but what about all the data they're still going to train it on?
00:26:58
◼
►
What about the environment?
00:26:59
◼
►
What about eliminating jobs?
00:27:00
◼
►
What about the bubble bursting?
00:27:02
◼
►
You know, what about these tools being used for propaganda and oppression?
00:27:06
◼
►
You know, like you guys should talk about that.
00:27:08
◼
►
And part of my answer is we have talked about that a lot in the past.
00:27:11
◼
►
But the second part is this item here, which is we should continue to talk about it.
00:27:15
◼
►
And what I wanted to say on this topic is that none of those things are resolved.
00:27:18
◼
►
Like, yeah, we did not come to any conclusions and it's not like the world figured it out.
00:27:25
◼
►
And now we have a set of laws and rules and everything is fine.
00:27:28
◼
►
And we're sure it's nope.
00:27:29
◼
►
None of that is resolved.
00:27:31
◼
►
Like there there are various court cases that are happening, but it remains a completely open question.
00:27:37
◼
►
How bad is it going to be for jobs?
00:27:39
◼
►
Are they going to work out legalities?
00:27:41
◼
►
Is are the reality is going to bankrupt things?
00:27:43
◼
►
Remember, we talked about like people like three thousand dollars per book that was stolen, but it was only because those books were stolen.
00:27:47
◼
►
But the judge otherwise said the training on books that you didn't steal is fair use.
00:27:50
◼
►
These laws, there's there are many cases still winding their way through the courts about various big companies like the New York Times and Disney or whatever suing the AI companies.
00:27:58
◼
►
We don't know how this is going to turn out.
00:28:00
◼
►
And that's just in the US.
00:28:01
◼
►
So, you know, we are talking about, you know, using these products and trying them out or whatever.
00:28:06
◼
►
But it's not as if all those other issues went away, nor have they been resolved.
00:28:11
◼
►
And by resolved, I mean, my personally, I mean, we're resolved.
00:28:14
◼
►
I mean, like figured out in a way that is sustainable that we say, OK, we have a set of rules and guidelines around this.
00:28:21
◼
►
And we think if we continue along this path, people will be fairly compensated for their work.
00:28:26
◼
►
It won't disincentivize creation.
00:28:28
◼
►
It won't cause the end of humanity or whatever other, you know, like, but no, we haven't figured out the right set of those things.
00:28:33
◼
►
We don't know.
00:28:34
◼
►
It could be that letting all these companies take all the world's work and then sell it and become, you know, they could become huge, powerful companies that dwarf any company before known by man.
00:28:43
◼
►
And we let them steal all of our other content.
00:28:45
◼
►
And it turns out that was a terrible idea.
00:28:46
◼
►
We don't know yet.
00:28:47
◼
►
So I don't have any answers here.
00:28:49
◼
►
And this is not, you know, we're not going to like dwell into any of these topics.
00:28:52
◼
►
But you'll be hearing more stories about how this court case turned out.
00:28:55
◼
►
And are these people unpaid?
00:28:56
◼
►
And what is happening with open AI?
00:28:57
◼
►
And, you know, like, and it has the bubble burst.
00:29:00
◼
►
And are they really using all those GPUs?
00:29:01
◼
►
And should they be stealing everything?
00:29:02
◼
►
And how, you know, how bad is it for the environment?
00:29:04
◼
►
And what is XAI doing in Memphis, putting giant gas generators outside their big data centers?
00:29:09
◼
►
Like, that is all still happening.
00:29:11
◼
►
And you may, I know there are a lot of people who are on the extreme side of this to say, okay, that means you should never use AI at all.
00:29:18
◼
►
And you should not use these products.
00:29:19
◼
►
And you shouldn't talk about them.
00:29:20
◼
►
And you should be an abolitionist.
00:29:22
◼
►
And you should throw yourself into the gears of the machine or whatever.
00:29:25
◼
►
Everyone's got to find out where they think is the right place to be along that spectrum.
00:29:29
◼
►
And I think we're all kind of figuring it out, especially since we don't actually know how bad it may be, right?
00:29:36
◼
►
We don't know what will the consequences be of putting all this, them stealing all this information and selling it into product.
00:29:44
◼
►
Everyone is kind of used to the set of rules that we're used to that you were born with.
00:29:47
◼
►
Like, how does copyright work?
00:29:48
◼
►
And how does it work for music and books and movies?
00:29:50
◼
►
And, like, you just accept those as, like, that's just the way things are.
00:29:53
◼
►
But anything new is scary and weird.
00:29:55
◼
►
But if you look into any of the other things, like for television and music and writing and you name it,
00:30:03
◼
►
the systems we have now are very imperfect and are not particularly any more just or well-adjusted than, you know,
00:30:10
◼
►
the perfect thing you have in your mind is just what you're used to.
00:30:12
◼
►
So that's my personal answer on this is that I don't know how this is going to turn out.
00:30:16
◼
►
But I am not an abolitionist and I'm not totally against looking into it because I recognize that it's not actually particularly different than all the other forms of technology that have come before it.
00:30:29
◼
►
And I hope we figure it out and I'm fighting for this to be something sustainable.
00:30:34
◼
►
But I don't think saying that we should never, ever use this stuff is the right solution for right position for me personally.
00:30:42
◼
►
I would go a little further and just say, like, we don't have a choice.
00:30:49
◼
►
This is here.
00:30:50
◼
►
This is happening.
00:30:52
◼
►
It's happening with or without us.
00:30:54
◼
►
And it's happening to all of us.
00:30:57
◼
►
You can decide to sit it out.
00:30:59
◼
►
But what that's going to mean is sitting out the technology business for the foreseeable future.
00:31:05
◼
►
And that's a decision you can make.
00:31:07
◼
►
Like, that's entirely up to you as a technology user.
00:31:12
◼
►
As a consumer, like, you can do that.
00:31:14
◼
►
But if you sit out AI, if you condemn the whole thing, and as John said, like, there's lots of tricky, like, moral arguments and legal arguments.
00:31:24
◼
►
Like, I could see why somebody would condemn the whole thing.
00:31:27
◼
►
But this is the tech business for the foreseeable future.
00:31:32
◼
►
You can be a part of it or you can fall behind.
00:31:34
◼
►
Like, if you are in the tech business and you want to continue to be in the tech business, you kind of have to get on board in some capacity.
00:31:43
◼
►
Or at least you have to be okay with it and recognize, like, this is a massive force that is transforming not only our industry, but many industries, you know, soon.
00:31:55
◼
►
Or currently.
00:31:55
◼
►
Like, it's here.
00:31:59
◼
►
It's going to keep being big.
00:32:00
◼
►
And no matter what happens in the short term, like, if, you know, if the bubble pops, you know, people are saying that.
00:32:07
◼
►
And look, there's a bunch of really, you know, massive amounts of money and ordering and speculation and finance tricks going around.
00:32:14
◼
►
But at the end of the day, if what we have right now is all it ever is, this is already incredibly useful and incredibly valuable to solve lots of people's problems, even if it never gets any better than what it is right now today.
00:32:29
◼
►
And that's not the case.
00:32:31
◼
►
Of course, it's going to get better.
00:32:32
◼
►
So this revolution is here.
00:32:35
◼
►
You can choose not to be part of it, but what that will mean is sitting out the technology business for the foreseeable future.
00:32:41
◼
►
Yeah, and you can choose to not be in the technology business because you're against it, just like choosing not to be in, like, the cattle business because you don't want to eat meat, right?
00:32:49
◼
►
That makes perfect sense.
00:32:51
◼
►
And also, I think there's an equally valid position to stay in the tech business and lobby against it or lobby for, you know, better fit, more fair rules, just as people in the technology industry currently lobby for more fair rules for, like, paying artists for their music streams on Spotify or whatever.
00:33:07
◼
►
Like, it's possible to be to stay in it and fight for a more just rollout of this technology.
00:33:13
◼
►
Like, I was, my parents were asking me about AI recently.
00:33:17
◼
►
I think it's finally penetrated the boomers.
00:33:19
◼
►
And they wanted me to explain the whole situation to them.
00:33:21
◼
►
And the analogy I use for them is one I've also used in a couple of emails that I've responded from ATP.
00:33:25
◼
►
It's like, it's a lot like the Industrial Revolution.
00:33:27
◼
►
Whether I was like, is it a net good or a net bad?
00:33:30
◼
►
The Industrial Revolution caused incredible short-term harms and long-term harms.
00:33:36
◼
►
Short-term, filled the cities with smog, exploited workers.
00:33:39
◼
►
People got cancer from all the chemicals it was putting into the environment.
00:33:42
◼
►
It was just terrible.
00:33:43
◼
►
Like, people who was out in the fields working are now, like, children are in factories with their faces covered with soot.
00:33:48
◼
►
You know, all of London is in a giant black cloud.
00:33:51
◼
►
People losing their jobs left and right because machines were replacing them, right?
00:33:54
◼
►
And then long-term, there was incredible long-term harms from the Industrial Revolution.
00:33:59
◼
►
Climate change.
00:34:00
◼
►
I mean, even though people were kind of picking up, that was probably going to be a thing very early on.
00:34:05
◼
►
It's like, ah, but it'll be fine.
00:34:07
◼
►
Look, the factories are getting cleaner.
00:34:09
◼
►
We have child labor laws now.
00:34:10
◼
►
I'm sure it's smooth sailing from here.
00:34:12
◼
►
And then, you know, like, ah, it seems like the planet's getting warmer.
00:34:17
◼
►
It's like, no, it's fine.
00:34:17
◼
►
It'll be fine.
00:34:18
◼
►
It wasn't fine, right?
00:34:20
◼
►
But on the flip side, electric lighting, indoor plumbing, like, the ability to manufacture things.
00:34:28
◼
►
Like, the Industrial Revolution had tremendous downsides that are actually very similar to the downsides of the computer revolution,
00:34:36
◼
►
the Internet revolution, and now the AI revolution.
00:34:38
◼
►
And that doesn't mean that it's like, you should just accept it because there's good parts of it, too.
00:34:42
◼
►
No, you have to fight against the bad parts.
00:34:44
◼
►
You have to get the pollution out of the cities.
00:34:46
◼
►
You have to come up with cleaner energy.
00:34:47
◼
►
You have to implement child labor laws.
00:34:49
◼
►
You have to get the lead out of the pipes.
00:34:51
◼
►
You have to not let people build with asbestos.
00:34:53
◼
►
Like, you have to do all of those things.
00:34:56
◼
►
And I feel like a lot of the people who are fighting against AI, quote, unquote, against AI, are simply trying to make it so that the children aren't in the factories.
00:35:04
◼
►
And then we stop putting soot into the air and chemicals into the rivers, right?
00:35:08
◼
►
And the tricky thing with AI is that because it's so new and weird, it's not as clean cut as, like, hey, don't put the runoff from your factory into our river.
00:35:18
◼
►
Like, that was straightforward, and it still took, like, decades to make any appreciable progress on because rich people are mean.
00:35:25
◼
►
But, like, and we're in the same situation here, billionaires controlling this AI, right?
00:35:28
◼
►
But it's not entirely clear, like, is it possible to build a sustainable society where they steal all of the – well, they train on the world's work and then profit from it but don't give anything back?
00:35:40
◼
►
Is that – can we build a sustainable system out of that?
00:35:42
◼
►
Or do we have to, you know, have lawsuits where people get paid for the stuff and they have to work?
00:35:47
◼
►
Like, we don't know.
00:35:48
◼
►
We don't know how it's going to turn out.
00:35:49
◼
►
It's not as straightforward as child labor and pollution, but it is very similar in that it's unclear whether it will be a net good or a net bad at any given point because I would say the Industrial Revolution was a net evil for a long time before it was like, okay, well, I guess having, you know, like, machines and factories and manufactured goods and, you know, all that other stuff is actually pretty good.
00:36:10
◼
►
But for a long time, it was really bad.
00:36:13
◼
►
And then all of a sudden, now with climate change, like, oh, we thought we were over the hill, but we're not.
00:36:18
◼
►
We did and we made some terrible choices and now the chickens are coming home to roost generations later, right?
00:36:22
◼
►
So who knows?
00:36:23
◼
►
AI could be like that as well.
00:36:24
◼
►
But all this is to say is that it's complicated.
00:36:26
◼
►
It's not resolved.
00:36:28
◼
►
And just because we talk about it in a nice way and say we had fun working with Claude Code and we're going to talk more about AI stock now doesn't mean we think everything's fine.
00:36:34
◼
►
There's no reason to look anywhere else.
00:36:36
◼
►
There are no other issues.
00:36:37
◼
►
That's not true.
00:36:38
◼
►
All the issues remain and are difficult.
00:36:41
◼
►
I was going to say earlier, and then I think, Marco, you beat me to it, that it would be irresponsible for at least the three of us not to explore this because of both of our professions, both the clickety-clacking and the yakety-yacking in that, you know, it is expected that we are going to be more efficient with the work that we do on our apps.
00:37:00
◼
►
And I think that leveraging these tools is kind of table stakes, which is what Marco was saying.
00:37:05
◼
►
And additionally, for us to stick our heads on the ground and not be aware of this and try it at least some is, I think, irresponsible for the show.
00:37:14
◼
►
And I think that it is literally our jobs to at least kick the tires and see what this is all about.
00:37:21
◼
►
And like John said, and like Marco said, it doesn't make any of this, like, a perfect situation.
00:37:27
◼
►
It doesn't mean that we're excusing all of the ills.
00:37:31
◼
►
But, you know, the genie's out of the bottle.
00:37:34
◼
►
And we just got to – at this point, I feel like we have no choice but to at least participate in it to some degree.
00:37:41
◼
►
I don't see it as like we have no choice and there's no fighting it or whatever.
00:37:43
◼
►
I think there is fighting it.
00:37:44
◼
►
And I think it's equally our responsibility to continue to highlight the problems, right?
00:37:48
◼
►
Just as much as it's our responsibility to know about the technology just for our programming jobs and for our tech podcasting job, it is also our responsibility to continue to acknowledge and fight against the worst of the excesses and try to, you know, do the equivalent of stopping the factories from putting their runoff into the river and, you know, get the smog out of the skies and all the other stuff.
00:38:07
◼
►
Because there's so many equivalents to that in AI and there will be – like we don't even know what the long-term harms are going to be.
00:38:13
◼
►
Like we don't know what the climate change equivalent is, but we know the job loss, pollution, exploitation, propaganda – like the printing press.
00:38:20
◼
►
The printing press was a pretty good invention, but boy, did it empower propaganda.
00:38:23
◼
►
So many harms came from the printing press, but also good.
00:38:26
◼
►
But that doesn't mean to say, well, the printing press is here.
00:38:28
◼
►
There's no sense fighting it.
00:38:29
◼
►
Propaganda is going to be everywhere.
00:38:30
◼
►
No, fight it.
00:38:32
◼
►
But you're never – you're not going to get rid of – getting rid of the printing press is not the solution.
00:38:36
◼
►
But you know what I mean?
00:38:37
◼
►
Like anyway.
00:38:37
◼
►
How do you feel about CDRs?
00:38:40
◼
►
Well, luckily we don't have to worry about them because they're gone, baby.
00:38:44
◼
►
That's all I'm saying.
00:38:44
◼
►
No, it's worth us noting.
00:38:47
◼
►
And I think, John, you would put a link – this is kind of tangentially related – but you would put a link to – a friend of the show, even though I don't think he knows any of us.
00:38:55
◼
►
I don't think any of us know him.
00:38:57
◼
►
But Alec from Technology Connections put together a 90-minute video that is ostensibly about renewable energy, and I definitely learned a lot from this.
00:39:06
◼
►
And that's the first 60 minutes.
00:39:08
◼
►
And the last 30 minutes, continue with renewable energy, and then things take a turn.
00:39:15
◼
►
And I beg of you, if you can give 90 minutes to this, I genuinely think it is well worth your time.
00:39:23
◼
►
If you can only give 30, though, start when he's kind of rolling the credits, so to speak, and he shows all the Patreon supporters and all that, and just pick it up right there.
00:39:31
◼
►
And you will – well, I don't think you'll be disappointed.
00:39:35
◼
►
And you will either be shouting, hell yeah, or hopefully you will reconsider some of your priors.
00:39:40
◼
►
But this video is incredible.
00:39:42
◼
►
The only reason I put it in here is because it's a good – it's one of the things that I think about when considering the environmental impact of AI.
00:39:49
◼
►
As I've said on past shows, I'm perfectly happy to expend energy production of humanity on things that are useful for humanity.
00:39:57
◼
►
So I'm good with power plants heating my home, right, and providing lights because I think those are good things, right, keeping lights on the roads at night for safety and stuff.
00:40:09
◼
►
Yeah, crypto, stupid, but AI, valuable.
00:40:12
◼
►
Yeah, we generate power to do things that are useful for us.
00:40:15
◼
►
And it's a question of, okay, but AI is taking too much power and it's not useful enough or whatever.
00:40:19
◼
►
I view it mostly as the energy generation problem is a separate thing.
00:40:26
◼
►
No, we should not be burning coal.
00:40:27
◼
►
Yes, we should be using solar panels and renewables and we're doing a terrible job of that in our country and we apologize.
00:40:32
◼
►
But this video is good at addressing, like, technologically speaking, things actually do look vaguely hopeful.
00:40:39
◼
►
If you don't – not if you don't look – if you look at the U.S., things look miserable.
00:40:42
◼
►
But if you look at the rest of the world, renewables are kicking butt.
00:40:45
◼
►
And if you haven't been keeping up with this, this video from Technology Connections is a great overview of, like, just how good have a renewable's got?
00:40:51
◼
►
It was another one I forgot to get the URL for, but, like, it was a recent story that a wind farm in, like, Ireland or something,
00:40:56
◼
►
was being shut down.
00:40:58
◼
►
They had, like, 21 turbines.
00:40:59
◼
►
Oh, yeah, yeah, yeah.
00:41:00
◼
►
They had 21 turbines that had been put up in the 90s.
00:41:02
◼
►
They were shutting the wind farm down and they were replacing those 21 turbines with new ones.
00:41:06
◼
►
And every single new turbine they put in produces more power than all 21 of the old ones combined.
00:41:13
◼
►
That's progress.
00:41:14
◼
►
That's incredible.
00:41:14
◼
►
Since the 90s.
00:41:15
◼
►
That is progress.
00:41:16
◼
►
Anyway, watch this video.
00:41:17
◼
►
The political rant at the end, I don't think we'll convince anybody that doesn't already agree with it.
00:41:21
◼
►
But if you do already agree with it, it can be cathartic.
00:41:24
◼
►
We are sponsored this episode by Masterclass.
00:41:27
◼
►
With Masterclass, you can learn from the best to become your best.
00:41:32
◼
►
It is always great to broaden your horizons and to learn new things and to pick up and hone your skills.
00:41:39
◼
►
And Masterclass really is an amazing gift either for yourself or for other people because you can give the gift of knowledge and get better at things.
00:41:47
◼
►
So you can do amazing things at work.
00:41:49
◼
►
For instance, you have a course called Win at Work with Professor Jeffrey Pfeiffer and his Power Playbook.
00:41:54
◼
►
You can apply CIA-tested tactics to everyday life with the art of intelligence.
00:41:59
◼
►
You can negotiate your next raise with lessons from Super Agent Rich Paul or FBI Negotiator Chris Voss.
00:42:04
◼
►
And then you can do other things like design your dream home on your budget with Joanna Gaines.
00:42:08
◼
►
Apply the principles of improv to your life with Amy Poehler.
00:42:11
◼
►
Develop good repeatable habits with Atomic Habits author James Clear and so much more.
00:42:17
◼
►
There's so much in Masterclass and their plans start at just $10 a month billed annually.
00:42:22
◼
►
You get access to over 200 classes taught by the world's best business leaders, writers, chefs, and so much more.
00:42:30
◼
►
So you can turn your commute or your workout into a classroom.
00:42:34
◼
►
There's audio-only modes.
00:42:35
◼
►
You can listen to Masterclass lessons anytime, anywhere.
00:42:38
◼
►
And so no matter what your schedule is, no matter where you are, they have apps to support you on your phone, laptop, your TV.
00:42:45
◼
►
So Masterclass goes with you wherever you can learn.
00:42:47
◼
►
There's no risk.
00:42:49
◼
►
Every new membership comes with a 30-day money-back guarantee.
00:42:51
◼
►
And memberships come with bonus class guides and downloadable content to help you get even more out of each lesson.
00:42:57
◼
►
Right now, our listeners can get an additional 15% off any annual membership at Masterclass.com slash ATP.
00:43:05
◼
►
That's 15% off at Masterclass.com slash ATP.
00:43:09
◼
►
Masterclass.com slash ATP.
00:43:13
◼
►
Thank you so much to Masterclass for sponsoring our show.
00:43:20
◼
►
All right, let's do some topics.
00:43:21
◼
►
And we have to re-enter Vision Pro Corner.
00:43:24
◼
►
And now I have a buddy.
00:43:25
◼
►
And so, John, did you watch both episodes of Top Dog?
00:43:30
◼
►
I watched the dogs.
00:43:31
◼
►
I've watched the dogs as well.
00:43:32
◼
►
That's the whole thing, right?
00:43:33
◼
►
It was just those two episodes?
00:43:34
◼
►
That's right.
00:43:35
◼
►
I thought we were getting like a sample, but it was just two and done, right?
00:43:37
◼
►
Yeah, I think it's like a sum total of around 30 minutes, give or take a little bit, if I recall correctly.
00:43:42
◼
►
But yeah, it's about the, what is it, Crufts or something like that?
00:43:46
◼
►
Yeah, so CRU is like the thing in your code where you got like some old junky code.
00:43:50
◼
►
You said you got a lot of Cruft in this code.
00:43:52
◼
►
It's CRUFTS.
00:43:54
◼
►
It is the world's biggest and oldest dog show.
00:43:56
◼
►
That's right.
00:43:57
◼
►
It takes place in Birmingham, I believe.
00:43:59
◼
►
And this was an immersive two-episode show about that dog show.
00:44:05
◼
►
And I know I've been through this a hundred times.
00:44:07
◼
►
Please just indulge me.
00:44:08
◼
►
2D is, you know, your television.
00:44:10
◼
►
3D is a rectangle that has depth.
00:44:13
◼
►
And immersive means not only is there depth, but you can actually look around and change your perspective, which you can't do in 3D.
00:44:20
◼
►
In 3D, you're just, you're, you're.
00:44:21
◼
►
Every time you say that, like my bristle, because it's the wrong, it's fine.
00:44:26
◼
►
I am happy, John, for you to give me a better way.
00:44:28
◼
►
I know what you mean.
00:44:29
◼
►
I know what you mean.
00:44:30
◼
►
I would just phrase it in a different way.
00:44:32
◼
►
Well, I'll work on that.
00:44:34
◼
►
I'll take that as a homework assignment.
00:44:35
◼
►
But anyway, this was entertaining.
00:44:38
◼
►
I mean, I am not dog obsessed.
00:44:40
◼
►
I really enjoy my dog.
00:44:42
◼
►
I generally enjoy dogs as a point of order, but nothing about this.
00:44:47
◼
►
Like, I don't watch dog shows.
00:44:49
◼
►
So that I don't really care about.
00:44:51
◼
►
But this was, to my eyes, very clearly either directed or edited or both by someone who has only ever done 2D content and has no freaking clue how to do immersive content.
00:45:03
◼
►
And Marco and Ben Thompson, friend of the show, have banged the drum over and over again that you cannot cut.
00:45:10
◼
►
Cut or if you do, you have to do it extremely sparingly.
00:45:12
◼
►
We also spoke about this with regard to basketball.
00:45:15
◼
►
Generally speaking, I think that Marco and Ben are a little too far in one direction and that I think a cut here and there is fine.
00:45:24
◼
►
This, to me, was brutal.
00:45:27
◼
►
It was all the things that Marco and Ben are always talking about.
00:45:30
◼
►
It was a cut every few seconds.
00:45:31
◼
►
The camera moved a lot more than I think it should have because that's kind of, and I've said this before, I don't really ever get motion sick.
00:45:38
◼
►
But when the camera's moving, it kind of gives you the feeling.
00:45:42
◼
►
I really feel like this was basically a three, it was created and cut such that it was more 3D than it was immersive.
00:45:52
◼
►
Literally speaking, it was immersive.
00:45:54
◼
►
As I turn my head, I can see different parts of the scene in front of me.
00:45:58
◼
►
But in terms of the way it was presented, it was basically 3D.
00:46:03
◼
►
There was very little that was, like, fun and cool that you could get by just turning your head to one side or the other.
00:46:10
◼
►
And that was really disappointing to me because I really love the immersive stuff.
00:46:14
◼
►
And I keep banging the drum of, like, the Metallica concert was incredible.
00:46:19
◼
►
I thought the NBA stuff was incredible.
00:46:21
◼
►
And that, you can look around and you have the time to look around and change your perspective
00:46:27
◼
►
and look at something that maybe the camera isn't wanting you to focus on, but you find interesting for whatever reason.
00:46:32
◼
►
This was none of that.
00:46:34
◼
►
And honestly, I was pretty disappointed by it.
00:46:36
◼
►
And by the way, the language I would use for this is small FOV versus large, small field of view versus large,
00:46:41
◼
►
but field of view in two dimensions, not just left and right, but also up and down.
00:46:44
◼
►
And yeah, I did notice that on this thing as well because in any Vision Pro, you know, quote-unquote immersive content,
00:46:50
◼
►
you can turn your head far enough where you see the edges of the thing.
00:46:53
◼
►
Just often the edges are just so, like, the field of view is so large that you're looking at, like, half a sphere
00:46:57
◼
►
or a quarter of a sphere.
00:46:58
◼
►
And so it's really, you can turn your head way to the left, way to the light, way up, way down.
00:47:01
◼
►
You know, you can see your feet.
00:47:03
◼
►
You can see the sky.
00:47:04
◼
►
Sometimes there are even 360 where the field of view is a perfect sphere and there's no place, no edge.
00:47:09
◼
►
But usually there is an edge.
00:47:10
◼
►
And this thing, the edges, top and bottom, were real close.
00:47:14
◼
►
Like, you could look up and down a little bit, but you couldn't see the people's shoes.
00:47:18
◼
►
And you couldn't see the boom mic that was inevitably hanging overhead.
00:47:21
◼
►
Maybe that's why they cut the field of view.
00:47:23
◼
►
And in that respect, that's why you're saying, Casey, that felt more like watching a TV show type thing.
00:47:28
◼
►
Because in a TV show type thing, you can't look up and see the boom mic.
00:47:31
◼
►
And you can't look down and see the people's sneakers.
00:47:32
◼
►
You just see what they frame.
00:47:34
◼
►
And so this was much like that top and bottom.
00:47:36
◼
►
Left and right, there was more freedom.
00:47:37
◼
►
I felt like, was it 180 left and right?
00:47:39
◼
►
I don't know if it was 180.
00:47:40
◼
►
It was pretty big.
00:47:41
◼
►
Maybe, you know, I don't know.
00:47:43
◼
►
The field of view was pretty wide, horizontally, but narrow, vertically.
00:47:46
◼
►
The cutting didn't bother me at all.
00:47:49
◼
►
I thought the immersive environment was a good place to see dogs, because dogs are cute when you can see them in 3D.
00:47:54
◼
►
I would agree with that.
00:47:55
◼
►
I think whoever made this is not very good at making a stereotypical reality TV show thing, having watched a ton of reality TV.
00:48:03
◼
►
Like, this is just a gimme, man.
00:48:05
◼
►
It's like a dog show.
00:48:06
◼
►
There are personalities.
00:48:07
◼
►
You got to give people the villain edit and the hero and the underdog.
00:48:13
◼
►
You know, you got edited together, knowing how it ends to make it.
00:48:16
◼
►
And they just, it wasn't, it was like, they're not very good at their job in terms of making it sort of chewing gum for your mind kind of like fluff reality show.
00:48:25
◼
►
Like, you don't have to be mean to the participants.
00:48:26
◼
►
You don't have to make fun of them or whatever.
00:48:27
◼
►
Just develop drama.
00:48:29
◼
►
And they tried, but it was so-so.
00:48:31
◼
►
The other thing I'll say is, like, the part I was most interested in, I was hurt so much by their poor filming of it, probably because they didn't have enough cameras, which was the fly ball competition.
00:48:43
◼
►
Where the dogs have to run and get a tennis ball and run back.
00:48:45
◼
►
I'd never seen that before.
00:48:46
◼
►
And I wanted to see it, and they wanted to show you the big dramatic finish.
00:48:49
◼
►
And it's like, you didn't, you basically didn't even get it on camera.
00:48:52
◼
►
I mean, you kind of did, but it was so far away.
00:48:54
◼
►
It's such an awkward angle.
00:48:55
◼
►
It's like, this is not how you film a sporting event.
00:48:57
◼
►
They also had a bunch of times where they had straight-up 2D content, and I thought that they handled it well.
00:49:02
◼
►
It was kind of, like, projected onto, like, a screen.
00:49:05
◼
►
I'm not doing it justice.
00:49:07
◼
►
Which parts are you talking about?
00:49:08
◼
►
Like, when they would have just straight-up 2D content that they would show you, you couldn't really move your head and change the perspective.
00:49:15
◼
►
It was just 2D content presented on a screen.
00:49:17
◼
►
You're never changing your perspective.
00:49:18
◼
►
You're just looking at different parts of the image.
00:49:20
◼
►
John, just bear with me here, all right?
00:49:22
◼
►
Changing your perspective means you'd be able to see around the back of something that you couldn't previously see, but you could never do that.
00:49:27
◼
►
All right, you can't change your particular focus or field of view or whatever.
00:49:30
◼
►
You're just looking at different parts of the FOV, yes.
00:49:33
◼
►
Anyway, the point is, there were moments, several moments, where they had just 2D footage, because I guess they didn't set up the immersive cameras or what have you.
00:49:40
◼
►
Like, for past years or just in the show?
00:49:42
◼
►
No, it was in the show.
00:49:42
◼
►
That's how flat it was that I don't think I even noticed this, because there's, like...
00:49:46
◼
►
And that's what I was going to say, is they did it really, really well.
00:49:49
◼
►
Like, it was presented in a way that it wasn't off-putting or anything like that.
00:49:53
◼
►
But ultimately, to your point, and what made me think of this, is that they didn't capture everything in immersive.
00:49:57
◼
►
Like, they whiffed.
00:50:00
◼
►
Oh, it was...
00:50:00
◼
►
It's still very cute, but...
00:50:02
◼
►
Another thing that I think is interesting, I mean, you're a rant on the cutting.
00:50:05
◼
►
This is one...
00:50:05
◼
►
I'm not sure if this is just that I'm not used to, but just be aware this is a thing you ever watch immersive.
00:50:09
◼
►
In regular television or movies, they will do things like...
00:50:14
◼
►
They will have the camera at varying distances to people.
00:50:16
◼
►
So let's say someone is sitting in a chair for an interview in, like, a documentary.
00:50:19
◼
►
You'll see them, like, from the waist up, and they'll be talking and sitting in the chair.
00:50:23
◼
►
And then, like, for some dramatic moments, they'll have a close-up.
00:50:25
◼
►
And you'll be seeing them just, like, their head and shoulders filling the same size frame.
00:50:29
◼
►
That's common.
00:50:30
◼
►
You know, a wide shot, a close-up of, you know, of people or whatever.
00:50:33
◼
►
In immersive, because it's 3D, and because you, like, feel like you're there,
00:50:38
◼
►
like, you could reach out and touch them, it messes with your sense of scale.
00:50:42
◼
►
Like, they had this lady with, like, a dog on her lap.
00:50:45
◼
►
And it was, like, close-up, like, maybe, like, from her chest up with the dog on her lap.
00:50:50
◼
►
And then they cut back to, like, a more sort of, like, a shot where the person in the Vision Pro
00:50:57
◼
►
was the size the person really was.
00:50:59
◼
►
So if they're 5 feet tall, they'll look 5 feet tall in the Vision Pro.
00:51:01
◼
►
And suddenly, the dog looked, like, way smaller.
00:51:04
◼
►
Because before, I was closer to the dog, but I just thought it was a giant dog,
00:51:08
◼
►
because I had no way to, like, compare scale.
00:51:10
◼
►
Like, you see this when they get close to people's faces.
00:51:13
◼
►
It's unnerving because it looks like a gigantic dog or head or whatever really close to you.
00:51:20
◼
►
It doesn't, like, you know, it's just that I don't know if they can do the same thing with scale.
00:51:25
◼
►
Because when they go close up, you don't think, that person is huge.
00:51:27
◼
►
You just know that they're taking a tighter shot.
00:51:29
◼
►
But when you can see them 3D, I kept thinking the dogs were bigger than they were
00:51:33
◼
►
until I saw them, like, walking around.
00:51:34
◼
►
I'm like, oh, no, that dog is the size of a rat.
00:51:36
◼
►
It just looked huge because I was three inches away from it.
00:51:39
◼
►
And it was in 3D.
00:51:40
◼
►
And it felt like it was very strange.
00:51:43
◼
►
So maybe that's just me having to get used to stuff like that.
00:51:46
◼
►
But it is a weird experience to not know the size of things
00:51:49
◼
►
because they seem like they're right in front of you.
00:51:51
◼
►
But they're not.
00:51:52
◼
►
It's just a picture on a screen with a big FOV.
00:51:54
◼
►
All right, Marco, you did an experiment recently.
00:51:57
◼
►
Tell me about that, please.
00:51:58
◼
►
It's kind of an experiment in progress, actually.
00:52:02
◼
►
I think this might surprise you.
00:52:07
◼
►
Me, personally, or just both of us?
00:52:09
◼
►
I'll try to make this quick.
00:52:13
◼
►
I decided, you know, kind of in, like, the Cortex yearly theme arena.
00:52:18
◼
►
This year, I wanted to focus generally on efficiency in my life.
00:52:24
◼
►
Like, there's a bunch of, you know, just kind of cruft I've built up over the years,
00:52:29
◼
►
friction I've tolerated over the years,
00:52:32
◼
►
that I'm trying to just kind of find inefficiencies
00:52:36
◼
►
and consider whether I want to keep them that way or not.
00:52:40
◼
►
You know, I'm not, like, being ruthless, you know,
00:52:43
◼
►
trying to cut every single efficiency,
00:52:45
◼
►
but just kind of be aware of inefficiencies,
00:52:47
◼
►
think about them, consider whether I want to make a change,
00:52:52
◼
►
and then if I want to, then, you know, then make a change.
00:52:54
◼
►
Will I jump in for a second here?
00:52:56
◼
►
I'm thinking in my head,
00:52:57
◼
►
what inefficiencies does Marco have in his life?
00:53:00
◼
►
And I'll be honest, not a lot is jumping to mind.
00:53:02
◼
►
Like, you don't strike me.
00:53:04
◼
►
Like, you're not the kind of person to tolerate inefficiencies,
00:53:08
◼
►
so I can't wait to hear.
00:53:09
◼
►
Do you have any ideas before he tells us what he's talking about?
00:53:11
◼
►
Casey, what inefficiencies are in Marco's life?
00:53:14
◼
►
Like, it seems like there's not a lot.
00:53:15
◼
►
I'm struggling.
00:53:18
◼
►
Like, there's more inefficiency in my life, for sure.
00:53:21
◼
►
I feel like...
00:53:22
◼
►
I mean, like, I have two houses.
00:53:24
◼
►
Like, that's never efficient.
00:53:25
◼
►
But, like, you have so much less stuff
00:53:30
◼
►
than I do, and it's like, you don't...
00:53:31
◼
►
Oh, I don't think that's necessarily the case.
00:53:33
◼
►
But just, it just, like, I don't...
00:53:35
◼
►
Well, I've got nothing.
00:53:37
◼
►
I just, your life always seems like
00:53:39
◼
►
the model of efficiency to me.
00:53:41
◼
►
Then yours must be a disaster.
00:53:44
◼
►
I've seen the basement of the beach house,
00:53:46
◼
►
and I wouldn't say that there's
00:53:49
◼
►
unreasonable things down there,
00:53:50
◼
►
but in the basement,
00:53:51
◼
►
there was a lot of things down there,
00:53:53
◼
►
for, as an example.
00:53:53
◼
►
We're thinking of things, though.
00:53:55
◼
►
That's not just the...
00:53:55
◼
►
Like, inefficiency is like, you know,
00:53:57
◼
►
every day I come home,
00:53:58
◼
►
I have to do this thing,
00:53:59
◼
►
because the knob sticks,
00:54:01
◼
►
and I just deal with it,
00:54:02
◼
►
and the lights don't turn on, right?
00:54:03
◼
►
And Marco doesn't tolerate that.
00:54:05
◼
►
He gets it fixed.
00:54:05
◼
►
That's the kind of thing
00:54:06
◼
►
I'm thinking about.
00:54:07
◼
►
Well, maybe there's things in your life
00:54:08
◼
►
that we don't know about,
00:54:09
◼
►
but it seems...
00:54:10
◼
►
I mean, your computer stuff
00:54:11
◼
►
and your technology stuff
00:54:11
◼
►
seems pretty efficient.
00:54:12
◼
►
Well, except...
00:54:13
◼
►
The only thing I can think of
00:54:15
◼
►
is something to do with cellular
00:54:16
◼
►
and travel and hotspots
00:54:19
◼
►
and things of that nature.
00:54:20
◼
►
That's fine.
00:54:20
◼
►
Of course, your mind would go there.
00:54:22
◼
►
Well, I mean, Marco and I...
00:54:24
◼
►
I mean, he can get rid of the Vision Pro,
00:54:25
◼
►
but he was the first of us
00:54:27
◼
►
to get rid of his Synology
00:54:28
◼
►
or just essentially stop using it
00:54:29
◼
►
because it didn't fit into his
00:54:30
◼
►
ruthlessly efficient technology life.
00:54:33
◼
►
You were giving me
00:54:33
◼
►
a lot more credit
00:54:34
◼
►
than I deserve for efficiency.
00:54:36
◼
►
Well, go ahead.
00:54:37
◼
►
Let's hear what you got.
00:54:38
◼
►
So, this is...
00:54:40
◼
►
I'm looking at a bunch
00:54:40
◼
►
of different things this year.
00:54:41
◼
►
I mean, I'm looking at Overcast,
00:54:43
◼
►
like, you know,
00:54:44
◼
►
just like, you know,
00:54:45
◼
►
the hosting stack of Overcast
00:54:47
◼
►
is, you know, pretty inefficient.
00:54:49
◼
►
I should have thought of that.
00:54:50
◼
►
You're right.
00:54:51
◼
►
That is inefficient.
00:54:52
◼
►
There's stuff there.
00:54:54
◼
►
You know, I'm looking into,
00:54:54
◼
►
you know, doing more with,
00:54:56
◼
►
you know, like S3 type things
00:54:57
◼
►
with Cloudflare,
00:54:58
◼
►
like, you know,
00:54:58
◼
►
trying, you know,
00:54:59
◼
►
different things there
00:55:00
◼
►
that I'm going to look at
00:55:01
◼
►
throughout the year.
00:55:03
◼
►
I mentioned very briefly
00:55:04
◼
►
last episode
00:55:05
◼
►
that I was looking at
00:55:07
◼
►
some like financial planning stuff
00:55:09
◼
►
and there was some inefficiencies
00:55:10
◼
►
in like how I was managing investments
00:55:12
◼
►
and I've been working on
00:55:13
◼
►
those recently.
00:55:14
◼
►
But one other thing
00:55:18
◼
►
that just has been bothering me
00:55:19
◼
►
a lot recently
00:55:28
◼
►
work together.
00:55:29
◼
►
Oh, goodness.
00:55:31
◼
►
You're still using 1Password?
00:55:33
◼
►
And I started looking at like,
00:55:35
◼
►
okay, what...
00:55:36
◼
►
Should I just go all in
00:55:37
◼
►
on 1Password
00:55:38
◼
►
or go all in
00:55:40
◼
►
on Apple Passwords
00:55:41
◼
►
and, you know,
00:55:43
◼
►
because they've been
00:55:43
◼
►
kind of fighting each other
00:55:44
◼
►
for years now.
00:55:45
◼
►
You should do one
00:55:46
◼
►
or the other for sure.
00:55:49
◼
►
something else
00:55:49
◼
►
has changed recently
00:55:51
◼
►
is that I've started
00:55:52
◼
►
using AI a lot.
00:55:53
◼
►
I've started using
00:55:55
◼
►
many times a day
00:55:58
◼
►
small questions.
00:55:59
◼
►
where I used to do
00:56:01
◼
►
more web searches
00:56:02
◼
►
or asking Siri
00:56:04
◼
►
I've been doing a lot
00:56:05
◼
►
more asking ChatGPT
00:56:06
◼
►
and that's been
00:56:07
◼
►
going very well.
00:56:08
◼
►
Another thing
00:56:09
◼
►
that happened recently
00:56:10
◼
►
is Google Gemini
00:56:13
◼
►
And when I actually
00:56:16
◼
►
compare factual answers
00:56:17
◼
►
where I know
00:56:18
◼
►
the correct factual answer
00:56:19
◼
►
between ChatGPT
00:56:21
◼
►
and Google Gemini,
00:56:22
◼
►
Gemini was getting it
00:56:24
◼
►
right a little bit
00:56:25
◼
►
And there are certain
00:56:27
◼
►
things about Gemini
00:56:27
◼
►
that I don't like
00:56:28
◼
►
as much as ChatGPT
00:56:29
◼
►
but the sophistication
00:56:31
◼
►
of the model
00:56:31
◼
►
is hard to ignore.
00:56:32
◼
►
It is seemingly
00:56:34
◼
►
a little bit ahead
00:56:36
◼
►
in a lot of things.
00:56:36
◼
►
Not in everything
00:56:37
◼
►
and it has its own
00:56:38
◼
►
quirks and annoyances
00:56:40
◼
►
has been pretty good
00:56:42
◼
►
And I also started
00:56:43
◼
►
thinking about AI
00:56:44
◼
►
and I'm like,
00:56:48
◼
►
has a really bright
00:56:51
◼
►
when you look at
00:56:54
◼
►
how these AI
00:56:56
◼
►
things are advancing
00:56:57
◼
►
and what's the next step?
00:56:58
◼
►
the next step
00:57:00
◼
►
is figuring out
00:57:00
◼
►
their business model
00:57:03
◼
►
their ad system.
00:57:07
◼
►
integrate more
00:57:07
◼
►
into things,
00:57:09
◼
►
you need data sources,
00:57:10
◼
►
you need deals
00:57:11
◼
►
with other vendors
00:57:13
◼
►
or things like
00:57:14
◼
►
business and directory
00:57:17
◼
►
mapping providers,
00:57:18
◼
►
different affiliate
00:57:19
◼
►
deals with everybody
00:57:20
◼
►
to try to monetize
00:57:21
◼
►
through that.
00:57:22
◼
►
If you want to be able
00:57:23
◼
►
to book hotels
00:57:24
◼
►
or whatever,
00:57:24
◼
►
you need that.
00:57:25
◼
►
And I started thinking,
00:57:27
◼
►
is in a really good
00:57:28
◼
►
position for this world.
00:57:30
◼
►
they had a bit
00:57:31
◼
►
of a slow start,
00:57:33
◼
►
make the models.
00:57:34
◼
►
They have their own
00:57:35
◼
►
in-house models.
00:57:37
◼
►
OpenAI has that too.
00:57:37
◼
►
That's great.
00:57:38
◼
►
Google has their own
00:57:40
◼
►
cloud infrastructure.
00:57:43
◼
►
OpenAI doesn't really
00:57:44
◼
►
have much of that.
00:57:44
◼
►
Google makes
00:57:46
◼
►
their own chips
00:57:47
◼
►
OpenAI doesn't have that.
00:57:49
◼
►
And Google has
00:57:50
◼
►
all these existing
00:57:52
◼
►
They have data sources.
00:57:54
◼
►
They have the entire
00:57:56
◼
►
content of the web
00:57:57
◼
►
that they already have
00:57:58
◼
►
like in their possession
00:57:59
◼
►
through means that most
00:58:01
◼
►
people don't want to
00:58:02
◼
►
And I know they try
00:58:02
◼
►
to separate it,
00:58:03
◼
►
but it's not that
00:58:04
◼
►
they have Google Maps
00:58:06
◼
►
and they recently
00:58:06
◼
►
just integrated Google Maps
00:58:07
◼
►
pretty well into Gemini.
00:58:09
◼
►
And so I was able
00:58:10
◼
►
to do something like,
00:58:11
◼
►
like I was in the city
00:58:12
◼
►
yesterday and I said,
00:58:13
◼
►
make me a walk.
00:58:14
◼
►
I want to walk
00:58:15
◼
►
about four miles.
00:58:16
◼
►
Get me a walk
00:58:18
◼
►
through Central Park
00:58:19
◼
►
and that's about
00:58:19
◼
►
four miles somewhat scenic.
00:58:21
◼
►
And it generated,
00:58:22
◼
►
Gemini generated
00:58:24
◼
►
and it was stupid.
00:58:26
◼
►
I asked for a four mile walk
00:58:27
◼
►
and it said,
00:58:28
◼
►
here's a 3.1 mile walk
00:58:29
◼
►
and I'm like,
00:58:31
◼
►
I asked for a four mile walk,
00:58:33
◼
►
please make it four miles.
00:58:33
◼
►
And it's like,
00:58:34
◼
►
here's a 4.6 mile walk.
00:58:37
◼
►
Numbers are hard for them.
00:58:39
◼
►
But using Google Maps,
00:58:42
◼
►
it showed me the route
00:58:43
◼
►
and then it said,
00:58:45
◼
►
you can send this
00:58:47
◼
►
to your phone,
00:58:47
◼
►
to Google Maps.
00:58:48
◼
►
and it had a link
00:58:50
◼
►
so I could send it
00:58:51
◼
►
to someone else
00:58:51
◼
►
walking with me
00:58:52
◼
►
and I could send it to,
00:58:54
◼
►
I could save it
00:58:54
◼
►
if I wanted to.
00:58:55
◼
►
And that showed me,
00:58:57
◼
►
this kind of integration,
00:58:58
◼
►
I think Google
00:59:00
◼
►
has a really big edge here
00:59:02
◼
►
over everyone else.
00:59:03
◼
►
That they have,
00:59:04
◼
►
all the integrations
00:59:05
◼
►
that they're going to have
00:59:06
◼
►
with all the data,
00:59:06
◼
►
I think they're going
00:59:08
◼
►
to really build a big edge
00:59:11
◼
►
in the near term
00:59:12
◼
►
and, you know,
00:59:13
◼
►
long term probably as well.
00:59:14
◼
►
One of the reasons
00:59:15
◼
►
why I was interested
00:59:15
◼
►
in sending the link
00:59:18
◼
►
from Google Maps
00:59:19
◼
►
through Gemini
00:59:20
◼
►
is for all these years,
00:59:22
◼
►
I have been bouncing
00:59:24
◼
►
between three different
00:59:25
◼
►
Maps apps on my phone.
00:59:26
◼
►
Google Maps,
00:59:29
◼
►
for different purposes.
00:59:31
◼
►
driving directions app
00:59:32
◼
►
because it's good
00:59:32
◼
►
with navigating traffic
00:59:33
◼
►
on Long Island.
00:59:34
◼
►
Google Maps,
00:59:35
◼
►
I would look up business
00:59:36
◼
►
info and stuff like that.
00:59:38
◼
►
And Apple Maps,
00:59:39
◼
►
I would occasionally use
00:59:41
◼
►
for something else,
00:59:42
◼
►
maybe like walking directions.
00:59:44
◼
►
But Apple Maps
00:59:45
◼
►
drives me nuts
00:59:47
◼
►
on the phone
00:59:47
◼
►
in one particular way
00:59:49
◼
►
that when you have
00:59:50
◼
►
active direction following
00:59:51
◼
►
going on in Apple Maps,
00:59:53
◼
►
it takes over
00:59:54
◼
►
the lock screen
00:59:55
◼
►
of the phone
00:59:56
◼
►
that seems to
00:59:57
◼
►
significantly
00:59:59
◼
►
and reliably
01:00:00
◼
►
unlocking the phone
01:00:01
◼
►
Because it's like
01:00:02
◼
►
you unlock the phone
01:00:03
◼
►
so you can see,
01:00:04
◼
►
so you can navigate
01:00:05
◼
►
and then you can
01:00:07
◼
►
to go to the home screen.
01:00:08
◼
►
I can't find a way
01:00:09
◼
►
to turn that off,
01:00:10
◼
►
like the way Apple Maps
01:00:11
◼
►
takes over the home screen,
01:00:12
◼
►
the lock screen,
01:00:13
◼
►
And so that actually
01:00:14
◼
►
drives me crazy
01:00:15
◼
►
when I'm trying
01:00:16
◼
►
to use Apple Maps
01:00:16
◼
►
for navigation.
01:00:17
◼
►
So that alone
01:00:20
◼
►
why I mostly
01:00:20
◼
►
didn't use it for that.
01:00:21
◼
►
And then Google Maps,
01:00:23
◼
►
I would look up businesses
01:00:23
◼
►
and it has by far
01:00:25
◼
►
the best business data
01:00:27
◼
►
find me coffee shops
01:00:28
◼
►
and you're here.
01:00:29
◼
►
And then are they open
01:00:30
◼
►
What are their hours?
01:00:31
◼
►
Are they good?
01:00:33
◼
►
Give me ratings.
01:00:35
◼
►
destroys Apple Maps
01:00:36
◼
►
And Google also owns
01:00:39
◼
►
they have access
01:00:39
◼
►
to all the same
01:00:39
◼
►
traffic data.
01:00:40
◼
►
And so I decided
01:00:41
◼
►
this is ridiculous
01:00:42
◼
►
to have three
01:00:43
◼
►
different Maps apps
01:00:44
◼
►
the year of efficiency.
01:00:45
◼
►
I'm going to try
01:00:47
◼
►
to use just one
01:00:48
◼
►
Maps app for everything.
01:00:49
◼
►
And the only one
01:00:51
◼
►
that made sense
01:00:51
◼
►
to do that with
01:00:52
◼
►
is Google Maps.
01:00:52
◼
►
So for the last
01:00:54
◼
►
month or so,
01:00:55
◼
►
I've been using
01:00:56
◼
►
exclusively Google Maps.
01:00:59
◼
►
so when I had
01:01:00
◼
►
Gemini generate me
01:01:02
◼
►
walking directions,
01:01:02
◼
►
it was great.
01:01:03
◼
►
Send it to my phone.
01:01:07
◼
►
It was there.
01:01:08
◼
►
I could pull it up
01:01:10
◼
►
And it's a walking
01:01:11
◼
►
route through Central
01:01:12
◼
►
Park that had like
01:01:13
◼
►
six different stops
01:01:14
◼
►
and they all
01:01:14
◼
►
transferred perfectly.
01:01:15
◼
►
And I could view it
01:01:18
◼
►
Like when I actually
01:01:18
◼
►
went on the walk,
01:01:19
◼
►
like the next day
01:01:20
◼
►
I viewed it,
01:01:21
◼
►
it was still there.
01:01:22
◼
►
It hadn't like
01:01:22
◼
►
fallen out of memory
01:01:23
◼
►
or whatever.
01:01:24
◼
►
Like the link
01:01:25
◼
►
still worked.
01:01:26
◼
►
It was still sent
01:01:26
◼
►
to my phone.
01:01:28
◼
►
It worked really
01:01:30
◼
►
And Google Maps
01:01:31
◼
►
for driving directions,
01:01:32
◼
►
there's a lot of
01:01:33
◼
►
small ways in which
01:01:35
◼
►
it works better
01:01:38
◼
►
I know this is a
01:01:39
◼
►
confusing thing to
01:01:42
◼
►
There's a lot of
01:01:44
◼
►
ways that Google Maps
01:01:45
◼
►
is a little bit
01:01:45
◼
►
better than Waze.
01:01:46
◼
►
Not in every
01:01:47
◼
►
possible way,
01:01:47
◼
►
but in most ways.
01:01:48
◼
►
It's like a little
01:01:50
◼
►
a little bit more
01:01:51
◼
►
It's also a lot
01:01:55
◼
►
Any kind of like
01:01:57
◼
►
with one of the
01:01:58
◼
►
downsides of Waze
01:01:59
◼
►
is if you ask for
01:01:59
◼
►
directions somewhere,
01:02:00
◼
►
it kind of spins
01:02:01
◼
►
for a while before
01:02:02
◼
►
it generates them.
01:02:02
◼
►
I don't know why
01:02:04
◼
►
this is different
01:02:05
◼
►
because Google Maps
01:02:05
◼
►
and Waze are both
01:02:07
◼
►
owned by Google
01:02:07
◼
►
and both seem to
01:02:08
◼
►
operate in most of
01:02:08
◼
►
the same data.
01:02:10
◼
►
directions to Google
01:02:11
◼
►
Maps is almost
01:02:11
◼
►
instantaneous every
01:02:13
◼
►
That is not the
01:02:14
◼
►
case with Waze.
01:02:15
◼
►
I've used Waze
01:02:16
◼
►
connection like I
01:02:17
◼
►
know Waze is slow
01:02:19
◼
►
to generate directions
01:02:19
◼
►
and Google Maps is
01:02:21
◼
►
Google Maps also
01:02:22
◼
►
had better voice
01:02:24
◼
►
It's better for
01:02:26
◼
►
charging station
01:02:27
◼
►
It's better for
01:02:29
◼
►
all of those
01:02:31
◼
►
I'm very happy
01:02:32
◼
►
with Google Maps
01:02:33
◼
►
mapping app.
01:02:34
◼
►
Again, it's not
01:02:35
◼
►
perfect, but I
01:02:36
◼
►
think it's way
01:02:37
◼
►
better than using
01:02:37
◼
►
three different
01:02:38
◼
►
apps for the
01:02:39
◼
►
That was not the
01:02:40
◼
►
efficiency I expected
01:02:41
◼
►
That's a minor
01:02:41
◼
►
efficiency, but I'm
01:02:42
◼
►
glad to see you
01:02:43
◼
►
drop down from
01:02:43
◼
►
three to one.
01:02:46
◼
►
occasionally will
01:02:47
◼
►
use Google Maps,
01:02:48
◼
►
although very
01:02:49
◼
►
typically, if
01:02:50
◼
►
anything, it's
01:02:50
◼
►
for business
01:02:51
◼
►
information, like
01:02:51
◼
►
you had said.
01:02:52
◼
►
I almost never
01:02:53
◼
►
use Waze, but I'm
01:02:54
◼
►
almost never in
01:02:55
◼
►
places that have
01:02:56
◼
►
And I actually
01:02:58
◼
►
find that Apple
01:02:58
◼
►
Maps is good for
01:03:00
◼
►
me for 98% of
01:03:02
◼
►
the things that I
01:03:05
◼
►
good at first
01:03:05
◼
►
for sure, but
01:03:06
◼
►
the last few
01:03:07
◼
►
years, it's been
01:03:09
◼
►
The issue with
01:03:10
◼
►
Apple Maps, again,
01:03:10
◼
►
besides my irritation
01:03:12
◼
►
with the way it
01:03:12
◼
►
takes over the
01:03:13
◼
►
lock screen and
01:03:14
◼
►
unlocking my
01:03:14
◼
►
phone a lot for
01:03:15
◼
►
some reason, my
01:03:16
◼
►
issue with Apple
01:03:17
◼
►
Maps is the same
01:03:18
◼
►
issue I have with
01:03:18
◼
►
Waze that if I'm
01:03:20
◼
►
going to only have
01:03:21
◼
►
one mapping app,
01:03:22
◼
►
neither of those is
01:03:23
◼
►
good enough to be
01:03:25
◼
►
You know, because
01:03:25
◼
►
Apple Maps really
01:03:27
◼
►
falls down with
01:03:28
◼
►
business information.
01:03:29
◼
►
I didn't even
01:03:30
◼
►
business information.
01:03:31
◼
►
I always use
01:03:32
◼
►
Google Maps for
01:03:33
◼
►
It's not even
01:03:33
◼
►
like, I know
01:03:35
◼
►
that's part of
01:03:39
◼
►
all the food
01:03:40
◼
►
and the menus
01:03:40
◼
►
and that's a
01:03:41
◼
►
whole separate
01:03:42
◼
►
And it's like,
01:03:43
◼
►
the way, this
01:03:44
◼
►
point on the
01:03:45
◼
►
map brings up
01:03:46
◼
►
that info, but
01:03:46
◼
►
Apple, I mean,
01:03:47
◼
►
I guess Apple has
01:03:48
◼
►
some business
01:03:48
◼
►
of it, but it
01:03:49
◼
►
doesn't compare.
01:03:50
◼
►
Google Maps and
01:03:50
◼
►
Apple Maps and
01:03:51
◼
►
Google for all
01:03:52
◼
►
the reasons that
01:03:53
◼
►
you said, but
01:03:54
◼
►
I use Apple for
01:03:56
◼
►
driving because
01:03:57
◼
►
they're much more
01:03:58
◼
►
attractive and
01:03:59
◼
►
larger for my
01:04:00
◼
►
Viewing of the
01:04:02
◼
►
road and the
01:04:03
◼
►
I find really
01:04:04
◼
►
helps, although
01:04:04
◼
►
I will say that
01:04:05
◼
►
sometimes it
01:04:06
◼
►
still makes me do
01:04:07
◼
►
things and I
01:04:08
◼
►
get angry at it
01:04:08
◼
►
and switch back
01:04:09
◼
►
to Google Maps.
01:04:09
◼
►
So I'm down to
01:04:10
◼
►
two, but I'm
01:04:12
◼
►
glad to hear that
01:04:12
◼
►
you're down to
01:04:13
◼
►
All right, so
01:04:17
◼
►
again, thinking
01:04:18
◼
►
about the whole
01:04:19
◼
►
picture here,
01:04:23
◼
►
keeps getting
01:04:23
◼
►
better and it
01:04:24
◼
►
keeps getting
01:04:25
◼
►
bigger and it's
01:04:26
◼
►
taking over what
01:04:27
◼
►
used to be web
01:04:28
◼
►
search for me.
01:04:28
◼
►
I realized that,
01:04:33
◼
►
for years used
01:04:36
◼
►
And one of the
01:04:38
◼
►
things I got
01:04:38
◼
►
hooked on with
01:04:39
◼
►
DuckDuckGo is
01:04:40
◼
►
the bang shortcut
01:04:42
◼
►
syntax, especially
01:04:44
◼
►
for searching
01:04:45
◼
►
So when you have
01:04:46
◼
►
DuckDuckGo, you
01:04:48
◼
►
exclamation point
01:04:50
◼
►
then whatever you
01:04:50
◼
►
want and that's a
01:04:51
◼
►
shortcut to Amazon.
01:04:52
◼
►
You can customize
01:04:53
◼
►
There's a bunch
01:04:54
◼
►
There's G for
01:04:55
◼
►
Google if you want
01:04:56
◼
►
to quickly jump
01:04:57
◼
►
over to Google
01:04:58
◼
►
There's bang
01:04:59
◼
►
eBay search.
01:05:00
◼
►
There's a bunch
01:05:02
◼
►
I mostly use the
01:05:03
◼
►
Amazon one a lot
01:05:04
◼
►
and occasionally
01:05:04
◼
►
the eBay one.
01:05:05
◼
►
And by the way,
01:05:06
◼
►
if you use Safari
01:05:06
◼
►
like I do, you
01:05:07
◼
►
can do that with
01:05:08
◼
►
a Safari extension
01:05:10
◼
►
where I just type
01:05:10
◼
►
an az space and
01:05:11
◼
►
then a search in
01:05:12
◼
►
my address bar and
01:05:13
◼
►
does the same
01:05:13
◼
►
Yeah, there's a
01:05:14
◼
►
few different
01:05:14
◼
►
extensions, some of
01:05:15
◼
►
which I found
01:05:16
◼
►
But yeah, there's
01:05:17
◼
►
a few different
01:05:18
◼
►
extensions that do
01:05:19
◼
►
But Safari also
01:05:20
◼
►
supports DuckDuckGo
01:05:21
◼
►
as a built-in
01:05:22
◼
►
search engine.
01:05:22
◼
►
So it's very,
01:05:24
◼
►
very easy to
01:05:24
◼
►
switch to it.
01:05:25
◼
►
In the last, I
01:05:27
◼
►
think, two years
01:05:28
◼
►
or so, I've been
01:05:32
◼
►
I heard it on the
01:05:33
◼
►
talk show because
01:05:34
◼
►
the CEO has actually
01:05:35
◼
►
been on the talk
01:05:35
◼
►
show I think twice
01:05:36
◼
►
with John Gruber.
01:05:38
◼
►
And it sounded
01:05:39
◼
►
interesting to me and
01:05:41
◼
►
I've been using
01:05:41
◼
►
Kagi for the last
01:05:42
◼
►
couple of years.
01:05:43
◼
►
And it's been
01:05:44
◼
►
pretty good.
01:05:45
◼
►
Like as a web
01:05:47
◼
►
search, if you
01:05:47
◼
►
actually want web
01:05:49
◼
►
search, you want
01:05:50
◼
►
to like find a
01:05:51
◼
►
review of a new
01:05:51
◼
►
fridge, Google is
01:05:53
◼
►
garbage for that.
01:05:55
◼
►
It's just, it's
01:05:55
◼
►
been gamed like
01:05:56
◼
►
crazy and it's
01:05:58
◼
►
really hard to find
01:05:59
◼
►
good info on Google.
01:06:00
◼
►
And part of that is
01:06:01
◼
►
because everything's
01:06:02
◼
►
Part of that's also
01:06:03
◼
►
because there's
01:06:04
◼
►
increasingly less and
01:06:05
◼
►
less good content to
01:06:06
◼
►
find because the web
01:06:08
◼
►
is dying in large
01:06:09
◼
►
part due to AI, but
01:06:10
◼
►
also it was dead
01:06:11
◼
►
before that.
01:06:11
◼
►
It was dead because
01:06:12
◼
►
social networks
01:06:13
◼
►
So the web is
01:06:14
◼
►
really not in a
01:06:15
◼
►
great place right
01:06:16
◼
►
now content-wise,
01:06:17
◼
►
hasn't been for
01:06:18
◼
►
And Google hasn't
01:06:19
◼
►
really seemingly
01:06:21
◼
►
worked very hard on
01:06:21
◼
►
their web search in
01:06:22
◼
►
that time either.
01:06:23
◼
►
So it's not like
01:06:24
◼
►
they're without
01:06:25
◼
►
fault here, but the
01:06:27
◼
►
web's in a tough
01:06:28
◼
►
spot and web search
01:06:29
◼
►
has therefore been
01:06:30
◼
►
And that's part of
01:06:31
◼
►
why I have so much
01:06:32
◼
►
value these days in
01:06:33
◼
►
AI, because it's
01:06:35
◼
►
just, it's better at
01:06:37
◼
►
giving me information
01:06:37
◼
►
than web search has
01:06:39
◼
►
been for many years.
01:06:40
◼
►
So I've been using
01:06:41
◼
►
Kaki and I've been,
01:06:43
◼
►
you know, but I
01:06:43
◼
►
realized I've been
01:06:44
◼
►
less and less actually
01:06:45
◼
►
going to the search
01:06:46
◼
►
results page.
01:06:46
◼
►
Usually I'm using
01:06:47
◼
►
Kaki to just search
01:06:49
◼
►
on so what, you
01:06:52
◼
►
know, maybe in
01:06:54
◼
►
the era of all of
01:06:56
◼
►
this AI stuff, I'm
01:06:57
◼
►
like, I'm trying
01:06:58
◼
►
Google Maps.
01:06:59
◼
►
I'm trying Gemini.
01:07:01
◼
►
Why don't I just
01:07:02
◼
►
try using Google as
01:07:03
◼
►
my search engine
01:07:04
◼
►
What am I doing
01:07:05
◼
►
fighting all this?
01:07:06
◼
►
Like I fought, I
01:07:07
◼
►
fought against Google.
01:07:08
◼
►
I've raged against
01:07:09
◼
►
the Google machine for
01:07:10
◼
►
years, for decades
01:07:14
◼
►
like, let's give
01:07:16
◼
►
So I have switched
01:07:18
◼
►
back to Google as
01:07:20
◼
►
my search engine.
01:07:21
◼
►
Oh, I don't like that
01:07:23
◼
►
I never left.
01:07:24
◼
►
Although the AI
01:07:25
◼
►
search results at the
01:07:26
◼
►
top really annoy me.
01:07:27
◼
►
Yes, I know.
01:07:28
◼
►
I know how to get
01:07:28
◼
►
rid of them with the
01:07:29
◼
►
Yeah, to be clear,
01:07:30
◼
►
like the AI summary
01:07:31
◼
►
on top of Google search
01:07:32
◼
►
results is horrendously
01:07:34
◼
►
It's very bad.
01:07:34
◼
►
It's sometimes it's
01:07:35
◼
►
So it's so often
01:07:37
◼
►
comically wrong.
01:07:38
◼
►
And when I say
01:07:40
◼
►
Gemini is good, that's
01:07:41
◼
►
not what I'm talking
01:07:42
◼
►
The worst part about
01:07:43
◼
►
them, though, is
01:07:43
◼
►
because they are
01:07:44
◼
►
at the top of the
01:07:45
◼
►
Google search results.
01:07:45
◼
►
Everyone in my life,
01:07:47
◼
►
anytime they quote
01:07:48
◼
►
unquote look up
01:07:48
◼
►
something on their
01:07:49
◼
►
phone, I have to ask
01:07:50
◼
►
them to just scroll
01:07:51
◼
►
past the AI search
01:07:53
◼
►
I'm not like just I
01:07:54
◼
►
want to know the
01:07:55
◼
►
actual answer.
01:07:55
◼
►
The AI search results
01:07:56
◼
►
at the top of Google
01:07:57
◼
►
are hilariously bad.
01:07:59
◼
►
They're like, I think
01:08:00
◼
►
Google is giving AI a
01:08:01
◼
►
bad name with those.
01:08:02
◼
►
I honestly think
01:08:03
◼
►
Google should they
01:08:04
◼
►
should either make
01:08:05
◼
►
them use a better
01:08:05
◼
►
model, which I know I
01:08:06
◼
►
kind of understand why
01:08:07
◼
►
they're so bad.
01:08:08
◼
►
Yeah, because like the
01:08:09
◼
►
scaling of that would
01:08:10
◼
►
be incredible, right?
01:08:11
◼
►
Like they don't want
01:08:12
◼
►
to get like it's too
01:08:13
◼
►
expensive to give
01:08:13
◼
►
everyone a good
01:08:14
◼
►
AI search result, but
01:08:15
◼
►
like I feel like almost
01:08:15
◼
►
like they should go back
01:08:16
◼
►
to the old days where
01:08:17
◼
►
they just had like can
01:08:17
◼
►
responses to common
01:08:18
◼
►
queries that they would
01:08:19
◼
►
actually like look up
01:08:20
◼
►
and have like, you know,
01:08:21
◼
►
just the dumb old sort
01:08:22
◼
►
of like hard coded.
01:08:23
◼
►
If someone asked for
01:08:24
◼
►
this thing, run this
01:08:26
◼
►
thing instead of just
01:08:27
◼
►
like, oh, check it to
01:08:27
◼
►
Gemini, but one of our
01:08:28
◼
►
tiny little models and
01:08:29
◼
►
it will give total BS.
01:08:30
◼
►
So it just annoys me to
01:08:32
◼
►
the top at the top.
01:08:33
◼
►
You have to scroll past
01:08:33
◼
►
and I forget what the
01:08:34
◼
►
query parameter is that
01:08:35
◼
►
you can put into your
01:08:36
◼
►
Google search query.
01:08:36
◼
►
So it doesn't show
01:08:37
◼
►
that, but it's just
01:08:38
◼
►
such a hassle because
01:08:39
◼
►
it's not so much me.
01:08:40
◼
►
I can scroll past
01:08:41
◼
►
It's everyone else in
01:08:42
◼
►
the world who now
01:08:44
◼
►
quote unquote look
01:08:44
◼
►
something up in their
01:08:45
◼
►
phone and they read the
01:08:47
◼
►
It's the top of the
01:08:47
◼
►
Google search results.
01:08:48
◼
►
And I say, no, keep
01:08:49
◼
►
scrolling, keep
01:08:51
◼
►
I really think Google's
01:08:52
◼
►
doing way more harm than
01:08:53
◼
►
good for both themselves
01:08:55
◼
►
and for the for the
01:08:56
◼
►
perception of AI with
01:08:57
◼
►
I mean, well, I was
01:08:59
◼
►
going to say the good
01:09:00
◼
►
news was maybe not the
01:09:01
◼
►
good news, but like as
01:09:02
◼
►
the cost of inference goes
01:09:03
◼
►
down, presumably those
01:09:04
◼
►
answers will get better
01:09:06
◼
►
ish, but I will never
01:09:07
◼
►
accept them as the
01:09:08
◼
►
result of quote unquote
01:09:09
◼
►
looking something up.
01:09:10
◼
►
You didn't look something
01:09:11
◼
►
You just took a stew of
01:09:12
◼
►
words that came out of a
01:09:14
◼
►
probability machine and
01:09:15
◼
►
read it back.
01:09:15
◼
►
If you're going to take
01:09:16
◼
►
the time to type
01:09:18
◼
►
something into Google,
01:09:19
◼
►
look it up at a source.
01:09:20
◼
►
So get some kind of
01:09:21
◼
►
source, just source it
01:09:22
◼
►
from somewhere, not the
01:09:24
◼
►
amalgam of all the words
01:09:25
◼
►
information run through a
01:09:29
◼
►
So anyway, tell us how
01:09:31
◼
►
you really feel, John.
01:09:32
◼
►
So, all right, I'm trying
01:09:34
◼
►
out Google with my search
01:09:35
◼
►
And so far I've I've I
01:09:38
◼
►
said it that way a couple
01:09:39
◼
►
I've barely noticed any
01:09:41
◼
►
difference because I the
01:09:43
◼
►
first thing I did was, oh,
01:09:45
◼
►
let me figure out a way to
01:09:46
◼
►
get my Amazon shortcuts to
01:09:49
◼
►
And so, you know, there's
01:09:50
◼
►
these extensions.
01:09:51
◼
►
And then something else
01:09:55
◼
►
happened a few days ago.
01:09:56
◼
►
I saw somebody filling out
01:10:00
◼
►
a form and I noticed
01:10:03
◼
►
their autofill worked a
01:10:05
◼
►
hell of a lot better than
01:10:08
◼
►
Every time I place an
01:10:10
◼
►
order for something, you
01:10:11
◼
►
know, Safari will offer me
01:10:13
◼
►
like, you know, the fill the
01:10:14
◼
►
information and it's it's
01:10:17
◼
►
Oh, you switched this
01:10:17
◼
►
Chrome, didn't you?
01:10:18
◼
►
Yeah, that's what I was
01:10:20
◼
►
The autofill is a little bit
01:10:22
◼
►
better sometimes.
01:10:22
◼
►
I am trying Chrome.
01:10:26
◼
►
You're going to have an
01:10:26
◼
►
Android phone before the end
01:10:27
◼
►
of the year.
01:10:27
◼
►
No, that's definitely not
01:10:28
◼
►
going to happen.
01:10:29
◼
►
But I'm not so sure.
01:10:30
◼
►
So that's that's the the
01:10:33
◼
►
final shoe here is that I
01:10:34
◼
►
have tried switching to
01:10:35
◼
►
I'm only like a day in.
01:10:37
◼
►
You're becoming like you're
01:10:38
◼
►
becoming more like me.
01:10:39
◼
►
You're using Google Maps,
01:10:40
◼
►
using Chrome, Google
01:10:42
◼
►
I've been using those things
01:10:44
◼
►
So here's here's the thing
01:10:45
◼
►
about Chrome.
01:10:45
◼
►
Chrome is as as annoying as
01:10:49
◼
►
Google has always been with
01:10:51
◼
►
Chrome, with, you know, the
01:10:53
◼
►
very heavy handing pushing the
01:10:54
◼
►
logins for everything, you
01:10:56
◼
►
And so, yeah, you know what?
01:10:57
◼
►
I logged in.
01:10:58
◼
►
I'm doing it the way they want
01:11:00
◼
►
me to do it the same way.
01:11:01
◼
►
Like if you want to use Apple
01:11:03
◼
►
products, you can like rage
01:11:04
◼
►
against the machine and like,
01:11:05
◼
►
you know, not sign into an
01:11:06
◼
►
Apple ID and like not use the
01:11:08
◼
►
You can do it.
01:11:09
◼
►
It's just harder.
01:11:10
◼
►
And again, my thing here is
01:11:12
◼
►
like I have been fighting this
01:11:14
◼
►
fight for so long.
01:11:15
◼
►
No one cares.
01:11:17
◼
►
It's not affecting anyone but
01:11:19
◼
►
Meanwhile, when I see certain
01:11:22
◼
►
things, the way they work in
01:11:24
◼
►
Look, I've been using Safari
01:11:26
◼
►
full time on all my devices for
01:11:28
◼
►
well over 10 years.
01:11:30
◼
►
And I have to bounce over to
01:11:33
◼
►
Chrome pretty often because
01:11:34
◼
►
whatever I'm using doesn't work
01:11:37
◼
►
in like some like, you know,
01:11:38
◼
►
oh, I have to fill out like
01:11:39
◼
►
some state tax website thing.
01:11:41
◼
►
Oh, it doesn't work right in
01:11:43
◼
►
First, I try.
01:11:45
◼
►
OK, well, what if I disable all
01:11:46
◼
►
my content blocker?
01:11:47
◼
►
Maybe it's my fault.
01:11:49
◼
►
And it still doesn't work
01:11:50
◼
►
because it just doesn't work
01:11:51
◼
►
And you can make an argument
01:11:53
◼
►
for like, is this, you know,
01:11:54
◼
►
is this people not following
01:11:56
◼
►
the standards?
01:11:56
◼
►
Is it Safari not following the
01:11:58
◼
►
Is it web developers not
01:11:59
◼
►
testing things right?
01:12:00
◼
►
And the answer is yes, it's
01:12:01
◼
►
all of those things.
01:12:02
◼
►
But my problem is that sites
01:12:05
◼
►
don't work in Safari sometimes
01:12:06
◼
►
and I have to go redo the
01:12:07
◼
►
same action in Chrome.
01:12:08
◼
►
that's Apple's problem to
01:12:10
◼
►
I don't want to make it my
01:12:11
◼
►
And whatever the reason, like
01:12:13
◼
►
you could whatever, whoever's
01:12:15
◼
►
at fault, it's Apple's
01:12:16
◼
►
Why am I continuing to battle
01:12:19
◼
►
this unless there's really,
01:12:21
◼
►
really good value there?
01:12:22
◼
►
OK, so if Safari is way
01:12:24
◼
►
better than Chrome in a bunch
01:12:25
◼
►
of other ways, then I'll take
01:12:27
◼
►
I will I will fight.
01:12:28
◼
►
I'll keep fighting that fight.
01:12:29
◼
►
But whenever I pop over to
01:12:32
◼
►
Chrome, I mean, I notice a
01:12:33
◼
►
couple of things.
01:12:33
◼
►
number one, everything works.
01:12:35
◼
►
Number two, it's so fast at
01:12:40
◼
►
Now, part of that is, you
01:12:43
◼
►
know, I think there's a few
01:12:44
◼
►
animations and stuff.
01:12:44
◼
►
It also still looks the way web
01:12:46
◼
►
browsers should look, unlike
01:12:47
◼
►
Liquid Glass.
01:12:48
◼
►
But OK, everything in Chrome is
01:12:51
◼
►
Everything opening and closing
01:12:54
◼
►
tabs is faster.
01:12:55
◼
►
Loading websites for some reason
01:12:59
◼
►
everything is faster in
01:13:01
◼
►
Chrome and not a little bit
01:13:02
◼
►
switching from Safari to
01:13:04
◼
►
Chrome feels like when I
01:13:06
◼
►
switched from Intel to Apple
01:13:08
◼
►
Silicon in terms of
01:13:09
◼
►
performance.
01:13:09
◼
►
It's like everything is just
01:13:11
◼
►
baseline faster by a lot.
01:13:14
◼
►
And also one password works a
01:13:20
◼
►
thousand times better in Chrome
01:13:21
◼
►
than in Safari logging into
01:13:24
◼
►
websites, auto filling forms.
01:13:27
◼
►
It's so good and so fast.
01:13:30
◼
►
I have been wasting years of my
01:13:34
◼
►
life typing my name and address
01:13:36
◼
►
into web forms that didn't
01:13:37
◼
►
autofill correctly and filling
01:13:39
◼
►
out password forms because it
01:13:41
◼
►
didn't autofill my password or
01:13:42
◼
►
having to paste in my 2FA code
01:13:44
◼
►
because that didn't autofill.
01:13:46
◼
►
Chrome works so much better in
01:13:49
◼
►
And one password works so much
01:13:51
◼
►
better in Chrome.
01:13:52
◼
►
And it's all so fast.
01:13:56
◼
►
So that's what I'm trying.
01:13:58
◼
►
I'm trying Chrome as my default
01:14:00
◼
►
browser on my Macs.
01:14:01
◼
►
I don't know.
01:14:02
◼
►
I downloaded the iOS version of
01:14:04
◼
►
I don't know if I'm going to stick
01:14:06
◼
►
with that but I wanted to give it a
01:14:07
◼
►
No, don't bother with that.
01:14:08
◼
►
You're not missing anything
01:14:10
◼
►
It's the same engine that's the
01:14:12
◼
►
Well, but there's, you know,
01:14:13
◼
►
there'll be other benefits if,
01:14:15
◼
►
you know, sync and everything are
01:14:16
◼
►
good, which I don't actually know
01:14:18
◼
►
because so obviously when you
01:14:20
◼
►
switch from Safari to Chrome
01:14:21
◼
►
and you're otherwise an Apple
01:14:22
◼
►
person, you're going to give up
01:14:23
◼
►
certain things.
01:14:24
◼
►
You're going to give up certain
01:14:25
◼
►
certain integrations.
01:14:27
◼
►
Now, one thing I've noticed is
01:14:28
◼
►
that Chrome on Tahoe has the
01:14:32
◼
►
autofill for messages, which is
01:14:35
◼
►
I don't think I'm going to switch
01:14:36
◼
►
to Tahoe on my main Mac yet
01:14:37
◼
►
because I still hate everything
01:14:38
◼
►
else about it.
01:14:39
◼
►
But I was using it on my laptop
01:14:40
◼
►
that has Tahoe and I noticed
01:14:42
◼
►
that it did the autofill for
01:14:43
◼
►
messages thing because I think
01:14:44
◼
►
Apple added an API for that in
01:14:47
◼
►
So I that I cut that kind of
01:14:49
◼
►
made me jealous of Tahoe briefly.
01:14:51
◼
►
But the one the one biggest
01:14:54
◼
►
downside to Chrome, which luckily
01:14:55
◼
►
you're avoiding, is if if like
01:14:57
◼
►
me, you are an iCloud keychain
01:14:59
◼
►
user and not a one password user,
01:15:00
◼
►
the Apple's keychain plug in
01:15:04
◼
►
extension for Chrome.
01:15:05
◼
►
It's it's fine.
01:15:08
◼
►
I understand why it works the way
01:15:10
◼
►
it does, but there's some
01:15:11
◼
►
additional friction because of
01:15:12
◼
►
their, you know, understandably
01:15:14
◼
►
paranoid security where if you
01:15:16
◼
►
don't use it often enough on a
01:15:18
◼
►
particular site, you have to end
01:15:19
◼
►
up unlocking it by typing in a
01:15:21
◼
►
six digit code before you can do
01:15:22
◼
►
the autofill.
01:15:23
◼
►
And that's a step you don't have
01:15:24
◼
►
to take with one password.
01:15:26
◼
►
And that's a step you don't have
01:15:27
◼
►
to take when you're using iCloud
01:15:28
◼
►
keychain in Safari.
01:15:29
◼
►
So there's a little bit of
01:15:31
◼
►
friction there.
01:15:31
◼
►
And then the other bit is if
01:15:33
◼
►
you've been using Chrome since the
01:15:35
◼
►
beginning, like I have and never
01:15:36
◼
►
stopped, you end up with tons of
01:15:38
◼
►
like stuff in Google's password
01:15:40
◼
►
manager and migrating off of that
01:15:42
◼
►
because it used to be you
01:15:43
◼
►
couldn't use iCloud keychain at all
01:15:44
◼
►
directly within Chrome.
01:15:45
◼
►
And then they made that
01:15:46
◼
►
So some of my passwords were in
01:15:48
◼
►
Chrome and some of my passwords
01:15:49
◼
►
were in iCloud keychain.
01:15:51
◼
►
And doing that transition has been
01:15:52
◼
►
a pain, especially since basically
01:15:54
◼
►
the most expedient way to do it is
01:15:56
◼
►
to not disable Chrome's autofill.
01:15:57
◼
►
So now I've got two autofills fighting
01:15:58
◼
►
in the same web page.
01:15:59
◼
►
And I'm I'm still I'm almost done
01:16:02
◼
►
with that transition.
01:16:03
◼
►
But one password, this is one of
01:16:05
◼
►
the the beauty and former
01:16:06
◼
►
sponsor of the show, whatever.
01:16:08
◼
►
But one of the things about one
01:16:09
◼
►
password is that it is, in fact,
01:16:11
◼
►
cross platform, cross everything.
01:16:13
◼
►
So it is works in Safari.
01:16:15
◼
►
It works in Chrome.
01:16:16
◼
►
It's the same wherever you put it.
01:16:18
◼
►
It's not like Apple's thing, which
01:16:21
◼
►
works way better in Safari than it
01:16:22
◼
►
does in Chrome.
01:16:23
◼
►
And so I at this point, like so
01:16:26
◼
►
far, my experiment with Chrome again,
01:16:28
◼
►
it's only been a couple of days.
01:16:30
◼
►
But so far, I it feels amazing
01:16:33
◼
►
to be this fast and this automated
01:16:35
◼
►
in so many ways and to just have
01:16:37
◼
►
fewer problems.
01:16:37
◼
►
And this this is what this is the
01:16:39
◼
►
kind of thing that I'm that I'm
01:16:41
◼
►
hoping to do more of during my year
01:16:42
◼
►
of efficiency of just like I've
01:16:44
◼
►
been doing certain I've been
01:16:45
◼
►
tolerating certain friction and,
01:16:47
◼
►
you know, doing a lot of manual
01:16:49
◼
►
work and shuffling things back and
01:16:50
◼
►
And I know this isn't going to be
01:16:52
◼
►
Like, again, like because I'm not
01:16:54
◼
►
going to have Chrome probably on my
01:16:55
◼
►
iPhone, because it is kind of weird
01:16:57
◼
►
there, but because I'm not
01:16:58
◼
►
going to do that, I'll miss out on
01:16:59
◼
►
like, you know, certain shared
01:17:01
◼
►
histories and things like if I'm
01:17:02
◼
►
using Safari on the phone, but
01:17:03
◼
►
Chrome on the desktop.
01:17:04
◼
►
So, you know, I know there's going
01:17:05
◼
►
to be some additional friction
01:17:07
◼
►
there and maybe long term, I will
01:17:09
◼
►
realize that's not worth it.
01:17:11
◼
►
And maybe I'll switch all the way
01:17:12
◼
►
back to Safari again.
01:17:13
◼
►
But right now, like the the initial
01:17:15
◼
►
impression of just having Chrome
01:17:18
◼
►
as default, it's just so fast.
01:17:21
◼
►
And I'm looking forward to things
01:17:23
◼
►
not breaking as much or at all,
01:17:25
◼
►
because I've just been dealing
01:17:26
◼
►
with so much of that.
01:17:27
◼
►
I have a different name for your
01:17:29
◼
►
yearly theme.
01:17:29
◼
►
It's the year of basic.
01:17:31
◼
►
You're becoming basic.
01:17:33
◼
►
You're going to use Google for
01:17:35
◼
►
You're going to use Chrome for
01:17:36
◼
►
your browser, which, by the way,
01:17:37
◼
►
go look at any kid and like they
01:17:39
◼
►
want to use Chrome as their
01:17:40
◼
►
My kids refuse to use Safari
01:17:42
◼
►
because they've just grown up on
01:17:44
◼
►
Yeah, you're you're you're using
01:17:47
◼
►
the things that everybody else
01:17:48
◼
►
You're not using DuckDuckGo and
01:17:50
◼
►
Coggy and sticking with Safari.
01:17:52
◼
►
You're just going to use Google
01:17:54
◼
►
Search and Chrome and Google Maps
01:17:56
◼
►
the year of basic, but in a good
01:17:58
◼
►
way in parentheses.
01:17:59
◼
►
Yeah, because like, again, it's
01:18:01
◼
►
just I'm tired of fighting all
01:18:04
◼
►
these fights that don't matter to
01:18:05
◼
►
anybody but me.
01:18:06
◼
►
Like all these years, I've been
01:18:08
◼
►
raging against this stuff.
01:18:09
◼
►
And you know what?
01:18:09
◼
►
In the meantime, like, you know,
01:18:11
◼
►
Chrome has some downsides and
01:18:13
◼
►
Google's really creepy.
01:18:14
◼
►
And it used to have a pretty big
01:18:16
◼
►
battery cost on Apple's computers.
01:18:18
◼
►
Well, I think Apple's computers are
01:18:20
◼
►
so good now that the battery cost,
01:18:21
◼
►
you know, I'll take it.
01:18:23
◼
►
Yeah, the battery cost is still
01:18:24
◼
►
It's Safari is still better in
01:18:26
◼
►
But you know what?
01:18:27
◼
►
If it gets me this much speed, fine.
01:18:30
◼
►
I accept the battery cost.
01:18:31
◼
►
You know, these laptops have great
01:18:33
◼
►
battery life these days.
01:18:34
◼
►
I can spare a little bit to have to
01:18:36
◼
►
have some of my time back.
01:18:37
◼
►
And so that I think like that's a
01:18:40
◼
►
major shift here.
01:18:41
◼
►
I think AI is a big shift here and
01:18:42
◼
►
integrating Gemini with all your
01:18:44
◼
►
I think it's going to be
01:18:45
◼
►
increasingly valuable.
01:18:46
◼
►
I'm not going to switch Gmail or
01:18:47
◼
►
anything, although I did briefly
01:18:49
◼
►
consider like, should I start like
01:18:50
◼
►
forwarding all my email to Gmail
01:18:52
◼
►
so we can maybe integrate stuff
01:18:54
◼
►
You're turning into me.
01:18:55
◼
►
Yeah, you really are.
01:18:56
◼
►
God help us all.
01:18:59
◼
►
Should try using an Intel Mac.
01:19:01
◼
►
That's I'm not going to, I mean,
01:19:03
◼
►
let's not be too ridiculous.
01:19:04
◼
►
So this, this is where I am.
01:19:07
◼
►
Like, and I want to, I think the
01:19:10
◼
►
AI revolution is blowing a lot of
01:19:12
◼
►
stuff wide open in tech and it's
01:19:13
◼
►
going to keep doing that as we were
01:19:14
◼
►
saying earlier.
01:19:14
◼
►
And so I want to actually challenge
01:19:17
◼
►
my assumptions that I've had for
01:19:19
◼
►
like the previous eras of tech.
01:19:20
◼
►
my, you know, my assumptions of
01:19:21
◼
►
like I should avoid Google because
01:19:23
◼
►
they kind of creep me out and I
01:19:23
◼
►
don't like the way they do some
01:19:25
◼
►
It's like, okay, well, you know,
01:19:26
◼
►
Apple has their own BS to, you
01:19:28
◼
►
know, none of the tech giants are
01:19:29
◼
►
particularly clean in that way
01:19:31
◼
►
anymore if they ever were.
01:19:32
◼
►
And so there are certain areas that
01:19:35
◼
►
I think, you know, deserve, you
01:19:38
◼
►
know, reevaluation sometimes.
01:19:39
◼
►
And, and in, in, in the wake of
01:19:40
◼
►
like significant disruption, which
01:19:42
◼
►
is what AI is doing and will
01:19:43
◼
►
continue to do, that's a good
01:19:45
◼
►
opportunity to reevaluate some of
01:19:47
◼
►
your assumptions and, and long
01:19:49
◼
►
held habits and allegiances.
01:19:51
◼
►
And it's like, is, is this still
01:19:52
◼
►
best for me?
01:19:53
◼
►
Is, is this, are my assumptions still
01:19:55
◼
►
the case and are the choices I
01:19:56
◼
►
made 15 years ago still the choices
01:19:59
◼
►
I should make today or are certain
01:20:01
◼
►
things worth reevaluating?
01:20:02
◼
►
And so that's, that's what I'm
01:20:03
◼
►
I'm heading into this year doing
01:20:05
◼
►
some of that reevaluation and
01:20:07
◼
►
hopefully save myself a bunch of
01:20:09
◼
►
time along the way and
01:20:10
◼
►
hopefully, you know, making some
01:20:11
◼
►
good stuff for the show, whether
01:20:12
◼
►
I, whether it succeeds or not.
01:20:13
◼
►
Chaos is a ladder.
01:20:18
◼
►
Absolutely not.
01:20:19
◼
►
I'm not asking you, Marco.
01:20:21
◼
►
Anyway, chaos is a ladder and
01:20:22
◼
►
Marco's climbing it.
01:20:23
◼
►
What is that from, John?
01:20:25
◼
►
Game of Thrones.
01:20:26
◼
►
I've never seen it.
01:20:28
◼
►
Jesus Christ.
01:20:29
◼
►
All right, fine.
01:20:29
◼
►
I didn't expect Marco to see
01:20:33
◼
►
it, but all right then.
01:20:35
◼
►
We are sponsored for this
01:20:37
◼
►
episode by Factor.
01:20:39
◼
►
Now, look, everybody tells
01:20:40
◼
►
themselves they want to eat
01:20:41
◼
►
Oh, I, I should eat healthier.
01:20:43
◼
►
I should cook more at home
01:20:44
◼
►
instead of getting takeout so
01:20:45
◼
►
often, et cetera.
01:20:46
◼
►
Look, I've, I've been there.
01:20:47
◼
►
This is what I tell myself all
01:20:49
◼
►
The problem is we don't have
01:20:50
◼
►
time to cook every single
01:20:52
◼
►
night necessarily.
01:20:53
◼
►
Cooking takes a lot of time,
01:20:54
◼
►
you know, prepping and
01:20:55
◼
►
shopping and everything.
01:20:56
◼
►
Factor doesn't ask you to meal
01:20:58
◼
►
prep or follow recipes or
01:21:00
◼
►
anything like that.
01:21:00
◼
►
It just removes that problem
01:21:02
◼
►
from your plate.
01:21:02
◼
►
So you get two minutes and
01:21:04
◼
►
you get real food and it's
01:21:07
◼
►
Factor is made by chefs,
01:21:09
◼
►
designed by dieticians and
01:21:10
◼
►
delivered to your door and
01:21:12
◼
►
you just heat it for two
01:21:13
◼
►
minutes and eat it.
01:21:15
◼
►
It's nice, fresh food.
01:21:17
◼
►
You get lean proteins,
01:21:18
◼
►
colorful vegetables, whole
01:21:20
◼
►
food ingredients, healthy
01:21:21
◼
►
fats, the stuff you've
01:21:23
◼
►
make, you make yourself if
01:21:24
◼
►
you had the time.
01:21:24
◼
►
But of course, you know,
01:21:25
◼
►
that isn't always the case.
01:21:26
◼
►
We know that there's no
01:21:27
◼
►
refined sugars, no
01:21:28
◼
►
artificial sweeteners, no
01:21:29
◼
►
refined seed oils.
01:21:30
◼
►
It's just good, healthy
01:21:32
◼
►
I've tried Factor.
01:21:33
◼
►
My family's tried Factor.
01:21:34
◼
►
Everyone likes it.
01:21:36
◼
►
It gets positive reviews from
01:21:37
◼
►
guests, from family members,
01:21:38
◼
►
even, you know, even the
01:21:39
◼
►
kid like he likes it too.
01:21:40
◼
►
Everyone likes Factor
01:21:42
◼
►
Whatever your needs and
01:21:43
◼
►
preferences are, they have
01:21:44
◼
►
things like high protein,
01:21:45
◼
►
calorie smart, Mediterranean,
01:21:46
◼
►
GLP one support, ready to
01:21:49
◼
►
There's a new muscle pro
01:21:50
◼
►
collection for strength and
01:21:51
◼
►
So whatever your goals are,
01:21:53
◼
►
whether it's healthy eating,
01:21:54
◼
►
calorie management, more
01:21:55
◼
►
protein, or just easier
01:21:57
◼
►
meals, Factor has you
01:21:59
◼
►
They're always fresh,
01:22:00
◼
►
never frozen, ready in two
01:22:02
◼
►
minutes, no prep, no
01:22:04
◼
►
cleanup, no mental load.
01:22:05
◼
►
So go to factormeals.com
01:22:08
◼
►
slash ATP50 off.
01:22:10
◼
►
Use code ATP50 off to get
01:22:12
◼
►
50% off your first Factor
01:22:13
◼
►
box, plus free breakfast for
01:22:16
◼
►
Offer only valid for new
01:22:17
◼
►
Factor customers with code
01:22:18
◼
►
and qualifying auto-renewing
01:22:19
◼
►
subscription purchase.
01:22:20
◼
►
Make healthier eating easy
01:22:22
◼
►
with Factor.
01:22:23
◼
►
Thanks to Factor for
01:22:24
◼
►
sponsoring our show.
01:22:25
◼
►
All right, so breaking news
01:22:30
◼
►
as we're recording.
01:22:31
◼
►
Apple has just released in
01:22:33
◼
►
the last 24 hours.
01:22:33
◼
►
Is it a beta or it's a
01:22:36
◼
►
beta, right?
01:22:37
◼
►
Release candidate for
01:22:38
◼
►
Xcode 26.3, which adds
01:22:41
◼
►
agentic coding.
01:22:42
◼
►
Reading from Apple's newsroom,
01:22:44
◼
►
Xcode 26.3 introduces
01:22:45
◼
►
support for agentic coding, a
01:22:47
◼
►
new way in Xcode for
01:22:48
◼
►
developers to build apps using
01:22:49
◼
►
coding agents such as
01:22:50
◼
►
Anthropics, Cloud Agent, and
01:22:51
◼
►
OpenAI's Codex.
01:22:52
◼
►
Agents can search
01:22:53
◼
►
documentation, explore file
01:22:54
◼
►
structures, update project
01:22:56
◼
►
settings, and verify their
01:22:57
◼
►
work visually by capturing
01:22:58
◼
►
Xcode previews and iterating
01:23:00
◼
►
through builds and fixes.
01:23:01
◼
►
In addition to these built-in
01:23:03
◼
►
integrations, Xcode 26.3 makes
01:23:06
◼
►
its capabilities available
01:23:06
◼
►
through the Model Context
01:23:07
◼
►
protocol, an open standard
01:23:09
◼
►
that gives developers the
01:23:10
◼
►
flexibility to use any
01:23:11
◼
►
compatible agent or tool with
01:23:12
◼
►
I have not personally had a
01:23:14
◼
►
chance to try this.
01:23:15
◼
►
I've been busy trying to get
01:23:16
◼
►
ready the next release of
01:23:18
◼
►
Call Sheet, but John, you've
01:23:19
◼
►
dabbled for quite a while,
01:23:21
◼
►
Yeah, well, so first of
01:23:22
◼
►
all, I think it is notable
01:23:23
◼
►
that Apple has rolled this
01:23:25
◼
►
out in 26.3 of Xcode rather
01:23:28
◼
►
than like saving it up for a
01:23:30
◼
►
big reveal of WWDC because
01:23:31
◼
►
they see the writing on the
01:23:33
◼
►
wall, which is everyone is
01:23:33
◼
►
using these things.
01:23:34
◼
►
It's not just because they
01:23:35
◼
►
saw that I was using it
01:23:36
◼
►
last week, but like, you
01:23:37
◼
►
know, like people have been
01:23:38
◼
►
using these things for
01:23:39
◼
►
months and months.
01:23:39
◼
►
It's like, oh crap, they
01:23:40
◼
►
got Syracuse.
01:23:40
◼
►
Yeah, that's right.
01:23:41
◼
►
By the time it's like, it's
01:23:43
◼
►
like AI penetrating to the
01:23:45
◼
►
It's like, by the time I'm
01:23:45
◼
►
using it, really, Apple's
01:23:46
◼
►
behind the times that
01:23:47
◼
►
they're not integrating.
01:23:48
◼
►
And they had like the AI
01:23:49
◼
►
coding assistant and their
01:23:50
◼
►
models and stuff like that.
01:23:51
◼
►
But everyone is using things
01:23:52
◼
►
like Cloud Code and has been
01:23:53
◼
►
And it's like, why is this not
01:23:55
◼
►
integrated with Xcode?
01:23:55
◼
►
And we've heard rumors about
01:23:56
◼
►
that for ages.
01:23:57
◼
►
I mean, one of the original AI
01:23:58
◼
►
rumors was just like that
01:23:59
◼
►
Anthropic, Apple's doing a deal
01:24:01
◼
►
with Anthropic to integrate
01:24:02
◼
►
Cloud into Xcode or whatever.
01:24:04
◼
►
Anyway, they released it.
01:24:06
◼
►
I've tried it.
01:24:07
◼
►
It's nice that they are.
01:24:08
◼
►
It seems it makes them seem
01:24:10
◼
►
It's nimble for Apple.
01:24:11
◼
►
I'm not saying it's nimble
01:24:12
◼
►
because it's not because
01:24:13
◼
►
they're behind.
01:24:13
◼
►
But it's nimble for Apple to
01:24:15
◼
►
actually roll this out.
01:24:16
◼
►
And it more or less works.
01:24:18
◼
►
And I wanted to try it.
01:24:19
◼
►
So I'm still on an Xcode
01:24:22
◼
►
26.0.1 because of the
01:24:23
◼
►
Tahoe icon issue where that's
01:24:25
◼
►
the last version of Xcode
01:24:26
◼
►
that will let me have the new
01:24:28
◼
►
icon, the new OS and the old
01:24:29
◼
►
icon on the old OS before
01:24:30
◼
►
Apple broke it and refused to
01:24:32
◼
►
So I installed 26.3 on one
01:24:37
◼
►
of my development machines,
01:24:38
◼
►
the one that's running
01:24:38
◼
►
Tahoe, in fact, and I wanted
01:24:41
◼
►
to try it out.
01:24:41
◼
►
And the problem I gave it
01:24:42
◼
►
was I can't use the Tahoe
01:24:45
◼
►
icon on Tahoe and the old
01:24:46
◼
►
icon on old OS's.
01:24:47
◼
►
Fix that for me.
01:24:49
◼
►
This is your hotspot.
01:24:50
◼
►
And so I tried Claude
01:24:53
◼
►
Code on it and Claude Code
01:24:54
◼
►
said, oh, you know, you can't
01:24:56
◼
►
And here's why.
01:24:56
◼
►
I'm like, I know.
01:24:58
◼
►
I know Claude Code, but I
01:25:00
◼
►
want you to try and see what
01:25:02
◼
►
And I went back and forth
01:25:03
◼
►
for a while.
01:25:04
◼
►
It was heroic.
01:25:06
◼
►
It tried a lot of things.
01:25:07
◼
►
It was like writing Python
01:25:08
◼
►
scripts to parse the assets
01:25:10
◼
►
dot car file and doing all
01:25:12
◼
►
sorts of stuff.
01:25:13
◼
►
And we tried all sorts of
01:25:14
◼
►
whereby we, I mean it and
01:25:16
◼
►
me just like telling you
01:25:17
◼
►
what to do or all sorts of
01:25:19
◼
►
The integration with Xcode
01:25:20
◼
►
is pretty good.
01:25:21
◼
►
Like it is, you know, I've
01:25:23
◼
►
only used the command line
01:25:24
◼
►
Claude Code, but I know that
01:25:25
◼
►
there's lots of GUI stuff
01:25:26
◼
►
out there, like the new
01:25:27
◼
►
codex thing from OpenAI.
01:25:28
◼
►
And anyway, the integration
01:25:30
◼
►
Xcode in the sidebar, it
01:25:31
◼
►
feels kind of cramped, but
01:25:32
◼
►
it's like, it's fine.
01:25:33
◼
►
Claude Code can do lots of
01:25:35
◼
►
stuff in Xcode, but not all
01:25:37
◼
►
Occasionally you would say,
01:25:38
◼
►
hey, put this as a new
01:25:39
◼
►
build phase, copy this build
01:25:41
◼
►
And it was nice affordance
01:25:42
◼
►
for like you just click on
01:25:43
◼
►
and it will copy the full
01:25:45
◼
►
You're not like dragging out,
01:25:45
◼
►
you know, but it couldn't it
01:25:47
◼
►
couldn't add a build phase
01:25:48
◼
►
It needed me to copy and
01:25:50
◼
►
paste the shell script into
01:25:52
◼
►
the build phase and do all
01:25:53
◼
►
So I was like, oh, that's
01:25:54
◼
►
that's kind of cruddy, but
01:25:55
◼
►
Anyway, it failed.
01:25:56
◼
►
They couldn't get it to
01:25:57
◼
►
So I'm like, OK, next
01:25:58
◼
►
up, let's try the open
01:26:01
◼
►
And for Claude, again, I was
01:26:03
◼
►
using whatever that expensive
01:26:05
◼
►
plan I have now.
01:26:05
◼
►
And for open AI, I forget
01:26:07
◼
►
which plan I was using, but
01:26:08
◼
►
you get to use the good
01:26:09
◼
►
You get to use 5.2.
01:26:10
◼
►
And I tried with codex and
01:26:13
◼
►
it said the same thing.
01:26:14
◼
►
Oh, I just look this up and
01:26:14
◼
►
that's not a thing you can
01:26:15
◼
►
And I said, I know.
01:26:17
◼
►
I know it seems like you
01:26:18
◼
►
can't do it, but let's find
01:26:19
◼
►
And I went back and forth and
01:26:21
◼
►
basically I rubber ducked with
01:26:22
◼
►
it a lot because, you know,
01:26:23
◼
►
when you're explaining rubber
01:26:24
◼
►
ducking is when you're
01:26:25
◼
►
facing a programming
01:26:26
◼
►
problem and you explain it
01:26:27
◼
►
to a rubber duck, rubber
01:26:28
◼
►
duck's not going to help
01:26:29
◼
►
But the act of you having to
01:26:30
◼
►
explain it makes you, you
01:26:32
◼
►
know, verbalize things that
01:26:33
◼
►
you hadn't thought of
01:26:34
◼
►
And so me with rubber
01:26:36
◼
►
ducking, I'm typing into the
01:26:37
◼
►
prompt like, I know it seems
01:26:38
◼
►
like you can't do it, but
01:26:39
◼
►
like, look, Xcode 26.0.1
01:26:43
◼
►
The app is on the app store
01:26:45
◼
►
If you download it, it has
01:26:47
◼
►
the Tahoe icon, it has the
01:26:49
◼
►
pre-Tahoe icon, like I know
01:26:50
◼
►
it's possible just because the
01:26:52
◼
►
version of AC tool, the
01:26:54
◼
►
command line tool, they came
01:26:55
◼
►
with all versions of Xcode
01:26:56
◼
►
after 26.0.1 can't do it.
01:26:58
◼
►
The old one could.
01:26:59
◼
►
So whatever it's doing, let's
01:27:00
◼
►
do that here.
01:27:01
◼
►
And I'm going back and forth
01:27:02
◼
►
with rubber ducking.
01:27:02
◼
►
I was like, wait a second.
01:27:03
◼
►
Why don't I just take the
01:27:06
◼
►
assets file with the icons
01:27:07
◼
►
in it from the version I
01:27:09
◼
►
built, like from the app
01:27:11
◼
►
store version, from the
01:27:12
◼
►
version I built with 26.0.1
01:27:14
◼
►
and copy it into my Git
01:27:17
◼
►
repository for my app and just
01:27:20
◼
►
say, hey, add a new build
01:27:21
◼
►
When all the building is
01:27:22
◼
►
done, your final step before
01:27:24
◼
►
code signing is to chuck out
01:27:26
◼
►
the assets.car that you just
01:27:27
◼
►
built with Xcode 26.3 and
01:27:30
◼
►
replace it with the one I just
01:27:31
◼
►
copied from my app store
01:27:33
◼
►
And so that's what I told
01:27:35
◼
►
I had rubber ducked it and I'm
01:27:36
◼
►
like, this is terrible and
01:27:37
◼
►
stupid, but it should work.
01:27:39
◼
►
And then rather than dirty my
01:27:40
◼
►
hands doing it, I just said to
01:27:41
◼
►
Codex, here, here's here's
01:27:43
◼
►
it, you know, slash
01:27:44
◼
►
application slash switchglass.app
01:27:46
◼
►
slash blah, blah, blah, that
01:27:47
◼
►
asset.car file, put it in my
01:27:49
◼
►
Git repo, make a build phase
01:27:51
◼
►
and blah, blah, blah.
01:27:51
◼
►
And by the way, Codex can
01:27:53
◼
►
make build phases by itself.
01:27:54
◼
►
Codex didn't ask me to do
01:27:56
◼
►
Codex seems to have much
01:27:57
◼
►
deeper hooks into the MC.
01:27:58
◼
►
I don't know if it's MCP or
01:27:59
◼
►
whatever it is, but they as
01:28:00
◼
►
much deeper hooks into Xcode
01:28:02
◼
►
where it never asked me to do
01:28:03
◼
►
things myself.
01:28:04
◼
►
It's just like I can make a
01:28:04
◼
►
build phase.
01:28:05
◼
►
I can do this.
01:28:05
◼
►
Although Codex was so needy
01:28:07
◼
►
with like asking permissions
01:28:09
◼
►
and no matter how many times
01:28:09
◼
►
it said always allow, always
01:28:10
◼
►
allow, always allow.
01:28:11
◼
►
There was always another
01:28:12
◼
►
prompt coming.
01:28:12
◼
►
So these things are a
01:28:14
◼
►
little bit fuzzy.
01:28:15
◼
►
But anyway, Codex did this
01:28:17
◼
►
I now have a terrible, hideous
01:28:20
◼
►
And of course, if I ever want
01:28:21
◼
►
to change my app icon, I
01:28:22
◼
►
have to do it in Xcode 26.0.1.
01:28:24
◼
►
And yes, I have filed a
01:28:27
◼
►
feedback against this as an
01:28:28
◼
►
enhancement request asking
01:28:29
◼
►
Apple not to be this dumb.
01:28:30
◼
►
We'll see if it ever gets
01:28:31
◼
►
But in the meantime, I use
01:28:34
◼
►
the coding agents.
01:28:34
◼
►
I think for a first
01:28:35
◼
►
implementation in an RC, it's
01:28:38
◼
►
Lots of people are doing even
01:28:40
◼
►
more impressive things with it.
01:28:41
◼
►
but I was happy to see that
01:28:44
◼
►
it served its function both
01:28:45
◼
►
as, you know, a thing letting
01:28:46
◼
►
me try these integrations, but
01:28:48
◼
►
also letting me rubber duck
01:28:50
◼
►
with it because I think that
01:28:50
◼
►
is actually one of the roles.
01:28:51
◼
►
I mean, Casey, you talked
01:28:52
◼
►
today about talking to it as
01:28:54
◼
►
like kind of a co-worker, but
01:28:55
◼
►
a rubber ducking is another
01:28:56
◼
►
thing that co-workers are
01:28:57
◼
►
They ignore you.
01:28:58
◼
►
They have their headphones on.
01:28:59
◼
►
They're just nodding politely.
01:29:00
◼
►
But you explaining the current
01:29:01
◼
►
thing you're fighting with can
01:29:03
◼
►
make you realize, oh, what
01:29:05
◼
►
Why don't I just copy the good
01:29:06
◼
►
file out of the app on the
01:29:07
◼
►
app store now and check it
01:29:08
◼
►
into my repo?
01:29:09
◼
►
Done and done.
01:29:11
◼
►
That is a very weird way of
01:29:13
◼
►
going about this.
01:29:13
◼
►
But when you don't have a whole
01:29:14
◼
►
lot of other choice, it makes
01:29:16
◼
►
Yeah, it's terrible.
01:29:17
◼
►
Like, to be clear, it's hideous.
01:29:18
◼
►
Like it's I should never have to
01:29:19
◼
►
be done, but you got to do a lot
01:29:20
◼
►
of hideous things in Tahoe.
01:29:24
◼
►
And then the other the other angle
01:29:25
◼
►
on this is so Kyle Hughes posted a
01:29:30
◼
►
I'm not sure where it came from,
01:29:31
◼
►
but his wording on it was 2025 was
01:29:34
◼
►
indeed the dawn of drop shipping
01:29:35
◼
►
apps as the prophecy foretold.
01:29:37
◼
►
I'll put a link in the show.
01:29:39
◼
►
It's a drop shipping.
01:29:39
◼
►
But it's like by making it easier
01:29:41
◼
►
to become like essentially a
01:29:43
◼
►
merchant online where you don't
01:29:44
◼
►
have any inventory and you just
01:29:45
◼
►
take orders and then take the thing
01:29:47
◼
►
from somebody else and make it
01:29:48
◼
►
ship to you.
01:29:49
◼
►
that increase the number of
01:29:50
◼
►
people selling things online.
01:29:51
◼
►
Well, the graph is how many iOS apps
01:29:54
◼
►
have been released per month.
01:29:55
◼
►
And you can put a dotted line on
01:29:57
◼
►
the graph where like agentic
01:29:58
◼
►
coding arrived and the lines going
01:30:00
◼
►
up much faster than it was before.
01:30:02
◼
►
Like before it was basically flat
01:30:03
◼
►
and now it's taken off and it's
01:30:05
◼
►
not shocking because if you put a
01:30:06
◼
►
coding agent to Xcode and you can
01:30:08
◼
►
just type words in the sidebar in
01:30:09
◼
►
English and get an app out of it
01:30:11
◼
►
and you can totally do that.
01:30:12
◼
►
Yeah, there's going to be more
01:30:14
◼
►
stuff submitted to the app store.
01:30:15
◼
►
Probably more junk, but it's a
01:30:16
◼
►
concern for all of us because
01:30:17
◼
►
it's not like app review was
01:30:18
◼
►
particularly
01:30:20
◼
►
it's been reasonably speedy
01:30:23
◼
►
compared to the bad old days, but
01:30:24
◼
►
like this increase is not going to
01:30:26
◼
►
help things.
01:30:26
◼
►
Yeah, I mean, I think this is
01:30:29
◼
►
this is the least of our problems
01:30:31
◼
►
in terms of that's Apple's problem.
01:30:33
◼
►
But yeah, yeah, I mean, I think
01:30:35
◼
►
the area of like how much is
01:30:39
◼
►
agentic coding going to affect
01:30:40
◼
►
both us as developers and also
01:30:44
◼
►
the software environment on the
01:30:48
◼
►
Those are huge questions and I think
01:30:50
◼
►
the answer is going to be it's
01:30:52
◼
►
going to affect us a lot in both
01:30:55
◼
►
It already is affecting us a lot.
01:30:56
◼
►
That's the that's the thing.
01:30:57
◼
►
Yeah, it's already happening.
01:30:58
◼
►
Yeah, this is happening today, you
01:31:00
◼
►
know, and we don't even cover like the
01:31:01
◼
►
whole like, you know, clawed bot,
01:31:02
◼
►
molt bot, open claw thing.
01:31:04
◼
►
Like there's there's so much there's
01:31:05
◼
►
so much happening right now driven
01:31:07
◼
►
by AI that like it's it's hard to
01:31:10
◼
►
even hear about it all, let alone try
01:31:13
◼
►
it all or or become proficient in
01:31:16
◼
►
You know, we're in such high motion
01:31:17
◼
►
But will software developers like us,
01:31:20
◼
►
will we even be that necessary
01:31:23
◼
►
going forward or like, you know, what
01:31:26
◼
►
what will junior developers do?
01:31:27
◼
►
Like maybe you have one senior
01:31:29
◼
►
developer, you know, controlling a
01:31:30
◼
►
bunch of agents like that themselves
01:31:32
◼
►
are controlling a bunch of agents.
01:31:34
◼
►
There's so many ways this can go, but
01:31:36
◼
►
it's it's looking like we should not
01:31:38
◼
►
underestimate the capabilities of
01:31:41
◼
►
these systems for the foreseeable
01:31:43
◼
►
future, because there's there's so
01:31:45
◼
►
much that they are already doing
01:31:47
◼
►
This is an area like I personally I
01:31:49
◼
►
feel like I'm way behind because I
01:31:51
◼
►
have not yet used much much AI
01:31:53
◼
►
coding at all.
01:31:54
◼
►
I haven't done any of the agentic
01:31:56
◼
►
I've only been like, you know, asking
01:31:57
◼
►
the chatbots for code snippets here
01:31:59
◼
►
and there like, you know, today I was
01:32:00
◼
►
writing something for Overcast today
01:32:02
◼
►
and I was like, oh, I have like
01:32:03
◼
►
this this set of data here.
01:32:05
◼
►
And what's it?
01:32:06
◼
►
How do I write me some Swift code to
01:32:08
◼
►
take this set of data that I have
01:32:10
◼
►
and make it into this other type of
01:32:13
◼
►
And it's like, oh, well, the
01:32:14
◼
►
algorithm you want is called this.
01:32:15
◼
►
And here's an implementation with
01:32:16
◼
►
Swift that uses generics.
01:32:17
◼
►
And I plopped it right in and it was
01:32:19
◼
►
But even that, like, is this just like
01:32:21
◼
►
me typing in my name in a web form?
01:32:22
◼
►
Why am I wasting time copying and
01:32:24
◼
►
pasting Swift code that is
01:32:26
◼
►
generating into my project and
01:32:27
◼
►
integrating it myself when I
01:32:29
◼
►
could just be using these agents to
01:32:30
◼
►
say, just make this feature
01:32:32
◼
►
work and I don't think we're that
01:32:34
◼
►
far from that, even for very
01:32:35
◼
►
complex things like we're already
01:32:36
◼
►
we're there already for a lot of
01:32:38
◼
►
simple stuff.
01:32:39
◼
►
And this stuff is, you know,
01:32:41
◼
►
just been born, basically.
01:32:43
◼
►
Like we're so early into this and
01:32:44
◼
►
it's this good already.
01:32:45
◼
►
This is this is getting better
01:32:48
◼
►
And it's I think we are in
01:32:51
◼
►
a real inflection point for many
01:32:54
◼
►
industries, including our own.
01:32:56
◼
►
And I don't know that it's
01:32:58
◼
►
necessarily, you know, super
01:33:01
◼
►
destructive, but it's certainly
01:33:03
◼
►
very different.
01:33:04
◼
►
And we again, like we've got to get
01:33:07
◼
►
on board or, you know, be OK with
01:33:09
◼
►
being abandoned.
01:33:11
◼
►
And comparing this, I mean, I know it's
01:33:13
◼
►
the same underlying tech, but
01:33:14
◼
►
comparing this to the early optimism
01:33:15
◼
►
about how LLIs are going to lead us
01:33:18
◼
►
to AGI, the real AI, the thing that
01:33:21
◼
►
used to be called AI before we took the
01:33:22
◼
►
term and perverted it.
01:33:24
◼
►
You know, like how 9000, you know,
01:33:26
◼
►
human level intelligence.
01:33:27
◼
►
Like all we need to do is add more
01:33:28
◼
►
parameters and the experience with
01:33:30
◼
►
like, you know, open AI going from
01:33:32
◼
►
chat GPT, you know, two and three and
01:33:33
◼
►
four, you know, going up is like,
01:33:36
◼
►
well, they kind of have plateaued and
01:33:39
◼
►
how good those things that you can
01:33:40
◼
►
talk to work.
01:33:40
◼
►
It seems like that's not a direct
01:33:42
◼
►
path to AGI, but the coding engines
01:33:44
◼
►
using the same underlying technology.
01:33:45
◼
►
We're not asking them to be how 9000
01:33:47
◼
►
we're asking them to do one very
01:33:48
◼
►
specific thing, which is right code.
01:33:50
◼
►
And they are getting better at that
01:33:52
◼
►
kind of like how fast LMs are getting
01:33:54
◼
►
better between GPT two, three and
01:33:57
◼
►
Like they're in that phase.
01:33:58
◼
►
Now, maybe they'll plateau out as
01:34:00
◼
►
well, but we are in the upward curve
01:34:03
◼
►
of coding agents.
01:34:04
◼
►
They are getting rapidly better, much
01:34:08
◼
►
faster than the, oh, ask me a general
01:34:10
◼
►
question and I'll tell you something.
01:34:11
◼
►
And hopefully it's not total BS, right?
01:34:13
◼
►
Because I mean, again, I say it's
01:34:14
◼
►
because code, you know, it can test it
01:34:18
◼
►
Like you can't do that with facts.
01:34:20
◼
►
You ask something and it gives you, well,
01:34:21
◼
►
here's a plausible answer, but is it
01:34:23
◼
►
It can't help you with that, right?
01:34:25
◼
►
But code, you can run it.
01:34:27
◼
►
You can write tests against it.
01:34:28
◼
►
You can iterate on that until the tests
01:34:30
◼
►
You know what I mean?
01:34:31
◼
►
And there's tons of code out there, both
01:34:33
◼
►
open source and whatever, you know,
01:34:34
◼
►
like code is tractable in a way that
01:34:37
◼
►
general intelligence is not because you
01:34:39
◼
►
immediately run it and see if it does
01:34:42
◼
►
something, whether you're running it or
01:34:43
◼
►
the agent is running it, you can tell
01:34:46
◼
►
to a much higher degree whether it is
01:34:49
◼
►
And I think that is helping these
01:34:50
◼
►
agents increase much more rapidly than
01:34:54
◼
►
currently the sort of like, I'm just a
01:34:57
◼
►
You can talk to me like they seem to not
01:34:59
◼
►
be getting better as fast.
01:35:00
◼
►
Certainly they're not getting as
01:35:01
◼
►
better as fast as they were between
01:35:02
◼
►
chat TPT two, three and four.
01:35:03
◼
►
I feel like we're in that phase now.
01:35:05
◼
►
So I don't even know how good these
01:35:07
◼
►
things are going to be in like five to
01:35:08
◼
►
ten years unless this curve levels off.
01:35:10
◼
►
But right now it's not leveling off.
01:35:12
◼
►
It's getting better like every day
01:35:14
◼
►
So it's an exciting technology.
01:35:17
◼
►
And again, even if it is already
01:35:20
◼
►
leveled off and even if it never gets
01:35:21
◼
►
significantly better than how it is
01:35:23
◼
►
right now, it's still amazingly useful
01:35:25
◼
►
and is changing everything.
01:35:26
◼
►
Like then it will probably keep getting
01:35:30
◼
►
I mean, I think I would bet that it's
01:35:32
◼
►
continuing to get better.
01:35:33
◼
►
But yes, it already is useful.
01:35:35
◼
►
But like I talked a lot about this in
01:35:36
◼
►
the directives I just recorded
01:35:37
◼
►
Like it is a skill to know how to use
01:35:40
◼
►
this, just like it is a skill to be a
01:35:42
◼
►
senior developer that leads a team
01:35:43
◼
►
and talks to junior developers.
01:35:45
◼
►
That's not it's not a given that people
01:35:47
◼
►
will be good at that.
01:35:48
◼
►
And it is work and it is difficult.
01:35:49
◼
►
It's just a different kind of work.
01:35:51
◼
►
And so there are people out there who
01:35:53
◼
►
are currently way better at using
01:35:55
◼
►
agents to get work done than other
01:35:57
◼
►
Like I'm just playing with it just
01:35:58
◼
►
because it's fun.
01:35:58
◼
►
Like but I recognize how much of a
01:36:01
◼
►
novice I am at playing with these
01:36:02
◼
►
things and that it is, in fact, a
01:36:04
◼
►
different skill than having an actual
01:36:07
◼
►
And it's definitely a different skill
01:36:08
◼
►
than writing the code yourself.
01:36:10
◼
►
And these are all important skills.
01:36:11
◼
►
And I don't think any of them are going
01:36:13
◼
►
away, going away.
01:36:14
◼
►
But don't think, oh, now anyone can do
01:36:16
◼
►
this because you can talk to it.
01:36:17
◼
►
It does make it open to more people.
01:36:20
◼
►
But the skills required to use an agent
01:36:23
◼
►
well are actually fairly difficult to
01:36:26
◼
►
master and require lots of expertise and
01:36:28
◼
►
are not sort of evenly distributed through
01:36:31
◼
►
the populace.
01:36:31
◼
►
So it's really just more of a a change
01:36:34
◼
►
in who might excel in a world where most
01:36:37
◼
►
coding is done by agents.
01:36:38
◼
►
But it's not easy.
01:36:39
◼
►
And I don't think it will actually be the
01:36:41
◼
►
case that it, you know, depresses
01:36:43
◼
►
salaries or whatever.
01:36:44
◼
►
But I do think, you know, again, like the
01:36:46
◼
►
Industrial Revolution, lots of people
01:36:48
◼
►
will lose their jobs and then people
01:36:50
◼
►
they'll die and new people will get
01:36:52
◼
►
different jobs.
01:36:52
◼
►
And it's going to be disruptive.
01:36:53
◼
►
It's going to be harmful and disruptive.
01:36:55
◼
►
And if done carelessly, it's going to
01:36:58
◼
►
cause a lot of problems.
01:36:59
◼
►
But I think it is it will eventually
01:37:03
◼
►
be seen as progress, hopefully done
01:37:07
◼
►
in a way that is more humane than the
01:37:09
◼
►
Industrial Revolution and maybe also
01:37:10
◼
►
more humane than the PC Revolution and
01:37:12
◼
►
the Internet Revolution.
01:37:13
◼
►
It's moving so fast, it's hard to tell
01:37:15
◼
►
for sure, like how much of an impact
01:37:17
◼
►
this will have to how many things.
01:37:19
◼
►
But so far, like that estimation, as
01:37:23
◼
►
as we learn more about these and as we
01:37:25
◼
►
use them more, that estimation is going
01:37:27
◼
►
up, not down, like it is very clear
01:37:29
◼
►
this is changing more than we think.
01:37:31
◼
►
This is reshaping a lot of industries, but
01:37:33
◼
►
it definitely is reshaping software
01:37:35
◼
►
development and using it like I'm saying
01:37:38
◼
►
it is its own skill.
01:37:39
◼
►
It's it's like the birth of Google,
01:37:42
◼
►
speaking of which, like it's the birth of
01:37:44
◼
►
When web search became a thing in the
01:37:47
◼
►
90s, if you were good at using web
01:37:50
◼
►
search, you had like superpowers.
01:37:52
◼
►
And imagine how that translated to
01:37:54
◼
►
almost every part of life when no one
01:37:57
◼
►
else was using web search that much.
01:37:58
◼
►
And you were the nerd who knew who had
01:38:01
◼
►
access to the Internet, who knew how to
01:38:02
◼
►
use web search and maybe even were like
01:38:05
◼
►
the person who got good at using like
01:38:06
◼
►
search operators to actually find better
01:38:09
◼
►
information.
01:38:09
◼
►
Before Google got rid of all those.
01:38:11
◼
►
But if it but like, you know, in the
01:38:13
◼
►
early like that was that was a superpower.
01:38:14
◼
►
And eventually then that power became, you
01:38:18
◼
►
know, everyone had it.
01:38:19
◼
►
And if you were like the one, you know,
01:38:22
◼
►
neophyte back then who was, you know,
01:38:25
◼
►
anti computer.
01:38:27
◼
►
Wait, is that neophyte?
01:38:28
◼
►
What's the other one?
01:38:28
◼
►
No, that's not neophyte.
01:38:29
◼
►
The other the opposite.
01:38:30
◼
►
Yeah, you're trying to think of Luddite, but
01:38:32
◼
►
that's a bad rap.
01:38:33
◼
►
Luddites get a bad rap and they were
01:38:34
◼
►
mostly just arguing not to to have
01:38:37
◼
►
reasonable labor laws.
01:38:37
◼
►
But anyway, that's that's what turns out.
01:38:39
◼
►
Well, OK, well, if you were like, you
01:38:41
◼
►
know, anti technology or ignoring
01:38:44
◼
►
technology throughout the 90s and
01:38:46
◼
►
2000s and you missed the entire
01:38:49
◼
►
Internet revolution, you could have gone
01:38:51
◼
►
on living life just fine.
01:38:52
◼
►
Many millions of people did, but they
01:38:55
◼
►
were increasingly being left behind by
01:38:58
◼
►
We are now in the point with with AI.
01:39:01
◼
►
We're like around like, you know,
01:39:03
◼
►
1999 in terms of search
01:39:05
◼
►
search engines in the Internet.
01:39:07
◼
►
like this is booming and
01:39:10
◼
►
it's radically changing
01:39:12
◼
►
a lot of things and
01:39:13
◼
►
you can get good at it and you
01:39:15
◼
►
probably should
01:39:16
◼
►
get into this world because
01:39:18
◼
►
it's going to change.
01:39:19
◼
►
It's already changing a lot.
01:39:20
◼
►
It's going to keep changing more and
01:39:21
◼
►
more stuff and
01:39:22
◼
►
you don't have to.
01:39:23
◼
►
You can just ignore this,
01:39:26
◼
►
you will be left behind.
01:39:28
◼
►
You are being left behind already.
01:39:29
◼
►
Soon, like most of the workforce
01:39:31
◼
►
is going to be young enough
01:39:33
◼
►
that they don't care about whatever
01:39:36
◼
►
copyright things you might be
01:39:37
◼
►
worried about.
01:39:38
◼
►
They don't care about whether AI is
01:39:40
◼
►
sometimes wrong or whether it's
01:39:42
◼
►
making stuff up.
01:39:42
◼
►
I hope they care about that, please.
01:39:44
◼
►
But sorry, like they're not.
01:39:46
◼
►
I think I think being
01:39:48
◼
►
correctness will still always be a
01:39:50
◼
►
measure somewhere.
01:39:51
◼
►
Otherwise, these planes are going to
01:39:52
◼
►
be falling out of the sky and we're
01:39:53
◼
►
going to be in idiocracy pouring
01:39:54
◼
►
soda on our planes.
01:39:55
◼
►
We might have some challenges in that
01:39:57
◼
►
Like that's but but look like,
01:39:59
◼
►
you know, don't we have that?
01:40:00
◼
►
Didn't we have the same challenges
01:40:01
◼
►
with web search?
01:40:02
◼
►
Yeah, I know.
01:40:03
◼
►
I think this is definitely more.
01:40:04
◼
►
There is there is more of an upside,
01:40:05
◼
►
but there's also more of a downside
01:40:06
◼
►
in web search.
01:40:07
◼
►
Like in terms of a skill, it's not
01:40:09
◼
►
the it's not that difficult of a
01:40:11
◼
►
skill, whereas this is more like
01:40:12
◼
►
learning how to use Photoshop
01:40:14
◼
►
where Photoshop is this amazingly
01:40:16
◼
►
powerful tool.
01:40:17
◼
►
And if you learning how to use
01:40:20
◼
►
Photoshop to its fullest potential
01:40:21
◼
►
makes you extremely valuable,
01:40:22
◼
►
but there's a lot to learn.
01:40:24
◼
►
Like I know like one
01:40:25
◼
►
ten thousandth of Photoshop
01:40:27
◼
►
and already I have more skills than
01:40:29
◼
►
most people I will meet.
01:40:31
◼
►
And someone who really knows how
01:40:32
◼
►
to use Photoshop Photoshop made
01:40:33
◼
►
them incredibly powerful.
01:40:34
◼
►
And there's a lot to learn.
01:40:36
◼
►
And I feel like coding agents are
01:40:37
◼
►
like that, where there is a huge
01:40:40
◼
►
depth of stuff to learn.
01:40:41
◼
►
And the problem with coding agents
01:40:42
◼
►
is they're changing every single
01:40:43
◼
►
So if you waste your time becoming
01:40:45
◼
►
an expert in some particular
01:40:46
◼
►
coding agent, like all your skills
01:40:47
◼
►
are going to be obsoleted tomorrow.
01:40:48
◼
►
But it's just like learning
01:40:49
◼
►
Photoshop 1.0 and then 2.0.
01:40:51
◼
►
Then they add layers.
01:40:51
◼
►
You're like, what the hell is a
01:40:52
◼
►
layer like in the early phases of
01:40:55
◼
►
powerful tools?
01:40:56
◼
►
There is an advantage to learning
01:40:58
◼
►
to harness them to maximum capacity,
01:41:01
◼
►
but expect your knowledge to be
01:41:02
◼
►
trashed periodically because
01:41:03
◼
►
things are changing so fast.
01:41:06
◼
►
And that's going to be the case
01:41:07
◼
►
for a while.
01:41:08
◼
►
But like this train is moving.
01:41:12
◼
►
Like this is where things are and
01:41:14
◼
►
will be going for the foreseeable
01:41:16
◼
►
And so, and like, look, I am very
01:41:20
◼
►
often slow to adopt new technologies.
01:41:22
◼
►
And I recognize that I am behind in
01:41:26
◼
►
adopting AI coding.
01:41:27
◼
►
Even, you know, I'm like, you know,
01:41:30
◼
►
two months behind, but like, but I'm
01:41:32
◼
►
And I need to, I need to get on
01:41:34
◼
►
Like if you want to keep working in
01:41:35
◼
►
this business, this is it.
01:41:36
◼
►
It is going to be a lot of skills
01:41:38
◼
►
that we're going to have to develop.
01:41:39
◼
►
You know, all of us old people who,
01:41:41
◼
►
who grew up without this, we're
01:41:43
◼
►
going to have to do a lot of
01:41:44
◼
►
relearning the same way when the
01:41:45
◼
►
internet came up.
01:41:46
◼
►
A lot of people had to do a lot of
01:41:47
◼
►
relearning who were already doing
01:41:48
◼
►
things a certain way for 10, 20,
01:41:50
◼
►
30, 40 years.
01:41:51
◼
►
You're a graphic artist before
01:41:52
◼
►
You're like, what the hell?
01:41:54
◼
►
None of my skills seem to transfer
01:41:55
◼
►
I know that I use all these people
01:41:56
◼
►
papers and tracing things and exacto
01:41:58
◼
►
knives and tape.
01:41:59
◼
►
And like, you want me to use a
01:42:01
◼
►
Yeah, but that's, that is where
01:42:04
◼
►
Jump in because this, this is it.
01:42:07
◼
►
Like this is not, this is not being
01:42:08
◼
►
put back in the bottle.
01:42:09
◼
►
And if you think copyright or moral
01:42:12
◼
►
arguments are going to save you, I'm
01:42:14
◼
►
sorry, I have bad news for you.
01:42:16
◼
►
I don't think that's going to
01:42:16
◼
►
You know, there were also copyright
01:42:18
◼
►
concerns over web search engines
01:42:20
◼
►
because in order for Google and other
01:42:23
◼
►
search engines to index content, they
01:42:26
◼
►
have to read it.
01:42:27
◼
►
And that's making a copy even
01:42:29
◼
►
temporarily in the server's memory.
01:42:31
◼
►
That's making a copy.
01:42:32
◼
►
And this was litigated whether or not
01:42:34
◼
►
it was fair use to make a copy of
01:42:38
◼
►
copyrighted stuff from a web page to
01:42:40
◼
►
just like read it like if from, from a
01:42:43
◼
►
program from either a search crawler or
01:42:45
◼
►
even a web browser, like you have to make
01:42:47
◼
►
copies in memory to do anything with
01:42:50
◼
►
That's copyright violation at the
01:42:52
◼
►
purest form.
01:42:53
◼
►
And that had to be litigated.
01:42:55
◼
►
Technology won.
01:42:57
◼
►
Like the massive, massive utility of
01:43:00
◼
►
technology won out over nitpickiness
01:43:03
◼
►
over copyright law.
01:43:04
◼
►
And I think the same thing ultimately is
01:43:07
◼
►
going to happen here.
01:43:08
◼
►
I mean, there might be a slightly bumpy
01:43:09
◼
►
route to get there, but copyright's not
01:43:12
◼
►
going to kill AI.
01:43:13
◼
►
AI is here and the value, the utility is
01:43:18
◼
►
You're not going to be magically safe.
01:43:19
◼
►
Nothing's putting this genie back in the
01:43:21
◼
►
Nothing's going to save us from this.
01:43:22
◼
►
Well, you really have a lot more faith in
01:43:24
◼
►
our judicial system than I know.
01:43:25
◼
►
Just because it doesn't make sense to
01:43:28
◼
►
you doesn't mean it won't become law in
01:43:30
◼
►
this country.
01:43:30
◼
►
Have you not learned anything?
01:43:31
◼
►
Well, there's a lot of money behind it.
01:43:33
◼
►
Don't worry.
01:43:34
◼
►
There is, but still, even that's not a
01:43:37
◼
►
Anyway, I'm not as, I don't, I don't
01:43:39
◼
►
agree that it's the same situation.
01:43:41
◼
►
I think we do need to make some
01:43:42
◼
►
adjustments for reasons that we've
01:43:43
◼
►
discussed on past episodes, but I just
01:43:46
◼
►
don't know how it's going to turn out
01:43:47
◼
►
because trying to use like logic and
01:43:49
◼
►
reason to predict what laws will be
01:43:51
◼
►
upheld in this country has not worked
01:43:53
◼
►
for decades and it's really depressing.
01:43:56
◼
►
Thank you to our sponsors this week,
01:43:58
◼
►
Gusto, Factor, and Masterclass.
01:44:00
◼
►
And thanks to our members who support us
01:44:02
◼
►
You can join us at atp.fm slash join.
01:44:05
◼
►
One of the many perks of membership is
01:44:08
◼
►
ATP Overtime, our weekly bonus topic.
01:44:10
◼
►
It's usually about 15 to 25 minutes of
01:44:13
◼
►
one more topic for the show that we just,
01:44:15
◼
►
we kept, it kept falling down the list
01:44:17
◼
►
and we didn't actually get to it.
01:44:18
◼
►
This week on Overtime, we're going to be
01:44:20
◼
►
talking about, apparently Apple is rumored
01:44:22
◼
►
to be making basically an AI pin.
01:44:24
◼
►
We're going to see what that's about.
01:44:27
◼
►
And we're going to talk about that in Overtime.
01:44:29
◼
►
You can join us to listen, atp.fm slash join.
01:44:32
◼
►
Thanks everybody.
01:44:33
◼
►
I'm going to talk to you next week.
01:44:35
◼
►
Now the show is over.
01:44:40
◼
►
They didn't even mean to begin.
01:44:42
◼
►
Because it was accidental.
01:44:45
◼
►
Oh, it was accidental.
01:44:48
◼
►
John didn't do any research.
01:44:51
◼
►
Marco and Casey wouldn't let him.
01:44:53
◼
►
Because it was accidental.
01:44:56
◼
►
And you can find the show notes at atp.fm
01:45:03
◼
►
And if you're into Mastodon,
01:45:07
◼
►
you can follow them at
01:45:09
◼
►
It's accidental.
01:45:27
◼
►
They didn't mean to.
01:45:33
◼
►
Tech Podcast.
01:45:39
◼
►
We have one of our favorite segments,
01:45:42
◼
►
which is let's make fun of Tesla and Elon Musk.
01:45:44
◼
►
Because Tesla is killing off the Model S and the Model X.
01:45:49
◼
►
This is from TechCrunch from a couple of weeks ago.
01:45:53
◼
►
Tesla's ending production of the Model S sedan and Model X SUV.
01:45:56
◼
►
CEO Elon Musk announced Wednesday
01:45:57
◼
►
during the company's quarterly earnings call.
01:45:59
◼
►
The company will make the final versions
01:46:01
◼
►
of both electric vehicles next quarter, he said,
01:46:03
◼
►
adding that his company will offer support
01:46:04
◼
►
for existing Model S and Model X owners,
01:46:06
◼
►
quote, for as long as people have the vehicles.
01:46:10
◼
►
Another quote from Elon,
01:46:11
◼
►
it's time to basically bring the Model S and X programs
01:46:14
◼
►
to an end with an honorable discharge
01:46:15
◼
►
because we're really moving into a future
01:46:17
◼
►
that is based on autonomy, he said.
01:46:18
◼
►
So if you're interested in buying a Model S and X,
01:46:21
◼
►
now would be the time to order it.
01:46:22
◼
►
Sales of both models have flatlined in recent years.
01:46:26
◼
►
Despite interior and exterior refreshes along the way,
01:46:28
◼
►
Tesla has faced increased competition
01:46:30
◼
►
in the luxury EV space from legacy automakers
01:46:32
◼
►
as well as upstarts like Rivian and Lucid Motors.
01:46:33
◼
►
Also, their CEO is a complete piece of shit,
01:46:37
◼
►
which might have something to do with it.
01:46:39
◼
►
A little bit.
01:46:39
◼
►
Well, I mean, in the case of the S and the X, though,
01:46:41
◼
►
like, what is the S from?
01:46:44
◼
►
Was that like the first?
01:46:47
◼
►
Like, in the car industry,
01:46:49
◼
►
there are generations of cars,
01:46:52
◼
►
usually numbered,
01:46:53
◼
►
like whatever generation of Honda Accord they're on.
01:46:55
◼
►
And a generation will last multiple years,
01:46:57
◼
►
but not over a decade.
01:46:59
◼
►
Well, they had like a half generational update
01:47:02
◼
►
or two throughout.
01:47:03
◼
►
But they just like,
01:47:04
◼
►
they were like,
01:47:05
◼
►
I guess we can just keep making the Model S forever.
01:47:07
◼
►
And the answer is no.
01:47:08
◼
►
and I know they did like tweaks
01:47:10
◼
►
and the current Model S is very different
01:47:12
◼
►
from the previous one,
01:47:12
◼
►
but not that,
01:47:13
◼
►
like within the car world,
01:47:14
◼
►
it is the same generation.
01:47:16
◼
►
I know they've changed so,
01:47:17
◼
►
they've changed so much.
01:47:18
◼
►
they changed all these different pieces of the chat,
01:47:20
◼
►
but it's like,
01:47:20
◼
►
it's kind of ship of Theseus.
01:47:22
◼
►
I know they're a different kind of car company,
01:47:23
◼
►
but the point is,
01:47:24
◼
►
they didn't keep up with their competition.
01:47:25
◼
►
So the Model S was inarguably
01:47:28
◼
►
the best electric vehicle ever created
01:47:30
◼
►
when it was produced.
01:47:31
◼
►
And for years,
01:47:32
◼
►
it was that.
01:47:33
◼
►
And then it just didn't keep up with everybody else.
01:47:37
◼
►
it was really good for a long,
01:47:39
◼
►
but it just didn't.
01:47:40
◼
►
And everyone else got better and better
01:47:42
◼
►
and it just stayed the same
01:47:43
◼
►
and got worse in ways
01:47:44
◼
►
when they got rid of the stocks
01:47:45
◼
►
and all that crap or whatever.
01:47:46
◼
►
And so like,
01:47:48
◼
►
now we have to bring it to an end.
01:47:50
◼
►
if you rolled out a new generation of Model S,
01:47:52
◼
►
maybe your sales would increase.
01:47:53
◼
►
But as you pointed out,
01:47:54
◼
►
there are other factors here.
01:47:56
◼
►
The main one being Elon Musk himself.
01:47:58
◼
►
So whatever.
01:48:02
◼
►
I owned two Model S's
01:48:04
◼
►
and Models S.
01:48:06
◼
►
for the time especially,
01:48:08
◼
►
they were amazing cars.
01:48:09
◼
►
But that was,
01:48:11
◼
►
my first one was 2016.
01:48:12
◼
►
It was on a lease.
01:48:13
◼
►
I got another one,
01:48:14
◼
►
2019 or whatever.
01:48:15
◼
►
They were amazing for the time
01:48:17
◼
►
and they really did do what,
01:48:19
◼
►
what Elon said he was setting out to do early on,
01:48:23
◼
►
which was like advance electric vehicles,
01:48:25
◼
►
basically like for,
01:48:25
◼
►
like make a big splash
01:48:27
◼
►
and force the rest of the industry
01:48:28
◼
►
to start electrifying their vehicles.
01:48:29
◼
►
They succeeded.
01:48:30
◼
►
They did that.
01:48:31
◼
►
And that was in very large part due to the Model S.
01:48:35
◼
►
But that was a very different time.
01:48:37
◼
►
This is a very different company now,
01:48:39
◼
►
a very different world,
01:48:41
◼
►
a very different competitive landscape.
01:48:42
◼
►
Elon is a very different person now
01:48:44
◼
►
compared to what,
01:48:45
◼
►
how he was in 2016.
01:48:46
◼
►
He successfully lobbied to get the $7,500 tax credit
01:48:49
◼
►
for electric vehicles eliminated,
01:48:51
◼
►
which I'm sure really helped his company.
01:48:53
◼
►
they were probably already passed it
01:48:55
◼
►
with like the allocations.
01:48:57
◼
►
it's like once you sold a certain number of vehicles
01:48:59
◼
►
in total as a company,
01:49:00
◼
►
like yours wouldn't apply anymore.
01:49:01
◼
►
I think they were still getting it in some way.
01:49:04
◼
►
if he cares about advancing electric vehicles,
01:49:06
◼
►
getting rid of that was stupid.
01:49:08
◼
►
God knows what he cares about these days.
01:49:10
◼
►
he's gone a very different direction,
01:49:12
◼
►
but you know,
01:49:15
◼
►
when the Model S came out,
01:49:17
◼
►
we had to justify EVs,
01:49:20
◼
►
high prices somehow.
01:49:21
◼
►
And so EVs were all like these high end luxury cars.
01:49:25
◼
►
Like the whole reason it's called S is he was ripping off the Mercedes S class
01:49:29
◼
►
in terms of like market position.
01:49:30
◼
►
Cause it's like the battery was so expensive.
01:49:33
◼
►
You had to make the car like a hundred grand.
01:49:35
◼
►
And so it's like,
01:49:37
◼
►
well how do we sell a hundred grand car?
01:49:38
◼
►
We'll make it compete with the Mercedes S class,
01:49:40
◼
►
or at least we'll,
01:49:41
◼
►
we'll position it that way.
01:49:42
◼
►
In practice,
01:49:45
◼
►
a fairly inexpensive car interior on top of a very expensive car battery.
01:49:50
◼
►
with an amazing drive train.
01:49:53
◼
►
they somewhat succeeded with,
01:49:54
◼
►
with the competition.
01:49:55
◼
►
But what really set the company,
01:49:59
◼
►
on fire was,
01:50:02
◼
►
kind of two things,
01:50:03
◼
►
the model three,
01:50:04
◼
►
which radically changed everything.
01:50:07
◼
►
Elon promising all this like self-driving robo taxi stuff that has kind of not quite
01:50:11
◼
►
really ever made it.
01:50:13
◼
►
And may someday possibly.
01:50:14
◼
►
We'll get to that in a little bit.
01:50:15
◼
►
but now EVs are,
01:50:19
◼
►
there's so many EVs in the market now from everybody else.
01:50:22
◼
►
And they're not only the super high end hundred thousand dollar cars anymore.
01:50:26
◼
►
Now they've come down in price,
01:50:27
◼
►
including largely Tesla's own model three and model Y.
01:50:32
◼
►
And where Tesla has driven their,
01:50:37
◼
►
their product line was towards volume and towards lower prices,
01:50:41
◼
►
which they do.
01:50:43
◼
►
they're very competitive in that area.
01:50:44
◼
►
like with the model three and why,
01:50:46
◼
►
like compared to the rest of the market,
01:50:47
◼
►
they are very competitive,
01:50:48
◼
►
but what they,
01:50:49
◼
►
what they've done is they've stripped down.
01:50:54
◼
►
they've decontented their cars to bring the price lower and lower and lower.
01:51:00
◼
►
and you know,
01:51:00
◼
►
meanwhile dealing with their own profitability along the way.
01:51:03
◼
►
and it's been,
01:51:03
◼
►
that's been kind of rollercoaster,
01:51:04
◼
►
but like the,
01:51:06
◼
►
the first model S that I got was fairly luxurious.
01:51:10
◼
►
It had a lot of luxury features.
01:51:13
◼
►
it was like a,
01:51:14
◼
►
the highest end Honda interior quality.
01:51:19
◼
►
it with a really amazing drive train that beats supercars.
01:51:22
◼
►
Like it was very lopsided,
01:51:23
◼
►
but it was a very nice car over time.
01:51:26
◼
►
As they've gone with the higher volume models,
01:51:28
◼
►
the models three and Y,
01:51:30
◼
►
those were a lot less nice inside than the S and X,
01:51:34
◼
►
R slash were.
01:51:35
◼
►
And then when they redid the model S,
01:51:38
◼
►
to add the stupid half steering wheel and other things,
01:51:40
◼
►
they kind of gave it the same treatment.
01:51:42
◼
►
They kind of decontented it.
01:51:44
◼
►
they took a lot of stuff out.
01:51:45
◼
►
they simplified or whatever,
01:51:47
◼
►
but it made it actually feel like an even cheaper car over time.
01:51:51
◼
►
As Tesla has become more about just selling the biggest volume of the cheapest EVs that they can,
01:51:57
◼
►
they've totally ignored the S and the X.
01:52:01
◼
►
And maybe that's because the market has too.
01:52:03
◼
►
I don't know.
01:52:05
◼
►
the current model S is not nearly as nice as the old ones.
01:52:09
◼
►
Tesla is not nearly as nice of a car company.
01:52:12
◼
►
they're really no longer that luxurious of a car company.
01:52:16
◼
►
And certainly,
01:52:16
◼
►
Elon himself and his politics have,
01:52:19
◼
►
have taken a lot of,
01:52:21
◼
►
damage out on the company.
01:52:23
◼
►
And so now it makes sense.
01:52:25
◼
►
as much as it pains me to see that the model S,
01:52:27
◼
►
a car I used to love a lot,
01:52:30
◼
►
or it's about to be gone.
01:52:32
◼
►
I hardly ever see them on the road anymore.
01:52:36
◼
►
The new generation,
01:52:37
◼
►
I've hardly seen any ever.
01:52:39
◼
►
So it does seem like nobody was buying them.
01:52:41
◼
►
It does seem like most people were just buying the model three and why.
01:52:44
◼
►
And also Tesla has changed as a company so much that they no longer care to even be a luxury car maker making,
01:52:54
◼
►
a hundred thousand dollar sedan.
01:52:55
◼
►
So it is a shame,
01:52:57
◼
►
but I think that the era for that car has passed and the era for Tesla to make that car has passed.
01:53:04
◼
►
And now other companies have made a bunch of really good EVs.
01:53:10
◼
►
We no longer need Tesla to address that market,
01:53:14
◼
►
which is good because they're not going to.
01:53:15
◼
►
But now if you want a luxury,
01:53:18
◼
►
a luxury electric sedan,
01:53:20
◼
►
you have lots of options now.
01:53:22
◼
►
And lots of them are very,
01:53:23
◼
►
So it is the end of an era,
01:53:25
◼
►
but I think that era kind of ended up a while ago.
01:53:28
◼
►
I don't think it's a great business plan to just switch to low margin products and volume.
01:53:33
◼
►
that's the reason people have diversified product lines is you want to fleece the rich people with
01:53:36
◼
►
the high volume,
01:53:37
◼
►
with the low volume,
01:53:38
◼
►
high margin,
01:53:39
◼
►
fancy products,
01:53:40
◼
►
even though most of the cars you sell are the cheaper ones,
01:53:43
◼
►
That's a reasonable model.
01:53:45
◼
►
nobody likes them.
01:53:46
◼
►
Car companies just stopped making them entirely.
01:53:47
◼
►
The model X should have been great for them,
01:53:50
◼
►
but they had to do the stupid Falcon wing doors.
01:53:51
◼
►
Like SUVs are popular.
01:53:53
◼
►
Big SUVs are popular.
01:53:54
◼
►
Big electric SUVs are popular,
01:53:56
◼
►
Why is the X not successful?
01:53:58
◼
►
they didn't update it and B,
01:53:59
◼
►
stupid Falcon wing doors.
01:54:01
◼
►
The X was also,
01:54:02
◼
►
was very expensive.
01:54:03
◼
►
It was always a bit of a range sacrifice compared to the S.
01:54:06
◼
►
but it is baffling to me that Tesla still doesn't really make the most popular kind of car in
01:54:14
◼
►
The Y is as close as they get,
01:54:16
◼
►
but it's just like a small SUV.
01:54:18
◼
►
The Y is in the ballpark,
01:54:20
◼
►
but not that much.
01:54:21
◼
►
But the Y looks like a inflated three instead of the three looking like a shrunken down SUV.
01:54:26
◼
►
They're like,
01:54:27
◼
►
they're a little bit old school in that way.
01:54:28
◼
►
And which is why one of the reasons I like the S,
01:54:31
◼
►
that was just a fumble.
01:54:32
◼
►
that's their own fault.
01:54:33
◼
►
they fumbled that themselves by demanding those stupid doors.
01:54:36
◼
►
And it's just,
01:54:37
◼
►
it's just too weird.
01:54:37
◼
►
And for people.
01:54:38
◼
►
And so even,
01:54:40
◼
►
their entrant in,
01:54:41
◼
►
in a fairly popular segment,
01:54:42
◼
►
which is like expensive,
01:54:43
◼
►
big SUVs for rich people,
01:54:45
◼
►
they've screwed that one up somehow too.
01:54:48
◼
►
but if you don't update your products for a long time,
01:54:50
◼
►
it's like not updating the Mac pro and then complaining.
01:54:51
◼
►
Nobody buys it.
01:54:52
◼
►
Like there are a lot of things conspiring to be a problem with this,
01:54:55
◼
►
but like the idea that we're only going to sell model threes and just go cheaper and cheaper.
01:54:59
◼
►
and don't forget about the roadster.
01:55:01
◼
►
Remember that,
01:55:02
◼
►
which was announced.
01:55:02
◼
►
I think I have a note in here somewhere.
01:55:04
◼
►
How long ago the roadster was that's I'd forgotten.
01:55:06
◼
►
I'd forgotten about it.
01:55:08
◼
►
we do have a note on it later,
01:55:10
◼
►
it was eight years ago.
01:55:11
◼
►
That presumably will be at a very expensive luxury car if they ever ship it as well.
01:55:15
◼
►
I don't know what to do,
01:55:16
◼
►
but we should continue with,
01:55:17
◼
►
the story because there's a few other tidbits to,
01:55:19
◼
►
to talk about,
01:55:20
◼
►
Tesla's future in a post S and X world.
01:55:24
◼
►
So their most recent earnings,
01:55:26
◼
►
this is reading from the verge in the quarter that ended in December,
01:55:29
◼
►
Tesla reported a 3% decrease in revenue and a staggering 61% decrease in profits over the fourth quarter of 2024.
01:55:37
◼
►
So you want to sell more low margin products you say?
01:55:41
◼
►
the earnings report comes a few weeks after Tesla lost its title as the world's bestselling EV company,
01:55:45
◼
►
China's BYD,
01:55:46
◼
►
which sold 2.26 million vehicles last year.
01:55:48
◼
►
Tesla reported selling about 1.6 million vehicles in 2025 and eight and a half percent decrease year over year.
01:55:55
◼
►
Additionally,
01:55:56
◼
►
from the earnings call transcript,
01:55:58
◼
►
I really think long-term,
01:55:59
◼
►
the only vehicles that we'll make will be autonomous vehicles with the exception of the next generation Roadster,
01:56:04
◼
►
which we're hoping to debut in April.
01:56:06
◼
►
You mentioned this was revealed in November of 2017.
01:56:10
◼
►
And to be clear,
01:56:11
◼
►
I only had one child when the Tesla Roadster was announced.
01:56:14
◼
►
You could have put a deposit down eight years ago on a car that still doesn't exist.
01:56:18
◼
►
Michaela did not exist in November,
01:56:21
◼
►
She was still cooking.
01:56:24
◼
►
that's when the Roadster,
01:56:26
◼
►
was announced.
01:56:27
◼
►
And to be clear,
01:56:27
◼
►
they didn't just announce it.
01:56:28
◼
►
They showed a car that looked like the Roadster driving away really fast.
01:56:31
◼
►
So they had something on four wheels that moved,
01:56:34
◼
►
but it was obviously not the car that they intended to sell.
01:56:36
◼
►
And I also think it is not going to be inexpensive like the Model 3 if and when they put it out.
01:56:41
◼
►
it's hilarious to hear Elon being like,
01:56:43
◼
►
everything's going to be autonomous because as we're about to find out,
01:56:47
◼
►
they're not doing well in that department either.
01:56:50
◼
►
Tesla's any day,
01:56:51
◼
►
it'll be the day.
01:56:52
◼
►
Tesla's own robot taxi data confirms a crash rate at about three times worse than humans.
01:56:56
◼
►
This is Fred Lambert at electric new NHTSA.
01:57:00
◼
►
Is that right?
01:57:01
◼
►
Crash data combined with Tesla's new disclosure of robot taxi mileage reveals Tesla's autonomous vehicles are crashing at a rate much higher than human drivers.
01:57:08
◼
►
And that's with a safety monitor in every car.
01:57:11
◼
►
According to a chart in Tesla's Q4 2025 earnings report showing cumulative robot taxi miles,
01:57:17
◼
►
the robot taxi,
01:57:18
◼
►
robotaxis experience roughly one crash every 55,000 miles.
01:57:21
◼
►
For comparison,
01:57:22
◼
►
human drivers in the United States average approximately one crash every 500,000 miles,
01:57:26
◼
►
according to NHTSA data.
01:57:28
◼
►
Waymo has logged over 125 million autonomous miles and maintains a crash rate well below human averages.
01:57:34
◼
►
it's worth noting that not everything gets reported.
01:57:37
◼
►
So when I said a moment ago,
01:57:39
◼
►
human drivers do one every 500,000,
01:57:41
◼
►
that's crashes that are reported.
01:57:43
◼
►
And so there's a little bit of like thumb in the wind.
01:57:46
◼
►
Not every robot taxi thing gets reported either.
01:57:49
◼
►
But Fred Lambert did a little thumb in the air,
01:57:51
◼
►
thumb in the wind calculations and said,
01:57:54
◼
►
basically robotaxis are three times worse than anything else.
01:57:58
◼
►
Additionally,
01:57:58
◼
►
and we'll talk about this in a second as well.
01:58:02
◼
►
I don't remember if it was this article or the next one we're about to bring up,
01:58:05
◼
►
but they noted as well that whenever Tesla has a robot taxi accident,
01:58:09
◼
►
they basically redact everything.
01:58:11
◼
►
Whereas Waymo's robotaxi or Waymo's whatever autonomous vehicle reports will say,
01:58:16
◼
►
the Waymo was traveling at five miles an hour north on first street and then impacted a pedestrian or whatever the case may be.
01:58:21
◼
►
And like has extremely detailed reports.
01:58:24
◼
►
Whereas Tesla's are hidden behind,
01:58:26
◼
►
this is ND8 or whatever the case,
01:58:28
◼
►
something like that,
01:58:28
◼
►
like corporate secrets,
01:58:30
◼
►
they're very,
01:58:31
◼
►
very shady about it.
01:58:32
◼
►
Super shady.
01:58:34
◼
►
Waymo safety record reading from the New York times and December of last year,
01:58:39
◼
►
Jonathan slotkin writes when compared with human drivers on the same roads,
01:58:42
◼
►
Waymo self-driving cars were involved in 91% fewer serious injury or worst crashes and 80% fewer crashes causing any injury.
01:58:50
◼
►
If Waymo's results are indicative of the broader future of autonomous vehicles,
01:58:53
◼
►
we may be on the path to eliminating traffic deaths as a leading cause of mortality in the United States.
01:58:57
◼
►
While many see this as a tech story,
01:59:00
◼
►
I view it as a public health breakthrough.
01:59:02
◼
►
Waymo's approach to this is like the exact opposite of Tesla.
01:59:06
◼
►
Tesla has been promising the moon for years and years and failing to deliver and doing stupid things.
01:59:11
◼
►
And Waymo has been so incredibly cautious and careful over so many years.
01:59:18
◼
►
Everything has to be totally controlled roads map down to the smallest millimeter,
01:59:22
◼
►
just to roll everything out slowly,
01:59:24
◼
►
do everything slowly and carefully,
01:59:26
◼
►
because the most important thing is to not hurt people and to not be unsafe.
01:59:30
◼
►
they clog traffic in San Francisco and stop emergency vehicles.
01:59:32
◼
►
And it hasn't been all perfect,
01:59:34
◼
►
but like there,
01:59:35
◼
►
I always get the impression that Waymo is trying as hard as they can not to screw up.
01:59:38
◼
►
And I get the opposite impression from Tesla,
01:59:41
◼
►
which is they're trying to say,
01:59:42
◼
►
we've done it full self-driving.
01:59:44
◼
►
Your car will be making money for you while you sleep.
01:59:46
◼
►
And we'll just keep saying that for the next decade.
01:59:48
◼
►
Please give us money.
01:59:49
◼
►
And by the way,
01:59:50
◼
►
we're taking LiDAR out of the cars because I,
01:59:52
◼
►
I think that's a good idea,
01:59:54
◼
►
but I'm stupid.
01:59:55
◼
►
So it's not shocking that like RoboTaxi is like,
02:00:00
◼
►
he just wants there to be a RoboTaxi and they put a human safe driver in every car.
02:00:03
◼
►
And there's still three times worse than you.
02:00:05
◼
►
Like this is just he like,
02:00:07
◼
►
this is one of the main reasons that I will never buy a Tesla is I do not trust a company with that person at the head of it because every instinct he has and everything he wants his company to do is the opposite of what I want.
02:00:19
◼
►
And the thing that I put my life into,
02:00:21
◼
►
I understand like they have at various times been good cars and may be good cars,
02:00:25
◼
►
but like his incredible control of that company,
02:00:28
◼
►
it filters down into all levels.
02:00:31
◼
►
Like every story you hear about the terrible fact,
02:00:34
◼
►
the terrible racist factories,
02:00:36
◼
►
in hindsight should have been a big sign about him.
02:00:38
◼
►
and like the things he makes people do in the assembly lining,
02:00:40
◼
►
buying a brand new car and finding parts of it held together with zip ties and the removing LiDAR and saying cameras are good enough and promising self-driving.
02:00:46
◼
►
And it's just like,
02:00:47
◼
►
I don't want to be anywhere near that with my life or anyone else's life.
02:00:52
◼
►
and then Waymo,
02:00:54
◼
►
I'm not sure what their path to success is there,
02:00:56
◼
►
but everything I read about them is like,
02:00:57
◼
►
they're always doing a little bit better than they did before.
02:01:00
◼
►
And they're trying to be as safe as possible.
02:01:02
◼
►
And I'm glad to hear that their stats show that they are actually exceeding human safety in the,
02:01:07
◼
►
in the scenarios where they choose to drive,
02:01:10
◼
►
which again are still very limited,
02:01:12
◼
►
but like that's appropriate.
02:01:13
◼
►
Like that they're not saying it'll drive,
02:01:16
◼
►
this car will drive you across country while you sleep.
02:01:18
◼
►
Like Waymo doesn't say that because they can't do that.
02:01:21
◼
►
And they're being careful.
02:01:22
◼
►
And I hate Elon Musk.
02:01:23
◼
►
I will fight you for which one of us hates him more.