PodSearch

ATP

507: It Sucks, Doesn't It?

 

00:00:00   I am here on time in part thanks to a product

00:00:05   that I never thought I would use before,

00:00:07   a rapid tire deflator.

00:00:10   - Isn't that just called a knife?

00:00:11   - Yeah, right. (laughs)

00:00:13   - I think a knife is the most rapid tire deflator,

00:00:15   but unfortunately that makes it difficult

00:00:17   to use the tire afterwards.

00:00:19   I first got this when I first started driving on the beach

00:00:22   because you're supposed to air down your tires

00:00:25   down to like the 20 PSI-ish range

00:00:27   when you're driving on the sand here

00:00:29   to keep things, not only to keep yourself from getting stuck

00:00:31   but to keep yourself from tearing up the beach too much

00:00:34   unnecessarily and tearing up some of the routes.

00:00:37   - Oh, I just realized, now that I've been thinking

00:00:38   about this, it sucks, doesn't it?

00:00:41   (laughing)

00:00:43   - No.

00:00:44   - 'Cause that's the only way it could work, right?

00:00:45   Go on, go on.

00:00:46   - So anyway, so I frequently have to change my tire pressure

00:00:49   in both directions between like 20-ish

00:00:52   and like, you know, 40-ish PSI.

00:00:54   If I'm going on a long highway drive,

00:00:56   I wanna put it up to about 40

00:00:58   just for additional rigidity and safety

00:01:00   and a little bit of efficiency.

00:01:01   Although it's a giant box. - Whoa, slow down.

00:01:03   - Like it's not gonna be that efficient, but you know.

00:01:04   - How did you land on 40?

00:01:07   - It, well the car wants me to do 50,

00:01:08   and I think that's dumb, so I get impatient

00:01:11   and stop after 30. - No way it wants 50 pounds.

00:01:13   - Is that what it says on the door jamb?

00:01:15   - Thank you, Jon.

00:01:17   - I think it says 45 or 48 on the door jamb.

00:01:20   - No way.

00:01:21   - Is this a case where Casey asked to Google

00:01:23   for the PDF of your car manual again?

00:01:25   - Yes, either way the-- - Seriously.

00:01:27   - What year is your car?

00:01:28   It's a Land Rover Defender, what year?

00:01:29   - 2021.

00:01:30   - I will solve this.

00:01:31   - Okay.

00:01:32   So, and it has, like, the one that happened to be

00:01:35   on the dealer lot in the color I wanted

00:01:37   has this optional accessory of an onboard

00:01:39   inflator deflator thing, which is really fun.

00:01:41   It's in the trunk and you come to this big hose,

00:01:43   you reach all four tires.

00:01:44   So it's fun, it's just slow.

00:01:46   And so I haven't found a rapid tire inflator yet,

00:01:48   besides, you know, I don't know, a bomb,

00:01:50   but a rapid tire deflator is this thing that,

00:01:55   Again, I'll put it in the show notes

00:01:57   when I'll find the link and dig it up.

00:01:58   But it's this weird little contraption that normally,

00:02:02   if you wanna deflate it as higher quickly,

00:02:03   you can do a compressor that just kinda like

00:02:06   holds the valve in and lets air out.

00:02:09   Or you can poke the valve stem with a coin or a key

00:02:12   or something if you wanna do it that way.

00:02:14   But what this thing does is it screws onto the valve

00:02:20   stem thing and then there's an inner thing that you twist

00:02:24   once it's screwed onto the valve,

00:02:27   you twist this inner thing the opposite direction

00:02:29   and it unscrews the inner valve stem

00:02:33   and lets it pop out inside of itself

00:02:36   and then you just pop a thing and it just,

00:02:39   psh, lets huge amounts of air out.

00:02:41   It can go from 40 PSI to 20 PSI in 20 seconds.

00:02:46   It's massively fast.

00:02:47   - I find this very alarming.

00:02:49   So I could be, yeah, it doesn't suck.

00:02:51   - Yes, it just basically disassembles

00:02:53   your tire valve temporarily,

00:02:55   and I was able to deflate all four of my tires

00:02:58   from 40 to 20 in less than five minutes.

00:03:01   It was shockingly fast. (laughs)

00:03:04   So anyway, I don't know who needs this product

00:03:07   besides a very small handful of people,

00:03:09   but if you are one of those handful of people,

00:03:11   this is amazing.

00:03:12   Check it out. (laughs)

00:03:14   - Why is it so difficult for me to find

00:03:16   the owner's manual for this car?

00:03:18   Why is this like state secrets?

00:03:20   I will say this is actually gonna be

00:03:22   the KCEO's Marco apology tour episode apparently,

00:03:25   but according to a spot of Googling,

00:03:27   it was, I found a very unofficial looking site

00:03:30   that said it was like 47 or 50 pounds,

00:03:32   which is what you said to your credit,

00:03:33   and I said no frickin' way.

00:03:35   But I'm trying to find the actual owner's manual,

00:03:38   and I can't.

00:03:40   - So one thing, it lets you kind of customize

00:03:43   the different sections of the dashboard

00:03:45   to a few different selections,

00:03:46   and I want a constant display on the left side

00:03:49   of the four tire pressures.

00:03:51   There is no option for that.

00:03:53   However, if it's below the recommended pressure,

00:03:56   it puts it up there by default

00:03:57   until you click a certain button.

00:03:59   And because mine are always below the,

00:04:02   this is why I know the recommendation is higher than 40,

00:04:04   mine are always below the recommendation.

00:04:05   And so because of that,

00:04:08   as long as I don't hit this one button,

00:04:10   it's displaying the four tire pressures all the time,

00:04:12   which is a nice little accidental feature,

00:04:14   but I'm really enjoying it.

00:04:16   - Well, I owe you an apology.

00:04:18   I am apparently very wrong.

00:04:20   Oh wait, okay, oh no wait, I found it, but it's not a PDF.

00:04:23   Just give me a friggin' PDF!

00:04:25   God, I hate everything.

00:04:26   (electronic beeping)

00:04:28   - So, before we get into topics,

00:04:29   I wanna start this episode also

00:04:31   with a public service announcement.

00:04:33   If you know people in your life with Apple Watches,

00:04:38   in the most kind way possible,

00:04:40   if you see them using an Apple Watch face

00:04:44   that has complications, you will blow their mind

00:04:49   If at some opportune time,

00:04:50   if they're open to such suggestions,

00:04:52   you tell them, hey, by the way,

00:04:53   you know you can change what those are.

00:04:56   So in my experience, I have a bunch of friends

00:04:58   who are kind of nerdy,

00:05:00   who they're nerdy enough to get an Apple Watch,

00:05:02   and they're nerdy enough to care about it being good,

00:05:06   but not nerdy enough to be able to know some of that stuff,

00:05:09   like you can customize the face and all the complications.

00:05:13   And I've shown a couple of people this

00:05:15   in the last couple of weeks,

00:05:17   And these are people who've worn an Apple Watch for years.

00:05:20   And every time, I'll see that they're using

00:05:23   the Infograph modular face, it's full of complications.

00:05:26   And the default set is like, it's okay,

00:05:28   you know, you got a weather, okay,

00:05:29   then you got the compass, like eh, it's like how you,

00:05:32   you know, there's a couple things on there

00:05:33   I think most people wouldn't use.

00:05:35   And sometimes they'll see mine, they're like,

00:05:37   oh, what face is that?

00:05:38   'Cause mine's of course all customized.

00:05:39   And when I show them, like hey, you can hold down here,

00:05:42   swipe over, and then you can change every single one

00:05:45   of those to all these different things.

00:05:47   Everyone's like, oh my god, it changes their,

00:05:50   it blows their mind.

00:05:51   They're so amazed.

00:05:53   No one knows that you can customize watch faces,

00:05:56   especially complications.

00:05:58   So just PSA, if someone you know uses a face

00:06:02   with complications and you see that they're all the defaults

00:06:04   'cause they usually are, again, some opportune time,

00:06:07   don't be like the annoying nerd like, you know,

00:06:08   like you know, at some nice time, be like, hey, by the way,

00:06:10   you know you can change that and here's how

00:06:12   and show them how.

00:06:13   You can also show them that they can have multiple faces

00:06:16   they can swipe between really easily.

00:06:18   No one knows that either.

00:06:19   And both of those things I have had

00:06:22   very strong positive reactions to when I have shown people.

00:06:25   - As a non-Apple Watch wearer who does know

00:06:28   that you can change complications,

00:06:29   every time I try to change complications

00:06:31   on someone else's Apple Watch for them,

00:06:32   I have to like refigure out how to do it.

00:06:34   It is not an obvious UI,

00:06:36   so I don't really blame people for not,

00:06:38   I mean, I kinda blame them for not knowing it's possible,

00:06:40   but even when you know it's possible,

00:06:43   it's not obvious how to do it

00:06:45   if you haven't done it recently.

00:06:46   - Yeah, exactly.

00:06:47   Like so many things about touchscreen discoverability,

00:06:50   like it's so much worse on the watch

00:06:51   'cause it's such a small screen.

00:06:52   - It is way worse on the watch 'cause there's just nothing.

00:06:55   There's nothing there for you.

00:06:56   And I think the, sort of the credit

00:06:58   to the original Apple design,

00:07:00   having two things that you can press,

00:07:02   the button and the crown both press in,

00:07:05   does add to the number of possibilities

00:07:07   of things that I have to blindly try

00:07:08   before I figure out how to work things.

00:07:10   And then of course, oh, I forgot,

00:07:11   you can turn the ground too.

00:07:12   And it's like, it's just,

00:07:13   I go through the same little dance every time I do it.

00:07:15   It's ridiculous.

00:07:17   - Do you know Marco, if you have 18, 19, 20 or 22 inch rims?

00:07:20   - 19.

00:07:21   - Then 47 pounds up front, 50 in the back.

00:07:24   - Bam. - I am wrong.

00:07:26   Well done.

00:07:27   I have never seen tire pressures that high.

00:07:29   Goodness.

00:07:30   - It's 'cause they're big tires, right?

00:07:31   It's big.

00:07:32   - I mean, I guess.

00:07:33   - Yeah, they're pretty big.

00:07:34   - Yeah, I did eventually find it because Google,

00:07:36   not because of Land Rover site,

00:07:37   which as I've heard is trash.

00:07:39   - Yes.

00:07:40   - But hey, you know what is not trash?

00:07:42   What is not trash is ATP merch,

00:07:45   which is available right now, baby, at ATP.fm/store.

00:07:49   So, you know how every time I say,

00:07:52   "Oh, you really gotta order quick

00:07:53   "'cause you never know if something's gonna sell out, huh?

00:07:55   "You never know."

00:07:56   Well, guess what?

00:07:58   Guess what?

00:07:59   It's happened.

00:08:00   Chicken hats, boom, gone.

00:08:02   Second run of chicken hats,

00:08:03   which we did like under duress, boom, gone.

00:08:07   Pint glasses, gone.

00:08:08   Mugs, gone.

00:08:09   You snooze, you lose.

00:08:11   No sympathy from me.

00:08:12   This is what you get.

00:08:13   You had sympathy from the other two, not from me.

00:08:15   I warned you, I warned you.

00:08:17   We did the visualization exercise,

00:08:19   which I'm pretty sure I stole from Mike Hurley.

00:08:20   We did the exercise together.

00:08:22   You still didn't order, and this is on you.

00:08:24   But for those that haven't ordered

00:08:26   and would still like lovely, excellent, delightful merch,

00:08:29   we still have M2 shirts in colored and monochrome versions.

00:08:33   We have the classic ATP logo shirt.

00:08:35   we have the utterly delightful ATP hoodie, all available.

00:08:38   So if you're interested, and you should be,

00:08:41   go to ATP.fm/store, and you can buy any of these three,

00:08:46   four items, the two different varieties of shirt,

00:08:49   well, three if you include the ATP one and the hoodie,

00:08:51   all available.

00:08:52   And remember, members, if you are a member,

00:08:55   and you can become a member at ATP.fm/join,

00:08:58   then you get 15% off.

00:08:59   You can get your bespoke discount code on your member page.

00:09:01   It's all linked from ATP.fm/store.

00:09:04   Go get yourself some treats.

00:09:05   treat yourself, you deserve a treat,

00:09:06   you deserve several treats, treat yourself,

00:09:08   treat your friends, treat your husband,

00:09:09   treat your wife, treat everybody, ATP.fm/store.

00:09:12   - Couple more items on the store.

00:09:15   Reminder about the monochrome M2 shirt,

00:09:18   it just shows like a blue shirt with white,

00:09:20   but it comes in a whole bunch of different colors.

00:09:21   So if you know, the regular M2 one,

00:09:24   the really expensive one with the color stripes

00:09:26   just comes in black, but this other one comes in like red,

00:09:30   purple, green, pink, teal,

00:09:32   so check out the different colors,

00:09:33   think a lot of them look really good.

00:09:34   Second thing, if you, for one of the sold out items,

00:09:37   chicken hat, pint glass, or mug, all three of which,

00:09:40   I specifically warned you about last week

00:09:42   as the ones that would sell it first.

00:09:43   And they did in the order that I said.

00:09:46   - Wait, in all fairness, before we rag on people

00:09:48   too hard on this, which Casey already did,

00:09:50   you do have some sympathy for me because, like,

00:09:52   the first batch of the chicken hats,

00:09:54   it was like halfway gone before we even published

00:09:56   the episode, just from bootleg listeners.

00:09:58   - Yeah, that is true, that is true.

00:09:59   - We ordered, we guessed wrong on the demand

00:10:03   for the chicken hat, that's on us.

00:10:04   But we did do a second order,

00:10:06   and we did a second order for more than twice

00:10:08   the amount of the first order, and that sold out too.

00:10:10   - Yeah, as much as I'm snarking earlier,

00:10:12   that is 100% accurate, and that is 100% on us,

00:10:15   that we clearly underestimate, which is lovely,

00:10:17   and actually a heck of a compliment

00:10:18   from our wonderful listeners.

00:10:20   It was a complete miss on our part.

00:10:22   We thought nobody would be interested in a hat

00:10:25   that we presented as something that John bought

00:10:28   20 years ago and really likes, but it turns out--

00:10:30   - And it's weird shaped. - And it's weird shaped.

00:10:32   - But it turns out the poll of John Siracusen

00:10:35   has no bounds, knows no bounds.

00:10:37   - Morbid curiosity is what I would call it.

00:10:39   But anyway, related to the sold out items,

00:10:41   if you go, they're still on the store page,

00:10:43   they just put like sold out text on them,

00:10:45   but you can still click through on them.

00:10:46   Click the little buy link, click the name,

00:10:48   click the image, like anyway, click through to the item.

00:10:51   It has a button on the store that says bring it back.

00:10:53   It's just cotton beer is like normal,

00:10:54   like hey, I want this even though it's not in stock.

00:10:57   If you still want a hat, a pint glass, or a mug,

00:11:00   click through on it and then click the bring it back thing.

00:11:02   it makes you enter the email address

00:11:04   'cause they wanna email you when it comes back in stock.

00:11:05   And I know that's annoying,

00:11:06   but Cotton Bureau is good with email.

00:11:08   That's how we knew, sort of,

00:11:10   that's how we gauge demand

00:11:11   for the second order of chicken hats

00:11:13   based on how many people click the bring it back link

00:11:15   and enter an email address.

00:11:17   And that gave us a rough estimate

00:11:18   of how many people probably wanted them.

00:11:19   And I think we got it pretty close

00:11:20   'cause towards the end of selling out of the chicken hats,

00:11:23   the sales were really slowing down.

00:11:25   It was like just a real trick alike.

00:11:27   The number wouldn't move at all for like hours at a time.

00:11:29   So I think we kind of mostly met demand,

00:11:31   but we're not entirely sure.

00:11:32   And the mugs we did a new order of,

00:11:34   and the pint glass we were just selling through an old order.

00:11:36   So in order for us to have a better idea for the next sale,

00:11:41   how much demand there is for this stuff,

00:11:43   you can use that bring it back button,

00:11:45   and it just ticks up a counter

00:11:47   that we can see on our internal dashboards

00:11:48   to know how many people are actually interested

00:11:50   in this again.

00:11:51   So that's my suggestion there.

00:11:53   And then finally,

00:11:54   lots of people are sending pictures of the chicken hats.

00:11:56   If you ordered it, especially if you were in the first batch,

00:11:58   you might get them already.

00:11:59   I already got my chicken hat,

00:12:00   and lots of people are getting them,

00:12:02   'cause the things that are in stock,

00:12:03   I think, I'm not sure about the mugs and the pint glasses,

00:12:05   but certainly the hats,

00:12:06   they just ship them as soon as they get them.

00:12:07   The other ones we have to wait until the campaign is over

00:12:09   so they know how many they have to print

00:12:11   'cause they're gonna make them

00:12:12   based on how many people order them.

00:12:14   Anyway, people are sending me pictures

00:12:15   of them wearing the chicken hat.

00:12:17   Bad on us for not including instructions with the hat

00:12:19   and far be it for me to tell you how to wear hats.

00:12:24   And I was like, yeah, you put it on your head.

00:12:25   People are getting that far.

00:12:27   But part of the reason it's called a chicken hat

00:12:29   is 'cause it looks a little bit like

00:12:31   the little thing on top of a chicken or a rooster's head,

00:12:34   the little floppy thing.

00:12:35   - The little mohawk thing?

00:12:37   - Yeah, and that mohawk thing,

00:12:39   it's like a vertical fin that goes from front to back.

00:12:41   That's how, I'm not gonna say supposed to be warm,

00:12:44   but that's how I wear the chicken hat,

00:12:46   is front to back, like a shark fin, right?

00:12:50   And then second thing to know is,

00:12:52   this hat, although it's pretty uniform,

00:12:54   it does have a seam.

00:12:56   Like there is a seam down one part,

00:12:57   and the seam goes in the back,

00:12:59   like most items of clothing that have a seam.

00:13:01   The seam goes in the back,

00:13:03   which means that the little ATP tag

00:13:05   is also more towards the back.

00:13:07   Originally, we wanted that tag to be in the front,

00:13:09   but they made the first batch of them with them on the back,

00:13:11   and then seeing people put them on their heads,

00:13:15   and especially with glasses,

00:13:16   the tag, when it's on the front,

00:13:17   kind of interferes with the glasses

00:13:19   and messes up the line of the thing.

00:13:21   So we made the second batch.

00:13:22   I said, "Put the tag in the back.

00:13:23   I think it looks better there."

00:13:24   But everybody who I see wearing this online

00:13:26   thinks the tag goes in the front,

00:13:27   And it can, like it's a uniform hat, it's a single color.

00:13:30   No one's gonna see that the seam is in the front.

00:13:32   But technically the seam goes in the back

00:13:34   and the hat goes vertically forward and backwards.

00:13:37   You wanna wear it any other way, feel free.

00:13:39   It's your hat, you can do whatever you want with it,

00:13:41   but I'm just saying for the, if you're wondering,

00:13:43   how does Jon wear his chicken hat?

00:13:45   I wear it chicken style, like a dorsal fin

00:13:49   with the seam in the back.

00:13:51   - All right, we got a lot to talk about.

00:13:53   Let's do some follow up.

00:13:54   I need to continue my apology tour for Marco.

00:13:57   Apparently based on our feedback freaking nobody knew how to use the AirPods Pro 2 volume control

00:14:04   Vindicated. Yeah, I assumed that this was obvious. I do not have Pro 2. I only have the the

00:14:10   pedestrian original OG AirPods Pro

00:14:13   And I just assumed this was pretty self-explanatory and wow did I assume wrong because we got a lot of very kind feedback saying

00:14:20   Oh my goodness. Thank you so much for telling me how to do this. I had no idea

00:14:24   So Marco, first of all, I'm sorry because I think I was snickering in your general direction

00:14:29   last week and saying, "Oh my gosh, are you serious?"

00:14:31   And it turns out I was wrong.

00:14:33   Second of all, would you mind reiterating, since we brought it up again, can you just

00:14:36   quickly reiterate how is it that you control the volume on an AirPods Pro 2, please?

00:14:42   So you don't use two different fingers in motion, as I was doing, like front and back

00:14:47   kind of, you know, doing something to it.

00:14:51   You don't do that.

00:14:53   Instead, you put the thumb of your hand

00:14:56   on the back of the stem to hold it in place,

00:14:58   and with your index finger, you swipe up or down

00:15:02   on the front flat part of the stick

00:15:05   to change it up and down.

00:15:06   And you know you did it because only that AirPod

00:15:09   will make a little click sound out of that ear

00:15:12   to confirm that you did it.

00:15:14   And so my mistake was moving both fingers.

00:15:16   What you're supposed to do is support it with one finger

00:15:19   and swipe with the other.

00:15:20   - Yep, so my apologies, Marco,

00:15:22   you were right to bring it up and I was wrong.

00:15:25   - I am so happy, so many people wrote in

00:15:27   and were like, yeah, I didn't know how to do it either,

00:15:28   or I've been doing it wrong, I also wrote it off.

00:15:31   I'm so vindicated, I thought I was the only one.

00:15:33   I felt so dumb when I finally figured it out.

00:15:36   That's why I wrote it up on the show,

00:15:38   to be like, hopefully somebody else can feel less dumb

00:15:42   by this, and sure enough, many people now feel less dumb.

00:15:45   So mission accomplished. - Dozens, dozens.

00:15:47   - Yes. - And hearing your

00:15:48   complication story, I wonder if the people

00:15:49   who didn't know that you could adjust the volume

00:15:51   by swiping and were just too embarrassed to write in.

00:15:53   (laughing)

00:15:54   - You can change complications?

00:15:56   - What?

00:15:57   - You can change, well, and keep in mind,

00:15:58   this volume thing only works on the AirPods Pro 2.

00:16:02   It doesn't work on the AirPods 3,

00:16:04   it doesn't work on the original AirPods Pro,

00:16:05   even though they all look kinda similar.

00:16:07   - We are sponsored this week by Linode,

00:16:11   my favorite place to run servers.

00:16:12   Visit linode.com/ATP and see why so many people,

00:16:16   including us, use Linode to host our servers.

00:16:20   I run a lot of servers.

00:16:22   (laughs)

00:16:23   And Linode makes it super easy.

00:16:24   They're all at Linode.

00:16:26   I've moved off of all of their hosts

00:16:27   over the last few years because Linode is by far the best.

00:16:30   They even have other services now,

00:16:32   things like an S3 compatible object storage,

00:16:35   which I'm also now using.

00:16:36   They have managed load balancers, managed backups.

00:16:39   They also now have managed databases.

00:16:41   I'm not using those yet,

00:16:42   but those are very tempting, honestly.

00:16:43   So Linode is great.

00:16:45   They're an amazing web host to run servers

00:16:47   and all these different managed services they now offer.

00:16:49   And they have amazing support, amazing capabilities,

00:16:52   huge variation in things like resource levels

00:16:55   and what you're concentrating on for your resources

00:16:58   so you can have like high memory or high CPU

00:17:00   or GPU compute plans.

00:17:02   All of that in so much variety.

00:17:03   And what really is great about Linode

00:17:06   is that you get all of that at incredible value.

00:17:09   This is why I got there in the first place

00:17:11   almost a decade ago.

00:17:12   And that has kept me there all this time

00:17:14   because I have not found a better value

00:17:16   in the hosting business.

00:17:18   At all times, I've been with them,

00:17:19   again, for almost a decade now,

00:17:21   they have been either the cheapest

00:17:23   or tied for the cheapest for anything I'm looking to get.

00:17:26   And I have a lot of stuff there,

00:17:28   so that really adds up.

00:17:29   And so I'm just super happy at Linode.

00:17:31   I strongly recommend,

00:17:32   whether you're gonna spend five bucks a month

00:17:34   for a really basic VPS kind of server,

00:17:36   or whether you're really going in

00:17:37   and making some really big stuff,

00:17:39   Linode scales with you.

00:17:40   They offer the same great support,

00:17:42   no matter how much money you're spending with them.

00:17:43   It's a great server host and they have a whole API

00:17:47   and everything, it's amazing being a Linode customer.

00:17:49   See for yourself, linode.com/atp.

00:17:52   Create a free account, they only get $100 in credit.

00:17:55   Once again, linode.com/atp,

00:17:57   run all your cloud stuff at Linode.

00:17:59   Thank you so much to Linode for hosting all my servers

00:18:02   and sponsoring our show.

00:18:03   - I have a little bit of news about my beloved,

00:18:09   don't call it a park bench picnic table.

00:18:12   Went there to do a little bit of work today because it was very nice staying here in Richmond

00:18:15   It's just a touch chilly but not too bad and I was all slick with myself

00:18:18   I was all excited because I've got my fancy pants new iPad Pro

00:18:22   It's got 5g it can support 5g ultra wideband by Verizon, baby. I'm ready

00:18:28   Except I got there and my phone, you know took a few seconds when my phone shows 5g uw 5g ultra wideband

00:18:36   I am ready to rock on the phone

00:18:38   The iPad says you got 5g

00:18:41   So I wait a little bit still got 5g

00:18:43   Wait a little bit kind of diddle around on the iPad a little bit

00:18:46   You know do a fast.com text test just to see if that would kick it into like, you know

00:18:50   high gear or whatever and I look and it's

00:18:53   Still just 5g. Oh, no, did your phone work on it? Yeah. Oh, yeah, my phone was fine. My phone was good to go

00:19:00   So, okay what's going on here? Remember that this has a physical SIM

00:19:05   I was briefly confused by this when I did the transfer because I was waiting for it to transfer the eSIM

00:19:10   that wasn't an eSIM. But anyways, so I get on the Verizon chat, which while annoying,

00:19:15   has actually got a pretty high success rate, their online chat, for getting me an answer

00:19:21   and/or getting me what I want. You know, I got my cable card through the online chat.

00:19:26   That mostly worked. I've done a lot of stuff via the online chat, and so I get on the online

00:19:30   chat and I did that thing you should never do, and I kind of led with, "Do I need a new

00:19:35   SIM card, you know, is the SIM card which was, you know, originally in an iPad that

00:19:39   did not support 5G, it only supported LTE, does the SIM card perhaps not have whatever

00:19:45   magic bit flipped or whatever it may need in order to get 5G Ultra Wideband?

00:19:51   And after a considerable amount of time waiting for the person that was surely helping 35

00:19:56   other people, it turns out that there are multiple iPad plans in Verizon.

00:20:01   The one that I have is normally $20 a month,

00:20:04   but because of my phone's plan,

00:20:06   I only pay $10 a month for it, it's half off.

00:20:08   And that gives me 5G, and it gives me 15 gigs of data,

00:20:12   of quote unquote premium data,

00:20:15   which I guess means basically you get 15 gigs

00:20:17   and they start to throttle you.

00:20:18   But if I want ultra wide band, I need to go to $30 a month,

00:20:23   which I think would only be five bucks more for me,

00:20:25   'cause I think all of the iPad plans are half off,

00:20:27   but I gotta do some research on this.

00:20:28   Anyway, so strictly speaking, $30 a month,

00:20:31   Then I can get my 5G Ultra Wideband

00:20:33   and I go from 15 gigs of premium data to 30 gigs.

00:20:37   So I gotta confirm,

00:20:39   'cause this is not worth an additional $10 a month,

00:20:42   'cause remember, my plan is supposed to be 20,

00:20:45   I pay 10 because of the way I have my phone set up.

00:20:48   I would pay probably $5 more a month for this

00:20:50   because darn it, I want my Ultra Wideband,

00:20:52   that's the whole point, man.

00:20:53   But I don't think I would pay 10 more dollars for this,

00:20:55   so I gotta figure out what the story is.

00:20:56   And yes, I know I'm frugal/cheap/whatever,

00:20:59   but I gotta look into that.

00:21:00   But I just thought in case anyone else,

00:21:02   yes, this is kind of another PSA,

00:21:03   another entry in the PSA corner,

00:21:05   in case anyone else was confused,

00:21:06   that apparently is the case for Verizon.

00:21:08   I cannot speak for any other carrier.

00:21:10   And I just, I did not expect that.

00:21:11   So I was disappointed,

00:21:12   but I was at least happy to understand it.

00:21:14   So there is that.

00:21:16   - Is this also another case where if you were to hotspot

00:21:18   to your phone, you'd be throttled

00:21:20   by the Bluetooth bandwidth maybe?

00:21:21   - Yeah, or whether it's using Bluetooth or wifi,

00:21:24   yeah, you're way throttled,

00:21:25   but even by wifi you'd be throttled.

00:21:27   - Yeah, that's correct.

00:21:28   In fact, when I first was piddling with the ultra wideband stuff, I had done it via a

00:21:33   physical USB connection to my computer, and it was dog-slow on the computer end.

00:21:38   And I'm like, "What the heck?

00:21:39   I've got like 2.5 gigabits per second down on the phone.

00:21:43   Why the--" Oh, right.

00:21:44   This is USB 2.0.

00:21:45   It's slower than dirt.

00:21:46   Whoopsie-doopsie.

00:21:47   And so I decided, "Oh, if I'm going to tether, I am not going to tether via the USB connection.

00:21:52   I'm not going to tether, to your point, via Bluetooth.

00:21:54   I will tether via Wi-Fi.

00:21:56   Even if I have the phone connected to the computer for power purposes, I will still

00:22:01   tether via Wi-Fi because that is as close as I can get to full bandwidth from UltraWideband.

00:22:07   So yeah, you're exactly right.

00:22:08   Anyway, I just thought that was fascinating.

00:22:10   We have a lot to talk about with regards to system settings, Jon.

00:22:13   Tell me about this in Ventura, please.

00:22:15   This was something that someone asked about a while ago that I found when I was digging

00:22:20   through some old stuff regarding the switch controls in system settings.

00:22:25   The Switch is like the ones on your iPhone.

00:22:26   When you go to settings on your iPhone,

00:22:27   there's a little on/off switch that goes horizontally,

00:22:29   a little circle goes horizontally

00:22:31   in a little channel, right?

00:22:33   And of course, system settings in Ventura

00:22:35   looks a lot like that now.

00:22:36   And lots of people were throwing this entry

00:22:39   from the Apple Human Interface guidelines

00:22:40   back in Apple's face when the betas were out.

00:22:43   I was worried, this is not the item

00:22:45   that was brought to my attention, but is relevant to it.

00:22:48   So in the Apple Hig,

00:22:49   at a location that we'll link in the show notes,

00:22:52   Apple says, "Avoid using a Switch control

00:22:54   for a single detail or minor setting.

00:22:56   A switch has more visual weight than a checkbox,

00:22:58   so it looks better when it controls more functionality

00:23:00   than a checkbox typically does.

00:23:01   For example, you might use a switch to let people turn on

00:23:04   or off a group of settings instead of just one setting.

00:23:07   In general, don't replace a checkbox with a switch.

00:23:10   If you're already using a checkbox in your interface,

00:23:12   it's probably best to keep using it.

00:23:13   That's what the Apple Human Interface guidelines say.

00:23:16   Arguably, Apple violates that 10 ways to Sunday,

00:23:18   but with the Ventura System settings,

00:23:20   because pretty much all those things used to be checkboxes

00:23:21   and they didn't leave them alone,

00:23:22   and pretty much all of them are controlling

00:23:25   a single detail or minor setting.

00:23:27   So there is that, right?

00:23:29   But that's not the biggest thing,

00:23:30   'cause you can argue, oh, well, is this a minor setting?

00:23:32   Is it a big setting?

00:23:33   Is it important for it to match the iOS settings

00:23:35   for familiarity?

00:23:36   You can go all around in different directions and that,

00:23:37   although it is kind of weird that it is directly

00:23:40   in violation of what they say

00:23:41   in the Apple Hint-Erase guidelines.

00:23:42   But this is inarguable in another example of,

00:23:45   I don't know, just sort of not meeting the minimum standard

00:23:49   for a Mac user interface.

00:23:51   And this, kind of like complications

00:23:53   or possibly changing the volume on your AirPods,

00:23:55   it may be a thing that nobody knows about,

00:23:57   but I feel like if anyone should know about it,

00:23:59   it's the people making GUIs for the Mac who work at Apple.

00:24:03   I don't think that's too high of a standard.

00:24:05   And that is this, on the Mac, when you have a checkbox,

00:24:08   and also on the web, by the way, if you do it right.

00:24:10   But anyway, on the Mac, if you have a checkbox

00:24:12   and it says like, you know, turn on turbo lasers

00:24:15   and it's got a checkbox and the text next to the checkbox

00:24:17   says turn on turbo lasers, right?

00:24:19   Or turbo lasers on, I don't know.

00:24:20   I'm doing a bad job with the copy here,

00:24:22   but suffice to say there's a label next to the checkbox.

00:24:24   You can click the checkbox to make the checkbox

00:24:27   be on or off, but you can also click the label text.

00:24:31   And the reason why you might wanna do that

00:24:32   is the label text is way bigger than the checkbox.

00:24:35   Checkboxes are pretty small.

00:24:38   They can be hard to even see.

00:24:39   That's one of the arguments people make for switches.

00:24:41   Oh, the switches are so much bigger than checkbox.

00:24:42   They should use the switches

00:24:44   because they're just easier to hit.

00:24:45   It's a bigger target, and there is something too,

00:24:46   it being a bigger target, right?

00:24:48   Although the switches on Mac

00:24:49   are actually kind of small,

00:24:50   but the label is a way bigger target.

00:24:53   It's very wide and just as high as the checkbox,

00:24:55   if not higher.

00:24:55   That's why on the Mac, you can click the label

00:24:58   anywhere in the label text,

00:25:00   and it will activate and deactivate the checkbox.

00:25:02   And you can do it on the web if you know what you're doing,

00:25:04   and you should, and if you don't do it on the web,

00:25:05   you should feel bad.

00:25:06   All right.

00:25:07   (laughs)

00:25:08   So the question was, hey, in Ventura system settings,

00:25:11   when they have these big, long lists of labels and switches,

00:25:16   can you click the label to toggle the switch?

00:25:18   and the answer is, at least in the one that I tested,

00:25:20   no, you can't, and that is bad.

00:25:23   That is very bad, 'cause that is way worse than a checkbox.

00:25:26   And yes, I know the labels are distant from the slider,

00:25:28   from the little switch too,

00:25:29   instead of right next to it like checkboxes,

00:25:31   but I think it's also kind of bad.

00:25:32   But why can't I click the label

00:25:34   to activate and deactivate the slider?

00:25:36   That is a Mac idiom, and this is a replacement for a,

00:25:39   I mean, I don't know, maybe there's somewhere

00:25:40   in the human error state guidelines

00:25:41   where they say explicitly don't do this,

00:25:42   but that makes the targets for every one

00:25:44   of these little things so much smaller than they used to be.

00:25:46   They're bigger than checkboxes,

00:25:47   but they're way smaller than a checkbox and a label.

00:25:49   And I just don't understand that.

00:25:51   I don't understand how that comes to be

00:25:54   where they replace all the checkboxes

00:25:56   with this kind of control.

00:25:57   And then, oh, and by the way,

00:25:58   we forgot for this type of control

00:25:59   to make the label clickable.

00:26:02   Upsetting.

00:26:04   - Very upsetting.

00:26:06   And then apparently some things are disappearing

00:26:08   from the settings, system settings, system preferences,

00:26:12   whatever it's called.

00:26:13   - Yeah, we noted like the network locations was gone.

00:26:15   I don't know why that feature disappeared, but it did.

00:26:17   But there's one more feature that disappeared

00:26:19   and it's relevant to my interest because I use this feature.

00:26:21   It's a feature I've talked about in the past

00:26:23   that used to be an energy saver in system preferences.

00:26:26   There was a GUI for setting a time

00:26:28   when you want your computer to wake up

00:26:30   and setting a time when you want your computer

00:26:31   to go to sleep.

00:26:32   And I use these features to have my computer

00:26:33   wake up in the middle of the night when I'm sleeping,

00:26:35   do a bunch of backup stuff and then go to sleep.

00:26:38   So I put my computer to sleep when I go to bed

00:26:40   and then while I'm sleeping it wakes up,

00:26:41   does a bunch of stuff, backup stuff

00:26:43   and then goes back to sleep, right?

00:26:45   Those, the ability to do that, like timed waking up

00:26:48   and going to sleep, still exists in the OS.

00:26:51   They just either got rid of or didn't have time

00:26:54   to re-enactment the GUI for that, which is kind of a shame.

00:26:57   We'll put a link in the show notes to Apple's support

00:27:00   document that shows you how to do it from the command line.

00:27:02   So I had to go and read the man page and come up

00:27:04   with the command lines that are equivalent to my settings

00:27:07   in case I ever want to restore them.

00:27:08   It's an easy way to get rid of the schedule

00:27:11   and it's an easy way to add it back.

00:27:12   So it's not so bad, but no one's ever gonna find

00:27:15   document the GUI was much friendlier way to let people change this and since the

00:27:18   feature is still there I hope the GUI will come back someday. Indeed. A really

00:27:23   quick thing I wanted to note I should have talked about this earlier but it

00:27:26   completely slipped my mind. I am NOT trying to be funny. When did landscape

00:27:31   iPads start treating the volume up down physical volume up down switches as

00:27:37   backwards? Wait they do? Right thank you! What do you consider backwards when it's

00:27:43   left and right. Well, so here's the thing. I will, this is like natural scrolling all

00:27:48   over, which I believe in. I believe in natural scrolling, but last I recall, I don't think

00:27:52   either of you do, but I do not. And I disagree with the terminology too. It's like a death

00:27:57   tax. I don't, I don't accept your framing. I'm just using the Apple terminology. So the

00:28:02   way I'm, the way I'm used to it is that, okay, hold your iPad either mentally or physically

00:28:07   hold your iPad in portrait, you know, tall. So on the right hand side, for larger iPads anyway,

00:28:14   the volume up button is the top button, volume down button is the bottom button, right? So far,

00:28:20   so good. That makes sense. Take that and twist it counterclockwise. So you're taking the top and

00:28:27   sliding it so it's now the left. I am used to, and in every other iPad I've ever used, the left side

00:28:35   is volume up, the right side is volume down,

00:28:38   because it's the same function.

00:28:39   - It's the same button as it was before.

00:28:40   If you had kept your finger on the buttons,

00:28:42   it would be the same button.

00:28:42   - That's what mine's doing.

00:28:43   - So on my brand new iPad,

00:28:45   and I believe I read that the Mini is doing this as well,

00:28:48   but I could have that wrong.

00:28:49   On the brand new iPad,

00:28:50   which actually is not in the room with me,

00:28:52   I wish it was, but if I have a portrait twist 90 degrees

00:28:56   to the left, so now the top is the left,

00:28:58   volume up is the rightmost button,

00:29:01   volume down is the leftmost button.

00:29:03   Now if you think about it for half a second,

00:29:05   that actually does make more sense

00:29:07   because you are looking at the meter

00:29:09   as you're voluming up and down.

00:29:11   And as you increase volume, it goes to the right.

00:29:15   As you decrease volume--

00:29:16   - If you change your language to Arabic, does it change?

00:29:19   - (laughs) I don't know, or Hebrew or something.

00:29:21   - Wait, so Benzi in the chat has,

00:29:24   and Jitengur in the chat also have pointed out,

00:29:28   this is actually now a setting.

00:29:29   - Yeah, I looked for that and I must have missed it

00:29:32   because I looked for it and didn't,

00:29:34   oh no, you know, that's what it was.

00:29:35   I told myself, no, embrace it.

00:29:38   This is backwards.

00:29:39   The way you're used to doing it, Casey, is backwards.

00:29:42   - Just like a quote unquote natural scrolling

00:29:44   advocate would do. - Correct, exactly right.

00:29:46   Exactly right.

00:29:47   - Yeah, you see the error of your ways, don't you?

00:29:49   Now that you know the settings there,

00:29:50   you can restore sanity.

00:29:52   - No, I disagree.

00:29:53   I think this does make more sense.

00:29:54   It's just because it's not what I'm used to,

00:29:56   that I'm flummoxed by it.

00:29:58   But I do think it actually makes more sense

00:29:59   because again, as you're voluming up,

00:30:01   this slider is moving from left to right.

00:30:03   As you're voluming down,

00:30:04   it's moving from right to left.

00:30:05   So it does make sense,

00:30:07   but the fact that the buttons are software controlled,

00:30:11   which obviously they were the whole time,

00:30:12   but like you feel, or I felt at least,

00:30:15   as though they were hardware buttons.

00:30:17   Does that make sense?

00:30:17   You know, like this button will always

00:30:20   and forever be volume up.

00:30:21   This other button will always and forever be volume down.

00:30:25   And it turns out that, well, no,

00:30:27   they aren't quote unquote hardware buttons.

00:30:29   they're just giving a hint to the software to do something.

00:30:33   And yeah, it actually does make sense to me anyway,

00:30:35   to have the right one go up and the left one go down,

00:30:38   but it took me by such surprise.

00:30:39   And I forget what I was doing.

00:30:40   I don't think I was like in bed with Aaron

00:30:42   or anything like that, but it was at a time

00:30:44   when I was not expecting a bunch of volume

00:30:46   and I wanted to turn the iPad down

00:30:48   and I mashed down on the right hand button

00:30:52   because I was in landscape,

00:30:53   mashed down the right hand button, it just got louder

00:30:54   and I was like, "Oh my God, what's happening?"

00:30:56   And then it occurred to me, oh, this is reversed.

00:30:59   - Unintended volume acceleration?

00:31:00   - Yeah, exactly right.

00:31:02   Hashtag Toyota, am I right?

00:31:03   But anyway, so yeah.

00:31:04   - Howdy, come on, get the fake story right.

00:31:06   - Oh, sorry, my mistake.

00:31:07   Are you sure it wasn't Toyota?

00:31:08   - I guess, was it later Toyota?

00:31:10   The Audi one wasn't real, I think, but I don't remember.

00:31:13   - I thought it was Toyota. - I think either of them

00:31:14   were super real.

00:31:16   The big story was about Toyota,

00:31:19   but I think it was a little not super solid of a story.

00:31:23   - Well, anyway, for the volume button thing,

00:31:25   I guess we have to wait for Apple to come out

00:31:27   with the brand new landscape volume buttons to match their landscape camera, which is

00:31:31   what they call it when they move the camera to the other side.

00:31:33   Well, if they move the volume buttons to the other side of the iPad, I guess, like the

00:31:38   short side, then it'll be up and well, it'll be up on landscape.

00:31:42   It'll basically give the landscape one a top.

00:31:43   I guess it already has a top because once you put the camera on the landscape edge,

00:31:46   that's the top in landscape.

00:31:47   So then they can put the volume button somewhere and then I guess, you know, they would be

00:31:51   on the left side and then up would be volume up and down and be volume down.

00:31:54   - And then you can debate how they should work

00:31:56   when it's in portrait.

00:31:57   - Yeah, yeah.

00:31:58   Anyway, this took me by surprise.

00:31:59   Obviously--

00:32:00   - Apparently that's how the mini works, too.

00:32:01   - Yeah, I think that's correct.

00:32:03   - 'Cause they had to have room for the pencil.

00:32:04   That's right, the pencil's along the edge.

00:32:06   - Right, again, I don't have a mini, so I can't verify,

00:32:08   but I believe that's correct.

00:32:09   Anyway, I just thought that was very, very confusing

00:32:11   and surprising, and I don't remember that

00:32:12   having been mentioned anywhere, perhaps it was.

00:32:15   - It was probably mentioned in the keynote,

00:32:15   but it was a long time ago.

00:32:17   - We are brought to you this week by Trade Coffee.

00:32:21   You know, we're getting into holiday shopping season,

00:32:24   And if you're looking for something

00:32:25   that you can get somebody who's really hard to shop for,

00:32:28   take a look at a personalized coffee subscription

00:32:31   for them from Trade Coffee.

00:32:32   This is a coffee subscription service

00:32:35   that makes it so simple to discover new coffees

00:32:37   and make your best cup of coffee at home every day.

00:32:41   Trade partners with the nation's top rated

00:32:42   independent roasters to send you coffee

00:32:44   they know you'll love fresh to your home

00:32:47   on whatever your preferred schedule is.

00:32:49   And Trade makes it super easy and convenient

00:32:52   to discover brand new coffees.

00:32:54   Whether you are a coffee noob,

00:32:56   or whether you're kind of a coffee power user,

00:32:58   like I am, Trade is amazing,

00:33:00   because they aren't just one roaster.

00:33:02   Trade partners with all these roasters all over the country,

00:33:04   and so when you subscribe to Trade,

00:33:06   you get a variety of coffees sent to you,

00:33:08   which is really nice, you know?

00:33:10   I have my mainstays, of course,

00:33:11   but sometimes I just want a variety,

00:33:13   and Trade does that for me.

00:33:15   It's really, really great,

00:33:16   and they have these personalized algorithms

00:33:18   and questionnaires and things,

00:33:19   so that they can really make sure

00:33:20   that you're getting what you like.

00:33:22   And it's a wonderful gift now as well.

00:33:25   They make it super easy with digital gifting,

00:33:27   options for last minute shoppers on both their coffee

00:33:30   and their equipment bundles

00:33:31   for something to put under the tree.

00:33:33   So treat yourself or the coffee lover in your life

00:33:37   with Trade Coffee.

00:33:38   Right now, Trade is offering our listeners

00:33:40   a total of $30 off a subscription

00:33:43   and access to limited time holiday specials

00:33:45   at DrinkTrade.com/ATP.

00:33:49   That's DrinkTrade.com/ATP for $30 off a subscription.

00:33:54   Once again, DrinkTrade.com/ATP.

00:33:58   Thank you so much to Trade for keeping me caffeinated.

00:34:01   I'm caffeinated by them right now as we speak,

00:34:04   and for sponsoring our show.

00:34:05   - Hey, would you tell me about the great news

00:34:12   coming out of The Verge, among other places,

00:34:14   with regard to the new Apple TV 4K,

00:34:15   which arrives Friday, if I'm not mistaken.

00:34:17   Isn't that right?

00:34:18   - I think so, yeah, it's okay news.

00:34:20   We talked about, I should have had the expansion

00:34:24   of the abbreviation in the notes here

00:34:26   with a quick media sync, I believe it was, QMS, HDMI QMS.

00:34:29   It was part of the HDMI 2.1 standard

00:34:31   that lets the television change frame rates

00:34:34   without blacking out the screen,

00:34:36   like leveraging the VRR, the variable refresh rate feature

00:34:40   that already is an HDMI, the main television support,

00:34:42   QMS is the thing on top of that

00:34:43   that would try to avoid some of those black screens.

00:34:46   I talked about the limitations of it last week

00:34:47   that one limitation is that everything except for the frame rate has to stay the same and

00:34:51   I wasn't sure if that included HDR versus SDR, so you still might be getting black screens

00:34:55   when you go from your Apple TV menu screen to a video program if the HDRness doesn't

00:35:01   match but we'll see.

00:35:02   But the real main limitation is both devices have to support HDMI QMS.

00:35:09   And so the news is that Apple says that, and I don't know, this was in the Verge article

00:35:14   and a bunch of other articles, Apple must have just told people reviewing the Apple

00:35:16   TV this information, but Apple basically says that the Apple TV 4K will be getting QMS in

00:35:23   a future software update.

00:35:25   No timelines or whatever, but it's coming, right?

00:35:27   But as the Verge article says, "How many TVs work with QMS VRR, you asked?

00:35:32   Well, zero at the moment, but you'll start seeing them hit the market next year."

00:35:36   I really hope they'll be able to do firmware updates because if you already support VRR,

00:35:39   which most fancy modern TVs do, supporting QMS VRR seems like it's within the realm

00:35:46   of possibility for a firmware update, but you never know with TV makers. Sometimes they

00:35:50   just never update old TVs to be able to do something that a new TV does. So my fingers

00:35:54   are crossed for my television, which doesn't currently support QMS, that someday in the

00:36:00   future it will support QMS. And also someday in the future the Apple TV 4K that's coming

00:36:04   to my house will eventually support QMS.

00:36:06   Yeah, so I was looking at this briefly and I believe that VRR and QMS are part of HDMI

00:36:13   2.1, I think?

00:36:16   but they're optional like every part of HDMI 2.1.

00:36:18   - So I guess the question is, you know,

00:36:20   what version of HDMI do all of our televisions support?

00:36:23   - It's not the version, the standard is so crappy.

00:36:26   You can be compliant with HDMI 2.1

00:36:28   if you only support HDMI 2.0 features.

00:36:30   It's a ridiculous standard.

00:36:31   Like it just, it lets people put the label on.

00:36:33   You have to, that's why you go to these review sites

00:36:35   and they have a giant grid of like,

00:36:36   what features does it actually support?

00:36:38   ALM, VRR, you know, it's all these acronyms,

00:36:40   soup or whatever.

00:36:41   But HDMI 2.1 tells you almost nothing.

00:36:44   All it tells you is that it's plausible

00:36:46   that it might support these things.

00:36:47   Whereas if you see HDMI 2.0, you know it absolutely doesn't.

00:36:50   But you have to look at the individual specs

00:36:53   to know which alphabet soup thing

00:36:55   that you're actually interested in.

00:36:56   It's stupid.

00:36:58   - Yeah, well, we'll see what happens.

00:37:00   Talk to me about iCloud shared photo library,

00:37:03   which by the way, due to breaking news,

00:37:05   I finally got the chance to have Erin update her phone

00:37:09   to 16.1 just like an hour and a half ago.

00:37:12   And so as we are recording, I am moving 47,733 items

00:37:16   to the shared photo library,

00:37:17   which based my network bandwidth seems like

00:37:19   is actually like a thing.

00:37:22   Like it is, it almost appears as though things are,

00:37:25   these photos are moving back and forth.

00:37:26   You would think that it would just be like,

00:37:28   oh, take all these on the server side

00:37:30   and glob them into the new shared photo library.

00:37:32   But it does not appear to me that that's what's happening.

00:37:33   I don't know.

00:37:34   - I don't think it's moving your data at all.

00:37:35   Like I said last week,

00:37:36   I think it is purely a metadata based operation.

00:37:38   And I think when you did the move of 40,000 or whatever,

00:37:42   You should have seen within a few seconds all 40,000 disappear.

00:37:45   If you were on the personal library view,

00:37:47   they should have all disappeared from the personal library

00:37:49   view in your local thing.

00:37:50   And then there's the countdown.

00:37:51   Again, scroll down to the bottom for the progress.

00:37:54   But I think it's just a purely metadata move.

00:37:55   I don't think it's moving any of the data at all.

00:37:57   But I don't know.

00:37:57   Someone who works on Apple Photos can come and tell us.

00:38:00   Do you think anybody who works on Apple Photos would actually

00:38:03   be on speaking terms with us?

00:38:04   That is a plausible thing that could happen.

00:38:07   Unrelated to that, here's some news about photo-shared library.

00:38:11   Mikhail Gok told us that in the camera settings on iOS,

00:38:16   there's a toggle to send photos you take

00:38:19   directly to the shared library.

00:38:20   I'd asked about that last week.

00:38:21   What if I just want everything I take,

00:38:22   every photo I take with my phone

00:38:23   to go right into the shared library?

00:38:24   I don't want it to suggest to me

00:38:25   when it thinks they should go there.

00:38:27   I don't want to move them manually.

00:38:29   There is an option for that.

00:38:30   I think during the onboarding process on the phone,

00:38:32   it asks you that and says, "Hey, do you want me to do this?"

00:38:34   But I onboarded on the Mac, and so that's not my phone.

00:38:37   And so either I missed that option

00:38:39   or it didn't even ask or whatever.

00:38:40   But if you want every single photo you take on your phone

00:38:44   to go directly into the shared library, you can turn that on.

00:38:46   It's in the camera settings on your iPhone.

00:38:49   As for my unsaved photos warning,

00:38:51   I'm not sure if Casey saw that when he was trying

00:38:52   to move his 40,000 photos, but if you select a bunch

00:38:54   of photos and try to move them to the shared library,

00:38:56   you might see something that says,

00:38:57   "Hey, you've got unsaved photos.

00:38:59   "Do you wanna save them?"

00:39:00   And I wondered what the heck that was about.

00:39:02   Lots of people gave me the answer

00:39:03   that those are the shared with you photos.

00:39:05   If you have that feature on, when someone sends you a photo

00:39:09   in messages, like an iMessage.

00:39:11   There's a feature that will surface that photo

00:39:14   in your photos application, and it'll say like,

00:39:16   oh, this was from Casey when you were having a message thread,

00:39:19   and it shows his name and a little message icon

00:39:21   or something or whatever.

00:39:23   Those are the ones that haven't saved.

00:39:24   And what it means by saved is we're showing you this photo

00:39:27   because it's in one of your message threads.

00:39:29   We're showing it to you in the photo grid view

00:39:31   or in the Mac version of Photos app,

00:39:33   but you haven't actually saved this photo

00:39:34   to your photo library.

00:39:37   And by saved, because I did select all

00:39:38   or select a whole bunch,

00:39:40   some of the things I selected were shared with you photos.

00:39:43   So you have two choices.

00:39:44   You could either save them,

00:39:45   which means basically copy them out of the message thread

00:39:48   and put them into your photo library

00:39:49   where they will live forever.

00:39:50   Or you can go to the, on the Mac anyway,

00:39:52   I think on the iOS one too,

00:39:53   there's like a filters menu option in the upper right.

00:39:57   You can tell it to filter out,

00:39:58   like don't show me the shared with you photos.

00:40:01   And then it just won't show them.

00:40:02   And then when you select huge swaths

00:40:03   and you don't have to worry about that.

00:40:04   So that's that feature, that's that mystery solved.

00:40:08   D duping.

00:40:09   So D duping is still definitely a frustrating situation.

00:40:13   And I'm not even, you know,

00:40:14   I'm not even talking about D duping

00:40:17   like with two people's accounts.

00:40:18   Like I said, last week, my wife and my account

00:40:20   on the same Mac, I'm talking about D duping

00:40:21   where two people contribute the same photo

00:40:23   to the shared library,

00:40:24   or there's a photo in the shared library

00:40:25   and also in a personal library.

00:40:27   And it's a tricky situation.

00:40:28   So just to give an example, an example that I tested,

00:40:32   if I and my wife both contribute the exact same photo

00:40:35   to the shared photo library,

00:40:36   which might happen if we both got like air dropped

00:40:38   the photo from someone else.

00:40:39   It's literally the same photo down to the bite,

00:40:41   no differences whatsoever.

00:40:43   There'll be two copies in there.

00:40:45   And deduping is a little bit tricky

00:40:47   because if we dedupe that,

00:40:49   I mean, to do it right, what you would have to do

00:40:51   is keep track of the fact that two different people

00:40:55   contribute this photo, because,

00:40:57   and I'm noticing this when I'm hopping back and forth

00:40:58   between accounts, if you're in the shared library,

00:41:01   you can take one of the photos and say,

00:41:02   move this back to my library.

00:41:04   But if you're not the one who contributed

00:41:06   to the share library, you can't,

00:41:07   it doesn't say move it back,

00:41:08   'cause you didn't contribute it.

00:41:09   So if two people contributed,

00:41:11   the system has to remember,

00:41:12   hey, both people contributed to this.

00:41:14   And so if you move it back,

00:41:16   A, it has to understand that you're allowed to move it back

00:41:18   because you were one of the two contributors,

00:41:20   or one of the three contributors,

00:41:21   or one of the four people who contributed, or whatever,

00:41:23   it has to understand that you're allowed to do that.

00:41:24   And B, it should probably leave it there

00:41:28   for other people to get back,

00:41:29   because they contributed it, but they didn't pull it back.

00:41:31   So you can pull your copy back.

00:41:35   it's got a lot of weird edge cases.

00:41:36   It's not as simple as just, oh, just delete one of them.

00:41:38   Because if you do that,

00:41:39   then you're left with like the one contributed by one person

00:41:41   but now the other person can't pull it back

00:41:43   even though they think they contributed it.

00:41:44   So it is complicated and confusing,

00:41:47   but I do hope it's on the list.

00:41:48   I don't think I put it in a feedback for that,

00:41:50   but I probably will at some point.

00:41:51   But I hope this is already on the roadmap somewhere

00:41:52   because it would be a useful feature.

00:41:55   It's surprisingly easy to get dupes.

00:41:57   And the reason it's surprisingly easy to get dupes,

00:41:58   at least for me, is we've had just like

00:42:00   years and years, decades, at least a single decade,

00:42:03   a long time where we've had separate libraries.

00:42:06   And inevitably there are photos that end up

00:42:08   in both of the libraries

00:42:09   because even when one is the real library,

00:42:11   sometimes I wanna have like the good pictures on my phone

00:42:14   so I can make wallpapers, so I can post them to Instagram.

00:42:17   And there are just so many photos that are both on my phone

00:42:20   and also in the quote unquote real library.

00:42:22   And trying to manually sort that out is fraught.

00:42:26   I'd rather just dump everything into the shadow of the library

00:42:29   and say, "No, dee doop," and have it handle that,

00:42:31   but it doesn't do that yet.

00:42:33   and it doesn't have a way for you to ask it to do,

00:42:34   but just does it on its own sweet time, which is not great.

00:42:37   And finally, people and faces.

00:42:39   So here's how, as far as I can tell,

00:42:41   the people and faces stuff.

00:42:43   The people and like the names of the people

00:42:46   and all the face data doesn't seem like

00:42:48   it goes with the photo.

00:42:51   So if I contribute a photo and I have all the faces named,

00:42:54   you know, and like identified all the people in them,

00:42:56   that stays, like I don't lose that

00:42:58   by pushing into the shared library.

00:42:59   It's exactly there because I'm contributing the photo.

00:43:01   But other people who see that photo,

00:43:03   they have to then run face recognition

00:43:05   according to their face database,

00:43:07   and they have, you know,

00:43:08   assigning their names or whatever.

00:43:09   And so in theory, if you want to be weird or funny,

00:43:12   you could have totally different names

00:43:13   and totally different face assignments for multiple people

00:43:16   looking in the same photo in the shared library.

00:43:18   But the face data and the name data is not shared.

00:43:21   It is private to each contributor.

00:43:24   There's a privacy angle to that,

00:43:25   or you might not want to know that, you know,

00:43:28   Uncle Bill is named Poopyface

00:43:29   in your library or something.

00:43:32   But for families, it would actually be kind of convenient

00:43:36   if the people data could be shared.

00:43:38   Kind of like the keywords are shared,

00:43:39   'cause that's another privacy thing.

00:43:41   Like I said a couple weeks ago,

00:43:44   when you assign a keyword to a photo and you share it,

00:43:46   other people see that keyword.

00:43:48   So I hope your keywords aren't like, you know,

00:43:50   something you don't want other people to see.

00:43:51   It's just text, you know, it's not a big deal

00:43:53   in terms of the sharing,

00:43:54   but it saves a lot of time and energy

00:43:56   'cause I want the keywords to be shared rather,

00:43:59   so that I can take advantage of the years

00:44:01   I have spent keyword tagging photos.

00:44:03   - You know, as you were talking

00:44:05   and I was looking at the show notes

00:44:06   and I noticed that I, or I realized

00:44:09   I hadn't looked at the shared library settings

00:44:11   within camera, within settings on my phone.

00:44:14   And it's worth noting there's some clever stuff here.

00:44:16   So, you know, there's a kind of global,

00:44:19   do you wanna share photos directly from your camera?

00:44:21   Yes, no.

00:44:22   Then below that, you have two choices,

00:44:23   share automatically or share manually.

00:44:25   And if I understand this right, basically,

00:44:28   If your phone, or my phone in my case,

00:44:30   sees Aaron's phone nearby, it will automatically,

00:44:33   if I so choose, share photos that I take

00:44:37   while we're near each other to the shared library.

00:44:40   And then additionally,

00:44:41   which I think they had talked about on a keynote somewhere,

00:44:43   but additionally, there's share when at home, yes, no.

00:44:47   And so whenever I'm at home,

00:44:48   irrespective of whether Aaron is physically near me

00:44:50   at the time or not,

00:44:51   it can optionally share directly

00:44:53   to the shared photo library, which I think is very slick.

00:44:55   And we'll see whether or not any of this actually works,

00:44:57   but I'm very impressed that these options are available.

00:45:01   It seems reasonably well thought out.

00:45:03   The one thing that I do think I screwed up though,

00:45:05   is I wanted to capture photos of the four of us

00:45:10   and basically anything since Aaron and I met.

00:45:12   And I think what I accidentally did was an and not an or.

00:45:16   Like it seemed to me like I was doing an or operation.

00:45:19   Like take anything that is, you know, from before

00:45:23   or from the time we met and after, and, or,

00:45:26   I have to choose my words carefully here.

00:45:27   - Did you mean like during the onboarding

00:45:29   when you were at the-- - Correct, sorry, yes.

00:45:30   - prompted you to say like,

00:45:30   hey, what photos do you want me to send over?

00:45:32   Yeah, I didn't understand - Exactly right.

00:45:33   - any of those options, which is why I said,

00:45:35   I'll do this myself later, whatever that option was.

00:45:37   - Yeah, see, that's why I should've learned from your,

00:45:40   I was gonna say mistakes, but not mistakes.

00:45:42   I should've learned from you.

00:45:43   And so I think, what I did was I said,

00:45:45   start from whenever we met and take the four of us.

00:45:48   And it appears to me that what it heard when I said that

00:45:52   is take pictures of the four of us

00:45:54   rather than just glob everything from 2005 onward,

00:45:59   it's actually just, I think it's just taking photos

00:46:03   that it recognizes as either me, Aaron, Declan, or Michaela,

00:46:06   which I'm gonna have to go back and add a whole bunch more,

00:46:08   but what are you gonna do?

00:46:09   Also, I mean, it's still churning, so maybe I'm wrong.

00:46:11   Maybe I'm basing this off of what's in progress,

00:46:14   and once it all settles down, maybe it'll be what I expect.

00:46:17   - You're just reminding me of how much the search

00:46:19   in photos annoys me either because I can't figure it out,

00:46:21   in which case someone, again, from the Apple Photos team,

00:46:23   please tell me how to do this because I want to do it,

00:46:25   but like, photos has had for many years now

00:46:28   a search box in the upper right,

00:46:29   and you can type stuff in there like you can on your phone,

00:46:31   like you can type the word dog

00:46:32   and it will find all the pictures of dogs, right?

00:46:34   That, and it has, it understands a whole bunch of things.

00:46:37   It's very useful, and now it also does text,

00:46:39   so if you take a picture of a street sign,

00:46:40   it'll do OCR on the text, so you can type that text, right?

00:46:42   Powerful search field up there, right?

00:46:44   Also, there is a thing called smart folders

00:46:46   that are in the sidebar, and smart folders

00:46:49   have never heard of that search field in the upper right,

00:46:51   because everything that search field can do,

00:46:53   Smart folders are like, I have no idea what that is,

00:46:54   don't even talk to me about it.

00:46:55   So if I wanted to do something like,

00:46:57   show me photos with these people,

00:46:59   show me photos of a dog that were taken

00:47:01   between these years, the smart folder is like,

00:47:02   ha ha ha, I don't know what a photo of a dog is,

00:47:04   what are you even talking about?

00:47:05   Unless you keyword tagged it with dog,

00:47:07   I have no idea about that AI thing.

00:47:08   You can't even, and it doesn't know

00:47:09   about face recognition either.

00:47:10   Smart folders are like, show me photos of these five people,

00:47:13   but also that, like it's just, it's so,

00:47:16   they're so divorced from each other

00:47:18   based on like the year and team that they were created,

00:47:20   and I'm like, look, if you can do these searches,

00:47:21   the smart folders should support everything

00:47:23   the search field can support and vice versa,

00:47:24   but they don't.

00:47:25   There's so many smart folders that I wanted to create

00:47:29   to help me deal with the migration and stuff

00:47:31   that just aren't possible to do.

00:47:32   Or maybe they are, please tell me I'm wrong

00:47:34   and let me know how to do this.

00:47:35   - We are brought to you this week by Squarespace,

00:47:40   the all-in-one platform for building your brand

00:47:42   and growing your business online.

00:47:44   Stand out with a beautiful website,

00:47:46   engage with your audience, and sell anything,

00:47:48   your products, your content, even your time.

00:47:51   Squarespace makes it super easy for you to make a website,

00:47:54   especially business websites.

00:47:55   So of course, any site you want to make,

00:47:58   basic hobby content kind of stuff, podcast hosting,

00:48:00   they do all that, whatever you want to make a site for.

00:48:02   But they've really been pushing recently

00:48:04   into lots of great business tools.

00:48:06   So whether you want to sell things like digital products,

00:48:09   physical products, time slots,

00:48:11   if you're like some kind of consultant

00:48:12   or you do online coaching or something like that.

00:48:15   If you can even sell member content,

00:48:17   you could have member areas that unlock things

00:48:19   like gated content, videos, online courses, newsletters.

00:48:23   And Squarespace makes it easy for you to run your shop.

00:48:25   So you can use analytics to grow your business,

00:48:28   learn where your site visits and your sales are coming from,

00:48:30   make sure your marketing channels are most effective.

00:48:33   You have all sorts of stats tools and SEO tools

00:48:37   so that you can really be sure that

00:48:39   whatever kind of business you're trying to run on Squarespace

00:48:41   they have the tools that you need and it's so easy to use.

00:48:45   So everything is customizable, as usual with Squarespace.

00:48:49   You can design the whole site yourself.

00:48:50   You can start with their great templates.

00:48:52   You can leave it the way it is.

00:48:53   Or you can throw in your own colors or branding.

00:48:56   Or you can even get in there, really move stuff around,

00:48:58   really customize stuff.

00:48:59   Whatever degree you wanna customize, they offer that.

00:49:02   And all of that's with no coding.

00:49:03   And so it's really super easy.

00:49:05   You don't have to be a web developer to do it.

00:49:07   It's just a great place to run a site, especially a store.

00:49:10   So go to squarespace.com/atp

00:49:13   and you can build your site in free trial mode

00:49:16   without ever giving them a credit card.

00:49:18   When you're ready to launch, use offer code ATP

00:49:20   when you purchase to save 10% off your first purchase

00:49:23   of a website or domain.

00:49:24   Once again, squarespace.com/ATP to start that free trial.

00:49:28   Code ATP at purchase to save 10% off your first purchase

00:49:31   of a website or domain.

00:49:32   Thank you so much to Squarespace for sponsoring our show.

00:49:35   (upbeat music)

00:49:38   - So we had some news that happened this week

00:49:42   and it's one of those times when like the entire internet

00:49:46   points to you and says, did you see this?

00:49:48   Did you see this?

00:49:49   Like there was a,

00:49:50   I probably won't be able to find it for the show notes,

00:49:52   but there was like a satirical thing

00:49:53   where an FFmpeg fan talks about FFmpeg.

00:49:56   It was not directly about me as far as I'm aware,

00:49:58   but like the entire internet sent it to me.

00:50:01   If I remember it, and if I can dig it up,

00:50:03   I'll put it in the show notes.

00:50:04   But in this case, I was not the star.

00:50:06   It was John.

00:50:07   And you were the star because coming out of,

00:50:11   apparently out of the woodwork is rewind.ai.

00:50:14   And rewind.ai is by another name, a life stream.

00:50:19   And this is something I've heard you talk about for years.

00:50:22   So John, remind me what a life stream is and tell me what's going on.

00:50:27   Not to be confused with life day. Um, yeah, I had to dig up where I talked about this.

00:50:31   Most recently back in November of 2021,

00:50:34   I talked about it with Merlin on rectifs, uh, reconcilable differences.

00:50:37   I keep abbreviating. People don't know what that is. Another podcast I do.

00:50:39   I'll put a link in the show notes. If you, uh,

00:50:41   a timestamp length to 53 minutes and 22 seconds

00:50:44   if you want to hear my most recent conversation about it.

00:50:46   But way before that, back in May of 2014,

00:50:49   I talked about it on this very show, on ATP episode 66,

00:50:52   at 1 hour and 52 seconds into the show.

00:50:55   And it's not a thing that I made up.

00:50:57   As I said in both of the times I talked about it,

00:50:58   it's a thing I saw on TV when I was a kid,

00:51:00   possibly a teenager, but in my advanced age,

00:51:03   it seems like when I was a kid.

00:51:04   And it was like some research thing

00:51:06   that some academic had done and written papers about

00:51:09   or whatever.

00:51:10   I was digging some more stuff about this.

00:51:11   Someone wrote a paper about implementing this

00:51:13   on the Newton PDA back in the day.

00:51:15   I don't know who's ever implemented.

00:51:16   A lot of the papers had broken links.

00:51:18   But anyway, as I think I described

00:51:20   in both times I talked about this,

00:51:21   it was back early enough where people were still thinking

00:51:24   about new ways to look at data on computers.

00:51:27   We're all kind of familiar with the files and folder metaphor

00:51:30   of sort of arranging things in a hierarchy

00:51:33   and the desktop metaphor and like, you know,

00:51:35   filing things like that.

00:51:36   We're also kind of aware these days of the search paradigm,

00:51:40   where things, you know, files and objects have attributes,

00:51:44   and you can search based on those attributes,

00:51:45   unless they're Apple photos, in which case,

00:51:47   you can only search on some of them in the sidebar.

00:51:48   But anyway, that's a different way to view your text,

00:51:51   view your content, you could use search,

00:51:52   and then the search doesn't care where the thing is located,

00:51:55   it just cares that they're all pictures of dogs or whatever.

00:51:58   And another way you can think about slicing and dicing data

00:52:01   is purely based on time.

00:52:03   We don't care if it has a picture of a dog in it,

00:52:05   we don't care where it is in the file system,

00:52:07   all we care is when something happened to it.

00:52:09   And so you can imagine every single thing

00:52:12   that you interact with on a computer

00:52:14   being a time-ordered sequence.

00:52:17   I looked at this thing, I clicked this thing,

00:52:19   I made this thing, I wrote this text, I did this,

00:52:21   and it's just a long stream of things.

00:52:23   You can visualize, I think they even visualize it

00:52:25   on the thing I was watching on PBS

00:52:26   is just like a, you know, kind of like a rollercoaster ride

00:52:29   where you're flying through a whole bunch of things

00:52:30   and the past is way out there

00:52:32   and the future is in the other direction.

00:52:33   And they called it life streams or life streaming.

00:52:36   And it was bigger than that.

00:52:37   It was like, what if I wear a camera on my head

00:52:39   and record myself 24 hours a day or whatever,

00:52:40   but just in the realm of computers,

00:52:42   imagine if you could see everything

00:52:46   that you've interacted with on the computer

00:52:48   ordered by time.

00:52:49   And the reason I brought that up probably on both podcasts

00:52:52   is because I always find it frustrating

00:52:54   when I know I saw something somewhere the other day,

00:52:58   but I can't remember where it was.

00:53:00   Was it a tweet?

00:53:01   Was it an email somebody sent me?

00:53:03   Did someone send me a message?

00:53:05   Was it in Slack?

00:53:07   Was it a phone call?

00:53:08   Was it an in-person meeting?

00:53:09   I know this doesn't help with that,

00:53:10   but that is a thing that, again,

00:53:12   at my advanced stage, very often,

00:53:14   the lines blur between things that happen on the computer

00:53:16   and things that happen in real life.

00:53:18   And I hate it when I can't, I hate that feeling

00:53:21   where it's like, I know I saw this, I just saw it,

00:53:23   and like, I go through my history and my browsers

00:53:25   and I'm searching the file system

00:53:27   and I'm trying to search Slack

00:53:28   and hoping that like the free Slack, you know,

00:53:30   hasn't scrolled off the end

00:53:31   'cause it doesn't keep everything, you know,

00:53:32   it's just, I hate when I can't find it.

00:53:34   And I always think, if I had live streams,

00:53:36   this wouldn't be a problem

00:53:37   because the whole point of livestreams is just,

00:53:39   it notes everything you do and orders it by time.

00:53:41   And if I can remember it happened sometime yesterday,

00:53:44   then I can find it

00:53:45   because that's the only thing I care about.

00:53:46   I don't care about, did it happen in Slack?

00:53:48   Did it happen in messages?

00:53:49   Was it an email?

00:53:50   I don't care about that.

00:53:51   And I certainly don't care about

00:53:53   where it is in the file system,

00:53:54   where the file was saved, if it is a file or whatever.

00:53:56   I just wanna find the data.

00:53:58   So this startup at rewind.ai

00:54:01   has basically implemented livestreams.

00:54:05   When I talked about it on Erectifs back in November 2021,

00:54:08   I talked about the pitfalls

00:54:11   of potentially implementing this.

00:54:13   And so Rewind.ai has implemented it,

00:54:15   so now we have a concrete implementation to look at.

00:54:17   And what it does is you run it on your Mac,

00:54:19   it's a Mac application,

00:54:20   so cool for people making innovative new Mac applications,

00:54:24   and it tries to implement live streams.

00:54:26   And so if you're like,

00:54:27   what was that thing that I see,

00:54:29   it was something about the TPS report,

00:54:31   I think they used TPS report in their demo video.

00:54:32   - Yeah, they did. - TPS reports,

00:54:34   yesterday where was that?

00:54:36   And you don't remember and it will find it

00:54:37   in the example they give is like,

00:54:39   what if it was just something that was on somebody else's

00:54:41   screen in a Zoom call, right?

00:54:43   They did screen sharing on a Zoom call

00:54:44   and that's where it was.

00:54:45   You're never gonna find that

00:54:46   'cause it's not even on your desk.

00:54:47   It's not in your browser history.

00:54:49   There's no history in Zoom where you could find

00:54:51   text that was on a document that was being shared over Zoom

00:54:55   for someone who's doing screen sharing, right?

00:54:57   But rewind.ai will find it.

00:54:59   And how does it do this you may ask?

00:55:01   Exactly the way you're thinking.

00:55:03   It records everything that happens on your screen,

00:55:06   and that's where people start to get scared.

00:55:08   But honestly, that's the only way to do it

00:55:10   with current technology is,

00:55:12   what if we just record everything that happens

00:55:14   on your screen, every single window, OCR, all of it,

00:55:17   do text to speech from any audio that happens,

00:55:20   and just record it all?

00:55:21   Because that's the only way we're gonna find anything.

00:55:23   So it doesn't really care what it is,

00:55:24   it doesn't need to integrate with your browser,

00:55:26   it doesn't need to do any of that stuff.

00:55:27   It's just basically gonna record everything.

00:55:29   And if you are technically minded,

00:55:31   you're thinking many things.

00:55:32   You're thinking about the privacy implications

00:55:34   and you're thinking about the technical implications.

00:55:35   But if we're going to get into all that

00:55:36   and there's a lot to get into, I just want--

00:55:37   - I think about the legal implications.

00:55:39   - Yeah, yeah.

00:55:40   - Like two-party consent for recording.

00:55:42   - I just want to say that, as always

00:55:44   when I was discussing on rectifs,

00:55:46   that I would love this feature,

00:55:48   but anyone who actually implements it,

00:55:50   it becomes immediately terrifying

00:55:52   and probably also not technically possible.

00:55:53   And I think what I said back on the rectifs,

00:55:55   so that I haven't listened back to the whole thing,

00:55:56   is like the only company that I wouldn't even remotely trust

00:55:59   to implement this in some future

00:56:00   where implementing it doesn't bring my system to its knees,

00:56:04   it would be a platform or like Apple that is privacy focused

00:56:07   because anybody else who implements this,

00:56:10   it is so scary that I would,

00:56:15   how could I possibly have trust in them?

00:56:16   And as we'll get to in a little bit,

00:56:19   this is a venture-funded company.

00:56:20   And so I'm like making, like it's like,

00:56:23   do I want a venture-funded company being responsible

00:56:28   for keeping all my data privacy

00:56:29   as it records everything that happens on my screen?

00:56:32   Probably not, but anyway, that's the product.

00:56:35   I think it actually is cool.

00:56:36   I am certainly a fan of livestreams.

00:56:38   I think it's technically cool,

00:56:39   but it also scares the heck out of me.

00:56:41   - I'm not sure I would trust even Apple with this,

00:56:43   for a number of reasons.

00:56:45   I mean, number one, I think just the liability

00:56:48   that you're creating by recording everything

00:56:51   that passes through your screen and/or microphone

00:56:54   or whatever else, if somebody had access to that,

00:56:56   even if it's just somebody who sat down at your computer

00:56:59   and you walked away for a minute,

00:57:00   it's having access to everything

00:57:03   that has shown on your screen.

00:57:04   And granted, this app says they could do things

00:57:06   like exclude your private Safari windows

00:57:09   so when you're looking at stuff

00:57:10   you shouldn't be looking at in polite company.

00:57:14   - Or exclude the Zoom window if your work

00:57:16   doesn't want you to record and stuff like that.

00:57:18   They built the features you would expect

00:57:20   that they would build in, but it's not like,

00:57:23   getting back to the live streaming thing

00:57:24   of the guy who has a camera on his head

00:57:26   and records his whole life, it's the same idea.

00:57:28   and this is a topic visited very often in science fiction.

00:57:31   Obviously they're doing it for like as academic,

00:57:33   let's try this thing, but just think about

00:57:34   what it would be like if you recorded

00:57:35   your entire life all day with your AR glasses,

00:57:38   it just recorded everywhere you went.

00:57:41   That is a tremendous, like you said, a tremendous liability.

00:57:44   Nobody wants every moment of their life recorded

00:57:47   and able to be recalled by somebody who's not them.

00:57:49   I don't even want it to be recalled by someone who is me.

00:57:52   I don't want to be able to go back

00:57:53   and see every moment of my past,

00:57:55   but I certainly don't want any arbitrary person

00:57:58   who gets access to my AR or goggle recording

00:58:01   to be able to jump back in time and look at what,

00:58:03   that is fodder for dystopian sci-fi entirely.

00:58:07   We're just talking about what happens on a computer,

00:58:09   but that's plenty scary enough.

00:58:11   - Yeah, 'cause your whole life is on your computer,

00:58:13   and the things you do, I mean, there's so much,

00:58:15   there's so much information there

00:58:18   that could be used against you in some way,

00:58:20   whether it's if you happen to flash a view

00:58:23   of your bank account on a browser somewhere,

00:58:25   or just sensitive emails you get,

00:58:27   you know, just sensitive documents you have to deal with,

00:58:30   what you're browsing for legitimate reasons,

00:58:33   like there's just so, there is so much there,

00:58:36   you are creating a massive honeypot reward

00:58:41   for one bad actor in any part of it to be able to access

00:58:46   and do horrendous things with.

00:58:47   And even, again, I wouldn't even trust Apple to do this.

00:58:51   And I trust Apple with a lot.

00:58:52   I trust that even though we will complain about them

00:58:56   when it's warranted, I do trust them a lot

00:59:00   with a lot of data and integrity.

00:59:02   That being said, as they push more into trying

00:59:06   to be an ad company with, I think,

00:59:10   somewhat spotty morals in that department,

00:59:13   I don't know that we could even trust them with that.

00:59:15   'Cause I think what Apple's argument would be is,

00:59:18   well, we will keep it secure,

00:59:20   and we won't sell your data to other companies,

00:59:23   which is what they call tracking,

00:59:25   but Apple itself would probably at some point

00:59:29   use that for advertising targeting purposes.

00:59:31   Again, we're in a different world now.

00:59:34   Now that Apple is not only an ad company

00:59:38   and a rapidly expanding ad company,

00:59:41   but also now that Apple is having some tough quarters

00:59:44   with services growth, they're gonna tighten the screws,

00:59:48   they're gonna keep going.

00:59:49   This past month or quarter, whatever,

00:59:52   we had all of a sudden price hikes for all the services.

00:59:54   That's not a coincidence.

00:59:55   Look at their growth rates, look at their revenue rates.

00:59:58   It's slowing down.

00:59:59   All of a sudden we have lots of ads launching

01:00:01   around the app store.

01:00:02   Again, not a coincidence.

01:00:04   They're gonna keep tightening the screws

01:00:06   because what they call quote services

01:00:09   are all things where they can fairly easily

01:00:12   just like increase the spigot a little bit.

01:00:14   Just they can do things here and there

01:00:16   that will degrade things for users or developers

01:00:19   or both or whatever, but we'll give them

01:00:21   a little bit more money in the short term.

01:00:23   This is what companies do, Apple's not immune to it.

01:00:25   They're gonna keep doing this.

01:00:26   And so I can clearly imagine a future,

01:00:29   and I hope this never happens,

01:00:30   but I think it's extremely plausible

01:00:34   where if Apple had this feature of their platforms,

01:00:37   this total recording everything feature,

01:00:39   which frankly I don't think they would ever do

01:00:41   'cause I think it's too problematic

01:00:43   and they would know that,

01:00:44   but if they were to ever launch such a thing,

01:00:47   There is no question in my mind that they would find a way,

01:00:51   once a bad quarter came around,

01:00:52   they would find a way to twist some logic to say,

01:00:55   okay, we're gonna now start using that data

01:00:56   to target ads to you.

01:00:57   And only in our apps though.

01:00:59   We're gonna be super privacy sensitive

01:01:01   and only we can do this creepy thing to your data.

01:01:05   But that's what they would do.

01:01:07   And so again, you look at even Apple,

01:01:10   who we trust so much,

01:01:11   who has a pretty good track record in most of these areas,

01:01:14   I think even they would be a very small step away

01:01:17   from taking advantage of that kind of thing

01:01:18   in a creepy way, and that's Apple.

01:01:20   Imagine any other company with access to that kind of data.

01:01:24   And I mean, especially, I think a VC funded company

01:01:27   is especially scary about that,

01:01:28   because it is really hard to look at that

01:01:31   and not be tempted when times get tough.

01:01:34   So when you're not meeting your numbers,

01:01:36   when you need a little bit more revenue

01:01:38   to hit some goal or to avoid some problem,

01:01:41   it's really, really hard to not tap that resource.

01:01:44   And it's a massive resource that could be used

01:01:46   in immensely creepy and terrible ways.

01:01:49   Not to mention, even going beyond stuff

01:01:52   like legal problems with it, which again,

01:01:54   there are many legal problems with like,

01:01:57   if you are recording things on your computer

01:01:59   without people's consent,

01:02:00   they're on the other side of those things,

01:02:02   that's a bit of a problem in a lot of places.

01:02:04   - I'm not sure if the law has caught up with that,

01:02:06   but certainly NDA law has caught up with that, right?

01:02:08   If you're under NDA, you're not supposed to record

01:02:12   anything that happens in the meetings

01:02:14   that you have at work or whatever.

01:02:16   If people use enterprise software to do their tele-meetings or whatever, they explicitly

01:02:21   disable recording if they don't want you to be able to record it, if that's their corporate

01:02:24   policy.

01:02:25   And of course, if you install a third-party app that also records on your computer using

01:02:28   for corporate stuff, you are certainly violating your employee agreement or some NDA or whatever,

01:02:32   which is a more well-trodden area of the law than...

01:02:36   The phone system, the law has kind of caught up with that.

01:02:38   So we have all the recording statutes that probably apply to recording someone saying

01:02:43   audio, but recording like an image that goes across your screen because someone sent it

01:02:47   to you in like a WhatsApp thing, it's so technically esoteric that I do wonder if the law in various

01:02:53   states would be able to, would be able to grapple with that or if it's just, you know,

01:02:57   something that hasn't ever been tried because this technology is so new.

01:03:01   It's funny also, one of the things that's on the rewind website is, you know, you can

01:03:07   record your meetings and at the bottom of this page it says, oh shoot, where did it

01:03:12   go?

01:03:13   like, "Oh, how does this work?" And then there's like a help page about it. At the bottom of that,

01:03:20   it says, "Make sure to read this article as well, colon, the importance of consent." And they talk

01:03:26   a little bit about it, and they say at the bottom of this article in bold, "We believe all users of

01:03:32   our product should proactively seek consent from everyone they record, even if they are not

01:03:35   illegally obligated to do so." And apparently, news to me, because I was looking at this earlier,

01:03:41   There are only 11 American states that are two-party.

01:03:45   Shoot, I lost the list, but somewhere around here.

01:03:47   I believe Massachusetts--

01:03:47   - California's a big one.

01:03:49   - Oh, here it is.

01:03:50   California, Delaware, Florida, Illinois, Maryland,

01:03:51   Massachusetts, Montana, Nevada, New Hampshire,

01:03:53   Pennsylvania, and Washington.

01:03:55   Not New York, apparently.

01:03:56   - Yeah, and again, this is not the first thing

01:03:58   that records your screen.

01:04:00   Screen recorders exist and have existed forever,

01:04:02   and all the legal things are also caught by that.

01:04:04   And in my time in corporate America in the latter years,

01:04:08   we routinely recorded every meeting.

01:04:10   It's basically like, oh, for people who couldn't show up

01:04:11   to the meeting, make sure we record it,

01:04:13   and it would get shoved into the cloud somewhere,

01:04:15   and some Microsoft thing, and then other people

01:04:16   would be able to view the meeting.

01:04:17   So recording meetings is not a thing that really happens.

01:04:20   What didn't happen as often is someone locally recording

01:04:22   using a third-party piece of software that they installed,

01:04:24   like it gets recorded and pushed to the Microsoft OneDrive

01:04:28   through the Microsoft Teams thing or whatever.

01:04:29   That is a feature that most people who work in corporations

01:04:32   want, but they wanna be able to control it.

01:04:34   So it's not like this is letting the recording

01:04:36   or screen-reading out of the bottle,

01:04:37   but the idea of it being pervasive,

01:04:39   because to get the value of it,

01:04:40   it kind of has to be pervasive.

01:04:42   And maybe you could just say,

01:04:43   I would just run that on my home computer,

01:04:44   not my work computer, but that also kind of cuts into things

01:04:47   if the thing that was sent to you was like a message,

01:04:49   a non-work related message sent to you in messages,

01:04:52   you're not gonna see it

01:04:53   if you go through your live stream later at home.

01:04:57   So here's what the CEO had to say about it

01:04:59   in a little Twitter thread

01:05:00   to sort of explain why his product is okay.

01:05:03   Local and private by design.

01:05:06   We record anything you've seen, said, or heard

01:05:08   and make it searchable.

01:05:09   For your privacy, we store all of the recordings

01:05:10   locally on your Mac, only you have access to them.

01:05:12   So as you would imagine, if they're trying to not be

01:05:15   immediately, ridiculously evil,

01:05:16   they do not take these recordings.

01:05:18   If you're a local computer and throw it into the cloud,

01:05:19   it's all just on your Mac.

01:05:21   It's, you know, it's all local, right?

01:05:23   Mind boggling compression.

01:05:25   This gets to the technical feasibility.

01:05:26   How can I record everything that's happening

01:05:28   on my screen all day without filling my disc, right?

01:05:30   Starring all the recordings locally

01:05:31   means compression is very important.

01:05:33   We compress raw data up to 3,750 times.

01:05:37   What an odd number.

01:05:38   Without major loss of quality, for example,

01:05:40   10.5 gigabytes of raw recording data

01:05:42   becomes 2.8 megabytes.

01:05:43   And then finally, no cloud integration or IT required.

01:05:46   In order to enable you to search anything you've seen,

01:05:47   we use native Mac OS APIs with its optical character

01:05:50   recognition to recognize and index all the words

01:05:52   that appear in your screen.

01:05:52   So it's all happening locally.

01:05:54   It's all on your device.

01:05:55   And they're able to compress things

01:05:57   so that they get down to size.

01:05:58   And in terms of the-- they also touted their VC funding,

01:06:01   which is why everybody knows about it.

01:06:03   In terms of the business bottle, here's

01:06:04   something they had to say.

01:06:05   I think this is on their website,

01:06:06   on the how much does rewind cost.

01:06:07   Rewind is completely free for now.

01:06:09   Not reassuring, company.

01:06:10   - No.

01:06:11   - Don't put an exclamation on that, not reassuring.

01:06:12   Anyway, we plan to offer a free product indefinitely,

01:06:15   I'll bet, and for users to get a ton of value,

01:06:17   we will charge them a monthly subscription,

01:06:19   AKA freemium.

01:06:20   And then this is, this is not getting better here.

01:06:23   We aren't yet sure what the price will be

01:06:24   or what the features will be in the free product

01:06:26   versus the paid product.

01:06:27   We'll make that decision based on feedback we get.

01:06:29   One thing's for sure,

01:06:30   we will never sell your data or do advertising.

01:06:33   Are you sure about that?

01:06:34   - See, I would like to know what was their pitch

01:06:37   to the VCs, because I bet it contains

01:06:39   much more detail than that.

01:06:41   - I mean, they didn't get a huge amount of VC money,

01:06:43   and the person who, the CEO also had previously,

01:06:46   I think he was from Optimizely, which got--

01:06:48   - I love, hold on, just the ridiculousness

01:06:50   of like $10 million not being a huge,

01:06:52   and you're right, like in terms of,

01:06:54   but it's just, it's ridiculous.

01:06:56   - You know what I mean, but like in the grand scheme,

01:06:57   this is Andreessen Horowitz.

01:07:00   It's a big VC company, and he's had a successful company

01:07:03   that presumably he did well for the VCs

01:07:04   and when they invested in that company,

01:07:05   that's why he invested in his new company.

01:07:07   So here's the thing about this feature.

01:07:09   If you had gone back to my childhood

01:07:11   and found some technically minded adults

01:07:13   and said that a company's gonna make a product

01:07:15   that keeps track of our location all day, every day,

01:07:18   and makes that information accessible

01:07:20   to other people potentially,

01:07:22   they would say, "What, someone's gonna spy me

01:07:24   "and know where I am all the time?

01:07:25   "My device is gonna track my location on the Earth

01:07:28   "24 hours a day?

01:07:29   "This is a dystopian nightmare.

01:07:30   "We all have phones and we don't care about that."

01:07:32   It's like, well, it's just location data

01:07:33   and we trust Apple not to share with anybody.

01:07:35   And yeah, there was that thing where Google accidentally,

01:07:37   we let people get at every location and see that you're,

01:07:39   you know, where you go during the day.

01:07:41   Like there is kind of a,

01:07:43   an acclimation to technology tracking things

01:07:47   that were previously seemed like the surveillance state.

01:07:50   So, I mean, how many people use their phones

01:07:52   and turn location off so it's never tracked?

01:07:53   None of us do that because there's just so much utility

01:07:56   for a device that knows where we are.

01:07:58   Whether it's find my friends to find out

01:08:00   where people in your family are,

01:08:01   or just being able to set a reminder

01:08:02   so it knows when you get back home,

01:08:04   or maps features, or keeping track of your runs,

01:08:08   or how many steps you've taken and where you walked.

01:08:10   We have pretty much all accepted

01:08:12   and found a vendor that we can trust well enough

01:08:16   that we're okay essentially carrying a tracker around.

01:08:19   We're carrying a tracker around with us all day.

01:08:21   It's always keeping track of where we are

01:08:22   and it's recording it.

01:08:23   And we're like, yeah, we've worked out the kinks in that.

01:08:26   People are mostly okay with it.

01:08:28   But if you had dumped in somebody from the '70s

01:08:31   and describe that, it'd be like, no, I don't want this tracker.

01:08:34   Yes, it's an amazing peak of technology,

01:08:36   but I don't want it to be tracked.

01:08:37   And oh, you say it's only on the device,

01:08:39   or it's shared in this cloud.

01:08:40   You describe it, you say it's safe,

01:08:41   and it can't-- like, you have to do so much reassurance

01:08:44   and explaining of the cultural context for that to be OK.

01:08:47   Part of the reason that happened is

01:08:49   because we passed the technical threshold where doing that

01:08:51   is not prohibitive.

01:08:53   I'm not entirely sure that we have passed the threshold.

01:08:56   We're doing full screen recording or per window

01:09:00   recording an OCR is not oppressive because especially if you're on a laptop

01:09:03   it's gonna hurt your battery like there's a page on this site that says

01:09:07   look we're using the imaging powers of the M1 SoC to do all this compression

01:09:11   and I bet they are but every one of those little things takes power and that

01:09:16   you know especially on battery-powered devices like laptops but even on like a

01:09:19   desktop you're burning CPU cycles doing this thing that presumably will have

01:09:24   value later but you're burning it all the time like I mean again to get value

01:09:29   out of it, you can't just like run it for one hour a day.

01:09:31   You have to run it all the time.

01:09:33   So I do feel like the utility of this,

01:09:36   if this could exist immediately and magically work

01:09:40   and like be seamless and not impact your life

01:09:43   and not impact battery life,

01:09:45   I think the utility of it actually is tremendous,

01:09:49   but like getting over the hump of from where we are now

01:09:54   where it's not and people are scared of it

01:09:56   to the other side where it's just as scary

01:09:59   in the wrong hands,

01:10:01   but people are now used to the utility of it,

01:10:03   is kind of inevitable as everything,

01:10:07   I don't know if this is gonna happen,

01:10:08   but I've always talked about this before,

01:10:09   of like technology catching up with our perceptions, right?

01:10:12   So audio, we've got the technology

01:10:14   to make audio good enough for our human ears.

01:10:16   Like we can do it.

01:10:17   Like even if you're a crazy audio file person

01:10:19   who wants 192 kilohertz, whatever, 48 bit,

01:10:22   like we can do that, it's no problem.

01:10:24   audio does not need to get better for humans

01:10:26   until and unless we evolve better hearing,

01:10:28   and that's gonna take a long time, right?

01:10:30   We haven't caught up with video

01:10:31   'cause video still is pixelated and it's not 3D

01:10:34   like our eyes see and you know, blah, blah, blah.

01:10:36   But if Moore's law continues long enough,

01:10:39   and eventually Moore's law does stop

01:10:41   'cause you can't make things smaller forever,

01:10:42   see quantum physics, right?

01:10:44   If Moore's law continues long enough

01:10:46   that we max out on audio, vision, smell, touch,

01:10:51   like all our senses basically,

01:10:53   and we still have some more like doublings

01:10:57   of transistor density to go,

01:10:59   we're gonna get lifestreams whether we like it or not

01:11:01   because it'll basically be free

01:11:02   and to not do it would seem like wasteful, right?

01:11:06   And hopefully we will have the security things

01:11:07   worked out then hopefully we're all didn't die

01:11:09   in the water wars like there are many caveats

01:11:11   to what I'm describing here in this particular

01:11:13   timescale argument.

01:11:14   But the utility of this is tremendous

01:11:17   because if I said, look,

01:11:19   we're not gonna do any privacy things

01:11:20   but now a genie has granted you a wish

01:11:22   and you can go back to any moment and find anything

01:11:26   and where was that thing?

01:11:27   Like you have that magical ability, it's not technology,

01:11:28   it's literally magic.

01:11:30   People would take that in a second.

01:11:31   It's a superpower.

01:11:32   It's this kind of superpowers that computers give us.

01:11:35   Things that we can't do easily as human,

01:11:37   like remembering things or erasing things perfectly

01:11:39   or whatever, like computers can do.

01:11:42   They give us the power to do that.

01:11:44   This would be a superpower that computers could give us

01:11:47   if it could be harnessed in a way that we find acceptable

01:11:49   and doesn't totally destroy our lives.

01:11:51   Is 2022 the time and the place for that to happen on Mac OS

01:11:56   from a VC funded company?

01:11:59   Probably not, but I remain fascinated by the idea

01:12:03   and I think the utility of the idea is unavoidable.

01:12:08   Eventually, if we have excess computing capacity,

01:12:11   somebody's gonna do it and some generation of people

01:12:13   are gonna grow up and they're gonna think of it

01:12:14   the way we think about our phone's GPS tracking

01:12:17   or the way we think about Google searches.

01:12:18   Just as part of life, that it would be barbaric

01:12:21   live without because what the heck is the point of computers if not to do this thing?

01:12:24   It was interesting watching the intro video as well, which is literally a minute, I think,

01:12:29   to the second. And the CEO, Dan Soracher, I probably pronounced that wrong, but anyways,

01:12:34   he said that he went deaf or, you know, lost a lot of hearing in his 30s and then was able to get it

01:12:41   back by way of a hearing aid. And then he thought to himself, allegedly, you know, well, what else

01:12:45   could we apply this principle to? Like, how else can we like supercharge a human being? And, and

01:12:50   And he eventually landed on, well, what if you had perfect memory?

01:12:53   And that's kind of what this rewind thing is trying to do.

01:12:57   I don't know, I have very mixed feelings about it.

01:13:00   I like the fact that somebody is doing something innovative on Mac OS, because that seems to

01:13:05   be happening so rarely these days.

01:13:08   It is funny, like you had said, John, you know, unleashes Apple Silicon.

01:13:12   Rewind utilizes virtually every part of Apple's silicon system on a chip.

01:13:16   rewind doesn't tax system resources like CPU and memory while recording allegedly.

01:13:20   It uses every part of the SoC but doesn't tax CPU. You know CPU is part of the SoC.

01:13:26   And either way when you're using parts that don't count as a CPU, I'm not using the CPU,

01:13:30   I'm using the H.265 encoder. That still uses electricity. Like I know it uses less than doing

01:13:37   it on the CPU but like there's no avoiding the there is technical overhead for doing this.

01:13:42   Technical overhead for the compression, there's technical overhead for doing this the I/O for the

01:13:45   the storage and there's overhead of actually doing the screen recording.

01:13:49   It's not huge overhead, and I think our machines can mostly handle it, but it is going to impact

01:13:53   your battery life some amount.

01:13:54   I don't think they throw out any figures of like, "1% worse battery life," or "4% we don't

01:13:59   know what it is," but it's not zero.

01:14:01   Yeah, but I don't know.

01:14:03   As someone who genuinely has a pretty crummy memory, I find this very fascinating.

01:14:10   It is not often that I think to myself, "Oh, where did I see that thing?"

01:14:15   But it definitely happens.

01:14:16   And in a situation where I felt like I could trust

01:14:19   whatever vendor is providing this service,

01:14:23   I would probably be interested in it.

01:14:25   But I echo what both of you, particularly Marco,

01:14:27   was saying about the privacy implications,

01:14:30   about the consent implications.

01:14:33   Like, yeah, I guess every time I speak to anyone

01:14:35   on my computer, I can ask them if I have their consent

01:14:38   to record them.

01:14:39   - Or you could turn it off like it was a big switch

01:14:41   in the menu bar.

01:14:42   Like if, you know, again, it cuts into the value

01:14:44   if you turn it off, but it's a thing that you could do.

01:14:46   - But nevertheless, like,

01:14:47   I think this is a very fascinating idea.

01:14:49   And the fact that this group of people seems to claim

01:14:53   that they can realize this idea, that's super cool.

01:14:57   Now I'm skeptical that that's reality,

01:14:59   but I dig the demo, if nothing else.

01:15:02   Now maybe it's all smoke and mirrors,

01:15:03   but it looked really slick.

01:15:05   So I don't know, I would definitely be interested in this.

01:15:08   I put my name or my email in the list

01:15:11   just because I'd like to toy with it for a minute

01:15:12   and see what it does.

01:15:14   But a lot of the incentives that I see,

01:15:18   namely the fact that it's VC funded,

01:15:21   and they don't really seem to know

01:15:22   how they're gonna make money,

01:15:23   that does not align well with the incentives

01:15:26   I want them to have,

01:15:27   which is privacy beyond anything else.

01:15:30   Efficiency is a second tier, right behind privacy.

01:15:34   I don't know, I'm very skeptical.

01:15:36   Quick aside, by the way,

01:15:37   I just realized a few minutes ago

01:15:40   where A16Z came from. Did you know this? I think I figured it out.

01:15:43   And I had like, everybody knows, but you, I everyone, but new.

01:15:46   Did you know that the Beatles is a music pun?

01:15:49   I did actually know that. Um, but I had no idea it was all the,

01:15:53   it was 16 characters between the A and the Z anyways. Um, I don't know.

01:15:56   Would you use the, let me start with Marco. Would you use this both,

01:15:59   both rewind specifically and a fantasy, let's call it Apple.

01:16:04   Yes, I agree with all you were saying about Apple and services, but you know,

01:16:08   Would you use an Apple-provided version of rewind?

01:16:11   Would you use the rewind-provided version of rewind?

01:16:13   Let me start with Marco and then John,

01:16:14   I'd like to hear your two cents.

01:16:16   - I just, I don't see this ever being a thing

01:16:20   that is worth the risk.

01:16:23   It's almost like, imagine if nuclear power

01:16:26   was far less safe than it really is.

01:16:29   And so imagine, so it'd be like, yeah,

01:16:32   there's value in generating this power,

01:16:33   but every couple of years it's gonna have a giant meltdown.

01:16:36   and I was like, well, maybe that's not worth it.

01:16:38   And I know in reality this is a bad metaphor

01:16:40   because it's way safer, but anyway.

01:16:42   I think you look at this product and like,

01:16:47   first of all, even if we ignore the VC part of it,

01:16:51   just conceptually a product like this,

01:16:54   well, the very first thing it's gonna have to do

01:16:57   is add multi-device sync,

01:16:59   because that's the very first feature request

01:17:01   people are gonna have, because they're gonna go,

01:17:02   oh, I was doing this thing,

01:17:05   But I forget whether it was on my desktop or my laptop

01:17:08   or my phone or my tablet.

01:17:10   So you're gonna have, and I guess you can't really do

01:17:12   any of this on iOS, but who knows how they would

01:17:15   get around that, but anyway.

01:17:17   So like, you can start to see, all right,

01:17:19   the very first thing people are gonna want

01:17:20   is something that's gonna require them to have this data

01:17:23   synced somehow between devices and therefore

01:17:26   probably stored on some kind of server, maybe encrypted.

01:17:30   So there's already like a lot of red flags even there.

01:17:33   But again, even if you just stick with

01:17:35   what they've announced and you assume

01:17:37   they will never need money,

01:17:38   and so all that's off the table,

01:17:40   all those concerns are off the table,

01:17:42   and again, even heck, even assume,

01:17:45   not only Apple did it, but Apple of 15 years ago did it,

01:17:49   before they really got into the ad business.

01:17:51   Even with all those assumptions,

01:17:53   I think this is creating a massive liability

01:17:57   that the liability is greater

01:18:00   than the value that it creates.

01:18:02   And I cannot see this going well long-term for the people who

01:18:07   use it.

01:18:08   John?

01:18:09   I also signed up for the thing.

01:18:10   I'll definitely try it out of technical curiosity.

01:18:13   When I talked about the way Apple might implement this,

01:18:15   because they're the platform owner,

01:18:16   they have more invasive access to apps running on the system.

01:18:19   And also, they can do a thing that third parties can't,

01:18:22   which is they can vend APIs that third parties can then

01:18:24   adopt to sort of opt into this.

01:18:26   So it can be both more efficient and more privacy preserving

01:18:29   and with sort of like an impartial referee in the middle

01:18:32   being Apple the platform owner,

01:18:34   kind of like how does location services work?

01:18:36   Like it's an Apple vended API that they control

01:18:38   and Apple is able to control the privacy of it

01:18:41   to some degree through both the App Store

01:18:43   and the way the APIs work.

01:18:44   And same thing with sandboxing and a lot of things like that.

01:18:47   That's why I envisioned Apple trying to,

01:18:50   being able to pull this off in a more secure way.

01:18:51   I still think the technical overhead of it

01:18:53   is probably not worth the trade off,

01:18:55   but like I was saying before

01:18:57   with the sort of long-term thinking about this

01:19:00   and how we think about GPS.

01:19:02   I mean, there are tons of technologies

01:19:05   that have terrible privacy implications

01:19:10   that we don't like as tech nerds who are sensitive to this,

01:19:13   but the world disagrees, like Facebook, for example, right?

01:19:17   Or so many things having to do with the web.

01:19:20   - Or even like workplace man-of-the-middle security products.

01:19:23   - Yeah, I mean, the workplace ones, I think,

01:19:27   kind of more justifiable because it's like at least you at least that you know what you're getting into there and it's your employer and you

01:19:31   Could separate your personal or whatever things like Facebook

01:19:33   We're like, oh just give them access to your whole life and they will sell ads against it and like

01:19:37   You know people have different thresholds for how much they care about this and it seems like there are a lot of things

01:19:44   That may be terrible for privacy

01:19:46   but also clear pretty much everybody's threshold right and

01:19:49   A product like this is probably only one generation away setting aside the technical hurdles from clearing

01:19:56   enough people's threshold that it doesn't matter.

01:20:00   Who cares if everybody listening to ADP

01:20:02   is horrified by it, right?

01:20:03   That the masses writ large have different,

01:20:07   make different trade-offs, they have different weights

01:20:09   and different values to these different things.

01:20:11   When they weigh X against Y, they're like,

01:20:13   "Oh, Facebook, I see cool pictures of my friends

01:20:14   "and have fun.

01:20:15   "Told, you know, privacy destroy anything?"

01:20:17   I don't care about that, it's weird, esoteric,

01:20:19   I'm not a tech nerd, who cares, right?

01:20:21   I'm not saying they're right, obviously we disagree

01:20:23   with that, but if you think like,

01:20:25   I don't see this ever catching on

01:20:26   'cause I find it distasteful.

01:20:28   I don't think that's, you know,

01:20:31   my pessimistic view is that's not gonna save us.

01:20:34   In terms of like me trying this for real,

01:20:39   like, oh, will you run this on your computer all day?

01:20:42   Right now, I have to say no.

01:20:43   Like I'm interested in seeing it run.

01:20:45   I wanna see how it works,

01:20:46   but my personal trade-off is as annoyed as I am

01:20:50   when I can't find something,

01:20:51   I'm not annoyed enough to give this.

01:20:54   But someday I'm going to get old and die,

01:20:56   and then the next generation of people

01:20:58   may not have the same foibles that I have.

01:21:01   And the thing I mentioned,

01:21:02   dystopian sci-fi, I'm reusing this issue a lot.

01:21:05   Someone in the chat room pointed out

01:21:06   that there was an episode of Black Mirror

01:21:08   that touches on this exactly.

01:21:10   I've seen every episode of Black Mirror,

01:21:11   and so I vaguely remembered it,

01:21:13   but they looked it up for me.

01:21:13   It's the entire history of you.

01:21:14   It's more about, hey, if you have the ability

01:21:16   to go back and look at any of your memories

01:21:18   'cause your whole life is recorded,

01:21:19   maybe you'll obsessively go back and look at a memory

01:21:21   that's like, you know, that you'll sort of like spiral

01:21:25   on that one memory and keep obsessing over it

01:21:26   and looking at every corner or whatever.

01:21:28   It's kind of like a romance and someone's like, you know,

01:21:30   I think it was like someone was cheating on someone else

01:21:32   and someone's rewinding the memory and looking at it.

01:21:33   - This would be my personal hell.

01:21:35   Like I'd be so tempted to like go back to like middle school

01:21:37   and then I would just torture myself

01:21:39   with like all the dumb crap I did and said and acted like,

01:21:42   oh God.

01:21:43   - Yeah, I mean, like again, Black Mirror is not a,

01:21:46   you know, it's not a happy show.

01:21:49   Although San Junipero is the best episode

01:21:51   the entire series and it stands alone.

01:21:53   It is not a series of continuity.

01:21:54   So you like these, these are independent for the most part,

01:21:57   independent episodes.

01:21:57   So just watch San Junipero, just find that one episode.

01:22:00   I forgot which season and what episode it's for.

01:22:02   It's S-A-N space, J-U-N-I-P-E-R-O.

01:22:05   Put a link in the show notes.

01:22:06   That episode is worth watching

01:22:08   and is one of the best episodes of television ever.

01:22:09   And it is totally standalone.

01:22:11   You need no continuity,

01:22:12   but every other episode of Black Mirror is so grim.

01:22:15   And the entire history of you is also grim.

01:22:17   And I'm sure there have been decades and decades

01:22:20   sci-fi novels that have examined what it would be like if every moment of our lives was recorded

01:22:25   by us, by a totalitarian state that's ruling over us, by aliens. There are so many things in there

01:22:31   that are scary to us, which is why companies like this are trying to be careful with how

01:22:34   they introduce this technology. But I pretty much do feel that something like this is unavoidable,

01:22:40   as long as technology, as long as Moore's Law continues long enough to get us into a place

01:22:46   where this can be implemented in such a way that it like that the technical trade-off and the battery

01:22:51   life trade-off or whatever are rendered moot and then it's just a matter of who's bothered by it

01:22:55   we'll all be gone so we don't have to worry about it and i think the generation of people who live

01:22:59   there probably go that's useful i'll do that and they'll care about privacy they won't want it to

01:23:03   be super bad and you know if we've done well as a human race we will have implemented laws that

01:23:09   protect our privacy much more than we have today but i'm not feeling particularly optimistic about

01:23:14   about that at this moment in 2022.

01:23:15   So check again in a hundred years.

01:23:18   - I forgot to mention,

01:23:19   and the how does the recording of meetings work

01:23:23   in the Rewind website?

01:23:25   Where's the data sent?

01:23:27   Video data is never sent off your Mac.

01:23:29   The only data sent to a cloud service is the audio.

01:23:32   We send it to a cloud transcription service

01:23:34   in order to generate a transcript of what was said.

01:23:36   This transcript is created so you can read it

01:23:38   and search for specific words that were said.

01:23:41   Yikes.

01:23:42   - Nope, nope, nope.

01:23:43   - It's on device too, but maybe that's the one

01:23:45   that kills the battery, you know,

01:23:46   like maybe that eats up the CPU,

01:23:48   maybe the speech attacks is not as good on Apple's system.

01:23:51   - It's not.

01:23:52   - As Jason H says in the chat room,

01:23:54   every episode of Black Mirror ends up

01:23:56   on a tech company's roadmap.

01:23:57   (laughing)

01:23:59   - Oh no, this, 'cause that's the thing,

01:24:01   like, it's so easy to see like, oh well,

01:24:05   we're gonna poke this hole in our security model

01:24:07   that we previously established or that you thought you had

01:24:10   in order to provide this great value.

01:24:12   Now there's one hole already.

01:24:13   Oh, we gotta send your audio off to a cloud thing.

01:24:16   Okay, well, what's next?

01:24:18   It's so easy to make a couple little exceptions.

01:24:21   Well, we have to make this exception here

01:24:23   to provide value to you and/or us.

01:24:25   But that becomes so easy to do.

01:24:28   Once there's any holes, oh, we'll just add a little more

01:24:32   here, just a little more.

01:24:33   Oh, one more little exception here.

01:24:35   And again, the liability of this is so high,

01:24:39   it is not worth even creating.

01:24:42   I do think it's a very interesting technical challenge with what appears to be a very interesting

01:24:48   solution, but yeah, the more I think about it—and I was already getting the heebie-jeebies

01:24:53   to begin with, but the more I think about it, the heebie-er my geebies get, if you will.

01:24:58   It is creepy.

01:24:59   You know what else is creepy?

01:25:02   Elon Musk officially owns Twitter now.

01:25:04   Yay.

01:25:05   I don't even know what to say about this.

01:25:09   People are acting like this is some kind of massive change and huge turn for the worse.

01:25:15   Now I think this is not good in absolute terms.

01:25:20   However, in relative terms to the way Twitter has always been led, I don't know that it's

01:25:27   that much worse.

01:25:29   Twitter has always been terribly led by horrible people.

01:25:33   It's never been well moderated.

01:25:36   It's been terribly led, but has it been terribly led by people who have ideas that are as wrong

01:25:42   as his are?

01:25:43   Yes!

01:25:44   Yes, it has!

01:25:45   I don't think so.

01:25:50   The reason we all think Twitter has been mismanaged is because the crew that was running Twitter

01:25:54   for all these years has just seemed unable to do anything.

01:25:57   So it was like status quo was the safe move.

01:26:00   Because like, "Hey, we've got this thing.

01:26:03   It's kicking off where it's becoming super popular."

01:26:06   becoming part of the culture, it's blah, blah, blah.

01:26:08   Let's just not rock the boat.

01:26:09   So they did nothing for Twitter, to Twitter for so long.

01:26:12   They didn't really figure out how to make a lot of money.

01:26:13   They made money, but they didn't make a lot of money.

01:26:16   They could have made potentially more money.

01:26:18   They didn't add features, that's for sure.

01:26:20   They screwed third party developers

01:26:21   and then kinda like backed off a little bit on that.

01:26:23   Like they didn't do a great job,

01:26:25   but it was mostly inaction.

01:26:26   It was people who didn't have any good ideas.

01:26:29   The best idea they had was,

01:26:30   it seemed to me that they understood

01:26:32   the limits of their competence.

01:26:34   They were like, we don't know what to do.

01:26:35   So let's just not do anything,

01:26:37   which is not good leadership, to be clear,

01:26:39   but it is different than, I know what we should do,

01:26:42   we should do bad things.

01:26:44   Now, Elon Musk, to be fair, says lots of crap.

01:26:46   You have no idea what he's gonna do,

01:26:47   it's just kind of like it's hard to talk about

01:26:49   his owning Twitter until he actually does anything.

01:26:51   You can't really pay attention to what he says,

01:26:53   he says a lot of things, right?

01:26:55   But a lot of the things he says make the world think,

01:26:59   I don't like those ideas, and if you were to do them,

01:27:02   I would be sad.

01:27:03   And that's different than the people

01:27:04   who were running Twitter before.

01:27:05   The people who were on Twitter before

01:27:06   weren't constantly saying,

01:27:07   "We wanna let more Nazis back."

01:27:10   They weren't saying that.

01:27:11   What they were saying is,

01:27:11   we're trying to get rid of them,

01:27:12   but it's hard and we're incompetent.

01:27:14   Which is not, again, not good leadership.

01:27:17   The way they handled moderation was not good,

01:27:21   but in general, the noises they made was,

01:27:25   "We agree that this should be better than it is,

01:27:27   "and we're trying to make it better,

01:27:28   "and we're failing at making it better."

01:27:31   That's very different than saying,

01:27:33   "Actually, I think we should make it worse."

01:27:35   And that's what a lot of people hear

01:27:36   when they hear Elon Musk speak,

01:27:38   'cause he has ideas that we say,

01:27:39   "No, that would make everything worse."

01:27:40   And he's like, "Yeah, isn't that great?"

01:27:42   And that's why Elon Musk potentially is way, way worse

01:27:45   than the incredibly incompetent people

01:27:47   who were running Twitter before.

01:27:48   - See, I think people are forgetting some of the details

01:27:51   of how crappily Twitter has been run.

01:27:54   And I'm not just talking about product direction.

01:27:58   We can say things like,

01:27:59   "Oh, they should have had edit tweet," or whatever.

01:28:02   That's one area, and honestly, it's hard to imagine

01:28:05   anybody doing worse than what Twitter has been doing

01:28:07   to date in that area, but you know, I think it's--

01:28:09   - Don't forget Killing Vine and stuff like that.

01:28:11   They just generally didn't know what to do.

01:28:12   - But I think it's important to remember

01:28:15   Twitter's past leadership was extremely libertarian,

01:28:20   extremely hands-off with moderation,

01:28:23   and let a whole bunch of hate and Nazis

01:28:27   and horrible stuff on the platform indefinitely

01:28:29   with poor controls and poor enforcement.

01:28:32   - But which direction did they go in?

01:28:33   They went in the direction of improving that

01:28:35   incredibly slowly and not to the level

01:28:38   that we find acceptable,

01:28:39   but it didn't go in the other direction.

01:28:41   There was no sort of consensus with the leadership

01:28:43   that in fact, you know how bad it is now?

01:28:45   We should make it worse.

01:28:46   Instead they said, okay, we kind of grudgingly agreed

01:28:49   that maybe we should make it better,

01:28:50   and they acted too slowly and did a bad job.

01:28:52   But directionally, they were not heading

01:28:54   in the wrong direction, they were heading

01:28:55   the right direction from a place of terribleness too slow.

01:28:58   - Mm, I think they were so far from right before,

01:29:03   and I don't mean right as a conservative,

01:29:04   I mean right as incorrect,

01:29:05   they were so far from good before

01:29:09   that I don't even know if we can judge their micro moves

01:29:12   as being even in the right direction.

01:29:14   - They did move in the right direction

01:29:16   and did make important proves,

01:29:18   especially within the last few years.

01:29:20   - Here's the thing, everyone points to like,

01:29:24   oh look, Twitter finally kicked off Donald Trump.

01:29:28   You know when Twitter kicked off Donald Trump,

01:29:29   the second his term was clearly done,

01:29:33   like the second he had no more use to them,

01:29:36   they kicked him off.

01:29:36   They kept him on the entire four years before that

01:29:40   and didn't enforce a goddamn thing against him.

01:29:42   When he broke all the rules,

01:29:44   he was himself directly abusive and breaking laws constantly

01:29:48   and they did nothing against him

01:29:49   because he was too convenient to have on the platform

01:29:51   and it would have been too politically bad

01:29:53   for, in their mind, to kick him off.

01:29:55   So they kept him on.

01:29:56   stupid weasels kept him on the entire term and let him do all the damage he

01:30:00   did and they only kicked him off like after January 6th when it was when he

01:30:05   was politically tanked and when his term was effectively over that's when they

01:30:09   kicked him off they rung every single bit of value out of having him on that

01:30:13   platform and then they look like heroes kicking him off so forgive me if you

01:30:16   think like you know if Elon flirted the idea of maybe letting him back on that

01:30:20   doesn't make him any worse than the previous leadership that makes them both

01:30:23   both turds to be clear, but it's no worse than them.

01:30:27   - They did kick him off and he would be reversing that,

01:30:29   so that's worse.

01:30:30   Like obviously not kicking him off for so long is terrible

01:30:32   and you could, you know, like they waited too long

01:30:34   and they did a bad job or whatever,

01:30:36   but reversing it is worse.

01:30:38   Like, I mean, it's comparing two bad things to be clear.

01:30:40   I don't disagree with anything you said about what they did,

01:30:44   but Twitter is more than just him.

01:30:45   Twitter is how they handled all the moderation and stuff

01:30:47   and there were people in Twitter trying to move things

01:30:51   in a good direction.

01:30:52   And part of what enabled people to try to move things

01:30:56   in a good direction on Twitter

01:30:57   was the ineffectiveness of leadership.

01:31:00   That they were able to try to do good within the org,

01:31:02   despite the fact that the leadership may have disagreed

01:31:04   with the good they were trying to do.

01:31:05   You know what I mean?

01:31:07   Like that they weren't paying attention,

01:31:08   and while they weren't paying attention,

01:31:09   the head of the Trust and Safety Department

01:31:11   was able to try to do good things

01:31:12   and hire some good people briefly, right?

01:31:15   Again, you can't put this at the feet of the line,

01:31:17   'cause what has he done so far?

01:31:18   Not much, right?

01:31:19   He's fired a bunch of people

01:31:20   that people think he hasn't fired, so.

01:31:22   - Yeah, and that's the thing.

01:31:23   I think the reason why I am slightly optimistic about this

01:31:28   is that, is not necessarily because I hugely believe in him.

01:31:32   It's more that I just had so little faith

01:31:35   in the previous leadership.

01:31:36   And so I think he is, he's showing signs in both directions.

01:31:41   He's showing signs that seem like things

01:31:43   are gonna be terrible, and signs that show

01:31:46   that maybe he's gonna make some refreshing changes.

01:31:48   I don't know yet.

01:31:49   don't know yet. He, you know, as John said a few minutes ago, like, Elon floats a lot

01:31:54   of ideas in public and he's a total troll and he does a lot of it for attention or for

01:32:01   laughs. Sometimes he's actually legitimately floating an idea he thinks actually might

01:32:04   work. You know, he's not like good socially. He's not a serious person. No, he's not a

01:32:10   serious person. He also, you know, he has obviously pretty poor social skills, you know,

01:32:18   possibly for reasons that I can't diagnose

01:32:20   'cause I'm not a professional,

01:32:21   but there might be something there.

01:32:23   I think he clearly,

01:32:25   you can't take everything he says at face value,

01:32:29   but you also can't rule him out

01:32:33   as being totally incompetent

01:32:35   because of all the crap he says,

01:32:39   some of it he actually does and it turns out pretty good.

01:32:41   Not all of it, but some of it.

01:32:43   And so I think it's important when dealing with him,

01:32:46   I think it's important to try to just not feed

01:32:51   into his trolling with your attention

01:32:54   and just focus on the results.

01:32:56   Is he gonna actually do good stuff?

01:32:58   And the answer to that is we don't know yet.

01:33:00   It's too soon, but Twitter's previous leadership was awful

01:33:05   and so I think the bar is low for him to be

01:33:08   at least no worse than them.

01:33:10   - Well, the thing that's attractive

01:33:14   in any sort of situation where you want a strong man

01:33:16   to come in and wipe the slate clean or whatever.

01:33:19   It's like, you want decisive action, right?

01:33:20   I'm so sick of these people doing nothing

01:33:22   or saying they're gonna do something

01:33:25   and never doing anything,

01:33:26   or saying they're moving in the right direction

01:33:28   but they move so slow, it doesn't even matter.

01:33:29   I want decisive action.

01:33:31   And at a certain point,

01:33:32   dissatisfaction with the status quo becomes so pervasive

01:33:37   that decisive action is the most important thing.

01:33:39   Doesn't even matter what that action is,

01:33:41   I just want decisive action.

01:33:42   And very often, the biggest problem big corporations have

01:33:46   is there's no one there who's empowered

01:33:48   to do something decisive.

01:33:50   Because once you have a company that's really, really big,

01:33:53   especially if the people running it

01:33:54   aren't the people who founded it and don't own it all,

01:33:57   you just kinda wanna not screw things up

01:33:59   and you have your options vest

01:34:01   and get your golden parachute.

01:34:02   And there's lots of motivations to not rock the boat.

01:34:05   Because everyone, and so it becomes hard

01:34:08   for big corporations to do anything.

01:34:09   It becomes hard for leadership to do something.

01:34:11   What Elon Musk has going for him

01:34:13   and the things people find attractive is he doesn't care.

01:34:15   He doesn't care because he's already rich.

01:34:17   He doesn't care because he doesn't care, whatever it is.

01:34:19   He has no problem with taking decisive action.

01:34:23   And people find that attractive.

01:34:25   But that is obviously the trap is,

01:34:27   if it's decisive action doing something terrible,

01:34:29   that's not good, right?

01:34:31   And very often people will say,

01:34:34   "Yeah, he did the terrible thing, but it was so decisive."

01:34:36   And that is attractive to people.

01:34:38   Like people can hold that in their head and say,

01:34:41   "I don't agree with what he did,

01:34:43   but at least he did it decisively

01:34:44   and that makes me admire him, right?

01:34:46   And the other thing is, well,

01:34:47   you can't argue with the results.

01:34:48   The results were great.

01:34:49   Very often people are put in a situation where,

01:34:52   you know, not being as incompetent as the last person

01:34:56   is seen as a victory, for example, right?

01:34:58   Or just sort of like, knowing,

01:35:02   the question is, if you had picked a random person

01:35:06   off the street and put them in that same situation

01:35:08   and said, "Consequences are meaningless to you.

01:35:10   No matter what happens, don't worry, you'll be fine.

01:35:13   do whatever you think is right.

01:35:15   Tons of people would be able to do what Elon Musk did.

01:35:18   The thing that say, oh, I will attribute to this,

01:35:20   he's a genius, is like, he was just uninhibited.

01:35:23   Was he uninhibited because that's the wise thing to do?

01:35:26   I don't know, but he was, he was uninhibited

01:35:28   and he was able to do the thing that other people

01:35:31   didn't have the guts to even try

01:35:33   and it worked out a few times and it also helped

01:35:34   that he was born rich and blah, blah, blah, right?

01:35:36   So I don't, like saying like, oh, he was involved

01:35:40   with things that were a success,

01:35:41   therefore he knows what he's doing.

01:35:43   I don't see that at all

01:35:44   because this is a totally different realm

01:35:45   and we kind of see how he uses Twitter

01:35:49   and if that's what he likes about it,

01:35:51   it's potentially, I potentially don't wanna be

01:35:54   on the Twitter that Elon Musk thinks he would enjoy,

01:35:57   but I'm also not ready to celebrate his decisiveness

01:36:00   until I see what he is decisive about

01:36:02   because I do want to see the decisions

01:36:04   and who knows what they will be

01:36:06   because it's not as if he's immune to feedback and learning,

01:36:10   But he'll try all sorts of things,

01:36:12   and he'll say all sorts of things,

01:36:13   and hopefully he will try something,

01:36:15   it'll be a disaster, he'll try something else.

01:36:16   That's what, you know, then the clock is ticking on his debt

01:36:18   and all his other financial things or whatever.

01:36:20   But that's his MO, and that's what people

01:36:23   who have nothing to lose do.

01:36:24   People who have nothing to lose can be refreshing,

01:36:27   but kind of like rewind.ai, sorry, rewind.ai.

01:36:30   Having nothing to lose can also be terrifying, right?

01:36:33   It really depends, and that's why, in general,

01:36:35   it's not a great idea to have single, unaccountable people

01:36:39   with nothing to lose in control of things that are important.

01:36:41   Just putting that out there.

01:36:42   - Yeah, and I totally agree with that.

01:36:44   But I think that when people judge

01:36:49   whatever he's gonna do with Twitter,

01:36:50   and look, he might completely ruin it for all we know,

01:36:53   but I think it's very important to just always

01:36:55   contextualize like, well, how good was it before?

01:36:58   In the case of things like moderation decisions,

01:37:01   I don't necessarily care if Twitter is filled

01:37:04   with a bunch of people whose stuff I don't wanna see

01:37:07   if I'm not seeing it, right?

01:37:09   And I don't really care if there's people

01:37:13   who try to abuse me or people I care about

01:37:17   if I and they don't see it.

01:37:20   So I think it's important to recognize,

01:37:22   like, you know, when you have a social network

01:37:25   as big as Twitter, it's really hard to say,

01:37:29   like, you know, people of type or belief, X, Y, Z,

01:37:32   should not even be allowed to use this,

01:37:34   because it is really blurring the line

01:37:36   between public infrastructure and a private company.

01:37:39   So that's a tough thing to manage.

01:37:41   - I think the line is very blurry when one person owns it.

01:37:44   And it's a private company.

01:37:45   Literally it's a private company.

01:37:47   Where's the blurred line?

01:37:48   - Well, but again, I mean, look,

01:37:49   I make the diagrams all the time about the App Store

01:37:52   and about how the App Store has become such a required

01:37:56   and massive part of so much of everyday life and commerce

01:38:00   that it does kind of cross the line

01:38:03   and start to need public style regulation, right?

01:38:06   social network, large social networks,

01:38:08   the handful that there are,

01:38:09   are kind of like that in certain ways.

01:38:11   Like it's tricky, you can't just say,

01:38:14   oh, somebody who's ultra conservative,

01:38:16   who's a huge (bleep)wad, you can't just say like,

01:38:19   oh, they can't use this platform

01:38:21   because of something stupid they said.

01:38:24   But you can say, no one needs to see what they say.

01:38:27   And so it's really, it's so hard.

01:38:29   - I think you can say they can't use the platform.

01:38:30   That's the whole point of having a private company.

01:38:32   I totally disagree with that,

01:38:33   Twitter is the public square.

01:38:34   It's like 200 million tech nerds and journalists.

01:38:37   It's a private company.

01:38:38   The public square is the public square.

01:38:40   The internet, you could say, is the public square.

01:38:42   But anybody can start a social network.

01:38:43   You can make your own acid on server.

01:38:46   I will never be on board with that.

01:38:47   It is ridiculous.

01:38:48   - Yeah, but you know what?

01:38:49   Look, everyone, look.

01:38:51   This is not the first time that Twitter has made us all mad.

01:38:57   And this is not the first time that lots of us have,

01:39:00   with great principled stands, say we're leaving Twitter.

01:39:04   Go join us somewhere here on this new other thing.

01:39:07   Come join us here because Twitter has made us mad

01:39:09   and all those other things have pretty much gone nowhere

01:39:12   because that's not how social networks work.

01:39:15   And so I think it's, to everyone out there

01:39:17   who's like fleeing Twitter, look, do what you gotta do.

01:39:19   If you feel strongly enough to do that, go for it.

01:39:23   But don't assume that you will never come back.

01:39:27   Don't assume that Twitter is guaranteed to be bad

01:39:30   under this new ridiculous leader.

01:39:33   Keep an open mind that maybe this will be fine,

01:39:36   or at least no worse than it has been,

01:39:38   and maybe leave the door open to come back,

01:39:41   because you're not gonna get anyone else

01:39:43   to go in mass to some new thing that's the same thing,

01:39:46   but just not Twitter.

01:39:47   That's not how anything works,

01:39:49   and we've seen this time and time again.

01:39:51   So I would advise everyone,

01:39:54   before you go fleeing to your alternative

01:39:56   and delete your Twitter account,

01:39:57   maybe don't delete the account,

01:39:58   maybe leave it open,

01:39:59   and don't post saying I'm never coming back.

01:40:01   Maybe just, you know, leave the door open

01:40:03   and don't burn the bridge.

01:40:05   - Also, to go back a little bit, is Elon that decisive?

01:40:09   Like he seems to be waffling about what to do

01:40:12   with Twitter Blue and whether or not--

01:40:14   - He came in and threw his weight around immediately,

01:40:16   making like, add this feature in a week and do this

01:40:18   and firing a bunch of people and telling a bunch of people

01:40:20   to do a thing within a week is probably stupid,

01:40:24   but definitely decisive.

01:40:25   He didn't have a year's worth of meetings with people

01:40:27   before deciding what to do.

01:40:28   He's just doing stuff.

01:40:28   - Sure, sure. - Is he doing things

01:40:30   with forethought or things that are wise to do or things that will be useful, we'll find

01:40:34   out. But he did a bunch of stuff, that's for sure.

01:40:36   Well, but then he's getting in fights with Stephen King, or not even getting in fights,

01:40:39   he's asking Stephen King, "Well, what do you think I should do?"

01:40:41   Oh, no, he does stupid stuff all the time. Like, that's all he does all day is stupid

01:40:44   stuff, right? And the things he did to quote unquote, that was my whole point with the

01:40:47   decisive league, is firing a bunch of people and trying to make sure they don't get severance,

01:40:51   like, "Oh, it's decisive." But it's bad. You're doing, it's like, you're firing the

01:40:54   wrong people and you're doing it in a dickish way and like, but because it's decisive, like,

01:40:59   something is happening.

01:41:01   So if your thing was like, "I hate stagnation of Twitter, I want to see something happen,"

01:41:05   the finger on the monkey's paw curls and says, "Okay, something's happening now."

01:41:11   And we'll see, we'll see how it turns out.

01:41:13   But things are happening.

01:41:14   I feel for the people who work there, because I'm sure lots of people who work there would

01:41:18   like things to be better and they don't have control over this.

01:41:22   But I'm sure this is not the last time we'll talk about this.

01:41:25   And it is hard to do, because all the things he's doing so far are rumors of what he's

01:41:28   doing internally that are leaking out and it's like that's not relevant to us

01:41:31   as users of the thing we have to see what happens to the Twitter that we use

01:41:35   and right now there's not much visible except for tons of things that he's

01:41:39   talking about doing but hasn't actually done yet so we'll see.

01:41:42   Yeah I don't know I just I've spent a unreasonable amount of time trying to

01:41:49   understand random randos on Twitter randos in the chat room people that I am

01:41:56   acquainted with friends, many of whom seem to worship the ground that Elon walks on.

01:42:01   Because he's decisive and he's rich and might makes right and he's a strong man. Like this

01:42:05   is not a new phenomenon. Thousands of years of human history have shown people love the strong

01:42:09   man. People love the authoritarian. Fascism is popular for a reason. It is explicable. It is

01:42:15   should not be shocking. We would hope that people would, you know, learn the lesson of history,

01:42:19   but they don't. Uh, it is attractive. It will always be attractive probably until unless we

01:42:24   we evolve away from it or wipe ourselves out.

01:42:26   - Yeah, but he also, he has a lot of the same appeal

01:42:28   that Trump did to a lot of people of like,

01:42:30   you know, he is like, despite being like, you know,

01:42:34   the richest person in the world most of the time,

01:42:36   he is, he's seen as like the everyman to a lot of people

01:42:40   who feel like society is against them.

01:42:42   - Yeah, the everyman, he came from nothing,

01:42:44   the son of a diamond emerald miner in South Africa.

01:42:47   - Yeah, yeah, just like Trump was self-made

01:42:49   and a good businessman, right, all that BS.

01:42:51   - Self-made multimillionaire.

01:42:54   - Right, started off with nothing but a few hundred million

01:42:56   from his father.

01:42:56   - But like, yeah, but like, what appeals to people

01:43:00   about him, there's so much overlap with Trump,

01:43:03   and it's many of the same people, and so I don't agree

01:43:06   with pretty much any of that, but I understand why

01:43:11   a certain personality type would think he's basically God,

01:43:15   and we have to deal with that, like that's the reality,

01:43:18   and we might as well try to understand it and deal with it.

01:43:21   The project of society is to try to make it so that those notions do not find root and

01:43:28   we are doing a bad job of educating society.

01:43:33   So far we're not burning witches, we got away from that, but barely.

01:43:37   Barely.

01:43:38   And in any second it threatens to come back.

01:43:42   That's the thing, is that like, I've had lengthy conversations with some of these Elon super

01:43:47   fans, it's so troubling because they seem to think a lot of them, "Oh, well just listen

01:43:54   to this interview," or "Oh, did you just know this one tidbit?" or "Oh, what if I told you

01:44:00   this?" Like suddenly, "Oh, you're right. Elon is God. I am a mere peon in the world that

01:44:07   he lives and owns." And I don't get like, I don't understand, and I guess I should just

01:44:14   let it go because it's probably not making for good programming. I just,

01:44:17   I don't understand how people are so enamored with him. I just don't,

01:44:20   I don't think that his cars are very good.

01:44:23   SpaceX I don't know enough about to have an opinion.

01:44:26   Starlink seems like it's clever, but not really going to amount to a whole lot.

01:44:30   I mean, part of it,

01:44:32   part of it was what you're explaining right now that you associate all those

01:44:35   things with a single person, which is a ridiculous thing, right?

01:44:37   But it's a thing that we all do is a shortcut. It's like Steve jobs is Apple.

01:44:41   Elon Musk is Tesla, and obviously,

01:44:43   those are companies filled with people doing things.

01:44:46   And because he owns them or funded them

01:44:50   or made important decisions that led them to their success,

01:44:54   we attribute it all to that one person.

01:44:56   And that type of thing is the same thing

01:44:57   that leads you to, you know, to like, any,

01:45:00   you assign accomplishments, accrual of wealth,

01:45:05   power, good looks, height, like all these things,

01:45:08   all these attributes that people can have,

01:45:10   we connect them to virtue and say, you know, if you,

01:45:14   if you have a limp that is less virtuous than the person

01:45:17   who walks without a limp, right?

01:45:19   If you get a disease that is like,

01:45:21   and we connect everything with virtue, right?

01:45:23   Your hair color, your skin color, your height,

01:45:26   how much money you have, what family you belong to,

01:45:29   things that are in your control,

01:45:30   things that aren't in your control, we say,

01:45:31   and therefore that is connected to virtue.

01:45:33   What virtue?

01:45:34   Brains, wisdom, strength, you know, leadership qualities,

01:45:40   and that causes people, you know,

01:45:41   why do they worship these people?

01:45:42   Because all of those accomplishments are connected

01:45:44   with merits that they believe in,

01:45:46   and they say, of course I believe in him,

01:45:48   he is a hero, he is amazing, he is brilliant,

01:45:50   he is making the world a better place,

01:45:52   because they, you know, these accomplishments,

01:45:56   which can be accomplished by terrible people,

01:45:58   they say, but no, because he did those things,

01:46:00   because he has those things, because he is those things,

01:46:02   therefore he has these other virtues, right?

01:46:05   and that is, it's an unavoidable trap of human nature

01:46:10   and a shortcut that we take.

01:46:11   So anybody who acquires any of those things,

01:46:15   people start to see all those other virtues in them, right?

01:46:17   And of course, there's the people who see virtue

01:46:20   in the punishing of the people they don't like,

01:46:21   which gets into a bigger problem with fascism and racism

01:46:24   and so on and so forth, but it's depressing,

01:46:26   but it is well-trod territory,

01:46:29   and it is difficult to combat,

01:46:31   especially when other things are going poorly as well.

01:46:34   - Yeah, it's just, I just can't wrap my mind around

01:46:36   how people who I know that strike me

01:46:38   as intelligent human beings look at this

01:46:40   professional internet troll.

01:46:42   He seems like just a dirtbag human,

01:46:44   and yet these people like worship the ground he walks.

01:46:46   I just don't get it, but we should move on.

01:46:48   - I think it's important too to separate

01:46:51   the personality details of this person from his work.

01:46:56   And I think, you know, there was a good discussion

01:46:59   on the talk show about this this week.

01:47:00   It was actually a great episode with Federico Vaticci

01:47:02   and John Greer, great episode, I recommend it.

01:47:04   There was a great discussion there about kind of

01:47:05   separating the artist from the work

01:47:07   when somebody turns out to be a turd in real life

01:47:10   and that comes to light, but you still enjoy their work

01:47:14   or what they've made before you found out they were a turd.

01:47:16   You can look at the various companies

01:47:18   that he's been involved with so far,

01:47:21   and I think largely they've been pretty good.

01:47:25   You know, they're not all perfect,

01:47:27   and some of his wacky ideas didn't get off the ground

01:47:30   or were too ridiculous to even consider.

01:47:33   But Teslas are pretty great cars

01:47:35   and they are selling an absolute ton of them

01:47:38   and they've done great things for car electrification.

01:47:42   So that's a pretty huge thing.

01:47:44   They did great things in batteries,

01:47:45   they're working on doing great things in solar.

01:47:48   SpaceX is itself a pretty great thing,

01:47:51   doing itself pretty great things.

01:47:53   Starlink is a bunch of asterisks

01:47:56   with the space debris problem,

01:47:58   but that's a pretty amazing idea

01:48:00   as well that from the handful of people I know

01:48:02   who have used it, it's pretty great.

01:48:05   So I think owning Twitter, again,

01:48:08   if we just don't even pay attention to the crap he says,

01:48:13   just look at the work, and I know that's hard

01:48:15   because he says a lot, and most of it's horrendously

01:48:19   inflammatory or ridiculous or whatever,

01:48:22   but just ignore everything he says,

01:48:25   and just look at the work.

01:48:27   Owning Twitter kind of fits in the sense that

01:48:30   It's yet another massive challenge in yet another area

01:48:34   that he has at least started out knowing nothing about.

01:48:37   But do you think he knew how to make cars before Tesla?

01:48:39   Do you think he knew how to make rockets before SpaceX?

01:48:41   Do you think he knew how to launch satellites

01:48:42   and run an ISP before Starlink?

01:48:45   This is actually fitting a pattern that he does

01:48:47   of tackling truly ridiculous, pretty large scale,

01:48:52   pretty difficult problems that he thinks he can do.

01:48:55   Regardless of why he thinks he can do them

01:48:58   or the ridiculous ideas that he gets when he,

01:49:01   that he spouts off on Twitter.

01:49:03   Again, if you ignore all of that,

01:49:06   because he's a massive troll and kind of a dick,

01:49:09   so if you just ignore all of that

01:49:12   and just look at the work,

01:49:14   he does actually achieve some pretty remarkable things,

01:49:18   and so that's why I think, hey, you know what?

01:49:21   I'll give him the benefit of the doubt.

01:49:22   I'll stick around on Twitter

01:49:23   to see what the heck he does with the place,

01:49:24   because again, it wasn't well run before,

01:49:27   And so he's tackling a massive problem again.

01:49:32   And in the past, when he has tackled massive problems,

01:49:35   it's actually worked out pretty well eventually.

01:49:39   Again, if you ignore everything he says.

01:49:41   So I'm willing to apply the same strategy here.

01:49:44   Turns out his car company makes pretty great cars

01:49:47   that a lot of people like.

01:49:48   His rocket company makes pretty good rockets

01:49:51   that are doing important work.

01:49:52   His satellite company is covering space

01:49:55   with these disposable satellites that,

01:49:58   all those issues aside,

01:49:59   are providing a good service to people.

01:50:01   So again, big problems, ignore all the crap

01:50:05   that comes out of the guy's mouth,

01:50:06   and I hope no one has to work for him in the process,

01:50:10   'cause apparently that's not a super fun amount,

01:50:12   or that's not super fun either,

01:50:13   but when you look at the work,

01:50:16   when you look at what comes out of this jerk,

01:50:20   it actually is often pretty impressive.

01:50:23   - Well, it's not a blank slate when it comes to Twitter,

01:50:24   'cause we kind of, like, we may not know

01:50:26   what he thinks he wants to do with Twitter,

01:50:28   but we know how he uses Twitter.

01:50:29   It's not like Twitter, he's not new to Twitter.

01:50:31   He's been on Twitter for years.

01:50:32   He is a very experienced Twitter user.

01:50:35   And if you see the way he uses Twitter and you think,

01:50:38   how would this person make Twitter more to their liking,

01:50:42   it's not optimistic.

01:50:43   That doesn't mean that's what he's gonna do.

01:50:44   It doesn't mean he's gonna make Twitter more to his liking,

01:50:45   but I have a feeling that he--

01:50:46   - Look at how Jack used Twitter.

01:50:48   - All right, well, I know.

01:50:50   But like, I mean, I think it kind of fit.

01:50:52   The way he used it was like,

01:50:53   I kind of like it the way it is,

01:50:54   and so it didn't change that much.

01:50:56   But Elon wants to change Twitter.

01:50:57   I mean, the reason he bought it practically is,

01:51:00   it's like, this is a thing I use all the time,

01:51:02   and I think it should be different than it is,

01:51:04   so I'm just gonna buy it 'cause I'm super rich, right?

01:51:06   So it's not as if we're like,

01:51:08   I'm just gonna learn about social networks now.

01:51:09   I mean, he is gonna learn about how it is to run them,

01:51:11   but we're not starting from zero in terms of,

01:51:13   I have no idea what Elon thinks about Twitter.

01:51:16   We at least know what he thinks about it as a user.

01:51:18   We don't know necessarily that he's going to tailor

01:51:20   the service to his own personal tastes,

01:51:22   but I have a hard time believing he'll do much of anything.

01:51:25   So I'm going in predisposed to think

01:51:27   that the changes that he is going to make

01:51:29   are not going to appeal to me

01:51:31   and people who are of similar mindset, but we'll see.

01:51:34   Like I totally agree, it's like it's hard to talk about

01:51:37   until he's actually done something.

01:51:38   'Cause it's point like,

01:51:39   we're not even talking about all the things that he said,

01:51:41   'cause whatever, he says lots of things.

01:51:43   He's changed what he said 17 times.

01:51:45   Let's just wait to see what he actually does.

01:51:47   But I am not particularly optimistic

01:51:49   because pretty much every idea that he's floated

01:51:52   with a few minor exceptions,

01:51:54   does not seem like the way to go to me,

01:51:56   but we'll see.

01:51:58   - Well, but see, even that,

01:52:00   in all the stupid crap he has floated,

01:52:03   there's usually been a little kernel of something

01:52:06   that was actually correct or good about it.

01:52:07   - Well, if you say everything, eventually,

01:52:09   one of the things is gonna be,

01:52:11   if he covers all the bases.

01:52:12   - But all the stupid crap he said

01:52:14   that everyone jumps down a third about,

01:52:16   usually it's like, okay, that's 80% a bad idea,

01:52:19   but 20% of it, you were actually on track

01:52:21   for something good, you know?

01:52:22   And I think that's how he does lots of things.

01:52:24   That like, if you look at the way he runs his companies

01:52:26   and some of the ideas he tries,

01:52:28   he does, he tries lots of crazy stuff,

01:52:30   where he floats a lot of crazy ideas,

01:52:31   or he says lots of crazy things.

01:52:32   Some of them never happen, thank God.

01:52:34   - Some of them do, like the Cybertruck.

01:52:35   - Right, yeah, well, that doesn't happen yet.

01:52:37   It's not out yet, you can't buy it yet.

01:52:39   - Oh, it'll arrive just before Linux on its desktop

01:52:43   really arrives, it'll be the same time.

01:52:45   - Yeah, but anyway, like, you know,

01:52:46   he has a bunch of ridiculous ideas.

01:52:48   He tries, like, you know, look at some of the things

01:52:50   he's talked about on Twitter so far.

01:52:52   So, you know, some of the immediate controversies

01:52:54   are that he wanted to lay off a bunch of people.

01:52:57   Well, yeah, Twitter's way too big for what it is.

01:52:59   How he does it--

01:53:00   - Well, but slow down, but he laid off the wrong people.

01:53:03   Right, it was the wrong people.

01:53:04   - So this is the 80% bad part.

01:53:06   But the 20% of it, like, he identified a real problem,

01:53:10   so he might have solved it poorly.

01:53:12   - If you had picked someone off the street

01:53:13   and put them in charge of Twitter,

01:53:14   they also would have laid off a lot of people,

01:53:16   'cause everyone knows it's overstaffed.

01:53:17   It's like, that's what attributing this is like,

01:53:19   you know, the obvious thing that everybody knew

01:53:22   but no one had the guts to,

01:53:22   it's a condemnation of previous Twitter management,

01:53:25   obviously.

01:53:25   - Yeah, the previous dropping door of leadership

01:53:27   never touched it, so you know, so that's one thing, okay?

01:53:30   - Oh, they're the ones that hired all those people.

01:53:31   - Right, so that's one thing.

01:53:32   So there's also, you know, look at his dumb idea

01:53:35   about tying verification to Twitter Blue.

01:53:37   Well, some of that is terrible,

01:53:40   but also, Twitter Blue is a premium service

01:53:43   that no one I know bought.

01:53:45   I know so many people who are Twitter power users.

01:53:48   I don't know a single person who buys Twitter Blue.

01:53:51   And I have no use for it,

01:53:52   and I've been using this platform for like a decade, heavily.

01:53:55   And yet, so obviously, they launched a premium product

01:53:58   that most of their premium users don't want.

01:54:01   Secondarily, you know who could pay for verification,

01:54:04   who would love to pay for verification?

01:54:07   Businesses, every single business.

01:54:09   I have a verification mark on Overcast,

01:54:12   we have one on ATP, and in part,

01:54:14   the reason why I sought those

01:54:16   was because it makes us look more legitimate.

01:54:18   And if I could have those on other platforms as easily,

01:54:20   I would get them there too.

01:54:21   Because again, and businesses are,

01:54:23   yeah, you don't wanna make every journalist in the world

01:54:27   have to pay for something they might be able to afford

01:54:29   or whatever, but businesses sure can.

01:54:31   So again, there's a kernel of like, okay,

01:54:34   you have businesses capturing all this value on Twitter,

01:54:40   why not let businesses pay not $20,

01:54:43   make businesses pay $100 a month

01:54:44   for a verification check mark?

01:54:46   Like, these aren't terrible ideas completely.

01:54:50   The details that he has floated so far,

01:54:52   or that have been rumored that he said,

01:54:54   some of the details are awful,

01:54:55   many of the details are awful,

01:54:56   but at the heart, like, you know,

01:54:58   Twitter is a platform that most people have identified

01:55:02   as totally failing as an advertising delivery service.

01:55:05   And everything people want is away from ads.

01:55:09   So what if they launched a premium thing

01:55:11   that people wanted to pay six or eight

01:55:12   or 10 or 20 bucks a month for?

01:55:14   That's not that bad.

01:55:15   Well, but did you do the math on that, though?

01:55:17   You do the back of the envelope math, it's not great.

01:55:20   That's what you get with just saying things and saying,

01:55:23   oh, well, you know, this way.

01:55:24   Just do the back of the envelope math.

01:55:25   Like, how much are they money they make from ads now?

01:55:27   And everyone agrees they're not doing a good job with ads.

01:55:29   But how much do they make from ads now?

01:55:30   How much would they make if every single person

01:55:32   who's on Twitter pays what he's saying a month?

01:55:34   And you know 100% of people aren't gonna pay for it.

01:55:36   And it's like, it's way less, right?

01:55:38   So what's the plan?

01:55:39   And obviously, that's why it's so hard to engage with this.

01:55:41   Like, who cares?

01:55:42   He hasn't actually done anything yet, we'll see.

01:55:43   and what if it's only additive and blah, blah, blah.

01:55:45   Like there's all these theories about how it could go

01:55:47   or whatever, but back of the envelope,

01:55:49   like being able to pay for verification

01:55:51   and those type of things, you know,

01:55:53   like as most of the articles that talk about this say,

01:55:56   I think there was a, this isn't the new Libetel one,

01:55:58   that the product of Twitter is moderation, right?

01:56:00   That that's what they're selling.

01:56:01   - That was a great article, yeah,

01:56:02   The Verge, Welcome to Hell.

01:56:04   - Right, because that's true of anything

01:56:06   that's advertising driven or whatever.

01:56:08   Like you need the people to be there

01:56:09   so you can advertise to them.

01:56:10   So you have to make it a hospitable place.

01:56:12   Twitter has arguably not been great about getting people

01:56:15   to be there, but it's got a lot of the quote unquote

01:56:17   valuable people there who are valuable to advertise to

01:56:20   or whatever.

01:56:21   Like that all makes sense, it just hasn't been

01:56:23   a leverage very well.

01:56:25   And verification is part of the moderation product

01:56:29   because you want to make it so that there's a way

01:56:31   for people who use your service to determine

01:56:33   if this is really McDonald's or not McDonald's,

01:56:34   you know what I mean?

01:56:35   Like that's part of the product that they're selling, right?

01:56:39   You could sell other things on top of that,

01:56:40   like let the people who are willing or able to pay

01:56:42   get better features or whatever,

01:56:44   but that is all in service of making it a place

01:56:47   where it is safe and suitable to advertise.

01:56:52   Because the money you're gonna get

01:56:53   from the people who pay for any of these services

01:56:56   is nothing compared to,

01:56:57   like literally every single person on Twitter pays this,

01:57:00   it's not going to match your advertising income

01:57:02   unless you charge every single person $3,000 a month.

01:57:05   When you do them, and again,

01:57:06   I use the everybody because that's ridiculous,

01:57:08   what percentage of people,

01:57:10   no matter how good you make this product,

01:57:11   what percentage of the people even can pay for it,

01:57:13   let alone will pay for it,

01:57:15   then add up all their money

01:57:16   and weigh it against your potential advertising

01:57:18   to these people who are valuable to advertise to

01:57:20   and advertising keeps winning.

01:57:22   And so in the end,

01:57:23   if you look at this from a business person's perspective,

01:57:26   if you want to make more money than they're making now,

01:57:28   you can cut costs,

01:57:29   which is what everybody does when they buy a company,

01:57:31   lay off a bunch of people, lower your costs,

01:57:32   and you can increase revenue.

01:57:34   And where is that really gonna come from?

01:57:35   You can get some revenue from your users,

01:57:37   but there's not a lot of them.

01:57:39   Twitter is not as big as Facebook, you know?

01:57:42   So they can't get a fraction of a cent from everybody

01:57:45   and be bazillionaires.

01:57:46   They have to get larger money from advertisers

01:57:50   to give access to their relatively small,

01:57:52   mere hundreds of millions or whatever

01:57:54   Twitter's user base is,

01:57:55   of people who are valuable to advertise to.

01:57:58   And then there's the value of the conversation

01:58:00   that happens on Twitter.

01:58:01   Again, makes it a place where people wanna be

01:58:04   so you can advertise to them.

01:58:06   Are there other ways to make money from Twitter?

01:58:08   fine, but any way you come up with to make money from Twitter, you have to say,

01:58:10   okay, is this in lieu of advertising? Because if it is,

01:58:12   it has to make at least as much money. But it's not in lieu of advertising.

01:58:15   Does this make it an environment where advertisers still want to advertise?

01:58:19   And that's where I feel like he gets into trouble with a lot of his schemes that

01:58:22   he's throwing out there.

01:58:23   Not that they're bad ideas because I've always been a proponent of have people

01:58:26   pay for a service so they can support it with money.

01:58:28   But Twitter is so long past the point. Like it's,

01:58:31   it's too big to be supported by its users because you know,

01:58:35   not enough people will pay to run Twitter.

01:58:38   You can't run Twitter on, you know,

01:58:39   you can't make it like app.net where everyone who's on it

01:58:41   pays something or whatever.

01:58:43   Twitter is too big for that.

01:58:44   But it's also too small to say,

01:58:47   "It's a free for all and I don't care

01:58:49   "because we have literally five billion people

01:58:51   "so you will advertise with us."

01:58:53   It's in that uncomfortable place in between

01:58:55   where it's just barely too big to be supported by its users

01:58:58   so you have to advertise to it,

01:59:00   but it's filled with people who want to make it a place

01:59:02   where advertisers don't wanna advertise.

01:59:05   Well, I'm tired of talking about this.

01:59:07   - You do wonder if Elon Musk watches Black Mirror

01:59:09   and again, like the joke in the chat room,

01:59:11   sees every episode and says, "That would be awesome."

01:59:13   - Don't give him any more ideas.

01:59:15   - Seriously, Jesus, Jesus, John.

01:59:17   - Thanks to our sponsors this week,

01:59:20   Squarespace, Trade Coffee, and Linode.

01:59:23   And thanks to our members who support us directly.

01:59:25   You can join at atv.fm/join.

01:59:28   We will talk to you next week.

01:59:30   (upbeat music)

01:59:33   Now the show is over, they didn't even mean to begin

01:59:37   'Cause it was accidental (accidental)

01:59:40   Oh, it was accidental (accidental)

01:59:43   John didn't do any research, Marco and Casey wouldn't let him

01:59:48   'Cause it was accidental (accidental)

01:59:51   Oh, it was accidental (accidental)

01:59:54   And you can find the show notes at ATP.fm

01:59:59   And if you're into Twitter, you can follow them @C-A-S-E-Y-L-I-S-S

02:00:08   So that's Kasey Liss, M-A-R-C-O-A-R-M

02:00:12   Auntie Marco Armin, S-I-R-A-C-U-S-A-C-R-A-C-U-S-A

02:00:19   It's accidental (it's accidental)

02:00:23   They didn't mean to accidental (accidental)

02:00:28   ♪ The tech podcast so long ♪

02:00:31   - Can we have something light in the mood please

02:00:34   and thank you?

02:00:35   - I don't know, maybe we just have to come to it.

02:00:37   I did enjoy the part where he thought Stephen King

02:00:39   was balking it paying $20 so he said, "How about eight?"

02:00:42   (snorts)

02:00:43   - He's so dumb, he's so dumb.

02:00:45   - He's not dumb, that was trolling.

02:00:47   That was like, come on.

02:00:49   - Was it?

02:00:50   - He is a troll.

02:00:51   - Oh, he's unquestionably a troll, unquestionably.

02:00:54   - I mean, part of that, part of saying that he's always,

02:00:58   that he's always a troll is attributing to him

02:01:00   an intelligence, wisdom, and self-awareness

02:01:03   that is not an evidence, because it seems like--

02:01:05   - Right, thank you! - You know what I mean?

02:01:07   It's easier to say that because, well, obviously,

02:01:09   he must be smart 'cause he's rich,

02:01:10   and so when he says something stupid,

02:01:12   it must be because he's trolling to get a rise out of you,

02:01:14   but no, sometimes he's just stupid.

02:01:15   - Well, I mean, and the things,

02:01:17   like this strikes me as trolling,

02:01:19   but I'm not convinced it is.

02:01:21   - Yeah, for sure, he is definitely a troll.

02:01:23   Like, I'm not arguing with him.

02:01:24   - I don't know, he's 1,000% a troll.

02:01:26   But sometimes it's hard to tell when he's trolling

02:01:29   when he's just dumb because there's enough

02:01:30   of both to go around.

02:01:31   - Well, and this is such an absurd tweet.

02:01:34   He tweeted a couple hours ago,

02:01:35   advertisers should support this poll.

02:01:37   Option one, freedom of speech.

02:01:39   Option two, political quote, correctness quote.

02:01:41   - That's trolling.

02:01:42   - I know it is, but it's so,

02:01:45   like why do we worship this jacket?

02:01:47   Why?

02:01:48   I don't get it.

02:01:49   - I don't think a lot of people worship.

02:01:51   I don't think it's an epitheco-version.

02:01:53   It's an epidemic of positive regard,

02:01:55   which is how we got Trump as president. That's what it is.

02:01:58   Can we please talk about anything else for the love of Christ, please?

02:02:01   I don't care what, anything, anything else. Why did I bring this?

02:02:05   I brought this on myself. I'm an idiot. I shouldn't have brought it up.

02:02:08   I mean, we avoided it for as long as we could. He's like, well, he's,

02:02:11   he was going to buy it, but then he's trying to get out of it.

02:02:14   It's been going on for months. You know, it went on for, and eventually it,

02:02:17   you know, it ended the way again,

02:02:20   things that fly in the face of the idea of being him,

02:02:23   him being a mastermind that he, it seems like he wanted to buy Twitter and just sort of

02:02:27   a fit of peak of like, "I should control this because I'm on it all the time and it sucks

02:02:31   and I can make it better and I should buy it for a ridiculous price." And then he kind

02:02:34   of said, "Well, actually, maybe I don't want to do that. It seems like a dumb idea. Can

02:02:37   I get out of it? Oh, no, I can't. Oh, I guess I'm buying it." Like, that is not the sign

02:02:40   of a stable genius. Like, it's not the sign of somebody who has got their stuff together

02:02:47   and is, you know, it's sure is decisive action. He took a decisive action to do a stupid thing

02:02:52   and then he decisively said,

02:02:53   "I'm getting the hell out of this,"

02:02:54   and he decisively couldn't do that.

02:02:55   (laughing)

02:02:57   It does not reflect well on him.

02:02:59   If someone who was not a millionaire did that,

02:03:01   you would pity them, right?

02:03:03   You're not a billionaire, sorry, Elon.

02:03:06   Not the America's richest man.

02:03:08   You would feel pity for them

02:03:09   and how little hold they had over their life

02:03:12   and what poor decisions they make.

02:03:13   But when someone who's fabulously wealthy does it,

02:03:15   people will bend over backwards

02:03:16   to attribute genius to those moves.

02:03:18   Those moves are not genius, not at all.

02:03:20   Just ask the bankers who lent him the money.

02:03:22   [beeping]

02:03:24   [ Silence ]