119: Thinking, Fast and Slow
00:00:00
◼
►
Don't be alarmed, Myke.
00:00:01
◼
►
Oh, that's a terrible way to start.
00:00:03
◼
►
I don't like this.
00:00:04
◼
►
I am recording this podcast on the beta, on the Mac beta.
00:00:11
◼
►
Now, just to be clear, not because I want to,
00:00:15
◼
►
but because I have to.
00:00:16
◼
►
You don't have to.
00:00:17
◼
►
No, I do have to.
00:00:19
◼
►
All right, so.
00:00:19
◼
►
Why do you cause me so much stress?
00:00:21
◼
►
I'm not causing, I'm letting you know
00:00:23
◼
►
that you don't have to be stressed
00:00:25
◼
►
because I have the backup recording going,
00:00:27
◼
►
which is physically separate from everything else
00:00:31
◼
►
and will be perfectly fine as long as the batteries don't run down
00:00:34
◼
►
but I'd freshly changed them just before this episode so
00:00:37
◼
►
the chance of that happening is low
00:00:41
◼
►
assuming I picked the right batteries and didn't pick dead batteries
00:00:44
◼
►
- Just keep your eye on the recorder would be my request
00:00:46
◼
►
I'm sure it gives you some kind of indication when the battery's low and going low
00:00:49
◼
►
- Yeah it turns off
00:00:50
◼
►
so I've got it right in front of me so I can see if that happens
00:00:53
◼
►
but no I'm not doing this on purpose
00:00:56
◼
►
This isn't fun levels levels shenanigans.
00:00:59
◼
►
What happened is my writing computer has become...
00:01:06
◼
►
I guess the way to describe it is it has a like a synchronization corruption in Dropbox
00:01:15
◼
►
that is causing me problems.
00:01:17
◼
►
And so I have had to quarantine my writing computer from any kind of network access
00:01:24
◼
►
so that all of the work that I do doesn't get messed up.
00:01:29
◼
►
The only other computers that I have are the laptops where I'm running the betas.
00:01:33
◼
►
And so that's why I'm talking to you from the beta right now.
00:01:39
◼
►
I'll allow it.
00:01:40
◼
►
Okay, you'll allow it?
00:01:42
◼
►
I'll allow it.
00:01:43
◼
►
I mean, I know you're having this problem because sometimes,
00:01:46
◼
►
like after our last episode, I had to text you and be like, "Where's the file?"
00:01:50
◼
►
So was it happening from then?
00:01:52
◼
►
Yes, so our last episode was the thing that finally clued me into,
00:01:58
◼
►
"Hey buddy, you've got some problem in your system."
00:02:01
◼
►
And you kept texting me like, "Where's the file?"
00:02:04
◼
►
And I kept checking on my writing computer, which is also the podcasting computer,
00:02:08
◼
►
and seeing files there.
00:02:10
◼
►
It says uploading.
00:02:11
◼
►
It should be with you any moment, Myke.
00:02:13
◼
►
But as your increasingly frequent messages conveyed to me, the file was not showing up.
00:02:19
◼
►
This always happens when I need the file quickly.
00:02:22
◼
►
And I did need the file quickly after our last episode
00:02:25
◼
►
'cause we're under a bit of a time crunch.
00:02:27
◼
►
'Cause I wanted to start editing straight away,
00:02:29
◼
►
which is one of the worst things you can do.
00:02:31
◼
►
Like with the way that I edit this show,
00:02:33
◼
►
that we just had the conversation,
00:02:34
◼
►
now I'm literally going to listen back
00:02:36
◼
►
to all of it immediately, which is like--
00:02:38
◼
►
- Oh, that's the worst. - That sucks.
00:02:41
◼
►
It's nice to have a couple of days at least.
00:02:43
◼
►
Which I also, I will say, I feel like it fits better.
00:02:46
◼
►
After a couple of days when I come back to that episode,
00:02:48
◼
►
not only is some of it kind of refreshing,
00:02:51
◼
►
I've also like my brain has been working on it
00:02:53
◼
►
a little bit more and I find that I'm able
00:02:56
◼
►
to immediately notice the parts that I know didn't work
00:02:59
◼
►
in a way that I don't get that after for some reason,
00:03:02
◼
►
like if I go straight into the episode,
00:03:03
◼
►
like there's this weird thing where I feel like my brain
00:03:06
◼
►
is kind of just like chewing on the conversation a bit,
00:03:09
◼
►
which is interesting.
00:03:10
◼
►
But yeah, so I always, this always happens
00:03:13
◼
►
and this might be like a selection effect kind of thing,
00:03:16
◼
►
like maybe there's always problems,
00:03:17
◼
►
but I don't usually notice them
00:03:19
◼
►
because they resolve themselves after a day or two.
00:03:21
◼
►
- Usually my fault is just simply not turning back
00:03:24
◼
►
on Dropbox after the conversation is over.
00:03:26
◼
►
That's the fault most of the times, but this time, okay.
00:03:30
◼
►
So what is happening, I don't think it's Dropbox's fault.
00:03:34
◼
►
I do think it's ultimately my fault,
00:03:37
◼
►
but what is the chain of events
00:03:40
◼
►
as far as I was able to reconstruct this incident
00:03:43
◼
►
is that I keep a local copy of all of my Dropbox files
00:03:48
◼
►
on one of these giant Pegasus drive things
00:03:53
◼
►
that's under my desk,
00:03:54
◼
►
in one of these like,
00:03:55
◼
►
we can hold 50 terabytes of data kind of drives.
00:03:57
◼
►
- They spin in hard drives or use SSDs in them?
00:03:59
◼
►
- I don't know, they're those funny shaped other drives
00:04:02
◼
►
that I only see in server stuff.
00:04:04
◼
►
I don't actually know if they're spinning drives
00:04:05
◼
►
on the inside or if they're SSDs on the inside.
00:04:07
◼
►
- Does your Pegasus thing make any noise?
00:04:10
◼
►
- It does, but it doesn't make
00:04:11
◼
►
spinny hard drive kinds of noises.
00:04:13
◼
►
makes electrical kind of noises. So I'm going to guess they're solid state but I don't know for sure.
00:04:17
◼
►
But so precisely you've now identified where does the problem begin because it does make noise.
00:04:22
◼
►
I've wanted to have it outside the acoustically separated writing computer which means that I need
00:04:28
◼
►
to run a wire from the Pegasus to the computer but of course I also have a standing desk.
00:04:36
◼
►
And so I set up the situation so that the wire could just reach when the standing desk was at the highest level.
00:04:45
◼
►
Great. Really good. Really good stuff.
00:04:50
◼
►
And this way, the Pegasus drive could be outside the little recording booth that I've made.
00:04:54
◼
►
And also, I could raise and lower the standing desk as long as I was using the preset memory heights for the standing desk.
00:05:04
◼
►
And hey, what's the issue with having your massive storage solution, the cable for it,
00:05:09
◼
►
just under slight tension constantly? What's the issue with that? No problem.
00:05:12
◼
►
That wouldn't cause any issues.
00:05:14
◼
►
Well, and it hasn't caused any issues for my entire quarantine year.
00:05:20
◼
►
Just before our last recording, I was attempting to redo some of the wires behind my desk,
00:05:29
◼
►
and while I was doing that, I thought, "Oh, I need a little bit more space
00:05:33
◼
►
under the standing desk while I'm working here, and so I press the up button and
00:05:38
◼
►
right out popped the cable. Now Dropbox was running at the time, and just to give people
00:05:45
◼
►
a sense of the scale of the thing, I checked this morning and I have 20 terabytes of data
00:05:52
◼
►
in about a million files in my Dropbox system. That's a lot of terabytes. It's a lot of terabytes.
00:05:58
◼
►
it's a lot of files. Obviously that's partly because I'm sharing documents with a bunch of
00:06:03
◼
►
people and like people that I work with and so there's like there's just a ton of stuff in there.
00:06:07
◼
►
But what happened is after the cable got pulled out, well, I plugged it back in and I thought
00:06:15
◼
►
hopefully nothing bad happened. But obviously something bad did happen because Dropbox started
00:06:23
◼
►
to re-index the entirety of those 20 terabytes and million files.
00:06:28
◼
►
Wait, so do you have, I just want to make sure I've got this right, so you have all
00:06:32
◼
►
this stuff on the Pegasus drive.
00:06:35
◼
►
And that's going up to Dropbox as well?
00:06:37
◼
►
No, it's all in Dropbox.
00:06:38
◼
►
What's on the Pegasus drive?
00:06:40
◼
►
The Pegasus drive is where I have my Dropbox folder.
00:06:44
◼
►
And I've told that Dropbox folder, "Keep everything saved locally."
00:06:48
◼
►
Right, okay.
00:06:49
◼
►
And then on your other machines you're doing that like, download it when you need it.
00:06:52
◼
►
doing the selective syncing thing.
00:06:54
◼
►
- Do you use selective sync
00:06:55
◼
►
or do you use the smart sync thing?
00:06:58
◼
►
- So I've, well, I'm slightly changing the way I work now
00:07:01
◼
►
because of this very problem.
00:07:03
◼
►
But previously I was using selective sync
00:07:06
◼
►
where you can tell it,
00:07:07
◼
►
just pretend like these folders don't exist.
00:07:09
◼
►
And part of the reason I was doing that
00:07:11
◼
►
is because every time I would install Dropbox
00:07:12
◼
►
on a new computer,
00:07:14
◼
►
it would give me this message that said,
00:07:15
◼
►
hey buddy, you have more than 500,000 files in your system.
00:07:19
◼
►
We strongly recommend
00:07:20
◼
►
don't try to synchronize all of this, like just use selective sync for what you need.
00:07:25
◼
►
And since the laptops only have a terabyte of data or whatever, I would have to do that.
00:07:29
◼
►
And also the old way they used to work about selecting files to be local or not local but
00:07:35
◼
►
still visible used to not work with Time Machine, but they seem to have fixed that. It does
00:07:39
◼
►
seem to work with Time Machine now.
00:07:41
◼
►
I think I'm having an issue with Dropbox and Time Machine.
00:07:43
◼
►
Okay, yeah. What do you mean?
00:07:45
◼
►
My Time Machine backup keeps failing, and it's telling me it needs to back up four terabytes
00:07:50
◼
►
of stuff. My iMac has a 1TB SSD in it, so I don't know where it's drawing 4TB of stuff
00:07:57
◼
►
Yeah, this is the kind of thing that you can run into.
00:07:59
◼
►
My time machine isn't working, and I think it's related to my Dropbox, because I have
00:08:04
◼
►
like 3TB of stuff in Dropbox.
00:08:06
◼
►
Yeah, I'd easily bet that that's what this is. Ideally, the Dropbox should register all
00:08:12
◼
►
of the files that aren't there locally as 0 bytes in size, but there are funny things
00:08:18
◼
►
that can happen when you have files that reference other files and like all sorts of complications
00:08:23
◼
►
I think maybe Time Machine is actually not a thing I can do anymore.
00:08:26
◼
►
So that's that's the situation.
00:08:28
◼
►
Everything was local on the Pegasus and then it got disconnected while Dropbox was running
00:08:33
◼
►
and then Dropbox attempted to re-index things.
00:08:37
◼
►
It seemed to be going fine until Dropbox, so the number is dropping like indexing a
00:08:44
◼
►
million files, indexing 900,000 files.
00:08:47
◼
►
And I was like, "Okay, well, this will just take a while."
00:08:49
◼
►
But it got down to about 400,000 files and then just stopped.
00:08:55
◼
►
And so Dropbox kept saying, "Indexing 400,000 files."
00:09:00
◼
►
And also had this hilarious, like, uploading 200,000 files.
00:09:05
◼
►
But I could check the network access and see, "You're not doing anything, Dropbox.
00:09:09
◼
►
There's no data coming out of this computer.
00:09:11
◼
►
You're stuck. You're just stuck in this current position."
00:09:16
◼
►
So I thought, well, these are a lot of files.
00:09:18
◼
►
It's a lot of data.
00:09:19
◼
►
I'm sure I can just wait long enough and this problem will sort itself out.
00:09:23
◼
►
Uh, no, it didn't.
00:09:26
◼
►
I've been waiting five, six weeks now and it just didn't move at all.
00:09:31
◼
►
But what was happening is that my writing computer was downloading
00:09:36
◼
►
new stuff from my other computers that I was working on.
00:09:40
◼
►
And so I thought, oh, okay, I guess everything is staying in sync.
00:09:46
◼
►
But I've only just noticed in the past couple of days that it isn't.
00:09:52
◼
►
That the newer computers keep trying to revert to the way Dropbox was like a month ago before
00:09:58
◼
►
this happened and disappearing stuff that I've worked on.
00:10:02
◼
►
So I was like...
00:10:03
◼
►
So I realized, "Oh no indeed" is what I realized.
00:10:11
◼
►
So I thought, okay, write a computer, shut down immediately.
00:10:15
◼
►
Goodbye, like you are not getting network access ever again
00:10:20
◼
►
until I can create some kind of Faraday cage around it
00:10:24
◼
►
to boot it up and probably just like wipe the whole machine
00:10:27
◼
►
and start over if I'm gonna use it for something else.
00:10:28
◼
►
- I think that this situation compounded
00:10:32
◼
►
with other problems you have had with Dropbox in the past,
00:10:35
◼
►
I think is suggesting that you need
00:10:36
◼
►
a slightly different system than the one that you're using.
00:10:40
◼
►
I feel like there needs to be a cold storage which isn't connected to Dropbox anymore.
00:10:46
◼
►
I think 20 terabytes is too much to put in Dropbox.
00:10:51
◼
►
I think you're always going to have these problems.
00:10:53
◼
►
I mean maybe.
00:10:54
◼
►
I would be curious to know among Dropbox's enterprise users where do I rank percentage-wise
00:10:58
◼
►
in terms of amount of data that is being used.
00:11:01
◼
►
I feel like there have to be organizations that are using 100 times more than I'm using
00:11:04
◼
►
for Dropbox.
00:11:05
◼
►
Yeah, but it's probably all not for one person.
00:11:08
◼
►
Right, but this is where it's coming from.
00:11:09
◼
►
team stuff as well or working with other people like even between the two of us, right?
00:11:13
◼
►
Like we both have access to a copy of all of the Cortex files.
00:11:17
◼
►
Yeah, but I don't think that people even in like large organizations are sharing that volume of data between them, right?
00:11:24
◼
►
Like maybe an organization has 20 terabytes, but each individual only has a small percentage of that overall thing.
00:11:30
◼
►
Right, that they have - I see what you're saying that they have access to, that there's not any individual
00:11:35
◼
►
who's trying to keep on top of 20 terabytes worth of stuff.
00:11:39
◼
►
Okay, I see what you're saying.
00:11:40
◼
►
- I think particularly the way that you're doing it
00:11:42
◼
►
of like 20 terabytes of external storage,
00:11:46
◼
►
I think that that's like compounding the potential risk.
00:11:49
◼
►
- Hmm, what could go wrong?
00:11:52
◼
►
- All of the data being stored in that one place
00:11:54
◼
►
is probably a bad, bad idea.
00:11:56
◼
►
- Right, well I mean, part of the reason why
00:11:58
◼
►
I wanted a local copy of all of it
00:12:00
◼
►
is so that I can make my own backups
00:12:02
◼
►
and not just trust Dropbox to have all of the files
00:12:05
◼
►
all of the time. - Yeah.
00:12:05
◼
►
- Like that's what I'm just trying to do there, but.
00:12:07
◼
►
- No, I completely understand why you would do that.
00:12:10
◼
►
If I was, okay, this isn't necessarily helpful as such
00:12:14
◼
►
because it's so hard to get to this,
00:12:16
◼
►
but if I was gonna re-architect what you're doing,
00:12:19
◼
►
I think you need to have,
00:12:21
◼
►
there's the Dropbox active storage,
00:12:23
◼
►
and then there isn't an away from Dropbox storage,
00:12:27
◼
►
which is physically on a thing that you have in your home,
00:12:32
◼
►
but then that is also backed up to another service
00:12:36
◼
►
like Backblaze or something.
00:12:38
◼
►
So you have an online backup for it,
00:12:40
◼
►
and it's accessible to you,
00:12:42
◼
►
but not constantly like Dropbox is.
00:12:44
◼
►
I don't think you need that, I would expect.
00:12:47
◼
►
Like at your fingertips on every single computer
00:12:49
◼
►
that you have at any moment.
00:12:51
◼
►
Like really easy. - Yeah.
00:12:52
◼
►
- 'Cause like even with Backblaze,
00:12:54
◼
►
if you had it all there,
00:12:56
◼
►
you could log into Backblaze
00:12:57
◼
►
and download that data on any machine,
00:13:00
◼
►
but it's not like in a folder structure in Finder.
00:13:05
◼
►
I think you're putting too much stuff through that system.
00:13:09
◼
►
- Yes, I mean, possibly precisely because of the moment
00:13:12
◼
►
that we're in right now.
00:13:14
◼
►
- I think you've proven it because I think millions
00:13:17
◼
►
of files up and down, it's just, it's gonna get,
00:13:20
◼
►
like you only need one out of a million
00:13:25
◼
►
to have some kind of weirdness to it.
00:13:28
◼
►
- And then you're in this situation
00:13:29
◼
►
and I don't know how you wouldn't just continue
00:13:32
◼
►
to get in these situations forever
00:13:34
◼
►
unless you change something about the way you store files.
00:13:36
◼
►
- Yeah, I mean, that is my suspicion,
00:13:37
◼
►
is that some file has become corrupted in an odd way
00:13:42
◼
►
that doesn't allow Dropbox to continue to index it,
00:13:45
◼
►
and that's where the system is just getting stuck.
00:13:47
◼
►
And that it's trying to keep track of new things,
00:13:50
◼
►
but also the synchronization status keeps seeming like,
00:13:54
◼
►
oh no, these new files don't exist
00:13:55
◼
►
because I'm currently running on this machine
00:13:57
◼
►
where they don't, and this machine is up to date
00:13:59
◼
►
because I haven't finished the indexing.
00:14:02
◼
►
I noticed it because I was working on a, like a,
00:14:04
◼
►
just a really dumb little vlog.
00:14:06
◼
►
And then I went on my laptop to go edit it.
00:14:08
◼
►
And it's like, this file doesn't exist.
00:14:11
◼
►
This entire Final Cut project that's gigabytes in size,
00:14:15
◼
►
never heard of it.
00:14:15
◼
►
And I was like, oh God.
00:14:18
◼
►
- This is kind of funny to me,
00:14:19
◼
►
'cause like on Twitter a couple of days ago,
00:14:21
◼
►
I saw a conversation between Hank Green and MKBHD,
00:14:25
◼
►
where they were talking about the fact
00:14:26
◼
►
that once they upload videos, they delete everything.
00:14:29
◼
►
The only thing that exists is what's uploaded to YouTube.
00:14:32
◼
►
They don't keep anything.
00:14:33
◼
►
- Yeah, I mean, I'm just gonna say though,
00:14:35
◼
►
I think that does make more sense for both of them.
00:14:38
◼
►
Like I think that's quite a reasonable workflow.
00:14:40
◼
►
I think there's a little bit of a difference
00:14:42
◼
►
in the content there of I do want to keep
00:14:44
◼
►
the originals of everything.
00:14:45
◼
►
- What surprised me about MKBHD was that surely
00:14:48
◼
►
he needs B-roll footage sometimes.
00:14:51
◼
►
So maybe he saves a little bit of that, I don't know.
00:14:53
◼
►
But that was a surprise to me.
00:14:55
◼
►
But you know, like as I've said many times,
00:14:57
◼
►
the only shows that I keep the project files for
00:15:00
◼
►
is this one.
00:15:01
◼
►
I don't keep anything more than like a month or two.
00:15:05
◼
►
After a month or two, I delete all the project files
00:15:07
◼
►
and then just carry on with all my other shows.
00:15:09
◼
►
Except for Cortex, I have every Logic project of Cortex.
00:15:12
◼
►
I never thought it was gonna be useful until Moretex.
00:15:16
◼
►
It was so useful.
00:15:17
◼
►
- Right until you needed to remaster everything from Cortex.
00:15:19
◼
►
- Yeah, remove all of the ads from the entire back catalog,
00:15:21
◼
►
which is a feature of Moretex, by the way.
00:15:23
◼
►
If you go to getmoretex.com,
00:15:25
◼
►
not only do you get additional content for every episode
00:15:27
◼
►
and no ads, you get an entirely ad-free
00:15:30
◼
►
remastered back catalog.
00:15:31
◼
►
It's just higher audio quality.
00:15:33
◼
►
- Yeah, and it's for that same reason
00:15:35
◼
►
that I do wanna keep all of the video projects.
00:15:37
◼
►
- I do understand it.
00:15:38
◼
►
Like I'm not saying that you're wrong for doing it.
00:15:40
◼
►
It was just interesting to me,
00:15:42
◼
►
but I think Dropbox is not the place for that.
00:15:45
◼
►
- Yeah, I mean, maybe I need to figure out something else,
00:15:47
◼
►
but the problem of different people's versions,
00:15:51
◼
►
even for the old stuff, getting out of sync
00:15:54
◼
►
is a non-trivial issue for if you ever do need
00:15:58
◼
►
to reconstruct what is the current state of this thing.
00:16:01
◼
►
I'll have to think about it, but all of that is just to tell you why and how I'm recording to you from the beta currently
00:16:09
◼
►
and is also another continuing step of the saga of my writing computer, which I think maybe the lesson learned is that the writing computer should not also be the server for every file you have
00:16:25
◼
►
And also a standing desk computer.
00:16:28
◼
►
What an interesting idea.
00:16:29
◼
►
It's almost as if you were going to try and sequester a machine to do one thing, you
00:16:34
◼
►
don't make it do everything.
00:16:36
◼
►
It's funny that, really.
00:16:37
◼
►
Yeah, I think that's the lesson we've all learned today.
00:16:41
◼
►
This episode of Cortex is brought to you by Muse.
00:16:45
◼
►
Muse is a tool for thought on iPad.
00:16:48
◼
►
It gives you a spatial canvas for your research notes, your sketches, screenshots, bookmarks,
00:16:53
◼
►
PDFs and so much more. The Muse team believes that deep thinking doesn't happen in front of a computer.
00:16:59
◼
►
So Muse turns your iPad into a space inspired by your desk, letting you be personal, creative,
00:17:06
◼
►
and even a little bit messy. You can put anything on a Muse board. You can pull in relevant
00:17:11
◼
►
information from the web, email, Twitter, Slack, your files, notes or photos from your phone,
00:17:17
◼
►
and then just arrange it however you like. Muse lets you sift and sort through it all, helping you
00:17:22
◼
►
find new patterns and insights.
00:17:25
◼
►
There are times when I come across user interfaces that I just enjoy playing with and Muse is
00:17:29
◼
►
one of those.
00:17:30
◼
►
It does some things that feel really natural for how you want to interact with your iPad,
00:17:35
◼
►
making this bridge between physical and digital.
00:17:38
◼
►
You're able to freely place things wherever you want and you can move them around how
00:17:41
◼
►
you see fit, make them bigger, smaller.
00:17:43
◼
►
It's super intuitive and fun to use and it's an app where you're actually using both hands
00:17:48
◼
►
at the same time which I really enjoy.
00:17:50
◼
►
This is a tool that I'm going to be using when brainstorming product ideas in the future.
00:17:55
◼
►
It also feels really perfect for creating things like mood boards.
00:17:59
◼
►
Visit MuseApp.com to learn more and download Muse for free today.
00:18:07
◼
►
That's MuseApp.com to download Muse for free.
00:18:10
◼
►
Go there now, Muse.
00:18:12
◼
►
Because deep thinking doesn't happen in front of a computer.
00:18:15
◼
►
Our thanks to Muse for their support of this show and Relay FM.
00:18:20
◼
►
Hello taxons, we are once again, as we have for the last two years taken this time to
00:18:25
◼
►
raise money for St Jude's Children's Research Hospital from now throughout September which
00:18:30
◼
►
is Childhood Cancer Awareness Month.
00:18:32
◼
►
I want to tell you a little bit about St Jude and why it's a special place and why we
00:18:36
◼
►
think it's deserving of your donations.
00:18:38
◼
►
So this is our third consecutive year of supporting the life saving missions of St Jude's Children's
00:18:43
◼
►
Research Hospital.
00:18:44
◼
►
It's quite simple, they find cures, they save children.
00:18:48
◼
►
St. Jude is leading the way that the world understands, treats and defeats childhood
00:18:52
◼
►
cancer and other life-threatening diseases, but they cannot do it without the help of
00:18:56
◼
►
people like you.
00:18:57
◼
►
Because of generous donors, families never receive a bill from St. Jude for treatment,
00:19:02
◼
►
travel or food because all a family should have to worry about in these situations is
00:19:07
◼
►
just helping their child live.
00:19:09
◼
►
For context, the average cost to treat just one child of acute lymphoblastic leukemia,
00:19:14
◼
►
the most common form of childhood cancer is $203,074.
00:19:19
◼
►
To make this possible, it's a lot of money,
00:19:22
◼
►
it's so much money.
00:19:23
◼
►
- That is a breathtaking amount of money.
00:19:25
◼
►
- It's so much money.
00:19:27
◼
►
And to make this possible,
00:19:29
◼
►
about 80% of the funds necessary to sustain
00:19:31
◼
►
and grow St. Jude must be raised each year from donors,
00:19:35
◼
►
'cause this is an incredibly expensive thing.
00:19:39
◼
►
And the great thing about St. Jude really is not only
00:19:41
◼
►
do they treat the children,
00:19:43
◼
►
They're also a research hospital.
00:19:44
◼
►
So the things that they learn can be used
00:19:47
◼
►
for future cancer patients.
00:19:49
◼
►
And one of my favorite things about St. Jude
00:19:52
◼
►
is this knowledge, they share it with the rest of the world,
00:19:55
◼
►
the entire science community
00:19:56
◼
►
they will share this knowledge with.
00:19:58
◼
►
And that's what I love about them.
00:19:59
◼
►
Like it is one place, it is in Memphis, Tennessee,
00:20:02
◼
►
which is where my co-founder Steven Hackett lives
00:20:05
◼
►
and we are particularly tied to St. Jude like emotionally
00:20:09
◼
►
because one of his children had treatment at St. Jude
00:20:12
◼
►
and saved his life.
00:20:14
◼
►
And through our first two fundraising campaigns,
00:20:17
◼
►
the Relay FM community has raised over $800,000
00:20:21
◼
►
for the mission of St. Jude.
00:20:22
◼
►
And this year we want to cross 1 million.
00:20:25
◼
►
So you can help us by donating at stjude.org/relay today.
00:20:29
◼
►
This year, people who donate over $100
00:20:31
◼
►
will get a exclusive Relay FM sticker of thanks pack
00:20:34
◼
►
at the end of the campaign,
00:20:35
◼
►
just as a little thank you from us.
00:20:37
◼
►
Let's cure childhood cancer together.
00:20:40
◼
►
Million dollars this year, I wanna do it.
00:20:43
◼
►
We've set our goal at $333,333.33,
00:20:47
◼
►
'cause it's the third one.
00:20:50
◼
►
But when we hit $196,000 raised,
00:20:54
◼
►
we've done a million over three years.
00:20:57
◼
►
And when you think about it,
00:20:58
◼
►
it's an incredible amount of money.
00:21:00
◼
►
- It's breathtaking.
00:21:00
◼
►
- It's a drop in the ocean, really.
00:21:03
◼
►
But looking at that, that's like five children
00:21:05
◼
►
whose life could be saved from that money.
00:21:08
◼
►
And that's kind of an incredible thing.
00:21:10
◼
►
- I also think the thing that you mentioned
00:21:12
◼
►
about the research being shared
00:21:13
◼
►
is just much less common in science
00:21:17
◼
►
than people think it is. - It really is.
00:21:19
◼
►
It really is so uncommon. - It's shockingly rare.
00:21:22
◼
►
- The reason we talk about it
00:21:23
◼
►
is because they are abnormal in what they do here, right?
00:21:26
◼
►
Like they share their science.
00:21:28
◼
►
They don't keep it to themselves
00:21:31
◼
►
and try and make money from it.
00:21:32
◼
►
They share it.
00:21:34
◼
►
- But you attempting to do a calculation there
00:21:36
◼
►
of like children per hundred thousand dollars.
00:21:39
◼
►
I think that argument doesn't apply very well to St. Jude
00:21:42
◼
►
precisely because of this fact that they share the knowledge
00:21:46
◼
►
that they're able to get,
00:21:47
◼
►
which has a big multiplying effect for dollars donated.
00:21:52
◼
►
And I cannot believe that Relay is approaching
00:21:55
◼
►
a million dollars for St. Jude.
00:21:57
◼
►
Like it's an unbelievable number.
00:21:59
◼
►
And I think it's great.
00:22:01
◼
►
Like I really hope that we get there this year
00:22:05
◼
►
that fundraiser. It's just, it's a lot of money and it really shows the generosity of
00:22:12
◼
►
all of the Relay listeners.
00:22:13
◼
►
Yeah, and it really is. And we have continued to be blown away by it every year, and I hope
00:22:18
◼
►
that people will continue to donate. It's at stju.org/relay where you can donate today.
00:22:24
◼
►
And of course, we're continuing the tradition as part of this campaign through September.
00:22:29
◼
►
we're going to be holding Podcastathon 3.
00:22:33
◼
►
It is happening on September 17th,
00:22:35
◼
►
from 12 to 8 p.m. Eastern time.
00:22:38
◼
►
We're doing two hours more this year.
00:22:40
◼
►
It's an eight hour Podcastathon.
00:22:42
◼
►
- Oh my God.
00:22:44
◼
►
- We've done six years the first year,
00:22:46
◼
►
we were supposed to do six years the second year last year,
00:22:48
◼
►
but we did seven.
00:22:49
◼
►
- I presume you mean six hours.
00:22:51
◼
►
You said we did six years the first year.
00:22:56
◼
►
- Wait a second.
00:22:56
◼
►
- It just feels like six years when you're live.
00:22:58
◼
►
Six hours the first year, seven hours the second year,
00:23:02
◼
►
'cause we were close to meeting our goals,
00:23:04
◼
►
so we just kept going until we did it.
00:23:06
◼
►
- Oh, that's right, yes, that's right, I remember.
00:23:08
◼
►
- We're doing eight hours this year.
00:23:10
◼
►
We have so many things planned.
00:23:12
◼
►
We have multiple sets of plans.
00:23:15
◼
►
Maybe we're remote, maybe we're in person.
00:23:17
◼
►
We actually don't know at this point,
00:23:19
◼
►
but I've got the balloon room standing by
00:23:23
◼
►
in case I'm gonna be here in Mega Studio again.
00:23:25
◼
►
I'm super excited.
00:23:26
◼
►
We're gonna have tons of guests,
00:23:27
◼
►
loads of great stuff planned. So it's going to be on September 17th from 12 to 8pm US
00:23:33
◼
►
Eastern Time at twitch.tv/relayfm. If you go to twitch.tv/relayfm now and click the
00:23:39
◼
►
follow button you'll be alerted when things go live. So I'm super excited about the podcast
00:23:44
◼
►
athon and to be once again raising money for such an incredible cause. Please donate
00:23:48
◼
►
at stjoe.org/relay.
00:23:50
◼
►
So we have mentioned Cortex Animated a bunch of times in the past. These are wonderful
00:23:54
◼
►
videos created by H.M. Bhutte that we put on our YouTube channel, the Cortex YouTube
00:23:59
◼
►
channel. And every month they send us a video and we take a look and then we approve it
00:24:05
◼
►
and we upload it.
00:24:06
◼
►
Yeah, they're delightful. It's delightful every time.
00:24:08
◼
►
They're fantastic. They're all just incredible and I'm so pleased that we're able to make
00:24:11
◼
►
these happen with them. They do just a superb job. This time they were like, "This one's
00:24:16
◼
►
going to take me a little bit longer." We're like, "All right." And so they posted the
00:24:20
◼
►
video with some flashbacks in it. I don't want to spoil it all, but it's worth watching.
00:24:25
◼
►
Basically because if you remember on our previous episode, me and Grey were like dumbfounded
00:24:30
◼
►
with discovering the effective executive because we could not remember this book at all. And
00:24:35
◼
►
I still don't remember it.
00:24:36
◼
►
It was on your Kindle, yeah. We're like, "What is this?"
00:24:39
◼
►
In the episode for the effective executive, at the end of it, you say,
00:24:44
◼
►
"Don't let it tap to blue mana and cast 'forget on me' again. If I suggest it again, you have
00:24:49
◼
►
to remember, Myke, that we've already read it.
00:24:51
◼
►
And then I say, "Well, I will remember because there was something quite unique about this
00:24:57
◼
►
So I don't know exactly what has happened here, why we were convinced we needed to remember
00:25:03
◼
►
this book and then didn't.
00:25:05
◼
►
But that's... also, is that a magic joke?
00:25:08
◼
►
I was about to say, that is a magic reference.
00:25:11
◼
►
I enjoy this on several levels because this is clearly the thing that I would do every
00:25:17
◼
►
couple of years, which is just, "Oh, I can't, I can't get back into magic, but let me just,
00:25:22
◼
►
let me just read up a little bit on what's going on in the magic world."
00:25:25
◼
►
Let me just think about it a little.
00:25:26
◼
►
"Let me just think about it a bit."
00:25:28
◼
►
That's the same.
00:25:29
◼
►
"I'm not gonna, I'm not gonna push this button, but I'm gonna put my finger on it and see
00:25:33
◼
►
if it has a little give before it clicks, you know, like that kind of, that kind of thing."
00:25:38
◼
►
And so I enjoy this because the very fact that I would say that sentence indicates to
00:25:41
◼
►
me that I was at one of the heights of perhaps trying to get sucked back in, but then backing
00:25:46
◼
►
away, but so yes, I like it because I made a magic reference about neither of us should
00:25:52
◼
►
let a spell be cast upon us so that we do not remember this book. You are then confident
00:25:57
◼
►
that you will remember at the very least and then flash forward whatever it is, a couple
00:26:02
◼
►
years, neither of us had any memory of any part of this.
00:26:06
◼
►
I still don't remember it. I have no memory of this book.
00:26:10
◼
►
No, I don't, I don't remember a thing about this book.
00:26:13
◼
►
I have a, I have a guess.
00:26:14
◼
►
My, my, my best guess is that it was one of those books where we said something
00:26:22
◼
►
like, oh, it must've been really influential at the time because we've
00:26:27
◼
►
heard all of the ideas in other places.
00:26:29
◼
►
And so it makes the original seem really boring and unnoteworthy, even
00:26:35
◼
►
though maybe it was the thing that set the trend at the time.
00:26:37
◼
►
That's my best guess.
00:26:39
◼
►
Is this a foreshadowing of today's episode, Greg?
00:26:42
◼
►
Is this a foreshadowing?
00:26:43
◼
►
- I don't know what you could possibly mean.
00:26:47
◼
►
That's my best guess about,
00:26:49
◼
►
because here's the thing,
00:26:51
◼
►
you think about any kind of media,
00:26:53
◼
►
the best things are the best things.
00:26:56
◼
►
The worst things are also kind of the best in their own way
00:27:00
◼
►
because at least you can remember them.
00:27:02
◼
►
And the worst things are the ones
00:27:04
◼
►
that are actually dead in the middle and just boring.
00:27:06
◼
►
- The true worst things.
00:27:07
◼
►
- Yeah, the true worst things are the things
00:27:10
◼
►
that score five out of 10, rather than one or 10 out of 10.
00:27:15
◼
►
Five is the worst, 10 is the best, one is the next best,
00:27:19
◼
►
as far as the order of things.
00:27:21
◼
►
So that's my guess about the effective executive,
00:27:24
◼
►
what could make it so completely forgettable,
00:27:27
◼
►
but I don't know, I have no idea.
00:27:30
◼
►
- This episode of Cortex is brought to you
00:27:34
◼
►
by our good friends at Membr4.
00:27:36
◼
►
Memberful is the easiest way to sell memberships to your audience, used by the biggest creators
00:27:43
◼
►
Generate sustainable recurring income while diversifying your revenue stream.
00:27:48
◼
►
You might have heard us talk about Mortex, which is part of the Relay FM membership program,
00:27:52
◼
►
but what you might not know is that Memberful is the platform that we use to power it all.
00:27:56
◼
►
They make it super easy for us to generate that extra revenue stream, whilst also delivering
00:28:00
◼
►
bonus content to our members.
00:28:03
◼
►
I really love being able to use Memberful.
00:28:05
◼
►
It's so easy for us to integrate it with the platforms that we use and make it super
00:28:09
◼
►
easy for our listeners to get additional content and to have ad removes and more text.
00:28:14
◼
►
Memberful makes it so easy for us to keep track of the people that are signing up and
00:28:18
◼
►
to integrate with other platforms and systems like Discord so we can have that integration
00:28:22
◼
►
there as well.
00:28:23
◼
►
Maybe you're already producing content and relying on advertising or other means of income.
00:28:28
◼
►
Memberful makes it easy to diversify that income with everything you need to run a membership
00:28:32
◼
►
program of your own, including custom branding, gift subscriptions, apple pay, free trials,
00:28:37
◼
►
private podcasts and tons more, while leaving you with full control and ownership of everything
00:28:42
◼
►
that relates to your audience, brand and membership.
00:28:45
◼
►
If you're a content creator, Membeful can help you monetise that passion.
00:28:48
◼
►
Get started for free at membeful.com/cortex, there's no credit card required.
00:28:53
◼
►
That's membeful.com, M E M B E R F U L dot com slash cortex, go there right now and check
00:28:59
◼
►
it out, it could be the start of something exciting.
00:29:01
◼
►
Our thanks to MAMB4 for their support of this show and Relay FM.
00:29:06
◼
►
Alright, Cortex Book Club time.
00:29:08
◼
►
- Cortex Book Club time.
00:29:09
◼
►
- "Thinking Fast and Slow" by Daniel Kahneman.
00:29:13
◼
►
I don't know if you knew, but he was a Nobel Prize winner grade, did you know that?
00:29:15
◼
►
- I thought this book won the Nobel Prize, isn't that what this is?
00:29:18
◼
►
- That's what the cover leads me to believe.
00:29:22
◼
►
[deep inhale]
00:29:25
◼
►
We have a bit of a problem with this book.
00:29:26
◼
►
- Oh, you-- Myke, you don't know what I think about this book.
00:29:30
◼
►
You don't have any idea.
00:29:31
◼
►
No, I never said that. I said there is a problem with this book.
00:29:34
◼
►
There is a problem with this book.
00:29:35
◼
►
The problem with this book, for me,
00:29:39
◼
►
happened on the Reddit thread of our last episode.
00:29:44
◼
►
Because we had a bunch of cortexes say,
00:29:47
◼
►
"Oh boy, that's a dense book."
00:29:50
◼
►
And my brain said, "I don't want to read this anymore."
00:29:56
◼
►
So I really, really struggled to get started with this one.
00:30:01
◼
►
So what you're saying is when I requested that we pushed back the recording date of
00:30:05
◼
►
this episode by a week, you had no complaints about that because you probably hadn't even
00:30:11
◼
►
started the book.
00:30:12
◼
►
It helped me massively.
00:30:13
◼
►
I'd started it but I'd not gotten very far at all.
00:30:17
◼
►
And then there was like a whole week where I couldn't listen because we were traveling
00:30:20
◼
►
a little bit and I just, it was, I was so happy because otherwise I didn't know what
00:30:26
◼
►
I was gonna do.
00:30:30
◼
►
Let's just say it's not a book where you want to listen to all 20 hours in one day, for
00:30:35
◼
►
That would not be a pleasant experience.
00:30:36
◼
►
Did you try and do that?
00:30:40
◼
►
No, I didn't do that.
00:30:41
◼
►
But I partly needed to push back the date for similar reasons where I was looking at
00:30:46
◼
►
the number of things I needed to do between then and the recording date and the number
00:30:50
◼
►
of hours I had left in the book, which was something like 17 at that point, and I thought
00:30:56
◼
►
- I think we were at a very similar place at that point,
00:30:59
◼
►
to be honest.
00:31:02
◼
►
- Thought I'm gonna have a real problem.
00:31:05
◼
►
- The cortex was not wrong.
00:31:07
◼
►
This is an incredibly dense book.
00:31:09
◼
►
And it left me with a feeling,
00:31:12
◼
►
which I cannot believe I felt,
00:31:14
◼
►
where I missed the bat (beep) bananas stories
00:31:19
◼
►
from the other books.
00:31:21
◼
►
The things that would make me the most angry
00:31:25
◼
►
when we come to the show,
00:31:26
◼
►
like this is so annoying, why are you wasting my time
00:31:28
◼
►
with these stupid examples?
00:31:30
◼
►
They were the things I ended up missing.
00:31:33
◼
►
'Cause the problem with this book is, for most of it,
00:31:36
◼
►
I cannot attach to it because it's so dry.
00:31:40
◼
►
It is so dry and dense and there's just so much stuff.
00:31:45
◼
►
It is not a book to try and read quickly
00:31:50
◼
►
and I actually think this is really not a book for audio.
00:31:54
◼
►
Yeah, so what you're saying is you missed the the E-Myth revisited style stories of
00:32:02
◼
►
"Oh, I went to a magic hotel!"
00:32:03
◼
►
I think I kind of did, which is so weird, but what I've realized is like, what I want
00:32:09
◼
►
from these books is good information and things that like, give some kind of emotional response
00:32:17
◼
►
Right, like riding on a motorcycle with your 17 children in Hawaii.
00:32:20
◼
►
All on the back.
00:32:21
◼
►
Right, all on the back and you go "Wait, how does that work?
00:32:23
◼
►
I don't understand.
00:32:24
◼
►
- But at least it gives me,
00:32:27
◼
►
I can imagine things or whatever.
00:32:29
◼
►
Daniel Kahneman loves an experiment
00:32:32
◼
►
more than any other human being alive.
00:32:36
◼
►
I feel like everything is an experiment.
00:32:38
◼
►
Did this experiment, did this experiment,
00:32:41
◼
►
looked at the pupils dilating, did this experiment.
00:32:43
◼
►
Hey, what about this experiment?
00:32:45
◼
►
There's so many of them and it's not,
00:32:48
◼
►
it doesn't grab me in the same way.
00:32:51
◼
►
I found it for that reason kind of really hard to try and get through.
00:32:56
◼
►
Yeah, so, oh Myke, I've got some things to tell you about those experiments later.
00:33:03
◼
►
I will also agree that, so when we discuss these books, you normally read the audiobook
00:33:08
◼
►
and I'm always like, "Oh Myke, it's a terrible mistake, you shouldn't read the
00:33:11
◼
►
Because these sorts of books are made for, you have to be able to skim them, and because
00:33:18
◼
►
Because my schedule in the next month has radically changed at the last moment, I found
00:33:22
◼
►
myself extremely short on time in the last two weeks and I realized I'm gonna have to
00:33:29
◼
►
go through this book as an audiobook.
00:33:30
◼
►
I just don't have the time to sit down and read through it.
00:33:34
◼
►
And very quickly I realized, oh this isn't just normally the situation where these books
00:33:44
◼
►
are not good in audio form.
00:33:46
◼
►
This book is particularly brutal in audio form.
00:33:51
◼
►
I found myself in this constantly frustrated situation where I knew the only time that
00:33:58
◼
►
I could get through it was when I was doing other things so I could listen in audio form
00:34:02
◼
►
while simultaneously knowing that I could be getting through the book easily five times
00:34:08
◼
►
faster if I was actually reading it because for reasons we'll get into later a lot of
00:34:14
◼
►
this stuff I just heard before or found completely unremarkable. And there was one part in particular
00:34:22
◼
►
where I did drop out and I read two chapters because I'm like, "I'm willing to bet I know
00:34:28
◼
►
what's in these chapters!" And so I was like, "Let me just quickly jump over to the actual
00:34:31
◼
►
book." It's like, "Okay, chapters three and four, reading in quotation marks, but actually
00:34:37
◼
►
skim reading very quickly and like blasting through that section." But that was the only
00:34:41
◼
►
part where I was able to do that.
00:34:42
◼
►
And then I had to get back into the audio book and yeah, it is a
00:34:48
◼
►
brutal book in audio form.
00:34:50
◼
►
I think it's a particularly bad one because like you said, there are, well,
00:34:55
◼
►
I do have complaints about some of his stories because I think that
00:35:00
◼
►
there are stories in this book, but they're all the same kind of story
00:35:04
◼
►
that I find really infuriating where he tells you a little bit about some
00:35:08
◼
►
colleague who like did this other thing.
00:35:11
◼
►
And that to me is, I don't know, it's just an infuriatingly, I don't know how to put
00:35:16
◼
►
this, but it's a little bit like he's really making sure to give credit and
00:35:20
◼
►
make all of his colleagues sound great.
00:35:22
◼
►
And so he constantly includes references to like, Oh, this is, this is from my
00:35:26
◼
►
genius colleague who he's so much smarter than me, we work together on this thing,
00:35:30
◼
►
but it's mostly him and like, he's so close.
00:35:33
◼
►
And like, it's like, I don't care.
00:35:34
◼
►
I don't care who did the thing.
00:35:36
◼
►
Like, I just care about the idea.
00:35:38
◼
►
I don't really care that this one came from Chicago and this one came
00:35:41
◼
►
from the University of Illinois.
00:35:43
◼
►
And like, oh, well that team at Illinois is great.
00:35:45
◼
►
Like, I don't care about that at all, but I do think that that is a, um,
00:35:50
◼
►
I'm going to put it this way.
00:35:51
◼
►
I think it is a side effect of defensive writing on the part of an academic who
00:36:00
◼
►
is trying to write a popular book.
00:36:05
◼
►
Is like, this is, this is like the popular writing version of citing
00:36:11
◼
►
where the work comes from.
00:36:13
◼
►
So he's trying not to just put in a little footnote that says like
00:36:16
◼
►
Smith et al University of Hawaii.
00:36:20
◼
►
He's instead trying to tell you a little bit about the people
00:36:23
◼
►
at the University of Hawaii.
00:36:24
◼
►
But, you know, just like when you watch the behind the scenes, uh, for making
00:36:28
◼
►
a movie, it's like, Oh, spoiler.
00:36:30
◼
►
Everyone's just got great things to say about everyone else.
00:36:33
◼
►
It's the same thing here academically.
00:36:35
◼
►
Like I didn't hear one bad thing about one colleague.
00:36:38
◼
►
I found that kind of stuff infuriating and also just made it very difficult to
00:36:43
◼
►
listen to because it's like, I know if I was reading this, as soon as I would
00:36:46
◼
►
hit one of those paragraphs, I would just jump right to the next paragraph and be
00:36:49
◼
►
like, yeah, yeah, just tell me the thing.
00:36:50
◼
►
I don't care about the person behind the thing.
00:36:52
◼
►
So I did do something with this book that I haven't done before.
00:36:56
◼
►
I didn't read all of it.
00:36:57
◼
►
Well, I listened to the first half in entirety.
00:37:00
◼
►
It's cut into a bunch of sections.
00:37:02
◼
►
this book and in about section three it really lost me.
00:37:07
◼
►
I'm gonna get into why in a little bit later on.
00:37:11
◼
►
So I then started basically just jumping around.
00:37:15
◼
►
I would listen to something,
00:37:16
◼
►
get what I think is the main idea
00:37:18
◼
►
and then once he started going into all of the experiments
00:37:21
◼
►
that he'd done to prove his point,
00:37:22
◼
►
I would jump forward to the next section.
00:37:25
◼
►
So I feel like I was still getting the main ideas
00:37:28
◼
►
but I wasn't sitting through the supporting materials
00:37:32
◼
►
which again, this is a very normal thing of these types of books, but I think I prefer the presentation style of the fake person than the
00:37:39
◼
►
"Hey, let me tell you about the experiment."
00:37:41
◼
►
So I didn't listen to all of it.
00:37:43
◼
►
Yeah, the fake person is worse, but less boring, right?
00:37:48
◼
►
Which has some redeeming characteristics.
00:37:51
◼
►
It goes back to that good, worse, and true worst thing that we were talking about, right?
00:37:56
◼
►
Yeah, I was like, "Oh, I'm trying to remember what it is."
00:37:57
◼
►
I just, um, just recently I just, I did the thing that very rarely
00:38:02
◼
►
happens where I hate read a book.
00:38:04
◼
►
Like a book made me so angry that I finished it.
00:38:07
◼
►
God, I can't remember what it was.
00:38:14
◼
►
This is just a couple of months ago.
00:38:15
◼
►
It doesn't happen very often, but when it does, it's a, it's an odd experience.
00:38:18
◼
►
I'm like, I hate this book so much, but I'm going to finish it.
00:38:21
◼
►
But you know what?
00:38:23
◼
►
That's an experience.
00:38:25
◼
►
You know, I felt something.
00:38:26
◼
►
Whereas with the boring stuff it's just, it's so much worse in a completely unremarkable way.
00:38:32
◼
►
So, I've got so many complaints it's hard to know where to start.
00:38:40
◼
►
Can we talk about the actual good thing in the book? I want to sandwich this a little bit.
00:38:44
◼
►
Okay, yes. Let's talk about the good thing in the book,
00:38:47
◼
►
and then I will tell you my story about this book.
00:38:51
◼
►
And then I have a bunch of more complaints.
00:38:55
◼
►
Look, the thing about this book, the reason this book is successful, is because it has something
00:38:59
◼
►
which is genuinely very good, which is system one and two. This is the thing that makes this book
00:39:05
◼
►
what it is, this is the reason why this book is in so many businesses, it's the reason why
00:39:11
◼
►
when you join an advertising agency they will give you this book, like this is a very normal thing
00:39:15
◼
►
that happens. It's effectively saying that our brains work in one of two ways. There is system
00:39:24
◼
►
one which is automatic responses to things like for example if you hear a
00:39:29
◼
►
loud noise you'll immediately go look like you turn and look at it simple
00:39:33
◼
►
things like driving a car with no traffic on a route that you know basic
00:39:37
◼
►
sentence structure all of that kind of stuff these are just simple things that
00:39:41
◼
►
our brain can do automatically and then there are system two which are things
00:39:45
◼
►
that take more effort things that need orderly steps things that you have to
00:39:49
◼
►
pay attention to. Like focusing on the voice of a particular person in a loud environment
00:39:55
◼
►
takes effort. Filling out a form that you're unfamiliar with takes effort. Parking your
00:40:01
◼
►
car in a narrow space, right? These are things where you must focus on them. So the book
00:40:08
◼
►
at first focuses on this a lot, as it gets later in the book tangentially relates back
00:40:12
◼
►
to it, which is very strange to me because it feels like the entire book should be about
00:40:16
◼
►
but then it seems to go off in these weird areas,
00:40:20
◼
►
which is like, "Oh, and by the way,
00:40:21
◼
►
that's a part of system one."
00:40:22
◼
►
It's like, "All right, thanks Kahneman."
00:40:24
◼
►
But I really like this idea.
00:40:26
◼
►
I like a lot of where it comes from.
00:40:29
◼
►
I like how it can be used and is used a lot in marketing
00:40:34
◼
►
and stuff like that, right?
00:40:35
◼
►
Like system one kind of leads on subliminal messaging
00:40:39
◼
►
and that kind of stuff, right?
00:40:40
◼
►
Like these are the things that people will take advantage of
00:40:43
◼
►
to try and just get these ideas in your head
00:40:46
◼
►
and our system ones can be tricked.
00:40:49
◼
►
Like one of the key examples,
00:40:51
◼
►
this is actually a pretty good one.
00:40:52
◼
►
If somebody told you to think of the word eat
00:40:54
◼
►
and then shows you the letters S O then space and P,
00:40:58
◼
►
you would immediately think of soup.
00:41:01
◼
►
But if they told you the word clean
00:41:03
◼
►
and then showed you the same thing, you would say soap.
00:41:06
◼
►
Like simple stuff like that.
00:41:08
◼
►
I liked this idea.
00:41:09
◼
►
And there are some other parts that come from it.
00:41:11
◼
►
Like part of system two can be like being in a state of flow
00:41:15
◼
►
when you're working, when you're like really concentrating
00:41:18
◼
►
and you're in it, like all this kind of stuff.
00:41:20
◼
►
I found it really interesting,
00:41:21
◼
►
but that was kind of the entire book for me.
00:41:23
◼
►
- Yeah, I think if I would try to distill down
00:41:27
◼
►
the valuable thing here is,
00:41:31
◼
►
I feel like it never quite says so clearly,
00:41:33
◼
►
but the basic idea is a little bit like,
00:41:37
◼
►
if you need to make a decision that matters,
00:41:41
◼
►
you should notice when you haven't actually thought about it.
00:41:46
◼
►
- That like by default, your brain always wants to use
00:41:51
◼
►
the fast way of thinking.
00:41:53
◼
►
- One of the ways they do press announcements I like
00:41:54
◼
►
is saying that like your brain instinctively tries
00:41:57
◼
►
to find a system one answer to a system two question.
00:42:02
◼
►
Like would this person be good for this job?
00:42:05
◼
►
And you immediately look at them
00:42:07
◼
►
and try and judge them based on their appearance.
00:42:09
◼
►
Like that is a system one answer
00:42:11
◼
►
'cause the system two answer is actually doing
00:42:14
◼
►
some research on this person and like, and our brains try and make these impressions
00:42:19
◼
►
very quickly so it doesn't have to work.
00:42:21
◼
►
Like he refers to system two as lazy, which I like.
00:42:26
◼
►
I think it is a good idea to have people realize that the logical part of their
00:42:31
◼
►
brain is in some ways a smart, but lazy slacker and you know, like you need to
00:42:39
◼
►
rouse it at certain moments when it really matters and be like, Hey, pay attention.
00:42:44
◼
►
do the thing that you do. And he has some good examples in there of when are you more
00:42:53
◼
►
likely to fall for this? And I think the best two for me are familiar- they're related,
00:42:59
◼
►
but they're familiarity and availability. That you tend to go with things that you have
00:43:06
◼
►
just heard a bunch. So this is, this is like with marketing, like this is the whole idea
00:43:11
◼
►
of how advertising works in many ways is just repeatedly expose people to the same idea.
00:43:19
◼
►
And the contents of that idea doesn't really matter, it's just that people will then tend
00:43:24
◼
►
to prefer whatever that idea is over alternatives that they are less familiar with. So if you're
00:43:32
◼
►
buying a car, be aware that you're going to be tending towards brands you have heard more
00:43:40
◼
►
versus brands you have heard less and that's not always a logical thing to do. And then
00:43:45
◼
►
availability is a similar sort of thing where you just tend to when thinking of things you
00:43:52
◼
►
think of the most salient examples in your mind of a thing. So stuff that is emotionally
00:43:59
◼
►
resonant will come to mind first. Say either books you really liked or books you really
00:44:04
◼
►
hated like they're easier to remember than say dry books that might be filled with a
00:44:09
◼
►
lot of great information or whatever. So like, I think those two are useful to try to catch.
00:44:17
◼
►
Oh, am I just coming up with a reflexive answer? I'm just saying a thing that I have heard
00:44:23
◼
►
lots of people say, or I'm remembering a case that is emotionally salient or that was recent.
00:44:31
◼
►
I'm not thinking of what is the typical example of this case.
00:44:37
◼
►
So that I feel like that's how I would try to encapsulate what is in the book.
00:44:42
◼
►
I think the things you were touching on there, of course, cognitive ease, which is when our
00:44:47
◼
►
brains make logical jumps because we're familiar with something, even if the answer is not
00:44:51
◼
►
correct, but our memory of thinking that we know something will suffice, right?
00:44:55
◼
►
It's just like a thing that you couldn't know, but because you've experienced something in
00:45:00
◼
►
the past you will just give an answer to it and then exposure effect which is the
00:45:03
◼
►
more we see something the more we are likely to feel positive about it.
00:45:07
◼
►
There's this one part that I like too which is talking about the way that our
00:45:11
◼
►
brains expect things differently and how this can go from system two to system one
00:45:16
◼
►
like if something unexpected happens to you you kind of deal with it with system
00:45:21
◼
►
two because you you have to try and work out what on earth is happening but then
00:45:25
◼
►
once it's happened once if it happens again it becomes much more of a system
00:45:29
◼
►
one thing and you're more likely to expect it to happen and he uses an
00:45:32
◼
►
example of he bumped into a friend in Italy and it was like a big thing and
00:45:38
◼
►
then bumped into him in London and it was like well I see this guy John in
00:45:42
◼
►
different places so it's not weird to me the reason this resonated is I have a
00:45:47
◼
►
friend like this his name's Matt and multiple times I have bumped into him in
00:45:52
◼
►
places that seem really strange once was at a sporting event in New York City and
00:45:58
◼
►
And it was like a huge thing. It was like, "Oh my god, I can't believe you're here."
00:46:01
◼
►
Like, we're sitting two rows away from each other, like, "How wild is this?"
00:46:04
◼
►
And then a couple of weeks ago, we were staying at a hotel in London,
00:46:08
◼
►
and I saw him there, and it wasn't such a surprise.
00:46:10
◼
►
- Right. Oh yeah. - Because I've experienced it.
00:46:12
◼
►
Like, "Oh yeah, Matt's my friend that I see in various places around the world unexpectedly."
00:46:17
◼
►
Yeah, I like that example because I think people can understand that one quite well.
00:46:21
◼
►
That like, just by the nature of life, there is going to be that person who you seem to bump into
00:46:26
◼
►
more frequently than other people and it rapidly becomes, "Oh yeah, that's the person who's
00:46:30
◼
►
everywhere in your brain." It doesn't become remarkable at all.
00:46:33
◼
►
But you have taught me things like this before of like selection effects where like I can see that
00:46:41
◼
►
we're probably quite similar people so we end up being in similar places. There's just a higher
00:46:46
◼
►
percentage of chance that I will see him because we like the same kinds of things.
00:46:51
◼
►
Right. So if I'm going to see anyone it will be Matt.
00:46:54
◼
►
I mean, you have already stumbled upon one of the things that I find quite frustrating
00:46:59
◼
►
with this book, which is, I think, a conflation of like pure math, which he's often talking
00:47:07
◼
►
about with the reality of social situations, which I feel like he just does not acknowledge
00:47:16
◼
►
I was losing my mind at one part of this book.
00:47:21
◼
►
I can't wait to find out if it's the same one that I lost my mind over.
00:47:23
◼
►
But before we get there, I was just looking through my notes and I realized
00:47:27
◼
►
there's one other idea, which I think is worth saying that comes out of the book.
00:47:31
◼
►
It's a single paragraph.
00:47:32
◼
►
I've heard it before, but I still think it's like, this is always a good idea to hear.
00:47:37
◼
►
He's talking about how people remember what they have done.
00:47:41
◼
►
And so the example that he uses is like when you ask couples, what percentage
00:47:47
◼
►
of the housework do you think you do?
00:47:50
◼
►
the total will always be over 100%, right?
00:47:53
◼
►
Because both people will say,
00:47:54
◼
►
"Oh, I do 60% of the work
00:47:55
◼
►
and my partner only does 40% of the work."
00:47:58
◼
►
Or if you are in a group work situation in school,
00:48:03
◼
►
it's the same thing.
00:48:04
◼
►
The total amount of work that was done is 300%
00:48:07
◼
►
because each of the five people think that they did 40%.
00:48:09
◼
►
- Who was the leader of this project?
00:48:11
◼
►
We all were.
00:48:12
◼
►
- Right, exactly.
00:48:14
◼
►
I think this is one of those ideas
00:48:16
◼
►
that should be constantly hammered into people's minds, that you are more aware of the things
00:48:25
◼
►
that you do than you are aware of the things that other people do.
00:48:29
◼
►
That is not to say that you can't be in a marriage where one person is a total slacker
00:48:34
◼
►
on the housework, it's not saying everyone does the same thing, but you should just be
00:48:39
◼
►
aware that by default your brain is exceedingly aware of every tiny thing that you have to
00:48:46
◼
►
do and is almost completely oblivious of all of the things that everyone else has to do.
00:48:53
◼
►
And I really do think that one of the prime areas for this is like the employer-employee
00:49:00
◼
►
relationship of like, it's very easy for employees to imagine that their bosses do
00:49:06
◼
►
do nothing, like, and that they do all of the work, you know, and it's just like, it's
00:49:11
◼
►
just an interesting situation and it's useful to keep that in mind and I think it's useful
00:49:16
◼
►
in work life when you're on a team, you're like, "Oh, I'm doing everything!" and you're
00:49:20
◼
►
like, "Are you really?" you know, or in a relationship, "Oh, I'm the one who does all
00:49:24
◼
►
the work in this relationship, do you really?" Maybe it is the case, but it is much more
00:49:30
◼
►
likely that you're just over remembering your own contributions and I like this is actually
00:49:36
◼
►
something that I do think about a lot with other people is just like try to remember
00:49:41
◼
►
you always overestimate your own contributions to whatever like a partnership is in any way
00:49:47
◼
►
so that was one of the few notes that I had for like this is a great idea it's one paragraph
00:49:53
◼
►
in the book but I gave it a pink highlight to show to myself like this is the most important
00:49:57
◼
►
- He talks about what you see is all there is,
00:50:01
◼
►
which I liked, which is a system one behavior,
00:50:04
◼
►
like that you just see something
00:50:06
◼
►
and you take in what you see
00:50:09
◼
►
and you make your judgment on it and that's that
00:50:11
◼
►
without actually taking in any sources or information.
00:50:16
◼
►
One of the things that this, which I really liked,
00:50:17
◼
►
is the halo effect.
00:50:19
◼
►
So they give this example of you'll meet somebody at a party
00:50:22
◼
►
and you're talking to someone at the party
00:50:23
◼
►
and you really like them and you think they're interesting.
00:50:26
◼
►
And then later on at that party, someone tells you about
00:50:29
◼
►
a charity that they're involved in and asks,
00:50:31
◼
►
do you know anybody who you think might be interested
00:50:34
◼
►
in donating?
00:50:34
◼
►
And then you immediately go back to person one
00:50:37
◼
►
and think they probably would
00:50:38
◼
►
'cause they seem like a nice person.
00:50:40
◼
►
But you do not know them at all.
00:50:41
◼
►
You just assume they are good and generous and kind
00:50:44
◼
►
because you like them.
00:50:46
◼
►
And I found that as such an interesting point
00:50:50
◼
►
for the modern world today.
00:50:52
◼
►
How many judgments and assessments we make about people
00:50:56
◼
►
just because we like them,
00:50:57
◼
►
without knowing everything about them.
00:50:59
◼
►
And I think this works in multiple ways.
00:51:02
◼
►
I think the people need to be much more aware
00:51:04
◼
►
of this these days, that people are complicated
00:51:07
◼
►
and there's a lot to them that you do and do not know,
00:51:11
◼
►
and that it can be helpful to try and remember that
00:51:13
◼
►
when making judgments good and bad.
00:51:15
◼
►
- Yeah, yeah.
00:51:16
◼
►
I think he lightly touches on it in the Halo Effect section,
00:51:19
◼
►
but it's another one of these really under talked about ways
00:51:23
◼
►
that people influence you,
00:51:25
◼
►
is just through their sheer physical attractiveness
00:51:27
◼
►
is a kind of halo effect.
00:51:29
◼
►
That someone who is physically attractive
00:51:31
◼
►
gets overrated on all sorts of good qualities
00:51:35
◼
►
in this comical way. - So smart, so generous,
00:51:37
◼
►
so kind. (laughing)
00:51:39
◼
►
- Physical attractiveness is this kind of thing
00:51:41
◼
►
that is just impossible not to halo effect people over.
00:51:46
◼
►
Sort of talking about social realities though,
00:51:49
◼
►
I do think one of the key things
00:51:50
◼
►
about the halo effect though is like,
00:51:53
◼
►
This only applies though when you don't really know the person.
00:51:58
◼
►
That, like, this is where like social reality kicks in of like, well, yeah,
00:52:02
◼
►
you can start to make judgments about people when you actually know them, but
00:52:06
◼
►
it is, it is useful to be aware that like, if you meet an attractive person
00:52:10
◼
►
who is paying you attention, your fast thinking part of your brain is going to
00:52:17
◼
►
be like, this person's great at everything, they should be my partner.
00:52:21
◼
►
And also they should be the new Dean of this college and they probably
00:52:26
◼
►
do all of these amazing things.
00:52:27
◼
►
Uh, like you, you just, it's useful to be aware of that when you're in one of those
00:52:31
◼
►
situations, like hold back judgments in the immediate moment until you have actually
00:52:36
◼
►
gathered some more information about the person.
00:52:38
◼
►
Yeah, I'm just looking through here.
00:52:40
◼
►
Do I have anything else that I think is particularly good?
00:52:42
◼
►
From this section, no, like it is after this section where things start to get rough
00:52:49
◼
►
for me, but there are some interesting parts to it.
00:52:52
◼
►
I particularly liked something called
00:52:54
◼
►
the availability cascade,
00:52:57
◼
►
which is when like the media will jump on a thing,
00:53:00
◼
►
making it like a circus,
00:53:02
◼
►
because the more you cover it, the more people care
00:53:04
◼
►
and they overweight unimportant things
00:53:06
◼
►
because people like to learn about them.
00:53:08
◼
►
And this is like stuff that directly appeals to system one.
00:53:12
◼
►
This just, I liked this.
00:53:13
◼
►
It just puts something into words,
00:53:15
◼
►
which we've spoken about before,
00:53:16
◼
►
which I've been dealing with over the last year or two.
00:53:19
◼
►
Like just about the way that news is covered
00:53:21
◼
►
and what's covered and what's important and what isn't.
00:53:24
◼
►
- Yeah, I highlighted that section as well,
00:53:25
◼
►
but my note with that is terrible name for this phenomenon.
00:53:29
◼
►
- Availability cascade, that's horrific.
00:53:32
◼
►
It doesn't make any sense.
00:53:33
◼
►
- Availability sounds like a good thing?
00:53:35
◼
►
Like an availability cascade
00:53:37
◼
►
sounds like cornucopia of delight.
00:53:39
◼
►
- Cascade is a good word,
00:53:41
◼
►
but availability doesn't fit with even the example he gave.
00:53:45
◼
►
- Right, but it sounds more like it's a good thing
00:53:48
◼
►
than it's a bad thing.
00:53:50
◼
►
- Yeah, oh, I'm so free.
00:53:53
◼
►
- How available am I?
00:53:54
◼
►
Oh man, I've got a cascade of availability.
00:53:56
◼
►
This is amazing.
00:53:58
◼
►
There is a part in the beginning of the book
00:53:59
◼
►
where he talks about this idea,
00:54:02
◼
►
which I think I have a tendency to underrate,
00:54:04
◼
►
but I do think that he's right about,
00:54:06
◼
►
that giving specific vocabulary to certain ideas
00:54:11
◼
►
is helpful in terms of thinking about those ideas.
00:54:15
◼
►
I think it's particularly funny in this book because I would rate Kahneman as very under
00:54:21
◼
►
average with actually coming up with good names for the concepts that he's talking about.
00:54:25
◼
►
What makes it even worse is he doesn't stop naming things.
00:54:29
◼
►
There are like 2,000 different things in this book.
00:54:34
◼
►
This is one of my issues with it is it's too much branding.
00:54:38
◼
►
Like what Kahneman really has is like four books in one here because like he has the
00:54:44
◼
►
one really good idea and then like a bunch of others and it's like just a
00:54:48
◼
►
dartboard of naming stuff constantly.
00:54:51
◼
►
Here's the thing, we don't like Kahneman's book very much but I mean dude won the Nobel Prize for something like
00:54:57
◼
►
For this book.
00:55:00
◼
►
For this book.
00:55:00
◼
►
No, he's very smart. Very smart.
00:55:04
◼
►
So I think he's trying to do he's trying to do a survey of knowledge in some way and of a lot of things that he was involved in but this is where I need to tell you a story you're not
00:55:14
◼
►
gonna like about this book Myke. Are you ready? Yeah.
00:55:17
◼
►
This episode of Cortex is brought to you by Squarespace, the all-in-one platform to build your online presence and run your own business.
00:55:24
◼
►
From websites and online stores to marketing tools and analytics, Squarespace have got you covered.
00:55:29
◼
►
Because they combine cutting-edge design and world-class engineering to make it easier than ever for you to establish your home online and make your ideas a reality.
00:55:38
◼
►
Squarespace has absolutely everything that you're going to need to create a beautiful, modern website of your own.
00:55:43
◼
►
website of your own. You start with their wonderful, professionally designed templates
00:55:47
◼
►
that use drag and drop tools that you can take advantage of to make your own, customising
00:55:52
◼
►
the look, the feel, the settings, even products that you have on sale with just a few clicks.
00:55:57
◼
►
And all of Squarespace's websites are optimised for mobile, your content automatically adjusts
00:56:01
◼
►
so it's going to look great on any device. I really love that Squarespace has inbuilt
00:56:05
◼
►
analytics so that's all just really easy to do, and their iPad app and iPhone app, they're
00:56:10
◼
►
so good. You can go in and view the important stuff you need, also you can publish content,
00:56:14
◼
►
you can even make changes to your website right from their apps.
00:56:18
◼
►
With Squarespace you'll get unlimited, free hosting, top of the line security and dependable
00:56:22
◼
►
resources that are there to help you succeed. There's nothing to patch or upgrade, they
00:56:27
◼
►
have award winning 24/7 customer support, you can even grab a unique domain name and
00:56:31
◼
►
take advantage of SEO and email marketing tools to get your ideas out there to the world.
00:56:36
◼
►
Squarespace you can turn your big idea into a new website, showcase your work with their
00:56:41
◼
►
portfolio designs, publish your blog posts, promote your business, announce an upcoming
00:56:44
◼
►
event and so much more.
00:56:46
◼
►
Head to squarespace.com/cortex for a free trial today with no credit card acquired.
00:56:50
◼
►
And when you're ready to launch use the offer code CORTEX and you'll save 10% off
00:56:54
◼
►
your first purchase of a website or domain.
00:56:57
◼
►
That's squarespace.com/cortex and when you decide to sign up use the offer code CORTEX
00:57:01
◼
►
to get 10% off your first purchase and show your support for this show.
00:57:05
◼
►
thanks to Squarespace for the continued support of Cortex and Relay FM.
00:57:10
◼
►
Okay, so here Myke is my experience with reading this book.
00:57:13
◼
►
So I'm listening to it and like we said it starts out with here's the idea, there
00:57:19
◼
►
are these two ways of thinking, you know, and that's like chapters one and two is
00:57:23
◼
►
getting you started on the book.
00:57:25
◼
►
As it goes on, one of the first things he talks about is ego depletion.
00:57:31
◼
►
This idea that you have a finite amount of willpower, you can only expend it on
00:57:35
◼
►
so many things, and that's partly because of the fact that type 2 thinking is taxing.
00:57:41
◼
►
I particularly enjoyed how he spent a really long time making sure that you believed him
00:57:46
◼
►
that mentally difficult work made you tired. I don't know about you, but I found that
00:57:51
◼
►
section a little bit like he was telling me running on a treadmill would make me tired.
00:57:55
◼
►
He's like, "Did you know if I had you do mental math really fast in a lab, you couldn't
00:58:03
◼
►
it indefinitely. Like, yeah.
00:58:06
◼
►
WOOOAAAH! Give this man a second Nobel Prize!
00:58:11
◼
►
And he was obviously really chuffed with this particular test he came up with about adding
00:58:15
◼
►
one to numbers, to sequences of numbers really quickly, because he insisted many times, like,
00:58:20
◼
►
"Don't just read this part of the book, you gotta try this!"
00:58:23
◼
►
Did you try it?
00:58:24
◼
►
No, of course not, I was walking around listening to an audio book.
00:58:26
◼
►
No I didn't either, I was like, "I'm just gonna wait for him to get to the result,
00:58:30
◼
►
because I'm not interested in doing the sums.
00:58:33
◼
►
And guess what?
00:58:34
◼
►
The result was, this'll make you real tired.
00:58:36
◼
►
Like, yeah, duh.
00:58:37
◼
►
That's why I didn't want to do it.
00:58:39
◼
►
I was system wanting all my way through that section, dude.
00:58:44
◼
►
I can see where this one was going, Connor, man.
00:58:48
◼
►
Yeah, lazy slacker, system two looks up and is like, nice try.
00:58:52
◼
►
But so, so he starts talking about ego depletion, which for various reasons,
00:59:00
◼
►
we don't need to get into because I think ego depletion is like a whole other thing for another time.
00:59:04
◼
►
But the fact that he's talking about ego depletion raises a little bit of a yellow flag in my brain.
00:59:10
◼
►
And I go, "Hmm, uh-oh."
00:59:12
◼
►
I was like, "Oh, well, whatever. Let's just, we're gonna keep reading."
00:59:16
◼
►
Then we get to chapter four, which covers a topic called priming.
00:59:23
◼
►
And this is where I thought, "Oh no."
00:59:28
◼
►
And I bailed out of the audiobook to read the physical book because I thought, "I don't know how this is going to go."
00:59:34
◼
►
And so I skimmed through the section on priming, which I did not like for reasons that we'll get to.
00:59:42
◼
►
And I thought, "Okay, it's getting a little worse here. I'm going to jump back into the audiobook and keep listening."
00:59:49
◼
►
listening, and it just kept going on and on with all of these experiments in the social
00:59:55
◼
►
sciences that you were talking about. Kahneman never saw or participated in a behavioral
01:00:02
◼
►
economics experiment that he didn't want to tell you about, and it's like, "Here
01:00:06
◼
►
are all of these experiments!"
01:00:09
◼
►
I do feel like I've lived his entire career in the last couple of weeks.
01:00:13
◼
►
Yeah. It's a book that makes you feel like you're the other person, for sure, because
01:00:18
◼
►
You even have to participate in office chitchat between colleagues where he tells you about
01:00:23
◼
►
like what this colleague thought and then you the colleague was surprised and thought
01:00:26
◼
►
about it some more and because he's such a clever clogs he obviously realized his mistake
01:00:31
◼
►
because he's smarter than me the author or whatever.
01:00:33
◼
►
MATT PORTER, PH.D. Oh by the way this is one of the most often cited papers in this field.
01:00:47
◼
►
super popular papers, their citation numbers are real low.
01:00:51
◼
►
- Like 400, it's like, all right, bud.
01:00:55
◼
►
It's not that many.
01:00:56
◼
►
- Right, but this actually leads directly
01:00:58
◼
►
into the problem that I have, which is like,
01:01:00
◼
►
oh, the most popular behavioral economics paper
01:01:02
◼
►
is cited 400 times.
01:01:04
◼
►
Like, it's actually quite, that's just quite a small number.
01:01:06
◼
►
It should raise some red flags in your brain.
01:01:08
◼
►
But so as I kept reading, I kept coming across
01:01:10
◼
►
all this behavioral economics and behavioral science
01:01:14
◼
►
and psychology experiments.
01:01:16
◼
►
And at one point, probably about a quarter of the way into the book, I thought, "I have
01:01:20
◼
►
to check when this book was published."
01:01:22
◼
►
So I go look.
01:01:24
◼
►
So the book is published in 2011 or 2010.
01:01:29
◼
►
And so this book was published right before a thing called the replication
01:01:40
◼
►
crisis came through like a, like a hurricane to destroy the social sciences.
01:01:48
◼
►
Are you familiar with the phrase, the replication crisis?
01:01:51
◼
►
Is this a thing you've ever heard?
01:01:53
◼
►
I have never heard of this before.
01:01:56
◼
►
So I think more people should know about the replication crisis because it is a
01:02:02
◼
►
big deal in the modern world, but I'm also slightly sad to tell you about it,
01:02:07
◼
►
Myke, because I know it will make you sad.
01:02:09
◼
►
Can I read from Wikipedia, which is a thing that I seem to be keep doing recently?
01:02:14
◼
►
Yes, go right ahead.
01:02:15
◼
►
The replication crisis is an ongoing methodological crisis in which it has been found that many
01:02:20
◼
►
scientific studies are difficult or impossible to replicate or reproduce.
01:02:25
◼
►
The replication crisis most severely affects the social sciences and medicine, while survey
01:02:29
◼
►
data strongly indicates that all of the natural sciences are probably implicated as well.
01:02:34
◼
►
The phrase was coined in the early 2010s as part of a growing awareness of the problem.
01:02:39
◼
►
The replication crisis represents an important body of research in the field of metascience.
01:02:43
◼
►
B: Alright, early 2010s.
01:02:46
◼
►
Replication crisis has been on my radar for a really long time, and it's been quite an
01:02:51
◼
►
interesting thing to sort of follow how the scientific world has attempted to deal with
01:02:56
◼
►
this, but as the description there says, there are two particular areas that are just destroyed
01:03:04
◼
►
by this, and it is social sciences in particular, and medicine.
01:03:09
◼
►
And what the replication crisis, you can summarize it as saying, an enormous percentage of these
01:03:18
◼
►
studies that you hear about, like in this sort of book where they say, "We did an experiment,
01:03:24
◼
►
and we had a basket of food, and we put eyeballs above the basket of food, and people stole
01:03:29
◼
►
from it less."
01:03:30
◼
►
These experimental results either don't replicate, which means when other people try to do the
01:03:36
◼
►
same experiment, they do not get the same results, or they have literally never been
01:03:41
◼
►
attempted to be replicated, which tells you almost nothing about the validity of the statement.
01:03:48
◼
►
And the absolute epicenter of what started the replication crisis was all of the social
01:03:56
◼
►
science work on priming because priming, this whole thing in chapter four, is this idea
01:04:04
◼
►
that kind of like spread through the greater society.
01:04:07
◼
►
If you show people images of older people, they'll walk more slowly down a hallway, right?
01:04:14
◼
►
Or you can make people act more virtuous if you have them swear on a Bible.
01:04:20
◼
►
Like this concept of priming that like you're putting ideas into someone's head and then
01:04:25
◼
►
will act more like the ideas that you just put in their head. And this whole field was
01:04:29
◼
►
just destroyed of like, none of this is real. None of this replicates. You cannot prove
01:04:37
◼
►
that this effect exists. Or if it does exist, it's so incredibly infinitesimally small,
01:04:45
◼
►
that the results you're getting can't possibly be real. So this is the replication crisis.
01:04:50
◼
►
It seems like he's involved in this, Conklin.
01:04:54
◼
►
So I don't know what the deal is.
01:05:00
◼
►
Let me say this.
01:05:01
◼
►
So once we got to this section of priming and we continued with all of these really
01:05:06
◼
►
cute sort of media-friendly experiments afterward, I just found myself in this position of it
01:05:13
◼
►
is incredibly difficult to take anything in this book at its word.
01:05:18
◼
►
I'm so pleased that you say this.
01:05:20
◼
►
Why do you say that?
01:05:21
◼
►
I was getting so angry at this section.
01:05:23
◼
►
Which section in particular?
01:05:24
◼
►
There are these situations where he creates fake people and personality profiles.
01:05:30
◼
►
We can get to the fake people.
01:05:31
◼
►
I hate the fake people section so much.
01:05:32
◼
►
Alright, great.
01:05:33
◼
►
Cool, cool, cool, cool.
01:05:34
◼
►
Alright, let's come back to that.
01:05:35
◼
►
We'll come back to that.
01:05:36
◼
►
Sorry, sorry, sorry.
01:05:37
◼
►
Finish what you were going to say.
01:05:38
◼
►
We're not even at the fake people, Myke.
01:05:41
◼
►
Cool, cool, cool, cool.
01:05:42
◼
►
Like, we're at the real science part, which is before all of the fake people.
01:05:47
◼
►
So I think Daniel Kahneman established a lot of, like the impression that I get from some
01:05:53
◼
►
of the stuff that he talks about in this book is I feel like he did a lot of the foundational
01:05:58
◼
►
work in the basic concept of the irrationality of human decision-making.
01:06:04
◼
►
And like, I have absolutely no argument with him there.
01:06:08
◼
►
Like I had this really weird experience reading the book where it was, "Hey, Daniel Kahneman,
01:06:15
◼
►
I'm totally on board with a lot of the ideas that you're expressing.
01:06:20
◼
►
I think that the environment around a person has huge amount of
01:06:23
◼
►
impact on what they actually do.
01:06:25
◼
►
A concept we've discussed on the show many times, like you can
01:06:28
◼
►
influence your environment and your environment influences you.
01:06:31
◼
►
I'm totally on board for people don't make rational decisions all the time.
01:06:37
◼
►
And a huge number of ways that people just think lazily and it's, it's useful
01:06:44
◼
►
to try to put language and terms that express the ways that people think lazily.
01:06:51
◼
►
It's like, I'm on, I'm on board with you here, dude.
01:06:54
◼
►
But the problem is almost everything that you're using to back this up is like,
01:07:02
◼
►
scores very high on my bullsh*tometer.
01:07:05
◼
►
And that bullsh*tometer is not uncalibrated because this book is right at the heart of
01:07:11
◼
►
one of the biggest problems in the scientific world
01:07:15
◼
►
in the last 10 years.
01:07:17
◼
►
I have a suspicion that part of the reason
01:07:20
◼
►
this book is so popular is it must have been
01:07:24
◼
►
one of the last books published
01:07:28
◼
►
before it would have become very difficult
01:07:31
◼
►
to publish a book filled with all of these examples.
01:07:35
◼
►
- So that it is actually the book that contains
01:07:38
◼
►
the maximum density of examples of these kinds of stories.
01:07:43
◼
►
Because I think even a year or two later,
01:07:48
◼
►
more editors might have flagged this up of like,
01:07:50
◼
►
"Hey, how sure are you about this priming stuff?
01:07:53
◼
►
Like, have you looked into this?"
01:07:55
◼
►
One of my other complaints is,
01:07:56
◼
►
I do think the book lacks actionable things
01:07:59
◼
►
to do with some of this.
01:08:00
◼
►
Like there's a lot of stuff
01:08:01
◼
►
that's just extremely unactionable.
01:08:03
◼
►
But one of the things I've really become
01:08:04
◼
►
an increasing fan of over the years
01:08:06
◼
►
is try to quantify your thinking in terms of bets.
01:08:10
◼
►
And I was, you know,
01:08:11
◼
►
we were getting ready for the show this morning,
01:08:13
◼
►
I was trying to think like,
01:08:15
◼
►
how confident am I in making statements
01:08:19
◼
►
about the failure to replicate of studies in this book?
01:08:24
◼
►
Like, you know, I'm not an expert in this field,
01:08:26
◼
►
you know, I don't know.
01:08:27
◼
►
But I thought I would easily take an even money bet
01:08:33
◼
►
that at least 45% of the experiments mentioned in the first half of the book are wrong.
01:08:42
◼
►
Like, I would happily place a large amount of money on that bet.
01:08:45
◼
►
And wrong in the sense that they either don't replicate
01:08:50
◼
►
or they have never been attempted to be replicated,
01:08:52
◼
►
which is basically worthless in the social science.
01:08:54
◼
►
Like, a single paper that says "we got this crazy result"
01:08:58
◼
►
is literally worthless from a mathematical perspective.
01:09:01
◼
►
like it just tells you nothing except "hey, maybe you should do another one of these."
01:09:05
◼
►
So I have to limit it to the first half of the book because I exploded when we got to the fake
01:09:12
◼
►
people and and just could not deal with it. I'm so pleased we really bounced at the same point.
01:09:19
◼
►
Because that was when I couldn't take it anymore. That was when I then started
01:09:23
◼
►
going through. It's really interesting. He has a new book out and I wonder what that's like,
01:09:27
◼
►
like with this stuff in mind like they had a book come out this year I think called Noise.
01:09:33
◼
►
I don't know anything about it.
01:09:34
◼
►
Okay so I don't know anything about it but that title sounds a little bit like it's trying to
01:09:38
◼
►
talk about some of the replication crisis because I just want to mention something really quickly
01:09:43
◼
►
here because I think the replication crisis has been actually quite damaging to the wider world
01:09:50
◼
►
in a bunch of ways because you do get media reports or stories about like how people are
01:09:58
◼
►
under certain circumstances or how people act or what people do or like look at this wacky
01:10:03
◼
►
experiment where we get the wrong results. And I do think this stuff kind of just permeates
01:10:09
◼
►
society as this background knowledge of like, oh, we all know that people will be greedy under
01:10:14
◼
►
these circumstances or people will cheat under those circumstances. And like, I've looked into
01:10:19
◼
►
to these papers sometimes and they just don't replicate and they just get repeated as true
01:10:26
◼
►
I don't think this book's gonna make you happier.
01:10:30
◼
►
This is from Amazon.
01:10:31
◼
►
"Wherever there is human judgment, there is noise.
01:10:34
◼
►
Imagine that two doctors in the same city give different diagnoses to identical patients,
01:10:39
◼
►
or that two judges in the same court give different sentences to people who have committed
01:10:42
◼
►
matching crimes.
01:10:44
◼
►
Now imagine that the same doctor and the same judge make different decisions depending on
01:10:47
◼
►
whether it is morning or afternoon or Monday or Wednesday.
01:10:50
◼
►
Oh my god, okay, so this is the... right, okay.
01:10:54
◼
►
So this is, he's actually hitting one right there, which I used to think was true and
01:10:58
◼
►
then looked into it more and it is not true, and it has to do with judges giving harsher
01:11:03
◼
►
sentences right before lunch is like this concept that-
01:11:06
◼
►
He references that in the book.
01:11:07
◼
►
-people are grumpy and hungry.
01:11:08
◼
►
He talks about that in the book.
01:11:09
◼
►
Yeah, so like, that doesn't replicate as far as I am aware.
01:11:11
◼
►
Like that, that paper failed to replicate when done with other things.
01:11:15
◼
►
So the reason I thought that the title "Noise" would be related to this is because—
01:11:20
◼
►
so here's the fundamental problem with the replication crisis.
01:11:26
◼
►
If you have, say, you know, in America, I don't know how many behavioral economics students there
01:11:35
◼
►
are or psychology students there are trying to get their PhDs, but you have people who
01:11:40
◼
►
need to do experiments.
01:11:42
◼
►
And you have lots of them who are doing experiments.
01:11:45
◼
►
You know full well, just from like the mathematics of large numbers, that some
01:11:52
◼
►
of those people will conduct an experiment and they will get extremely
01:11:58
◼
►
convincing results that variable A is related to variable B, even though
01:12:04
◼
►
they're not related at all, just by chance, because there's just a large
01:12:09
◼
►
number of people here. An example I used to do back when I was a teacher and you do some
01:12:14
◼
►
basic statistics is I'd have a class of 20 students and you have everyone stand up and
01:12:20
◼
►
everyone gets to flip a coin and if they flip heads they get to stay standing up and flip
01:12:25
◼
►
again. Well in a class of 20 people you're basically guaranteed you're going to get one
01:12:30
◼
►
kid who's really surprised they flipped heads four times in a row and like that's just basically
01:12:36
◼
►
statistically is very likely to happen.
01:12:38
◼
►
But what's not likely to happen is that when you do it a second time, the same kid
01:12:44
◼
►
flips heads four times in a row, right?
01:12:47
◼
►
That kid wasn't really good at flipping coins or whatever.
01:12:50
◼
►
So the replication crisis is interesting because in some ways it's a side effect
01:12:54
◼
►
of there are way more people doing science now than there were in the past.
01:12:59
◼
►
And so one of the problems that you have to deal with is when you have lots of
01:13:04
◼
►
of people doing experiments, you know that some of them are going to be really wrong,
01:13:11
◼
►
but also have shockingly convincing data, which is why you need to, you need to run
01:13:17
◼
►
it again, because it's the equivalent of someone publishing a paper that says,
01:13:23
◼
►
I flipped a coin and it came up heads 10 times in a row.
01:13:26
◼
►
I must be amazing at this!"
01:13:29
◼
►
Like it's the mathematical equivalent of that.
01:13:32
◼
►
The other slightly more technical problem, which is not really worth getting into, but
01:13:36
◼
►
the bar for I don't really want to get into is that people will know it's called like
01:13:41
◼
►
the p hacking.
01:13:42
◼
►
It's this probability metric that's used of like, how good does your data have to be to
01:13:47
◼
►
be published in a respectable peer reviewed journal, the threshold is not set very high.
01:13:54
◼
►
It's set so that you can basically be guaranteed that one in 20 papers can't be correct in
01:14:00
◼
►
a journal is roughly where the threshold is set, which is really appalling when you realize,
01:14:05
◼
►
oh, an edition of a journal may have 40 papers in it.
01:14:11
◼
►
So two of them, before we even do anything, you can be very confident are wrong without
01:14:18
◼
►
even having to look at any of the data, because you just know where the threshold has been
01:14:23
◼
►
set for what will we accept to publish in this paper.
01:14:27
◼
►
And that's like the best case scenario because the journals are only picking from papers
01:14:32
◼
►
that obviously have really convincing data.
01:14:35
◼
►
But those papers are produced by statistical outliers when they perform their experiments.
01:14:40
◼
►
So it's a huge problem in the field and it's why after the priming stuff and when it just
01:14:46
◼
►
kept continuing onward I was like I'm having a real hard time with this book in this in
01:14:52
◼
►
this dual way of like, "I believe your fundamental thesis, but goddamn, did like, this book get
01:14:58
◼
►
published at the exact wrong year to include the maximum amount of almost certainly non-replicable
01:15:07
◼
►
experiments."
01:15:08
◼
►
It makes me feel so much better about how I felt about this book.
01:15:12
◼
►
Oh, okay, I thought you would be crying when you heard about the replication crisis.
01:15:16
◼
►
No, I mean, that's the whole thing that I want to look into a bit more, but it seems
01:15:20
◼
►
deeply unsettling but in a kind of tantalizing way. Which is interestingly
01:15:25
◼
►
kind of exactly what these books are like, right? Like they are... it's like
01:15:29
◼
►
tantalizing so you just want to believe it. Because like I was at a point like I
01:15:34
◼
►
was I was talking to Adina last night about this because she asked me what I
01:15:37
◼
►
thought about it and it was like it was like a part of me it's like I don't know
01:15:40
◼
►
who this book is for or why like it's not really a business book it's not
01:15:46
◼
►
really a self-help book. It's not really a book about science but it's
01:15:50
◼
►
kind of accepted by all of them because it's like catnip to all of those
01:15:54
◼
►
different, especially like the business-y types of things because there's
01:15:57
◼
►
interesting stuff in here but every time it would get to a point where he would
01:16:02
◼
►
start to give examples and explain his interesting idea I would become more
01:16:07
◼
►
infuriated by the overall experience because some of the stuff it was like
01:16:13
◼
►
it's the very worst of these types of books where it's like I'm going to tell
01:16:19
◼
►
you a thing, then tell you everybody's wrong except me.
01:16:24
◼
►
And in other books, people do this, right?
01:16:27
◼
►
This is very normal in these types of books,
01:16:30
◼
►
but it's not usually being presented to me as science.
01:16:34
◼
►
- Yes, okay, I had a thought that I was gonna keep
01:16:37
◼
►
to myself, but you've expressed a similar feeling,
01:16:40
◼
►
so I feel less bad about it.
01:16:42
◼
►
Part of the reason I never read this book is,
01:16:46
◼
►
It was hugely recommended to me, which I often just find a sort of yellow
01:16:51
◼
►
flag for recommendations in general of like, you know, when a thing is
01:16:55
◼
►
overwhelmingly recommended, I can be really confident I won't like it.
01:16:59
◼
►
For example, Ready Player One, like everyone in the universe recommended
01:17:03
◼
►
it to me and is like, I can guarantee you, I will not like that book.
01:17:05
◼
►
This book had an additional layer, which is the people who recommended it to me.
01:17:15
◼
►
would fall into a category that I think this book is kind of catnip for, which is a little
01:17:21
◼
►
bit of an elitist, "Oh, aren't I smarter than everyone?"
01:17:27
◼
►
And I think that this book has that kind of weaved through it all the time.
01:17:35
◼
►
It's like, it's a little bit set up for, "Oh, yeah, I know all about this stuff.
01:17:42
◼
►
I wouldn't fall for this kind of stuff, but look at how other people fall for this
01:17:46
◼
►
kind of stuff." And I don't have it highlighted, but there were a few little sentences that
01:17:51
◼
►
that just really rubbed me the wrong way where he's like, "So when we make policy for
01:17:58
◼
►
people, we need to keep in mind that they're thinking with their emotional brains."
01:18:03
◼
►
"When he starts talking about governments, it's like, luckily some governments are
01:18:08
◼
►
doing things the way I think they should be done. Hopefully they'll all come on board
01:18:14
◼
►
P - Yeah, okay, so I'm glad it wasn't just me, but it's like, there's a quality of elite
01:18:23
◼
►
college-educated superiority that some people have when recommending this book, and it's
01:18:29
◼
►
one of the things that always put me off the book, and it's like, "Boy, it is in here."
01:18:35
◼
►
It bugged me. It bugged me a number of times.
01:18:39
◼
►
So let's talk about the fake people. We've got to talk about the fake people.
01:18:42
◼
►
Tell me about the fake people, Myke, because I have to hear what you think about this,
01:18:45
◼
►
because this is where I lost it to.
01:18:47
◼
►
So he creates two people. One is Tom W and one is Linda. And the Linda one is really
01:18:53
◼
►
controversial and he's actually listed as so, which I appreciate, to a point where it
01:18:58
◼
►
is known as the "Linda problem" after publishing the paper.
01:19:02
◼
►
That's hilarious. Linda is the exact moment I checked out.
01:19:06
◼
►
So in a nutshell, creates a fake person,
01:19:09
◼
►
creates a personality profile about them,
01:19:11
◼
►
and then wants you to guess what jobs
01:19:13
◼
►
that they would be good at.
01:19:15
◼
►
Then says that all of your guesses are wrong.
01:19:19
◼
►
So what he explicitly does is creates a person
01:19:24
◼
►
who you are 100% expected to suggest
01:19:28
◼
►
that they would do this type of job,
01:19:29
◼
►
and he goes, "No, no, they'd be good at another one."
01:19:32
◼
►
But you've created this fake situation
01:19:35
◼
►
and told me to think a certain way.
01:19:38
◼
►
And then when I said, yeah, I believe you,
01:19:40
◼
►
you said, no, you're wrong.
01:19:42
◼
►
And I hate stuff like this.
01:19:44
◼
►
You created this completely fake situation.
01:19:47
◼
►
The same with Linda, right?
01:19:48
◼
►
Creates this personality profile.
01:19:50
◼
►
It was like, there's no way that they could be a bank teller.
01:19:52
◼
►
It's just impossible.
01:19:53
◼
►
Like, no, it's not impossible.
01:19:54
◼
►
They could be.
01:19:55
◼
►
And I find this so annoying because it's like,
01:19:58
◼
►
I'm so smart, you are so stupid.
01:20:01
◼
►
Or like there's another part in the book as well,
01:20:04
◼
►
this is much later on, where he's talking about experts,
01:20:08
◼
►
that all experts are wrong
01:20:10
◼
►
because they cannot actually predict the future.
01:20:13
◼
►
Like people are paid to forecast things,
01:20:16
◼
►
but there's no way that they could know them
01:20:18
◼
►
because it's the future.
01:20:19
◼
►
So they're all wrong.
01:20:20
◼
►
And it's kind of like,
01:20:22
◼
►
I'm not saying that he's incorrect,
01:20:25
◼
►
but by his own logic, what he has just said is wrong
01:20:30
◼
►
because he cannot actually know.
01:20:32
◼
►
And I really get annoyed when these types of books
01:20:35
◼
►
wrap themselves in either A, these falsehoods
01:20:38
◼
►
and tells you you're stupid for believing them
01:20:40
◼
►
even though they force you to believe them,
01:20:42
◼
►
or B, makes these grand sweeping statements
01:20:47
◼
►
that undo everything the statement has said,
01:20:50
◼
►
like it's eating its own tail, right?
01:20:52
◼
►
Like you can't trust anyone, everyone's wrong,
01:20:55
◼
►
but you can trust me except everyone's wrong.
01:20:58
◼
►
And like these two parts just, it really, unfortunately,
01:21:03
◼
►
reduced my overall feeling about this book.
01:21:07
◼
►
And I really wish that these, as with most of these books,
01:21:12
◼
►
it was just half the size and then he could have got out
01:21:14
◼
►
what he wanted to say, put system one and two in,
01:21:17
◼
►
give me some more about that and left it there.
01:21:20
◼
►
Because everything past that point
01:21:22
◼
►
really undermines the work, I think.
01:21:25
◼
►
- Yeah, the thing about experts, again,
01:21:27
◼
►
because what is true has been a repeated topic on this podcast.
01:21:32
◼
►
It's frustrating because I think in the what is true topic, the danger that you constantly
01:21:40
◼
►
have to avoid is becoming cynical and just reflexively going, "Oh, I can't believe experts.
01:21:48
◼
►
Experts are dumb."
01:21:49
◼
►
You can have an interesting conversation about under what circumstances does it make sense
01:21:56
◼
►
to trust expert advice. What are the constraints and what are the incentives that are acting
01:22:01
◼
►
upon an expert? And that, that gives you a way to frame people's advice. You know, a
01:22:09
◼
►
very, a very classic example is something that comes up in the past year. You say regulatory
01:22:14
◼
►
agencies, how much should you trust a regulatory agency? And you're like, well, the more that
01:22:20
◼
►
that agency, the people making decisions have something on the line personally, you should
01:22:25
◼
►
take that into account for trusting of that, or it's like, there are all of these
01:22:30
◼
►
different ways to think about that, but a dismissal that is also then couched in,
01:22:38
◼
►
except me as the expert, is just, it's the worst kind of thing.
01:22:43
◼
►
I think it encourages a kind of cynical-ism that is not helpful for actually solving
01:22:50
◼
►
problems and also tells you sort of implicitly, because you've read this far in this book,
01:22:57
◼
►
you obviously agree with all of this.
01:22:59
◼
►
So when I say you can trust me, I'm also saying you can trust you who trusts me.
01:23:05
◼
►
- You're the real expert here.
01:23:07
◼
►
I do want to just add on something that you said, because you made a good point about
01:23:10
◼
►
if there is something on the line, but I think that that can actually lend a little bit to
01:23:14
◼
►
what he's saying.
01:23:15
◼
►
And I just wanted to also suggest that we have an expert.
01:23:19
◼
►
Yeah, they probably-- well, they definitely can't predict the future.
01:23:23
◼
►
But they know more about it than you do.
01:23:25
◼
►
And so if one of us is going to try and make a decision,
01:23:28
◼
►
maybe it's that person.
01:23:29
◼
►
And that was the thing that annoyed me about this book, where
01:23:32
◼
►
it's kind of like, no experts know nothing.
01:23:34
◼
►
They don't know anything.
01:23:36
◼
►
There's no way they can predict anything.
01:23:37
◼
►
It's like, yeah, I know.
01:23:38
◼
►
We all know this.
01:23:39
◼
►
We all know that these people cannot predict the future.
01:23:42
◼
►
If you spent years studying something, you maybe have a better gut reaction than me,
01:23:48
◼
►
rando individual, who's just rolling up having read a news article in the Guardian.
01:23:53
◼
►
I don't think anybody is suggesting that people can accurately predict the future,
01:23:59
◼
►
but if we're going to try and base something on something, at least try and put some logic
01:24:04
◼
►
And I find it really annoying that he, as you say, is like, "Yeah, we shouldn't do that
01:24:09
◼
►
to anyone, except what I have to say."
01:24:12
◼
►
Just to get back to the fake people thing.
01:24:14
◼
►
Alright, so I just want the listeners to really understand what we're saying here.
01:24:19
◼
►
So I just had to pull up part of this.
01:24:20
◼
►
I'm just going to read a little bit of it.
01:24:22
◼
►
The Tom W one.
01:24:24
◼
►
So he presents you with this description of a person, Tom W, who's not a real person,
01:24:30
◼
►
is a constructed person for this experiment here, right?
01:24:35
◼
►
It says, "Tom W is of high intelligence, although lacking in true creativity.
01:24:41
◼
►
He has a need for order and clarity and for neat and tidy systems in which every detail
01:24:45
◼
►
finds its appropriate places.
01:24:48
◼
►
His writing is rather dull and mechanical, occasionally livened by somewhat corny puns
01:24:53
◼
►
and flashes of imagination of the sci-fi type.
01:24:57
◼
►
Now before we even get one sentence further, the note I wrote down in this section of the
01:25:03
◼
►
book is "I feel like I'm having a stroke."
01:25:06
◼
►
Like, how am I supposed to understand what is happening here?
01:25:13
◼
►
Like, I don't know about you.
01:25:16
◼
►
I literally can't conceive of this the way he's trying to ask me to conceive of this.
01:25:21
◼
►
It's, it's like, it's not a real person.
01:25:26
◼
►
So do I need to pretend it's a real person?
01:25:28
◼
►
If it is a real person, who's giving me this information?
01:25:34
◼
►
Like, is this information 100% accurate about this?
01:25:38
◼
►
- Yeah, 'cause this is like,
01:25:38
◼
►
"Oh, you have to judge this personality profile,
01:25:41
◼
►
which is a personality profile that doesn't exist.
01:25:44
◼
►
No one can write this about someone."
01:25:46
◼
►
- Yeah. - It's...
01:25:47
◼
►
- Every one of those sentences
01:25:50
◼
►
is thrown into immediate confusion.
01:25:53
◼
►
If I am quite literally doing what the whole book is about,
01:25:56
◼
►
hey, think about this seriously.
01:25:58
◼
►
Like, don't just make a quick judgment.
01:26:00
◼
►
Think about it seriously.
01:26:01
◼
►
But the moment I have to think about it seriously,
01:26:03
◼
►
I feel like I can't read. Like I can't absorb this in any way because the whole thing just
01:26:10
◼
►
falls apart. And then, yes, the thing that both of us find annoying is he then asks you
01:26:15
◼
►
to rank out of nine categories, you know, which of these categories do you think that
01:26:20
◼
►
he's most likely to work in? And then he goes, "LOL, no, you're wrong. He's not likely to
01:26:25
◼
►
be a librarian. He's likely to be a farmer because there's more farmers than librarians."
01:26:30
◼
►
- Yeah, oh, I hate that.
01:26:32
◼
►
There's more farms in the world.
01:26:33
◼
►
Oh, good, great, that's excellent.
01:26:36
◼
►
But are they this type of person?
01:26:37
◼
►
Like, let's be realistic here.
01:26:40
◼
►
Like, I know what you're trying to tell me,
01:26:42
◼
►
but the world doesn't work on probabilities.
01:26:45
◼
►
It's not, ugh.
01:26:47
◼
►
- This is what I meant by constant confusion
01:26:50
◼
►
of math problems and social situations.
01:26:53
◼
►
And the thing it made me think of is,
01:26:56
◼
►
like, I remember in high school,
01:26:58
◼
►
one of the standardized math things that we had to learn
01:27:00
◼
►
was these, I really quite liked them, but they were logic puzzles.
01:27:03
◼
►
They would be presented as a series of sentences.
01:27:06
◼
►
And so they would say something like, "John wears red every
01:27:12
◼
►
day that starts with a T.
01:27:15
◼
►
He only wears blue on every other day and he'll never wear yellow on a Sunday.
01:27:21
◼
►
If it's a Tuesday, what color is he likely to be wearing?"
01:27:24
◼
►
And step one of solving any of these problems is you go, "Okay, these words
01:27:30
◼
►
don't have anything to do with reality and you have to just turn them into mathematical
01:27:34
◼
►
statements and then it's very easy to solve, right?
01:27:39
◼
►
But this, so what Kahneman's, the trick that he's pulling here, he is explicitly asking
01:27:44
◼
►
you to solve a social problem.
01:27:47
◼
►
Here's a personality description of a person.
01:27:49
◼
►
What job do you think that they would like to do?
01:27:51
◼
►
And then he's pulling the rug out from under you going, "LOL, I actually only wanted a
01:27:55
◼
►
statistical answer."
01:27:56
◼
►
And it's like, "F*ck you!"
01:27:58
◼
►
Like, "F*ck you!"
01:27:59
◼
►
Yeah, you didn't ask me for that.
01:28:01
◼
►
I was walking around in my pre-cortex time, I was trying to really articulate like,
01:28:07
◼
►
why does this make me so mad?
01:28:09
◼
►
Like, but I couldn't put it into words.
01:28:11
◼
►
And then it finally dawned on me of like, I know what he's doing.
01:28:15
◼
►
So this is my metaphor for this section.
01:28:17
◼
►
Let me describe a student for you.
01:28:19
◼
►
She's the smartest girl in school, and she loves books.
01:28:23
◼
►
She's great at memorizing long lists of things.
01:28:26
◼
►
of things and she's about to be sorted into a Hogwarts house.
01:28:31
◼
►
Which house do you think she's going to be sorted into?
01:28:34
◼
►
Did you guess Ravenclaw?
01:28:36
◼
►
LOL no, she ended up in Gryffindor.
01:28:40
◼
►
She ended up in Gryffindor because there's more brave people than smart people.
01:28:46
◼
►
Now don't you feel stupid?
01:28:48
◼
►
It's inferior.
01:28:49
◼
►
Like this whole thing is fake.
01:28:51
◼
►
It's not real.
01:28:51
◼
►
That's what it just dawned on me.
01:28:53
◼
►
Like that's what it is.
01:28:54
◼
►
I am being judged for guessing people's Hogwarts house incorrectly based on fictional
01:29:00
◼
►
descriptions of fictional people.
01:29:03
◼
►
And he's given me a lesson on like, well, you know, the stereotype of Ravenclaw students
01:29:08
◼
►
is that they're smart, but actually in the book, it's not really mentioned very often
01:29:11
◼
►
that they're smart.
01:29:12
◼
►
So joke's on you, like you really fell for something here.
01:29:16
◼
►
Like I don't like, it's maddening.
01:29:18
◼
►
It's absolutely maddening.
01:29:19
◼
►
Like, I actually find it more maddening because there is a good idea underneath this.
01:29:28
◼
►
It's just presented in the worst of all possible ways.
01:29:31
◼
►
I want to, again, I'm going back to Wikipedia because that's apparently what I do on this
01:29:35
◼
►
podcast now.
01:29:36
◼
►
So this is known as the conjunction fallacy in some part of it and it's going back to
01:29:42
◼
►
the Linda problem.
01:29:43
◼
►
I just want to read it out just so people that haven't read the book can, like, because
01:29:46
◼
►
we're just getting so upset now.
01:29:48
◼
►
Linda is 31 years old. Single, outspoken and very bright. She majored in philosophy. As
01:29:55
◼
►
a student she was deeply concerned of issues of discrimination and social justice and also
01:30:00
◼
►
participated in the anti-nuclear demonstrations. Which is more probable? 1. Linda is a bank
01:30:05
◼
►
teller. 2. Linda is a bank teller and is active in the feminist movement. The majority of
01:30:09
◼
►
those asked choose option 2, however the probability of two events occurring together in conjunction
01:30:14
◼
►
is always less than or equal to the probability of either one occurring alone.
01:30:19
◼
►
So the idea is she's more likely to be a bank teller than a bank teller active in the feminist
01:30:24
◼
►
movement because the probability, you have to ignore the fact that you were told, you
01:30:28
◼
►
were told clearly in such a way that would suggest that she would be active in the feminist
01:30:35
◼
►
And that is so angering to me because I honestly genuinely believe this person is more likely
01:30:44
◼
►
with real, like, the way we believe about the world kind of thinking to be a bank teller
01:30:51
◼
►
active in the feminist movement than just a bank teller. Because people are not math.
01:30:57
◼
►
Yeah, so I mean, here's the thing, I will disagree with you on that. That's fine. Right.
01:31:03
◼
►
But the the, this is why I think he's it's particularly bad at explaining this concept.
01:31:09
◼
►
Now, like, I think that the Linda one is, is less bad because the fundamental thing
01:31:15
◼
►
that he's trying to convey is true on a mathematical perspective.
01:31:19
◼
►
Yeah, the Tom one is worse, I say.
01:31:22
◼
►
The Tom one is worse because it's just, you guessed wrong about this person's job.
01:31:26
◼
►
Like that's, that's what I mean with the Ravenclaw houses.
01:31:29
◼
►
But like, also with the Tom one, the thing that's frustrating to me about it is like,
01:31:34
◼
►
okay, so let me, let me translate this into the useful idea.
01:31:38
◼
►
idea people is don't bet against the base rate unless you have a really good reason
01:31:46
◼
►
why you think this time is different.
01:31:49
◼
►
So all that means is say like, I think this is really useful idea for trying to make predictions
01:31:55
◼
►
is you say, Oh, I want to I want to predict something.
01:31:59
◼
►
Well, if I didn't know anything about the details of this particular situation, but
01:32:05
◼
►
It's like a class of situations.
01:32:09
◼
►
So you might say, what's the chance that the CEO at a big tech company will get
01:32:16
◼
►
replaced within the next year, let's say, and you want to place a bet on that.
01:32:20
◼
►
You can lose yourself very easily in like, Oh, here's all these things
01:32:25
◼
►
that I know about Apple and this might have an effect or this might have an effect.
01:32:29
◼
►
I think this is super important.
01:32:31
◼
►
This thing, like you can get lost in those specifics.
01:32:34
◼
►
And this is the argument against experts in some ways is you can
01:32:39
◼
►
become too obsessed with the details.
01:32:41
◼
►
The very starting question should be, what is the likelihood in any
01:32:47
◼
►
given year that a tech CEO is replaced as CEO based on the last 10 years of data?
01:32:53
◼
►
And that should be your default betting position unless you have like a really
01:33:01
◼
►
good reason why you think you may know differently this time.
01:33:04
◼
►
And like, maybe you do, maybe you don't.
01:33:06
◼
►
And so that's what the Tom question is trying to get at.
01:33:09
◼
►
But the way it should be phrased is more like,
01:33:12
◼
►
if you have to guess what someone's job is,
01:33:15
◼
►
and you don't have reliable information about them,
01:33:20
◼
►
you should just guess whatever the most frequent job is,
01:33:23
◼
►
and you will be correct most of the time.
01:33:25
◼
►
But he just presents it in like this totally bizarre social way,
01:33:31
◼
►
Where if you have to take it seriously, in real life, when you're really
01:33:38
◼
►
interacting with people, I think people are actually pretty good at making
01:33:44
◼
►
correlative judgments about other people.
01:33:47
◼
►
And like, this is, this is why I say like the selection effect
01:33:50
◼
►
is really undervalued in humans.
01:33:51
◼
►
That if you know a couple things about someone, you probably can estimate
01:33:57
◼
►
very well other things about them.
01:34:00
◼
►
But he's not doing that with an artificial person.
01:34:04
◼
►
Like this artificial person is just math.
01:34:08
◼
►
And like, that's just not how it works if you are really interacting with someone.
01:34:14
◼
►
Like if Tom was a real person, I'm pretty sure if I was talking with him, I could
01:34:20
◼
►
figure out pretty quickly, is he more likely to be a librarian or a farmer based
01:34:25
◼
►
on information that you get about interacting with the person rather than
01:34:29
◼
►
like this stroke inducing description of his personality.
01:34:33
◼
►
So that's why it's absolutely infuriating.
01:34:36
◼
►
And it's a more infuriating because like, don't bet against the
01:34:39
◼
►
base rate is a really good idea.
01:34:42
◼
►
I feel like it's an idea that I've only really become aware of in like
01:34:47
◼
►
the past five years as something that just wasn't really on my mind before.
01:34:50
◼
►
If like, base rates really important.
01:34:52
◼
►
You should think about it if a decision really matters.
01:34:54
◼
►
And so that's why I'm like, when I get to the Tom section, I'm like,
01:34:57
◼
►
"Ah, like I can't deal with it."
01:35:00
◼
►
And thank God this is not where I first came across
01:35:02
◼
►
this concept.
01:35:03
◼
►
This mixture of social and math, I think,
01:35:07
◼
►
serves neither of them.
01:35:10
◼
►
There's one more thing I just have to tell you about.
01:35:13
◼
►
Okay, 'cause I highlighted it,
01:35:15
◼
►
'cause this was the other time.
01:35:17
◼
►
I don't know if you have this experience, Myke,
01:35:18
◼
►
but sometimes when you're listening to a podcast
01:35:20
◼
►
or an audio book, like, you can remember exactly
01:35:22
◼
►
where you were when you heard something.
01:35:24
◼
►
- Yeah, yeah, yeah, yeah.
01:35:25
◼
►
I feel this and I've heard a lot of people say this to me about my shows in the past.
01:35:30
◼
►
Yeah, so I was listening to this audiobook and I was at a particular spot in London and
01:35:34
◼
►
again had like an aneurysm on the street when I got to this point in the audiobook and I
01:35:38
◼
►
know that forever in my life I will always think of this one corner in London as bat
01:35:44
◼
►
and ball corner. So do you remember the bat and ball section in the book? Did this strike
01:35:50
◼
►
Oh yeah, yeah, yeah, yeah, yeah, yeah.
01:35:53
◼
►
So I'm just going to read this little section from the book word for word.
01:35:56
◼
►
He's talking about system one and system two thinking.
01:35:59
◼
►
For example, here is a puzzle.
01:36:02
◼
►
Do not try to solve it, but listen to your intuition.
01:36:06
◼
►
A bat and ball cost $1.10.
01:36:08
◼
►
The bat costs $1 more than the ball.
01:36:11
◼
►
How much does the ball cost?
01:36:14
◼
►
So I listened to that.
01:36:16
◼
►
I followed his instructions.
01:36:19
◼
►
And the first number that pops into my head is 10 cents as the answer, which of course
01:36:24
◼
►
is the wrong answer.
01:36:26
◼
►
And I actually knew it was the wrong answer because I've heard this before.
01:36:31
◼
►
But following his instructions, like don't try to solve it, just listen to your intuition,
01:36:35
◼
►
it's still the number that just pops right into my head of like, "Oh, it's got to be
01:36:39
◼
►
That's not the way it works.
01:36:40
◼
►
Like it's actually 5 cents if you write it out with some algebra and you solve it.
01:36:44
◼
►
But this is one of those sections where he sort of goes on to be like, "lol, aren't
01:36:49
◼
►
people dumb?"
01:36:51
◼
►
And I had an aneurysm because later on he's like, he starts talking about, "Oh, how
01:36:56
◼
►
were people able to solve it?"
01:36:58
◼
►
Like they were obviously able to overcome their system one fast thinking and, you know,
01:37:05
◼
►
really work it out.
01:37:06
◼
►
It's safe to assume that the intuitive answer also came to the mind of those who ended up
01:37:10
◼
►
with the correct number, but they somehow managed to resist the intuition.
01:37:16
◼
►
And I'm like, okay, well, screw you, because you didn't give me the chance to actually
01:37:22
◼
►
You specifically told me, don't try to solve this, just say whatever pops into your head
01:37:25
◼
►
for the first time.
01:37:27
◼
►
And then this is also the mixing socialness with math.
01:37:30
◼
►
"Furthermore, we also know that the people who gave the intuitive answer have missed
01:37:35
◼
►
an obvious social cue.
01:37:37
◼
►
They should have wondered why anyone would include a question to a puzzle with such an
01:37:42
◼
►
obvious seeming answer.
01:37:44
◼
►
And it's like, God f*cking damn it!
01:37:46
◼
►
Like, I would have actually solved it if you didn't explicitly tell me, like, "Don't try
01:37:51
◼
►
to solve it," and now I feel like you're gaslighting me, like I should have rethought, like, "Oh,
01:37:58
◼
►
but it's such an obvious answer, like, there must be something more complicated."
01:38:01
◼
►
It's just like, it's another one of these weird traps of, "Oh, you got the wrong answer!"
01:38:07
◼
►
encouraged you to get the wrong answer.
01:38:09
◼
►
And I tricked you, I fooled you and like,
01:38:11
◼
►
ha ha ha, the correct answer is this one.
01:38:13
◼
►
It's just a bunch of stuff in the book is
01:38:15
◼
►
infuriating like that.
01:38:16
◼
►
And it's like, yeah, the social cue stuff,
01:38:18
◼
►
obviously, and it's also why anyone who ever
01:38:22
◼
►
does public speaking, I highly recommend you
01:38:25
◼
►
never do the thing where you ask the audience
01:38:28
◼
►
a question where you're expecting them to
01:38:30
◼
►
give one answer and then you're going to tell
01:38:31
◼
►
them, oh, it's the other answer.
01:38:33
◼
►
It just never works in an audience because a
01:38:35
◼
►
audience because a real group of people, you can always feel it.
01:38:39
◼
►
They hesitate because they don't know what to do.
01:38:42
◼
►
They don't know if they're supposed to give the answer that they know that you
01:38:45
◼
►
want to give so that you can then say the other answer, or if they should pick the
01:38:50
◼
►
contradictory answer because they know from a lifetime of experience that when
01:38:54
◼
►
people ask really dumb questions with obvious answers, spoiler, it's going to
01:38:58
◼
►
be the surprising answer.
01:38:59
◼
►
Like don't ever do this.
01:39:02
◼
►
Like, so him to just mention this thing about missing out on the social queue also just really flipped me out.
01:39:08
◼
►
It's like, you can't keep switching between are you trying to solve a math problem or are you aware of a social situation and using them to bounce off of each other.
01:39:17
◼
►
So that's not a part of the book I did not enjoy.
01:39:19
◼
►
In case, like, so it took me a while, right?
01:39:23
◼
►
Like in case you're one of these people like me that struggles with it.
01:39:25
◼
►
It's $1.05 is the cost of the bat.
01:39:31
◼
►
the bat costs $1 more than the ball. So the ball is five cents, the bat is $1.05.
01:39:35
◼
►
That's how that works. I hate this. Makes me feel stupid and I hate it.
01:39:40
◼
►
Yeah, and it's also it annoys me because, again, like with the fake people,
01:39:48
◼
►
I can't help but wonder all of these people who get this question wrong,
01:39:52
◼
►
what is the situation under which you're asking them? Like, it says something about like,
01:39:56
◼
►
oh, we asked all of these people at really smart call. Oh, yeah, Harvard, MIT and Princeton,
01:40:01
◼
►
We asked all of these people, "graduates of these elite universities and they got it wrong!"
01:40:05
◼
►
That's like, yeah, but what's the scenario under which you asked these people?
01:40:09
◼
►
Because it matters quite a lot.
01:40:12
◼
►
I think anyone who graduates from MIT, if you gave them this question
01:40:16
◼
►
under a scenario in which like, "hey, really think about it,"
01:40:22
◼
►
I think they could get you the answer.
01:40:23
◼
►
I suspect like a lot of these wrong answers are because it's not worth anything to the
01:40:29
◼
►
being asked to think about it for more than a second, right?
01:40:32
◼
►
It's just part of a thing.
01:40:34
◼
►
Like, I also kept having flashbacks to when I was getting my sociology
01:40:39
◼
►
degree and as part of that, guess what?
01:40:41
◼
►
You have to participate in these exact kind of experiments.
01:40:45
◼
►
It's like, okay, I had to go into the lab sometime and answer a bunch of
01:40:48
◼
►
questions on a computer, or they'd have you look at a thing and, and be like,
01:40:52
◼
►
and, you know, try to react to something.
01:40:53
◼
►
And this is also part of the replication crisis.
01:40:57
◼
►
It's like, guess what?
01:40:58
◼
►
A lot of these studies, they're not done on random people.
01:41:02
◼
►
They're done on undergraduates of psychology and sociology who are trying
01:41:07
◼
►
to get credits so they can graduate.
01:41:10
◼
►
And so, you know, when I did those experiments, number one, anyone who
01:41:16
◼
►
has participated in those things.
01:41:17
◼
►
If you're doing a degree in sociology, spoiler, you already know that
01:41:23
◼
►
whatever they say they're studying is not the thing that they're studying.
01:41:26
◼
►
That's like step one of an experiment.
01:41:28
◼
►
So you're already thinking, "I wonder what they're really trying to find out
01:41:31
◼
►
in this experiment because they're asking me to solve math problems, but
01:41:34
◼
►
it's not really math problems.
01:41:35
◼
►
Like I know how this works."
01:41:36
◼
►
You know, or I remember sitting on the computer and you had to do one of these
01:41:40
◼
►
things where it's like, "Oh, we're going to show you certain kinds of pictures.
01:41:44
◼
►
And then you move the mouse cursor up and then other kinds of pictures.
01:41:47
◼
►
And we move the mouse cursor down or like you have to react."
01:41:49
◼
►
And like, "I didn't care, whatever.
01:41:52
◼
►
Like I'm there just to get a credit."
01:41:54
◼
►
But it's like, if it really mattered that I performed well at this task of like classifying
01:42:01
◼
►
different sorts of flowers quickly, you bet I could do it better if the incentive was
01:42:07
◼
►
So this is the other like massively conflating problem for all of this stuff.
01:42:11
◼
►
And so I just, you know, again, this is this is why I'm willing to bet just a huge amount
01:42:15
◼
►
of this stuff just does not check out.
01:42:18
◼
►
Even dumb little things where they're like, oh, the Harvard graduates can't get this question
01:42:23
◼
►
But who are you asking? And how? I bet it just wasn't worth their time at all to think about it, which
01:42:30
◼
►
you can sort of say is part of the idea of the book that people think fast and slow sometimes,
01:42:42
◼
►
totally uninteresting as a piece of information that like if people don't care about a question, they won't think about it very much. So
01:42:51
◼
►
Sorry, I got way more worked up than I thought I was gonna be over this book.
01:42:55
◼
►
- I mean, I was pretty worked up about it too, so I'm pleased that you were.
01:42:58
◼
►
I will say that we have spoken about this book for much longer than I thought we were going to today.
01:43:03
◼
►
I really thought we were gonna have to plan more stuff, and we have a lot of stuff as is usual,
01:43:08
◼
►
that we're not gonna talk about today.
01:43:10
◼
►
Like many books, it has a good thing, it has a lot of bad things.
01:43:13
◼
►
Unfortunately, I think that the bad things that this book has is maybe more bad than the typical.
01:43:19
◼
►
Yeah, I would say that I anti-recommend this book.