54: goto fail;
00:00:00
◼
►
Alright, do you want to do some follow-up?
00:00:02
◼
►
Okay, sounds great.
00:00:05
◼
►
There's hardly any, is there any today?
00:00:08
◼
►
Kind of, not really.
00:00:10
◼
►
Yeah, I'm lumping something that probably isn't by the strictest definition follow-up into follow-up,
00:00:15
◼
►
and that is, after our discussion about what comes after Objective-C,
00:00:21
◼
►
a lot of people have come out of the woodwork and said,
00:00:24
◼
►
"Hey guys, have you seen this Wolfram language thing? That's gonna be the next big thing!
00:00:28
◼
►
that will replace Objective-C, and it's not. So, anything else?
00:00:32
◼
►
It is pretty cool, though. Oh, it's cool as hell.
00:00:35
◼
►
I have no idea what I would use it for, if anything. I think I'm not smart enough to
00:00:39
◼
►
use it, actually, but it's really cool. I mean, all kidding aside, it is very, very
00:00:43
◼
►
cool, but it's serving a completely different purpose, and I don't see a mechanism by
00:00:49
◼
►
which that's going to be the way in which we build apps. And of course, somebody will
00:00:54
◼
►
say to us, "Oh well, but didn't you watch the whole video? They had sliders on there
00:00:58
◼
►
and other UI elements and blah blah blah." Yeah, but that's not really the point. It's
00:01:04
◼
►
not the sort of thing you'd build an app with. It's the sort of thing that you would do some
00:01:08
◼
►
really impressive and very cool data computations with, but it's not something you'd build an
00:01:16
◼
►
Did either one of you guys have to use Mathematica in school?
00:01:19
◼
►
I did, and I've long since forgotten all of it. Or no, no, I used MATLAB. I'm sorry.
00:01:23
◼
►
Yeah. Well, Mathematica-- so the guy-- what's his name? Stephen Wolfram?
00:01:28
◼
►
Mathematica is to him as Emacs is to Stallman, basically. Like, he's a super genius, crazy person
00:01:35
◼
►
who made an environment where he can sort of fulfill his dreams of computation,
00:01:40
◼
►
and it's just added to it and added to it over the years, only Wolfram is a little bit more successful
00:01:45
◼
►
and a little bit more determined and not quite as afflicted with RSI and I think hired a bunch of
00:01:50
◼
►
other people to do things. I think the difference is that Wolfram had a business that made money
00:01:53
◼
►
that let him indulge his tastes to do this type of thing. So he's built this amazing thing for
00:01:58
◼
►
himself that works the way his mind works, that is basically like start with Mathematica and just
00:02:03
◼
►
expand out to fill the universe. But it's... I'm not going to say something about it that's like,
00:02:09
◼
►
"Oh, it's just an insert word here," because it's not just a anything. What it does is very
00:02:13
◼
►
impressive. And it's like, it's a life's work, and it is very interesting and impressive, but I think
00:02:18
◼
►
it's an application more. It's an application that you can program with more than a programming
00:02:23
◼
►
language. But anyway, regardless of what you think it can apply to, everything in that demo
00:02:29
◼
►
shows that it's good at doing the stuff in that demo. But Apple, of course, would need a language
00:02:34
◼
►
that's good at doing the things that Apple needs to do. And what does Apple need to do? They need
00:02:37
◼
►
to let developers write applications. They need to write applications themselves. And they need
00:02:41
◼
►
to write an OS. And for all of those purposes, this language is not useful.
00:02:46
◼
►
Right. And don't let me, you know, kind of shrugging it off as a replacement for Objective-C
00:02:53
◼
►
to take away from the fact that you're absolutely right. It is unbelievably impressive, the things
00:02:57
◼
►
that can be done with it, but it is serving an entirely different purpose. And that's the only
00:03:00
◼
►
point I'm trying to make. Yeah, like I'm not going to be writing the next, you know, Instapaper
00:03:05
◼
►
killer in Wolfram language. Yeah, and one of the aspects that it has going against it is that thus
00:03:11
◼
►
far, computer languages that lean heavily on, like in the sort of cloud of what are
00:03:17
◼
►
you closer to, that sort of start grouping towards the math side of things, while always
00:03:22
◼
►
very interesting and powerful, tend not to, thus far, I mean it doesn't mean it can never
00:03:27
◼
►
happen, but so far the ones that tend to be more math-like have not caught on as much
00:03:32
◼
►
as the ones that are less math-like. And you know, I don't know why that is, but you know,
00:03:36
◼
►
for example Haskell is another one of those languages that looks more like you're doing
00:03:41
◼
►
math or is more math-like or even something like Lisp kind of sort of.
00:03:47
◼
►
And this definitely is towards the math side.
00:03:49
◼
►
I mean, you can solve integrals with it as part of, like, language-free, all the symbolic
00:03:55
◼
►
I mean, it started from Mathematica.
00:03:56
◼
►
How could it not be math-based?
00:03:57
◼
►
But, yeah, so far, languages like that haven't caught on with the masses, no matter how cool
00:04:02
◼
►
they are for the people who use them for the things they're good at.
00:04:05
◼
►
One problem I think I'd have trying to use this for anything is that, kind of similar
00:04:11
◼
►
in this one, only one way to AppleScript, I think it would be hard for me to get to
00:04:17
◼
►
start into this and to even know the kinds of things I could do.
00:04:22
◼
►
And you know, because it can do so much and in that way it's pretty unfocused.
00:04:29
◼
►
It's very broad and you're presented with like, here's this shell basically, this command
00:04:35
◼
►
and this interactive prompt that you can do whatever you want with.
00:04:39
◼
►
And like the demos that he was showing off in the video look amazingly cool.
00:04:44
◼
►
I don't know necessarily how useful they would be for me, but they were still amazingly cool.
00:04:48
◼
►
But I was looking at the kinds of syntax he was using, the kinds of commands he was using,
00:04:53
◼
►
the kind of structures he was using, and I don't even know where I would begin with something like that.
00:04:58
◼
►
And I've had that same problem ever using WolframAlpha as well, where every time I've tried to use WolframAlpha
00:05:03
◼
►
Alpha, I've tried phrasing things in a certain way, and I never guess the correct syntax,
00:05:09
◼
►
and it never does what I want, and I can tell there's a lot there, but it's really hard
00:05:14
◼
►
to get started. It's really hard to know, like, "Okay, what can I do here, and how do
00:05:19
◼
►
I ask it to do that?"
00:05:21
◼
►
I think like Lisp, I think the language itself is probably super simple. I think there's
00:05:24
◼
►
only a few things that exist, you know, they probably have like tuples and some syntax
00:05:28
◼
►
for function calls and a couple other symbolic things to write math in ASCII that get converted
00:05:33
◼
►
into symbol representations.
00:05:35
◼
►
That is the language, but the language is pointless.
00:05:38
◼
►
If you look at that big...
00:05:39
◼
►
They kept paging through those page after page, those little rectangles.
00:05:43
◼
►
Underneath each one of those is a vocabulary, which is basically like a library.
00:05:47
◼
►
What functions can I call?
00:05:49
◼
►
What things can I type?
00:05:50
◼
►
That's not part of the language per se.
00:05:51
◼
►
It's not as if those are keywords like "if" and loops and function declarations.
00:05:58
◼
►
The language syntactically looks very simple to its discredit, I think, in that it looks
00:06:03
◼
►
like it would be very cumbersome to do anything remotely complicated.
00:06:06
◼
►
But the power of the thing is, look at all these libraries that we have.
00:06:10
◼
►
Look at all the different functions we have.
00:06:12
◼
►
Look at what the options to those functions are.
00:06:14
◼
►
Look at how those functions can be composed with each other.
00:06:18
◼
►
And so it's kind of weird to call it a language.
00:06:21
◼
►
That's why it's more like an application or a set of libraries.
00:06:24
◼
►
And the set of libraries look huge.
00:06:26
◼
►
Surely there's some function that does the thing that you want that has the options that
00:06:29
◼
►
you want, and if you can't find it exactly, you can build it by composing it out of other
00:06:32
◼
►
really powerful pieces and put it all on a web front end and get it through a web services
00:06:38
◼
►
Lots of cool stuff in there, but I think it's more like an application.
00:06:43
◼
►
It's more like an API than it is a language.
00:06:46
◼
►
And whatever it is, it's not suited
00:06:49
◼
►
to write GUI applications or operating systems
00:06:51
◼
►
for phones or desktops.
00:06:53
◼
►
Right, I had the same thought, that this
00:06:55
◼
►
would be a potentially extremely powerful add-on or processing--
00:07:01
◼
►
not unit, but I guess like a dynamic library
00:07:04
◼
►
for something in Objective-C if you could get
00:07:08
◼
►
some sort of interface into it, but I
00:07:09
◼
►
don't see it replacing Objective-C or anything
00:07:12
◼
►
Very cool, though.
00:07:13
◼
►
Very, very cool.
00:07:14
◼
►
So, what else is going on?
00:07:18
◼
►
You wanna talk about this SSL bug?
00:07:20
◼
►
- Go to fail.
00:07:21
◼
►
- All right, so moving on, no, I'm just kidding.
00:07:24
◼
►
- It was like meme ready.
00:07:26
◼
►
- Just really was.
00:07:28
◼
►
- It came pre-memed.
00:07:29
◼
►
- I love that gotofail.com was actually available.
00:07:32
◼
►
- Not for long, but yeah.
00:07:34
◼
►
- And it was useful.
00:07:35
◼
►
Yeah, once it became a thing.
00:07:38
◼
►
I don't know if I have all that much to say about this.
00:07:42
◼
►
I'm not sure I concur, Marco, with your tinfoil hat reasoning that this was a deliberate act.
00:07:49
◼
►
And I'm happy for you to convince me that I am wrong.
00:07:52
◼
►
But I'm not saying it wasn't deliberate, but to me it didn't reek of being deliberate
00:07:58
◼
►
like you seem to think.
00:07:59
◼
►
Do you want to kind of recap what leads you to believe that?
00:08:04
◼
►
I mean, I think, I'm not saying this was definitely
00:08:09
◼
►
an NSA security breach where, you know,
00:08:14
◼
►
my theory that I think is potential.
00:08:19
◼
►
I don't even know if I would say the most likely,
00:08:22
◼
►
but I think it's reasonable to look at these events
00:08:27
◼
►
that this one duplicated line was inserted
00:08:31
◼
►
into this SSL verif, it was a certificate verification code
00:08:37
◼
►
- Common name checking part, wasn't it?
00:08:39
◼
►
It was the step that checks that the common names match
00:08:42
◼
►
and it skips over that step.
00:08:43
◼
►
- I believe so.
00:08:44
◼
►
It was some part of the certificate verification step
00:08:48
◼
►
so that you could make sure that the certificate,
00:08:50
◼
►
the SSL server that you're talking to
00:08:52
◼
►
is who they say they are and not a man in the middle who,
00:08:56
◼
►
you know, like man in the middle attacks,
00:08:58
◼
►
I'm probably not even qualified to explain them
00:09:00
◼
►
in all their, everything properly,
00:09:03
◼
►
but somebody who could intercept your network traffic
00:09:07
◼
►
at an ISP or a wireless router in a coffee shop
00:09:11
◼
►
or whatever the case may be, your school, your workplace,
00:09:13
◼
►
somebody who could intercept, or your government,
00:09:15
◼
►
somebody who could intercept your network traffic.
00:09:19
◼
►
Normally, SSL is designed if everything's done right
00:09:23
◼
►
so that the server and you can talk securely
00:09:27
◼
►
and you know when you connect to the server,
00:09:28
◼
►
you can verify through these series of cryptography steps,
00:09:32
◼
►
you can verify that the server you're talking to
00:09:34
◼
►
really is who they say they are,
00:09:35
◼
►
and nobody else in the middle is listening in
00:09:38
◼
►
in a way that they can decode your data.
00:09:40
◼
►
This bug broke that assumption,
00:09:43
◼
►
and so that somebody could have been listening in
00:09:46
◼
►
and breaking SSL and watching your traffic,
00:09:48
◼
►
and the operating system was just
00:09:51
◼
►
skipping that verification step or a part of it.
00:09:53
◼
►
So the way this was inserted in the file,
00:09:58
◼
►
And the files are open source.
00:10:01
◼
►
Not every revision is open source,
00:10:03
◼
►
but you can see the version that shipped in 10.8
00:10:07
◼
►
and the version that shipped in 10.9,
00:10:09
◼
►
and you can see the diff there.
00:10:11
◼
►
So the diff is not entirely convincing
00:10:14
◼
►
because there could have been a lot
00:10:16
◼
►
of intermediate revisions between.
00:10:18
◼
►
You don't know what happened between those two.
00:10:20
◼
►
All you see is beginning point and ending point.
00:10:22
◼
►
But if you look at the diff,
00:10:25
◼
►
not a lot in the file has changed
00:10:27
◼
►
between the two releases.
00:10:28
◼
►
There's this context parameter
00:10:32
◼
►
to some of these security calls that was removed, basically.
00:10:35
◼
►
It looks like the API just changed minorly
00:10:37
◼
►
so that this one argument was no longer necessary
00:10:39
◼
►
or something like that.
00:10:41
◼
►
So most, not even, some of the calls
00:10:44
◼
►
had this very basic change to them
00:10:47
◼
►
that just removed this argument.
00:10:49
◼
►
There were almost no other changes in the entire function.
00:10:52
◼
►
And then this one extra go to fail line
00:10:56
◼
►
inserted in the middle
00:11:00
◼
►
so if you look at this if you just look at the diff it looks pretty bad
00:11:02
◼
►
like it looks like wow there like
00:11:05
◼
►
no edits happened in the surrounding lines
00:11:07
◼
►
between these two releases
00:11:09
◼
►
just this one line was inserted kind of in the middle of nowhere
00:11:12
◼
►
and it looks pretty bad
00:11:15
◼
►
of course you know as I said though you can't rely only on that I think what
00:11:20
◼
►
what worries me
00:11:21
◼
►
and what makes me think that
00:11:23
◼
►
this could be
00:11:25
◼
►
this could have been nefarious.
00:11:26
◼
►
And again, I want to say to my tweets,
00:11:27
◼
►
I don't necessarily believe that Apple itself
00:11:31
◼
►
officially knew about this or introduced this intentionally
00:11:34
◼
►
or was working with the NSA.
00:11:36
◼
►
I think it's much more likely seeing how the NSA works,
00:11:39
◼
►
knowing that they have a program where they,
00:11:42
◼
►
New York Times reported this, I'll have to find the link,
00:11:43
◼
►
but I believe it said they had an annual budget
00:11:46
◼
►
of $250 million to go do things exactly like this,
00:11:49
◼
►
where basically the NSA will get to an engineer
00:11:54
◼
►
who works at one of the big tech companies,
00:11:57
◼
►
or they'll have people sitting on standards bodies
00:12:01
◼
►
trying to argue for different standards
00:12:04
◼
►
to be subtly weakened or have these back doors introduced,
00:12:09
◼
►
or the people who work at tech companies
00:12:10
◼
►
will become NSA, I don't know, supporters,
00:12:15
◼
►
agents, whatever they're called.
00:12:18
◼
►
So we know that that kind of thing happens.
00:12:21
◼
►
We know all that from the Everett Snowden leaks
00:12:22
◼
►
from the associated reporting that's gone on since then.
00:12:25
◼
►
All of that, that's not like an artificial tinfoil hat thing.
00:12:29
◼
►
That kind of thing does happen.
00:12:32
◼
►
And so for this bug to be inserted in this file
00:12:37
◼
►
at this time, and again, another little piece
00:12:40
◼
►
of circumstantial evidence, this bug was inserted
00:12:44
◼
►
in I believe it was fall of 2012.
00:12:48
◼
►
It was the month before one of the NSA slide decks
00:12:52
◼
►
claimed that Apple had joined the Prism program in some way.
00:12:56
◼
►
And that timing is really suspect as well.
00:12:59
◼
►
So you can look at this, and we don't know yet at least,
00:13:03
◼
►
we don't know what happened.
00:13:04
◼
►
We probably will never know what happened.
00:13:06
◼
►
It could have been an innocent mistake,
00:13:08
◼
►
an innocent line paste out of a VI buffer
00:13:11
◼
►
or a weird merge artifact when the files were merged.
00:13:15
◼
►
Who knows, right?
00:13:16
◼
►
We can't tell exactly, but normally when you try to rule out
00:13:21
◼
►
kind of nefarious tinfoil hat conspiracy theory kind of thing, you do it by saying, "Well,
00:13:26
◼
►
the simpler, more likely explanation is honest reason X." And I think in this case, looking
00:13:33
◼
►
at the environment we're in, looking at the kinds of things that we now know go on with
00:13:37
◼
►
the NSA and what they do with tech companies, and you look at exactly—I mean, for a one-line
00:13:44
◼
►
bug like this. This is a hell of a line to pick. Like, if you look at, like, what it
00:13:51
◼
►
did in the way it did it, it was so subtle, it was subtle enough, if you think about it,
00:13:56
◼
►
it's perfect, it's subtle enough that it would, it passed a lot of, any kind of internal review
00:14:03
◼
►
they had, and we can talk about that they probably had insufficient review and insufficient
00:14:06
◼
►
tests, but any kind of internal review, it might skip by
00:14:11
◼
►
because it blends in, it's not obvious, it's not even
00:14:17
◼
►
obvious that it is a bug once you spot it, you kinda have
00:14:21
◼
►
to notice and be like, oh, wait a minute, like you have
00:14:23
◼
►
to think about it for a second, oh, that's wrong.
00:14:25
◼
►
And it could be explained away if somebody was caught
00:14:29
◼
►
inserting it, it can be explained away by saying, oh,
00:14:32
◼
►
I must have hit paste wrong or merged wrong.
00:14:35
◼
►
So there's like a plausible explanation if you get caught.
00:14:39
◼
►
And it's exactly at the right point where
00:14:43
◼
►
it wasn't breaking all SSL,
00:14:44
◼
►
it wasn't making all SSL validate,
00:14:47
◼
►
but it was making this one particular class of thing validate
00:14:50
◼
►
that the NSA has been known to do.
00:14:53
◼
►
So it's just a little too convenient
00:14:57
◼
►
in so many of these ways.
00:14:58
◼
►
The timing, the kind of thing it is,
00:15:01
◼
►
the perfection of exactly the right part of the file
00:15:04
◼
►
to cause a very convenient backdoor for the NSA, and in a way that looks really subtle
00:15:11
◼
►
and hard to find and hard to attribute blame for once you do find it.
00:15:15
◼
►
And they can, I'm sure they can look at their version history and they can see which employee
00:15:19
◼
►
inserted that, but again, there's like a plausible reasoning, "Oh, it must have been a mistake
00:15:22
◼
►
during the merge," or something like that.
00:15:24
◼
►
So it's just a little too convenient to be a dumb, honest mistake, given the context,
00:15:31
◼
►
given the time it happened, what it did, the results it had, and what we now know about
00:15:37
◼
►
what our government does. So that's why I think, again, like I wouldn't say that it's
00:15:44
◼
►
definitely the NSA, but I would say it would be naive to brush it off and say, "Oh, it
00:15:49
◼
►
probably wasn't them." I think the chances are better than that, that it was them.
00:15:53
◼
►
You know, I can't really argue with any of that.
00:15:57
◼
►
And I don't know, I just, I guess I just want to believe that people aren't jerks like that,
00:16:04
◼
►
and that our government doesn't do things like that.
00:16:07
◼
►
But to be honest, that's me just keeping my head in the sand.
00:16:10
◼
►
So I don't know, John, what do you think about all of it?
00:16:13
◼
►
For the timing thing, I think that's just as reasonably explained by saying the NSA
00:16:21
◼
►
that this bug was in there and joining the program basically means the NSA now has the
00:16:26
◼
►
capability to intercept traffic to Apple devices because of this bug that it knows about.
00:16:29
◼
►
And how would it know about the bug?
00:16:30
◼
►
Well, through its own testing, through trying to do man-in-the-middle attacks, perhaps having
00:16:35
◼
►
someone working at Apple who looks at the code before it's released to tell them that
00:16:39
◼
►
this is in there.
00:16:40
◼
►
So that would explain the timing and that doesn't require the NSA to have caused the
00:16:44
◼
►
bug to be entered in any way.
00:16:47
◼
►
So the timing I think is a wash.
00:16:50
◼
►
If I had to put money on it, I would bet that it's a merge error and it was accidental.
00:16:56
◼
►
And that doesn't mean the NSA wasn't exploiting it to do what they do, because it seems like
00:17:01
◼
►
they knew about it based on the timing, and if they knew about it, I'm sure they would
00:17:05
◼
►
be exploiting it.
00:17:07
◼
►
The reason I think it's a merge error is because I don't think it's all that subtle.
00:17:10
◼
►
I think if the NSA were going to intentionally plant something like this, they would do it
00:17:14
◼
►
in a less discoverable way.
00:17:16
◼
►
once it's discovered it gets patched and the NSA's goal is not to be discovered.
00:17:22
◼
►
This is not a, you know, if you glance at it you might miss it, you know, casually visually
00:17:27
◼
►
inspecting it, but it's the type of thing that will be found, both in terms of the code
00:17:32
◼
►
and in terms of the massive effect it has.
00:17:35
◼
►
If they planted one you would hope they would plant one that isn't so easy to find, because
00:17:39
◼
►
this is like, "I can spoof a certificate and it just accepts everything."
00:17:44
◼
►
it accepts any garbage, it skips that entire variant. It's not like if I carefully construct
00:17:48
◼
►
a certificate with a particular thing or like, you know, like it's not exploiting a subtlety.
00:17:52
◼
►
It's the type of thing, it's amazing that it went undiscovered for as long as it did
00:17:55
◼
►
because of Apple's apparent total lack of testing of their security frameworks.
00:18:00
◼
►
So that I think argues against it being intentional. And the thing about plausible
00:18:05
◼
►
deniability, this is the really creepy part, is that I can imagine that Apple has automated
00:18:10
◼
►
merge tools for bringing builds together.
00:18:13
◼
►
So if you were to find the commit that did this, I would imagine there's a good chance
00:18:16
◼
►
that it could be attributable to an automated merge tool.
00:18:19
◼
►
And then who do you blame, right?
00:18:21
◼
►
Like oh, well someone was really clever and set up these series of dominoes such that
00:18:24
◼
►
they knew when we did this merge with that merge and that merge it would mess something
00:18:28
◼
►
And if the only validation of a merge is a human being visually inspecting it and signing
00:18:32
◼
►
off, well yes, that's easy enough to miss.
00:18:34
◼
►
Or if the validation of a merge is it compiles and passes our apparently completely inadequate
00:18:39
◼
►
test suite, then that would let it go through too.
00:18:43
◼
►
I think we all agree that it is entirely possible that the NSA did this and the government did
00:18:49
◼
►
this, because like Margo said, this is something they do.
00:18:52
◼
►
But I think this is below the level of competence and sneakiness that I would expect from them.
00:18:59
◼
►
So I give it a less than 50% chance that it was done intentionally, more than 50% chance
00:19:04
◼
►
that it was done unintentionally, and almost 100% chance that the NSA both knew about it
00:19:08
◼
►
and exploited it.
00:19:09
◼
►
>> Enoch I mean, you know, overall I agree, Jon. Like, I agree that these things, like,
00:19:16
◼
►
all of these factors could be explained away in reasonable, you know, plausible explanations.
00:19:23
◼
►
It's just like, when you add it all together, the--and again, like, if this would have happened
00:19:29
◼
►
a year ago before we knew so much about, you know, from the Snowden leaks, before we knew
00:19:32
◼
►
all this stuff, a year ago I would have looked at this and thought, "Oh, well, it looks
00:19:37
◼
►
somebody made a stupid mistake. But now that we know that this stuff happens, and because
00:19:44
◼
►
of how convenient it would be, you know, yeah you're right that disabling these entire
00:19:49
◼
►
steps of SSL verification are pretty ham-fisted. However, they got it through, and it was there
00:19:56
◼
►
for over a year, right? So I think, I'm sure they don't just try one thing, and maybe
00:20:05
◼
►
the other things they tried got caught or didn't ship.
00:20:09
◼
►
Or are still there. Yes, thanks.
00:20:13
◼
►
Sleep well tonight. Or are still there.
00:20:17
◼
►
I'm sure they don't just leave themselves as one option.
00:20:21
◼
►
Again, I think by looking at just the
00:20:25
◼
►
fairly broad stroke that this bug used
00:20:29
◼
►
I wouldn't, for the same reason that you wouldn't rule out the timing.
00:20:33
◼
►
because there's, you know, there's always like,
00:20:37
◼
►
I wouldn't rule out the possibility of them doing this
00:20:40
◼
►
just because of how fairly ham-fisted it is,
00:20:43
◼
►
because in many other ways it's quite elegant
00:20:46
◼
►
in how innocent it looks and how hard it was to catch.
00:20:51
◼
►
And I'm sure they try many things
00:20:55
◼
►
and some of them are ham-fisted
00:20:57
◼
►
and some of them are really clever
00:20:58
◼
►
and the really clever ones, maybe they didn't work
00:21:00
◼
►
or maybe they're still there,
00:21:01
◼
►
but this one we happen to find.
00:21:03
◼
►
Well, none of us are ruling anything out. It's just that you're over 50% for thinking
00:21:06
◼
►
it was intentional and I'm under. But that's basically it. We're all around the middle,
00:21:10
◼
►
Yeah. And again, I'm not too far over 50%. I might say 60%, you know, but normally in
00:21:17
◼
►
conspiracy theories you gotta be like, "Well, there's just too many coincidences to believe
00:21:21
◼
►
your conspiracy theory. Too many coincidences would have to happen." This, I think it's
00:21:26
◼
►
subtly the other direction. You'd have to ignore a lot of coincidences that are the
00:21:32
◼
►
the case to believe that this was totally innocent.
00:21:36
◼
►
So do you think Apple will ever say anything publicly about the investigation that undoubtedly
00:21:43
◼
►
is taking place inside the company to determine the cause of this?
00:21:46
◼
►
Oh, I really doubt that.
00:21:48
◼
►
And the second question is, will they use their disclosure canary thing? You know, like
00:21:53
◼
►
where they—the previous disclosure, they said, "We have not been contacted by government
00:21:57
◼
►
agencies to blah blah blah blah blah." And I forget what the word for that is. Someone
00:22:00
◼
►
in the chat room will look it up. But they put that in there so that when you see that
00:22:05
◼
►
message disappear, you will know that they have been contacted by the government and
00:22:09
◼
►
told not to say anything about it. So that's, I guess, the thing we can actually watch for
00:22:15
◼
►
that's actionable. The next time they do one of those security disclosure statements, if
00:22:19
◼
►
the whatever canary statement is not there, again, we can't directly connect it to this
00:22:23
◼
►
incident. But at least, you know, for example, if we see the statement again, we'll know
00:22:27
◼
►
that Apple had investigated it internally and that the...
00:22:32
◼
►
Like I don't see...
00:22:33
◼
►
If it was the government and they haven't been forced by the government not to disclose
00:22:36
◼
►
it was the government, I don't see why they wouldn't make it that public.
00:22:40
◼
►
Because they would basically be saying, "Hey, in essence, our government hacked us."
00:22:46
◼
►
They'd be angry, they'd talk to Congress about it, all this type of stuff like that, if they
00:22:51
◼
►
determined that to be the case.
00:22:53
◼
►
But if it was just an internal error, they probably won't say anything.
00:22:57
◼
►
And if the little canary statement is still there, then we also know that NSA is not the
00:23:01
◼
►
one making them not say anything.
00:23:03
◼
►
See, but I thought that the canary statement was more about getting at user data.
00:23:10
◼
►
I thought that the canary statement was something about how we haven't received any requests
00:23:16
◼
►
from the NSA to do crap we didn't want to do.
00:23:20
◼
►
This would be a request to sit to, "You're not allowed to say anything about the NSA
00:23:24
◼
►
mole that you discovered in your organization."
00:23:28
◼
►
You're right, it's a different category.
00:23:30
◼
►
We've never given them user data and blah, blah, blah.
00:23:34
◼
►
It was a fairly wide-ranging statement.
00:23:38
◼
►
They can't anticipate what they may be forced not to say anything about, but I would imagine
00:23:43
◼
►
they would remove that.
00:23:45
◼
►
They would just simply not put that statement in there because it's not an admission of
00:23:49
◼
►
of the canary. You can just remove it, and that's their sort of signal to the outside world that
00:23:54
◼
►
government people have come and told us not to say anything, and we can legally just simply not
00:23:58
◼
►
say anything, and you can interpret that as a sign that we're being told not to say something
00:24:03
◼
►
about something. Yeah, maybe, but it is two different things. Yeah, it says, I have it here,
00:24:08
◼
►
the very last line—this is a thing from Matthew Panzareno—the very last line of Apple's report
00:24:14
◼
►
today states, "Apple has never received an order under Section 215 of the USA Patriot Act. We would
00:24:20
◼
►
expect to challenge such an order if served on us." Which to me sounds like something separate than
00:24:27
◼
►
what we're talking about. Well, I mean, you'd have to look up what Section 215 of the Patriot Act
00:24:31
◼
►
says. Knowing the Patriot Act, it probably says the government can do whatever the hell it wants,
00:24:35
◼
►
and you have no rights. And in some sort of vague language that's broad. But I mean, that's all
00:24:40
◼
►
we've got. Like, because they can't go back in time and put it in a canary about, "We've never
00:24:44
◼
►
been infiltrated by the NSA. They've never added bugs to our code intentionally. And honestly,
00:24:49
◼
►
I don't know how they would ever determine that because like Marco said, if the best case scenario
00:24:53
◼
►
that it actually is an individual developer, what are they going to do? Waterboard the guy?
00:24:57
◼
►
Like it could have been a legitimate mistake. They can ask him, did you put that there
00:25:00
◼
►
intentionally? But if he did it intentionally, of course, he's not going to tell you that he did.
00:25:04
◼
►
And you can't force him to tell you. And you'll just never know because it is 100% plausible
00:25:10
◼
►
as a bug like people write bugs all the time, right it just you know,
00:25:13
◼
►
lines get pasted twice like Marco said like there is no literally no way to force someone to
00:25:19
◼
►
You'll never know if that guy's telling truth
00:25:21
◼
►
You could torture him to death and he dies and he never said that he did it and you still don't know whether he was lying or not
00:25:27
◼
►
Let me tell you a story and then maybe Marco you can tell us about something sweet in my first job. I wrote uh,
00:25:37
◼
►
slot machines, which is a weird and odd story that's not worth explaining right now, but
00:25:42
◼
►
this was done in DOS using the Wacom C++ compiler.
00:25:47
◼
►
And because it was done in DOS, debugging in the traditional sense wasn't really a thing.
00:25:54
◼
►
So you basically had to print out a bunch of log statements and so on and so forth.
00:25:58
◼
►
Well, the machines that we built, and the software that we built, basically it had a
00:26:04
◼
►
a menu and then a series of different slot machine games. And what we were noticing all
00:26:10
◼
►
of a sudden is that after some arbitrary amount of time of going into a game and going to
00:26:15
◼
►
the menu, going to the game, going to the menu, going to the game, going to the menu,
00:26:18
◼
►
after like 30, 40, 50 times, all of a sudden we would get a hard crash. And we couldn't
00:26:23
◼
►
figure out what it was. And we had some really, really good C++ developers there. The team
00:26:29
◼
►
at that point was only like 20 people. But there were some really smart guys there. Now
00:26:33
◼
►
Now I'm straight out of college, so I don't know what the crap I'm doing.
00:26:36
◼
►
But after a while, it was on me to try to figure out—and a coworker, actually—to
00:26:43
◼
►
try to figure out where is this crash coming from?
00:26:47
◼
►
Why is it happening?
00:26:48
◼
►
And some of my much more experienced, much better coworkers had looked through diffs.
00:26:52
◼
►
They'd looked through check-ins.
00:26:54
◼
►
Nobody could figure out what it was.
00:26:57
◼
►
And eventually I figured it out.
00:26:58
◼
►
And I can't recall if I just spotted it or if I looked through the version history
00:27:04
◼
►
of all the files that had been changed lately.
00:27:07
◼
►
But what it ended up being was a fall through and a switch statement.
00:27:13
◼
►
And in the switch statement, in each case, we were instantiating an object that was fairly
00:27:20
◼
►
I forget exactly what it was, but it was big.
00:27:22
◼
►
So what that means is we would create this new object and allocate a bunch of memory
00:27:28
◼
►
And then there was an accidental fall through that we didn't mean to have.
00:27:32
◼
►
And so we would create another one.
00:27:34
◼
►
And that first object, all that memory got leaked.
00:27:38
◼
►
And it took us forever to find it.
00:27:42
◼
►
It took seriously, I believe it was two weeks of myself and a coworker just digging through
00:27:47
◼
►
code for two straight weeks trying to figure out what it was.
00:27:50
◼
►
And I bring this up because it looked, aesthetically, it looked very similar to this go-to fail
00:27:57
◼
►
what appeared on the surface to be a perfectly valid switch statement. And it just so happened
00:28:03
◼
►
that we had forgotten to put the word "break" with a semicolon after it. We were leaking
00:28:08
◼
►
memory and after 30 to 50 times going back out to the menu, that's what caused the error.
00:28:13
◼
►
So if you had instruments, you would have known that. Because you could have run the
00:28:16
◼
►
little graph that shows the memory, you would have seen the leak.
00:28:19
◼
►
You're absolutely right. I know you're slightly being snarky, but that is absolutely true.
00:28:23
◼
►
And that's part of the reason why I think none of us are necessarily that excited to
00:28:26
◼
►
to get rid of Objective-C. But I bring this up because here was a situation where we had
00:28:32
◼
►
a handful of really good C++ developers. Now we didn't have a lot of process. You could
00:28:36
◼
►
say our methodology was not very good. But regardless, we didn't have a lot of process.
00:28:41
◼
►
But nevertheless, we had some really bright guys and girls going through this stuff and
00:28:45
◼
►
nobody could find it because it wasn't something that was visually, almost aesthetically obvious.
00:28:53
◼
►
And I see this as being a very, very, very similar situation.
00:28:58
◼
►
Just food for thought.
00:28:59
◼
►
There's examples of language misfeatures that lead to bugs.
00:29:02
◼
►
The language misfeature you cited is the fall through, or you need the break statement.
00:29:05
◼
►
The language misfeature, I think, in this bug was that you can have single clause conditionals
00:29:10
◼
►
without the braces that make it slightly less obvious.
00:29:13
◼
►
And some people say the language misfeature is that whitespace is insignificant, and the
00:29:17
◼
►
Python people will tell you how this will never happen in Python.
00:29:22
◼
►
All the talk about whether this is intentional or not, like bugs happen all the time that
00:29:25
◼
►
aren't securely related, that are just plain bugs and cause things to crash.
00:29:30
◼
►
And language features do lead to more or fewer bugs.
00:29:35
◼
►
I would love to see some stats on like bugs that are attributable to human error that
00:29:43
◼
►
could potentially have been influenced by language features.
00:29:45
◼
►
So the single clause if thing I bet is probably pretty high up in any language that allows
00:29:51
◼
►
that type of thing. Because it's just so easy to accidentally put a line underneath it,
00:29:55
◼
►
or to indent things wrong in a misleading way.
00:29:59
◼
►
The fall through for case statements, forgetting the break,
00:30:03
◼
►
I mean, I can't count how many times I've done that. Usually it's so
00:30:07
◼
►
obvious because nothing works at all, but if you get unlucky enough and things happen to sort of
00:30:11
◼
►
appear to work, you won't notice it, because you just write it out and it looks
00:30:15
◼
►
all indented the way you want it to be, and you just forgot to put the word back. Who doesn't forget to put the word
00:30:19
◼
►
That's like a rite of passage when you learn the scene,
00:30:22
◼
►
you learn the case statement.
00:30:23
◼
►
You will forget to put in break.
00:30:25
◼
►
And your thing won't work right.
00:30:28
◼
►
Man, I found a nasty bug this week in my PHP framework.
00:30:31
◼
►
In my sort function-- so I have my model class,
00:30:35
◼
►
and I have a couple of convenient model sort functions
00:30:37
◼
►
to do common things.
00:30:39
◼
►
And one of them is if you have your array-- and PHP arrays
00:30:45
◼
►
are all hashes/dictionaries.
00:30:47
◼
►
it's all like, you know, arbitrary key to value.
00:30:50
◼
►
It can be strings or numbers.
00:30:52
◼
►
So the idea is if you have an array
00:30:53
◼
►
of numerically indexed models,
00:30:55
◼
►
and you want them to instead be indexed
00:30:57
◼
►
by one of the values on each one,
00:30:59
◼
►
like say, go ID to object,
00:31:01
◼
►
instead of just like zero through N,
00:31:03
◼
►
add a function to assign the key of each object
00:31:10
◼
►
to be that property that you specify.
00:31:12
◼
►
And I had it work on the array in place.
00:31:15
◼
►
Oops. See the problem?
00:31:17
◼
►
Does PHP have defined semantics for what constructs allow you to modify the thing you're iterating
00:31:24
◼
►
over in place, and which ones don't?
00:31:28
◼
►
Not really. PHP, you can generally modify things as you iterate over them most of the
00:31:33
◼
►
time, and it usually works.
00:31:35
◼
►
It's kind of a problem that I have to say most of the time and usually there. But yeah,
00:31:41
◼
►
So as a result, as I was going through the numeric indexes, and I would say, "Alright,
00:31:48
◼
►
well, I have this property value of this one, so unset the numeric index that it was at
00:31:53
◼
►
and set it to the string index."
00:31:56
◼
►
So what happens when the value of one of those things is...
00:31:59
◼
►
What happens if the value of the one that was previously at id zero is two?
00:32:06
◼
►
Then you go to id one, do that, go to id two, and then that one gets unset.
00:32:10
◼
►
And so the resulting array can clobber certain elements based on their value and can actually
00:32:16
◼
►
have fewer elements in it than the input array.
00:32:20
◼
►
What I was getting at was, say you pulled off the first one and its ID value was 77
00:32:25
◼
►
and you shoved it into 77.
00:32:26
◼
►
Would you later on find yourself iterating over 77 because the iteration thing now sees
00:32:30
◼
►
a new entry down at 77?
00:32:32
◼
►
I think so, but the fact that I don't even know that for sure is a problem.
00:32:37
◼
►
Yeah, so it's like self-modifying code.
00:32:39
◼
►
You couldn't in theory get yourself into an infinite loop of where these things just keep
00:32:43
◼
►
getting shoved to the end and making new entries that you then iterate over that cause them
00:32:46
◼
►
to be shoved to the end at the end.
00:32:48
◼
►
All sorts of silliness.
00:32:50
◼
►
Yeah, it's crazy.
00:32:51
◼
►
And so yeah, that was an obvious, like, if you looked at the code it looked reasonable.
00:32:54
◼
►
Like, oh, of course.
00:32:56
◼
►
But until you thought about it, you're like, oh, wait a minute.
00:32:59
◼
►
And this was a utility function in my framework that's been there for about seven years.
00:33:06
◼
►
And so like, hmm, I wonder how many bugs that has caused.
00:33:10
◼
►
'Cause I don't use this function a lot,
00:33:12
◼
►
but when I do use it, and maybe all the time
00:33:15
◼
►
I've used it so far, like most of the time,
00:33:17
◼
►
it just never, it never had that situation,
00:33:19
◼
►
so I didn't notice it.
00:33:21
◼
►
But yeah, that was a problem.
00:33:23
◼
►
Anyway, we are sponsored this week,
00:33:25
◼
►
our first sponsor, 30 minutes in,
00:33:27
◼
►
our first sponsor is Picture Life.
00:33:30
◼
►
Now, we talked a while back, many times,
00:33:33
◼
►
about hosting your pictures online, photo storage,
00:33:36
◼
►
photo backups, stuff like that.
00:33:37
◼
►
So Picture Life is the one app you need
00:33:41
◼
►
for your photos and videos.
00:33:42
◼
►
Starting with seamless backup and deep integrations
00:33:45
◼
►
into iPhoto and Aperture.
00:33:47
◼
►
Picture Life auto organizes your photos
00:33:49
◼
►
and gives you the power to view
00:33:50
◼
►
and quickly search through them on any device.
00:33:53
◼
►
Picture Life's private sharing lets you easily control
00:33:56
◼
►
who sees which photos and their editor works
00:33:58
◼
►
on the web and iOS.
00:34:01
◼
►
Plans start at just $5 per month
00:34:03
◼
►
And ATP listeners get 30% off for life.
00:34:06
◼
►
Sign up at picturelife.com/ATP.
00:34:10
◼
►
They've never done this before, so really give this a shot.
00:34:13
◼
►
Make them love their sponsorship with us.
00:34:16
◼
►
You save 30% on the monthly fee for life.
00:34:19
◼
►
That's awesome.
00:34:20
◼
►
And so they have all sorts of cool features.
00:34:23
◼
►
They have a deep search, it's very, very powerful.
00:34:26
◼
►
It uses the face detection, all that other stuff
00:34:29
◼
►
that's really cool these days.
00:34:31
◼
►
They have apps for the Mac, apps for iOS,
00:34:34
◼
►
but they also even support Windows and Android.
00:34:36
◼
►
Android just launched in December
00:34:38
◼
►
and is very quickly coming up to speed.
00:34:40
◼
►
And this company was founded by people who really love photos.
00:34:45
◼
►
They love creativity and they love technology.
00:34:48
◼
►
Founders of this include Charles Forman, who I actually know,
00:34:51
◼
►
Charles Forman of OMG Pop, Jacob DeHart of Threadless,
00:34:55
◼
►
and Nate Westheimer of the New York Tech Meetup.
00:34:57
◼
►
I know him too.
00:34:59
◼
►
And this is backed by our VC friends at Spark Capital,
00:35:02
◼
►
and I know them too.
00:35:03
◼
►
They were the main VC behind Tumblr.
00:35:05
◼
►
So I'm very familiar with all these people.
00:35:06
◼
►
They're really good people.
00:35:07
◼
►
Anyway, Picture Life is the one app
00:35:10
◼
►
that your photos really needs.
00:35:12
◼
►
So you can backup, search, edit, and share on Mac and iOS.
00:35:16
◼
►
Go to picturelife.com/atp, and you can get 30% off for life.
00:35:21
◼
►
Thanks a lot to Picture Life for sponsoring our show.
00:35:23
◼
►
- And if you don't want them to go away like Everpix,
00:35:25
◼
►
sign up, and unlike Everpix,
00:35:27
◼
►
they don't have a unlimited storage thing,
00:35:30
◼
►
so hopefully they'll have a more viable business model
00:35:33
◼
►
of where you actually pay for what you use.
00:35:35
◼
►
- Yeah, it seems that way.
00:35:36
◼
►
Also Charles Foreman, like that guy is a machine.
00:35:39
◼
►
Like, remember Draw Something, that was the big thing?
00:35:41
◼
►
That was like their big thing,
00:35:42
◼
►
but he had a site before that,
00:35:44
◼
►
that I socialized with him a bit
00:35:47
◼
►
while he was working on that through David Carpet Tumblr.
00:35:49
◼
►
We were all, we would go out to dinner
00:35:50
◼
►
a few times here and there.
00:35:52
◼
►
And that guy, he's incredibly smart.
00:35:55
◼
►
And he's just like a coding output machine.
00:36:00
◼
►
Like, I've rarely seen anybody be able to produce as much as he does, and he really,
00:36:06
◼
►
really knows his stuff.
00:36:07
◼
►
So I would certainly trust this.
00:36:09
◼
►
Now, to go back from before the sponsor break, it would be wrong of me not to mention that
00:36:15
◼
►
in Objective-C—I'm sorry, not Objective-C. I've been writing Objective-C today, which
00:36:20
◼
►
is why I'm all confused.
00:36:21
◼
►
In C#, there are interesting language features that prevent both of the bugs that Marco and
00:36:28
◼
►
I are talking about.
00:36:30
◼
►
Firstly, you can't have a fall-through in a switch statement unless that case is completely
00:36:37
◼
►
So you would literally have one line, case 1, colon, the next line, case 2, colon.
00:36:42
◼
►
And if there's anything in between without a break statement, that's a compiler error.
00:36:46
◼
►
And the other thing is, if you try to modify pretty much any enumerable collection in place,
00:36:52
◼
►
I believe that's a runtime error, not a compiler error.
00:36:55
◼
►
And so that's just really nice ways
00:36:57
◼
►
to protect you from yourself, and that I really appreciate.
00:37:00
◼
►
Do I have to tell you that Perl would protect you
00:37:02
◼
►
from these things too, or we just assume it now?
00:37:04
◼
►
Opera did it.
00:37:06
◼
►
Perl requires braces on single statement
00:37:09
◼
►
ifs, it doesn't let you do it without them.
00:37:11
◼
►
And that was intentional to avoid this feature,
00:37:13
◼
►
because the people who wrote Perl were writing Perl in C
00:37:16
◼
►
and hated that.
00:37:17
◼
►
And there's no switch statement, so problem solved there.
00:37:20
◼
►
Well, there is a terrible deprecated one
00:37:22
◼
►
that's part of a CPAN module, but it doesn't count.
00:37:24
◼
►
It's not part of the language.
00:37:25
◼
►
And the-- what was the other one?
00:37:28
◼
►
Oh, modifying sets.
00:37:30
◼
►
If you iterate over the keys of an associative array,
00:37:33
◼
►
you can actually modify the array
00:37:35
◼
►
because it gets the key list ahead of time
00:37:37
◼
►
and it doesn't make reference to the thing.
00:37:40
◼
►
Unfortunately, if you get the keys and values,
00:37:42
◼
►
then it does the good old PHP way where it's just madness.
00:37:45
◼
►
But there are other reasons not to do--
00:37:48
◼
►
should just deprecate getting the keys and values at the same time anyway because, I
00:37:52
◼
►
don't want to go into Perl details, but suffice it to say that it's not a, they have to keep
00:37:57
◼
►
it, they have to keep an iterator value on a per-variable basis and that just leads to,
00:38:01
◼
►
leads to more madness, so that should be deprecated but isn't yet.
00:38:05
◼
►
This is one of those things too, like, you know, the go-to-fail bug, there's actually
00:38:10
◼
►
a compiler warning that warns on unreachable code blocks. And so if you have a function
00:38:17
◼
►
that contains like, you know, return zero and then a bunch of lines of code below it,
00:38:21
◼
►
well those lines will never be reached because that return statement, that unconditional
00:38:24
◼
►
return will execute and then everything else in the function will never be reached.
00:38:29
◼
►
Go to fail thing, again, it's, if you have this unconditional go to statement, which
00:38:33
◼
►
is what the bug line was, that skips over the big block of the function, then that code
00:38:38
◼
►
is unreachable. And so compiler, there's a warning for that in, I don't know if it's
00:38:42
◼
►
it's in GCC, but it's at least in LLVM.
00:38:45
◼
►
And the problem is, it's not part of the wall option
00:38:49
◼
►
that a lot of nerds use.
00:38:51
◼
►
It's not part of the all warnings default set,
00:38:54
◼
►
because I think, I was reading a little bit about this,
00:38:57
◼
►
I think the main reason why is because there's a lot of
00:39:01
◼
►
libraries and stuff that it would fail on
00:39:04
◼
►
for various reasons, and so, you know,
00:39:06
◼
►
it's always a tricky balance with warnings
00:39:09
◼
►
when you're striking that.
00:39:10
◼
►
Like I recently, in my PHP framework,
00:39:13
◼
►
I always had for the last few years,
00:39:16
◼
►
I've used what I call strict development mode,
00:39:18
◼
►
which is when you're in the development environment,
00:39:21
◼
►
everything, even the notice,
00:39:23
◼
►
the lightest level of PHP warnings,
00:39:26
◼
►
everything became an exception
00:39:28
◼
►
because I don't want my code to ever emit a notice
00:39:30
◼
►
and I don't care if it's gonna be littered
00:39:32
◼
►
with isSet statements all over the place,
00:39:35
◼
►
I don't care, everything,
00:39:36
◼
►
there should be no errors in development.
00:39:38
◼
►
And I recently, just even a couple of weeks ago I think it was, I recently decided, you
00:39:42
◼
►
know, why shouldn't that also apply to production?
00:39:45
◼
►
If I'm throwing exceptions on any minor error in development, that for, I think, good reasons,
00:39:52
◼
►
why should I be more lenient in production?
00:39:54
◼
►
In reality, if things are failing anywhere, I want to know about that so I can stop it,
00:39:59
◼
►
so I can fix it, so I can do the right thing.
00:40:00
◼
►
There's a reason it shouldn't apply in production if you don't control your own service, which
00:40:04
◼
►
a lot of people don't.
00:40:05
◼
►
like they're sort of deploying to,
00:40:07
◼
►
they have some baseline they need to support
00:40:09
◼
►
for their deployments,
00:40:10
◼
►
but they don't control every single detail.
00:40:12
◼
►
Like they don't have their own machines basically.
00:40:13
◼
►
They're not the one who installed PHP, for example.
00:40:15
◼
►
They're not the one controlling Apache
00:40:16
◼
►
and the upgrade cycle and stuff like that.
00:40:18
◼
►
And you don't want to run with all your warnings turned on,
00:40:20
◼
►
especially with warnings turned into fatal errors
00:40:22
◼
►
in that situation,
00:40:23
◼
►
because someone will do a minor point upgrade
00:40:25
◼
►
to Apache PHP or some other thing,
00:40:27
◼
►
which will suddenly cause warnings where once there were none
00:40:29
◼
►
and then your production is down
00:40:30
◼
►
because of something you didn't do.
00:40:31
◼
►
But in your case, since you control all of the,
00:40:33
◼
►
you control the version of everything,
00:40:35
◼
►
not going to get upgraded behind your back, so it is slightly more reasonable to do that.
00:40:39
◼
►
I think I would disagree on that. I think I would want that to break, because that's
00:40:43
◼
►
a problem that you should know about immediately. And if the people who are controlling the
00:40:48
◼
►
servers do any kind of testing, like if they maybe deploy it on a development server, or
00:40:55
◼
►
if you deploy directly to production with stuff like PHP updates, at least do it on
00:40:59
◼
►
one server first and see what happens, at least, if you're going to be that sloppy.
00:41:03
◼
►
And in all those cases, I'd rather the app actually crash and burn immediately.
00:41:09
◼
►
It's like the "fail early and completely" or "often" whatever the statement is.
00:41:14
◼
►
The warnings are always going to be something like, "This language feature is going to be
00:41:17
◼
►
deprecated sometime in the next two years, so you should stop using it."
00:41:20
◼
►
And it's like, that should not cause your app to go down in production.
00:41:23
◼
►
If that did, you'd be mad that it went down because you'd be like, "That's not important
00:41:27
◼
►
enough for production to stop working."
00:41:29
◼
►
And you'd be mad because they upgraded something and it broke your stuff or whatever.
00:41:33
◼
►
historically in my working career, the reason warnings get turned off in production is because
00:41:38
◼
►
it relates to this reason, even sometimes when we do control the entire stack, merely because
00:41:42
◼
►
a different department in the same company controls the upgrade cycle and stuff like that,
00:41:47
◼
►
and the other department doesn't want that department possibly screwing them up by changing
00:41:53
◼
►
something that causes a warning that is totally bogus and immaterial and stupid and really does
00:41:59
◼
►
not have anything to do with the functioning of the application is practically like the developers
00:42:02
◼
►
just like waving at you and saying, "We made a new message here! Look at our message! Hi,
00:42:06
◼
►
how you doing? Message, message!" And you're like, "I don't want my program to stop working
00:42:10
◼
►
because of that. I'll see it, fine, we'll go and patch up that thing so it doesn't emit
00:42:14
◼
►
the warning anymore. It's not like it'll be invisible, but turning it into a fatal error
00:42:18
◼
►
in production is usually a bridge too far."
00:42:20
◼
►
I still disagree. I see your point. I don't think that's a good enough reason.
00:42:26
◼
►
Well, for a one-man shop that controls everything, yeah, that's fine. But things get much more
00:42:30
◼
►
complicated as the organizations get bigger and things get farther away from the control
00:42:35
◼
►
of the people writing the code.
00:42:37
◼
►
Yeah, I completely agree with Jon.
00:42:39
◼
►
I think that comes from the fact that Jon and I have real jobs and it's just you by
00:42:46
◼
►
So it's not easy to deal with those sorts of issues.
00:42:51
◼
►
And even in consulting, it's an even finer line because a lot of times, myself and my
00:42:57
◼
►
team will build something, hand it off to the client, and then walk away. And so in
00:43:04
◼
►
many cases, the company for which we've built something may or may not really have the talent
00:43:11
◼
►
in house to figure out when some esoteric warning happens and what to do about it. And
00:43:17
◼
►
we're not necessarily contracted with this client anymore. So they're on their own.
00:43:22
◼
►
And not to say that we, you know, silently squash all exceptions or anything like that,
00:43:26
◼
►
for non-fatal things, oftentimes it's not in our or our client's best interest to
00:43:32
◼
►
cause a ruckus over those sorts of things.
00:43:35
◼
►
**Matt Stauffer** The C tool chain is better about doing this,
00:43:38
◼
►
which is basically why -wall doesn't print so many errors, because whoever defined wall way back when,
00:43:44
◼
►
now no one can change it. Because if you did, you were like, "Oh, this compiled cleanly,
00:43:47
◼
►
and now it doesn't." Your compiler is broken, right? That's why there's w-everything,
00:43:50
◼
►
because WAL is not historical baggage like that.
00:43:54
◼
►
Whereas in the much looser, higher level languages
00:43:58
◼
►
and the fancier tools, people have no problem
00:44:01
◼
►
throwing out the next version, the next minor version
00:44:03
◼
►
of Ruby or Node or something like that
00:44:05
◼
►
and adding a bunch of warnings
00:44:06
◼
►
because they're trying to influence the people
00:44:09
◼
►
who are using the language,
00:44:09
◼
►
because they're trying to warn about deprecating features
00:44:11
◼
►
because they think they've decided
00:44:13
◼
►
that this could potentially be a problem.
00:44:16
◼
►
Because their warnings are like,
00:44:17
◼
►
we don't know this is wrong,
00:44:18
◼
►
otherwise we wouldn't have compiled it. But there might be something you might want to look at here,
00:44:23
◼
►
and this may be a new warning so you don't have a statement above it that says don't warn me about
00:44:27
◼
►
that thing. Like you were saying with the is_set, but like there's lots of other things you could
00:44:30
◼
►
like put pragmas and lexically scoped warnings, restrictions to say look I know normally a
00:44:35
◼
►
warning would be emitted here, but I know what I'm doing, let this thing go past, and you write that
00:44:39
◼
►
right into the code. Well you can only do that for the warnings you know about, and although your C
00:44:44
◼
►
compiler is not going to suddenly add a bunch of things to minus_wall, lots of other
00:44:48
◼
►
languages and open source projects that are newer and moving much faster.
00:44:52
◼
►
We'll have no problem adding crap like that.
00:44:54
◼
►
And eventually just the fatigue of the organization keeping up with these things
00:44:57
◼
►
is like, you know, politically speaking, they'll be like, can't we just not make
00:45:01
◼
►
the warnings into exceptions in production and just look at the logs.
00:45:04
◼
►
And when we see new warnings, fix them, that will very quickly go through.
00:45:07
◼
►
Once production is down a few times and bosses and bosses, bosses are yelling
00:45:11
◼
►
down at development asking why production was down.
00:45:13
◼
►
And at that point, Marco will have a much harder time explaining to his boss or his
00:45:16
◼
►
boss's boss, why it actually really is a good thing that production went down because now
00:45:21
◼
►
we know about this failure right away.
00:45:23
◼
►
See, I think this is a lot like ending a Lisp program with a bunch of closing parentheses,
00:45:27
◼
►
you know, just in case you make a mistake, which was actually the recommended thing in
00:45:31
◼
►
my Comp Sci 201 or whatever class when we were learning Lisp. The professor actually instructed
00:45:37
◼
►
us to put a bunch of closing parentheses at the end of the file to make it easier while
00:45:42
◼
►
might have been joking. But basically, from a practical perspective, if your company is
00:45:47
◼
►
a 24/7 online company and they lose, let's say, $5,000 in revenue for every 60 seconds
00:45:52
◼
►
your servers are down, it is much harder to make the argument that Marco has made. It
00:45:57
◼
►
really depends on the situation. And some reasons are stupid, like institutional reasons
00:46:02
◼
►
where there's kingdoms within the company that are fighting each other and the developers
00:46:07
◼
►
are distant from the code and they don't control the deployment. Those are sicknesses within
00:46:11
◼
►
the company, but then there are legitimate reasons like, well, say it is a company where
00:46:14
◼
►
everybody works together, but you know you lose X amount of dollars for every minute
00:46:17
◼
►
the server is down. It only takes one night of that happening to equal one developer's
00:46:23
◼
►
salary that they could have had for the thousands of dollars they lost for their servers being
00:46:28
◼
►
down during that time. And so, yeah, sometimes you just, practically speaking, don't have
00:46:33
◼
►
the luxury of turning all warnings into exceptions.
00:46:35
◼
►
See, I think this is a... everything you've just said, like, you know, where it's like
00:46:39
◼
►
really critical that if this happens in production it's really a big problem, that's all the
00:46:43
◼
►
more reason why you shouldn't be reckless deploying updates in production to your critical
00:46:48
◼
►
code like your language interpreter. Like, if you're deploying a new version of PHP to
00:46:54
◼
►
your production servers that you've never tested your code on in development, and you
00:46:58
◼
►
have the kind of situation where you're going to lose tons of money every minute that your
00:47:03
◼
►
site is down in the middle of the night because you updated it in the middle of the night
00:47:05
◼
►
and you have to wake up your programmers.
00:47:09
◼
►
To me, that's like a pretty...
00:47:12
◼
►
There's a number of things wrong with that.
00:47:14
◼
►
And it's not your warning level.
00:47:17
◼
►
Dynamic languages can emit unexpected warnings not just because they upgraded the version
00:47:21
◼
►
of dynamic language interpreter, but simply because that code path didn't have coverage.
00:47:24
◼
►
And it's like they're basically runtime warnings.
00:47:29
◼
►
That exists.
00:47:30
◼
►
in dynamic language, especially for web programming,
00:47:32
◼
►
that if you don't execute that code path,
00:47:34
◼
►
that warning will never be committed.
00:47:36
◼
►
But if you have all warnings immediately
00:47:37
◼
►
become fatal exceptions, and in production,
00:47:39
◼
►
someone actually hits that code path,
00:47:41
◼
►
no one upgraded anything, but your server still died
00:47:45
◼
►
because that turned into a fatal exception.
00:47:48
◼
►
- There's always, between, with any kind of non-error,
00:47:55
◼
►
any kind of warning or notice,
00:47:57
◼
►
there's this balance that you're striking
00:47:59
◼
►
between convenience and easiness and tolerance of edge cases versus trying to be correct
00:48:09
◼
►
all the time. And it's kind of like security. There's a balance you have to strike between
00:48:14
◼
►
convenience and ease and the right thing. And if you're to the point where you're
00:48:22
◼
►
permitting lots of warnings to happen in production unexpectedly, I think that's a sign that
00:48:28
◼
►
something's wrong. Well, you're not permitting them, you just don't want them to be fatal errors.
00:48:31
◼
►
You'll address them as soon as you see them. Because a lot of them could be data-driven,
00:48:34
◼
►
for example. Value comes in and it's undefined. And how is that value undefined? It's okay for it
00:48:39
◼
►
to be undefined, but there's some warning that if you use an undefined value and there's one
00:48:42
◼
►
function that says this is unexpected, and you totally thought it should be expected,
00:48:45
◼
►
someone forgot to put, "Please don't give me warnings about undefined values when I pass this
00:48:49
◼
►
in because it's okay for it to be undefined." But you never hit that code in your testing,
00:48:52
◼
►
and it gets emitted. Like, warnings aren't necessarily... I know people like to turn them
00:48:56
◼
►
all on in development so they can just get things clean because it's much easier to verify
00:49:00
◼
►
that there's nothing there because once you let any leak through then it becomes an avalanche
00:49:03
◼
►
and you start ignoring them. But in production, enabling these things isn't necessarily saying
00:49:10
◼
►
"we are telling you there's something wrong with your code." In fact, almost all the time
00:49:14
◼
►
it's not telling you there's something wrong with your code and it's just really hard to
00:49:17
◼
►
be at the whim of these messages that don't actually tell you anything useful about your code
00:49:24
◼
►
in production, turning them into fatal errors. Leave them on in production, log them, immediately
00:49:27
◼
►
address each one of them so that you get the volume of those warnings down to zero again,
00:49:32
◼
►
but turn them into fatal exceptions? Like I said, I think that's too much.
00:49:36
◼
►
But you see, I think that, and people in the chat have also suggested, like, you know,
00:49:39
◼
►
you can, you should just log them and then, you know, have a policy to deal with them.
00:49:43
◼
►
I think in reality, that's much more likely to just get ignored, or to be, "Oh well,
00:49:48
◼
►
it only happens once a week for a few hours, or we've only ever seen this message five
00:49:55
◼
►
or six times, so we'll just ignore it." And I think that's the wrong approach for a lot
00:49:59
◼
►
of situations.
00:50:00
◼
►
But you don't need to ignore it. You'll know exactly what line it came from. You can get
00:50:04
◼
►
a stack trace with it. It's usually so easy to address because you'll immediately look
00:50:08
◼
►
at the warning, look at the line of code, and say, "Is this an acceptable condition
00:50:12
◼
►
or not?" If it's an acceptable condition, you just put in the pragma that says, "Don't
00:50:14
◼
►
emit that warning from this line anymore, done and done.
00:50:17
◼
►
If it's not an acceptable condition, then congratulations, you've been alerted to a
00:50:20
◼
►
potential bug and you need to change your code.
00:50:22
◼
►
Both of those situations, one is either dispensed in two seconds, and so there's not a barrier
00:50:26
◼
►
to entry to that, and the second one is you found a legitimate bug, and I think developers
00:50:30
◼
►
will want to fix that as well.
00:50:32
◼
►
I don't think people will ignore them, especially if you're running all development in the fatal
00:50:35
◼
►
errors mode, because that will ensure that your volume is zero in development.
00:50:39
◼
►
It's just when you get to production that you want to crank it back one notch, basically
00:50:43
◼
►
to make yourself not go down for reasons that you wouldn't want to be down.
00:50:46
◼
►
But again, I think the human behavior, like the reality of human nature is such that if
00:50:54
◼
►
you tolerate those warnings in production, even if the policy says you shouldn't, in
00:50:59
◼
►
practice that's going to lead to a lot of messy code staying there indefinitely. Whereas
00:51:05
◼
►
if it actually breaks, it forces you to fix it.
00:51:09
◼
►
organizations are good at one thing. It's making policies, annoying policies. So I actually have
00:51:15
◼
►
more faith in a large corporation's ability to make a stupid policy that it requires zero,
00:51:21
◼
►
or not stupid, to make a policy that people don't like following because it's human nature to not
00:51:25
◼
►
want to deal with those things and to enforce it. It goes the other way. The policy that says you
00:51:30
◼
►
got to have all warnings on when you build your application, a lot of individual developers won't
00:51:36
◼
►
like that because they find it tedious to go trace down all those errors and everything like that.
00:51:39
◼
►
indie developers tend to have to psych each other up to say,
00:51:42
◼
►
I know I should be running with WEverything, but I'm not.
00:51:45
◼
►
And they kind of have to encourage each other to do that
00:51:48
◼
►
because they know it's good for them,
00:51:48
◼
►
but it's human nature not to want to do that.
00:51:50
◼
►
- It's like flossing.
00:51:52
◼
►
- In an organization, you have someone two levels up
00:51:54
◼
►
who doesn't have to touch the code,
00:51:55
◼
►
who can just force everybody to do that
00:51:56
◼
►
and make it a policy.
00:51:57
◼
►
Like I think as Casey has said this many times in the past,
00:52:00
◼
►
and it's true of anyone who works in big companies,
00:52:02
◼
►
whenever anything goes wrong,
00:52:03
◼
►
someone wants to make a policy
00:52:04
◼
►
to prevent it from ever happening again,
00:52:05
◼
►
which eventually leads to a gigantic web of policies
00:52:09
◼
►
that paralyzes the company and makes them stupid and dumb.
00:52:11
◼
►
But like I said, that's the one thing they're good at
00:52:13
◼
►
is this bad thing happened, make a policy
00:52:16
◼
►
so it doesn't happen again,
00:52:17
◼
►
and they would institute the policy.
00:52:19
◼
►
We must have zero warnings.
00:52:20
◼
►
By the Tuesday after the build,
00:52:22
◼
►
all warnings must have tasks assigned for them
00:52:24
◼
►
with due dates, and that is, I think,
00:52:26
◼
►
something that big organizations are good at.
00:52:29
◼
►
- Right, and I think, Marco,
00:52:30
◼
►
what you're maybe losing sight of is that
00:52:34
◼
►
A lot of times, having a production failure, it just isn't an option, even for seconds.
00:52:42
◼
►
And Jon was alluding to this earlier, but you're coming from the position, as you
00:52:46
◼
►
should, of someone who is not only the peon coder, but is also the boss.
00:52:53
◼
►
And so if Overcast is out in the wild and it goes down, who do you have to answer to?
00:52:59
◼
►
Everyone will know why now.
00:53:00
◼
►
Yeah, exactly.
00:53:02
◼
►
time zone set.
00:53:04
◼
►
You know, you're modifying another collection while iterating over it.
00:53:07
◼
►
Well, now we at least know that you shouldn't use my framework if you have this situation.
00:53:12
◼
►
But you know what I mean?
00:53:13
◼
►
Like, you have to answer—yes, you have to answer your customers, and that is kind of
00:53:16
◼
►
crummy, but your customers can't fire you.
00:53:18
◼
►
They could walk away and, okay, we can go down that rabbit hole if you really want to,
00:53:22
◼
►
but in a direct sense, they can't fire you.
00:53:26
◼
►
They can't not give you a bonus that year.
00:53:30
◼
►
They can't empirically hurt you.
00:53:33
◼
►
And it's very different for, I think I speak for Jon in saying it's very different for
00:53:36
◼
►
he and I, because if we make some sort of decision that we think that dying in production
00:53:43
◼
►
is better, somebody many, many, many rungs up the ladder from us may not agree with that.
00:53:50
◼
►
And I guarantee you, the bigger the company, the more, well, in my experience, the bigger
00:53:56
◼
►
the company the happier they are to find a head to chop off and let roll.
00:53:59
◼
►
Well, the big thing is you will not be able to convince them, even if you're 100% right,
00:54:03
◼
►
because there are many situations where you really are right. And this was like,
00:54:07
◼
►
ignore whether we're talking about warnings, like this was a legitimate reason. It is actually
00:54:10
◼
►
better that this happened than it not happened. Good luck convincing somebody three levels up
00:54:14
◼
►
in the org chart of that, even if you are just so right, and that everybody who is a sibling to you
00:54:19
◼
►
in the org chart and below you agrees, and they all sign a petition, and they all get out in the
00:54:22
◼
►
parking lot and say, "We're right. The COO or the CTO and certainly the CEO, you will
00:54:28
◼
►
not convince them." And that's the harsh world that we live in.
00:54:31
◼
►
John: Let me challenge our listeners here. I'm honestly curious. I would like to know
00:54:36
◼
►
if you work at a company that has a big, important online infrastructure, Amazon, Google, stuff
00:54:47
◼
►
like that. Big important companies where you do things well online and it's really important
00:54:53
◼
►
that they stay up. I'm curious, what is your policy? What is your company's policy?
00:54:57
◼
►
And you can write us on the feedback form anonymously, you can use the throwaway Twitter
00:55:01
◼
►
accounts. We don't need this to be on the record. I'm just curious to know, what do
00:55:07
◼
►
the big companies do in reality where this stuff really does matter and where they are
00:55:11
◼
►
technically, generally well-run companies?
00:55:16
◼
►
I think that at that scale it's kind of different because I think Google and Amazon and Netflix
00:55:20
◼
►
and stuff like that have a design for expected failure.
00:55:23
◼
►
So they have to have basically instead of an organism that is like clean and running,
00:55:27
◼
►
they have to make one where cell death is expected and they just have little busy robots
00:55:31
◼
►
going through and cleaning up the dead cells and stuff.
00:55:33
◼
►
So that's a little bit different.
00:55:35
◼
►
And I'm thinking of like the medium term ones where you're not that big.
00:55:38
◼
►
Every human in the world is not hitting your server.
00:55:41
◼
►
But it has to be up.
00:55:42
◼
►
It just absolutely has to be.
00:55:43
◼
►
like stock trading or banking things, whereas only a few computers in this network, and
00:55:48
◼
►
they're directly hacked into by crazy fiber optic wires, and they're doing high frequency
00:55:51
◼
►
trading, or they're doing something like that, or they're doing bank transfers, but these
00:55:54
◼
►
three or four computers have to always be up, otherwise people lose millions of dollars,
00:56:00
◼
►
But again, like, that's more reason why you should be really careful what code runs on
00:56:04
◼
►
them and why you probably shouldn't ignore a warning.
00:56:07
◼
►
Well, you're not ignoring it, we're just going around in circles. It's like, it's the question
00:56:10
◼
►
of do you want something that doesn't necessarily have to be fatal to be fatal or to cause a piece
00:56:15
◼
►
of work to be put into someone's bin to fix that with a deadline because that's the policy and so
00:56:19
◼
►
on and so forth. But anyway, people can send us feedback on whatever they think the policy is.
00:56:21
◼
►
I think you'll be horrified to find that people don't enable warnings, period, even in development.
00:56:26
◼
►
Forget about ignoring them. They're not even enabled. Like if we actually took a mass survey,
00:56:31
◼
►
what we'd find out is there'd be like warnings. No, we turn those off. Those annoy us.
00:56:35
◼
►
Yeah, you're absolutely right. You're absolutely right. And the other thing I should point
00:56:38
◼
►
out is that you saying, Marco, that you only care—well, I shouldn't say that. The way
00:56:43
◼
►
you phrased the question was if your livelihood, your company's livelihood is based on some
00:56:49
◼
►
sort of online service or website or whatever, but what you're losing sight of is a lot
00:56:54
◼
►
of times my clients at the job in which I work, we oftentimes, but not always, do like
00:57:00
◼
►
a corporate intranet, you know, the same sort of thing that igloo does. And a lot of times
00:57:05
◼
►
we're told this cannot go down, when in reality, nothing will go broken if the intranet
00:57:13
◼
►
is down, you know what I mean? So a lot of times from on high, they say this intranet
00:57:18
◼
►
will be a boys. But really, it doesn't matter. And so we have to make decisions as consultants
00:57:25
◼
►
on the requirement from the client even if we think it's bull. And I think that John
00:57:30
◼
►
was alluding to that earlier. And so a lot of times, even if you could say, "Well, it
00:57:35
◼
►
doesn't really matter if this goes down," a lot of times clients will say, "Well, it
00:57:38
◼
►
better friggin' be up," or, "We're coming and calling you guys and we're gonna be
00:57:42
◼
►
pissed." On a happier note, you want to tell me about something cool?
00:57:46
◼
►
Yeah, and before—one final thing, sorry—that I think that you're right that for your
00:57:53
◼
►
situation where, you know, if it's a consulting gig where you have to build some system for
00:57:58
◼
►
somebody and then effectively it's going to have zero maintenance for a while, whether
00:58:02
◼
►
that's weeks or years or decades, like then I think that's a different environment where
00:58:09
◼
►
you expect the software to just tolerate existing and whatever upgrades happen on its server,
00:58:14
◼
►
which probably won't even be a whole lot, honestly, but in reality, let's be realistic
00:58:18
◼
►
secure, but this thing has to operate with no programmer intervention indefinitely. Then
00:58:24
◼
►
it's a different story. Then I completely agree that you're already doing something
00:58:29
◼
►
that's, by software development standards, pretty bad, in that you're going to have
00:58:35
◼
►
unmaintained software in production use for a long time. But in reality that happens all
00:58:40
◼
►
the time. And so you're right that in practice in a lot of places you have to accommodate
00:58:45
◼
►
for that, but that's typically not the kind of place, as you just said, it's not the kind
00:58:49
◼
►
of place where it's super important that there be no technical errors, you know, all the
00:58:55
◼
►
time. So anyway.
00:58:56
◼
►
All right. We are sponsored, sorry, we are sponsored this week also, once again by our
00:59:02
◼
►
friends at Help Spot. Are you still using email clients for customer support, Casey?
00:59:08
◼
►
I do customer support? I mean, yes. Yes, I am.
00:59:12
◼
►
You're probably losing track of important tickets.
00:59:15
◼
►
- Trying to use Mark as on read as an organizational tool
00:59:18
◼
►
and I am in coworkers to see who's working on what.
00:59:21
◼
►
That's ridiculous.
00:59:22
◼
►
It's time to get organized.
00:59:24
◼
►
Most help desk software tries to be all things
00:59:26
◼
►
to all people.
00:59:27
◼
►
Infinite feature creep, these huge complex messes.
00:59:31
◼
►
Help Spot is focused.
00:59:32
◼
►
It deals only with customer inquiries
00:59:34
◼
►
and self-serve knowledge bases.
00:59:37
◼
►
There's no built-in asset management or password resetting
00:59:39
◼
►
or other unnecessary features to get in your way
00:59:41
◼
►
or require complex integration work
00:59:44
◼
►
with your application or your infrastructure.
00:59:46
◼
►
Helpdesk software is also usually really expensive.
00:59:49
◼
►
A lot of them are like around $600 per user per year,
00:59:52
◼
►
which is really high.
00:59:54
◼
►
HelpSpot is just $2.99 per user once.
00:59:58
◼
►
You own it for life, it's not per month,
01:00:00
◼
►
it is $2.99 per user one time.
01:00:03
◼
►
You own it for life.
01:00:04
◼
►
And there's no lock-in.
01:00:05
◼
►
You can download the software and host it yourself,
01:00:08
◼
►
or you can have it hosted for you.
01:00:10
◼
►
Either way, even if they host it for you,
01:00:12
◼
►
you always have access to the database
01:00:14
◼
►
that you can directly query, export, and take elsewhere.
01:00:18
◼
►
HelpSpot is not some new startup.
01:00:19
◼
►
They've been available for nearly a decade
01:00:22
◼
►
and they've been adopted by thousands of companies
01:00:24
◼
►
and organizations.
01:00:25
◼
►
Customers from single person startups
01:00:27
◼
►
to Fortune 500 companies all use HelpSpot
01:00:29
◼
►
to manage their support teams.
01:00:31
◼
►
So you can start a free trial today.
01:00:33
◼
►
Go to helpspot.com/atp.
01:00:36
◼
►
And then when you're ready to buy it,
01:00:37
◼
►
if you use coupon code ATP14 for the year of '14,
01:00:41
◼
►
Code ATP14 will save you $100 off your already very well-priced purchase.
01:00:47
◼
►
So thanks a lot to HelpSpot for sponsoring our show once again.
01:00:50
◼
►
Remember, go to helpspot.com/ATP and use coupon code ATP14 for $100 off.
01:00:57
◼
►
So here's the thing that we're kind of dancing around, and I think it might be time to pull
01:01:02
◼
►
the band-aid off.
01:01:03
◼
►
No, I wanted to do the script notes thing.
01:01:06
◼
►
I thought that's what you were talking about, because it's perfect.
01:01:08
◼
►
It's exactly what you were just talking about.
01:01:10
◼
►
I know what he wants. He wants to do some of our methodologies. He always does.
01:01:13
◼
►
We kind of touched on it a little bit, but we don't have time for Script Notes.
01:01:16
◼
►
You don't want to save it to F for after the second sponsor.
01:01:21
◼
►
It's happening. I hate to break it to you. It's happening.
01:01:25
◼
►
I've listened to that Script Notes podcast for nothing.
01:01:27
◼
►
Well, so did I!
01:01:29
◼
►
I even listened to the follow-up today, just to make sure I had it done for this episode.
01:01:32
◼
►
Gentlemen, relax. This is the way it's going to have to be. Daddy Casey has spoken.
01:01:37
◼
►
We are also sponsored this week by Squarespace.
01:01:39
◼
►
- You're such a jerk.
01:01:40
◼
►
Don't even stop.
01:01:42
◼
►
- The all-in-one platform that makes it fast and easy
01:01:44
◼
►
to create your own professional website
01:01:45
◼
►
or online portfolio.
01:01:47
◼
►
For a free trial and 10% off, go to squarespace.com
01:01:50
◼
►
and use offer code Casey.
01:01:54
◼
►
- Squarespace has always improving their platform
01:01:56
◼
►
with new features, new designs, and even better support.
01:01:59
◼
►
They have beautiful designs for you to start with
01:02:01
◼
►
and all the style options you need
01:02:02
◼
►
to create a unique website for you or your business.
01:02:06
◼
►
They have over 20 highly customizable templates
01:02:08
◼
►
for you to choose from.
01:02:10
◼
►
They've won numerous design awards
01:02:11
◼
►
from prestigious institutions,
01:02:13
◼
►
and it's incredibly easy to use and customize.
01:02:16
◼
►
If you want some help,
01:02:17
◼
►
Squarespace has an amazing support team
01:02:19
◼
►
that works 24 hours a day, seven days a week,
01:02:21
◼
►
with over 70 employees right here in New York City.
01:02:24
◼
►
All of this starts at just $8 a month,
01:02:27
◼
►
and includes a free domain if you sign up for a year.
01:02:30
◼
►
And you can start a free trial today
01:02:32
◼
►
with no credit card required.
01:02:34
◼
►
It's a real free trial, no credit card.
01:02:37
◼
►
And when you do sign up, use offer code Casey for 10% off
01:02:41
◼
►
to show your support for our show as well.
01:02:43
◼
►
And one more cool thing, two more cool things actually.
01:02:47
◼
►
You know what, I'm gonna go all out,
01:02:50
◼
►
three more cool things.
01:02:51
◼
►
Squarespace has introduced this new thing
01:02:53
◼
►
called Squarespace logo.
01:02:54
◼
►
In addition to building your website,
01:02:55
◼
►
you can now build your own logo for your site,
01:02:57
◼
►
business card, shirt, or whatever you want.
01:03:01
◼
►
Also, all Squarespace plans now include
01:03:03
◼
►
the commerce functionality.
01:03:04
◼
►
You can have a store where you sell
01:03:06
◼
►
physical or digital goods.
01:03:08
◼
►
They have all sorts of great integration there.
01:03:09
◼
►
It's so easy to set up.
01:03:11
◼
►
You can finally have your own online store
01:03:13
◼
►
with very, very little effort.
01:03:14
◼
►
It's really amazing.
01:03:15
◼
►
And every Squarespace plan now includes
01:03:17
◼
►
the commerce functionality at no additional charge.
01:03:20
◼
►
And finally, Squarespace is hiring.
01:03:23
◼
►
If you interview for an engineering or design position
01:03:27
◼
►
before March 15th, they will invite you
01:03:30
◼
►
and your partner to be New Yorkers for a weekend.
01:03:33
◼
►
They will fly you out to New York,
01:03:34
◼
►
put you up in one of the city's best hotels,
01:03:36
◼
►
and give you a long weekend of checking out
01:03:38
◼
►
some of their favorite attractions, cultural icons,
01:03:41
◼
►
and restaurants in the city.
01:03:42
◼
►
Squarespace will pick up the entire tab
01:03:44
◼
►
for this awesome trip to New York.
01:03:46
◼
►
They've been voted one of New York City's greatest places
01:03:48
◼
►
to work for two years running,
01:03:49
◼
►
so really put them on your short list.
01:03:51
◼
►
They're looking to hire 30 engineers and designers
01:03:54
◼
►
by March 15th.
01:03:55
◼
►
So go to beapartofit.squarespace.com,
01:03:58
◼
►
That's beapartofit.squarespace.com to learn more about this cool offer.
01:04:02
◼
►
Thanks a lot to Squarespace for sponsoring our show once again.
01:04:05
◼
►
So about the Script Notes episode...
01:04:07
◼
►
Don't even start with me.
01:04:08
◼
►
Are you really going to be that?
01:04:10
◼
►
If you really want me to abandon software methodologies, even though it perfectly fits
01:04:14
◼
►
the thread of this episode, you f***ing b****, then we can move on.
01:04:18
◼
►
Jon, what do you think about this episode?
01:04:20
◼
►
God, I hate you so much, Marco.
01:04:22
◼
►
Casey, you've got to assert yourself.
01:04:24
◼
►
If you want to talk about software methodologies, you can.
01:04:26
◼
►
I would say though that with the time left in the episode don't you think it deserves to be kind of like in the prime
01:04:31
◼
►
Spot with more time. I leave it up to I'm abstaining from this vote. So
01:04:35
◼
►
It's Casey versus Marco decide what we're gonna talk about next I'm prepared for both
01:04:40
◼
►
I kind of feel like so her
01:04:42
◼
►
Methodologies is a big enough topic that we wouldn't want to try to jam into the end of the show
01:04:46
◼
►
But I'll go either way
01:04:47
◼
►
I actually am going to agree with John's non vote which actually is a vote but I'm gonna agree with John's non vote and say
01:04:53
◼
►
That I I'm not trying to avoid it
01:04:56
◼
►
I do think it deserves more time than this.
01:04:58
◼
►
You know, you're probably right, and God help you trying to edit this episode,
01:05:03
◼
►
because now I've turned it into a complete cluster.
01:05:06
◼
►
Oh, this is all in. Just one horn needed, that's it.
01:05:08
◼
►
It's all in.
01:05:10
◼
►
I'm excited about that.
01:05:12
◼
►
This is what people tune in for, Casey.
01:05:13
◼
►
Oh, is this the show?
01:05:15
◼
►
All right, so we should probably catch people up on Script Notes as I curse this topic internally,
01:05:22
◼
►
even though it actually is very interesting.
01:05:23
◼
►
So a lot of people came and told us, "Oh, you should really listen to this Script Notes
01:05:29
◼
►
And Script Notes is a podcast by two screenwriters, and I couldn't even tell you who they are
01:05:33
◼
►
off the top of my head.
01:05:34
◼
►
John August and some other guy.
01:05:37
◼
►
That's what I knew too, but I wasn't going to say that.
01:05:41
◼
►
And I only know who John August is because he was on other podcasts that I listened to.
01:05:44
◼
►
And I've seen his movies and I like them, and I'm like, "Oh, he's the guy who did that!"
01:05:49
◼
►
But yeah, I don't know the other guy.
01:05:50
◼
►
Sorry, other guy.
01:05:51
◼
►
Now that other guy knows exactly how I feel.
01:05:56
◼
►
Anyway, so there's this podcast about screenwriting and apparently the de facto industry standard
01:06:03
◼
►
for screenwriting is called what?
01:06:06
◼
►
What is it called again?
01:06:07
◼
►
Final draft.
01:06:09
◼
►
I wanted to say Final Cut.
01:06:10
◼
►
Final draft.
01:06:11
◼
►
And so it's a screenwriting application and it is, like I said, the industry standard.
01:06:14
◼
►
So from what I gather, the screenwriters, John August and Casey we'll call him, they
01:06:21
◼
►
don't particularly care for Final Draft.
01:06:23
◼
►
His name is Craig Mazin.
01:06:26
◼
►
So John and Craig don't really care for Final Draft.
01:06:29
◼
►
And they actually, I guess I complained about it in prior episodes, but this past episode
01:06:33
◼
►
at the time we record this, they had actually had the CEO of the company that makes Final
01:06:40
◼
►
Draft as well as what a product manager, is that right?
01:06:44
◼
►
project I don't know the difference I live in my own little world here with
01:06:47
◼
►
one. Fair enough. So some other guy. We'll just call that person long-suffering employee.
01:06:53
◼
►
Right so so that other guy. So the two of them came on the show which I really
01:06:59
◼
►
respect because it was clear from the get-go that this wasn't going to go well
01:07:04
◼
►
for them and and so John and Craig had these two other guys on the show and
01:07:09
◼
►
started telling them all the things that they don't really like about Final Draft and why
01:07:15
◼
►
they feel like they've been wronged by not only the high purchase price of what, 250
01:07:22
◼
►
Is that right?
01:07:23
◼
►
Something like that.
01:07:24
◼
►
Some of them.
01:07:25
◼
►
250 bucks or so for Final Draft, but also the slow updates.
01:07:29
◼
►
And it was a really fascinating view of both sides of the coin of software development,
01:07:36
◼
►
as the company and the people who create software and the people who consume it and how they
01:07:44
◼
►
don't entirely understand what we go through is in the same way that the CEO and the long-suffering
01:07:51
◼
►
employee don't really understand what their customers go through either.
01:07:56
◼
►
And I have some takeaways from this, but let me open the floor to you guys and see.
01:08:01
◼
►
Marco, what did you think about all this?
01:08:04
◼
►
Well, first, I think... so I listened to the podcast, I made my own opinions of it, and
01:08:12
◼
►
I even made a whole post about it, and all that was before I had read this follow-up
01:08:17
◼
►
article from... there's a guy named Kent Tesman who writes a competing product called, I believe,
01:08:27
◼
►
called Fade In, here we go, and he's been a long time critic, I gather, of Final Draft,
01:08:34
◼
►
and that's one of the reasons why he started writing his own, because he hated Final Draft
01:08:37
◼
►
so much, so he started writing his own, and he's clearly a programmer/tech guy, but also
01:08:44
◼
►
a film guy, and he broke down, here I'll put the link in the show notes here, he broke
01:08:50
◼
►
down exactly some of the problems with Final Draft technically, like for instance that
01:08:57
◼
►
it still doesn't support Unicode, like really like major shortcomings and that one of the
01:09:04
◼
►
reasons why they had so much trouble going Retina was not necessarily because it was
01:09:10
◼
►
using Carbon, because you can do Retina with Carbon, it's because they were using Quick
01:09:16
◼
►
which was deprecated, what, 20 years ago?
01:09:20
◼
►
10 years ago?
01:09:21
◼
►
A long time ago, at any rate.
01:09:23
◼
►
They've built up quite some technical debt.
01:09:27
◼
►
And so there's some problems there.
01:09:29
◼
►
It basically seems like they wrote this application
01:09:33
◼
►
in the '80s and '90s and have been kind of sitting on it
01:09:39
◼
►
and not doing the really hard migrations
01:09:41
◼
►
and not modernizing all this time,
01:09:44
◼
►
And then all of a sudden they were forced to by their customers with things like Retina,
01:09:48
◼
►
and they were forced to suddenly do this quickly and it became a really big thing. And so it's
01:09:54
◼
►
a pretty typical story of pretty severe technical debt being ignored for way too long.
01:10:01
◼
►
I think what I got out of the CEO's comments and attitude was that he wants to make all
01:10:09
◼
►
all of his problems your problems.
01:10:12
◼
►
And this is something, you know, the reason why
01:10:14
◼
►
I think it's important for developers to hear this
01:10:17
◼
►
is because you can see both sides of it.
01:10:20
◼
►
You can see the CEO arguing the business side
01:10:24
◼
►
and the difficulties of the business side.
01:10:27
◼
►
And then you can see the customers arguing that,
01:10:29
◼
►
well, you know, your business stuff is your problem
01:10:32
◼
►
and it's not serving us at all
01:10:34
◼
►
and you're kind of treating us badly
01:10:35
◼
►
and your product needs a lot of work
01:10:37
◼
►
and is really stagnant and outdated.
01:10:39
◼
►
So you can see both sides of it, and you can kind of see,
01:10:42
◼
►
as a programmer, or especially as a company owner,
01:10:46
◼
►
if you own your own company or make your own product,
01:10:48
◼
►
you can see how you could become that CEO.
01:10:52
◼
►
And it should scare the crap out of you,
01:10:54
◼
►
because that's a plausible outcome
01:10:56
◼
►
for so many software developers.
01:10:58
◼
►
- I think developers should listen to this,
01:11:00
◼
►
because people with technical knowledge
01:11:02
◼
►
of both sides of the software industry
01:11:04
◼
►
are people who listen to this podcast,
01:11:06
◼
►
or just Mac nerds or into the indie development scene,
01:11:10
◼
►
will have the unique experience of listening to a podcast
01:11:13
◼
►
with two camps of angry people,
01:11:15
◼
►
both of whom are just massively wrong
01:11:17
◼
►
at the fundamental level about the major points
01:11:19
◼
►
of their arguments.
01:11:20
◼
►
The customers are wrong about,
01:11:22
◼
►
"Your product should be free,
01:11:23
◼
►
"and how hard is it to do this?"
01:11:26
◼
►
From the outside, they think everything is easy,
01:11:27
◼
►
and they think everything should be free,
01:11:29
◼
►
and Apple gives away the OS for free,
01:11:31
◼
►
"Why can't you give away Final Draft for me?"
01:11:33
◼
►
They're just so out of left field.
01:11:34
◼
►
You know where they're coming from,
01:11:35
◼
►
but it's like, man, they have no idea
01:11:37
◼
►
whether you're always a software.
01:11:38
◼
►
And as Marco pointed out, the CEO is being defensive
01:11:42
◼
►
in ways that are not appropriate for his job as a,
01:11:46
◼
►
like it's his job to figure out a way
01:11:47
◼
►
to keep your software up to date.
01:11:49
◼
►
To sell it like, you know, don't make,
01:11:51
◼
►
those are all your problems.
01:11:52
◼
►
You have to figure out a way,
01:11:53
◼
►
that's called being a software entrepreneur.
01:11:55
◼
►
Like you have to figure that stuff out.
01:11:56
◼
►
You can't throw it back in customers faces and say,
01:11:59
◼
►
you don't realize this, it's really hard to do this.
01:12:01
◼
►
And it's clear that the CEO doesn't have
01:12:03
◼
►
a lot of technical knowledge,
01:12:03
◼
►
so he doesn't even know the detail things.
01:12:05
◼
►
Like, I felt like I wanted to jump into the podcast
01:12:07
◼
►
and explain to each one of them why they're positioned,
01:12:10
◼
►
'cause why they're both wrong.
01:12:11
◼
►
You customers, look, things can't be free and you're crazy,
01:12:14
◼
►
and let me explain why your reasoning is wrong.
01:12:15
◼
►
And you, CEO, you don't even know why this is hard,
01:12:17
◼
►
but it is really hard, and if you had a clue
01:12:19
◼
►
how hard it was, you would have started
01:12:20
◼
►
on these transitions years and years ago.
01:12:22
◼
►
And yes, they might have destroyed your company,
01:12:24
◼
►
but look how many other companies have been left
01:12:26
◼
►
in the wake of not being able to keep up
01:12:28
◼
►
with the changes in OS X and iOS.
01:12:30
◼
►
Like, that's your job.
01:12:31
◼
►
You wanna be a big boy in the software world?
01:12:33
◼
►
It's not easy.
01:12:34
◼
►
The best example is something like BB Edit, which started on classic Mac OS, and it has
01:12:38
◼
►
had to go through, it's so similar because it's a text editor, it had to go through all
01:12:41
◼
►
these things.
01:12:42
◼
►
They had QuickDraw, they had to get rid of that.
01:12:43
◼
►
They were carbon, they had to find a way to go on modern OS X.
01:12:47
◼
►
They had to adopt Unicode, they went through like, they used to do like @suite and like
01:12:52
◼
►
Core Text and whatever the hell, they've gone through multiple different underlying text
01:12:57
◼
►
You have to do that, otherwise you're dead.
01:12:59
◼
►
And if some other nimble competitor who doesn't have your legacy concerns makes a new application,
01:13:04
◼
►
and the only reason you're staying successful is because you're an entrenched interest
01:13:06
◼
►
or whatever.
01:13:07
◼
►
We've all seen this play out a million times.
01:13:09
◼
►
So those two camps of people were just both angry, both talking past each other, and just
01:13:15
◼
►
both fundamentally wrong about their complaints about the other person.
01:13:21
◼
►
You can't have an app that has a code base that—you can't write an app in 1990 and
01:13:28
◼
►
still be around in 2014 and not have to have gone through a few really painful transitions
01:13:34
◼
►
in that time.
01:13:35
◼
►
JESSE ERICKSON Unless you coast on like, "Well, we're just
01:13:37
◼
►
so entrenched, we're the big gorilla, we can afford to..."
01:13:39
◼
►
Like, even if you do that, you have a longer timer, but it's still a timer because eventually
01:13:43
◼
►
your app doesn't launch anymore and you're like, "Well, now I guess it's the time we
01:13:46
◼
►
have to move away from QuickDraw."
01:13:47
◼
►
It's like, "Nope, sorry, too late.
01:13:48
◼
►
Now you're dead."
01:13:49
◼
►
I don't think Final Draft is in that situation yet because they are such an entrenched interest.
01:13:53
◼
►
Hell, look at Adobe.
01:13:54
◼
►
Look how long it took them.
01:13:55
◼
►
We mentioned this last show.
01:13:56
◼
►
Look how long it took them to go Coco.
01:13:58
◼
►
The only reason they can get away with that
01:13:59
◼
►
is because they're Photoshop.
01:14:00
◼
►
We are Photoshop, we are the image editor,
01:14:02
◼
►
we ate up all the other ones,
01:14:04
◼
►
we bought Macromedia or whatever.
01:14:06
◼
►
And even then, eventually the OS is like, no seriously,
01:14:10
◼
►
you gotta be 64-bit, you gotta be cocoa, right?
01:14:13
◼
►
Even then they're forced to upgrade,
01:14:14
◼
►
but yeah, the final cut, the CEO didn't understand,
01:14:18
◼
►
doesn't seem to understand what his company does
01:14:22
◼
►
or should be doing.
01:14:23
◼
►
What does your company do?
01:14:25
◼
►
we sell and maintain in a software application,
01:14:28
◼
►
that's a dynamic business.
01:14:30
◼
►
You can't just keep making the same thing
01:14:32
◼
►
and expect, I don't know, expect everyone else
01:14:36
◼
►
to do your job for you and to make it
01:14:38
◼
►
so that your company doesn't have to do the hard things.
01:14:41
◼
►
I don't know.
01:14:43
◼
►
- Yeah, and there were some quotes in this that,
01:14:45
◼
►
only a handful that I'd like to quickly go over,
01:14:48
◼
►
that just really stuck out to me.
01:14:52
◼
►
So one of them, and actually John, you already quoted it,
01:14:55
◼
►
was, "Well," and this is John and Craig, "Well, Apple gives an entire operating system
01:15:01
◼
►
away for free."
01:15:02
◼
►
No, that wasn't them. That was the CEO, citing that as a problem that they have.
01:15:09
◼
►
No, I think it was the CEO pointed out to them that they make money from selling the
01:15:12
◼
►
phones, but I think it was the complaining customers who were saying, "If it wasn't
01:15:17
◼
►
the OS, it was something about, like, 'Look how cheap software is. Software has been devalued,
01:15:22
◼
►
therefore your thing should be one dollar free. It's like telling Adobe that Photoshop
01:15:26
◼
►
should be free. Or like ten cents because Angry Birds is ten cents.
01:15:29
◼
►
Well, to be fair, they were not complaining that Final Draft is not free. From what I
01:15:36
◼
►
gather they were complaining twofold, both that it is not a high enough quality product
01:15:41
◼
►
to be worth the high price anymore, and that the upgrades are not worth the upgrade price
01:15:48
◼
►
because of how little progress is made, relatively speaking, in each upgrade.
01:15:52
◼
►
That seemed like their bigger complaint, not that the app should be free or very, very cheap.
01:15:56
◼
►
It seems like they'd be very happy to pay the $250 for the app if it was a better app.
01:16:01
◼
►
Well, but even then, they want to pay $250 10 years ago and never pay again,
01:16:05
◼
►
and just continually have the app updated.
01:16:08
◼
►
I didn't get that impression. I don't—
01:16:09
◼
►
I definitely got that impression. That that's—
01:16:11
◼
►
Yeah, I did not.
01:16:12
◼
►
I think that I got the impression that they would have hated to spend $250 way back when,
01:16:18
◼
►
and then they just want that app to work forever and to continue to be updated with the OS and to
01:16:21
◼
►
get retina and to get Unicode support and everything all for free. They seem to have an entitlement
01:16:27
◼
►
complex that was not proportional to the value that they're deriving from the software. And it
01:16:33
◼
►
was kind of compounded by the fact that this is crappy software that hasn't been updated,
01:16:37
◼
►
so they feel burned by any amount of money they put towards it. It's mostly because they have
01:16:42
◼
►
resentment. They feel like they have to have this program because everybody uses it. I think that's
01:16:47
◼
►
That's the core of their dissatisfaction, is like, it's like an abusive relationship
01:16:50
◼
►
with like, well, you have to get Final Draft because everybody uses Final Draft.
01:16:54
◼
►
And then you're already bitter at this thing, and if it's not the most amazing program that
01:16:58
◼
►
does everything perfectly, then you're just going to be pissed when anything is wrong
01:17:02
◼
►
with it, and there seems to be a lot wrong with this program.
01:17:04
◼
►
Yeah, I think Marco is right, though, that I don't think they were embittered necessarily
01:17:09
◼
►
about paying for it.
01:17:10
◼
►
It just seems that they thought that it was an order of magnitude too expensive.
01:17:15
◼
►
I'm making numbers up. I'm putting words in their mouth. But instead of $250, it should
01:17:19
◼
►
be $100 upfront for the first release and like $20 for every supplementary release.
01:17:24
◼
►
And additionally, I think they were extremely embittered that these upgrades or updates,
01:17:29
◼
►
whatever they called them, were a heck of a lot of money. And even I would probably
01:17:33
◼
►
agree to this, a heck of a lot of money for really not that much update. And Retina was
01:17:38
◼
►
cited several times. I think they said it was like $100 for the Retina upgrade update,
01:17:42
◼
►
whatever it was.
01:17:43
◼
►
80 bucks or something. And maybe I'm crazy, but if it was an application that I use for
01:17:48
◼
►
my livelihood, I would pay $80 for a Retina upgrade. I wouldn't blink at that because
01:17:54
◼
►
say you use Photoshop to do your work, you're a graphic artist, and a Photoshop update comes
01:17:58
◼
►
out and the only upgrade in it is that the UI is Retina and a couple of bug fixes, and
01:18:03
◼
►
it's $80 for the upgrade. I mean, I would pay it. Wouldn't any of you pay it?
01:18:06
◼
►
Yeah. I mean, I suppose I would.
01:18:09
◼
►
Like maybe we have different... I guess we probably value software differently because
01:18:13
◼
►
kind of know what goes into it. But it's not like it's like 50 grand. I mean, it's not
01:18:19
◼
►
a site license to some CAD program. That's why I think they're getting used to the world
01:18:27
◼
►
where everything is cheaper free. But it's like, "But this is part of your livelihood.
01:18:31
◼
►
This is how you make money." And I think it all gets back to, "It's because they don't
01:18:34
◼
►
like this tool. They feel like they're forced to use it. If they had their choice, they
01:18:37
◼
►
would use something different and better." And it's like, "If you're going to force me
01:18:40
◼
►
to buy it and then now suddenly I'm not happy. Say you hated Photoshop and you needed it
01:18:46
◼
►
for your work. Say you hate Microsoft Word and you have to get it because you deal with
01:18:49
◼
►
stupid people all day who insist that you send them everything in doc format. Then you'd
01:18:54
◼
►
be pissed that you're going to do an $80 upgrade to Office and the only change was that it
01:18:59
◼
►
was retina and compatible with Mavericks. Then you'd be pissed, but you're mostly pissed
01:19:03
◼
►
because you hate Word so much that any money you put towards continuing this charade of
01:19:07
◼
►
having to use this program on behalf of other people, that's where I feel like they're coming
01:19:11
◼
►
I think it's that, but it's also that what was once a decent product has stagnated.
01:19:18
◼
►
The impression I got—and I've never used Final Draft—but the impression I got was
01:19:22
◼
►
that it was at one point a good thing.
01:19:25
◼
►
What they were saying about how, based on the length of the PDF or the printed document,
01:19:30
◼
►
you could take a guess at about how long the script was, that was very cool and apparently
01:19:34
◼
►
something that at least the CEO seemed to think that that was unique to them.
01:19:38
◼
►
Well anyways, the product started pretty solid and pretty good, but kind of never really
01:19:44
◼
►
went anywhere.
01:19:46
◼
►
And these two guys, John and Craig, for example, really embittered that it took so long to
01:19:50
◼
►
get Retina support in there.
01:19:53
◼
►
And then on top of that, they got charged for it.
01:19:54
◼
►
So they were saying to the CEO, "You knew this was coming.
01:19:57
◼
►
You knew this was coming.
01:19:58
◼
►
How did you not figure this out?
01:19:59
◼
►
How did you not do it?"
01:20:00
◼
►
And at first, I was like, "Well, you know, it's hard.
01:20:03
◼
►
I still don't have my FastX update out for iOS 7 because I'm a slacker and I have better
01:20:07
◼
►
things to do with my time, but I can understand that." Well, then the CEO says, and this is
01:20:12
◼
►
another one of my quotes, "Well, we're 40 people." And I think to myself, "Okay,
01:20:16
◼
►
so how did you not get Retina done with 40 people?" But then later on, he says, "Well,
01:20:23
◼
►
but 10 to 15% of us are programmers." What the hell is everyone else doing?
01:20:27
◼
►
Sales and marketing. The best part was when he said, "You know how we learned about Retina?
01:20:32
◼
►
Someone brought a computer into the office and showed it to us." If that's how you learned about
01:20:36
◼
►
I read and it shows you're not engaged with the platform on which you deploy your software.
01:20:41
◼
►
Like, even trivial—you don't even subscribe to Macworld. Like, you do nothing. Forget
01:20:46
◼
►
about going to WWC, which of course you should do if your job is that you write a software
01:20:50
◼
►
product for the Mac, for crying out loud, right? Yeah, this was kind of depressing.
01:20:56
◼
►
Like, the CEO, like, he did himself and his company a disservice by coming on this program
01:21:01
◼
►
and saying all these things because every word out of his mouth was, like Margo said,
01:21:04
◼
►
It's like the answer to every single one of his comebacks was like, "That's your problem.
01:21:09
◼
►
That's not my problem."
01:21:10
◼
►
You know what I mean?
01:21:11
◼
►
It may be that the natural way of things is that you've now painted yourself into a technical
01:21:14
◼
►
corner and you must go out of business, and it will be a blessing for the industry because
01:21:18
◼
►
we can all give our money to these other hungrier developers who, if they're unlucky, will go
01:21:22
◼
►
through the exact same cycle as you, get big, become popular, not update their software
01:21:26
◼
►
for a decade, crumble under their own weight, and the cycle will continue.
01:21:30
◼
►
But that's not your customers' problems in any way, shape, or form.
01:21:33
◼
►
And so I guess he doesn't have people telling him,
01:21:36
◼
►
don't go onto a program and tell--
01:21:38
◼
►
even when your customers are actually wrong about everything
01:21:41
◼
►
should be free and stuff, you can't go onto a program
01:21:45
◼
►
and tell them about all your woes.
01:21:47
◼
►
And they're going to go, oh, well, now--
01:21:50
◼
►
I guess maybe he was hoping for empathy,
01:21:52
◼
►
that they would put themselves into his shoe
01:21:54
◼
►
as the non-technical CEO of a company that
01:21:57
◼
►
made terrible strategic decisions for the past decade,
01:22:00
◼
►
and go, oh, well, now I feel bad for you.
01:22:02
◼
►
It's all right.
01:22:02
◼
►
about buying your software.
01:22:04
◼
►
- I mean, and one of the, and one thing,
01:22:06
◼
►
I noticed Casey, you have this in the notes,
01:22:07
◼
►
and so I might as well bring it up now.
01:22:09
◼
►
One of the more interesting parts of this dynamic,
01:22:14
◼
►
I thought, was that the CEO opened the conversation
01:22:18
◼
►
basically by saying that they've done surveys,
01:22:22
◼
►
and I believe it was 92% of the people said
01:22:26
◼
►
that they are very happy with Final Draft,
01:22:28
◼
►
and therefore, they think they're doing great.
01:22:30
◼
►
And like that's so flawed on so many levels. And John and Craig briefly mentioned like,
01:22:38
◼
►
well, you know, that's only people who responded to the survey, which is obviously going to
01:22:43
◼
►
be mostly people who like you. Or, you know, like, that's so not a random sample of what
01:22:50
◼
►
people think of your product. And the CEO's entire attitude seemed to just be that, well,
01:22:57
◼
►
we keep hearing from people who like it
01:22:59
◼
►
and therefore everything's fine.
01:23:02
◼
►
And we don't have to,
01:23:03
◼
►
like his attitude from the very beginning was,
01:23:05
◼
►
we're fine because some people like us
01:23:07
◼
►
and therefore we don't need to make any changes at all.
01:23:09
◼
►
And it's so easy for people to fall into that trap.
01:23:13
◼
►
If you select what you're listening to,
01:23:16
◼
►
and I've never seen this app
01:23:20
◼
►
and even I know everyone hates it.
01:23:22
◼
►
Like I've heard about this app for years now
01:23:26
◼
►
how much people hate this in random edge conversations here and there. And not even being in the
01:23:32
◼
►
business I know people hate it, obviously that's a pretty widespread problem. And it's
01:23:37
◼
►
so easy to surround yourself only by the inputs that you want to believe, I'm sure there's
01:23:44
◼
►
a term for that, it's so easy to fall into that that this guy honestly, I think he honestly
01:23:50
◼
►
thinks that everything's fine, that he's in complete denial of any major problems and
01:23:55
◼
►
And therefore, that's, I think, why he was so weirdly defensive and aggressive in his
01:24:03
◼
►
I don't know if I really believe that he was sincere that he really believed that, but
01:24:06
◼
►
like it's possible because the denial is pretty strong.
01:24:08
◼
►
I think the reason you've heard about it and I have as well, even though we're not screenwriters,
01:24:12
◼
►
is because we travel in software development circles and anybody who knows anything about
01:24:15
◼
►
software can look at this application and you can smell like the death on it.
01:24:21
◼
►
Even if you don't know it's using QuickDraw, you're like, oh, geez, this program has not
01:24:24
◼
►
been updated for a while.
01:24:25
◼
►
see old, we I have no idea how this looks, but I bet we could
01:24:28
◼
►
pick up like, it's using older controls when it should be
01:24:30
◼
►
using newer ones, like, maybe for a long time, the text wasn't
01:24:33
◼
►
properly anti alias, or looks like was rendered differently,
01:24:35
◼
►
because it's quick draw. And like, you know, just we could
01:24:37
◼
►
tell that there are problems with it. And the reason I think
01:24:42
◼
►
he may be right, if you took a random sampling of their
01:24:45
◼
►
customer base is because their customer base are not technical
01:24:47
◼
►
people. And they view this as kind of like, especially if it's
01:24:50
◼
►
since it's such an institution, they view it as if you want to
01:24:53
◼
►
to become a screenwriter, this is what you get, and this is the tool you deal with. And
01:24:57
◼
►
everyone complains about it, but like, you know, you might as well complain about the
01:25:00
◼
►
weather, because to do your job you have to use Final Draft in the same way that you might
01:25:04
◼
►
say you have to use Photoshop if you're a graphic designer, you just, you know, there's
01:25:08
◼
►
no other choice out there. But they're non-technical people, so they're not equipped to understand
01:25:14
◼
►
what's wrong with this program and why, and they're more willing to accept that like whatever's
01:25:18
◼
►
wrong with it, like what, that's just the way it is, what can you do? So in that respect,
01:25:23
◼
►
they may have good server responses. But the thing that makes me think of in the worst
01:25:26
◼
►
case scenario is I will bet you in 2007, 2008, 2009, maybe even 2010, if you were to survey
01:25:32
◼
►
all BlackBerry users, they would say they love their BlackBerrys. It doesn't matter,
01:25:36
◼
►
BlackRim is still doomed, right? You could think, oh, the people who have BlackBerrys
01:25:40
◼
►
love it. It's like, yes, but the time has passed it by and customers may have it and
01:25:45
◼
►
may love it and maybe Stockholm Syndrome or maybe not know better, but like from the outside
01:25:49
◼
►
we all can see, "Bye bye, Blackberry!"
01:25:53
◼
►
Yeah, and the other thing that I found interesting about this, and it's the only other quote
01:25:58
◼
►
that I want to bring up, is apparently there was a feature called, they called it "Collaboriter,"
01:26:05
◼
►
which I guess was some sort of collaborative writing thing where you can work on a screenplay,
01:26:10
◼
►
two people can work on a screenplay at the same time.
01:26:12
◼
►
Sub-E-That, yeah, it sounded a lot like that.
01:26:14
◼
►
Yeah, that's a very good analogy.
01:26:17
◼
►
And so they said, and this should be pretty much verbatim, Collaborator was built when
01:26:22
◼
►
it was on a peer-to-peer technology with no security.
01:26:25
◼
►
It would still work like that today.
01:26:28
◼
►
What they were saying was, when they built this feature that was designed to be used
01:26:32
◼
►
between two people, probably not co-located, they did it without even thinking about firewalls.
01:26:40
◼
►
And unless they wrote it in '96, are you kidding me?
01:26:45
◼
►
Like how is that okay?
01:26:47
◼
►
And that just completely sealed the deal in my mind that--
01:26:50
◼
►
Screenwriters don't have firewalls, Casey.
01:26:51
◼
►
You think they have IT departments?
01:26:53
◼
►
They're lonely people in their apartments with dial-up internet connection, slaving
01:26:58
◼
►
away an excrete script.
01:26:59
◼
►
And every router since 2001 hasn't included a firewall in it.
01:27:04
◼
►
But-- Yeah, he wouldn't know about this stuff anyway
01:27:06
◼
►
if he was talking about it, but yeah.
01:27:09
◼
►
And that's the thing is, I mean, if that is such an obvious technical hurdle that you
01:27:14
◼
►
would have to get over to do that kind of feature, and they shipped it, at least for
01:27:19
◼
►
a little while, and I guess they've pulled it since, they shipped it without even thinking
01:27:24
◼
►
Like, how is that possible?
01:27:26
◼
►
It just, it reminds me that there are terrible software developers out there, or if these
01:27:30
◼
►
developers are good, just unbelievably, indescribably out of touch management, or both.
01:27:36
◼
►
If I was on that podcast, I would have also brought up the meta point, which is that the
01:27:39
◼
►
screenwriting format is ridiculous and anachronistic and really deserves to be destroyed and torn
01:27:48
◼
►
And it is held aloft by the collective, you know, by tradition, basically.
01:27:52
◼
►
This is the way it's always done.
01:27:53
◼
►
This is what a screenplay looks like.
01:27:55
◼
►
Let me tell you about all the great qualities of it.
01:27:57
◼
►
Oh, you can always tell exactly how long a thing will be in minutes by looking at the
01:27:59
◼
►
number of pages.
01:28:00
◼
►
There's always a certain number of pages every day.
01:28:02
◼
►
knows it like it can continue to be held aloft by that like sort of oral tradition and indoctrination
01:28:09
◼
►
of new people into the industry that this is what a screenplay looks like. But the format is dumb.
01:28:13
◼
►
It is not like it's a monospace font. It's formatted crazily. Lots of things are in all caps.
01:28:18
◼
►
And everyone who enters the industry and everybody gets used to it comes to like it and will not
01:28:23
◼
►
accept any other format. But bottom line, objectively, wipe out all the people who've
01:28:27
◼
►
ever seen a screenplay, show them this form, show the new people this format, they'll be like,
01:28:32
◼
►
Like, it is not of this time. It is of a different time, of a time of typewriters and monospace fonts and all caps and not, you know, it's dumb.
01:28:39
◼
►
And so, the best, the ideal thing to happen for the entire industry would be to not only for Final Draft to be disrupted, but for the entire screenplay format to be replaced by something better.
01:28:50
◼
►
I have dim hopes of that ever happening because if there's any group of people who is not rearing for, you know, amazing disruption and innovation, it's screenwriters in the entertainment industry.
01:28:59
◼
►
They just want their format, they want it to be the way it is, they just want slightly
01:29:02
◼
►
better tools to work on it, and they will consider that a victory.
01:29:05
◼
►
But from the outside, it's clear that the screenplay format is stupid.
01:29:08
◼
►
Well, and to be fair, in the follow-up episode of their podcast, Jon and Craig did address
01:29:14
◼
►
that a little bit, and talking about things like how the one-page-per-minute thing is
01:29:18
◼
►
this kind of elementary assumption that, you know, it's like the three-paragraph essay
01:29:23
◼
►
you're taught in middle school.
01:29:24
◼
►
It's, you know, it's like, it's a very basic thing that, you know, in practice, once you
01:29:28
◼
►
get, once you're a real professional, it doesn't really, it's not that simple or it's not necessary
01:29:33
◼
►
to rely on things, you know, assumptions like that or these tenets like that. And there's also this
01:29:38
◼
►
thing called fountain which, from what I gather, is like a markdown-like or markdown-based plain
01:29:43
◼
►
text format that does away with manual pagination and stuff like that. And so it does sound like
01:29:49
◼
►
they are actually moving forward. And the problem really seems to be that Final Draft is not.
01:29:56
◼
►
And because Final Draft has this position of power right now where, you know, or at
01:30:03
◼
►
least they have to date, where they were the standard.
01:30:06
◼
►
They have their own file format and they're like, you know, they're relying on these
01:30:13
◼
►
invariants that, okay, well pagination matters above all else and all this other stuff.
01:30:17
◼
►
And then, and the reality is that industry is being disrupted just like so many other
01:30:24
◼
►
technical industries where now there's a whole lot of other apps coming out, some of them
01:30:29
◼
►
cheap and terrible, but some of them good, that are doing things in a more modern way.
01:30:35
◼
►
And that's the problem. Just like so many other industries, I don't think Final Draft
01:30:43
◼
►
is going to be disrupted and replaced by one thing. I think it's going to end up being
01:30:49
◼
►
disrupted and replaced by a few standards and then a few different apps that read and
01:30:53
◼
►
write those standards. And it seems like this fountain format is probably going to be that
01:30:58
◼
►
standard. But again, I think it's funny now all three of us are talking about something
01:31:02
◼
►
we all know nothing about.
01:31:03
◼
►
Yeah, we're going to get all the email of people explaining the various features of
01:31:07
◼
►
the screenplay format that were created intentionally for a specific reason. And I'm sure the formatting
01:31:12
◼
►
and the layout and everything had a purpose originally, but some aspects of it are clearly
01:31:16
◼
►
just artifacts of the technology that was available at the time, specifically the all-caps,
01:31:21
◼
►
the monospace font, the layout being done basically by a series of spaces with the monospace
01:31:27
◼
►
Like, all those things are done because that's what you had—that's how, you know, you weren't
01:31:29
◼
►
gonna typeset it like a book because that's not efficient for the production process.
01:31:32
◼
►
You need people to be able to type these things up.
01:31:34
◼
►
It's kind of the same way that you see, like, the opening parentheses used to make ASCII
01:31:39
◼
►
art to make the little boxes on the front of legal documents.
01:31:42
◼
►
Marco must be familiar with this phenomenon.
01:31:43
◼
►
Like, the fact that that's still—it doesn't really matter as much in legal documents.
01:31:46
◼
►
It's like, whatever, lawyers, everything they do is all made up and crazy with their legal
01:31:50
◼
►
language and everything.
01:31:51
◼
►
plays could benefit from a format that took advantage of modern technology. Keep all the
01:31:56
◼
►
good stuff about the old format in terms of visually blocking out things with white space
01:31:59
◼
►
and making it easy for actors to read and all, you know, I'm just making up what the
01:32:03
◼
►
features may be of the format. There are probably good things about the format, but the bad
01:32:07
◼
►
things are just being carried along like those parentheses on legal documents and they're
01:32:12
◼
►
just ridiculous at this point.
01:32:13
◼
►
All right. Thanks a lot to our three sponsors this week. Picture Life, Help Spot, and Squarespace.
01:32:20
◼
►
And we will see you next week.
01:32:22
◼
►
Now the show is over, they didn't even mean to begin
01:32:29
◼
►
'Cause it was accidental, oh it was accidental
01:32:34
◼
►
John didn't do any research, Marco and Casey wouldn't let him
01:32:39
◼
►
'Cause it was accidental, oh it was accidental
01:32:45
◼
►
And you can find the show notes at ATP.fm
01:32:50
◼
►
And if you're into Twitter, you can follow them
01:32:55
◼
►
@C-A-S-E-Y-L-I-S-S
01:32:59
◼
►
So that's Casey Liss M-A-R-C-O-A-R-M
01:33:03
◼
►
Auntie Marco Arment S-I-R-A-C
01:33:08
◼
►
U-S-A-C-R-A-C-U-S-A
01:33:11
◼
►
It's accidental (it's accidental)
01:33:14
◼
►
They didn't mean to, accidental (accidental)
01:33:19
◼
►
Tech by casting so long
01:33:24
◼
►
"Someday, Casey, someday..."
01:33:27
◼
►
"I cannot believe I let you f***ing railroad me."
01:33:30
◼
►
"Well, the script notes is timely.
01:33:32
◼
►
But no, I abstained, so it was you and Marco could have worked that amongst yourselves.
01:33:36
◼
►
If Casey had insisted, I would have gone along with it."
01:33:39
◼
►
He didn't really abstain.
01:33:41
◼
►
I abstain—I gave my opinion, but I am abstaining from the vote.
01:33:45
◼
►
What I was trying to convince Casey was that not getting it done today does not mean it
01:33:49
◼
►
will never get done, and in fact it may be done better in the future, but he still could
01:33:52
◼
►
have insisted, and I would not have opposed him."
01:33:54
◼
►
Why don't you do it next week? We'll say right now next week will be software methodologies and nothing will happen bold statement
01:34:00
◼
►
Nothing. Yeah, when Apple buys Nintendo, we won't talk about it at all. Oh
01:34:04
◼
►
God now you've jinxed it and we'll never talk about it ever this podcast will end if it results in Apple buying Nintendo
01:34:11
◼
►
Your cut your sacrifice will have been noble
01:34:13
◼
►
You you saying that we will do it next week whether or not you're serious has pretty much
01:34:23
◼
►
absolutely prevented that from happening ever. That's the equivalent of saying, "Well,
01:34:28
◼
►
this will be a short show." Ah, God, Margo.
01:34:33
◼
►
If Apple buys Nintendo and there's no John Siracusa podcast to talk about it, did it
01:34:37
◼
►
really happen?
01:34:38
◼
►
Yeah. This is my third podcast in like three days. I did it in comparable. I did Command
01:34:44
◼
►
Space. Now I'm doing this. I'll have to just take a rest to get ready for the big
01:34:49
◼
►
Nintendo announcement.
01:34:51
◼
►
I hate you too.
01:34:55
◼
►
And we didn't even talk about driving with glass.
01:34:57
◼
►
That was another Remarco topic.
01:34:58
◼
►
Yeah, I mean, is it that much more to say on it?
01:35:01
◼
►
I just wanted to say that I have looked into this zero amount as is my way for this show,
01:35:06
◼
►
but what I assumed when I saw the headline of that story flying by on Twitter was that
01:35:10
◼
►
they were only lobbying to allow you to wear it in the car.
01:35:13
◼
►
Like, for example, if you get it with prescription lenses and you need them to see you to drive.
01:35:17
◼
►
wanted to make it illegal, so you didn't want it to be illegal for you to merely wear
01:35:21
◼
►
them. I didn't think they were lobbying for it to be legal for you to have it turned
01:35:26
◼
►
on and using it. But since I didn't read any of these articles, I have to say I don't
01:35:30
◼
►
know. Did you read them and you can tell me?
01:35:31
◼
►
>> KEVIN SMITH I don't think we know to that much detail.
01:35:36
◼
►
Hell, might as well address this here. I've mainly heard mostly support of my position,
01:35:44
◼
►
And briefly my position is that for Google to actively lobby against states that want
01:35:51
◼
►
to prohibit glass use while driving, I think that's incredibly irresponsible by Google.
01:35:58
◼
►
And I think it's pretty much all reasonable people agree that texting while driving is
01:36:05
◼
►
dangerous and should be prohibited and certainly avoided.
01:36:11
◼
►
do it anyway, but I think even the people who do it know that it's unsafe and wouldn't
01:36:15
◼
►
really argue that strongly that it was safe. And I think texting while driving and using
01:36:22
◼
►
Google Glass are pretty close. Like, I don't think there's a huge difference in safety
01:36:28
◼
►
in a car between doing those two things. I don't think one is dramatically more safe
01:36:33
◼
►
than the other. It's both like, it's engaging your visual attention in a way that, like,
01:36:41
◼
►
you have to interact with a multi-step process with a computer system where you're not looking
01:36:46
◼
►
at the road.
01:36:47
◼
►
Well, actually now that I've thought about it for the five minutes that you've talked,
01:36:50
◼
►
what about if they are envisioning something like the HUD on your M5?
01:36:55
◼
►
There are arguments that people have made that say things like, "Well, this is just
01:36:59
◼
►
like looking at a nav screen or it's just like looking at a heads up display. And I
01:37:03
◼
►
think the difference is that, you know, first of all, a nav screen and a heads up displays
01:37:08
◼
►
are designed very, very carefully and conservatively to be like maximally safe and also to minimize
01:37:17
◼
►
how long you have to look at them in ideal cases. Some of them go as far to be like,
01:37:22
◼
►
you can't even, like some of them won't even let you enter and address in navigation
01:37:26
◼
►
if you're moving, for instance.
01:37:27
◼
►
There's all sorts of safety things that car systems use
01:37:31
◼
►
to either prevent or discourage you
01:37:33
◼
►
from using it very irresponsibly.
01:37:35
◼
►
Things like how DVD players,
01:37:37
◼
►
like sometimes there'll be these DVD players
01:37:39
◼
►
that can play video in the front seat,
01:37:41
◼
►
but you can only technically play it while you're parked.
01:37:43
◼
►
And yeah, some people will hack that and disable it,
01:37:45
◼
►
but there's all these things in place
01:37:47
◼
►
for things that are installed in cars,
01:37:49
◼
►
especially things that come stock from the manufacturers.
01:37:52
◼
►
There's all these safeties in place
01:37:54
◼
►
to try to make them as non-distracting as possible.
01:37:58
◼
►
And like a heads-up display displaying your speed,
01:38:02
◼
►
well, that's no more distracting than a speedometer.
01:38:06
◼
►
You don't have a lot of reasons to stare at that for a while.
01:38:09
◼
►
Well, what if the heads-up display on your glass
01:38:11
◼
►
was showing Nav overlaid on the street in front of you?
01:38:13
◼
►
If that is what it's showing, that's fine.
01:38:16
◼
►
But I think, again, this is like a human nature,
01:38:19
◼
►
human behavioral kind of thing.
01:38:22
◼
►
The reality is we've banned, in most places,
01:38:25
◼
►
we've banned handheld use of cell phones.
01:38:28
◼
►
Now, you could also say, well, what if you're using
01:38:31
◼
►
the cell phone in your hand looking at a navigation screen?
01:38:34
◼
►
And that's, in places where handheld cell phone use
01:38:37
◼
►
is banned, that's banned.
01:38:39
◼
►
And I think this is one of those cases where,
01:38:41
◼
►
the reason why that's banned is that,
01:38:44
◼
►
yeah, maybe you might be doing that,
01:38:45
◼
►
but there's also a very good chance
01:38:47
◼
►
that a lot of people who do that are also gonna like,
01:38:50
◼
►
you know, "Oh, send a quick text. It'll just take a second." You know, and there's
01:38:55
◼
►
all this potential for these multipurpose general computing systems, like smartphones,
01:38:59
◼
►
like Glass, there's all this potential for misuse, and misuse is so easy and so common
01:39:05
◼
►
that they should probably ban it outright because they know that, like, "Yeah, you
01:39:11
◼
►
might be using it for navigation, but there's a pretty good chance you're not," or "There's
01:39:14
◼
►
so many other things to do with it that aren't navigation that, you know, in reality people
01:39:18
◼
►
would be reading a text message or dictating a message or reading Twitter or reading an
01:39:24
◼
►
email that just came in. There's so many potential abuses there, and it is so much more distracting
01:39:31
◼
►
than the in-car systems because you're interacting with a computer at that point. It's a multi-step
01:39:37
◼
►
interaction. You're visually engaging it in a way that it's non-trivial and takes more
01:39:42
◼
►
than a split second of your attention. And so I do think there's a significant difference
01:39:46
◼
►
there. And some people have also said like, "Oh, well, what if glass is, you know, what
01:39:52
◼
►
if you can wear it but you have to keep it off?" Again, same thing. Like, you can't tell
01:39:58
◼
►
from the outside whether it's on or off. So a lot of people would just sneak by and say,
01:40:02
◼
►
"Oh, well, you know what, I'm just going to leave it on because, you know, I'm responsible.
01:40:05
◼
►
I'll be okay." And that's a problem too.
01:40:08
◼
►
That's why I'd like to know what Google was lobbying for. Like, basically, were they lobbying
01:40:11
◼
►
for you, you're allowed to wear it but it has to be off versus you're allowed to wear
01:40:13
◼
►
it and use it.
01:40:14
◼
►
I'm pretty sure it was the latter.
01:40:15
◼
►
I'm pretty sure they didn't distinguish between on and off, that they just wanted
01:40:18
◼
►
you to be able to wear it.
01:40:19
◼
►
Yeah, well, I think Google's bet, and I think they're right on this bet, is that
01:40:22
◼
►
augmented reality, as they call it, is probably the future of more or less everything.
01:40:26
◼
►
But I think you're right that if that is the future of everything, we are unfortunately
01:40:30
◼
►
going to have to wait for it to be built into cars, because then it will be more or less
01:40:34
◼
►
single purpose.
01:40:36
◼
►
And even if it's the exact same technology, the exact same kind of display where it overlays
01:40:40
◼
►
the nav on top of the street in front of you and shows you where you should turn with the
01:40:43
◼
►
like you can imagine lots of really cool features like because I think that is actually safer than it looking down at an ab screen like
01:40:48
◼
►
to see the little arrow and your little picture of your car going if you could just continue to look out the window and just
01:40:53
◼
►
Look like a little red line was painted on the road turning to the right that's safer
01:40:56
◼
►
And I think that is definitely coming if that comes from your glasses though
01:41:00
◼
►
Chances are good even if you have like a driving mode and if the glasses have to be certified by like the highway
01:41:06
◼
►
Association like it's so much safer when it's built into the car a because car technology moves so slowly that they're super conservative
01:41:12
◼
►
and everything sucks there anyway. And B, if it's built into the car, the car maker
01:41:16
◼
►
is extremely motivated to make sure that you can't read text on it.
01:41:19
◼
►
Right. Like, the car maker could not ship that.
01:41:22
◼
►
And like, yeah, and they're more reliable with that kind of stuff. Like, you could
01:41:27
◼
►
plausibly sue the car maker for designing a very distracting system a lot more easily than you could
01:41:31
◼
►
sue Google for making, you know... Right, because it's built into the car. Google's gonna say you
01:41:36
◼
►
shouldn't have been doing that while you're driving, but you're like, if it's built into the
01:41:39
◼
►
car you're going to say, "But you put it in the car."
01:41:42
◼
►
And that's why you can't enter the address when you're moving, because the car makers,
01:41:45
◼
►
if we allow them to do this, they will. They'll crash and they'll sue us.
01:41:48
◼
►
For what it's worth, I can actually read text messages on my iDrive in my BMW.
01:41:53
◼
►
Like on the screen?
01:41:55
◼
►
Well, you should crash into something and sue BMW and get a better car.
01:42:00
◼
►
I just wanted to point that out real quick. I'm sorry, Marco. Go ahead.
01:42:03
◼
►
Alright, so part of my position on this was a pretty severe condemnation, saying like,
01:42:09
◼
►
you know, if Google, Google is actively lobbying for something that I believe is pretty clearly
01:42:15
◼
►
very unsafe for driving. And car accidents are no joke. Like, it's a really, car accidents
01:42:24
◼
►
are a serious problem. So many people get injured or die in car accidents. And for the
01:42:30
◼
►
the most part, and you know, things that we pay a lot of attention to like plane crashes
01:42:34
◼
►
and terrorism, like that end up killing way fewer people than car crashes. Car crashes
01:42:39
◼
►
are a big problem. It's a really big deal and car safety should really be taken very,
01:42:45
◼
►
very seriously and not at all lightly and not at all recklessly. And so for Google to
01:42:51
◼
►
actively lobby against car safety basically, to actively lobby for their own self-interest
01:42:57
◼
►
in a way that's pretty clearly unsafe in general use for cars. People are probably going to
01:43:05
◼
►
die as a result of that. And if Google Glass becomes really popular, which it probably
01:43:11
◼
►
won't in all honesty, but suppose it does. Imagine how big of a problem that will be
01:43:17
◼
►
and how many people might unnecessarily die because Google fought safety legislation.
01:43:23
◼
►
a really serious, that's serious, you know, that's not a joking matter at all. So a lot
01:43:27
◼
►
of people said, "Well, isn't Apple at fault for people texting while using their phones
01:43:34
◼
►
and crashing?" And no, that's a completely different situation because Apple has not,
01:43:39
◼
►
to my knowledge, actively fought against anti-texting laws. And if they have, let me know, I would
01:43:46
◼
►
love to know that. But that is not at all the same thing and it's not comparable. You
01:43:52
◼
►
You know, it's obviously manufacturers on both sides
01:43:56
◼
►
of this weirdly political debate,
01:43:59
◼
►
manufacturers on both sides can potentially do more
01:44:03
◼
►
to prevent people from using their phones while driving.
01:44:06
◼
►
You know, now you can think, okay, well,
01:44:08
◼
►
Apple has this new M7 processor that can supposedly detect
01:44:12
◼
►
when you're in a car, it probably can't detect
01:44:15
◼
►
whether you're driving or a passenger,
01:44:16
◼
►
but it can detect whether you're in a car,
01:44:18
◼
►
maybe show some kind of warning or something,
01:44:19
◼
►
you know, there are things they could do,
01:44:20
◼
►
but they're not actively fighting against safety legislation.
01:44:25
◼
►
And that I think puts a very different type of action on it.
01:44:30
◼
►
What Google is doing I think is actively harmful,
01:44:33
◼
►
whereas the inability to try to prevent people
01:44:37
◼
►
from texting in a car is,
01:44:41
◼
►
inaction is not as bad as actively harming
01:44:44
◼
►
safety actions, I think.
01:44:46
◼
►
- Well Google's also in the position to potentially
01:44:48
◼
►
bring the largest safety increase in car transportation
01:44:52
◼
►
ever through their self-driving cars,
01:44:54
◼
►
assuming that continues apace and does not fade away
01:44:56
◼
►
and they continue to be successful with it.
01:44:59
◼
►
I'm not saying that balances out.
01:45:01
◼
►
Therefore, you're allowed to do something
01:45:02
◼
►
that's going to kill more people now.
01:45:03
◼
►
That's a bad idea.
01:45:04
◼
►
But historically speaking, if history goes out,
01:45:09
◼
►
everyone will forget that briefly they
01:45:11
◼
►
killed a bunch of people with Google Glass.
01:45:14
◼
►
I think what they're thinking is on the Google Glass is
01:45:17
◼
►
They also believe that augmented reality is the future.
01:45:20
◼
►
Just look at that crazy whatever-there-was-that thing,
01:45:22
◼
►
the phone thing that maps out the rooms and everything.
01:45:23
◼
►
The tech for doing that augmented reality stuff
01:45:25
◼
►
is getting better and better all the time.
01:45:27
◼
►
It's going to be everywhere.
01:45:28
◼
►
And Google was like, if we wait for the car makers to do this,
01:45:31
◼
►
it will take forever to get here and it will be crappy.
01:45:32
◼
►
And all of that is true, but I think they just have to--
01:45:36
◼
►
the alternative is they should either make their own car
01:45:39
◼
►
with this built in, hey, buy Tesla, go nuts,
01:45:41
◼
►
have a go at it.
01:45:42
◼
►
But trying to put it into a general purpose
01:45:45
◼
►
computing device that you wear on your face. A, you have to get everyone to wear that stuff
01:45:48
◼
►
on your face. And B, they would have to sign up for all of the liability-type concerns that
01:45:55
◼
►
the car makers do in terms of regulation and everything. And they're not prepared to do that.
01:46:00
◼
►
They want the freedom to do whatever the hell they want. And just don't bother us,
01:46:05
◼
►
because eventually we're going to get to this awesome all-grandeur reality that's coming
01:46:07
◼
►
eventually anyway. And we're going to get there first because we can move faster. But
01:46:12
◼
►
That's playing a little bit fast and loose with people's lives, I agree.
01:46:15
◼
►
And that also seems like almost a childish and naive…
01:46:19
◼
►
That's Google for you.
01:46:24
◼
►
That's the problem.
01:46:25
◼
►
Well, that's what we love about—that's what I love about Google.
01:46:27
◼
►
Like, the self-driving cars is the same type of thing.
01:46:29
◼
►
That's what we love about it, but at the same time, it gets them into that.
01:46:32
◼
►
That's why I always think of Google's corporate mindset not as evil, more as like
01:46:38
◼
►
like, naive hacker type of, you know, like, denizen of Reddit. Like, technology is cool,
01:46:44
◼
►
we can do cool things with technology, and let's just go do that cool things because
01:46:47
◼
►
it's cool. And let's not think too much about the consequences or whether it will make money
01:46:51
◼
►
or anything like that, you know. That's what we both love and hate about Google.
01:46:55
◼
►
All right. Titles. Go to fail.
01:46:59
◼
►
How's the title not "Go to Fail"? Someone put it in all caps, this isn't basic.
01:47:05
◼
►
Haven't you seen the code? Oh, yeah, that's one point we didn't address
01:47:08
◼
►
about the go to thing they don't want to address. All the people who have never seen the C program
01:47:11
◼
►
or Unix C program are like go to who uses that. If you are a C programmer, and you've come from the
01:47:18
◼
►
old like just look at the source code, your favorite units, look at BSD, look at Linux or
01:47:22
◼
►
whatever, you will find go to everywhere because they didn't really have exceptions. And they if
01:47:28
◼
►
the control flow necessary with lots of nested ifs to get you out of an if you end up having to make
01:47:34
◼
►
like flag variables and really contorted logic. GoTo is actually the cleanest solution in those
01:47:39
◼
►
situations where you want to get out of the normal flow of the program and go down to it, you know,
01:47:43
◼
►
or you could do set jump or whatever you want. GoTo is the idiom that, believe it or not, I know
01:47:49
◼
►
the only thing you've heard about programming is that GoTo is evil because there's a paper that
01:47:52
◼
►
you've never read that got passed around the internet 10 years ago. But look at any real C
01:47:56
◼
►
source code and GoTo is there and it's used for exactly this purpose and it has all these same
01:48:02
◼
►
problems. This is not what the paper about GoTo was against, really. But you can see,
01:48:07
◼
►
GoTo does contribute to this anti-pattern, but that's why newer, better languages have
01:48:13
◼
►
Yeah, and GoTo is—there's things like break and continue in a loop that, I would
01:48:18
◼
►
argue, break and continue, especially break, that's really not a whole lot cleaner than
01:48:24
◼
►
GoTo. If you think about it, it's—
01:48:28
◼
►
But you need—if you have very contorted logic, like you have a condition, and then
01:48:30
◼
►
condition and a condition and you want to break all the way out of it, then you—even
01:48:34
◼
►
break doesn't save you—then you just end up with flag variables and it makes the code
01:48:37
◼
►
incomprehensible.
01:48:38
◼
►
The people who read up the function a little bit and saw the type of the variable, OS status,
01:48:44
◼
►
pretty sure if you knew what that was, you were not surprised to see the go-to. Like,
01:48:48
◼
►
I think that's like, you know, if you've been around old C APIs long enough, you know,
01:48:54
◼
►
you've probably seen that, and it's, you know, a typical thing where it's like, you call
01:48:57
◼
►
a bunch of API calls, and if they return zero, everything's cool, and if they return non-zero,
01:49:02
◼
►
something bad happened. And so if you want to do this, like, you know, eight-step process
01:49:07
◼
►
where you have to call these eight different API functions, and you have to have error
01:49:10
◼
►
checking code around every single one of them to say if any of these return non-zero, fail.
01:49:15
◼
►
That's what this was. That's exactly what this was. And fail, in this case, like the
01:49:21
◼
►
fail label didn't mean it has failed. The fail label was the destination of where to
01:49:26
◼
►
jump to if it had failed, which is basically like, go to the end. Like, go to the end of
01:49:32
◼
►
this logic block. That's what it was.
01:49:34
◼
►
That's another Objective-C thing that we could have talked about in the Copeland 2010 show,
01:49:37
◼
►
is that, you know, it's a convention in basically Cocoa, not so much Objective-C, but Cocoa,
01:49:41
◼
►
to not use exceptions for control flow. Is that correct?
01:49:46
◼
►
Yeah. I mean, except like, in Objective-C, objections are by like policy and like the
01:49:54
◼
►
API norm, exceptions are not meant to happen in running code most of the time. It's not meant to
01:50:02
◼
►
be control flows. You're probably not meant to catch an exception in Objective-C.
01:50:08
◼
►
Right. And it's not so much a language feature as it is an API convention, but still,
01:50:13
◼
►
that type of thing leads you to these—the tedium of write param error conditions, where you pass
01:50:22
◼
►
an address to some error thing that's going to fill that, or return values where it's
01:50:26
◼
►
some status, whether it's OS status or any other type of thing.
01:50:29
◼
►
That type of thing is seen as slightly barbaric in languages that do allow you to use exceptions,
01:50:38
◼
►
do expect you to use exceptions for flow control.
01:50:40
◼
►
And then of course there's the pathological case of like, you know, check the exceptions
01:50:43
◼
►
and all the Java crap, and like it can go too far in the other direction as well.
01:50:46
◼
►
But I would have put that on a list of things that Objective-C that other languages do differently
01:50:51
◼
►
that people find cumbersome and tedious and error-prone and lead to these type of situations
01:50:57
◼
►
where you find yourself needing a go-to.
01:50:59
◼
►
As far as I know, in my Objective-C code I've written so far, Instapaper, Bugshot, The Magazine,
01:51:05
◼
►
and Overcast, I don't think I've ever written a try/catch block. I think I've always made
01:51:10
◼
►
exceptions blow up the whole thing.
01:51:12
◼
►
Yeah, because in Cocoa, you're not supposed to—like, exception is supposed to be exceptional.
01:51:16
◼
►
That's my understanding of the policy, is that it's not for control flow. It's not
01:51:19
◼
►
just get me out of this nested block, it's like something has gone wrong, and maybe you
01:51:24
◼
►
can put up a dialog and exit your app or do something like that, but you're not supposed
01:51:27
◼
►
to use it as a form of flow control, even though I assume you can if you wanted to.
01:51:32
◼
►
Yeah, it's more like an assertion failure in Objective-C, I think.
01:51:35
◼
►
Yeah, exactly.
01:51:36
◼
►
That's how you're more supposed to use it.
01:51:37
◼
►
In fact, I think assertion failures might even throw exceptions.
01:51:39
◼
►
I don't remember exactly, but anyway.
01:51:42
◼
►
All right, titles.
01:51:45
◼
►
GoToFail is obvious.
01:51:47
◼
►
I'm not against it.
01:51:49
◼
►
A lot of podcasts I think are gonna have that title this week though, so we might want to
01:51:52
◼
►
avoid it so we can be cool.
01:51:55
◼
►
So we can be cool.
01:51:56
◼
►
I added "ish."
01:51:58
◼
►
Everything's relative.
01:52:03
◼
►
[BLANK_AUDIO]