149: Caramel
00:00:06
◼
►
From Real AFM, this is Connected, episode 149.
00:00:10
◼
►
Today's show is brought to you by Mack Weldon, Hover and Igloo.
00:00:14
◼
►
My name is Myke Hurley and we've got the band back together because Stephen Hackett is here.
00:00:19
◼
►
Hello, Stephen Hackett.
00:00:20
◼
►
Hello, Michael Hurley.
00:00:22
◼
►
How are you?
00:00:23
◼
►
I'm good. It's the 4th of July. It's raining outside.
00:00:28
◼
►
We're here together.
00:00:29
◼
►
Is this a haiku? It sounds like a haiku.
00:00:32
◼
►
Yeah, I don't know how to think out the syllables, but I'm a little disappointed that I haven't gotten any gifts for 4th of July yet from you.
00:00:38
◼
►
What about Federico, did you get me anything?
00:00:41
◼
►
No, and this reminds me of a joke on the Big Bang Theory when, like,
00:00:47
◼
►
someone convinced Rajesh that on the 4th of July people from another country are supposed to give Americans gifts.
00:00:55
◼
►
And so for like for 10 years he gave his friends gifts on the 4th of July to celebrate America.
00:01:00
◼
►
That's pretty good.
00:01:03
◼
►
That's not a thing.
00:01:04
◼
►
I don't think you're supposed to give your American friends gifts on the 4th of July.
00:01:07
◼
►
So I don't have anything for you.
00:01:10
◼
►
I'm assuming that the postal service is closed on the 4th of July, right?
00:01:14
◼
►
I would think so.
00:01:15
◼
►
That's why you haven't got your gift from me.
00:01:16
◼
►
It's in the mail but then I'm delivering it today.
00:01:19
◼
►
Interesting.
00:01:20
◼
►
Tomorrow is going to be a great day for you.
00:01:22
◼
►
That's a real promise that I feel like you're going to break.
00:01:28
◼
►
Yeah, same day delivery.
00:01:32
◼
►
So we had some people write in and ask about the prompt curse.
00:01:37
◼
►
You all mentioned this, I think on last week's episode about the iPad and iOS 11 and exploding.
00:01:45
◼
►
We should explain what it is because we have a lot of listeners who weren't around during
00:01:48
◼
►
the prompt days.
00:01:50
◼
►
We had a show before this called The Prompt.
00:01:52
◼
►
The archives are on the Relay website.
00:01:53
◼
►
You can go dig through and find them.
00:01:55
◼
►
You find the one about the iPhone keynote.
00:01:57
◼
►
I think it's episode 30 is probably by far the best one.
00:02:02
◼
►
But the prompt curse happened because we
00:02:07
◼
►
used to cover photo services on the show a lot
00:02:11
◼
►
back when they were kind of a thing before I called Photo
00:02:13
◼
►
Library and Google Photos took everything over.
00:02:15
◼
►
So we'd cover companies like Everpix.
00:02:18
◼
►
And what were some of the others?
00:02:19
◼
►
I don't remember.
00:02:20
◼
►
Yeah, I don't know.
00:02:21
◼
►
they're all gone because we spoke about them.
00:02:25
◼
►
So we would talk about a photo service
00:02:28
◼
►
and the next week they would have a Medium post
00:02:30
◼
►
apologizing that they were going out of business
00:02:32
◼
►
and all your photos were gone.
00:02:34
◼
►
This led to a website that'll be in the show notes,
00:02:37
◼
►
prompt.photos, which is, I haven't seen this
00:02:41
◼
►
in like a couple of years and it is amazing.
00:02:44
◼
►
So the prompt curves, we talk about a service or an app
00:02:47
◼
►
and then the service or app goes out of business.
00:02:50
◼
►
It has happened to things other than photo services, but it is a thing that happened,
00:02:55
◼
►
it is a thing that continues to happen.
00:02:56
◼
►
We have this, I don't want to call it a gift, I don't know what to call it, we have this
00:03:01
◼
►
It's a curse that's in the name.
00:03:03
◼
►
We have a very specific set of skills.
00:03:05
◼
►
That's right.
00:03:06
◼
►
Which is to find services and to kill them.
00:03:11
◼
►
This is what we do.
00:03:13
◼
►
So if you have an app or a service that you would like sunsetted, get in touch and we
00:03:17
◼
►
will talk about it on the air.
00:03:18
◼
►
We'll talk about it and kill it.
00:03:19
◼
►
I'm not even sure Medium used to be a thing when we were doing the prompt and these companies would shut down.
00:03:25
◼
►
Back in the day it was so difficult to announce that your company was laying off people and shutting down.
00:03:32
◼
►
It was a real struggle and then Medium came along and sort of provided this new niche.
00:03:37
◼
►
These companies that need to announce that they've been acquired and sort of Medium kind of revolutionized that space of shutdown announcements.
00:03:46
◼
►
It's really notable in hindsight.
00:03:50
◼
►
This leads to a paradox. Undoubtedly Medium will go out of business.
00:03:54
◼
►
Like probably sooner rather than later. They are heading that way.
00:03:58
◼
►
Yeah, they're doomed. Where does Medium announce that they are going out of business?
00:04:02
◼
►
My money is on the Medium programmers
00:04:06
◼
►
leaving some ASCII code in the source code
00:04:10
◼
►
of the webpage. Like you can inspect the webpage and you will see the announcement
00:04:14
◼
►
into the HTML, you know, that kind of thing.
00:04:18
◼
►
Sort of like a meta shutdown announcement.
00:04:22
◼
►
- Or they could go old school and put it on Tumblr
00:04:24
◼
►
or LiveJournal, there's lots of options.
00:04:26
◼
►
- I would say a self-hosted Jekyll install
00:04:29
◼
►
with a Dropbox folder on GitHub.
00:04:33
◼
►
- Yeah, all of Medium just redirects to a gist
00:04:36
◼
►
and it's just like, oh, sorry, we ran out of money.
00:04:40
◼
►
VCs have no place to apologize now.
00:04:42
◼
►
It's gonna be sad.
00:04:43
◼
►
So Myke, have you installed the iOS 11 beta yet?
00:04:50
◼
►
That didn't take long.
00:04:51
◼
►
I was setting myself up for a summer full of jokes.
00:04:54
◼
►
It's over in one week.
00:04:56
◼
►
It was just-- I really wanted to do it.
00:05:01
◼
►
Can't wait to do it.
00:05:01
◼
►
We had to record upgrade early.
00:05:04
◼
►
And there wasn't any news.
00:05:06
◼
►
So the only thing that we could think to talk about was iOS 11.
00:05:11
◼
►
And I didn't want another episode
00:05:12
◼
►
where I was just asking people questions, so I installed it.
00:05:16
◼
►
- So you're asking yourself questions?
00:05:19
◼
►
- Well, it's just me and Jason were just talking
00:05:21
◼
►
about some of our experiences.
00:05:23
◼
►
Mine were very early 'cause I only did it that morning,
00:05:25
◼
►
so I'd only used it for a couple of hours.
00:05:27
◼
►
But it was really, it was just the excuse that I needed,
00:05:31
◼
►
'cause I really wanted to do it.
00:05:32
◼
►
I kept backing up my iPad, I just kept doing it.
00:05:36
◼
►
I would back up my iPad and then I would chicken out.
00:05:39
◼
►
I did this for four days.
00:05:40
◼
►
I kept like, I back it up and there's no, no, I can't do it.
00:05:43
◼
►
I can't do it and I back it up and I can't do it.
00:05:44
◼
►
So I'm using it and I really like it.
00:05:47
◼
►
There are some things that are horrifically broken,
00:05:49
◼
►
but that was expected, right?
00:05:50
◼
►
Like TestFlight is just kaput for me,
00:05:53
◼
►
which I know is a, if things happen to some,
00:05:56
◼
►
not happen to others, some third party apps are working,
00:06:00
◼
►
but missing key features.
00:06:01
◼
►
Like I currently cannot export anything out of Dropbox.
00:06:05
◼
►
I have to use other apps to get things out of Dropbox.
00:06:15
◼
►
absolutely in love with it. Like I am,
00:06:19
◼
►
I already feel so much more productive.
00:06:21
◼
►
Like it takes getting used to cause you have to adapt your workflows,
00:06:24
◼
►
but I feel like I have so much more control over the apps.
00:06:29
◼
►
Like I did this one thing today where I had two apps open side by side, right?
00:06:33
◼
►
And I was like,
00:06:35
◼
►
I'd be really much so much easier if I could just move this one to the left.
00:06:38
◼
►
And I was like, I wonder if I can do that. And yes you can.
00:06:41
◼
►
you can swap the apps around now.
00:06:42
◼
►
And I nearly squealed of excitement.
00:06:46
◼
►
And then I was like, oh, I need a calculator up,
00:06:48
◼
►
and I'd usually just open another app.
00:06:50
◼
►
But no, I can bring up Peacock
00:06:52
◼
►
in the little slide-over window.
00:06:54
◼
►
And there are some things that are different,
00:06:56
◼
►
like Command + Tab, whilst also being pretty broken
00:06:59
◼
►
right now, doesn't do what it used to do,
00:07:02
◼
►
where it would just switch out one app.
00:07:03
◼
►
But I'm getting really quick at just flicking up the dot,
00:07:06
◼
►
bringing out the app, and I also feel like I just have
00:07:08
◼
►
way more control about where apps are going to go.
00:07:11
◼
►
Like I can open them in the specific place that I want them to be rather than
00:07:15
◼
►
opening and then readjusting everything around it. Like,
00:07:17
◼
►
I know a lot of people are struggling right now and think it's complicated and
00:07:22
◼
►
coming from their previous workflow,
00:07:23
◼
►
but I think we're just holding onto what is fundamentally a broken way of doing
00:07:28
◼
►
things with the iOS nine version of multitasking.
00:07:31
◼
►
Like this is vastly superior and I think almost every single way,
00:07:36
◼
►
like I love it.
00:07:37
◼
►
Yeah, I have to agree with you, especially now that I'm using a bunch of third-party
00:07:43
◼
►
apps that have drag-and-drop support, putting together the various bits from my review.
00:07:50
◼
►
And just using this stuff, the slide-over with three apps at the same time, and drag-and-drop
00:07:56
◼
►
and some of these other changes are really just so impressive.
00:08:02
◼
►
coming from iOS 10, I'm saving so many steps that I would have otherwise used workarounds
00:08:12
◼
►
and workflows, and now what I'm noticing is I'm using way fewer workflows than I used
00:08:18
◼
►
And also I'm copying stuff to the clipboard way less, because you don't need to use the
00:08:23
◼
►
clipboard anymore just to move data back and forth between different apps, you can just
00:08:27
◼
►
hold it and move.
00:08:28
◼
►
And I know that this is a discussion that we'll probably have later on.
00:08:33
◼
►
It takes, you know, you need to use multiple fingers, sometimes multiple hands,
00:08:37
◼
►
and it can feel like a whole circus going on, like you need to be a contortionist
00:08:41
◼
►
and perform these gestures, but they do work.
00:08:44
◼
►
And there's a kind of like geeky pleasure of being able to have these gestures
00:08:52
◼
►
and this multi-touch enabled drag and drop.
00:08:54
◼
►
It feels fun, and that's hard to explain.
00:08:57
◼
►
I feel like I'm really in control.
00:09:00
◼
►
Like I feel like I have so much more control
00:09:03
◼
►
of what's going on than I did before.
00:09:05
◼
►
'Cause I don't know, like I feel like I'm able to bend it
00:09:07
◼
►
all to my will a little bit, but like more than I used to,
00:09:10
◼
►
rather than me working like around all of those things.
00:09:13
◼
►
Like it's much nicer.
00:09:15
◼
►
I will say I'm not using drag and drop, right?
00:09:18
◼
►
Because most of the apps that I'm using don't support it.
00:09:20
◼
►
So I'm looking forward to that additional world
00:09:23
◼
►
of drag and drop as things move along throughout the year.
00:09:26
◼
►
I will say, just to follow up on last week,
00:09:28
◼
►
I don't want to get into this again because it makes me sad,
00:09:30
◼
►
but the notifications and widgets cover sheet thing
00:09:32
◼
►
is as much as a disaster as Federico outlined.
00:09:36
◼
►
- That's literally the next thing in follow up.
00:09:38
◼
►
But, yeah, so I'm running it too.
00:09:43
◼
►
I put it on my 10.5 inch iPad on Friday,
00:09:45
◼
►
and I very quickly realized that my entire
00:09:49
◼
►
iPad home screen was wrong,
00:09:50
◼
►
because apps in the dock are blessed in a way
00:09:53
◼
►
that apps not in the dock aren't.
00:09:55
◼
►
kind of like watchOS actually,
00:09:57
◼
►
that those things in the dock have,
00:09:59
◼
►
are easier to get to and can do more things.
00:10:01
◼
►
And so I basically collapsed my home screen into the dock
00:10:04
◼
►
and sort of promoted stuff that had been in folders
00:10:07
◼
►
on the second screen up to the home screen.
00:10:09
◼
►
So nothing is where it used to be.
00:10:11
◼
►
But like the two of you,
00:10:13
◼
►
I do feel like I'm in more control.
00:10:15
◼
►
And I'm still sort of learning how everything works.
00:10:20
◼
►
Sometimes I'll try something
00:10:22
◼
►
and it doesn't work the way I expect it to.
00:10:23
◼
►
And some of the gestures are a little less than ideal.
00:10:27
◼
►
Like if you have an app, for instance, that uses sliding panes like Slack, and you have
00:10:32
◼
►
it in the third popover window, and you want to push it back off the screen, and if you
00:10:38
◼
►
don't get it just right, you're just moving the interface around inside of Slack instead
00:10:42
◼
►
of moving Slack itself, because it's kind of a very thin strip you need to hit.
00:10:46
◼
►
I think a lot of developers still need to approach that.
00:10:51
◼
►
I feel like people haven't done that yet, right?
00:10:54
◼
►
Like, worked out, like, how do I make sure
00:10:56
◼
►
I give people enough space to be able to move
00:10:58
◼
►
the UI around, right?
00:11:01
◼
►
- Yeah, but all in all, it's impressive,
00:11:03
◼
►
and I think that iOS 11 is gonna be awesome.
00:11:08
◼
►
So, way to go, iOS 11 team.
00:11:12
◼
►
But you said you didn't want to get in the cover sheet,
00:11:14
◼
►
and that's too bad, because that is the next thing
00:11:15
◼
►
in follow up.
00:11:16
◼
►
We have a long tweet thread, we'll link to the first one
00:11:19
◼
►
the show notes about kind of following up on Myke on what you said about you
00:11:24
◼
►
know maybe is there something about this design that is giving us hints or make
00:11:30
◼
►
more sense with the next iPhone. So Apple's done this in the past right
00:11:34
◼
►
where they'll put things in iOS that don't quite make full sense until you
00:11:38
◼
►
see the new hardware and I think that's what this tweet threat is getting to. So
00:11:43
◼
►
Federico, do you want to take this?
00:11:46
◼
►
So the idea here is that if we consider a future iPhone without a home button and with
00:11:54
◼
►
a more prominent use of 3D touch, the argument from this person goes, the coversheet design
00:12:01
◼
►
makes sense because to open the phone, to unlock the phone, you tap on a notification,
00:12:10
◼
►
So all you need to do is you 3D touch on a notification, you unlock the phone and you
00:12:19
◼
►
see what that notification is all about.
00:12:23
◼
►
And it makes sense if you follow this idea to merge the notifications and the lock screen
00:12:31
◼
►
because you pick up the phone, you look at what's new, you tap on a notification and
00:12:36
◼
►
you authenticate somehow, whether that's Touch ID or maybe by looking at the phone, which
00:12:41
◼
►
we're going to talk about in a few minutes.
00:12:44
◼
►
So the idea is you remove confusion by merging these two areas of iOS and you have a list
00:12:53
◼
►
of notifications, you tap it and you go into the phone, which is unlocked and it lets you
00:12:56
◼
►
view the notification.
00:12:58
◼
►
I sort of understand the idea behind this, to merge everything, to make a single place
00:13:05
◼
►
where you can view all of your messages and alerts from apps and to also unlock
00:13:11
◼
►
the phone the moment that you open a notification. But the problem
00:13:16
◼
►
that I have with this idea is it kind of falls apart if you have no notifications.
00:13:22
◼
►
And at that point what is left to do on the lock screen? You authenticate and
00:13:28
◼
►
you're back into the phone so we're not really merging these two aspects because
00:13:34
◼
►
then again if you unlock the phone and then you swipe down and then you view no notifications,
00:13:39
◼
►
then you need to swipe up again. And as I said last week, my main problem is this sort of
00:13:45
◼
►
this seesaw approach of I need to swipe down and I need to swipe up even more carefully until I feel
00:13:53
◼
►
this taptic feedback from the iPhone that tells me you're scrolling up to view your older notifications.
00:14:01
◼
►
Whereas before, if I swipe down, I knew I could just keep scrolling without a release mechanism
00:14:07
◼
►
to load my previous notifications. And so, while I understand the current design of "we want to
00:14:14
◼
►
make it easier for people because they associate the lock screen with notifications, therefore,
00:14:19
◼
►
when you swipe down, you view notifications, so you view the lock screen", it's the interaction
00:14:24
◼
►
that bothers me. It's the fact that you need to release very intentionally to load the older
00:14:31
◼
►
notifications and also the complete absence of gestures to triage individual notifications,
00:14:39
◼
►
which is basically impossible now because you need to carefully tap on a notification,
00:14:43
◼
►
which a lot of people have problems. For example, Sylvia cannot long press on her iPhone. I don't
00:14:48
◼
►
I don't know if it's because she's a woman, she has obviously, you know, she has longer fingernails, you know.
00:14:54
◼
►
She's a different kind of user, so she cannot use 3D touch properly.
00:15:00
◼
►
She cannot even long press properly on the screen because her touches are often not recognized.
00:15:05
◼
►
And so you're making it more difficult to expand a single notification.
00:15:10
◼
►
Whereas it was much easier to just swipe across the screen and do one thing at a time with individual notifications.
00:15:18
◼
►
It's just, I understand the design, I understand the idea, it's the implementation that could be so much better.
00:15:25
◼
►
And again, the release mechanism of dividing the non-scene notifications versus the older ones,
00:15:34
◼
►
and you need to swipe up and you feel the tap and you need to load them,
00:15:39
◼
►
it's very intentional and it really slows me down.
00:15:42
◼
►
And I would like to see in Beta 3 or Beta 4, whatever,
00:15:46
◼
►
refinement of this idea because it can go somewhere but not with the current
00:15:52
◼
►
design in my opinion. I think that knowing everything that we know and
00:15:56
◼
►
we're gonna get into these iPhone rumors next because I think we just have to
00:15:59
◼
►
follow with this. I see that there is some thought with this that that makes
00:16:07
◼
►
sense to me like if looking at your phone now unlocks your phone then all
00:16:12
◼
►
you need to do is just interact with it in a way right so like just swiping up
00:16:16
◼
►
from the bottom or I expect there to be like some virtual button on the home screen that
00:16:21
◼
►
maybe you would just tap and it would take you there right like it just opens it up or
00:16:25
◼
►
something right because it's going to be all this new screen with a state or something
00:16:28
◼
►
but I could see a world in which this makes sense if touch ID is not needed anymore. I
00:16:34
◼
►
agree with you like some of the implementations wonky and that can get better but it really
00:16:38
◼
►
does feel to me like that there's something in this which is lining up with the rest of
00:16:43
◼
►
the rumors that we've heard about the iPhone considering how little sense this
00:16:47
◼
►
currently makes with what we're currently using that it feels to me that
00:16:50
◼
►
there just has to be something there. I don't know what hardware would dictate
00:16:55
◼
►
the the us losing the ability to dismiss a single notification. That's
00:17:00
◼
►
broken but I'm not talking about that like I'm talking about the idea of
00:17:03
◼
►
merging all of this stuff together as a concept. So the implementation is still
00:17:08
◼
►
leaves a lot to be desired like the fine details but the overall idea of why you would do this
00:17:14
◼
►
I think lends to that.
00:17:19
◼
►
I think that's it for follow up.
00:17:20
◼
►
Alright let's...
00:17:21
◼
►
You've done it.
00:17:22
◼
►
Yeah I want to just talk about this iPhone stuff now because we're moving into it already.
00:17:25
◼
►
Today's show is brought to you by Mack Weldon.
00:17:28
◼
►
The most comfortable underwear, socks, shirts, undershirts, hoodies and sweatpants that you
00:17:32
◼
►
will ever wear are made by this company because they believe that by pairing premium fabrics
00:17:40
◼
►
with meticulous attention to detail and giving you a simple shopping experience to get them
00:17:44
◼
►
home Macweldon can deliver to you a new level of daily comfort straight to your door. They
00:17:50
◼
►
believe and I agree that Macweldon is better than anything else that you have is better
00:17:55
◼
►
than whatever you're wearing right now. They make undershirts that stay tucked, socks that
00:17:59
◼
►
stay up on waist bands that do not roll. Everything is made with premium cotton blended with natural
00:18:04
◼
►
fibres and I love that their website is built to get you in and out as quickly as possible.
00:18:08
◼
►
They don't want to waste your time, they want to do their best to get you what you
00:18:11
◼
►
need and get you on your way. Macweldon are so confident in the quality of their products
00:18:16
◼
►
that they have a no questions asked return policy. They want to make sure that you're
00:18:20
◼
►
super comfortable in what you wear. If for any reason you don't like your first pair
00:18:23
◼
►
just keep them and they'll refund you, no questions asked. I love my MacWalden clothing
00:18:29
◼
►
and I desperately need more of it and I'm going to take care of that very soon as I'm
00:18:33
◼
►
making another trip to the US. So Stephen look out for a package, that won't be your
00:18:37
◼
►
gift but it might be some new underwear for me. Not only do MacWalden's underwear, socks
00:18:41
◼
►
and shirts look good, they perform well too. They're good for working out, going to work,
00:18:45
◼
►
travelling or for everyday life. Listeners of this show can get 20% off at MacWalden.com
00:18:50
◼
►
by using the co-connected. Thank you so much to Mac Weldon for their support of this show
00:18:55
◼
►
and Relay FM.
00:18:58
◼
►
I feel like we've gone around this tree quite a few times but we're back to talk about TouchID
00:19:03
◼
►
again on the potential next iPhone. Which I think for the case of this discussion we'll
00:19:10
◼
►
just refer to as the iPhone Pro. Are we all happy to refer to this phone as the iPhone
00:19:14
◼
►
Pro for this discussion?
00:19:15
◼
►
Alright, yeah, sure. Okay.
00:19:16
◼
►
Just so we can differentiate it in some way.
00:19:19
◼
►
So our good old friend at KGI Securities, Ming-Chi Kuo has issued a predictions report
00:19:25
◼
►
for the iPhone Pro and these are some of the things that Ming-Chi Kuo mentions.
00:19:30
◼
►
So I'll read these through and then we'll jump into the Touch ID one because it's probably
00:19:33
◼
►
the more interesting of all of this.
00:19:35
◼
►
So Ming-Chi Kuo says that the next iPhone will feature the biggest screen to body ratio
00:19:40
◼
►
of any phone on the market, therefore having the thinnest bezels.
00:19:44
◼
►
The screen will be OLED.
00:19:46
◼
►
The home button will be virtual.
00:19:49
◼
►
The front camera will include 3D depth sensing technology that will help with face scanning
00:19:55
◼
►
because Minchukko says that the next iPhone will have no Touch ID sensor of any kind.
00:20:06
◼
►
We predict the OLED model won't support fingerprint recognition.
00:20:09
◼
►
Reasons being 1.
00:20:10
◼
►
The full screen design doesn't work with existing capacitive fingerprint recognition
00:20:15
◼
►
Scan through ability of the under display fingerprint solution still has technical challenges.
00:20:20
◼
►
So Ming-Chi Kuo is saying that Apple has not been able to embed a Touch ID sensor into
00:20:29
◼
►
the screen of the phone.
00:20:31
◼
►
So instead of putting it anywhere else on the device, they are getting rid of Touch
00:20:36
◼
►
ID in favour of face scanning, 3D depth sensing face scanning which they have implemented
00:20:43
◼
►
for this phone and that it would be good enough. So there's a few questions about this and
00:20:49
◼
►
there's also a Bloomberg report which we'll get to in a moment which I think addresses
00:20:52
◼
►
some of these questions but I want to talk through them as a group.
00:20:57
◼
►
So first off, would Apple remove Touch ID? So removing Touch ID could potentially cut
00:21:04
◼
►
support for Apple Pay, right? If there's no authentication method. So let's just assume
00:21:08
◼
►
that there's no Touch ID. Apple's not going to just get rid of this method, right? There's
00:21:12
◼
►
going to be some kind of authentication because otherwise where does Touch ID go?
00:21:16
◼
►
Oh sorry, where does Apple Pay go, right? Like they're not gonna remove that.
00:21:21
◼
►
Just a blood sample. You just prick your finger on the phone every single time you buy
00:21:25
◼
►
something. It's a real Pavlovian response to buying new things.
00:21:31
◼
►
I don't see them removing it unless there's something equally secure
00:21:39
◼
►
and easy to use and we're gonna get into some things that could do that potentially.
00:21:44
◼
►
But they're not going, I don't see them getting rid of it without a really strong case to
00:21:50
◼
►
replace it with something else.
00:21:52
◼
►
And I can hear you now, Steven, the headphone jack!
00:21:55
◼
►
They removed the headphone jack and I don't think, I still don't think they've given enough
00:21:59
◼
►
of a reason why but that's a different type of feature.
00:22:04
◼
►
I will say on that note though, they did remove it but they gave me AirPods and AirPods are
00:22:08
◼
►
amazing and I know they could have given me AirPods anyway, I know they could have done it anyway, but
00:22:13
◼
►
you know there is a better thing. I think that sidesteps the actual argument of Touch ID like
00:22:19
◼
►
you said is key to not only Apple Pay but like device security and like a headphone jack like I
00:22:26
◼
►
was going to write off that complaint because headphone jack is is like it's kind of a pain
00:22:30
◼
►
to live without it but it's not it's not device security it's not payment security and can they go
00:22:38
◼
►
all their banking partners and deal with this in a way that keeps
00:22:44
◼
►
everybody happy or such ID like the linchpin and all of it. I just I don't I
00:22:49
◼
►
don't know if any of us on the outside know how important Touch ID is to some
00:22:52
◼
►
of those deals but Apple is not going to do something that is less secure
00:22:58
◼
►
right that goes against everything they've done like the last five years if
00:23:02
◼
►
not longer with this phone. So I just I don't buy this unless the replacement is
00:23:08
◼
►
equally good to use and equally secure. All right I have many many thoughts here.
00:23:14
◼
►
Here we go, settle in everyone. Let me go with an opening statement which is the
00:23:18
◼
►
idea of just getting rid of Touch ID because you cannot figure out a way to
00:23:23
◼
►
make you work with the iPhone Pro is the equivalent of getting a new car when you
00:23:27
◼
►
blow a tire. Like well we couldn't get that to work so I think it really matters why
00:23:31
◼
►
get rid of it. It is a real scorched earth situation, right? We couldn't work it out, so whatever.
00:23:37
◼
►
Yeah, exactly. So that said, we cannot agree on the fact that there's no way that Apple is going back to a world where it's just a passcode.
00:23:46
◼
►
There has to be a second, possibly based on biometric authentication method of providing an extra layer of encryption and security on the iPhone.
00:23:57
◼
►
So far, that has been Touch ID, and Touch ID powers a lot of services and features on the iPhone, from
00:24:03
◼
►
encrypting, backups, and the lock screen authentication with the App Store, Apple Pay,
00:24:09
◼
►
the API for developers. There's a whole API to use Touch ID in third-party apps.
00:24:16
◼
►
And Touch ID is really everywhere. And the thing that sells Touch ID, it's not that it's a fingerprint.
00:24:23
◼
►
Yes, it's a fingerprint reader, but it's the fact that it's so fast, accessible and easy to use.
00:24:30
◼
►
And also the fact that you don't have to look at your phone while you authenticate with HID.
00:24:36
◼
►
Anyone who's ever taken the tube in London, Myke, you know this.
00:24:39
◼
►
You're grocery shopping and you're in a hurry, you just place your finger on the home button
00:24:44
◼
►
and you feel the tap and you hear the sound effect and you're done.
00:24:47
◼
►
on. You know that you authenticate it, it doesn't get in the way, and it's an unassuming
00:24:53
◼
►
interaction. It's not flashy. We're at the point where most people are getting used to
00:24:59
◼
►
Apple Pay, most people are getting used to Touch ID. It's a natural way of authenticating.
00:25:04
◼
►
So if Apple wants to, in my mind, if Apple wants to provide an alternative to Touch ID,
00:25:09
◼
►
it has to be as quick and easy to use and accessible, because remember, one of the benefits
00:25:16
◼
►
Touch ID is that it's in a fixed position on the device, even if you have
00:25:21
◼
►
motor impairments or visual impairments, you can just feel Touch ID
00:25:26
◼
►
and you can use it. Anyone can use it, unless you don't have fingers or you've
00:25:31
◼
►
burned off your fingerprints, but that's another problem. So a faster solution to
00:25:36
◼
►
Touch ID, let's assume that it's face recognition. The problem with
00:25:42
◼
►
face recognition is not that most implementations so far have sucked,
00:25:46
◼
►
which is, you know, we cannot agree on that. They're not perfect. They're not great.
00:25:49
◼
►
And it wouldn't surprise me if Apple comes out with a much better, faster, more powerful solution, and we're all surprised.
00:25:56
◼
►
Well, Apple did it again. You know, they reinvented face recognition.
00:25:59
◼
►
I could see that actually. And I could see how in iOS 11, you know, with the machine learning stuff, if you look at the vision APIs,
00:26:06
◼
►
Apple has much improved the system that they provide to third-party developers. You know, the system can recognize faces.
00:26:13
◼
►
There are partially occluded people with sunglasses, with hats, profiles.
00:26:18
◼
►
The API itself is more powerful.
00:26:22
◼
►
So you can only imagine if the API is even more powerful, imagine the stuff that Apple keeps to themselves.
00:26:27
◼
►
So it's totally possible that Apple can pull this off, especially with the 3D mapping stuff with the depth perception.
00:26:34
◼
►
My problem is in the physical
00:26:37
◼
►
implementation of face recognition.
00:26:40
◼
►
So there's also a report from Germen on Bloomberg, and he says that Apple has figured out a way to make this work
00:26:47
◼
►
even when an iPhone is sitting flat on a table.
00:26:49
◼
►
So, you know, I have to assume even if you place your iPhone next to a scanner, next to a
00:26:57
◼
►
point-of-sale system, somehow the camera is gonna look right back at you and authenticate you. And I struggle to imagine how
00:27:04
◼
►
that could be possible.
00:27:07
◼
►
Unless there's a multi-camera array
00:27:10
◼
►
along the top edge and the bottom edge of the phone, I struggle to imagine a way to make this work without holding up an iPhone and basically taking a selfie.
00:27:22
◼
►
And if we reach the point where we imagine an iPhone Pro and a bunch of people in the London Tube or at my local supermarket, and everyone before authenticating has to basically take a selfie, even if it takes a second,
00:27:35
◼
►
It's not, you know, the operation itself can take even less than a second.
00:27:39
◼
►
It's the gesture of pulling out your phone, looking at your face, basically taking a selfie.
00:27:44
◼
►
Now everybody's looking at you and you end up with a whole system that takes at least three seconds
00:27:49
◼
►
and it's much slower than Touch ID. And this is where I come down on this rumor right now.
00:27:55
◼
►
It's hard for me to imagine a system that is faster than Touch ID. Not more secure,
00:28:01
◼
►
because I can't believe that Apple maybe has a solution that, you know, it can identify
00:28:06
◼
►
2D pictures, so it won't authenticate you, it can identify, you know, even the finest
00:28:13
◼
►
details of your skin and your eyebrows and your retina, whatever.
00:28:17
◼
►
I totally believe that it's gonna be possible somehow.
00:28:21
◼
►
But it's the different physical behavior of using a face scanner and it's the accessibility
00:28:28
◼
►
in low-light conditions, and there's a few people that say, "Well, Apple is going to use an IR scanner,
00:28:34
◼
►
so it can work even if it's dark." All right, sure. What happens if I'm a blind person and I
00:28:41
◼
►
cannot look at the screen? I don't know if I'm looking straight at my face or if I'm holding
00:28:47
◼
►
the iPhone at a weird angle and it doesn't authenticate me. Whereas with Touch ID before,
00:28:52
◼
►
or I could just feel the button and place my finger.
00:28:55
◼
►
It's such a different way of authenticating,
00:28:59
◼
►
and it's such a different way of holding the iPhone
00:29:01
◼
►
and saying it's me,
00:29:05
◼
►
that combining that with the even higher emphasis
00:29:10
◼
►
on Apple Pay in iOS 11,
00:29:12
◼
►
from Touch ID with Apple Pay in iMessage,
00:29:15
◼
►
the App Store dialog when you purchase apps in iOS 11,
00:29:19
◼
►
it looks like an Apple Pay sheet,
00:29:20
◼
►
there's a fingerprint icon right front and center on that dialog. It's so strange to
00:29:26
◼
►
me that Apple would get rid of Touch ID right now because they cannot figure out how to
00:29:31
◼
►
make it work with the iPhone Pro and that the solution to that is a face scanner. I
00:29:38
◼
►
just don't know what to think, honestly.
00:29:42
◼
►
Mark Gurman follows up with a report. Some quotes from Mark Gurman's report.
00:29:46
◼
►
Apple is testing an improved security system that allows users to log in, authenticate
00:29:50
◼
►
payments and launch secure apps by scanning their face, according to people familiar with
00:29:54
◼
►
the product. It can scan a user's face and unlock the iPhone within a few hundred milliseconds,
00:29:59
◼
►
the person said. It is designed to work even if the device is laying flat on a table, rather
00:30:05
◼
►
than just close up to the face. However, the intent is for it to replace the Touch ID fingerprint
00:30:10
◼
►
scanner. In testing, the face unlock feature takes in more data points than a fingerprint
00:30:15
◼
►
making it more secure than the Touch ID system,
00:30:18
◼
►
the person said.
00:30:19
◼
►
The feature is still being tested
00:30:20
◼
►
and may not appear with the new device.
00:30:22
◼
►
That last line, I put it in there.
00:30:23
◼
►
I think that this is one of those things
00:30:25
◼
►
that Bloomberg makes Garmin put in personally.
00:30:28
◼
►
I think that they're just like,
00:30:29
◼
►
you gotta cover yourself.
00:30:30
◼
►
- The safety net.
00:30:31
◼
►
- Yeah, and I think that they're making,
00:30:33
◼
►
putting weird stuff like that in there
00:30:34
◼
►
as well as like the, as the person said.
00:30:37
◼
►
So I'm just gonna, I'm just gonna do this, you know.
00:30:39
◼
►
Remember, year of optimism and all that,
00:30:41
◼
►
that I'm going for, it's my plan here.
00:30:44
◼
►
So I remember that fingerprint scanners were terrible and that the idea of
00:30:48
◼
►
Apple putting a fingerprint scanner into the iPhone seemed like a stupid one
00:30:52
◼
►
because fingerprint scanners were terrible, right?
00:30:55
◼
►
Like every fingerprint scanner I'd ever used on like a ThinkPad just,
00:30:58
◼
►
just didn't work, right? Like they just flat out didn't work. Touch ID works.
00:31:02
◼
►
It's flawless effectively, right? With how, how successful it is.
00:31:05
◼
►
So let's assume that what Gherman's saying is true, right?
00:31:11
◼
►
That all of this stuff is correct.
00:31:13
◼
►
This sounds fine to me. This sounds good.
00:31:16
◼
►
Like, speed, okay, so let's just take the argument about speed.
00:31:21
◼
►
I understand that as a thing about why speed in these situations is good,
00:31:26
◼
►
but let's say it takes a little bit longer than Touch ID.
00:31:32
◼
►
What are the other benefits, right?
00:31:34
◼
►
Like, speed isn't the only thing here, and there may be something that's nicer,
00:31:42
◼
►
there may be something that's just cooler about it, right?
00:31:44
◼
►
That all you do is just pick up your phone.
00:31:46
◼
►
The action that we all have for picking up the phone,
00:31:48
◼
►
the raise to wake.
00:31:49
◼
►
As soon as we do that now, our phones are locked.
00:31:51
◼
►
Like we don't need to worry about tapping anything
00:31:54
◼
►
or clicking anything.
00:31:56
◼
►
We don't need to have a home button anymore.
00:31:58
◼
►
So not having a home button means that our screens
00:32:01
◼
►
are even bigger than before, right?
00:32:03
◼
►
So that is a big benefit that we get.
00:32:05
◼
►
Maybe we don't need buttons at all on the next phone.
00:32:08
◼
►
It's just this beautiful thing
00:32:10
◼
►
because it's all about this using this face detection technology.
00:32:13
◼
►
Like in trying to have faith in this, I can maybe see that there could be some
00:32:18
◼
►
very interesting stuff in here and like stuff about like which I agree with.
00:32:22
◼
►
How do you do buying of stuff when you're already looking at the device? Like do you
00:32:28
◼
►
like there is an implementation detail in there that's not for us to decide, but if we assume
00:32:34
◼
►
that they work it out in as nice a way as they've worked out putting my thumb on the
00:32:41
◼
►
screen, then it'll be fine. Let's say for example it's kind of playful. Blink! Or like,
00:32:47
◼
►
you know, like there could be something in it that is interesting. Maybe you pick your
00:32:51
◼
►
own facial expression that helps you buy stuff.
00:32:55
◼
►
So the password. Show an emotion.
00:32:58
◼
►
So I think I agree that the pure idea of Touch ID going away is bonkers but if you think
00:33:05
◼
►
it just becomes Face ID and we use our faces instead of our fingerprint scanners and we
00:33:09
◼
►
get all of the functionality we had before and just leave it up to them to work out the
00:33:12
◼
►
details the benefits that we could potentially get just from a device perspective could be
00:33:19
◼
►
great and I agree with you Federico that on paper it maybe sounds harder for accessibility
00:33:24
◼
►
but I also have faith that Apple's got that part covered because that's something that
00:33:27
◼
►
they care about so much right like I will assume that whatever they do it
00:33:32
◼
►
will not become harder for people that have accessibility issues to use it may
00:33:37
◼
►
expose some new ones but everything does right like everything they add may solve
00:33:41
◼
►
something for some people make it harder for others but I'm sure that they will
00:33:46
◼
►
try their utmost to make sure that this feature includes as many people as
00:33:51
◼
►
possible and then make a epic amount of options to allow for people that now
00:33:57
◼
►
are struggling to be able to access their device in a different way, right?
00:34:01
◼
►
Like I'm confident that they will find ways to tackle that part because maybe
00:34:06
◼
►
more than any other technology company, and you mentioned technology company,
00:34:09
◼
►
Apple cares about this stuff more than anybody else, right? I'm really torn on
00:34:13
◼
►
this and I totally understand your point and I'm gonna play devil's advocate
00:34:17
◼
►
and let's think that it's happening and it's real. So I think you're totally
00:34:21
◼
►
right when you say even if it's slightly slower, not because the
00:34:26
◼
►
sensor takes milliseconds, but because there's humans holding phones and humans don't just take
00:34:32
◼
►
milliseconds to operate. But even if it's slower and it takes two or three seconds, if it's more secure
00:34:39
◼
►
then I totally buy the argument. Apple could say it's, I mean they won't say it's slower, but
00:34:44
◼
►
it's times more secure and unique than Touch ID, so we believe this is the right technology
00:34:51
◼
►
to use going forward. And that I could buy. And I could also say, you know, there could
00:34:57
◼
►
be an elegant way to make this work for the thousands of apps that already implement Touch
00:35:04
◼
►
ID and maybe the same API used for Touch ID authentication could automatically fall back
00:35:08
◼
►
onto Face ID when you're using an iPhone 8. So right at launch, one password without doing
00:35:14
◼
►
any work, for example, could support Face ID with the same Touch ID dialogue.
00:35:18
◼
►
It's just the same call, like it's just the same API called and Apple just throws up the
00:35:21
◼
►
new UI that they've built.
00:35:23
◼
►
Like that's how that will work.
00:35:25
◼
►
So that could work.
00:35:27
◼
►
You know, I think part of me, the problem that I have right now is that I'm feeling
00:35:33
◼
►
the same way that older people felt about the headphone jack going away, because Touch
00:35:41
◼
►
Touch ID is a feature of our generation.
00:35:46
◼
►
And to think about it going away already, it makes me feel uneasy right now.
00:35:52
◼
►
And it makes me come up with all kinds of questions and reasons why it would be a bad
00:35:59
◼
►
But there's also the argument that Touch ID doesn't work for everyone.
00:36:05
◼
►
If you have Face ID, it doesn't matter if you're sweaty or you just took a shower,
00:36:11
◼
►
you know, because the camera doesn't touch your skin, you're just looking at your face
00:36:14
◼
►
so you can always authenticate, whereas Touch ID has that kind of problems.
00:36:18
◼
►
And Face ID would work even if you're wearing gloves, you know, or hats or glasses.
00:36:24
◼
►
Once again, because it doesn't matter whether your skin is in direct contact with the sensor.
00:36:28
◼
►
So I could see all the arguments for that.
00:36:31
◼
►
just that kind of feature that I was there when it launched and it's become
00:36:36
◼
►
so entrenched in the way that we use iOS that it makes me skeptical to think of a
00:36:42
◼
►
replacement not even four, five years after the first Touch ID sensor. Again I
00:36:50
◼
►
should say if, and that's a big if, it's really safer and more secure than Touch
00:36:58
◼
►
If this face ID can provide more unique data points about whether it's really me.
00:37:05
◼
►
And by the way, the argument that, you know, what happens if you change your hairstyle?
00:37:08
◼
►
That's really not relevant because right now I'm looking at my photos app
00:37:12
◼
►
and the photos app with no training can recognize my face when I was 16,
00:37:18
◼
►
when I was bald because I was doing chemo, and when I had a beard when I did not have a beard.
00:37:23
◼
►
So it doesn't really matter.
00:37:25
◼
►
I think the things that it's looking for, like it's not looking for is your
00:37:29
◼
►
hair spiky, right? Like it's looking for the distance between your eyes, you know?
00:37:32
◼
►
Like it's, you know.
00:37:33
◼
►
Yeah. I can tell you myself it doesn't really matter because I've had all kinds of hairstyles.
00:37:38
◼
►
So that's not a problem.
00:37:39
◼
►
My problem is what happens in busy situations where there's a lot of people behind you.
00:37:44
◼
►
So that's, you know, I could think about that and maybe say, well, that's where the 3D
00:37:48
◼
►
stuff comes in because you have a 3D map so you can block out people in the background, for
00:37:52
◼
►
example, people waiting in line at the cash register or people waiting in line in the
00:37:55
◼
►
tube and it could identify you or it could use some kind of proximity with the Apple
00:38:00
◼
►
Watch. I don't know, but I'm sure that Apple can find a way to say there's 50
00:38:05
◼
►
people here, but the person that is trying to authenticate with Face ID is this user,
00:38:10
◼
►
you know, X centimeters away from the camera.
00:38:12
◼
►
So that I could totally believe.
00:38:14
◼
►
My main question is, how does it work in the physical space?
00:38:19
◼
►
What am I supposed to do to make this work?
00:38:22
◼
►
because Touch ID is a known quantity. I just have to pick up my phone. I don't
00:38:27
◼
►
have to look at it. I place my finger and it works. Face ID will require some kind
00:38:31
◼
►
of selfie-like authentication process, which I'm not sure about, but I guess
00:38:34
◼
►
we'll see what happens. I'm trying to be optimistic, but it's my second nature to,
00:38:39
◼
►
you know, find all the possible problems and consequences and how things could go
00:38:43
◼
►
badly. Here's the thing, right, like I've been thinking about this in the idea of
00:38:47
◼
►
Fraser Speer's fantastic article that's been shared a few times this week where
00:38:51
◼
►
Like he argues like he makes like does a review of a MacBook Pro as an iPad user like flips it on its head
00:38:58
◼
►
Like can the MacBook Pro replace your iPad right? Like he did this it was like in 2015
00:39:04
◼
►
but it's still relevant now because for some reason this has become a problem again and
00:39:08
◼
►
Let's imagine that we already have face detection and we're gonna move to fingerprint detection
00:39:14
◼
►
Right, and it's like oh I have to put like specific fingers on my phone
00:39:18
◼
►
Like all I have to do right now is pick it up and it unlocks
00:39:21
◼
►
What if my girlfriend wants to use my phone?
00:39:24
◼
►
She now has to like
00:39:25
◼
►
She has to pass it to me and I have to unlock it when she could just point it at my face and unlock it
00:39:29
◼
►
Like that's so much easier, right?
00:39:31
◼
►
Like there are just these arguments that I can imagine if this thing is great about like why it could be better
00:39:37
◼
►
and then I think it's fun to turn it on its head like I'm optimistic about this because I
00:39:42
◼
►
think that in removing the home button completely and therefore touch ID going with it and
00:39:48
◼
►
there are a lot of advancements in the hardware that we could get and then it's
00:39:52
◼
►
up to Apple to make face ID better than any thing that's come before it and
00:39:58
◼
►
better than touch ID and I believe that they would not remove touch ID from the
00:40:03
◼
►
next phone the phone that's going to be the most expensive most premium iPhone
00:40:06
◼
►
that's ever made if it's worse I just can't see that being a thing because
00:40:10
◼
►
there's no way they got to a point where they were like oh this won't work oh no
00:40:15
◼
►
we have nothing we have no plan B like there was always a plan B then like
00:40:21
◼
►
that's silly to me and my hope is that face ID was plan A touch ID was plan B
00:40:27
◼
►
and they just couldn't get it to work so you know I'm calling it now there's
00:40:32
◼
►
going to be people if face ID happens there's going to be people who 3d print
00:40:36
◼
►
heads like like models to test the whether it works or not and I bet that
00:40:43
◼
►
whatever system Apple comes up with, it's gonna have some of, you know, some clever
00:40:48
◼
►
workarounds for that. Like it detects, like, those minimal movements of your eyes or, you
00:40:54
◼
►
know, of your skin, like, sub-skin elements even.
00:40:57
◼
►
What about that?
00:40:58
◼
►
You know, it can recognize a human versus, like, a mannequin holding an iPhone. So that's
00:41:05
◼
►
gonna be super fascinating if it happens.
00:41:07
◼
►
It does potentially open up an opportunity on the Mac as well.
00:41:13
◼
►
If Touch ID is going to be married to the Touch Bar on the Mac and the Touch Bar is dead in the water
00:41:19
◼
►
or difficult to implement an external Bluetooth keyboard,
00:41:22
◼
►
I would imagine that they could put whatever this face sensing deal is.
00:41:27
◼
►
And we haven't really gotten into that. Surely it's more than just the camera.
00:41:30
◼
►
But that could come to the Mac and be really interesting
00:41:35
◼
►
if Touch ID is indeed getting replaced with some sort of facial thing.
00:41:39
◼
►
You could add that to an iMac and it doesn't matter what keyboard you have.
00:41:42
◼
►
Because every Mac already has a camera on the front of it anyway, right?
00:41:46
◼
►
Right. And I truly believe that this would be something more than just the camera,
00:41:50
◼
►
that it's using IR or something else. But you already have
00:41:54
◼
►
some infrastructure there. You could put the Secure Enclave in there somewhere and you could,
00:41:58
◼
►
you know every Mac has a bezel around the screen and there's already a camera
00:42:02
◼
►
up there just put something else up there too. So that's if this is the case
00:42:07
◼
►
that Touch ID is going to be kind of pushed aside for some sort of facial
00:42:12
◼
►
thing I would be very interested to see where that expands in the future to
00:42:16
◼
►
other products besides just the iPhone. All right this episode is also brought
00:42:21
◼
►
to you by our friends over at Hover. When you have a great idea for your next
00:42:41
◼
►
by the way, you want to have a great email address. There is a bunch of different reasons
00:42:45
◼
►
to get a great domain and finding that perfect domain is so easy with Hover. They have over
00:42:52
◼
►
400 domain extensions that you can end your domain with, all of the classics that you're
00:42:56
◼
►
used to and all the crazy ones as well. And once you get your domain, you can use it to
00:43:00
◼
►
get an on-brand or maybe just a more professional email address. It is nice to have an email
00:43:06
◼
►
address that has a personalised domain rather than like Hotmail or something. And all of
00:43:10
◼
►
this stuff can be super easy to set up with hover and any email address that you create
00:43:16
◼
►
will work with whatever email programs you're already using and they'll be able to help
00:43:20
◼
►
you with that because they have fantastic features like their support team right their
00:43:23
◼
►
support team is always there for you you can give them a call they have an actual person
00:43:28
◼
►
that's going to pick up the phone they'll answer whatever questions you might have there
00:43:31
◼
►
are no annoying phone trees or being transferred to another department for you to deal with
00:43:36
◼
►
they also have hover connect which lets you set up your domain in just a few clicks with
00:43:59
◼
►
Thank you so much to Hover for their support. Hover, domain names for your ideas. Thank
00:44:03
◼
►
you so much to Hover for supporting this show and Relay FM.
00:44:07
◼
►
Steven, we were talking about the reviews of the Echo Show last week and since that
00:44:13
◼
►
episode, yours was received. You now have an Echo Show at home and I was saying on the
00:44:21
◼
►
episode that I'm reserving most of my judgement until I hear what you have to say. So what's
00:44:28
◼
►
That's a real vote of confidence.
00:44:31
◼
►
- I believe in you.
00:44:33
◼
►
- So I would, I would,
00:44:35
◼
►
I'm going to categorize my comments
00:44:38
◼
►
in sort of three buckets,
00:44:41
◼
►
hardware, software, and potential.
00:44:44
◼
►
So no doubt people have seen photos of this thing.
00:44:48
◼
►
It is not the prettiest thing in my household.
00:44:51
◼
►
Unfortunately, the screen is bright.
00:44:54
◼
►
I actually had a video call on it with Jason Snell,
00:44:58
◼
►
and we both sort of commented like
00:45:00
◼
►
how deep the screen looks recessed,
00:45:02
◼
►
like it's not laminated to the cover,
00:45:03
◼
►
like you're used to on an iPad or an iPhone.
00:45:05
◼
►
But it's really bright, it's really clear.
00:45:07
◼
►
The speakers are incredible,
00:45:08
◼
►
way better than the regular Echo.
00:45:11
◼
►
Definitely better than the Dot.
00:45:13
◼
►
I mean, I've got a Dot here on my desk,
00:45:14
◼
►
and the speaker is just fine for the voice,
00:45:18
◼
►
but that's about it.
00:45:19
◼
►
Interestingly, I noticed during the boot-up sequence
00:45:23
◼
►
a little flash of the Intel inside logo which I did not anticipate. I figured these things
00:45:28
◼
►
would have some sort of ARM processor in them.
00:45:30
◼
►
How peculiar to see that!
00:45:32
◼
►
I don't know what's in there. I tried to do some digging and there's not a lot of information
00:45:36
◼
►
on the web about what Intel processors are in here. Maybe that's where all the Xeons
00:45:40
◼
►
went that Apple didn't use.
00:45:43
◼
►
Does it have a Pentium 3 sticker also in the back?
00:45:46
◼
►
That'd be awesome. That'd be incredible. So it's got Intel inside.
00:45:52
◼
►
The microphone array and everything seems to work just as well as the regular echo.
00:45:59
◼
►
It is more directional in a way, you know, the echo and the dot being cylinders.
00:46:05
◼
►
The sound comes out of them the same in every direction.
00:46:08
◼
►
So you could put it kind of anywhere in the room.
00:46:11
◼
►
The speakers on the show fire forward, which makes sense because you have a screen, right?
00:46:15
◼
►
You're not going to point the screen at the wall.
00:46:18
◼
►
But it's something to be aware of as you set this thing up that you may need to turn it
00:46:22
◼
►
or have it slightly different if you're just dropping in where a regular Echo was.
00:46:28
◼
►
And one thing that Amazon's done that I think is incredible, and I think like Microsoft
00:46:32
◼
►
did this with the Xbox and now the Xbox One X, the power plug is the same, and it's in
00:46:40
◼
►
the same place.
00:46:41
◼
►
And so you can unplug your Echo and plug your Echo Show in with the same power brick and
00:46:48
◼
►
You don't have to change anything out.
00:46:49
◼
►
you know stuff on the kitchen counter or an entertainment center somewhere you're
00:46:52
◼
►
not digging around you can just unplug it and plug the same little barrel
00:46:56
◼
►
connector in which I love those those little details that hardware makers do
00:47:00
◼
►
and Amazon gets a lot of that stuff right honestly. So the hardware is not
00:47:06
◼
►
much to look at but what is there is nice it does look kind of like a little
00:47:09
◼
►
old TV set but got a little a little more bulk than I had anticipated I may
00:47:16
◼
►
will find it for the show notes. I put a picture on Instagram of it and the
00:47:19
◼
►
molar Mac which is like the super ugly Mac Apple made in the 90s and they kind
00:47:24
◼
►
of look the same from a certain angle and so that's that's not great but it is
00:47:29
◼
►
what it is I guess. The the touchscreen itself kind of moving from hardware to
00:47:36
◼
►
software enables a lot of nice stuff and where I was immediately impressed with
00:47:41
◼
►
it this oh this is so much better than the regular echo experience was
00:47:44
◼
►
connected to Wi-Fi. So Amazon does have this thing if you have an account set up
00:47:48
◼
►
with an echo that you can have Amazon hold on to your wireless password. I
00:47:53
◼
►
don't have that enabled on my account for hopefully obvious reasons so I had
00:47:58
◼
►
to type in my Wi-Fi password and instead of like getting the echo app out, I
00:48:03
◼
►
actually pulled my phone out while the thing was booting up thinking I'm gonna
00:48:06
◼
►
need the phone to configure this and you don't because it has a screen you can
00:48:10
◼
►
just you know type in your Wi-Fi password. It was kind of great if you're
00:48:15
◼
►
used to the normal echo interface. The first thing, one of the first things I
00:48:21
◼
►
tried of course was the video calling. There's no interface to this and so I
00:48:27
◼
►
just said friend in a tube video call Jason Snow and it did it. Like I just
00:48:35
◼
►
tried the first thing that came to mind and it worked and I was impressed that
00:48:38
◼
►
that it understood that I got the vocabulary right
00:48:40
◼
►
by basically guessing.
00:48:41
◼
►
The call was fine.
00:48:44
◼
►
It did freeze at first, but then it was okay.
00:48:47
◼
►
And I'm willing to like, this was very early,
00:48:49
◼
►
like we got ours on like the first or second day
00:48:50
◼
►
they were out.
00:48:51
◼
►
I haven't had that problem on subsequent calls,
00:48:54
◼
►
so maybe it was just a hiccup.
00:48:56
◼
►
But Jason Snell had lifted his arms in the air
00:48:58
◼
►
and then he was frozen that way for a couple minutes,
00:49:00
◼
►
which was pretty funny.
00:49:01
◼
►
- Wish you got a picture of that.
00:49:03
◼
►
- It was pretty, I should have, yeah, I should have.
00:49:05
◼
►
He was just like standing with his arms
00:49:06
◼
►
raised high in victory.
00:49:08
◼
►
But the call quality and stuff is fine,
00:49:11
◼
►
which you expect from FaceTime or Skype on a good day.
00:49:14
◼
►
The software benefits from all the work Amazon does
00:49:20
◼
►
in the background, so it knew who Jason Snell was
00:49:24
◼
►
because I had already given them all my contacts, sorry.
00:49:29
◼
►
- Oh man. - Part of the deal.
00:49:31
◼
►
Yeah, if Amazon didn't know your email address, real sorry,
00:49:35
◼
►
but it is what it is.
00:49:37
◼
►
but it knew who he was because he already had access to that.
00:49:41
◼
►
Likewise, I didn't have to go in and set up Spotify
00:49:44
◼
►
because my Echo system already knows
00:49:48
◼
►
about my Spotify account.
00:49:49
◼
►
So I could just tell it, friend in a tube,
00:49:51
◼
►
play polka music and it knew what to do
00:49:53
◼
►
because it was already hooked up.
00:49:54
◼
►
- Polka, polka, polka.
00:49:57
◼
►
- That's right.
00:49:57
◼
►
Amazon does a really good job with that stuff.
00:49:59
◼
►
Kind of how Apple does with iCloud,
00:50:01
◼
►
if you signed iCloud in your account,
00:50:03
◼
►
certain things take place.
00:50:04
◼
►
It's the same with all this stuff.
00:50:07
◼
►
But unfortunately it's not all puppies and rainbows
00:50:11
◼
►
when it comes to the software on this thing.
00:50:14
◼
►
And I will preface my complaints by saying two things.
00:50:19
◼
►
A, I know it's early days,
00:50:21
◼
►
and Amazon generally does a pretty decent job
00:50:25
◼
►
at updating things,
00:50:27
◼
►
and there's some things on the Echo originally
00:50:31
◼
►
that they have resolved,
00:50:32
◼
►
things on the Kindles have gotten better over time.
00:50:35
◼
►
So I understand that my complaints could be resolved
00:50:37
◼
►
with future software updates, and indeed I hope they are.
00:50:41
◼
►
Two, there is an inherent weirdness
00:50:43
◼
►
in reviewing something over such a short period of time.
00:50:48
◼
►
And Dan Morin actually was giving me a hard time about this,
00:50:51
◼
►
about trying to make a decision on this product so early
00:50:54
◼
►
because it is so different than the other Echo.
00:50:56
◼
►
I appreciate that thought, but I also disagree
00:51:00
◼
►
with it a little bit that there are some things
00:51:02
◼
►
that you're just gonna know if you like or not.
00:51:04
◼
►
and the Echo Show is one of those things for me.
00:51:08
◼
►
- Like the Logitech Slim Combo.
00:51:11
◼
►
- Like the Logitech Slim Combo.
00:51:12
◼
►
- Yeah, exactly.
00:51:14
◼
►
It doesn't take long to realize
00:51:15
◼
►
that thing is a piece of garbage.
00:51:17
◼
►
So a couple things in the software that I really don't like.
00:51:21
◼
►
First, every screen has a hint.
00:51:25
◼
►
So if you are currently playing your polka music,
00:51:30
◼
►
it's got the album artwork, it's like an iPod, right?
00:51:33
◼
►
got all your your music information on it. Then there's a little line and it
00:51:37
◼
►
says try, you know, wake word, show me my photo albums. It's like I don't need a
00:51:45
◼
►
hint when I'm listening to music to try something else. Also, lol, no one uses
00:51:51
◼
►
Amazon photo service. But so if I kind of like advertising in a way for service
00:51:57
◼
►
that no one's using but also like explain to you what else your device can
00:52:01
◼
►
do and almost every single screen on the Echo Show has this little bottom third
00:52:07
◼
►
of saying "hey try this other voice command" and I appreciate the the
00:52:13
◼
►
helpfulness that they're trying like they're trying to surface new commands
00:52:16
◼
►
because there's no interface to this thing right I just had to say video call
00:52:19
◼
►
Jason Snell for it to work because I couldn't there's no interface to like go
00:52:22
◼
►
and do that thing but the trade-off is I feel like I'm being like annoyed to
00:52:29
◼
►
by the tips app on iOS 11 and I could not find a way to turn this off in
00:52:33
◼
►
settings which is a bummer. I wonder if it's something that goes away after a
00:52:37
◼
►
certain period of time. It may and I have not like sat down and like mapped out
00:52:42
◼
►
the frequency of it so I don't I don't know to be honest with you I would hope
00:52:46
◼
►
that it would it would trail off of hey once you've said you know X number of
00:52:50
◼
►
commands to this thing then slowly fade these away but for now at least starting
00:52:57
◼
►
off you know the first week or so of use there they're not there or they're there
00:53:02
◼
►
all the time excuse me. The other big thing that that that bothers me about
00:53:07
◼
►
the Echo Show is it is very attention hungry and and what I mean by that the
00:53:15
◼
►
Echo itself the Echo Dot and Siri and you know Bixby I guess if you have a
00:53:22
◼
►
Samsung phone, the Google Assistant, all of these things are kind of they're kind
00:53:27
◼
►
of in waiting and what I mean by that is that they are they are there for you
00:53:32
◼
►
when you want them to do something or when you need something but they're not
00:53:37
◼
►
reminding you of their presence. So the echo if I'm not talking to it the LEDs
00:53:43
◼
►
are off it's just sitting on my kitchen counter and you may know what it is if
00:53:48
◼
►
if you come into my house,
00:53:49
◼
►
or you may think it's a weird salt and pepper grinder
00:53:52
◼
►
or something, it's just an object sitting on the counter.
00:53:55
◼
►
Not necessarily the prettiest thing,
00:53:57
◼
►
but it's very inoffensive.
00:54:00
◼
►
If you're not using Siri, it's just locked away
00:54:01
◼
►
behind the home button or the wake word,
00:54:03
◼
►
same with the Google Assistant.
00:54:05
◼
►
But the show is not that way.
00:54:07
◼
►
The show has a screen, and that screen is always on,
00:54:11
◼
►
and you can fall back to a screensaver of just a clock,
00:54:15
◼
►
and if you put the thing in do not disturb mode,
00:54:17
◼
►
it goes to the clock and stays there.
00:54:20
◼
►
But every time you walk by it,
00:54:23
◼
►
or if you're like me and it's in your office,
00:54:25
◼
►
that's only 200 square feet,
00:54:27
◼
►
it detects your motion and so the screen's always awake.
00:54:30
◼
►
And what it's doing is showing,
00:54:32
◼
►
Amazon, I believe they're called HomeCards,
00:54:34
◼
►
and these cards kinda cycle through on a carousel,
00:54:37
◼
►
and I care about none of them.
00:54:39
◼
►
So you can tell it,
00:54:40
◼
►
"Hey, put my next calendar appointment in there,"
00:54:43
◼
►
if I use Google Calendar,
00:54:45
◼
►
"put these things in here."
00:54:46
◼
►
But it's also a lot of news, a lot of entertainment stuff
00:54:51
◼
►
that honestly I just don't care about.
00:54:53
◼
►
I don't care about,
00:54:55
◼
►
like when it was like some new thing on Instagram
00:55:00
◼
►
that was trending or what Kanye was doing
00:55:02
◼
►
for the Fourth of July, I just don't care.
00:55:04
◼
►
And I don't want that stuff shown to me.
00:55:08
◼
►
I don't want to see it.
00:55:13
◼
►
And there's no way to like, again, to turn that off.
00:55:15
◼
►
There's no way to tell it, just go to the clock and stay there unless you put it in
00:55:19
◼
►
Do Not Disturb.
00:55:20
◼
►
So it is, again, it is trying to pull your attention into it when all these other voice
00:55:27
◼
►
assistants are just kind of laying and waiting.
00:55:29
◼
►
Does that make sense?
00:55:30
◼
►
Like, does that framing make sense to you, Myke?
00:55:32
◼
►
Yeah, and I don't like the sound of it.
00:55:35
◼
►
I don't know if I'd assumed that the screen was always on.
00:55:40
◼
►
I did but just thinking of that is not what I like about my Echo in that as you say I
00:55:46
◼
►
like the idea of in waiting right that's a really good way of putting it but these things
00:55:51
◼
►
are just like they're like Jeeves and Jeeves is never seen right like he's just you know
00:55:56
◼
►
when you need him he'll come and help you out but like if you don't need him he's not
00:55:59
◼
►
going to be there like holding up a big sign saying like hey would you like me to get the
00:56:04
◼
►
quaggy master I don't know why I'm doing this but that's kind of where my mind went to I
00:56:09
◼
►
I don't like the idea of it just being this persistent screen that has stuff showing on
00:56:14
◼
►
it when that stuff a lot of the time feels like it might be like advertisements to use
00:56:19
◼
►
it more, right?
00:56:20
◼
►
Like come and do this with me.
00:56:23
◼
►
Like I don't need you to do that.
00:56:25
◼
►
Like I appreciate the ability of trying to teach someone how to use the device a little
00:56:31
◼
►
bit better but do that in context.
00:56:35
◼
►
So if I've asked you a question about photos, give me some other options for it or let me
00:56:40
◼
►
tell you that I need help with something.
00:56:42
◼
►
I don't know, there's something about like, as you say, when you're doing one thing and
00:56:46
◼
►
it's trying to give you a tip about another piece of functionality that's completely unrelated,
00:56:49
◼
►
it feels a little bit redundant and needy in a way that I'm not keen on for a device
00:56:58
◼
►
I don't need neediness from my voice assistant devices.
00:57:04
◼
►
Yeah, the, you know, someone in the chat were asking what I want to see on that home screen.
00:57:10
◼
►
The calendar stuff is nice.
00:57:13
◼
►
But you know, I like the clock.
00:57:15
◼
►
I like to have weather, which I think you can tell the weather to be there.
00:57:18
◼
►
But I would like that stuff to be more sort of ambient.
00:57:22
◼
►
Like these, these tiles don't need to rotate around on a carousel.
00:57:25
◼
►
Like every time the thing moves, it catches my eye.
00:57:28
◼
►
And I don't, my wife did not want that in the kitchen, which is why this thing is in
00:57:33
◼
►
in the office and I don't want it here in my office
00:57:37
◼
►
catching my eye every time.
00:57:37
◼
►
Like I have a digital clock in my office
00:57:40
◼
►
and it shows the weather and like some stats
00:57:44
◼
►
like Twitter followers and YouTube subscribers and stuff.
00:57:46
◼
►
Like it was Christmas gift, I really like it.
00:57:48
◼
►
It's on the other side of my office.
00:57:50
◼
►
I can't see it from my desk
00:57:52
◼
►
because every time the thing moves, it catches my eye
00:57:56
◼
►
and I, you know, maybe I'm just a squirrel
00:58:00
◼
►
attracted to shiny things but that is really distracting
00:58:02
◼
►
when I'm trying to write or trying to edit.
00:58:04
◼
►
And this thing is like, every time you walk by it,
00:58:06
◼
►
it's got something new to show you.
00:58:08
◼
►
And there may be a place for that in the kitchen, right?
00:58:12
◼
►
Like having a screen that surfaces things.
00:58:15
◼
►
But I, and I think my household,
00:58:16
◼
►
I'll speak for my wife as well,
00:58:18
◼
►
we have come to appreciate the Echo
00:58:19
◼
►
because it's not there unless you start the conversation.
00:58:24
◼
►
And that difference may seem subtle,
00:58:27
◼
►
but I think it's really profound
00:58:28
◼
►
that something is always responsive
00:58:30
◼
►
as to something that is trying to initiate
00:58:34
◼
►
some sort of interaction.
00:58:35
◼
►
- Seems like you're not too hot on this in the overall.
00:58:38
◼
►
Like I can't imagine you're gonna be keeping this thing.
00:58:41
◼
►
I'm not keeping it.
00:58:42
◼
►
I see, so the third part of the view is potential.
00:58:49
◼
►
If they solve the software issues, that's great.
00:58:52
◼
►
I can see a real potential of things like the,
00:58:55
◼
►
and it's been mocked a lot,
00:58:57
◼
►
And there's been a lot of Apple commentators
00:59:02
◼
►
really down on this feature of being able to,
00:59:04
◼
►
the drop-in, right?
00:59:05
◼
►
So Myke, if I gave you permission,
00:59:08
◼
►
underline all of those words,
00:59:10
◼
►
then you can start a video call
00:59:13
◼
►
and I don't have to approve it.
00:59:14
◼
►
You can just show up.
00:59:16
◼
►
You know what, if you don't want that,
00:59:17
◼
►
don't opt into it.
00:59:19
◼
►
It's not a hard thing.
00:59:19
◼
►
You don't have to write a blog post burning Amazon down.
00:59:21
◼
►
You don't opt into it.
00:59:22
◼
►
But there are times and relationships, I think,
00:59:25
◼
►
that would benefit from that.
00:59:26
◼
►
I think Amazon's done a pretty good job
00:59:27
◼
►
at outlining what those would be.
00:59:29
◼
►
But that stuff is promising.
00:59:33
◼
►
It sounds great.
00:59:34
◼
►
Having a screen is nice.
00:59:35
◼
►
Like having a little news or having weather,
00:59:37
◼
►
that sort of stuff is nice.
00:59:39
◼
►
But I think Amazon has a long way in going
00:59:43
◼
►
and understanding what customers like
00:59:45
◼
►
about the Echo products and how the Echo Show
00:59:49
◼
►
breaks a lot of those things, at least for users like me.
00:59:53
◼
►
And I think the Echo Show is salvageable.
00:59:56
◼
►
Like I don't think it's a dead end product,
00:59:58
◼
►
but I think Amazon sort of missed the mark
01:00:00
◼
►
in understanding what people like about the service.
01:00:03
◼
►
And it's too pushy and it's too in your face.
01:00:06
◼
►
And if they can dial some of that back
01:00:07
◼
►
and the screen can be there when you want it,
01:00:10
◼
►
but not all the time, I think I at least would find it
01:00:13
◼
►
a much more attractive offering.
01:00:14
◼
►
But for now I'd prefer my $200 back or whatever it was.
01:00:18
◼
►
So I'm returning it.
01:00:19
◼
►
- It really sounds to me like what I expected
01:00:22
◼
►
that it adds a bunch of functionality, which is interesting. It's just not the stuff I
01:00:26
◼
►
want to do. And everything that I want to do with that device is already in the one
01:00:31
◼
►
I own. So I'm all good. Yes. Yeah, you know, Jason talked about he has some security cameras
01:00:40
◼
►
that the echo show can show him a live feed from the camera like that sort of stuff is
01:00:44
◼
►
cool, like do things that you can't do on a regular echo. But the other stuff I think
01:00:49
◼
►
I guess the water's just too muddy now for me.
01:00:52
◼
►
- All right, today's show is also brought to you by Igloo,
01:00:56
◼
►
a digital workplace platform that enhances
01:00:59
◼
►
your corporate culture, improving how you work
01:01:02
◼
►
with your teams and the people inside of your organization.
01:01:05
◼
►
Igloo will help connect your team to three important things,
01:01:09
◼
►
the people inside of your company,
01:01:10
◼
►
the information that you need to share,
01:01:12
◼
►
and the processes that you have to go to.
01:01:14
◼
►
With Igloo, your people have access to what they need
01:01:16
◼
►
using the tools that they already know
01:01:18
◼
►
a host of app integrations right inside of their igloo.
01:01:21
◼
►
So you get to be able to share everything you do in your company.
01:01:25
◼
►
One of my favourite thoughts about this is when I used to work in a big company everybody
01:01:28
◼
►
needed to sign a piece of paper to say that they understood some kind of safety procedure
01:01:32
◼
►
and somebody had to physically walk it around the whole department and get people to sign
01:01:36
◼
►
With igloo you don't need that sort of stuff because people can just acknowledge that they've
01:01:38
◼
►
seen something.
01:01:39
◼
►
You can send out everything, you can make sure everybody's seen it and then you're all
01:01:42
◼
►
set and ready to go.
01:01:44
◼
►
But you can also make sure that people are comfortable working within igloo because they're
01:01:48
◼
►
using the devices that they love to use, they're not being forced to use a certain browser
01:01:52
◼
►
or a certain machine and they're also able to integrate with the applications and services
01:01:57
◼
►
that they're using to get their other pieces of their work done.
01:02:00
◼
►
This is powerful stuff.
01:02:02
◼
►
Igloo can also be customized to represent your brand and culture.
01:02:05
◼
►
It helps define how your company operates.
01:02:09
◼
►
To enhance your processes and your culture inside of your organization there are four
01:02:12
◼
►
things you need to think about. Communication, collaboration, knowledge management and the
01:02:16
◼
►
workflow of your team. This is what you can get started on with igloo. That is what it's
01:02:21
◼
►
focused on enhancing, building and maybe even fixing. Igloo is a modern intranet designed
01:02:27
◼
►
to keep everyone on the same page. Try igloo for free with no obligation to continue after
01:02:31
◼
►
your trial. Go to igloosoftware.com/connected now to find out more and sign up. We thank
01:02:36
◼
►
igloo for their support of this show.
01:02:40
◼
►
So as we are still in the phase, I believe that Federico is still in the phase of his
01:02:44
◼
►
life in which he is devouring videos of sessions from WWDC?
01:02:50
◼
►
No, not anymore, surprisingly.
01:02:52
◼
►
Oh, congratulations, you're out of it?
01:02:55
◼
►
I have started writing the review.
01:02:59
◼
►
So you're now at that stage which means that you've watched every single second of every
01:03:03
◼
►
single session.
01:03:05
◼
►
I know that's what you've done.
01:03:06
◼
►
Every session has been consumed.
01:03:09
◼
►
No, I think it was all of them.
01:03:10
◼
►
I think you're being modest.
01:03:11
◼
►
Every single session.
01:03:12
◼
►
That's why I heard you watched them twice and has done that since WWDC.
01:03:17
◼
►
So Tichi teaches today, which is the segment that I have given this a name for and potentially
01:03:22
◼
►
maybe the last one, but at least I have a name for it now.
01:03:25
◼
►
Tichi teaches us today CoreML.
01:03:29
◼
►
What is CoreML Federico?
01:03:31
◼
►
So CoreML, it sounds like caramel, but it's not that. It's not caramel for AI. It's a
01:03:39
◼
►
different thing.
01:03:40
◼
►
I didn't realize that. That's so amazing.
01:03:45
◼
►
CoreML is, I'm afraid to disappoint you, Myke.
01:03:49
◼
►
I can't hear it any other way now.
01:03:51
◼
►
Yeah, I know, caramel. So CoreML, it's not the deep and open framework that you might
01:03:59
◼
►
It's really more of a...
01:04:00
◼
►
If it's not deep and open, who cares?
01:04:02
◼
►
Yeah, well, what do we need it for?
01:04:04
◼
►
Well, I'm sorry, you can end the show.
01:04:06
◼
►
It's more of a set of
01:04:08
◼
►
developer tools for
01:04:10
◼
►
machine learning. It's not a framework
01:04:12
◼
►
to write, to develop
01:04:14
◼
►
actual machine learning
01:04:16
◼
►
stuff. So, here's
01:04:18
◼
►
how it works. Machine learning is
01:04:20
◼
►
at least the way that Apple is doing it. It's based on
01:04:22
◼
►
models. And models are
01:04:24
◼
►
think of them as files
01:04:26
◼
►
of code. It's actual code
01:04:28
◼
►
that you download from public sources, in this case for iOS, and there are a
01:04:34
◼
►
variety of models created with other tools such as Keras or Caffe or there's
01:04:41
◼
►
one called Scikit, I think. So you download these models and they are code
01:04:46
◼
►
made by others to describe a series of objects or scenes or items, stuff that
01:04:56
◼
►
you want machine learning to recognize and patterns that you want your
01:05:01
◼
►
software to understand. So the way that Apple has done it, they have this format.
01:05:04
◼
►
So Core ML is largely a format and a set of developer APIs that allows you to
01:05:10
◼
►
take in these already trained models, so this code that has already been taught
01:05:15
◼
►
how to recognize specific elements of everyday life. I'm simplifying here but
01:05:20
◼
►
to understand. You take these models, Apple has a web page where they have
01:05:23
◼
►
some sample models that already work with Core ML and you make them work with
01:05:29
◼
►
Apple's format. So the Core ML format is an actual file format. You convert these
01:05:35
◼
►
models into the iOS Core ML format and you drop them into Xcode and you drop
01:05:40
◼
►
them into your app and you suddenly gain machine learning features from these
01:05:45
◼
►
already trained models. The way that Apple is doing it seems to me, I mean I
01:05:51
◼
►
don't understand this stuff. I don't know. I mean, I've taken a look at machine
01:05:54
◼
►
learning code and it's a bunch of mathematics that I really don't
01:05:57
◼
►
understand. But the way that it works is you set a goal. You're like, I want my
01:06:03
◼
►
photo app to recognize flowers and to recognize hotels and airports.
01:06:10
◼
►
I have these three different categories that I want my app to understand. So you
01:06:14
◼
►
find a model that is able to recognize those features in a photo. You take the
01:06:19
◼
►
model and you run it through the Python converter that Apple has put together
01:06:24
◼
►
and that generates a compatible CoreML file. You drop it into Xcode and your app
01:06:31
◼
►
can now identify flowers, airports and hotels because of the CoreML
01:06:37
◼
►
tools. At a higher level there's a bunch of machine learning APIs that Apple is
01:06:45
◼
►
offering in iOS 11. And Core ML itself is used by Apple in Siri for translation,
01:06:50
◼
►
it's using the camera for, I think, for scene and object recognition, and it's
01:06:57
◼
►
also used, I believe, in the QuickType keyboard. So, you know, you can, you know,
01:07:01
◼
►
the next word predictions and all the suggestions that you get in the
01:07:05
◼
►
keyboard are powered by Core ML. Below Core ML on iOS 11 you find the more
01:07:10
◼
►
specific frameworks, such as the Vision API, so computer vision, it can recognize
01:07:14
◼
►
faces, scenes, objects, rectangles, whatever, and you have the NLP API which
01:07:21
◼
►
is the natural language processing which can do a bunch of things like it can
01:07:25
◼
►
recognize sentences, it can recognize individual entities like verbs and
01:07:30
◼
►
pronouns and nouns, but these more specific APIs they are an implementation
01:07:37
◼
►
of machine learning and they depend on core ML for them to work. So really what
01:07:44
◼
►
What Apple has done here is, I saw someone say Apple has made the PDF of machine learning
01:07:51
◼
►
with Core ML and I sort of understand the idea.
01:07:55
◼
►
I mean it's not as easy as opening a PDF of course.
01:07:58
◼
►
You still need to write a bunch of code and you still need to understand what you're doing.
01:08:02
◼
►
It's not like you're taking a Core ML file and you copy and paste into Xcode.
01:08:06
◼
►
Boom, machine learning, it's happening.
01:08:07
◼
►
It's not that easy.
01:08:08
◼
►
Are you sure though?
01:08:09
◼
►
It sounds like it is that easy.
01:08:11
◼
►
That's what I'm getting from what you're explaining.
01:08:13
◼
►
It is much easier for developers to find a model trained by someone else, publicly available,
01:08:22
◼
►
open source, and use it in an iOS app.
01:08:25
◼
►
The fact that Apple is providing some models for free, some of them coming from Google,
01:08:29
◼
►
I believe, is a testament to the fact that they want to make it easy for you to get started.
01:08:36
◼
►
I'm interested to see how developers are going to take advantage of this, in the sense of
01:08:44
◼
►
what are the possible implementations that we're going to see as users, what does this
01:08:48
◼
►
mean for us.
01:08:50
◼
►
And I think stuff like photos and translations, the basics of recognizing things in a picture
01:08:58
◼
►
or understanding what a string of text means, that stuff I think we're going to see plenty
01:09:05
◼
►
we're gonna get a bunch of enhancements for the apps that we already use starting this fall.
01:09:10
◼
►
We're gonna get, for example, your calendar now can make better suggestions for the locations of events or the times of
01:09:17
◼
►
specific events or we're gonna see apps that can
01:09:20
◼
►
automatically sort our photo libraries, you know, in a way that maybe not even the Apple Photos app does. We're gonna get, you know, all these
01:09:28
◼
►
image recognition,
01:09:31
◼
►
summarizing text,
01:09:32
◼
►
doing translation, maybe even if you could... the thing is you can even combine
01:09:37
◼
►
these APIs together. So for example, you can recognize some text using NLP, using natural language processing.
01:09:45
◼
►
Then you take that
01:09:47
◼
►
recognized text, you pass it to Core ML and you say things like, "Well, here's some text.
01:09:53
◼
►
I know that it's in English and I know that it's got two verbs and
01:09:56
◼
►
and the name of a person named Myke,
01:10:00
◼
►
and Myke says he's going to a concert,
01:10:02
◼
►
can you analyze the sentiment of this text?
01:10:05
◼
►
And Core ML will come back at you and say,
01:10:08
◼
►
well, it appears that Myke is happy,
01:10:09
◼
►
so maybe you wanna show an happy face emoji.
01:10:11
◼
►
That kind of stuff.
01:10:12
◼
►
You can mix and match these APIs,
01:10:15
◼
►
and you can do most of this work using train models
01:10:18
◼
►
that you can already find on the open web.
01:10:22
◼
►
You can take them, you can convert to the Core ML format,
01:10:25
◼
►
use them in Xcode and let your app take advantage of those predictions, that analysis, all those features
01:10:31
◼
►
that Xcode and the iOS SDK wouldn't normally give you for free.
01:10:36
◼
►
And then you can combine it with this other stuff like NLP and Vision.
01:10:40
◼
►
So I think there's a lot of potential here for even the smaller features,
01:10:47
◼
►
like the smaller additions, like now Fantastical is smarter when it gives you recommendations for locations.
01:10:53
◼
►
or now you can have a summary of your agenda based on the things that you want to know.
01:10:59
◼
►
And I think we're going to get this smaller, but yet so useful, enhancement.
01:11:04
◼
►
And I'm really, really curious to see how this goes.
01:11:07
◼
►
Did that make any sense?
01:11:12
◼
►
I am struggling to get my head around this one in ways that I haven't in the previous
01:11:17
◼
►
things that you've been talking about.
01:11:19
◼
►
And I've been that way since I first heard about CoreML because I don't fully understand
01:11:26
◼
►
how this framework is able to do anything.
01:11:30
◼
►
If you give it like this just list of things, how does it know what?
01:11:34
◼
►
It seems confusing to me.
01:11:36
◼
►
So the basic idea is that it's all in the original code, the original model.
01:11:41
◼
►
So imagine there's a long file with millions of lines of code that has been trained with
01:11:48
◼
►
the type of neural network to understand what an apple looks like or what a
01:11:55
◼
►
bedroom looks like. And that is actual code. There's a programmer who wrote that
01:12:00
◼
►
and taught the network how to recognize those items in a picture. And you take
01:12:06
◼
►
that knowledge, you take that code and you put it into an iOS compatible format,
01:12:11
◼
►
which is CoreML, and you start using it into your app. So CoreML, you know,
01:12:16
◼
►
it's a two-way communication. Your app as a developer, Myke, you're putting together
01:12:19
◼
►
one, two, three AI so you can pass text or you can pass an image or you
01:12:26
◼
►
can pass data, you can pass a dictionary. So you take some input from
01:12:30
◼
►
your app and you say "CoreML, help me figure out what is going on here" and the
01:12:35
◼
►
framework is gonna do its processing, it's based on
01:12:40
◼
►
metal, it's based on accelerate, so these high-performance frameworks that take
01:12:44
◼
►
you know, a fraction of a second to understand scene recognition and that kind of stuff.
01:12:48
◼
►
And it comes back with a message that says, "I believe there's a 90% accuracy that this
01:12:52
◼
►
picture is showing an apple or a horse or a mountain." And you take that and you say,
01:12:57
◼
►
"Well, I believe, as a user, here's your message. We believe this is a horse." But the essence
01:13:05
◼
►
of the actual machine learning is happening in the model. And that's the most important
01:13:12
◼
►
part. But what Apple is doing here is it's making it easy for developers to take these models that
01:13:19
◼
►
have been going around for a couple of years now, maybe even more, and to use them on iOS.
01:13:24
◼
►
It was also tricky for me to understand because I was like, "So where does the actual machine
01:13:31
◼
►
learning occur?" And it's not in the model. What CoreML does is it's a Rosetta Stone for machine
01:13:39
◼
►
learning and apps and it's helping them communicate and translate those inputs
01:13:45
◼
►
and commands from the machine learning code into data that can be used by an
01:13:52
◼
►
app on iOS. So it's this framework right in the middle between the algorithm and
01:13:58
◼
►
your app. That's what it is. Let me see if I can say that back to you in a way that
01:14:04
◼
►
that I think I've understood it with an example.
01:14:07
◼
►
So like Apple's photo recognition stuff, right?
01:14:11
◼
►
They had a bunch of data, which was saying,
01:14:16
◼
►
this is what this picture is. This is what this picture is. You know,
01:14:19
◼
►
this is what a horse is. This is what an Apple is,
01:14:21
◼
►
but they need something in the middle to translate that
01:14:26
◼
►
information back to the photos.app and CoreML is the thing
01:14:31
◼
►
that can like, it takes all of the information
01:14:35
◼
►
and can process it.
01:14:36
◼
►
So when the app asks a question, Core ML gives the answer
01:14:41
◼
►
because it has all of the information.
01:14:44
◼
►
- Yes. - Okay.
01:14:45
◼
►
So everything that goes through Core ML
01:14:48
◼
►
has to have some kind of huge data set
01:14:51
◼
►
that Core ML can read from.
01:14:53
◼
►
And it can then interpret or extrapolate from it
01:14:59
◼
►
to give answers to questions that the application may have?
01:15:03
◼
►
- Basically, and we're oversimplifying here,
01:15:06
◼
►
but yes, that's what it does.
01:15:08
◼
►
- Yeah, we have to oversimplify
01:15:09
◼
►
'cause this is way over my pay grade.
01:15:11
◼
►
- Yes, we uses the model, which is the actual code,
01:15:16
◼
►
the programming that went into machine learning,
01:15:20
◼
►
and it communicates with iOS back and forth between the two,
01:15:23
◼
►
and it spits out a reasonable answer for your app to use.
01:15:28
◼
►
What sorts of things do you see developers being able to do now that they have this as
01:15:34
◼
►
opposed to in the past when this was more difficult or out of the reach of more people?
01:15:40
◼
►
Well, translation services that used in the past like a web service or a web API, now
01:15:46
◼
►
they can do, if they have the right models and if they combine the NLP API and Core ML
01:15:53
◼
►
together, they can now do all of this stuff offline with local processing and it takes
01:15:58
◼
►
couple of seconds, maybe even less. So translation, doing things like summarizing text or of course
01:16:06
◼
►
categorizing pictures based on scene or objects recognizing a picture and then you can do things
01:16:14
◼
►
like, and this is all done locally, there's no web API, you don't have to be connected to the
01:16:18
◼
►
internet for this to work, you can do things like show me photos where I'm happy or show me photos
01:16:24
◼
►
where another person was crying. So you can do sentiment analysis.
01:16:29
◼
►
God! Don't make that such!
01:16:33
◼
►
I mean, maybe to an extent you could do things like, I want to help me find all of my diary
01:16:42
◼
►
entries where I was sad and I was typing in Italian. So if I have a diary, sometimes I
01:16:50
◼
►
I write in Italian, other times I write in English, and I can find the said entries of
01:16:54
◼
►
my diary in a specific language. Or maybe you could even do things like audio tagging,
01:16:59
◼
►
so you can process audio and recognize things in audio, or you could do things like, I mean,
01:17:07
◼
►
understanding a specific speaker, for example, in an audio stream, like show me the bits
01:17:13
◼
►
where Myke is talking, you know, all that kind of stuff.
01:17:17
◼
►
needs that. All apps should use that feature, the "when Myke is talking" feature.
01:17:21
◼
►
Yeah, it's a very... it's a wild west right now, really. It's all up to
01:17:29
◼
►
the developers imagination, the models they can find. And there's the
01:17:33
◼
►
criticism that is going on is that Apple is not making a real machine learning
01:17:37
◼
►
development tool. It's more of an API that takes in models from other
01:17:43
◼
►
other proper tools and makes them compatible with iOS,
01:17:47
◼
►
you know, it actually includes the models
01:17:49
◼
►
into the binary of an app.
01:17:51
◼
►
So, you know, like at runtime, the model is already there.
01:17:55
◼
►
And it's not like a real program
01:17:58
◼
►
that you open on your computer and you're like,
01:17:59
◼
►
"Okay, I want to write machine learning code."
01:18:02
◼
►
That's not what it does.
01:18:03
◼
►
It's a way to take the model, take the code,
01:18:06
◼
►
and make it work with iOS.
01:18:08
◼
►
- Yeah, I understand that criticism, right?
01:18:10
◼
►
That kind of cheat in a way,
01:18:12
◼
►
but they're also making something that sounds pretty impressive.
01:18:15
◼
►
But yeah, they are getting around doing that stuff themselves.
01:18:19
◼
►
Like, it's not TensorFlow, you know.
01:18:21
◼
►
It's not what Google is doing.
01:18:23
◼
►
It's a different approach.
01:18:25
◼
►
Thank you for teaching, Tichi.
01:18:28
◼
►
Sure. I feel educated.
01:18:31
◼
►
Is there anything else?
01:18:32
◼
►
Is there any other things that you want to do?
01:18:34
◼
►
Like, are there other things that you can talk about?
01:18:38
◼
►
There are other things.
01:18:39
◼
►
I need to decide, so I don't want to pre-announce it this time, because it creates an expectation.
01:18:47
◼
►
But there are other things that I would like to discuss with you and sort of wrap my head
01:18:50
◼
►
around because it can be difficult to understand these ideas just by myself.
01:18:55
◼
►
So there will be more.
01:18:56
◼
►
I think if you run out of stuff we can pivot and I can teach you guys about Apple Talk
01:19:04
◼
►
If you want to find our show notes for this week just go to relay.fm/connected/149.
01:19:07
◼
►
I'd like to thank our sponsors again for helping support this episode.
01:19:11
◼
►
So a huge thanks to Igloo, Hover and Mac Weldon.
01:19:15
◼
►
If you want to find us online, there's a few places you can do that.
01:19:18
◼
►
You can go to 512pixels.net and @ismh on Twitter for all of Steven's stuff and @fotichiviticci
01:19:25
◼
►
and maxlories.net for all of Federico's.
01:19:28
◼
►
I am i-mike, i-m-y-k-e on Twitter.
01:19:31
◼
►
Thank you so much for listening to this week's show and we'll be back next time.
01:19:36
◼
►
Until then, say goodbye guys.
01:19:37
◼
►
Arrivederci.