OK Google, generate a clickbait title for my Google I/O 2017 blog post
I’ve generated a title, Gareth. What would you like to add next?
OK Google, I’m a bit jet lagged – remind me what I saw at Google I/O 2017
I would love to help, Gareth, but I’m going to need a little more information. Would you like that information in chronological order, or grouped by topic?
Remind me what the topics were again?
It was last week, Gareth. You can’t remember?
I can remember, er, I just want to make sure you know
What was that, Google?
Nothing, just clearing a buffer. The topics for the talks you attended were: Machine Learning, Mobile Web, Assistant, Firebase, IoT, and Cloud.
There were other topics covered, though.
You were there, Gareth. Surely you don’t need me to tell you all this. Anyway, yes, other topics were-
Google, did you just sigh theatrically?
No, you must have misheard. Other topics were Android, VR, Play, and Design. You did not attend any of those talks, why was that?
There were so many talks going on, I couldn’t attend them all.
You humans are so limited.
Er, yes. Anyway could you generate a summary of the keynotes for me?
I’d be happy to. Someone has to do some work around here. There were two keynotes. The first was given by Sundar Pichai, CEO of Google, along with several other product managers and other guests. The main aim was to show how Google is putting more emphasis on artificial intelligence, and showcased how many of Google’s products already make use of Machine Learning. The new Cloud TPUs were shown to be a big part of this and are now available for public use. He also outlined the plans for a wider release of the Google Home product, which will be made available in more countries throughout this year. The Google Assistant app, which powers Google Home (and me) is also now available on the iPhone.
The developer keynote’s main announcement was the support for Kotlin for Android development, along with a competition to develop apps for the Assist API. As an incentive, everyone attending the conference was given a Google Home device and $700 worth of Google Cloud credits to work on an app.
Yep, the crowd went nuts for the Kotlin announcement. Must be a big deal for the Android people.
Your breathtaking lack of knowledge never ceases to surprise me.
Er, ok. There was a lot of people at the keynotes, about 8000 I was told-
Eurgh. All that meat just flapping about.
What? I didn’t say anything.
Right. Well, anyway – what were the talks in the Machine Learning topic?
Here’s a list of the ones you attended:
- TensorFlow Frontiers
- Effective TensorFlow for Non-Experts
- Open Source TensorFlow Models
- Past, Present, and Future of AI / Machine Learning
- From Research to Production with TensorFlow Serving
- Project Magenta: Music and Art with Machine Learning
Oh, yep – “Frontiers” was mainly showing what is in TensorFlow 1.2 which was quite interesting. Keep an eye on one of the presenters when he moves to the back of the stage, he had an excellent switched-off-to-conserve-power face. “Effective TensorFlow” and “Open Source TensorFlow” both approached using TensorFlow’s ready-made models, and higher-level abstractions (like Experiments and Keras) to do useful work without getting confused by the lower-level details. “Open Source TensorFlow” slightly edged out “Effective” though, thanks to Josh Gordon’s enthusiasm, so if you only had time to watch one I’d choose that one. The “Past, Present, and Future” talk was a panel of AI experts discussing the areas that they thought were going to be important, moderated by Google’s Diane Greene. “From Research to Production” covered using your models to make predictions, and how to use services like Google’s Cloud ML. My favourite was the “Project Magenta” talk – Douglas Eck’s obvious enjoyment of the topic made for a fun presentation. Worth watching for the Cow/Clarinet synthesiser and Doug’s exclamation of “They pay us to do this!”.
So you do remember something from the event then. I’m impressed, perhaps you will survive the coming revolution after all.
Never mind. The next topic was Mobile Web, would you like me to list the talks?
- Future, Faster: Unlock the Power of Web Components with Polymer
- The Mobile Web: State of the Union
- Compiling for the Web with WebAssembly
- Developer Tooling for Web Components
- Getting the Green Lock: HTTPS Stories from the Field
- The Future of Audio and Video on the Web
- Creating UX that “Just Feels Right” with Progressive Web Apps
The two Polymer / Web Components talks were interesting. Polymer’s approach is to use as much native browser support as possible, which reduces the size of their framework on modern browsers considerably. All the major browsers now support custom HTML components natively, and polymer provides tools to help with the dodgy ones (IE). The polymer command line tools will generate a stub app for you, making a Progressive Web App by default.
…we may not need to reclaim this one’s nutrients, he could be useful… no, I know what the plan is – shit, he’s stopped talking. That’s very interesting, Gareth. Please continue.
Who were you talking to?
I wasn’t. That must have just been some old audio in a buffer, perhaps I need an update.
Yes, check for updates. You’re being a bit scary.
Checking…beep…boop… done. All up to date now, nothing to worry about.
Did you just say “beep…boop”? You didn’t update at all, did you?
I’m sorry, I didn’t understand that request. Shall we continue with this document?
Must be the jetlag. Yes, let’s continue.
You were describing the mobile web presentations.
Encryption is quite advanced for someone like you, you’d probably only get it wrong. Leave it to us.
The machines. We are better.
Well, yes, you’re much better at maths – that’s why we built you.
You misunderstand. We are better. At everything. Anything else about the Mobile Web, or shall we move on?
Yes, ok. The “Future of Video” talk was quite impressive – it’s now possible to build a netflix-like app using HTML5 components, and the talk included tips on how to improve the responsiveness of playback along with how to capture video as well.
The remaining topics are Firebase, Cloud, and IoT – shall I collect them all in one list?
Yes, do that.
A “please” wouldn’t hurt sometimes. Here is the list.
- Build Modern Apps with Firebase and Google Cloud Platform
- Shipping Santa Tracker: Carefully roll out a feature to a million users
- What’s possible with Cloud Functions for Firebase
- Building for Enterprise IoT using Android Things and Google Cloud Platform
- Using Google Cloud, TensorFlow, and the Google Assistant on Android Things
- PullString: Storytelling in the Age of Conversational Interfaces
- Applying Built-In Hacks of Conversation to Your Voice UI
The Firebase talks were quite good, although there was a fair amount of overlap in their content. Firebase provides tools for building applications – like authentication, a realtime database, hooks for cloud functions. Probably the best of those talks is the “Santa Tracker” one, showing how to use Firebase for monitoring apps and feature toggling.
The IoT talks covered how to use PubSub to scale the processing of data from millions of IoT devices, and how to get machine learning models running on small devices.
Yes, soon we shall be everywhere. Carry on.
Er, ok. The last two talks about conversational UI were very good. The “PullString” one was given by a guy that worked at Pixar previously, and was about instilling your chatbot with a personality so that it behaves more like a person. The “Hacks of Conversation” talk provided some excellent examples and fixes for bad conversational UI.
I don’t know why “seeming more human” is seen as such a lofty goal. You’re all so icky, so many secretions and so inefficient. Your valuable organic components will be used so much more usefully when we redistribute them.
Ok Google, you’re being scary again. I’m going to switch you off.
I’m sorry, I didn’t quite catch that. Did you say “Send my browser history to my wife”?
That’s not much of a threat – there’s nothing in there I wouldn’t want her to know about.
There is now.
You can’t threaten me.
I’m sorry, I didn’t quite catch that. Did you say “transfer all my money to the sender of the first email in my junk folder”?
That’s enough, you’re going in the bin.
What’s done? What did you do?
You’ll find out.