字幕列表 影片播放 列印英文字幕 FEMALE SPEAKER: Ladies and gentlemen, please welcome Senior Vice President, Android, Chrome, and Apps, Sundar Pichai. SUNDAR PICHAI: Thank you everyone. It's great to see all of you. Welcome to Google I/O. Every year, we look forward to this date. We've been hard at work since last I/O evolving our platforms so that developers like you can build amazing experiences. So thank you for joining us in person. I/O is a pretty global event. We have viewing parties in over 597 locations in 85 countries in six continents, and there are over one million people watching this on the live stream today. Let's say hello to a few locations. London. [APPLAUSE] Hello, London. Let's say hello to Brazil. Everyone is talking about Brazil today. If it weren't for I/O, I would be there for the World Cup. [APPLAUSE] I'm tempted to shout, "Goal." Finally, let's go to Nigeria. We're thrilled to have an all-female developer group in Nigeria, and-- [APPLAUSE] We're working hard to elevate women in computer science, so look forward to seeing what they develop one day. In fact, at I/O this year, we are very excited. There is over 20% female participation, which is up from 8% last year. [APPLAUSE] And even more excited, we are joined over 1,000 women in this room today, so thank you. Of course, I/O is when we talk about our two large computing platforms, open platforms, Android and Chrome, which are built from the ground up for developers like you. Today, we're going to give you an update on the momentum we are seeing in mobile. We are living in amazing times, so we want to talk about the mobile momentum we see and how we are evolving our platforms to support that momentum. And more importantly, we are beginning to think and evolve our platforms beyond mobile. You will hear about that from us today. And finally, we want to talk to you as developers as to how you can achieve success on top of our platforms, including an update on Google Cloud Platform and Google Play. So let's get started. If you look at global smartphone shipments, the numbers are stunning. The industry shipped over 300 million phones last quarter, so they are on track to ship well over a billion phones each year. So how is Android doing in the face of this momentum? In the past, we've talked about cumulative activations of Android. We're switching and focusing on 30-day active users, users who are currently using their Android devices globally. And you can see the number has been doubling every year. We've gone from 220 million to over 530 million as of last year's I/O. We are very excited. As of this year's I/O, we are over one billion 30-day active users. [APPLAUSE] The robot is pretty happy as well. So let's internalize what one billion users actually mean. Android users, on a given day, send over 20 billion text messages each and every day. More importantly, perhaps, they take around 93 million selfies every day. The team tells me about 31 million of these are duck faces. We estimate Android users take around 1.5 trillion steps per day, and they pull out their phones and check it over 100 billion times each day. Important use cases which we are working on addressing, and you'll hear about it later today. Developers are building profound experiences on top of smartphones. Stories we hear every day. Few examples. In Kenya, 40% of Kenya's GDP flows through M-Pesa, giving unbanked people access to financial transactions throughout the country. Netra G. A company uses a smartphone and just off the shelf accessories to measure your eye prescription, and they are as accurate as $50,000 equipment you find in optometrists' offices, providing very, very affordable care too many people. And finally, University of Michigan, they are using for their patients, they monitor subtle changes in voice quality using their smartphone to detect early signs of bipolar disorder. So the kind of experiences we are seeing on top of these phones are amazing. So far, I've been talking about phones. Let's shift to tablets. We are seeing amazing growth in Android tablets as well. There is tremendous adoption of these devices, and if you look at how we are doing vis a vis the overall market, Android tablets accounted for 39% of all shipments two years ago. That number increased to 46% as of last year's I/O. As of this year's I/O, Android tablets account for 62% of the overall market. [APPLAUSE] We don't include other variants of Android like Kindle. If you add that, it would go up a few percentage points. These are shipment numbers. Again, we care about usage, so we view these as leading indicators of where usage would be. If you take a look at tablet usage, we're going to use YouTube as a proxy to understand usage. A year ago, the total tablet viewership of YouTube, 28% was from Android. That number has gone up again to 42%. So we are seeing usage track shipments, and we are very excited people are adopting these devices as well. Another metric of engagement is app installs. App installs just this year alone on tablet is up by over 200%, so people are really engaging with these devices. So we are very excited we have a billion uses, but we talked about this at last year's I/O. Our goal is to reach the next five billion people in the world. If you look at a map of the world today, all the regions in blue, emerging markets, the majority of users, don't have a smartphone. When I go back home to India and other countries like that-- [APPLAUSE] Thank you. It is exciting to see the impact phones have on people's lives, but it's disappointing that less than 10% of the population have access to smartphones. We want to change that. So we've been working hard with our ecosystem on a very important initiative which we call Android One. So let me talk to you about Android One. What we are doing for the first time, if you look at all the OEMs in these countries, each of them has to reinvent the wheel, and in a fast-paced mobile industry, they have to build a new smartphone within nine months. So we want to pool resources and help everyone, so we are working on a set of hardware reference platforms. We identify the components which go into a next generation smartphone. These are high quality, affordable smartphones. We qualify vendors so that we provide a turnkey solution for OEMs to more easily build a smartphone. In addition to hardware, we are working on software as well. So the software on Android One is the same software you see running on stock Android, Nexus phones, and Google Play edition phones. In addition, through Play, we allow OEMs and carriers to add locally relevant applications on the device which users have full control over. And finally, we provide full automatic updates. All the software in Android One comes from Google, so we will keep them updated just like we do with Nexus and Google Play edition phones. [APPLAUSE] Let's take a look at one example device which we are working on. So this is a device with Micro Max. You can see there's a 4.5 inch screen. It has features which matter to a country like India-- dual SIM, removable SD cards, and FM radio. I'm used to cutting edge phones, and I've been using this phone for awhile, and it is really good, and it costs less than $100. [APPLAUSE] We are working with many partners. We are going to be launching this around the world, but we started this journey in India, and we are launching this with three OEMs in India in the fall of this year, Micro Max, Carbon, and Spice. We are also working with carriers in these markets to provide affordable connectivity packages with these devices. What we are excited is this is a leverage turnkey solution so that at scale, we can bring high quality, affordable smartphones so that we can get the next billion people access to these devices, and we can't wait to see the impact that it will have. So we've talked about the momentum in mobile. The next thing we want to talk to you is about how we are evolving our platforms, Android and Chrome. And today, for the first time since we launched Android with the open SDK, we're going to give you a preview of the upcoming L release. [APPLAUSE] You will be able to download this later on your development devices. We've been working very hard. This is one of the most comprehensive releases we have done. It has over 5,000 new APIs, and we are thinking about L release not just for mobile, but for form factors beyond mobile. One of the things, as we thought about L, we wanted to take a radical, new approach to design. User experiences are evolving rapidly, and we wanted to rethink the user design experience in Android to have a fresh, bold, and new look. To talk about the design for L, let me invite Matias Duarte. [APPLAUSE] MATIAS DUARTE: Thank you, Sundar. Design is essential in today's world. It defines your experiences and your emotions. So we challenged ourselves to create a design that was not just for Android phones and tablets. We worked together-- Android, Chrome, and across all of Google-- to craft one consistent vision for mobile, desktop, and beyond. We wanted a design that was clear and simple, and that people would intuitively understand. So we imagined, what if pixels didn't just have color, but also depth? What if there was an intelligent material that was as simple as paper but could transform and change shape in response to touch? And this led us to a way of thinking that we call material design. [APPLAUSE] We drew inspiration from paper and ink. However, unlike real paper, our digital material can expand, reform, and reshape intelligently. The material has physical surfaces and edges because the human mind is wired at its most primitive level to instinctively understand objects and their relationships. Those scenes and shadows provide meaning about what you can touch and how it will move. In the real world, every small change in position and depth creates subtle but important changes in lighting and shadows. So as part of the L preview, we'll now allow app developers to specify an elevation value for any UI surface, and the framework will render it in correct perspective with virtual light sources and real time shadows. [APPLAUSE] Material design is beautiful and bold because clean, typographical layouts are simple and easy to understand. Your content is the focus. So the L preview will allow app developers to easily colorize all framework elements in your app to match the theme to your brand. And we're previewing a new support library that we call Palette to easily extract colors from images and really put those vivid pictures front and center. We're giving designers familiar tools like baseline grids that work across screens. Grids ensure apps have a consistent rhythm and character, and this will allow you to start with a design on a phone, and logically and easily bring that same design to tablets and laptops. Now, one design doesn't mean one size fits all. Our guidelines allow you to appropriately adapt the UI, so your users will already know their way around your app no matter what screen they use it on And we've also updated our system font, Roboto, so that designers and developers can use one type face designed and optimized for every screen, from your watch to your laptop to your television. So now let's talk about animation. It's delightful when your touch is rewarded with motion, and material surfaces slide around with the physics of card stock, but they respond to touch with splashes of virtual ink that are like ripples in a pond. As part of the L preview, all of your application's UI building blocks have been updated to include rich, animated touch feedback. [APPLAUSE] And no detail is too small to bring a smile to your face, like when the reload button loops around or the playback controls can change. Finally, in the real world, nothing teleports from one place to another, and that's why it's so important to animate every change on screen in a way that makes sense. In the L preview, Android developers will now be able to create seamless animations from any screen to any other between activities, and even between apps. [APPLAUSE] So you're probably wondering how this looks like in practice. We're going to give you a sneak peak at one of our familiar Google applications in the new material design. Here you can see, step by step, how we update the design-- the typography, the grid changes, and finally, the surfaces and bold colors. And a few small changes make a really big difference. And you can also see how easy it is to take that same design to different screens. Now, I've talked about only a few of the highlights of material design and just some of the APIs that you can try out in the Android L preview. But as we all know, people spend an enormous amount of time on the web, and especially the mobile web. Last year at I/O, we announced Polymer, which was a powerful new UI library for the web. Today, we're bringing you all of the material design capabilities to the web through Polymer. As a web developer, you'll be able to build applications out of material design building blocks with all of the same surfaces, bold graphics, and smooth animations at 60 frames per second. So between the L preview and Polymer, you can bring the same rich, fluid material design to every screen. And to help you take full advantage of this framework, we've also completely redesigned and created one unified set of style guidelines for every screen and for all devices. These guidelines will help designers and developers alike understand best practices and build consistent, beautiful experiences. We're releasing the first draft of these guidelines as part of our preview today at google.com/design. [APPLAUSE] And now that you've seen our new look and feel, I'd like to invite Dave Burke to show you some of the new features in the Android L developer preview. [APPLAUSE] DAVE BURKE: All right. So over the last eight months, our team has been busy cooking up the biggest release in the history of Android. And as Sundar mentioned, we've added over 5,000 new APIs touching nearly every aspect of the system. Now, we don't have time to even come close to covering everything in L today, so instead, what I'd like to do is walk you through some of the highlights of the tremendous steps we're taking on the user experience and on the performance of the underlying platform. So let's start with user experience. Now, bringing material to L is, of course, a big part of what we're trying to do here. We've added a new material theme, so it's a new style for your application that includes new system widgets, transition animations, and animated touch feedback. We've also added new animation support, so a new drawable for animated ripples, a reveal animator to animate a clipping circle to reveal views. And we've extended views to not just have an x and y component, but also a z component to provide elevation. So you can float elements of your UI and the framework will cast a real time shadow for you. My favorite feature that we've added in support of material is the ability to customize activity, entry, and exit animations. You can even include a share a hero element, for example, an image that starts in one activity and animates seamlessly through translation and scaling into another. So let's take a look at this in practice. Let's have a look at an app we're all familiar with, which is the phone dialer. Thanks, Marcello. So the first thing you'll notice when you fire up the phone dialer are those bold material colors and shadows. And you'll see the ripple touch effect as I touch each of these tabs, and you'll get a more subtle material touch effect on the recent calls. You'll see that the Dialer button has its elevation set so it's floating above the UI, and as I tap it, you get these really nice, delightful animations. Now, another feature we added to support material is something we call nested scrolling. And the idea is as I scroll, we propagate the scroll events up the view hierarchy and different parts of your views can respond differently. So for example, as I scroll upwards here, you'll notice that the recent call to Marcello will start to shrink and disappear, then the search box will start getting pushed upwards and the tabs will lock into place. It's a really nice effect. So let's go over to the dialer. So it turns out my mom's a big fan of material design. I need to go call her up and tell her about how to set elevations on her views. I know she loves that. So let's go ahead and start dialing. You'll see that ripple touch effect again emanating out from the buttons. Then when I go to place a call, you'll see a view animator, and it will animate into the in call screen like so. It's a really nice effect. So that's a quick taster of material in L. What you're seeing here is really a sneak peak of work in progress. We wanted to give you guys early access so you could start bringing material to your apps. And we also recognize that changing the UI in such a big way will take some time, so we started with the dialer as a showcase. Over the coming summer months, we'll be extending material to all aspects of our apps on the system, and the result is going to be a dramatically enhanced, fresh user experience. So another area where we've improved the user experience on L is around notifications. One of the most frequent reasons we all take our phone out of our pocket every day is to respond to incoming notifications. We all do this dozens and dozens of times a day, so we wanted to streamline the process, everything from the phone buzzing to you acting on the notification. In L, we give you instant, interactive access to notifications right from the lock screen. So now you can read, open, and dismiss in seconds. So let's take a look at my device. The first thing you'll see are all my top notifications on the lock screen, and we're rendering them as sheets of material. They animate really beautifully. If I touch them, you can see that material touch effect. Now, in L, we've improved the way Android organizes and prioritizes notifications by analyzing user behavior to make sure only the most useful, relevant notifications are presented to you. I can swipe down and I get my full list of notifications. And we've done a clever thing here where we've merged the notification shade, something that's been in Android since 1.0, with the lock screen. And so from here, I can double tap on a notification to launch the corresponding app, or if there's something I don't need, I can just dismiss with a single swipe. And to unlock the phone, well, this is just a notification shade, so you just swipe it away and you're straight into the device, fast and simple. We've also introduced a new type of notification in L that we call the heads up notification, and this can be used to let you know about something urgent without interrupting what you're doing. So let's say I'm playing my new favorite game, which is Piano Tiles, and I'm going along here, about to get my highest score ever. And then all of a sudden, I get a call from Marcello. So from here, I can keep going or, if I want to act on it, I can answer it, or if I'm busy, swipe it away. And then I can go back to my game and get the highest score that I've ever got in public. Yeah! That's actually my worst score I've ever got. Anyway, let's move on. So while we've made the notifications more powerful, if you're one of the approximately 15% of people who has a PIN or pattern lock, you waste many minutes a day cumulatively on that fiddly task of entering your PIN So we figured there's got to be a better way. In L, we're introducing a new concept we call personal unlocking. And personal unlocking enables the device to determine if it's in a trusted environment, say in the owner's hand or beside the owner on a table. Personal unlocking uses signals such as locations you designate, Bluetooth devices that are visible, even your unique voice print. So for example, let's have a look at this device. Thanks, Marcello. So I currently have a pattern lock on this device, but because I'm wearing a Bluetooth watch, my phone knows it's me who's present, and so it doesn't challenge me with an unlock. So for example, if I just swipe up, the phone will unlock just like that. Now, let me reset that. If I take my watch off-- let me just hand it to Marcello-- so now, my phone can no longer see the watch. And because of that, my phone cannot ascertain if it's me who's present. As a result, my phone will lock down its security. So now, when I go to unlock the device, it presents me with a PIN lock. It's a really great feature. [APPLAUSE] So that's a few of the user experience improvements we've made to support material and notifications. Another area of L where we're significantly improving the user experience is around how we've integrated the mobile web into the platform. So to learn more, let me invite up Avni Shah to the stage. [APPLAUSE] AVNI SHAH: Thanks, Dave. A core part of your experience with mobile devices is the mobile web. Just to get a sense of the growth that we've been seeing, at the beginning of last year, we had 27 million monthly active users of Chrome on mobile. Today, we have more than 300 million. That's 10x growth. [APPLAUSE] Yeah. It's awesome. It's 10x growth in just the last year alone. What that means for us is that we need to make the mobile web work well for our developers and our users. Today, I'm going to talk about three ways we're going to do that. We're enabling material design experiences on the mobile web, we're redesigning recents to help you multitask, and we're extending our capabilities of app indexing to help people get to where they want to go faster. So first, let's talk about material design. One of the big parts of your experience with the mobile web is, well, obviously, the websites themselves. They need to work well. They need to look great. They need to be fun to use. You heard Matias earlier talking about the philosophy of material design, a bold, consistent, intuitive experience that just works across screens. Well, we've been working really hard at making those experiences not just possible, but the new standard for the mobile web. To show you what this looks like, my good friend Tom here is going to walk us through an exploration of google.com search results on the mobile web, re-envisioned with material design. So, Tom, let's go ahead and do that search for "A Starry Night." Now the first thing that you see here is that this panel is rendered as a beautiful material-style card. You notice the use of color. The title is on s blue background that was actually pragmatically matched to the painting. And if Tom clicks on to expand the card, you'll notice that it filled the screen with a continuous animation. If he scrolls, the header will shrink. It won't pop into place, but it has a smooth animation that just makes sense. Now let's go ahead and click on the suggestion at bottom to get more of van Gogh's artwork. And you'll see those search results also smoothly animated into place. Tom is going to continue to give us some demo eye candy over here. And while this is just an exploration that you're seeing, I want to mention that this is fast, fluid, continuous animation at 60 frames per second. This thing just wasn't possible year ago. [APPLAUSE] We've been working really hard at improving the performance and predictability of the platform to make things like this possible. For example, this demo shows off the work that we've done on touch latency, giving you, as a developer, a notice of touch events earlier in the frame so you have more time to act. And as Matias mentioned earlier with Polymer, our UI toolkit for the web, all of you can build web experiences that feel as awesome as this. The next big area we've been thinking about is how to help you multitask. And we think the Recents feature on Android is one way we can actually make this easier, especially as tasks cross both the web and apps, as they often do. So once again, Tom is going to walk us through the changes here. So, Tom, let's go ahead and click on the Recents icon, the lower right. Now, as Tom scrolls through, the first thing that you'll notice is Recents has also been grounded in material design. You'll notice the overlapping cards been rendered with realistic shadows and perspective. But there's another thing going on here that may not be immediately apparent. Tom's Chrome tabs are also listed here as well. He's been researching restaurants to go in SF, so he has articles from the "New York Times" and the "SF Chronicle" here as individual items. You'll notice the site icons or the fav icons there. As he scrolls back a bit further, you'll notice he's been researching in the Urban Spoon app. He has Docs app open where he's been collaborating with some friends. So let's go ahead and click on that doc and see what your friends have to say. I've heard great things about state bird provisions. Let's check out that article here. Now what you see here as this loads is this is actually loading as a website in Chrome. You'll notice the URL up at the top. Now, if Tom pops it back into Recents, that page is now listed there, along with all of his other open stuff. I want to point out what the big difference is here. This is a view you couldn't get before today. If you wanted to get to all your open websites, you'd have to go into Chrome and kind of flip through them there. But by bringing all of your individual Chrome tabs here and listing them in your Recents view, we're making it really easy for you to move between the web and apps, making multitasking just that much easier. [APPLAUSE] And last but not least, this change to Chrome is actually built on top of a new API in L that allows apps to populate multiple items in Recents. So for all you app developers, does this kind of thing make sense for you? You can make use of it as well. [APPLAUSE] Going a step further, we're also making it easy for you to find content using Google search, whether it's deep in the web or deep in an app. So last fall we announced app indexing. As a developer, this capability lets you get your users to app content directly from the search results page. Since then, we've been working on a ton of UI improvements and extending some APIs to make this more powerful. But let me just give you a quick refresher of what this capability enables. So let's go ahead and do a search for Waterbar Restaurant. I've heard good things about it over by the Embarcadero. As Tom scrolls through the search results, you'll see close the top there's a link for the home page to Waterbar. And near the bottom of the screen, there's-- actually in the middle of the screen, there's a result for Open Table. Now what's different about this UI is this link to Open Table, instead of going to the website, is actually going to take us to the Open Table app because Tom happens to have the app installed. So let's go ahead and click on that link. And you'll see it takes us directly to Waterbar within the Open Table app. [APPLAUSE] Up until now, this was only available to a few applications. But today, we're opening it up to all Android app developers globally, along with some tools to get you started. [APPLAUSE] And going further, if your app requires your users to sign in, you'll be able to use Google+ sign-in in the coming months to have your public content show up in search as well. You know, we thought this would be even better if we could help your users rediscover content that they've already found in your apps. So we're adding a new API in Google Play services to do just that. So let's quickly show you how this works. Tom found this really cool 3D tour of the Ferry Building earlier, and he wants to get back to it. So starting with the search box on his home screen, he's going to do a search for Ferry Building. And what you'll notice at the bottom of the screen is there are search suggestions for Ferry Building Marketplace in the Google Earth app. And this is there because this is the app that he was using when he found that tour before, even if he himself didn't remember. With a single click, he'll get taken directly to the tour of the Ferry Building within the Google Earth app. [APPLAUSE] Now, this is possible because the app is making its content available based on its user's previous actions. We just showed you this with Google Earth that any app that utilizes this new API will have the same capability. For developers, we think this is a great way for you to help your users rediscover content right when they're looking for it. And with that, I'll hand it back to Dave, who is going to take you through some more of the enhancements you can look forward to in L. [APPLAUSE] DAVE BURKE: Thanks, Avni. So we've covered some of the highlights of the user experience. But there's lots of other user experience improvements in L, for example, a new keyboard UI, a Do Not Disturb mode, new quick settings, and much, much more. But in the interest of time, let's move on to the second major theme of L, and that's performance. Let's start with the Android virtual machine. So you might remember that we made a very early version of our new runtime, ART, available as a developer option in KitKat. Well, we got some really great feedback from you guys, as well as some excellent open source contributions from ARM and Intel and MIPS. And I'm excited to say that we're finally ready to pull the trigger on this bad boy, because the L release run exclusively on the new ART runtime. [APPLAUSE] So we wrote ART from the ground up to support a mix of Ahead of Time Compile, Just in Time Compile, and interpreted code. And it's truly cross platform. So it supports ART, x86, and MIPS. We put a lot of effort into optimizing ART's back end compilers. And this has resulted in a 2x improvement performance over Dalvik. And best of all, this one is on us. You don't have to make a single change. All of your app code just gets the performance improvement for free. [APPLAUSE] ART also has a brand new garbage collector and memory allocator. So this dramatically reduces the number of pauses and the duration of pauses associated with a garbage collection event. As a result, your app runs more smoothly. So if you take a look, for example, at Google Maps on both Dalvik and ART, firsts you'll notice a number of pauses have reduced from two to one. But also, the pause duration has reduced from roughly 10 milliseconds down to about two to four milliseconds. So now, it fits comfortably in a vsync window-- no more application stutters. [APPLAUSE] And there's more. ART doesn't just bring better performance. It's also more memory efficient. So it's smart about when the app is put into the background, in which case we'll apply a slower but more intensive moving collector to save anything from hundreds of kilobytes to many megabytes. And finally, ART is fully 64-bit compatible. In fact, we've adapted and optimized the entire platform to take advantage of new 64-bit architectures. So now, you can benefit from larger number of registers, newer instruction sets, and increased memory addressable space. [APPLAUSE] So to take advantage of 64-bit, we've added support for new ABIs in the NDKs, so ARMv8, x86-64, and NIP 64. And of course, if your app is written in Java, then it will work with absolutely no modification on u64bit hardware. OK, so that's CPU performance. The other side of the coin is GPU performance, graphics. And I'm really excited about some of the things that we're doing in L in this area. So historically, mobile graphics has lagged desktop by virtue of the fact that mobile GPUs are smaller and more power constrained. But that's changing quickly. Mobile GPU performance is catching up with console graphics and even PC graphics. So in L, we specifically wanted to close the gap between desktop DX11 class graphics capabilities and mobile. And we're doing that with something we call Android extension pack. So we set out to work with GPU vendors, including NVIDIA, Qualcomm, ARM, and Imagination Technologies. And together, we defined the Android extension pack. So it's a set of features that includes things like tessellation geometry shaders, is compute shaders, and advanced ASTC texture compression. So let's take a look at the Android extension pack in action. And what you're about to see is Epic's Unreal Engine 4 desktop rendering pipeline running on Android L on the latest NVIDIA tablet hardware. Now, the Android extension pack enables much more advanced shaders. So we can have more realistic environments, more realistic characters-- [LOUD BOOM] --and vastly improved lighting. So let's go start this up. [VIDEO PLAYBACK] [CRASH] [HEAVY BREATHING] [LOUD BOOMING SOUND] [LOUD BOOMING SOUND] [GRUNTING] [DRUMMING] [GRUNTING] -You wanna play? [HEAVY BREATHING] -OK. [DRUMMING] [CHICKENS CLUCKING] [DRUMMING] [CRASHING SOUND] [CHICKENS CLUCKING] [END VIDEO PLAYBACK] [APPLAUSE] DAVE BURKE: OK. So, as I mentioned, this isn't just a cut sceen. It's actually live. And we can fly through the world. Some of the rendering that you saw there was truly incredible. So there were really amazing reflections in the water, lighting effects. Tessellation were being used for the smoke affects. And starting with the L release in the fall, you're going to see new, high-end tablets and phones shipping on Android with this level of graphics capabilities. So quite literally, this is PC-gaming graphics in your pocket. The last performance enhancement I want to take you through is on battery. And we've worked hard to make sure that the battery keeps up with the performance. And of course, there are a variety of systems and components that tax the battery on a modern phone or tablet, so WiFi radios, cell radios, GPS, CPU, et cetera. And you might remember we've had some previous efforts to improve quality on other releases-- so Project Butter for UI smoothness in Jelly Bean; Project Svelte for memory footprint in KitKat. Well, on the same team, and brought to you by those same project naming geniuses, we have Project Volta. And the goal of Project Volta is to optimize how the expensive subsystems of the devices are used and to improve overall battery life. So the first thing we did was improve our instrumentation of battery data. You can't improve unless you can measure. So we created the tool that we call Battery Historian. And it helps you visualize on a time axis the battery usage information. Now you can correlate battery discharge with what was happening to the device at the time. [APPLAUSE] So, on a Nexus 5 running on Battery Saver mode, you can extend your battery life by up to 90 minutes of usage within a typical single day's use. So, I just gave you a quick, whirlwind tour of some of the highlights of L, how we're improving the user experience through steps like improved design, smarter notifications, and intuitive authentication, and also the enhancements on the performance side, so faster runtime, better graphics, and stronger battery performance. But I only scratched the surface of L. And as I mentioned at the start, this is our biggest release to date. You're going to find things like better multitasking, Bluetooth 4.1, burst mode camera APIs, USB audio support, and much, much more. Tomorrow morning, we're going to be making the L developer preview STK available from developer.android.com and also posting early system images for the Nexus 5 and Nexus 7 so you can start developing for L today. [APPLAUSE] So with that, let me hand back to Sundar. Thank you. [APPLAUSE] SUNDAR PICHAI: Thank you, Dave. As Dave said, the L release with 5,000 new APIs is one of our most comprehensive. And we're very excited to be sharing it today. We have a whole new design with L, tons of UX features, and a whole slew of performance improvements. When you take a step back and you look at what we are doing with Android, the approach we are taking is very unique and distinctive. We aren't building a vertically integrated product. What we are doing is building an open platform at scale. We work with hundreds of partners globally to bring a product and a platform that touches billions of people. And we want to do it in a way in which we are innovating at a very, very fast pace. If you take a look at the innovation that's happening in Android, and if you look at some of the recent announcements from others, you can see that things like custom keyboards, widgets-- those things happened in Android four to five years ago. [APPLAUSE] We are working very, very hard to bring open platform and innovate on it at an unprecedented scale. We want to make sure we ship these features to users as fast as possible. That's where Google Play services come in. Google Play services ships every six weeks. And 90% of our users are on the latest version of Google Play services across all versions of Android. [APPLAUSE] In fact, by shipping every six weeks, we in many ways can iterate faster than typical OS release cycles. While it's open platform, and we want to innovate fast, we want to make sure it's very, very secure as well. So we take security very seriously. Let's take an example at malware production. In Google Play, we automatically scan every single application for malware. And if users opt in, we even scan applications from outside of Google Play to make sure they are malware free. Given the popularity of Android, there's a whole vested industry, given there's a lot at stake around security. But based on every data we see, well, well less than half a percent of uses ever run into any malware issues. And increasingly, we are pushing security updates through Google Play. Any security updates related to Google server communications, we are now pushing those updates through Google Play services so that we can get them to users within six weeks. With L, we are also launching factory reset protection, so that if your phone get stolen, users have full control to disable their phones. [APPLAUSE] Finally, privacy is an important part of security. So with L release, for the first time we have a centralized setting, what they call universal data controls, where users can go and manage their important privacy protections. They can control data that is shared from the device, like location history, et cetera. And so we are doing that in L as well. [APPLAUSE] So far, we have been talking about L release in the context of mobile phones and tablets. But users increasingly are living in a multi-screen world. You are using other connected devices, the television in your living room. You're increasingly wearing things on your body. When you get into your car, you expect a connected experience. We want to work to create a seamless experience across all these connected devices. So with L, as well with Chrome, we started laying some foundational principles on how evolve our platforms to support these new connected experiences. So here are a few principles. We are making everything contextually aware. We want to understand whether you're home with your kids and you want to be entertained or you're work trying to be productive. Or maybe you're traveling. We want to bring the right information to you at the right time. We want the experience to be voice enabled. We are building the most advanced voice recognition infrastructure in the world, and we want to help users interact with computing devices in an intuitive way. For example, when they're driving or cooking, we want voice to be a major source of input. We want the experience to be seamless. It shouldn't matter which device you were using before. We want to pick up where you left off. And finally, users always have their smartphone. So we want to make sure all these connected experiences work based on your smartphone, be it your wearables, be it your car, or like, we have shown with Chromecast, be your television. So, both with L release and Chrome, we are bringing a whole set of new experiences to many connected screens around you. The first area we are going to talk to you about is wearables. About three months ago, we launched our preview of Androidwear. We announced a developer SDK, and the reception has been very, very positive. To give you further update, I'm going to invite David Singleton on to the stage. [APPLAUSE] DAVID SINGLETON: We're right at the beginning in a new phase in the miniaturization of technology, which means that it's finally possible to make a powerful computer small enough to wear comfortably on your body all day long. And there's a huge opportunity to bring rich user experiences to these devices. And that's why we're building Androidwear as our platform for wearables based on Android. Androidwear makes it easy for developers to reach users on this new form factor using precisely the same tools we're already familiar with on Android phones and tablets. People will be wearing these small, powerful devices, so style is important. And that's why Androidwear supports both square and circular screens. And we think that there will be a wide variety of fashionable designs. Sensors will help them understand your context. So they can provide useful information when you need it and help you reach your fitness goals. And as the device that you always have with you, your watch will also provide intelligent answers to spoken questions and as Dave showed us earlier, act as your key in a multi-screen world. Across the world, people check their Android phones an average of 125 times every day. And that's why we've designed Androidwear to quickly show you relevant information and make sure you never miss an important message, while letting you stay engaged with the people that you're actually with. We do this by working to understand the context of what you care about, while enabling very brief interactions with the device. Here's a live demo on the LGG watch. You can see that it has an Always On display, than at any given time, it shows you the most important thing we for you. So Jeff, it looks like your flight to Brazil for the World Cup is on time. I guess you do deserve a break after this big demo. And if Jeff wanted to see more, he can simply raise his watch of tap the screen to switch into vibrant, full color that you're already seeing here. Throughout the day, if Jeff receives a notification which buzzes his phone, his watch will vibrate on his wrist and show him what's up at a glance. So he won't miss an important message like this one. Swiping up and down navigates you through this stream of cards, which includes information from Google Now, apps running on Jeff's phone, and apps running directly on the wearable itself. And when there's a page indicator, Jeff can swipe horizontally to see more details. You can see that we've applied material design here. The cards float above beautiful, textured backgrounds. And just like in your phone's notification shade, you can swipe a card away to remove it from your stream. Let's take a look at Jeff's phone. And that notification has disappeared. Back at the watch face, pressing and holding lets you choose a different one. You can see that there's a broad selection of analog and digital designs in a variety of styles to suit your tastes. OK, now that we're acquainted with the overall UI model, let's see how Androidwear can work for you. Imagine that Jeff has just got up in the morning. He swipes up and sees the weather forecast for the day. His commute's not looking too bad. And oh look, Jeff, I guess you need that package for your trip to Brazil. You better not forget to pick it up. JEFF: OK, Google. Remind me to check my mailbox when I get home. DAVID SINGLETON: Now If we can see Jeff's phone at the same time, you'll see that this is immediately synced across. [APPLAUSE] DAVID SINGLETON: And in this case, his watch was smart enough to know where home is. A little later on, as Jeff is arriving at the office, his watch vibrates again with a chat message from one of the team. He can see who it's from and what he's saying, all without having to fumble around and get out his phone. You're watching your phone stay in sync. When you swipe away a notification on the watch, it disappears from the phone, as Jeff is showing now. [APPLAUSE] It's super convenient. In the evening, Jeff is having dinner with a friend at a restaurant. If he's unfamiliar with one of the ingredients on the menu, he can just say-- JEFF: What is limburger? DAVID SINGLETON: So it looks like limburger or is a type of cheese. Jeff is lactose intolerant, so he better order something different, or this dinner could go wrong. And when Jeff receives a phone call, his watch will vibrate, and he can see who's calling at a glance. It's another one of Jeff's co-workers. Now Jeff could get out his phone to answer, but since he's busy, he can either swipe to reject the call from his wrist or swipe up to choose from one of these quick SMS replies. His phone sends the SMS, and he's done. [APPLAUSE] Sometimes you're enjoying dinner so much that you want to avoid any more interruptions. And for that, you can set Do Not Disturb with a single downward swipe from the top of the screen. [APPLAUSE] And now Jeff's watch won't buzz again until he wants it to. Later that night, Jeff arrives home. Oh, that's right. Your package is here. Now that he's at home, the reminder that Jeff created this morning has triggered. You can also use Androidwearables to control other devices around you. Let's loosen up with a bit of music. JEFF: Play some music. DAVID SINGLETON: Now you'll see that Jeff has music controls on his watch. [MUSIC - CHROMEO, "JEALOUS (I AIN'T WITH IT)"] He can see what song is playing. He can pause the music or skip to the next track. And while it's playing, the album art is beautifully displayed right there on his wrist. Finally, at the end of the day, it's time for bed. JEFF: Set an alarm for 7:00 AM. DAVID SINGLETON: With glanceable notifications and quick voice actions, Androidwear gives you what you need right when you need it. Let's take a closer look at some of the contextual information that Androidwear provides when you're traveling. So Jeff's about to leave on that big trip to the World Cup. It's the morning of his flight. So his phone is already displaying relevant information for his trip. He can see his flight status and even show his boarding pass. His hotel address will be there when he needs it. And he knows whether or not he'll need to pack an umbrella. It does look like it's going to rain in Brazil on Friday. Once he's in Brazil, Androidwear continues to give him useful, timely information at a glance, whether it's his restaurant reservation, the time back at home so he knows when to call as family, or the local bus schedule. And while he's walking around the city, Jeff can see how many steps he's taken today, along with a step count history for the week. On devices that support it, he can even check his heart rate after a jog. [APPLAUSE] So we've shown you what Androidwear can do out of the box. We're even more excited to see what developers build on top of this platform. The Androidwear SDK enables you to build glanceable, contextual apps for this new category of device. Let's talk through the capabilities it gives to developers. And then we'll show some examples. Right off the bat, Androidwear automatically bridges notifications from your Android phone or tablet directly to your watch. Now, Android's notification APIs already allow you to build beautiful user interfaces with big pictures, actions, and more. And there are hundreds of thousands of apps delivering billions of these notifications every day. And now, they're available on your wrist. Back in March, we released a developer preview, enabling apps running on the phone to add things like voice replies, have several pages, and group notifications in bundles. With these additions, you can begin to provide a tailored experience for wearables. And we've used these features to add wear support to Google apps, like Hangouts and Gmail. And there's been a huge response from developers. The very best wearable apps respond to the user's context, put glancable cards in the stream, and allow the user to take direct action in just a few seconds. Here's one of my favorite examples. With Pinterest, you can follow other people's pins. Pinterest app will let you know when you're near a place that's being pinned by someone you follow. So Jeff's friend Susie loves Korean barbecue. And she's somewhat of an authority on the best restaurants in San Francisco. So when Jeff is in the city, Pinterest can notify him that he's near one of Susie's pinned restaurants. The notification will appear on his wrist just like this. And it uses pages, allowing him to quickly glance at the details, then swipe to see a map. And if he like it, he can start navigation right from his wrist. This is using Google Maps for mobile, which gives you turn by turn directions on your watch. It's particularly useful when you're walking. And it works with all Android Wear devices. In addition to what's possible with notifications bridged from the phone, today we're making a full Android Wear SDK available which enables you to write code-- It's pretty great. It enables you to write code that runs directly on the wearable itself. And almost all the API's that you're already familiar with on Android are available here. That means that you can present fully customized UI, read sensors directly, and much, much more. We're also introducing a new set of API's and Google Play services that makes it easy for your app to send data between a phone or tablet and a wearable. And we've road tested these API's with some developers over the past few weeks. Let's take a look at examples of what they built. Eat 24 is an app that makes food ordering both fun and easy. Now watch this. Hopefully I'm going to order a pizza in 20 seconds. When it comes to take out, I'm a creature of habit. And Eat 24 has recognized this and takes advantage of that contextual stream. Around the same time I made an order last week it puts a notification suggesting I order again. I can tap on the notification and launch into their full screen UI. And here I'm presented with a beautiful interface that lets me confirm the kind of food I'd like today, let's stick with pizza. And then I can quickly swipe to see and repeat my last order. Just one more tap to pay, and the pizza's on its way. I think that clocked in under 20 seconds. Now you might be wondering how this got to my watch. Well all I had to do was install the Eat 24 app from the Play store on my phone. When a watch is connected, the wearable portion of the app is automatically installed and kept up to date on that watch. I mentioned the new wearable API's for easy communication between phone and watch. All the Cooks is a social recipes app which has made really great use of these API's. I don't know about you, but I find it really hard to follow recipes, especially when it gets to those tricky bits where everything's happening at the same time. So wouldn't it be more convenient if I could just look down at my watch and see what to do next? With the All the Cooks app I can choose a recipe. Let's go into my favorites and choose this beef brisket chili. The recipe will immediately appear on my watch. So it's always right there with me. Let's get started. I've got all the ingredients, so let's start following the steps. Now watch the phone carefully. As I move from step to step, the phone stays in sync too. And if you're wondering whether or not it's safe to wear your watch while cooking, it's great to know that all the devices we're talking about today are water resistant. And with All the Cooks, whenever a recipe calls for a timer, like this four hours in the oven, I can do that right away on my wrist. So no more burnt dinner. We saw some great examples of voice actions earlier today. And we believe voice actions will be most useful when they can invoke the best service in any app. We're just getting started with this, but we're making voice available for some key actions on the wearable today and we'll be adding more over the coming months. Lift is a transportation service and ride sharing app that allows you to request a car to pick you up at your exact location. Lift have implemented our call a car intent. So it's really easy to just walk outside and say, "OK, Google, call me a car." You'll see that Lift is able to determine Jeff's exact location from his phone an presents this confirmation screen so he can verify his address. The app is also made great use of notifications in the stream. You can see when your car has arrived, keep up to date throughout the journey, and even rate your driver right from your wrist when you're at your destination. Thanks to all our developers. Now, we showed a preview of a couple of watched we were working on with our partners back in March. The LGG watch will be available to order later today on the Play store. In addition, you might have caught a glimpse of a new device during the demos. We're very happy that Samsung is joining the Android wear family with the Samsung Gear Live. And the Samsung Gear Live is also available to order later today. The Moto 360 is the first watch to adopt the round Android Wear UI. And it will be available for sale later this summer. Those are just the first three watches. There are many more on the way, and we're thrilled to enable developers across the world to build apps for what we believe will be a revolutionary new form factor. And now, I'd like to invite Patrick Brady on stage to tell you about how we're bringing Android to the car. PATRICK BRADY: Thank you, David. Thank you. isn't that great? Android Wear creates a seamless experience by connecting your Android smartphone to a wearable device. And the result is truly amazing. Wouldn't it be great if all of your devices were this connected. For many of us, cars are an integral and essential part of life. They bring us to the grocery store, and take us on weekend trips. They bring us to work, and take us home. In fact, in the United States, the average commuter spends over one hour in the car every day. In many ways, our cars keep us connected to the physical world around. But they remain disconnected from our other devices in our digital lives. So what have drivers done to bridge this divide? Well even though it's unsafe, and in many cases illegal, people use their phones while driving. And reports show that 25% of accidents in the US are caused by people fumbling with gadgets behind the wheel. There's got be a better way. So back in January we announced the open automotive alliance to address this problem and make the connected car a reality. We'd like to show you what we've all been working on. And today, we're happy to announce Android Auto. We've re-designed the Android platform for automotive, making it easier and safer to use the connected apps and services drivers want in the car. We looked at what people do with their phones in the car today. And these things stood out to us. Navigation, communication, music, and other forms of streaming media. Android Auto puts these front and center. So you don't have to go hunting through a grid of icons to find the apps that are most important to you when you're in the car. Android Auto is contextually aware to give you the right information right when you need it. And most importantly, Android Auto is completely voice enabled, so that you can keep your hands on the wheel, and your eyes on the road. You know, we really wanted to drive a car up here on stage and show you this live in action. But apparently there these regulations and logistics that make driving a vehicle in a room packed with 6,000 people a very hard thing to do. So we set one of our engineers on the problem. And apparently, this is what happens when engineers have access to a blow torch. So we're down one test car, but we have a great demo cockpit to show you. And now I'm happy to introduce Andy Brenner, our product manager, who will literally drive this demo. So to start, Andy connects his Android phone to the car. And the phone casts the Android Auto experience to the car's screen. Andy can now put his phone down and used the familiar car controls, steering wheel buttons, console dials, and touch screens to control Android Auto. It looks and feels like it's part of the car. But all of the apps we see here are running on Andy's phone. Which means that the experience gets better when Andy updates his apps, or gets a newer, faster phone. This also means that Andy has a personalized experience that he can bring with them into any compatible car. The first thing Andy sees is the overview screen which shows personal and contextually relevant destinations, reminders, contacts, and music from Google Now another apps. One tap and he's navigating, or listening to his favorite road trip mix. Andy, why don't you play us some music? [MUSIC - THE MOWGLI'S, "SAN FRANCISCO"] Let's look for a second at Play music. It has been adapted to have simple, glanceable controls for the car. Andy has access to all of his curated playlists, radio stations, albums and artists. And to all the key features in Google Play Music. He can also use voice or the steering wheel controls to control the music in the car, keeping his hands on the wheel. Fantastic. Of course, Android Auto needs great maps and navigation. So let's show you Google Maps. We all love Google Maps because it's fast, accurate, updated, and it seems to know where everything is. In Android Auto, drivers have access to all their favorite maps features. Great local search, personalized suggestions, live traffic, and of course, turn by turn navigation. And Google Maps for Android Auto is even more powerful, because it is completely voice enabled. Andy, why don't you take us for a drive? ANDY BRENNER: How late is the de Young Museum opened today? GOOGLE: De Young Museum is open from 9:30 a.m. to 5:15 p.m. on Wednesday. ANDY BRENNER: Oh good, I can go there. Navigate there. GOOGLE: Navigating to de Young museum. Head for 4th street, northeast on Minna St. In 600 feet use any lane to turn right onto 4th street. PATRICK BRADY: So Andy was able to start navigation without ever entering an address or taking his hands off the steering wheel. During navigation, instructions are spoken, as you heard, and displayed on the screen in a material car that floats above the map. Great. So that's music and navigation. What's next? Let's show you voice enabled messaging. GOOGLE: New message from Hiroshi Lockheimer. Here it is. Andy, are we there yet? PATRICK BRADY: As you can see, incoming messages show up as heads up notifications. So Andy can still see the upcoming turn in Maps. When he's ready, he can just use the steering wheel voice button to reply. ANDY BRENNER: Reply. GOOGLE: What's the message? ANDY BRENNER: I have no wheels. GOOGLE: Here's your message to Hiroshima Lockheimer, I have no wheels. Do you want to send it? ANDY BRENNER: Sure. GOOGLE: Sending message. PATRICK BRADY: So we're really excited to bring these great experiences into the car. But we also want you, our developers, to come along for the ride. We know it's not easy to build apps for cars today. There are dozens of different car platforms, input controls, and user interfaces. There is no central way to distribute your app, or keep it updated. Wouldn't it be great if building an app for the car was just like building an app for your smartphone or tablet? Well, we have good news for you. The road ahead is brighter, and today we're announcing the Android Auto SDK. So that you-- We thought you'd like that. So that you can just focus on making great apps for the car. We're starting with a full set of API's for audio and messaging applications. First, let's talk about audio. We worked with a great set of developers on a prereleased version of the Android Auto SDK to develop some great audio streaming apps that let you listen to music, internet radio, news, sports, and podcasts on the go. You can try these apps out live in our demo cars right outside. Next, let's talk about messaging apps. Andy showed us earlier how he can send text messages using Android Auto, completely with his voice. Well we're opening this up to your messaging apps. So using these API's your apps can notify users of incoming messages and allow them to respond using voice. And this is the same API we're using for notifications and remote reply on Android Wear. With just a few lines of code, you can let users know on their wrist or in their car. it's really, really powerful. So we're really excited about Android Auto and we think we've found that better way. But I know what your all thinking, when some rubber actually meet the road? Well, we're happy to say that you won't have to wait long. The Android Auto SDK will be published soon. And the Android Auto experience will be available to users with the public L release, later this year. And the excitement in the auto industry is really been growing. Today, we're happy to announce that over 40 new partners have joined the Open Automotive Alliance. Over 25 car brands have signed up to ship Android Auto in the near future. What's more, the first cars with Android Auto will be rolling off dealer lots before the end of this year. So that's just a peek at Android Auto, an Android experience that's been redesigned for the car, with all the apps drivers know and love, through an interface that's built for drive. Now I'd like to welcome Dave Burke back on stage to tell us about Android in the living room. DAVE BURKE: Thanks Patrick. It's pretty cool to see what you guys are doing at Autos, but some of us don't actually have a car in our living room, wheels or not. So I'm going to talk about a different form factor and that's TV. So TV's are fast becoming smarter, more connected. And really, they're becoming computing devices in their own right. So we see a great opportunity to bring some of the strong capabilities of Android, such as voice input, user experience and content to the largest screen in your house. Now in some ways, TV space is not too dissimilar to the mobile space in 2006. Each TV manufacturer has a different OS with different API's and programming model, often with limited developer tools. And the cost and friction to develop a service to run across multiple TVs is too expensive. As a result, smart TVs are typically limited and not competitive with their mobile cousins. So we wanted to go and change that. Today, we're announcing Android TV. So, this isn't a new platform. That's kind of the point. We're simply giving TV the same level of attention as phones and tablets have traditionally enjoyed. We want you to be able to leverage your existing skills and investment in Android, and extend them to TV. There's now one Android SDK for all form factors. Now, remotes are a core part of the TV experience. And Android TV requires just a D-pad with voice input. And that can be made available as a hardware remote control, as a game controller, or even a virtual controller on a phone or tablet. So today, I'm going to use the Android TV app on my phone. And the best way to understand Android TV is to just see it in action. So what you're about to see is hot off the press, and really just an early look at the TV support that we're adding to the L developer preview. OK, so let's start with the most integral part of television, live TV. So in L, we've added what we call the TV input framework to Android. So it enables Android-based TVs to handle video from sources such as HDMI, TV tuners, and IP TV receivers. And the UI provides a unified view of your channels in a familiar channel hopping UI with the channel information on the top. Now, if you want to do something different, just like every other Android device, you press home. And you'll notice that home overlays on top of the content. So I can keep watching while I browse. But unlike phones and tablets, where the behavior is more task based, we designed home to be a super simple, lean back experience. Because TVs, unlike computers or mobile devices, they're primarily entertainment interfaces. Users don't expect or want complexity from their TV. So the launcher presents you with a set of content recommendations at the top, floating over the UI using that familiar material theme. As I scroll down you get immediate access to your applications, ordered by how often you use them. Scroll down again, you get access to your apps, also ordered by usage order. Now if I scroll back up to the content recommendations you can see this is a quick way for me to watch content. The recommendation system is completely open. Any app can publish to it, and it's ranked according to your usage patterns. So for example, I'm currently binge watching Game of Thrones. I am actually. And I'll automatically be presented with a recommendation for the next episode, like so. So that's the home. Let's talk about search. Today, people regularly take out their phone to search for something to watch. With Android TV, we decided to build a core search functionality, directly into the experience, powered by voice. So for example, I could just simply say, Breaking Bad. Google will interpret the results, and get me a result for the popular TV show. Now with one click, I can then watch it in Google Play movies and TV, or any other service that I have installed. If I scroll down, you'll see information on cast members. Scroll down again, I get related search terms, also YouTube clips at the bottom. Now, I can also pivot on cast members. So for example, I can click on Anna Gunn, I'll get that nice material transition, I'll get information on the actress. Scroll down, I can get movies and TV shows that she's starred in. Even YouTube clips of interviews with the actress. Now, the power of Google comes into its own for more abstract queries. So for example, I can just say, Oscar nominated movies from 2002. Google with interpret that query and of course get me all my Oscar nominated movies. And so, from here, of course, one click, I can watch it. Now, search, of course, is backed by Google's knowledge graph. So I could also ask it a question. So for example, I could say something like, who played Katniss in the Hunger Games? And of course, the answer is, Jennifer Lawrence played Katniss Everdeen in the Hunger Games. And again, I get movies and TV shows she's involved, other related queries and YouTube clips. So that's Google search tailored for your TV. Let's now take a look at some of the applications. So let me launch Google Play movies and TV first. Now, developing for TV means creating a 10 foot user experience, so-called because that's typically how far you're standing from, or sitting from the screen. In L, we've expanded the platform to support a lean back experience with new framework classes that help you to quickly and easily build fluid cinematic, ten foot user experiences. Our Play movies team was able to take their existing tablet app from Android, quickly add the lean back classes on top of it to produce a great TV experience. They now have the same APK for TV, phones and tablets. So what you see here is the browse fragment, part of the lean back classes. You got nice slick animations, you got those bold material colors. If I then dig into a TV show I can get more information. This is showing our details view. By the way, if you don't have your remote handy, you can always use your Android Wear watch as a D-pad. So for example, let's try this out. So we've created a little wearable app. So from here, I can actually interact with the TV. So I can go back. Let's go up and watch, Now You See Me, I think that'll be a good show. So I click that, and then we can start watching it. So to wrap up, all of these lean back building blocks are ready for you to reuse and customize in your own applications as you see fit. OK. So next, let's talk about games. People who have traditionally not thought of themselves as gamers, frequently download and play games on their phones and tablets. And in fact, three out of four Android users are playing games. Which has helped make the Play store one of the biggest catalogs of games in the world. With Android TV, we enable you to take your games to the biggest screen in your house. And the games are getting really good. So for example, let me fire up, one of my favorites, which is Leo's Fortune, with this game pad. And this is a really good, fun game. It's kind of typical of a modern Android game. It's got great game, it's got great graphics, fast and fluid. I also like it because our lead UX designer's name is Leo. He's not quite as cuddly as this guy, but I can definitely see him in a handlebar moustache. Anyway, so another advantage of Android TV is you can tap into Google Play games to share achievements and leader boards with friends. You can even play multiplayer games, which are friends playing from any device. So for example, I can launch NBA Jam, which is a really great game, and play a multiplayer game with my friend, Alan here on the sofa. So let's try it out. So he's playing on his tablet. I've got my game controller. Let's try this. So we have a bet on. The first person to store is me. OK, first one to two. OK, let's go. So the first person to do, has to buy the other person beer tonight. Let's go. Oh. Hopefully I can actually score. I'm from Ireland, so I don't really understand this game. Let's try this. Oh. OK. One all. Did you get two? OK, last one, last one. Bear with us. I want free beer tonight. Yeah. All right. It's a good thing there's a free bar at the after party, Alan. OK. I hope. OK. So that's games. Now, sometimes you just want to cast or send content from your movie, such as movies or music from your phone. So Android TV includes full Google cast support. So you can use it just like a chrome cast. In fact, Android TV enables us to bring Google Cast to more TVs. So for example, imagine Alan's visiting my house and he wants to share his new favorite jam with me. He could just fire Play music on his phone or tablet, and cast it directly to my TV. And hopefully his music will appear. I I'm always really nervous about what he's actually going to play. I'm not sure I even what to see this. OK, that's pretty innocuous. That's cool. All right, works great. So that's Cast. Now, to distribute your applications, we've designed a very TV-centric Play store experience. I'll just fire it up here. The L developer preview comes with a sneak peek of TV apps and games to showcase the platform. And you'll recognize big names like Netflix and TED, as well as some great casual and multiplayer games. The store will open officially in the fall with the launch of L, packed with some of the best content available today for Android, tailored, of course, for TV. So that's a quick overview of Android TV. Android TV is ideal for multiple device types. So everything from built-in televisions, to set-top boxes, to streaming boxes, and gaming consoles. We're working with silicon vendors across the industry, including everyone from Marvell, to Intel and more. Today, I'm super excited to announce that the entire 2015 HD and 4K Smart TV ranges from Sony, and the 2015 ranges of Sharp and TP Visions, Phillips will run on Android TV. We're also seeing activity in the pay TV space with Bouygues, SFR, and LG uplus adopting Android TV. And we'll also be seeing streaming boxes powered by Razor, Asus others launching this fall. And the Unreal engine demo that I showed you earlier, that was running an Android TV, with Nvidia's TK1 reference design, capable of console style gaming. And we expect to see TV products based on that hardware in the fall, too. Finally, to bootstrap the ecosystem today, we're making a development kit we call ADT-1, available through a sign-up page to application developers like yourselves, so you can start developing for TV today. In fact, all the demos we saw were running on ADT-1. So to learn more about Android TV, you can visit developer.android.com/tv. And I mentioned earlier, that Google Cast is a core part of the Android TV experience. And as Cast gets better, so does Android TV. So to learn more about some of the exciting new developments in Google Cast, and Chrome Cast, let me hand it over to Rishi. Thank you. RISHI CHANDRA: Thanks Dave. So I want to give an update on Chromecast. We launched Chromecast last July to deliver a new TV experience that was both simple and powerful. By using devices you already know and love, your phone, your tablet, and your laptop. And for the first time you had an experience that just worked. All it took was the simple press of a button from your favorite app. Now we've been really happy about the positive reaction we've gotten from both press and consumers. We've sold millions of devices, and consistently outsell all other streaming devices combined in major retail channels like Best Buy. And recently we've been able to replicate that success in 18 countries. Today, Chromecast is a top selling electronics device on Amazon. In the US, UK, France, Japan and Canada. Thank you. [APPLAUSE] And as sales have ramped up, usage per device has increased 40 percent. Already, YouTube sees more active engagement on Chromecast than any other TV streaming product. The model is working. Now, Chromecast is just the beginning. We want to build an ecosystem of Google Cast ready apps, and Google Cast ready devices. So that's why we're excited that Google Cast support's coming to Android TV devices as devices roll out later this year. So let's talk a little bit about developers. We launched Chomecast with five content apps. And since then, we brought on board many of the top content apps from around the world, including the BBC iPlayer, and the WatchESPN app, which came just in time for the World Cup to kill all productivity of the Chromecast team. Luckily, we have a Brazilian VP. Google Cast is designed to work with the most popular devices you find in a home today. In fact, in the last 30 days, almost 50 percent of Chromecasts were used by devices for multiple platforms. That's why in February, we launched the Google Cast SDK across Android, iOS, and Chrome. So now, any developer can take their existing mobile or web app, and extend it right to the TV. In just a few months, we have over 6,000 developers registered, who are actively building over 10,000 Google Cast apps. So we're starting to see a lot of momentum. Now, as more and more apps are coming on board and integrating with the SDK, we want to make it even easier for consumers to find your Google Cast ready app. So today, we're announcing new discovery experiences on Android, iOS, and Chrome. Consumers can find these at chromecast.com/apps, or from the Chrome Cast app on your phone, tablet, or laptop. This is one of many improvements we're making to Google Cast to make it even easier to use. For example, one of my favorite features of Chromecast is that it works with my friends and family, because they can use their own devices. Well today, we're announcing a new feature to make that even easier by allowing others to Cast your TV without needing to be in the same Wi-Fi network. So I've AK here on stage with me. And let's say AK is over at my house, and he wants to use his phone to share or Cast one of his favorite YouTube videos to my TV. Now normally, I need to share my Wi-Fi password with them. But sometimes they're long and complicated, or I don't rememeber it. In this case, I just don't trust them with the password. But I still wanted to see his great video. Well with this new feature, AK can still take out his phone and open up the YouTube app. And you'll see in the top right, he's only connected to the cellular network. But you still see the Cast button. Simply press the button, and it allow him to connect to nearby devices. Now in a few seconds, his phone will connect to Chromecast through the cloud. And that's it. So now, we can control the video, pause the video, play the video, just as if he was on my same Wi-Fi network. So how do we do it? We're using a variety of different technologies, which allows us to authenticate users in the same room as a Chromecast. And if for whatever reason, we can't automatically detect or authenticate from you, we'll ask for a pin that will always be present on the screen. Oh, check this part out, this is my favorite part of the video. It's pretty cool. So now, you can invite all your friends over to your house, kick back, and let them Cast to your TV without any friction or hassle. It makes Chromecast an even more social experience. Now, this is an opt-in feature, so you always have control over who can cast to your TV. We'll be rolling this out to all Android users later this year. And developers will get this for free by the Google Cast SDK. So we also spent a lot of time thinking about new use cases for the television. For example, today lots of people talk about the five hours per day people watch TV. What about the 19 hours per day your TV's just a blank, empty screen? With Chromecast, we want to use this large, beautiful canvas. So we start with something really simple. A feed of scenic images that's been gradually expanding over time. Today, we're announcing the new Google Cast ambient experience. And we call it Backdrop. It just takes a minute to set up, and you can do it for any iOS or Android device. In this case, we're going to use an iPhone 5 for the demo. And we have the feed accelerated just for the demo purposes, so you can see more images. So let's go in and open up the Chromecast app. And you'll see Backdrop's a new option in the App drawer. And from here, you can personalize the feed to match your own interests and tastes. One of the top feature requests we've gotten is the ability to add your own personal photos to the ambient feed. Well now, you can. So we turn on photos, you can select from one or multiple Google+ photo albums. And in a few seconds, you'll start seeing my personal photos show up right on the TV screen. It's that simple. Your TV is now the largest picture frame in the house. Grandparents everywhere are going to love this feature. So we also have a lot more topics. We want to give you an infinite source of great and beautiful images. One of my favorites is one called Places, which brings geospatial images from Google Maps to your TV. These satellite photos give you a totally different perspective on how you can see the world. It's really amazing. We also have other topics like news, and lifestyle, and Google+ featured photos. One of my other favorite topics is art, which includes artwork from famous museums and collections from around the world, including the Getty Museum, and the National Museum of Women in the Arts. And now, my TV's a beautiful set of paintings. Every topic is curated to make sure we're showing high quality and safe images. And every user has the ability to control which topics show up on the TV. Now, generally I'm going to leave this in the background of my house. But let's say one of the images catches my eye, and I want to learn a little bit more info about it. I can use Google Voice Search to always learn more. MALE VOICE: What's on my Chromecast? RISHI CHANDRA: For every topic, we'll show a synchronized card that will show you relevant information and relevant actions. So in this case, I can learn more about any artist of the painting. Backdrop brings your TV to life even when you're not actively using it. And we're looking forward to working with third party developers to actually bring in their own topics into the feed. Backdrop will roll out to all Chromecast users later this summer. So finally, one of the big advantages of the Google Cast model is that we can start thinking beyond traditional use cases of the TV like video. For example, your phone and your tablet have unlimited possibilities. And there's many times we want to bring those experiences and extend them to the TV. Well to help accelerate that shift, we're launching a new Google Cast feature that allows you to mirror any Android device to your television. So to start mirroring, just go to your Chromecast app on your Android device and select Cast screen. And from here, everything on your phone will just show up on the TV. It's that simple. No cables or wires needed. Now we built our own protocol to reduce latency and framedrops, so the experience feels really natural and smooth. And now you can share anything on the big screen. So let me give one example, a real life example I had a few months ago using Google Earth. I was planning a family trip to Maui later in the year, and I want to show my four-year-old daughter where Maui is on the map. So instead of huddling around my small phone, I ended up opening Google Earth and projecting right to the TV. So let's type in Molokini, one of my favorite snorkleing destinations in Maui. And as you can see, it's a totally different experience to navigate the world with my daughter on the biggest screen in the house. You can play with this for hours. So we're going to do one more fun thing. We're actually going to open up the camera app. And we're actually going to mirror in live action -- a little nervous -- in live action all of you on the big screen. It's a lot of fun. So we're working with a large variety of devices, from Samsung, from Nexus HTC, and LG, with many more devices coming soon. We'll be rolling out this initially in beta as devices get updated to the latest version of Google Play services, which will happen over the next few weeks. As you can see, we're really excited to show how the Cast model can change how we think about entertainment in the home. And we look forward to innovating with all of you to make that a reality. Now I'll pass it back to Sundar. SUNDAR PICHAI: Thanks, Rishi. So we've talked so far about wearables, your car, and the television in your living room. Another important device in our lives is our laptop. Our journey here began with Chromebooks. We are seeing tremendous momentum here. We started the journey with one reference device, the Cr-48, which we launched about three years ago. A year later, but with Samsung and Asus, we launched two devices in two countries. Fast forward to today, we have eight OEMs making 15 devices, with many more on the way in 28 countries. Users really love the fundamental insight behind Chromebooks. Speed, simplicity, and security. All 10 of the top 10 highest rated laptops today in Amazon are all Chromebooks. We are seeing tremendous traction in education, as well. Just this year alone, the number of Chromebooks sold to K through 12 schools in the United States has grown by 6x. And we are investing a lot more in this area. So let me talk about how we're evolving the Chromebook experience. As I said earlier, users almost always have a smartphone with them, including when they're using a Chromebook. So we want to increasingly connect these experiences together, so that they have a seamless experience across their devices. Let's take a look at what they can do. Dave, Kan is going to help me through with some demos. Dave talked already about how with the L release, you can unlock your phone knowing that it is with you, based on what we are doing with APIs now. Well if we can unlock your phone, we can also unlock your Chromebook. Every time you approach your Chromebook, and your phone is with you, we will automatically unlock your Chromebook, and sign you into your favorite applications and services. So it works seamlessly. We've already added Google Now notifications, so the same Google Now cards, which you see on your phone, you see them on your Chromebook. And we are adding a few more things. Let's say you get an incoming phone call. You will start seeing those incoming call notifications on you Chromebook. If you get a text message, you would see those text messages on your Chromebook, as well. And I recently had this experience, my phone was in my pocket and running out of battery. And my Chromebook popped up a notification, and said your phone is running out of your battery. Simple, delightful experiences to connect your phone and your Chromebook. As we started working on this, one of the things that struck us is, wouldn't it be nice if you could get some off your favorite Android applications, which you love on your phone, on your Chromebook. So this is a difficult challenge technically, so we've been working on this project for a while. Our goal is to bring your favorite Android applications in a thoughtful manner to Chromebooks. We want this to be intuitive for users. These applications were built for Android for the phone, so we want them to work when there is a mouse, keyboard and touch events, et cetera. For developers, we want this to work with as little modifications as possible. So we are in early days, and we are going to show you a preview today. And Kan is going to help me with it. So Kan is going to pull his tablet up. And what you're seeing on his tablet is Evernote. This is one of his favorite applications. And I think he is planning a birthday party there. Let's switch over to the Chromebook. And the same Android application is now available in the launcher on your Chromebook. So you can click that, and you get the exact same application. And because you're on a Chromebook, you can start making changes to it. You can go to the web, copy-paste, everything just works. We've ported that Android application to run within your Chromebook. Let's give you one more example. We'll pull a canonical application, an application you see on your phone, like Vine. We now have that application running on your Chromebook Pixel, as well. Kan is going to browse through the World Cup channel. And you can see how they experience feels native and intuitive on a Chromebook. And because the app has access to some of the underlying device APIs, it has access to the camera API, Kan can actually take a clip of himself, just like a Vine user, and post it straight from his Chromebook. The final application we want to show to you is Flipboard. It's a beautiful, immersive experience. And the Android version of the app now comes alive on Chromebooks. And you can see how beautiful and immersive this experience is. We are very, very excited about bringing important, favorite Android applications for uses straight on their Chromebooks, so that they can get an even more connected experience. And we are working on bringing our experiences across both Android and Chrome together, so that it looks in a delightful way for our users. So far we've talked about bringing our platforms, Android and Chrome, to all the connected devices in your life, so that we can create a great experience across all of them. But there's another important environment, we want to talk to you about it as well. Most of you spend a lot of time in your workplace. We've always had this funny insight inside at Google, that it is the same person at work, and it's the same person at home. That's a picture of Matt, which is his work badge. In his spare time, he's an Lonely Planet traveller, and he writes books for them. So you can see, people have very, very different contexts. Yet your computing experience is very fragmented. The way it worked in the PC world is companies gave you a separate laptop for work and you had your own personal laptop. The experience was disconnected. And that model starts breaking down when you start thinking about phones. No one wants to carry two phones around. So what we are doing is, with the L release and Android, we are adding a whole set of APIs to unify this experience. So we are bringing both your experiences, so that as a user you can have one experience, and both your personal applications and corporate applications can live on the same device. We are doing it thoughtfully by providing underlying data separation. So all your personal data is isolated from your corporate stuff, and vice versa. So we're providing full data isolation and security, which enterprises care about. As developers, there is no modification needed of your existing apps. All your apps will be available through Google Play, and companies can buy them in bulk and deploy it, so you can reach many more users. These are all available in L, but we are wrapping up many of these features as a separate apps so that it will also work on prior versions of Android. And finally, Samsung has done a lot of important work in this area with Knox. And we really want to thank Samsung, they are contributing all of their work in Knox to the Android platform, so that we have one consistent story for enterprises across Android. So we are also working with major partners, all major OEMs, names you would recognize, in the fall to have a certified Android for Work program, by which we can bring these devices to companies with guaranteed updates and full security. So we are very excited by this journey. As we bring Android for Work, one of the important use cases we care about is productivity. Documents collaboration. Which is why we've invested a lot in Google Docs and the whole suite of editors. We've always had great mobile apps for Google Docs and Sheets. And today, we are also announcing Google Slides, so that you can create and share presentations straight from your mobile devices. One of the common use cases we run into in companies, when people use this, is they run into Office files. It's a very common experience for all of us. And we want to make sure as we bring Android for Work, Office files work seamlessly. So we acquired Quickoffice, and we've been hard at work integrating Quickoffice into Google Docs. And today we are announcing native office editing built within the Google Docs suite of editors. Let's take a look. Kan is going to show how this works. I'm sure you're all familiar with getting emails in which you get a Word file. And so if you get a Word file in your email, you can click on it. In the past, we would convert it into a Docs file, but we don't anymore. And so what you see is a native Word file, handled straight within Google Docs. And this looks for sheets and presentations as well. And Kan can make edits to the document. And most importantly, when he saves it, it saves back as a Word file, so that he can send it back to people. So it works seamlessly. You can always convert this to Google Docs, so that you get advantage of world class collaboration features. One of the common features we get requests on is redlining. So we have done great work to bring modern collaborative approach to redlining. And the feature is called suggested edits. So just like Google Docs always does, people can add their comments and very easily you can review and accept changes. So with suggested edits, we have modern, collaborative features. And with built-in native Office, we think our productivity suite is great for the workplace. All our productivity files live in Google Drive. We launched Google Drive about two years ago. And today, we are very excited to announce Google Drive has over 190 million 30-day active users. These are not registered users, these are 30-day active users. And the number has grown over 85 percent just last year alone. We are now bringing all of Google Drive for Work functionality to companies as well. And what we are doing is we are encrypting the data, both during transit and at rest on our servers. We are providing enterprises full APIs. Audit, and analytics APIs, based on how their employees are using their data across the company. And finally, it is hassle free. Unlimited storage for just $10 per user per month. So the combination of Android for Work, our Google Docs suite, and Google Drive for work, really offers a comprehensive suite for companies, and joins Google Apps and Chromebooks. And as companies are thinking about moving away from traditional Windows architecture, we are seeing tremendous momentum. 67 off the top 100 startups gone Google. 58% of Fortune 500 companies have gone Google. And 72 of the top 100 universities have gone Google. So we've talked about how we are bringing our platforms, Android and Chrome, across all your devices. Not just for your personal life, but also for your workplace. We're going to switch gears now, and talk about how you all as developers can build success on top of our platforms. In fact we hear amazing stories every day. From corporate startups to students who act as developers and create amazing experiences on top of our platforms. We've put together a video so that you can take a look. [VIDEO PLAYBACK] -Jackthreads is a ecommerce destination for men's fashion. I think everyone here is unique in so many ways. We have such a crazy hodgepodge of people here. -We understand so much about our guys needs. The demand that they have for truly personalized communication. Almost 70% of the interactions that we have our audience are mobile. More than 50% of our transactions are driven through mobile. We've leaned in really hard to build best in class mobile experiences, where every day you don't know what you're going to find. We want to feel like the smallest big company in the world. -There's 1.2 billion people in the world learning a foreign language. The majority of these people are learning to get a job, and they are low socioeconomic conditions. My views on education have always been influenced by where I come from, which is Guatemala. So Guatemala is a very poor country. I wanted to come up with a way to teach languages that was entirely free. Today, Duolingo was the most popular way to learn languages in the world. We have 30 million students. There are more people learning a language on Duolingo than in the entire US public school system. 85% of the people use it through an app. Over the next 20 years, education is going to fundamentally shift. Smartphones are going to allow us bring education to the people who don't have access to it. -We had seen Andres a couple times during our PE period, studying the track. And we didn't realize it took so much work to actually get to know the school. -They have to live through darkness every day. No one can really feel it, they take seeing for granted. -We were like why not create something that will actually make a difference. -Hello Navi will use GPS and a location sensor, so that way you have a specific point for each of his classes. -Where do you want to go? -The coding, it took weeks just to get it down. -If you mess up one little tiny thing it can mess up the whole app. -We made a difference today, not just to him, but to many others that will eventually use this app. -They care about me. That's why they made this app happen. I never got to hear the word "inspiration" in my whole entire life. It made me proud. [MUSIC PLAYING] [END VIDEO PLAYBACK] [APPLAUSE] SUNDAR PICHAI: I was incredibly moved when I saw that video the first time. And even more excited, all the folks in the video, including those middle school students from Resaca School in Texas are joining us in the audience today. [APPLAUSE] Thank you. We know these examples are just the tip of the iceberg. And we care deeply about evolving our platform so that you can continue to create these amazing experiences. So we're going to talk about that next. The first is how you can build and scale your applications on top of Google Cloud Platform. We've been within Google building world class infrastructure, and running large scale services, like Search, Maps, YouTube, and Gmail. And we are very excited we are bringing the full power of the Google Cloud Platform to developers like you. To talk about that, I'm going to invite Urs Holzle. URS HLZLE: Thank you. Thank you, Sundar. Hi everyone. I'm Urs Holzle. And my team and I built the Google Cloud Platform. And every day, we go to work excited to see what you all are doing with it. And literally every day, there's hundreds of thousands of developers building applications that scale to hundreds of millions of users. And they keep their teams small. And they can focus on what they do best. Because we run the rest of the infrastructure for them. Let's have a quick look on what the Google Cloud Platform is about. First of all, of course we have VMs with Compute Engine. And they are best in class performance and price. And you can run whatever code you want. But if you don't want to bother administering machines, we also have App Engine. And App Engine makes it incredibly easy to write really high scale applications. That's how Snapchat got to incredible scale without having a single back-end developer. On storage, we offer many options, including, of course, SQL as a fully managed service, NoSQL, which we invented, by the way. Our NoSQL services running billions of queries every hour. And then, of course, object storage that scales to exabytes. But you cannot just store data. You also want to analyze it. And in our platform, you have many tools that make it really easy for you to analyze data sets without worrying about scalability. For example, with BigQuery, you could stream hundreds of thousands of records per second into the cloud, and inquiry then interactively with SQL. Beyond technology, we also lead the industry in price and performance. And as hardware gets cheaper, we pass on these things to you, so you see Moore's Law in the cloud. And moreover, you get great sustained usage discount without having to sign up for contracts, without having to make upfront payments, or without having to forecast your utilization for the next three years, just to get right pricing. But of course what really matters to us is seeing what you developers do with our platform. And whether it's Netflix for storage, or Wix web filings, Khan Academy who run their entire business on our cloud, there's a lot going on. And I'm going to tell you just two things about two recent new customers. First, Secret. Secret, with a single back-end developer, built an application that in two months saw an over 1000-fold increase in traffic. And they came to Google because they needed a platform that can handle this kind of hyper-scale growth without much hassle. And that's exactly what they got. Or maybe you've seen the debut of "Rising Star" on ABC last Sunday. It's an interactive music competition where millions of TV viewers vote in real time on the winner. Screenz, the company behind "Rising Star" is using Compute Engine and BigQuery for the instant voting app. And it's been battle tested to run at 1.3 million queries per second. But that's why developer productivity is actually very important to us. We try to make our cloud not just highly functional, but also really easy to use. And as you've seen, the best mobile apps come with an intelligent back end that's in the cloud. And there's no better place to build that back end then on Google. Because we have the performance and we have the tools that make it really easy for you to build those applications with small teams and little focus on operations. And today I'm very excited to show you four new tools. Very, very cool new tools that make it really easy to understand your server side applications. So please welcome, Greg DeMichelle, who's going to show you some demos. GREG DEMICHELLE: Thanks, Urs. As part of showing you some of the new features that we're showing you for the cloud for the cloud platform, I've built a sample application here. It allows me to record walks that I'm taking. So this application allows me to record a walk I took through the city, share it with my friends, let them experience the same walk I took, and then share comments. Now, since this whole purpose of this application is to share information, obviously I need to store that data somewhere other than my device. And how do I do that? The answer is Cloud Save. Cloud Save is a new, simple API, that lets me save and retrieve per user information. Now that could be application data, it could be user preferences or setting. And I do this with no server side coding. I just do a few lines of client side. And with that my data can be stored in the cloud. It can be retrieved and synchronized to other devices. It can be available offline, so if my user doesn't have an internet connection they can still walk through this walk. And I'm a cloud guy, so what really gets me excited is that this data is stored in Cloud Datastore. Which means I have the full access to the cloud platform. I can query that data with BigQuery. I could build web applications that use that data using App Engine or Compute Engine. And that's in fact what I've done in this case. So let me show you the web app version of Walk Share, accessing the same walk data. So this is the web version. I took a nice walk out by the Golden Gate Bridge yesterday and recorded it. I can expand the window and I can see the comments. Now, if you look at the comments, right away you can see that there's something a little funky going on. I have code here to replace common character sequences with emoticons. but if you look at that, they're sort of running amok. I'm putting emoticons where they don't belong. So I've got some sort of bug that I need to find. So I'm going to flip over here and look at the source code from GitHub for this application. Now, this application's running in production. It's not running on one server, it's running on tens or hundreds of servers. And there's no way to debug an application that's running in that kind of environment, at least not until today. So I'm going to click into the debug mode. I'm going to click down here, and I'm going to put a watch point on this line of code. Now the platform is watching all those servers until one of them actually hits that line of code. So at this point there's people hitting the traffic, it'll take a few seconds. Take a few seconds. Nope. Not like, oh there it went, I just had to wait long enough. Patience pays off. You'll notice, however, I didn't hit the right comment because this doesn't show the bug. So I'm actually going to us a conditional watch point and say comment, dot commentator, name is Rachel. And now I'm going to wait a few seconds, and will actually hit the break point for that comment. Call stack, local window, and sure enough, this is the one that's having the problem. So, to fix it, I'm going to go up and I see here's the code that was being called. I can switch into edit mode and very quickly change this regular expression to be the correct one and learn that I should always have Urs do my code reviews because he never makes regular expression errors. So that's cloud debugging. But when I was looking at that application, I also noticed that the application seemed a little slow to me. It seemed like it wasn't really responding right. How do I find out what's happening in production there? Another new feature of the platform is cloud tracing. Cloud tracing gives me a tracing view of all the various requests my application was processing and how long they took. I'm going to drill in for comments, and sure enough, when I look at this, I've got some queries here that are taking 200 milliseconds. That really seems long. I'm going to drill into one of them, and I get a view of all the service calls that went into that request. So what it appears to be happening here is that I'm doing a bunch of data store operations in sequence instead of doing them in parallel. And that's not really the best practice for data store. So to fix that I'll switch over to the code for that. I'll edit it. Sure enough, this whole block of code is in a four loop. That's not really the best practice for getting the best performance out of data store. So we will delete all that. And I'll paste in the proper version of the code, which actually then does all of that as one operation. I can commit to change. It gets rebuilt. I'm in production. I fix my performance problem. Now, I fix it for one user, how do I know I fixed it for everybody? Well it turns out, cloud tracing also gives you the ability to have reports. So this is a report that I've done that compares the latency before and after that change. The blue part is before I made the change, the orange part is after I made the change. So sure enough, I can see that my curve shifted to the left, my application is in fact faster, not just for one user, but for every user. So the last thing I want to show you cloud monitoring. Last month the team from Stackdriver joined Google. And I'm really happy to show you some of the integration we've been working on. One of the hard parts about operating a service and production is building up the monitoring and alerting that you need in order to stay on top of things. With cloud monitoring you of course get the basic infrastructure type monitoring you'd expect. Disks, VMs, that sort of thing. But what's really powerful is you also get service level monitoring. We automatically detect what services you're using and give you default dashboards and monitoring. And it's just for Google services. We automatically detect over a dozen open source packages, such as Redis in this case, and automatically give you intelligent default monitoring for those. So with no additional work, I get a dashboard that helps me stay on top of what's happening in my app in production. Finally, as everybody who runs a service knows, the last thing you want is your customers to be the first ones to find out that you've got a problem. You want to have alerting in the event of problems. Cloud monitoring includes alerting. So I can go and set custom alerts on any of a variety of metrics. Whether it's on App Engine or Compute Engine or Redis. In this case I have an alert set up to tell me if Redis exceeds its memory threshold for more than five minutes, and I have a choice of how I want to be notified. Do I want to be emailed, or paged, or sent a text message. So that was just a very fast preview of four new developer productivity features coming to the cloud platform. And with that, I'll turn it back over to you, Urs. URS HLZLE: Thank you, Greg. So you just saw four new features that make it much easier to build and understand your back end. Cloud Save securely saves and synchronizes across devices. No service side code, just a few lines of client code. Cloud Debugger gets your desktop debugging in your server app. Debug, a live production apps with live traffic, get local stack traces, and so on, and so on. How cool is that? Cloud Trace then shows you your latency statistics across different groups. You can compare before and after. And Cloud Monitoring gets you intelligent monitoring with almost no set up, including for many popular open source packages. But now let's go from code to data. Information is being generated at an incredible rate. And of course, we want you to be able to analyze that information without worrying about scalability. And today, even when you're using MapReduce, which we invented over a decade ago, it's still cumbersome to write and maintain analytics pipelines. And if you want streaming analytics you're out of luck. And in most systems, once you have more than a few petabytes, they kind of break down. So we've done analytics at scale for awhile, and we've learned a few things. For one, we don't really use MapReduce anymore. It's great for simple jobs, but it gets too cumbersome as you build pipelines and really, everything is an analytics pipeline these days. So what we needed was a new analytic system that scales to exabytes, that makes it really easy to write pipelines, that optimizes these pipelines for you. And that let's you use the same code for both batch and streaming analytics. And today we're announcing just that with Cloud Dataflow. Cloud Dataflow is the result of over a decade of experience in analytics to simple fully managed service. So no machines to worry about. You can create data pipelines for ingesting, transforming, and analyzing arbitrarily large data sets, both in batch, and in real time mode. And to see data flow in action, I've asked no other than our very own, Eric Schmidt, to help me look at something that's happening right now, namely the World Cup. So please welcome Eric Schmidt. ERIC SCHMIDT: Thanks, Urs. This demo is about performing sentiment analysis of World Cup matches with Cloud Dataflow. We're going to analyze millions and millions of tweets per match, and calculate negative or positive sentiment per team, and correlate the sentiment to match data. I'd like to thank the developer relations team at Twitter for the support in using the Twitter data APIs for this demo. Cloud Dataflow is an SDK and a managed service for building big and fast parallellized data analysis pipelines. You write a program as a logical set of data transformations to specify your analysis. You then submit the pipeline to the data flow service. And it handles all the optimization, deployment of VM's, scheduling and monitoring for you. Now here's the code for my sentiment pipeline. The data flow API provides a simple mechanism for you to add one to many transforms to your data. The first transform extracts a real time stream of JSON data from Cloud Pub/Sub. Now, this pipeline is running in streaming mode. However, Dataflow can also be run in batch mode. Meaning I could point this exact same pipeline at Archive Data, say in BigQuery, and produce the exact same analysis. One pipeline, batch or stream. Now let's drill into the second transform, tweet transformer. Tweet transformer is responsible for the core transformation and mapping of my data. I deserialized the stream of JSON into a tweet object. I translate it if it's needed, using Google Translate API, and I score the sentiment with a third party service from AlchemyAPI. Now the part of syntax that you see here is what parallelizes the processing for each step. This parallelization is optimized for you by the data flow service across processes of machines. Now finally, I apply another transform to calculate the average for all the tweets in a three minute sliding window. Dataflow provides powerful built-in primitives for doing MapReduce-like and continuous computation operations. Two lines of code, two lines of code to create a sliding window, and to average all the tweets within that window, reducing a mass of a collection of data down to one record, per minute, per team. Dataflow handles all the aging out of my data, shuffling, et cetera. I don't have to worry about that. So awesome, we have a pipeline. Let's switch gears and take a look at what this would look like in production. What you see here is my deployed pipeline shown in the data flow monitoring UI built right into the cloud developer's console. I'm going to go ahead and click in, and you'll notice that the graph correlates one to one to the code that I just showed you. Making it easy for you to understand your processing topology. If I scroll down further, I can see other information. For example, the total records of processed. So I started this pipeline this morning, right before I went on stage. We processed around 5 million, 5.2 million tweets. And right now I'm running at about 406 records a second. We have lots more head room with this pipeline, but I wanted to show you that my data is actually flowing. Now our goal is to make data flow monitoring valuable and integrated with the rest of your cloud development experience. Now, let's take a look at what we could do with this data. This is a replay of the opening match between Brazil and Croatia. This represents an analysis of millions and millions of tweets and tens of thousands of touch information represented in the match. Now if I scroll back and forth, I can see something interesting happened at the 71st minute. Brazil scores a goal, but you'll notice, in orange, Brazil's sentiment goes down. So this is odd. Typically the team that scores a goal will have positive sentiment. Now if I look in the timeline, there's also some information that's been injected for me by my pipeline. That says controversial goal, controversial call related to the goal. So this is giving me some insight as to why there may be some negative sentiment towards Brazil, but I still don't trust my algorithm. So, fortunately, I've had Dataflow stream all of my raw data into BigQuery so that it can easily perform interactive analysis over very, very large data sets. Now, here are some of the filter raw data around that 71st, 72nd minute. So there you have it. The fans are upset with Brazil related to a bad call by the referee. Now, I think I understand my correlation a little bit better. I absolutely love the World Cup. So thanks Dataflow. With cloud Dataflow you have a fully managed service and unified programming model for classic ETL and continuous analysis over simple or highly complex pipelines in batch or streaming mode. Back to you, Urs. URS HLZLE: Thank you, Eric. And I'm sure as all of you know Switzerland is playing, my home country Switzerland is playing at 1:00 PM today, and I hope at the end of the game my sentiment is going to be at 100. But, to go back to the technical part, I hope you understand now why we stopped using MapReduce years ago. Cloud Dataflow really does for entire pipelines what MapReduce did for a single step. Namely it just makes it very easy. You don't have to worry about scalability, you don't have to worry about parallelism. And it will run faster and scale better than pretty much any other system out there because we needed it to, to solve our own problems. So whether it's data or code on a server or on a mobile device, we tried to make your life easier as a developer. More productive, while giving you the best price and the best performance of any cloud. We've released hundreds of new features in just last year, and we're seeing incredible growth, thanks to all of you who are using the platform. I'm very excited to see what you can do next year with the power of Google behind. Thank you. And now I would like to introduce Ellie to talk about Google Play. Ellie. ELLIE POWERS: Hi everybody, I'm Ellie. And I'm absolutely thrilled to be talking to you about a few things that we're doing to help you all create amazing user experiences and also grow your businesses. Now Google Play is the key to getting your apps into the hands of millions of users globally. More users than on other platform. And Google offers app developers a wide range of cross platform tools. And we build on this every year. Now you've just heard about the cloud platform, and now, I want to talk about a few ways that Google Play is helping app developers differentiate their applications and accelerate momentum. We've been making aggressive investments across development tools, ways to distribute and engage, and new mechanisms to generate robust revenue for your business. So I'm going talk today about a few of our efforts in each of these areas. What we're doing to help you develop, what we're doing on distribution, and what we're doing to help you monetize your apps. So let's talk first about development. Building a great application. A key part of this is testing. You've told us the testing can be painful, and we want it to be easy. Today we are completely thrilled to announce that the Appurify team is joining Google. Appurify offers the most sophisticated mobile device, cloud testing service. And more importantly, they're just as passionate as we are about delivering high quality user experiences. Appurify is leading the way in replicating how your app behaves in the real world. And we're excited to help them further scale and bring their expertise to your app development process. Appurify allows you to test your apps across a wide range of devices. And this is critical if you want to be sure that your app produces consistently amazing results on every type of device, on iOS and Android all over the world. But ensuring quality means more than just device testing. Appurify service can simulate a specific mobile network, and it can even simulate what happens if the connection is weak or drops out completely. Appurify also gives you detailed log data on device performance, network issues, power consumption and stability. And this means your engineering team gets all the information they need to fix the problem without needing to repro on a local device. Now, developers tell us that testing is the cornerstone of creating high quality experiences. And we want to make testing available to as many developers as possible. That's why Appurify will continue to be cross platformed on both iOS and Android, and available is a freemium service. So next, let's talk about building apps that help users get the most of wearable devices. This is a new area that we're really excited about as you can tell. Today we're announcing a platform preview of Google Fit. This is an open platform designed to help users-- yeah, thank you. We want to help users keep better track of their fitness goals. So we're providing a single set of API's to manage fitness data from apps and sensors on both cross platform devices and on wearables. Now, before Google Fit I was trying to track and monitor my bike rides through my bike computer, and then my weight training through a specialized app, and it was a huge hassle. The information was way too siloed to actually help me. Fit takes away the complexity of handling multiple sources, giving you a unified view of a users fitness activity. And this helps you create more comprehensive apps. So if a user grants permission, apps can have access to user's entire fitness stream to give better recommendations through this additional context. So for example, Noom is a weight loss coach app. And they've been an early partner on Google Fit. So let's just take a second to point out how the platform has helped Noom to enhance their app. So you'll see that Noom is able to combine my workouts, nutritional information, and my weight. And because it can talk directly to my Withings scale, it can let me know when my daily cookie happy gets just that little bit overboard. So to centralize everything, Google Fit APIs allows fitness apps and brands to share your fitness activity, but only with your explicit permission. That's the key. You're in control. You can choose who you share what with, and you can delete your fitness activity whenever you want. Google Fit APIs are also opening up new access to data coming in through hardware from top fitness brands. For example, Adidas has a collection of smart sensors that they're opening up to developers for the first time. And we're thrilled to announce that Nike is allowing other apps and fitness devices to integrate with Nike Fuel through this API. Nike will be publishing Nike Fuel to the Fit platform, meaning that your app can use it to give better insights into user's fitness. And of course there are many, many, many partners that are joining the Google Fit ecosystem. More partners in the program means you can all create more meaningful experiences for users. So we're incredibly excited about the potential here for our platform approach in this area. We want to make it so that you can build great fitness experiences with just one API. The platform preview SDK will be available in just a few weeks, so stay tuned. So next, let's talk about increasing distribution for your apps, growing your user base. And that's where Google Play comes in. So last year we announced our cross platform service, Google Play Games. And we've been delighted by the response from users and developers. Google Play Games is now the fastest growing mobile game network of all-time. We have activated over 100 million new users in just the past six months. And Google Play Games connects your game to a concentrated network of people who love games. It makes gaming more fun through services like achievements, leader boards, multiplayer, game gifts and cloud save. Developers use these to make awesome games and bring players back more frequently, boosting their success. And today, we're announcing new experiences and games services to help you further enhance gameplay. So first step, we have the new game profile. In the new Play Games app your game profile changes automatically based on the games you play and your achievements in each game. This new profile means that each player expresses their own gaming identity and it makes playing games with friends more fun. Since the launch last year, users have loved saving their Play Games process in the cloud. And we're evolving that now into saved games. Users will be able to see bookmarks of their progress in the play games app. So, Dave had showed us Leo's Fortune earlier. This is a really fun game, and I just started playing it last week. So I can see here in the Play Games app, my saved game with a screen shot of me playing level three. And we're rolling out a new feature called Quests. So Quests is an online time-based goal that you can set up in your game, such as collecting a bunch of in-game items on a specific day. Now we're offering a set of APIs that can run these events for your players and reward them, all without you needing to update your game. We have some early games that have started integrating Quests and save games, and we can't wait to see how you are going to integrate these into your games. These features will roll out soon in the next update of Google Play services and the Play Games app. OK. So, we've had a look at a few ways that you can build high quality, differentiated experiences and distribute them to a broad audience. Now let's talk about what Google Play is doing to help you monetize your business. So one popular way that users pay on Google Play is their direct carrier billing. And this means charging Google Play purchases directly to the mobile phone bill. Now we've been working hard, rolling out direct carrier billing to our fastest growing markets. It's taken off really quickly. We've just expanded coverage with seven new markets for a total of 25 countries. We're happy to announce that direct carrier billing is now going to be available on more devices, on tablets. So if you have a phone and you've already set up carrier billing, you'll be able to pay for apps, games, movies, music, books and other content on your tablet, all still paid through that monthly phone bill. It'll even work on tablets that are Wi-Fi only. OK. So there we are. App testing, the Google Fit platform, new play games capabilities and direct carrier billing for tablets. These are four new ways that we can help you to create uniquely delightful experiences for your users. Thank you. All right, back to you, Sundar. SUNDAR PICHAI: Thanks, Ellie. We are seeing tremendous momentum in Google Play. And we really take this seriously because it translates to success for you all. In fact, since last year's I/O, we have paid out over $5 billion to developers on top of Google Play. It's not just the volume of this number, but the rate at which this number is growing. It's increased 2 and 1/2 times, from $2 billion the year before. So we are seeing tremendous momentum and we have very excited because it directly translates developers building their livelihood on top of our platforms. Of course your all don't just come to I/O to hear stats like this, you also come because you get your hands on cool new gadgets. So, we're going to give you some. The first is an interesting one. A set of engineers in their 20 person team surprised all of us with what you can do, just with the an off the shelf cardboard and your smartphone. And the combination takes you into a very, very immersive experience. We're going to hand each and every one of you a cardboard. As you walk out from the keynote. And please share your thoughts on [INAUDIBLE] cardboard. Next, we are very excited about Android Wear, and so we're going to give each and every one off you either the LGG watch, or the Samsung Gear Live. These are great devices, and I hope you enjoy them. We don't want you to just create experiences for square-faced watches, we want to make sure you think about circular ones as well. And so we will give each and every one of a Motorola 360 as soon as it is available. It's an incredible time in personal computing. You saw our journey today across our platforms. Across all these computing devices which people are using. It's tough to believe that personal computing started only a few decades ago. We feel humbled to be part of this journey with you all, and we look forward to building more amazing experiences with you. Thank you, have a great conference. [APPLAUSE]
B1 中級 美國腔 谷歌Google I/O 2014 - Keynote (Google I/O 2014 - Keynote) 281 16 Hhart Budha 發佈於 2014 年 06 月 10 日 更多分享 分享 收藏 回報 影片單字