Also, I got tweeted my NASA. I’m going to stop the blog.
Turning petabytes of satellite imagery over 37 years into an interactive experience in product, on the web, and an interactive installation at the UN. Timelapse in Google Earth provides visual evidence that climate and human behavior are changing Earth.
Our team worked on the original idea, product design/ux, designing, writing, sound, comms & the main time-lapse camera moves and sequences…. plus a bunch of other stuff. Lots of hard work. I played a small part.
And absolutely one of my favourite projects this year.
Part of Hack to Help.
Our communities are facing all types of challenges with the coronavirus (COVID-19) outbreak. In the midst of all of it, we’ve been inspired by the small hacks people are making to help in any way they can. So we’ve created this page as a resource for others who might want to help – whether that means creating something, contributing to others’ efforts, or simply volunteering your time to those in need.
Sideways Dictionary is a collection of fun, easy-to-understand analogies that help explain complicated technology terms. Use it to look up tech terms of all kinds, vote for the definitions you like best, and even add your own.
Creative Lab helped our friends at Google Jigsaw bring this project to life by writing, art directing, and producing two short animations. One to introduce the project, and a second to help people get in the right mindset to write and contribute analogies of their own.
AI Experiments is live.
— Google (@Google) November 16, 2016
There’s a bunch of amazing experiments on the site; but this one below is the one I spend the most time with during its early development phase.
Honestly; I never felt more out of my depth on a project than at the beginning of this one. Sat in the kickoffs with Alex, Kyle and Yotam who were deep in the weeds talking about t-SNE, dimensionality reduction, hi-dimensional space, convolutional neural networks, and supervised vs un-supervised learning. Was a full-on nose-bleed, crash course, in ML. But so worth it. Do not fear this stuff. It’s a different world to start; but after a few weeks it starts to take. So please enjoy….
The Infinite Drum Machine
— Google (@Google) November 18, 2016
Sounds are complex and vary widely. This experiment uses machine learning to organize thousands of everyday sounds. The computer wasn’t given any descriptions or tags – only the audio. Using a technique called t-SNE, the computer placed similar sounds closer together. You can use the map to explore neighborhoods of similar sounds and even make beats using the drum sequencer.
Here’s the explainer video:
For an extra sneak peak into the development process; here’s a video showing an earlier prototype. This one has around ~40k short samples from Freesound! For the final version we licensed ~17k.
This is one of the last projects I started working on in New York, so it’s great to see it out in the real world. Mad props to Alex, Catherine, Manny, Yotam, Eric, Jonas, Kyle, Gene and bunch of other very smart people.
And yea…. what Kyle said.
— Kyle McDonald (@kcimc) November 16, 2016
AI Experiments website:
Most of the stuff I work on is confidential so I don’t get to share it publicly—but my last project could hardly be more visible. I’m not going to write lots about it as there’s already plenty of coverage out there. All I wanted to say is that it’s the meatiest (and most rewarding) thing I’ve ever tackled. We kicked the project off in January 2015 with ten people in a room sketching ideas. By the end of August we had over 200+ engineers, designers, writers, product managers, and marketeers preparing to flip the switches on over 30+ product updates. As well as the product updates and a ton of guidelines and toolkits – we also made this Google, Evolved video, a Google Doodle for the occasion, and shared the thinking on the Official Google Blog.
Everything went live on September 1st 2015.
Bonus: we also broke down the process + thinking in much more detail over on the Google Design Blog post If you’re into how things get made you should definitely take time to read it. You’ll get a better understanding of how the process worked, why the system & framework were designed to hold together, and what we wanted to reflect in the brand by making Google more accessible and useful to our users—wherever they may encounter it.
Here’s a little teaser.
Early this year, designers from all across the company, including Creative Lab and the Material Design team, convened in New York for an intense, week-long design sprint. We drafted a brief that identified four challenges we wanted to address:
- A scalable mark that could convey the feeling of the full logotype in constrained spaces.
- The incorporation of dynamic, intelligent motion that responded to users at all stages of an interaction.
- A systematic approach to branding in our products to provide consistency in people’s daily encounters with Google.
- A refinement of what makes us Googley, combining the best of the brand our users know and love with thoughtful consideration for how their needs are changing.
It was a huge team effort. Hope you like the work!
I’m excited to be able to share something I’ve been working on with team at Google’s Advanced Technology and Projects (ATAP) group over the past few months.
Project Soli is a new technology that uses radar to enable new types of touch-less interactions. My team worked on the overall project design + branding, early interaction ideas and use-cases, demo & prototype ideas, narrative storytelling and connected ATAP with the talented Jack Schulze and Timo Arnall.
Project Soli was announced at Google I/O in May 2015 to rave reviews, and the project team are now building out the DevKit. Developers and interested parties can now sign-up for updates on the official site.
Huge props to Ivan Poupyrev, Carsten Schwesig, and the entire team at ATAP. Excited to see where this will go.