Welcome back. In this course, we're learning how to use TensorFlow to develop deep learning models on mostly relatively small and academic data-sets. But of course, deep learning has also had enormous success in industry applications. In this video, I'm joined by Doug Kelly, a data scientist from Google Cloud, to talk about how Google has industrialized deep learning. Doug, thanks for joining me today. Thanks for having me. Perhaps you'd like to start by telling learners a bit about your background. Absolutely. So my name is Doug Kelly and I work as a data scientist on Google Cloud where I primarily work on decision support analytics projects and building machine learning systems on top of TensorFlow extended to improve the resolution, speed, and quality of technical support cases for GCP customers. This has involved numerous deep learning projects including predicting the resolution time of a case to power interventions, predicting the product feature issue tags of support cases as they come in, and also combining deep learning with interpretability techniques like integrated gradients to decompose complex business metrics and the key drivers to prioritize operational improvements. Prior to Google, I actually oversaw Data Science Content Strategy at Coursera. So it's a real pleasure being back here and joining you in front of the camera after spending so many years behind it. Prior to that, I also worked in a number of positions in the finance and utility space as well where I got to experiment with neural networks back in the day, including on text clustering with Dr. Veck and also RNNs for time series forecasting. I think sometimes we have to remind ourselves that this current wave of deep learning research is still relatively new. Many people will look back to the AlexNet success in the ImageNet competition back in 2012 as sparking a surge of interest in deep learning. But of course, that really wasn't that long ago. How has this rise in deep learning impacted your career? I would actually credit my intuition for neural networks to auditing and renderings Machine Learning course on Coursera back in 2014. I wrote my first neural network in Theano in graduate school, and to be honest, I didn't like them at first. I found them incredibly hard to debug, in tune, to understand and as I was primarily working with structured enterprise data, with just about any other machine-learning approach out there, Boosted Trees and Scikit or Attributes, I was achieving much better results out of the box. I was in the depths of despair during a class project when a classmate of mine introduced me to their ace in the hole, and that was Keras. From that point on, I was Keras first, and this was just about the time that TensorFlow was coming out. So before TensorFlow even. What Keras did is it opened up a whole new world of working with text in sequence data which abounds in the enterprise which I had never previously worked with. For learners that are interested in learning more about kind of the history of deep learning over the past 10 years, I would highly recommend that they check out the heroes of deep learning series from deep learning.ai where they can hear about many of these developments from the researchers that push this revolution for themselves. Many researchers would likely highlight events like AlexNet in 2012 and also maybe Bert in 2018 for kicking off the transformer revolution in natural language processing. I would actually suggest that there's also a complimentary timeline on the applied side as well. From my perspective, you had libraries, kind of a first wave, we had libraries like Theano and Scikit being released in 2007 and 2010 respectively. A second wave kicking off in 2015 with libraries like Chainer and Keras and TensorFlow. Perhaps now a third wave kicking off in 2017 with the release of Pytorch and kind of the convergence between these two frameworks, TensorFlow and Pytorch over the past couple of years. On the education side, many learners were brought into the field, starting with a first wave in 2012 with the rise of MOOCs. So you had Coursera and Udacity launching in 2012. You also had, perhaps a second way of kicking off in 2016 with Fast.ai taking a innovative approach to teach you machine learning and also Kaggle, evolving from just the competition platform. It's really an education resource and hitting over one million learners all the way back in 2017. From 2017 to 2019, there was also a focus on end-to-end Machine Learning frameworks as well until now it shifts the focus from research to production. So you have frameworks like TFX from Google, you have Michelangelo from Uber, and many other frameworks designed to accelerate the deployment of Machine Learning more broadly in the industry. So Google, of course, is well-known as being one of the leaders in AI research on products. What can you tell us about how machine learning is done at Google? I believe Google is one of the few companies in the world that is close to achieving industrial-scale Machine Learning. What I mean by industrial-scale Machine Learning is developing standardized machine-learning system blocks and deployment processes such that you're reducing the marginal development time and cost to deploy machine learning solutions by just about any practitioner. So you have 100s of practitioners now around the world that are able to integrate machine learning into their applications. What what makes Google so unique is its corporate strategy and commitment to becoming an AI first company. Alphabets corporate structure draws inspiration from many previous innovative research institutions in that it combines both researchers and practitioners closely together working on a number of problems and taking risks across a wide range of different areas. So you have 1,000s of researchers in Google AI producing cutting edge research and machine learning, but also complimentary research in compute, networking, and hardware as well. To further accelerate transitioning research into products, there's 1,000s of software engineers at Google that are working closely with researchers to integrate this research into products, nine of which now have reached over a billion users. So in addition to that, they're also working on frameworks like TensorFlow and TFX to accelerate the deployment at machine-learning across the company. Supporting this translation from research to products, you also have 1,000s of data scientists, UX researchers, product and program managers working behind the scenes to integrate machine learning into new and existing products and also to measure impact as well. I would also love to give a shout out to the community as well. There's a very vibrant community that exists outside of Google that- it's 1,000s of contributors and partners that are making contributions to Google's open source products like TensorFlow. So you have this incredible ecosystem of contributions flowing back and forth through the company as well. Ultimately, it's a fascinating ecosystem to be just a small part of an incredible driver of machine learning, research, and application globally. Finally, I'd like to ask you, what are some of the applications where deep learning has had a big impact than Google? Yeah, the best applications in my opinion have been those that really blend behind the scenes into products and also augment users. A couple, whether learners know it or not, deep-learning is now working behind the scenes to serve you more relevant search results, send you more relevant ads, help rank videos for you on YouTube, help answer questions, help understand and answer your questions is part of the Google Assistant, and also helping billions of users every day compose emails. In fact, since joining Google, one thing that stood out to me is that there has really been, even just a few years ago, there was this narrative that deep learning is only good on unstructured data. So text, video, and if you have structured data though, you're better off sticking to Attributes. Since joining Google, I perceive that narrative is beginning to change. So I've seen deep learning start to make inroads into many products and services that previously utilized more traditional machine learning approaches. For example, in YouTube, they've started to incorporate DNNs directly into kind of some of these mixed and hybrid systems for candidate recommendation, for video serving. I also attended the Kaggle Days Conference this year where I was on hand to witness Google Cloud's AutoML solution finish a close second behind the top Kagglers in the world on tabular data. So in summary, what I would leave learners with is that I certainly have perceived deep learning moving beyond just being another tool in the ML practitioners toolkit into perhaps something more general, a different wave for software engineers to build intelligent applications in the future. Doug, it's been great to chat. Thank you for joining me today. Thank you.