Episode Summary
What if every time you washed your dishes, your dishwasher got smarter? Now imagine your dishwasher getting smarter every time someone else washed their dishes.
Today, we are talking to Roger Barga, the General Manager of AWS Robotics. We discuss the recent advances in robotic programming as well as the benefits of the cloud in commercial and domestic applications.
Episode Show Notes & Transcript
Some of the highlights of the show include:
- The benefits of RoboMaker in code deployment
- How cloud computation frees up local resources
- Using machine learning to improve robot reaction
- How great a name RoboMaker is
- Amazon’s commitment to the enduring API
Links:
Transcript
Announcer: Hello, and welcome to Screaming in the Cloud with your host cloud economist Corey Quinn. This weekly show features conversations with people doing interesting work in the world of cloud. Thoughtful commentary on the state of the technical world and ridiculous titles for which Corey refuses to apologize. This is Screaming in the Cloud.
Corey Quinn: This episode of Screaming in the Cloud is sponsored by N2WS. There are a number of backup solutions that are available in AWS including the recently announced AWS Backup. Well, AWS, back the #$*% up. Backups are incredibly easy. Restores, however are absolutely not. You want to find out whether your backups worked well in advance instead of the way that most of us do: when they don’t work quite right immediately after you really, really, really needed them to work correctly. Check them out at n2ws.com. That’s n2ws.com. Thanks to them for supporting this ridiculous podcast.
Corey Quinn: Welcome to Screaming in the Cloud I'm Corey Quinn. I'm joined by Roger Barga, General Manager of AWS Robotics. Roger, welcome to the show.
Roger Barga: Thank you.
Corey Quinn: So, starting at the very beginning what would you say that RoboMaker does exactly?
Roger Barga: So, RoboMaker tries to remove all the undifferentiated heavy lifting that a robotics application developer has to do from the moment they try to start their project with multiple team members making sure everybody has the exact same development environment offering them a good run time to actually run on their robot. And then, offering them simulation as a way to test their application in 3D or 2D environments. And also, complimenting the software with cloud services powered by AWS. Which I believe the cloud is going to be one of the most powerful resources a robotics application developer has access to. But it also allows these developers once they've built their application, tested it through simulation, maybe running hundreds of simulations to test their robot in different environments the ability to publish their application to a robot anywhere in the world and manage hundreds or thousands of robots in a fleet. So, it really provides end to end application development support for robotics.
Corey Quinn: Where does a service like that come from? I guess what sort of pain did you see in the industry that made you decide yeah this is a service that we should bring to market? I'm trying to imagine a conversation that ends with, "You know what would really help this problem? That's right, a whole bunch of robots." Now, in my business I don't have any of those needs. I also live in fairly gentrified San Francisco where giant piles of robots solve remarkably few problems in my life and introduce a whole bunch more. Obviously I am probably not the target market for this. Who is?
Roger Barga: Yeah. So, there's a number of innovative companies that are right now exploring what can be done with automation and robotics. And we view robotics as a very general term, it's anything that can sense, compute, and take action. So, a Coke machine can be a robot, a dishwasher can be a robot, the kivas that are running around in our fulfillment centers. And so, we started talking to many of these startups. Many of the developers within Amazon Robotics which are building and deploying robots, so we said, "Where do you spend your time? What's tough about this? What are the hard parts that really don't add value to your robot?" And this is how we started to understand the product definition for RoboMaker. Because what we saw is these developers spend 80, 90% of their time on tasks that add absolutely no value, unique value to what the robot they're trying to build will actually do. They have to set up dev environment, set up simulation, manage the machines associated with that, very clunky mechanisms to update their robots let alone manage them once they're put into production. And this is actually what became the product definition for AWS RoboMaker.
Corey Quinn: All of which makes sense but how does the cloud enter into this?
Roger Barga: Yeah. So, one of the things if you look at actually where a robot spends its compute power and what resources it has available to it you quickly find that most of the power that's spent on a robot is computing functions locally on the robot which in fact could be shifted to the cloud allowing more power to be utilized by the robot for movement and interacting with its environment. When you step back even further and think about how could the cloud be used to capture data about all the robots in production. Did it detect trends? How could the cloud be used to coordinate the activity and orchestrate the activity of a group of robots in your house or a fulfillment center? Then you start to see the value that a cloud can bring not only to an individual robot but someone who's actually trying to build a business and optimize a business with a collection of robots.
Corey Quinn: I am in now way shape or form a roboticist but to my naïve view I would imagine that if you have a robot as I guess society generally conceptualizes a robot the fact that you have motors that provide locomotion, potentially there is a vacuum on it, maybe it has a bunch of articulated arms that do different things. It seems to me that the power requirements to power those engines are almost on a different order of magnitude then what it takes to power a CPU, or a disk of ram. So, from where I sit it seems like having compute on device isn't really something that moves the needle in any meaningful way. Is that naïve of me?
Roger Barga: It is. It turns out basically for a lot of robots we actually looked at over 50% of their power was actually processing imagery coming in through the cameras, processing data coming in from the sensors, especially for things like that navigation or creating slam maps which are computationally intensive. When we could stream the data coming off of a lidar or a camera to the cloud, do the computational intensive mapping, object recognition, and route planning up in the cloud and push these simple instructions down to these motors. You can save over half of the battery power, let alone the fact that a developer who's trying to build an affordable robot does not have to put expensive compute power. And I'll talk about a customer who we've been working with puts very affordable low power chips on the robot because they can actually push that compute capability and spread that cost of what they're paying for in the cloud across thousands of robots not putting it on each and every robot. It brings the total cost down and this is really important because we see a world where there could be hundreds of robots we get to work with throughout our house, throughout our businesses, and that cost has to be low for them to provide value for the company that's running them.
Corey Quinn: When you're talking about a commercialized robot, something that a company generally tends to sell for a fixed fee and then it has capabilities that may be cloud empowered, do you find that the economic story winds up changing as a result? Instead of a fixed bill of materials for a robot that's out the door now you have effectively cloud services which are historically pay for use which means that the life cycle and how long something's going to exist does over time have a different economic model then existed previously. And do you find customers are okay with that?
Roger Barga: Indeed. We found this actually in cloud computing in general where customers can amortize the cost and the investment of a piece of software not on a per robot basis but the actual usage and they find the economics actually work out in their favor. Not to mention the fact that they're sharing information. If you just think about robots navigating throughout your house or fulfillment center each one of those has information about its local environment which it can share up to the cloud. If another robot needs to enter that part of the warehouse it no longer needs to spend the compute power to understand what the map looks like. It can simply borrow from one of its neighbors who's been there previously and utilize that map saving compute power for everybody.
So again, it's that information only about sharing the compute power, but it's sharing the information that each one is getting. And it's also exciting when we think about machine learning. Let's say we put a machine learning model and install it on the robot for navigation and it bumps into a wall. It can actually send that little information, that one or two errors that it's going to make up to the cloud and if the customer has hundreds or thousands of these robots each one making one or two mistakes they now have a large corpus they can use to retrain the model and push a more intelligent model back down to the robot the next day.
Corey Quinn: So, RoboMaker effectively empowers the hard/interesting parts of building what most of us think of as a robot not the actual assembly line pieces of constructing things in hardware?
Roger Barga: That is correct. And in fact I'd even argue when you look at the hard problems that roboticists have to solve they've got really hard problems that they're trying to solve and some of it's actually their pioneering. So, to actually ask them to spend 90% of their time doing this undifferentiated heavy lifting is really slowing innovation in this field. We can actually give them that time back so they can do that innovative task, build that custom hardware that's going to make their robot special, and take advantage of the ecosystem and services that we're providing.
Corey Quinn: For some of us making fun of various AWS service names has almost become a sport and I take a look at RoboMaker and what it does and it is a shining example of an awesome name. It's very descriptive, it's catchy, it fits in a single Tweet which in some cases is hard to get to. It's so well named I almost have to assume that someone fought against it when it was first proposed. Was this the first name for the service you were considering or did you have a more contentious discussion?
Roger Barga: Naming is taken very seriously here at AWS. Names are important, it will shape how a customer thinks about a service, it can shape basically what people think they can do with a service, and I have to admit I'm really bad at naming. I'm a very pragmatic individual. I came forward with some very simple names for the service and our leaders, we stepped back as a group, and your five ideas actually turn into a list of 500 and you get to think about the merits and see what your peers think about them from their experience. It turns into a journey but also a heck of a large number of meetings. When it's done you can actually look back and go, "Yeah that's a great name. Why didn't I think of that in the first place?"
Corey Quinn: Speaking from personal experience it is many orders of magnitude easier to make fun of a name then it is to come up with a good one. Naming is an art and as much fun as I have with tearing down the very hard work of other people in that context it's hard. There is no great way to get there. Changing gears slightly, there's been a lot of talk about ROS, or ROS, or however it's pronounced, ROS. I read, I don't speak. What is that?
Roger Barga: ROS, yes. You know researchers 10, 12 years ago realized that research and robotics was being slowed down by the very problem that I described for industrial applications of robots that they too are repeating the same amount of undifferentiated heavy lifting to build a robot so they could actually publish their thesis and do that last little bit of interesting work. So, the community got together and said, "Let's actually build an open source academic runtime for robots". It's not an operating system, it's a message passing relay bus. Think about our sensor that actually senses something and it puts what it senses on a message bus with a topic. And again, robot sense, computes and acts where there's a compute node that needs to process information from that sensor. It subscribes to that topic, does the processing.
If it needs to move a motor it puts a message down on the line with another topic. And what's happened over the years is that academic institutions, researchers, have been contributing to this ecosystem of ROS packages for different actuators, different sensors, different computational tasks like navigation, and it's been picked up by industry as well. There's thousands of companies that are using ROS today for commercial applications of robots. And what's been happening over the last year is industry's been saying, "Let's advance this from a research platform to an industrial strength open source platform for robotics where the code has been verified, tested, hardened, performance has been improved". And that's the effort part called ROS 2 which we're proud to be part of.
Corey Quinn: And this is much larger than Amazon or AWS itself. This is a community or industry wide effort.
Roger Barga: Yes. Not unlike what happened in Linux. A number of companies have stepped up and say, "We have a vested interest in making the best industrial strength run time for operating systems that's open source, community supported, and community driven". And each one of the companies in the technical steering committee for ROS 2 of which Amazon was one of the founding members are actively contributing source code, designs, code reviews, and reaching out to the broader open source community of startups asking for their feedback, their input as well to define and build ROS 2.
Corey Quinn: Which brings us to the topic of open source which has a bunch of different directions it can go in but let's start at the beginning. What are you doing that is open source? You said you are part of a larger effort. How does that manifest?
Roger Barga: So, we as part of our membership of the technical steering committee and just what we feel is our responsibility to the open source effort behind ROS 2, we have engineers who are actively contributing to ROS 2 code. In fact, the most recent release of ROS 2, Crystal which just happened back in December, roughly 40% of the code that was actually contributed in the designs was through my team and the contributions through my team. We have a number of engineers which are actually not only contributing source code to ROS 2 but reviewing designs and helping support the community through forums, playing a very active role. And this is what we expect every company in the ROS 2 technical steering committee to do as well because this together is how we're going to make this a successful open source platform for robotics.
Corey Quinn: What are you doing that's different as far as the open source world goes these days?
Roger Barga: Yeah. So, Amazon and AWS have played very active roles in open source software and in not all cases have we been so visibly active and vocal. In this case we are. We actually came out as a public member of the technical steering committee. We're helping actively drive discussions getting feedback from the community and doing so in a very visible manner. And we think this is important because if you're startup or another company thinking about investing in ROS you need to see the names of the companies who are standing behind it and hear about the contributions that they're making so, "Can we trust our business with this?" What's been fantastic since the announcement of both the launch of RoboMaker but also the ROS 2 initiative and the companies specifically Amazon behind it, a number of robotics companies have approached us and said, "We're now moving to ROS 2. We see that Amazon's behind it. We see the RoboMaker support and this is going to be community supported and led. Let's go ahead and port our robots over to ROS 2."
Corey Quinn: One thing that strikes me as a bit of a strange tangent off to the idea of running robots that are cloud connected is as wonderful as the idea is of offloading all of the compute, all of the different heavy lifting that they wind up doing to a third party provider that's somewhere in a data center far far away. What about latency/safety sensitive concerns here? I mean the easy example is an autonomous vehicle which is a whole separate kettle of wax. But we're waiting on an API error and there's a time out and we're doing exponential back off. Meanwhile you are hurtling toward the bay and wanting not to drive into that. Do you find that there are use cases for which there is no substitute for on device computation or is there a fallback mode? How do you envision this?
Roger Barga: Absolutely and we can talk about what customers have done including our own robots in our own fulfillment center and we can also again we're very much ... We also follow trends and we anticipate when we see trends that are unfolding. 5G is unfolding which will give us ubiquitous connectivity devices around the world with low latency. So yeah, we see connectivity increasing, latencies falling, but let's talk about where we are today. Where if you wanted to have a highly interactive session with your robot and you could not afford that latency RoboMaker allows you to actually deploy code including machine learning models onto the robot itself for low latency interaction. And then, you can partition the work where if I can actually handle high latency, if I want to tell my robot to go actually get me something out of the refrigerator I'm okay if it takes a 100 milliseconds latency for that command to actually get up to the cloud, translate it to commands, and back to my robot.
So, good engineers will actually partition the work that needs to happen on the robot which is that when what can be actually offloaded to the cloud. And again, this is that interesting programming the edge, and what role will the cloud play, and how do you partition your work. Which makes this such an interesting problem. And again, companies that need safety critical processing will put that on board, on the robot, prioritize those messages, prioritize that processing, and actually offload other processing capabilities to the cloud.
Corey Quinn: When this service was announced at midnight madness at re:invent last year it was fascinating in that it was sort of out there, not directly tied to other services. And what's also neat about this is that it was announced as being generally available. This was not a in preview, coming later, apply for a thing, it was there ready to go that evening because what you absolutely want people to do is building industrial robots at 2:00 in the morning the day it's released 'cause that could not possibly go wrong. I've never known Amazon to release a service that did not already have active customers using it. You're not generally a company that says, "Hey we built this thing. We're super proud of it. Maybe someone will use this." You aren't the throw a bunch of stuff at the wall and see what sticks company. You have customers actively using this on launch day. Do you have any you can talk about?
Roger Barga: I do. And in fact even from the very inception of the project because we do work customer backwards we reached out to customers that are running robots in production as soon as we had our PR FAQ written to get their input, to get their guidance, and prioritization of features, and really dive deep with them in actual use cases. We later onboarded those customers into an advisory board and actually a beta program. So, we're working with customers months before a actual launch including working with customers who are ready to go into production. We worked with NASA JPL to port their open source rover to RoboMaker and to ROS.
One of my favorite customers because of the nature of the robot and how they're using the cloud is Lea by Robot Care Systems. Lea is a walker robot for the elderly or the disabled. It's not something you would think of as a robot but it's running ROS, it computes, it senses, it takes action. And in the case of Lea they actually have added our cloud services for Polly and for Lex so that the customer can actually call the walker to them from across the room with their voice. Lea will respond, come to the patient, interact with the patient in the most natural manner through voice, but they're also streaming telemetry off of the walker through kinesis data services. So, they can actually understand the gate of the patient, their walking rate, how much activity they've done. Doctors can have dashboards that monitor their patient. If they feel the patient's recovering they can actually build predictive models with that data and actually predict when the patient is going to recover or detect if there's a negative trend and they need to actually interject and take action. It's changed their business, it's changed the value prop they offer to their customers, and it's a great example of how RoboMaker and the cloud can actually complement robots in houses.
Corey Quinn: When someone is looking through the vast, vast, vast list of various AWS services and they come across RoboMaker which again props to the name that's evocative. And let's say they're a new grad, they've graduated from college yesterday and now they're figuring out what they want to do with their career. I'm told it doesn't quite work that way anymore but let's pretend that, "Oh wait, you mean I need to get a job and here we are", and they see that. Is there an easy on ramp for this service for someone who is puttering around at home for fun? Is there a DeepRacer style equivalent or DeepRacer itself? Is this something that is going to be useful to someone who is not part of a larger organization or do you generally need to already have a number of prerequisites before this starts to add value?
Roger Barga: Yeah. First off when one looks at ROS and the educational materials that are available they immediately have access to this and in fact the University of Cambridge in the UK is using RoboMaker right now to teach their robotics class. We have an educational outreach program that contains 15 universities in higher education that are using RoboMaker to teach robotics to their students. In addition to that, once you actually launch RoboMaker and open it up low and behold you'll actually find that it's actually used to train DeepRacer for actually learning how to race around a track.
We have about a half a dozen today and more soon to come sample applications where we have the source code and we walk you through how we build the application. And in fact, the Turtlebot which is the most widely used robot for education and for hobbyists, all of these applications run on that robot. So, a customer can actually deploy it to the robot in their living room and actually see it execute the program, change their program, and see the Turtlebots behavior change as well. So, we have tried to put a number of resources available and we have more to come which I can talk about later.
Corey Quinn: With the understanding that forward looking statements, etc, etc. If we take a look back at some of the early launches of AWS where an awful lot of what was announced made absolutely zero sense. You're an online bookstore, why are you announcing a queuing service or this thing called an object store that none of us had ever heard of? And now, we are a decade later and change looking back on that and seeing okay yeah this was used to build an awful lot of transformative amazing things. And it's never quite clear how much of the world today you folks saw coming back when this stuff was launched. So, in the context of RoboMaker do you have a vision 10 years out from now or however long it is where we're going to be looking back and this was the most obvious thing in the world to build but needed to get to a certain place and now it empowers something transformative and grand? Or is this effectively aimed at today's customer requirements or both?
Roger Barga: Yeah, I think that's part of Amazon's culture of being customer obsessed and invent simplify. I can assure you that every feature of RoboMaker was derived from talking with customers with actual real pain points both within the company but also outside the company. And we're already consulting with companies now like now that we've built this service what other new features can we add for you to enable you to do more with it? So again, I think the reason these services become more valuable, most viable, is not because of how they started but how they evolved working with customers as their needs evolve, as their requirements evolve, as new applications of robots in our case evolve we'll evolve with it.
Corey Quinn: Common refrain from Amazon is that collectively as a company you are and I quote, "Willing to be misunderstood for long periods of time". If you look right now at the feedback you've gotten since launch, how people are using this service, how people are talking about your service, how do you see that RoboMaker is potentially being misunderstood today?
Roger Barga: Yeah. So, developers have not had access to cloud services to take advantage of both for fleet management, for augmenting the capabilities of the robot, so we do find it foreign to roboticists have not had access to this capability. So, we do find ourselves leading a dialogue with them about how we use cloud services to coordinate the robots in our fulfillment centers, how other companies are using cloud services to program and control robots that are out in space hurtling towards new planets. And so, it is a little bit of an education of what the possibilities are but then also listening of what new services should we build. So, I do believe that's the most interesting space is that partitioning of functionality between the edge and the cloud and how it can complement their capabilities.
Corey Quinn: One of the more signature attributes of AWS has been that when you wind up launching a service even if it's one that doesn't seem to make sense, doesn't wind up seeming to have a market, it never gets turned off. And every service you launch has customers to my understanding but API's are almost perceived as promises from you folks. I feel like I can wind up taking this recording of our conversation and archive it and in 50 years my descendants will be able to listen to it and they may laugh a awful lot of how naïve the conversation was etc. etc. but that service is still going to be there.
There are very few companies I would take that bet on particularly in the technology space but it seems to me that whenever something goes GA from AWS I have remarkably little hesitation in recommending that people build their business on top of that service. The counterpoint to that is API's are forever, for better or worse. Are you starting to see ways for the API to evolve? Have you gotten to a point, and you don't need to be specific on this, where now that you've seen even in the few months that it has gone GA that you would've made different decisions in how the service is interacted with, how it interacts with other services? Or alternately are you seeing ways to expand this far beyond where it is today and start embracing other AWS or third party services that at launch you really hadn't considered using?
Roger Barga: So, we don't have any crystal balls that tell us how an API is going to hold up over time but we do know ...
Corey Quinn: I was hoping I could borrow it if you did.
Roger Barga: But we do know we have customer trust and customers will actually take a dependency on our API, build their application on our API, and we can't the hubris to think we can simply change an API and break those customers. So, while we try to think very deeply and very careful about the functionality of an API is that is it simple as possible, is it as cross cutting as possible. 'Cause you can always add new API's with different functionality over time but you never want to deprecate a API for the fear of breaking potential customers. So, there's a thought process that goes in there but there's also an obligation to keep the API as it is. You can always add new API's with new functionality. And again, a lot of that is if you start with customers in beta programs that you know they're deriving value from it so you know that API is going to continue to add value in the ecosystem. That doesn't mean we're not going to add more as we see additional ways of exposing functionality in a simpler form for our customers, are more powerful. But there is that commitment that we will continue to support the API's we have exposed.
Corey Quinn: As you take a look across the landscape of other AWS services, at launch you mentioned that there were a bunch of very high level, very forward thinking services that RoboMaker integrated with and also CloudTrail. And I'm wondering if you take a look across the ecosystem of various AWS services are you seeing opportunities to integrate with different services that weren't necessarily there at first? And there are some ridiculous answers to that. "Yeah we want to make sure that the robot can speak appropriately to Cost Explorer." Sounds like something that not a lot of people would be clamoring for and then of course I tend to make no predictions about anything AWS does. There's nothing I'm saying that will never happen. For all I know there's a huge customer that you can't tell me about that's already doing a lot of work with robots and Cost Explorer but I can't imagine with that would look like.
Roger Barga: So, obviously we were very excited about integrating RoboMaker with Polly and Lex So customers could have a more natural interaction with it. Pleasantly surprised to find out later that Cloud Watch turns out to be one of the most commonly used services because customers want to know, "What the heck is going on in my robot? Where is my robot at?" And so, when you start to see services like that that expose meaningful value, SMS which actually allows me to stream messages off my robot, maybe send a notification to my robot is one we're looking at right now. We have the ability to actually put an agent on a robot and update the operating system on a robot with an AWS action EC2 instance. But we're seeing demand for that.
So, when we start to think about the pragmatic nuts and bolts about actually managing a robot, where it's at, what it's telemetry is, we see new services that we're going to be integrating over time. I'm pleasantly surprised to see when we launch with DeepRacer another team basically is actually using reinforcement learning for training a car. The interest and response we've gotten from companies that say, "I'd love to use reinforcement learning to actually train my robot to do new behaviors", and we need a deeper and richer integration with that. So again, I think in the fullness of time we'll be building new services for fleet management but integrating even more AWS services into robots.
Corey Quinn: Cloud Watch is I guess one of those personal hobby horses I have but credit where due. That service is evolving rapidly over the last few months and it's modernizing at a very interesting rate. A lot of the challenges historically that were there are no longer there now and I'm sure even fewer by the time this episode airs. So, I want to be very clear that was a joke, that was not an actual criticism of the service. I guess one interesting aspect of this is the idea that you mentioned with Lea the robot that walks around and integrates with various other services. It seems like this is almost a straight shot play for some of the various Alexa services out there as well. Where this winds up being able to empower different modes of interaction with existing things both around the home as well as in the workplace.
It feels to me like, and I can't even articulate how but this is a glimpse of a future where working on a computer no longer looks like sitting there typing into a terminal or an editor. It starts to look a lot more like a conversation and it starts to look like where you say, "We give a series of instructions and things start happening in the real world". It feels like a number of things I've never spent a lot of time going into on the AWS side that interface with the real world, things that I try not to deal with as best as possible. IOT is an example of this as well where it starts to hint at a future I can start to see the edges of but can't quite figure out what that's going to look like.
Roger Barga: Yeah it is. Again, if we think about robots in their most general sense they sense, they compute, and they act. And how many devices do we have to interact with today that do that for us? And think about the interface we have. I am confounded by my dishwasher. I can spend a half hour trying to get the darn thing to actually do the right load. What if I could walk up to it and tell it exactly what kind of load I wanted to run, what time I wanted it to start, and I could do the same with other appliances throughout my house which in fact are robots? What it's really surfacing is not necessarily going to replace developers but give a more natural way of interacting with these devices which are in fact robots and it's how we're interacting with RH devices, how are we programming and managing them. So, I think that's an exciting future.
Corey Quinn: I would absolutely agree with that assessment. One thing I will point out that I expect you won't have anything meaningful to share with me. You are not the GM or RoboMaker, you are the GM of AWS Robotics. And on the one hand I feel like this might wind up being a story similar to ground station which is in its own category called satellite. Either there's about to be a whole lot of interesting space releases or it just really didn't fit into any other existing categories. Is this scenario that you're seeing is ripe for expansion or is this more or less a, "Well, we didn't really know where else to put the robot thing and we're done"?
Roger Barga: We think this is an area of great innovation and great opportunity for the years ahead. So, much like a naming exercise for a product we apply the same naming exercise for our service and our team not wanting to be locked into any single definition of what the team does or owns. Thinking in the fullness of time there will be other services. We're already talking and thinking about validating with customers what those services might be. But it's clearly a new category of emerging technology so we should be prepared to build and manage services for our customers that are building robots out in the real world.
Corey Quinn: Thank you so much for taking the time out of your day to speak with me. This is an exciting space and I'm very interested to see what comes next.
Roger Barga: It's been fun talking to you today. Thank you.
Corey Quinn: Thanks very much. Roger Barga, General Manager of AWS Robotics. I'm Corey Quinn and this is Screaming in the Cloud.
Announcer: This has been this week's episode of Screaming in the Cloud. You can also find more Corey at screaminginthecloud.com or wherever fine snark is sold.