Thursday, November 21, 2019
Robotics The Software Stage Is Here
Robotics The Software Stage Is Here Robotics The Software Stage Is Here Robotics The Software Stage Is HereIts 2017, and robots are still pretty dumb.Robotic hardware has mora or less arrived, and machines are currently hard at work in a wide range of industries including manufacturing, health care, and mora. But the truth is, todays robots are not yet the stuff of science fiction dreams. They are only capable of performing rote, monotonous tasks, arent good at adaptation, and still struggle with jobs requiring human interaction.In order for robots to reach their full potential, then, its time for the software that controls them to catch up with the capabilities of todays hardware. Researchers worldwide are working on this challenge right now, leveraging everything from artificial intelligence, to machine learning, to Big Data in order to better train robots and mora seamlessly integrate them into daily life.It really does feel like robotics is exciting again, says Chris Roberts, hea d of industrial robotics at product development and design firm Cambridge Consultants. Since the sdarbietungies, there has been this general steady progression of robots getting bigger and more precise and more powerful and more expensive. This hasnt really been a revolution in technology, but lots of individual things getting a bit better. Processors getting a bit faster and sensors getting a bit cheaper. With labor costs going up I expect what well see in the next few years is more of the very low-skilled jobs getting automated.According to Dr. Dezhen Song, a professor in the Departement of Computer Science and Engineering at Texas AM University, high-level intelligence for more advanced tasks is still probably five to 10 years off, depending on the difficulty of the task and the robot behavior involved. Simpler, more repetitive taskssuch as picking and sorting producecould be outsourced far sooner.If you want a fully autonomous system that functions like a human, thats probably v ery far off, he said. But if you have specifically set up a task you want them to do, then we are very close. We actually are already there for some tasks.FANUC (a), Kawasaki (b), KUKA (c), and other major robotics companies are now manufacturing systems designed to work alongside humans.Partners, Not ToolsIn order for robots to become an autonomous part of the workforce they will need to become better at interacting and working side-by-side with humans, a process that robotics experts refer to as cobotics, literally human-robot collaboration.Imagine youve got a robot working at the same lab bench as you and the robot is helping you, says Roberts. Say you both reach for the same test tube. The robot will stop and it wont hurt you, whereas the last generation of robots would have. Thats cobotics. But its still too hard for that robot to plan around you. So, when you both try to reach for the same test tube it will stop, it wont try to retry, it wont say youre reaching for that so Ill take a different route to get it.The challenge of cobotics is the fact that humans and robots tend to have overlapping skillsets, so developers need to determine which tasks to assign to robots and which to leave up to humans. It isnt solely a question of creating machines that handle tasks for us, but rather making them flexible enough to know when to step in and help us and when to let us take over.Deep Learning Teaching the RobotsThis is where artificial intelligence and machine learning come in.Deep Learning is a neural network-based approach to machine learning that makes use of todays massive sets of data to train machines on behavior. By using these large data sets programmers are now able to improve robots object recognition skills, their natural language processing, their image classification and more, resulting in smarter machines.A graph showing the number of organizations engaged with NVIDIA on Deep Learning in 2013-2015. Image NVIDIAAccording to Jesse Clayton, senior m anager of product management for intelligent machines at Nvidia, three factors have enabled this new approach to machine learning Big Data, so there is more data available to train neural networks new training algorithms that are far more efficient than previous generations and advanced new graphic processing technologies, enabling robots to see and perceive more about the world around them.The key part is training, he said. This is where youre exposing a neural network to the sort of data that you want it to learn. So, if you want it to learn to detect people, or you want it to learn to detect cars, or if you want it to learn to detect widgets in a factory, you simply show many, many instances of that data and through that process it learns how to distinguish between cars or people or different types of widgets in a factory.This is the process by which artificial intelligence becomes intelligent, and thanks to Big Data and cloud computing, it is accelerating.Right now, robots know to pick up a widget from this spot, move it over to this spot and put it back down, Clayton said. They cant deal well with things like dynamic lighting, changing environments, or changes to a manufacturing line. So, theres a lot of opportunity to automate so many more things throughout the entire industrial supply chain, if robots could be smarter about dealing with more dynamic situations, and also smarter about being able to work with humans.Clayton says he expects Deep Learning to start making real changes to robotics in the next five years, affecting not only manufacturing but a whole host of other industries as well.The Rise of the RobotsOf course, no discussion of Deep Learning and robots teaching robots is complete without addressing the risk factors associated with having sentient, autonomous robots in close proximity to humans. By definition, machines are stronger and more resilient than the average person, and that creates a potential danger in the case of a malfunction or other breakdown in the cobotics working relationship.This has not gone unnoticed by researchers.With robots, were going to have situations where they might work in some environments, situations where I can control the environment, but might not work when we are in an environment where we cannot anticipate of all the possibilities, says Dr. Song. So, we will have to be very careful. We have to have a fence, and within the fence we know the robot can work safely. The problem is its not always possible to establish that fence, especially as robots start getting closer and closer to humans.Autonomous driving is a very good example of this, he explained, because in a self-driving car a person is essentially sitting inside a robot that is fully in control of the situation and is driving very close to other people out on the road. This is a car, and it can do real damageto the occupant as well as others around itin the event that something goes wrong. The possibility of any sort of accide nt, then, is unacceptable, and many layers of safeguards must build in to protect the humans that are interacting with these machines.This is a process that takes time and careful effort, meaning that the transition to fully interactive robots is going to be slow and methodical.It is good to be optimistic, says Dr. Song, but its not good to be overly optimistic about this technology. We have many years of work to do.Tim Sprinkle is an independent writer. For Further Discussion With labor costs going up I expect what well see in the next few years is more of the very low skilled jobs getting automated. Chris Roberts, Cambridge Consultants
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.