
If I say that I saw a robot today, what comes to mind? What is a robot?
This is not a strategy question.
Robots in science fiction
Because we have seen many robots. We’ve seen Robbie the Robot from the 1956 film Forbidden Planet, Rosie the maid from The Jetsons, Star Wars from the late 1970s, the animated Gigantor, C-3PO and R2-D2, Optimus Prime, Data from Star Trek, Arnold Schwarzenegger’s T-800. Terminators, or later robots like Wall-E, Dolores (and all the synths) from Westworld, and all the rest of the robots from the various Star Wars spin-offs.
All of this, together, has shaped our view of robots for years, at least in movies and TV.
We are also familiar with the stories these robots tell. Star Trek Data just wants to be more human. Isaac from The Orville is a Kaylon, a race of robots that destroy organic creatures. (Yet Isaac’s path was one of redemption, as he is a sympathetic Kaylon and helps turn the tide for organic products.) Star Wars robots, especially those designed for merchandising, have become friends and companions to their organic counterparts.
Then there are the many evil robots bent on destruction, such as Ultron; Hal 9000; Daleks and Cybermen from Doctor Who; Various incarnations of Terminator robots; Controls from Nomads, Lore, Peanut Hamper and Star Trek; and a collection of droids from Star Wars.
These robots allowed writers to reflect humanity’s traits and problems onto mechanical creatures and to play with what happens when artificial life is created with or without the moral constraints that govern most humans.
Also: Star Wars Starter Guide: Every Movie Ranked and Graded
Robots in the real world
But robots do exist in the real world. And they don’t behave like C-3PO or Mr. data. Instead, they range from giant automated factories to automobile welding robots, 3D printers to children’s toys. What makes this robot…the robot? And what makes them different from science fiction robots?
To begin with, the robots of science fiction are often fully autonomous. Mr. Data from Star Trek: The Next Generation and The Doctor from Star Trek Voyager (a holographic AI) were even declared legal entities in the eyes of the fictional Federation. No one is claiming that my friend’s Tesla is legally a person.
Also: The best robot and AI innovations at CES
Indeed, the auto industry has evolved A set of aspirational criteria that define ““autonomy” A robotic vehicle, and we can apply that criterion to other robots as well. The SAE J3016 standard has six levels: Levels 0, 1, and 2 describe automation limits, while Levels 3, 4, and 5 describe more fully autonomous functionality:
As you can see, the blue criteria specifies that a human driver must still be in control, even if assisted by the car, while the green criteria specifies (mostly) that the car is capable of making all the necessary decisions.
Most of the real-world robots we have today fall on the blue side of the spectrum. That’s why I use the term automatic vs. Autonomous to separate robotic capabilities. Although the words sound similar, here’s how they differ:
- automatic Systems follow pre-defined rules to perform certain tasks.
- autonomous Systems can act independently, make decisions and adapt to new situations.
Today, I contend, most robots are merely automated devices with some level of dynamism in the real world. They perform a series of steps, possibly varying based on specific criteria. (For example, a 3D printer will stop printing when it runs out of filament, only to restart when more material is loaded.) Autonomous devices include C-3PO, Mr. Data, the T-800, or Amazon’s dream for human-free delivery robots.
Also: BMW tests next-gen LiDAR beats Tesla in Level 3 self-driving cars
Right now, we’re really, really good at automation. Autonomous, not so much. But we’re getting there.
Huge dynamic range
Today’s robot, Mr. Data, generally quite useful and functional.
These include industrial robots, medical robots, military and defense robots, domestic robots, entertainment robots, space exploration and maintenance robots, agricultural robots, retail robots, underwater robots and telepresence robots that help people participate in an activity remotely.
My personal interest is focused on robots that are available and accessible to makers and hobbyists, robots that can empower individuals to build, design and prototype projects previously only possible with a shop full of fabrication machinery.
I’m talking about 3D printers, which create objects from melted plastic layers; CNC devices, which often cut, engrave and remove wood or metal to create objects; laser cutters, which are ideal for sign cutting, engraving, and making very detailed parts and circuit boards; And even vinyl cutters, carefully light, for cutting flexible material in complex patterns.
Also: This is the best and fastest sub-$300 3D printer I’ve tested yet
These machines use CAD software to program — aka, design — the object being manufactured. Those designs are then converted into a series of motion instructions that guide the machine to create repetitive, complex steps.
I used a CNC, for example, to create a series of identical custom organizer racks for parts storage.
When I designed and assembled the organizer, the robot became a force multiplier, carving precise rack-holding features, a process that was beyond my woodworking skill set, which is mostly limited to screwing nails and screws.
From children’s learning toys to something surprisingly complex and massive, robots today have a huge dynamic range Amazon’s Smart Warehouses There are thousands of robots in each of these warehouses, but since they all work in concert with each other, the entire warehouse itself can be considered one giant robot.
Span of autonomy
Let’s get back to our discussion of automated robots vs. Autonomous robots. Automated robots can follow a set of tasks, usually supervised (or at least regularly checked) by a human operator.
My 3D printer is a good example. When I create a design in CAD software and then convert that design into Gcode, what I’m creating is a series of movement instructions. The instructions specify the X and Y positions of the print head, along with how high the extruder must be for successively added layers. The instructions also specify the temperature of the extruder, determining how quickly and smoothly the plastic melts in the previous layer.
Usually, I’ll stop a print and then monitor it with a camera. On the not-rare-enough occasion that the print fails or the printer just decides to spew melted plastic into the air, I usually catch it quick enough, run to the fab lab, and cancel the print. The printer is automatic – it is Following are the instructions — But there is nothing autonomous about this process.
Also: Generative AI will surpass what ChatGPT can do
The new printers are incorporating some AI: cameras feed images into a processor that uses some machine learning to examine each image and determine if there are failure scenarios. While machines can’t fix those failures, machine learning can stop the process, preventing material damage and a potential safety hazard.
When you visit Amazon’s factories, you will see more robots. There are carting robots that move along the floor, delivering products. Most of these are automated, not autonomous. But Amazon’s wildly complex conveyor system has intelligent imaging systems that see products as they pass by and make certain decisions about the objects as they pass. Here, we’re starting to see signs of more management without human supervision.
I have a drone that exhibits some autonomous behavior as well. If, while operating it with a hand-held controller, I send it out of radio range, the drone will take over on its own. It will plot a course back to its origin, reverse course, avoid obstacles like trees and power lines, and bring itself back home without any interaction. It will behave the same when it senses that its battery is too low to continue flying.
In each of these three examples (AI-assisted 3D printer monitoring, warehouse conveyor monitoring, and return flights from home), we are seeing autonomous behavior built as an extension of most automated systems. I think that’s how we’ll see autonomous features roll out. They will be available on a case-by-case basis until more circumstances are taken into account.
Also: The technology behind ChatGPT could power your next car’s AI driving assistant
Finally, you’ll be able to crawl into your car and get an extra 45 minutes of sleep while the car takes you to the nearest Starbucks to your office. But as the SAE Levels of Driving Automation chart we discussed earlier shows, Level 5 is a big step up. At that point, we trust the car to handle any and all road conditions and respond intelligently, carefully, quickly and safely. Most experts believe That we’ll start seeing cars with this capability sometime after 2030.
When ChatGPT gets one of its famous hallucinations wrong, it’s just annoying — and possibly embarrassing if someone uses that material in some writing. But if a robot makes a mistake while operating in the real world, something physically wrong can happen, even fatally. Because the stakes are higher, greater care needs to be taken not only in developing fully autonomous systems, but in staging and releasing those systems to ensure they are safe to open in our shared environment.
Looking Ahead: The Robots of Tomorrow
Let’s review our three main takeaways: First, science fiction has given us a picture of a robot that is both cautious and ambitious — but not necessarily practical. Second, many things in the real world can be considered robots. And third, the range of autonomy may vary between different real-world robots.
At first glance, it seems as if AI and robotics are inextricably linked. But as we’ve seen, AI can inform all, part, or none of a robot’s tasks, depending on the level of technology involved and a robot’s purpose. While it would be nice for a fabrication robot to know when it’s failing and stopping, we get a lot of value from automated CNC devices and 3D printers that just follow their Gcode instructions.
As we look to the future, we will see more autonomous systems. Siemens has an interesting vision of what a factory should look like In the coming decades, it will see many autonomous systems interacting with the overall production process.
Also: How horses may inform the future of robot-human interactions
Outside of the world of entertainment characters, robots are complex processes that justify the cost and effort by the value they create, be it cost savings, time savings, the ability to undertake otherwise difficult processes, the ability to perform tasks. Dangerous environments for humans, or their ability to override the efforts of human operators.
The ability to interact with the real world and perform automated actions is the table stake for participating in the robotics revolution. As we move forward, we hope to leverage machine learning, intelligent vision, generative AI, traditional programming skills, and mechanical design skills to open new doors, provide new opportunities, and help robots of all sizes and abilities do more to help us. .
On the other hand, if — someday in the future — the robot starts yelling, “Extermination! Extermination!” …ok then… Jeopardy, Will Robinson, Biddy biddy biddy, These aren’t the droids you’re looking for
You can follow my daily project updates on social media. Don’t forget to subscribe to my weekly update newsletter on the substackAnd follow me on Twitter @Davidgewirtzon facebook Facebook.com/DavidGewirtzon Instagram Instagram.com/DavidGewirtzand on YouTube YouTube.com/DavidGewirtzTV.