Who’s Really Doing AI?
There is no shortage of excitement around artificial intelligence. Lists upon lists tell of “promising” examples, just around the corner, coming soon.
But even claims of active AI implementations are often half-measures: AI up to a point, when humans step in and take the final actions. Consider that you may be using a regression tool to solve a problem. That’s great. But what happens next? Humans take that information and make a decision. Information is not AI, and particularly not if it’s just using something off-the-shelf.
True AI is AI from point A to point Z, a smart machine that learns its way through problems, anticipating and correcting along the way, without human input. It is data-driven, using advanced algorithms, and it’s not easy, which is why few true examples exist. It’s worth mentioning that calling something “AI” when it isn’t usually isn’t done to intentionally mislead — it’s an evolving field, and many are still getting up to speed on what it means.
So how do you know if it’s AI or not? The litmus test is the answer to this question: Do you need a human or not?
If you think you’re going big on machine learning and AI, think bigger. Using predictive analytics in your marketing? Great. But think bigger: Do you need a human or not? AI could automate all of your marketing.
Here are a handful of examples of AI that illustrate what it means to fully leverage its potential:
Neural Networks in the Driver’s Seat: Tesla
Quite literally. Tesla uses neural networks to train its cars, and then walks away. Autopilot AI builds the hardware from the ground up, applies driving and traffic scenarios from one million vehicles around the world and builds algorithms on top of ground truth data that develop planning and decision-making systems. Open-, closed-loop and hardware-in-the-loop hold the infrastructure in place and prevent against regression.
Learning and Moving from Sensors, Cameras and Mathematical Models: Spot and Flippy
Boston Dynamics’ Spot walks, runs, climbs stairs and even leaps, mapping its environment in real time through cameras and sensors. Mathematical models help Spot determine the physics of its actions, taking the weight and movement of its metal robot body into account.
Flippy does just that: It flips burgers. Designed specifically to work in kitchens, Miso Robotics’ robot learns from its environment and acquires skills over time using 3D and thermal scanners for movement. It also has a point-of-sale technology baked in to help drive decision-making about what to do next.
Assisting and Anticipating
In some cases, “do you need a human or not” is equivalent to DIFM, do it for me. Good examples of this are personal assistant technologies and services, such as Siri, Alexa and Google Assistant.
When machines get access to data and do something with it, you can’t get more AI than that, and that’s exactly how you could describe Siri, Alexa and Google Assistant. Tell Siri you need to be woken up at a certain time and an alarm is set for you. Ask Alexa to remind you about something at a later date and she’ll do it because she takes the data, stores it, and autonomously uses it again.
Dynamic Optimization
Nest, has such functions as above built into hardware. Nest works by learning the behaviors of people in a household — when they come and go, what temperatures they prefer at different times of day — and then uses special machine learning algorithms to make decisions about what the home temperature should be at which times, warming or cooling autonomously.
In another case, assistance happens in collaboration with a human, but not driven by a human, to optimize on the fly. Tesla’s Autobidder is built to shift power and resources to optimize energy usage among power customers. It uses a model similar to quantitative stock trading algorithms.
Real-Time Response
Finally, it’s easy to forget that AI has been hurling us through the air at top speed for more than a hundred years, since autopilot was invented in 1912. Today’s autopilot uses a lot more advanced technologies, obviously. Numerous sensors on a plane collect data throughout the flight, executing and adjusting against a flight plan.
Sophisticated drones, like planes, also operate on a collection of sensor input to execute against a plan or purpose. Skydio builds drones used by construction businesses to both guard construction zones as well as to inspect them. They fly completely autonomously. They can measure how many materials have been used, for example, by viewing and evaluating a large pile of rocks.
Since AI is largely about learning and executing, the potential over time is boundless. So don’t limit yourself to thinking AI is about segmenting Web traffic. It’s so much more.