What are the most important components of driving a car? The engine? The brakes? The driver? One out of the three is no longer necessary. Judging by the title it’s the driver. This simple motion of removing the driver has a lot of work behind it (but a great impact as well). Driverless/autonomous vehicles are here and they will be here to stay. There is so much potential within this industry but you first need to understand what these vehicles are.
Autonomous vehicles are the future of transportation. Also known as self-driving cars, driverless cars, etc. From the name, we can tell that they are automatic(auto) vehicles. Vehicles that don’t rely on actual people (drivers) to go places. To this day, this still seems quite futuristic but developments are already on their way.
How the Vehicle Sees the World
First, we need to train the vehicle to do something we take for granted every day. How to perceive the world! This is sight, being aware of the things around us, knowing when something is moving, etc. Adding the next few components is only the tip of the iceberg. Beneath the water, we see algorithms and neural networks so that the vehicle understands what it’s seeing.
If you’ve seen a driverless car before (or a picture, not everyone is that lucky) you’ve probably noticed the little hat on top of the car. This my friend is a LiDAR unit. LiDAR units are critical to providing measurements/info that a camera simply can’t provide like the distance between the vehicle and other cars. It’s an array of lasers, rotating 360 degrees while scanning the world around it, locating different objects and how they move.
Radar sensors accompany the LiDAR unit. Using wireless sensing technology, they provide more information about how the objects around the vehicle are moving (eg. motion characteristics, motion trajectory, a target’s distance). They compliment the data that LiDAR collects to give the vehicle a fuller picture of the world around it.
Camera (Computer Vision)
Did someone say picture? Here comes the last component of the imaging trifecta. The camera looks for colours, edges and gradients, to identify its surroundings. Using deep neural networks, the vehicle can identify its surroundings and classify them. Cars are classified as one thing, the trajectory another, the road line, signs, the list goes on.
One REALLY important component that is not on the lovely graphic above is the GPS. So please refer to the next image:
This isn’t your standard Google Maps GPS. Standard GPS is accurate within 1–2 meters. Seems okay but when you’re a driverless car, this means that you could be running over pedestrians on the sidewalk. Big no-no. We have to use much more sophisticated mathematical algorithms and high-definition maps to precisely locate the vehicle withing single-digit centimetre accuracy.
The vehicle would use various landmarks to figure out where it is in the world based on its proximity. These landmarks are anything from fire hydrants, street lights, even manhole covers. (“What did you do this weekend?” “Ah, visited some landmarks” “Which ones?” “The usual. Street lights and mailboxes”).
One last *main* component of this vehicle (If we did all of them, this article wouldn’t end). Ultrasonic sensors are usually located on the wheel. They detect curbs and other vehicles when parking.
Starting to appreciate the complexity? So many components working together so that this vehicle can exist. But wait. There’s more…
So how does it move?
We now have an all-seeing, all-knowing vehicle. Unfortunately, it’s pretty useless if it doesn’t move.
Complex software is used to process all of this input, creates a trajectory and then sends instructions to the car’s actuators. The actuators control acceleration, braking and steering (Thank you actuators). Hard-coded rules, obstacle avoidance algorithms, predictive modelling and object recognition help the software choose trajectories and send instructions.
Putting it all together
The process of a working autonomous vehicle can be broken down into 5 basic components: computer vision, sensory fusion, localization, path planning, and control.
Using camera images to figure out what the world around us looks like. It uses a deep neural network to recognize its surroundings. (See Camera(Computer Vision) for more information)
We augment computer vision by incorporating data from other sensors (LiDAR, radar, ultrasound) to get a richer perception of the surroundings. This focuses more on the motion of the objects around the vehicle. We are measuring the surroundings whereas computer vision focuses on the shapes and identification.
After we know what the world looks like, localization is used to figure out where we are in that world. We use the landmarks to find out precisely where the car is.
Next, we need to plot a trajectory through this world to where we want to go. We do this by using the location of where we are and deciding the location of our destination. The computer, literally, plots a path from one point to another.
Control is the execution of the other four components. The act of driving to the trajectory is planned during path planning by using the brakes, steering wheel, throttle.
What problems does this solve?
This is an awesome piece of technology and so, it does have a positive impact on our society. Here are just a few examples:
Traffic? Thing of the past
Sometime in the future: “Open up your history textbooks, we’re going to talk about a thing called traffic. *dramatic pause* Imagine this, you’re on the road and you can’t move because all the other cars around you aren’t moving.” *Class gasps* “How is that even possible?”
Traffic is one of those things you don’t come out of without mild to severe annoyance (especially, if you’re already late). If we can get rid of it, sign me up!
CGP Grey has a great video going into detail about how solving traffic would work but here’s my little summary.
A lot of times, traffic isn’t even caused by severe car accidents like you’d imagine. It’s caused by people braking too hard (for a valid reason, like a car changing lanes too fast) or getting too close to the car in front and then having to brake too hard. Soon, the car behind the braker brakes and the next and the next until someone comes to a complete stop. We know the story from there *heh hem, gridlock*.
This problem of shockwave traffic could be solved by everyone accelerating and separating at the same speed as everyone else.
Look, if anything, my 14 years on this Earth have taught me that most humans aren’t good at coordination that precise. Especially if they can’t talk to each other. I wonder if we can solve this with anything…
You know ‘em, you love ‘em, automated cars. They are much better at tasks this precise aaand they can communicate with each other much better (a.k.a without sticking their heads out of the window and yelling at each other on a highway).
Most drivers know that the way they drive impacts their car’s MPG, Miles per Gallon (or KPG for the rest of the world). Cars end up with a different average MPG after driving the same number of km. Yet, autonomous vehicles can more accurately measure, track and conserve fuel, saving money in the long term. Also, most autonomous vehicles are coming out as hybrid or fully electric which also saves on fuel.
These days, so much of the trucking process is automated. Why not automate the driving as well? Some truckers take incredibly long journeys through the country to deliver supplies. If we use automated vehicles for these kinds of trips, the shipment would get to its destination faster and more effectively. Also, we can program several automated trucks to move in close formation, as if one, which can deliver even more products faster.
What problems does this face?
Super promising right? At this time, there are still some wrinkles that autonomous vehicles have to iron out.
LiDAR and Radar
There are still a lot of questions about LiDAR. First of all, it’s expensive so we need to find a way to reduce costs there. We’re still figuring out things like range vs. resolution, will LiDAR signals interfere with each other, will there be enough frequency to support all the autonomous vehicles? We need to solve these issues before these vehicles get on the road.
Living in Canada means we deal with heavy snowfall each winter. These cars will be driving in this weather, possibly before the roads are cleared. How will the cars recognize road lines if they’re covered in water, oil, ice, or debris?
If an autonomous vehicle gets into an accident, who is held accountable? The manufacturer? The passenger? A really important thing to talk about even though, in a fully autonomous car, the passenger can’t override.
You’ve probably heard about the trolley problem at one point or another. It has a really interesting application in autonomous vehicles. I’ll give a refresher of one of the more common versions:
You’re next to a trolley track. On the track, there are 5 workers and the trolley is approaching fast. Luckily, in front of you is a lever. This lever will make the trolley switch onto another track with only one worker on it. What do you do? Do you pull the lever and save the five workers or do you do nothing and save the one worker?
This is just one of the many versions. Do the more complicated ones have the same principle but different people: two elderly or two children? Two businesswomen or two doctors? 2 pregnant women and a criminal or 2 mothers? It’s really hard stuff that questions your morals. This problem deserves an article of its own. As AI starts to take over, it might need to make these decisions. What outcome is for the greater good?
At this point, if you’re still reading, you’re hopefully hyped up about autonomous vehicles and want to know when they’ll be ready. Unfortunately, mainstream use of autonomous vehicles is still a while away. The pandemic managed to slow down the production and research around autonomous vehicles as well. Now we wait. (Or get in on the action and help out 💪)
Parting Words of Wisdom:
“You don’t even realize how complicated driving is until you have to teach a computer to drive.” — Me
References: (A.K.A — A great way to learn more)
How a driverless vehicle works: https://www.youtube.com/watch?v=Ly92UcnoEMY
More on radar sensors: https://www.fierceelectronics.com/sensors/what-a-radar-sensor
Autonomous cars: https://www.synopsys.com/automotive/what-is-autonomous-car.html
A really really cool trolley problem simulator: https://www.moralmachine.net/
CGP Grey’s traffic solution video: https://www.youtube.com/watch?v=iHzzSao6ypE
Shockwave traffic jams: https://www.vox.com/2014/11/24/7276027/traffic-jam
Future of Trucking: https://medium.com/@smarthopapp/the-future-of-trucking-f33086c008f4
Feel free to contact me and chat more about the subject: firstname.lastname@example.org