Now offering Comprehensive Fleet Services
Business License# 2023-64877-633669
Email the owner directly: Travis@ADASOnCall.com
3rd Gen LiDAR - Technology We Won't Stop Talking About
May1, 2024 - Travis SImpkins (Owner ADAS On Call)
​
We went to a wedding last month, and they couple had seated with the obnoxious "cryptobro's". They are loud and obnoxious and just cannot wait to tell you all about the latest "ground floor opportunity" in some cryptocurrency investment.
​
We got the hint. Other people are not quite as fascinated with news about ADAS technology developments.
​
But......we just have to tell you why we are so excited about the latest generation of LiDAR sensors and that because of them we could have fully autonomous vehicles before Boeing figures out how to keep doors closed on airplanes.
​
Lidar is short for Light Detection and Ranging, and uses at higher speeds than uses It's faster and far more accurate than its cousin RADAR (Yes, it does bother us that RADAR isn't RaDAR - but we are going to set that aside for now)
​
These different imaging technologies (cameras, RADAR and LiDAR) need to provide the vehicle with a complete picture of the dynamic environment around it. The path of the road, other vehicles, fixed obstacles, traffic signals, wayward pets, a ladder falling off a truck, kids so entranced by Tik-Tok that the only direction they look before crossing is down.
​
We are taking a few liberties with the science here (apologies to the applied science genuises who, despite making pretty much everything possible, never get enough credit).
​
Radio waves aren't very good at detail. They see rough shapes - although they are great at identifying speed and speed changes which is why they are the default for functions like Automatic Emergency Braking and Adaptive Cruise Control. But the distinction between garbage bag blowing across a road and a child chasing that bag is an important one.
Cameras identify objects much better. They "see" the same way we do. But a camera image is "flat", two dimensional. Converting that into a 3-D map of the world around a vehicle takes some processing. Which translates to additional time and lot of processing power. Also cameras, like us, don't see well in the dark, and they can't see through things. Like trees and shrubbery.
​
Lidar can see just fine at night and sees through all kinds of things. It was developed for 3-D mapping, so that's kind of its thing. The data from a RADAR signal is comparable to the data from a 3rd generation LiDAR sensor in the same way that a tetherball is comparable to an orbiting satellite.
​
3rd Generation LiDAR sensors are going to be the anchor of Level 3 vehicle autonomy (for now - just know that a car without a steering wheel or pedals is level 5. Three is getting pretty far up there.
​
These new sensors are currently in use or testing by Mercedes, GM, Volvo, and other OEM's. They are here - and fancy enough that lawsuits about patent infringement are dropping more often than Zach Bryan albums.
​
That thing ahead of you, the one you can't see because you are in a corner and there is foliage...... LiDAR can identify it as a child (or a very small adult, or a Hobbit, possibly an Ewok but presumably a child). LiDAR can calculate his relative Body Mass Index and estimate is response time and "getting out of the way" ability. I made that up, but it could.
It is extremely accurate. With 3D mapping and a predictive database, LiDAR can identify that the kid has a pudding cup in one hand and is licking pudding off the screen of the phone held in his other hand. It is that good.
Note: Yes, eating pudding without a spoon is a gender-neutral activity, but odds wise that has 8-year-old boy all over it.
​
So he (or she) is 26" from the curb, over two feet into the roadway. By the time you are at the apex of the corner will have taken 1.65 additional steps placing him (or her) 38" from the centerline of your vehicle. Where the grill and headlight meet.
​
"Pudding Hands" is 35 yards ahead of you, although because of the Amazon van parked at the beginning of the kid's driveway on the inside of the corner, you won't see them for another 15 yards. This will occur in about 2/3 of a second.
​
It is reasonable to predict that this kid has zero spatial awareness. Or the athletic moves of a middle linebacker playing in 4-3. We hope Amazon is delivering something they might use in an outdoor activity, but more likely is just another case of gluten-free pudding cups).
​
At 50 mph you will cover the remaining 20 yards in 819 milliseconds.
Depending on the weight of your vehicle, road condition, and tire size and condition, the AEB (automatic emergency braking) might be able to bring you down from 50 in that 20 yards. Many collision avoidance systems can and will check for oncoming traffic and steer you around pudding hands while braking. A good system with 360-degree visibility will also check the speed and distance of the vehicle behind you, and mitigate the rear-end hit if the AEB has maximum braking applied.
​
But that won't be necessary in this case. Because of the LiDARs ability to map 3 dimensionally and include the roadway in the map, the AEB is already applying the brakes, bringing the speed down to 20 mph in the 15 yards before you are deep enough into the corner for line-of-sight visibility.
​
You wouldn't even have to hit the horn at this point - although I would personally lay into it as the car came to gentle stop without any inputs from me. Maybe give him a pudding-dropping scare. It would be a funny thing to laugh at later.
​
Instead of the other thing.
​
-TS