The big headline is understandably that it crashes into a fake painted wall like a cartoon, but that’s not something that most drivers are likely to encounter on the road. The other two comparisons where lidar succeeded and cameras failed were the fog and rain tests, where the Tesla ran over a mannequin that was concealed from standard optical cameras by extreme weather conditions. Human eyes are obviously susceptible to the same conditions, but if the option is there, why not do better than human eyes?
Human eyes are much better at picking up shapes from complex visual clutter than computers. Not to mention that human eyes have infinitely better dynamic range than cameras (every time you go into or out of a tunnel the camera is essentially blind for a couple of seconds until it readjusts), both mean that while you absolutely do need the cameras, you can’t just rely on them and say “oh well humans only use their eyes”, because the cameras and the computers are nowhere near human level capable.
I love the justification for only using cameras is “humans do it”. Like, are we not supposed to use any technology that humans can’t replicate? Maybe we shouldn’t fly planes because humans can’t fly. It’s such a dumb reason.
Human eyes are way better at this than any camera based self driving system. No self driving system is anywhere close to driving in Swedish winter with bad weather and no sun, yet us Swedes do it routinely by the millions every day.
Whilst I agree on the wall, fog and rain are not extreme weather conditions. I’d rather he’d used a level of rain that was less intense. However, the fact is lidar still worked even though it was not clear it was going to.
The big headline is understandably that it crashes into a fake painted wall like a cartoon, but that’s not something that most drivers are likely to encounter on the road. The other two comparisons where lidar succeeded and cameras failed were the fog and rain tests, where the Tesla ran over a mannequin that was concealed from standard optical cameras by extreme weather conditions. Human eyes are obviously susceptible to the same conditions, but if the option is there, why not do better than human eyes?
Human eyes are much better at picking up shapes from complex visual clutter than computers. Not to mention that human eyes have infinitely better dynamic range than cameras (every time you go into or out of a tunnel the camera is essentially blind for a couple of seconds until it readjusts), both mean that while you absolutely do need the cameras, you can’t just rely on them and say “oh well humans only use their eyes”, because the cameras and the computers are nowhere near human level capable.
I love the justification for only using cameras is “humans do it”. Like, are we not supposed to use any technology that humans can’t replicate? Maybe we shouldn’t fly planes because humans can’t fly. It’s such a dumb reason.
Human eyes are way better at this than any camera based self driving system. No self driving system is anywhere close to driving in Swedish winter with bad weather and no sun, yet us Swedes do it routinely by the millions every day.
Yep, winter will be the death of 100% self driving cars, we can “filter out” snowflakes easily, computers can’t.
That’s interesting. I’ve never had any problems with adaptive cruise in the winter. I’m pretty sure that my V90CC has a heated radar module.
Whilst I agree on the wall, fog and rain are not extreme weather conditions. I’d rather he’d used a level of rain that was less intense. However, the fact is lidar still worked even though it was not clear it was going to.
Tesla users with low empathy: who cares?
Seriously though, why ditch lidar?
https://www.cdotrends.com/story/4083/lidar-u-turn-elon-musks-fools-errand-becomes-teslas-secret-weapon
Long story short is Musk is a greedy billionaire that likes to think he’s an engineer and he clearly isn’t.
To save Elmo a few bucks.