Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Anyone knows how those technologies work at night time ?


Shouldn't make a difference as, as far as I know, none of the tech they're using is visually based except for a few of the cameras. Most of the primary systems are LIDAR, RADAR, or some other tech that doesn't rely on perfect visual detection and even the ones that do have a mode for "night vision" or a modifier to allow a similar type of function.


What about street signs, road markings, and even signals? That must be visual, and they certainly look different at night

Can you create a "normalized" image (hardware- or software-wise) that is independent of the time of day? Is that what you mean by "night-vision" and "modifier"?

I would still imagine that certain light conditions (i.e. just around sunset) could make things difficult.

Would be interesting if anyone knows how they handle this.


They do have cameras for those kinds of things but most of them are a backup system and they have the actual signals, signs, and speed limits embedded in the GPS data. If you do a google search, you can find several instances of Tesla's speed-limiting a car going down the freeway where the GPS data was incorrect. The car all but ignored what the signs actually said as long as it could identify that a sign was there.

I'm sure there are light conditions that it can't handle, but there are so many fallback systems that it doesn't have to rely simply on visual data.


Ah, ok, so if the can encounters a pre-mapped street signs, the car doesn't bother to actually check the sign.

Hmm... seems a tad dangerous to me. What if the sign is changed, or two signs are almost in identical position?

I suppose it will be ok if the pre-mapped version is used, whenever a confident visual prediction cannot be made (e.g. due to light, mud, bullet holes, or whatever).


This is part of the issue that's being addressed in the current autopilot. Tesla throttles the autopilot speed to be no more than a certain amount past the posted speed limit. Drivers have been complaining that the posted limits don't match the pre-mapped limits already.


.... still seems a bit dangerous.

Would also be interesting to hear what the other teams do, but I only remember seeing videos in daylight so far.

Somehow I completely overlooked this issue in the past.


Is the tesla "auto-pilot" visually based? From what I recall the fatal accident was because the trailer was a similar color with the sky or something like that so it was not detected as an obstruction.


Parts of it are, yes. The issue with the truck was the height. The logs indicated that the trailer was being ignored as an overhead sign because its height was greater than the height of the cameras. The visual camera data was ignored completely by the system as the truck was perpendicular to the direction of travel.


Even lane detection ? I don't see how signs and paintings could be detected with radars


It's not all detected with radar, just the primary systems are. There are visual systems but they're not the primary detection system. With lane lines, it's mostly visual but they filter the input and have location-based data for things like signs and signals.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: