Self-Driving Cars Ended Up Stranded in Traffic: Why and How?

Self-Driving Cars Ended Up Stranded in Traffic: Why and How?

How did Automated Self-driving cars got stuck behind in the slow lane? Let's find out

Automated Self Driving cars: Problems and Delay

Tesla began offering beta tests of its "Full Self-Driving" software (FSD) to roughly 60,000 Tesla customers in late 2020, upon passing a safety exam and paying $12,000. Customers will test the automated driving assistance system in order to help enhance it before it is released to the public.

The Autonomous Vehicle (AV) sector is taking an unusual approach by putting new technologies in the hands of inexperienced testers. Other businesses, such as Alphabet's Waymo, GM's Cruise, and Aurora, an autonomous vehicle startup, use safety operators to test technologies on predetermined routes. While the move has strengthened Tesla's populist credentials among supporters, it has also proven to be dangerous in terms of reputation. A stream of videos documenting reckless-looking FSD behaviour has racked up a lot of views online since the time the company put its technology in the hands of the people.

Automated Self Driving cars are not following Road Signs

The NHTSA, for example, ordered Tesla to prevent the system from doing illegal "rolling stops", which is the process of slowly passing through a stop sign without ever coming to a complete stop, whereas an "unexpected braking" concern is currently under investigation. "I haven't even seen it get better," Ogan says of his experience with FSD. "It rather does crazy things with more assurance", he says.

Automated Self Driving cars do not recognise pedestrians

The "learner driver" metaphor, according to Maynard, works for some of FSD's problems but breaks down when the technology participates in clearly non-human behaviour. For example, a disregard for driving dangerously close to pedestrians and a Tesla ploughing into a bollard went unnoticed by FSD. Tesla's Autopilot software, which has been linked to at least 12 incidents, has had similar issues.

Challenges with Automated Vehicles:

While around 80% of self-driving is very basic — keeping the car on the right side of the road, avoiding collisions – the other 10% entails more difficult circumstances like roundabouts and complex junctions. "The last 10% is the toughest," Avery explains. "That's when you have, say, a cow standing in the centre of the road who refuses to move."

The AV sector is stuck in the last 20%, particularly the last 10%, which deals with the perilous subject of "edge situations." These are rare and unique road situations, such as a ball bouncing across the street followed by a running child, difficult roadworks requiring the automobile to mount the kerb to pass; a bunch of protesters holding signs or that stubborn cow.

Non-availability of every scenario data:

Self-driving cars rely on a combination of machine learning software and fundamental written rules like "always stop at a red light." Machine-learning algorithms consume a large amount of data in order to "learn" how to drive safely. The car does not learn how to behave effectively because edge scenarios are rare in such data.

While humans are able to generalise from one case to the next, if a self-driving system appears to "master" a given situation, it doesn't necessarily mean it will be able to duplicate this under slightly different circumstances. It's an issue for which no solution has yet been found.

Limitations of AI:

"A big part of real-world AI remains to be addressed to make unsupervised, universal full self-driving function," Musk tweeted in 2019. In the absence of a breakthrough in AI, driverless vehicles that function as well as people are unlikely to hit the market anytime soon.

To partially get around this challenge, other AV makers utilize high-definition maps — mapping the lines of roads and pavements, as well as the location of traffic signs and speed limits. However, these maps must be updated on a regular basis to keep up with ever-changing road conditions, and even then, there is no guarantee of accuracy.

The ultimate goal of autonomous vehicle designers is to develop vehicles that are safer than human-driven automobiles. To establish that their system was safer than a human, Koopman argues AV developers would have to outperform this. However, he feels that industry-wide metrics such as disengagement data (how often a person must take control to avoid an accident) exclude the most critical aspects of AV safety.

Lack of regulation:

The lack of regulation here so far in this context demonstrates the absence of global consensus in this area. "Is the software going to mature fast enough that it's both trusted and regulators give it the green light, before something truly awful happens and pulls the rug out from under the entire enterprise?" is the main question here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net