Tesla CEO Elon Musk slammed the latest version of its experimental driver aid software, FSD Beta 9.2, on Twitter on Monday, calling it “really not fantastic.” In the United States, the business offers a Full Self-Driving Capability (or FSD) package for $10,000 or $199 per month. But, unfortunately, this advanced driver assistance system does not make Tesla electric vehicles safe to drive without a focused driver.
Elon Musk wrote: “FSD Beta 9.2 is actually not great imo, but Autopilot/AI team is rallying to improve as fast as possible. We’re trying to have a single stack for both highway & city streets, but it requires massive NN retraining.” FSD Beta is only available to Tesla staff and those drivers who have previously purchased FSD. The beta version introduces new or revised functionality to the car’s premium driving assistance systems.Drivers often agree to keep their experiences private. At the same time, some public FSD Beta users are permitted to publish videos on social media demonstrating and criticizing the latest features they have tested on US roads.
Regulators may prohibit vehicle testing with non-trained experts on public roads in the future. But, for the time being, no legislation stands in the way of Tesla’s ability to convert their customers, as well as everyone else on the road, into guinea pigs.Musk’s scathing post came only days after bragged about Tesla’s autonomous systems and components expertise at an event called Tesla AI Day. Tesla demonstrated a custom chip for training artificial intelligence networks in data centers at that event, which took place last week on Thursday. The chips will train models that automatically recognize a range of road impediments in video feeds captured by cameras inside Tesla vehicles.
Tesla claims FSD will add the ability to automatically steer on city streets later this year, a long-awaited function. The auto steer on city streets feature has been incorporated in FSD Beta. However, it is imprecise and incomplete. Elon Musk scathing remark comes after federal car safety officials in the United States launched a formal inquiry into Tesla’s Autopilot system last week. Tesla’s Autopilot is the most basic version of their driver assistance technology, and it is now included as standard equipment in all of their vehicles.However, according to the NHTSA, Tesla vehicles equipped with Autopilot or just traffic-aware cruise control collided with first responder vehicles on at least 11 occasions in the United States, injuring at least 17 persons and killing one. This sparked an official investigation into whether Autopilot has any safety flaws that the NHTSA may order Tesla to address.