The incidents occurred in multiple locations across the United States, prompting the National Highway Traffic Safety Administration (NHTSA) to open a new inquiry into Tesla’s autonomous driving system.
Video footage shared on social media shows cars equipped with the latest FSD software version veering across lane markings and even entering oncoming traffic before correcting course.
In a statement, the NHTSA said it was aware of "multiple reports of Tesla vehicles engaging in unsafe driving behaviour while using driver-assist features" and confirmed it had requested data from the company to assess whether the software poses a broader public risk.
Tesla’s FSD mode — a beta programme currently available to select users — allows vehicles to navigate city streets, respond to traffic signals, and attempt complex manoeuvres without driver input.
However, drivers are still required to maintain control and remain alert at all times.
The company, led by Elon Musk, has faced repeated scrutiny over the branding and performance of its self-driving software.
Earlier this year, the NHTSA forced Tesla to issue an over-the-air recall for more than two million cars, requiring updates to its Autopilot monitoring systems after a string of crashes linked to inattentive drivers.
Safety experts have warned that Tesla’s approach to real-world testing risks putting unproven AI systems on public roads.
Jake Fisher of Consumer Reports said: "This isn’t full autonomy — it’s a test environment with human lives at stake."