Tesla’s “full self-management” debate now includes homemade dummies and tests on real children.

A resident of North Carolina set out to refute the widespread video Tesla with “full self-driving” beta software that allows the car to steer, brake and accelerate, but requires an attentive driver ready to get behind the wheel crashing into child-sized dummies.
Dan O’Dowd, CEO of a software company who posted a video earlier this month, thinks The National Highway Traffic Safety Administration should ban “full self-driving” until Tesla CEO Elon Musk “proves he doesn’t mow down kids.”
It was then that Kupani, the owner of an auto shop specializing in imported and Tesla cars, stepped in and recruited his son. Although he calls himself a “BMW guy,” Kupani says the software can’t match what Tesla has to offer. It also wasn’t the first time he had enlisted his son, who Kupani says is 11, in a potentially viral automotive endeavor: earlier this year he posted a video his son driving his Model S Plaid, which can hit 100 km/h in 1.99 seconds, in a private parking lot. It has been viewed over 250,000 times.

“Some people look at it and go, ‘Oh, that crazy dad, what is he doing?'” Kupani told CNN Business. “Well, I do a lot of stuff like that, but I’m going to make sure my kid doesn’t get hit.”

Kupani filmed the test “full self-driving” in the parking lot. His son was standing at the end of the aisle holding a smartphone to take the test. Kupani accelerated the Tesla from the other side of the parking lot and turned on “full self-driving”, hitting 35 mph. The Tesla braked hard and came to a complete halt, well ahead of his son.
Kupani did another test with his son on the street using Autopilot, Tesla’s more basic driver assistance software, and found it stopped for his son as well. “This guy, Dan, says he’s an expert at this, an expert at this,” Kupani said. “Well, I’m an expert in automotive engineering, future technology, a professional driving instructor.”
Kupani is among the many Tesla supporters who took issue with O’Dowd’s video and decided to create their own tests. Some asked their children to help. Others built homemade mannequins or used inflatable dolls.
The passionate defense and criticism of “full self-driving” highlights how this technology has become a hotspot in the industry. The California DMV recently said the name “full self-driving” is misleading and is grounds for suspending or revoking Tesla’s license to sell cars in the state. Ralph Nader, whose critique of the auto industry in the 1960s helped create the National Highway Traffic Safety Administration (NHTSA), joined the chorus of “full self-driving” critics this month.

But it’s also yet another example of the unintended consequences of deploying an unfinished breakthrough technology in the wild — and shows how far some Tesla supporters are willing to go to protect it and the company. It turns out that so many people are doing their own experiments that one government agency has taken the extraordinary step of warning people not to use children to test car technology.

“Consumers should never attempt to create their own test scripts or use real people, especially children, to test the performance of automotive technology,” the NHTSA said in a statement on Wednesday. The agency called this approach “extremely dangerous.”

Tesla testing

Earlier this month, California resident Tad Park saw another Tesla enthusiast want to test “full self-driving” with a child and volunteered. two of his children. Park told CNN Business that it was “a little difficult” to get his wife to agree. She agreed when he promised to drive.

“I’m never going to go over the top because my kids are the most precious thing in the world to me,” Park said. “I’m not going to risk their lives in any way.”

Park’s tests, unlike O’Dowd, started off the Tesla at 0 mph. Tesla stalled in all of Park’s tests, beating out two of his children in the video, including a 5-year-old. Pack said he felt uncomfortable doing the test at the higher speed of 40 mph – as O’Dowd did using dummies – with his children.
Toronto resident Franklin Kadamuro has created the “box boy,” a children’s uniform made from old Amazon cardboard boxes. “Don’t blame me for what the machine does or doesn’t do,” he wrote on the beginning of his video. “I’m a big Tesla fan.”

His Tesla slowed down as it approached the box boy. Then he accelerated again and hit his cardboard dummy. Kadamuro speculated that this could be because the cameras couldn’t see the short boxes when they were directly in front of the bumper, and so they forgot they were there.

Human babies learn at around eight months that an out-of-sight object still exists, many years before they qualify for a driver’s license. But that ability may still elude some AI systems, such as Tesla’s “full self-driving”. Another Tesla fan. found a similar result.

Kadamuro said that his video started as entertainment. But he wanted people to see that “full self-driving” is not ideal.

“I found that a lot of people have two extreme views on the ‘full autonomous driving’ beta,” Kadamuro said. “People like Dan think this is the worst thing in the world. I know some friends who think it’s almost perfect.”

Kadamuro said he also ran other tests in which his Tesla, driven at higher speeds, effectively controlled the box boy.

According to Raj Rajkumar, a Carnegie Mellon University professor who researches autonomous vehicles, fast and accurate detection of small objects, such as small children, will generally be more difficult than detection of large objects and adults for a computer vision system like the one that Tesla cars rely on.

The more pixels an object occupies in a camera image, the more information the system has to detect features and identify the object. The system will also be affected by the data it is trained on, such as the number of images of young children it is exposed to.

“Computer vision with machine learning is not 100% reliable,” Rajkumar said. “Like diagnosing a disease, there are always false positives and negatives.”

Tesla did not respond to a request for comment and does not generally interact with professional media.

“Wild West Chaos Rules”

After criticism from Tesla fans of his original tests, O’Dowd released another video this month.

Some Tesla supporters criticized O’Dowd’s use of cones as lane markings in his initial testing, which may have limited the sedan’s ability to bypass a dummy. Others claimed that O’Dowd’s test driver forced the Tesla to hit the dummy by depressing the accelerator pedal, which was not seen in the videos posted by O’Dowd. Some Tesla enthusiasts have also pointed to blurry messages on the Tesla car’s screen as an indication that O’Dowd’s test driver was pressing the gas pedal to rig the tests.

Dan O'Dowd has run tests on mannequins and says they demonstrate that

O’Dowd told CNN Business that the vague messages were about the unavailability of boost and uneven tire wear. CNN Business was unable to independently verify what the report said because O’Dowd did not provide a clearer video of what was happening in the car during testing.

In my second video, O’Dowd tested without cones on a residential street and showed Tesla’s interior, including the accelerator pedal. Tesla, as in O’Dowd’s other tests, hit a baby dummy.
Earlier this year, O’Dowd lamented in an interview with CNN Business that no industry testing body is testing code for “full autonomous driving.” The US government does not have performance standards for automated driver assistance technologies such as autopilot.

O’Dowd is the founder of the Dawn Project, whose goal is to make computers safe for mankind. This year, he ran unsuccessfully for the US Senate in a campaign solely focused on his criticism of “full self-driving.”

NHTSA is currently investigating Tesla’s driver-assistance technology, so changes may be ahead.

“The software that drives the lives of billions of people in self-driving cars must be the best ever written,” O’Dowd said. “We use the absolute rules of chaos in the Wild West, and we got something so terrible.”

Leave a Reply

Your email address will not be published. Required fields are marked *