Eugène Lefebvre was pushing a new technology to its limits.
Lefebvre was riding in a machine the media had derided as unsafe and unready. Editorials denounced the developers of the device as “bluffers” and “liars.” Yet enthusiasts like Lefebvre continued to “push the envelope.”
Lefebrve loved to show off his skill with the rapidly-developing technology. A true believer in the new device, he went off script, improvising astounding moves never attempted before. A wowed crowd of observers at an event were stunned by the maneuvers. In a moment, the event and its participants, including Lefebvre, were instrumental in bestowing an entire new industry – and the maligned machine’s developers – instant credibility.
The media, titans of industry and the proverbial crowned heads of Europe in that moment saw the technology was real. They realized the vast potential of this device and the world was changed by this revolutionary development. It wasn’t a driverless car.
Sadly, just nine days after that first international air race, the 1909 “Grande Semaine d’Aviation” at Reims, France, Eugène Lefebvre, test pilot, pushed the envelope too far. Not far from the site of his triumphal display of aerial derring-do, Lefebvre’s “Wright Flyer” crashed, killing him. He was only thirty-nine years old.
A Dilemma: Safety vs Progress
The safety of the machine was immediately called into question and justifiably so. However, Lefebvre’s own antics at the air show had earned him the derision of the event’s organizers for exhibiting “excessive recklessness and daring.” It was noted in the New York Times account of the meet Lefebrve had buzzed the crowd at a low height, “. . . driving at the crowded tribunes, turned in the nick of time, went sailing off, swooped down again till he made the flags on the pillars and the plumes on the ladies’ hats flutter, and so played about at will for our applause.”
Wilbur Wright spent much of the remainder of his life vigorously – and often vehemently – defending the safety of the invention and arguing for the superiority of his technology. Yet Orville and Wilbur Wright’s flying machines would also take the lives of eleven more U.S. Army test pilots in the next few years. Since Lefebvre, hundreds, perhaps thousands of test pilots have surrendered their lives while perfecting air transport as we know it today.
The reality is test pilots die.
This is not to diminish the services of test pilots in any way, reduce their humanity to a number or to portray them as cogs in a machine. It is because of their bravery and sacrifice that we have achieved another reality: Traveling by commercial airline today is the safest form of transport on the planet.
Joshua Brown was not a test pilot. But he was perhaps the 21st century equivalent to a test pilot: he was a driverless technology “early adopter.” The ex-Navy SEAL and small business owner had recently purchased a Tesla Model S and, like Lefebvre before him, was pushing the new technology to its limits.
The Tesla Model S, pioneered by serial entrepreneur Elon Musk, is not a driverless vehicle but features a first-to-market driverless technology labeled “Autopilot.” This feature allows the driver to temporarily relinquish full control of the car to . . . the car.
Like Lefebvre, Brown was a true believer in what the device could do, often posting the results of his “flight” experiments on social media. Like the crowds at that first air show in Reims, fans of Tesla and Brown were wowed by the capabilities of the car and the passion of its “pilot.”
Via its onboard sensor array and artificial intelligence analytics, the Model S can perform maneuvers that no other conventional car with “cruise control” can achieve. In one notable video post, Brown’s Tesla adjusts for a last-moment lane change by a truck. The onboard computer reacts and corrects its course even as the truck straddles both the lane it was in and the Tesla’s lane.
Joshua Brown was not a Tesla employee, rigorously analyzing the capabilities of the machine. He was not a test pilot pushing the envelope on performance to win the hearts and minds of doubters like Lefebvre. He was an end user. Yet like Lefebvre, this test pilot died tragically. The onboard computer of Brown’s Model S failed to recognize and react to a tractor-trailer crossing in front of the Tesla on a Florida (U.S.) highway.
As they collided perpendicular to each other, the cab of the car was sheared off by the undercarriage of the truck. The vehicle, which Brown had tagged as “Tessy,” continued on after the collision barreling through two fences before striking a power pole and coming to rest. Brown was only forty years old.
A Blame Game
Unresolved is whether the technology, the barnstorming believer Brown or the driver of the tractor trailer were primarily at fault for the accident. Preliminary analysis suggests that “Tessy” may have been speeding (most likely at the instruction of the driver). Some evidence at the wreck site indicates Brown may have been watching a “Harry Potter” video at the time of the crash. Was Brown, like Lefebvre before him, exhibiting “excessive recklessness and daring?”
According to the New York Times, “The accident may have happened in part because the crash-avoidance system is designed to engage only when radar and computer vision systems agree [emphasis added by H2] that there is an obstacle, according to an industry executive with direct knowledge of the system.” It appears the computer may not have been able to discern the white paneling of the truck from the bright skies that day.
Tesla was slow to publicly divulge its Autopilot feature’s involvement in the accident, which is now under investigation by the U.S. National Transportation Safety Board. Once the news was out, however, the company and founder Elon Musk, like Wilbur Wright before him, was vigorously defending the safety of the invention and arguing for the superiority of his technology. Critics were voicing concerns that the Autopilot feature was prematurely released to the public and lulls drivers into a sense of security that nothing can go wrong.
Musk has long argued that driverless cars will actually improve safety and often points to statistics that back up this viewpoint. “This is the first known fatality in just over 130 million miles where Autopilot was activated,” the company said in its blog. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”
Public Beta Mode?
Tesla’s Autopilot is, according to the company, in a “public beta” mode. “When drivers activate Autopilot,” the company said in its explanatory blog post, “the acknowledgment box explains, among other things, that Autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times,’ and that ‘you need to maintain control and responsibility for your vehicle’ while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.”
The company adds that, “The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.”
Is Society Ready?
Critics, such as Dr. Patrick Lin, argue that there is more at stake than the death of one “test pilot.” Ethical questions arise, Lin says in the IEEE’s online site (IEEE.org).
These include the question of whether society is prepared for deaths like Brown’s, which possibly may not have occurred if Brown had been fully attentive and participating in driving the vehicle. So, while saving lives, the technology may also actually cost other lives. But, Lin says, these are “different” lives.
“So, when we speculate whether the broader public would accept robot cars that are imperfect but still safer than today’s cars,” Lin says, “we should remember that it’s more than about the numbers. Would we really accept higher statistical safety, if that came with new risks and accident types that we could easily avoid today?”
Lin suggests, “. . . we’re very uncomfortable with the possibility that a robot car might choose to take a life, if it’s the lesser of two unavoidable evils. . . the debate isn’t over just by pointing at numbers. Consumer adoption of self-driving vehicles won’t just be driven by logic and statistics, but also by perceptions and emotions.”
The Beginning of the End . . . or Just the End of the Beginning?
Carmaker Tesla and the driverless car concept may have taken a public relations beating from this tragic incident. The Tesla Autopilot development team apparently had a team meeting to discuss and deal with emotions sparked by the event.
In searching for some greater good from a Brown’s unnecessary and possibly preventable death, we must arrive at the point where we understand there is, and will be, a human cost to managing the carnage on the highways. A carnage to which we have grown much too numb.
Everyone agrees with Elon Musk the technology will improve, perhaps in part due to this event. It may even be perfected someday, though some may doubt this. As safe as air travel is, airplanes still crash and even disappear.
But for managers of transportation systems, it is time to begin to ponder how our operations and processes will be affected when we are no longer interacting with a customer who is a person, but one which is a car.
Until then, however, the reality is test pilots die.
(Want more information? Please check out our forums. If you have comments or suggestions for improvements to our content, please share your experiences with us and other professionals visiting our site. Add your comments below – if Comments are “on” – or by contacting us directly here. A version of this article was published previously in PARKING magazine. It has been edited, updated and is reprinted here by permission of the author.)