First published on Tech.pinions
From the June 22 Wall Street Journal:
In the first 24 hours of the launch of an electric-scooter pilot program in the city of Hoboken, N.J., the local police department received more than 1,500 complaints and comments about the scooters, its police chief said.
Since the May 20 launch, a steady stream of complaints has rolled into the Hoboken Police Department. During that time, the department has also taken nine reports on collisions with scooters into parked cars and pedestrians, the worst of which occurred when an 11-year-old rider struck a pedestrian, who needed stitches.
“The number of issues about e-scooters has matched all other traffic complaints for the year, and this is only in a month,” Hoboken Police Chief Ken Ferrante said in an interview Thursday.
This is the physical manifestation of all that’s wrong with many of the tech companies, notably Uber, Facebook and Google: throw out a new innovation without preparing for or caring about the consequences.
While the new idea might be exciting, novel and even beneficial, these companies seem unable to understand or are too lazy to worry about the unintended consequences. Instead of thinking like a chess player, looking at several moves ahead, they’re like kids playing marbles.
The typical retort is that they never imagined how their technology might be used in ways they never intended:
“We never imagined that people could be injured riding scooters among pedestrians without a helmet, that scooters would be left anywhere blocking doorways or sidewalks.”
“We never imagined how targeting advertising could be used to target fake news by adversarial countries during an election.”
“We never imagined how pedophiles would be going after kids watching cartoons on You Tube.”
“We never expected an Uber driver to attack a passenger.”
But that argument, being used by all of these companies, is wearing very thin after so many miscues. For Facebook, in particular, Zuckerberg’s, Sandberg’s, and the company’s reputation have plummeted. Yet the company is so big and profitable, they just don’t care.
When companies evaluate a new product or feature, they do it on a basis of its profitability. Will it bring in more revenue for the cost to implement? But if they never consider the cost to clean up, filter out the trolls, do better screening, and maintain a healthy environment, their profitability calculations will be all wrong. In many cases the cleanup costs, whether it be scooters, YouTube videos or Facebook feeds, are substantial. So substantial that they resist doing what’s needed because it makes their initial evaluation way out of whack.
And when they do react, they outsource the cleanup, as in the case of Facebook, to make it less visible to their employees who created the mess to begin with. They eliminate the feedback loop that will prevent it from happening all over again. They’re motivated to keep repeating these mistakes to keep their investors happy and keep their stock price up.
This is not how products used to be evaluated when consequences were taken much more seriously and how they are still done in other sectors of our economy. Hardware products have a cost to manufacturer, to test, and to market, but they also have a cost of warranty, ongoing engineering, replacing defects, and customer support. The ROI is more predictable. But with these new companies, caution and due diligence seems to be an afterthought. Their mantra is to throw out the product to and see how well it works and worry about fixing it later. But too often they never get around to the latter because they’re on to their next new thing.
The proliferation of scooters in cities around the world is just another demonstration, albeit more visible, of these tech companies’ uncaring and selfish approach.
What’s disturbing is that with these companies’ financial successes and ability to get away with so much, it’s likely inspiring others to follow the same behavior. Witness Boing’s approach to the 737Max. They believed that they, too, could “throw out” their new plane with a design that had serious flaws without proper training, testing and understanding the consequences. And we know how well that worked.