Uber Driverless Car Kills Pedestrian; Police Say Uber May Not Be At Fault - Page 2 - Politics Forum.org | PoFo

Wandering the information superhighway, he came upon the last refuge of civilization, PoFo, the only forum on the internet ...

Talk about what you've seen in the news today.

Moderator: PoFo Today's News Mods

#14898848
Godstud wrote:Perhaps not. It's an opinion. What happens next will depend on facts, not what people "feel".


It has nothing to do with "feel". If a driverless car cannot detect a pedestrian ahead on an open road it's a major fail. I guess a human could be excused in such a situation, but not a machine full of freaking sensors. Driverless cars might have other deficiencies, so they must at least manage the obvious situations flawlessly.
#14898933
Why would the capitalists care? Driverless cars mean they will be able to fire more people with driving jobs thus pushing up the (((shareholders))) dividends. The rich won't be the ones getting run over. Since when have the rich walked anywhere? Capitalism 101.
#14898963
Oh, FFS Decky, now you need maintenance people to fix and oversee the cars. Jobs aren't gone, but merely changed. God damned commies!!!
Last edited by Godstud on 23 Mar 2018 00:09, edited 1 time in total.
#14898997
Godstud wrote:Oh, FFS Decky, now you need maintenance people to fix and oversee the cars. Jobs aren't gone, but merely changed. God damned commies!!!

So you mean to tell me that unlike the cars we have to day, driverless cars require being maintained and repaired?

That's sort of strange.

Actually they would probably need less maintenance because the would probably be replaced quicker, especially as they depend on technical advancement along the line. This would probably be built into the analysis. There is already a much greater planned obsolescence of cars than anything you are aware of or familiar with in some places, like Japan (they use the mechanism of registration taxation to achieve this; which is exorbitant in the first instance and which becomes more exorbitant the longer you own the car; and the Japanese bend over, clench their teeth, and take it, as they are want to do on many things).
#14899029
Seems to me that finding a pedestrian with a big bike in the dark in perfect weather with completely clear roads would be a task an automated car should be good at. F*ing thing has radar and lidar.
#14899708
Godstud wrote:Because most humans are shit drivers, and there are times when humans should not drive, and yet they do anyways.


So the real answer is Moolah for Uber selling useless tech at a high price.

Most humans will still be shit drivers, and will be even shitter drivers with this shit doing it for them. When they have to switch off Autodrive they'll be shit at the normal stuff. They won't know what the fuck to do in an emergency just like this one.

H.G Wells was right, we are turning into useless dumb Eloi who won't be even capable of driving one day.
#14917765
The Unavoidable Folly of Making Humans Train Self-Driving Cars

UPDATE: On Thursday, May 24, the National Transportation Safety Board released its preliminary report from its investigation into Uber's fatal self-driving car crash in March in Tempe, Arizona.

The report explains that although the car's software determined it needed to stop to avoid hitting the woman crossing the street, it wasn't programmed to make emergency braking maneuvers. Instead of risking “erratic vehicle behavior” by letting the car decide when to slam on the brakes, Uber relies on the human behind the wheel to take control when necessary.

Problem is, Uber's safety driver wasn't looking at the road in the moments leading up to the crash. This story about why humans are ill-suited to this kind of work originally ran on March 24, 2018. We've updated it to include new details from the report.


So they are blaming the driver when the car's programmers failed to program the car to stop.

Isn't it the point of a self driving car to prevent an accident? What is the point of programming a self driving car to not execute a potentially life-threating accident?

The National Transportation Safety Board's preliminary report from its investigation into the crash reveals that the car's sensors detected Herzberg about six seconds before the crash, and that the software classified her as an unknown object, then as a vehicle, and finally as a pedestrian.

Less than two seconds before impact, the car determined it needed to stop. But it couldn't. "According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior," the report says. "The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator." But video shows that the car's operator, Rafaela Vasquez, wasn’t watching the road in the moments leading to the crash. She told NTSB investigators she was looking at the system's interface, which is built into the center console.

And so, along with the entire notion that robots can be safer drivers than humans, the crash casts doubt on a fundamental tenet of this nascent industry: that the best way to keep everyone safe in these early years is to have humans sitting in the driver’s seat, ready to leap into action.

This is fucking retarded.

What is the point of expecting a driver to just sit there with his hands on the wheel and be ready to take control over a vehicle at any given time when the car's programming just decides to crap out?

In further reading, the main problem is that the driverless vehicle operators are idiots who can't stop playing around with their phones while driving.

So they are put in a driverless car and are expected to sit through traffic and be expected to take control over the vehicle. Sounds like a really stupid game.

Dozens of companies are developing autonomous driving technology in the United States. They all rely on human safety drivers as backups. The odd thing about that reliance is that it belies one of the key reasons so many people are working on this technology. We are good drivers when we’re vigilant. But we’re terrible at being vigilant. We get distracted and tired. We drink and do drugs. We kill 40,000 people on US roads every year and more than a million worldwide. Self-driving cars are supposed to fix that. But if we can’t be trusted to watch the road when we’re actually driving, how did anyone think we’d be good at it when the robot’s doing nearly all the work?

“Of course this was gonna be a problem,” says Missy Cummings, the director of the Humans and Autonomy Laboratory and Duke Robotics at Duke University. “Your brain doesn’t like to sit idle. It is painful.”

In good faith, people really thought the safety drivers were going to do a good job.

In 2015, Cummings and fellow researchers ran their own test. “We put people in a really boring, four-lane-highway driving simulator for four hours, to see how fast people would mentally check out,” she says. On average, people dropped their guard after 20 minutes. In some cases, it took just eight minutes.

Everyone developing self-driving tech knows how bad humans are at focusing on the road. That’s why many automakers have declined to develop semiautonomous tech, where a car drives itself in a simple scenario like highway cruising, but needs a person to supervise and grab the wheel when trouble seems imminent. That kind of system conjures the handoff problem, and as Volvo’s head of safety and driver-assist technologies told WIRED in 2016, "That problem's just too difficult.”

The problem for the companies eager to skip that icky middle ground and go right for a fully driverless car is that they believe the only way to get there is by training on public roads—the testing ground that offers all the vagaries and oddities these machines must master. And the only reasonable approach—from a pragmatic and political point of view—to testing imperfect tech in two-ton vehicles speeding around other people is to have a human supervisor.

“I think, in good faith, people really thought the safety drivers were going to do a good job,” Cummings says. In a rush to move past the oh-so-fallible human, the people developing truly driverless cars doubled down on, yes, the oh-so-fallible human.

That’s why, before letting them on the road, Uber puts its vehicle operators through a three-week training course at its Pittsburgh R&D center. Trainees spend time in a classroom reviewing the technology and the testing protocols, and on the track learning to spot and avoid trouble. They even get a day at a racetrack, practicing emergency maneuvers at highway speeds. They’re taught to keep their hands an inch or two from the steering wheel, and the right foot over the brake. If they simply have to look at their phones, they’re supposed to take control of the car and put it in park first.

There’s a sense of complacency when you’re driving the same loops over and over, and you trust the vehicle.

Working alone in eight-hour shifts (in Phoenix they earn about $24 an hour), the babysitters are then set loose into the wild. Each day, they get a briefing from an engineer: Here’s where you’ll be driving, here’s what to look for. Maybe this version of the software is acting a bit funky around cyclists, or taking one particular turn a little fast.

And constantly, they are told: Watch the road. Don’t look at your phone. If you’re tired, stop driving. Uber also audits vehicle logs for traffic violations, and it has a full-time employee who does nothing but investigate potential infractions of the rules. Uber has fired drivers caught (by other operators or by people on the street) looking at their phones.

Still, the vigilance decrement proves persistent. “There’s fatigue, there’s boredom,” says one former operator, who left Uber recently and requested not to be named. “There’s a sense of complacency when you’re driving the same loops over and over, and you trust the vehicle.” That’s especially true now that Uber’s cars are, overall, pretty good drivers. This driver said that by early this year, the car would regularly go 20 miles without requiring intervention. If you’re tooling around the suburbs, that might mean an hour or more. As any RAF cadet watching a broken clock in a cabin could tell you, that’s a long time to stay focused. “You get lulled into a false sense of security,” the driver says.
#14917871
NTSB wrote:"The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator."

So the driver is just a scapegoat? "Just sit here and do nothing. We'll throw you under the bus if we need you."

They literally programmed the car to ignore hazards. No breaking, no swerving and no warning. Uber should be charged with manslaughter.
#14921071
Godstud wrote:The person was invisible up until the lights were on her. I doubt a human would have reacted any better.

Exactly.
The most important point is that, in a world of cars, humans are not free to walk freely. They must be constantly vigilant and follow a set of strict rules regarding where to cross streets, and they must pay attention to drivers who don't pay attention. If pedestrians don't follow all the rules and walk as they are confined to do so by the law, they may be instantly killed: our capital punishment is death by fast-moving multi-tonne object.

The everyday use of the car has killed our natural way of moving around our environment, of socializing, and of playing on streets with neighbors (as kids), and this is much a more serious social problem than our car-propaganda-addled brains are able to comprehend.
#14921088
QatzelOk wrote:The most important point is that, in a world of cars, humans are not free to walk freely. They must be constantly vigilant and follow a set of strict rules regarding where to cross streets, and they must pay attention to drivers who don't pay attention. If pedestrians don't follow all the rules and walk as they are confined to do so by the law, they may be instantly killed: our capital punishment is death by fast-moving multi-tonne object.

QFT.
Here in Bangladesh the people coming from the village are not used to fast moving vehicles. The fastest many of them have experienced is an ox cart.
So they are not trained to swiftly jump out of the way when we drive our automobiles.
It is really hampering the pleasure we could experience at higher speeds.
C'est la vie.
#14921662
Ter wrote:QFT.
Here in Bangladesh the people coming from the village are not used to fast moving vehicles. The fastest many of them have experienced is an ox cart.
So they are not trained to swiftly jump out of the way when we drive our automobiles.
It is really hampering the pleasure we could experience at higher speeds.
C'est la vie.

In less developed countries, car infrastructure is less developed and death is random and instant.

In more developed countries, the people are used to coping with cars, and their deaths are slow and boring, from not being able to walk or play in streets casually.

In both cases, cars ruin the societies that they are permitted to dominate.

Driverless cars are just a new fad for brainless consumers to follow.
Israel-Palestinian War 2023

no. It's not. :O BUilding more homes for people […]

There were formidable defense lines in the Donbas[…]

World War II Day by Day

March 28, Thursday No separate peace deal with G[…]

Russia-Ukraine War 2022

Meanwhile, your opponents argue that everyone e[…]