this post was submitted on 27 Mar 2025
663 points (94.6% liked)

Technology

68187 readers
4872 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 60 points 6 days ago (2 children)

They're super conservative. I rode just once in one. There was a parked ambulance down a side street about 30 feet with it's lights one while paramedics helped someone. The car wouldn't drive forward through the intersection. It just detected the lights and froze. I had to get out and walk. If we all drove that conservatively we'd also have less accidents and congest the city to undrivability.

[–] [email protected] 20 points 6 days ago* (last edited 6 days ago) (1 children)

Back in February, I took a Waymo for the first time and was at first amazed. But then in the middle of an empty four lane road, it abruptly slammed the brakes, twice. There was literally nothing in the road, no cars and because it was raining, no pedestrians within sight.

If I had been holding a drink, it would have spelled disaster.

After the second abrupt stop, I was bracing for more for the remainder of the ride, even though the car generally goes quite slow most of the time. It also made a strange habit of drifting between lanes through intersections and using the turning indicators like it had no idea what it was doing—it kept alternating went from left to right.

Honestly it felt like being in the car with a first time driver.

[–] [email protected] 12 points 6 days ago

Maybe the reason they crash less is because everyone around them have to be extremely careful with these cars. Just like in my country we put a big L on the rear of the car for first year drivers.

[–] [email protected] 14 points 6 days ago (1 children)

How long ago was that? Last year I took a couple near Phoenix and they did great, lights or no. The hardest part was dropping me off at the front of a hotel, as people were in and out and cars were everywhere. Still didn't have issues, just slowed down to 3mph when it had 15 years left or so

[–] [email protected] 32 points 6 days ago (1 children)

just slowed down to 3mph when it had 15 years left or so

Damn, spending 15 years in a car going 3mph sounds terrible.

[–] [email protected] 14 points 6 days ago

Haha, yeah I didn't check that, was eating. 15 yards. I'm actually still sitting there.

[–] [email protected] 77 points 6 days ago* (last edited 6 days ago) (3 children)

Because they are driving under near ideal conditions, in areas that are completely mapped out, and guided away from roadworks and avoiding "confusing" crosses, and other traffic situations like unmarked roads, that humans deal with routinely without problem.
And in a situation they can't handle, they just stop and call and wait for a human driver to get them going again, disregarding if they are blocking traffic.

I'm not blaming Waymo for doing it as safe as they can, that's great IMO.
But don̈́t make it sound like they drive better than humans yet. There is still some ways to go.

What's really obnoxious is that Elon Musk claimed this would be 100% ready by 2017. Full self driving, across America, day and night, safer than a human. I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.

[–] [email protected] 23 points 6 days ago (3 children)

You’re not wrong, but arguably that doesn’t invalidate the point, they do drive better than humans because they’re so much better at judging their own limitations.

If human drivers refused to enter dangerous intersections, stopped every time things started yup look dangerous, and handed off to a specialist to handle problems, driving might not produce the mountain of corpses it does today.

That said, you’re of course correct that they still have a long way to go in technical driving ability and handling of adverse conditions, but it’s interesting to consider that simple policy effectively enforced is enough to cancel out all the advantages that human drivers currently still have.

[–] [email protected] 20 points 6 days ago* (last edited 6 days ago) (2 children)

You are completely ignoring the under ideal circumstances part.
~~They can't drive at night AFAIK~~, they can't drive outside the area that is meticulously mapped out.
And even then, they often require human intervention.

If you asked a professional driver to do the exact same thing, I'm pretty sure that driver would have way better accident record than average humans too.

Seems to me you are missing the point I tried to make. And is drawing a false conclusion based on comparing apples to oranges.

[–] [email protected] 12 points 6 days ago (1 children)

Waymo can absolutely drive at night, I’ve seen them do it. They rely heavily on LIDAR, so the time of day makes no difference to them.

And apparently they only disengage and need human assistance every 17,000 miles, on average. Contrast that to something like Tesla’s “Full Self Driving” (ignoring the controversy over whether it counts or not), where the most generous numbers I could find for it are a disengagement every 71 city miles, on average, or every 245 city miles for a “critical disengagement.”

You are correct in that Waymo is heavily geofenced, and that’s pretty annoying sometimes. I tried to ride one in Phoenix last year, but couldn’t get it to pick me up from the park I was visiting because I was just on the edge of their area. I suspect they would likely do fine if they went outside of their zones, but they really want to make sure they’re going to be successful so they’re deliberately slow-rolling where the service is available.

load more comments (1 replies)
load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)
[–] Curious_Canid 99 points 6 days ago (14 children)

This would be more impressive if Waymos were fully self-driving. They aren't. They depend on remote "navigators" to make many of their most critical decisions. Those "navigators" may or may not be directly controlling the car, but things do not work without them.

When we have automated cars that do not actually rely on human being we will have something to talk about.

It's also worth noting that the human "navigators" are almost always poorly paid workers in third-world countries. The system will only scale if there are enough desperate poor people. Otherwise it quickly become too expensive.

[–] [email protected] 25 points 6 days ago (1 children)

@Curious_Canid @vegeta this is the case for the Amazon "just walk out" shops as well. Like Waymo they frame it as the humans "just doing the hard part" but who knows what "annotating" means in this context? And notably it's clearly more expensive to run than they thought as they've decided to do Dash Carts instead which looks like it's basically a portable self-service checkout. The customer does the checking. https://www.theverge.com/2024/4/17/24133029/amazon-just-walk-out-cashierless-ai-india

[–] [email protected] 10 points 6 days ago (5 children)

Back when I was a fabricator I made some of the critical components used in Amazon stores. Amazon was incredibly particular about every little detail, even on parts that didn't call for tight tolerancing in any conceivable way. They, on several occasions, sent us one bad set of prints after another. Which we could only discover after completing a run of parts. We're talking 20-30 thousand units that ended up being scrapped because of their shitty prints. Millions of dollars set on fire, basically.

They became such a huge pain in the ass to work with we eliminated every single SKU they ordered from us.

load more comments (5 replies)
[–] [email protected] 21 points 6 days ago (1 children)

Yeah we managed to just put the slave workers behind a further layer of obfuscation. Not just relegated to their own quarters or part of town but to a different city altogether or even continent.

Tech dreams have become about a complete lack of humanity.

[–] Curious_Canid 22 points 6 days ago (2 children)

I saw an article recently, I should remember where, about how modern "tech" seems to be focused on how to insert a profit-taking element between two existing components of a system that already works just fine without it.

[–] [email protected] 14 points 6 days ago

That's called "rent-seeking behavior," and it's not new

load more comments (1 replies)
[–] Yoga 14 points 6 days ago (4 children)

The system will only scale if there are enough desperate poor people. Otherwise it quickly become too expensive.

You can also get MMORPG players to do it for pennies per hour for in-game currency or membership. RuneScape players would gladly control 5 'autonomous' cars if it meant that they could level up their farming level for free.

The game is basically designed to be an incredibly time consuming skinner box that takes minimal skill and effort in order to maximize membership fees.

load more comments (4 replies)
load more comments (11 replies)
[–] [email protected] 41 points 6 days ago (5 children)

Considering the sort of driving issues and code violations I see on a daily basis, the standards for human drivers need raising. The issue is more lax humans than it is amazing robots.

[–] [email protected] 11 points 6 days ago (1 children)

it's hard to change humans. It's easy to roll out a firmware update.

[–] [email protected] 5 points 5 days ago

Raising the standards would result in 20-50% of the worst drivers being forced to do something else. If our infrastructure wasn't so car-centric, that would be perfectly fine.

load more comments (4 replies)
[–] [email protected] 23 points 6 days ago (4 children)

We always knew good quality self-driving tech would vastly outperform human skill. It's nice to see some decent metrics!

load more comments (4 replies)
[–] [email protected] 25 points 6 days ago (3 children)

But when it does crash, will Google accept the liability?

load more comments (3 replies)
[–] [email protected] 21 points 6 days ago* (last edited 6 days ago) (6 children)

Unprofessional human drivers (yes, even you) are unbelievably bad at driving, it's only a matter of time, but call me when you can do it without just moving labor done by decently paid locals to labor done remotely in the third world.

load more comments (6 replies)
[–] [email protected] 20 points 6 days ago (16 children)

"After 6 miles, Teslas crash a lot more than human drivers."

load more comments (16 replies)
[–] [email protected] 10 points 6 days ago

I used to hate them for being slow and annoying. Now they drive like us and I hate them for being dicks. This morning, one of them made an insane move that only the worst Audi drivers in my area do, a massive left over a solid yellow across no stop sign with me coming right at it before it even began acceleration into the intersection.

[–] [email protected] 10 points 6 days ago

As a techno-optimist, I always expected self-driving to quickly become safer than human, at least in relatively controlled situations. However I’m at least as much a pessimist of human nature and the legal system.

Given self-driving vehicles demonstrably safer than human, but not perfect, how can we get beyond humans taking advantage, and massive liability for the remaining accidents?

[–] [email protected] 11 points 6 days ago (12 children)

I had a friend that worked for them in the past. They really aren't that impressive. They get stuck constantly. While the tech down the line might be revolutionary for people who cannot drive for whatever reason right now it still needs a LOT of work.

load more comments (12 replies)
[–] [email protected] 11 points 6 days ago (1 children)

What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash.

Also, I think it's worth discussing whether to include in the baseline certain driver assistance technologies, like automated braking, blind spot warnings, other warnings/visualizations of surrounding objects, cars, bikes, or pedestrians, etc. Throw in other things like traction control, antilock brakes, etc.

There are ways to make human driving safer without fully automating the driving, so it may not be appropriate to compare fully automated driving with fully manual driving. Hybrid approaches might be safer today, but we don't have the data to actually analyze that, as far as I can tell.

load more comments (1 replies)
[–] [email protected] 6 points 6 days ago

how those robot food delivery "robot ai boxes"? by starship doing?

load more comments
view more: next ›