darkevilmac

joined 2 years ago
[–] [email protected] 7 points 2 years ago (1 children)

Yes and no, I've worked on the backend for big apps before. You generally try and keep backwards compatibility as long as possible to give clients time to update. You can't just change API routes and have all the clients be on the latest version overnight.

[–] [email protected] 12 points 2 years ago

Excel, is that you?

[–] [email protected] 37 points 2 years ago (2 children)

Nice try dentist

[–] [email protected] 2 points 2 years ago

Rendering with JS definitely makes a difference, it's part of the reason SSR is such a big deal for SEO.

[–] [email protected] 29 points 2 years ago* (last edited 2 years ago) (1 children)

It's all well and good to have a revolution, but if nobody knows you're having one then nothing really changes. There are still benefits to centralised services, one of which being scale. To effectively index so much data you need scale, which is why smaller search engines tend to be just white labels of things like Bing.

[–] [email protected] 3 points 2 years ago

Maybe, I'm a bit more optimistic though. I think even if they just did something like a read only service that pulls from other federated sources like their web crawlers do for regular sites they would basically be done.

The only concern there would be people trying to block them like everyone has been doing to Meta.

[–] [email protected] 73 points 2 years ago (26 children)

I feel like Google is going to have to find a way to effectively index federated content at some point. The only way to really get human information is from sites like Reddit and Twitter. And both of those platforms seem to be dedicated to completely imploding at the moment.

[–] [email protected] 14 points 2 years ago (2 children)

Don't forget poetry!