stevecrox

joined 1 year ago
[–] [email protected] 4 points 1 year ago

Plex has been baking in features like that to help you see what is on other streaming channels, etc..

Personally the whole point of Plex for me was it was a container for my existing DVD/Blu Ray collection, while Plex has added some really cool features. Increasingly they keep resetting the dashboard to try and force engagement with new features, it feels a bit user hostile and I've been switching to Jellyfin (same idea but entirely open source and self hosted).

From a discovery perspective, personally I've found good content tends to create its own word of mouth style buzz.

For example at the moment you can't go near twitter, reddit, work, BBC News, etc.. without someone talking about 'Mr Bates vs The Post Office'. Recently the risa community kept mentioning Babylon 5 so I picked up all 5 seasons for £20 and watched it through. Similarly the Risa community really seems to love Star Trek prodigy so I'll probably give that a go at some point.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (2 children)

The FAANG companies have an internal kind of elitisim that would make staff less effective.

If you look at any Google Java library, GWT, GSon, Guava, Gradle, Protobuf, etc.. there was a commonly used open source library that existed years before that covered 90% of the functionality.

The Google staff just don't think to look outside Google (after if Google hasn't solved it no chance outsiders have) and so wrote something entirely from scratch.

Then normally within 6 months the open source library has added the killer new feature. The Google library only persists because people hold FAANG as great "Its by Google so it must be good!" Yet it normally has serious issues/limitations.

The Google libraries that actually suceeded weren't owned by Google (E.g. Yahoo wrote Hadoop, Kubernetes got spun away from Google control, etc..).

[–] [email protected] 1 points 1 year ago (1 children)

I wouldn't use "certified" in this context.

Limiting support of software to specific software configurations makes sense.

Its stuff like Debian might be using Python 3.8 Ubuntu Python 3.9, OpenSuse Python 3.9, etc.. Your application might use a Python 3.9 requiring library and act odd on 3.8 but fine on 3.7, etc.. so only supporting X distributions let you make the test/QA process sane.

This is also why Docker/Flatpack exist since you can define all of this.

However the normal mix is RHEL/Suse/Ubuntu because those target businesses and your target market will most likely be running one.

[–] [email protected] 2 points 1 year ago (3 children)

I suspect they mean around packaging.

I honestly believe Red Hat has a policy that everything should pull in Gnome. I have had headless RHEL installs and half the CLI tools require Gnome Keyring (even if they don't deal with secrets or store any). Back in RHEL 7, Kate the KDE based Text Editor pulled in a bunch of GTK dependencies somehow.

Certification is really someone paid to go through a process and so its designed so they pass.

Think about the people you know who are Agile/Cloud/whatever certified and how all it means is they have learnt the basic examples.

Its no different when a business gets certified.

The only reason people care is because they can point to the cert if it all goes wrong

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

Debian isn't old == stable, its tested == stable.

Debian has an effective Rolling distribution through testing than can get ahead of Arch.

At some point they freeze the software versions in testing and look for Release Critical and Major bugs. Once they have shaken everything and submitted fixes where possible. It then becomes stable.

The idea is people have tested a set baseline of software and there are no known major bugs.

For the 4-5 releases Debian has released every 2 years (Similar to Ubuntu LTS). Debian tends to align its release with LTS Kernel and Mesa releases so there have been times the latest stable is running newer versions than Ubuntu and the newest software crown switches between Ubuntu LTS and Debian each year.

For some the priority to run software that won't have major bugs, that is what Debian, Ubuntu LTS and RHEL offer.

[–] [email protected] 1 points 1 year ago (1 children)

You realise Debian is the base distribution?

Ubuntu takes 6 monthly cuts from Debian Testing, adds some in house stuff puts them through QA and performs a release.

Linux Mint is produced by Cinnamon devs, similar to KDE Neon. They take the last Ubuntu LTS, remove many of the in house additions, add the latest Cinnamon desktop and release.

Cinnamon got upstreamed into Debian to make the process easier.

[–] [email protected] 3 points 1 year ago

The person is correct in this isn't a Linux problem, but relates to your experience.

Windows worked by giving everyone full permissions and opening every port. While Microsoft has tried to roll that back the administration effort goes into restricting access.

Linux works on the opposite principle, you have to learn how to grant access to users and expose ports.

You would have to learn this mental switch no matter what Linux task your trying to learn

Dockers guide to setting up a headless docker is copy/paste. You can install Docker Desktop on Linux and the effort is identical to windows. The only missing step is

sudo usermod -aG docker $user

To ensure your user can access the docker host as a local user.

[–] [email protected] 37 points 1 year ago (2 children)

Most of the updates are about long term support the performance gains are a side product.

This driver was one of the earliest open source drivers developed by AMD. The point of the driver is to convert OpenGL (instructions games give to draw 3D shapes) into the low level commands a graphics card uses.

A library (TMSC I think) was written to do this, however they found OpenGL commands often relied on the results of others and converting back to OpenGL was really CPU expensive.

So someone invented NIR, its an intermediate layer. You convert all OpenGL commands to NIR and it uses way less CPU to convert from NIR to GPU commands and back.

People in their spare time have been updating the old AMD drivers so they use the same libraries, interfaces, etc.. as the modern AMD drivers.

This update removes the last of the TMSC? usage so now the driver uses only NIR.

From a dev perspective everything now works the same way (less effort) from a user perspective those old cards get the performance bump NIR brought.

[–] [email protected] 5 points 1 year ago (2 children)

No, don't use Sid. No one should run it on a system they expect to work.

Debian has 3 phases stable, testing & unstable.

Debian Unstable is the initial gate for pulling in new code, applications need to not break everything in that environment before they can be moved to testing. A freeze is periodically applied to testing and RC/Major bugs are identified/fixed and Stable is released

Sid is the naughty child in Toy Story who destroys things. Debian uses Toy Story characters to name things and so Unstable got the nickname Sid.

If you have newly released hardware you might need an updated kernel. This can be found via backports.

Similarly Mesa covers the graphics drivers, you can pull the latest from backports, again you only need to do this if your graphics card is too new.

As someone who runs Debian Stable with KDE, it works great for gaming

[–] [email protected] 1 points 1 year ago (1 children)

I switched my computer illiterate family members to reduce the effort of helping them and they didn't notice.

As a helper..

There are distributions focussed on the latest and greatest (Arch, Fedora, etc..) and ones aiming for stability (Debian, Ubuntu, etc..). Think of them as groups with different views.

So Linux Mint is Ubuntu but it has the latest Cinnamon desktop. Ubuntu is Debian but focused on fixed releases and adds 'snaps' and includes "non-free" by default.

People have different views on how the desktop should work. The two big desktops are Gnome and KDE.

Gnome is like Marmite. Its works completely different to any other desktop and people either love it or loathe it. Its often the distribution default.

With Windows 10/11 I think Microsoft were trying to steal some of KDE's best features. By default it looks very much like a Windows desktop but lots of people mod it to look/act like macos. Some people struggle with the options it provides.

Then there are lots of other desktops, for example Cinnamon takes Gnome and turns it into a normal desktop.

Personally I would suggest Kubuntu as your first attempt. This is a fairly decent install guide.

Ubuntu tries to minimise the choices you need to make and the 6 month update cycle keeps it fairly stable.

Kubuntu is Ubuntu it just makes KDE the default instead of Gnome.

view more: ‹ prev next ›