this post was submitted on 02 Feb 2025
18 points (90.9% liked)

Hardware

1021 readers
162 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
 

But it still has a chance at the edge and the PC

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 1 points 4 days ago (1 children)

But Nvidia already had chips and cards shipping that were suitable for AI, so not tons of R&D up front. Intel would have to somehow make their ARC line better for AI and do it faster than Nvidia is improving theirs.

[โ€“] [email protected] 1 points 4 days ago

Not saying that Intel should pivot to AI, I hope they don't. Just to the question if anyone made a profit with AI, the biggest winner is probably Nvidia