this post was submitted on 11 Feb 2025
600 points (99.0% liked)
Technology
62161 readers
5713 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Stop calling gpt ai
That's the inaccurate name everyone's settled on. Kinda like how "sentient" is widely used to mean "sapient" despite being two different things.
I made a smartass comment earlier comparing AI to fire, but it's really my favorite metaphor for it - and it extends to this issue. Depending on how you define it, fire seems to meet the requirements for being alive. It tends to come up in the same conversations that question whether a virus is alive. I think it's fair to think of LLMs (particularly the current implementations) as intelligent - just in the same way we think of fire or a virus as alive. Having many of the characteristics of it, but being a step removed.
That is an extremely apt parallel!
(I'm stealing it)
How is it not AI? Just because it's not AGI doesn't mean it's not AI. AI encompasses a lot of things.
You put a few GPTs in a trenchcoat and they're obviously AI. I can't speak about openAIs offerings since I won't use it as a cloud service, but local deepseek I've tried is certainly AI. People are moving the goalposts constantly, with what seems to me a determination to avoid seeing the future that's already here. Download deepseek-v2-coder 16b if you have 16GB of ram and 10gb of storage space and see for yourselves, it's ridiculously low requirements for what it can do, it uses 50% of four cpu cores for like 15 seconds to solve a problem with detailed reasoning steps.
This article is about Gemini, not GPT. The generic term is LLM: Large Language Model.