Technology
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
The amount of second hand content an LLM needs to consume to train inevitably includes copyrighted material. If they used this thread, the quotes OP included would end up in the training set.
The amount of fan forums and wikis on copy written material provide copious amounts of information about the stories and facilitate the retelling. They're right that it is impossible for a general purpose LLM.
My personal experience so far though has been that general purpose and multiple modality LLMs are less consistently useful to me than GPT4 was at launch. I think small, purpose built LLMs with trusted content providers have a better chance of success for most users, but we will see if anyone can make that work given the challenge of bringing users to the right one for the right task.
I would just like to say, with open curiosity, that I think a nice solution would be for OpenAI to become a nonprofit with clear guidelines to follow.
What does that make me? Other than an idiot.
Of that at least, I’m self aware.
I feel like we’re disregarding the significance of artificial intelligence’s existence in our future, because the only thing anybody that cares is trying to do is get back control to DO something about it. But news is becoming our feeding tube for the masses. They’ve masked that with the hate of all of us.
Anyways, sorry, diatribe, happy new year
I think OpenAI (or some part of it) is a non-profit. But corporate fuckery means it can largely be funded by for profit companies which then turn around and profit from that relationship. Corporate law is so weak and laxly enforced that's it's a bit of a joke unfortunately.
I agree that AI has an important role to play in the future, but it's a lot more limited in the current form than a lot of people want to believe. I'm writing a tool that leverages AI as a sort of auto-DM for roleplaying, but AI hasn't written a line of code in it because the output is garbage. And frankly I find the fun and value of the tool comes from the other humans you play with, not the AI itself. The output just isn't that good.
It is supposedly a non-profit, and that is how the board of Open AI tried to fire Altman but than the big tech (Microsoft) intervened and wrestled the control.
Its basically Microsoft now.
“Oh you know it’s like in Fight Club”
“Sorry I have been trained in a legal matter and therefore know no cultural references from the last 30 years”