this post was submitted on 21 Jan 2025
19 points (95.2% liked)

United Kingdom

4237 readers
152 users here now

General community for news/discussion in the UK.

Less serious posts should go in [email protected] or [email protected]
More serious politics should go in [email protected].

Try not to spam the same link to multiple feddit.uk communities.
Pick the most appropriate, and put it there.

Posts should be related to UK-centric news, and should be either a link to a reputable source, or a text post on this community.

Opinion pieces are also allowed, provided they are not misleading/misrepresented/drivel, and have proper sources.

If you think "reputable news source" needs some definition, by all means start a meta thread.

Posts should be manually submitted, not by bot. Link titles should not be editorialised.

Disappointing comments will generally be left to fester in ratio, outright horrible comments will be removed.
Message the mods if you feel something really should be removed, or if a user seems to have a pattern of awful comments.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 day ago (1 children)

Yeah that was sorta my point. Modern/ current technology at best mimics some functionality of intelligence. Hence my claim it is artificial. It really is no more then 1980s expert systems with much greater data speeds and sets. And more flexible algorithms. But an evolution not a revolution.

He ce why I'd say artificial intelligence applies to current technology. Because it is not real.

If we ever develop anything that is intelligence as many fear it. Then by its very definition AI is no longer a valid term for it. Hence why O think we should stop using that term when talking about weather such things are safe or not. First It gives the impression to the less informed that we are anywhere close to such tech. Creating invalid fears of current tech. When lets face it their are plenty of genuine arguments about the massive use of data.

But more importantly if ever anything (sci-filike as it may be) that is trully able to learn and think for itself is developed (if that would even be the correct term as we really are that ill-informed on how atm) . Then artificial would be a miss definition.

[–] [email protected] 1 points 2 hours ago (1 children)

On the contrary, I'd argue that its entirely feasible to create an artificial intelligence. "All" you need do is replicate the concept of thought - which is a never ending train of relational contexts that are entirely dependent on the individuals life experiences. Putting that into practise is a huge job, but arguably not an impossible one. Such a creation, presuming it could create new concepts along the way, would certainly be deserving of the title "AI".

[–] [email protected] 1 points 40 minutes ago* (last edited 38 minutes ago) (1 children)

Have fun.

The issue is your interpretation is at best an hypothsis. Not a fact. And the only way to prove your hypothsis is to simulate the thought you wish to create.

Others have not managed it yet. But you may be the first. Personally i am not sold on the idea. Bur would love to see you prove me wrong.

That is after all the point of science.

But linguistics wise.

How is that intelegence artificial?

[–] [email protected] 1 points 12 minutes ago

Artificial merely implies manmade, as opposed to naturally developed IMO.

As for the hypothesis, a few years ago I took a crack at designing a system like that as an on-paper exercise. The vast majority of it was just...pushing data around and using existing data to suggest new data. Not all that dissimilar to how human beings think, to be honest. The big hurdle was optimisation and context, and allowing the platform to "grow" without letting it metastasize and without improperly restricting it. There are some hardware limitations to consider too - a storage backbone, for one, and interlinking every thread as opposed to having them wholly isolated from each other. There's the potential for thread interruption too, which as far as I'm aware is not something that any microcode packages support.

But despite all that, I'm still fairly certain one could build an approximation therein. The complexity of inter-stimuli input (read: input from audio, visual, and potentially sensatory endpoints, replicating vision, hearing and touch) isn't to be underestimated, though.

Perhaps one day I might take a crack at it - but its also a morally gray area that has quite a few caveats to it, so... uh... maybe.