View Full Version here: : A chat with the GPT-3 AI about the forthcoming 100-trillion parameter GPT-4 AI
In this 10 minute video (https://youtu.be/hwzCta0Cx8k)released only a few minutes ago, Dr Alan Thompson interacts with the GPT-3 AI.
One interacts with GPT-3 in both directions using text.
The text was then post-processed using text-to-speech synthesis and visualisation using the Leta avatar. These components are not part of the AI.
Thompson talks with GPT-3 about the forthcoming GPT-4.
Whereas GPT-3 has around 160 billion parameters, GPT-4 will have around 100 trillion parameters.
It won't be around for another few years but that is equivalent to the number of synapses in the human brain.
https://youtu.be/hwzCta0Cx8k
colour-coded
28-08-2021, 07:01 PM
Well, I'll go out and get some ice cream*
*lock down conditions permitted
multiweb
29-08-2021, 09:46 AM
Next iteration might be GPT-4 asking questions and this bloke answering. :lol:
AdamJL
29-08-2021, 12:42 PM
we need to modify ourselves to understand machines rather than the other way around. Human language, emotion and expression is inefficient in too many ways.
multiweb
29-08-2021, 01:58 PM
Communication is just an interface doesn't affect the reasoning/problem solving bit. But I think you're right about emotions. That could be an issue. :lol:
Sunfish
29-08-2021, 06:31 PM
I think that Dr Thompson shows a singular lack of imagination and the AI is correct . Nature is more powerful. AI is still just software, no matter how complex it seems, and like all systems can become moribund without providing what people prefer, external input or purpose and continued direction from people. People’s preferences and abilities are unpredictable.
AdamJL
29-08-2021, 06:50 PM
True. But then there's the speed issue as well. We can think faster without reasoning involved. Machines can think faster for many tasks. Consciousness gets in the way for problem solving (which is why we don't physically compute the maths involved in the trajectory of a projectile aimed at us or aimed at our food/enemy.. your hands know exactly where to go to catch a ball, for instance without you having to calculate the physics involved consciously).
The human conscious layer is more like a CEO. It gets second by second reports of what's happened, is oblivious to the workforce and processes actually doing the work, and then thinks it's done the job :)
The old "decide to move your hand" experiment shows the signal is already on the way to your hand before you think you've made the choice to move said hand.
There are a million links on the many experiments done in this field, but here's one:
https://www.thegreatcoursesdaily.com/why-did-you-move-your-hand-just-now-linking-free-will-and-movement/
Nature is more powerful because it's had billions of years at it. It's very very good at what it does. If AI can fulfil the potential of a singularity within the next 1000 years, that's orders of magnitude faster than nature. And it will only increase exponentially after that.
This is an article from 2015, over two parts. If anyone's interested in learning this topic, this is a great start:
https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html
https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-2.html
And on the subject of AI just being "software"... well, so are humans. We're just a software layer on top of an organic computer. We have coding (DNA), we have programming rules. We have inputs and outputs, our code can malfunction, we can get a virus, and we can be turned off with ease (hello Grim Reaper).
Lariliss
07-09-2021, 09:30 PM
AI recognition is vital for processing big data volumes and it’s impartiality.
Earth surveillance, what do we use satellites for (Skyrora: https://www.skyrora.com/blog/satellites) is beneficial in this case. AI is making cool judgement of patterns and makes data efficient postprocessing.
Usual human conversation flow is everybody talking to themselves, not even carefully listening to the questions. This brings humans more usual to a monologue than to a dialogue, that’s why AI collected and learned data might be noisy.
The effect of ‘thinking the same’ and understanding with a few words between humans without any logic. GPT uses logic, that’s why it will answer ‘two eyes’, when you ask it ‘how many eyes I have on my left foot’.
Emotions and past experience. Our visions and morality come from personal experience, which are then transferred and stored in particular neurons (engrams). Information is stored in engram neuron cells.
An engram is a trace left by an irritant. Taking a certain neurons’ processes, a repetitive signal (sound, smell, a certain environment, etc.) provokes some physical and biochemical changes in them.
When the stimulus is repeated, the ‘trace’ is activated. The cells in which it is present will recall the entire memory from the memory. In other words, engram neurons are responsible for accessing the recorded information, and in order for them to work, a key signal must act on them.
Human brain works from thousands of years of memory, that’s why our recognition and comprehension is faster. We are expecting to see a dog from the bottom level cells and send signals upward, and then from upward to downward the recognition is refined and either confirmed or not.
vBulletin® v3.8.7, Copyright ©2000-2025, vBulletin Solutions, Inc.