Maker Pro
Maker Pro

A Purely-Electronic Brain -- Possible?

R

Rich the Philosophizer

Jan 1, 1970
0
Rich the Philosophizer wrote:
..
"[there are] two kinds of awareness... conscious awareness and sentient
awareness. The awareness of mind is called 'consciousness', and the
awareness of emotions and body sensations is called 'sentience' or
'feeling.'"
--- excerpted from http://www.godchannel.com/awareness.html

I'll be impressed when I see a machine that can feel.
It will depend on what is available to it to feel with. Digital nervous
system? Digital skin?

Well, when you poke an ameba with a needle and it flinches, what did it
feel the needle with? How does an ameba decide what's food vs what's
dangerous?

I guess some kinds of sea slugs or some such have pretty sophisticated
behaviors, and their brain is like eight neurons.

Thanks!
Rich
 
R

Richard Dobson

Jan 1, 1970
0
Rich the Philosophizer wrote:
...
Well, when you poke an ameba with a needle and it flinches, what did it
feel the needle with? How does an ameba decide what's food vs what's
dangerous?

I guess some kinds of sea slugs or some such have pretty sophisticated
behaviors, and their brain is like eight neurons.
More fundamental feeling than danger - hunger. The concept of food.
And an instinct to look for food (plus the ability to direct movement
towards it). The sense of well-being when food is consumed. So, before
building the electronic brian, build a system (machine? organism?) that
registers lack of food, responds to that by seeking it out (hence, need
a mechanism that recognises food), and rewarding itself when it has
found and consumed it. And remembering, for some amount of time, what it
did to find it - basic reinforcement learning. Eight neurons might just
be enough. For where the stomach leads, the brain will surely follow!


Richard Dobson
 
R

r norman

Jan 1, 1970
0
Well, when you poke an ameba with a needle and it flinches, what did it
feel the needle with? How does an ameba decide what's food vs what's
dangerous?

I guess some kinds of sea slugs or some such have pretty sophisticated
behaviors, and their brain is like eight neurons.

If you do cross post to bionet.neuroscience, could you at least make
some attempt at fact? Or at least keep your wild speculations to
within a few orders of magnitude to reality?
 
M

Michael A. Terrell

Jan 1, 1970
0
Richard said:
Rich the Philosophizer wrote:
..
More fundamental feeling than danger - hunger. The concept of food.
And an instinct to look for food (plus the ability to direct movement
towards it). The sense of well-being when food is consumed. So, before
building the electronic brian, build a system (machine? organism?) that
registers lack of food, responds to that by seeking it out (hence, need
a mechanism that recognises food), and rewarding itself when it has
found and consumed it. And remembering, for some amount of time, what it
did to find it - basic reinforcement learning. Eight neurons might just
be enough. For where the stomach leads, the brain will surely follow!

Richard Dobson


"before building the electronic brian"?


--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida
 
B

Bob Myers

Jan 1, 1970
0
Rich the Philosophizer said:
I'll be impressed when I see a machine that can feel.

I'd be equally impressed when I see someone come up
with an objective proof that anyone other than me can
feel. I know I can only because I experience it directly,
but how can you prove to me that the rest of you aren't
merely clever simulations?

Bob M.
 
B

Bob Myers

Jan 1, 1970
0
Richard Dobson said:
Rich the Philosophizer wrote:
..
More fundamental feeling than danger - hunger. The concept of food. And
an instinct to look for food (plus the ability to direct movement towards
it).

So when you say the something "feels," what you really mean
is simply that you can observe it responding to stimuli? If
that's all you require, I would submit that we already have
many examples of machines that "feel."

Bob M.
 
E

Entertained by my own EIMC

Jan 1, 1970
0
Bob Myers said:
I'd be equally impressed when I see someone come up
with an objective proof that anyone other than me can
feel. I know I can only because I experience it directly,
but how can you prove to me that the rest of you aren't
merely clever simulations?

Bob M.

The nearest we can ever come to an objective proof is to if we

1. note and carefully collate - not stupidly and stubbornly ignore - how
people with specific known brain deficiencies differ perceputally and
emotionally from people with the equivalent brain structures intact.

2. probe the psychie of people whilst make use of fMRI (or forthcoming
increasingly precise and powerful similar technologies) in combination with
yet to be refined means to with high spatiotemporal precision (and of course
harmlessly) freeze the function of neurons (and/or or their suppoporting
glia).


That's why IF you want to do anything other than create an artificially
intelligent fake of a feeling brain you must at least recreate not just an
isolated brain ("just" :-^) but a brain with _all_ its coevolved
complementary exteroceptive and interoceptive machinery (and you still would
have left out such a "brain's" environmentally embedded developmental and
evolutionary history!).

As long as people who work on develping AI don't loose sight of what AI
actually stands for, discussions like this one would be highly unlikely to
involve such people. ;-)

P
 
R

Radium

Jan 1, 1970
0
Conceptually, there is no problem. Practically, it is not possible.

If such a silicon-based electronic brain could be designed and
functional, certain advantages of this electronic-brain over the human-
biologic-brain are:

1. Lack of physical-fragility
2. No need for glucose/oxygen [the device would use electricity for
energy]
3. Does not need water.
4. Faster signaling [electric signals are faster than chemical
signals]

However, this silicon brain would have certain disadvantages as well.
I see that this electric brain would be unable to form new silicon
cells [whereas the human brain can form new neurons]. Hence the amount
of information stored has more of a physical limit than a human brain.
In addition, this brain would be an excellent conductor of electricity
[unlike the human brain] and could be easily damaged by exposure to
microwaves and other electric and magnetic energy. An ElectroMagnetic
Pulse or even a solar flare could seriously damage such a brain that
relies purely on electric signals.
There are 100 billion neurons. If they are implemented in electronic
circuitry, it would cost a lot of money. It would take a lot of time.
Say, like a thousand billion dollars spread over two centuries. Then
why? You can hire grad students at minimum wage.

Aren't neural nets used in massively-parallel computing devices.
This is a conjecture about how a brain might work.
If you are interested in the brain, you might look at:

http://home.nycap.rr.com/rscanlon/brain/brain.htm

Thanks for the link
 
R

Richard Dobson

Jan 1, 1970
0
Bob Myers wrote:
...
So when you say the something "feels," what you really mean
is simply that you can observe it responding to stimuli? If
that's all you require, I would submit that we already have
many examples of machines that "feel."
It must surely be the starting point. We are all more instinctual and
reactive than we might like to think, much of the time. The difference
bewteen us and the amoeeba may be a difference in scale (being aware of
being hungry, anticipating stufffing ourselves, fighting off competitors
for food) than in principle. But I am profoundly sceptical of "strong
AI". We may develop a machine that can mimic some higher brain functions
(and, now we know the heart literally has brain cells, maybe even higher
heart functions too), but sentience as we would define it seems
dependent not on connections as such, but chemistry, constant
reconfiguring, and probably some non-local quantum dimension as well.

But if we are to make an artificial brain that bears any relation to
ourse, the starting point should not be the model of an adult, but that
of a baby; and training might very well take just as long.

Richard Dobson
 
B

bob the builder

Jan 1, 1970
0
A little reality check here.

With a desktop and NEURON, one can model a dozen neurons, modeling
only the electrical characteristics. This will exhaust your PC.

You dont want to model everything on the smallest level. Those
proteins have a function, what function? what are they doing? Some
things about them will be of no consequence when modelling a brain.
You want to make everthing as simple as possible, but not too simpel
(sorry Einstein).

And maybe it turn out my quad-core pc can only simulate 10.000
neurons. A decade from now my pc will do a million.
A very serious effort is underway in Lausanne, Blue Brain. IBM
furnished a super computer. There are about thirty-five very, very
bright people involved directly. They have successfully modeled a rat
macrocolumn (about 10,000 neurons). They are talking with IBM about
the next evolution of super computers. A rat macrocolumn is probably
(certainly?) equivalent to a human macrocolumn. They need to scale-up
by a factor of ten million to get to the level of a human brain.

These neurons lack their chemical insides. They simulate the
electrical signals, but not the molecular machines that produce them.
The cells do not model the proteins, or the genome. They say they are
going to add the proteins later.

They are very, very confident. They intend to model the entire human
brain.

Others are skeptical.

Including me. It maybe be very usefull for understanding the human
brain at some level, in some way. But its not going to create
artificial intelligence. You cant just increase the number of neurons
and then hope that you model, suddenly, becomes intelligent.

With 10.000 artificial neurons you should be able to create something
as smart as a bug (i know, real ones have about half a million of
them). Which would be a great achievement.
This question is asked of people working on Blue Brain. They have no
answer.

They should be fired :p
 
D

Don Bowey

Jan 1, 1970
0
The nearest we can ever come to an objective proof is to if we

1. note and carefully collate - not stupidly and stubbornly ignore - how
people with specific known brain deficiencies differ perceputally and
emotionally from people with the equivalent brain structures intact.

2. probe the psychie of people whilst make use of fMRI (or forthcoming
increasingly precise and powerful similar technologies) in combination with
yet to be refined means to with high spatiotemporal precision (and of course
harmlessly) freeze the function of neurons (and/or or their suppoporting
glia).


That's why IF you want to do anything other than create an artificially
intelligent fake of a feeling brain you must at least recreate not just an
isolated brain ("just" :-^) but a brain with _all_ its coevolved
complementary exteroceptive and interoceptive machinery (and you still would
have left out such a "brain's" environmentally embedded developmental and
evolutionary history!).

As long as people who work on develping AI don't loose sight of what AI
actually stands for, discussions like this one would be highly unlikely to
involve such people. ;-)

How can you say that? It seems to have involved you. ;-)
 
J

Jon Danniken

Jan 1, 1970
0
Richard Dobson said:
It must surely be the starting point. We are all more instinctual and
reactive than we might like to think, much of the time.

Wasn't it Nietzsche who first posited the possibility that even our most
advanced philosophical thinkings were but the manifestation of a
pre-programmed set of instincts?

Jon
 
J

John H.

Jan 1, 1970
0
It was Darwin who wrote in his notes:

"Origin of man now proved. - Metaphysics must flourish. - He who understands
baboon will do more towards metaphysics than Locke."



His "M book"
 
H

Homer J Simpson

Jan 1, 1970
0
Jon Danniken said:
Wasn't it Nietzsche who first posited the possibility that even our most
advanced philosophical thinkings were but the manifestation of a
pre-programmed set of instincts?

Does that explain why all too many have asses larger than elephants now that
the west has abundant, cheap food?
 
T

The Autist formerly known as

Jan 1, 1970
0
You can't but I can, I have no evidence that you are anything other than an
automaton, who thinks he is not one, I on the other hand am real :)

--
þT

L'autisme c'est moi

"Space folds, and folded space bends, and bent folded space contracts and
expands unevenly in every way unconcievable except to someone who does not
believe in the laws of mathematics"
 
S

Sjouke Burry

Jan 1, 1970
0
The said:
You can't but I can, I have no evidence that you are anything other than an
automaton, who thinks he is not one, I on the other hand am real :)
Be gone, figment of our imagination!!!!!!!!!!
 
R

Rich Grise

Jan 1, 1970
0
You dont want to model everything on the smallest level. Those
proteins have a function, what function? what are they doing? Some
things about them will be of no consequence when modelling a brain.
You want to make everthing as simple as possible, but not too simpel
(sorry Einstein).

And maybe it turn out my quad-core pc can only simulate 10.000
neurons. A decade from now my pc will do a million.
I think it'd take more than that - a neuron on its own has considerable
smarts - I even have a hypothesis that, since neurons don't generally
reproduce, that frees up their DNA/RNA/mitochondria to do other stuff,
which could be, that's where they store their memory.

So each neuron would need the processing power of a modern desktop, and
enough gigs of storage to model a whole genome, and as much common sense
as a typical ameba. ;-) (or even a phagocyte! ;-) )

Cheers!
Rich
 
R

Rich the Philosophizer

Jan 1, 1970
0
Rich the Philosophizer wrote:
...
More fundamental feeling than danger - hunger. The concept of food.
And an instinct to look for food (plus the ability to direct movement
towards it). The sense of well-being when food is consumed. So, before
building the electronic brian, build a system (machine? organism?) that
registers lack of food, responds to that by seeking it out (hence, need
a mechanism that recognises food), and rewarding itself when it has
found and consumed it. And remembering, for some amount of time, what it
did to find it - basic reinforcement learning. Eight neurons might just
be enough. For where the stomach leads, the brain will surely follow!

Dude! You got it! :-D :-D :-D

The fundamental driving force behind All That Is is Desire. >:->

Cheers!
Rich
 
R

Rich the Philosophizer

Jan 1, 1970
0
So when you say the something "feels," what you really mean
is simply that you can observe it responding to stimuli? If
that's all you require, I would submit that we already have
many examples of machines that "feel."

Have you ever been in a scary situation and felt "butterflies in your
stomach"?

That's the kind of feeling I'm talking about - everybody's "sixth"
sense, AKA "sentience", which most people won't even acknowledge exists.

Thanks,
Rich
 
R

r norman

Jan 1, 1970
0
- I even have a hypothesis that, since neurons don't generally
reproduce, that frees up their DNA/RNA/mitochondria to do other stuff,
which could be, that's where they store their memory.

A little evidence (not to mention some knowledge of biology) would be
useful, here.
 
Top