ISF Logo   IS Forum
Forum Index Register Members List Events Mark Forums Read Help

Go Back   International Skeptics Forum » General Topics » Science, Mathematics, Medicine, and Technology
 


Welcome to the International Skeptics Forum, where we discuss skepticism, critical thinking, the paranormal and science in a friendly but lively way. You are currently viewing the forum as a guest, which means you are missing out on discussing matters that are of interest to you. Please consider registering so you can gain full use of the forum features and interact with other Members. Registration is simple, fast and free! Click here to register today.
Tags consciousness , robots

Reply
Old 14th August 2009, 04:38 AM   #361
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
So nothing but brains can have consciousness, by definition? Is this in the same way that only eyes can have vision because it is a specialised function?
No, that's not what I mean (unless you want to call anything that does it a "brain" in the same way you might want to call any vision device an "eye") .

In fact, there may be other ways to generate consciousness than the one our brains are using. Nature often comes to the same solution by different means.

But as of now, this is the only one we know of.

But when we look at how consciousness is generated in the cases we know of, we see that it's not simply a global emergent property that arises from having a bunch of neurons hooked up, but rather it is a specialized function of the brain.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 04:47 AM   #362
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
I do not think it is meaningful to distinguish conscious thoughts from other thoughts, whatever they may be.

There has been some experimentation to show that people "make decisions" before they are "aware" that they make a decision. This sounds rather futile for me. Unconscious thoughts making decisions are as much part of consciousness as any other kinds of thoughts, so it really tells more about how we experience consciousness than what consciousness consists of.
Consciousness is your experience.

And the distinction between conscious and non-conscious activity in the brain is quite important.

We have to ask the question, not only "How are we conscious?", but "Why?"

Why does the brain bother to do this? There must be some evolutionary advantage to it.

The points you make here are one's I've made previously on this thread. Consciousness appears to be a specialized "downstream" function. Our awareness of our actions and decisions appears to be after-the-fact.

So why should evolution bother to do such a thing?

There's a wealth of experiments showing a distinction between conscious functions of the brain and all the rest (the majority) of activity.

It's well established the people can and do act on sensory information that the brain has picked up and processed, but not bothered to make them consciously aware of. And Marvin's case shows clearly how the information feed to the module controlling emotional awareness can be short-circuited, even though the parts of the brain generating emotion continue to function.

So when we talk about a "conscious robot", we're not just talking about a robot that can do a lot of what our brains can do. We must be speaking of a robot that has this particular capacity to be "aware" of at least some of what it's doing.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 04:51 AM   #363
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
As long as we do not know exactly how the brain achieves consciousness, it remains an opinion.
But we may not need to know the whole story about how consciousness is produced by the brain in order to answer the question.

It's possible to have enough information to answer the question without knowing everything, just as we answer questions in astronomy without knowing everything about the cosmos.

I think we do know enough to say that the pencil brain is too slow to be able to do what the brain is doing when it generates conscious awareness.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 04:56 AM   #364
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
The OP states: "Let's assume that the appropriately organized software is conscious in a sense similar to that of human brains." Where do you read that the OP is not concerned with a virtual simulation of consciousness? How do you interpret the word "similar"?
What if there were an OP that said, "Let's assume we have enough food to feed an army...." Would you be ok with the premise that we can proceed with that thought experiment by positing that we have some sort of virtual food to feed that army?

The OP stipulates that we're dealing with a robot that is conscious like we are. In other words, we've got something here in the physical world that is actually self-aware.

No virtual "simulation" is being stipulated.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 05:51 AM   #365
steenkh
Philosopher
 
steenkh's Avatar
 
Join Date: Aug 2002
Location: Denmark
Posts: 5,359
Originally Posted by Piggy View Post
But when we look at how consciousness is generated in the cases we know of, we see that it's not simply a global emergent property that arises from having a bunch of neurons hooked up, but rather it is a specialized function of the brain.
How do we see this? Why are bunches of neurons not constituting the specialised function of the brain? Bunches of photoreceptor cells also make up vision.

Originally Posted by Piggy View Post
Consciousness is your experience.
Yes, but I doubt that the experience could exist without the lower layer of consciousness.

Quote:
We have to ask the question, not only "How are we conscious?", but "Why?"

Why does the brain bother to do this? There must be some evolutionary advantage to it.
Definitely, but if it is an emergent quality, evolution might have stumbled upon it by chance. In fact, I am pretty sure that evolution does not go in any special direction, unless there is a god to direct it, so it is fairly certain that consciousness just happened, and evolution opportunistically latched on to a good thing.

Quote:
The points you make here are one's I've made previously on this thread. Consciousness appears to be a specialized "downstream" function. Our awareness of our actions and decisions appears to be after-the-fact.
Yes, our awareness is after the fact, but we are not consciously firing certain neurons, so it is rather obvious that any decision making is done on a level that is only later brought to the awareness level. Apparently, this is how consciousness works. I believe that you could not have consciousness without all of the levels.

Quote:
There's a wealth of experiments showing a distinction between conscious functions of the brain and all the rest (the majority) of activity.
Does these experiments also show that consciousness could exist without the rest of the activity?

Originally Posted by Piggy View Post
But we may not need to know the whole story about how consciousness is produced by the brain in order to answer the question.
Quite true. I said something along the same lines in an earlier post. As long as we do not know excatly what makes up consciousness, we can still simulate consciousness by simulating every single element. Once we have the knowledge of what makes up consciousness, we can cut down on everything that is not strictly necessary.

Quote:
I think we do know enough to say that the pencil brain is too slow to be able to do what the brain is doing when it generates conscious awareness.
You are supposing that the timing element is important. I see no reason to accept this.

Originally Posted by Piggy View Post
What if there were an OP that said, "Let's assume we have enough food to feed an army...." Would you be ok with the premise that we can proceed with that thought experiment by positing that we have some sort of virtual food to feed that army?
I fail to see the connection.

Quote:
The OP stipulates that we're dealing with a robot that is conscious like we are. In other words, we've got something here in the physical world that is actually self-aware.

No virtual "simulation" is being stipulated.
A robot is not a biological being. "conscious like we are" necessarily means that it is a virtual simulation that is making the robot conscious. Though presumably not a paper machine

Many years ago I have produced a virtual simulation of a tape recorder that worked exactly as a tape recorder, except it had no tapes. I see nothing in the OP that rules out that the robot is running a virtual simulation of a brain.
__________________
Steen

--
Jack of all trades - master of none!
steenkh is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:27 AM   #366
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
How do we see this? Why are bunches of neurons not constituting the specialised function of the brain? Bunches of photoreceptor cells also make up vision.
Bunches of neurons do constitute the brain, obviously.

The position I was arguing against is that a bunch of neurons is all you need for consciousness to "emerge" from the critical mass.

That does not appear to be the case.

Consciousness, like vision, doesn't just arise from any ol' bundle of neurons. It requires particular kinds of circuitry.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:28 AM   #367
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
Yes, but I doubt that the experience could exist without the lower layer of consciousness.
I'm sorry, but I don't know what you mean by "lower layer of consciousness".
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:30 AM   #368
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
Definitely, but if it is an emergent quality, evolution might have stumbled upon it by chance. In fact, I am pretty sure that evolution does not go in any special direction, unless there is a god to direct it, so it is fairly certain that consciousness just happened, and evolution opportunistically latched on to a good thing.
Of course it's true that consciousness evolved like everything else in our biological world. I don't understand why you're bringing it up.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:31 AM   #369
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
Yes, our awareness is after the fact, but we are not consciously firing certain neurons, so it is rather obvious that any decision making is done on a level that is only later brought to the awareness level. Apparently, this is how consciousness works. I believe that you could not have consciousness without all of the levels.
Yes, but why are you bringing this up?

I'm totally befuddled about why you're stating all these obvious points.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:32 AM   #370
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
Does these experiments also show that consciousness could exist without the rest of the activity?
Of course not. What in the world is your point?
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:47 AM   #371
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
You are supposing that the timing element is important. I see no reason to accept this.
Well, let's take an example of a conscious event.

First of all, we know there's a timespan below which events will not be consciously processed. Flicker that image too fast, and an observer won't be aware of it, even though his brain has processed it (which we can tell b/c it influences the observer's behavior).

So consciousness, as you and roger have both pointed out, is something that doesn't exist in very small frames of time, but in what we might call macro time.

Let's take the example of being aware that someone has said your name at a party.

An enormous amount of data has to be aggregated and analyzed and associated. (Yes, I know the pencil brain can aggregate data etc.)

All the incoming sounds have to be parsed, matched with stored patterns, compared with each other, triaged, prioritized.

The result is a pretty massive assemblage of simultaneous information which results in something like: "In this particular setting, that set of sounds is someone saying my name, which is more important than what I'm focusing on now, so I'll attend to it instead".

Can that feat be accomplished by feeding discreet bits of information at a very slow rate into the areas of the brain responsible for conscious awareness?

No, because when you do that you lose the large-scale data coherence that's necessary for the brain to do this, and you have neuronal activity in discreet pulses that are below the minimum timespan for events to be consciously processed.

I suppose if you had a very slow machine that stored up information, then sent it in coherent bundles in short bursts of macro time to the modules responsible for conscious awareness, you could have intermittent bursts of consciousness, though.

Essentially, we're asking if there's a stall speed for consciousness like there's a stall speed for engines.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:50 AM   #372
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
A robot is not a biological being. "conscious like we are" necessarily means that it is a virtual simulation that is making the robot conscious. Though presumably not a paper machine

Many years ago I have produced a virtual simulation of a tape recorder that worked exactly as a tape recorder, except it had no tapes. I see nothing in the OP that rules out that the robot is running a virtual simulation of a brain.
This may be semantics, but at that point I'd argue you're no longer simulating consciousness. At that point, it's identity. You have consciousness.

As I read the OP, that's what's been stipulated.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 07:13 AM   #373
roger
Penultimate Amazing
 
roger's Avatar
 
Join Date: May 2002
Posts: 11,465
Originally Posted by Piggy View Post

I say no, and for entirely practical reasons.

The step which, from my point of view, you are failing to make is to actually go look at how consciousness is created and ask, "Is this a function that can be maintained by something that works as slowly as a pencil brain?"

I say no, because you don't have simultaneous coordination of large enough amounts of coherent data over short enough continuous spans of time to achieve it.

That's why I say that the pencil brain cannot mimic that particular function.
I am addressing this point.

This next part isn't going to convince you, but just let me say it first.

The smallest possible unit of time is Planck time, on the order of 5*10^-44 seconds. Rather brief. Nothing happens that is shorter than that duration. If we looked at your brain using that time frame, you would no longer talk about 'simultaneous coordination'. Events that are happening simultaneously are in fact happening glacially - you wouldn't even see things happening on that time scale. The impulses running from neuron to neuron in the chemical soup would appear absolutely frozen to you on that time scale. you'd have to leave markers and come back days later to even notice they moved. You could reach in 'by hand' on that time scale and make things happen, one at a time, one neuron at a time, and have time to play a golf game and go to the symphony between each adjustment. Heck, at that time scale, you could write a million novels between each look at a single neuron (there are only 100 billion to concern yourself with, after all). In other words, at that time scale, even though things are happening simultaneously, the signals and events are happening so slowly that there is no need for changes to happen simultaneously - you can easily do it one at a time. You wouldn't be talking about simultaneous on this time scale, you certainly wouldn't be talking about continuous. You'd see an absolutely frozen object that appears to do nothing over 100 years. This is unarguable - on a given time scale, that actually exists in our universe, the brain is basically doing nothing, 'continuous' has no meaning, and yet it produces consciousness. Hence, continuous is not a necessary property of consciousness.


Okay, so at this point I'm assuming you aren't convinced. If not, you are positing something 'special' about simultaneousness that creates consciousness. From a computational point of view there is no reason to assume that. Certainly there is a need for our neurons, running at the speeds they do, to be simultaneously doing things to get everything they need to get done in the time they have available. But if those neurons were running, say, 10 trillion times faster than they do right now, why couldn't you just have one trillion, and a big bank of memory do all the work of your 100 billion neurons? Computationally, they are equivalent, and I think we now have you say consciousness is computational. Again, this is a thought experiment, no need to point out that our neurotransmitters wouldn't work at that speed.

I'm not sure what you meant by 'practical'. Certainly with our brain we need parallel processing to get things done in time. But the pencil brain is not constrained by our time scale, where 10^-44 sec is too small to notice, and 1 is a small, but noticble time increment. With the pencil brain, I'm saying 1 second is it's planck time interval. Unable to perceive it, basically nothing happens during it, the brain is frozen. But as 10^44 seconds passes, the pencil brain will perceive, it will think, it will be conscious. They are computationally equivalent.

I assume that you don't think the neurotransmitters are creating a 'field' or something that creates consciousness. I think we are both on the same page - it's how large bundles of neurons process information, self-referentially, that creates consciousness. It's the processing, nothing more. If so, there is no fundamental difference between parallel and sequential processing. Again, back to the planck scale. Sure, even at that speed, there's an impulse running along nerve 10234 and another impulse along nerve 2343322. Simultaneously! Sure, but it's not the impulse thats creating consciousness, it's what happens when it reaches the neuron (broadly). A neuron gets a bunch of inputs, and at a certain threshhold it fires. At the scale of Planck, it'd be an astonishing miracle if two neurons got a signal at the same moment. At that time scale (if you were living such that a planck interval felt like one second to you) you might wait decades between two events you swear are instantaneous. So, we know simultaneous in that sense plays no role.

So, we are left with the field concept. I don't see any evidence for such a thing, so I dismiss it. If there was a field, then speed and simultaneousness could matter, and the pencil brain, running on a substrate that doesn't generate that field, wouldn't work. But, that is just pie-in-the-sky speculation.

I can't think of any other objections or alternatives. In a time scale that actually exists on this world, your brain does not do the 'important things' simultaneously or continuously. Where 'important things' mean neurons receiveing impulses and firing. It's the receiving, firing, and storing data that creates the consciousness, not the mere fact of neurotransmitters propagating a signal.

Therefore, a pencil brain operating on a time scale of seconds would work too, and would perceive time passing in parcels of billions of eons.
__________________
May your trails be crooked, winding, lonesome, dangerous, leading to the most amazing view. May your mountains rise into and above the clouds. - Edward Abbey

Climb the mountains and get their good tidings.
Nature's peace will flow into you as sunshine flows into trees. The winds will blow their own freshness into you, and the storms their energy, while cares will drop off like autumn leaves. - John Muir
roger is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 07:19 AM   #374
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
I like roger's idea of reposting the OP at the top as a reminder:

Quote:
Consider a conscious robot with a brain composed of a computer running sophisticated software. Let's assume that the appropriately organized software is conscious in a sense similar to that of human brains.

Would the robot be conscious if we ran the computer at a significantly reduced clock speed? What if we single-stepped the program? What would this consciousness be like if we hand-executed the code with pencil and paper?
So this question boils down to whether there's a "stall speed" for consciousness, like there is with a car engine.

Now, for the car, we could theoretically build a second car which has equivalents for all the parts of the working car in the same arrangement, but everything moves much more slowly.

Let's say there is only one event per second in this model car.

(Assuming all physics is computable, both cars are computable.)

Q: Will the second car run?

A: No.

So we cannot simply assume that a slow model brain will be conscious. It depends on what the mechanism of consciousness is.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 07:24 AM   #375
roger
Penultimate Amazing
 
roger's Avatar
 
Join Date: May 2002
Posts: 11,465
Originally Posted by Piggy View Post

So we cannot simply assume that a slow model brain will be conscious. It depends on what the mechanism of consciousness is.
Right. We all agree with this. Any given substrate, in the real world, has a minimum and maximum run speed. My beloved pencil brain wouldn't really work. Pencil fades in several hundred years, and paper deteriorates. We wouldn't actually get anything done before everything decomposed. We'd need this stuff to hang around for eons for the brain to do something useful, and that wouldn't happen.

But we are trying to discuss a more philosophical point rather than an implementation detail. Nothing, in principle, stops a pencil brain from having a conscious that spans eons.
__________________
May your trails be crooked, winding, lonesome, dangerous, leading to the most amazing view. May your mountains rise into and above the clouds. - Edward Abbey

Climb the mountains and get their good tidings.
Nature's peace will flow into you as sunshine flows into trees. The winds will blow their own freshness into you, and the storms their energy, while cares will drop off like autumn leaves. - John Muir
roger is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 07:32 AM   #376
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by roger View Post
The smallest possible unit of time is Planck time, on the order of 5*10^-44 seconds. Rather brief. Nothing happens that is shorter than that duration. If we looked at your brain using that time frame, you would no longer talk about 'simultaneous coordination'. Events that are happening simultaneously are in fact happening glacially - you wouldn't even see things happening on that time scale. The impulses running from neuron to neuron in the chemical soup would appear absolutely frozen to you on that time scale. you'd have to leave markers and come back days later to even notice they moved. You could reach in 'by hand' on that time scale and make things happen, one at a time, one neuron at a time, and have time to play a golf game and go to the symphony between each adjustment. Heck, at that time scale, you could write a million novels between each look at a single neuron (there are only 100 billion to concern yourself with, after all). In other words, at that time scale, even though things are happening simultaneously, the signals and events are happening so slowly that there is no need for changes to happen simultaneously - you can easily do it one at a time. You wouldn't be talking about simultaneous on this time scale, you certainly wouldn't be talking about continuous. You'd see an absolutely frozen object that appears to do nothing over 100 years. This is unarguable - on a given time scale, that actually exists in our universe, the brain is basically doing nothing, 'continuous' has no meaning, and yet it produces consciousness. Hence, continuous is not a necessary property of consciousness.
Since there is no consciousness on that scale, it doesn't really matter.

In fact, let's consider why it should be that there is a subliminal timeframe at all.

Why doesn't the brain move anything that happens within very short timeframes into conscious processing? We know that the brain processes information below that timeframe, so we imagine that it could push it into conscious processing. Why doesn't it?

That's because consciousness does rely on the coordination of apparently coherent data. And below a certain time interval, that kind of coherence just doesn't exist.

You might argue that our conscious experience is like a movie -- frames that only appear continuous.

Below the subliminal threshold, the brain can still process data, but it can't lump that data into apparently coherent sets which can be used to create the experience of being aware of them.

We perceive our experience to be continuous, but it is not.

Instead, the brain has to (on very short timescales) create chunks of highly processed data that are treated as if they were coherent and simultaneous by the brain structures that generate consciousness. On the neural level, it is transferred in the same way as any other data. But these modules appear to process batches of it.

This is only possible at a high level of organization.

The result is our experience of being "aware" of things.

If we slow down the data stream so much that we break up these apparently coherent batches, then all input becomes subliminal.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 07:36 AM   #377
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by roger View Post
Right. We all agree with this. Any given substrate, in the real world, has a minimum and maximum run speed. My beloved pencil brain wouldn't really work. Pencil fades in several hundred years, and paper deteriorates. We wouldn't actually get anything done before everything decomposed. We'd need this stuff to hang around for eons for the brain to do something useful, and that wouldn't happen.

But we are trying to discuss a more philosophical point rather than an implementation detail. Nothing, in principle, stops a pencil brain from having a conscious that spans eons.
I disagree. That's like saying that nothing in principle stops our one-event-per-second car from running.

In fact, it won't be able to run.

See my above post for an explanation of why a one-impulse-per-second brain won't run the consciousness function. At that speed, everything becomes subliminal, so the brain can't be conscious of anything.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 07:42 AM   #378
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by roger View Post
Certainly with our brain we need parallel processing to get things done in time. But the pencil brain is not constrained by our time scale, where 10^-44 sec is too small to notice, and 1 is a small, but noticble time increment. With the pencil brain, I'm saying 1 second is it's planck time interval. Unable to perceive it, basically nothing happens during it, the brain is frozen. But as 10^44 seconds passes, the pencil brain will perceive, it will think, it will be conscious. They are computationally equivalent.
What method are you going to use to overcome the subliminal threshold?

At some point, the pencil brain has to speed up.

If it works so slowly that all transfer of information happens at pencil speed, how do you create the apparent cohesion and simultaneity which the generation of conscious awareness requires?

You could do it by storing up sufficient data to create cohesion, then dumping all that aggregated and associated data very rapidly into the right module and allowing it to run at a speed that keeps up above the threshold. But then we'd have a hybrid, of which the pencil brain is only one part.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 07:43 AM   #379
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by roger View Post
In a time scale that actually exists on this world, your brain does not do the 'important things' simultaneously or continuously. Where 'important things' mean neurons receiving impulses and firing. It's the receiving, firing, and storing data that creates the consciousness, not the mere fact of neurotransmitters propagating a signal.
But the action of neurons is not the only important thing.

We have to look at larger scale structures, and how they behave at that level of granularity, to understand how consciousness is created.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 08:05 AM   #380
SK.
New Blood
 
Join Date: Oct 2007
Posts: 24
Originally Posted by Piggy View Post
First of all, we know there's a timespan below which events will not be consciously processed. Flicker that image too fast, and an observer won't be aware of it, even though his brain has processed it (which we can tell b/c it influences the observer's behavior).
What you are describing has nothing to do with the dependence of conciousness and timespan, but with the synchronization of the computation that is/results in conciousness with the incoming sensory data. Of course you wonīt be able to just slow down or speed up a humanīs or AIīs brain situated in the real world at will and expect it to stay concious.

But letīs say you create a log of the exact sensory inputs you receive while looking at that flickering image, with exact timestamp information, as well as the starting state of your computation device. You then can recreate the exact computation happening when looking at that image, regardless actual speed at which it is performed. Be it a nanosecond or a thousand years, the computation and itīs results will stay the same. If this computation results in conciousness, it will do so regardless of the actual time it took.

One could actually consider not only using a pencil to run a brain, but another person to use another pencil to compute the simulation environment that brain lives in and the associated incoming sensory input. And because both of these algorithms are computable, you could even have a single TM/human with a pencil computing both the brainīs activity as well as the simulation it lives in. For such a system it doesnīt matter whatsoever at which speed it is run, it will always be concious and in sync with itīs sensor input.

Of course, all the above assumes conciousness to be computable, as discussed earlier in this thread.
SK. is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 08:27 AM   #381
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by SK. View Post
Of course you wonīt be able to just slow down or speed up a humanīs or AIīs brain situated in the real world at will and expect it to stay concious.
And yet, that is precisely the question before us.

Quote:
Consider a conscious robot with a brain composed of a computer running sophisticated software. Let's assume that the appropriately organized software is conscious in a sense similar to that of human brains.

Would the robot be conscious if we ran the computer at a significantly reduced clock speed? What if we single-stepped the program? What would this consciousness be like if we hand-executed the code with pencil and paper?
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 08:37 AM   #382
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by SK. View Post
But letīs say you create a log of the exact sensory inputs you receive while looking at that flickering image, with exact timestamp information, as well as the starting state of your computation device. You then can recreate the exact computation happening when looking at that image, regardless actual speed at which it is performed. Be it a nanosecond or a thousand years, the computation and itīs results will stay the same. If this computation results in conciousness, it will do so regardless of the actual time it took.

One could actually consider not only using a pencil to run a brain, but another person to use another pencil to compute the simulation environment that brain lives in and the associated incoming sensory input. And because both of these algorithms are computable, you could even have a single TM/human with a pencil computing both the brainīs activity as well as the simulation it lives in. For such a system it doesnīt matter whatsoever at which speed it is run, it will always be concious and in sync with itīs sensor input.

Of course, all the above assumes conciousness to be computable, as discussed earlier in this thread.
I wasn't talking about looking at movies, but the fact that our awareness is kind of like a movie, only apparently continuous.

Maintaining conscious awareness of our environment requires a certain level of processing speed, if you will.

Take the case of hearing your name at a party.

What is received by the brain modules that generate conscious experience (CMs) isn't the raw input, not by a long shot. What goes in has been highly processed, mixed with stored data, and "chunked".

What goes in is something akin to "That's my wife's voice saying my name a few feet over to my left".

Other parts of the brain do all the pre-processing. Then bundles of highly processed information are streamed into the CM. But the CM does not treat them as if they were streamed. It treats them as if they were coherent.

Often, the preprocessing introduces errors, sometimes gross errors, because it uses shortcuts. We often "see" things that aren't there, and fail to see things that are.

If we slow down the processing speed so that information drips into the CMs at a rate below the subliminal threshold, the apparent coherence is lost, and the CMs can't process the data, because it's not formatted correctly. It would be treated as discreet impulses, which the CMs can't "read".

Our brains do depend on a certain minimum speed in order to generate conscious awareness.

So the pencil brain would only work if it were part of a system which, at some points, acted much faster.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 08:40 AM   #383
SK.
New Blood
 
Join Date: Oct 2007
Posts: 24
Originally Posted by Piggy View Post
And yet, that is precisely the question before us.
No, the question isnīt if a AI brain in the real world will stay concious if itīs run totally out of sync with itīs external interfaces.
This additional constraint doesnīt exist in the OPs question.
SK. is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 08:43 AM   #384
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by SK. View Post
No, the question isnīt if a AI brain in the real world will stay concious if itīs run totally out of sync with itīs external interfaces.
This additional constraint doesnīt exist in the OPs question.
Then how do you read the question?
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 09:13 AM   #385
SK.
New Blood
 
Join Date: Oct 2007
Posts: 24
Originally Posted by Piggy View Post
Then how do you read the question?
There is no reference to external interfaces, so we might as well think about an AI that has none, or has none at the moment weīre undertaking our experiment.

Consider this: We have an advanced robot that senses the world with some sensors. We now disconnect all external sensors (and internal too, just for good measure) for 5 seconds and then reconnect them. If the robot is conscious before plugging out the sensors, it is quite likely that it is concious during the 5 second blackout period too. If we now severly underclock the robotīs CPU during itīs blackout phase (and yeah, itīs internal clock is dependant on the CPU clock), the robot will stay conscious and experience 5 conscious seconds, while in reality the blackout phase might has taken 10 seconds or 10000 years, depending on the underclocking.
SK. is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 09:32 AM   #386
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by SK. View Post
Consider this: We have an advanced robot that senses the world with some sensors. We now disconnect all external sensors (and internal too, just for good measure) for 5 seconds and then reconnect them. If the robot is conscious before plugging out the sensors, it is quite likely that it is concious during the 5 second blackout period too.
If it is conscious during the sensory blackout, then it is dreaming, which is fine.

In common parlance, we are "unconscious" when dreaming, but for our purposes here we have to consider dreams to be "conscious" experience, since we are aware of what are essentially hallucinations during our dreams.

So we posit a dreaming robot.

Originally Posted by SK. View Post
If we now severly underclock the robotīs CPU during itīs blackout phase (and yeah, itīs internal clock is dependant on the CPU clock), the robot will stay conscious and experience 5 conscious seconds, while in reality the blackout phase might has taken 10 seconds or 10000 years, depending on the underclocking.
Ok, I'll need some clarification on this point here.

What actually happens physically/electronically to the robot brain when we "severely underclock the... CPU"?

What's the actual difference in the CPU's activity between the normal and underclocked states?

Thanks.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 11:50 AM   #387
SK.
New Blood
 
Join Date: Oct 2007
Posts: 24
Originally Posted by Piggy View Post
What actually happens physically/electronically to the robot brain when we "severely underclock the... CPU"?

What's the actual difference in the CPU's activity between the normal and underclocked states?

Thanks.
In itīs most simple form underclocking would work like on most of todayīs notebook computer CPUs, by decreasing the value of the CPU multiplier. The exact method of slowing down isnīt as important as the concept that a slowdown can be achieved in principle.
Be it by underclocking, inserting a sleep / do-nothing instructions between every of the AI instructions or whatever else comes to oneīs mind.
The only necessary condition for completely equivalent computation regardless of actual speed is that the all components of the system are slowed down proportionally, so they donīt get out of sync.
SK. is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 12:36 PM   #388
steenkh
Philosopher
 
steenkh's Avatar
 
Join Date: Aug 2002
Location: Denmark
Posts: 5,359
Originally Posted by Piggy View Post
The position I was arguing against is that a bunch of neurons is all you need for consciousness to "emerge" from the critical mass.

That does not appear to be the case.
On what do you base this conclusion?

Quote:
Consciousness, like vision, doesn't just arise from any ol' bundle of neurons. It requires particular kinds of circuitry.
That is possible, but it might remain a Turing Machine, and it can in principle be simulated on paper or in silicon. Do you have any sources for the specialised circuitry?

Originally Posted by Piggy View Post
I'm sorry, but I don't know what you mean by "lower layer of consciousness".
The unconscious layer that is needed for consciousness.

Originally Posted by Piggy View Post
Of course it's true that consciousness evolved like everything else in our biological world. I don't understand why you're bringing it up.
You brought up evolution, I did not. I also did not see the relevance.

Originally Posted by Piggy View Post
Well, let's take an example of a conscious event.

First of all, we know there's a timespan below which events will not be consciously processed. Flicker that image too fast, and an observer won't be aware of it, even though his brain has processed it (which we can tell b/c it influences the observer's behavior).

So consciousness, as you and roger have both pointed out, is something that doesn't exist in very small frames of time, but in what we might call macro time.
So far you have not brought something up that cannot be simulated. You do realise that we are not talking about a realtime simulation of a brain, right?

Quote:
Let's take the example of being aware that someone has said your name at a party.

An enormous amount of data has to be aggregated and analyzed and associated. (Yes, I know the pencil brain can aggregate data etc.)

All the incoming sounds have to be parsed, matched with stored patterns, compared with each other, triaged, prioritized.

The result is a pretty massive assemblage of simultaneous information which results in something like: "In this particular setting, that set of sounds is someone saying my name, which is more important than what I'm focusing on now, so I'll attend to it instead".

Can that feat be accomplished by feeding discreet bits of information at a very slow rate into the areas of the brain responsible for conscious awareness?
Not in a real brain, but in a simulated brain, it would be no problem.

Quote:
No, because when you do that you lose the large-scale data coherence that's necessary for the brain to do this, and you have neuronal activity in discreet pulses that are below the minimum timespan for events to be consciously processed.

I suppose if you had a very slow machine that stored up information, then sent it in coherent bundles in short bursts of macro time to the modules responsible for conscious awareness, you could have intermittent bursts of consciousness, though.

Essentially, we're asking if there's a stall speed for consciousness like there's a stall speed for engines.
This was the question of the OP, and I believe that for a biological brain, there is a stall speed. For a simulated brain there may be no stall speed at all. The entire contents of every element of the simulated brain, down to the last atom could be stored somewhere and be restored at a later stage, and that consciousness would never notice what had happened.
__________________
Steen

--
Jack of all trades - master of none!
steenkh is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 01:11 PM   #389
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by SK. View Post
In itīs most simple form underclocking would work like on most of todayīs notebook computer CPUs, by decreasing the value of the CPU multiplier. The exact method of slowing down isnīt as important as the concept that a slowdown can be achieved in principle.
Be it by underclocking, inserting a sleep / do-nothing instructions between every of the AI instructions or whatever else comes to oneīs mind.
The only necessary condition for completely equivalent computation regardless of actual speed is that the all components of the system are slowed down proportionally, so they donīt get out of sync.
So it would be the equivalent of, say, putting some sort of pause mechanism in the middle of each neuron, so that the impulse had to stop and wait for a prescribed amount of time before moving on?

(Impossible, yes, but in our hypothetical world.)
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 01:22 PM   #390
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
On what do you base this conclusion?
I'm not sure why you want me to repeat this, but it's because of what we observe in the operation of the brain. For example, Marvin's case, in which a stroke disrupted the pathway leading to the part of the brain that generates consciousness of emotions, resulting in a kind of emotional blindness.

We also infer this from the types of errors that are typical of conscious perception, such as the famous "gorilla on the basketball court" experiment, in which people counting ball passes often ignore a person in a gorilla suit who walks onto the court, stands amidst the players, beats his chest, and walks off.

We know from many studies that the mind doesn't shut off perception of these things. Rather, it simply fails to move that information into conscious awareness. On the other hand, the mind also fills in sensory data with stored patterns and associations to create a complete conscious experience even when the data is incomplete.

Also, there are the studies showing that we only become aware of our decisions after we make them and begin to act on them.

What emerges from all of this is a pretty clear picture of a brain that takes sensory information, adds it with stored information and internal data regarding the brain's own states, then moves highly processed information selectively to brain modules responsible for generating conscious awareness. These models are downstream from the pre-processing centers, as well as the areas that route physical responses out to the rest of the body.

So we can be sure that consciousness does not simply emerge from a critical mass of neurons, but rather that it is a specialized function of the brain.

Since thermostats have no such capacity built into them, it's safe to say they are not conscious.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 01:25 PM   #391
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
That is possible, but it might remain a Turing Machine, and it can in principle be simulated on paper or in silicon. Do you have any sources for the specialised circuitry?
I do, but I can't access them right now.

I'll see if the vid on Marvin is still out there, and I'll try to find other sources on the studies I mention above, as well as anything newer regarding the identification of areas of the brain responsible for generating the experience of conscious awareness. Might not collect all that til tomorrow, tho.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 01:30 PM   #392
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
So far you have not brought something up that cannot be simulated. You do realise that we are not talking about a realtime simulation of a brain, right?
If I understand what you mean by "simulation", I'm not claiming it can't be simulated -- although I suspect that simulating consciousness would be equivalent to instantiating it.

If you're talking about a very slow machine designed to simulate the brain, then that's the question at hand -- if it computes at a much slower rate, could it still be conscious?

I'm arguing that if it were very slow, it could not, because you'd lose the data coherence necessary for the CMs to work. If you sent data to them with long pauses between impulses, all information entering the CMs would be subliminal.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 01:32 PM   #393
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
Not in a real brain, but in a simulated brain, it would be no problem.
And on what do you base that?

In order to make it work, you're going to have to create CMs that know how to handle data in a way that our only working consciousness-generator doesn't appear able to handle.

How will that be accomplished?
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 01:34 PM   #394
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
The entire contents of every element of the simulated brain, down to the last atom could be stored somewhere and be restored at a later stage, and that consciousness would never notice what had happened.
Ok, then you're talking about stopping and starting time, for all intents and purposes.

This makes the question trivial.

If you did that, of course consciousness would continue.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 02:17 PM   #395
steenkh
Philosopher
 
steenkh's Avatar
 
Join Date: Aug 2002
Location: Denmark
Posts: 5,359
Originally Posted by Piggy View Post
I'm not sure why you want me to repeat this, but it's because of what we observe in the operation of the brain. For example, Marvin's case, in which a stroke disrupted the pathway leading to the part of the brain that generates consciousness of emotions, resulting in a kind of emotional blindness.
I have seen your arguments, but I never noticed anything that spoke against an emergent function of neurons or whatever you think is important in the brain.

Quote:
We also infer this from the types of errors that are typical of conscious perception, such as the famous "gorilla on the basketball court" experiment, in which people counting ball passes often ignore a person in a gorilla suit who walks onto the court, stands amidst the players, beats his chest, and walks off.
What connection does this have with emergence?

Quote:
We know from many studies that the mind doesn't shut off perception of these things. Rather, it simply fails to move that information into conscious awareness. On the other hand, the mind also fills in sensory data with stored patterns and associations to create a complete conscious experience even when the data is incomplete.

Also, there are the studies showing that we only become aware of our decisions after we make them and begin to act on them.
I fail to see the relevance for emergence.

Quote:
What emerges from all of this is a pretty clear picture of a brain that takes sensory information, adds it with stored information and internal data regarding the brain's own states, then moves highly processed information selectively to brain modules responsible for generating conscious awareness. These models are downstream from the pre-processing centers, as well as the areas that route physical responses out to the rest of the body.
That might be an accurate description of how the brain works, but you seem to think that emergence means that a a random assembly of neurons will generate consciousness, whereas on the contrary you just made the case that we need a certain level of specialisation plus a huge number of neurons in order to get conscience, which still does not rule out that emergence is important.

Originally Posted by Piggy View Post
I do, but I can't access them right now.

I'll see if the vid on Marvin is still out there, and I'll try to find other sources on the studies I mention above, as well as anything newer regarding the identification of areas of the brain responsible for generating the experience of conscious awareness. Might not collect all that til tomorrow, tho.
OK, thanks. Take your time, I might in any case not be as much online tomorrow ...

[quote=Piggy;5006039]If I understand what you mean by "simulation", I'm not claiming it can't be simulated -- although I suspect that simulating consciousness would be equivalent to instantiating it.
I am not sure what you mean by "instantiating" here. I think it means that we achieve consciousness, and in that case we agree.

Quote:
If you're talking about a very slow machine designed to simulate the brain, then that's the question at hand -- if it computes at a much slower rate, could it still be conscious?
Yes, that was my interpretation of the question in the OP. My reply is a yes, but we might not be able to recognise that the machine is conscious if it takes too long to compute the various states.

Quote:
I'm arguing that if it were very slow, it could not, because you'd lose the data coherence necessary for the CMs to work. If you sent data to them with long pauses between impulses, all information entering the CMs would be subliminal.
I am sorry, but I do not know what a "CM" is. I am not sure why a simulation or a robot would have CM's.

Originally Posted by Piggy View Post
And on what do you base that?

In order to make it work, you're going to have to create CMs that know how to handle data in a way that our only working consciousness-generator doesn't appear able to handle.

How will that be accomplished?
By virtue of the precondition that the brain is a Turing Machine. I still do not know what a CM, so I cannot yet answer that part of the question.

Originally Posted by Piggy View Post
Ok, then you're talking about stopping and starting time, for all intents and purposes.

This makes the question trivial.

If you did that, of course consciousness would continue.
That answers the question of the OP about what would happen if we single-stepped the robot's brain simulation.
__________________
Steen

--
Jack of all trades - master of none!
steenkh is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:33 PM   #396
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
I have seen your arguments, but I never noticed anything that spoke against an emergent function of neurons or whatever you think is important in the brain.
Odd, because I made all those points before, but no matter, I've made them again.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:46 PM   #397
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
What connection does this have with emergence?
Patience, my friend. Try reading the whole thing.

I'm discussing, at your request I believe, why we can say that consciousness is a specialized function of the brain rather than an emergent property like the whiteness of clouds which arises merely from having a pile of neurons working together.

Part of the reason we know this is because of clues we get from the kinds of errors the brain makes in conscious processing, together with other evidence such as what happens when a brain like Marvin's "breaks" after being damaged.

From the gorilla experiment, and others like subliminal information tests, we see that consciousness does not deal with raw data, but rather with highly processed information. From Marvin, and the decision sequencing tests, we see that consciousness is a "downstream" process.

From this, we can deduce that the brain does one heck of a lot of work, and responds physically to input, before it bothers to make us aware of what's going on. And when it does so, it has already filtered out what it "thinks" is unimportant, filled in gaps, and attached associations from memory to the information before pushing the resulting product into consciousness.

If consciousness were an emergent property that arose from the mere fact of stringing neurons or circuits together, this is not what we'd expect to see.

Instead, what we observe is a specialized function like any other, such as vision.

We don't expect vision to arise as a "property" of any old bundle of neurons liek the whiteness of clouds. It's a function of the system that requires a proper setup. Ditto for consciousness.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:48 PM   #398
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
That might be an accurate description of how the brain works, but you seem to think that emergence means that a a random assembly of neurons will generate consciousness, whereas on the contrary you just made the case that we need a certain level of specialisation plus a huge number of neurons in order to get conscience, which still does not rule out that emergence is important.
I said in my first post on this topic that if consciousness is emergent, it is an emergent feature, not an emergent property like the whiteness of clouds. So I'm not ruling out emergence. I was arguing against a specific type of emergence hypothesis.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:52 PM   #399
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
I am sorry, but I do not know what a "CM" is. I am not sure why a simulation or a robot would have CM's.
CM is a shorthand, which I identified upthread, for "consciousness modules".

I'm finding some really interesting new stuff, tho, that might argue against the existence of CMs for generalized awareness, altho Marvin's case certainly presents evidence for a CM controlling emotional awareness.

As for our robot, we must assume that it uses the same method used by the human brain to produce consciousness, otherwise the question becomes nonsense. It would be like saying: "Suppose there's a car which has an engine that produces motion in a way unlike anything we currently know about or imagine -- how low could its idle speed go before it stalls?"
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Old 14th August 2009, 06:57 PM   #400
Piggy
Unlicensed street skeptic
 
Piggy's Avatar
 
Join Date: Mar 2006
Posts: 15,905
Originally Posted by steenkh View Post
That answers the question of the OP about what would happen if we single-stepped the robot's brain simulation.
I don't think so. It seems to me that the OP posits a conscious robot (not a simulation of consciousness, but a genuine conscious entity) and asks what would happen in the real world if we slowed its processing speed.

In a simulation where you simulate halting and restarting the entire universe, of course there's no change in anything. It's entirely trivial, no matter what you're simulating.
__________________
.
How can you expect to be rescued if you don’t put first things first and act proper?
Piggy is offline   Quote this post in a PM   Nominate this post for this month's language award Copy a direct link to this post Reply With Quote Back to Top
Reply

International Skeptics Forum » General Topics » Science, Mathematics, Medicine, and Technology

Bookmarks

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -7. The time now is 12:50 AM.
Powered by vBulletin. Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.

This forum began as part of the James Randi Education Foundation (JREF). However, the forum now exists as
an independent entity with no affiliation with or endorsement by the JREF, including the section in reference to "JREF" topics.

Disclaimer: Messages posted in the Forum are solely the opinion of their authors.