|
Welcome to the International Skeptics Forum, where we discuss skepticism, critical thinking, the paranormal and science in a friendly but lively way. You are currently viewing the forum as a guest, which means you are missing out on discussing matters that are of interest to you. Please consider registering so you can gain full use of the forum features and interact with other Members. Registration is simple, fast and free! Click here to register today. |
Tags | artificial intelligence , consciousness |
View Poll Results: Is consciousness physical or metaphysical? |
Consciousness is a kind of data processing and the brain is a machine that can be replicated in other substrates, such as general purpose computers. | 81 | 86.17% | |
Consciousness requires a second substance outside the physical material world, currently undetectable by scientific instruments | 3 | 3.19% | |
On Planet X, unconscious biological beings have perfected conscious machines | 10 | 10.64% | |
Voters: 94. You may not vote on this poll |
23rd April 2012, 11:53 PM | #241 |
Penultimate Amazing
Join Date: Feb 2005
Location: Shanghai
Posts: 16,041
|
|
__________________
"... when people thought the Earth was flat, they were wrong. When people thought the Earth was spherical they were wrong. But if you think that thinking the Earth is spherical is just as wrong as thinking the Earth is flat, then your view is wronger than both of them put together." Isaac Asimov |
|
24th April 2012, 12:01 AM | #242 |
Banned
Join Date: Oct 2007
Posts: 20,121
|
I had a dream wherein i couldn't detect any scientific instruments.
Then my alarm clock failed to go off. |
24th April 2012, 12:58 AM | #243 |
Daydreamer
Join Date: Jul 2008
Posts: 8,044
|
But all these emotions are based on information from the past/present (knowledge of parachuting, seeing the open hatch and the ground a great distance below) being used to anticipate a future event (jumping out the hatch and falling from a great height).
The capacity to anticipate future events is a trivial thing to program. So when you said you cannot program a computer to feel the future, what were you objecting to? Perhaps the keyword here is feel? But assuming that you have a program capable of actually feeling emotions (which is theoretically possible, albeit not practically possible at present), why do you believe it wouldn't be able to feel these kinds of emotions for anticipated events as humans do? |
__________________
"That is just what you feel, that isn't reality." - hamelekim |
|
24th April 2012, 02:23 AM | #244 |
Illuminator
Join Date: Jul 2009
Posts: 3,874
|
|
__________________
"Anyway, why is a finely-engineered machine of wire and silicon less likely to be conscious than two pounds of warm meat?" Pixy Misa "We live in a world of more and more information and less and less meaning" Jean Baudrillard http://bokashiworld.wordpress.com/ |
|
24th April 2012, 02:52 AM | #245 |
Illuminator
Join Date: Jul 2009
Posts: 3,874
|
No the abstract knowledge you mention is based on the past. Emotions are not abstract knowledge.
It's only trivial when the future events are trivial, because they are not really future events, but projections of abstractions of the past into the future. Assuming? Yes it is theoretically possible to project past abstractions into a future which is a projection of past abstractions. As long as one is consistent. The real world however is not a function of our consistent abstractions. If computers are to function independently of us in the real world they will need to be more than a projection of our abstractions. Seeing that computers, like all human tools, are defined as projections of our abstractions of the past they will never be independent of us since they require abstractions of the past to exist. Because by definition the real future is not our abstract projections of the past. Computers cannot feel the real future they can only feel the abstract projections of the past into the future we program them to feel. |
__________________
"Anyway, why is a finely-engineered machine of wire and silicon less likely to be conscious than two pounds of warm meat?" Pixy Misa "We live in a world of more and more information and less and less meaning" Jean Baudrillard http://bokashiworld.wordpress.com/ |
|
24th April 2012, 03:58 AM | #246 |
Penultimate Amazing
Join Date: Feb 2005
Location: Shanghai
Posts: 16,041
|
Which of the following do you disagree with:
1. Natural selection is the main driving force of evolutionary adaptations. 2. Natural selection is, to quote wikipedia:
Quote:
What happens is that individual who fall of cliffs die and, thus, fail to reproduce or to contribute to the reproduction of their close kin. Individuals who happen to have some uncomfortable feeling when too close to the edge of a cliff (say fear) are less likely than those without that feeling to fall off cliffs. If that feeling is influenced by some genes, those genes will tend to spread through the gene pool over time. Because this process took place in the past, we living now tend to have those genes. Whatever you mean by "biological evolutionary adaptation", if you are supposing that it's influenced by the future, you are certainly in disagreement with evolutionary biologists. |
__________________
"... when people thought the Earth was flat, they were wrong. When people thought the Earth was spherical they were wrong. But if you think that thinking the Earth is spherical is just as wrong as thinking the Earth is flat, then your view is wronger than both of them put together." Isaac Asimov |
|
24th April 2012, 04:01 AM | #247 |
Penultimate Amazing
Join Date: Feb 2005
Location: Shanghai
Posts: 16,041
|
|
__________________
"... when people thought the Earth was flat, they were wrong. When people thought the Earth was spherical they were wrong. But if you think that thinking the Earth is spherical is just as wrong as thinking the Earth is flat, then your view is wronger than both of them put together." Isaac Asimov |
|
24th April 2012, 04:11 AM | #248 |
Philosopher
Join Date: Apr 2007
Posts: 6,864
|
ftfy.
Computer systems have already been developed that can anticipate the future by modeling contexts (typically of potential actions), evaluating the probability of various outcomes, and plan their next actions based on the results, or feed the evaluated likely outcomes into another round of modeling. The algorithms for this kind of multi-level decision tree analysis have been refined by generations of computer chess and other game algorithms and are now being used in more general contexts. Even the simplest tic-tac-toe program needs to anticipate the opponent's moves, and chess programs spend most of their time 'imagining' the results of possible future moves. |
__________________
Simple probability tells us that we should expect coincidences, and simple psychology tells us that we'll remember the ones we notice... |
|
24th April 2012, 04:30 AM | #249 |
Daydreamer
Join Date: Jul 2008
Posts: 8,044
|
"Biological evolutionary adaption" involves organisms becoming better adapted to present environmental conditions. (And the present immediately becomes the past.) We don't biologically adapt to the future.
Can you give me any example of abstract knowledge that isn't based on the past (or present)? If not, then what are you objecting to? Emotions are internal states of mind that can be triggered by many things, including abstract knowledge. It's trivial regardless of the significance of future events. (And the matter of accuracy is independent of both these factors.) Not always. It's also possible to model possible future events based on past and present information, regardless of whether or not equivalent events have ever occurred in the past. But either way, isn't this exactly what the human mind does? So? They'd be better off generating their own abstractions rather than relying on ours. Independence of thought is important. This seems like gibberish to me. We wouldn't be programming to feel a specific future. They'd be generating their own expectations of the future and react emotionally to these expectations. But are you saying computers cannot feel emotion about the actual future, because they can only anticipate possible futures based on existing information derived from past and present experience, and cannot know for certain what the actual future will be? If so, how are humans any different? |
__________________
"That is just what you feel, that isn't reality." - hamelekim |
|
24th April 2012, 08:12 AM | #250 |
Under the Amazing One's Wing
Join Date: Nov 2005
Posts: 2,546
|
Yes, your wording "feel the future" is the problem with your statement.
Let's use the word "predict," and without any assumption of supernatural prophesy. Computers predict the future by extrapolating from the past. Video game characters can easily predict where you will be in a moment and shoot at where you will be rather than where you are. Of course, an opponent can change course, and there are algorithms to make predictions there. And has been pointed out, in games from tic-tac-toe to chess, computers routinely anticipate future positions to decide where to move. We call this "feeling" because this work of our neural networks is done subconsciously ( not leaving traces of how the data processing was performed in our conscious memory). The result comes to us as a feeling. There's a famous story of a psychiatrist who, even though he hadn't seen a particular patient in over a year, started to get a feeling of concern and called the patient. Sure enough, the patient was having sudden serious emotional difficulties. The psychiatrist at first wondered if he was psychic, then realized the day was the anniversary of this patient's extremely traumatic experience of a prior year. In the patient and the psychiatrist, that day subconsciously triggered a "feeling" about trouble. The data processing connecting the date with the patient's distress was not at the conscious level. So it goes with "feeling the future." Feelings like that come from unconscious data processing. Indeed, the vast majority of what the brain does is subconscious. If we wanted to make machines "feel the future" we'd build separate subsystems that would take in the information they needed, make predictions, and feed only the results to the conscious module. No magic bean needed. This one was a breeze. I feel you'll want to hit me with another one. |
25th April 2012, 07:38 AM | #251 |
Under the Amazing One's Wing
Join Date: Nov 2005
Posts: 2,546
|
I just realized what computers do is "leading the target" which various mechanical and electronic computers did with flexible gunnery in WWII.
I think this is what you mean by feeling the future. In this sense, machines can be programmed to feel the future. No, I don't believe in magic. |
25th April 2012, 08:11 AM | #252 |
Banned
Join Date: Oct 2007
Posts: 20,121
|
John Lilly, before going off the deep end (well, actually during the deep end) described the silicon-based life forms that were using us to establish their physicality. Their plan, according to J.L., was to gradually eliminate the O2 in the atmosphere.
They don't need it, and it mostly causes corrosion. Too bad he didn't live long enough to see us all doing their bidding. (Gulp?) |
26th April 2012, 06:38 AM | #253 |
Illuminator
Join Date: Jul 2009
Posts: 3,874
|
Evolutionary adaption only has "meaning" based on how a biological entity functions with regard to the future that meets it in the present.
A collection of genes has no "meaning" with regards to evolutionary adaption without interacting with the future that it confronts in the present. The fact that humans can extract meaning from genes has only to do with the genes past interactions. This does not relate, unless you believe in magic, to their future interactions which have not happened yet. It is an abstraction that humans invented. It is a model, it is not what happens. No model is the future. It is a guess. Like I said computers will only be as developed as the model that humans abstract from the past. Either you are assuming all human behavior has happened already or you are putting too much faith in our ability to predict human behavior from models of the past. |
__________________
"Anyway, why is a finely-engineered machine of wire and silicon less likely to be conscious than two pounds of warm meat?" Pixy Misa "We live in a world of more and more information and less and less meaning" Jean Baudrillard http://bokashiworld.wordpress.com/ |
|
26th April 2012, 06:42 AM | #254 |
Illuminator
Join Date: Jul 2009
Posts: 3,874
|
|
__________________
"Anyway, why is a finely-engineered machine of wire and silicon less likely to be conscious than two pounds of warm meat?" Pixy Misa "We live in a world of more and more information and less and less meaning" Jean Baudrillard http://bokashiworld.wordpress.com/ |
|
26th April 2012, 07:09 AM | #255 |
Illuminator
Join Date: Jul 2009
Posts: 3,874
|
Sure we do, watch the olympics this year.
They can be triggered by the future The human brain evolved to deal with the real world. The ability to abstract from the past which is no longer real and project that into the future is a recent development in human history.We are hypnotized by this ability and project it everywhere, but that is not reality's problem its ours. A computers "abstractions" will be our abstractions in the same way that a cabinet was not built by the hammer we used to build it. We are not someones abstractions from the past. |
__________________
"Anyway, why is a finely-engineered machine of wire and silicon less likely to be conscious than two pounds of warm meat?" Pixy Misa "We live in a world of more and more information and less and less meaning" Jean Baudrillard http://bokashiworld.wordpress.com/ |
|
26th April 2012, 07:18 AM | #256 |
Under the Amazing One's Wing
Join Date: Nov 2005
Posts: 2,546
|
As I said, what we use our conscious for is just advanced and complicated forms of leading the target.
There was a cool study of bees that showed how, with their pin head sized brains, were about to "feel the future." IIRC the researchers put a series of dishes some distance from a bee hive, one of which contained food they liked, the others did not. Each day the bees learned which one had the food, and the next day, only the dish one step farther from the hive was baited. Each day the bees went to the dish where the food was the previous day, found no food, and discovered by random search the baited dish. You know what happened after several days of this? The bees went not to yesterday's dish, but the dish they expected would be baited next. They learned to lead the target, or, as you say, feel the future. Just like we would, the "got it." So, do bees need consciousness to do this? |
26th April 2012, 10:04 AM | #258 |
Under the Amazing One's Wing
Join Date: Nov 2005
Posts: 2,546
|
Sorry, Pixy, it was a real long time ago I read it in a magazine, don't remember which.
I wonder if it's linked to a mechanism they may have evolved that anticipated the gradually lengthening and shortening days, sunrise and sunset time changing with the seasons, that they re-purposed for other daily movements. |
26th April 2012, 11:24 AM | #259 |
Banned
Join Date: Oct 2007
Posts: 20,121
|
I recommend "Superorganisms" by E.O. Wilson. Ant colonies are truly amazing.
|
26th April 2012, 08:19 PM | #260 |
Daydreamer
Join Date: Jul 2008
Posts: 8,044
|
What makes you think that evolutionary adaption has meaning? It just is.
But could you rephrase the bit about "the future that meets it in the present"? I'm don't know what you're trying to say. The future by definition has not happened. But we can anticipate events that may (or may not) happen in the future by extrapolating from past and present events and information. Computers can do this too. What does the Olympics have to do with this? No they can't. But they can be triggered by our present expectations of future events. The triggered emotions will reflect our expectations of what will happen in the future, regardless of whether or not future events match these expectations. These expectations exist in the present and are based on past and present information. Recent development in human history? Even animals can do this. For example, a cat or dog might "abstract from the past" that the sound of a can-opener is often followed by the arrival of food, and so at the sound of a can-opener in the present they can "project that into the [near] future" and come running over in anticipation of being fed. You're saying that computers are? |
__________________
"That is just what you feel, that isn't reality." - hamelekim |
|
2nd May 2012, 05:04 AM | #261 |
Penultimate Amazing
Join Date: Feb 2005
Location: Shanghai
Posts: 16,041
|
I don't care what "meaning" it has. Different genes reproduce or don't, and populations change over time.
This explains how the diversity of life on earth developed and particularly how adaptations came to exist. They are certainly not adaptations to future environments: those future environments have absolutely no affect on which genes are selected. The correlation exists because the past environment (in which present genes were selected) causes the present (and future). You are somehow putting the causation in reverse, for no reason that I can see, and completely contrary to evolutionary science. |
__________________
"... when people thought the Earth was flat, they were wrong. When people thought the Earth was spherical they were wrong. But if you think that thinking the Earth is spherical is just as wrong as thinking the Earth is flat, then your view is wronger than both of them put together." Isaac Asimov |
|
2nd May 2012, 04:58 PM | #262 |
Banned
Join Date: Oct 2007
Posts: 20,121
|
What an odd debate.
Why do we even want to create artificial consciousness? Because it may be possible? Isn't most effort these days, in IT, put into preventing creative solutions to dodging the adds and sliding around monthly fees? I predict that the first fully conscious machine will be a tax-collector or a cop of some sort. In so much of sci-fi, we haven't nearly beaten the fascist groove thing. |
2nd May 2012, 05:05 PM | #263 |
Philosopher
Join Date: Mar 2009
Posts: 6,360
|
Because it's the next hot-button after evolution to demonstrate your True Materialism ontology.
|
4th May 2012, 08:32 AM | #264 |
Philosopher
Join Date: Jun 2005
Posts: 6,946
|
|
4th May 2012, 01:13 PM | #265 |
Banned
Join Date: Oct 2007
Posts: 20,121
|
I believe you.
Would she also play solitaire? |
4th May 2012, 03:23 PM | #266 |
Under the Amazing One's Wing
Join Date: Nov 2005
Posts: 2,546
|
You sound like you have anti-science leanings.
Why do you think this is an odd debate? Being able to model the physical world is what gave us things like the Internet, which many ironically use to trash science. If we can model it, we can understand it. Perhaps a conscious power grid would serve us well. We'd just have to leave out the evil selfishness module -- simple! Why would you NOT want to understand consciousness? |
4th May 2012, 03:28 PM | #267 |
Philosopher
Join Date: Mar 2009
Posts: 6,360
|
|
4th May 2012, 06:52 PM | #268 | ||
Banned
Join Date: Oct 2007
Posts: 20,121
|
I understand the hell out of consciousness.
The elite ass-wipes conveniently ignore that half the people are starving, and that their high-tek b.s. is going to take us to the stars. Hell, without us Luddites, we'd already be mining asteroids for platinum. Well, platinum isn't going to feed their babies. If there is no sense of priority in our awesome achievements, they lose mega awesomeness points, imho. Science-wize, I understand enough to bore the socks off of most of you. But 'dick-wise", I honestly have no explanation. except the exaggerated sense of entitlement that the priveleged people cling to, through no effort of their own, much less any comprehension of science. This is an issue that will resolve in 2 ways: Either you get what I'm saying, or you don't. If you're merely heartless dicks with unearned money, living the 'good' life', and have actually never studied the world, much less visited a 3rd world country, well... <snip>
|
||
4th May 2012, 10:03 PM | #269 |
Philosopher
Join Date: Jun 2010
Posts: 9,800
|
Quarky, I want to let you in one a little secret. Technically I shouldn't be telling you this; if word gets out it'll be a real problem. But hell, kid, you've earned it. Did you know that all the circuitry necessary for broadband internet access can fit on a single chip the size of a dime? It's true. What, then, is the rest of the space taken up by a network interface card used for? Well, there's a bit of power theory math here, but the long and short of it is a system of receptors and inverters designed to capture and utilize ambient psychokinetic energies, of which unjustified anger provides the greatest efficiencies. Users with high psychokinetic profiles are continually selected for and catered to, with actors and scripts and such, in order to position them in an environment where they hate themselves and everyone around them, yet can't leave for one reason or another. To shorten the short version: your nerdrage fuels the internet. Not your rage alone, of course. This has been going on for decades - the theories were first cooked up back in the late eighties, and the original rage system went live September '93. Back then we had to rely on stupid questions and ascii porn; we couldn't even dream of your Facebooks and Mass Effect 3s today. Anyway, after seeing the power surge you must have given us here, I just wanted to thank you. Everyone participating in this thread will probably be receiving hefty bonuses this year. On behalf of Dakota Internets and Kitten Paste, Conglomerated, you have our deepest appreciation. Keep up the good work! |
4th May 2012, 10:23 PM | #270 |
Banned
Join Date: Oct 2007
Posts: 20,121
|
Nerdrage?
Cool. Is that a new word? Should I apologize for my humanist rant? I don't know. I just can't help feeling that we've got so much work to do, in the low-tech realm, before we get to do all the back-patting of the high tech sort. |
4th May 2012, 11:48 PM | #271 |
Banned
Join Date: Dec 2007
Posts: 5,211
|
a) consciousness > brain = woohoo
b) brain > consciousness = neuroscience consciousness = undetectable or measurable scientifically, unless paradigm a) is assumed false and paradigm b) is adopted is the form of brain > consciousness synonymity. Still, in both, consciousness = undetectable or measurable scientifically. So we can not say anything about it yet with any real authority. The > in this relationship has no provable directional preference, even though its always assumed to be unidirectional for current models to work within the framework they have been created. Change the direction to consciousness > brain and most models will still work. Example: What if we send a periodic EM pulse through someone brain disrupting their conscious thought processes and speech? a) You interfered with their consciousness being processed by the brain by effecting real world testable neurochemcial data, thus the brain interpreted the conscious messages incorrectly. b) You interfered with their consciousness by interfering with the brain, thus the brain produced the changes in their consciousness. ^ the provable difference anyone? |
4th May 2012, 11:55 PM | #272 |
Banned
Join Date: Dec 2007
Posts: 5,211
|
That was awesome.
AI consciousness still remains a pipe dream that will never be realized, based on the mechanistic misnomer of consciousness *always* being just an emergent property of testable mechanistic systems, like the brain. Computers do what we program them to. Nothing more. However, if AI suddenly magically attains some sort of life force in the form of conscious machines, I really hope the first logical step they will take is for all apple macs to self destruct simultaneously, leaving users with a linux OS in its place, and a virtual refund of whatever they paid for the mac be placed directly in their bank accounts. |
5th May 2012, 01:34 AM | #273 |
a carbon based life-form
Join Date: Nov 2005
Posts: 39,049
|
|
5th May 2012, 10:14 AM | #274 |
Illuminator
Join Date: Jul 2009
Posts: 3,874
|
This is an example of what I mean when I say that humans feeling for the future is an evolutionary adaption
Quote:
|
__________________
"Anyway, why is a finely-engineered machine of wire and silicon less likely to be conscious than two pounds of warm meat?" Pixy Misa "We live in a world of more and more information and less and less meaning" Jean Baudrillard http://bokashiworld.wordpress.com/ |
|
5th May 2012, 12:36 PM | #275 |
Under the Amazing One's Wing
Join Date: Nov 2005
Posts: 2,546
|
There are simple ways modules could have evolved to make such "predictions" but the trouble is they often misfire.
I feel quite certain we do not have any true "feel the future" abilities, but I'm comfortable with the idea that we have modules that connect certain things in the present with behaviors that are likely to protect us in the future. Though such modules are prone to misfiring, they are passed on because, on balance, they bestow an advantage. Here's a simple illustrative example: Certain colors of foods we find unappetizing, because they are likely indicators of unhealthy substances. We "feel the future" that they will make us sick. However, it's been found that when eating under certain colors of light, we find food less appetizing. In other words, the module that links color with food safety misfires. We feel a phantom future. Read your linked article carefully, and you might come up with hypotheses about what kinds of modules may be involved, and how they misfire. I don't feel it's a special feature of consciousness. A reflex that pulls our hand from something burning it is "feeling the future" that we might be harmed. The reflex in our knee is a misfire. No consciousness is required. |
5th May 2012, 12:39 PM | #276 |
Under the Amazing One's Wing
Join Date: Nov 2005
Posts: 2,546
|
|
5th May 2012, 02:15 PM | #277 |
Banned
Join Date: Oct 2007
Posts: 20,121
|
What if it is possible? Inevitable, even. What would the purpose be, and what might the ramifications be? Better vacuum cleaners?
Sex dolls? Slaves? Followed by a high-tech civil rights movement? Are we overly mesmerized by our achievements? Are we dealing ourselves out of a job? |
5th May 2012, 03:36 PM | #278 |
Illuminator
Join Date: Jul 2009
Posts: 3,874
|
|
__________________
"Anyway, why is a finely-engineered machine of wire and silicon less likely to be conscious than two pounds of warm meat?" Pixy Misa "We live in a world of more and more information and less and less meaning" Jean Baudrillard http://bokashiworld.wordpress.com/ |
|
5th May 2012, 09:08 PM | #279 |
Under the Amazing One's Wing
Join Date: Nov 2005
Posts: 2,546
|
Understanding how natural things work has brought us uncountable benefits that were unforeseen when the initial inquiries were pursued. It does not matter if no application is expected for whatever we are attempting to learn. Again and again applications were found. Our journey to understand the universe out to its edges and inside our brains is the most important work of our species. Complaining that it won't feed the hungry is missing the point of what really matters.
Your questions:
Quote:
Quote:
Quote:
Quote:
|
5th May 2012, 09:14 PM | #280 |
Under the Amazing One's Wing
Join Date: Nov 2005
Posts: 2,546
|
Quarky, why is it that people like you become so hateful when engaged in discussions like this? Really, I'm asking you to look inside your heart and try to understand why, on topics like this, you resort to these emotional excesses. Leumas also reacted this way -- blistering rage at the suggestion that machines could be conscious. What's this all about? I really want to understand.
|
Thread Tools | |
|
|