Will machines completely replace all human beings?

This is the main board for discussing philosophy - formal, informal and in between.

Re: Will machines completely replace all human beings?

Postby Arminius » Wed Dec 21, 2016 1:28 am

Only_Humean wrote:
Arminius wrote:I agree. But do you think that the capabilities of machines are overestimated at the present time?


By whom? I'm sure many capabilities are underestimated by many, and many overestimated too. What's the most important group to consider - the common understanding, the understanding of policymakers, that of technicians, that of the shadowy cabal running the world? :)

The film industry? Is it "the most important group to consider“? - Maybe, maybe not. At least it is interested in the capabilities of machines.

Technicians must be more optimistic than pessimistic, which means that they could be in danger of overestimating the capabilities of machines.

Policymakers have to talk more optimistically than pessimistically, which means that, if they are politically interested in the capabilities of machines and talk about them publicly, they are in danger of over- and/or underestimating them.

The common understanding is a matter of a majority, and majorities do what they ought to do, which means these days: they are politically correct (so cf. policymakers).

If the shadowy cabal is interested in the capabilities of machines, then it could also be in danger of over- and/or underestimating them.

The answer to your question depends on the interest in the capabilities of machines in combination with the everlasting interest in the option of not wanting any majority to know what really happens. If the shadow cabal and the policymakers are interested in the capabilities of machines, then the majority with its common understanding is also interested in it. The shadow cabal and the policymakers are always interested in in the option of not wanting any majority to know what really happens, so that the majority with its common understanding does not know what really happens. I think that the political interest in the capabilities of machines is high, but it is not politically correct to talk as much about that theme as the common understanding becomes capable of estimating the capabilities of machines in the right way. There is always an interest in the option of not wanting any majority to know what really happens. This may lead to the following answer: currently, the capabilities of machines are over- and underestimated, namely overestimated by some and underestimated by many people.
________________

Maybe this thread can show that said answer too (provided that ILP represents the world [ :lol: ]): This thread has now 115050 views and 1975 replies, so it seems to be an important thread. But if I look at the number of those who posted in this thread and the number of those who did not post in this thread, then I have to say that the number of those ILP members who are really interest in the topic of this thread is a relatively small number. The majority is not interested in it. This majority is probably in danger of over- and/or underestimating the capabilities of machines.
Image
User avatar
Arminius
ILP Legend
 
Posts: 5732
Joined: Sat Mar 08, 2014 10:51 pm
Location: Saltus Teutoburgiensis

Re: Will machines completely replace all human beings?

Postby Tortis » Wed Dec 21, 2016 9:52 am

Maniacal Mongoose wrote:
Mildly deranged ranting?

Timewaster?

Whose sock puppet are you? You could be missing your true identity.


I'm afraid you have to face up to the sad reality that there may be more of us who are thinking this way.
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby Only_Humean » Wed Dec 21, 2016 10:58 am

Tortis wrote:
Only_Humean wrote:It sounds like you're confusing the signified and the referent, there. A brown horse may be just a concept in my mind, but it can also be a horse.


But the position is significantly different with an algorithm, which can only be a concept: there isn't anything "out there in the world" you can point to and say "there is the algorithm". You could point to a representation of an algorithm, but not the algorithm itself. Whereas with a horse, we can have the concept of a horse, a representation of a horse, and also the horse itself.


We can point to things that behave in ways that can be described algorithmically. It's a linguistic tool, not a reification. So...

Well, my suggestion is that an "algorithm" is not an identifiable, tangible process in the brain, but rather an idea about or a description of the processes that go on in the brain. "Algorithm" has a similar ontological status to "equation". Would you find it plausible that consciousness might be caused by equations?


I find it plausible that it is caused by processes that can be described algorithmically. Or put into equations. Don't you?

Galen Strawson, http://www.nytimes.com/2016/05/16/opini ... atter.html wrote:I find this odd because we know exactly what consciousness is — where by “consciousness” I mean what most people mean in this debate: experience of any kind whatever. It’s the most familiar thing there is, whether it’s experience of emotion, pain, understanding what someone is saying, seeing, hearing, touching, tasting or feeling. It is in fact the only thing in the universe whose ultimate intrinsic nature we can claim to know. It is utterly unmysterious.


We know exactly what our own consciousness is. We extrapolate that to other beings based on their similarity to us - I have a better idea of my brother's experience of something than that of a Kalahari bushman's, and better that than an orang-utan's; better that than a crab's; better that than a flatworm's. I have to say that the consciousness of a crab is a bit of a mystery, but I assume they feel pain because they respond in ways that indicate it, and I assume an oak tree doesn't because it doesn't (although it responds physically to damage and heals).

Consciousness doesn't necessarily mean human consciousness. I'm perfectly willing to grant that a computer won't know what it is to be human - or vice versa.

Not when we know that it is acting that way because we set it up to look like it is conscious, which is the case with computers now. I genuinely find it a little scary that you can think like that. It's like you've lost sight of what it means to be a conscious entity.


I'm not talking about computers now; we're at the level of modelling basic invertebrates with a few dozen neurons. I'm trying to get to the root of your argument that "consciousness is consciousness, and computers just aren't and can never be that."

If consciousness isn't the result of neural activity, what is it? It's certainly easy to drastically modify consciousness by modifying neural activity, and to end it by ending that. It seems a reasonable proposition.

And if it is, why is organically-mediated information processing somehow different from electronically-mediated?

Computers are getting closer and closer to passing the Turing Test. Are you starting to think they should be ascribed some minimal rights now? Or is that going to happen all of a sudden on the day a computer fools the Turing Test Committee?


I think that would depend on how and why a computer manages to do so. If it's by ELIZA-like language manipulation, then no.

Ah yes: but that knowledge is backed up by the lived experience of others whose reports you are able to place in the context of your own experience. If their lived experience is too far removed from your own, you won't be able to understand their reports. So for example if you are red/green colourblind you will be able to understand that red and green are colours without being able to understand how to distinguish between them. If you were completely blind you wouldn't be able to understand colour properly at all.


As Wittgenstein said, "if a lion could speak, we could not understand him." I'm still willing to grant lions the benefit of the doubt and not torture them for fun, though.
Image

The biology of purpose keeps my nose above the surface.
- Brian Eno
User avatar
Only_Humean
ILP Legend
 
Posts: 6198
Joined: Mon Jun 22, 2009 10:53 am
Location: Right here

Re: Will machines completely replace all human beings?

Postby James S Saint » Wed Dec 21, 2016 11:55 am

An algorithm is merely a specified process. The entire universe is nothing other than processes. Describe any one of them and you have specified an algorithm that physically exists (more scripturally known as a "spirit"). Regardless of whether Man understands the processes of the mind, because they physically exist, they are necessarily algorithms.
Clarify, Verify, Instill, and Reinforce the Perception of Hopes and Threats unto Anentropic Harmony :)
Else
From THIS age of sleep, Homo-sapien shall never awake.

The Wise gather together to help one another in EVERY aspect of living.

You are always more insecure than you think, just not by what you think.
The only absolute certainty is formed by the absolute lack of alternatives.
It is not merely "do what works", but "to accomplish what purpose in what time frame at what cost".
As long as the authority is secretive, the population will be subjugated.

Amid the lack of certainty, put faith in the wiser to believe.
Devil's Motto: Make it look good, safe, innocent, and wise.. until it is too late to choose otherwise.

The Real God ≡ The reason/cause for the Universe being what it is = "The situation cannot be what it is and also remain as it is".
.
James S Saint
ILP Legend
 
Posts: 25976
Joined: Sun Apr 18, 2010 8:05 pm

Re: Will machines completely replace all human beings?

Postby Ecmandu » Wed Dec 21, 2016 2:07 pm

I've been studying compositional signatures, consciousness signatures and behavioral signatures for... Hmm... 23 years now. Apply a frequency to the signature and you get output.

We have a superposition process that really is the answer to Turing, if you can't superimpose, it's not conscious. Otherwise known as possessions.
Ecmandu
ILP Legend
 
Posts: 11098
Joined: Thu Dec 11, 2014 1:22 am

Re: Will machines completely replace all human beings?

Postby Tortis » Wed Dec 21, 2016 2:55 pm

Only_Humean wrote:I find it plausible that it is caused by processes that can be described algorithmically. Or put into equations. Don't you?


Yes, but the cause then is the processes, not the description of them.

Actually I've changed my mind: the relevant processes in the brain can't be described algorithmically or put into equations, because of our limited understanding of matter.

Further thoughts a couple of hours later: aspects of the relevant processes in the brain can be described algorithmically, but not the entire process. But the important point is that this description is not the cause of consciousness.

I have to say that the consciousness of a crab is a bit of a mystery, but I assume they feel pain because they respond in ways that indicate it, and I assume an oak tree doesn't because it doesn't (although it responds physically to damage and heals).


We can respond to injury without feeling pain (as when we pull away from a hot surface), so I don't assume that crabs can feel pain.

Consciousness doesn't necessarily mean human consciousness. I'm perfectly willing to grant that a computer won't know what it is to be human - or vice versa.


I'm not willing to grant that a computer will know anything or feel anything.

I'm not talking about computers now; we're at the level of modelling basic invertebrates with a few dozen neurons. I'm trying to get to the root of your argument that "consciousness is consciousness, and computers just aren't and can never be that."

If consciousness isn't the result of neural activity, what is it? It's certainly easy to drastically modify consciousness by modifying neural activity, and to end it by ending that. It seems a reasonable proposition.


I believe that consciousness is the result of neural activity, but what is going on in computers has nothing to do with that kind of activity.

When you say you are not talking about computers when modelling basic invertebrates, what are you talking about?

And if it is, why is organically-mediated information processing somehow different from electronically-mediated?


"Information processing" has a similar ontological status to "algorithm" and "equation". If we look at a description of our visual system for example, we are likely to see sentences like "the optic nerve carries information from the retina to the brain". But what the optic nerve actually carries is electrochemical impulses. The entire process can be explained without making reference to "information".

Computers are getting closer and closer to passing the Turing Test. Are you starting to think they should be ascribed some minimal rights now? Or is that going to happen all of a sudden on the day a computer fools the Turing Test Committee?


I think that would depend on how and why a computer manages to do so. If it's by ELIZA-like language manipulation, then no.


So what kind of process would qualify?
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby nano-bug » Wed Dec 21, 2016 6:30 pm

It doesn't matter if a computer can actually know or feel anything. That's not what the Turing test is about. The test is about simulation of thought. Is it convincing? Can it fool you good enough?

So you might say, it can fool me, but it still can't actually think, so it doesn't really count. Who's counting?

When you think of the simulation of a machine, it's based human interaction. If you really want to bend to this logic, think about your interaction with other human beings. When they say something to you, you hear the words in your head, you see their lips move as image in your head. You then tell yourself, they are displaying thoughts behind those actions, they must be thinking, just like me! But all that is simulation.

There is very little beyond that which allows you to tap into their brain, know their exact thoughts like you are riding the waves of your own thought process. You can't even prove to yourself that you know they have a brain that thinks. Even if you open up their skull and look. It's a good guess about a series of associated implications. But it's still a guess.

Just like when a person walks around the corner we assume they still exist. It's an assumption we don't let go of. That others exist and have thoughts. Probably because the alternative is a scary or lonely one. Either way, we don't know the thoughts of others, it remains a surface assessment. Computers are not different when they beat the turning test.

Yes, computers will replace . . . our jobs. But new jobs will replace the old ones, especially when they start teaching kids to code in the first grade.

When I say we will become more like computers, start with the idea that computers are already extensions of ourselves. Our brains compute. I move on to emotions. Since feelings lead to craving which leads to suffering, the most human of problems. Since computers solve problems, ending suffering is high on the list. Before that ultimate cure arrives, people will become more logical than emotional. Or more emotionally intelligent. I call it numb as normal. Chrome Novocain. Rusted parts replaced, for a 3D printed heart.

We won't be replaced. We'll evolve into bionic men. Our thoughts will be assisted by the thoughts* of a machine. It's called symbiosis. It occurs in nature.
Highly adaptable. Yes. Wait! What? Yes. He, herself, is a head fuck. Well, will you look at this little train of thought?
User avatar
nano-bug
Philosopher
 
Posts: 3174
Joined: Fri Jul 21, 2006 5:35 pm
Location: The Virtuplex

Re: Will machines completely replace all human beings?

Postby Tortis » Wed Dec 21, 2016 6:41 pm

Ecmandu wrote:I've been studying compositional signatures, consciousness signatures and behavioral signatures for... Hmm... 23 years now. Apply a frequency to the signature and you get output.


This reads to me like nonsense, but maybe I am just ignorant. Can you explain what you mean in a little more detail?
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby Ecmandu » Wed Dec 21, 2016 8:30 pm

Tortis wrote:
Ecmandu wrote:I've been studying compositional signatures, consciousness signatures and behavioral signatures for... Hmm... 23 years now. Apply a frequency to the signature and you get output.


This reads to me like nonsense, but maybe I am just ignorant. Can you explain what you mean in a little more detail?


Sure, every being has a consciousness signature in the same sense that beings have unique skin creases.

These signatures come in clusters, as everyone thinks multiple things at the same time...

What makes them easy to isolate is the vast amount of unique data that flows through...

You think that would make it harder, but it actually makes it easier to isolate unique signatures.

Having a signature in itself is not enough.

You have to send a charge through the signature to activate the consciousness itself.

This is how you develop mind reading software.

I actually don't care if you thinks it's crap.

I was just answering your question.

The Turing test is resolved with superimposition processes. If you have the process down, when it refuses to map, you have a philosophic zombie, or just a behavioral signature.
Ecmandu
ILP Legend
 
Posts: 11098
Joined: Thu Dec 11, 2014 1:22 am

Re: Will machines completely replace all human beings?

Postby Tortis » Wed Dec 21, 2016 11:12 pm

nano-bug wrote:It doesn't matter if a computer can actually know or feel anything. That's not what the Turing test is about. The test is about simulation of thought. Is it convincing? Can it fool you good enough?


That's nearly right: Turing said the question he wanted to answer was "can machines think?". Not "can machines simulate thought?".

So you might say, it can fool me, but it still can't actually think, so it doesn't really count. Who's counting?


Well Turing was, apparently.

There is very little beyond that which allows you to tap into their brain, know their exact thoughts like you are riding the waves of your own thought process. You can't even prove to yourself that you know they have a brain that thinks. Even if you open up their skull and look. It's a good guess about a series of associated implications. But it's still a guess.


I don't think this "Problem of Other Minds" is a serious problem. It's more of a conundrum, like Zeno's Arrow Paradox:

Zeno wrote:If everything when it occupies an equal space is at rest, and if that which is in locomotion is always occupying such a space at any moment, the flying arrow is therefore motionless.


There isn't any real room for doubt that things move, and there isn't any real room for doubt that other people are conscious.

Just like when a person walks around the corner we assume they still exist. It's an assumption we don't let go of. That others exist and have thoughts. Probably because the alternative is a scary or lonely one.


Do you not think it's because the alternative is just a bit too silly? It's science fiction again. Don't get me wrong, I like science fiction, but it is fiction, not philosophy (and not science either).

Either way, we don't know the thoughts of others, it remains a surface assessment. Computers are not different when they beat the Turing test.


I don't think it matters that we don't (directly) know the thoughts of others, but in any case we do know about our own thoughts. We know the kind of thing they are and where they come from. They arise from our lived experience, what we feel, see, hear. When we program a computer to simulate human behaviour, we know that it is behaving like that because of the program, not because it is having thoughts like those we have. That is really quite a ridiculous suggestion, although surprisingly widely accepted.
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby Tortis » Wed Dec 21, 2016 11:17 pm

Ecmandu wrote:
I actually don't care if you think it's crap.



Do you care if it is crap?
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby Ecmandu » Wed Dec 21, 2016 11:21 pm

Tortis wrote:
Ecmandu wrote:
I actually don't care if you think it's crap.



Do you care if it is crap?


I gave you the answer to your question.

I've done it before and travelled back in time.

None of you are smart enough to do it, so I'll explain outlines .

I've destroyed this whole world before
Ecmandu
ILP Legend
 
Posts: 11098
Joined: Thu Dec 11, 2014 1:22 am

Re: Will machines completely replace all human beings?

Postby Tortis » Wed Dec 21, 2016 11:30 pm

Ecmandu wrote:

Do you care if it is crap?


I gave you the answer to your question.


Yeah I know but it was crap.
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby Ecmandu » Wed Dec 21, 2016 11:40 pm

Tortis wrote:
Ecmandu wrote:

Do you care if it is crap?


I gave you the answer to your question.


Yeah I know but it was crap.


Not my issue man.

You really have no clue WHAT you are talking to right now.

You'll understand someday for sure .

I literally had to reconstruct this entire world to resurrect in it.

I got a second chance.
Ecmandu
ILP Legend
 
Posts: 11098
Joined: Thu Dec 11, 2014 1:22 am

Re: Will machines completely replace all human beings?

Postby Tortis » Thu Dec 22, 2016 12:12 am

Ecmandu wrote:
Not my issue man.

You really have no clue WHAT you are talking to right now.

You'll understand someday for sure .

I literally had to reconstruct this entire world to resurrect in it.

I got a second chance.


That sounds great, I'll be sure to call on you if I need any building work done.
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby Ecmandu » Thu Dec 22, 2016 12:16 am

Tortis wrote:
Ecmandu wrote:
Not my issue man.

You really have no clue WHAT you are talking to right now.

You'll understand someday for sure .

I literally had to reconstruct this entire world to resurrect in it.

I got a second chance.


That sounds great, I'll be sure to call on you if I need any building work done.


:) that's the smartest thing you've said here.

Actually, we all created this together, but I digress

I messed up big time... So bad, that we actually remade it...

I'm the tesla of consciousness ...

Turns out, if you fuck with it, it fucks everything up...

And I used to think I was so fucking smart!
Ecmandu
ILP Legend
 
Posts: 11098
Joined: Thu Dec 11, 2014 1:22 am

Re: Will machines completely replace all human beings?

Postby Ecmandu » Thu Dec 22, 2016 3:10 am

My favorite things actually...

Talking about the weather

Making jokes with cross dressers and trannies

Being the smartest person is not everything..,

Being the best you can be... Is
Ecmandu
ILP Legend
 
Posts: 11098
Joined: Thu Dec 11, 2014 1:22 am

Re: Will machines completely replace all human beings?

Postby Only_Humean » Thu Dec 22, 2016 12:03 pm

Tortis wrote:
Only_Humean wrote:I find it plausible that it is caused by processes that can be described algorithmically. Or put into equations. Don't you?


Yes, but the cause then is the processes, not the description of them.

Actually I've changed my mind: the relevant processes in the brain can't be described algorithmically or put into equations, because of our limited understanding of matter.

Further thoughts a couple of hours later: aspects of the relevant processes in the brain can be described algorithmically, but not the entire process. But the important point is that this description is not the cause of consciousness.


That's not relevant to my point at all, though; I just said it's the process and not the description. I don't see how you're not still confusing description and thing. It's not the description of bits and bytes that causes these words to appear on your screen, it's the process of charge moving through your laptop (/pc/tablet/phone).

I have to say that the consciousness of a crab is a bit of a mystery, but I assume they feel pain because they respond in ways that indicate it, and I assume an oak tree doesn't because it doesn't (although it responds physically to damage and heals).


We can respond to injury without feeling pain (as when we pull away from a hot surface), so I don't assume that crabs can feel pain.


Where does consciousness begin? Vertebrates? Mammals? Primates?

I'm not talking about computers now; we're at the level of modelling basic invertebrates with a few dozen neurons. I'm trying to get to the root of your argument that "consciousness is consciousness, and computers just aren't and can never be that."

If consciousness isn't the result of neural activity, what is it? It's certainly easy to drastically modify consciousness by modifying neural activity, and to end it by ending that. It seems a reasonable proposition.


I believe that consciousness is the result of neural activity, but what is going on in computers has nothing to do with that kind of activity. [/quote]

How do they differ? (I'm not saying they don't, I'm curious)

When you say you are not talking about computers when modelling basic invertebrates, what are you talking about?


I meant: I'm not talking about modern-day computers being conscious, as even if consciousness is an emergent property of neural network activity, the limit of neural network modelling is still at a very basic level.


And if it is, why is organically-mediated information processing somehow different from electronically-mediated?


"Information processing" has a similar ontological status to "algorithm" and "equation". If we look at a description of our visual system for example, we are likely to see sentences like "the optic nerve carries information from the retina to the brain". But what the optic nerve actually carries is electrochemical impulses. The entire process can be explained without making reference to "information".


Why are dynamic electrochemical processes fundamentally different from dynamic electronic ones? Is consciousness to be found in carbon not silicon, or ions not electrons?

Computers are getting closer and closer to passing the Turing Test. Are you starting to think they should be ascribed some minimal rights now? Or is that going to happen all of a sudden on the day a computer fools the Turing Test Committee?


I think that would depend on how and why a computer manages to do so. If it's by ELIZA-like language manipulation, then no.


So what kind of process would qualify?


At the very least, some kind of conceptualisation rather than manipulating linguistic tokens.
Image

The biology of purpose keeps my nose above the surface.
- Brian Eno
User avatar
Only_Humean
ILP Legend
 
Posts: 6198
Joined: Mon Jun 22, 2009 10:53 am
Location: Right here

Re: Will machines completely replace all human beings?

Postby Tortis » Fri Dec 23, 2016 1:11 am

Only_Humean wrote:That's not relevant to my point at all, though; I just said it's the process and not the description. I don't see how you're not still confusing description and thing. It's not the description of bits and bytes that causes these words to appear on your screen, it's the process of charge moving through your laptop (/pc/tablet/phone).


I'm not sure we are understanding each other correctly. My position is, it's the electrical circuitry that causes the words to appear on the screen, not an algorithm, and it's the neuronal activity that causes consciousness and allows us to read and write the words, not an algorithm.

Where does consciousness begin? Vertebrates? Mammals? Primates?


We don't know precisely. You could ask a similar question about the developing human foetus. When does it first have experiences, and what kind of experiences are they? I would speculate that touch arrives first, it seems somehow more primitive, and the surface of the body is there before the eyes. Maybe feeling your tongue in your mouth, your fingers rubbing together?

I think it's quite possible that consciousness appears very early in evolution. Maybe worms can feel and could feel millions of years ago?

I believe that consciousness is the result of neural activity, but what is going on in computers has nothing to do with that kind of activity.


How do they differ? (I'm not saying they don't, I'm curious)


A computer running the same program can be made from different things, right? Vacuum tubes or transistors on silicon chips? And you can use different media, magnetic coatings or a laser reading bumps and hollows on a disk.

Consciousness arises from (or is) highly specific processes. It seems very likely that these developed from the processes that allow unconscious detection and response. Volvox is a green algae, a plant, which evolved 200 million years ago. It forms spherical colonies. The individual cells have eyespots and whip-like tails, which allows the colony to swim towards the light. As well as photoreceptor proteins, these eyespots contain a large number of complex signalling proteins. Light arriving at the eyespot sets off a photoelectric signal transduction process that ultimately triggers changes in the way the tail moves and causes the movement towards the light source.

Evolution has had 200 million years to develop on what was already a complex system, and all that time the system is becoming more specific.

Is it really credible that the same thing could now be achieved by vacuum tubes, or transistors, reading magnetic coatings, or microscopic bumps and hollows, or punched paper cards, or any number of other materials and technologies you could use? This just seems unscientific to me, irrational.

I'm not talking about modern-day computers being conscious, as even if consciousness is an emergent property of neural network activity, the limit of neural network modelling is still at a very basic level.


It's a great sales gimmick, but really this term "neural" is a bit of a con when applied to computers.

Why are dynamic electrochemical processes fundamentally different from dynamic electronic ones?


Because they can't produce the same effects. This is pretty obvious really!

Is consciousness to be found in carbon not silicon, or ions not electrons?


If we can discover the precise causal mechanisms we may be able to produce consciousness in other media. Would we really want to? I think what we want is precisely the opposite. There are already more than enough conscious beings. What we want is unconscious computers.

At the very least, some kind of conceptualisation rather than manipulating linguistic tokens.


Thanks Only, it's been an enjoyable discussion so far. Maybe we can come back to the Turing Test issue later?
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby James S Saint » Fri Dec 23, 2016 3:44 am

If one cannot settle upon what the word "consciousness" is referring, I don't see how one can intelligently discuss from where it arises, what it takes to create it, or within what it might reside.
Clarify, Verify, Instill, and Reinforce the Perception of Hopes and Threats unto Anentropic Harmony :)
Else
From THIS age of sleep, Homo-sapien shall never awake.

The Wise gather together to help one another in EVERY aspect of living.

You are always more insecure than you think, just not by what you think.
The only absolute certainty is formed by the absolute lack of alternatives.
It is not merely "do what works", but "to accomplish what purpose in what time frame at what cost".
As long as the authority is secretive, the population will be subjugated.

Amid the lack of certainty, put faith in the wiser to believe.
Devil's Motto: Make it look good, safe, innocent, and wise.. until it is too late to choose otherwise.

The Real God ≡ The reason/cause for the Universe being what it is = "The situation cannot be what it is and also remain as it is".
.
James S Saint
ILP Legend
 
Posts: 25976
Joined: Sun Apr 18, 2010 8:05 pm

Re: Will machines completely replace all human beings?

Postby Tortis » Fri Dec 23, 2016 9:49 am

James S Saint wrote:If one cannot settle upon what the word "consciousness" is referring, I don't see how one can intelligently discuss from where it arises, what it takes to create it, or within what it might reside.


Galen Strawson wrote:Every day, it seems, some verifiably intelligent person tells us that we don’t know what consciousness is. The nature of consciousness, they say, is an awesome mystery. It’s the ultimate hard problem. The current Wikipedia entry is typical: Consciousness “is the most mysterious aspect of our lives”; philosophers “have struggled to comprehend the nature of consciousness.”

I find this odd because we know exactly what consciousness is — where by “consciousness” I mean what most people mean in this debate: experience of any kind whatever. It’s the most familiar thing there is, whether it’s experience of emotion, pain, understanding what someone is saying, seeing, hearing, touching, tasting or feeling. It is in fact the only thing in the universe whose ultimate intrinsic nature we can claim to know. It is utterly unmysterious.
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby James S Saint » Fri Dec 23, 2016 10:50 am

Tortis wrote:
Galen Strawson wrote:Every day, it seems, some verifiably intelligent person tells us that we don’t know what consciousness is. The nature of consciousness, they say, is an awesome mystery. It’s the ultimate hard problem. The current Wikipedia entry is typical: Consciousness “is the most mysterious aspect of our lives”; philosophers “have struggled to comprehend the nature of consciousness.”

I find this odd because we know exactly what consciousness is — where by “consciousness” I mean what most people mean in this debate: experience of any kind whatever. It’s the most familiar thing there is, whether it’s experience of emotion, pain, understanding what someone is saying, seeing, hearing, touching, tasting or feeling. It is in fact the only thing in the universe whose ultimate intrinsic nature we can claim to know. It is utterly unmysterious.

Merely substituting "experiencing" with "being conscious", doesn't really help much. A tin can can experience getting kicked, yet is hardly conscious of it. "Being aware" is a better substitute, but it is still too unexplicative. What is required of something such as to say that it is aware, experiencing, or conscious? That is the mystery to which they were referring.

I define consciousness as the process of remote recognition. If anything can detect, locate, and identify something else then it is aware of, conscious of, and experiencing the presence of that something. To be more obviously conscious, it must be in the process of detecting many things, perhaps objects in a room, facial expressions, or movements and detecting them as separate from itself (hence "remote"). The act of being conscious of a terrain is the act of continuously maintaining a map for a terrain with which decisions concerning further actions can be made.

Anything or anyone maintaining that activity is conscious of a specific terrain, yet perhaps unaware or unconscious of other terrains.

But the problem with defining precisely what consciousness is, is similar to defining what a god is - people want to argue and no authority is willing to step up and declare precisely what the word is to mean. The mystery is willingly maintained. Thus in any detailed discussion, the participants must settle between themselves what the word is going to mean during their discussion. Without such agreements upon the meanings of the words, vague inferences lead merely to endless bantering.
Clarify, Verify, Instill, and Reinforce the Perception of Hopes and Threats unto Anentropic Harmony :)
Else
From THIS age of sleep, Homo-sapien shall never awake.

The Wise gather together to help one another in EVERY aspect of living.

You are always more insecure than you think, just not by what you think.
The only absolute certainty is formed by the absolute lack of alternatives.
It is not merely "do what works", but "to accomplish what purpose in what time frame at what cost".
As long as the authority is secretive, the population will be subjugated.

Amid the lack of certainty, put faith in the wiser to believe.
Devil's Motto: Make it look good, safe, innocent, and wise.. until it is too late to choose otherwise.

The Real God ≡ The reason/cause for the Universe being what it is = "The situation cannot be what it is and also remain as it is".
.
James S Saint
ILP Legend
 
Posts: 25976
Joined: Sun Apr 18, 2010 8:05 pm

Re: Will machines completely replace all human beings?

Postby fuse » Fri Dec 23, 2016 11:32 am

Tortis wrote:Computers will never have the moral or legal status of humans because they can't feel, see, hear, think, understand. They aren't conscious. And they haven't become any more conscious as the result of the programming described in the article.

Strawson wrote:We examine the brain in ever greater detail, using increasingly powerful techniques like fMRI, and we observe extraordinarily complex neuroelectrochemical goings-on, but we can’t even begin to understand how these goings-on can be (or give rise to) conscious experiences.

If you agree with Strawson that we can't even begin to understand the physical goings-on that give rise to consciousness, then how can you know that artificial consciousness will never be possible?

If you do have reason be be able to make such a strong claim, then you must have an idea about why the biochemical processes of consciousness are not possibly repeatable in another format, ever.
User avatar
fuse
Philosopher
 
Posts: 4598
Joined: Thu Jul 20, 2006 5:13 pm

Re: Will machines completely replace all human beings?

Postby Tortis » Fri Dec 23, 2016 11:39 am

James S Saint wrote:I define consciousness as the process of remote recognition. If anything can detect, locate, and identify something else then it is aware of, conscious of, and experiencing the presence of that something.


Ok, but then you aren't talking about what other people are talking about when they talk about consciousness.

So, you go to Google or MIT or whatever and say "look, I made a conscious machine, it can detect, locate and identify something else". They aren't going to be much impressed, because that isn't what people generally mean by "conscious".
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

Re: Will machines completely replace all human beings?

Postby Tortis » Fri Dec 23, 2016 11:41 am

fuse wrote:If you do have reason be be able to make such a strong claim, then you must have an idea about why the biochemical processes of consciousness are not possibly repeatable in another format, ever.


I said in my last post but one: if we can discover the precise causal mechanisms we may be able to produce consciousness in other media.

And I tried to explain why that other medium (or format as you put it) won't be computation.
Last edited by Tortis on Fri Dec 23, 2016 12:38 pm, edited 1 time in total.
User avatar
Tortis
 
Posts: 60
Joined: Thu May 30, 2013 3:57 pm

PreviousNext

Return to Philosophy



Who is online

Users browsing this forum: Google [Bot]