Carleas
Right, but you have to look at the point of separation between the two systems we're comparing: Yes, a 3000 pound toad orbiting the Sun is pretty ridiculous. But it would be equally so in both materialist and non-materialist systems. Like you said, it's just a bizarre thing to propose. Likewise with a computer made of 1 trillion people holding flash cards. A very bizarre hypothetical, no doubt. But the argument doesn't revolve around "Isn't that many people holding flashcards bizarre?" it revolves around the difference that a materialist [i]has to[/i] admit that such a situation could result in consciousness, whereas some other system may not. That's the absurdity in the argument- the very thing that materialism forces us to conclude about the hypothetical. That's why the argument works.
Computer Science as opposed to what? I don’t know that there’s some other theory that explains what computers do. But no, I completely disagree that running the Frogger code on flashcards is just as absurd as creating an authentic human consciousness with them.
Sure, I can go along with that, but just extend the instance of the program from an instant to something tangible like 3 seconds, and I think my argument is preserved- so we have flash cards, or stars, or stimulated ganglia lying in a ditch forcing a consciousness to come into existence, have your memories of last Tuesday for 3 seconds, and then annihilate. You may call ‘remembering last Tuesday’ a single brain state, but in fact it’s a composite thing that can progress from a beginning through to an end, and I don’t see why that wouldn’t be consciousness.
Do you, though? If I wanted to transfer the word document from one computer to another, I wouldn't need to transfer all that stuff, I'd only need a tiny bit of data- even if the transfer was to be unnecessarily physical, I would only be cutting out the smallest portion of the computers hard drive, and sticking it in another computer, to have the same document. So yes, you certainly could take the computer's harddrive, toss it in a ditch, shock with with the right amount of electricity in the right way, and 'run' the word document, right? I'm no computer scientist, but I don't see why not. All the other stuff, graphics and memory, revolves around outside observers witnessing the program in some way.
And that's a big part of the difference, that makes the consciousness thing absurd- consciousness is witnessed by default, because it's an act that it's only witness performs. So, we can run a computer program without a monitor or any graphics engine with the understanding that the program is running 'invisibly'. But once we get to consciousness, there is always a witness- because that's part of what a conscious thought is. So questions like "If I stimulate disembodied memory ganglia, who is doing the remembering?" become difficult.
As long as we’re stuck with materialism, I’m pretty sure this has to be false- if you can say that my feet aren’t involved with remembering last Tuesday, you should be able to say which parts of my brain are and are not involved, as well. Every part can’t be used to remember every thing. I mean, isn’t the whole foundation of the brain = mind argument based on the fact that if we damage particular, predictable portions of the brain, then predictable, particular mental functions become inhibited?