Moral and epistemic norms are real standards by which we are measured to be good or think rationally. The trouble is, there is no way for them to be material things.
Philosophy of Mind
Agency Arguments: Do reasoning and moral choice prove you have a nonphysical mind?
Rational agency is what we call our reasoning ability, and it seems to be different from mimicry, but the two aren't physically different. This seems to show we are non-physical. Moral agency is our faculty for deliberating between goods. Again, this seems physically identical to mimicry, so its existence seems to show we are non-physical.
NOTES
Further Reading
The Intentionality Argument
Our thoughts are about things, a property we call intentionality. Material objects do not exhibit intentionality. In this video, I consider the possibility that this shows our minds are immaterial.
NOTES
THE INTENTIONALITY ARGUMENT
- Thoughts have intentionality
- Material things don't have intentionality
- So, thoughts are not material things
- O1: causal view of intentionality
- R1: non-existent causes (e.g., unicorn)
- O1: these are just existent causes mixed together (e.g., unicorn = horn + horse)
- R1: the concept may be caused by existent causes, but the content of the thought isn't of those existent causes
- O1: these are just existent causes mixed together (e.g., unicorn = horn + horse)
- R2: different causes, same content
- R3: same cause, different content
- R4: reducing intentionality to an object ipso facto loses the 'aboutness' necessary for intentionality
- R1: non-existent causes (e.g., unicorn)
- O2 (Dennett): intentionality has no explanatory power so we shouldn't posit it
- R1: it isn't something posited to explain, but something observed and in need of explaining
Further Reading
Qualia Arguments
Qualia arguments intend to show that the mind must be at least partially immaterial due to our qualitative experiences that can't be identified in the material brain. In this video I review three major kinds--the Bat Argument, Mary the Scientist, and the Zombie Argument--as well as objections to these arguments.
NOTES
Qualia Arguments
- Qualia are immaterial mental properties that exist
- S1: qualia like the experience of echolocation or color are not located in the brain
- S2: in a different possible world, a physical brain could exist without qualia
- Therefore, the mind is an existent immaterial entity
- O1: reducible
- S1: water & H2O
- R1: water is made up of H2O, but neurons aren't made up of qualia
- O2 (Andrew Melnyk): conceptually distinct descriptions, doesn't prove non-identity
- S1: amnesiac
- R1: the point isn't epistemological, but ontological
- O3 (Daniel Dennett): qualia are confused
- R1: a confused experience is still experience
- O4: qualia are theoretically effete in explaining behavior
- O1: qualia are the data to be explained
- O5: no one can accurately define qualia
- R1: mental ostensive definition
A further reply can be given to Objection 1:Â We initially conceive of H2O apart from water, but with further investigation realize that H2O must be water, so we can't conceive of the two apart from each other. The same strategy won't work for qualia, though, because water is how H2O appears to us vs. how it appears under a microscope. Qualia are not how the brain appears to us, but appearance itself.
Further Reading
The Chinese Room Argument
Could computers think? Could robots have minds? The Chinese Room Argument, devised by John Searle, is a thought experiment meant to show that computers can't have minds, no matter how good technology gets. The amount of debate this thought experiment has garnered has been enormous, and it has proven to be one of the most fascinating ideas in philosophy. In this video, I explain the Chinese Room Argument and five major replies to it.
NOTES
- Definitions
- understands: Whatever it is we're referring to when, before we start doing philosophy and thinking about it, we say "X understands Y"
- X p-understands Y: "X runs a program that always produces a set of behaviors B we associate with understanding that thing Y"
- Program- a list of rules for what to do
- r-understands:
- 'Understands' includes one or more of the following:
- Qualitative aspect: A feeling of understanding
- Conscious aspect: Awareness of understanding and how you are using it
- Intentional aspect: Content of understanding as we experience it
- 'Understands' includes one or more of the following:
-
- x Ci-understands y: x produces the same behaviors as someone who understands y and this behavior begins with a causal connection from y to x
- x X-understands y: x has the same complexity as the brain of a person that understands y
- Strong AI
- (Computational theory of mind) Understanding is nothing more than p-understanding
- A computer can p-understand (Chinese)
- So, a computer can understand (Chinese)
-
- O1: The Chinese Room Argument
- If (1), then we can't p-understand without understanding
- I can p-understand (Chinese) without understanding (Chinese)
- S1: Chinese Room
- I don't understand Chinese
- In the middle of the room is:
- boxes of Chinese symbols (a data base)
- a book of instructions for manipulating the symbols (the program)
- People outside the room send in other Chinese symbols: questions in Chinese (the input)
- By following the instructions in the program I pass the Chinese symbols which are correct answers to the questions (the output)
- I p-understand Chinese
- So, I p-understand Chinese w/ understanding Chinese, which is (5)
- S1: Chinese Room
- So, ~(1)
- The only thing a computer can do is p-understand
- So, a computer can't understand
-
- R1: Systems Reply
- I am not the whole system here, but more like the cpu of the computer
- So, me not understanding is irrelevant
- The system as a whole understands, and that's what counts
- O1: Internalized Chinese Room Argument
- Memorize the rules, then there's only one physical system
- R1: Virtual Mind Reply
- There is a virtual mind working the program
- O1: there is only one physical system
- R2: Robot reply
- Include Ci-understanding
- O1: Internalized Chinese Room Robot
- Use digital readouts of cameras and this satisfies Ci-understanding without true understanding
- R1: Systems Reply
-
- R3: Brain Simulator Reply
- Make a computer that takes natural-language as inputs and runs a program identical to a human brain that understands Chinese
- Add X-understanding
- O1: Supergenius Internalized Chinese Room Robot
- Increase complexity of the Chinese Room program too
- O2 (Searle): the water valve brain
- R4: Other Minds
- We attribute understanding to other people because of their behavior
- Robots and aliens share the same behavior
- So, we should attribute understanding to robots and aliens
-
- N1: this is R-understanding
- S1: pragmatic reasons
- O1: anthropomorphizing is useful, but metaphoric
- R5: Intuition Reply
- The Chinese Room Argument is based on intuition
- Intuition is unreliable in metaphysics
- Computational Theory of Mind has explanatory power
- We should believe in things that have the most explanatory power
- So, we should trust Computational Theory of Mind over the Chinese Room Argument
-
- O1: framing CRA in the first person appeals to observation, not intuition
- R3: Brain Simulator Reply
- O1: The Chinese Room Argument
Further Reading
The Liberalism Objection to Functionalism
Functionalism is the idea that the thing that makes psychological discourse true is that there is something with the same functional organization as the mind, where the mind is understood as an abstract theory rather than a real thing. The Liberalism Objection claims that there are counterexamples to this idea. The most interesting counterexample is Ned Block's Chinese Brain thought experiment.
NOTES
- Important presuppositions:
- Psychological physicalism
- Psychological discourse is true for humans
- MRT: Psychological discourse is true for a set of non-human things, too
- Psychological discourse is false or meaningless for the compliment of that set
- Functionalism: the same functional organization
- Definitions:
- c-mind: common sense mind
- what people typically mean by mind
- not identical to the brain
- f-mind: functionalist mind
- anything that has the same functional organization as the c-mind
- c-mind: common sense mind
- (Psychological physicalism) The c-mind is fictional
- But, the fiction has a structure/functional organization to it
- The human brain has the same functional organization as the c-mind
- So, by definition, it has an f-mind
- When we use the word mind, we should mean f-mind
- So, humans have a mind
- Definitions:
-
- Arguments for (5)
- A1: it's useful
- We have certain goals as humans that we don't as sacks of atoms
- O1 (Liberalism Objection): other things could realize that same functional organization that we wouldn’t say have a mind
- S1: Chinese brain
- R1: technically, it is a mind
- O1: from pragmatic definition
- The word 'mind' was defined pragmatically
- For the same word to apply, it must be useful in the same way
- But, it would not be useful to identify the Chinese brain as having a mind
- If it's not useful to claim the Chinese brain has a mind, but it has an f-mind, then having an f-mind isn't sufficient for having a mind
- O2: redefining words leads to confusion
- O1: from pragmatic definition
- O2: the headache
- A1: it's useful
- Arguments for (5)
Further Reading
Eliminativism vs. Reductivism vs. Non-reductivism
This video goes over the differences between eliminativism, reductivism, and non-reductivism.
NOTES
- eliminativism
- Psychological physicalism- in philosophy of mind, only physical things exist.
- Psychological discourse refers to non-physical things.
- So, psychological discourse is false.
- reductivism
- Psychological physicalism- in philosophy of mind, only physical things exist.
- Psychological discourse refers to specific physical things.
- So, psychological discourse is only true for things that share that physical makeup.
- But, psychological discourse is reducible to physical discourse.
- non-reductivism
- Psychological physicalism- in philosophy of mind, only physical things exist.
- Psychological discourse refers to non-specific physical things.
- So, psychological discourse is true for anything.
- Psychological discourse is not reducible to physical discourse.
Further Reading
Functionalism
Functionalism is a view in philosophy of mind that attempts to resolve the Multiple Realizability Theorem with psychological physicalism.
NOTES
Functionalism
- Psychological physicalism: In philosophy of mind, only physical things exist
- Psychological discourse (terms relating to the mind like "want" or "pain") refer to whatever takes an input and assigns an output
- E.g., "I want to surf" refers to whatever in me takes the input "seeing good waves" and assigns the output "go surfing"
- E.g., "I am in pain" refers to whatever in me takes the input "gets pinched" and assigns the output "winces and says ouch"
- MRT: The same functional organization can be 'realized by' multiple different physical systems
- functional organization- a complete description of all the input-output assignments done by a "mind"
- So, the same mind state can be in very different beings
- Psychological discourse covers a wide range of things in a way that physical discourse can't
- So, Non-reductivism: Psychological discourse can't be done away with in favor of physical discourse, even though the things mentioned in psychological discourse aren't real
- O1: the same psychological state can be realized by different functional organizations
Further Reading
Hilary Putnam has a lot of his work on this subject collected in Philosophical Papers volume 2: Mind, Language, and Reality.
Non-reducibility: Is talk about the mind irreducible?
Multiple Realizability Argument
Reductivism is the claim that descriptions of the mind should be done away with in favor of descriptions of the brain. The Multiple Realizability Argument rejects reductivism because the same mind-state can be realized by multiple physical states. This video explores exactly what that means and how philosophers argue for it.
NOTES
- Multiple Realizability Argument
- realizable- an abstract description is made true by more ordinary objects
- Reductivism (reductive physicalism): Psychological categories can and should be replaced by physical categories
- So, there is a one-to-one correspondence between psychological categories and physical categories
- MRT: A mental state can be "realized by" or made true by more than one physical state
- So, there isn't a one-to-one correspondence
- So, reductivism is false
-
- Identity Theory entails reductivism, so it's false
- Arguing about MRT
- A1: biology
- Psychological states--minds--are an adaptive advantage
- It is likely aliens evolved on other planets
- So, it’s likely aliens have minds
- If aliens exist, then it's likely they evolved using different stuff
- So, it is likely aliens have minds like ours with different physical brains
-
- O1: (2) is wild
- O2: (1) is false
- Reactions are adaptive advantages, not psychological states
- A2: AI
- O1: appeal to the future
- O2: Chinese room
- A3: brain plasticity
- O1: different types of regions in the brain--this isn't possible for the mind
- A4: conceivable
- S1: Robots
- S2: gaseous creatures
- S3: brain prosthetics
- O1: not fine grained enough
- R1: implausible that the brain will match up perfectly like that
- A1: biology