The argument from reason didn't convince me well.
There is a clear difference between computing an input information to produce a specific output, from understanding it in human way. The difference is not vague, and not complex.
What we meant by understanding or feeling something (concept) is the relation it has to the other concepts and. Evey word has a meaning because of it's context. Rain means something that depends on the concept (and thereby to the feeling) of water, the sound of it, the visual feeling and so on. If a computer program is written that could identify if it is raining by the visual aspect of rain, it will not have understood it, because it has missed the sound quality of it. The argument is inductive. That is, if you augment the program by adding sound detection, it will still miss the sensation it produces on skin, or the capability of us to recognize it by also the touch of raindrops. The crux of this argument is, to be able to say a computer program has "understood" rain, it will have to produce or simulate an impact on it's state that is as vivid as a human could have. The easy and immediate objection to this argument is, such vivid impact in our brain is not very precisely specified and therefore the definition is vague and not very useful. The response is, the fact that we can't satisfactorily define this impact network/context is not a weakness of my argument but rather a lacking in our understanding in how concepts relate in our head - which actually is being made clear this argument. In addition to this, I also argue that no two human beings will also share the exact mapping of a concept. And it is the overall similarity rather, that is shared between humans. In simpler terms, no two human beings understands rain in the exact same way. And therefore if a computer program is sufficiently close to the complexity (how many concepts is influenced by a given concept) of human, it could be accepted as an entity that understands it.
There is a clear difference between computing an input information to produce a specific output, from understanding it in human way. The difference is not vague, and not complex.
What we meant by understanding or feeling something (concept) is the relation it has to the other concepts and. Evey word has a meaning because of it's context. Rain means something that depends on the concept (and thereby to the feeling) of water, the sound of it, the visual feeling and so on. If a computer program is written that could identify if it is raining by the visual aspect of rain, it will not have understood it, because it has missed the sound quality of it. The argument is inductive. That is, if you augment the program by adding sound detection, it will still miss the sensation it produces on skin, or the capability of us to recognize it by also the touch of raindrops. The crux of this argument is, to be able to say a computer program has "understood" rain, it will have to produce or simulate an impact on it's state that is as vivid as a human could have. The easy and immediate objection to this argument is, such vivid impact in our brain is not very precisely specified and therefore the definition is vague and not very useful. The response is, the fact that we can't satisfactorily define this impact network/context is not a weakness of my argument but rather a lacking in our understanding in how concepts relate in our head - which actually is being made clear this argument. In addition to this, I also argue that no two human beings will also share the exact mapping of a concept. And it is the overall similarity rather, that is shared between humans. In simpler terms, no two human beings understands rain in the exact same way. And therefore if a computer program is sufficiently close to the complexity (how many concepts is influenced by a given concept) of human, it could be accepted as an entity that understands it.
No comments:
Post a Comment