The second kind of consciousness we will look at is Monitoring Consciousness. This involves being in a conscious state, which is accompanied by a thought to the effect that you are in that state. It enables you to monitor your own actions and mental states, and allows you to take appropriate action if necessary. It is similar to a form of internal scanning.
Reason (1979) conducted a diary study of errors people can make in everyday actions, and attributed a number of errors to a lack of monitoring consciousness. For example, one participant noted how she had unwrapped a sweet, put the paper in her mouth and threw the sweet in the bin. Another told how they were just about to get into the bath when they noticed they had their underwear still on. Reason deduced that complex sequences of actions contain some highly automatized processes and that we need to consciously monitor the actions at key points during the sequence in order to carry them out successfully.
A lack of monitoring consciousness may also be used to explain anosegnosia, where patients are unable to recognise and deal with their own disease or illness. Studies by Damasio (1999) have found that some patients seem to be impaired in their conscious ability to monitor the actions of their own bodies, and might say that their body was working perfectly and not impaired, when in fact it could be paralysed.
The third kind of consciousness is Self-Consciousness. This is the possession of the concept of the self and the ability to use this concept in thinking about yourself. This involves not only awareness of the self at present, but also of the self in the past and anticipated future. There is reason to think that animals and babies can have conscious states without employing any concept of the self.
There is some loss of self-consciousness in cases of amnesia, where autobiographical memory is disrupted. Likewise, the effects of epilepsy and other head injuries may result in an impact on the sense of self. Asomatognosia is an illness that results in some patients being unable to recognise or even feel their bodies, thereby losing their sense of being an embodied self.
Baars’ account of consciousness (1988) assumes that autonomous processors send information to a global workspace that can broadcast information to other processors. Consciousness then corresponds to the contents of the workspace. This theory may help explain the phenomena of the three kinds of consciousness we have looked at. It does not, however, accommodate the fourth kind of consciousness, Phenomenal Consciousness, which we will now observe.
In an attempt to illustrate Phenomenal Consciousness, let us consider four kinds of introspectible states – perceptual experiences, such as tastings and seeings; bodily sensational experiences such as those of pains, tickles and itches; imaginative experiences such as those of one’s own actions or perceptions, and streams of thought, as in the experience of thinking ‘in words’ or ‘in images’. All of these states have features that make up ‘what it is like’ (or ‘seems like’ or ‘feels like’) for one to undergo them (Nagel 1974). We sometimes try to describe these features, for example, by saying that a given pain is ‘sharp’ or ‘throbbing’ to some degree, or that a given visual image is ‘blurry’ or ‘moving’. These specific features are called ‘phenomenal properties’ or sometimes ‘raw feel’ or ‘qualia’, more or less interchangeably.
Phenomenal character has been described as the ‘hard problem’ of consciousness, and arises because statements involving the qualitative character of experience bear no obvious logical connections to descriptions couched in physical terms. There is an explanatory gap (Levine 1983) between functions and experience and we need an explanatory bridge to cross it. Scientists attempting to provide an objective account of phenomenal consciousness have three main problems. Firstly, they cannot observe the inner conscious states of another person, and so have to infer what it is like. Secondly, because phenomenal consciousness is qualitative, a feeling or an experience, there are practically no means of observing the nature of the experience by observing a person’s outward behaviour. Finally, because consciousness possesses an essential point-of-view or first person character (Nagel, 1974), to try to study it may be to disregard its subjectivity. We would have to understand others points of view, which whilst difficult enough with humans, is nearly impossible when dealing with the points of view of animals.
In attempting to find a solution to this ‘hard problem’, researchers have differed widely. Many seem to adapt to one of six strategies. The first of these is where researchers simply explain something else, (as with McGinn, 1989), feeling that the problem of experience is too difficult for now and perhaps even outside the domain of science altogether. They choose instead to address problems such as reportability or self-concept. The second choice is to take a harder line and to deny the phenomenon (for example, Dennett, 1991). According to this line, once we have explained functions such as accessibility and reportability, there is no further phenomenon called ‘experience’ to explain. Both of these strategies appear to be unsatisfactory. Experience is a key aspect of our mental lives that cannot simply be disregarded or avoided.
The third option is where researchers believe that they are explaining experience (for example, Flohr, 1992), but usually pass over the relevant step in their explanation quickly and inadequately.
The fourth approach tries to explain the structure of experience, (for example, Hardin, 1992). In general, certain facts about structures found in processing will correspond to and arguably explain facts about the structure of experience. This approach is limited however, and tells us nothing about why there is experience in the first place.
The fifth strategy is to isolate the substrate of experience, trying to identify the sort of brain processes from which experience arises (for example, Edelman, 1989). While this strategy can shed indirect light on the problem of experience, it is nevertheless incomplete. While it may tell us which processes give rise to experience, it does not tell us why and how.
The final strategy is one pioneered by Chalmers (1995). He believes that phenomenal consciousness may be a ‘fundamental feature’ of the physical universe. While all objects have mass and charge, and occupy space and time, Chalmers believes they also have the non-physical fundamental feature of experience. Whilst this is an appealing approach, it still leaves questions as to how physical objects such as chairs and tables do not achieve what we call consciousness.
So the hard problem remains unsolved for now. There appears to be no easy answers to questions posed by consciousness, and as Chalmers says “you can’t explain conscious experience on the cheap”. Despite the difficulties, it is encouraging to see that psychologists and philosophers alike are continuing their studies on this topic. Even if some of the answers are a long way off, what is being learned and developed into theories is highly beneficial, especially in the case of helping patients with neuropsychological problems. If the answers finally do arrive, there will be deep implications for our understanding of the mind and body alike.
Bibliography
Primary Source:
Braisby, N (2002). ‘Consciousness’ in Cooper, T and Roth, I (eds.) Challenging Psychological Issues. Milton Keynes: The Open University.
Secondary Sources:
Baars, B.J. (1988). A Cognitive Theory of Consciousness. Cambridge: Cambridge University Press.
Dennett, D.C. (1991). Consciousness Explained. Boston: Little, Brown.
Edelman, G. (1989). The Remembered Present: A Biological Theory of Consciousness. New York: Basic Books.
Flohr, H. (1992). ‘Qualia and brain processes’ in Beckermann, A., Flohr, H. and Kim, J. (eds.) Emergence or Reduction?: Prospects for Nonreductive Physicalism. Berlin: De Gruyter.
Hardin, C.L. (1992). ‘Physiology, phenomenology, and Spinoza's true colors’ processes’ in Beckermann, A., Flohr, H. and Kim, J. (eds.) Emergence or Reduction?: Prospects for Nonreductive Physicalism. Berlin: De Gruyter.
McGinn, C. (1989). ‘Can we solve the mind-body problem?’ in Mind 98:349-66.
Nagel, T. (1974). ‘What is it like to be a bat?’ in Philosophical Review 4:435-50.
Reason, T. (1974). ‘Actions not as planned: The price of automatization’ in Aspects of Consciousness, Vol.1, Psychological Issues. London: Academic Press.
Tye, M. 1995. Ten Problems of Consciousness. Cambridge, MA: MIT Press.