• Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month

Outline and Evaluate the Multi Store Model of Memory and One Alternative Model

Extracts from this document...

Introduction

Outline and Evaluate the Multi Store Model of Memory and One Alternative Model Models of memory are theoretical, and sometimes diagrammatic, representations of how the human memory works. Many models have been suggested by various psychologists in attempts to explain how information passes between the known stores of the memory (such as long-term memory and short-term memory) however they also take into account ideas of encoding, duration and capacity, as well as the idea of loss from memory through different mechanisms (such as displacement or decay). One such proposed model is that of Atkinson and Shiffrin (1968), known as the Multi Store Model of Memory. The name of this model comes from its main assumption; that the memory is made up of multiple stores- believed to be three. These three stores suggested in the Multi Store Model are sensory memory (a store holding an exact copy of a sensory stimulus in the same modality for a fraction of a second), short-term memory (where information may be stored for brief periods of time, but longer than that of sensory memory) and long-term memory (a store containing information that has been held for long periods of time). Atkinson and Shiffrin's model shows how data travels through the three stores in a sequence- from sensory memory to short-term memory and eventually to long-term memory. Data in sensory memory may only enter short-term memory if it goes through the process of selective attention. Rehearsal of the information now held in short-term memory may permit it to remain here for longer than it would otherwise be expected, and it may pass through into long-term memory if repeated enough. The mechanisms of forgetting from each store is also suggested in this model- sensory memory may lose information through decay (forgetting due to the passage of time), whilst in short-term memory both decay and displacement (old information being replaced by newer information entering the memory store may occur. ...read more.

Middle

It was found that as the number of digits increased although the time taken to respond to the verbal reasoning questions increased the time delay was not significant (merely fractions of a second), and there was no increase in errors made in answering. From this it was concluded that each task used a separate substore of the short-term memory- verbal reasoning requiring the central executive, whilst digit span requiring the phonological loop. As Atkinson and Shiffrin's Multi Store Model represented the stores as simplistic unitary stores, these results would be impossible according to the model. This experiment contradicts their idea and suggests that it was too simplistic. Their model was also unable to explain exceptions found, such as that of autistic savants. The subjects challenge the theory that repetition and rehearsal must be used in order for information to pass from the short-term memory to the long-term memory; as they are able to recall precise figures without the need for rehearsal, and show no sign of decay over time, nor any sign of possible forgetting mechanisms. It's because Atkinson and Shiffrin's model is so solid that exceptions like this cannot be explained or accounted for. An alternative to the Multi Store Model was suggested by Craik and Lockhart (1972) in the Levels of Processing model. Their idea tackled the simplicity of Atkinson and Shiffrin's theory by emphasising the mental processes that occurred in the memory, and not fixating on such a rigid structure; like in the Multi Store Model which resulted in it being labelled as 'too linear'. Craik and Lockhart rejected the idea of separate stores of memory, instead suggesting that the stimulus inputs go through a variety of different processing operations, each processing operation coming from a spectrum organized by 'depth'. They proposed two terms to explain this; shallow processing, and deep processing. Shallow processing merely involved recognizing the shape or visual appearance of the stimulus; for example, the shape of the letters within a word. ...read more.

Conclusion

This should reduce the amount of interference that may occur, as suggested by Atkinson and Shiffrin as the mechanism for forgetting in long-term memory. This is called release from proactive interference- where by making new items distinctive from old items in the memory you decrease the chance of interference from existing memory strains. Each cue that is made will only match one item, so cannot be confused with other data that is similar and has already been stored. None of the models formulated by psychologists are entirely accurate, however they each have their strengths, and evidence from investigations that can support the theories behind them. Atkinson and Shiffrin's Multi Store Model of memory made the key assumptions that there were three stores of memory, but it also showed may faults. It was criticised of being too simple- claiming that the three stores were unitary when other research has shown proof of substores, and not demonstrating how information can be accessed from the long-term memory in order to encode information for the short-term memory; as the diagram of the model is said to be too linear and does not account for interaction between the memory stores. Craik and Lockhart's Levels of Processing model took a different view on the idea of memory, showing that there were not three memory stores, but rather that the retention of information merely differed by the depth it was processed at. However there was no independent definition of depth so it would be difficult to classify procedures using this model. It also agreed with the Multi Store Model of memory that acoustic encoding was "short-term" whilst semantic encoding was "long-term", this idea has been supported by many study's, including those of Conrad (1964) and Baddeley (1966). The validity of the ideas within Levels of Processing was improved when psychologists suggested that although the retention of information was not always dependent on the depth of processing, it did appear to be dependent on the effort of processing, as shown by Tyler et al (1979). ?? ?? ?? ?? 28/11/2008 ...read more.

The above preview is unformatted text

This student written piece of work is one of many that can be found in our AS and A Level Cognitive Psychology section.

Found what you're looking for?

  • Start learning 29% faster today
  • 150,000+ documents available
  • Just £6.99 a month

Not the one? Search for your essay title...
  • Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month

See related essaysSee related essays

Related AS and A Level Cognitive Psychology essays

  1. Marked by a teacher

    Craik and Lockhart believed that depth is a critical concept for levels of processing ...

    4 star(s)

    short and long-term memory has come from the study of brain damaged patients. Amnesia is when a person has lost a lot of their long-term memory, often due to an accident that has caused brain damage. These people usually have almost normal short-term memory.

  2. Outline and evaluate the Multi Store Model of memory

    It is unlikely that in everyday life we would be expected to remember a list of unrelated numbers, we are much more likely to need to recall shopping lists - this is different because the information has a deeper meaning allowing the memory to process the information with semantic connections,

  1. Memory is an important area of study in Psychology because it underpins our other ...

    The participants in the study may have affected the results. All the subjects were of similar age and were still in full time education. They are all taught the same and so may have the same techniques for revising/recalling words.

  2. Primacy and Recency effect

    have a higher bar chart than the middle ? of the words in the word list. In this experiment, the Primacy and Recency effect was investigated. The aim of the experiment was to investigate the Primacy and Recency effect basing on the Glanzer and Cunitz theory. The experimental hypothesis for this experiment which stated that: The first 10 words of a list of 30 (positioned from 1-10)

  1. Categorisation in Long-Term Memory

    Ecological validity is if the experiment measures a naturally occurring behaviour. This was a field experiment which has good ecological validity but it's not usual for someone to be taken into a room and to participate in a test on a daily life setting.

  2. This study is based on the theory of cue dependent forgetting - more specifically, ...

    For example, there are a number of words relating to water (river, flood, tributary, puddle, tank and goldfish) so if one were remembered it is possible that this would lead to the recall of others. In addition to this, cues in the surroundings of the participants may have prompted recall.

  1. Memory: Rote Rehearsal and Mental Imagery.

    This is clearly shown in Bower's experiment of 1972. Subjects were given a set of one hundred word cards with a pair of unrelated nouns, such as 'dog' - 'hat', written on them. The 'imagery' group was asked to form a mental image of the two words interacting with one

  2. Cue dependent Forgetting. This experiment investigates Tulvings theory of cue dependent forgetting, with ...

    increase its ecological validity with strict control maintaining the variables, this contributes experimental validity. Nevertheless, the memorising task is an unnatural task, which decreases ecological validity. To counter this, it would be good idea to adopt an independent measures design because it reduces order of effect as participants only do

  • Over 160,000 pieces
    of student written work
  • Annotated by
    experienced teachers
  • Ideas and feedback to
    improve your own work