• Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month

Outline and Evaluate the Multi Store Model of Memory and One Alternative Model

Extracts from this document...


Outline and Evaluate the Multi Store Model of Memory and One Alternative Model Models of memory are theoretical, and sometimes diagrammatic, representations of how the human memory works. Many models have been suggested by various psychologists in attempts to explain how information passes between the known stores of the memory (such as long-term memory and short-term memory) however they also take into account ideas of encoding, duration and capacity, as well as the idea of loss from memory through different mechanisms (such as displacement or decay). One such proposed model is that of Atkinson and Shiffrin (1968), known as the Multi Store Model of Memory. The name of this model comes from its main assumption; that the memory is made up of multiple stores- believed to be three. These three stores suggested in the Multi Store Model are sensory memory (a store holding an exact copy of a sensory stimulus in the same modality for a fraction of a second), short-term memory (where information may be stored for brief periods of time, but longer than that of sensory memory) and long-term memory (a store containing information that has been held for long periods of time). Atkinson and Shiffrin's model shows how data travels through the three stores in a sequence- from sensory memory to short-term memory and eventually to long-term memory. Data in sensory memory may only enter short-term memory if it goes through the process of selective attention. Rehearsal of the information now held in short-term memory may permit it to remain here for longer than it would otherwise be expected, and it may pass through into long-term memory if repeated enough. The mechanisms of forgetting from each store is also suggested in this model- sensory memory may lose information through decay (forgetting due to the passage of time), whilst in short-term memory both decay and displacement (old information being replaced by newer information entering the memory store may occur. ...read more.


It was found that as the number of digits increased although the time taken to respond to the verbal reasoning questions increased the time delay was not significant (merely fractions of a second), and there was no increase in errors made in answering. From this it was concluded that each task used a separate substore of the short-term memory- verbal reasoning requiring the central executive, whilst digit span requiring the phonological loop. As Atkinson and Shiffrin's Multi Store Model represented the stores as simplistic unitary stores, these results would be impossible according to the model. This experiment contradicts their idea and suggests that it was too simplistic. Their model was also unable to explain exceptions found, such as that of autistic savants. The subjects challenge the theory that repetition and rehearsal must be used in order for information to pass from the short-term memory to the long-term memory; as they are able to recall precise figures without the need for rehearsal, and show no sign of decay over time, nor any sign of possible forgetting mechanisms. It's because Atkinson and Shiffrin's model is so solid that exceptions like this cannot be explained or accounted for. An alternative to the Multi Store Model was suggested by Craik and Lockhart (1972) in the Levels of Processing model. Their idea tackled the simplicity of Atkinson and Shiffrin's theory by emphasising the mental processes that occurred in the memory, and not fixating on such a rigid structure; like in the Multi Store Model which resulted in it being labelled as 'too linear'. Craik and Lockhart rejected the idea of separate stores of memory, instead suggesting that the stimulus inputs go through a variety of different processing operations, each processing operation coming from a spectrum organized by 'depth'. They proposed two terms to explain this; shallow processing, and deep processing. Shallow processing merely involved recognizing the shape or visual appearance of the stimulus; for example, the shape of the letters within a word. ...read more.


This should reduce the amount of interference that may occur, as suggested by Atkinson and Shiffrin as the mechanism for forgetting in long-term memory. This is called release from proactive interference- where by making new items distinctive from old items in the memory you decrease the chance of interference from existing memory strains. Each cue that is made will only match one item, so cannot be confused with other data that is similar and has already been stored. None of the models formulated by psychologists are entirely accurate, however they each have their strengths, and evidence from investigations that can support the theories behind them. Atkinson and Shiffrin's Multi Store Model of memory made the key assumptions that there were three stores of memory, but it also showed may faults. It was criticised of being too simple- claiming that the three stores were unitary when other research has shown proof of substores, and not demonstrating how information can be accessed from the long-term memory in order to encode information for the short-term memory; as the diagram of the model is said to be too linear and does not account for interaction between the memory stores. Craik and Lockhart's Levels of Processing model took a different view on the idea of memory, showing that there were not three memory stores, but rather that the retention of information merely differed by the depth it was processed at. However there was no independent definition of depth so it would be difficult to classify procedures using this model. It also agreed with the Multi Store Model of memory that acoustic encoding was "short-term" whilst semantic encoding was "long-term", this idea has been supported by many study's, including those of Conrad (1964) and Baddeley (1966). The validity of the ideas within Levels of Processing was improved when psychologists suggested that although the retention of information was not always dependent on the depth of processing, it did appear to be dependent on the effort of processing, as shown by Tyler et al (1979). ?? ?? ?? ?? 28/11/2008 ...read more.

The above preview is unformatted text

This student written piece of work is one of many that can be found in our AS and A Level Cognitive Psychology section.

Found what you're looking for?

  • Start learning 29% faster today
  • 150,000+ documents available
  • Just £6.99 a month

Not the one? Search for your essay title...
  • Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month

See related essaysSee related essays

Related AS and A Level Cognitive Psychology essays

  1. Marked by a teacher

    Outline and Evaluate the Multi Store Model of Memory.

    3 star(s)

    He introduced the Digit Span Technique. This is where participants have to recall a series of digits in the correct order (serial recall), he then increases the number of digits in the sequence until the participant can no longer recall any digits.

  2. Investigating the short-term memory

    The music was played simultaneously after looking at and learning the words for a minute. This target population of 17 years old was seen as a suitable age for the investigation and generalisation can be made on this age set, from the set of data derived for this investigation.

  1. Memory: Rote Rehearsal and Mental Imagery.

    The room was also lit well so that the participants would have no problem with reading any of the information given to them. Had this been otherwise, this would have led to incorrect preception of the words and this would have led to apparent memory mistakes, as the word would have been mistaken for some other word.

  2. Primacy and Recency effect

    Any difference will be due to chance. Looking at this experiment the positioning of the words did have an effect on recall rates. The experimental hypothesis was accepted which stated that the first 10 words of a list of 30 (positioned from 1-10) and the last 10 words (positioned from 21-30)

  1. "An experiment to see the effect of chunking on short-term memory recall".

    Because the experiment was done in unnatural settings you could say that demand characteristics had taken place because the participants thought that they should remember more words with meaning rather than without. Suggestions For Improved Validity To make the experiment more valid it could take place in a more naturalistic

  2. Memory is an important area of study in Psychology because it underpins our other ...

    Any difference will be due to chance" accepted. This means that the experiment didn't show a relationship between recalling a greater number of correct words when given category headings. The study was an experimental design. This may have put pressure on the subjects, as they could have perceived the experiment as a test.

  1. Psychology Retrospective Interference coursework

    This suggested that retroactive interference do occur and the more similar the later material, the greater the interference. However, there are alternative explanations of LTM such as trace decay and retrieval failure. The trace decay theory explains forgetting in LTM in terms of automatic decay of memories over time.

  2. Outline and evaluate the working memory model

    However although this case study works to support the W.M.M there is evidence from brain damaged patients which shows that there are limitations in the model. There are a number of problems with using evidence from brain damaged patients. Firstly, you cannot make before and after comparisons, so it is

  • Over 160,000 pieces
    of student written work
  • Annotated by
    experienced teachers
  • Ideas and feedback to
    improve your own work