An analysis and evaluation of pratical investigative assessment.
AN ANALYSIS AND EVALUATION OF PRACTICAL INVESTIGATIVE ASSESSMENT.
Sreeja Bhaskaran
Introduction
What you test is what you get. In the quest for accountability using the tool of assessment inevitably results in a tailoring of teaching to the required assessment. The pressure is on to teach the skills that can be counted and reported, but just how useful are these skills? Assessment in school science as with other subjects can take many forms from oral questioning to the marking of exercises set for homework or short written tests of knowledge. Most often we think of assessment in the context of public examinations such as the GCSE and the A-Level. Much of the media attention and political commentary places enormous faith in the ability of examination systems to give absolute answers. Indeed educationalists have identified government policy as that of an "empty commitment" (Black and William page5, 1998). Whilst the TGAT have publicly emphasised the importance of formative assessment the reality is that resources and political policy is clearly focused on the external testing of teachers and schools. Undoubtedly the national curriculum has increased the amount of testing. In the UK students are compulsorily examined at four points during their school career at ages 7, 11, 14 and 16. Summative assessment does have a role in so much that it can provide information for parents and employers that can indicate the overall achievement of pupils over a period of time. However it does not help the teacher of a pupil identify areas of difficulty early enough to correct them. The end of a topic test does not leave room to help the failed child to learn better. If the majority of assessment is seen to consist of testing low level skills and fact based knowledge, this is the diminishing of education to training students to perform certain prescribed behaviours, emphasis on outcomes rather than processes, encouraging the passive nature of learning and elevating the trivial observable short term behaviours over enriching, high order, creative, open ended, lifelong aspects of education.
Section 1 Assessment in school
The translation of the importance of assessment within the national curriculum into achievable policies in schools can be very different. I have taught in schools where there is a strong whole school policy on assessment (appendix 1) with a lot of in service training and departmental emphasis given to improving formative assessment. My lasting memories were of the quality of the teaching and learning experience that I was fortunate enough to participate in. However a summative impression of this particular school would have been very poor in relation to its position in league tables. The school that I am currently teaching in has a less coordinated approach to assessment. Having been criticised recently by OFSTED for the quality of teaching and assessment in general the response was to monitor standards more rigorously through more testing of pupils seeing this as a way of achieving performance improvement targets. Teaching in an environment where assessment is so closely coupled with testing generates fearful learning and limits the quality of teaching and learning.
Assessment of science at Key Stage 3 of the National curriculum is principally by externally set written tests towards the end of year 9. This test covers attainment targets 2,3, and 4. Scientific investigation or enquiry is the first attainment target of national curriculum science in which class teachers are required to assess their pupil performance. Investigations or explorations as they are sometimes called incorporate a new breed of practical activity. Investigative work encompasses a whole range of activities that are centered on helping pupils to learn. It is about encouraging pupils to ask questions and carry out investigations to answer their questions. The cognitive and practical processes of formulating hypothesis, controlling and manipulating variables, making appropriate measurements, interpreting data and applying knowledge of scientific concepts are all built up through investigative work. The fact that investigations give pupils an opportunity to test their own understanding of scientific phenomena is very important. Extensive evidence has been gathered which show that pupils come into science lessons with a whole range of ideas about why things happen, many of these do not fit the accepted scientific explanations (Osbourne and Freyberg 1985, Driver et al 1994). The process of changing from an original idea to a scientific one is difficult and many pupils will pass through school retaining their original ideas. They may even give correct explanations for examination purposes whilst still retaining considerable doubt over scientific reasoning. For meaningful learning to occur pupils must construct their own understanding by modifying their pre-existing ideas in light of new insights gained from their performance and outcome of the investigation. This constructivist approach to leaning is largely based upon the work of Children Learning in Science Project (CLSP 1987).
Assessment at Key Stage 4 is by means of the GCSE. Coursework requirements for the GCSE have focused on assessing investigative work as it is defined in the National Curriculum. The current model of assessment identifies four skill areas, planning, obtaining, analysing and evaluating evidence. With in the department that I am currently working in pupils are encouraged to develop their skills in the four areas throughout Key Sage 3 and 4. By the end of Key Stage 3 investigations carried out by pupils are marked according to the GCSE syllabus criteria. The department follow the NEAB board, the criteria by which each skill is to be assessed is enclosed (Appendix 2) The departments attitude towards moderation of assessed practicals differs from other departments that I have worked in. The emphasis is on individuals applying consistent standards, in relation to the marking criteria and exemplar material provided by the examining board. There is very little in the way of internal departmental moderation aiming to establish a conmen standard between staff. There are no formal moderation meetings, and so no opportunity to exchange and mark work independently and discuss changes. The drawbacks of such a system are self evident, in that the last three years external moderators made corrective adjustments twice to the departmental marks.
I chose to use my year 9 to do some investigative work with. Although they were a relatively high ability set, discussions with the class teacher revealed that they had done very little practical or investigative-based work, with priority of time given to preparation for Standard Attainment Tests. I had to take this into consideration along with departmental policy of assessing investigations in accordance with examining board criteria at the end of Key Stage 3, which this group was at. My chosen activity would have to be a balance between being challenging enough to develop their thinking but ...
This is a preview of the whole essay
I chose to use my year 9 to do some investigative work with. Although they were a relatively high ability set, discussions with the class teacher revealed that they had done very little practical or investigative-based work, with priority of time given to preparation for Standard Attainment Tests. I had to take this into consideration along with departmental policy of assessing investigations in accordance with examining board criteria at the end of Key Stage 3, which this group was at. My chosen activity would have to be a balance between being challenging enough to develop their thinking but not to procedurally demanding for an inexperienced practical group. The investigation that I developed was taken from a "Thinking Science" activity, which is based upon the ideas of the cognitive acceleration through science education project (CASE, Shayre and Adey 1981). Thinking science is all about encouraging the development of thinking from what Piagetian psychology (Piaget and Inhelder 1969) term as concrete operational thinking, characterised by thought processes that a child performs based upon perception allowing for a description of a situation but unable to explain it, to formal operational thinking characterised by complex abstract thought due to the development of higher cognitive skills such as classification, analysis, synthesis and deduction. I based my investigation on principles used in an activity, which was placed at the beginning of the series. The activity was designed to lay important groundwork on the control of variables and so related to the vital formal reasoning patterns needed in successful investigative work. The activity identified the relationship between the viscosity of a liquid (oil) in relation to a change in temperature (appendix 3). Having discussed my proposal with the class teacher and laboratory technicians I amended the activity so as to use a sugar solution largely due to safety and availability factors. Although I was going to use the same apparatus as suggested for the activity I decided to ask the pupils to investigate any factor which might affect the viscosity of the sugar solution so as to ensure consideration of a range of variables including temperature. Although Gott and Mashiter (1991) have suggested that the move from handing catagoric and discrete variables to continuous variables would increase the complexity of a problem, I felt the pupils within this group were capable of this and also the relative simplicity of the procedures would leave them more time to develop reasoning patterns. I only had available three lessons to plan, carry out, analyse and evaluate the investigation. As I mentioned in my lesson plans (appendix 4) I aimed to introduce, develop ideas and get pupils to write their preliminary plans in the initial lesson and then get them to carry it out in the following double lesson, leaving time at the end to discuss results and instruct for the completion of the analysis and evaluation to be completed at home.
Section 2 Analysis of investigation
A quantitative breakdown of pupil response to this investigative task can be found in appendix 5. As can be seen from this evidence the skill area which pupils were most successful at was planning. All of the pupils had made an attempt in this area and all obtained marks in accordance with the criteria. Even the weakest student in this skill area was able to make a prediction, plan a simple procedure, and all be it not fully but attempted to describe how they had planned for a fair test (appendix 6). The stronger pupils were able to use their own scientific knowledge and understanding of processes to plan a detailed procedure identifying key variables to control and vary and decide on a suitable number and range of observations or measurements to be made (appendix 7). Success in this skill area could be attributed to a combination of factors including appropriate teaching strategy coupled with prior experience in this area. I devoted a whole lesson to developing investigative plans (appendix 4). The strategy I adopted was based upon the recommendations given in the open-ended work and science project (Jones et al 1992). This report gives examples of how teachers introduce investigations and discusses the problems of giving pupils just the right amount of information (page 61-78). I decided the best approach was to do a lot of "asking" rather than a lot of "telling". The stimulus activity came from a small demonstration of a liquid flowing through a funnel with no form of explanation followed by a brainstorming session to identify variables that would affect the rate of flow of the liquid. By doing this I was enabling the pupils to focus on the nature of the task and start to pinpoint the key variables involved. It also got the problem into a form that could be investigated and led to discussions of the practicalities of solving the problem. Having perceived the problem and reformulated it through identification of variables, to give further guidance with planning I provided a support sheet devised by the department which contained an array of questions. Frost (page 71, 1995) identifies such questions as supporting students in undertaking the thinking and decision-making processes for themselves (appendix 8). I was also able to use the sheets as formative tools to give feedback to the less able students, by collecting these planning sheets and adding comments and further questions to stimulate their thought on their planned procedure (appendix 9). Another reason for relatively high achievement in this skill area is that it may well be a skill that they had been able to exercise during previous practical activities.
Obtaining evidence was another skill area, which faired relatively well with only two pupils failing to be awarded any marks due to absence and incompletion of reports. Using apparatus, making measurements and observations are all basic practical skills that Parkinson (1994) rightly comments would have developed through pupil's prior practical exercises. All pupils were able to make appropriate observations and the majority met the needs for accurate results through systematic and repeated measurements (appendix 10). This was also the point where pupils had the opportunity to do a "trial run" and be able to modify their techniques and strategies appropriately. My own strategy at this point was to create a balance between encouraging pupil to work independently and also giving constructive feedback about apparatus, materials and strategies that pupils adopted. I deliberately tried not to make the procedural demands to complex due to the classes relative inexperience and timing factors. This was in accordance to Parkinson (page 130 1994) who comments that children are likely to tackle a problem badly if it is overburdened with procedure.
A skill area not tackled with the same degree of success as the other two was analysing evidence and drawing conclusions. Although most pupils were able to manipulate the data and display it graphically, the interpretation of this data was often inconsistent with the evidence, or very superficial and lacking detail (appendix 11). Often conclusions boar very little relation to scientific knowledge and understanding (appendix 12). The failure in this area may be due to numerous factors including teaching strategy, curriculum context of investigation, pupil ability and the time devoted to this skill. In relation to teaching strategy as my lesson plans reveal (appendix 4) where I had spent a third of the available time planning, only a small fraction of the time was spent on developing analytical skills. Perhaps in aiming to encourage independent thought I had gone to far, an area in which pupils needed the most support ended up being the area in which they had to put in the most independent work. Having completed their exploratory stage, instead of reporting back on findings and consolidating knowledge there was only a brief ten-minute discussion and instruction on completing the written report in accordance with criteria requirements. On the positive side I did go over questions on there support sheets and gave pupils the opportunity to take these home to aid completion. However I may have been wiser to consult the framework recommended by Jones and Kirk (1996) and the Opens Project (1992) for lesson t structure for investigational-based work. This suggests that after the exploratory stage there must be reporting, consolidation and application stages where pupils report back about their findings, discuss procedures and so enabling them to develop a greater procedural and conceptual understanding. The context within the curriculum in which investigations are set is another factor, which may affect success in this area of analysing evidence. Incorporating investigative work into a departmental scheme of work in accordance to the appropriate whole unit or topic can serve to enable progression in learning (Jones et al 1992). Such sequencing of activities can aid to develop and compliment conceptual knowledge and understanding. Had the investigation I had chosen perhaps have been conducted in a unit which focussed for example on teaching the particle theory of matter or properties of solids, liquids and gases, would have been useful in terms of scientific principals that can be used and related when analysing evidence from the investigation. However the department I was currently teaching in, had their Key Stage 3 schemes of work under review and as such had little coordination of incorporating investigative work into specific areas.
Further reasons for the lack of success in this skill area can be attributed to pupil ability. Research has identified that analysing evidence and drawing conclusion is a skill that many pupil fail to do effectively (Foulds, Gott and Feasey 1992). This may be related to Piaget's ideas that analysis is one of the higher cognitive skills related to formal operational thinking that children sequentially develop through the late adolescent period. The mechanism and exact rate of this development is by no means certain, safe to say children will develop at different speeds depending on a variety of individual and environmental factors. However differences in cognitive development could contribute to explain the variation in this skill area, where some pupils have obvious difficulty analysing and drawing conclusions (appendix 11/12) whilst others handle it with more ease (appendix 13).
The last skill area is that of evaluating evidence, again as with analysis there were serious weaknesses in this area. Many pupils had failed to complete or even make an attempt at evaluating the procedure. Even pupils who had attained high marks in other skill areas often evaluated their evidence quite superficially without relevant depth (appendix 14 and 15). The possible reasons for this may again be similar to analysis that is inappropriate teaching strategy, insufficient time and inexperience of using this skill. Instead of dictating the criteria required as I did, using phrases from the examining board criteria such as "you need to comment on reliability of evidence, account for strange results and propose further investigation". It would have been more constructive for pupils to report back and discuss actual procedures of the investigation and their limitations so that procedural understanding could be enhanced. Further more it would have been useful to have what Jones and Kirk (1990) suggest as an application stage in the lesson framework, where pupils can suggest, modify or implement further investigation.
At Key Stage 3 class teachers are required to assess pupil's performance of whole investigations to give a level for attainment target 1 (SEAC 1993). The national curriculum provides a formal link between teaching and assessment, with different levels within attainment target 1 intended to mark out stepping stone in progression of the development of skills in this area. The class teacher will examine the levels reached by each pupil in each of the skill areas and look for the highest level that has been consistently attained. SEAC have suggested that teachers use their professional judgement to decide upon an overall level for Sc 1 (SEAC 1993). My analysis of the investigation undertaken by this year 9 group can serve to contribute to the class teacher's assessment of their levels. As the year 9 group were only a week from progressing to Key Stage 4 and departmental policy being to assess investigative work as would happen in Key Stage 4 through GCSE syllabus criteria. This criteria (appendix 2) is focused on assessing investigative science as it is defined in the national curriculum, which identifies four skill areas, planning, obtaining, analysing and evaluating evidence. My analysis of this classes performance in these skill areas can thus not only provide the class teacher information to base assessment of levels, but also as a guide to what kind of grade they are performing to in relation to the practical components of the GCSE. This information is vital for the class teacher and is also vital for the pupils to understand how they are currently performing against the criteria against which they will be assessed in the coming year. I have provided pupils with formative feedback in terms of comment sheets with targets for each skill area, based on my marks and analysis of the investigation they had conducted (appendix 16).
Section 3 Evaluations
It has been claimed by educationalists (Fairbrother 1992) that there is a strong movement currently in the direction of open-ended investigative problem solving as a form of practical assessment in science education. It is impossible to avoid it, as it now firmly a part of the GCSE and national curriculum. It is seen not only as a reflection or the way "real scientists" work but also a good way of teaching skills and knowledge. Gott et al (1988) identified two sorts of understanding being required in order to do investigations. There is conceptual understanding concerned with facts, concept, laws and principals and procedural understanding or "processes" that are concerned with different modes of thought that are involved when solving problems encountered in science and more generally in everyday life situations. Some reports and curriculum projects in the UK have not merely identified procedural understanding as a separate aspect of science performance but have argued that it is the most important aspect. The DES asserts (page 7 1985) "the essential characteristics of education in science is that it introduces pupils to the methods of science". Others have argued against this process led assessment of practical work in science. They identify these processes as general cognitive skills that all humans develop without formal instruction, so they do not need to be taught or developed (Miller 1993). The viewpoint adopted by constructivists is that processes defined investigative work can be a tool for confronting preconceptions and thus enable "children to develop more effective conceptual tools" (Miller and Driver 1987). There are many viewpoints as to what should be taught, understood and revealed through assessment of practical based work in science. I believe that teaching and assessing practical skills should not be an end in itself but have skills that can be applied in other situations. Another way of looking at it is as Fairbrother (1993) argues that assessment should enable pupils to see the purpose of what they are being asked to do. So assessments must be put into some kind of meaningful context which investigations or open-ended problem solving can achieve. Further more reform minded educators who strive to move away from low level fact based testing in the curriculum in order to create and develop this form of practical based assessment can only be a good thing. Practical assessment of this kind is designed to elicit critical thinking, problem solving and communication skills which serve to prepare students for more competitive high performance work places which demand individuals have and be able to use such skills.
The teaching and assessment of scientific investigation is not a simple task, particularly when you are relatively inexperienced. Investigative work differs in a number of ways from other types of practical work. For example investigative work gives pupils the opportunity to test their own understanding of scientific phenomena, encourages pupils to make statements that they can test, allows them to plan their own investigations, gives opportunity for discussing ideas with other pupils, it in reality encompasses a whole range of activities that are centered around helping pupils to learn. The role teachers take in these investigations must be as Frost (1995) suggests "compliment and support" the role of students. The teacher's skill lies in their ability to enable students to undertake the thinking and decision making for themselves. It is all too easy for prior practical experience to take over and tell students what to do. I felt I managed to achieve the optimal balance when pupils were initially planning and implementing their investigations without explicitly dictating and telling the pupils what to do. The suitability of this approach was validated by the fact that pupils achieved relatively high marks in these skill areas (appendix 5). However when it came to analysing and evaluating evidence I could see I gave insufficient guidance due to time constraints and also a concern to ensure students think independently. I may well adapt the framework for structuring lessons recommended by the Open Work in Science Project (1992). The framework consists of five distinct stages focusing, exploratory, reporting, consolidation and application and discuss utilising different techniques of formative feedback. Furthermore I would need to give more thought into context and objectives that I want to achieve through having completed an investigation. It would be useful in the future to adopt a teacher-planning sheet for investigative work as suggested by Parkinson (page 125 1994). Never the less I take from this initial experience of teaching investigative work an enjoyment of the enthusiasm and commitment that students showed arising from an ownership of the activity, and also a repertoire of useful questions and comments to support students, such as "have you decided what to measure? / How to measure / how to record" and of course "are you sure that's what your meant to do with that piece of equipment?
References
Black, P, and William, D. (1998) Inside the black box, King's college London.
Department of Education and Science (1985) Science at age 15 Science report for teachers no.5 London HMSO
Driver,R, Squires A, Rushworth P and Wood Robinson V (1994) Making sense of secondary science. Routledge
Fairbrother, B (1991) Principles of practical assessment in Woolnough, B
Practical work in science. Milton Keynes, Open University Press
Frost J (1995) Teaching Science Woburn Press
Gott, R and Murphy, P (1987) Assessing investigations at ages 13 and 15. APU science report for teachers, No.9 DES London
Gott, R and Mashiter J (1991) Practical work in science- a task based approach? in Woolnough, B Practical work in science. Milton Keynes, Open University Press
Gott, R, Welford, G and Foulds K (1998) The assessment of practical work in science. Basil Blackwell, Oxford.
Jones,A .T.et al (1992) Open work in science, development of investigations in schools ASE
Jones, A.T. and Kirk, C.M (1990) Introducing Technological Applications in to the Physics Classroom: Help or hindrance to learning? International Journal of Science Education Vol 12, 5, pp.481-490
Miller, R (1991) A means to an end: the role of processes in science education in Woolnough, B Practical work in science. Milton Keynes, Open University Press
Miller, R and Driver, R (1987) Beyond processes Studies in Science education 14, pp.33-62
Osborne, R and Freyberg, P (1985) Learning in science. Heinemen
Piaget and Inhelder. (1969) The psychology of the child. Routledge and Kogan Paul.
Parkinson, J (1994) The effective teaching of secondary science Longman
Scott, P, (1987) A constructivist view of learning and teaching science Children's learning in science project. University of Leeds
Shayer, M and Adey, P (1981) Towards a Science of science teaching Heineman London
Bibliography
Black, P, and William, D. (1998) Inside the black box, King's college London.
Cohen, L, Manion, L and Morrison, K (1996) Guide to teaching practise, Routledge.
Harlen, W (1983) Guides to assessment in education Macmillan
Frost, J et al (1995) Teaching Science Woburn
Millar,R (1989) Doing science The Falmer Press
Parkinson, J. (1994) The effective teaching of secondary science.
Longman.
Qualter, A, et al (1990) Exploration, a way of learning science. Blackwell
Rowlands,D (1987) Problem solving in science and technology Hutchinson
Turner, T and DiMarco, W. (1998) Learning to teach science in the secondary school. Routledge
.
Wellington, J. (1994) Secondary science. Routledge
APPENDIX CONTENTS
) School policy of assessment
2) GCSE criteria for assessment of practical work
3) Thinking science practical activity
4) Lesson plans
5) Quantative breakdown of pupil response
6) Pupil sample: planning
7) Pupil sample:planning
8) Departmental devised support sheets
9) Pupil sample: planning feedback
0) Pupil sample :obtaining evidence
1)Pupil sample:analysis
2)Pupil sample: analysis
3)Pupil sample: evaluating
4)Pupil sample: evaluating
5)Feedback comments