The methodology used by Whitmarsh (2009) can be seen as “filling in the gap” as previous research has often relied on survey checklists to measure public understanding of climate change. This study however, employed a much more qualitative approach to reveal participants unprompted conceptions of climate change and global warming (Whitmarsh, 2009). Whitmarsh (2009) used a split survey design where half the sample was provided with questions using the term “climate change” and the other half with the exact same questions, except the term “global warming” was used instead. As the questionnaires were identical in all respects apart from key terminology, it was easy to make clear comparison of results between the group with the “climate change” and the group with “global warming”.
Whitmarsh’s decision to use a postal questionnaire meant that a lot of people could be covered, hence increasing the sample size. However it does have its limitations, and the most important is the fact that there is no control or supervision over who completes the questionnaire and what methods are used to complete it. On the other hand, there was no noteworthy issue related to a bias in the choice of sampling groups for each survey. Since a random sample was used, any particular respondent had an equal possibility of receiving either version of the survey. The major issue with the survey is linked to demographics, whether the sample can be said to represent the UK public as a whole. Even though Whitmarsh (2009) states that the survey “represents a cross-section of socio-demographic groups” (Whitmarsh, 2009), the fact that “15 percent of the sample has no formal qualifications compared to 24 percent of the total population”, challenges that claim. The returns were biased towards the more educated, a common characteristic of a postal questionnaire as it is normally the people interested in the topic who respond. Whitmarsh (2009) appears to overlook this fact and instead weights the data that produces results with no significant change. Nevertheless, an inaccurate representation of the publics understanding could possibly have come about because of this. Furthermore, the fact that the survey only covered Portsmouth strengthens the argument that the results are not accurate of public understanding of climate change across the UK as there is most definitely regional difference in understanding of climate change and global warming. What's more, the response rate was only 33.3%, the majority of which are most likely better educated and therefore an inaccurate interpretation is created. Though, for a postal questionnaire the response rate is reasonable (Oliver, 1990) and comparable to response rates for similar surveys.
Whitmarsh (2009) analysed and interpreted the data collected using SPSS (statistical software). This allowed for qualitative data to be quantified to show the prevalence of conceptual themes, and to allow for comparative analysis (Whitmarsh, 2009). Furthermore, the qualitative data was transformed into quantative form using NVivo that codes the results using a hierarchical coding procedure and as a result, allows direct comparison. Also, SPSS was used to perform chi-square tests, used to determine whether there was any significant variation between the responses using the two questionnaire surveys (Whitmarsh, 2009).
There is a logical structure throughout, with the findings of the research represented in both statistical and textual form. The use of tables to present the results means information can be retrieved quickly and results can easily be compared. However, the fact that there is no consistency of measurement throughout the article makes it difficult to analyse and evaluate findings. For example, Whitmarsh (2009) measures trust on a four point scale with “1=not at all” and “4= a lot” (Whitmarsh, 2009), but uses percentages to present all other statistical data. Whitmarsh (2009) hasn’t tabulated all data, making it challenging for direct comparisons. In fact, not all data is even shown in the article and instead Whitmarsh (2009) refers to results in other studies such as DEFRA’s (2002, 2007) and MORI’s (Norton and Leaman, 2004) surveys. This makes it difficult for any analysis or judgement on results to be made. Finally, although the results were tabulated, it is unfortunate that statistical data was only shown in tables and not in other ways i.e. graphs or pie charts, which in some cases could have more appropriate. By and large though, the balance of statistical data (presented in tables) and textual data in (written form) is adequate, especially when compared to the unbalanced presentation of data found in Sterman and Sweeney (2002).
In the discussion, Whitmarsh (2009) supports the use of a more qualitative survey design by stating how in previous quantative research there was greater response to questions focused on the causes of climate change/global warming. This is purely because Whitmarsh (2009) didn’t employ a survey checklist, while previous research did. The variation in understanding and response to basic terminology may have resulted from the “media’s tendency to refer to “global warming” instead of “climate change””. Therefore the fact that the public were more familiar with the term “global warming” is not a surprise given the publics reliance on media sources for information.
Conclusion
The research Whitmarsh (2009) conducted is focused on two main concepts which are argued and discussed throughout the paper. Whitmarsh (2009) has highlighted the importance of the fact that “researchers must be aware that questionnaire wording will affect the responses given” (Whitmarsh, 2009), simply because there is no consistency of terminology in surveys/questionnaires used to research public knowledge or attitude to climate change/ global warming. Whitmarsh (2009) advises and expects that future research focused on public response should use explicit terminology that is not neutral and should not be used indiscriminately. By using a less restrictive, more qualatative and unprompted survey, Whitmarsh (2009) has allowed participants to express their understanding “in their own words” (Whitmarsh, 2009), and wants to make sure that future research has a greater use of more qualatative methods to explore understanding. Also, Whitmarsh (2009) suggests that researchers should not assume respondents share the same knowledge and terminology as themselves as “there might be a trade-off between scientifically accurate information and affective public engagement in climate change/global warming” (Whitmarsh, 2009).
The methods used by Whitmarsh (2009) are appropriate and effective to the research concept being explored. The skills used by Whitmarsh (2009), made sure that the argument was logical and flowing, however an wider variation in data presentation would have been fitting. On the whole though, it was well structured and resulted in findings which are both important and interesting in relation to research on public understanding of climate change/global warming.
References
Lorenzoni, I and Pidgeon, N.F (2006) Public Views on Climate Change: European and USA Perspectives (p73) Climatic Change, 77, 73-95
Norton, A and Leaman, J (2004) The Day After Tomorrow: Public Opinion on Climate Change, Ipsos MORI, 1-9
Sterman, J.D and Sweeney, L.B (2002) Understanding Public Complacency About Climate Change: Adults’ mental models of climate change violate conservation of matter (p8-13), 1-33
Whitmarsh, L (2009) What’s in a name? Commonalities and differences in public understanding of “climate change” and “global warming”, Public Understanding of Science, 18, 401-420
Electronic Word Count: 1483