Source:
Generation is defined as:
-
‘A stage or period of sequential technological development and innovation. A class of objects derived from a preceding class: a new generation of computers. The formation of a line or geometric figure by the movement of a point or line. The act or process of generating; origination, production, or procreation.’
Source:
Communication is defined as:
‘The act of communicating; transmission. The exchange of thoughts, messages, or information, as by speech, signals, writing, or behavior. Interpersonal rapport. The art and technique of using words effectively to impart information or ideas. The field of study concerned with the transmission of information by various means, such as print or broadcasting. Any of various professions involved with the transmission of information, such as advertising, broadcasting, or journalism. Something communicated; a message A means of communicating, especially: A system, such as mail, telephone, or television, for sending and receiving messages. A network of routes for sending messages and transporting troops and supplies. The technology employed in transmitting messages. The transfer of information from one molecule, cell, or organism to another, as by chemical or electrical signals or by behaviors. An opening or connecting passage between two structures.’
2.1 How is research done?
Elizabeth Orna sets out three components of research that fall outside a self-evident and linear definition of such a process:
A: What researchers do before the searching begins, in order to decide what they are looking for.
B: What they do during the searching, to ensure they can make good use of what they find.
C: What they do after their investigation has led to discovery, in order to communicate what they find.
Source: Orna, Stevens, 1997.
Orna describes a model of staggered recursion whereby a pre-search definition of the search itself precedes the actual data-gathering which in turn leads to discovery which then has the potential to lead to a further modification of the search definition or to a communication of what the initial investigation has discovered (Orna,Stevens,1997.)
‘The definition of “research focus” need not be a complete and detailed one; the theme can indeed come into focus as a result of a preliminary, fairly general searching.’
Source: Orna, Stevens, 1997.
Other authors including Graziano (2004), Wisker (2001) and Blaxter (1996) mention a similar aspect of the research process: the fact that an institutional hope/fallacy of linear and unrestrictedly logical information gathering is not the sum of intellectual endeavor inasmuch as said overlaps with knowledge generation. Their point is that what areas of interest may be uncovered in the pre gathering stage may be changed both during the gathering stage and during a third opaque stage called information formatting (the act of arrangement and rendering systematized different ranges of data.) Instead of this half-measure, the Author posits a constant and in the research equation:
A: That formatting is reflexive and co-extensive with the entire research process.
B: This function may be unconscious or conscious. That is to say autonomic or self aware.
A useful definition of reflexivity is:
‘Directed back on itself: Of, relating to, or being a verb having an identical subject and direct object, as dressed in the sentence “She dressed herself.” Of, relating to, or being the pronoun used as the direct object of a reflexive verb, as herself in “She dressed herself.” Of or relating to a reflex. Elicited automatically; spontaneous: “a bid for... reflexive left-wing approval” (Marshall Delaney.)’
Source:
At this point, the Author feels it important to explicitly mention the two questions underpinning the current discussion:
A: What is research?
B: How do we do research?
While the former refers to the categorical functionality of preconditions to formal knowledge generation, the latter refers to the actuated processes of research or rather the methodologies and applications of what the author describes as ‘real-time’ research.
Practical research can be compared to a physical or even mechanical (the interaction and generative properties of well-linked components) event- a useful definition of actuation/instantiation is:
A: To put into motion or action; activate: electrical relays that actuate the elevator's movements.
Or:
B: To move to action: a speech that actuated dissent.
Source:
This understanding of research hints at a recursive/regressive apparent duality: the bare bones of procedural research and the complex interpretation factor which permeates the commonly purported effort towards ‘pure’ research. Orna and Stevens (1997) describe the above two-tier model in terms of the bifurcation of internalized external stimulation into knowledge and information (discrete but mutually informative.) The former is transformed into the latter which is then perceived as the former for another: research is therefore a mode of transformation-it is an active force and cannot be seen as unbiased, ‘pure’ or totally ‘scientific.’ The only scale which can be usefully applied in such a manner is that of the qualitative-quantitative spectrum.
A helpful definition of actualization (the aforementioned capacity of the empirical realm) is:
‘Making real or giving the appearance of reality.’
Source:
Note the subtle presence of transformation, interpretation and complicity within the process of any attempt to realize the internal symbiont of external experience and the intercommunication of such between conscious/human entities and intelligences. Thomas Nagel (1986) has said that any diagnosis of the scientific/humanities epistemological breakdown is due to a misconception of an irreducible dual reality comprehensible in terms of discrete objective and subjective modes. The main thrust of this pensee is that a purported liberation from any and all hermeneutical burden betrays a fundamental misapprehension of reality. The burden of perception can be defined as:
‘Interpretative Hermeneutical Circle: the dialectical relation between the interpreter’s horizon and the horizon of a text.’
Source: http://www.people.bu.edu
Or:
‘We need freely and imaginatively to vary the components of the phenomena being studied in order to determine which are their essential, and which their conditional, features; in this way we can detect what the essential relations between the things under consideration. When these essential predicates have been identified (of human being, for example), then we have immediate (unmediated), apodictic knowledge. This is used to create a world of ideas (eidoi), which is a universal and so public world.’
Source:
In both private and public spheres, the research or phenomenological (for the objects and subjects of research are themselves subject to conscious processes) procedure is always and at best an approximation: the only real goal is to refine, hone and research the research itself as well as its preconditions. Ideally, both research and the research of the research would be almost co-extensive and co-temporal (Habermas, 1981) but such conscious praxis is, according to the author, probably beyond the abilities of mankind in its current state.
2.1.2 Numbers or Feelings? / Quantities and Qualities.
Orna and Stevens (1997) note that an aspect of ameliorative (the author hesitates to use terms such as ‘good’ or ‘bad’) research is the utilization of prior knowledge and the consequent re-configuration and novel application of the researcher’s past experience and personality. The extent to which different approaches and measures are used in a research project can therefore be seen as an extension of the (A?)/authorial voice which is more or less explicit according to style and agenda.
The Author’s idea of a spectrum of quantitative and qualitative sources and information types is useful in two ways:
A: It refers to the fact that the ostensive typology is not discretely dualistic but rather proportional (or inherently dualistic.)
B: It refers to the fact that there is no absolute position but a series of more or less amalgamations or single points heading up matrices of substantial pre-conditions discussed in the first part of this paper.
Wisker (2001) divides research into five kinds:
1: descriptive.
2: exploratory.
3: predictive.
4: explanatory.
5: action.
These categories though second-order (necessarily super-imposed rather than analytic) are a useful systematization of positions along the quantitative-qualitative spectrum posited explicitly by the author (though implicitly by Graziano, 2004.) Wisker (2001) defines descriptive research as a pre-emptory mode to exploratory researches in that the former:
‘…aims to find out more about a phenomenon and to capture it with detailed information.’
Source: Wisker, 2001.
Exploratory research:
‘…is commonly used when new knowledge is needed is sought or certain behaviour and the causes for the presentation of symptoms, actions, or events need discovering.’
Source: Wisker, 2001.
Explanatory research:
‘…specifically seeks to look at the cause/effect relationships between two or more phenomena; it can be immensely helpful when description and simple exploration have come up with a number of variables which confuse rather than clarify the assumptions and hypotheses.’
Source: Wisker, 2001.
Predictive research:
‘…predictive research is based on identification of relationships between variables, so changing one or more variables could change the outcomes; you can then deduce the outcomes to some extent.’
Source: Wisker, 2001.
Action research:
‘…is experimentally based and usually set up to try to solve a problem or try out a hypothesis which could improve a practical situation.’
Source: Wisker, 2001.
Walliman (2001) includes ethnogenic and feminist research modes:
Ethnogenic research:
‘In this approach, the researcher is interested in how the subjects of the research theorize about their own behaviour rather than imposing a theory from outside.’
Source: Walliman, 2001.
Feminist research:
‘Feminist research claims that researchers who ignore the differences between men's and women's knowledge have created invalid knowledge, as non-feminist paradigms usually ignore the partiality of researcher’s ideas about the world.’
Source: Walliman, 2001.
It is obvious that all these approaches use differing proportions of qualitative and quantitative data: the balance between the two genres is struck self-consciously or otherwise relevant to the tension between research objectives (reflexive) and the nature of the sources (variable.) Bearing this in mind, the following definitions will prove useful:
Qualitative:
‘1: involving distinctions based on qualities; qualitative change; qualitative data.2: relating to or involving comparisons based on qualities.’
Source:
Quantitative:
‘1: expressible as a quantity or relating to or susceptible of measurement; export wheat without quantitative limitations; quantitative analysis determines the amounts and proportions of the chemical constituents of a substance or mixture 2: relating to the measurement of quantity.’
Source:
2.1.3 Research Paradigms
A definition concludes:
‘A research paradigm or perspective is the underlying set of beliefs about how the elements of the research area fit together and how we can enquire of it and make meaning of our discoveries.’
Source: Wisker, 2001.
Denzin and Lincoln (1998) have opposed the positivist and post positivist paradigms focusing on internal validity, external validity, reliability and objectivity with the constructivist model which favours a relativist ontology, transactional epistemology and hermeneutic/dialectical methodology. The former approach (or agenda) is useful for maintaining analytical rigour in an investigation but it cannot:
‘…take account of the ways in which in which inquiry is interactive and sets of facts can be read in different ways and are value-laden and not value-free.’
Source: Wisker, 2001.
However, the more humanist attitude to research aims at the:
‘…production of reconstructed understandings: it is not as focused on validity but concentrates instead on trustworthiness and authenticity.’
Source: Wiser, 2001.
Both these research paradigms are located prior to any research exercise and derive themselves existentially. As such, the balance between the two viewpoints is generally a function of the research personalities involved and the institutional bias they subscribe to (this last measure is variable but usually relevant in the majority of cases of human thought and action.) A useful definition for this aspect of conditions of research is:
‘A philosophy that emphasizes the uniqueness and isolation of the individual experience in a hostile or indifferent universe, regards human existence as unexplainable, and stresses freedom of choice and responsibility for the consequences of one's acts.’
Source:
Indeed personality can be summarized (though not conclusively by any means) as:
‘The quality or condition of being a person. The totality of qualities and traits, as of character or behavior, that is peculiar to a specific person. The pattern of collective character, behavioral, temperamental, emotional, and mental traits of a person. Distinctive qualities of a person, especially those distinguishing personal characteristics that make one socially appealing: won the election more on personality than on capability. A person as the embodiment of distinctive traits of mind and behavior. A person of prominence or notoriety: television personalities. An offensively personal remark. Often used in the plural: Let's not engage in personalities. The distinctive characteristics of a place or situation: furnishings that give a room personality.’
Source:
While bias may be defined as:
‘A preference or an inclination, especially one that inhibits impartial judgment. An unfair act or policy stemming from prejudice. A statistical sampling or testing error caused by systematically favoring some outcomes over others.’
Source:
As such, it is nearly unavoidable that a research will have many a bias, prejudice and agenda insofar as such phenomena are manifested as psychological and sociological survival mechanisms which are moreover the burden of free and choosing human entities (Camut,Satre,1942,1946.)
2.1.4 The Academy and its Institutions.
The Author feels it important at this juncture to render explicit an aspect of the discussion which has been in the background of previous sections of this paper such that the division between academic criteria of veracity and the attempts towards same of external or non-academic knowledge generation. According to Kuhn, competing paradigms:
- Incorporate different criteria of acceptability for scientific explanations.
- Employ different concepts for describing and reporting the results of experiments and observations (“theory-ladenness of observation.”)
- Incorporate different views about which experimental techniques and measuring/detecting instruments are reliable.
- Incorporate different views about which problems/questions are significant.
- All have anomalies.
Source:
What is important to note is that the implicit tension between modes of apprehending scientific truth are in fact artificially divided: a definition of science does not necessarily lead to scientism and where scientism supports or is identifiable with the process of knowledge-generation within a knowledge society, such an academic environment need not be scientistic (though often such a misidentification proves to be the actuality.)
Science may be defined as:
‘The classical definition of science is simply the state of “knowing” — specifically theoretical knowledge as opposed the practical knowledge. In the Middle Ages, the term “science” came to be used interchangeably with “arts,” the word for such practical knowledge. Thus, “liberal arts” and “liberal sciences” meant the same thing. Modern dictionaries are a bit more specific than that and offer a number of different ways in which the term science can be defined:
The observation, identification, description, experimental investigation, and theoretical explanation of phenomena.
Methodological activity, discipline, or study.
An activity that appears to require study and method.’
Source:
Or:
‘Finally, science is often used to refer to the community of scientists and researchers who do scientific work. It is this group of people who, through practicing science, effectively define what science is and how science is done. Philosophers of science attempt to describe what an ideal pursuit of science would look like, but it is the scientists who establish what it will really be.’
Source:
Scientism may be defined as:
‘Unlike the use of the scientific method as only one mode of reaching knowledge, scientism claims that science alone can render truth about the world and reality. Scientism's single-minded adherence to only the empirical, or testable, makes it a strictly scientific worldview, in much the same way that a Protestant fundamentalism that rejects science can be seen as a strictly religious worldview. Scientism sees it necessary to do away with most, if not all, , philosophical, and religious claims, as the truths they proclaim cannot be apprehended by the scientific method. In essence, Scientism sees science as the absolute and only justifiable access to the truth.’
Source: http://www.pbs.org
Academia may be defined as:
‘A school for special instruction. A secondary or college-preparatory school, especially a private one. The academic community; academe: “When there's moral leadership from the White House and from the academy, people tend to adjust” (Jesse Jackson). Higher education in general, a society of scholars, scientists, or artists.’
Source:
Or:
‘Academy:
- Plato's school for advanced education and the first institutional school of philosophy.
- Platonism.
- The disciples of Plato.’
Source:
The Author suggests that the telos of refining research parameters is a shared end proportionate in methodology to personality. When the above defined become overly intermingled, confusion, poor research and dubious knowledge-generation can occur. Any other/institutional paradigm prior to any research project may be thusly contingent on historical and socio-political bases unless such an agenda is made explicit and is usefully incorporated into the proposed research strategy- i.e. becoming conscious of the sometimes unconscious or even pre-conscious.
2.2 Tools of the Trade.
The choice of methods used to find and arrange information in research relates primarily to whether the researcher wishes to understand meanings, or look at, describe and understand experience, ideas, beliefs and values which are essentially if not substantially intangible or whether it is numerical variables or the proving of existing hypotheses which the researcher is interested in. Again, this categorization is broadly qualitative/quantitative as discussed previously.
Wisker (2001) has said of the meeting between the two nominally discrete data-types:
‘This weighting can be determined by different times and places, different needs and abilities, the opportunities for different kinds of study, and different subjects.’
Accordingly, Wisker (2001) lists qualitative research methods as including:
1: Interviews. Face-to-face discussion with human subjects enables information to be gathered in response to open or closed questions and for this information to be recorded in memory, on paper or on tape. Closed questions can be universally simplistic (name, date of birth etc.) or multiple-choice (Lickert Scale) in response while open questions are interrogative and speculative (requiring the respondents’ opinions, beliefs, preferences as well as their own reasoning on their answers.) Interview questions and structure should be prepared in relation to the research topic and to any desired perspective within or on that topic.
2: Focus groups. Focus groups enable the researcher to bring together a random or specified range of perspectives and/ or expertise to discuss and scrutinize a particular issue. The format can be repeated over time to gauge the changes in people’s views as well as taking into account the effects of the researcher’s presence. Wisker notes that:
‘With several people present in a focus group, ideas and issues tend to shape themselves as people speak, and the subjects start to form an understanding as participants debate certain points.’
Source: Wisker, 2001.
3: Participant observation. In participant observation, the observer becomes as the observed for a certain time period. However:
‘This is essentially a very subjective form of research and great care has to be taken in recognizing what is subjective, and what is objective fact when considering data and responses.’
Source: Wisker, 2001.
4: Personal learning logs. The log is a forum for recording response moment by moment, so tracking changing attitudes and the build-up of knowledge and understanding. It can be used as a research vehicle in itself to record, for example, how the researcher is feeling about undertaking the research, and is here very helpful if the research is personally related and ethically testing, involves emotions and so on.
Quantitative methods include:
1: Questionnaires. These can be understood as literal interviews which must further be designed to take account of the non-literal cues and nuances that take place in an interview. As such, questionnaires are an art form in of themselves and require more planning than interviews. Given their ubiquity in modern life, good design is key to combating the low response rate to questionnaires or surveys. If using a large sample-base, an optical mark reader is a good way of collating collected data, otherwise a smaller sample-base of written responses requires after-the-event manual collation. When setting up questionnaires:
A: Avoid ambiguity in the questions.
B: Avoid repetition in the questions.
C: Keep the parameters and respondents’ details confidential.
D: Make sure the final/usable version has been trialled, piloted and refined.
E: Make sure the information desired is obtainable using the survey method.
Source: Wisker, 2001.
In addition, the questionnaire should be well laid-out for the ease-of-use of both researcher and respondent. The information gained should be easily coded and tabulated (Wisker, 2001.)
2: Statistical research. Virtually all other quantitative analysis of data comes under the domain of statistics or the extrapolation and manipulation of mathematically derived and represented patterns.
A pattern is defined as:
‘A model or original used as an archetype. A person or thing considered worthy of imitation. A plan, diagram, or model to be followed in making things: a dress pattern A representative sample; a specimen. An artistic or decorative design: a paisley pattern.’
Source:
Or:
‘A design of natural or accidental origin: patterns of bird formations. A consistent, characteristic form, style, or method. A composite of traits or features characteristic of an individual or a group: one's pattern of behavior. Form and style in an artistic work or body of artistic works.’
Source:
The above definitions refer to matrices of data either intelligible analytically (as extractions or extrusions of an empirical reality) or synthetically as being phenomenologically contingent (Kant, 1781.) Since there is this implicit polar cosmological assumption in statistical analysis (Author, 2004), the following definitions will prove useful:
A: Induction is defined as:
‘The process of deriving general principles from particular facts or instances. A conclusion reached by this process.’
Source:
B: Deduction is defined as:
‘The process of reasoning in which a conclusion follows necessarily from the stated premises; inference by reasoning from the general to the specific. A conclusion reached by this process.’
Source:
The point is that while mathematical representations of phenomena might be seen as the final mode, while number might be seen as the clearest model-basis, there is in fact a pre-supposition involved whereby the expectation of veracious numerical analysis of reality is borne out in the first instance by a redefinition of the fundamentals of cosmology:
‘Authority, first objectified as time, becomes rigidified by the gradually mathematized consciousness of time. Put slightly differently, time is a measure and exists as a reification or materiality thanks to the introduction of measure.’
Source:
In this sense, it is important to note that the poeticisation, linguification and abstraction of reality are a contingent process (Wittgenstein, 1912)-one which itself cannot be unchangeably removed from its intellectual context:
‘The importance of symbolization should also be noted, for a further interrelation consists of the fact that while the basic feature of all measurement is symbolic representation, the creation of a symbolic world is the condition of the existence of time.’
Source:
What we can say of the phenomenological moment of mathematical thought is only that it is a noble if doomed attempt at the elimination of self-processes in human thought:
‘The purpose of the mathematical aspect of language and concept is the more complete isolation of the concept from the senses. Math is the paradigm of abstract thought.’
Source:
Noble in the first instance because mathematical representation has on its side the generalizability of primal axioms which give rise to a functional essence of otherwise more reflexive/experiential subjects/objects and tragic in the same instance because such a movement is as unrepresentational as any possible totalistic/holistic viewpoint of any given particle in space-time:
‘Abstraction and equivalence of identity are inseparable; the suppression of the world's richness which is paramount in identity brought Adorno to the "primal world of ideology." The untruth of identity is simply that the concept does not exhaust the thing conceived. Mathematics is reified, ritualized thought, the virtual abandonment of thinking.’
Source:
What this preamble sets out to remind the naïf researcher is that while the current Scientistic paradigm (E.O Wilson,1978) sets out to at once debunk alternate empirical and conceptual approaches while simultaneously validating statistical order (mathematical phenomenology),what can be said of any totalizing methodology is in fact also true of mathematics:
‘In the birth of controls aimed at control of what is free and unordered, crystallized by early counting, we see a new attitude toward the world. If naming is a distancing, a mastery, so too is number, which is impoverished naming. Though numbering is a corollary of language, it is the signature of a critical breakthrough of alienation.’
Source:
2.2.1 Further Discussion of Statistics.
‘If you've been struggling to work out just how the war on Iraq is going to affect the economy, you're in good company. The statistics claiming to offer definite conclusions are scant and tenuous.’
Source:
Although any involved or meaningful understanding of the real world is not easily derivable from statistics, the descriptive aspect of statistics allows researchers to summarise large quantities of data using measures that are easily understood by an observer:
‘Thus descriptive statistics consist of graphical and numerical techniques for summarising data, i.e.; reducing a large mass of data to simpler, more understandable terms.’
Source: Burns,2000.
Another version of statistical analysis uses procedures based on a sample-base of information in order to develop generalizations about characteristics extant therein. The usual method is to relate the data to a central tendency (for example ‘0’ on a graph) by way of three distribution points (Burns,2000):
1 The mean.
2 The median.
3 The mode.
The mean locus is easily skewed by extreme values given that it is calculated arithmetically whereas the median is an ordered hierarchy of values and does not involve manipulation of those values per se but rather merely their relative status. Burns (2000) reminds the researcher that arithmetically derived values and data are analytically more capacious/veracious than mean or mode especially in the case of inferential sample-bases where the likelihood of sampling-error is increased.
There exists many ways of measuring or averaging the values as derived from a chosen central tendency (Burns,2000):
1: Range- the distance or territory between lowest and highest values/scores.
2: Variance-the sum of squared deviations from the mean (chosen central tendency) divided by the number/amount of values/scores.
3: Standard Deviation- a reflection of the domain/spread of values/scores around a chosen central tendency. For research (most research) where the sample-base of data is inversely proportional to the underived sample-context, a modified denominator (N-1) must be used for greater analytic accuracy. This method is also subject (when squaring) to skewed or extreme values and therefore also results no matter how many constants are added.
While statistics ventures further into pure mathematics the more complex the calculation, the abovementioned techniques remain fundamental and consistently useful for generating simplified representations of otherwise intractable amounts of data. The dawn (utility-defined) of I.T. and computer technology generally has enabled faster and more accurate (within the accepted genre-specific research parameters) statistical analysis and therefore the application of such research techniques to a greater range of specialisations (i.e.; other than ‘hard’ sciences.) A more thoroughgoing discussion of the contribution of I.T. to research will make up the final section of this paper.
If the scatter of data is represented graphically, the alternate heuristic presentation entails an altered perspective on the information generated. This is called frequency distribution:
Normal distribution:
‘Normal distributions are symmetrical, affected only by random influences and will tend to balance out with extreme scores becoming progressively rarer.’
Source: Burns,2000.
Skewed frequency distributions: these distributions are biased by factors that tend to push scores one way more than another (Burns, 2000.)It is important to note that:
‘…our recorded observations tend to be discrete in practice rather than continuous, nor are they of infinite size. It is reasonable to speculate that no real variable is exactly normally distributed. This does not matter too much as it is they utility of the model that counts. Even variables that appear normally distributed in a large homogenous data-sample will fail to be so under other circumstances, particularly with inferred sample-bases.’
Source: Burns, 2000.
Even though mathematics purports toward ultimate data-refinement, its statistical puissance is not completely self-contained and capricious. Probability calculations arising from the above quote are as prevalent in mathematics (Burns, 2000) as heuristic assumptions are in ‘soft’ data-generation methodologies. Even if the researcher can move most efficiently from phenomena to noumena (Kant, 1790) by way of mathematics, the contingencies of both researcher and research material being grounded phenomenally means that even with the best efforts of Scientistic analysts (Dawkins, 1990), ‘hard’ scientific data is approximate and in no way the final/a priori (Kant, 1790) horizon of information potential.
Bearing the above in mind, the researcher generally utilizes the following weighting system (to determine the perceived predictability/probability status of any given value/score):
‘The three p-values of 0.05, 0.01 and 0.001 are conventionally accepted as the major thresholds for decisions about statistical significance.’
Source: Burns, 2000.
The threshold specifications are somewhat arbitrary or at least are subject to the same categorical pressures discussed previously in relation to the researcher’s pre-enactive agenda (s):
‘This decision may involve political, social, educational, philosophic and economic considerations as well as statistical ones.’
Source: Burns, 2000.
Another dynamic affecting the parameters of statistical measurement is that of sampling criteria. As a general rule, the best approximation of reality that can be experimented on/drawn upon is that section closest to the research materials:
‘The sample must be representative in terms of variables which are known to be related to the characteristics we wish to study.’
Source: Burns, 2000.
The Author wishes to point up at this juncture that the actual practice of quantitative research becomes akin to qualitative research when the wish-fulfillment (the transferring of personal-research agendas-desires onto external/other actions or phenomena or people-Freud, 1896) of Scientism is set aside. The variations of sampling-procedures are as follows:
1: The Random Sample: Each unit of data has an equal chance of being selected and the selection of one subject is independent of the section of any other.
2: The Systematic Sample: Each unit of data is derived according to a fixed interval or advanced algorithm.
3: The Stratified Sample: Each unit of data is derived form pre-formatted clusters or categories before randomization occurs (an apt analogy for the research process itself-Author-?)
4: The Clustered Sample: Each unit of data is derived as a group-entity adjoined to related data-units.
5: The Staged Sample: Each unit of data is subject to a two stage process whereby both groupings and members of each group are randomly determined.
6: The Opportunity Sample: Each unit of data is obtained in opportunistic circumstances-no satisfactory (predetermined, conditional) generalizability is possible.
Source: Burns,2000.
The main problem with using samples of data from their corresponding parameters or context is that they-the samples-are generally not identical with their sources. Statistics based on these samples will therefore be less than totally accurate. When measuring the distribution of any number of random sample mean-values the researcher must take into account the standard error:
‘This sampling variability of the mean, which is the extent to which a mean can be expected to vary as different samples of the same size randomly selected from the same context, is expressed by its standard error. The standard error of any sample measure such as the mean is the standard deviation of the distribution of measures that would result if large numbers of different samples of the same size were randomly selected from the same context.’
Source: Burns, 2000.
Levels of measurement also affect how and what statistical analysis is done on the data, or rather what arithmetic function is utilized in tandem with a specific scale of data measurement.:
1: Nominal Measurement: Observations are classified into categories with no necessary relationship existing between those categories.
2:Ordninal Measurement: Data are assigned rank order: thus their relative magnitude is meaningful.
3:Interval Measurement: The differences between scores or intervals can be treated as equal units: there exists a specific numerical distance between each pair of distances.
4: Ratio Measurement: The ability to refer to units of data across scales in terms of alternate scales.
Source: Burns,2000.
A variable is defined as a measurable characteristic of a sample (Burns,2000) and is an intelligible manifestation of the relationships between entities within a statistical research model:
‘The independent variable is manipulated and its effect is measured by the changes in the dependent variable. A moderator variable affects the relationship between the two types of variable. A control variable is held constant and an intervening variable is a hypothetical entity whose effects are inferred.’
Source: Burns,2000.
Methods such as factor analysis and multivariate correlation (Graziano,2004, also discusses Pearson P-M correlation and Spearman R-O correlation both using the positive/negative number line to differentiate between closely and distantly related variables) techniques are used in order to reduce:
‘…a plethora of variables to a few factors; by grouping variables that are moderately or highly correlated with each other together to form a factor. It is an empirical way of reducing an unwieldy mass of variables into a manageable few.’
Source:Burns,2000.
Relationships between data which are linear and which are measured in turn by interval data can be subjected to linear regression:
‘Regression is an interesting phenomenon which is observed when there is a less than perfect relationship between two variables involved in perception. In such a situation, it refers to the fact that the predicted score on a variable will be closer to the mean of the sample than is the predicator score.’
Source: Burns,2000.
Simple linear regression is concerned with finding the straight line on a scatter diagram that fits the data best:
‘The line of best fit is therefore the straight line about which the sum of squared distances of the deviations of the scatter points is least or at a minimum.’
Source: Burns,2000.
However, this method is reliant on assumptions of best approximation and future coherency:
‘For future predictions based on previously obtained data, we are assuming the future sample is quite similar to the sample of our original data.’
Source: Burns,2000.
Another statistically useful procedure is ANOVA or analysis of variance:
‘The purpose of ANOVA is to decide whether the differences between (two, for example) samples is simply due to chance (sampling error) or whether there are systematic treatment effects that have caused scores in one group to be different from scores in other groups.’
Source: Burns,2000.
ANOVA can manipulate factors(variables) on three different levels (values of the factor) even when the factors themselves are both internal (within treatments variability) and external (between treatments variability.) Grouped factor and two-way ANOVA processes are also used to derive a null hypothesis stating the interaction of factors and whether it is a statistically significant figure (Burns,2000.)The point of this summary is to show that the quantitative analysis of complex phenomena is itself very complex-it is the Author’s opinion that such phenomena are better analyzed qualitatively as the measure of tractability suggests that such an approach fits certain subjects (social, humanities studies) more naturally (utilizing existential-experiential and emergence-consciousness related internal processes and information stores-Velmans,Deutsch,2000,1997.)
A good method of making productive use of statistical studies is meta-analysis which is a quantitative tool for comparing or combing results across a set of similar studies, facilitating statistically guided decisions about the strength of observed effects and the reliability of results across a range of studies (Burns, Graziano, 2000, 2004.)Meta-analysis is a more efficient and effective way to summarize the results of large numbers of studies by using the following general principles:
1: Comparing Studies: This comparison is made when you want to determine whether two studies produce significantly different effects.
2: Combining Studies: This technique involves combining studies to determine the average effect size of a variable across studies.
Source: Graziano,2004.
For each approach, you can evaluate studies by comparing or combining either p-values or effect sizes (Burns,2000.) A problem with this procedure arises when one considers the type and availability of texts from which to draw comparisons or make combinations. Burns (2000) notes that a biased ample will inevitably arise in meta-analysis since it (meta-analysis) involves a sample of only those results published because they produced acceptable statistically significant results-even published research may be of uneven quality. Graziano (2004) lists the threats to validity as including:
1: Threats to statistical validity: unreliable measures, violation of underlying statistical assumptions, distortion of p-value.
2: Threats to construct validity: unclear and poorly supported theoretical bases, continuing unwarranted presence of rival theories.
3: Threats to external validity: The inaccurate generalization of results of a study to other participants, conditions, times and places.
4: Threats to internal validity: Any factor that weakens confidence that it was the independent variable that accounted for the results is a threat to internal validity.
Graziano (2004) also lists some measures or controls to maximize validity of data gathered:
1: General Control Procedures: Inclusive of careful preparation of the research setting, specification of measurement instruments, and replication.
2: Subject and Experimenter: The researcher and participants are kept blind as to the hypotheses and conditions under which each participant is being tested. Methods such as automation of procedures, objective measures, multiple observers and deception.
3:Participant Selection and Assignment: Randomization of both selection and assignment keeps threats to validity minimized.
4: Design: The maximized conditions of controls in experimentation gives rise to the clearest indication of relationships between variables.
Both these threats and controls are attempts to rarefy and isolate phenomena to an impossible degree: although there is some place for abstracting patterns from empirical data, the Author believes that there exists a threshold after which the advantages gained through statistical analysis are waylaid by the increasing disparity between complex, unpredictable reality and a stratified, clarified sample-based statistical model of said. As mentioned previously, a sensible balance between quantitative and qualitative approaches yields the greatest utility: just because qualitative analysis does not require heavy abstraction does not entail that no quantitative analysis should be taken into account by the researcher. Equally, simply because quantitative analysis does not definitively rely on qualitative data, there is no reason why it should not use the latter as a checking mechanism for its own methodology (Author, 2004.)
2.2.2 Statistical Epilogue
Multidimensional scaling techniques are a reflexive addendum to more orthodox modes of statistical analysis whereby the underlying structure of reality as manifest in nature are sought within the statistical data derived (at a remove) from said (Carroll,Arabie,1998.) As such it can be seen as an extra stage in data formatting and in seeking closer-to-base axioms fewer in number and greater in generalizability and as such are an epistemological version of ANOVA procedures.
‘Multi-dimensional scaling techniques seek to identify underlying factors that account for complex patterns of scores. It is presumed that such underlying factors correspond to real variables in life.’
Source: Graziano,2004.
Path analysis is a regression procedure that assumes the observed data to be contingent on a specific set of prior unobserved variables:
‘Path analysis tests the viability of a hypothesized causal model of the relationship between observed variables.’
Source: Graziano,2004.
Taxonomic differentials between groups refer to real-life quantity/quality amalgamations and the non-obvious empirical instantiation of such entities and their apparent hierarchical positions. Graziano(2004) suggests the dimensional variance of intelligence in general and schizophrenia in particular as an example of the taxonomic dynamic.
Alpha/Beta levels of knowledge: the statistically prescriptive tendencies of the ‘either-or’ response to questions of knowledge-generation may actually be decreasing the sum of new knowledge in a given area which has the knock-on effect of diminishing cumulative knowledge and the potential efficacy of meta-analysis (Graziano,2004.)
What we can see from these new schools of thought within quantitative/mathematical analysis is that the intangible challenge-the effectual presence of variables resistant to current statistical representation (or rather ‘qualitative data-types’)-is being taken up within statistical science: indeed it is the Author’s main contention throughout this paper that neither one nor the other modes of analysis alone is satisfactory and that the whole human endeavour (Deutsch,1997)-with research as a subset of aims and strivings- is a referent of the relationship between subject and object. While the paradigms and terminology may change and wax and wane, the essential human intellectual project is one of control (naming), comprehension (ordering) and self-doubt (critiquing) (Schussler-Fiorenza, Deutsch, Descartes, 1996, 1997, 1637.)
3.1 Research and I.T.
A recent author on the subject of using computer technology for writing research appears has said that:
‘Compared to learning how to surf the Net, download files from online services, and search the archives of an electronic discussion group, today’s parents had an easier job researching topics than do today’s high-school students.’
Source: Harmon, 1996.
While the initial skill-base necessary to research a paper before the I.T. revolution was incorporated more as an adjunct of general day-to-day routine, these heady days of instant WWW access mean that:
‘The good news is that today’s researcher can write interesting papers on a wider variety of subjects because he or she can locate an almost unlimited amount of information on almost any conceivable subject.’
Source: Harmon, 1996.
As Harmon (1996) notes, the increased ease of information gathering offers a more free-ranging brief and facility for knowledge generation among a potentially eclectic and stimulating worldwide participant network:
‘For less than $10 a month, a whole realm of almost limitless information can come onto a researcher’s PC over the telephone line.’
While a hard-line traditionalist academic may worry that the skills learn through orthodox research should be undermined by the ease-of-use of I.T., Harmon (1996) points out that:
‘While traditional reference works have customarily gone through an intensive review and verification process, electronic resources-especially the Internet and Web ones- may be literally the collected and posted ramblings of one individual. Learning to evaluate sources for authenticity, authority and currency is an essential step to becoming an information literate person.’
It could also be, according to the Author, that another advantage of the I.T. mass peer review system (or the selection of material based on thousands of ‘hits’) is the placing of the burden of critical thought more firmly on the shoulders of the researcher which may entail better engagement with the chosen topic, a greater sense of responsibility to other researchers and a more active sense of involvement with intellectual activity on a personal and individual level.
Sheehan-Harris (1996) notes that as against bulky printed tomes:
‘CD-ROMs and DVD-ROMs are great tools for secondary research because they allow you to research in ways that would be impossible with printed sources. Discs found in libraries today include ones filled with poetry, magazine and newspaper articles, journal indexes, paintings from the world’s great museums and even sports statistics.’
Sheehan-Harris (1996) divides the disc-based secondary research material into two forms:
1: Electronic Reference Books: the electronic format of standard library resources such as encyclopedias, dictionaries and handbooks. These disc versions are cheaper than paper books especially given the size of an encyclopedia series. Though initially difficult to use, the wealth of instant information often presented in multimedia form is a pleasurable experience for the researcher. A problem can be the lack of universal information indexing between products (different companies’ dictionaries or encyclopedias.)
2: Journal Indexes:
‘A journal index will list magazine articles on a particular subject and tell you what magazine the article is in. This is much faster than searching through fifty-two issues of Time magazine for an article you thought you read a couple of months ago. Journal indexes list citations: the author and title of the paper, and the name, page numbers and date of the journal.’
Source: Sheehan-Harris, 1996.
Increased speed of access and increased information stored and greater range of formats/media as well as decreased cost and space savings are some of the benefits of the above electronic as opposed to print data-basing (Vecchio, 1995):
‘Though using a CD-ROM index may initially be more challenging, ultimately the technology yields both savings in time and more comprehensive research results.’
Source: Sheehan-Harris, 1996.
Vecchio (1995) discusses how using on-line services such as search-engines, e-communities, e-mail groups, on-line forums and databases, and personal websites as well as on-line academic institutions all provide opportunity for access to information and opinion from around the globe:
‘Increasingly research as a process will require more mental run-time and less foot-work!’
Not only can the researcher view and request different information but in nearly all cases they can also download the information in a short amount of real-time and in varying formats (King, 1995.)
However, the novelty aspect of Internet-based research has led to an extreme bias in the younger researcher population:
‘Students who rely on information that they can find on the Internet and neglect to consult books or other forms of publication are becoming an increasing worry to university and college lecturers, but that is not to say that searching the Internet for relevant information is all bad. There just has to be a balance.’
Source: Coombes, 2001.
As ever in any research:
‘The key is to contemplate, analyze and be critical of the information that you find, whether it is from a printed or on-line source.’
Source: Coombes, 2001.
- I.T. and Research- A Warning.
Plagiarism:
‘If you put your name on a piece of work to say you have written it, and in actual fact parts of it were lifted from elsewhere, it is known as plagiarism which is theft. Also you will be taking your praise under false pretences and, perhaps better deserved, you will also be taking your criticism.’
Source: Coombes, 2001.
Copyright:
‘A single Web page could be subject to several different copyrights, for example there could be a copyright for the graphics, another for the sounds used and yet another for the text. Permission from each copyright holder would be required if you wanted to copy the whole of that page.’
Source: Coombes, 2001.
Any Web page or any part thereof that the researcher wishes to copy is subject to permission gained from the Webmaster. Most Webmasters are happy to give permission if:
1: The information will be used for corporate promotional work (as long as credit is given.)
2: The information is being used for research purposes.
Source: Walliman, 2001.
Often, permission is given automatically: this status of a Web page is usually described in the small print at the bottom of the home-page (Coombes, 2001.)
Fair Dealing Criterion: The Copyright, Designs and Patents Act of 1988 stipulates that some copyrighted material may be downloaded for use in teaching or research situations. However, the parameters defining these situations are inexact and are left for the law-courts to settle case by case (Coombes, 2001.)
Copyright and the HMSO: In the U.K. in 1999, it was decided that unrestricted access and copying/dissemination rights would be granted to researchers using the valuable material held under the HMSO auspices:
‘Much of the information issued by the HMSO is in the form of public records, legislation, literacy, statistics and government press notices.’
Source: Coombes, 2001.
Data Protection Act 1998: Section 33 of this U.K. act covers many aspects of computer data storage and access a summary of which is displayed below:
1: Research purposes include historical and statistical purposes.
2: Conditons in relation to the processing of personal data include:
A: the data are not used to support measures or decisions with respect to particular individuals.
B: the data are not processed in such away as to cause either damage or distress to the data subject.
3: Personal data processed only for research in accordance with the above conditions, may be kept indefinitely.
Source:
The general aim of these warning and measures is to maintain freedom of information with freedom of the individual and freedom of thought: all these important conditions on the possibility of Western livelihood are not, according to the Author, easily and mutually related and protected. Therefore, insofar as a knowledge-based society relies on the status of its knowledge generation, the contribution of I.T. should be taken as an experiment in progress and not accepted wholesale as the only (newest) mode of research:
‘The main advantage of using I.T. as a research tool lies in its speed, variety of information sources accessible on the WWW, amalgamation into one workstation the previously disparate facilities of word processing (Microsoft Word/Works),information gathering (Internet Explorer),analysis (Minitab),formation (Works, Spreadsheet),presentation (PowerPoint) and communication (Outlook Express.)’
Source: Walliman, 2001.
The main threats to research integrity from I.T. include:
1: Lack of original work-see in relation to Meta-analysis.
2: Misrepresentation of the work of others-intellectual laziness is the modern-day version of lack of manual work-ethic of the industrial age.
3: Lack of learning experience for the researcher-the researcher must incorporate some of the newly-generated thought processes in day-to-day life for the full benefits of research to be realized (see action research.)
4: Emphasis on information gathering and de-emphasis on hard thought and personal input-hard thought is humanizing in that it evolves the human condition and aims at improving the status of humanity in general.
5: Emphasis on professional appearance over knowledge-substance: poor research disguised by professional presentation software legitimates weak knowledge generation, undermines computer technology and cheats the proponents of future research (one researcher’s conclusion may be the starting point for the intellectual journey of another.)
Source: Coombes, 2001, Author 2004.
- Ethical Considerations in Research.
Having discussed some of the issues to do with research practice in I.T., it is important to set out some aspects of ethical behaviour in non praxis-specific research:
NATURE OF RESEARCH:
1. Is the researcher adequately qualified and experienced to undertake the proposed research?
2. Is the research justified? Has the subject been studied before? If it has, is this additional study necessary?
3. Is the research question worth asking? That is, is it of importance?
4. Has the study been carefully designed and will it address the research question?
5. If the subjects are staff will they be able to maintain the usual standard of productivity while participating?
POSSIBLE HARMS:
6. Has the research proposal been approved by a research ethics committee?
7. Is there any unnecessary risk, discomfort, pain or inconvenience to participants?
8. Will the privacy of the subjects be respected?
9. Will confidentiality be respected, and is the Data Protection Act relevant?
(NB Anonymity is not the same as confidentiality.)
10. Will other relevant law / statutes be respected?
CONSENT:
11. Will all those involved either directly or indirectly in the study be required to give their informed consent?
12. Will a consent form be used?
13. Do any of subjects come from 'at risk' groups? If so, have their rights been safeguarded?
14. How will this consent be obtained? Will participants be given adequate time to ask questions and to consider the proposal carefully before consenting?
15. Will subjects be made aware that they are free to withdraw at any time?
MONEY:
16. How is the research to be funded? Is the project independent? Are the researchers free to publish?
17. Are subjects to be reimbursed for all the costs involved? (In general, payment should not be offered as inducement, but simply to cover expenses.)
PROVISOS:
1. Is the research relevant to the practical context?
2. Has the research been adequately corroborated? How reliable are the research findings e.g. consider use of a control, recognition of unexpected ramifications.
3. Research should be implemented only after adequate discussion with relevant authorities and participants
4. The facility of an advocate should be afforded the participant where the participant’s knowledge-base may prohibit the self-determination of his/her assent.
5. Just because research shows that some procedure (etc.) is biologically/economically/administratively effective it does not follow that it must be morally right. Technology should not dictate to morality. [Example: abortion, fetal brain tissue transplants, ECT, psychosurgery]
6. The consent of the participants is paramount concerning the implementation of any but minor research findings.
7. Care must be taken that 'research' is not used as an excuse for implementing changes which have as their motive mere cost-effectiveness (economic efficiency) managerial convenience, or institutional power. In other words, research has as its only true rationale the welfare of participants and society in general.
Source: , Author, 2004.
In any research and more specifically in action research and paradigmatically-linked knowledge generation, the status of the individual must be respected and taken into consideration. The real life-world effects of technological, sociological and academic research are of equal if not greater import than the potency of any potential knowledge-generation. The universal morality which underpins the effects of research (or indeed any consequential human action) is inherent to civilized society:
‘Kant believed that we are not all fully rational but we can become more rational. It is this which separates us from the animals who are dominated by desires and instincts. A Categorical Imperative is done purely out of duty, it has no regard for self interest. Any rational person would follow a Categorical Imperative.’
Source:
Inasmuch as research is a sub-function or specialization of human consciousness (Zeman, 2000), so too are its specific practitioners isolated somewhat from the social context of their activities (this is true even for action research), an awareness of the Categorical Imperative is therefore paramount for research integrity and the objects/subjects thereof:
‘An action done from duty does not have its moral worth in the purpose which is to be achieved through it but in the maxim by which it is determined. Its moral value, therefore, does not depend on the reality of the object of the action but merely on the principle of volition by which the action is done.’
Source:
A more straightforward moral axiom (ala Kant or Spinoza) is:
‘Act only according to that maxim by which you can at the same time will that it should become a universal law.’
Source:
4.1 What is Research?
In the prelude to the discussion of research preceding this summary movement was suggested by the Author from a linear, exact, logical (masculine) model (Kohlberg,1981) toward an involved, reflexive, multi-dimensional (feminine) understanding (Gilligan,1990.) An encompassing survey of all the dynamics involved is probably without the confines of this paper, but the Author expects that some of the more important aspects of research and researcher have been covered enough that the following conclusions seem apt:
A: The Author holds that research is context-derived.
B: The Author holds that research is context-driven.
C: The Author holds that research and researcher are mutually determining.
D: Therefore the researcher is context-driven.
E: Therefore the researcher is context-derived.
F: Therefore the real object of research is context.
G: Therefore the real object of context is the researcher.
H: Therefore the real object of research is the researcher.
Source: Author, 2004.
As Deutsch (1997), Tipler (1997) and Greene (2003) have discussed, in the further reaches of physics and experimental sciences and philosophies the borders and delineations between conscious presence, action and consequence and material (inanimate) reality are blurring more every day. It seems likely that any version of M-Theory developed thereof will have to incorporate both qualitative and quantitative research and maybe some new categories of substance and mode will have to be utilized in order to make sense of such a worldview. Perhaps, as Deutsch (1997) has suggested and Mackey (1996) has furthered theologically, the sum of conscious understanding is equivalent to the sum of possible matter-energy (and human self-) conversion in a quantum reality?
4.1.2 Research is?
Research is not ‘just’ anything. It is a compound phenomenon and process which references, incorporates and affects nearly every aspect of both life and working worlds. Its consequences are felt internally and externally and meternally. Research is a function of and a pre-requisite for conscious existence and therefore is an emergent property of an anthropological/anthropomorphic universe.
4.1.3 Research is Also…
1: Stressful.
2: Time-consuming.
3: Isolationary.
4: Personally and intellectuality demanding.
5: Not to be undertaken ad hoc.
Source: Blaxter, 2001.
I use this conception as it refers to the paradigmatic assumptions of today’s technologist agenda and so the conception also stands as a straw dog for the more intuitive schema described in this paper.
By which the Author refers to the Kantian distinction of qualities derivable from objective reality under analysis rather than synthetic qualities related to and reflexive toward objective reality under reason.
Aristotle (346B.C.) set out this notion of emergent goals, aims, ambitions or fulfilments as a categorical measure of human action and intellection.