cmns 260: empirical communication research methods 4-measurement (part 1 of 2 slideshows) neuman...

Post on 23-Jan-2016

222 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

CMNS 260: Empirical Communication Research Methods 4-Measurement (Part 1 of 2 slideshows)

CMNS 260: Empirical Communication Research Methods 4-Measurement (Part 1 of 2 slideshows)

Neuman & Robson Chapters 5, 6, & (some of) 12 (p. 332-8)

•systematic observation •can be replicated•requires:•construct (concept)•operational measure/instrument/tool for making empirical observations

Today’s Lecture

• Review core notions: concepts, operational measures, empirical research, language of variables, hypothesis testing, errors in explanation, etc.

• Reliability & Validity: relationship, types• Levels of Measurement• Scales & Indices (if time– material for this in

another slideshow)

Babbie (1995: 101)

Recall:The

Research Process

““Dimensions” of ResearchDimensions” of Research

Neuman (2000: 37)

Purpose ofPurpose of

StudyStudy

Intended Use Intended Use of Studyof Study

Treatment of Time Treatment of Time in Studyin Study

Space Unit of Space Unit of

Analysis Analysis

ExploratoryExploratory

DescriptiveDescriptive

ExplanatoryExplanatory

BasicBasic

AppliedApplied

-Action-Action

-Impact-Impact

-Evaluation-Evaluation

Cross-sectionalCross-sectional

LongitudinalLongitudinal

-Panel-Panel

-Time series-Time series

-Cohort analysis -Cohort analysis

-Case Study-Case Study

--Trend studyTrend study

-dependent -individual-dependent -individual

-independent -family-independent -family

-household-household

-artifact-artifact

(media, (media,

technology)technology)

The Research Wheel

ChooseTopic

FocusResearchQuestion

DesignStudy

CollectData

AnalyzeData

InterpretData

InformOthers

The“Research Wheel”

Source: Neuman (1995: 12)

Steps in theresearch process

Developing research topics

Concepts– Symbol (image, words, practices…)– definition

• must be shared to have social meaning• concepts with more than one possible value or attribute sometimes called variables

Concept Clusters• Examples:– Peer Group– Role Model– Broadcast Media– Ethnic Identity– Cultural trauma– Collective memory– Political economy

Measurement systematic observation can be replicated (by someone

else) Measures include:

Concepts (constructs), theories

measurement instrument/tools

Must recognize concept in observations (measures)

??(# of library holdings as a measure of quality of university?) MacLeans Magazine survey

results, 2000.

From Concept to Measure

Neuman (2000: 162)

Variables• Must have more than one possible “value” or

“attribute”• Types:– dependent variable (effect)– independent variable (cause)– intervening variable– control variable

Causal Relationships

• proposed for testing (NOT like assumptions)• 5 characteristics of causal hypothesis – at least 2 variables– cause-effect relationship (cause must come before

effect)– can be expressed as prediction– logically linked to research question+ a theory– falsifiable

Errors in Explanation

Propositions

• logical statement about (causal) relationship between two variables

• i.e. “Increased television watching leads to more shared family time and better communication between children & their parents”

Types of Hypotheses (note: plural form of Hypothesis)

• Null hypothesis– predicts there is no relationship

• Direct relationship (positive correlation)– more time spent studying leads to higher grades

• Indirect relationship (negative correlation)• More time spent playing video games leads to lower

grades

Hypothesis Testing

Possible outcomes in Testing Hypotheses (using empirical research)

• support (confirm) hypothesis• reject (not support) hypothesis• partially confirm or fail to

support• avoid use of PROVE

Causal diagrams

X Y

X Y

Direct relationship (positive correlation)

Indirect relationship (negative correlation)

Spurious Association example

Causal Diagram

R= Racism against non-whitesD= Discrimination against non-whitesS=Intelligence Test Scores

SDR

Good & Bad Research Questions

Abstract to ConcreteConcept to Measure

Reliability & Validity

Reliability dependability is the indicator consistent? same result every time?

Validity measurement validity - how well the conceptual and

operational definitions mesh with each other does measurement tool measure what we think ?

Types of Validity

Content Validity measure represents all the aspects of conceptual definition of

construct. how adequately a measure covers behavior representative of

the universe of behavior the test was designed to sample.

Love

Face & Expert Panel Validity judgement by group or scientific community that indicator

measures the construct (conceptual def.) Examples:

Socio-economic status (education, income & ?) Digital Divide (differences in access to computers, internet, broadband?...)

Construct

MeasureScientific Community

Criterion Validity The validity of an indicator is verified by comparing

it with another measure of the same construct in which a researcher has confidence.

Predictive : ex. Comparison of Aptitude test & performance measures

concurrent validity: ex. Comparison of new measure with established one

Construct Validity A type of measurement validity that uses multiple

indicators– the construct is a combination of measures of the same variable

convergent : positive correlation with related measures

discriminate: negative correlation with measures of different variables

Other Dimensions of Validity

Internal Validity no error of logic internal to research design

External Validity results can be generalized

Statistical validity correct statistical methodology chosen ? assumptions fully met

Types of Reliability

stability over time

representative across different subgroups of a population

equivalence multiple indicators

intercoder type of equivalence reliability

Improving Reliability

clearly conceptualize constructs increase level of measurement use pretests, pilot studies use multiple indicators :

DependentVariable Measure

IndependentVariable Measure

Empirical

Association?

a2 a3a1 b1 b2

A B

Specific IndicatorsSpecific Indicators

Relationship between Measurement Reliability & Validity

reliability necessary for validity but does not guarantee it “necessary but not sufficient” measure can be reliable but invalid (ex. not measuring

what you think you are measuring)

Quantitative & Qualitative“Trustworthiness”

Creating Measures

Measures must have response categories that are: mutually exclusive

possible observations must only fit in one category

exhaustive categories must cover all possibilities

composite measures must also be: uni-dimensional

Levels of Measurement

Levels of Measurement

•Categories (or attributes) must be exhaustive & mutually exclusive

Relations between levels --can collapse from higher into lower, not vice versa

Nominal Measurement different categories (names, labels, images) not ranked• attributes are mutually exclusive and

exhaustive.

Babbie (1995: 137) Examples: What media do you use for finding out about news?

TelevisionNewspapers

RadioMagazinesInternetOther

Ordinal Measurement different categories (mutually exclusive, exhaustive) rank-ordered• attributes indicate relatively more or less of that variable.• distance between the attributes of a variable is imprecise

Example: “How important are newspapers as your news source?”

Interval Measurement

different categories ranked in order can also tell amount of difference between

categories

Babbie (1995: 137)

Ratio Measurement different categories ranked in order amount of difference between categories also possible to state proportion (have a

true zero)Example: “What was your income in dollars last year?”

Examples

Continuous & Discrete Variables Continuous variables:

can have an infinite number of values interval and ratio levels of

measurement

Discrete variables: distinct categories nominal and ordinal levels of

measurement

Composite Measures (continued in second slide series)

• Composite measures are instruments that use several questions to measure a given variable (construct).

• A composite measure can be either unidimensional or multidimensional.

• Ex. Indices (plural form of index) and scales

top related