Inquiry & Research - PowerPoint PPT Presentation

Inquiry research l.jpg
1 / 73
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Download Presentation

Inquiry & Research

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Inquiry research l.jpg

Inquiry & Research

  • The inquiry or research activities of social scientists we will be talking about is this course are really specially cases of the more general search processes involved in human problem-solving — looking for the optimal solutions to challenges presented by the ecology in order to be competitively effective, to be "fit," and to survive.

  • They all use similar — stochastically based — problem-solving strategies or “algorithms”

  • Research methods, data analysis and statistics are central to what we do as a species and cultures, as well as individuals. They reflect very significantly how we"think." And, as we will see, the process most often involves the "fuzzy logic" required by the real, empirical world rather than the rigid, abstract logic of philosophy.

The psychology of science l.jpg

The Psychology of Science

Research is part of theperceptual process of scientists or those who engage in inquiry as part of science.

It is done by people, who typically are members of organizations and who receive support from funding and other agencies —to understand it we must understand those people, organizations and agencies.

From examining research we will learn as much about the researcher as the researched.

The perceptions resulting from research (e.g., data, theories) produce findings not “facts.” Those findings are only valuable or useful for you to the degree you decide that the methods by which they were acquired are valid for your needs.

The philosophy of science l.jpg

The Philosophy of Science

  • Science is asystem for the acquisition of knowledge about the empirical (sensory) world

  • (There are other systems for acquiring knowledge — art, humanities, religion)

  • The system includes

    People, institutions, culture, technology

    The knowledge — findings or “data,” theories, models

    Paradigms — the framework of beliefs within which science is conducted

    Paradigms consist of

    Ontology — our beliefs about the characteristics of the empirical world

    Epistemology — our beliefs about how we can reasonably inquire about that world

    Methodology — the methods we use to conduct that inquiry

    Purpose & Goals — why we do it & what we hope to achieve

    Basic approaches to paradigmatic change and knowledge development

    The evolutionary model (knowledge in science is accumulative; Karl Popper).

    The revolutionary model (knowledge in science is replaced in the process of transition;Thomas Kuhn).

    Less extreme & complex combinations of the two; Larry Laudan

The purposes of research l.jpg

The Purposes of research

  • The purpose of inquiry or research is to find something out about the empirical world that we didn’t know beforeand, often, to do something useful with what we find —

    • description

    • explanation

    • prediction

    • control

  • Research can be theory/model/hypothesis generating (exploratory research — ”inductive,” “grounded theory”), testing (hypothesis-testing research — ”deductive”) or interwoven combinations of the two.

  • The purpose of research is not todemonstrate something that we —think — we already know

  • There are always a variety ofego-oriented motivesassociated with research activity

Your problem l.jpg

Your Problem …

  • (1) Identify one question you have about the University of Hawaii — the place, the people, the programs, the resources — that you don’t know, but need to know or would like to know. Really, any question will work.

  • (2) Take 30 minutes to venture out into our UH environment to inquire about that question.

  • (3) Don’t forget to come back!

Let s make a deal a l.jpg

Let’s Make a Deal a

  • The “Monty Hall Problem” from “Let’s Make a Deal” (September, 1990)–

    • Suppose the contestants on a game show are given the choice of three doors: Behind one door is a car; behind the others, goats. After a contestant picks a door, the host – who knows what’s behind all the doors – opens one of the unchosen doors, to always reveal a goat. He then says to the contestant, “Do you want to switch to the other unopened door?”

  • Is it to the contestant’s advantage to make the switch?

  • In Parade Magazine’s “Ask Marilyn” column, Marilyn vos Savant – famous for being listed by Guinness World Records as having the highest recorded IQ (228) – said “yes.”

  • Was she correct?

And beware the drunkard s walk l.jpg

And beware “The Drunkard’s Walk”

  • Nobel prize winning psychologist Daniel Kahneman & colleagues (1982) & Leonard Mlodinow (“The Drunkard’s Walk” 2008) illustrate that – while identifying the strategies associated with the best outcomes is central to swarming & optimization, people are actually not terribly good at doing such. We typically have a significant misunderstanding of the role of stochasticity or probability in life & task outcomes.

  • Understanding the basic laws of probability (predictions based on fixed probabilities) are key to understanding the stochasticity of evolutionary optimization processes.

  • Understanding the basic principles of statistics (the inference of those probabilities from empirical observations) are key to understanding the challenges of evaluating the outcomes of ourselves & neighbors.

  • An optimal strategy is one that works better than others over the long run – not on every occasion!

  • There is no reason to believe that success in completing tasks in the complex, turbulent global world of multinational enterprises is any less challenging than predicting the stock market, getting a hit in baseball or writing a best seller. Optimal strategies are ones that might, in fact, succeed significantly less than half the time. Try selling that to your client! But we do succeed better than our competitors … we do survive …

YouTube presented for Google (42min)

The drunkard s walk c l.jpg

The Drunkard’s Walk c

  • The term “drunkard’s walk” describes the fundamentally chaotic movement of molecules in liquid. The molecules randomly go this way or that in a straight line until deflected by encounters with other molecules which send them off in other directions– thus the molecules “stagger” around in their environment. These movements essentially cancel each other out over time.

  • However, simply by chance there can be a preponderance of hits from some particular direction that produces a noticeable “wiggle” in a that direction– the “Brownian motion” we observe in small particles.

  • Albert Einstein first recognized that much of the order we perceive in nature belies an invisible underlying chaos and can only be understood through the rules of randomness.

  • Mlodinow stresses that though patterns emerge from the randomness each pattern is not necessarily meaningful. “And as important as it is to recognize the meaning when it is there, it is equally important not to extract meaning when it is not there. Avoiding the illusion of meaning in random patters is a difficulty task.” (168)

Let s make a deal b l.jpg

Let’s Make a Deal b

  • 92% of Americans (including over 1,000 Ph.Ds. – many in math) who wrote into the program “vehemently” said –

  • she was wrong.

    • “Two doors are available – open one and you win; open the other and you lose– so it seems self-evident that whether you change your choice or not, your chances of winning are 50/50. What could be simpler?” (Mlodinow, 2008, 44)

  • She was right!

Let s make a deal c l.jpg

Let’s Make a Deal c

  • In the Monty Hall problem you are facing three doors: behind one door is something valuable, say a shiny red Maserati; behind the other two, an item of far less interest, [say a goat]. You have chosen door 1.The sample space in this case is this list of three possible outcomes —

    • Maserati is behind door I.

    • Maserati is behind door 2.

    • Maserati is behind door 3.

  • Each of these has a probability of 1 in 3. Since the assumption is that most people would prefer the Maserati, the first case is the winning case, and your chances of having guessed right are 1 in 3.

  • The next thing that happens is that the host, who knows what's behind all the doors, opens one you did not choose, revealing one of the [goats]. In opening this door, the host has used what he knows to avoid revealing the Maserati, so this is nota completely random process.

  • There are two cases to consider —

Let s make a deal d l.jpg

Let’s Make a Deal d

  • One is the case in which your initial choice was correct. Let's call that the “Lucky Guess” scenario. The host will now randomly open door 2 or door 3, and, if you choose to switch, you will loose — In the Lucky Guess scenario you are better off not switching, but the probability of landing in the Lucky Guess scenario is only I in 3.

  • The other case is that in which your initial choice was wrong. We'll call that the “Wrong Guess” scenario. The chances you guessed wrong are 2 out of 3, so the Wrong Guess scenario is twice as likely to occur as the Lucky Guess scenario. How does the Wrong Guess scenario differ from the Lucky Guess scenario? In the Wrong Guess scenario the Maserati is behind one of the doors you did not choose, and a [goat] is behind the other unchosen door. Unlike the Lucky Guess scenario, in this scenario the host does not randomly open an unchosen door. Since he does not want to reveal the Maserati, he chooses to open precisely the door that does not have the Maserati behind it. In other words, the host intervenes in what until now has been a random process. The host uses his knowledge to bias the result, violating randomness by guaranteeing that if you switch your choice, you will get the car. So if you find yourself in the Wrong Guess scenario, you will win if you switch.

Let s make a deal e l.jpg

Let’s Make a Deal e

  • So, if you are in the Lucky Guess scenario (probability 1 in 3), you'll win if you stick with your choice. If you are in the Wrong Guess scenario (probability 2 in 3), you will win if you switch your choice. And so your decision comes down to a guess: in which scenario do you find yourself? The odds are 2 to 1 that you are in the Wrong Guess scenario, and so it is better to switch.

  • The Monty Hall problem is hard to grasp because, unless you think about it carefully, the role of the host goes unappreciated. But the host is fixing the game! And empirical data confirms the logic — those who switched won twice as often as those who didn’t. (adapted from Mlodinow, 2008, 54-55)

  • [See also]

Research concepts l.jpg

Research Concepts

  • Research methods are the processes used to obtain data to facilitate making decisionswith respect to theories, programs, or policies. They can focus on the characteristicsof phenomena (descriptive research) or, more commonly, the relationshipbetween phenomena  particularly causalrelationships because of the implications for intervention (explanatory/predictive research).

  • Data

  • Informationabout phenomena assessed (described/measured) along variables operationalized within a research design that assists us in making decisions. We can acquire data about 

    • Individuals/dyads or relationships/groups/networks/systems

    • Situations, contexts or ecologies

    • Characteristics/structures/outcomes/processes

Research concepts continued l.jpg

Research Concepts (continued)

  • Causality

  • Traditional western culture’s criteria for establishing causality

    • category

    • precedence

    • proximity

    • covariation

    • Unidirectional with simple “billiard ball” causal links

  • Quantitativeresearch  covariation

  • Qualitativeresearch  proximity & precedence

  • Other and more recent views of causality

    • No causality

    • Multiple, mutual & reciprocal causality

    • Complex causality & Neural Networks




The living neuron l.jpg

The living neuron

The human brain consists of neural cells that process information. Each cell works like a simple processor and only the massive interaction between all cells and their parallel processing makes the brain's abilities possible. A neuron consists of a core, dendrites for incoming information and an axon with dendrites for outgoing information that is passed to connected neurons. Information is transported between neurons in form of electrical stimulations along the dendrites. Incoming information that reaches the dendrites is added up and then delivered along the axon to the dendrites, where the information is passed to other neurons if it has exceeded a certain threshold. In this case, the neuron is activated. If the incoming stimulation is too low, the information will not be transported any further and the neuron is inhibited. The connections between the neurons areadaptive--the connection structure is changing dynamically.Learning ability of the human brain is based on this adaptation.

Gurney (1996)

Neural networks and complex causality l.jpg

Neural networks and complex causality

Recurrent network

Parallel constraint satisfactionnetworkmodel of suicide

Back-propagation network

Research concepts continued17 l.jpg

Research Concepts (continued)

  • In research the phenomena of interest are called variablesbecause to be studied phenomena must be manifested at least two values or levels.

  • In descriptive research these variables are simply called research variables. In explanatory/predictive/intervention research they are differentiated into —

    • Experimental or treatment variables  potential causes

      • Independent variables — vary orthogonally to each other.

      • Predictor variables — not necessarily orthogonal.

      • Intervening variables — moderate the effect of an independent variable

    • Outcome variables  effects of interest in the research

      • Dependent variables — in causal relationships.

      • Criterion variables — in not necessarily causal relationships.

    • Extraneous or confounding variables other potential causesnot controlled or manipulated and perhaps not assessed.

Research concepts continued18 l.jpg

Research Concepts (continued)

  • Three key research tools

  • Assessment (measures the value/category of a variable)

  • Manipulation/intervention/treatment (alters the value/category of a variable)

  • Control (holds the value/category of a variable constant)

  • Typically we use —

  • Assessment, manipulation, or control of experimental variables.

  • Assessment of outcome variables.

  • Control and sometimes assessment of extraneous variables through randomassignment or sometimes statistical techniques.

Research concepts continued19 l.jpg

Research Concepts (continued)

  • Theories, Models, Programs, Policies, Hypotheses & Expectations

  • Theories ormodels are systems of explanation that exist within some paradigm from which predictions, hypothesesor expectations can be derived concerning relationships which can be tested empirically  Basic or theoretical research

  • Programsor policies also are based on explicit or implicit models and have predictions associated with their effectiveness which can be tested empirically  Applied research, evaluation research, social impact assessment, etc.

  • To the degree that empirical data are consistent with the hypotheses, support is provided for the theories/programs/policies from which they are derived

  • Typesof hypotheses

    • Null hypothesis (H0) Differences/relationships observed in samples are due to random variation or sampling error remember “The Drunkard’s Walk”.

    • Alternativeorresearch hypotheses (H1,2,3,etc.) Differences/relationships observed are due to such in populations and may reflect theorized/program specified effects.

  • Decisions about hypotheses

    • Accept or reject the null hypothesis (Decision Theory)

    • Accept/reject the null and alternative hypothesis (Weaker Decision Theory)

    • Adjust degree of acceptance of alternative hypothesis based on probability level

  • Types oferrorin making these decisions

    • Reject the null hypothesis when it is "true" (Type I error or alpha error significance level)

    • Accept the null hypothesis when it is "false" (Type II error or beta error)

Research concepts continued20 l.jpg

Research Concepts (continued)

  • Validity & Reliability

  • Validity

    • Construct validity  the validity of the operationalization of the theoretical or programmatic constructs being studied

    • Internal validity  the confidence with which effects observed can be attributed to the variables studied (sometimes “invalidity,” e.g., Baxter & Babbie)

    • External validity  the generalizability of the findings to other populations, settings, and procedures

    • Cross-cultural&ecologicalvalidity.

  • Reliability

    • Inter-judge or -observer or -interviewer or -trainer reliability

    • Temporalreliability

    • Split-half reliability

Research design l.jpg

Research Design

  • A research design represents the structure of variables assessed, manipulated, or controlled and is used to establish relationships between those variables. They range from what are called experimental designs to non-experimental designs based on the degree to which the amount of manipulation and control allow for reasonable causal inferences to be made.

    • Experimental designs (emphasize covariation; quantitative; simplify context in terms of reducing the number of variables examined)

    • Quasi-experimental designs (do not have random assignment, often with less manipulation and control)

    • Non-experimental designs (emphasize context to establish covariation relationships, often qualitative)

    • Computer simulations involving iterative (many) tests of models, programs, etc.

Common research designs l.jpg

Common Research Designs

Common research designs continued l.jpg

Common Research Designs (continued)

Common research designs continued24 l.jpg

Common Research Designs (continued)

Common research designs continued25 l.jpg

Common Research Designs (continued)

Babbies bar l.jpg

bABbiEs bAr

Inquiry and dialogue l.jpg

Inquiry and dialogue

  • Scientific knowledge is an emergent property of both inquiry and dialogue

  • among cultures and teams 

  • Perception & Communication

Research concept papers proposals reports a l.jpg

Research Concept Papers, Proposals & Reports a

  • Overview

    • Many good ideas – challenge is to develop them into good proposals

    • Presented to those responsible for approval, assistance or funding

    • Many different styles/formats depending on target

  • Concept papers

    • Brief 1 – 3 page description of major issues, objectives, methods, time-line, $, etc.

    • Time-saver for both researcher and target audiences

    • Often developed through several iterations

  • Proposals

  • (Required sections underlined)

  • Title

    • Short and to the point

    • Identifies both general topic and unique focus

  • Author(s)

    • Whose idea?

    • Who does significant work on the project? Footnotes?

    • Whose name will most contribute to approval, $ and publication?

  • Abstract

    • Short (e.g., 250 – 500 word) summary of key points

    • Often all that is read!

Research concept papers proposals reports b l.jpg

Research Concept Papers, Proposals & Reports b

  • Introduction(often untitled)

    • What do you want to do and why–the significance of the project?

    • What's been done before in terms of theory, research, or programs? The literature review.

  • Objectives

    • Planned accomplishments–not processes or methods

    • Be concrete, be realistic

    • Research questions, hypotheses, guidelines for intervention, etc.

  • Method

    • How are the objectives going to be met?

    • Research Design

    • Sample/subjects/participants

    • Assessment Instruments, equipment, technologies, etc.

    • Procedure

    • Data analysis plan–analyses tied to hypotheses or research questions

  • Capabilities of researcher(s)

    • Past research in the area

    • Past grants or contracts completed by organization

    • Facilities and resources

Research concept papers proposals reports c l.jpg

Research Concept Papers, Proposals & Reports c

  • Human subjects concerns

    • Participants must be voluntary

    • They have a right to privacy

    • They must be protected from harm–risk/benefit ratio

    • Institutional review boards (IRB). Uh IRBis“expedited procedure”). The Collaborative Institutional Training Initiative (CITI) human subjects protection and research ethics education program is available online at and can be completed in 2 to 4 hours. This program was initially developed in 2000 through a collaborative process involving numerous universities.

  • Management plan

    • Organization of project in terms of people, tasks, resources and time-line

  • Budget–there’s no free lunch!

    • Direct costs–personnel, fringe, consultants, equipment, facilities, supplies/phone/photocopying/postage, travel, subjects, contractual

    • Indirect costs–“overhead,” some % of Total Direct Costs (TDC)

  • References

    • Everything in text should be in references & nothing in references not in text

    • American Psychological Association (APA) style

  • Appendices

  • Presentation

    • Attractive but not overdone, accurate & on time

    • This is an example of your capabilities

Research concept papers proposals reports d l.jpg

Research Concept Papers, Proposals & Reports d

  • Reports on Completed Projects

  • In addition to key sections of proposal –

  • Results of the research in terms of quantitative and/or qualitative data analysis related to the research questions or hypotheses.

  • Discussion of those results in terms of objectives, hypotheses or research questions and implications for theories, programs or policies and future research. For theses it typically includes identification of the “limitations” of the study

  • Presentations–short so need to focus on key points

  • Published reports–generally reviewed and edited

  • Final reports–includes expenditures

  • again

  • Perception (or Inquiry) & Communication

Professionalism l.jpg


  • Ethics (accuracy in reporting methods & findings)

  • Citation (for access & fairness)

  • Plagiarism (of others & self)!

  • Authorship (inclusion & order)

  • Cultural relativity

  • Communicating to one's colleagues (Books, refereed journals, reports, oral presentations; online publication)

  • Communicating to one's community (Dealing with the social impact of research)

Data collection methodologies l.jpg

Data Collection Methodologies

  • Data collection methodologies are the processes we use to obtain data about variables within a research design. They involve the process of operationalizing the conceptual definitions of variables identified in those designs. These operationalizations are sometimes called operational definitions.

    • Quantitative data collection methods are designed to acquire information about frequencies, amounts or intensities of variables. They are usually relatively context-independent/general/nomothetic.

    • Qualitative data collection methodsare designed to acquire information about processes, contexts, and meanings of variables and their relationships. They are usually relatively context-related/particular/ideographic.

  • The same general methodology (e.g., survey or case study) can often be used to acquire either quantitative or qualitative data. The two are not mutually exclusive but important complements in any multi-method research program. In fact, most good quantitative research has always incorporated the collection of qualitative data during “debriefing.”

  • The selection of methods is often dependent in part on the resources we have available.

  • Each methodology involves the use of different methods or “tools” of assessment, manipulation, and control as well as sampling procedures.

Some commonly used methodologies in communication research l.jpg

Some commonly used methodologies in communication research

  • Laboratory research (occurs in a removed, relatively controlled setting) vrsfield research (occurs in natural setting); observation vrs intervention

  • Surveys, questionnaires & interviews (eliciting responses from individual participants)

  • Documentary or archival research (examining data already available)

  • Focus groups (using group dynamics in focused discussion to enhance data collection)

  • Participant observation (researcher is a participant in the phenomena of study)

  • Case studies (studying a single case or event representative of phenomena)

  • Media analysis, protocol or conversational analysis, hermeneutic inquiry (studying the content of communications in a variety of media)

  • Ecological observation (studying ecological contexts across people)

  • Ethnography (studying phenomena as participants describe & understand them)

  • Action research (studying questions or issues with particular attention to intervention or change) & participatory action research (“subjects” participate with researcher in design & conduct of the study)

  • Appreciative inquiry(asking questions that focus on highlighting the strengths–as opposed to weaknesses of an organization to aid growth toward potential)

  • The World café (a structured conversational process for awakening collective intelligence about key questions and issues)

  • Computer simulations involving iterative tests of models, programs, etc. simultaneously or sequentially. “

  • These methods aren’t mutually exclusive & all can provide quantitative or qualitative data.

Assessment methods l.jpg

Assessment methods

  • Laboratory, organizational or community observation, self report, other report methodsto acquirequantitativeor qualitativedata.

  • Assessment throughobservation

    • “Outside” vrs participant observation

    • Live/recorded

    • Construct validity issues in operationalizing variables in recording and coding data

    • Interobserver/interjudge/intercoder reliability

  • Assessment through self andother report–interviews, questionnaires, surveys, documentary research

    • Item wording–clarity, face & predictive (criterion) validity, demand characteristics

    • Response format–open/closed

    • Relationship between researcher and respondent

    • Culture-method interaction (e.g., individual/collectivist culture and data collection epistemology and methods  Pe-pua “Pagtatanong-tanong”)

    • Order effects, social desirability, response bias

    • Length

    • F2f versus online

    • Encoding/scoring (and error)

  • Measurementor scalingin assessment

    • Nominal, ordinal, interval& ratioscales

    • Discrete versus continuous scales

Slide36 l.jpg

Analyze Results

Collect Responses

Manipulation control methods l.jpg

Manipulation & Control Methods

  • Manipulation, intervention ortreatment methods– laboratory, organizational or community.

    • Topic specific–the essence of the study of potential cause(s).

    • Laboratorymanipulation or Field intervention–active

    • Opportunistic “intervention”–passive observation of “naturally” occurring events (more like assessment in that confounding variables may not be controlled)

    • Usually based on past research –replication, altered replication, balanced replication

    • Can require creativity–especially for altered replications or new topics–and resources.

  • Controlmethods

    • Random assignment of participants

    • Matching/pairing of participants

    • Assessment and statistical control or weighting and analysis of covariance

Other key method issues l.jpg

Other key method issues

  • Sampling (operationalizing who or what we collect data from)

    • Representative sample(“probability sample”–random & independentselectionfrom population)

    • Stratified random sample (randomly and independent selection within strata)

    • Quota sample

    • Opportunity sample

    • Sampling fordiversity

  • Pre-testingorpilot testing

    • To check workability of procedures

    • To check operationalizations–assessment instruments, manipulations, controls

    • To train research staff

  • Data Management

    • Coding and inputting data (must use formats appropriate to data analysis requirements)

    • Editing/verifying data

    • Data reduction (e.g., content analysis, factor analysis)

    • Data storage–original documents, tapes, discs, hard drives, etc., backups

Computer simulations l.jpg

Computer Simulations

  • The traditional western social science paradigm views humans as primarily intentional& rationaland behavior, and its products, as caused by decision-making, planning, leadership, and so forth. Its epistemology & methods are consistent with this ontology, including assessment, manipulation/intervention and control in the "real world." But there are other paradigms.

  • Evolution computation, self-organization, and swarm intelligence are related paradigms that view people (teams, organizations, etc.) as potential problem solutions to challenges in the ecology and identify natural selection strategies to optimize the solutions (i.e., each person is a potential solution to some problem!). Evolution is a general problem-solving algorithm.

  • Within these later paradigms human behavior, and its products, are seen to emerge–often in complex and unpredictable ways–from relatively simple rules of behavior/perceptionand communication.

  • But emergence occurs over many generations or iterations and is difficult to study with traditional methods with limits of time and resources. Computers, however, can run programs specifying behavioral rules within ecological parameters over many iterations very fast and observe what types of behaviors and products emerge. This method is called–Computer Simulation. The use of computer-based models in exploration is akin to the use of gendanken [thought] experiments in physics” (Holland, 1998, p. 241).

The game l.jpg

The Game


  • “The Game" illustrates through simulation how simple rules at the local level (perceptual/behavioral/communication) can produce emergence of unpredictable and complex structures at a global(organizational) level without the need to infer leadership, management, plans, recipes, or templates to guide behavior.

  • As you start playing the game note that changingrules(e.g., for appropriate behavior) and parameters (e.g., population and sight distance) change outcomes drastically and unpredictably resulting in patterns that are very complex and appear planned or organized--but by who!

  • Note how changing sight distance affects outcome in terms of number and stability of the emerging clusters (or teams). Play with the parameters (e.g., try population=78 or so, sight distance=7).

  • Note that communication difficulty or cultural diversity, etc. could be functionally similar to sight distance and be sufficient to produce cultural clustering without postulating other social psychological explanations. In what ways might cultural differences in the rules for local interactions affect the self-organization process and hence the global outcomes?

  • Note how medium (e.g., online vrs f2f) could also be related to sight distance in effects?

  • How much of what goes on in teams is attributable to leadership or management or previously learned global plans and how much "simply" emerges from relatively simple rules we learn for interacting at the local level?

Social psychology of research l.jpg

Social Psychology of Research

  • The socialnature of social research can present special potential threats to validity–particularly in the form of confounding variables

  • From the Subject’sperspective

    • High regard for science

    • Desire to help (or hinder) researcher

    • Evaluation apprehension

    • Demand characteristics (Orne,1962; also Milgram, Zimbardo)

  • From the Researcher’s perspective

    • Experimenter/interviewer/observer effect (a tendency to obtain consistent differences in observation or measurement from other researchers across conditions on the same variable)

    • Experimenter/interviewer/observer bias (a tendency to obtain differences in behavior and/or observation or measurement between conditions consistent with the researcher's expectations) (Rosenthal & Fode, 1963 with rats & grad students; Cordero & Isen, 1963 with planeria & Grad Students; Rosenthal & Jacobsen, 1968 with self-fulfilling prophecies or effects)

  • What to do?

    • Be aware!

    • Double blind and automation procedures

    • Varying “researcher” as an experimental variable

    • Replication

    • Multiple methods

Special issues in cross cultural intercultural research l.jpg

Special Issues in Cross-cultural & Intercultural Research

  • Objectives and motives for cross-cultural research

    • To test the generalizability of our findings, theories or programs developed in one culture in another culture

    • To develop our understanding of phenomena from a diverse cultural base ("pan-cultural research"). Emic (withinculture) versesetic(cross culture)research perspectives

    • $ and sabbaticals


    • defining& operationalizing culture

    • concept/linguistic equivalence& back translation(does a given concept or word have equivalent meaning in different cultures?)

    • functional equivalence(does a concept function the same in different cultures?)

    • metric equivalence(do data points have comparable meaning in different cultures?)

    • differentiating “culture”from other variables

    • Researcher/subject interaction effects become intercultural interaction effects

    • Culture-method interaction (e.g., individual/collectivist culture and data collection epistemology and methods–Pe-pua “Pagtatanong-tanong”)


    • Diverseresearch teams

    • Multiculturalresearch programs on topics including participatory action research in which “subjects” participate with researcher in design & conduct of the study.

Qualitative methods l.jpg

Qualitative Methods

  • Direct qualitative observations of typically natural settings often using the methods of participant observation and/or intensive interviewing.

  • Alternatively called ethnography, naturalistic research, narrative analysis, verbal protocols, etc. See Qualitative Methods for Management & Communication Research.

  • The epistemological foundation is that only through direct observation, careful listening and/or active participation can one get close to understanding those studied and the character of their social worlds.

  • Often more of an inductive ("grounded theory") than deductive process, particularly valuable in exploratory research

  • A potentially rich, but very labor-intensive, time-consuming process that must be done in a persistent and precise manner and requires care and elaboration in publication.

  • The researcher and researcher's perspective are central to the analysis process and thus cannot be replaced by software or contracted to others.

Key steps in qualitative data analysis l.jpg

Key Steps in Qualitative Data Analysis

  • Social science framing (structuring in terms of theory, hypotheses, research questions)

  • Gathering data typically with the researcher as participant or observer

  • Coding of observations/interview responses (an interactive process between the researcher and the data requiring immersion in that data).

  • Memoing (making interpretive note as coding continues)

  • Diagramming (taxonomies, typologies, concept charts, flow charts, etc.)

  • Thinking flexibly and being open to insight and willing to change

  • Managing researcher anxiety (qualitative analysis process is not mechanical or easy)

  • Writing it all up

Some examples of a qualitative data analysis process l.jpg

Some examples of a Qualitative Data Analysis process

  • Fontaine & Emily (1978). Causal attribution and judicial discretion: A look at the verbal behavior of municipal court judges. Law and Human Behavior, 2, 323-337.

  • or

  • Fontaine (2004).Voices from the Road: Descriptions of a Sense of Presence in Intercultural and International Encounters. (Paper presented at the 28th International Congress of Psychology – ICP2004 – Beijing, China).

Fontaine emily a l.jpg

Fontaine & Emily a

  • Examination of the judges' verbal statements reveals the apparent use of attributional processes discussed earlier. Some appear to reflect the use of a single basic process, other reflect a combination of processes. For instance, the following three excerpts seem to indicate the use of the logical process described by Kelley.

  • Judge: Does the defendant have any further evidence?

  • Defense Attorney: No, sir.

  • Judge: Having heard all the evidence of both the City and the defendant's testimony / the Court has no choice but / to find him guilty as charged / You are well aware that there is no "accessory" in the Kansas City Ordinances / Anyone participating in an offense is guilty / and on his testimony he was equally guilty / He asked "how much" / We note that he has a prior conviction on soliciting for prostitution / No doubt that's why he was there / Sentence is 15 days.

  • Notice the apparent use of consistency information –"We note that he has a prior conviction ...“–to infer the cause of the defendant's behavior – "No doubt that's why he was there.“

  • In the following example the judge appears to be looking more at how distinctive the act is for the defendant:

  • Judge: You are charged with appearing in an intoxicated condition How do you plead?

  • Defendant: Guilty, your honor, but I want to. I have a statement.

  • Judge: What are you at MCI (Municipal Correctional Institute) for now?

  • Defendant: Being drunk.

  • Judge: For how long?

  • Defendant: I don't know.

  • Thus the judge appears to be seeking information about how distinctive the defendant's public intoxication is–"What are you at MCI for now?" In other words, is public intoxication the only type of criminal act in which the defendant engages (i.e., is it distinctive?) or just one of many types of criminal acts? The question is not how consistent his public intoxication is–although the judge does discover some consistency–but whether other criminal acts are usual for the defendant.

Fontaine emily 1978 b l.jpg

Fontaine & Emily (1978) b

  • Other statements seem to reflect the use of a process more similar to that described by Jones, in particular, concern with the social category of the defendant to determine the value of noncommon effects.

  • Judge- Now, do you have anything you want to say about why you did it? / At your age / with no prior record / why would you do something like that?

  • Defendant: The first time I've ever been arrested.

  • Judge: All right, you pleaded guilty, sir / You don't have any prior record whatsoever / At your age / The humiliation at having been arrested and coming to Court would be sufficient / A fine of $25 is assessed.

  • Notice that the judge seems particularly concerned with the value of such an act (shoplifting) to that category of defendant –quite likely to determine, as Jones proposes, the defendant's intention in committing the act and something about his dispositions. Further, he seems to assume, based upon the defendant's social category, that the entire situation is very humiliating to the defendant, i.e., the judge is stereotyping.

  • A similar process is reflected in the following example:

  • Judge: You have at least one prior conviction in a two year period / However. I note that this case was continued the first time because you are a student / is that correct? You attend Drayton University?

  • Defendant: Yes.

  • Judge: Ordinarily I would order driver's school / However, that doesn't seem to be ... / Fine of S20.

  • The judge seems to conclude from some categorization of the defendant based on her student status that the normal sentence for the offense would be pointless or undesirable. This may be a case in which some process involving the defendant's social category had more impact, even though the consistency information ("at least one prior conviction") should have led to a causal attribution to the defendant's disposition and thus a relatively severe sentence if a logical process were used.

Fontaine 2004 a l.jpg

Fontaine (2004) a

  • "I was working there for an emergency relief organization for a three week assignment after a cyclone swept through the Samoa. A local government employee was assigned to the service center where I worked and we became friends. On my last night on the island, we went to visit a friend of his who was not home, so we waited for him by putting a blanket on the ground and sat for a few hours watching and discussing the stars, and talking about our different cultures and life experiences. That was magic aplenty, but to serenade us in the night there were two old Samoan men sitting in a little fale outside the house singing old Samoan songs. The stars, the smell of the sea, the singing, an enlightening conversation, and the beginnings of a cross-cultural friendship were all ingredients in what I can classify as one of the most powerful experiences in my life. Knowing I was leaving the next day heightened my senses. It was if I was trying to drink the magic all in before the dream vanished upon my boarding the jet home. I felt then as if this brief experience was a special chapter in my life. The experience made a dramatic impact on my emotions and my senses. I cannot look at the stars today without thinking about that night many years ago. I felt somehow connected to the island and its people with a bond of song and story topped off with a celestial dollop of starlight.“

  • “I take refuge in these images: the molten river, a gnarled tree, a brackish pool in which a single white lotus, now closing gently against the evening coolness, will again miraculously bloom. Three women pass me, giggling at my cropped hair, my indeterminate features. Do they think I'm a boy? Do they know I'm a woman and wonder at my aloneness, here on this road? Each one wears the vermilion streak lining the part in her glossy black hair--the furrow split by the plow--signifying her married status. A young tea vendor pours a cup of milky brew the color of his palms and flashes a radiant gap-toothed grin at me. 'Chai, sister?' he beckons earnestly, forcing another refusal from me; the momentum of my stride carries me on. A thin old man drives his dusty gray and blue-black bullocks leisurely toward the patchy fields on the riverbank. Their massive haunches heave a lazy rhythm as they move past me, their tails halfheartedly flicking away flies. I run my hand over their broad backs, touch the pungent skin of the world. Where am I in all this? A spectator, a ghost? A guest. Suddenly I become lighter, transparent. Things pass through. My senses are as permeable as a membrane. Someone is laughing.

Fontaine 2004 b l.jpg

Fontaine (2004) b

  • Those descriptions represent probably the signature quality of a sense of presence in terms of its impact on the person–a vividness, realness and a feeling of being very alive.It is like sensing the world with the gloss wiped away, without a lens, somehow more directly, and thus more intensely. Our culture, of course, focuses our perception on certain stimuli in the world and guides our construction of meaning for them. That's vitally important. It helps us survive, be adjusted, and perform well in our culture's world. It's as though our culture provides lenses through which we view the world and the "prescription" for them. But in focusing our perception through these lenses on some parts of a familiar world, it glosses over the remainder. The sense of presence is like blinking and, suddenly, perhaps briefly, seeing the world clearly. It is perceptual clarity or sensory intimacy. This is very similar to Seamon’s (1979, 105-111) description of heightened contact–"This vividness of presence is described as an inner tingling and quiet ... as a sense of reverence for time and place. The person is quiet and receptive in the moment of contact. ... Heightened contact, like noticing, is unexpected and sudden."

Quantitative data analysis l.jpg

Quantitative Data Analysis

  • Data analysis involves basically an optimization process for determining the goodness of fit of alternative solutions to problems. They are significantly modeled on the processes of neural networks–how the human mind thinks. Central to these processes is minimizing the sum of squares (the deviations of observations about some optimum).

  • Some of these analyses involve inference of cause and effect relationships in the network when variables are independent (e.g., with IVs, mediating Vs, and DVs and using ANOVA) and a single global optima can be identified. A "Fujiyama Landscape.“

  • Others involve the strength of relationships when the variables are not independent (e.g., using correlation and regression) and multiple, local optima are identified. A more common "fitness landscape."

Quantitative data analysis concepts l.jpg

Quantitative Data Analysis Concepts

  • Statistics or quantitative methods is a tool–largely based on applied mathematics–for helping describe and draw inferences from research data.

  • Descriptive statistics (describing or representing data in ways to facilitate interpretation)

  • Inferential statistics (analyses assisting us in making inferences about populations from data on samples)

  • Difference analyses (statistics for drawing inferences about differences in dependent variables in populations defined by independent variables–estimates the likelihood of causality, typically with experimental designs)

  • Relatedness analyses (statistics for drawing inferences about the relationship between variables in a population–estimates the likelihood of covariation, typically with non-experimental designs)

  • Parametric statistics (inferential statistics used to draw inferences about parameters, e.g., means & variances)

  • Non-parametric statistics (inferential statistics used to draw inferences about characteristics of populations other than parameters, e.g., frequencies & %)

  • Population (any class of phenomena defined in terms of unique and observable/measurable characteristics–usually the people we are trying to understand)

  • Sample (some subset of a population–the specific people we actually study)

  • Parameter (a mathematical characteristic of a population)

  • Statistic (a mathematical characteristic of a sample)

Data analysis descriptive statistics l.jpg

Data Analysis–Descriptive Statistics

  • Descriptive statistics describe or represent data in ways to facilitate interpretation. They help us simplify what is usually a complex array of data. If we are studying the entire population, they are all we need to interpret the data, but we usually aren’t; If we are studying a sample, they are usually the first thing we look at to see what we got. They can guide decisions about subsequent inferential analyses.

  • Descriptive statistics are typically presented in table or figure form (e.g., in the latter--frequency polygons, line graphs, bar graphs, pie charts, etc.). The form of both is usually dictated by convention for consistency and to minimize biased presentation (see APA style).

  • Frequency distributions(distribution of observations along a scale–not a mathematical characteristic of a sample, thus not a statistic in the strict sense). Also percentage distribution, cumulative frequency distribution, and cumulative percentage distribution.

  • Scale score12345678910

  • Frequency45612867433

Data analysis descriptive statistics continued l.jpg

Data Analysis–Descriptive Statistics(continued)

Data analysis descriptive statistics continued54 l.jpg

Data Analysis–Descriptive Statistics (continued)

Data analysis descriptive statistics continued55 l.jpg

Data Analysis–Descriptive Statistics(continued)

Data analysis inferential statistics l.jpg

Data Analysis–Inferential Statistics

  • Assist in making inferences about parameters from statistics.

  • We sometimes estimate parameters from statistics. Inferential statistics help us specify the limits within which we are confident that a parameter falls (confidence limits, confidence interval, and confidence coefficient expressed in terms of probability the parameter falls within the limits)

    • Example – mean # of class books read/week = 15. What is the parameter ( ) for the population of graduate students?

  • We may wish to infer if a sample is drawn from a population with known parameters or whether two or more samples are from the same population with respect to some variable. Inferential statistics help us specify the probability that the samples are from the same population (i.e., of the null hypothesis). The analyses used to calculate this probability are called tests of significance. The probability is called the level of significance or alpha level and if it is sufficiently low we say the effect is statistically significant, i.e., they are not from the same population.

    • Example – mean # of class books read/week this semester = 15. Last semester = 9. Are the students from the same population with respect to reading frequency?

  • We sometimes wish to infer whether a relationship obtained in a sample between two or more variables occurs in the population. Tests of significance give us the probability that they are not related (i.e., of the null hypothesis) using the same terminology as above.

    • Example -- Is there a relationship between books read/week and time spent on FaceBook/week?

And again beware

Sampling distributions l.jpg

Sampling distributions

Sampling distributions continued l.jpg

Sampling distributions (continued)

  • Sampling distribution of sample variances–the distribution of sample variance along some variable.

  • The mean of sample variances is equal to the population variance, thus the variance of a representative sample is an unbiased estimate of the population variance.

  • The sampling distribution of sample variances is not normally distributed (it is an F distribution).

  • We can differentiate the variance of sample means from the mean of sample variances. We thus have two independent ways of calculating the population variance–between group variance and within group variance respectively. The comparison between them is the basis of analysis of variance–the more different the estimates the less likely the groups are to be from the same population (i.e., the null hypothesis).

Sampling distributions continued59 l.jpg

Sampling distributions (continued)

Sampling distributions continued60 l.jpg

Sampling distributions (continued)

Data analysis analysis of variance l.jpg

Data Analysis–Analysis of Variance

Data analysis analysis of variance continued l.jpg

Data Analysis–Analysis of Variance (continued)

Data analysis analysis of variance continued63 l.jpg

Data Analysis–Analysis of Variance (continued)

Data analysis analysis of variance continued64 l.jpg

Data Analysis–Analysis of Variance (continued)

Data analysis analysis of variance continued65 l.jpg

Data Analysis–Analysis of Variance (continued)

Data analysis analysis of variance continued66 l.jpg

Data Analysis–Analysis of Variance (continued)

Data analysis correlation and regression l.jpg

Data Analysis–Correlation and Regression

Data analysis correlation and regression continued l.jpg

Data Analysis–Correlation and Regression (continued)

Data analysis other common analyses l.jpg

Data Analysis–Other Common Analyses

Data analysis other common analyses continued l.jpg

Data Analysis–Other Common Analyses (continued)

Data analysis data files l.jpg

Data Analysis–Data Files

Data analysis statistical software packages l.jpg

Data Analysis–Statistical Software Packages

  • Microsoft Excel (OK, for many common analyses–t-test, ANOVA, correlation, etc.)

  • SPSS (Statistical Package for the Social Sciences–a full system)

  • SAS (Statistical Analysis System–a full system)

Online calculator for many simple descriptive stats & analyses

OK for many common analyses, particularly good for management

Data analysis implications for methods l.jpg

Data Analysis–Implications for Methods

  • Sample size (the larger the sample the greater the degrees of freedom and the easier to get significance–because of smaller error term and lower critical values for given alpha levels; influence type of analysis appropriate, e.g., chi square needs a minimum of 5 per cell, multiple regression and multivariate analyses need large n's)

  • Sample selection procedure (most tests are appropriate only to the degree probability sampling is used. That is, samples are representative–randomly and independently selected)

  • Scale for outcome variables (influence type of analysis appropriate)

  • Design in terms of relationship between variables (influence type of analysis appropriate, e.g., in ANOVA experimental variables must be orthogonal or multiple regression more appropriate–but best solution not unique)

  • Login