Term
| Describe the difference between basic and applied research. |
|
Definition
| Basic research is driven by a curiousity in a scientific question, the motivation is to expand man's knowledge, not invent something. Basic research lays the foundation for applied research. Applied research is designed to solve practical problems, rather to acquire knowledge for knowledge's sake. Goal of the applied scientist is to improve the human condition. |
|
|
Term
| Describe the characteristics of scientific research. What makes research scientific? |
|
Definition
| Research is observable, descriptive, systematic, tied to theory. Poses significant questions that can be investigated empirically. Provides a coherent and explicit chain of reasoning. Can be replicated and generalized across studies. Tied to theory: relate constructs to observations and to each other |
|
|
Term
| Explain empirical evidence, differentiate between empirical evidence and theory |
|
Definition
Source of knowledge acquired by means of observation or experimentation.
Empirical evidence is information that justifies a belief in the truth or falsity of an empirical claim. |
|
|
Term
| Explain what a research problem is |
|
Definition
| An area of interest or concern. Must be in some way observable. Must be basic or applied. |
|
|
Term
| Describe what it is meant by a scientific theory. |
|
Definition
| A tested, well-substantiated hypothesis based on knowledge that has been repeatedly confirmed through observation or experiment. |
|
|
Term
| Describe what a construct is. |
|
Definition
| An unobservable trait, attribute or characteristic. You must operationalize variables- identify indicators or variables that represent constructs. These variables are tangible, observable manifestations of constructs |
|
|
Term
| Describe the characteristics of a good research question. |
|
Definition
| Feasible, clear, significant (worth investigating, strong rationale), ethical |
|
|
Term
| Describe the literature review process. |
|
Definition
| The foundation of any study. One of the single best determiners of study quality. Three parts: Searching, Reading, Writing. Brainstorm key words, search, primary & secondary sources, find articles and keywords cited in what you've read and lesser known sources. Writing helps you trim literature and see if there are any holes or gaps: Broaden search if need be |
|
|
Term
| Differentiate between primary and secondary sources |
|
Definition
| A primary source is an original document whereas secondary source interprets and analyzes primary sources. For example, The Diary of Anne Frank is a primary source. A book about Anne Frank within the context of WWII would be a secondary source. |
|
|
Term
| Differentiate between confidentiality and anonimity |
|
Definition
- Maintaining confidentiality of information collected from research participants means that only the investigator(s) or individuals of the research team can identify the responses of individual subjects
- either the project does not collect identifying information of individual subjects (e.g., name, address, Email address, etc.), or the project cannot link individual responses with participants’ identities |
|
|
Term
| Be able to recognize the appropriate level of review by HSRB (IRB) for different study designs |
|
Definition
Exempted review: secondary data, surveys, interviews, public observations, educational tests
- In educational settings using normal educational practices
Expedited review: studies involving minimal risk, vulnerable populations, biological samples w/o invasion, recorded data
Full review
All research should be reviewed |
|
|
Term
| Describe the Informed Consent Process |
|
Definition
Participant must be aware of what they will be asked to do in the study, must freely choose to particpate.
Consent must come from guardian for minors and those with diminshed capacity.
Participants are free to withdraw at any time.
Information should be given in a language the participant can understand.
Information should include risks and benefits, should help them decide whether to participate |
|
|
Term
|
Definition
Respect for Persons: requirement to acknowledge autonomy and protect those with diminished autonomy
Beneficence: Do no harm; maximize possible benefits and minimize possible harms; benefits must outweigh risks
Justice: Requires that people be treated fairly; selection of participants as well |
|
|
Term
| Understand the conditions in which using deception in research might be appropriate |
|
Definition
In most instances, researchers do not use deception.
Deception is never used when there is a reasonable expectation of pain or severe emotional distress
The purpose of deception needs to be made explicit in the IRB application.
If deception is used, a debriefing of the true nature of the study needs to occur as soon as possible |
|
|
Term
|
Definition
Research plans must be approved
Do no harm (physically or mentally)
Informed consent |
|
|
Term
| Buckley Amendment (FERPA) |
|
Definition
Confidentiality of data
"Legitimate educational interest" |
|
|
Term
|
Definition
| Children, those with mental disabilities, and those subject to significant levels of coercion, such as prisioners. |
|
|
Term
Define and give examples of variables
Differentiate between variables and constants |
|
Definition
A concept that varies in a class of objects. A variable among students might be hair color.
Student = constant, Hair color = variable |
|
|
Term
| Differentiate between categorical and continuous variables |
|
Definition
Categorical variables: differences in quality, not degree (e.g. male/female)
Continuous variables (quantitative): describes differences of degree |
|
|
Term
| Give examples of nominal, ordinal, interval, and ratio variables |
|
Definition
Nominal: groups data (gender, place of birth)
Ordinal: ranks data (SES, student rank)
Interval: equal difference between score points (test scores, personality scale)
Ratio: equal differences plus a true zero point (weight, length)
*Treat interval/ratio variables similarly in social sciences research |
|
|
Term
| Differentiate between independent and dependent variables |
|
Definition
Independent variable: variable that affects change; hypothesized causal variable
Dependent variable: variable changed by intervention
Extraneous variables: might influence the dependent variablel; we want to control for extraneous variables |
|
|
Term
| Define a hypothesis and recognize difference between directional and nondirectional hypotheses |
|
Definition
Hypothesis: a proposed explanation for a phenomenon; a prediction of the outcome of the study; based on theory
Directional: an indication of the outcome in a specific direction
Nondirectional: a change will occur but not sure how or which direction, more ambiguous |
|
|
Term
| Describe what a representative sample means |
|
Definition
A subset of a statistical population that accurately reflects the members of the entire population; should be an unbiased indication of what the population is like.
|
|
|
Term
|
Definition
The extent to which the findings from a sample can be generalized to the entire population;
The degree to which the findings can be generalized from the study sample to the entire population |
|
|
Term
| Understand and recognize/give examples of different sampling techniques (random & non-random) |
|
Definition
Random: "A simple random sample choses in such a way that every set of n individuals has an equal chance to be the sample actually selected
Non-random: convenience or purposive sample (who is available or volunteers and when the researcher decides who in the population to collect data from) |
|
|
Term
Define and describe validity as a concept
Explain measurement validity |
|
Definition
The degree to which evidence and theory support the interpretations of test scores entailed by proposed uses of tests
How well does it test what it sets out to test? (accuracy)
Measurement validity: the ways we interpret and use measurements; some interpretations are more or less valid |
|
|
Term
| Explain content, criterion and construct validity and give examples of each |
|
Definition
Content validity: is my assessment focused on the content I wish to measure? Focus, format isses (printing, font, spacing, language, directions)
Criterion validity: can the scores on the assessment predict performance on a task? (predictive evidence)
Do the scores on the assessment relate to the scores on another well-known assessment? (concurrent evidence)
Construct validity: Is the underlying cognitive skill measured reflected in the measurement scores? Evidence: clear definition of construct (tested empirically) comparison between individuals with high & low levels of construct |
|
|
Term
| Explain and describe reliability as a concept |
|
Definition
| The consistency of measurements when a test is repeated on a population of individuals or groups |
|
|
Term
| Explain and describe what is meant by measurement error |
|
Definition
| It's always present; efforts should be made to account for and reduce error |
|
|
Term
| Give examples of how to test the reliability of a set of scores |
|
Definition
Test-retest: theoretically useful but not practical
Equivalent forms: two different tape measures (high concurrent validity)
Interrater: two raters using same form/criteria to measure (ACT Writing portion) |
|
|
Term
| Describe what is meant by internal validity |
|
Definition
| Concerned with the quality of inferences you can make in your research; the recognition and removal of nuisance variables or unaccounted for errors |
|
|