click below
click below
Normal Size Small Size show me how
Research Exam I
Research, Epidemiology and Statistics Exam I
Term | Definition |
---|---|
Definition of Research | Research is a systematic process based on the scientific method that facilitates the identification of relationships and determination of differences in order to answer a question |
T/F: Discovery arises from the research process when we set out to discover something | False, discover arises from the research process, but we DON'T set out to discover something |
Pathway of Progress (How Funding Effects Stage, in other words: what is the benefit of funding at each stage) | Curiosity/Perceived Need -> Basic Research (Advance Fundamental Knowledge) -> Applied Research (Advancement of Technology) -> Product or Change Development (Acceptability or Profitability of the Change/Product) -> Quality of Life Improvements |
T/F: Health Research and Development dollars have increased as Health Care Spending has skyrocketed | False, Health R&D has increased marginally. |
Scientific Method/Process for Conducting Research (7 steps) | 1) Specify and justify a problem 2) Format your research question 3) Collect and summarize prior research 4) Develop your method of work 5) Select your study participants 6) Conduct study 7) Analyze and interpret data |
Descriptive research goal | Describes a health event, incidence, prevalence |
Association research goal | Risk factors, confounding variables, modifiers |
Causal research goal | Meets causal requirements, is valid and reliable; hardest to do |
Evaluation research goal | Determine efficacy of intervention, role in preventing and minimizing adverse events, must know the cause of the issue; what we mostly do in health sciences |
Definition of the measure: construct | an attribute or characteristic that is expressed in a general or abstract manner (example- scholastic achievement)– global idea |
Definition of the measure: variable | a characteristic or attribute that can be measured and that varies among individuals or organizations (example- grade point average)—measureable idea |
T/F: Mistakes made in the calculations or in reading the instrument are not considered in the error analysis | True |
Systematic error | reproducible inaccuracies that are consistently in the same direction. Ex- The cloth tape measure that you use to measure the length of an object had been stretched out from years of use. (As a result, all of your length measurements were too large.) |
Random error | statistical fluctuations (in either direction) in the data due to the precision limitations of the measurement device. Ex- You measure the mass of a flask three times using the same balance and get slightly different values |
Five sources of evidence in the pursuit of truth | 1) Custom and tradition 2) Authority 3) Personal experience 4) Deductive reasoning 5) Scientific inquiry |
Deductive Reasoning | Logic; thinking proceeds from general assumption to a specific application. Not sufficient as a source of new truth. Ex: All philosophers are moral. Socrates is a philosopher. Therefore, Socrates is moral. |
Inductive Reasoning | Conclusions about events (general) are based on information generated through many individual and direct observations (specific). |
Perfect Induction | Conclusions based on observations made from ALL members of a group or population |
Imperfect Induction | Conclusions based on observations made from a random sample of members of a population |
1st and arguably the most important step of the scientific method | Identifying the problem |
Step 2 of the scientific method | Formulating a hypothesis (belief or prediction of the eventual outcome of the research). Based on deductive reasoning. |
Null hypothesis | All is equal, no differences exist |
Alternative or research hypothesis | Usually specific and opposite of the null hypothesis |
Step 3 of scientific method | Developing a research plan; strategy must be made for gathering and analyzing info that is required to test hypothesis or answer |
4 parts of developing a research plan | 1) Selection of research methodology. 2) ID of subjects/participants 3) Description of data-gathering procedures 4) Specification of data analysis instruments. Summ: What, who, how, and what do you need |
Step 4 of scientific method | Collecting and Analyzing the Data; follow all of your predetermined protocols |
Step 5 of scientific method | Interpreting Results and Forming Conclusions; Data analysis is not an end in itself...Does the hypothesis support or refute the original hypothesis? |
Descriptive Questions purpose and examples | To describe phenomena or characteristics of a particular group of subjects being studied. Ex: survey research and qualitative research |
Difference Questions purpose and example of research type | To make comparisons btwn or w/in groups to see if there is a difference. Ex: Experimental research and non-experimental research |
Experimental research examples | Treatment vs control; pre vs post test comparisons |
Non-experimental research definition | compare one group to another based on existing characteristics |
Relationship Questions purpose. T/F: This establishes a cause and effect relationship. | To investigate the degree which 2+ variables are associated w/ each other (covary). Instead of looking at differences in groups, they look at relationships among them. False, this DOES NOT establish cause and effect. |
Hypothesis definition and what reasoning it is based off of | A concrete, specific statement about the relationships between phenomena w/ the belief/prediction about the eventual outcome of the research. Based on deductive reasoning |
Theory definition and what reasoning it is based off of | A theory establishes a cause-and-effect relationship between variables with the purpose of explaining and predicting phenomena w/ a belief/assumption about how things relate to each other. Based on inductive reasoning |
Definition of empiricism | Acquiring info and facts through the observation of our world. Allows us to make pragmatic observations developing theories though our experiences and those observations. Often our Txs will be based non-scientifically on our experiences. |
Basic vs Applied Research | Basic: pure, fundamental research in order to discover new knowledge. Applied: central purpose to solve an immediate problem inferring beyond the group or situation studied. |
Quantitative vs Qualitative Research | Quan: numerical, measurable data with a traditional approach (question, hypothesis, procedures, controls, large samples, stat. analysis). Qual: non-numerical data using sociological research methods in a natural setting being interpretive or descriptive. |
Experimental vs Non-experimental Research characteristics and/or examples | E: 3 fundamental characteristics: 1) at least 1 independent variable 2) Extraneous variable controls 3) observation of dependent variable response to independent variable. Ex: cause and effect N: causal-comparative, descriptive, correlational, historical |
6 major limitations in conducting research | 1) Time 2) Costs 3) Access to resources 4) Approval by authorities 5) Ethical concerns 6) Expertise |
6 assumptions of qualitative designs (Researchers = Rs) | 1) Rs concerned w/ process>outcome 2) Rs interested in meaning 3) Rs instruments for collection and analysis 4) Fieldwork involved 5) Descriptive 6) Inductive |
Quantitative Descriptive | Descriptive statistics: graphical and numerical techniques for summarizing data. |
Quantitative Analytic | Inferential statistics: procedures for making generalizations about characteristics of a population based on information obtained from a sample taken from that population |
Population | any set of individuals (or objects) having some common observable characteristics. |
Sample | the subset of a population which represents the characteristics of the population. A sample consists of respondents or subjects |
An informant | a person from whom a linguist obtains information about language, dialect, or culture. |
A corpus | is a collection of written or spoken material. “body of information” |
1st 7 steps to experimental research | 1) IDing the research ? or problem area 2) 1st review of literature 3) Distilling the ? to a specific research problem 4) Further review of literature 5) Formulation of hypotheses 6) Determining the basic research approach 7) IDing the pop. and sample |
Steps 8-12 to experimental research | 8) Designing data collection plan 9) Selecting or developing specific data collection instruments or procedures 10) Choosing the method of data analysis 11) Implementing the research plan 12) Preparing the research report |
Study types from most valuable to least valuable (and their levels of medical evidence based on source of evidence) (7) | 1) Systematic review/meta-analysis 2) randomized controlled trial (I) 3) cohort studies (II-2) 4) case control studies (II-2) 5) case series/case reports (II-3) 6) animal research/lab tests 7) Expert Opinion (III) |
T/F: Most medicine is practiced based off of case reports | True, and they are one of the least reliable/valuable forms of study |
Concept of Study Base: definition; in cohort study; in cross-sectional study; in case-control study | pop. who experience disease outcomes you will observe in your study; Cohort: 1+ explicitly defined cohorts based on characteristics at time zero. X-Sec: 1+ hypothetical cohorts sampled at one point in time. C-C: cohort either that gave rise to cases |
Observational study vs experimental study | O: only observing/counting and not doing anything. E: interfering w/ disease |
Descriptive Studies | No assignment of exposure or risk factor as the intent is to observe and record. Gives us an idea of what is changing. |
Cross-Sectional Studies | carried out at one time point or over a short period. Idea of prevalence; shows characteristics/risk factors we wouldn't usually see. It is a good planning type of approach, but if a disease is rare, population can get too big. |
Point prevalence vs Period prevalence. What kind of study can measure prevalence? | Cross-Sectional are only study capable of calculating prevalence. Point: what we are usually speaking of when we refer to prevalence. Ex: Do you currently have a backache? Period: Ex: Have you had a backache in the past 6 months? |
Advantages and Disadvantages of Cross-Sectional Studies | A: cheap and quick, data frequently available through records/statistics, ideal for generating new hypothesis. D: Importance of cause and effect relationship cannot be determined. Can measure event or outcome OR E and O, but cannot tell you E caused O |
Ecological studies | studies of risk-modifying factors on outcomes based on differing populations defined either geographically or temporally. Ex: American diet vs Mediterranean diet |
Ecological fallacy | consists in thinking that relationships observed for groups necessarily apply to all individuals in group (can't apply group data and narrow it down) Ex: A Mediterranean may live in the South and have HD, but not necessarily b/c he lives in the South. |
Retrospective Case-Control Studies: Definition and Goal | Compares Hxs of affected individuals vs non-affected individuals starting as far back as possible w/ the goal of determining the presence/absence or necessary magnitude of an exposure in affected individuals. Describe risk profile & analyze associations. |
T/F: Causation can be determined w/ retrospective case-control studies. | False: Associations can be made when a huge study is done, but no causation can be linked. |
Advantages and Disadvantages of Retrospective Case-Control Studies | A: Good initial explanatory study, efficient, cheap, quick, well suited for rare disease (it begins w/ subjects who already have outcome) D: Questionable data reliability, recall bias, sampling bias and others |
Prospective Case Control (Cohort) Studies concept | Subjects IDed according to presence/absence of risk factor -> followed over time until outcome occurs -> frequency of outcome compared btwn 2 groups |
Prospective Case Control (Cohort) Studies purposes | Describe incidence (new cases) of outcome over time. Analyze associations btwn risk factors and outcome. Recall you cannot prove causation, only make associations with large amounts of data. |
T/F: Respective Case-Control studies and historical cohort studies are different things. | True: Retrospective Case Control doesn't use an entire cohort, you just ID subjects who met the case definition, select similar unaffected persons as the control. Historical cohort selects a group of participants regardless of status using entire cohort. |
Advantages and Disadvantage of Cohort Studies | A: more powerful design for defining incidence, no recall bias, powerful design for associating (not proving) cause with the effect. D: Expensive in time, money, and # of subjects. Loss of info due to subject attrition. Subjs. may change behavior -> error |
Randomized Controlled Trial (aka Clinical Trial) | Experimental study w/ unique features. Intervention randomization, control group comparison (placebo vs Tx or Existing Tx vs new Tx. This is the GOLD STANDARD. |
Advantages and Disadvantages of Randomized Control Trials | A: Controls exposure: shows intervention vs not, increases likelihood that the only difference is the intervention. D: Ethical concern of w/holding Tx to group, $$$, pts/providers may not be willing participants. |
Blinding (Masking): Definition | In randomized controlled trials, Tx allocation for each subject is not revealed until subject has been successfully been entered into trial, avoiding bias. |
Single vs Double vs Triple blind | S: Subject blinded to role D: Subject and investigator are blind. T: *Best* Subject, investigator and data-cleanup people are blind. Statistician can only be partially blinded since he has to know subjects are in the same Tx group. |
Benefits of Blinding (just a few key ones from Compton's huge list) (one from each) | Participants blinded: less likely to have biased psychological or physical responses to intervention (I'm supposed to feel better", Hawthorne Effect). Investigators blinded: less likely to transfer attitudes. Assessors blinded: less bias on outcome |
3 steps to Research Intent | 1) Know the problem 2) Determine what you want to evaluate 3) Formulate the process |
The Research Hypothesis definition | Know the question you want answered; restate the question into terms of the null hypothesis (Ho) and the research hypothesis (Ha); think about the corresponding analysis. |
Why emphasize medical ethics? | Nazi experiments -> Nuremberg Code. American experiments w/ African-Americans w/untreated syphilis. Terre Haute federal prison testing prophylaxis on prisoners (that was paid). Guatemalan prisoners/psych pts/prostitutes infected |
National Research Act of 1974 | Result of revelations about Tuskegee: commission established to protect participants: IRB required, informed consent must be obtained, institutional assurances of compliance and IRB review must be provided to Federal Government. |
The Belmont Report: Definition and Key Ethical Points | established basic ethical priniciples for research involving human participants. KP: Respect for persons, beneficence, justice |
Fabrication | making up data or results and recording or reporting them |
Falsification | manipulating research materials or research participants, equipment, or processes, or changing, or omitting data or results, such that the research is not accurately represented in the research record |
Plagiarism | appropriating and using as one’s own the documented ideas, processes, results, or words of another without giving appropriate credit |
Direct Research benefits | research participant gets something from participation that he/she would not get otherwise (access to medical care information from a blood test, chance to talk about an important issue, learn a new skill) |
Societal Research benefits | Contribution to basic scientific knowledge, development of new products, techniques, develop detailed information for decision-making |
Privacy | Control over the extent, timing and circumstances of sharing oneself (physically, behaviorally or intellectually) with others; control of how much people know about you |
Participant rights for information privacy in medical research include (6) | 1) receive copy of the privacy notice governing the study, 2) file a complaint, 3) request restrictions to PMI, 4) see/copy records, update/amend records, 5) obtain list of disclosures, 6) expect their personal data is protected. |
Permitted disclosures w/o a release (5) | 1) Public Health activities (Infx disease tracking etc.) 2) Law enforcement/judicial proceedings 3) deceased persons 4) Patient directories 5) any disclosure needed to conduct daily facility operations. |
The Consent Process: Definition and elements of consent process | Informed consent is an educational process taking place btwn prospective participants and investigator. Elements: full disclosure of nature of research and requirements, adequate comprehension, that the consent is NOT A CONTRACT w/ no liability |
For valid informed consent: (4) | 1) participant must be competent 2) all relevant info about risk/benefit disclosed 3) Participants understand (written at 4th grade level) 4) Agreement is voluntary and free of coercion. "No possible way it can be misunderstood" |
Additional requirements for research involving children (2) | 1) Parents must give permission for children to participate 2) Children must ALSO be informed and MUST ASSENT (must say yes, cannot just fail to say no), must be witnessed. |
Functions of IRB: (7) | Ensure: 1) risks are minimized & are reasonable in relation to benefit 3) participant knows risks & how likely they are to occur 4) participant selection is fair & equitable 5) participation is voluntary 6) privacy protected 7) accept proposed research |
T/F: Research involving humans or animals may begin w/o approval if they have an IRB exemption. | True. This is the only exception. IRB can recommend suspension at any time. |
T/F: Updated consent form not necessary, even if new risks are discovered | False. Must modify consent form if any new information is gathered during the study |
Ethical considerations in animal research (6) | 1) must contribute to scientific knowledge 2) smallest # appropriate 3) cannot be subjected to unnecessary pain 4) Appropriate care to those maintained longer than a day 5) staff trained on care 6) humane euthanasia must be used |
Health Services Research: Definitnion and 3 things it can measure | Measures the financing, organization, delivery, and outcomes of health services. Measures: program structure, process, outcomes. |
T/F: Health Services Research measures effectiveness (what works in practice) instead of efficacy (what is measured in clinical trials). | True; Ex: Effectiveness: does this vaccine work for keeping populations immune? Efficacy: Does the vaccine produce antibodies in individual? |
T/F: When scientific basis for accepted practice is in doubt, using outcomes as measures discourages dogmatism and maintains more flexible approach to management | True. |
Disadvantages of using outcomes and measures (2) | 1) Even expert practitioners often unable to specify outcomes of optimal care. 2) How much of observed effect due to health care factors or patient factors. |
2 categories of outcome measures and examples | 1) Dichotomous: sick vs well, alive vs dead, Sx vs ASx. 2) Continuous: General health status, functional status (# of ADLs) |
Measuring outcome vs process. Why would we measure one over another? Example | Sometimes we cannot wait for the completion of outcome, so we measure the process that directly impacts the outcome. For example, instead of measuring the rate of blindness in diabetic neuropathy, we can measure the process (dilated eye exam rates) |
Health Outcome Domains (The 6 D's) | 1) Death 2) Disease 3) Disability 4) Discomfort 5) Dissatisfaction 6) Destitution |
Differences between biomedical and health services research: Biomedical | Biomedical: narrow eligibility, randomized by individual pt, single component (drugs/devices/procedures), uniformity of intervention is high, outcomes test result centered, blinding is possible, cost control and process of care are easy. |
Differences between biomedical and health services research: Health Services Research | HSR: Broad eligibility, randomized by pt, provider, clinic, hospital etc; multiple intvn components including structure of care, uniformity of intvn is low, outcomes pts centered, process of care and cost control hard, blinding impossible |
Research vs Evaluation | R: production of general knowledge, researcher-derived questions, controlled setting, clear role, published. E: knowledge intended for use, funder-derived questions, action setting, role conflicts, not always published but action usually taken. |
6 Steps of Program Evaluation | Step 1: Engage Stakeholders Step 2: Describe the Program Step 3: Focus the Evaluation Design Step 4: Gather Credible Evidence Step 5: Justify Conclusion Step 6: Ensure Use Of Conclusions and Share Any Lessons Learned |
4 Standards of Program Evaluation | Utility: ensures that info needs of intended users are met. Feasibility: ensures that eval is realistic, prudent, diplomatic, frugal. Propriety: ensures eval conducted legally/ethically. Accuracy: ensures eval reveals and conveys technically accurate info |
T/F: When IDing stakeholders, include those involved w/ program operations, affected/served by the program, intended users of eval findings, supporters and skeptics | True |
4 questions in the narrowing process known as "questioning your topic" | 1) What are the parts of your topic? 2) What larger Hx is your topic part of? 3) What are the categories of your topic? 4) What is the importance of your topic? |
Strict definition vs loose definition for your research question | Strict: statement exploring relationship btwn 2+ concepts (probably 2 is preferable). Loose: a question calling for empirical investigation. |
4 part approach to your hypothesis | 1) Situate study in area ripe for investigation 2) Name a curiosity w/in the seed for focused attention 3) Name something specific about the curiosity that we need to know something about 4) Foreshadow the nature of what you will find out. |
Hypothesis Definition | possible, testable, answer to why and somehow questions, expressed as a relationship between two concepts, variables, or things |
Components of good research question | F.I.N.E.R: Feasible, Interesting, Novel, Ethical, Relevant |
Advantages/Disadvantages of Ecological Studies | A: generate interesting hypotheses, rely on routinely collected (easy to get) data, inexpensive. D: only show correlations not causal relationships. Do not reflect individual associations (ecological fallacy) |
Reflection | Activities producing information oriented toward the self-assessment of persons who are engaged in an experience |
Process Evaluation or Monitoring | Activities producing information about how a procedure, situation, training course, or general program was implemented |
Outcome Evaluation | Activities producing information about what outcomes occurred as a result of a procedure, situation, training course, or general program |
Correlational | Activities producing information about what relationship exists between aspects of a procedure, situation, training course, or general program |
Experimental Research | Activities producing information about why a specific outcome occurred |
Research Logic Models: Definition and Purpose | diagram/flowsheet/schematic to visual present: rationale behind project, importance and benefactors, overview of research activities planned, results you hope to achieve |
Key Logic Model Componenets (5) | 1) Activities: actions needed to meet objectives. 2) Resources: inputs needed to design & implement activities. 3) Outputs: the tangible results. 4) Outcome: measurable changes 5) Indicator: data to monitor, measure progress & success of initiative. |
Goal vs Objective | Goal: broad statement of intended outcomes for a project. Objective: statement of intended outcomes that is focused and time-specific of how to achieve goal |
S.M.A.R.T. Objectives | Specific, Measurable, Attainable, Result-oriented and revelant, Time-bound |
T/F: It is okay if there is not a clear stopping point in your study | False, you SHOULD have a clear stopping point and goal |
"Conformance to specification" also known as: | Quality |
Swiss Cheese Model | where multiple errors combine to create major adverse events. One event can multiply as it passes through safeguards until it becomes a major incident. Hole can be patched at any point |
8 Steps to quality control | 1) establish control criteria 2) determine info relevant to criteria 3) formulate way to collect info 4) collect and analyze info 5) compare collected info w/ established control info 6) decide on quality or no 7) provide info to stakeholder 8) pos reeval |
Total Quality Management and 3 elements of TQM | reduce waste and cost of poor quality (1) pt 2) teamwork 3) scientific approach) |
7 Steps of Continuous Quality Improvement | 1) ID area for improvement 2) ID problem 3) establish desired outcome 4) specify improvement steps 5) collect/analyze data about factors preventing achievement 6) corrective action 7) monitor results |
What do you do with quality and RM data? | Learn from it, don't judge by it. |
T/F: Report individual performances at any time during aggregate reporting. | False, only once reporting has achieved at least 90% |
T/F: If you don’t have a quality improvement project and be able to report on it, Medicare can reduce your reimbursement. | True, have a project going! |
Management philosophy: Lean: based on 2 key themes | 1) continuous elimination of waste (usually time) 2) Respect for people and society; we're pursuing perfection w/ quality. |
Six Sigma focus and goal. Concept | eliminate defects through mistake-proofing. Premise that you establish 6 statistical standard deviations between the process mean and the nearest specification limit. |
5 structured project phases of the Six Sigma Project | 1) Define problem 2) Measure situation/interventions 3) Analyze process/intervention data 4) Improve process based on analysis 5) Control or sustain results |
Six Sigma enough? Example where it isn't: | 99.9% accuracy has 1 defect for each 1000. If you are doing liver biopsies you could have 1 death for every 1000 you perform. |
Trigger Tools Definition | Proactive and systematic method for IDing and measuring rate of adverse events over time. |
Failure Modes and Effects Analysis (FMEA) definition and 4 steps | prospective technique to prevent problems b/f they occur and reduce chance of adverse harm. 1) Modelling 2) Definition 3) Simulation 4) Analysis |
Risk analysis | is a systematic use of information to identify specific sources of harm (hazards) and to estimate the risk. |
Risk evaluation | compares the estimated risk against given risk criteria using a quantitative or qualitative scale to determine the significance of the risk. |
3 things pts want after a medical error | 1) Honest explanation 2) Remediation 3) Apology |
Primary indication for rapid response teams? | Provider is uncomfortable with the situation |
SBAR | Situation, Background, Assessment, Recommendation |