George Washington University
The Evaluators' InstituteHeader


Course Descriptions:

Analytic Approaches

Qualitative Data Analysis

Instructor: Dr. Patricia Rogers, Professor in Public Sector Evaluation at RMIT University (Royal Melbourne Institute of Technology), Australia

Description: Many evaluators find it challenging to analyze textual, visual, and aural data from interviews, diaries, observations, and open-ended questionnaire items in ways that are rigorous but practical within the time and staffing constraints of real evaluation. Analysis of qualitative data can range from simple enumeration and illustrative use to more detailed analysis requiring more expertise and time. In this class, participants will work through a structured approach to analyzing qualitative data based on an iterative process of considering the purpose of the analysis, reviewing suitable options, and working through interpretations. Techniques include grouping, summarizing, finding patterns, discovering relationships, and developing and testing relationships. The session will address practical and ethical issues in analyzing and reporting qualitative data-particularly who participates in interpretation, how confidentiality can be maintained, how analysis can be tracked and checked, and standards for good practice in qualitative data analysis. Hands-on exercises for individuals and small groups will be used throughout the class. Manual analysis of data will be used in exercises and participants will also be introduced to NVivo and other computer packages to assist analysis. As part of the course, participants will receive the textbook, Qualitative Data Analysis by Miles & Huberman (Sage, 1994).

>> Return to top

Introduction to Cost-Benefit and Cost-Effectiveness Analysis

Description: The tools and techniques of cost-benefit and cost-effectiveness analysis will be presented and students will have opportunity to apply the procedures using actual case studies. Content includes: identification and measurement of costs and benefits; consideration of intangible costs and benefits; calculation of net program benefits; examination of the benefits-to-costs ratio; conducting a sensitivity analysis on assumptions; and understanding and handling risk factors. Public and private sector analysis will be contrasted. Alternative evaluation approaches, such as Value for Money and cost-utility analysis also will be discussed. Individuals will work in groups to assess various costs and benefits applicable to the case studies.

>> Return to top

Intermediate Cost-Benefit and Cost-Effectiveness Analysis

Description: The Intermediate Cost-Benefit Analysis course provides a more advanced and detailed review of the principles of social cost and social benefit estimation than is provided in TEI’s Introduction to Cost-Benefit and Cost Effectiveness Analysis. Working with the instructor, students will undertake hands-on estimation of the costs and benefits of actual programs in the computer lab. The objective is to develop the ability both to critically evaluate and use cost-benefit analyses of programs in the public and nonprofit sectors, and to use basic cost-benefit analysis tools to actively undertake such analyses.

Topics covered in the course will include:

I. Principles of Social Cost and Social Benefit Estimation

  1. Social Cost Estimation: (a) Components (capital, operating, administrative; (b) Budgetary and Social Opportunity Cost
  2. Social Benefit Estimation: (a) Social vs. private benefits; (b) revealed benefit measures (Price/cost changes in primary market , Price/cost changes in analogous markets, Benefits inferred from market-trade-offs, and cost/damages avoided as benefit measures).
  3. Stated preference measures: Inferring benefits from survey data
  4. Benefit/Cost Transfer: Borrowing estimates of benefits and costs from elsewhere.
  5. Timing of Benefits and Costs: (a) Discounting and net present value, (b) Dealing with inflation, (c). Choosing a discount rate
  6. Presenting Results: (a) Sensitivity analysis (partial sensitivity analysis, best/worst case scenarios, break-even analysis, and Monte-Carlo analysis. (b) Present value of net social benefits, (c) Benefit Cost Ratio, (d) Internal rate of Return

II. Social Cost and Social Benefit Estimation in Practice

The use of the above principles of cost and benefit estimation will be illustrated using data drawn from several actual benefit cost analysis of real programs. The cases will be chosen to illustrate the application of the benefit/cost estimation principles in the case of social programs, health programs, and environmental programs. Working with the instructor in the computer lab, students will create a benefit-cost analysis template and then use that template to estimate social benefits and social costs, and to present a benefit-cost bottom line.

This is an intermediate level course. Participants are assumed to have some knowledge/or experience with cost-benefit and/or cost-effectiveness analysis equivalent to the TEI course Introduction to Cost-Benefit and Cost-Effectiveness Analysis.

>> Return to top

Hierarchical Linear Modeling

Instructor: Dr. Gary T. Henry

Description: In many evaluations, the program participants are nested within sites, schools, or groups.  In addition, the nesting is sometimes multi-leveled, such as students within classes within schools within school districts.  To make matters more complicated, we more frequently have multiple observations taken over time on the program participants, such as years of student achievement scores or measures of mental health status.  Hierarchical linear models (HLM) have been developed to accurately analyze these types of data.  These models make two important improvements over regular (ordinary least squares) regression.  First, the standard errors that are used for testing statistical significance are corrected for the “nesting” or “clustering” of participants into groups. Usually, the participants in a “cluster” are more similar to each other than they are to participants in other “clusters” and this, when uncorrected, deflates the standard errors leading to “false positives” or concluding that a coefficient is statistically significant when it is not. HLM corrects the standard errors and test of statistical significance for nested data. Second, HLM appropriately apportions the variance that occurs at each level to that level, and provides realistic estimates of the effects across levels.

In this course, we lay a foundation for understanding, using, and interpreting HLM. We begin with multiple regression, including the assumptions that must be fulfilled for the coefficients and tests of statistical significance to be unbiased.  Using a step by step approach, we will introduce the basic concepts of HLM and the notation that has been developed for presenting HLM models.  We will focus on practical aspects of the use of HLM and correctly putting the findings into language suitable for a report.  The main objective of the course is to provide the participants with a better understanding of HLM, how it can improve the analysis of data in many evaluations, and how to read and interpret reports and articles that utilize it.  The course will not offer hands on experiences writing and implementing HLM statistical programs.

>> Return to top

Practical Meta-Analysis: Summarizing Results Across Studies (Computer Lab)

Instructor: Dr. David B. Wilson, Professor in the Department of Criminology, Law & Society at George Mason University

Description: Meta-analysis is a technique for encoding, analyzing, and summarizing quantitative findings from research studies. It is used by applied researchers and evaluators to review, synthesize, and interpret existing research on such topics as effects of intervention, assessment of change, differentiation of diagnostic or demographic groups, relationships between risk variables and subsequent behavior, and reliability and validity of measurement instruments. This course will provide practical instruction on how to conduct meta-analysis, including (a) specifying the problem and gathering relevant studies, (b) coding procedures, (c) database structures, (d) analyzing meta-analytic databases, and (e) interpreting meta-analysis results. Participants will be given a detailed guide for conducting meta-analysis and a computer disk with applicable software. On the first day procedures will be explained and implementations discussed. On the second day, hands-on applications of analytic techniques will occur with participant access to individual computers. Problems provided sufficiently in advance by participants will be incorporated into class discussion, or if more appropriate, consultation provided after class hours.

>> Return to top

Applied Statistics for Evaluators (Computer Lab)

Instructor: Dr. Theodore H. Poister, Professor of Public Management & Policy, Andrew Young School of Policy Studies, Georgia State University

Description: A set of statistical tools often used in program evaluations will be presented with emphasis on appropriate application of techniques and interpretation of results. This course is designed to "demystify" statistics and provide a basis for understanding how and when to use particular techniques. While the principal concern focuses on practical applications in program evaluations rather than the mathematical support underlying the procedures, a number of formulas and computations are covered to help students understand how the statistics work. Topics include introduction to data analysis; simple descriptive statistics; examination of statistical relationships; the basics of statistical inference from sample data; two-sample t tests, chi square and associated measures; analysis of variance; and introduction to simple and multiple regression analysis.

A variety of tabular and graphical output for presenting results of analyses will be explored, and strong emphasis will be placed on interpreting the results of statistical analyses appropriately. The class is conducted in a computer lab where each participant has a computer for illustrating techniques and applying them to a wide range of real-world data sets, using SPSS software. However, no prior knowledge of statistics or SPSS is required. While this is an introductory course, it can also serve as a refresher for those with some training in statistics, and for evaluators who are working with statistics now but are not comfortable with when and how they should be used.

>> Return to top

Applied Regression Analysis for Evaluators (Computer Lab)

Instructor: Dr. Gary T. Henry, Duncan MacRae ’09 and Rebecca Kyle MacRae Professor of Public Policy, Department of Public Policy and Director of the Carolina Institute for Public Policy at the University of North Carolina at Chapel Hill

Description: Evaluators often face the situation where program outcomes vary across different participants and they want to explain those differences. To understand the contribution of the program to the outcomes, it is often necessary to control for the influence of other factors. In these situations, regression analysis is the most widely used statistical tool for evaluators to apply. The objective of this course is to describe and provide hands-on experience in conducting regression analysis, and to aid participants in interpreting regression results in an evaluation context. The course begins with a review of hypothesis testing (t-tests) and a non-mathematical explanation of how the regression line is computed for bivariate regression. A major focus is on accurately interpreting regression coefficients and tests of significance, including the slope of the line, the t-statistic, and the statistics that measure how well the regression line fits the data. Participants will also learn how to find outliers that may be unduly influencing the results. Participants will have opportunity to estimate multivariate regression models on cross-sectional data; diagnose the results to determine if they may be misleading; and test the effects of program participation with pretest-posttest and posttest-only data. Regression-based procedures for testing mediated and moderated effects will be covered. On the third day, students will be given the opportunity to conduct an independent analysis and write-up the findings. Both peer feedback and instructor feedback will be provided to build skills in interpreting findings and explaining them to interested audiences. Participants will use SPSS software to compute regression analyses and given opportunity to apply it on data from an actual evaluation. Students and instructor will work on interpreting the results and determining how to present them to evaluation audiences. The class will be in a lab where each person has a computer for application of content.

>> Return to top

Intermediate Qualitative Data Analysis

Instructor: Dr Delwyn Goodrick, Evaluation Practitioner / Psychologist, Melbourne, Australia

Description: Data analysis involves creativity, sensitivity and rigour. In its most basic form qualitative data analysis involves some sort of labelling, coding and clustering in order to make sense of data collected from evaluation fieldwork, interviews, and/or document analysis. This intermediate level workshop builds on basic coding and categorising familiar to most evaluators, and extends the array of strategies available to support rigorous interpretations.

This workshop presents an array of approaches to support the analysis of qualitative data with an emphasis on procedures for the analysis of interview data. Strategies such as, thematic analysis, pattern matching, template analysis, process tracing, schema analysis and qualitative comparative analysis are outlined and illustrated with reference to examples from evaluation and from a range of disciplines, including sociology, education, political science and psychology.

The core emphasis in the workshop is creating awareness of heuristics that support selection and application of appropriate analytic techniques that match the purpose of the evaluation, type of data, and practical considerations such as resource constraints. While a brief overview of qualitative analysis software is provided, the structure of the workshop focuses on analysis using manual methods. A range of activities to support critical thinking and application of principles is integrated within the workshop program.

Qualitative data analysis and writing go hand in hand. In the second part of the workshop strategies for transforming analysis through processes of description, interpretation and judgement will be presented. These issues are particularly important in the assessment of the credibility of qualitative evidence by evaluation audiences. Issues of quality, including validity, trustworthiness and authenticity of qualitative data are integrated throughout the workshop.

This is an intermediate level course. Participants are assumed to have some knowledge/or experience with qualitative data.

Participants will receive a text, Analyzing qualitative data: Systematic approaches by H.R. Bernard and G.W. Ryan (Sage, 2010) to support learning within and beyond the workshop.

Specific issues to be addressed:

  • What are the implications of an evaluator's worldview for selection of qualitative data analysis (QDA) strategies?
  • Are there analytic options that are best suited to particular kinds of qualitative data?
  • How can participant experiences be portrayed through QDA without fracturing the data through formal coding?
  • What types of analysis may be appropriate for particular types of evaluation (program theory, realist, transformative)
  • What strategies can be used to address interpretive dissent when working in evaluation teams?
  • What are some ways that qualitative and quantitative findings can be integrated in an evaluation report?
  • How can I sell the value of qualitative evidence to evaluation audiences?


>> Return to top

Needs Assessment

Instructor: Dr Ryan Watkins, Associate Professor at The George Washington University, Washington DC

Description: The earliest decisions that lead to projects or programs are among the most critical in determining long-term success. This phase of project development transforms exciting ideas into project proposals, thereby setting the stage for a variety of actions that will eventually lead (if all goes well) to desirable results. Decisions ranging from a propose a sanitation project in South Asia or North Florida to selecting approaches that strengthen school management in South America or Eastern Kentucky, are the early decisions that form the starting place of evaluation results.

Needs assessments support this earliest phase of project development with proven approaches to gathering information and making justifiable decisions. In a two-day workshop, learn how needs assessment tools and techniques help you identify, analyze, prioritize, and accomplish the results you really want to achieve. Filled with practical strategies, tools, and guides, the workshop covers both large-scale, formal needs assessments and less-formal assessments that guide daily decisions. The workshop blends rigorous methods and realistic tools that can help you make informed and reasoned decisions. Together, these methods and tools offer a comprehensive, yet realistic, approach to identifying needs and selecting among alternative paths forward.

Going beyond simple surveys, learn how to apply creative and engaging techniques and tools that clarify objectives and lead to innovative solutions. In this workshop we will focus on the pragmatic application of many needs assessment tools, giving participants the opportunity to practice their skills while learning how needs assessment techniques can improve the achievement of desired results. With participants from a variety of sectors and organizational roles, the workshop will illustrate how needs assessments can be of value in a variety of operational, capacity development, and staff learning functions.

>> Return to top

Having problems opening the PDF files? Download the latest version of Abode Acrobat Reader to view them.