Evaluation Expert
Uses data from multiple sources to demonstrate the impact of learning solutions and identify opportunities for improvement.
The Basics
Work Overview
Director of Action, DIA Higher Education Collaborators (DIA)
June 2023 – May 2025 (1 yr 10 mos)
DIA’s mission is to improve student learning and retention in higher education by helping institutions move from “data to information, information to action”. DIA not only provides college partners with data on incoming students’ non-cognitive skills (e.g., time management, grit, help seeking) but supports partners in using these data to “improve cultures, strategies, and practices” among faculty, staff, and senior leaders. As Director of Action, I guided partners in the development, implementation, and evaluation of customized use-of-results plans, including (where needed) plans for product user training and support.
Associate Research Scientist, Curriculum Associates (CA)
Mar 2021 – May 2023 (1 yr 10 mos)
CA is committed to “making classrooms better for teachers and students” by providing curriculum, assessment, and personalized instruction products for nearly half of the nation’s learners in grades K-8. As an Associate Research Scientist, I worked with internal teams (e.g., product, marketing, sales) to answer questions about the implementation/effectiveness of CA products and develop external-facing materials (e.g., implementation guides, efficacy reports) for current/potential users.
Lead Assessment Consultant, JMU Center for Assessment and Research Studies (CARS)
Aug 2017 – May 2020 (2 yrs 9 mos)
Comprised of faculty and graduate student assessment consultants, CARS guides all student learning outcomes assessment efforts at James Madison University (general education, academic degree, and co-curricular/student affairs). To achieve its vision of “inspiring and empowering faculty and staff to make evidence-based decisions to enhance student learning and development”, CARS offers consultation services for programs/departments, provides assessment training for faculty/staff, and works closely with senior leaders to create systems and processes that support learning assessment. As Lead Assessment Consultant (Doctoral Graduate Assistant), I worked primarily with co-curricular/student affairs programs, training staff and devising strategies to address cultural and organizational barriers to learning outcomes assessment.
Education Overview
Assessment & Measurement (Ph.D.), James Madison University
Degree Conferred: May 2020
The A&M program adheres to a “practitioner-scientist, applied model” of graduate training. In class, students receive instruction on program assessment (measuring the effectiveness of learning programs or interventions), measurement (developing and evaluating the quality of learning assessment instruments) and quantitative methods (research design and statistics). Outside of class, students practice what they’ve learned through assessment consultation work (20 hrs/wk) and applied research projects with university partners.
College Student Personnel Administration (M.Ed.), James Madison University
Degree Conferred: May 2016
The CSPA program “prepares graduates to excel as higher education educators” working in student services offices (e.g., Career Advising, Residence Life, Disability Services) at colleges and universities. Coursework explores young adult development, career development, counseling techniques, and organizational theories among other topics. Outside of class, students have ample opportunity to apply their knowledge through two assistantships (20 hrs/week) and two 150-hour practicum experiences in student services offices. As a CSPA student, I worked in Residence Life, Career & Academic Advising, and Multicultural Student Services.
Sociology & Public Policy (B.A.), University of North Carolina at Chapel Hill
Degree Conferred: May 2013
Completed a double major in Sociology and Public Policy, both with a strategic focus on education through electives such as SOCI 423: Sociology of Education and PLCY 530: Education Problems & Policy Solutions.
Relevant Experience
(See Comprehensive Portfolio to view work samples and case studies.)
- DIA: Evaluated the impact of trainings and other implementation support efforts on learner and organizational outcomes using a mix of quantitative and qualitative data collection methods (e.g., self-report surveys, focus groups, UX analytics, supervisor observations) and practical study designs to balance cost, feasibility, and the trustworthiness of results.
- DIA: Created custom reports for higher education clients designed to encourage use of results by highlighting actionable insights and providing recommended next steps in a user-friendly format.
- CA: Designed and implemented research studies (correlational, quasi-experimental, and qualitative) to evaluate the effectiveness of Curriculum Associates educational products.
- Conducted a mixed-methods study to identify promising practices of schools that exceeded expectations (with respect to student learning growth) during the COVID-19 pandemic; interviewed district/school leaders from 16 schools across the country and used qualitative analysis methods to identify six key themes.
- Wrote SQL queries to efficiently retrieve data from large databases (millions of records) and create cleaned datasets, used SAS and R to perform quantitative analyses (e.g., multiple/logistic regression, HLM, propensity score matching), engaged in rigorous QC processes to ensure the trustworthiness of results, and translated research findings into digestible insights for non-technical audiences.
- Oversaw the conceptualization and kick-off of a $1M quasi-experimental research project to evaluate the effectiveness of CA’s premier math curriculum on academic outcomes.
- CARS: Consulted with faculty and student support staff at James Madison University to support the development of high-quality learning interventions and facilitate the use of assessment results to improve student learning. Common consultation topics included the following:
- Writing measurable student learning outcomes
- Developing intentional, evidence-informed learning interventions
- Selecting and designing high-quality assessment measures
- Interpreting/communicating/using assessment results
- CARS: Designed, facilitated, and assessed an award-winning multi-day assessment workshop for educators and administrators at James Madison University that increased participants’ assessment knowledge, skills, and confidence.
- Workshop assessed using a pre-post design and locally developed assessment measure with items mapped to key learning outcomes.
- Results used to make targeted recommendations for improvement (recommendations implemented the following year).
Other Qualifications
- Relevant Coursework (Ph.D.)
- Assessment Methods & Instrument Design
- Performance Assessment (rubric design, development, and use)
- Computer Assisted Data Management & Analysis (statistical programming)
- Quasi-Experimental Research Methods
- Mixed Methods & Qualitative Research
- Multivariate Statistics
- Technical Skills
- Proficient in SAS and SPSS statistical analysis software.
- Proficient in use of Excel for data cleaning, manipulation, and visualization (small datasets).
- Additional skills in SQL (intermediate) and R (basic) programming languages for data management, cleaning, and analysis.
- Published six articles related to teaching, learning, and assessment in higher education, including this article that describes the assessment and redesign of an academic probation program for college students.
- Facilitated dozens of workshops at regional, national, and international conferences on topics related to teaching, learning, and learning assessment.
