Sign Up For Newsletter

Blueprints For Healthy Youth Development logo

ASSISTments

A school-based program that aims to improve student achievement in mathematics through online homework tools and specialized teacher training.

Program Outcomes

  • Academic Performance

Program Type

  • Academic Services
  • School - Individual Strategies

Program Setting

  • Home
  • School

Continuum of Intervention

  • Universal Prevention

Age

  • Late Adolescence (15-18) - High School
  • Early Adolescence (12-14) - Middle School
  • Late Childhood (5-11) - K/Elementary

Gender

  • Both

Race/Ethnicity

  • All

Endorsements

Blueprints: Promising
What Works Clearinghouse: Meets Standards Without Reservations - Positive Effect

Program Information Contact

Brian Story
Director of Teacher Engagement
ASSISTments
Email: brian.story@assistments.org
Website: https://new.assistments.org/

Program Developer/Owner

Neil and Cristina Heffernan
Worcester Polytechnic Institute / The ASSISTments Foundation


Brief Description of the Program

The ASSISTments program consists of a web-based homework tool that gives immediate feedback and hints to students during homework time while simultaneously providing individualized feedback to students' teachers. Content in the program consists of grade-appropriate mathematics problems with answers, hints, and detailed guidance. These problems are bundled into problem sets that teachers can easily assign to their class. Once students complete the assignments, teachers receive reports on student performance, both individualized and class-level. Program content also includes "Skillbuilders," which are standard-focus, mastery-based assignments teachers can use to target specific standards; students demonstrate mastery when they enter three correct answers in a row. These practice problems can be assigned when learning a new skill or to refresh students' memories. Teachers also may receive professional development training to increase their ability to use the ASSISTments reports.

Outcomes

Primary Evidence Base for Certification

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) found at posttest that, compared to the control group, the intervention group had significantly higher:

  • Math standardized test scores, especially for initially low-achieving students (Terra Nova testing).

Brief Evaluation Methodology

Of the two studies Blueprints has reviewed, one meets Blueprints evidentiary standards (specificity, evaluation quality, impact, dissemination readiness). The study was conducted by independent evaluators.

Primary Evidence Base for Certification

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) evaluated the program in a cluster randomized controlled trial. The 46 participating schools with 3,035 students were randomly assigned to either the intervention condition or a matched standard practice/waitlist control condition. Mathematics test scores at the end of the school year served as the outcome.

Study 1

Roschelle, J., Feng, M., Murphy, R. F., & Mason, C. A. (2016). Online mathematics homework increases student achievement. AERA Open, 2(4), 1-12.


Race/Ethnicity/Gender Details

Subgroup differences in program effects by race, ethnicity, or gender (coded in binary terms as male/female) or program effects for a sample of a specific racial, ethnic, or gender group:

Study 1 (Roschelle et al., 2016; Murphy et al., 2020) used a majority white sample and did not test for treatment moderation by gender or race.

Sample demographics including race, ethnicity, and gender for Blueprints-certified studies:

The student sample in Study 1 (Roschelle et al., 2016; Murphy et al., 2020) was evenly split on gender (49.3% male) but predominantly white (92.6%),

Custom Training and Coaching:

(Note all support is virtual. Support can be curriculum-agnostic or aligned to Illustrative Math, Eureka Math, or Open Up Resources)

ASSISTments Fundamentals provides math coaches and teachers the fundamentals they need to get started with a successful and impactful routine, customized to the goals and needs of a school or district.

Monthly coaching hours ensure schools have the ongoing support they need from expert ASSISTments trainers. The coaching engagement can be structured to support math learning goals with ASSISTments and ensure teachers get the ongoing support they need to implement with impact.

Custom Workshops can be designed as hands-on interactive workshops that support ongoing implementation and use of ASSISTments to shift teacher practice. Focus areas include fostering a growth mindset and positive math learning culture, engaging students in meaningful math discussions, and data-driven instructional planning.

School Essentials takes the impact of ASSISTments from the individual classroom to the entire school, giving every math teacher the tools to make an impact on learning. The School Essentials package provides a year-long trajectory of comprehensive, flexible support for teachers, instructional leaders and administrators. Instructional leaders will have the data they need to effectively support teachers, and teachers will engage in regular collaborative, data-driven discussions, supported by real-time student learning insights. Mistakes will be embraced as part of the learning process, building a positive culture around math.

Additional Website Resources: https://new.assistments.org/

  1. Regular monthly webinars that support new and experienced users, as well as a robust searchable library of webinars
  2. ASSISTments Certified Educator Program - On demand modules to get teachers fully up to speed on using ASSISTments with impact, on their own schedule
  3. Teacher Corner of authentic artifacts from real teachers' classrooms that support effective use of ASSISTments.
  4. User Resources and FAQs to support new and experienced teachers as they learn how to use ASSISTments and develop their routines

Source: Washington State Institute for Public Policy
All benefit-cost ratios are the most recent estimates published by The Washington State Institute for Public Policy for Blueprint programs implemented in Washington State. These ratios are based on a) meta-analysis estimates of effect size and b) monetized benefits and calculated costs for programs as delivered in the State of Washington. Caution is recommended in applying these estimates of the benefit-cost ratio to any other state or local area. They are provided as an illustration of the benefit-cost ratio found in one specific state. When feasible, local costs and monetized benefits should be used to calculate expected local benefit-cost ratios. The formula for this calculation can be found on the WSIPP website.

Start-Up Costs

Initial Training and Technical Assistance

ASSISTments Fundamentals provides math coaches and teachers the fundamentals they need to get started with a successful and impactful routine, customized to the goals and needs of a school or district. (3 hours, $2,000 for up to 30 participants)

Additional free resources may be found online: https://new.assistments.org/

Curriculum and Materials

The ASSISTments platform is provided at no cost.

Licensing

No information is available

Other Start-Up Costs

Students need internet access to be able to use ASSISTments. The program works best with the most updated version of either Chrome or Firefox. ASSISTments is compatible with mobile devices, tablets, chromebooks, and regular laptops. It is compatible with Google Classroom and Canvas. 

Intervention Implementation Costs

Ongoing Curriculum and Materials

No information is available

Staffing

Math teachers incorporate ASSISTments as part of regular in-class assignments or homework for students.

Other Implementation Costs

No information is available

Implementation Support and Fidelity Monitoring Costs

Ongoing Training and Technical Assistance

Monthly coaching hours ensure schools have the ongoing support they need from expert ASSISTments trainers. The coaching engagement can be structured to support math learning goals with ASSISTments and ensure teachers get the ongoing support they need to implement with impact. ($4,000 for up to 10 hours and 30 participants)

Custom Workshops can be designed as hands-on interactive workshops that support ongoing implementation and use of ASSISTments to shift teacher practice. Focus areas include fostering a growth mindset and positive math learning culture, engaging students in meaningful math discussions, and data-driven instructional planning. (Price varies)

School Essentials takes the impact of ASSISTments from the individual classroom to the entire school, giving every math teacher the tools to make an impact on learning. The School Essentials package provides a year-long trajectory of comprehensive, flexible support for teachers, instructional leaders and administrators. Instructional leaders will have the data they need to effectively support teachers, and teachers will engage in regular collaborative, data-driven discussions, supported by real-time student learning insights. Mistakes will be embraced as part of the learning process, building a positive culture around math. (3 hours, $2,000)

Fidelity Monitoring and Evaluation

No information is available

Ongoing License Fees

No information is available

Other Implementation Support and Fidelity Monitoring Costs

No information is available

Other Cost Considerations

No information is available

Year One Cost Example


No information is available

Program Developer/Owner

Neil and Cristina HeffernanWorcester Polytechnic Institute / The ASSISTments Foundationcristina.heffernan@assistments.org new.assistments.org

Program Outcomes

  • Academic Performance

Program Specifics

Program Type

  • Academic Services
  • School - Individual Strategies

Program Setting

  • Home
  • School

Continuum of Intervention

  • Universal Prevention

Program Goals

A school-based program that aims to improve student achievement in mathematics through online homework tools and specialized teacher training.

Population Demographics

ASSISTments offers content ranging from 1st grade through high school but is primarily recommended for students grade 4 and above. The Blueprints-certified study included 7th-grade students only.

Target Population

Age

  • Late Adolescence (15-18) - High School
  • Early Adolescence (12-14) - Middle School
  • Late Childhood (5-11) - K/Elementary

Gender

  • Both

Race/Ethnicity

  • All

Race/Ethnicity/Gender Details

Subgroup differences in program effects by race, ethnicity, or gender (coded in binary terms as male/female) or program effects for a sample of a specific racial, ethnic, or gender group:

Study 1 (Roschelle et al., 2016; Murphy et al., 2020) used a majority white sample and did not test for treatment moderation by gender or race.

Sample demographics including race, ethnicity, and gender for Blueprints-certified studies:

The student sample in Study 1 (Roschelle et al., 2016; Murphy et al., 2020) was evenly split on gender (49.3% male) but predominantly white (92.6%),

Other Risk and Protective Factors

Individual

  • Mathematics comprehension skills (Program Focus)

Risk/Protective Factor Domain

  • Individual

Risk/Protective Factors

Risk Factors

Protective Factors


*Risk/Protective Factor was significantly impacted by the program

Brief Description of the Program

The ASSISTments program consists of a web-based homework tool that gives immediate feedback and hints to students during homework time while simultaneously providing individualized feedback to students' teachers. Content in the program consists of grade-appropriate mathematics problems with answers, hints, and detailed guidance. These problems are bundled into problem sets that teachers can easily assign to their class. Once students complete the assignments, teachers receive reports on student performance, both individualized and class-level. Program content also includes "Skillbuilders," which are standard-focus, mastery-based assignments teachers can use to target specific standards; students demonstrate mastery when they enter three correct answers in a row. These practice problems can be assigned when learning a new skill or to refresh students' memories. Teachers also may receive professional development training to increase their ability to use the ASSISTments reports.

Description of the Program

The ASSISTments program consists of a web-based homework tool that gives immediate feedback and hints to students during homework time while simultaneously providing individualized feedback to students' teachers. Content in the program consists of grade-appropriate mathematics problems with answers, hints, and detailed guidance. These problems are bundled into problem sets that teachers can easily assign to their class. Once students complete the assignments, teachers receive reports on student performance, both individualized and class-level. Program content also includes "Skillbuilders," which are standard-focus, mastery-based assignments teachers can use to target specific standards; students demonstrate mastery when they enter three correct answers in a row. These practice problems can be assigned when learning a new skill or to refresh students' memories. Teachers also may receive professional development training to increase their ability to use the ASSISTments reports.

In the study by Roschelle et al. (2016), teachers participated in three days of professional development prior to implementation and received six hours of coaching spread across three visits during the school year from an ASSISTments coach.

Theoretical Rationale

Consistent individual practice with mathematics is positively linked to performance, but homework typically does not provide assistance or feedback in a sufficiently timely manner to help a struggling student. The online, real-time homework tool allows for help and feedback, ostensibly improving mathematics practice.

Theoretical Orientation

  • Skill Oriented

Brief Evaluation Methodology

Of the two studies Blueprints has reviewed, one meets Blueprints evidentiary standards (specificity, evaluation quality, impact, dissemination readiness). The study was conducted by independent evaluators.

Primary Evidence Base for Certification

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) evaluated the program in a cluster randomized controlled trial. The 46 participating schools with 3,035 students were randomly assigned to either the intervention condition or a matched standard practice/waitlist control condition. Mathematics test scores at the end of the school year served as the outcome.

Outcomes (Brief, over all studies)

Primary Evidence Base for Certification

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) found at posttest that the intervention group scored significantly higher on the mathematics assessment tool than the control group. The effect was strongest for low-achieving students, but even high-achieving students had significant gains over control group students.

Outcomes

Primary Evidence Base for Certification

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) found at posttest that, compared to the control group, the intervention group had significantly higher:

  • Math standardized test scores, especially for initially low-achieving students (Terra Nova testing).

Mediating Effects

Study 1 (Murphy et al., 2020) presented mediation tests that found no significant indirect effects of the program on math test scores via a measure of targeted homework review practices by teachers..

Effect Size

Study 1 (Roschelle et al., 2016) reported a small effect size (g=.18). 

Generalizability

One study meets Blueprints standards for high-quality methods with strong evidence of program impact (i.e., "certified" by Blueprints): Study 1 (Roschelle et al., 2016; Murphy et al., 2020). The samples for the study included seventh-grade students.

Study 1 took place in the state of Maine and compared treatment schools to standard practice control schools.

Potential Limitations

Additional Studies (not certified by Blueprints)

Study 2 (Koedinger et al., 2010)

  • QED with no matching
  • No specifics about outcome measure
  • Incorrect level of analysis
  • Difference between conditions at baseline
  • No tests for differential attrition

Koedinger, K. R., McLaughlin, E. A., & Heffernan, N. T. (2010). A quasi-experimental evaluation of an on-line formative assessment and tutoring system. Journal of Educational Computing Research, 43(4), 489-510. doi:10.2190/EC.43.4.d

Endorsements

Blueprints: Promising
What Works Clearinghouse: Meets Standards Without Reservations - Positive Effect

Program Information Contact

Brian Story
Director of Teacher Engagement
ASSISTments
Email: brian.story@assistments.org
Website: https://new.assistments.org/

References

Study 1

Certified Roschelle, J., Feng, M., Murphy, R. F., & Mason, C. A. (2016). Online mathematics homework increases student achievement. AERA Open, 2(4), 1-12.

Murphy, R., Roschelle, J., Feng, M., & Mason, C. A. (2020). Investigating efficacy, moderators and mediators for an online mathematics homework intervention. Journal of Research on Educational Effectiveness, 13(2), 235-270. doi:10.1080/19345747.2019.1710885

Study 2

Koedinger, K. R., McLaughlin, E. A., & Heffernan, N. T. (2010). A quasi-experimental evaluation of an on-line formative assessment and tutoring system. Journal of Educational Computing Research, 43(4), 489-510. doi:10.2190/EC.43.4.d

Study 1

Roschelle et al. (2016) presented main effects, while Murphy et al. (2020) primarily explored how teacher practices and behaviors mediated the impacts of the intervention.

Summary

Roschelle et al. (2016) and Murphy et al. (2020) evaluated the program in a cluster randomized controlled trial. The 46 participating schools with 3,035 students were randomly assigned to either the intervention condition or a matched standard practice/waitlist control condition. Mathematics test scores at the end of the school year served as the outcome.

Roschelle et al. (2016) and Murphy et al. (2020) found at posttest that, compared to the control group, the intervention group had significantly higher:

  • Math standardized test scores, especially for initially low-achieving students (Terra Nova testing).

Evaluation Methodology

Design:

Recruitment: Participating schools (n=46) were recruited from throughout Maine to represent the range of school sizes and mathematics achievement levels across the state. Schools were recruited using a consistent set of messages and presentations, which were available on a website and as handouts. All participating schools self-selected into the study, expressing a desire to use the ASSISTments technology either immediately or after serving as a waitlist control. Two cohorts of 7th-grade classes participated, one starting in fall 2014 and one in fall 2015.

A generalizability test presented by Murphy et al. (2020) indicated that the sample generalizes to school districts in Maine but not most other states.

Assignment: Participating schools were first blocked on school type (K-8 or other) and then matched into pairs on prior standardized test achievement. One school in each pair was randomly assigned to treatment or a standard practice/waiting list control group. The control group received the ASSISTments intervention two years later, after all assessments had been completed. Prior to assignment, one school withdrew, and its pair was rematched and randomly assigned. After assignment, one treatment school dropped out of the study, and its pair was dropped as well. Ultimately, 45 schools were assigned and 43 actually participated in the intervention and assessment (22 treatment schools and 21 control schools). The sample included 87 teachers and 3,035 students.

Attrition: With the first school year devoted to teacher professional development and program support, the baseline came at the start of the second year and the posttest came at the end of the second year. Specifically, the assessments occurred in October 2014 and May 2015 for the first cohort and in October 2015 and May 2016 for the second cohort. At the school level, there was an attrition rate of 6.5%. Roschelle et al. (2016) reported that, out of the original 3,035 students in participating schools, 307 either changed schools or skipped the assessment while 122 new students moved into a participating school, resulting in a final analytic sample of 2,850 (94%). Murphy et al. (2020) reported slightly lower figures of 116 new students and a final analytic sample of 2,769 (91%); the difference likely stems from including data from 87 teachers.

Sample:

The student sample was evenly split on gender (49.3% male) but predominantly white (92.6%), with 38.7% of students receiving free or reduced-price lunch, and 12.2% of the student body attending specialized education classes. All students had access to laptops.

Measures:

Student Academic Achievement was measured using different standardized tests at baseline and posttest. The New England Common Assessment Program (NECAP) was used to establish baseline (6th grade) scores in mathematics and reading. At posttest, similar skills were assessed using the TerraNova standardized test, which is widely used across the country as it aligns with the Common Core Standards for Mathematics and reflects national curricula.

Analysis:

The primary analysis in Roschelle et al. (2016) used a two-level hierarchical linear model, with students nested within schools. Some schools used only one mathematics teacher, eliminating the need for nesting students within classrooms, though models were also run with adjustments for the third level of clustering. Analyses also controlled for baseline scores, matching pairs, and available individual and school-level demographic characteristics.

The analysis in Murphy et al. (2020) used a three-level hierarchical linear model, with students nested within classrooms and classrooms nested within schools. Covariates included classroom-level measures such as the mean prior math score for all students, school-level measures such as indicator variables for the matched pairs, and student-level measures such as sixth-grade math score.

Intent-to-Treat:

All students in the target grade were assessed unless they moved away or were out of school on the testing day, though researchers omitted a control group school whose treatment group match dropped out of the study post-assignment.

Outcomes

Implementation Fidelity:

Roschelle et al. (2016) reported that coaches observed each teacher in his or her classroom at least three times during the intervention, but their findings were not reported. In terms of duration, students averaged 14 hours of use-much less than the 24 hours that evaluators anticipated students would use ASSISTments over the course of the semester.

Murphy et al. (2020) reported that the median classroom used the program for 22 weeks, about 55% of the available instructional weeks, which is considered a meaningful duration. Teachers opened 75% of the reports available to them, and students attempted approximately 33 program problems per week.

Baseline Equivalence:

Roschelle et al. (2016) reported no significant differences between conditions on six baseline measures. Murphy et al. (2020) reported two significant differences in eight tests. Treatment students were less likely than control students to be ethnic minorities and to receive free and reduced-price lunch (Table 2, n = 3035).

Differential Attrition:

Roschelle et al. (2016) reported only that the attrition rate in the two conditions differed by 5.6% but did not report whether this difference in attrition rates was significant. There were no tests for differential attrition by student characteristics or baseline-by-condition attrition reported in the article. However, the authors provided further differential attrition results that were requested, which showed no significant baseline-by-condition interactions for four math and reading baseline variables and three demographic characteristics. There was one significant baseline-by-condition interaction for the ethnic minority characteristic.

Murphy et al. (2020) provided several additional details on differential attrition. First, they reported overall student-level attrition of 12.6% and condition differences in attrition of 6.9%. The two figures in combination fall at the boundary between (a) acceptable and (b) acceptable under optimistic assumptions in relation to potential bias as specified by What Works Clearinghouse. Second, joiners - students entering the schools after the start of the program - showed one significant difference in eight tests for baseline equivalence: The treatment group had more ethnic minorities (see Table 3). Third, for the analysis sample that excluded dropouts and included joiners, tests for baseline equivalence of conditions showed one significant difference in eight tests: The treatment group had fewer students qualifying for free and reduced-price lunch - a difference that also emerged in tests of baseline equivalence using the full randomized sample.

Posttest:

Roschelle et al. (2016) found with the two-level model that the intervention group scored significantly higher on the posttest math achievement test than the control group (Hedges' g = .18). After splitting students into low and high-achievement groups, the effect was larger for low-achieving students (g = .29), though both low and high-achieving students had significant gains over control group students.

Murphy et al. (2020) replicated these results in the main effects analysis with the three-level model (Hedge's g = .22). Moderation tests found a statistically significant and negative effect between prior math scores and the treatment condition favoring students entering the study with lower prior math achievement. Mediation tests found no significant indirect effects of the program on math test scores via a measure of targeted homework review practices by teachers.

Long-Term:

The posttest assessment occurred immediately post-intervention.

Study 2

Summary

Koedinger et al. (2010) used a quasi-experimental design to examine 1,240 seventh-grade students from four Massachusetts middle schools. Three schools using the program were non-randomly selected as the intervention group, and one school not using the program was selected as the control group. A state standardized test score at the end of grade seven served as the outcome.

Koedinger et al. (2010) found that, compared to students in the control school and after adjustment for pretest scores, students in the three intervention schools had

  • Higher standardized test scores.

Evaluation Methodology

Design:

Recruitment: The sample included seventh-grade students from four middle schools in an urban school district in Massachusetts. The eligible sample of 1,344 students was reduced to 1,240 by examining those 1) with MCAS (Massachusetts Comprehensive Assessment System) scores from both 2006 (6th-grade test) and 2007 (7th-grade test), and 2) whose math teacher assignment could be determined either from ASSISTments use or other test data.

Assignment: This quasi-experiment study non-randomly selected three schools that had adopted the program to serve as the intervention group (n = 947 students) and one school that had not adopted the program because of an inadequate number of computers to serve as the control group (n = 293 students). There was no matching. Intervention students used the ASSISTments program, while control students worked on traditional textbook activities.

Assessments/Attrition: The posttest came at the end of seventh grade, or after one year of the program. The analysis sample with complete outcome data was 92% of the eligible sample.

Sample:

The study did not present figures on race, ethnicity, gender, or free-lunch eligibility, but it noted that regular students made up 79% of the sample and special education students made up 21% of the sample.

Measures:

The single outcome measure came from the score obtained on the Massachusetts Comprehensive Assessment System, a standardized knowledge test. No other information was provided, and it is unclear if the test was limited to math items.

Analysis:

The analysis used analysis of covariance models with pretest scores and student group (regular vs. special education) as covariates. The models did not adjust for clustering within schools, the unit of assignment.

Intent-to-Treat: The analysis used all students with complete data, though it did not impute missing data.

Outcomes

Implementation Fidelity:

Not examined.

Baseline Equivalence:

The study presented no figures for demographic measures, but the authors stated that the two conditions "had similar student characteristics (race, gender, limited English proficiency (LEP), free lunch eligibility, special education students) and teacher/school characteristics (licensed in teacher assignment, percent of core classes taught by 'highly qualified teachers,' student/teacher ratio)." For the outcome, the authors stated, and Table 1 confirmed, that the pretest score was substantially higher for the control group.

Differential Attrition:

Not examined.

Posttest:

The adjusted mean from the analysis of covariance was significantly higher for the intervention group students than the control group students (d = .23). Moderation tests showed a larger effect for special education students than for regular students.

Long-Term:

Not examined.

Contact

Blueprints for Healthy Youth Development
University of Colorado Boulder
Institute of Behavioral Science
UCB 483, Boulder, CO 80309

Email: blueprints@colorado.edu

Sign up for Newsletter

If you are interested in staying connected with the work conducted by Blueprints, please share your email to receive quarterly updates.

Blueprints for Healthy Youth Development is
currently funded by Arnold Ventures (formerly the Laura and John Arnold Foundation) and historically has received funding from the Annie E. Casey Foundation and the Office of Juvenile Justice and Delinquency Prevention.