Sign Up For Newsletter

Blueprints For Healthy Youth Development logo

ASSISTments

A school-based program that aims to improve student achievement in mathematics through online homework tools and specialized teacher training.

Program Outcomes

  • Academic Performance

Program Type

  • Academic Services
  • School - Individual Strategies

Program Setting

  • Home
  • School

Continuum of Intervention

  • Universal Prevention

Age

  • Late Adolescence (15-18) - High School
  • Early Adolescence (12-14) - Middle School
  • Late Childhood (5-11) - K/Elementary

Gender

  • Both

Race/Ethnicity

  • All

Endorsements

Blueprints: Promising
What Works Clearinghouse: Meets Standards Without Reservations - Positive Effect

Program Information Contact

Brian Story
Director of Teacher Engagement
ASSISTments
Email: brian.story@assistments.org
Website: https://new.assistments.org/

Program Developer/Owner

Neil and Cristina Heffernan
Worcester Polytechnic Institute / The ASSISTments Foundation


Brief Description of the Program

The ASSISTments program consists of a web-based homework tool that gives immediate feedback and hints to students during homework time while simultaneously providing individualized feedback to students' teachers. Content in the program consists of grade-appropriate mathematics problems with answers, hints, and detailed guidance. These problems are bundled into problem sets that teachers can easily assign to their class. Once students complete the assignments, teachers receive reports on student performance, both individualized and class-level. Program content also includes "Skillbuilders," which are standard-focus, mastery-based assignments teachers can use to target specific standards; students demonstrate mastery when they enter three correct answers in a row. These practice problems can be assigned when learning a new skill or to refresh students' memories. Teachers also may receive professional development training to increase their ability to use the ASSISTments reports.

Outcomes

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) found at posttest that, compared to the control group, the intervention group had significantly higher:

  • Math standardized test scores, especially for initially low-achieving students (Terra Nova testing)

Brief Evaluation Methodology

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) evaluated the program in a cluster randomized controlled trial. The 46 participating schools with 3,035 students were randomly assigned to either the intervention condition or a matched standard practice/waitlist control condition. Mathematics test scores at the end of the school year served as the outcome.

Study 1

Roschelle, J., Feng, M., Murphy, R. F., & Mason, C. A. (2016). Online mathematics homework increases student achievement. AERA Open, 2(4), 1-12.


Race/Ethnicity/Gender Details

Roschelle et al. (2016) used a majority white sample and did not test for treatment moderation by gender or race.

Custom Training and Coaching:

(Note all support is virtual. Support can be curriculum-agnostic or aligned to Illustrative Math, Eureka Math, or Open Up Resources)

ASSISTments Fundamentals provides math coaches and teachers the fundamentals they need to get started with a successful and impactful routine, customized to the goals and needs of a school or district.

Monthly coaching hours ensure schools have the ongoing support they need from expert ASSISTments trainers. The coaching engagement can be structured to support math learning goals with ASSISTments and ensure teachers get the ongoing support they need to implement with impact.

Custom Workshops can be designed as hands-on interactive workshops that support ongoing implementation and use of ASSISTments to shift teacher practice. Focus areas include fostering a growth mindset and positive math learning culture, engaging students in meaningful math discussions, and data-driven instructional planning.

School Essentials takes the impact of ASSISTments from the individual classroom to the entire school, giving every math teacher the tools to make an impact on learning. The School Essentials package provides a year-long trajectory of comprehensive, flexible support for teachers, instructional leaders and administrators. Instructional leaders will have the data they need to effectively support teachers, and teachers will engage in regular collaborative, data-driven discussions, supported by real-time student learning insights. Mistakes will be embraced as part of the learning process, building a positive culture around math.

Additional Website Resources: https://new.assistments.org/

  1. Regular monthly webinars that support new and experienced users, as well as a robust searchable library of webinars
  2. ASSISTments Certified Educator Program - On demand modules to get teachers fully up to speed on using ASSISTments with impact, on their own schedule
  3. Teacher Corner of authentic artifacts from real teachers' classrooms that support effective use of ASSISTments.
  4. User Resources and FAQs to support new and experienced teachers as they learn how to use ASSISTments and develop their routines

Source: Washington State Institute for Public Policy
All benefit-cost ratios are the most recent estimates published by The Washington State Institute for Public Policy for Blueprint programs implemented in Washington State. These ratios are based on a) meta-analysis estimates of effect size and b) monetized benefits and calculated costs for programs as delivered in the State of Washington. Caution is recommended in applying these estimates of the benefit-cost ratio to any other state or local area. They are provided as an illustration of the benefit-cost ratio found in one specific state. When feasible, local costs and monetized benefits should be used to calculate expected local benefit-cost ratios. The formula for this calculation can be found on the WSIPP website.

Start-Up Costs

Initial Training and Technical Assistance

ASSISTments Fundamentals provides math coaches and teachers the fundamentals they need to get started with a successful and impactful routine, customized to the goals and needs of a school or district. (3 hours, $2,000 for up to 30 participants)

Additional free resources may be found online: https://new.assistments.org/

Curriculum and Materials

The ASSISTments platform is provided at no cost.

Licensing

No information is available

Other Start-Up Costs

Students need internet access to be able to use ASSISTments. The program works best with the most updated version of either Chrome or Firefox. ASSISTments is compatible with mobile devices, tablets, chromebooks, and regular laptops. It is compatible with Google Classroom and Canvas. 

Intervention Implementation Costs

Ongoing Curriculum and Materials

No information is available

Staffing

Math teachers incorporate ASSISTments as part of regular in-class assignments or homework for students.

Other Implementation Costs

No information is available

Implementation Support and Fidelity Monitoring Costs

Ongoing Training and Technical Assistance

Monthly coaching hours ensure schools have the ongoing support they need from expert ASSISTments trainers. The coaching engagement can be structured to support math learning goals with ASSISTments and ensure teachers get the ongoing support they need to implement with impact. ($4,000 for up to 10 hours and 30 participants)

Custom Workshops can be designed as hands-on interactive workshops that support ongoing implementation and use of ASSISTments to shift teacher practice. Focus areas include fostering a growth mindset and positive math learning culture, engaging students in meaningful math discussions, and data-driven instructional planning. (Price varies)

School Essentials takes the impact of ASSISTments from the individual classroom to the entire school, giving every math teacher the tools to make an impact on learning. The School Essentials package provides a year-long trajectory of comprehensive, flexible support for teachers, instructional leaders and administrators. Instructional leaders will have the data they need to effectively support teachers, and teachers will engage in regular collaborative, data-driven discussions, supported by real-time student learning insights. Mistakes will be embraced as part of the learning process, building a positive culture around math. (3 hours, $2,000)

Fidelity Monitoring and Evaluation

No information is available

Ongoing License Fees

No information is available

Other Implementation Support and Fidelity Monitoring Costs

No information is available

Other Cost Considerations

No information is available

Year One Cost Example


No information is available

Program Developer/Owner

Neil and Cristina HeffernanWorcester Polytechnic Institute / The ASSISTments Foundationcristina.heffernan@assistments.org new.assistments.org

Program Outcomes

  • Academic Performance

Program Specifics

Program Type

  • Academic Services
  • School - Individual Strategies

Program Setting

  • Home
  • School

Continuum of Intervention

  • Universal Prevention

Program Goals

A school-based program that aims to improve student achievement in mathematics through online homework tools and specialized teacher training.

Population Demographics

ASSISTments offers content ranging from 1st grade through high school but is primarily recommended for students grade 4 and above. The Blueprints-certified study included 7th-grade students only.

Target Population

Age

  • Late Adolescence (15-18) - High School
  • Early Adolescence (12-14) - Middle School
  • Late Childhood (5-11) - K/Elementary

Gender

  • Both

Race/Ethnicity

  • All

Race/Ethnicity/Gender Details

Roschelle et al. (2016) used a majority white sample and did not test for treatment moderation by gender or race.

Other Risk and Protective Factors

Individual

  • Mathematics comprehension skills (Program Focus)

Risk/Protective Factor Domain

  • Individual

Risk/Protective Factors

Risk Factors

Protective Factors


*Risk/Protective Factor was significantly impacted by the program

Brief Description of the Program

The ASSISTments program consists of a web-based homework tool that gives immediate feedback and hints to students during homework time while simultaneously providing individualized feedback to students' teachers. Content in the program consists of grade-appropriate mathematics problems with answers, hints, and detailed guidance. These problems are bundled into problem sets that teachers can easily assign to their class. Once students complete the assignments, teachers receive reports on student performance, both individualized and class-level. Program content also includes "Skillbuilders," which are standard-focus, mastery-based assignments teachers can use to target specific standards; students demonstrate mastery when they enter three correct answers in a row. These practice problems can be assigned when learning a new skill or to refresh students' memories. Teachers also may receive professional development training to increase their ability to use the ASSISTments reports.

Description of the Program

The ASSISTments program consists of a web-based homework tool that gives immediate feedback and hints to students during homework time while simultaneously providing individualized feedback to students' teachers. Content in the program consists of grade-appropriate mathematics problems with answers, hints, and detailed guidance. These problems are bundled into problem sets that teachers can easily assign to their class. Once students complete the assignments, teachers receive reports on student performance, both individualized and class-level. Program content also includes "Skillbuilders," which are standard-focus, mastery-based assignments teachers can use to target specific standards; students demonstrate mastery when they enter three correct answers in a row. These practice problems can be assigned when learning a new skill or to refresh students' memories. Teachers also may receive professional development training to increase their ability to use the ASSISTments reports.

In the study by Roschelle et al. (2016), teachers participated in three days of professional development prior to implementation and received six hours of coaching spread across three visits during the school year from an ASSISTments coach.

Theoretical Rationale

Consistent individual practice with mathematics is positively linked to performance, but homework typically does not provide assistance or feedback in a sufficiently timely manner to help a struggling student. The online, real-time homework tool allows for help and feedback, ostensibly improving mathematics practice.

Theoretical Orientation

  • Skill Oriented

Brief Evaluation Methodology

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) evaluated the program in a cluster randomized controlled trial. The 46 participating schools with 3,035 students were randomly assigned to either the intervention condition or a matched standard practice/waitlist control condition. Mathematics test scores at the end of the school year served as the outcome.

Outcomes (Brief, over all studies)

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) found at posttest that the intervention group scored significantly higher on the mathematics assessment tool than the control group. The effect was strongest for low-achieving students, but even high-achieving students had significant gains over control group students.

Outcomes

Study 1

Roschelle et al. (2016) and Murphy et al. (2020) found at posttest that, compared to the control group, the intervention group had significantly higher:

  • Math standardized test scores, especially for initially low-achieving students (Terra Nova testing)

Mediating Effects

Murphy et al. (2020) presented mediation tests that found no significant indirect effects of the program on math test scores via a measure of targeted homework review practices by teachers.

Effect Size

Roschelle et al. (2016) reported a small effect size relative to the control group, with Hedges g=.18.

Generalizability

The program was tested on approximately 3,000 middle school students in Maine, a state that ensures all middle school students receive a laptop to take home. Findings would not be generalizable to states that do not provide computers to students. Tests found that the results generalized to other school districts in Maine but not to most other states.

Endorsements

Blueprints: Promising
What Works Clearinghouse: Meets Standards Without Reservations - Positive Effect

Program Information Contact

Brian Story
Director of Teacher Engagement
ASSISTments
Email: brian.story@assistments.org
Website: https://new.assistments.org/

References

Study 1

Certified Roschelle, J., Feng, M., Murphy, R. F., & Mason, C. A. (2016). Online mathematics homework increases student achievement. AERA Open, 2(4), 1-12.

Murphy, R., Roschelle, J., Feng, M., & Mason, C. A. (2020). Investigating efficacy, moderators and mediators for an online mathematics homework intervention. Journal of Research on Educational Effectiveness, 13(2), 235-270. doi:10.1080/19345747.2019.1710885

Study 1

Roschelle et al. (2016) presented main effects, while Murphy et al. (2020) primarily explored how teacher practices and behaviors mediated the impacts of the intervention.

Evaluation Methodology

Design:

Recruitment: Participating schools (n=46) were recruited from throughout Maine to represent the range of school sizes and mathematics achievement levels across the state. Schools were recruited using a consistent set of messages and presentations, which were available on a website and as handouts. All participating schools self-selected into the study, expressing a desire to use the ASSISTments technology either immediately or after serving as a waitlist control. Two cohorts of 7th-grade classes participated, one starting in fall 2014 and one in fall 2015.

A generalizability test presented by Murphy et al. (2020) indicated that the sample generalizes to school districts in Maine but not most other states.

Assignment: Participating schools were first blocked on school type (K-8 or other) and then matched into pairs on prior standardized test achievement. One school in each pair was randomly assigned to treatment or a standard practice/waiting list control group. The control group received the ASSISTments intervention two years later, after all assessments had been completed. Prior to assignment, one school withdrew, and its pair was rematched and randomly assigned. After assignment, one treatment school dropped out of the study, and its pair was dropped as well. Ultimately, 45 schools were assigned and 43 actually participated in the intervention and assessment (22 treatment schools and 21 control schools). The sample included 87 teachers and 3,035 students.

Attrition: With the first school year devoted to teacher professional development and program support, the baseline came at the start of the second year and the posttest came at the end of the second year. Specifically, the assessments occurred in October 2014 and May 2015 for the first cohort and in October 2015 and May 2016 for the second cohort. At the school level, there was an attrition rate of 6.5%. Roschelle et al. (2016) reported that, out of the original 3,035 students in participating schools, 307 either changed schools or skipped the assessment while 122 new students moved into a participating school, resulting in a final analytic sample of 2,850 (94%). Murphy et al. (2020) reported slightly lower figures of 116 new students and a final analytic sample of 2,769 (91%); the difference likely stems from including data from 87 teachers.

Sample:

The student sample was evenly split on gender (49.3% male) but predominantly white (92.6%), with 38.7% of students receiving free or reduced-price lunch, and 12.2% of the student body attending specialized education classes. All students had access to laptops.

Measures:

Student Academic Achievement was measured using different standardized tests at baseline and posttest. The New England Common Assessment Program (NECAP) was used to establish baseline (6th grade) scores in mathematics and reading. At posttest, similar skills were assessed using the TerraNova standardized test, which is widely used across the country as it aligns with the Common Core Standards for Mathematics and reflects national curricula.

Analysis:

The primary analysis in Roschelle et al. (2016) used a two-level hierarchical linear model, with students nested within schools. Some schools used only one mathematics teacher, eliminating the need for nesting students within classrooms, though models were also run with adjustments for the third level of clustering. Analyses also controlled for baseline scores, matching pairs, and available individual and school-level demographic characteristics.

The analysis in Murphy et al. (2020) used a three-level hierarchical linear model, with students nested within classrooms and classrooms nested within schools. Covariates included classroom-level measures such as the mean prior math score for all students, school-level measures such as indicator variables for the matched pairs, and student-level measures such as sixth-grade math score.

Intent-to-Treat:

All students in the target grade were assessed unless they moved away or were out of school on the testing day, though researchers omitted a control group school whose treatment group match dropped out of the study post-assignment.

Outcomes

Implementation Fidelity:

Roschelle et al. (2016) reported that coaches observed each teacher in his or her classroom at least three times during the intervention, but their findings were not reported. In terms of duration, students averaged 14 hours of use-much less than the 24 hours that evaluators anticipated students would use ASSISTments over the course of the semester.

Murphy et al. (2020) reported that the median classroom used the program for 22 weeks, about 55% of the available instructional weeks, which is considered a meaningful duration. Teachers opened 75% of the reports available to them, and students attempted approximately 33 program problems per week.

Baseline Equivalence:

Roschelle et al. (2016) reported no significant differences between conditions on six baseline measures. Murphy et al. (2020) reported two significant differences in eight tests. Treatment students were less likely than control students to be ethnic minorities and to receive free and reduced-price lunch (Table 2, n = 3035).

Differential Attrition:

Roschelle et al. (2016) reported only that the attrition rate in the two conditions differed by 5.6% but did not report whether this difference in attrition rates was significant. There were no tests for differential attrition by student characteristics or baseline-by-condition attrition reported in the article. However, the authors provided further differential attrition results that were requested, which showed no significant baseline-by-condition interactions for four math and reading baseline variables and three demographic characteristics. There was one significant baseline-by-condition interaction for the ethnic minority characteristic.

Murphy et al. (2020) provided several additional details on differential attrition. First, they reported overall student-level attrition of 12.6% and condition differences in attrition of 6.9%. The two figures in combination fall at the boundary between (a) acceptable and (b) acceptable under optimistic assumptions in relation to potential bias as specified by What Works Clearinghouse. Second, joiners - students entering the schools after the start of the program - showed one significant difference in eight tests for baseline equivalence: The treatment group had more ethnic minorities (see Table 3). Third, for the analysis sample that excluded dropouts and included joiners, tests for baseline equivalence of conditions showed one significant difference in eight tests: The treatment group had fewer students qualifying for free and reduced-price lunch - a difference that also emerged in tests of baseline equivalence using the full randomized sample.

Posttest:

Roschelle et al. (2016) found with the two-level model that the intervention group scored significantly higher on the posttest math achievement test than the control group (Hedges' g = .18). After splitting students into low and high-achievement groups, the effect was larger for low-achieving students (g = .29), though both low and high-achieving students had significant gains over control group students.

Murphy et al. (2020) replicated these results in the main effects analysis with the three-level model (Hedge's g = .22). Moderation tests found a statistically significant and negative effect between prior math scores and the treatment condition favoring students entering the study with lower prior math achievement. Mediation tests found no significant indirect effects of the program on math test scores via a measure of targeted homework review practices by teachers.

Long-Term:

The posttest assessment occurred immediately post-intervention.

Contact

Blueprints for Healthy Youth Development
University of Colorado Boulder
Institute of Behavioral Science
UCB 483, Boulder, CO 80309

Email: blueprints@colorado.edu

Sign up for Newsletter

If you are interested in staying connected with the work conducted by Blueprints, please share your email to receive quarterly updates.

Blueprints for Healthy Youth Development is
currently funded by Arnold Ventures (formerly the Laura and John Arnold Foundation) and historically has received funding from the Annie E. Casey Foundation and the Office of Juvenile Justice and Delinquency Prevention.