Accelerated Study in Associate Programs (ASAP)
A post-secondary college-based prevention program that aims to address potential barriers to academic success and promote credit accumulation and associate degree completion in college students through comprehensive advisement and career and tutoring services provided by dedicated advisers.
Program Outcomes
- Academic Performance
- Employment
- Post Secondary Education
Program Type
- Academic Services
- Mentoring - Tutoring
- School - Individual Strategies
- Skills Training
Program Setting
- School
Continuum of Intervention
- Selective Prevention
Age
- Adult
- Early Adulthood (19-24)
Gender
- Both
Race/Ethnicity
- All
Endorsements
Blueprints: Model Plus
Social Programs that Work:Top Tier
What Works Clearinghouse: Meets Standards Without Reservations - Positive Effect
Program Information Contact
Christine Brongniart
University Executive Director
CUNY ASAP/ACE
Christine.Brongniart@cuny.edu
Website: http://www1.cuny.edu/sites/asap/
For ASAP replication:
Email: CUNYASAPReplication@cuny.edu
Website: https://www1.cuny.edu/sites/asap/replication/
Program Developer/Owner
The City University of New York (CUNY)
Brief Description of the Program
The City University of New York (CUNY) Accelerated Study in Associate Programs (ASAP) is a comprehensive program for college students seeking an associate degree. ASAP is designed to help participating students earn their associate degrees as quickly as possible, with the goal of graduating at least 50 percent of students within three years. The program provides students with structured and wide-ranging supports, including financial resources (e.g., tuition waivers for students in receipt of financial aid with a gap need, textbook assistance, and MetroCards to assist with transportation), structured pathways to support academic momentum (e.g., full-time enrollment, block scheduled first-year courses, immediate and continuous enrollment in developmental education, winter and summer course-taking), and support services such as advisement, tutoring, and career development.
The City University of New York (CUNY) Accelerated Study in Associate Programs (ASAP) model is a comprehensive college-based program designed to improve academic success and degree completion rates for associate-degree students. Committed to graduating at least 50% of students within three years, ASAP currently serves 25,000 students across nine CUNY colleges each academic year.
The CUNY ASAP model requires students to attend college full time during the fall and spring semesters and encourages early completion of developmental supports. The model also offers winter and summer enrollment (i.e., the opportunity to earn credits between the fall and spring terms) as a method to accelerate students' progress. Students are provided early registration so that they can create convenient schedules and obtain seats in needed courses. ASAP also offers intentional opportunities for early engagement through a mix of social, academic, and personal development activities before a student's first semester in ASAP. Activities include a comprehensive intake process and an ASAP Institute experience that is designed to familiarize students with essential college and ASAP policies, create community through team-building activities, and allow students to meet staff members and each other before the start of the semester. During their first year, students enroll in blocked courses, or courses with other ASAP students, and participate in ASAP-led group advisement sessions, covering topics such as goal setting, study skills, and academic planning.
Throughout the duration of the program, students receive comprehensive advisement from a full-time ASAP-dedicated adviser (assigned to them from the time they enter the program through degree completion), career development support from an ASAP-dedicated career specialist, as well as additional tutoring and academic support services. In the initial years of ASAP's implementation, the model required students to meet with an adviser twice per month and the advisers had a caseload of approximately 60 to 80 students. As a result of ASAP's expansion in 2012, the advisor:student ratio increased to 1:150 and the program implemented a support level advisement model whereby students meet with their advisors based on their assigned support level (high, medium, low) after their first semester; in their first semester, all students are placed in the high support group. In addition, students who complete program requirements receive a tuition and fee gap scholarship that fills any gap between financial aid and the college's tuition and fees. Students also receive textbook assistance and an unlimited MetroCard for use on public transportation, contingent on fulfilling the program's eligibility and engagement requirements.
CUNY ASAP's program management structure involves collaboration between a dedicated ASAP Central Office team in the Office of Academic Affairs, and the ASAP program staff at CUNY colleges who are responsible for operating the program at their campuses. The ASAP Central Office is responsible for overall program administration, oversight, and reporting, as well as program-wide evaluation and data management to support iterative improvement.
CUNY ASAP's first replication partnership began in 2014 through a collaboration with a research and evaluation nonprofit organization called MDRC, the Ohio Department of Higher Education, and three Ohio community colleges that implemented programs based on ASAP. In the Ohio programs, students were encouraged to attend specific sections of existing "student success" courses that addressed goal setting, study skills, and academic planning. In addition, throughout the duration of the program, students were connected to colleges' existing career services, and students in developmental education courses were required to attend tutoring. Students were also required to meet with a program adviser twice per month in the first semester, with requirements in the later semesters varying depending on the adviser's determination of the student's support level group, as in the CUNY ASAP advisement model. Program advisers also had student caseloads in line with the CUNY ASAP model. In addition, students received a tuition waiver that filled any gap between their existing grant financial aid and tuition and fees, textbook assistance, and a monthly gift card of $50 to help students purchase groceries or gas and to serve as an incentive to meet other program requirements (for example, attending advising appointments). The Ohio programs were managed locally with dedicated staffing and oversight from college leadership to support data collection, reporting, and iterative improvement.
Since the first ASAP replication project in Ohio, CUNY ASAP continues to expand and partner with colleges in additional states across the U.S.
Outcomes
Primary Evidence Base for Certification
Study 1
Scrivener et al. (2015) and Weiss et al. (2019) found that, compared to participants in the control condition, participants in the intervention condition showed significantly:
- higher session enrollment at the posttest
- higher cumulative credits earned at the posttest and during the post-intervention 3-year follow-up
- higher rates of "any" degree completion at the posttest and during 3-year follow-up
- higher degree earnings at two-year colleges during the 3-year follow-up
- lower average time-to-earn-a-degree during the 3-year follow-up
Study 2
Miller et al. (2020) and Hill et al. (2023) found that students in the treatment group showed significant improvements relative to the control condition on:
- degree completion, mostly at the associate level, after three years
- registering at a 4-year college after three years
- earned a degree, ever earned an associate degree, and ever earned a bachelor's degree after six years
- annual earnings after six years
Brief Evaluation Methodology
Primary Evidence Base for Certification
Study 1
Scrivener et al. (2015) and Weiss et al. (2019) implemented a multi-site individual experimental design in which 896 students from three City University of New York (CUNY) community colleges were randomly assigned to treatment or a business-as-usual control group. Student enrollment in courses, college credit accumulation, and degree receipt were assessed from transcript data and student records at posttest (three years after baseline; Scrivener et al., 2015) and three years after the posttest (Weiss et al., 2019).
Study 2
Miller et al. (2020) and Hill et al. (2023) conducted a multi-site randomized control trial involving 1,501 students attending three different community colleges in Ohio in which administrative records were used to assess college degree receipt and transfer to four-year universities at the end of the three-year program (Miller et al., 2020) and degree completion and earnings after six years (Hill et al., 2023).
Study 1
Scrivener, S., Weiss, M., Ratledge, A., Rudd, T., Sommo, C., & Fresques, H. (2015). Doubling graduation rates: Three-year effects of CUNY's Accelerated Study in Associate Programs (ASAP) for developmental education students. New York: MDRC.
Weiss, M., Ratledge, A., Sommo, C., & Gupta, H. (2019). Supporting community college students from start to degree completion: Long-term evidence from a randomized trial of CUNY's ASAP. American Economic Journal: Applied Economics, 11(3): 253-297. doi.org/10.1257/app.20170430
Study 2
Miller, C., Headlam, C., Manno, M., & Cullinan, D. (2020). Increasing community college graduation rates with a proven model: Three-year results from the Accelerated Study in Associate Programs (ASAP) Ohio demonstration. MDRC.
Risk Factors
School: Low school commitment and attachment
Protective Factors
School: Opportunities for prosocial involvement in education, Rewards for prosocial involvement in school
*
Risk/Protective Factor was significantly impacted by the program
Gender Specific Findings
- Female
Race/Ethnicity Specific Findings
- Hispanic or Latino
- African American
Subgroup Analysis Details
Study 1 (Scrivener et al., 2015; Weiss et al., 2019) found subgroup effects by using a homogenous sample of students eligible for Pell Grants or from families near or below the poverty level. Both reports also tested for subgroup effects by gender and found equal effects for males and females. In addition, Weiss et al. (2019) then tested for subgroup effects by race and ethnicity and found stronger benefits for African Americans and non-Hispanics than other groups. Note, however, that Weiss et al. (2019) found benefits for Hispanics in a separate subsample analysis.
Study 2 (Miller et al., 2020; Hill et al., 2023) found subgroup effects by using a homogenous sample of students eligible for Pell grants. The study then tested for subgroup effects by race and ethnicity and found equal benefits across groups. In addition, Miller et al. (2020) tested for subgroup effects by gender and found equal benefits for females and males, while Hill et al. (2023) tested for subgroup effects by gender and found stronger benefits for females than males.
Sample demographics including race, ethnicity, and gender for Blueprints-certified studies:
- The sample for Study 1 (Scrivener et al., 2015; Weiss et al., 2019) was 62.1% female, 43.6% Hispanic, 34.3% Black, 10.0% White, 7.5% Asian or Pacific Islander, and 4.6% other race/ethnicity.
- The sample for Study 2 (Miller et al., 2020; Hill et al., 2023) was 64% female, 46% White, 35% Black 10% Hispanic, and 10% other race/ethnicity.
The CUNY ASAP Central Office houses a replication team who partner with institutions committed to increasing graduation rates by replicating the ASAP model with a high level of fidelity. The successful outcomes from the earliest replication efforts in Ohio, supported by this team, have led to an enhanced replication structure that includes highly customized technical assistance, as described below, and access to a national network of community colleges, known as the ASAP National Replication Collaborative.
The formal technical assistance that the team provides draws on the operational expertise of ASAP staff and includes structured guidance and troubleshooting support in all areas of program development and implementation, including college-wide program integration and oversight, staffing, marketing, recruitment, student financial supports, the ASAP advisement model, academic pathways, and using data for program management.
The CUNY ASAP team hosts regular consultations with replication partner colleges with opportunities for engagement with the ASAP community, whose members have years of on-the-ground experience operating and scaling ASAP. In addition, partners receive professional development and ongoing support for key replication program staff that includes an interactive orientation and training series on the ASAP advisement model. The CUNY ASAP team also offers guidance designing a comprehensive evaluation approach and a robust data management system to help replication partners use data for program management, and provides unlimited access to a comprehensive archive of electronic materials, resources, and tools developed by CUNY ASAP to support each step of the replication process.
Several key resources are available on the CUNY ASAP replication webpage, including "Inside ASAP: A Resource Guide on Program Structure, Components, and Management," and a brief published in 2020 titled, "Improving Graduation Rates in New York City, Ohio, and Beyond: Accelerated Study in Associate Programs (ASAP) Replications and Lessons Learned."
Source: Washington State Institute for Public Policy
All benefit-cost ratios are the most recent estimates published by The Washington State Institute for Public Policy for Blueprint programs implemented in Washington State. These ratios are based on a) meta-analysis estimates of effect size and b) monetized benefits and calculated costs for programs as delivered in the State of Washington. Caution is recommended in applying these estimates of the benefit-cost ratio to any other state or local area. They are provided as an illustration of the benefit-cost ratio found in one specific state. When feasible, local costs and monetized benefits should be used to calculate expected local benefit-cost ratios. The formula for this calculation can be found on the WSIPP website.
No information is available
No information is available
Program Developer/Owner
The City University of New York (CUNY)New York, New York United States
Program Outcomes
- Academic Performance
- Employment
- Post Secondary Education
Program Specifics
Program Type
- Academic Services
- Mentoring - Tutoring
- School - Individual Strategies
- Skills Training
Program Setting
- School
Continuum of Intervention
- Selective Prevention
Program Goals
A post-secondary college-based prevention program that aims to address potential barriers to academic success and promote credit accumulation and associate degree completion in college students through comprehensive advisement and career and tutoring services provided by dedicated advisers.
Population Demographics
The program targets low-income community college students.
Target Population
Age
- Adult
- Early Adulthood (19-24)
Gender
- Both
Gender Specific Findings
- Female
Race/Ethnicity
- All
Race/Ethnicity Specific Findings
- Hispanic or Latino
- African American
Subgroup Analysis Details
Study 1 (Scrivener et al., 2015; Weiss et al., 2019) found subgroup effects by using a homogenous sample of students eligible for Pell Grants or from families near or below the poverty level. Both reports also tested for subgroup effects by gender and found equal effects for males and females. In addition, Weiss et al. (2019) then tested for subgroup effects by race and ethnicity and found stronger benefits for African Americans and non-Hispanics than other groups. Note, however, that Weiss et al. (2019) found benefits for Hispanics in a separate subsample analysis.
Study 2 (Miller et al., 2020; Hill et al., 2023) found subgroup effects by using a homogenous sample of students eligible for Pell grants. The study then tested for subgroup effects by race and ethnicity and found equal benefits across groups. In addition, Miller et al. (2020) tested for subgroup effects by gender and found equal benefits for females and males, while Hill et al. (2023) tested for subgroup effects by gender and found stronger benefits for females than males.
Sample demographics including race, ethnicity, and gender for Blueprints-certified studies:
- The sample for Study 1 (Scrivener et al., 2015; Weiss et al., 2019) was 62.1% female, 43.6% Hispanic, 34.3% Black, 10.0% White, 7.5% Asian or Pacific Islander, and 4.6% other race/ethnicity.
- The sample for Study 2 (Miller et al., 2020; Hill et al., 2023) was 64% female, 46% White, 35% Black 10% Hispanic, and 10% other race/ethnicity.
Other Risk and Protective Factors
Full-time enrollment, early completion of remedial education courses
Risk/Protective Factor Domain
- Individual
- School
Risk/Protective Factors
Risk Factors
School: Low school commitment and attachment
Protective Factors
School: Opportunities for prosocial involvement in education, Rewards for prosocial involvement in school
*Risk/Protective Factor was significantly impacted by the program
Brief Description of the Program
The City University of New York (CUNY) Accelerated Study in Associate Programs (ASAP) is a comprehensive program for college students seeking an associate degree. ASAP is designed to help participating students earn their associate degrees as quickly as possible, with the goal of graduating at least 50 percent of students within three years. The program provides students with structured and wide-ranging supports, including financial resources (e.g., tuition waivers for students in receipt of financial aid with a gap need, textbook assistance, and MetroCards to assist with transportation), structured pathways to support academic momentum (e.g., full-time enrollment, block scheduled first-year courses, immediate and continuous enrollment in developmental education, winter and summer course-taking), and support services such as advisement, tutoring, and career development.
Description of the Program
The City University of New York (CUNY) Accelerated Study in Associate Programs (ASAP) model is a comprehensive college-based program designed to improve academic success and degree completion rates for associate-degree students. Committed to graduating at least 50% of students within three years, ASAP currently serves 25,000 students across nine CUNY colleges each academic year.
The CUNY ASAP model requires students to attend college full time during the fall and spring semesters and encourages early completion of developmental supports. The model also offers winter and summer enrollment (i.e., the opportunity to earn credits between the fall and spring terms) as a method to accelerate students' progress. Students are provided early registration so that they can create convenient schedules and obtain seats in needed courses. ASAP also offers intentional opportunities for early engagement through a mix of social, academic, and personal development activities before a student's first semester in ASAP. Activities include a comprehensive intake process and an ASAP Institute experience that is designed to familiarize students with essential college and ASAP policies, create community through team-building activities, and allow students to meet staff members and each other before the start of the semester. During their first year, students enroll in blocked courses, or courses with other ASAP students, and participate in ASAP-led group advisement sessions, covering topics such as goal setting, study skills, and academic planning.
Throughout the duration of the program, students receive comprehensive advisement from a full-time ASAP-dedicated adviser (assigned to them from the time they enter the program through degree completion), career development support from an ASAP-dedicated career specialist, as well as additional tutoring and academic support services. In the initial years of ASAP's implementation, the model required students to meet with an adviser twice per month and the advisers had a caseload of approximately 60 to 80 students. As a result of ASAP's expansion in 2012, the advisor:student ratio increased to 1:150 and the program implemented a support level advisement model whereby students meet with their advisors based on their assigned support level (high, medium, low) after their first semester; in their first semester, all students are placed in the high support group. In addition, students who complete program requirements receive a tuition and fee gap scholarship that fills any gap between financial aid and the college's tuition and fees. Students also receive textbook assistance and an unlimited MetroCard for use on public transportation, contingent on fulfilling the program's eligibility and engagement requirements.
CUNY ASAP's program management structure involves collaboration between a dedicated ASAP Central Office team in the Office of Academic Affairs, and the ASAP program staff at CUNY colleges who are responsible for operating the program at their campuses. The ASAP Central Office is responsible for overall program administration, oversight, and reporting, as well as program-wide evaluation and data management to support iterative improvement.
CUNY ASAP's first replication partnership began in 2014 through a collaboration with a research and evaluation nonprofit organization called MDRC, the Ohio Department of Higher Education, and three Ohio community colleges that implemented programs based on ASAP. In the Ohio programs, students were encouraged to attend specific sections of existing "student success" courses that addressed goal setting, study skills, and academic planning. In addition, throughout the duration of the program, students were connected to colleges' existing career services, and students in developmental education courses were required to attend tutoring. Students were also required to meet with a program adviser twice per month in the first semester, with requirements in the later semesters varying depending on the adviser's determination of the student's support level group, as in the CUNY ASAP advisement model. Program advisers also had student caseloads in line with the CUNY ASAP model. In addition, students received a tuition waiver that filled any gap between their existing grant financial aid and tuition and fees, textbook assistance, and a monthly gift card of $50 to help students purchase groceries or gas and to serve as an incentive to meet other program requirements (for example, attending advising appointments). The Ohio programs were managed locally with dedicated staffing and oversight from college leadership to support data collection, reporting, and iterative improvement.
Since the first ASAP replication project in Ohio, CUNY ASAP continues to expand and partner with colleges in additional states across the U.S.
Theoretical Rationale
Accelerated Study in Associate Programs (ASAP) relies on research into the multiple components that help and hinder student success in college.
Theoretical Orientation
- Skill Oriented
- Normative Education
Brief Evaluation Methodology
Primary Evidence Base for Certification
Study 1
Scrivener et al. (2015) and Weiss et al. (2019) implemented a multi-site individual experimental design in which 896 students from three City University of New York (CUNY) community colleges were randomly assigned to treatment or a business-as-usual control group. Student enrollment in courses, college credit accumulation, and degree receipt were assessed from transcript data and student records at posttest (three years after baseline; Scrivener et al., 2015) and three years after the posttest (Weiss et al., 2019).
Study 2
Miller et al. (2020) and Hill et al. (2023) conducted a multi-site randomized control trial involving 1,501 students attending three different community colleges in Ohio in which administrative records were used to assess college degree receipt and transfer to four-year universities at the end of the three-year program (Miller et al., 2020) and degree completion and earnings after six years (Hill et al., 2023).
Outcomes (Brief, over all studies)
Primary Evidence Base for Certification
Study 1
Scrivener et al. (2015) found that, compared to participants in the control condition, participants in the intervention condition showed significantly higher session enrollment, total credits earned, and rates of degree completion at the posttest (i.e., three years after baseline). In a long-term follow-up to Scrivener et al. (2015), Weiss et al. (2019) found that during the three years post-program, the intervention group (compared to the control group) earned more cumulative credits, higher degree completion (not distinguishing between associate's and bachelor's degrees), had a lower average time to earn a degree, and had a higher rate of earning a degree at two-year colleges.
Study 2
Miller et al. (2020) reported that three years after baseline (considered the posttest), students in the treatment group showed significant improvements relative to the control condition on (1) degree completion, mostly at the associate level and (2) registering at a 4-year college. Hill et al. (2023) reported that six years after baseline (long-term follow-up), students in the treatment group showed significant improvements relative to the control condition on two confirmatory outcomes (earned a degree and annual earnings) and two exploratory outcomes (ever earned an associate degree, and ever earned a bachelor's degree).
Outcomes
Primary Evidence Base for Certification
Study 1
Scrivener et al. (2015) and Weiss et al. (2019) found that, compared to participants in the control condition, participants in the intervention condition showed significantly:
- higher session enrollment at the posttest
- higher cumulative credits earned at the posttest and during the post-intervention 3-year follow-up
- higher rates of "any" degree completion at the posttest and during 3-year follow-up
- higher degree earnings at two-year colleges during the 3-year follow-up
- lower average time-to-earn-a-degree during the 3-year follow-up
Study 2
Miller et al. (2020) and Hill et al. (2023) found that students in the treatment group showed significant improvements relative to the control condition on:
- degree completion, mostly at the associate level, after three years
- registering at a 4-year college after three years
- earned a degree, ever earned an associate degree, and ever earned a bachelor's degree after six years
- annual earnings after six years
Effect Size
In Study 1, Scrivener et al. (2015) reported a 22% increase in session enrollment and credit accumulation for treatment students relative to control students, which translates to an average increase of 1.2 sessions and 8.7 credits associated with the treatment. Weiss et al. (2019) reported long-term effects of the intervention, compared to the control group: a 13% increase for cumulative credits earned, an 18% increase for degree receipt at any college, and one year less time for time to degree.
In Study 2, Miller et al. (2020) and Hill et al. (2023) did not report effect sizes.
Generalizability
Two studies meet Blueprints standards for high quality methods with strong evidence of program impact (i.e., "certified" by Blueprints): Study 1 (Scrivener et al., 2015; Weiss et al., 2019) and Study 2 (Miller et al., 2020; Hill et al., 2023). Student eligibility criteria in both studies included being of low-income and/or eligible for a Pell Grant.
- Study 1 took place in three City University of New York community colleges, in which treatment was compared to business-as-usual.
- Study 2 took place in three community colleges in Ohio, in which treatment was compared to business-as-usual.
Notes
In Study 1, Scrivener et al. (2015) examined outcomes at baseline and at the end of the 3-year program. Weiss et al. (2019) conducted a long-term follow-up study, which assessed outcomes 3-years post-program completion (or six years after baseline). In addition, some of the posttest effects were also reported in Weiss et al. (2019). Though the following citation did not report effects of Study 1, it did report information on baseline equivalence so is mentioned in this write up:
Scrivener, S., Weiss, M. J., & Sommo, C. (2012). What can a multifaceted program do for community college students? Early results from an evaluation of Accelerated Study in Associate Programs (ASAP) for developmental education students. New York: MDRC.
On pages 61-65, Miller et al. (2020) provide a detailed description of how the Ohio ASAP model compares to the original CUNY ASAP model. Notable differences include participating students and costs. Miller et al. (2020) explain: "The students in the Ohio evaluation differed from those in the CUNY evaluation in various ways. For example, more Ohio sample students were nontraditional, meaning they were older, were working full time, or had children. This difference in the sample is notable, as there was some concern that the model, with its full-time attendance requirement, would not work for these types of students. Fewer Ohio sample students had developmental requirements than was the case in the CUNY evaluation. Ohio students were also closer to obtaining degrees when they entered the study than CUNY students, meaning that they entered the study with more credits already earned" (p. 61). In addition, the Ohio Programs model was based on a modified, less expensive version of CUNY ASAP. The authors note: "almost every category of costs was lower in the Ohio Programs evaluation than in the 2010 CUNY ASAP evaluation, probably because CUNY had to pay higher New York City salaries, dedicated tutors and career specialists, costs associated with blocked and linked courses, and higher monthly incentives in the form of Metro-Cards" (p. 66). Still, providing the Ohio Program is not cheap. Miller et al. (2020) note: "The direct cost of the Ohio Programs was about $5,500 per program group member over the three-year period. However, because they increased degree receipt, the programs had a lower cost per degree received. Nonetheless, colleges may struggle to sustain these services over the long term without funding support from the state or other sources" (p. 3).
Benefit-Cost Analysis:
ASAP has been the subject of a two-part cost-benefit study led by Dr. Henry M. Levin and a research team from the Center for Benefit-Cost Studies of Education (CBCSE) at Teachers College, Columbia University. In September 2012, Dr. Levin's cost-effectiveness report revealed that the average cost per three-year ASAP graduate is lower than comparison group graduates. Part two of Dr. Levin's study was released in May 2013. This benefit-cost analysis found that an investment in ASAP has very large financial returns for both the taxpayer and the ASAP student due to increased lifetime earnings and tax revenues and reduced costs of spending on public health, criminal justice, and public assistance.
JHE Article: Levin, H. M., & García, E. (2017). Accelerating community college graduation rates: A benefit-cost analysis. The Journal of Higher Education, 8(1), 1-27.
CBCSE Report: Levin, H. M., & García, E. (2013). Benefit‐cost analysis of Accelerated Study in Associate Programs (ASAP) of The City University of New York (CUNY). New York, NY: Center for Benefit-Cost Studies in Education, Teachers College, Columbia University.
CBCSE Report: Levin, H. M., & Garcia, E., with the assistance of Morgan, J. (2012). Cost-effectiveness of the Accelerated Study in Associate Programs (ASAP) of the City University of New York. New York, NY: Center for Benefit-Cost Studies in Education, Teachers College, Columbia University.
Endorsements
Blueprints: Model Plus
Social Programs that Work:Top Tier
What Works Clearinghouse: Meets Standards Without Reservations - Positive Effect
Program Information Contact
Christine Brongniart
University Executive Director
CUNY ASAP/ACE
Christine.Brongniart@cuny.edu
Website: http://www1.cuny.edu/sites/asap/
For ASAP replication:
Email: CUNYASAPReplication@cuny.edu
Website: https://www1.cuny.edu/sites/asap/replication/
References
Study 1
Certified
Scrivener, S., Weiss, M., Ratledge, A., Rudd, T., Sommo, C., & Fresques, H. (2015). Doubling graduation rates: Three-year effects of CUNY's Accelerated Study in Associate Programs (ASAP) for developmental education students. New York: MDRC.
Certified
Weiss, M., Ratledge, A., Sommo, C., & Gupta, H. (2019). Supporting community college students from start to degree completion: Long-term evidence from a randomized trial of CUNY's ASAP. American Economic Journal: Applied Economics, 11(3): 253-297. doi.org/10.1257/app.20170430
Study 2
Certified
Miller, C., Headlam, C., Manno, M., & Cullinan, D. (2020). Increasing community college graduation rates with a proven model: Three-year results from the Accelerated Study in Associate Programs (ASAP) Ohio demonstration. MDRC.
Hill, C., Sommo, C., & Warner, K. (2023). From degrees to dollars: Six-year findings from the ASAP Ohio demonstration. MDRC.
Study 1
Summary
Scrivener et al. (2015) and Weiss et al. (2019) implemented a multi-site individual experimental design in which 896 students from three City University of New York (CUNY) community colleges were randomly assigned to treatment or a business-as-usual control group. Student enrollment in courses, college credit accumulation, and degree receipt were assessed from transcript data and student records at posttest (three years after baseline; Scrivener et al., 2015) and three years after the posttest (Weiss et al., 2019).
Scrivener et al. (2015) and Weiss et al. (2019) found that, compared to participants in the control condition, participants in the intervention condition showed significantly:
- higher session enrollment at the posttest
- higher cumulative credits earned at the posttest and during the post-intervention 3-year follow-up
- higher rates of "any" degree completion at the posttest and during 3-year follow-up
- higher degree earnings at two-year colleges during the 3-year follow-up
- lower average time-to-earn-a-degree during the 3-year follow-up
Evaluation Methodology
Design:
Recruitment: Participants were recruited from three City University of New York (CUNY) community colleges: Borough of Manhattan Community College, Kingsborough Community College, and LaGuardia Community College. Student eligibility criteria were: 1) family income 200% below the federal poverty level or eligible for a Pell Grant (or both); 2) in need of one or two remedial courses (to build math, reading, or writing skills) based on CUNY Assessment Tests; 3) new student or continuing student who had previously earned 12 credits or fewer and had at least a 2.0 grade point average; 4) New York City resident; 5) willing to attend college full time; and 6) in an ASAP-eligible major. Students who met eligibility criteria were invited to participate in the study through letters, emails, and phone calls.
Assignment:
The assigned sample included 896 students from two cohorts, one each prior to the spring and fall semesters in 2010, at three of the six community colleges running the intervention at that time. Students were randomly assigned to the intervention condition (n=451) or business-as-usual control condition (n=445). Randomization was completed using a computer program.
Attrition:
Transcript data and student records were assessed for students' course session enrollment, course taking, and degree receipt (if applicable) three years after baseline (Scrivener et al., 2015). Initial and analysis Ns match in all tables, suggesting no attrition due to the use of student records.
Weiss et al. (2019) reported that 903 students were randomly assigned, though seven students were not included in any analyses because they withdrew from the study or their consent form was unrecovered (less than 1% attrition). Thus, the analytic sample included a total of 896 students.
Sample:
The mean age of participants upon entering the study was 21.5 years. The sample was 62.1% female, 43.6% Hispanic, 34.3% Black, 10.0% White, 7.5% Asian or Pacific Islander, and 4.6% other race/ethnicity. Sixty percent of students were incoming freshman, 33.5% were returning students, and 6.5% were transfer students. The majority of students (87.5%) received a Pell Grant, 59.9% needed remedial instruction in one subject (math, reading, or writing), 26.8% needed remedial instruction in two subjects, and only 1.9% were considered college-ready. Most students were unmarried and did not have any children when they entered the study, and most lived with their parents.
Measures:
Students' enrollment, course taking, and degree receipt (if applicable) were taken from transcript data and student records to measure outcomes.
Scrivener et al. (2015) reported use of the following outcome measures (at posttest, or 3 years after baseline): (1) session enrollment (number of semesters in which each student enrolled over the course of three years, out of 12 semesters), (2) total cumulative credits earned, and (3) "any" degree completion (not distinguishing between associate's and bachelor's).
Weiss et al. (2019) assessed the following main outcomes (four, five and six years after baseline):
- Any college enrollment by semester;
- Full-time enrollment at any college by semester (i.e., attempting 12 or more credits per semester at CUNY colleges);
"Enrollment" for each of the outcomes listed above was based on courses that students were enrolled in at the end of the add/drop period, indicating they were still enrolled in college.
- Marginal credit accumulation earned at CUNY colleges, which include credits earned in a particular semester only;
- Cumulative credits earned at CUNY colleges (both remedial and college-level credits earned since baseline that lead to a degree);
- "Any" degree completion at any college (this measure did not distinguish between associate's and bachelor's. Also, degree receipt was cumulative. Those who earned a degree in an earlier semester were counted as having a degree in subsequent semesters);
- Average time to earn a degree;
- Degree receipt at two-year colleges; and,
- Degree receipt at four-year colleges.
Analysis:
The analysis pooled the sample across the two cohorts and three colleges.
In Scrivener et al. (2015), the authors state that they used generalized linear models to test for intervention effects (Footnote 4, pg. 51), though they do not indicate whether prior year outcomes were controlled in their models. All models included college-by-cohort combinations as covariates. They also plot results of the 12 individual two-tailed t-tests applied to differences between research groups within each year, semester, and academic session with estimates adjusted by site and cohort (Figure 4.1, pg. 52).
For all but one of the outcomes, Weiss et al. (2019) reported tables and figures including regression-adjusted averages with 90% CIs for intervention condition and control groups to represent the effects of the ASAP program on academic outcomes. Estimates were adjusted by random assignment blocks and select baseline characteristics (gender, race, age, has any children, single parent, working, depends on parents for more than half of expenses, first in family to attend college, and earned a high school diploma; see footnote number 6 on page 259). The authors stated that robust standard errors were used in all analyses.
For one outcome (average time-to-degree), results were reported by quantile (similar to quantile regression) and represent the quickest degree-earners. For this analysis, earned-a-degree estimates were not regression-adjusted. Average semesters-to-earn-degree was calculated as the average number of semesters it took for the first X percent of degree earners to earn their first degree. The authors explain in footnote 17 (pp. 270-271): "our missing data (due to time-censoring) is for individuals who have not yet earned a degree within six years. We know that the only possible values for time-to-degree for these nondegree earners are greater than six years or never. Thus, in the control group, they all fall above the forty-first quantile, and in the program group they all fall above the fifty-first quantile. Consequently, we plot the empirical distribution of the outcome up through the known quantiles without making a monotone treatment response assumption or a rank-preservation assumption...To avoid imputation of unknown points on the empirical cumulative distribution, we focus discussion on quantiles where data are available for the program and control groups."
For all outcomes, program effects in Weiss et al. (2019) were presented during the six years after baseline, which represents one-, two- and three-year follow-up after the program was complete (i.e., post-program semesters 7-12, or the long-term effects).
For both articles (Scrivener et al., 2015; Weiss et al., 2019), when the authors presented effect sizes, they were described in the text in the form of marginal effects, which represent the predicted probability of the postsecondary outcomes for treatment and control separately while holding other covariates constant at their means.
Intent-to-Treat: All available data were used in analyses. However, Table 1 shows missing covariate data (socio-demographics) that range from 1-5%.
Outcomes
Implementation Fidelity:
Over 95 percent of program group members were exposed to at least some portion of the program. Weiss et al. (2019) also report that "Overall, ASAP was well implemented and the difference between ASAP and regular college services available to the study's control group was substantial." Program group members were more likely than control to report messaging on hearing college faculty/staff speak about the importance of taking developmental courses early and obtaining an associate's degree within 3 years.
In addition, 95 percent of program group students reported contact with advising, compared with 80 percent of control group students. Researchers also found large differences in students' reported receipt of transportation assistance and textbooks, indicating a dramatic service contrast. In addition, implementation data showed that a majority of treatment students enrolled in courses with a significant concentration of other treatment students, one goal of the program.
Baseline Equivalence:
No significant differences were detected in baseline demographics or academic-related variables (e.g., highest grade completed, diplomas/degrees earned, highest degree student plans to attain) for the assigned sample. This information was not reported in the article but was obtained from a separate report on this evaluation (Scrivener, Weiss, & Sommo, 2012).
Differential Attrition:
There was no attrition in either article for Study 1 (Scrivener et al., 2015; Weiss et al., 2019).
Posttest:
In Scrivener et al. (2015), three years after baseline (considered the posttest), students in the intervention condition showed significant improvements relative to the control condition on all three outcomes tested: (1) session enrollment rates (treatment increased enrollment by an estimated 4 percentage points at the posttest), (2) cumulative credits earned (treatment students earned an average of 8.7 more cumulative credits than control students), (3) and degree completion (program group members graduated at a rate of nearly 40 percent and control group members graduated at a rate of 22 percent, for an estimated effect of around 18 percentage points).
Long-Term:
Weiss et al. (2019) reported long-term effects for the three years post-intervention (years 4-6 after baseline, or semesters 7-12). Results were shown in Figures 1-8 and Tables A1-A8.
There were significant long-term effects for the intervention group, compared to the control group, on four of the eight measured academic outcomes during the post-program period. The intervention group: (1) earned 7.0 more cumulative credits at CUNY colleges, which is a 13 percent increase over the control group base of 55.6 credits, or 12 percent of the 60 college-level credits required to earn an associate's degree (Figure 4, p. 268); (2) completed more degrees by 10 percentage points (Figure 5, p. 269); (3) had a lower average time to earn a degree (i.e., averaged under 4.9 semesters, which was one year or two semesters earlier than the control group; Figure 6, p. 270), and (4) earned more degrees at two-year colleges (note: see Panel A in Table A8 on page 293. In dividing the covariate-adjusted effect by its standard error, all results from semesters 7-12 were greater than 1.96, indicating statistically significant findings at p < .05).
During years 4-6 after baseline (or semesters 7-12), there was no effect on: (1) number of semesters enrolled in college, (2) full-time enrollment, (3) marginal credits earned, or (4) earning a degree at a four-year college (note: see Panel B in Table A8 on page 293. In dividing the covariate-adjusted effect by its standard error, all results from semesters 7-12 were less than 1.96, indicating no statistically significant differences at p < .05).
Weiss et al. (2019) also conducted exploratory subgroup analyses for intervention effects on earning a degree at the three-year (posttest) and during years 4-6 (long-term follow-up) points. Subgroup analyses were conducted across a range of sociodemographic variables (see Tables 4 and 5). Subgroup findings were only detected for race, in which the authors found degree-earning estimates were larger for treatment than control at both the posttest (three years after baseline) and follow-up (during years 4-6 post-baseline) for Black and White students, compared to Hispanic students.
Study 2
Summary
Miller et al. (2020) and Hill et al. (2023) conducted a multi-site randomized control trial involving 1,501 students attending three different community colleges in Ohio in which administrative records were used to assess college degree receipt and transfer to four-year universities at the end of the three-year program (Miller et al., 2020) and degree completion and earnings after six years (Hill et al., 2023).
Miller et al. (2020) and Hill et al. (2023) found that students in the treatment group showed significant improvements relative to the control condition on:
- degree completion, mostly at the associate level, after three years
- registering at a 4-year college after three years
- earned a degree, ever earned an associate degree, and ever earned a bachelor's degree after six years
- annual earnings after six years
Evaluation Methodology
Design:
Recruitment: The evaluation took place at 3 of the 23 community colleges in Ohio. Recruitment for the evaluation took place before the spring 2015, fall 2015, and spring 2016 semesters. Eligible students were required to be: (1) low-income (i.e., eligible for Pell Grants); (2) seeking degrees; (3) willing to attend community college full-time; (4) majoring in degree programs that could be completed within three years; and (5) newly enrolled or with 24 or fewer credits earned. Students were invited to participate through letters, emails, and phone calls.
Assignment: The assigned sample included 1,501 students from three cohorts (before spring 2015, fall 2015, and spring 2016 semesters) at the three community colleges. Students were randomly assigned to the intervention condition (n=806) or business-as-usual control condition (n=695). Randomization was completed using a computer program.
Assessments/Attrition: The two reports examined outcomes after three years (Miller et al., 2020) and after six years (Hill et al., 2023). The three-year assessment serves as a posttest, while the six-year assessment serves as a long-term follow-up. At three years, the initial and analysis Ns matched in all tables, suggesting no attrition due to the use of student records. At six years, the initial and analysis Ns in Table 1 indicated either no attrition or an attrition rate of 1.3%.
Sample: The sample was 64% female, 46% White, 35% Black 10% Hispanic, and 10% other race/ethnicity. The average age of participants when they joined the study was 23. In addition, 31 percent of the sample was 24 or older, 27 percent had children upon study entry, and a majority were working part-time. About half the sample was nontraditional, that is, students over age 24, working full time, with children, or without high school diplomas (having received General Educational Development [GED] certificates or other high school equivalencies instead). Three out of four students had developmental education requirements at study entry, meaning that they needed to complete at least one developmental course (in math, reading, or writing) as they progressed through college. Finally, about a third of the participants entered the study as new students, with no college credits earned, while another third had already earned 13 or more college credits.
Measures: Measures of academic outcomes were obtained from detailed college transcript records and placement exam data, all of which were provided by the three participating colleges. In addition, data from the National Student Clearinghouse, which covers student enrollment in nearly all postsecondary institutions in the United States, were used to examine enrollment, transfer, and graduation rates. Miller et al. (2020) reported on two outcomes: (1) "any" degree completion (not distinguishing between certificate, associate's, and bachelor's); and (2) registered at a 4-year college. Hill et al. (2023) prespecified two confirmatory outcomes (receiving a degree and annual earnings) and three exploratory outcomes (ever earned an associate degree, ever earned a bachelor's degree, and ever employed in year six). The measures came from the National Student Clearinghouse and the Ohio unemployment insurance wage records.
Analysis: Miller et al. (2020) provided no information on the three-year analysis, aside from a note at the bottom of Table 5.1 on page 44 that explained the analysis used a two-tailed test and estimates were adjusted by site, cohort, gender, race/ethnicity, age, parental status, marital status, weekly hours worked, dependence on parents for 50 percent or more of financial support, whether a student is the first family member to attend college, whether a student earned a high school diploma, the number of developmental education requirements at the time of random assignment, and intended enrollment level. It was not possible to control for college outcomes at baselines, but there were two measures of earlier academic performance. Hill et al. (2023) similarly offered little information other than a long list of controls used to obtain adjusted condition means at follow-up.
Missing Data Strategy. Most analyses used the full randomized sample, but for three outcomes at six years (Hill et al., 2023), the analysis used complete cases without the small portion of participants who were missing outcome data.
Intent-to-Treat: All available data were used in analyses except for 19 participants (1.3%) in Hill et al. (2023) who did not provide social security numbers and had missing data on the labor market outcomes.
Outcomes
Implementation Fidelity: For the most part, the colleges successfully implemented the programs based on the ASAP Ohio Program model. Nearly all treatment students who were enrolled in semester 1 were enrolled full-time, as required by the program. By semester 4 (2 years after baseline), however, only about 60 percent of enrolled students were enrolled full-time, illustrating a fall in program participation over time. Treatment students who were enrolled but were not enrolled full-time were not eligible for the program's financial support but were eligible for advising, tutoring, and career services. Enrollment dropped steadily over the semesters, as students either earned degrees or left school.
Baseline Equivalence: Baseline characteristics by condition were reported in Appendix Table B.1 (Miller et al., 2020, pp. 83-85). No significant baseline differences (p < .05) were detected for 58 demographics or academic-related variables (e.g., highest grade completed, diplomas/degrees earned, highest degree student plans to attain).
Differential Attrition: Except for a survey used to assess program implementation, there was no attrition or minimal attrition.
Posttest: In Miller et al. (2020), three years after baseline (considered the posttest), students in the treatment group showed significant improvements relative to the control condition on both outcomes tested: (1) degree completion, mostly at the associate level, in the last three semesters of the study period; and (2) registered at a 4-year college in the last two semesters of the study period. These results were listed in Table 5.1 on page 44.
In terms of "risk and protective factors," results also showed that compared to control students, students in the treatment group met developmental education requirements at significantly higher rates. They also had higher enrollment rates and credits earned in semesters 1-4 (Figure 5.1 and Table 5.2, pp. 46-47).
Moderation tests in Table E.2 (pp. 111-112) showed no significant differences in the intervention effects on degrees earned across numerous subgroups.
Long-Term: The six-year evaluation (Hill et al., 2023) found that the intervention group had significantly better outcomes than the control group for the two confirmatory measures: ever earned a degree and annual earnings. The supplemental document lists more detailed findings by semester, quarter, and subgroup and shows that the results were significant when adjusting for multiple comparisons. The intervention group also had significantly better outcomes than the control group on two of three exploratory measures: ever earned an associate degree and ever earned a bachelor's degree.
Tests for moderation found that the intervention "seemed to have larger impacts for women than for men" on degree completion but the authors noted that 'this finding should be interpreted with caution given the number of subgroups tested." There were also strong intervention impacts on annual earnings for students who did not have developmental education needs at baseline and on members of spring cohorts.