Increasing students’ academic success in postsecondary endeavors is an important goal for both high school and college institutions today. However, the standards for high school graduation and college readiness are not well aligned, and successful transition from high school to college is problematic for many students, particularly in math. This article describes a P-16 collaborative effort to examine high school math achievement in relation to college math placement and how the results informed policies and practices in both organizations.
Academic achievement and college success are important to both an individual’s future well-being and for our country’s economic health. Baum, Ma, and Payea (2013) report college graduates’ (bachelor’s degree) median annual earnings of $56,000 in 2011 compared with a high school diploma with median annual earnings of $35,000. Over the course of a lifetime, the difference in earning potential is dramatic. Similarly, health statistics indicate college-educated individuals experience less significant chronic health problems such as smoking and obesity (Baum et al., 2013). Finally, the poverty rate of college-educated individuals is lower than noncollege-educated individuals, federal public assistance program participation is lower, and federal income tax dollars generated by individuals with a bachelor’s degree are higher (Baum et al., 2013). While these are just a few examples, overall, individuals and our nation as a whole, benefit from increased educational attainment.
Adequate high school preparation is critical for subsequent successful college and career experiences. High school graduation rates have been one metric of a school system’s success for many years, and when the No Child Left Behind Act (U.S. Department of Education, 2002) included graduation rates in the federal accountability system, the focus and emphasis on increasing graduation rates was heightened. Nationally, high school graduation rates increased more than 10 percentage points from 2000 to 2013 (Deming & Figlio, 2016). In 2013, New Jersey had an overall graduation rate of 87.5%, demonstrating an increase of 4.4% from 2011 (DePaoli et al., 2015).
More recent efforts have focused not only on dropout prevention to increase graduation rates but also on increasing the number of students graduating college and career ready. One of the stated goals of the Race to the Top federal education initiative (U.S. Department of Education, 2009) was for all students to graduate from high school, college, and career ready. However, the definition of college readiness has varied depending on the source. For example, as reported in 2008, four leading college readiness sets of standards differed significantly (Rolfhus, Decker, Brite, & Gregory, 2010). The American Diploma Project had 62 English language arts standards, ACT had 191, the College Board had 115, and Standards for Success had 73, and when English language arts content and rigor of these four college readiness sets of standards was systematically compared, the alignment varied considerably.
The development of the Common Core State Standards was formally initiated in 2009 by the Chief Council of State School Officers (CCSSO) and the National Governors Association (NGA), with the intent to create a national set of K-12 standards with sufficient rigor to prepare students to be successful in college and careers (CCSSO & NGA, 2009). Conceptually, these standards would essentially provide a consistent definition of college readiness across the states that adopted them (Tepe, 2014).
Despite graduation rate gains and some increased consensus about the definition of college and career readiness, The New York Times (Rich, 2015) recently questioned graduates’ readiness to successfully engage in college-level academic work. The lack of alignment between student proficiency levels required to graduate from high school and those required for success in college has been described in multiple studies (Venezia, Callan, Finney, Kirst, & Usdan, 2005; Venezia, Kirst, & Antonio, 2003; Wilkins, Hartman, Howland, & Sharma, 2010). Lack of communication and collaboration across K-12 school districts and postsecondary institutions to align efforts to adequately prepare high school graduates for college-level coursework has been identified as a problem by numerous researchers (Venezia et al., 2003; Venezia et al., 2005). Venezia et al. (2003) describe the lack of postsecondary understanding of K-12 academic standards and performance levels, as well as high school administrators’ and counselors’ lack of understanding of postsecondary placement practices, as key barriers to ensuring students are placed in the most appropriate college courses.
This lack of communication and alignment from high school to postsecondary institutions has led to varying practices of placing students in first-year college courses. The challenge is to identify a process by which students take the highest course in which they will be successful (achieve a passing grade) and avoid taking courses for which they are not prepared and are more likely to fail (Venezia et al., 2003). Many colleges use student performance on standardized tests such as the SAT or ACT to determine placement in initial courses (Bracco et al., 2014; Hodara & Cox, 2016). Other institutions rely on placement tests administered quickly and inexpensively to students as part of their registration process (Hughes & Scott-Clayton, 2011; Jaggars & Hodara, 2013; Scott-Clayton, Crosta, & Belfield, 2014). The accuracy of these assessments to predict future success has been questioned by a number of scholars (Bracco et al., 2014; Hughes & Scott-Clayton, 2011; Kirst, 2007; Latterell & Regal, 2003; Scott-Clayton, 2012). Several studies explore and recommend the use of multiple measures, including high school course completion and grades, as a way to increase the accuracy of placing students in the courses which will be most beneficial (Hughes & Scott-Clayton, 2011; Scott-Clayton, 2012; Scott-Clayton et al., 2014).
Numerous research studies have been conducted to examine high school students’ mathematics performance as it relates to their subsequent performance at the postsecondary level (Adelman, 1999, 2004, 2006; Adelman, Danie, & Berkovits, 2003). For students who are assessed as not prepared to be successful in college-level math, the most common practice has been to recommend enrollment in developmental math (DM) courses. Developmental (or remedial) courses typically do not accrue credits for gradation or degree purposes. Considerable attention has focused on understanding the positive and negative effects of remedial courses on various measures of student success such as students’ persistence, subsequent success in passing courses, college credit accumulation, and on-time graduation (4-6 years). Assessing the impact of remedial course taking is complex. Logic would suggest less prepared students are less likely to achieve these various indicators of college success. However, it is challenging to separate the effects of inadequate preparation and remedial course taking. One way to understand the effects of taking remedial courses is to, first, identify students who achieved similar levels of academic preparation, and then, compare outcomes for those who took remedial courses with those who did not.
The findings from several studies with sufficiently rigorous research designs to account for these differences report mixed outcomes. When comparing students with similar levels of high school preparation and the outcomes for students who took remedial courses with those who did not, Bettinger and Long (2009) found positive effects (lower dropout rates after 5 years and an increase in 4-6 years graduation rate) for Ohio college students who took remedial courses. That same study, however, showed negative effects on credit accumulation and first-year dropout. Multiple other analyses and studies report negative impacts of remedial course taking on outcomes such as attempted and completed academic credits, completing 1 year of college, and degree completion (Attewell, Lavin, Domina, & Levey, 2006; Calcagno & Long, 2008; Martorell & McFarlin, 2011). In addition, their research shows that, while some students may not experience the negative achievement effects of remedial courses, they also do not show improved outcomes or benefits. Several studies describe differential effects based on the students’ level of prior academic preparation, and report students with higher previous achievement (closer to the cutoff scores for college course readiness) who take remedial courses experience more negative outcomes (Chen, 2016; Scott-Clayton & Rodriguez, 2015). These studies also suggest the academic effects of remediation are mixed, the benefits unclear for many of the more prepared students, and that remediation may be an effective intervention for a much smaller portion of students than current practices reflect.
One of the clearest negative impacts on students who take remedial courses is increased tuition costs associated with taking courses that do not count toward degree completion credits (Kirst, 2007; Scott-Clayton, 2012; Scott-Clayton et al., 2014; Scott-Clayton & Rodriguez, 2015; Venezia et al., 2003). Given that recent data suggest 50% of college students will take at least one remedial course in college, the economic costs of remedial courses, to both institutions and individuals, is significant (Chen, 2016; Martorell & McFarlin, 2011; Scott-Clayton et al., 2014; Scott-Clayton & Rodriguez, 2015). Without evidence of clear benefit, recommending students incur additional expense for taking remedial courses is a questionable practice.
Postsecondary assessment and placement practices have consequences for students. In particular, the overplacement of students in remedial courses is described as a more frequent problem than the overplacement of students in credit-bearing courses where they struggle (Scott-Clayton, 2012; Scott-Clayton et al., 2014). The potential negative effects of taking remedial courses, particularly for better-prepared students, and the mixed evidence of benefits, compel institutions to examine course placement practices and subsequent student success measures at a local level. The goal is to place students in the highest courses in which they are likely to succeed (based on evidence of prior achievement), but avoid enrolling them in courses for which they do not have adequate preparation to successfully complete. This study focused specifically on K-12 mathematics achievement and postsecondary initial math course placement and success (achieving a passing grade in a first-semester credit-bearing math course) in a small urban university serving a significant percentage of first-generation college-going students.
A collaborative effort between a 4-year university (University) and a local P-12 district (District) provided the context to examine the links between mastery of specific math concepts and courses (K-12 achievement), postsecondary enrollment in developmental or credit-bearing math courses, and subsequent grades in the math courses. The research collaboration required a legal data sharing agreement (DSA) and represents an important cross-organizational effort to improve student outcomes.
The study question was as follows: Does mastery of specific mathematics courses and content (such as Algebra II [AlgII]) lead to first-year placement and success in credit-bearing postsecondary mathematics courses? The goal for the District was to better understand how to prepare students for success in college and careers, and the goal for the University was to enroll increased numbers of students in credit-bearing entry-level college mathematics courses (decreased numbers of students enrolled in DM courses) who subsequently earn a passing grade.
To better understand the problem and potential solutions, the study examined extant data on first-year University students who had attended and graduated from a local feeder school District. As an initial attempt to understand the alignment (or lack of alignment) between high school and college mathematics success, the methodology in this study involved collecting and analyzing descriptive data. There were three primary steps in developing the University/District collaborative effort and conducting the study:
Identify the required data elements and the data system from which to extract each. For each student the following data were used:
From the District: high school standardized test scores and mathematics courses taken with corresponding grades earned and
From the University: SAT scores, University math placement test (MPT) scores,1 first-semester math course placement and related first-semester math course grades.
Conduct descriptive analyses of relationships between K-12 mathematics performance and college mathematics course enrollment and performance. The primary analysis was a Pearson correlation in which high school math course taking, grades, and SAT scores were the independent variables and first-semester University mathematics course enrollment and grades were the dependent variables.
Synthesize results and identify how results can inform intervention to improve student outcomes. The researcher and District administrator intended to examine the relationship between high school achievement and college math course success to begin to examine programs the high school had implemented to improve student achievement. The university achievement data would serve as an additional data point to examine in combination with the individual student data the District had been collecting over time. The researcher also worked collaboratively with the University institutional effectiveness office to examine results and implications for improved course placement practices.
As described, the data collected for this analysis was confidential student data. Sharing confidential information across the two agencies is limited by Federal Education Right to Privacy Act regulations, and a legal counsel-approved DSA was developed to address these issues. It outlined the roles and responsibilities of each organization, defined the data collection requirements, and described the logistics and protocols for ensuring student confidentiality.
Collecting Student-Level Data
The researcher met several times with an IT programmer to generate a list of the specified first-year University students who had attended the P-12 school District with the following data elements: student name, birthdate (for ensuring K-12 record match), SAT scores (if available), university MPT scores (ARITH and ALGE), first-semester math course enrollment, and first-semester math course grade. The query generated a list of 55 students in an Excel spreadsheet.
The data elements added by the District were High School Proficiency Assessment (HSPA) scores2; 8th-grade standardized test math scores; 9th-, 10th-, 11th-, and 12th-grade math courses taken with grades for each. The final data set contained 54 students (one student was dropped due to inactive high school status).
Data Analyses
The researcher initially sorted the completed data set by several different variables: by SAT score, by MPT score, by first-semester college course placement. While it was assumed that the University math department used multiple criteria for placing students in first-semester math courses, as a result sorting the data in various ways, it became clear that 98% of the students enrolled in a math course, were placed in developmental (two options: one lower and one higher) or credit-bearing courses on the basis of their MPT scores regardless of what other math achievement data were available.
Students were placed in first-semester math courses as follows:
If the student scored ≤67 on the ARITH test and <77 on the ALGE test, she or he was placed in the lowest DM course (noncredit bearing).
If the student scored >67 on the ARITH test and <77 on the ALGE test, she or he was placed in a higher DM course (noncredit bearing).
If the student scored >67 on the ARITH test and ≥77 on the ALGE test, she or he was placed in a credit-bearing college math course.
As a result of relying predominately on the MPT scores to determine first-semester math course placement, the original statistical correlation analysis planned (examining high school courses taken in relationship to first-semester math placement and grades) was not as meaningful as expected. Summary observations (see Table 1) indicate that, of the students who took AlgII3 in 11th grade (almost all received passing grades) but did not take a math course in 12th grade, 78% were placed in a DM course. This is in comparison to 33% of the students who took P&S, Pre-Calc, or Calc in 11th or 12th grade who were placed in DM courses.
|
Table 1. High School Math Course Taking Compared With First-Semester College Course Enrollment.

Given the limited findings as a result of examining the relationship between high school course taking and first-semester college math course placement, other measures of high school success were examined. SAT scores have been validated as a predictor of first-year college grade point average (Kobrin, Patterson, Shaw, Mattern, & Barbuti, 2008), and can be considered one measure of high school math achievement in lieu of high school course taking. The collected data were examined using the SAT math scores as a proxy for high school math achievement. Summary observations indicate of the 54 students in the data set, 17% were not placed in first-semester math courses. All of these students had SAT scores above 4004 and three had taken and passed Pre-Cal in high school.
Of the students placed in first-semester math courses, 69% were enrolled in DM courses. For those students, SAT scores and DM course placement are presented in Table 2. Approximately 70% of the students placed in DM courses had SAT math scores above 400, and about 50% had scores of 450 or above. Logically, one would expect that as SAT scores increase, fewer students would be placed in the lowest DM course and increasingly placed in higher courses; however, students who scored between 450 and 499 were placed in the lowest DM course at more than twice the rate of students scoring lower (400-449). The reverse is true for the same SAT score ranges and placement in the higher DM course.
|
Table 2. SAT Score and Developmental Math (DM) Course Placement.

Regarding student success in first-semester math courses, as reflected in grades earned, too few students were in each grade category for each course to draw any conclusions.
Correlation of Math SAT Scores With MPT and HSPA Scores and First-Year Math College Course Placement
Using a Pearson p measure to examine the correlation between students’ SAT scores and MPT scores, HSPA scores and first-year college course placement (lowest developmental to highest credit bearing), the analyses indicated statistically significant (.01 level) correlations for each (see Table 3). In addition, the strongest correlation was between the SAT score and the math score on the high school exit exam (0.6475, p < .0001). The weakest correlation was between the SAT and the ALGE score (0.5005, p = .0057).
|
Table 3. Correlation Between SAT Math Scores and Other Assessments and First-Year Course.

Summary statistics describing the relationship of the Math SAT and course placement (as determined by use of both MPT scores) provide additional information (see Table 4 and Figure 1). For the lowest DM course, the range of SAT scores is the highest (220); and for the highest credit-bearing course, the range of SAT scores is the lowest (80). For the highest DM course and the lowest credit-bearing course, the mean, minimum, and maximum scores, as well as the range are similar.
|
Table 4. Summary Statistics by Math SAT and Course Placement.

Further analyses of the correlation of the math SAT score with the ALGE MPT score by college course placement indicate a correlation between the SAT and ALGE MPT score for the highest DM course of .6047, though not at a significant level (p = .064). For the lowest credit-bearing course, the correlation coefficient between the SAT and ALGE MPT score (.05946 at a .8888 significance level) indicates no relationship. If the SAT score (cutoff of >449) had been used to place the students in these two courses (highest developmental and lowest credit bearing), 30% would have been placed in different courses. Of these students, 18% who took the credit-bearing course would have taken a noncredit-bearing course (these students earned passing grades the first semester), and 88% who took a noncredit-bearing course would have taken a credit-bearing course.
Discussion of the Data Analyses
The answer to the original research question, “Does mastery of specific mathematics courses and content (such as algebra) lead to first-year placement and success in credit-bearing postsecondary mathematics courses?” was not as readily answered as expected due to the University practice of using student MPT scores for first-semester math course placement rather than prior high school mathematics course-taking patterns or other existing math achievement measures (e.g., course grades, SAT scores). However, the data analyses indicated students who took AlgII in 11th grade but no math course in 12th grade were enrolled in first-semester college DM courses at twice the rate of students who took math courses of P&S or higher in Grades 11 or 12 (most in Grade 12). High school math teachers and advisors can share this information with students and parents to help them understand that students who take advanced math courses in Grade 12 may be better prepared to enroll in credit-bearing courses their first semester of college and avoid the expense, and potentially negative effects of taking DM courses.
The analysis of the data collected regarding high school mathematics achievement as measured by the SAT, and student placement and success in first-semester math courses provide insight and inform further action/investigation to improve student success. The high correlation between MPT scores and SAT scores suggest the MPT scores do not provide additional helpful information for differentiation in the college course placement process. Furthermore, the similarity of the descriptive statistics and the lack of correlation between the SAT scores and MPT scores for the highest DM course and the lowest credit-bearing course suggest the MPT does not consistently differentiate between students placed in these courses. Furthermore, use of the SAT rather than MPT scores could potentially result in more students enrolled in credit-bearing courses who could likely succeed.
In terms of grades earned at the end of the first semester, too few students are in each course and grade classification to draw meaningful conclusions. It is notable that almost all students passed the math courses they took first semester.
The researcher initially met with the University director of institutional effectiveness to review the complete data set and discuss the implications for the University. The researcher subsequently reviewed the findings with additional University administration and follow-up analyses and action were discussed. Finally, the data and findings were shared with the District administrator and the analyses discussed for potential District action.
Discussion of the University/District Research Partnership
The primary goal of this collaborative project was to establish a trusted working relationship between a P-12 District and a University such that confidential student achievement data could be shared to examine a current problem of practice of ensuring student success when transitioning from high school to college math. This goal was accomplished through regular, ongoing communication between the organizations; developing and adhering to the DSA; and collecting, analyzing, and sharing data related to the problem of practice examined. Regular meetings, frequent progress updates via phone calls and e-mails, and clear communication at every stage of the study prevented any unexpected developments that could have undermined the relationship. The University researcher had prior work experience in P-12 organizations, which was an asset in understanding and communicating across the different cultural norms of each. The DSA, in particular, helped build trust between the two organizations by establishing transparency and mutual expectations for the roles and responsibilities of each.
While the collaboration did not have the exact outcomes expected (ability to examine the relationship between student high school course math achievement and postsecondary math course placement and achievement), the data examined did provide useful, actionable information for both organizations. For the University, the study data made their placement practices more transparent and stimulated discussion regarding other methods for placing and supporting student success in college-level math courses. The University now also considers SAT scores in the math course placement process. The findings also contributed to ongoing discussions between faculty and administration regarding the value, utility, and design of DM courses. The math department revised course offerings to be more aligned to students’ majors (STEM, non-STEM). For the District, understanding University placement practices and the impact on student course taking can aid the District in considering various interventions (such as advisement to help students understand the importance of enrolling in higher level math courses in Grade 12, make University placement practices more transparent, and increase students’, parents’, and counselors’ awareness of the potentially negative effects of taking DM courses) to assist students in successfully transitioning to college-level math. The District was considering offering an alternative higher level math course for 12th-grade students who have completed AlgII but do not want to enroll in Pre-Cal or P&S. This study confirmed the potential positive effect of this action. Subsequently, the District added Discrete Math to course offerings for students who successfully complete AlgII, and Advanced Placement Computer Science Principles for students who successfully complete Foundations to Computer Science.
The outcomes of this particular data collection and analyses represent an important step in developing a cross-organization conversation and collaboration to examine how to improve successful student transitions from high school to college. The results provide useful data to inform policy development and decisions in practice.
Limitations of the Study
The primary limitation of this study is the small data set. It is difficult to draw conclusions and make recommendations for changes in policy and/or practice based on the small cohort examined. The second limitation is the time frame for the study was relatively short (12 months). Examination of different cohorts over several years would have strengthened the findings. However, the limited findings suggest that additional data collection and analyses could provide meaningful information to add to these initial observations, and inform improved college math placement decisions that result in increased college math course credit attainment.
Additional Questions for Further Consideration
The findings of this collaborative research study lead to additional questions for further investigation.
For the students in the study data set:
What were their first-year college second-semester math course grades? What was their final pass rate in credit-bearing math courses?
How many of these students were enrolled in the University for their second year of college?
How many of these students enrolled in credit-bearing math courses their second year at the University?
Of the students who took and passed DM courses, how many enrolled in credit-bearing math courses their second year at the University?
Are the correlations between SAT scores, HSPA scores, and MPT scores evident in this data set, the same or similar for other cohorts of students from this District?
Are the correlations between SAT scores, HSPA scores, and MPT scores evident in this data set, the same or similar for students from other P-12 local feeder school districts?
The answers to these questions could result in findings that would either strengthen or refute the initial observations made with the existing analyses and potentially contribute to improved decision making regarding student interventions, supports, and course placements to increase successful transition from high school to college math course taking.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Notes
1.
The University administers two MPTs to incoming students: ARITH, which assesses basic math skills, and ALGE, which assesses basic Algebra skills.
2.
The HSPA is a state standardized test designed to assess mastery of 11th-grade state standards in Mathematics and Language Arts Literacy. Students take the test in the spring of Grade 11 and must score 200 on each section to graduate from high school (New Jersey Department of Education, 2006).
3.
Typical sequencing of HS math courses from lowest to highest (in this study) is AlgII, Probability&Statistics (P&S), Precalculus (Pre-Calc), and Calculus (Calc).
4.
The state department of education had defined a SAT score of ≥400 as one indicator of sufficient proficiency for high school graduation (New Jersey Department of Education, 2014).
|
Adelman, C. (1999). Answers in the tool box: Academic intensity, attendance patterns, and bachelor’s degree attainment. Washington, DC: U.S. Department of Education. Google Scholar | |
|
Adelman, C. (2004). Principal indicators of student academic histories in postsecondary education, 1972-2000. Washington, DC: U.S. Department of Education. Google Scholar | |
|
Adelman, C. (2006). The toolbox revisited: Paths to degree completion from high school through college. Washington, DC: U.S. Department of Education. Retrieved from http://www2.ed.gov/print/rschstat/research/pubs/toolboxrevisit/index.html Google Scholar | |
|
Adelman, C., Danie, B., Berkovits, I. (2003). Postsecondary attainment, attendance, curriculum and performance: Selected results from the NELS:88/2000 postsecondary education transcript study. Washington, DC: National Center for Education Statistics. Google Scholar | |
|
Attewell, P., Lavin, D., Domina, T., Levey, T. (2006). New evidence on college remediation. Journal of Higher Education, 77, 886-924. Google Scholar | Crossref | ISI | |
|
Baum, S., Ma, J., Payea, K. (2013). Education Pays 2013: The benefits of higher education for individuals and society. New York, NY: College Board, Advocacy & Policy Center. Google Scholar | |
|
Bettinger, E., Long, B. T. (2009, May). Addressing the needs of underprepared students in higher education: Does college remediation work? (NBER Working Paper No. 11325). Retrieved from http://www.nber.org/papers/w11325.pdf Google Scholar | |
|
Bracco, K., Dadgar, M., Austin, K., Klarin, B., Broed, M., Finkelstein, N., . . . Bugler, D. (2014). Exploring the use of multiple measures for placement into college-level courses: Seeking alternatives or improvement to the use of a single standardized test. San Francisco, CA: WestEd. Google Scholar | |
|
Calcagno, J., Long, B. (2008, July). The impact of postsecondary remediation using a regression discontinuity approach: Addressing endogenous sorting and noncompliance (Working Paper No. 14194). Cambridge, MA: National Bureau of Economic Research. doi:10.3386/w14194 Google Scholar | Crossref | |
|
Chen, X. (2016). Remedial coursetaking at U.S. public 2- and 4-year institutions: Scope, experiences, and outcomes (NCES 2016-05). Washington, DC: U.S. Department of Education, National Center for Education Statistics. Google Scholar | |
|
Council of Chief State School Officers & National Governors Association . (2009). Common Core State Standards initiative: Preparing America’s students for college and career. Retrieved from http://www.corestandards.org/about-the-standards/development.process/ Google Scholar | |
|
Deming, D., Figlio, D. (2016). Accountability in U.S. education: Applying lessons from K-12 experience to higher education. Journal of Economic Perspectives, 30(3), 33-56. Google Scholar | Crossref | ISI | |
|
DePaoli, J., Fox, J., Ingram, E., Maushard, M., Bridgeland, J., Balfanz, R. (2015). Building a grad nation: Progress and challenge in ending the high school dropout epidemic. Baltimore, MD: Civic Enterprises/Johns Hopkins University. Google Scholar | |
|
Hodara, M., Cox, M. (2016). Developmental education and college readiness at the University of Alaska (REL 2016-123). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northwest. Google Scholar | |
|
Hughes, K., Scott-Clayton, J. (2011). Assessing developmental assessment in community colleges. Community College Review, 39, 327-351. doi:10.1177/0091552111426898 Google Scholar | SAGE Journals | |
|
Jaggars, S., Hodara, M. (2013). The opposing forces that shape developmental education. Community College Journal of Research and Practice, 37, 575-579. Google Scholar | Crossref | |
|
Kirst, M. (2007). Who needs it? Identifying the proportion of students who require postsecondary remedial education is virtually impossible. National Crosstalk. Retrieved from http://www.highereducation.org/crosstalk/ct0107/voices0107-kirst.shtml Google Scholar | |
|
Kobrin, J., Patterson, B., Shaw, E., Mattern, K., Barbuti, S. (2008). Validity of the SAT® for predicting first-year college grade point average. New York, NY: College Board. Retrieved from http://files.eric.ed.gov/fulltext/ED563202.pdf Google Scholar | |
|
Latterell, C., Regal, R. (2003). Are placement tests for incoming undergraduate mathematics students worth the expense of administration? Primus, 13, 152-164. doi:10.1080/10511970308984054 Google Scholar | Crossref | |
|
Martorell, P., McFarlin, I. (2011). Help or hindrance? The effects of college remediation on academic and labor market outcome. Review of Economics and Statistics, 93, 436-454. Google Scholar | Crossref | ISI | |
|
New Jersey Department of Education . (2006). Your guide to the HSPA. Retrieved from http://www.nj.gov/education/assessment/hs/hspa_guide_english.pdf Google Scholar | |
|
New Jersey Department of Education . (2014). UPDATED: Graduation requirements for the classes of 2016, 2017, and 2018. Retrieved from http://www.state.nj.us/education/intervention/memos/120214grad.pdf Google Scholar | |
|
Rich, M. (2015, December 27). As graduation rates rise, experts fear diplomas come up short. The New York Times. Retrieved from https://www.nytimes.com/2015/12/27/us/as-graduation-rates-rise-experts-fear-standards-have-fallen.html?_r=0 Google Scholar | |
|
Rolfhus, E., Decker, L. E., Brite, J. L., Gregory, L. (2010). A systematic comparison of the American Diploma Project English language arts college readiness standards with those of the ACT, College Board, and Standards for Success (Issues & Answers Report, REL 2010–No. 086). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest. Google Scholar | |
|
Scott-Clayton, J. (2012). Do high-stakes placement exams predict college success? (Working Paper No. 41). New York, NY: Teachers College, Columbia University. Retrieved from http://ccrc.tc.columbia.edu/media/k2/attachments/high-stakes-predict-success.pdf Google Scholar | |
|
Scott-Clayton, J., Crosta, P., Belfield, C. (2014). Improving the targeting of treatment: Evidence from college remediation. Educational Evaluation and Policy Analysis, 36, 371-393. doi:10.3102/0162373713517935 Google Scholar | SAGE Journals | ISI | |
|
Scott-Clayton, J., Rodriguez, O. (2015). Development, discouragement, or diversion? New evidence on the effects of college remediation policy. Association for Education Finance and Policy, 10, 4-45. doi:10.1162/EDFP_a_00150 Google Scholar | Crossref | ISI | |
|
Tepe, L. (2014). Common core goes to college: Building better connections between high school and higher education. Washington, DC: New America Foundation. Retrieved from https://www.newamerica.org/education-policy/policy-papers/common-core-goes-to-college Google Scholar | |
|
U.S. Department of Education . (2002). No Child Left Behind: A desktop reference. Retrieved from https://www2.ed.gov/admins/lead/account/nclbreference/reference.pdf Google Scholar | |
|
U.S. Department of Education . (2009). Race to the top program executive summary. Retrieved from http://www2.ed.gov/programs/racetothetop/executive-summary.pdf Google Scholar | |
|
Venezia, A., Callan, P., Finney, J., Kirst, M., Usdan, M. (2005). The governance divide: A report on a four-state study on improving college readiness and success. San Jose, CA: National Center for Public Policy and Higher Education. Retrieved from http://www.highereducation.org/reports/governance_divide/index.shtml Google Scholar | |
|
Venezia, A., Kirst, M., Antonio, A. (2003). Betraying the college dream: How disconnected K-12 and postsecondary education systems undermine student aspirations. San Jose, CA: Stanford University. Google Scholar | |
|
Wilkins, C., Hartman, J., Howland, N., Sharma, N. (2010). How prepared are students for college-level reading? Applying a Lexile®-based approach (Issues & Answers Report, REL 2010–No. 094). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest. Google Scholar |
Author Biography
Jenifer J. Hartman served as a P-12 public school administrator for eleven years prior to joining the Regional Educational Laboratory Southwest as Director of Practice-Based Research. Since 2013, she has been a university professor and currently teaches in educational leadership preparation program at the University of South Florida St. Petersburg.


