The amount of research evaluating the technical merits of general outcome measures of science and social studies achievement is growing. This study targeted criterion validity for critical content monitoring. Questions addressed the concurrent criterion validity of alternate presentation formats of critical content monitoring and the measure’s predictive validity. Participants were fifth-grade students (N = 51) who completed five different forms of critical content monitoring probes as well as oral reading fluency and maze probes over three benchmarking periods. Criterion measures were the science and social studies subtests of the online abbreviated Stanford Achievement Test–10th edition. Concurrent correlation magnitudes for critical content monitoring ranged from .47 to .60. Predictive correlations for fall and winter ranged from .23 to .64. In three of four cases, commonality analyses findings favored critical content monitoring over oral reading fluency and maze as benchmarking choices. Study limitations and benchmark assessment framework implications are discussed.

Amado, A. J. (2003). Partitioning predicted variance into constituent parts: A primer on regression commonality analysis. Research in the Schools, 10, 9197.
Google Scholar
Barth, A. E., Stuebing, K. K., Fletcher, J. M., Cirino, P. T., Francis, D. J., Vaughn, S. (2012). Reliability and validity of the median score when assessing the oral reading fluency of middle grade readers. Reading Psychology, 33, 133161.
Google Scholar | Crossref | Medline
Barth, A. E., Tolar, T. D., Fletcher, J. M., Francis, D. (2014). The effects of student and text characteristics on the oral reading fluency of middle-grade students. Journal of Educational Psychology, 106, 162180. doi:10.1037/a0033826
Google Scholar | Crossref | Medline | ISI
Borsuk, E. R. (2010). Examination of an administrator-read vocabulary-matching measure as an indicator of science achievement. Assessment for Effective Intervention, 35, 168177. doi:10.1177/1534508410372081
Google Scholar | SAGE Journals
Carney, R. N. (n.d.). Review of the Stanford Achievement Test (10th ed.). Retrieved from http://buros.org/mental-measurements-yearbook
Google Scholar
Deno, S. L. (1985). Curriculum-based measurement: The emerging alternative. Exceptional Children, 52, 219232.
Google Scholar | SAGE Journals | ISI
Deno, S. L., Fuchs, L. S. (1987). Developing curriculum-based measurement systems for data-based special education problem solving. Focus on Exceptional Children, 19, 116.
Google Scholar | Crossref
Dynamic Measurement Group . (2011). DIBELS Next Assessment Manual. Available from http://www.dibels.org/
Google Scholar
Espin, C. A., Busch, T. W., Lembke, E. S., Hampton, D. D., Seo, K., Zukowski, B. A. (2013). Curriculum-based measurement in science learning: Vocabulary-matching as an indicator of performance and progress. Assessment for Effective Intervention, 38, 203213. doi:10.1177/1534508413489724
Google Scholar | SAGE Journals
Espin, C. A., Busch, T. W., Shin, J., Kruschwitz, R. (2001). Curriculum-based measurement in the content areas: Validity of vocabulary matching as an indicator of performance in social studies. Learning Disabilities Research & Practice, 16, 142151. doi:10.1111/0938-8982.00015
Google Scholar | Crossref
Espin, C. A., Deno, S. L. (1994–1995). Curriculum-based measures for secondary students: Utility and task specificity of text-based reading and vocabulary measures for predicting performance on content area tasks. Diagnostique, 20, 121142.
Google Scholar
Espin, C. A., Foegen, A. (1996). Validity of general outcome measures for predicting secondary students’ performance on content-area tasks. Exceptional Children, 62, 497514.
Google Scholar | SAGE Journals | ISI
Espin, C. A., Shin, J., Busch, T. W. (2005). Curriculum-based measurement in the content areas: Vocabulary matching as an indicator of progress in social studies learning. Journal of Learning Disabilities, 38, 353363. doi:10.1177/00222194050380041301
Google Scholar | SAGE Journals | ISI
Fuchs, L. S., Deno, S. L. (1991). Paradigmatic distinctions between instructionally relevant measurement models. Exceptional Children, 57, 488500.
Google Scholar | SAGE Journals | ISI
Good, R. H., Kaminski, R. A., Dewey, E. N., Wallin, J., Powell-Smith, K. A., Latimer, R. J. (2011). DIBELS Next Technical Manual. Eugene, OR: Dynamic Measurement Group.
Google Scholar
Hinkle, D. E., Wiersma, W., Jurs, S. G. (2003). Applied statistics for the behavioral sciences. Boston, MA: Houghton Mifflin.
Google Scholar
Johnson, E. S., Semmelroth, C., Allison, J., Fritsch, T. (2013). The technical properties of science content maze passages for middle school students. Assessment for Effective Intervention, 38, 214223. doi:10.1177/1534508413489337
Google Scholar | SAGE Journals
Kutner, M. H., Nachtsheim, C. J., Neter, J. (2004). Applied linear regression models (4th ed.). Chicago, IL: McGraw-Hill.
Google Scholar
Marcotte, A. M., Hintze, J. M. (2009). Incremental and predictive validity of formative assessment methods of reading comprehension. Journal of School Psychology, 47, 315335. doi:10.1015/j.jsp.2009.04.003
Google Scholar | Crossref | Medline | ISI
Marston, D. B. (1989). A curriculum-based measurement approach to assessing academic performance: What is it and why do it. In Shinn, M. R. (Ed.), Curriculum-based measurement: Assessing special children (pp. 1878). New York, NY: Guilford.
Google Scholar
Moodle . (n.d.). Retrieved from http://docs.moodle.org/23/en/About_Moodle
Google Scholar
Mooney, P., Lastrapes, R. E., Marcotte, A. M., Matthews, A. (2015). Validity evidence for critical content monitoring and sentence verification technique as indicators of student science and social studies achievement. Manuscript in preparation.
Google Scholar
Mooney, P., McCarter, K. S., Russo, R. J., Blackwood, D. L. (2013). Examining an online content general outcome measure technical features of the static score. Assessment for Effective Intervention, 38, 249260. doi:10.1177/1534508413488794
Google Scholar | SAGE Journals
Mooney, P., McCarter, K. S., Russo, R. J., Blackwood, D. L. (2014). The structure of an online assessment of science and social studies content: Testing optional formats of a general outcome measure. Social Welfare Interdisciplinary Approach, 4, 8193.
Google Scholar
Mooney, P., McCarter, K. S., Schraven, J., Callicoatte, S. (2013). Additional performance and progress validity findings targeting the content-focused vocabulary matching. Exceptional Children, 80, 85100.
Google Scholar | SAGE Journals | ISI
Morse, D. T. (n.d.). Review of the Stanford Achievement Test (10th ed.). Retrieved from http://buros.org/mental-measurements-yearbook
Google Scholar
Nagy, W., Townsend, D. (2012). Words as tools: Learning academic vocabulary as language acquisition. Reading Research Quarterly, 47, 91108. doi:10.1002/RRQ.011
Google Scholar | Crossref | ISI
National Governors Association Center for Best Practices, Council of Chief State School Officers . (2010). Common Core State Standards for English language arts and literacy in history/social studies, science, and technical subjects. Washington, DC: Authors. Retrieved from www.corestandards.org/the-standards
Google Scholar
Next Generation Science Standards . (n.d.). Retrieved from http://www.nextgenscience.org/next-generation-science-standards
Google Scholar
Nimon, K. (2010). Regression commonality analysis: Demonstration of an SPSS solution. Multiple Linear Regression Viewpoints, 36, 1017.
Google Scholar
Pearson Education . (n.d.). Stanford Achievement Test Series, online abbreviated form (10th ed.). Retrieved from http://www.pearsonassessments.com/learningassessments/products/100000563/stanford-achievement-test-series-tenth-edition-abbreviated-battery.html
Google Scholar
Royer, J. M., Hastings, C. N., Hook, C. (1979). A sentence verification technique for measuring reading comprehension. Journal of Reading Behavior, 11, 355363.
Google Scholar | SAGE Journals | ISI
Smith, R. L., Ager, J. W., Williams, D. L. (1992). Suppressor variables in multiple regression/correlation. Educational and Psychological Measurement, 52, 1729. doi:10.1177/001316449205200102
Google Scholar | SAGE Journals | ISI
Solis, M., Miciak, J., Vaughn, S., Fletcher, J. M. (2014). Why intensive interventions matter: Longitudinal studies of adolescents with reading disabilities and poor reading comprehension. Learning Disability Quarterly, 37, 218229. doi:10.1177/0731948714528806
Google Scholar | SAGE Journals | ISI
Tabachnick, B. G., Fidell, L. S. (2013). Using multivariate statistics (6th ed.). Upper Saddle River, NJ: Pearson Education.
Google Scholar
Tolar, T. D., Barth, A. E., Francis, D. J., Fletcher, J. M., Stuebing, K. K., Vaughn, S. (2012). Psychometric properties of maze tasks in middle school students. Assessment for Effective Intervention, 37, 131146. doi:10.1177/1534508411413913
Google Scholar | SAGE Journals
Warne, R. T. (2011). Beyond multiple regression using commonality analysis to better understand R2 results. Gifted Child Quarterly, 55, 313318.
Google Scholar | SAGE Journals | ISI
Williams, K. T. (2001). The group reading assessment and diagnostic evaluation. Circle Pine, MN: AGS.
Google Scholar
Zientek, L. R., Thompson, B. (2006). Commonality analysis: Partitioning variance to facilitate better understanding of data. Journal of Early Intervention, 28, 299307.
Google Scholar | SAGE Journals | ISI
View access options

My Account

Welcome
You do not have access to this content.



Chinese Institutions / 中国用户

Click the button below for the full-text content

请点击以下获取该全文

Institutional Access

does not have access to this content.

Purchase Content

24 hours online access to download content

Your Access Options


Purchase

AEI-article-ppv for $15.00