Abstract
The purpose of this study was to evaluate technical adequacy features of an online adaptation of vocabulary matching known as critical content monitoring. Validity and reliability studies were conducted with a sample of 106 students from one school in fifth-grade science content. Participants were administered 20 parallel forms of the general outcome measure over a 2-week period. Criterion-related validity correlations with a statewide accountability test ranged from .36 to .55 for the 20 parallel forms. A pooled estimate of a common correlation between the state test and the probes was found to be .45. While correlation differences were not found, statistically significant differences in probe mean scores were identified across the body of parallel forms. Student commentary regarding the online assessment process was largely positive. Alternate-form reliability correlations ranged from .21 to .73, with a median correlation of .56. Limitations and implications are addressed.
|
Alexander, F. (n.d.). Understanding vocabulary. Retrieved from http://www.scholastic.com/teachers/article/understanding-vocabulary Google Scholar | |
|
American Educational Research Association, American Psychological Association, & National Council on Measurement in Education . (1999). Standards for educational and psychological testing. Washington, DC: AERA. Google Scholar | |
|
Ardoin, S. P., Christ, T. J., Morena, L. S., Cormier, D. C., Klingbeil, D. A. (2013). A systematic review and summarization of the recommendations and research surrounding curriculum-based measurement of oral reading fluency (CBM-R) decision rules. Journal of School Psychology, 51, 1–18. doi:10.1016/j.jsp.2012.09.004 Google Scholar | Crossref | Medline | ISI | |
|
Ardoin, S. P., Suldo, S. M., Witt, J. C., Aldrich, S., McDonald, E. (2005). Accuracy of readability estimates’ predictions of CBM performance. School Psychology Quarterly, 20, 1–22. doi:10.1521/scpq.20.1.1.64193 Google Scholar | Crossref | ISI | |
|
Ardoin, S. P., Williams, J. C., Christ, T. J., Klubnik, C., Wellborn, C. (2010). Examining readability estimates’ predictions of students’ oral reading rate: Spache, lexile, and forcast. School Psychology Review, 39, 277–285. Google Scholar | ISI | |
|
Baker, S. K., Kame’enui, E. J., Simmons, D. C., Simonsen, B. (2007). Characteristics of students with diverse learning and curricular needs. In Coyne, M. D., Kame’enui, E. J., Carnine, D. W. (Eds.), Effective teaching strategies that accommodate diverse learners (3rd ed., pp. 23–43). Upper Saddle River, NJ: Pearson. Google Scholar | |
|
Beatty, I. D., Gerace, W. J. (2009). Technology-enhanced formative assessment: A research-based pedagogy for teaching science with classroom response technology. Journal of Science Education and Technology, 18, 146–162. doi:10.1007/s10956-008-9140-4 Google Scholar | Crossref | ISI | |
|
Bloom, B. S. (1969). Some theoretical issues relating to educational evaluation. In Tyler, R. W. (Ed.), Educational evaluation: New roles, new means (National Society for the Study of Education Yearbook, Vol. 68, Pt. 2, pp. 26–50). Chicago, IL: University of Chicago Press. Google Scholar | |
|
Borsuk, E. R. (2010). Examination of an administrator-read vocabulary-matching measure as an indicator of science achievement. Assessment for Effective Intervention, 35, 168–177. doi:10.1177/1534508410372081 Google Scholar | SAGE Journals | |
|
Burns, M. K., Klingbeil, D. A., Ysseldyke, J. (2010). The effects of technology-enhanced formative evaluation on student performance on state accountability math tests. Psychology in the Schools, 47, 582–591. doi:10.1002/pits Google Scholar | Crossref | ISI | |
|
Christ, T. J., Zopluoglu, C., Monaghen, B. D., Van Norman, E. R. (2013). Curriculum-based measurement of oral reading: Multi-study evaluation of schedule, duration, and dataset quality on progress monitoring outcomes. Journal of School Psychology, 51, 19–57. doi:10.1016/j.jsp.2012.11.001 Google Scholar | Crossref | Medline | ISI | |
|
Deno, S. L. (1985). Curriculum-based measurement: The emerging alternative. Exceptional Children, 52, 219–232. Google Scholar | SAGE Journals | ISI | |
|
Deno, S. L., Mirkin, P., Chiang, B. (1982). Identifying valid measures of reading. Exceptional Children, 49, 36–45. Google Scholar | Medline | ISI | |
|
Espin, C. A., Busch, T. W., Shin, J., Kruschwitz, R. (2001). Curriculum-based measurement in the content areas: Validity of vocabulary matching as an indicator of performance in social studies. Learning Disabilities Research & Practice, 16, 142–151. doi:10.1111/0938-8982.00015 Google Scholar | Crossref | |
|
Espin, C. A., Deno, S. L. (1993). Performance in reading from content-area text as an indicator of achievement. Remedial and Special Education, 14(6), 47–59. Google Scholar | SAGE Journals | ISI | |
|
Espin, C. A., Deno, S. L. (1994–1995). Curriculum-based measures for secondary students: Utility and task specificity of text-based reading and vocabulary measures for predicting performance on content area tasks. Diagnostique, 20, 121–142. Google Scholar | |
|
Espin, C. A., Foegen, A. (1996). Validity of general outcome measures for predicting secondary students’ performance on content-area tasks. Exceptional Children, 62, 497–514. Google Scholar | SAGE Journals | ISI | |
|
Espin, C. A., Shin, J., Busch, T. W. (2005). Curriculum-based measurement in the content areas: Vocabulary matching as an indicator of progress in social studies learning. Journal of Learning Disabilities, 38, 353–363. Google Scholar | SAGE Journals | ISI | |
|
Fuchs, L. S. (2004). The past, present, and future of curriculum-based measurement research. School Psychology Review, 33, 188–192. Google Scholar | ISI | |
|
Fuchs, L. S., Deno, S. L. (1991). Paradigmatic distinctions between instructionally relevant measurement models. Exceptional Children, 57, 488–500. Google Scholar | SAGE Journals | ISI | |
|
Fuchs, L. S., Fuchs, D. (1986). Effects of systematic formative evaluation: A meta-analysis. Exceptional Children, 53, 199–208. Google Scholar | SAGE Journals | ISI | |
|
Fuchs, L. S., Fuchs, D. (2001). Computer applications to curriculum-based measurement. Special Services in the Schools, 17, 1–14. doi:10.1300/J008v17n01_01 Google Scholar | Crossref | |
|
Fuchs, L. S., Fuchs, D., Zumeta, R. O. (2008). A curricular-sampling approach to progress monitoring: Mathematics concepts and applications. Assessment for Effective Intervention, 33, 225–233. doi:10.1177/1534508407313484 Google Scholar | SAGE Journals | |
|
Louisiana Department of Education . (2008). Comprehensive curriculum: Grade 5 science. Baton Rouge, LA: Author. Google Scholar | |
|
Louisiana Department of Education . (n.d.-a). Integrated Louisiana Educational Assessment Program (iLEAP). Retrieved from http://www.louisianabelieves.com/assessment/annual-assessments Google Scholar | |
|
Louisiana Department of Education . (n.d.-b). Integrated Louisiana Educational Assessment Program (iLEAP) scaled-score ranges. Retrieved from http://www.louisianaschools.net/lde/uploads/11054.pdf Google Scholar | |
|
Louisiana Department of Education . (n.d.-c). iLEAP 2010 technical summary. Retrieved from http://www.louisianaschools.net/lde/uploads/18005.pdf Google Scholar | |
|
Moodle . (n.d.). Retrieved from http://docs.moodle.org/23/en/About_Moodle Google Scholar | |
|
Mooney, P., McCarter, K. S., Schraven, J., Callicoatte, S. (in press). Additional validity research on a GOM content assessment tool: Stage 1 and 2 study of vocabulary matching. Exceptional Children. Google Scholar | |
|
Mooney, P., McCarter, K. S., Schraven, J., Haydel, B. (2010). The relationship between content area GOM and statewide testing in world history. Assessment for Effective Intervention, 35, 148–158. doi:10.1177/1534508409346052 Google Scholar | SAGE Journals | |
|
Mooney, P., Schraven, J., Cox, B. (2010). Test-retest reliability of vocabulary matching in sixth-grade world history. International Journal of Psychology: A Biopsychosocial Approach, 6, 29–40. Google Scholar | |
|
Nagy, W., Townsend, D. (2012). Words as tools: Learning academic vocabulary as language acquisition. Reading Research Quarterly, 47(1), 91–108. doi:10.1002/RRQ.011 Google Scholar | Crossref | ISI | |
|
Petscher, Y., Cummings, K. D., Biancarosa, G., Fien, H. (2013). Advanced (measurement) applications of curriculum-based measurement in reading. Assessment for Effective Intervention, 38, 71–75. doi:10.1177/1534508412461434 Google Scholar | SAGE Journals | |
|
Reynolds, C. R., Livingston, R. B., Wilson, V. (2009). Measurement and assessment in education (2nd ed.). Upper Saddle River, NJ: Pearson. Google Scholar | |
|
Salvia, J., Ysseldyke, J., Bolt, S. (2007). Assessment in special and inclusive education (10th ed.). Boston, MA: Houghton Mifflin. Google Scholar | |
|
Stecker, P. M., Fuchs, L. S., Fuchs, D. (2005). Using curriculum-based measurement to improve student achievement: Review of research. Psychology in the Schools, 42, 795–819. doi:10.1002/pits.20113 Google Scholar | Crossref | ISI | |
|
Stoolmiller, M., Biancarosa, G., Fien, H. (2013). Measurement properties of DIBELS oral reading fluency in Grade 2: Implications for equating studies. Assessment for Effective Intervention, 38, 76–90. doi:10.1177/1534508412456729 Google Scholar | SAGE Journals | |
|
Vannest, K. J., Parker, R., Dyer, N. (2011). Progress monitoring in Grade 5 science for low achievers. Journal of Special Education, 44, 221–233. doi:10.1177/0022466909343121 Google Scholar | SAGE Journals | ISI | |
|
Wesson, C. L., King, R. P., Deno, S. L. (1984). Direct and frequent measurement: If it’s so good for us, why don’t we use it? Learning Disabilities Quarterly, 7, 45–48. doi:10.2307/1510260 Google Scholar | SAGE Journals | ISI | |
|
Ysseldyke, J., Bolt, D. M. (2007). Effect of technology-enhanced continuous progress monitoring on math achievement. School Psychology Review, 36, 453–467. Google Scholar | ISI |

