Research

ONPAR

Bauman, J. (2008). A linguistic model to investigate item components in dynamic, multi-semiotic content testing. Washington, DC: Center for Applied Linguistics.

Carr, T. & Kopriva, R. J. (2013, April). Allowing diverse students to "show & tell" on mathematics and science assessments: ONPAR computer-interactive response formats. Presentation at the Annual Meeting of the National Council on Measurement in Education, San Francisco, CA.

Carr, T. & Kopriva, R. J. (2013, September). Science assessments for the future. The Next Generation Science Standards: Updates, Challenges and OpportunitiesSymposium conducted at the National Association of State Boards of Education, Pittsburgh, PA.

Carr, T. G. & Kopriva, R. J. (2013, May). Greater than the sum of its parts: Theoretical and conceptual underpinnings of the ONPAR methodology for conveying meaning through multiple, multi-semiotic representations. Paper presented at the annual meeting of the National Council on Measurement in Education, San Francisco, CA.

Kopriva, R.J. & Wright, L. (2017). Score Processes in Assessing Academic Content of Non-Native Speakers: Literature Review and ONPAR Summary. In K. Ercikan, and J. Pellegrino (Eds.). Validation of Score Meaning in the Next Generation of Assessments Using Response Processes (100-112), New York, Routledge. (doc)

Kopriva, R. J. (2009). Assessing the skills and abilities in math and science of ELs with low English proficiency: A promising new method. AccELLerate!, 2(1), 7–10.

Kopriva, R. J. (2013, May). Preventing preventable tracking: The peril and promise for lower English proficient students. Presentation at the annual meeting of the National Council on Measurement in Education, San Franciso, CA.

Kopriva, R. J. (2013). The ONPAR methodology in the classroom: Why and how it seems to work for measuring challenging content of many ELLs. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J., Carr, T. G., Gabel, D., & Cameron, C. (2011). Improving the validity of mathematics results for students with learning disabilities in reading and other SwDs who struggle with language and literacy: Findings from the ONPAR elementary and middle school mathematics experimental study.

Kopriva, R. J., Gabel, D., & Bauman, J. (2009). What happens when large-scale items actually use the computer’s capabilities? Exploring issues and redefining challenges. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J., Gabel, D., & Cameron, C. (2009). Overview of results from the ONPAR elementary and middle school science experimental study with ELs and non-ELs: A promising new approach for measuring complex content knowledge of English learners with lower proficiency levels. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J., Winter, P. C., Triscari, R., & Carr, T. G., Cameron, C., & Gabel, D. (2013). Assessing the knowledge, skills, and abilities of ELs, selected SwDs and controls on challenging high school science content: Results from randomized trials of ONPAR and technology-enhanced traditional end-of-course biology and chemistry tests. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Logan-Terry, A., & Wright, L. J. (2010). Making thinking visible: An analysis of English language learners’ interactions with access-based science assessment items. AccELLerate!, 2(4), 11–14.

Myers, B. (2015). The cognitive basis of ONPAR assessment: A white paper.

Wright, L. J. & Logan-Terry, A. (2013, April). Incorporating student’s voices in the accommodations debate: A discourse analysis of students’ interactions with traditional and multisemiotic test items [PDF document]. PowerPoint presentation given at the annual meeting of the American Educational Research Association, Philadelphia, PA. 

Wright, L. J. (2013). Multimodality and measurement: Promise for assessing English learners and students who struggle with the language demands of tests. White Paper.

Wright, L. J., & Kopriva, R. J. (2009). Using cognitive labs to refine technology-enhanced assessment tasks and ensure their accessibility: Insights from data collected to inform ONPAR elementary and middle school science task development. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Wright, L. J., Kopriva, R. J., & Carr, T. G. (2011) ONPAR elementary and middle school mathematics cognitive lab results: How students interact with traditional and dynamic assessment tasks with novel features.

Wright, L. J., Staehr-Fenner, D., Moxley, K., Kopriva, R. J., & Carr, T. G. (2013). Exploring how diverse learners interact with computerized, multi-semiotic representations of meaning: Highlights from cognitive labs conducted with ONPAR end-of-course biology and chemistry assessment tasks.

Accessible Item and Test Development

Carr, T.G. & Kopriva, R.J. (2008). Digging deeper: Using expert reviews of distractor distributions for parallel test item dyads to analyze effective and ineffective item characteristics for English learners, students with learning disabilities, and students who are deaf/hard-of-hearing. White Paper.

Cawthon, S., Leppo, R., Carr, T. G., & Kopriva, R. J. (2013). Towards accessible assessments: The promises and limitations of test item adaptations for students with disabilities and English language learners. Educational Assessment, 18(2), 73-98.

Kopriva, R. J. (1999). A conceptual framework for the valid comparable measurement of all students: An outline. White Paper. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. (2000). Ensuring accuracy in testing for ELLs. Washington, DC: Council of Chief State School Officers.

Kopriva, R. J. (2008a). Getting started: Issues of participation, alignment and validating access with test specifications. In R. J. Kopriva, Improving testing for English language learners: A comprehensive approach to designing, building, implementing, and interpreting better academic assessments (pp. 67-81). New York, NY: Routledge Publishers.

Kopriva, R. J. (2008b). Improving testing for English language learners: A comprehensive approach to designing, building, implementing, and interpreting better academic assessments. New York, NY: Routledge Publishers.

Kopriva, R.J. (2008c). Other relevant accommodations and pretest support. In R. J. Kopriva, Improving testing for English language learners: A comprehensive approach to designing, building, implementing, and interpreting better academic assessments (pp. 205-218). New York, NY: Routledge Publishers.

Kopriva, R. J. (2008d). Tools, test forms, and reviews. In R.J. Kopriva, Improving testing for English language learners: A comprehensive approach to designing, building, implementing, and interpreting better academic assessments (pp. 145-170). New York, NY: Routledge Publishers.

Kopriva, R. J. (2013). Issues and considerations for the second-generation of researchers interested in assessing ELLs on academic content: A review of four papers. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. & Albers, C. (2013). Considerations of achievement testing for students with individual needs. In K. F. Geisinger (Ed.), APA handbook of testing and assessment in psychology (Vol. 3) (pp. 370-390).Washington, DC: American Psychological Association.

Kopriva, R. J., Wiley, D. E., & Emick, J. (2007). Inspecting the validity of large-scale assessment score inferences for ELLs and others under more optimal testing conditions—Does it measure Up? Paper commissioned by the Assessment and Accountability Comprehensive Center, WestEd, San Francison, CA.

Kopriva, R. J, Wiley, D., & Winter, P. (2003, April). Evidentiary logic in assessment of diverse learners: Valid Assessment of English Language Learners (VAELL) Project. Paper presented at the annual conference of the National Council of Measurement in Education, Chicago, IL.

Kopriva, R., Wiley, D. E., & Winter, P. (2004, April). Re-examining the role of individual differences in educational assessment. Paper presented at the annual conference of the National Council of Measurement in Education, San Diego, CA.

Kopriva, R. J. & Winter, P.C. (2012). Designing a cognitive skills framework for item development. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J., Winter, P. C., & Chen, C. S., (2006). Achieving accurate results for diverse learners with access-enhanced items: Summary of results from VAELL grant.

Mann, H., Emick, J., Cho, M., & Kopriva, R. J. (2006, March). Considering the validity of test score inferences for English language learners with limited proficiency using language liaisons and other accommodations. Paper presented at the annual conference of the American Educational Research Association, San Francisco, CA.

Shaw, J., Abedi, J., & Kopriva, R. J. (in press, 2013). The future of content testing for assessing ELLs. Accepted for special issue of Educational Assessment.

Sprehn, R. (2004). Standardized Academic Testing and Culture: A White Paper. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Winter, P. C., Kopriva, R. J., Chen, C., & Emick, J. E. (2006). Exploring individual and item factors that affect assessment validity for diverse learners: Results from a large-scale cognitive lab. Learning and Individual Differences, 16(4), 267-276.

Wright, L. J. & Bauman, J. (2009). The discourse of assessments: Identifying grammatical  features of standardized science tests that contribute to their (in)accessibility for linguistically diverse learners. Presentation at the annual convention of the American Association for Applied Linguistics, Denver, CO.

Wright, L. J. & Logan-Terry, A. (2013, April). Incorporating student’s voices in the accommodations debate: A discourse analysis of students’ interactions with traditional and multisemiotic test items [PDF document]. PowerPoint presentation given at the annual meeting of the American Educational Research Association, Philadelphia, PA.

Classroom Assessment of English Learners

August D., Valdez, G., Heritage, M., Herman, J., Kopriva, R. J. & Bailey, A. (submitted 2013). Instructional strategies and tools for teaching and assessing ELs in classrooms. Accepted for special issue of Educational Assessment.

Emick, J., Monroe, R., & Malagon Sprehn, M. (2006). Culture, education  and testing in the United States: Investigating a novel response with classroom support for ELLs. Wisconsin Center for Education Research, University of Wisconsin-Madison, Madison, WI. 

Kopriva, R. J. (2010a). Classroom assessment of ELs in content areas: What to consider and how to do it (Part 1). AccELLerate!, 3(1), 2–4.

Kopriva, R. J. (2010b). Getting down to the nitty gritty: Making content assessment work for ELs in classrooms (Part 2). Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. (2010, March). Meaningfully assessing English learners in local and statewide academic assessments: What does it entail? Presentation for the National Clearinghouse for English Language Acquisition and Language Instruction Educational Programs.

Kopriva, R. J. (2011, January). Considerations for meaningful classroom assessment of ELs in math and science: Getting Started. (Part 2). Presentation for the National Clearinghouse for English Language Acquisition and Language Instruction Educational Programs.

Kopriva, R. J. (2013). The ONPAR methodology in the classroom: Why and how it seems to work for measuring challenging content of many ELLs. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R.J. & Lara, J. (1997). Scoring English language learners' papers more accurately. In Y. S. George & V.V. Van Horne (Eds.), Science education reform for all: Sustaining the science, mathematics, and technology education reform. Washington, DC: American Association for the Advancement of Science.

Kopriva, R.J. & Rasmussen, M. (2010, August). Considerations for meaningful classroom assessment of ELs in math and science: How to do it. (Part 1). Presentation for the National Clearinghouse for English Language Acquisition and Language Instruction Educational Programs.

Kopriva, R.J. & Saez, S. (1997). Guide to scoring LEP student responses to open-ended mathematics itemsWashington, DC: Council of Chief State School Officers.

Kopriva, R.J. & Sexton, U. (1999). Guide to scoring LEP student responses to open-ended science items. Washington, DC: Council of Chief State School Officers: Council of Chief State School Officers. 

Kopriva R.J. & Sexton, U. (2011). Using appropriate assessment processes in the classroom: How to get accurate information about the academic knowledge and skills of English language learners. In M. del Rosario-Basterra, E. Trumbull, & G. Solano-Flores (Eds.), Cultural validity in assessment: Addressing linguistic and cultural diversity. New York, NY: Routledge Publishers.

Roeber, E. (2014). Providing professional learning on formative assessment methods for educators. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Wright, L. J. & Logan-Terry, A. (2013, April). Incorporating student’s voices in the accommodations debate: A discourse analysis of students’ interactions with traditional and multisemiotic test items [PDF document]. PowerPoint presentation given at the annual meeting of the American Educational Research Association, Philadelphia, PA. 

Differential Accommodations Assignment

Emick, J., Monroe, R., & Malagon Sprehn, M. (2006). Culture, education  and testing in the United States: Investigating a novel response with classroom support for ELLs. Wisconsin Center for Education Research, University of Wisconsin-Madison, Madison, WI. 

Kopriva, R. J. (2010, March). Meaningfully assessing English learners in local and statewide academic assessments: What does it entail? Presentation for the National Clearinghouse for English Language Acquisition and Language Instruction Educational Programs.

Kopriva, R. J. & Carr, T. G. (2009, April). It’s about time: Matching English learners and the ways they take tests by using an online tool to properly address individual needs. Presentation at the annual meeting of the National Council on Measurement in Education, San Diego, CA.

Kopriva, R. J., Emick, J. E., Hipolito-Delgado, C. P., & Cameron, C. A. (2007). Do proper accommodations assignments make a difference? Examining the impact of improved decision-making on scores for English language learners. Educational Measurement: Issues and Practice, 26(3), 11-20.

Kopriva, R. J. & Koran, J. (2008). Proper assignment of accommodations to individual students. In R. J. Kopriva, Improving testing for English language learners: A comprehensive approach to designing, building, implementing, and interpreting better academic assessments (pp. 221-258). New York, NY: Routledge Publishers.

Kopriva, R. J., Koran, J. & Hedgspeth, C. (2007). Addressing the importance of systematically matching student needs and test accommodations. In C. Cahalan-Laitusis & L. L. Cook (Eds.), Large-Scale assessment and accommodations: What works? Arlington, VA: Council for Exceptional Children.

Koran, J. & Kopriva, R. J. (2006). Teacher assignment of test accommodations for English language learners: Does it work? Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Myers, B., & Kopriva, R. J. (2015). Decision trees linking individual student need to large-scale accommodations for English learners: A white paper.

Policy Issues

Boals, T., Kopriva, R. J., & Lundberg, T. (2012). Why who takes what test matters: The need for thinking differently about English learners and content tests. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. (1999). A conceptual framework for the valid and comparable measurement of all students: An outline. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. (2008). Getting started: Issues of participation, alignment and validating access with test specifications. In R. J. Kopriva, Improving testing for English language learners (pp. 67-81). New York, NY: Routledge Publishers.

Kopriva, R. J. (2010, March). Meaningfully assessing English learners in local and statewide academic assessments. What does it entail? Presentation for the National Clearinghouse for English language Acquisition and Language Instruction Educational Programs.

Kopriva, R. J. (2012). English learners and large-scale content tests: Selected issues for today and tomorrow. Paper commissioned by the WIDA Consortium, University of Wisconsin-Madison, Madison, WI.

Kopriva, R. J., & Lara, J. (2009). Looking back and looking forward: Inclusion of all students in the National Assessment of Educational Progress. National Assessment Governing Board (NAGB) Historical Paper commissioned for National Assessment Governing Board (NAGB) 20th Anniversary Conference. Washington, DC.

Rigney, S. L., Wiley, D. E., & Kopriva, R. J. (2008). The past as preparation: Measurement, public policy and implications for access. In R. J. Kopriva (Ed.), Improving testing for English language learners: A comprehensive approach to designing, building, implementing, and interpreting better academic assessments (pp. 39-66). New York, NY: Routledge Publishers.

Scoring Practices

Kopriva, R. J. & Lara, J. (1997). Scoring English language learners' papers more accurately. In Y. S. George and V. V. Van Horne (Eds.), Science education reform for all: Sustaining the science, mathematics, and technology education reform. Washington, DC: American Association for the Advancement of Science.

Kopriva, R.J. & Saez, S. (1997). Guide to scoring LEP student responses to open-ended mathematics items. Washington, DC: Council of Chief State School Officers.

Kopriva, R.J. & Sexton, U. (1999). Guide to scoring LEP student responses to open-ended science items. Washington, DC: Council of Chief State School Officers. 

Koran, J. & Kopriva, R. J. (2006). Teacher and STELLA assignment of test accommodations for English language learners: Does it work? Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Winter, P., Kopriva, R.J., Chen, C. S., & Emick, J. (2006). Exploring individual and item factors that affect assessment validity for diverse learners: Results from a large-scale cognitive lab. Learning and Individual Differences, 16(4), 267-276.

Technical Considerations

Kopriva, R. J. (2008). Selected technical considerations. In R.J. Kopriva (Ed.), Improving testing for English language learners (pp. 283-322). New York, NY: Routledge Publishers

Kopriva, R.J. (2010). Where are we and where could we go next? In P. C. Winter (Ed.), Evaluating the comparability of scores from achievement test variations. Washington, DC: Council of Chief State School Officers.

Kopriva, R. J., Cameron, C. A., & Carr, T. G.  (2008). The limits of DIF for Special populations: Why this item evaluation tool is flawed for English learners, students with learning disabilities, and students who are deaf/hard-of-hearing.

Kopriva, R. J., Wiley, D. E., & Winter, P. C. (2006). Analyzing change in skill complexity using specially constructed test scores.

Kopriva, R. J. & Winter, P. C. (2012). Designing a cognitive skills framework for item development. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Technology

Kopriva, R.J. (2013, November). Mixing it up to make the most out of tech-based techniques for at-risk students (and others too). Presentation at Instructional Sensitivity Conference, Kansas University, Lawrence, KS.

Testing Students with Disabilities

Carr, T. G. & Kopriva, R. J. (2008). Digging deeper: Using expert reviews of distracter distributions for parallel test item dyads to analyze effective and ineffective item characteristics for English learners, students with learning disabilities, and students who are deaf/hard-of-hearing.

Cawthon, S., Leppo, R. Carr, T., & Kopriva, R. (2013). Towards accessible assessments: The promises and limitations of test item adaptations for students with disabilities and English language learners. Educational Assessment, 18, 73-98.

Kopriva, R.J. & Albers, C. (2013). Considerations of achievement testing for students with individual needs. In K. F. Geisinger (Ed.), APA handbook of testing and assessment in psychology (Vol. 3) (pp. 370-390).Washington, DC: American Psychological Association.

Kopriva, R. J., Carr, T., Gabel, D., & Cameron, C. (2011). Improving the validity of mathematics results for students with learning disabilities in reading and other SwDs who struggle with language and literacy: Findings from the ONPAR elementary and middle school mathematics experimental study.

Kopriva, R. J., & Lara, J. (2009). Looking back and looking forward: Inclusion of all students in the National Assessment of Educational Progress. National Assessment Governing Board (NAGB) Historical Paper commissioned for National Assessment Governing Board (NAGB) 20th Anniversary Conference. Washington, DC.

Kopriva, R. J., Winter, P. C., Triscari, R., & Carr, T. G., Cameron, C., & Gabel, D. (2013). Assessing the knowledge, skills, and abilities of ELs, selected SwDs and controls on challenging high school science content: Results from randomized trials of ONPAR and technology-enhanced traditional end-of-course biology and chemistry tests. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.