Research

ONPAR

Auty, W. & Wright, L.J. (2020).  Dynamic Interactive Formative Assessment Tasks and End-of-Unit Tests for Measuring Challenging Concepts and Skills of Diverse Middle School Students [Technical Report]. 

Wright, L.J. & Malkin, L. (2020, June 1-5).  Using formative assessment as an opportunity to promote academic language development: Results from the ONPAR science pilot.  [Poster session].  WIDA Board Meeting, online.

Wright, L.J., Malkin, L., & Eiring, E.  (2019).  Dynamic Interactive Formative Assessment Tasks and End-of-Unit Tests for Measuring Challenging Concepts and Skills of Diverse Middle School Students. [Poster session]. WIDA Board Meeting, Madison, Wisconsin. 

Wright, L.J. & Kopriva, R. (2019, March 6-9).  Assessing English Learners’ Understanding of Rigorous Mathematics in Classrooms.  [Conference Presentation]. The Society for Research on Educational Effectiveness, Washington, DC.

Wright, L.J. & Kopriva, R. (2018, October 8-10).  ONPAR: Multisemiotic Assessment Design for Diverse Students.  [Conference Presentation]. National Council on Measurement in Education Classroom Assessment Conference, Lawrence, Kansas. 

Koran, J. & Kopriva, R.J. (2017) Framing Appropriate Accommodations in Terms of Individual Need: Examining the Fit of Four Approaches to Selecting Test Accommodations of English Language Learners, Applied Measurement in Education, 30:2, 71-81. 

Kopriva, R., Wright, L.J. & Malkin, L. (2017, August 10-11).  Next Generation Science Design: Using Multisemiotics to work for all in authentic, rich assessment environments. [Conference Presentation].  Designing Assessment for Three-dimensional Standards: Current Approaches and Next Steps, Washington, DC. 

Kopriva, R., Wright, L.J. & McGlone, M.  (2017, April 27-May 1).  Formative Assessment as a Response to Standards Reform Movement.  [Poster Presentation]. American Educational Research Association, San Antonio, Texas.

Kopriva, R.J., Thurlow, M.L., Perie, M., Lazarus S.S., & Clark, A. (2016) Test Takers and the Validity of Score Interpretations, Educational Psychologist, 51:1, 108-128. 

Kopriva, R., Boals, T. & Wright, L.J. (2015, March 25-28).  Effective Uses of Technology for Measuring Challenging Content in Classroom Embedded Formative Assessment: What Works for English Learners.  [Conference Presentation] TESOL International Association, Toronto, Canada.

Kopriva, R.J. (2014) Second-Generation Challenges for Making Content Assessments Accessible for ELLs, Applied Measurement in Education, 27:4, 301-306.

Wright, L.J. (2014, November 13).  ONPAR Virtual Performance Tasks in Mathematics and Science: Harnessing Technology to Capture Deep Learning of Rigorous Academic Content.  [Webinar] Performance Testing Council, online.

Schultz, K. & McGlone, M. (2018, April). The Promise of Innovative Methodologies for Diverse Students in Mathematics Assessment. Presentation at the Annual meeting of the National Council for Teachers of Mathematics, Washington, DC.

Kopriva, R.J. & Wright, L. (2017). Score Processes in Assessing Academic Content of Non-Native Speakers: Literature Review and ONPAR Summary. In K. Ercikan, and J. Pellegrino (Eds.). Validation of Score Meaning in the Next Generation of Assessments Using Response Processes (100-112), New York, Routledge. (doc)

Kopriva, R. J. (2009). Assessing the skills and abilities in math and science of ELs with low English proficiency: A promising new method. AccELLerate!, 2(1), 7–10.

Kopriva, R. J. (2013, May). Preventing preventable tracking: The peril and promise for lower English proficient students. Presentation at the annual meeting of the National Council on Measurement in Education, San Franciso, CA.

Kopriva, R. J. (2013). The ONPAR methodology in the classroom: Why and how it seems to work for measuring challenging content of many ELLs. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J., Gabel, D., & Bauman, J. (2009). What happens when large-scale items actually use the computer’s capabilities? Exploring issues and redefining challenges. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J., Gabel, D., & Cameron, C. (2009). Overview of results from the ONPAR elementary and middle school science experimental study with ELs and non-ELs: A promising new approach for measuring complex content knowledge of English learners with lower proficiency levels. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J., Winter, P. C., Triscari, R., & Carr, T. G., Cameron, C., & Gabel, D. (2013). Assessing the knowledge, skills, and abilities of ELs, selected SwDs and controls on challenging high school science content: Results from randomized trials of ONPAR and technology-enhanced traditional end-of-course biology and chemistry tests. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Logan-Terry, A., & Wright, L. J. (2010). Making thinking visible: An analysis of English language learners’ interactions with access-based science assessment items. AccELLerate!, 2(4), 11–14.

Myers, B. (2015). The cognitive basis of ONPAR assessment: A white paper.

Wright, L. J. & Logan-Terry, A. (2013, April). Incorporating student’s voices in the accommodations debate: A discourse analysis of students’ interactions with traditional and multisemiotic test items [PDF document]. PowerPoint presentation given at the annual meeting of the American Educational Research Association, Philadelphia, PA. 

Wright, L. J. (2013). Multimodality and measurement: Promise for assessing English learners and students who struggle with the language demands of tests. White Paper.

Wright, L. J., & Kopriva, R. J. (2009). Using cognitive labs to refine technology-enhanced assessment tasks and ensure their accessibility: Insights from data collected to inform ONPAR elementary and middle school science task development. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Accessible Item and Test Development

Cawthon, S., Leppo, R., Carr, T. G., & Kopriva, R. J. (2013). Towards accessible assessments: The promises and limitations of test item adaptations for students with disabilities and English language learners. Educational Assessment, 18(2), 73-98.

Kopriva, R. J. (1999). A conceptual framework for the valid comparable measurement of all students: An outline. White Paper. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. (2000). Ensuring accuracy in testing for ELLs. Washington, DC: Council of Chief State School Officers.

Kopriva, R. J. (2008). Improving testing for English language learners: A comprehensive approach to designing, building, implementing, and interpreting better academic assessments. New York, NY: Routledge Publishers.

Kopriva, R. J. (2013). Issues and considerations for the second-generation of researchers interested in assessing ELLs on academic content: A review of four papers. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. & Albers, C. (2013). Considerations of achievement testing for students with individual needs. In K. F. Geisinger (Ed.), APA handbook of testing and assessment in psychology (Vol. 3) (pp. 370-390).Washington, DC: American Psychological Association.

Kopriva, R. J., Wiley, D. E., & Emick, J. (2007). Inspecting the validity of large-scale assessment score inferences for ELLs and others under more optimal testing conditions—Does it measure Up? Paper commissioned by the Assessment and Accountability Comprehensive Center, WestEd, San Francison, CA.

Kopriva, R. J, Wiley, D., & Winter, P. (2003, April). Evidentiary logic in assessment of diverse learners: Valid Assessment of English Language Learners (VAELL) Project. Paper presented at the annual conference of the National Council of Measurement in Education, Chicago, IL.

Kopriva, R., Wiley, D. E., & Winter, P. (2004, April). Re-examining the role of individual differences in educational assessment. Paper presented at the annual conference of the National Council of Measurement in Education, San Diego, CA.

Kopriva, R. J. & Winter, P.C. (2012). Designing a cognitive skills framework for item development. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J., Winter, P. C., & Chen, C. S., (2006). Achieving accurate results for diverse learners with access-enhanced items: Summary of results from VAELL grant.

Mann, H., Emick, J., Cho, M., & Kopriva, R. J. (2006, March). Considering the validity of test score inferences for English language learners with limited proficiency using language liaisons and other accommodations. Paper presented at the annual conference of the American Educational Research Association, San Francisco, CA.

Shaw, J., Abedi, J., & Kopriva, R. J. (in press, 2013). The future of content testing for assessing ELLs. Accepted for special issue of Educational Assessment.

Sprehn, R. (2004). Standardized Academic Testing and Culture: A White Paper. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Winter, P. C., Kopriva, R. J., Chen, C., & Emick, J. E. (2006). Exploring individual and item factors that affect assessment validity for diverse learners: Results from a large-scale cognitive lab. Learning and Individual Differences, 16(4), 267-276.

Wright, L. J. & Bauman, J. (2009). The discourse of assessments: Identifying grammatical  features of standardized science tests that contribute to their (in)accessibility for linguistically diverse learners. Presentation at the annual convention of the American Association for Applied Linguistics, Denver, CO.

Wright, L. J. & Logan-Terry, A. (2013, April). Incorporating student’s voices in the accommodations debate: A discourse analysis of students’ interactions with traditional and multisemiotic test items [PDF document]. PowerPoint presentation given at the annual meeting of the American Educational Research Association, Philadelphia, PA.

Classroom Assessment of English Learners

Emick, J., Monroe, R., & Malagon Sprehn, M. (2006). Culture, education  and testing in the United States: Investigating a novel response with classroom support for ELLs. Wisconsin Center for Education Research, University of Wisconsin-Madison, Madison, WI. 

Kopriva, R. J. (2010a). Classroom assessment of ELs in content areas: What to consider and how to do it (Part 1). AccELLerate!, 3(1), 2–4.

Kopriva, R. J. (2010b). Getting down to the nitty gritty: Making content assessment work for ELs in classrooms (Part 2). Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. (2010, March). Meaningfully assessing English learners in local and statewide academic assessments: What does it entail? Presentation for the National Clearinghouse for English Language Acquisition and Language Instruction Educational Programs.

Kopriva, R. J. (2011, January). Considerations for meaningful classroom assessment of ELs in math and science: Getting Started. (Part 2). Presentation for the National Clearinghouse for English Language Acquisition and Language Instruction Educational Programs.

Kopriva, R. J. (2013). The ONPAR methodology in the classroom: Why and how it seems to work for measuring challenging content of many ELLs. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R.J. & Lara, J. (1997). Scoring English language learners' papers more accurately. In Y. S. George & V.V. Van Horne (Eds.), Science education reform for all: Sustaining the science, mathematics, and technology education reform. Washington, DC: American Association for the Advancement of Science.

Kopriva, R.J. & Rasmussen, M. (2010, August). Considerations for meaningful classroom assessment of ELs in math and science: How to do it. (Part 1). Presentation for the National Clearinghouse for English Language Acquisition and Language Instruction Educational Programs.

Kopriva, R.J. & Saez, S. (1997). Guide to scoring LEP student responses to open-ended mathematics itemsWashington, DC: Council of Chief State School Officers.

Kopriva, R.J. & Sexton, U. (1999). Guide to scoring LEP student responses to open-ended science items. Washington, DC: Council of Chief State School Officers: Council of Chief State School Officers. 

Kopriva R.J. & Sexton, U. (2011). Using appropriate assessment processes in the classroom: How to get accurate information about the academic knowledge and skills of English language learners. In M. del Rosario-Basterra, E. Trumbull, & G. Solano-Flores (Eds.), Cultural validity in assessment: Addressing linguistic and cultural diversity. New York, NY: Routledge Publishers.

Roeber, E. (2014). Providing professional learning on formative assessment methods for educators. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Wright, L. J. & Logan-Terry, A. (2013, April). Incorporating student’s voices in the accommodations debate: A discourse analysis of students’ interactions with traditional and multisemiotic test items [PDF document]. PowerPoint presentation given at the annual meeting of the American Educational Research Association, Philadelphia, PA. 

Differential Accommodations Assignment

Emick, J., Monroe, R., & Malagon Sprehn, M. (2006). Culture, education  and testing in the United States: Investigating a novel response with classroom support for ELLs. Wisconsin Center for Education Research, University of Wisconsin-Madison, Madison, WI. 

Kopriva, R. J. (2010, March). Meaningfully assessing English learners in local and statewide academic assessments: What does it entail? Presentation for the National Clearinghouse for English Language Acquisition and Language Instruction Educational Programs.

Kopriva, R. J. & Carr, T. G. (2009, April). It’s about time: Matching English learners and the ways they take tests by using an online tool to properly address individual needs. Presentation at the annual meeting of the National Council on Measurement in Education, San Diego, CA.

Kopriva, R. J., Emick, J. E., Hipolito-Delgado, C. P., & Cameron, C. A. (2007). Do proper accommodations assignments make a difference? Examining the impact of improved decision-making on scores for English language learners. Educational Measurement: Issues and Practice, 26(3), 11-20.

Kopriva, R. J. & Koran, J. (2008). Proper assignment of accommodations to individual students. In R. J. Kopriva, Improving testing for English language learners: A comprehensive approach to designing, building, implementing, and interpreting better academic assessments (pp. 221-258). New York, NY: Routledge Publishers.

Kopriva, R. J., Koran, J. & Hedgspeth, C. (2007). Addressing the importance of systematically matching student needs and test accommodations. In C. Cahalan-Laitusis & L. L. Cook (Eds.), Large-Scale assessment and accommodations: What works? Arlington, VA: Council for Exceptional Children.

Koran, J. & Kopriva, R. J. (2006). Teacher assignment of test accommodations for English language learners: Does it work? Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Myers, B., & Kopriva, R. J. (2015). Decision trees linking individual student need to large-scale accommodations for English learners: A white paper.

Policy Issues

Boals, T., Kopriva, R. J., & Lundberg, T. (2012). Why who takes what test matters: The need for thinking differently about English learners and content tests. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. (1999). A conceptual framework for the valid and comparable measurement of all students: An outline. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Kopriva, R. J. (2008). Getting started: Issues of participation, alignment and validating access with test specifications. In R. J. Kopriva, Improving testing for English language learners (pp. 67-81). New York, NY: Routledge Publishers.

Kopriva, R. J. (2010, March). Meaningfully assessing English learners in local and statewide academic assessments. What does it entail? Presentation for the National Clearinghouse for English language Acquisition and Language Instruction Educational Programs.

Kopriva, R. J. (2012). English learners and large-scale content tests: Selected issues for today and tomorrow. Paper commissioned by the WIDA Consortium, University of Wisconsin-Madison, Madison, WI.

Kopriva, R. J., & Lara, J. (2009). Looking back and looking forward: Inclusion of all students in the National Assessment of Educational Progress. National Assessment Governing Board (NAGB) Historical Paper commissioned for National Assessment Governing Board (NAGB) 20th Anniversary Conference. Washington, DC.

Rigney, S. L., Wiley, D. E., & Kopriva, R. J. (2008). The past as preparation: Measurement, public policy and implications for access. In R. J. Kopriva (Ed.), Improving testing for English language learners: A comprehensive approach to designing, building, implementing, and interpreting better academic assessments (pp. 39-66). New York, NY: Routledge Publishers.

Scoring Practices

Kopriva, R. J. & Lara, J. (1997). Scoring English language learners' papers more accurately. In Y. S. George and V. V. Van Horne (Eds.), Science education reform for all: Sustaining the science, mathematics, and technology education reform. Washington, DC: American Association for the Advancement of Science.

Kopriva, R.J. & Saez, S. (1997). Guide to scoring LEP student responses to open-ended mathematics items. Washington, DC: Council of Chief State School Officers.

Kopriva, R.J. & Sexton, U. (1999). Guide to scoring LEP student responses to open-ended science items. Washington, DC: Council of Chief State School Officers. 

Koran, J. & Kopriva, R. J. (2006). Teacher and STELLA assignment of test accommodations for English language learners: Does it work? Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Winter, P., Kopriva, R.J., Chen, C. S., & Emick, J. (2006). Exploring individual and item factors that affect assessment validity for diverse learners: Results from a large-scale cognitive lab. Learning and Individual Differences, 16(4), 267-276.

Technical Considerations

Kopriva, R. J. (2008). Selected technical considerations. In R.J. Kopriva (Ed.), Improving testing for English language learners (pp. 283-322). New York, NY: Routledge Publishers

Kopriva, R.J. (2010). Where are we and where could we go next? In P. C. Winter (Ed.), Evaluating the comparability of scores from achievement test variations. Washington, DC: Council of Chief State School Officers.

Kopriva, R. J., Wiley, D. E., & Winter, P. C. (2006). Analyzing change in skill complexity using specially constructed test scores.

Kopriva, R. J. & Winter, P. C. (2012). Designing a cognitive skills framework for item development. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.

Technology

Kopriva, R.J. (2013, November). Mixing it up to make the most out of tech-based techniques for at-risk students (and others too). Presentation at Instructional Sensitivity Conference, Kansas University, Lawrence, KS.

Testing Students with Disabilities

Cawthon, S., Leppo, R. Carr, T., & Kopriva, R. (2013). Towards accessible assessments: The promises and limitations of test item adaptations for students with disabilities and English language learners. Educational Assessment, 18, 73-98.

Kopriva, R.J. & Albers, C. (2013). Considerations of achievement testing for students with individual needs. In K. F. Geisinger (Ed.), APA handbook of testing and assessment in psychology (Vol. 3) (pp. 370-390).Washington, DC: American Psychological Association.

Kopriva, R. J., & Lara, J. (2009). Looking back and looking forward: Inclusion of all students in the National Assessment of Educational Progress. National Assessment Governing Board (NAGB) Historical Paper commissioned for National Assessment Governing Board (NAGB) 20th Anniversary Conference. Washington, DC.

Kopriva, R. J., Winter, P. C., Triscari, R., & Carr, T. G., Cameron, C., & Gabel, D. (2013). Assessing the knowledge, skills, and abilities of ELs, selected SwDs and controls on challenging high school science content: Results from randomized trials of ONPAR and technology-enhanced traditional end-of-course biology and chemistry tests. Madison, WI: Institute for Innovative Assessment, University of Wisconsin-Madison.