Skip to main content

Standards for Developing Assessments of Learning Using Process Data

  • Chapter
  • First Online:
Re-imagining University Assessment in a Digital World

Part of the book series: The Enabling Power of Assessment ((EPAS,volume 7))

Abstract

Digital technology is changing assessment of learning. Digitised assessment can be more administratively efficient, more easily scaled, more effectively targeted to individual levels of performance, more integrated into the learning environment, more interactive, and it can support more imaginative, colourful, interactive and timely feedback. However, in this chapter, it is argued that ‘more, faster and prettier’ is only part of the assessment story of the first quarter of the twenty-first century. Education institutions are also being pressed to make distinctive shifts in what is learned and thus in what is assessed. Students now need to establish mastery of complex learning outcomes that extend beyond the cognitive domain, and beyond mastery of content knowledge, to mastery of competence and skill, including soft skills, or general capabilities. This chapter explores this assessment frontier, examining whether and how large quantities of digital, process-oriented data generated from learning management systems and other digital learning tools can be used to make reliable and valid judgments about the degree to which student have mastered complex general capabilities. It is argued that ‘metrolytic’ standards for development of assessment tools can be applied to ensure the requisite validity and reliability.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Asia-Pacific Education Research Institutes Network. (2015). Regional study on transversal competencies in education policy and practice. Bangkok/Paris: UNESCO. Retrieved from: http://www.unesdoc.unesco.org/images/0023/002319/231907E.pdf.

    Google Scholar 

  • Bergner, Y., Lang, C., & Gray, G. (2017). Measurement and its uses in learning analytics. In C. Lang, G. Siemens, A. Wise & D. Gasevic (Eds.), Handbook of learning analytics (pp. 35–48). New York: SOLAR. https://doi.org/10.18608/hla17. Retrieved from: https://www.solaresearch.org/hla-17/.

  • Bransford, J.D., Brown, J.D., & Cocking, R.R. (2003). How people learn: Brain, mind, experience, and school (Expanded ed.). Washington DC: National Academy Press.. Retrieved from http://www.nap.edu/read/9853/chapter/1

  • Buckingham Shum, S., & Deakin Crick, R. (2016). Learning analytics for 21st century competencies. Journal of Learning Analytics, 3(2), 6–21.

    Article  Google Scholar 

  • Carmean, C., & Mizzi, P. (2010). The case for nudge analytics. Educause Quarterly, 33(4), 4.

    Google Scholar 

  • Cleary, T. J., Callan, G. L., & Zimmerman, B. J. (2012). Assessing self-regulation as a cyclical, context-specific phenomenon: Overview and analysis of SLR microanalytic protocols. Educational Research International., 2012, 1–19. https://doi.org/10.1155/2012/428639.

    Article  Google Scholar 

  • Corrin, L., & de Barba, P. (2014). Exploring students’ interpretation of feedback delivered through learning analytics dashboards. In B. Hegarty, J. McDonald, & S.-K. Loke (Eds.), Rhetoric and reality: Critical perspectives on educational technology. Proceedings ASCILITE Dunedin 2014 (pp. 629–633). Dunedin: ASCILITE.

    Google Scholar 

  • Cronbach, L., & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological Bulletin, 52, 281–302.

    Article  Google Scholar 

  • Dreyfus, S. E., & Dreyfus, H. L. (1980). A five-stage model of the mental activities involved in directed skill acquisition. Retrieved from http://www.dtic.mil/get-tr-doc/pdf?AD=ADA084551

  • Dringus, L. P. (2012). Learning analytics considered harmful. Journal of Asynchronous Learning Networks, 16(3), 87–100.

    Google Scholar 

  • Ferguson, R., & Clow, D. (2017). Where is the evidence? A call to action for learning analytics. In Proceedings of the 7th International learning analytics & knowledge conference. Vancouver, Canada.

    Google Scholar 

  • Fiore, S. M., Graesser, A. G., Greiff, S., Griffin, P. G., Gong, B., Kyllonen, P., et al. (2017). Collaborative problem solving: Considerations for the national assessment of educational progress. Alexandria: National Center for Education Statistics. Retrieved from https://nces.ed.gov/nationsreportcard/pdf/researchcenter/collaborative_problem_solving.pdf.

    Google Scholar 

  • Gasevic, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71.

    Article  Google Scholar 

  • Glaser, R. (1994a). Criterion-referenced tests: Part I: Origins. Educational Measurement: Issues and Practice, 13(4), 9–11.

    Article  Google Scholar 

  • Glaser, R. (1994b). Criterion-referenced tests: Part II: Unfinished business. Educational Measurement: Issues and Practice, 13(4), 27–30.

    Article  Google Scholar 

  • Greller, W., & Draschler, H. (2012). Translating learning into numbers: A framework for learning analytics. Educational Technology and Society, 15(3), 42–47.

    Google Scholar 

  • Griffin, P. (2007). The comfort of competence and the uncertainty of assessment. Studies in Educational Evaluation, 33(1), 87–99.

    Article  Google Scholar 

  • Griffin, P., & Care, E. (2015). Assessment and teaching of 21st century skills: Methods and approach (Vol. 2). Dordrecht: Springer.

    Google Scholar 

  • Gunnarsson, B. L., & Alterman, R. (2013). Understanding promotions in a case study of student blogging. Paper presented at the 3rd international conference on learning analytics and knowledge, Leuven, Belgium.

    Google Scholar 

  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487.

    Article  Google Scholar 

  • He, J., Rubinstein, B. I. P., Bailey, J., Zhang, R., Milligan, S., & Chan, J. (2016). MOOCs meet measurement theory: A topic-modelling approach. Paper presented at the 30th AAAI conference on artificial intelligence (AAAI-16), Phoenix, Arizona.

    Google Scholar 

  • Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50(1), 1–73.

    Article  Google Scholar 

  • Masters, G. (2018, July 31). But can we measure it. Teacher Magazine. Retrieved from https://www.teachermagazine.com.au/columnists/geoff-masters/but-can-we-measure-it

  • Messick, S. (1995). Standards of validity and the validity of standards in performance assessment. Educational Measurement: Issues and Practice, 14(4), 5–8.

    Article  Google Scholar 

  • Milligan, S. K., & Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based interpretation. Journal of Learning Analytics, 3(2), 88–115. https://doi.org/10.18608/jla.2016.32.5.

    Article  Google Scholar 

  • Milligan, S. K., Kennedy, G., & Israel, D. (2018). Assessment, credentialling and recognition in the digital era: Recent developments in a fertile field (Seminar series 272). Melbourne: Centre for Strategic Education.

    Google Scholar 

  • Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educational Measurement: Issues & Practice, 25(4), 6–20. https://doi.org/10.1111/j.1745-3992.2006.00075.x.

    Article  Google Scholar 

  • OECD. (2018). The future of education and skills: Education 2030. Retrieved from http://www.oecd.org/education/2030/E2030%20Position%20Paper%20(05.04.2018).pdf

  • Pellegrino, J.W. (1999). The evolution of educational assessment: Considering the past and imagining the future. William H. Angoff Memorial Lecture. Retrieved from https://www.ets.org/Media/Research/pdf/PICANG6.pdf

  • Perkins, D. N., & Salomon, G. (1992). Transfer of learning. In T. Husen & T. Neville Postlethwaite (Eds.), International encyclopedia of education (2nd ed.). Oxford: Pergamon Press.

    Google Scholar 

  • Polyak, S. T., von Davier, A., & Peterschmidt, K. (2017). Analyzing game-based collaborative problem solving with computational psychometrics. In Proceedings of 23rd ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Halifax, ACTNext.

    Google Scholar 

  • Scardamalia, M., Bransford, J., Kozma, B., & Quellmalz, E. (2013). New assessments and environments for knowledge building. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (Vol. 1, pp. 231–300). New York: Springer.

    Google Scholar 

  • Shute, V., & Ventura, M. (2013). Stealth assessment: Measuring and supporting learning in video games. Cambridge, MA: MIT Press.

    Book  Google Scholar 

  • Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. Educause Review, 46(5), 30–32.

    Google Scholar 

  • Tai, J., & Adachi, C. (this volume). The future of self and peer assessment: Are technology or people the key? In M. Bearman, P. Dawson, J. Tai, R. Ajjawi, & D. Boud (Eds.), Reimagining assessment in a digital world (chapter 15). Dordrecht: Springer.

    Google Scholar 

  • Tremblay, K., Lalancette, D., & Roseveare, D. (2012). Assessment of higher education learning outcomes. Feasibility study report, volume 1 – Design and implementation. OECD. Retrieved from http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume1.pdf

  • von Davier, A. A., & Halpin, P. F. (2013). Collaborative problem-solving and the assessment of cognitive skills: Psychometric considerations. ETS: Princeton.

    Google Scholar 

  • Wilson, M. (2005). Constructing measures: An item response modeling approach. New York: Taylor & Francis Group.

    Google Scholar 

  • Wilson, M., & Scalise, K. (2016). Learning analytics: Negotiating the intersection of measurement technology and information technology. In J. M. Spector, B. B. Lockee, & M. D. Childress (Eds.), Learning, design, and technology. New York: Springer.

    Google Scholar 

  • Wilson, M., Scalise, K., & Gochyyev, P. (2016). Assessment of learning in digital interactive social networks: A learning analytics approach. Online Learning, 20(2), 97–119.

    Article  Google Scholar 

  • Wolfe, E. W., & Smith, E. V., Jr. (2007). Instrument development tools and activities for measure validation using Rasch models: Part 1 & part 2. In E. V. Smith Jr. & R. M. Smith (Eds.), Rasch measurement: Advanced and specialized applications (pp. 202–290). Minnesota: JAM Press.

    Google Scholar 

  • World Economic Forum. (2015). New vision for education: Unlocking the potential of technology. Geneva: WEF & The Boston Consulting Group.

    Google Scholar 

  • Wright, B. D., & Masters, G. N. (1982). Rating scale analysis. Chicago: Mesa Press.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sandra Milligan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Milligan, S. (2020). Standards for Developing Assessments of Learning Using Process Data. In: Bearman, M., Dawson, P., Ajjawi, R., Tai, J., Boud, D. (eds) Re-imagining University Assessment in a Digital World. The Enabling Power of Assessment, vol 7. Springer, Cham. https://doi.org/10.1007/978-3-030-41956-1_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-41956-1_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-41955-4

  • Online ISBN: 978-3-030-41956-1

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics