skip to main content
10.1145/3159450.3159642acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
abstract
Public Access

Connecting Evaluation and Computing Education Research: Why is it so Important?

Published:21 February 2018Publication History

ABSTRACT

With the growth of computing education research in the last decade, we have found a call for a strengthening of empiricism within the computing education research community. Computer science education researchers are being asked to focus not only the innovation that the research creates or the question it answers, but also on validating the claims we made about the work. In this session, we will explore the relationship between evaluation and computing education research and why it is so vital to the success of the many computing education initiatives underway. It will also help computing faculty engaged in computer science education research understand why it is essential to integrate evaluation and validation from the very first conceptual stages of their intervention programs.

References

  1. American Educational Research Association (AERA), American Psychological Association (APA), and National Council on Measurement in Education (NCME). 2014. Standards for educational and psychological testing. Retrieved August 31, 2017 from http://www.apa.org/science/programs/testing/standards.aspxGoogle ScholarGoogle Scholar
  2. M. Bienkowski, E. Snow, D.W. Rutstein, & S. Grover. 2015. Assessment design patterns for computational thinking practices in secondary computer science: A first look (SRI technical report). Retrieved August 31, 2017 from http://pact.sri.com/resources.htmlGoogle ScholarGoogle Scholar
  3. A. Decker & M.M. McGill. 2017. Pre-College Computing Outreach Research: Towards Improving the Practice. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (SIGCSE '17). ACM, New York, NY, USA, 153--158. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. A. Decker, M.M. McGill, & A. Settle. 2016. Towards a Common Framework for Evaluating Computing Outreach Activities. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (SIGCSE '16). ACM, New York, NY, USA, 627--632. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. D. Fetterman & J. Ravitz. 2017 in press. "A Google-Enhanced Empowerment Evaluation Approach." In Fetterman, D., Rodríguez-Campos, L., & Zukoski, A. (Eds.). Collaborative, Participatory, and Empowerment Evaluation: Stakeholder Involvement Approaches to Evaluation. New York, NY: Guilford Press. 105--117.Google ScholarGoogle Scholar
  6. Google Inc. & Gallup Inc. 2016. Diversity Gaps in Computer Science: Exploring the Underrepresentation of Girls, Blacks and Hispanics. Retrieved from August 31, 2017 http://goo.gl/PG34aHGoogle ScholarGoogle Scholar
  7. Erin C. Henrick, Paul Cobb, Kara Jackson, William R. Penuel, and Tiffany Clark. (2017). Assessing Research-Practice Partnerships: Five Dimensions of Effectiveness. New York, NY: William T. Grant Foundation. Retrieved November 20, 2017 from http://wtgrantfoundation.org/library/uploads/2017/10/Assessing-Research-Practice-Partnerships.pdfGoogle ScholarGoogle Scholar
  8. R.J. Mislevy and G.D. Haertel. 2006. Implications of Evidence-Centered Design for Educational Testing. Educational Measurement: Issues and Practice 25, 4: 6--20.Google ScholarGoogle ScholarCross RefCross Ref
  9. National Girls Collaborative Project. 2017. CS Outreach Program Evaluation Network (CS OPEN). Retrieved August 25, 2017 from https://ngcproject.org/cs-outreach-program-evaluation-network-cs-openGoogle ScholarGoogle Scholar
  10. J. Ravitz. 2002. Demystifying data about technology impacts in schools. Paper presented in National Educational Computing Conference. San Antonio, TX. Retrieved August 25, 2017 from http://academia.edu/1854297Google ScholarGoogle Scholar
  11. S.L. Smith et al. 2017, in press. Cracking the Code: The prevalence and nature of computer science depictions in media. Los Angeles, CA: USC Annenberg Media, Diversity & Social Change Initiative.Google ScholarGoogle Scholar
  12. E. Snow, C. Tate, D. Rutstein, M. Bienkowski. 2017. Assessment design patterns for computational thinking practices in Exploring Computer Science (SRI technical report). Menlo Park, CA: SRI International. Retrieved August 31, 2017 from http://pact.sri.com/resources.htmlGoogle ScholarGoogle Scholar
  13. STEM Evaluation Repository. Online resource. Retrieved 11/17/17. http://comm.eval.org/stemeducationandtraining/stem-tig-repository/viewrepositoryGoogle ScholarGoogle Scholar
  14. A. Yadav, D. Burkhart, D. Moix, E. Snow, P. Bandaru, and L. Clayborn. 2015. Sowing the Seeds: A Landscape Study on Assessment in Secondary Computer Science Education. Comp. Sci. Teachers Assn., NY, NY. Retrieved January 18, 2016 from http://csta.acm.org/Research/sub/Projects/ResearchFiles/AssessmentStudy2015.pdfGoogle ScholarGoogle Scholar

Index Terms

  1. Connecting Evaluation and Computing Education Research: Why is it so Important?

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            SIGCSE '18: Proceedings of the 49th ACM Technical Symposium on Computer Science Education
            February 2018
            1174 pages
            ISBN:9781450351034
            DOI:10.1145/3159450

            Copyright © 2018 Owner/Author

            Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 21 February 2018

            Check for updates

            Qualifiers

            • abstract

            Acceptance Rates

            SIGCSE '18 Paper Acceptance Rate161of459submissions,35%Overall Acceptance Rate1,595of4,542submissions,35%

            Upcoming Conference

            SIGCSE Virtual 2024
            SIGCSE Virtual 2024: ACM Virtual Global Computing Education Conference
            November 30 - December 1, 2024
            Virtual Event , USA

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader