• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Whenever you search in PBworks, Dokkio Sidebar (from the makers of PBworks) will run the same search in your Drive, Dropbox, OneDrive, Gmail, and Slack. Now you can find what you're looking for wherever it lives. Try Dokkio Sidebar for free.



Page history last edited by jkotsch1@kent.edu 10 years, 2 months ago

Technology-Based Assessments to Plan Differentiated Instruction


“Concentration of assessment data and cumulative documentation in computerized systems benefits all parties involved in the educational processes” (Eyal, 2012). 


Pros and Cons


Positive Aspects of Digitized/Technology-Driven Assessment Results

  • “The learners benefit because they have information about their scores, their implementation of tasks on time and their attendance records, as well as an overall picture of their learning situation relative to other students” (Eyal, 2010)
  • “The teachers benefit because it is possible to address a variety of learning styles and levels, and strengthen the personal connection between teachers and learners, thanks to the potential for ongoing dialogue and personal feedback” (Eyal, 2010)
  • “Principals may also use the digitally displayed assessment data; they can receive a general profile of a single student or class profiles at different levels of comparison” (Eyal, 2012). 
  •  “Computerized documentation of collection data enables precise assessment, reflection, and feedback” (Eyal, 2012). 
  •  “Computerized assessment items may include graphics, sound, animation, and multimedia with response options at different levels” (Eyal, 2012).


Negative Aspects of Digitized/Technology-Driven Assessment Results

  • Possibility of cheating increases
    • Peer-to-peer cheating as well as at-home cheating with a parent
    • Students may also look up answers from a book or other mobile device if not monitored 
  • Some assessments do not provide much interaction between students and teachers 
  • Deeper thinking questions may inolve more subjective questions, which normally are not included on digital assessments 


Informal Assessments


Formative Assessment


  • E-portfolios
    • Using websites as a space for files
      • Free sites for students, like Google Sites
      • Wikis can also be made to look like websites, which would also allow for student reflection 
    • Folioweb
    • PowerPoint as an interactive portfolio (with hyperlinks) 
  • Tech-created documents, filed into a physical portfolio
    • Word Documents
    • Print-outs of presentations, like PowerPoint
    • Filled-in PDF documents
    • Print-outs of computerized test results
    • Print-out of progress from computer-assisted instruction 


Observation/Anecdotal Notes

  • Apps would provide for mobile observational/anecdotal note taking
  • Apps available for notes
    • Evernote
    • Notability
    • Bamboo Paper
    • Microsoft One Note
    • MyScript Memo
    • Notes Plus 
  • Word Processing would also allow for digitized notes
  • Spreadsheet checklists
    • Teachers make a checklist of skills and write daily notes as to student performance on those skills 


Teacher created assessments

  • Rapid Response Systems
    • Ex: Mimio Vote 
  • Hot Potato, teacher-made quizzes 



  • Students can write exit tickets, summaries, etc. in the form of a classroom journal 
    • Teachers can use blogs and RSS feeds that send the blog directly to the teacher
  • Online journals
  • Tablet journal apps
    • My Daily Journal
    • Private Journal
    • Chronicle for iPad
    • Moment Diary
    • My Daily Thoughts 


Student self-assessment

  • Reviewing portfolio materials, and writing a summary (Word Document)


Formal Assessments 

Standardized Tests

    • Dynamic Indicators of Basic Early Literacy Skills
    • Three benchmark assessments per year and then progress monitoring as needed
    • Results are digitized after the assessors give the assessment (results initially paper-based)
    • Digitized results can be made into graphs easily from the DIBELS program
    • Also has iPad app
    • Both math and reading benchmarks/progress monitoring
    • Also has iPad app
  • Both of these allow teachers to assess student knowledge in order to be able to level students accordingly for differentiation 


State Achievement Tests

  • Computerized testing of the future
  • Concerns 
    • Student's ability to perform online, student typing and computer skills 
  • New state testing mandates using technology and performance based assessment


Progress Monitoring Assessments

  • Ex: Ren Learn (Star Reading, Star Math, Star Early Literacy)
  • Aimsweb Assessment 


Computer-Adapted Reading and Math Assessments

  • STAR Reading published by Renaissance Learning, Inc.
  • Scantron
    • Adaptive branching: "A procedure that evaluates a student's answer for each question and then displays the next question adapted to the correct level of difficulty" (McCormick & Zutell, 2011, p. 114).
    • Scantron machines allow students to use programs like Accelerated Math, which is an adaptive branching program 



Online Assessment 

Online assessment is probably the most technology-driven assessment tool today. 


Some considerations/questions to take when reviewing an online assessment include:

  • The validity and reliability of the test
  • Does the test measure what it is supposed to?
  • Is the assessment standards-based?
  • Can the student work independently on the test or will an adult need to be present?
  • Does the assessment offer data gathering tools?
  • Is there both a pre-assessment and post-assessment?
  • What formats of questions are used?
    • Typical online assessments are mostly objective tests with a selection of answers provided
    • These include: multiple choice, true/false, and one-word cloze sentences
  • How interactive is it between students and teachers? (Sorensen and Takle, 2005)
  • Is is formative or summative?
    • “Effective integration of formative assessment in online learning environments has the potential to offer an appropriate structure for sustained meaningful interactions among learners and the teacher, and foster development of effective learning communities to facilitate meaningful learning and its assessment” (Sorensen and Takle, 2005). 


  Designing Learning Assessments

Common Design Mistakes (Shank, 2011)

  • Expecting a bell curve - expecting the majority of students to fall in the "middle" of the bell curve does not indicate mastery of the subject matter. Instead of designing instruction to reflect the bell curve grading scale, course facilitators should provide instruction, practice, feedback, and remediation in order for students to successfully meet intended learning outcomes (pp. 4 - 5).
  • The wrong type of assessment - assessments should challenge students to demonstrate or perform with understanding. According to Shank, "...the optimal assessment type depends primarily on whether the objective is declariative (facts: name, list, state...) or procedural (task: calculate, formulate, build...)" (p. 5).  It is important when designing assessments to include measures evaluating student's ability to know about the topic and how it is applied to a real-world situation (Kelly, 2006).
  • Invalid assessments - a quality assessment is valid in that it measures what it purports to measure.  Quality is established by carefully matching course objectives and assessments (p. 5)
  • Poorly-written multiple choice exams - two common mistakes when writing multiple-choice questions are confusing language and incorrect alternatives from which the learner selects the correct response. Poorly written question sets negatively impact the validilty of assessments (pp. 5 - 6).
Additional Assessment Tools
www.studentprogress.orgNational Center on Student Progress Monitoring. This site provides information on progress monitoring tools.
www.engrade.comOnline grading system
http://nces.ed.gov/nceskids/createagraph/: A site to teach studnets how to graph their own progress.  



Eyal, L. (2012). Digital assessment literacy—the core role of the teacher in a digital environment. Educational Technology & Society, 15 (2), 37–49.


Eyal, L. (2010). The reciprocity between learning-content management system (LCMS) and the assessment of learners in Elearning courses (Unpublished doctoral dissertation). Bar-Ilan University, Ramat-Gan, Israel.


Globman, R., & Kula, E. (2005). Multi-face evaluation. Holon, Israel: Basic Publishing.


Kelly, R. (2006, Jan.). Authentic experiences, assessment development, online students' martketable skills. Online Classroom. Retrieved @ http://www.vcu.edu/cte/resources/newsletters_archive/OC0601.PDF


McCormick, S., & Zutell, J. (2011). Instructing students who have literacy problems. (6th ed.). Boston, MA: Pearson. 


Shank, P. (2011, Aug.). Four typical online learning assessment mistakes. Faculty Focus. Retrieved @ http://ctfd.sfsu.edu/feature/four-typical-online-learning-assessment-mistakes.htm


Sorensen, E. K., & Takle, E. S. (2005). Investigating knowledge building dialogues in networked communities of practice. A collaborative learning endeavor across cultures. Interactive Educational Multimedia, 10, 50–60.


Return to table of contents

Comments (0)

You don't have permission to comment on this page.