Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

System for performing assessment without testing

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Publication Date:
    March 29, 2016
  • Additional Information
    • Patent Number:
      9,299,266
    • Appl. No:
      13/209722
    • Application Filed:
      August 15, 2011
    • Abstract:
      A system for educational assessment without testing is provided that includes one or more client systems that are connected to a network allowing students or school officials to communicate with an education framework that performs and manages educational assessment. The one or more client systems issue a message to the education framework requesting a task to be performed. The educational assessment is administered independent of one or more educators so as to avoid interruption of instruction time. A server system receives the message and the education framework proceeds to process the contents of the message. The education framework includes a plurality of programming modules being executed on the server system that provides to educators specific information used for the educational assessment based on the contents of the message. The programming modules assist in calculating and determining one or more parameters for the educational assessment of the students as well as providing specific reports to educators as to the progress of the students.
    • Inventors:
      Crawford, Elizabeth Catherine (Somerville, MA, US); Petscher, Yaacov (Tallahassee, FL, US); Schatschneider, Christopher (Tallahassee, FL, US)
    • Assignees:
      LEXIA LEARNING SYSTEMS LLC (Concord, MA, US)
    • Claim:
      1. A system for educational assessment without testing comprising: one or more client systems that are connected to a network allowing students or educators to communicate with an education framework that performs and manages educational assessment, the one or more client systems that issue a message to the education framework requesting a task to be performed, the educational assessment is administered independent of one or more educators so as to avoid interruption of instruction time; and a server system that receives the message such that the education framework proceeds to process the contents of the message, the server system implementing an assessment module configured to calculate an education assessment by evaluating a student's quantified usage as well as a plurality of other student performance variables in determining whether a student is receiving benefit from using the education framework, the assessment module configured to calculate an average weekly usage time from the student's quantified usage, and compare the average weekly usage time to a norm sample to define a comparison value, and to use the comparison value with the other student performance variables to produce a performance predictor which specifically measures a student's chance of meeting a predefined benchmark, wherein a reporting module implemented by said server system simultaneously generates reports including (1) said performance predictor from a plurality of quantified risk performance variables for said students of a particular grade level, and (2) an indication of a skill set from a plurality of quantified skill sets for said students of said particular grade level to produce a prescription for each of said students, said quantified risk performance variables include a first set of a plurality of identifiers and said quantified skill sets include a different second set of a plurality of identifiers, said prescription is categorized by at least said first set of identifiers and said second set of identifiers and includes both a time said student of a particular grade level should use the system, and available lessons to target instruction, the performance predictor indicating each student's percent chance of reaching the predefined benchmark for a defined grade level and measuring the risk of failure, the benchmark is based on the norm sample and is correlated with other established progress monitoring tools allowing assessment of each student's performance predictor and determines a level of intensity of instruction needed to increase a likelihood each student meets the predefined benchmark.
    • Claim:
      2. The system of claim 1 , wherein the students perform one or more selected daily skill activities.
    • Claim:
      3. The system of claim 2 , wherein the server system further implements a student performance data module that analyzes the performance data of one or more selected daily skill activities and calculates the plurality of other student performance variables used in the educational assessment of the student.
    • Claim:
      4. The system of claim 3 , wherein the student performance data module stores the plurality of other student performance variables in a database.
    • Claim:
      5. The system of claim 1 , wherein the one or more benchmarks are calculated by determining a percentage of the students in the norm sample that completed each level of the one or more selected daily activities.
    • Claim:
      6. The system of claim 5 , wherein the one or more parameters select a skill set that defines the educational level the student achieves using daily skills activities.
    • Claim:
      7. The system of claim 3 , wherein the reporting module generates reports illustrating the student performance variables in a fashion that assists educators in the educational assessment of a student, class, school or district.
    • Claim:
      8. A method of performing educational assessment without comprising a test event comprising: providing one or more client systems that are connected to a network allowing students or educators to communicate with an education framework executed on a server system that performs and manages educational assessment, the one or more client systems configured to issue a message to the education framework requesting a task to be performed, the educational assessment is administered independent of one or more educators so as to avoid interruption of instruction time; and receiving the message and processing, via the education framework, the contents of the message, the education assessment is calculated via an assessment module implemented by the server system by evaluating a student's quantified usage as well as a plurality of other student performance variables in determining whether a student is receiving benefit from using the education framework, the student's quantified usage and the other student performance variables are used to produce a performance predictor which specifically measures a student's chance of meeting a predefined benchmark, the assessment module calculating an average weekly usage time from the student's quantified usage and the other student performance variables and comparing the average weekly usage time to a norm sample to produce the performance predictor, wherein a reporting module implemented by said server system simultaneously generates a report including (1) said performance predictor from a plurality of quantified risk performance predictors for said students of a particular grade level the student's quantified usage and the other student performance variables, and (2) an indication of a skill set from a plurality of quantified skill sets for said students of said particular grade level to produce a monthly prescription for each of said students, said quantified risk performance variables include a first set of a plurality of identifiers and said quantified skill sets include a different second set of a plurality of identifiers, said prescription is categorized by at least said first set of identifiers and said second set of identifiers and includes both the time said student of a particular grade level should use the system, and available lessons to target instruction, the performance predictor indicating each student's percent chance of reaching the defined benchmark for a defined grade level and measuring the risk of failure, the benchmark is based on the norm sample and are correlated with other established monitoring tools allowing assessment of each student's performance predictor and determine the level of intensity of instruction needed to increase the likelihood each meets the defined benchmark.
    • Claim:
      9. The method of claim 8 , wherein the students perform daily skill activities.
    • Claim:
      10. The method of claim 9 , wherein the server system further implements a student performance data module that analyzes the performance data of the student using the program and calculates the plurality of other student performance variables used in the educational assessment of the student.
    • Claim:
      11. The method of claim 10 , wherein the student performance data module stores the plurality of other student performance variables in a database.
    • Claim:
      12. The method of claim 8 , wherein the end-of-year benchmark is calculated by determining a percentage of the students in the norm sample that completed each level of the one or more selective educational assessment programs.
    • Claim:
      13. The method of claim 12 , wherein the plurality of other student performance variables determine a skill set that defines the educational level the student achieves using the one or more educational programs.
    • Claim:
      14. The method of claim 10 , wherein the reporting module generates reports illustrating the plurality of other student performance variables in a fashion that assists educators with the educational assessment of a student, class, school or district.
    • Claim:
      15. A memory device for storing a program being executed on an education framework, the program performs a method of performing educational assessment without comprising a test event, the method comprising: allowing students or educators to communicate with an education framework executed on a server system that performs and manages educational assessment using one or more client systems that are connected to a network, the one or more client systems issue a message to the education framework requesting a task to be performed, the educational assessment is administered independent of one or more educators so as to avoid interruption of instruction time; and receiving the message and processing, via the education framework, the contents of the message, the server system implementing an assessment module configured to calculate the education assessment by evaluating a student's quantified usage as well as a plurality of other student performance variables in determining whether a student is receiving benefit from using the education framework, the assessment module configured to calculate an average weekly usage time from the student's quantified usage, and compare the average weekly usage time to a norm sample to define a comparison value, and to use the comparison value with the other student performance variables to produce a performance predictor which specifically measures a student's chance of meeting a predefined benchmark; wherein a reporting module implemented by said server system simultaneously generates a report including (1) said performance predictor from a plurality of quantified risk performance variables for said students of a particular grade level, and (2) a skill set from a plurality of quantified skill sets for said students of said particular grade level to produce a monthly prescription for each of said students, said quantified risk performance variables include a first set of a plurality of identifiers and said quantified skill sets include a different second set of a plurality of identifiers, said prescription is categorized by at least said first set of identifiers and said second set of identifiers and includes both the time said student of a particular grade level should use the system, and available lessons to target instruction, the performance predictor indicating each student's percent chance of reaching the defined benchmark for a defined grade level and measuring the risk of failure, the benchmark is based on the norm sample and are correlated with other established progress monitoring tools allowing assessment of each student's performance predictor and determine the level of intensity of instruction needed to increase the likelihood each meets the defined benchmark.
    • Claim:
      16. The memory device of claim 15 , wherein the students perform daily skill activities.
    • Claim:
      17. The memory device of claim 16 , wherein the server system further implements a student performance data module that analyzes the performance data of the student using the program and calculates the plurality of other student performance variables that are used in the educational assessment of the student.
    • Claim:
      18. The memory device of claim 17 , wherein the student performance data module stores the plurality of other student performance variables in a database.
    • Claim:
      19. The memory device of claim 15 , wherein the end-of-year benchmark is calculated by determining a percentage of the students in the norm sample that completed each level of the one or more selective educational assessment programs.
    • Claim:
      20. The memory device of claim 15 , wherein the plurality of other student performance variables determine a skill set that defines the educational level the student achieves using the one or more educational programs.
    • Claim:
      21. The memory device of claim 17 , wherein the reporting module generates reports illustrating the plurality of other student performance variables in a fashion that assists educators with the educational assessment of a student, class, school or district.
    • Claim:
      22. The system of claim 1 , wherein the student's quantified usage is determined based on the student's use of the education framework to perform individualized skill activities.
    • Claim:
      23. The system of claim 1 , wherein the performance predictor is associated with one of “On Target,” “Some Risk,” and “High Risk”.
    • Claim:
      24. The system of claim 1 , wherein each student performance variable from the plurality of other student performance variables includes a weight associated with a predictive strength of that student performance variable.
    • Patent References Cited:
      5059127 October 1991 Lewis et al.
      6144838 November 2000 Sheehan
      6322366 November 2001 Bergan et al.
      7440725 October 2008 Roussos
      7828552 November 2010 Shute et al.
      2004/0063085 April 2004 Ivanir et al.
      2006/0166174 July 2006 Rowe et al.
      2007/0009871 January 2007 Tidwell-Scheuring et al.
      2008/0227077 September 2008 Thrall et al.
      2008/0254433 October 2008 Woolf et al.
      2009/0035733 February 2009 Meitar et al.
      2009/0162827 June 2009 Benson et al.




    • Other References:
      Wikipedia—Report Card: Progress Reports, Web Archive, Web. 2007. . cited by examiner
      Florida Assessments for Instruction in Reading, Florida Center for Reading Research, 2009, 2 pages. cited by applicant
      Torgesen, J.K., “A Comprehensive K-3 Reading Assessment Plan”, Center on Instruction, 2006, 24 pages. cited by applicant
      Florida Center for Reading Research, “Florida Assessments for Instruction in Reading” Technical Manual, 2009-2010 Edition, Grades 3-12, 78 pages. cited by applicant
      Florida Center for Reading Research, “Florida Assessments for Instruction in Reading” Technical Manual, 2009-2010 Edition, Grades Kindergarten-Grade 2, 105 pages. cited by applicant
    • Assistant Examiner:
      Hong, Thomas
    • Primary Examiner:
      Yao, Sam
    • Accession Number:
      edspgr.09299266