- Using PVAAS for a Purpose
- Key Concepts
- PEERS
- About PEERS
- Understanding the PEERS pages
- Evaluation List
- Evaluation Summary
- Evaluation Forms
- Add Educator
- Add Evaluator
- Manage Access
- Add a school-level Educator to PEERS
- Add a district-level Educator to PEERS
- Add the Evaluator permission to a user's account
- Remove the Evaluator permission from a district user's account
- Add the Evaluator or Administrative Evaluator permission to a district user's account
- Remove the Administrative Evaluator permission from a district user's account
- Remove an Educator from PEERS
- Restore a removed Educator
- Assign an Educator to a district-level Evaluator
- Assign an Educator to an Evaluator
- Unassign an Educator from an Evaluator
- Assign an Educator to a school
- Unassign an Educator from a school
- Link a PVAAS account to an Educator
- Working with Evaluations
- Switch between Educator and Evaluator
- View an evaluation
- Use filters to display only certain evaluations
- Print the Summary section of an evaluation
- Understanding evaluation statuses
- Determine whether other evaluators have access to an evaluation
- Lock or unlock an evaluation
- Save your changes
- Mark an evaluation as Ready for Conference
- Release one or more evaluations
- Download data from released evaluations to XLSX
- Make changes to an evaluation marked Ready for Conference
- Reports
- School Reports
- LEA/District Reports
- Teacher Reports
- Student Reports
- Comparison Reports
- Human Capital Retention Dashboard
- Roster Verification (RV)
- Getting Started
- All Actions by Role
- All Actions for Teachers
- All Actions for School Administrators or Roster Approvers
- Manage teachers' access to RV
- Assign other school users the Roster Approver permission
- View a teacher's rosters
- Take control of a teacher's rosters
- Add and remove rosters for a teacher
- Copy a roster
- Apply a percentage of instructional time to every student on a roster
- Batch print overclaimed and underclaimed students
- Remove students from a roster
- Add a student to a roster
- Return a teacher's rosters to the teacher
- Approve a teacher's rosters
- Submit your school's rosters to the district
- All Actions for district admin or district roster approvers
- Assign other LEA/district users the Roster Approver permission
- Take control of a school's rosters
- View a teacher's rosters
- View the history of a teacher's rosters
- Edit a teacher's rosters
- Add and remove rosters for a teacher
- Copy a roster
- Apply a percentage of instructional time to every student on a roster
- Batch print overclaimed and underclaimed students
- Return a school's rosters to the school
- Approve rosters that you have verified
- Submit your district's rosters
- Understanding the RV Pages
- Viewing the History of Actions on Rosters
- Additional Resources
- Admin Help
- General Help
Predictive Methodology
Assessments analyzed with the Predictive methodology can be used with any assessment that has sufficient prior testing. This prior testing, or predictors, can include assessments in the same or different subjects.
This model generates a predicted score for each student. Entering achievement reflects students' achievement before the current school year or when they entered a grade and subject or Keystone content area.
A predicted score is the score the student would make on the selected assessment if the student makes average or typical growth. To generate each student's predicted score we build a robust statistical model of all students who took the selected assessment in the most recent year. The model includes the scores of all students in the reference group, along with their testing histories across years, grades, and subjects.
By considering how all other students performed on the assessment in relation to their testing histories, the model calculates a predicted score for each student based on their individual testing history.
To ensure precision in the predicted scores, for most subjects, a student must have at least three prior assessment scores. This does not mean three years of scores or three scores in the same subject, but simply three prior scores on state assessments across grades and subjects. There is one exception. To generate a predicted score for fourth-grade science, only two prior scores are required: third-grade math and ELA.
Let's consider an example. Zachary is a high-achieving student who has scored well on state assessments for the past few years, especially in math. To predict Zachary's score on the Keystone assessment, we:
- Determine the relationships between the testing histories of all students and their exiting achievement on this assessment in the same year.
- Use these relationships to determine what the expected score would be for Zachary, given his own personal testing history.
Based on Zachary's testing history, a score at the 83rd percentile would be a reasonable expectation for him.
In contrast, Adam is a low-achieving student who has struggled in math. Their prior scores on state assessments are low. Just as with Zachary, we use the relationships between the testing histories of all students and their exiting achievement on the assessment statewide to determine a predicted score for Adam. Based upon Adam's own personal testing history, a score at the 26th percentile would be a reasonable expectation for him.
Once a predicted score has been generated for each student in the group, the predicted scores are averaged. Because this average predicted score is based on the students' prior test scores, it represents the entering achievement in this subject for the group of students.
Next, we compare the students' exiting achievement on the assessment to their entering achievement. If a group of students scores what they were predicted to score, on average, we can say that the group made average, or typical, growth. In other words, their growth was similar to the growth of students at the same achievement level across the reference group. This is the definition of meeting the growth standard in the predictive methodology.
If a group of students scores significantly higher than predicted, we can conclude that the group made more growth than their peers across the reference group. If a group scores significantly lower than predicted, the group did not grow as much as their peers.
The growth measure is a function of the difference between the students' entering achievement and their exiting achievement. This value is expressed in scale score points and indicates how much higher or lower the group scored, on average, compared to what they were expected to score given their individual testing histories. For example, a growth measure of 9.3 indicates that, on average, this group of students scored 9.3 scale score points higher than expected. When generating growth measures for Teacher Value-Added reports, students are weighted for each teacher based on the proportion of instructional responsibility claimed during roster verification.