Vantage Learning’s Content and Assessment Services Division provides a full suite of testing services to our customers covering all stages of the test development process. Vantage Learning has a team of item and test development specialists, production specialists, editors, constructed response scoring specialists, and psychometricians to provide our clients with the finest in testing services.
Vantage Learning integrates the best in technology with unsurpassed quality of services to produce a completely customized solution to meet our clients’ specific needs while also maintaining the necessary integrity to meet the requirements set forth in the Standards for Educational and Psychological Tests.
Services include the following:
Item and Content Development
- Development of item bank specifications to meet the needs of the assessment program
- Facilitation of Item Writing Workshops to train subject matter experts (SMEs) to write items according to specifications
- Item development by a core team of item writers on staff
- Item review—online and in-person item review meetings
- Item production for comparable online and offline delivery
- Item alignment to state and district standards
- Customized item authoring and banking tools
- Development of activities and lessons linked to state standards
- Creation of hints, suggestions, and feedback related to each item
- Customized item authoring and banking tools
- Development of artwork, audio, and other media linked to assessment questions and learning activities
- Development of test specifications so that tests are aligned to the knowledge and skills identified by the state as important for all students
- Development of tests mapped to the test specifications available for online or offline delivery and reporting
- Development of equated forms
- Production of computer adaptive tests
- Customized test development tools
- Creation and production of test administration manuals for test proctors and teachers
- Creation of user manuals and test preparation materials
- Development of practice tests and tutorials
IntelliMetric® Constructed Response Scoring Services
Vantage’s IntelliMetric® scoring services and results are backed and confirmed by the following:
- Vantage’s Expert Scoring Center—independent IntelliMetric® hand scoring of essays by teachers and other professionals
- Short-answer essay scoring across a variety of subject areas and grades
- Extended response essay scoring across a variety of subject areas and grades
- Development of IntelliMetric® models for use in automated essay scoring
- Development of custom rubrics for use in the evaluation of constructed response items
- Selection of anchor documents and materials for scorer training sessions
- Facilitation of rater training sessions
- Evaluation of rater accuracy
- Prompt development
- Customized tools for scoring and rater evaluation
Innovation—Research and Development
- Development and evaluation of new item types for offline and online delivery
- Creation of simulated complex test environments
- Creation and validation of automated scoring systems for custom items
For more information about our testing services, please contact us.
ACER uses the Intellimetric® automated essay scoring system (AES) from Vantage Learning to automatically score the OWA essays.
The Intellimetric® system is ‘trained’ with a set of scripts and their scores, known as the 'training set'.
The 'training set' for the OWA consisted of writing scripts drawn from a representative sample population. Expert ACER human markers, trained specifically on each essay type, provided the scores and marking guidelines to accompany the scripts.
For each OWA essay topic (writing prompt), the Intellimetric® system was trained to apply scores in a way that is similar to a human marker. This involved a 'training set' of over 300 high quality scripts for each prompt.
Training the Intellimetric® system works inductively: the system uses the human scores to ‘learn’ the marking rubric and the way it has been applied by the human markers. The system ‘trains’ through the systematic interaction of the set of scores and the features of the scripts, and by the accumulation of relationships, as more scripts from the training set are analysed: the system ‘builds’ itself as it progressively analyses material from the 'training set'.
The reliability of automated essay scoring
Extensive research into the use of computer-based marking of candidate writing has been undertaken. Computer-based marking has been proven to be as, if not more consistent than traditional hand scoring. View the article 'An Overview of Automated Scoring of Essays' (external link) recently published in the Journal of Technology, Learning and Assessment.