OWL Test Management System Case Studies

 

Case Study | Customer Service Hiring


The OWL Test Management System was the best choice for Rubrica. The flexibility and range of options within OWL’s system not only accommodated our original test design but also inspired us to think beyond standard testing approaches; this has ultimately resulted in testing products that are even more robust and unique than we had initially conceptualized.
— Jennie Norviel, Rubrica Founder
 

Rubrica Testing

Rubrica is a distinguished provider of high-volume, pre-employment tests for customer service providers, including companies with internal call centers and business process outsourcers. Rubrica was established by a team of language professionals, each with decades of specialized expertise in language acquisition and assessment. Rubrica prides itself on its ability to identify the highest quality candidates by evaluating professional communication proficiency through the evaluation of language, empathy, and critical thinking skills. Their human-rated exams go beyond other employment assessments by including personalized feedback to document performance and make better hiring and training decisions. Rubrica’s products are ideal for any organization that wants to control hiring and training costs and that wants to distinguish itself by offering a superior customer service experience.

Background:

Rubrica was searching for a test management system on which to build and deliver their communication assessments for hiring contact center agents. Rubrica’s products include the following types of assessments: a written test for chat agents (Scriva), a verbal test for voice agents (Vocia), and a general pre-hire interview (Intreva). The company required an online examination provider with a robust and flexible set of features that would permit them to evaluate a range of complex communicative skills through the administration of a series of simulated customer service scenarios. Additionally, they needed a system that would help them control the costs of human rating, as this is a key differentiating factor of their product offering.

Results:

Using the OWL Test Management System, Rubrica’s test developers were able to quickly build out each assessment in their product suite in just under a month. Within the first year, Rubrica has onboarded multiple clients who are employing a variety of their communicative assessment products to improve their hiring process and control costs.

Specifically, Rubrica’s clients have reported reduced training costs and increased new agent success rates due to the following factors:

  • Increased New Hire Success Rates - One customer found that 30% of the candidates they would have hired before using Rubrica's exams were actually unhireable. 20% of the candidates were removed through the use of Rubrica’s automatically scored screening test; an additional 10% were eliminated through a more comprehensive evaluation by human raters.

  • Human Evaluation of Language Proficiency, Empathy, and Critical Thinking - By offering an affordable human-rated alternative to AI evaluations, clients report more reliable results and less need for post-hire reassessments and costly manual corrective actions.

  • Reduced Training Costs - Additionally, the detailed scores and personalized feedback provided for each candidate have been used to place agents in more appropriate roles from the start and reduce time spent on training. One new customer estimated a savings of more than $100,000 in wasted training costs.

Rubrica Requirements:

 

OWL TMS Solutions:


Provide Results Superior to AI Assessments

Confident in the critical importance of offering human-rated evaluations, Rubrica’s team required an efficient and affordable way to manage their rating process. They wanted their raters to score responses within the system using their proprietary rubric. This allowed the company to not only evaluate language proficiency but also empathy and critical thinking skills.

 

OWL Rater Module

From response distribution to performing blind ratings, to rater evaluation and management – OWL Testing Software has integrated and automated the entire human ratings cycle. When a response is submitted, raters can be automatically notified to log in and complete their evaluation within the web-based OWL Rater Module. Additionally, OWL offers many features to manage primary and secondary raters and to automate the rating process according to each business's unique assessment process.


Control Costs with Automated Screenings

One way Rubrica controls the cost of human rating is by performing them on only higher quality applicants, thus performing fewer ratings overall. Rubrica’s exam flow includes an initial screener that eliminates the need to test under-qualified applicants. Candidates who score above a set performance threshold are then delivered the more challenging, human-rated exam. For example:

  • A multiple-choice listening exam is administered to qualify candidates for the speaking exam. (Vocia)

  • An auto-scored reading assessment is a screener for the writing exam (Scriva)

 

OWL Auto-scoring and Assignment Linking

Many online exam systems offer automatic scoring; what OWL offers users is a way to make the most of that automation. Assignments can be configured so that all potential candidates are delivered an initial auto-scored evaluation. In Rubrica’s case, these are reading and listening exams composed of multiple-choice questions. With OWL Assignment Linking, additional testing is delivered to only those candidates who meet a minimum level of performance on those exams. This means OWL users can save money and provide a superior level of human ratings by focusing on only those candidates who demonstrate an appropriate level of capability. (more)


Build Activities to Reproduce Customer Experiences

Rubrica has customized its test suite to evaluate the specific needs of the customer service industry. They do this by incorporating multiple types of online exam questions and media prompts to simulate complex customer experience scenarios. Activities include such expected job tasks as chat responses, in which candidates read a series of chat exchanges and type their reply, and phone call inquiries, in which agents listen to a customer call and respond verbally with a solution.

 

OWL Test Builder

OWL offers an extensive array of question types and features to customize online testing. Essay, Translation, Audio, and Video response items are all used by our assessment providers who need to assess a candidate's actual communication capabilities, rather than simply their ability to take an exam. By uploading proprietary Images, Video, and Audio Files, OWL users develop assessments that are then delivered through the flexible OWL Test Management System. They are not forced to tweak their assessment to fit into a more restrictive online exam delivery option.


Add Value with Individualized Performance Feedback

Rubrica offers clients individualized candidate feedback and documentation to communicate training needs, increase hiring success, and improve account assignments. (sample) Therefore, they required an online test management system that could automatically generate PDFs that could report both automated and hand-selected information about the candidate’s performance.

 

OWL Certificates

With the automatic certificates feature OWL users upload customized PDF files and attach them to their online assignments. This can be configured so that, upon finalization of a rating, a personalized certificate or document is produced. These files can include OWL Test Management System variables in order to provide specific rater feedback and performance information specific to each examinee. (more)


Offer Practice Tests

Rubrica wanted to help their clients find more qualified candidates and control costs even further by offering interested applicants a practice exam. In this way, candidates are given the opportunity to become familiar with both the platform and the tasks on the test prior to taking their official pre-hire evaluation.

 

OWL Practice Assignments

Rubrica employed the “Practice” feature of OWL Assignments to accomplish this aspect of their product offering. OWL's practice test feature allows the test taker to complete their auto-scored online exam and receive immediate performance feedback. The test taker can log in repeatedly and retake the exam to reevaluate their capabilities.


Case Study | Certification Board


nbcmi logo
In rolling out our test, we wanted to make sure that its delivery was consistent and was not subject to any variation. In OWL, we found the perfect solution. A solution that offers all candidates the opportunity to take the same test – no matter where or when it is taken.
— Martin J. Conroy, Senior Manager

National Board of Certification for Medical Interpreters

The National Board of Certification for Medical Interpreters (NBCMI) provides a professional credential to medical interpreters throughout the United States. The NBCMI allows interpreters to take one certification exam which can then serve as validation of their interpreting skills for any healthcare facility to which they apply across state lines. The Board strives to improve care and enhance the quality of interpretation in the healthcare industry by assuring that all interpreters in a clinical setting have the requisite skills.

Background: NBCMI contracted a psychometric services firm to help them create an assessment for certifying a medical interpreter's ability to effectively communicate between Spanish and English. The resulting exam consists of both written and oral components. The oral portion is a consecutive translation test that alternates questions between Spanish and English to replicate clinical scenarios. It is evaluated with a scoring rubric that was constructed exclusively for this exam. The board was looking for an efficient way to deliver this exam using one consistent platform at multiple testing centers across the United States.

Results: NBCMI found OWL's test management system to provide an ideal vehicle to deliver their certification assessment. In less than one year, the board has delivered more than 300 interpreter assessments through various testing sites throughout the United States. Although the current focus of the board is Spanish, their goal is to expand to include additional languages such as Cantonese, Korean, Mandarin, Russian, and Vietnamese. OWL's localized user interface and virtual keyboards will help to facilitate and enhance this expansion.

 

NBCMI Online Certification Requirements:


Smooth Transition from Paper & Tapes

NBCMI wanted to ensure that the transition away from paper and tape assessments went smoothly while maintaining a close adherence to testing standards.


Control the Cost of Evaluating Exams

The Board wanted to eliminate the costly and cumbersome process of disseminating candidates' completed tapes and test booklets for evaluation by the Board's raters.


Efficient & Standardized Exam Delivery
NBCMI wanted to deliver their certification exam at various test centers throughout the U.S. They needed a way to manage their assessment process that was cost-effective and efficient, as well as consistent and standardized.

 

OWL TMS Solutions:


OWL's Test Conversion Services

The OWL conversion team coordinated closely with the test's developer to create an interface that maintained the assessment's integrity and professional standards.


OWL's Integrated Rating Module

With OWL, NBCMI raters are easily able to access the OWL TMS to rate tests. Now raters can listen to responses and score exams from any Internet-enabled computer.


OWL's Web-Based Test Management

With OWL's web-based system, proctors no longer need to be trained on the intricacies of delivering the oral certification exam. This makes the OWL solution not only cost-effective, but also ensures a more homogeneous assessment at any available test locale.

Case Study | Global Business


We chose OWL because of its impressive versatility as well as its straight-forward functionality for the test user. The OWL team was top-notch in listening to our needs, assisting us with individualized content build, and providing excellent training to our team ahead of go-live.
— Angela Kelly, Manager of Training

himagine Solutions

himagine solutions is a leading provider of healthcare outsourcing solutions with a focus on Health Information Management and related services. himagine has the largest team of HIM professionals in the U.S. that serve various healthcare service providers including short and long term acute care facilities, community based hospitals, physician practices, and outpatient facilities. Their professional HIM outsourcing services include an industry leading managed coding solution along with auditing, registry, and clinical documentation improvement.

Background: himagine was challenged with finding a versatile platform that provide an efficient and standardized way to manage pre-employment testing for thousands of candidates. Each potential candidate applying for employment with himagine is required to participate in a two-part pre-employment screening process. Candidates must complete a multiple-choice assessment and a phone interview prior to onboarding. In addition to pre-employment testing, himagine was looking for a solution to validate ongoing skill development of its current workforce in preparation for the government-mandated ICD-10 code implementation.

Results: Use of the OWL Test Management System has created efficiencies within the existing business processes, allowing himagine to increase its candidate pool exponentially. Within a year of implementing the OWL Test Management System, himagine administered more than 2,000 pre-employment assessments to candidates looking to join the organization. In addition, himagine was able to use OWL to validate skill mastery for over 1000 employees ahead of the government-mandated ICD-10 code implementation. For future use, himagine has identified additional OWL features, such as digitally captured oral responses, which will ultimately streamline the phone interview process.


 

himagine Online Exam Requirements:

Seamless Transition

himagine was looking for a speedy and smooth implementation of the OWL TMS. They hoped to experience little impact to the recruiting process, candidate vetting, or client onboarding.


Customizable Test Building

Due to the highly specialized content of the pre-employment assessment, as well as the vast number of candidates to be vetted for employment, a must-have for himagine was flexibility to develop a customized online test question bank. The company also wanted features to ensure the integrity of their online assessments which are delivered in a proctor-free environment.


Administrative Control and Automation

himagine was challenged to find a test scoring feedback solution that worked within its internal recruiting processes. Specifically, they wanted automatic scoring, automated response data distribution, version control, as well as the highest level of administrator access and functionality.


Government Mandated Reporting

Soon after implementation of the OWL TMS, himagine had an immediate need to validate ongoing skill development for its existing workforce in preparation for the government-mandated ICD-10 code implementation. This included detailed compliance reporting mandates.


Candidate Driven Process

Because of the competitive nature in the staffing services industry, it was important to himagine that the pre-employment be user-friendly. They were looking for a comprehensive test management system that could both improve internal efficiencies AND facilitate, rather than hinder, the recruiting process.

 

OWL TMS Solutions:

OWL Implementation Team

OWL partnered closely with stakeholders and test developers at himagine to ensure that there was minimal disruption to business processes during site development and implementation.


OWL's Robust Test Builder

With OWL, himagine created multiple pre-employment assessments in the specialty areas of inpatient medicine, outpatient medicine, emergency room/diagnostic medicine, and physician services, each of the assessments had very different testing requirements. himagine employed customization features available including shuffling and test question randomization.


OWL TMS Event Triggered Notifications

OWL listened to these needs and developed an event-triggered automatic messaging report which delivers detailed candidate response data via email to all pertinent departments within himagine. OWL’s system administrator settings allow the himagine compliance team to control the process.


OWL Data Warehouse and Reporting

himagine used OWL to create skills assessments to validate that their existing workforce was sufficiently prepared for the ICD-10 code implementation. Rich reporting available within the OWL Test Management System made it easy for himagine to pull the necessary data to manage compliance.


Automated Account Registration

With OWL’s web-based system, himagine is able to invite candidates to create a user account and automatically register for their assessment. Using OWL, the candidate drives the test-taking process instead of the recruiting team. This provides flexibility for the test taker as well as efficiency in himagine business processes.




Case Study | Higher Education


Unlike the online activities provided by the students’ workbooks, OWL allows students to create a portfolio of verbal recordings. They can practice their verbal skills and even receive recorded oral feedback from the instructor. They can also listen to their previous recordings in order to practice their speech as well as monitor their own progress. Ultimately, we are creating superior language learners and, in particular, better second language speakers.
— Gerry Milligan, Chair of the Modern Languages Department

CUNY College of Staten Island

The College of Staten Island is a senior college of The City University of New York (CSI/CUNY) offering Doctoral programs, Advanced Certificate programs, and Master’s programs, as well as Bachelor’s and Associate’s degrees. The College is accredited by the Middle States Commission on Higher Education.

Background: The Modern Languages Media Center at the College of Staten Island (CSI) provides a variety of aids for the learning and teaching of foreign languages. It was important to the world language faculty to have a versatile platform allowing for the creation of media-rich oral examinations in which students could provide an oral response. In addition, it was imperative that the faculty have the ability to grade the oral examinations and provide oral feedback directly to the student within the same platform.

Results: Some of the efficiencies gained by installing OWL in the CSI Media Center include: software training and proctoring efficiencies, Streamlined oral exam process for students, swift administration of oral proficiency exams, and improved program performance monitoring. More than 1200 students at CSI are using OWL for 4 different languages. The college performs over 3000 oral proficiency exams each semester.


CUNY Online Assessment Requirements:

Fully Customizable Exams

A must-have for CSI CUNY was the flexibility to customize oral exams that simulate real-life situations across multiple languages/proficiencies.


Built-in Assessment Function

It was important to CSI CUNY to have the ability to rate oral assessments using the same systems as the students to record their oral responses.


Student Feedback Vehicle

CSI CUNY wanted to provide direct feedback to students on their oral assessments. They wanted a tool for students to practice speaking and directly monitor their progress.

 

OWL TMS Solutions:

OWL Online Test Builder

OWL allows instructors to easily create content that integrates multimedia into questions, time students’ responses, and randomize questions in various ways.


OWL's Rating Module

Test creators can apply grading rubrics and/or point systems to different exam sections. Faculty can create and rate oral exams on and off campus.


Multiple Feedback Modes

OWL offers many paths for providing student feedback. French professors at CSI/CUNY appreciate the ability to record feedback within the OWL TMS directly on the examinee’s response.

Case Study | Higher Education


With the push for accountability across the country, we knew that sooner or later, the university was going to ask us to measure the communicative abilities of all foreign language students by the end of the second year of language study. We wanted to be ready with the answers.
— Dr. Fernando Rubio, University of Utah, Foreign Language Department Chair

The University of Utah

The University of Utah Assesses Student Oral Proficiency at Home and Abroad

Background: Dr. Fernando Rubio and his fellow educators at the University of Utah decided to plan ahead. Like many universities, the University of Utah requires four semesters of foreign language for students who wish to receive a bachelor of arts degree. “With the push for accountability across the country, we knew that sooner or later, the university was going to ask us to measure the communicative abilities of all foreign language students by the end of the second year of language study,” says Dr. Rubio, Chair of the university’s foreign language department. “We wanted to be ready with the answers.”

In addition to the imminent demands by the university administration, some of the foreign language faculty were questioning the school’s policy regarding students who participated in the summer study-abroad programs. These students, who spent five weeks in the summer in one of eight foreign countries, were able to eliminate two semesters of study in the target language. The faculty were concerned: could a mere five weeks of immersion adequately replace an entire year’s worth of course work on campus?

Under Dr. Rubio’s leadership, the foreign language department developed assessments to measure the communicative abilities of its students. At first, they relied on audio cassette tapes to conduct the SOPI-type tests, but even when all of the students and raters were on campus, this tended to be an administrative nightmare. The assessment tapes of the study-abroad students were even more troublesome, and the university needed a new solution. In an effort to use technology to their advantage, Dr. Rubio tried a widely-used online course management software, but it didn’t meet the department’s needs. Ideally, students would be able to take the assessment wherever they were in the world, and likewise, the raters could rate the speech samples from any computer, whether in Utah, Germany or Japan. The software would also allow them to test languages that do not use Roman alphabets, since the university offers Arabic, Japanese and Chinese as well as Spanish, French, Italian and German.


Testing

Dr. Rubio found the solution in OWL Testing Software, which is specifically designed to simplify oral proficiency assessment. Unlike the previous applications they had tried, OWL gave the language department the features and control it needed to manage both the on-campus and distance assessment program. Today, all foreign language students take a test at the end of the second semester on campus. Those students who participate in the study-abroad program also take the test at the end of the five weeks abroad, while they are still on location. No special software needs to be installed on the computers; each student uses a web browser to access the test from virtually anywhere in the world, no matter which language the student is learning.


Research

In addition to testing student proficiency levels, part of Dr. Rubio’s research involves extensive analysis of student pronunciation. Using the students’ tests prior to their trip abroad, he might isolate certain sounds that challenge the student. For instance, he might measure vowel sounds to determine if they are pure vowel sounds or dipthongs. The digital audio recordings not only allow him to more easily listen to the speech samples, but he can also see visual representations of the sounds using spectrograms.

Until this year, Dr. Rubio’s research has been used internally at the University of Utah, though he has presented some of his findings at conferences. By the end of summer 2008, however, Dr. Rubio expects to have enough data for publication and looks forward to sharing the success of the university’s proficiency assessment program with other institutions and researchers.

 

Rating

Even though all of the raters are based in Utah, many of them travel extensively, including Dr. Rubio himself. In the past, one rater might be available in Utah, but the second rater would be abroad, and the whole rating process would be held up. Now, instead of having to wait until a program director hauled cassettes back to campus and both raters were in town, raters can access the tests via their web browser. “Arabic raters in Utah can access the results the day after the students take the test in Alexandria. Now, we can have results much more quickly than we could before,” says Dr. Rubio.


Results

Every year, the department receives special funding from the university to conduct the assessments, giving the university a true measurement of program success. Another advantage of assessing oral proficiency, says Dr. Rubio, is that the aggregated data provides good quality control of the study-abroad programs. “If we see that out of the eight programs, six are getting good results, but two are not, we recognize that there may be problems with the two programs and can make adjustments,” he notes.

And as to the question about whether five weeks abroad can truly take the place of an entire year’s course work on campus? “After these four years, we’ve learned that students who go abroad learn as much or more than students who take language on campus. They experience a significant gain in proficiency between pre-test and post-test. They speak and write more and better,” says Dr. Rubio. He notes that the results are complicated to generalize. Because the students who travel abroad might be considered a select sample, they may not be entirely representative; students who go abroad tend to be more motivated and must achieve a certain G.P.A. But the students who travel abroad tend to show greater progress even as compared to the best students who stay on campus.

Case Study | K-12


We have experienced a 12 percent increase in proficiency levels, and expect to build on this success each year.
— Marsha Plotkin-Goleman, Curriculum Supervisor World Languages

Pittsburgh Public Schools

Pittsburgh Public Schools (PPS) is the largest of 43 school districts in Allegheny County and second largest in Pennsylvania. PPS serves approximately 26,000 students in Kindergarten through Grade 12 in 66 schools. The district has approximately 34% of its overall student population enrolled in a foreign language program.

Background: Understanding the importance of oral proficiency development in language learning, Pittsburgh Public Schools’ world language administrators sought a simple yet sophisticated way to test its students’ speaking and listening skills. The district was looking for an online test management system that would enable them to collect and rate student speech samples across 66 schools to more than 8000 students.

Results: Pittsburgh Public Schools uses OWL to create, deliver, assess and report on "high stakes" SOPI-like testing at 2-3 year intervals. PPS follows the basic format of both the ACTFL OPI and CAL’s SOPI: warm-up, level checks, probes at a higher level, and a wind down. By using OWL Testing Software, the district administers more than 1,700 high-stakes oral proficiency assessments each year, as well as 2,000 reading and listening exams this year alone. Not only does OWL Testing Software streamline what had been a difficult process, but the district can easily gather valuable data to determine the effectiveness of their curricula, teaching methods, and rubrics.

Practice Activities for Language Students - These activities are used to build students' vocabulary in a way that resembles a game. PPS used a competition between classes and schools to generate a high volume of data on the effectiveness of using computerized testing as a learning tool. For more on this subject check out this blog post (PPS World Language Competition) on a district-wide world language competition.

Teacher Candidate Assessments - These tests are used to demonstrate teacher candidate proficiency as part of the hiring screening process. The teacher candidate tests consist of a primary section which is a SOPI-like test. The second and third sections consist of speaking and writing questions about general knowledge of the language and culture.


PPS Oral Exam Requirements:

Create/Share Large Item Bank

PPS wanted to both standardize assessments and to enhance instructor efficiency. The district's goal was to have a large enough task bank of items that would allow instructors to create new tests easily with a few clicks of a button.


Tool for Low-Stakes Testing

PPS wanted a test management system that could be responsive and be used everyday in the classroom for quizzes and regular in-class, achievement and proficiency testing.


Pre- and Post-Achievement

These tests are primarily multiple choice and are given in several grades to demonstrate student progression from the beginning of the year to the end. PPS sought an efficient way to deliver and report on these exams.


High-Stakes Testing

PPS wanted to create, deliver, assess and report on "high stakes" SOPI-like testing at 2-3 year intervals. They wished to follow the oral proficiency interview format of both the ACTFL OPI and CAL’s SOPI.


Structured Rating System

PPS required the test management system to have an integrated rating module to allow assessment of oral response activities to be performed in an automated and structured way. They wanted the ratings to occur using within the system using standardized rubrics.


Many Languages & Levels

The system had to be flexible and user-friendly enough to be used by students in all grades of study. The district wished to create tests for varying proficiency levels across seven different languages.


Vehicle for Student Feedback

In addition to testing, the district wanted to provide students with direct feedback on their oral responses. They were looking for a tool to give students a way to directly monitor their progress and give them information for improvement. They wanted a tool that could capture and maintain student response data over the long termVehicle for Student Feedback.

 

OWL TMS Solutions:

OWL Online Test Builder

OWL’s online test builder allows instructors to easily create content that includes images and audio into their assessment and practice activities. Department heads can create and share activities to manage testing within and across languages.


OWL Instructor Permissions

Teachers are able to create their own OWL activities to respond to specific student learning needs and view a gradebook report of ongoing student progress throughout the semester.


Response Data Warehouse

OWL not only makes it easy to deliver and auto-score multiple choice exams, but also has a built-in reporting feature which allows these pre- and post-test results to be easily compared within and across languages.


Flexible Test Builder

Using the many features for creating and presenting oral response items, PPS was able to create an online assessment following the format of warm-up, level checks, probes at a higher level, and wind down.


Integrated Rating Module

Custom or industry rubrics can be uploaded to evaluate responses within OWL. The OWL TMS can automatically manage the distribution of blind assessment requests. Users designate the number of assessment requests, primary and secondary raters and the method of finalization.


Target Language Audio

OWL test taker interface combined with the target language audio feature were the perfect fit. Users create a single assessment that is delivered to students in each of the languages of study.


OWL Results Publishing

By publishing results in OWL, PPS students can log back in to read or listen to specific assessment feedback left directly on their individual response. OWL also offers a way to create electronic student portfolios. This is a powerful way for students to hear their progress over the course of their entire academic career.

Case Study | Government


In researching potential software products, we found OWL to offer exceptional flexibility and functionality with respect to creating customized questions and tests, and supporting multiple language requirements.
— Paul D. Swinwood, President

Information and Communications Technology Council

The Information and Communications Technology Council (ICTC) is a not-for-profit national centre of expertise for the digital economy. Through trusted research, innovative talent solutions, and practical policy advice, ICTC fosters innovative and globally competitive Canadian industries empowered by a talented and diverse digital workforce.

Background: By creating industry and occupation-specific assessments, the Information and Communications Technology Council (ICTC) of Canada helps individuals develop their communication skills and evaluates their readiness to assimilate into Canada’s multilingual workplace. The ICTC was challenged to find a test management system in which workplace communication scenarios could be easily replicated in multiple languages, incorporating various types of multimedia. Also of extreme importance, the ICTC was searching for a company that would partner with them in order to customize a user interface that would integrate into an existing platform of online tools.

Results: Employers have identified Workplace Communications and Language as a key challenge to the integration of Individualized Education Programs (IEPs) in the workplace. Working with OWL, ICTC has developed a set of tools that will help IEPs: Understand the language requirements for five (5) occupations in the Canadian ICT sector (Business Analyst, IS Manager, Programmer, Project Manager, and Web Developer); Assess their Canadian Language Benchmark level in English and/or French; and Overcome language and workplace communication challenges.


ICTC Assessment Requirements:

Customized Assessment Items

The Council wanted to create test items specific to the communication and language needs of the Information and Communications Technology industry. Furthermore, they wanted to create assessment programs specific to ICT occupations.


Consistent User Interface

ICTC was looking for more than a pre-packaged system. They were searching for a company to would work with them to customize a user interface for their language assessment tool that would integrate into ICTC's existing platform of online tools.


Immediate Dynamic Feedback

ICTC wanted to give candidates immediate information about their Canadian Language Benchmark level in English and/or French. They also wished to recommend steps for improvement. Their ultimate goal was to help new workers overcome language and workplace communication challenges.

 

OWL TMS Solutions:

Media-Rich Test Builder

With OWL, ICTC easily created industry- and occupation-specific assessments to replicate workplace communication scenarios in multiple languages. OWL users can incorporate audio, video, images and text in their online activities.


OWL Customized Solutions

The council contracted OWL to create a customized user interface that reflects the appearance of ICTC's other Workshops Online tools through development of a customized skin. This consistent user interface enables the council to deliver a seamless experience.


Flight Path Feature

Using OWL's auto-rated items combined with the OWL Flight Path, ICTC provides the test taker with immediate feedback and next steps related to their language capabilities to include links to recommended workshops. These recommendations are based upon each individual's score on a specific test.