by Clive Shepherd
One thing that computers do really well - and with much less effort than human beings - is to run a test; a certain type of test that is, using highly-structured question formats for which answer judging, scoring and feedback can be readily automated. Because tests are not difficult to put together and deploy online - at least not at a superficial level - practically everyone does it. But what purpose do these tests really serve and do they provide us with the information we need about what students have learned? In this article, Clive Shepherd explores the opportunities provided by online assessment and, just as importantly, examines the limitations.
Upping the stakes
Life beyond multiple choice
Appendix: Five varieties of test item
Computers are good at testing learning. They never show bias, they can add up perfectly and they just love to provide you with charts and reports to tell you exactly who has done what to whom, when they did it and where. What's more, they can run authoring software that makes it a relatively simple form-filling exercise to create new tests that can be delivered securely and consistently across the globe, with not a traditional exam paper in sight. Online assessment looks easy and, superficially, it is. However, there's more to making a success of online assessment than buying a software package and typing in a set of questions. A great deal of skill and judgment is required to ensure that what you're testing bears any relationship to your real learning objectives and that online testing is used selectively, often alongside other assessment methods, both online and offline.
Simon Hoyle heads up EBC, a Manchester-based developer of bespoke e-learning solutions: "It's important to test the impact of learning, but most assessments really only check for knowledge, not skills. You need a follow-on process to look at what people do differently back on the job." Henry Stewart, MD of e-learning content provider
LearnFish, agrees: "The key question is whether tests that use multiple choice questions or similar is an effective measure of skills transfer. I don't believe they are. In teaching software applications, the real test is to have learners carry out tasks on a live document and email them to an e-tutor. This is a genuine test of skills transfer and has the advantage of increasing human contact, which we know increases the effectiveness of e-learning."
Is this attack on the usefulness of online tests fair? Well only up to a point. There are many situations where knowledge is what you want to be testing, because the transfer of knowledge is the point of the exercise. A large proportion of today's workforce is made up of knowledge workers - people for whom the knowledge of concepts, principles, rules and procedures are absolutely essential. They are the raw material from which sensible judgments are made and actions initiated. Online assessments may not be able to assess all forms of knowledge adequately, but they can make a major contribution.
To further the case for online assessment, it is also fair to point out that many forms of computerised testing can in fact test for skills. These days, very many of the skills we need are cognitive - they do not involve any manual activity on behalf of the performer. And cognitive skills are important - skills like problem-solving, decision-making, analysis, synthesis and evaluation. Not all cognitive skills can be thoroughly or even adequately tested using tests delivered by computer, but enough can to make the method useful. And that's not to mention such basic skills as arithmetic, typing and spelling, where the computer can actually generate test items automatically and to precisely fit the learner's profile.
Lastly, as even Stewart has to admit, tests, or 'quizzes' as they are often coined to improve their user-friendliness, "are very effective motivational tools. Learners love completing them".
One of the great advantages of e-learning is its modularity - the learning can be divided up into easily manageable 'learning objects' and then deployed according to a learner's individual requirements. You could leave a learner to make their own selections from the objects available, but that depends on the learner's awareness of their own requirements - they may have a misguided impression of their starting level of skill or knowledge. Another option is to use an online assessment to diagnose the learning required and, as a result, point the learner in the right direction.
Huthwaite International is a leader in sales training, with a world-renowned methodology - SPIN - and a successful programme of classroom courses. With time pressures in sales becoming ever tighter, Huthwaite has introduced a new online solution that cuts training time from 5 days to 2. Says Hoyle, who's company EBC developed the solution: "One area where online assessment really delivers value is in the way it can be used right up front to personalise the trainee's experience. The pre-assessment allows trainees to concentrate on the pre-work that they really need, so they are properly prepared for the 2-day workshop."
Stephen Citron is MD of Informatology, a company specialising in a novel new form of pre-assessment. "Working in the area of Microsoft Office applications, we start with the premise that most learners don't know what they don't know. The applications have lots of power, but learners are simply unaware of the possibilities. Our solution is to show the learner a series of short - 5 to 15 second - animations, demonstrating just what each application can do. For Excel alone we have 72 animations. If a piece of functionality is of interest to a learner, they can indicate this and update their profile. This profile can then be used to point the learner towards suitable classroom or online modules." The savings from such an approach could be substantial, as Informatology customer Julian Hancock, Training Officer at the Court Service, points out: "It is evident that between 1/3rd to 2/3rds of training arranged in the traditional manner is wasted."
Toshiba is another company that is using pre-assessment to streamline their course offerings. "Instead of our three day on-site training courses, we will be able to offer our partners a more modular approach that will check a number of pre-conditions electronically before booking candidates onto specific courses. And, with most of the training done beforehand, using online tests created using Question Mark Perception, the courses can be shortened to a single day. This will reduce costs for all parties and enables us to carry out a wider range of training courses than ever before," comments John Hawes, National Business Development Manager Electronic Imaging Division for Toshiba.
There are times when an assessment means more than a little gentle encouragement that you're moving in the right direction. If you need a pass to obtain an academic or professional qualification that could transform your career, then the stakes have upped considerably. In these circumstances, is online assessment really an option? According to Hoyle, "there are limitations in any remote assessment. You can always cheat if you really want to." Even with the most advanced biometric technology, utilising fingerprint, voice, face or eye recognition, there's nothing to stop you taking the exam with your extremely-knowledgeable friend sitting alongside you to supply you with the answers. When the stakes get high, the exam simply has to be invigilated.
This does that mean that online assessment is an inappropriate tool for high-profile exams, just that the exams should not be taken remotely. If you have the facilities available to provide each student taking the exam with access to a networked PC, then you are still going to benefit from a number of advantages. Firstly, there are no paper copies of the exam to be printed, distributed, or left lying round for prying eyes to see. Secondly, papers can be scored almost instantly and results fed back to administrators and students. Also, to further reduce the risk of cheating, particularly where the same exam papers are used repeatedly, you can automatically mix the order of questions and of items within questions. It's also important to ensure that the answers to the questions can not be deduced by a streetwise student who takes a look at the HTML source - that means encoding the answers or, better still, making scoring a task that's carried out at the server end.
beyond multiple choice
The most critical quality of an assessment is its ability to accurately test for the achievement of a learning objective. Assuming you're starting with specific, observable objectives - and that's a big assumption - this means selecting the right kind of test item and then carefully crafting the item to provide the right level of difficulty and a freedom from ambiguity. The creation of test items is not a trivial task and requires real attention to detail. Critically, you must test your test with typical learners and be prepared to make changes to ensure the test is both valid and reliable.
What happens if you cannot find a suitable test item to test for a particular learning objective? None of the options available really measures the knowledge, skill or attitude change that you're after. What you absolutely must not do is to revise your learning objective to make it more easily assessable (don't laugh - it happens all the time), or make do with an inaccurate test. The fact is you do have choices, many of them still within the online domain.
An example is the American Language Program from Columbia Interactive Arts and Sciences. The computer cannot automatically check the quality of your English (or American in this case), but a human can. For each assignment, students are asked to read documents or listen to simulated voicemails or meetings, write a document in response, review their work against a checklist and then submit the document to an online tutor for marking. A similar process is followed on Columbia's courses for programmers, where this time code is submitted online for assessment.
PricewaterhouseCoopers use a game, developed by EBC, to help clients assess their ability to fill in complex P11D tax forms. As situations arise, users must resolve the issues correctly or risk fines from the Inland Revenue (you can take realism too far!). EBC developed another creative assessment tool for Huthwaite: realising that sales managers do not always devote as much time to coaching as they should, they devised a self-assessment checklist by which salespeople can reflect on a real sales call they have just made and diagnose what went right and wrong.
Online assessment is just another angle on e-learning and, as such, can be too easily treated as a panacea and, just as easily, written off as a poor substitute for traditional methods. The reality is that computers provide their very own blend of advantages and disadvantages that need to be reviewed carefully against all the other options. Chances are that online assessment, in all its forms, has a place somewhere in your training or educational
varieties of test item
There are essentially just five types of question that you will encounter regularly in online assessments - selecting, supplying, ordering / ranking, matching and locating - although each has many variants.
With selecting questions, the user simply picks from a range of options. These are the most common questions and the simplest to develop. There are many varieties: at the simplest level, the user makes a binary decision - yes or no, true or false. With the ubiquitous multi-choice question, the user is offered a wider range of alternatives - generally 3-6 - from which they make a single selection. A variant on this is to allow the user to make multiple selections from a list; sometimes a fixed number of selections is asked for, at other times it can be variable.
Options in selecting questions do not need to be presented as a text list: users may be asked to select from a number of images, to click on a part of an image; to drag selected items into a target area; they may even be asked to stop an audio or video sequence at a point where they recognise a particular event or situation is occurring. Here are some examples of selecting questions:
Are you sure you want to exit - yes or no?
Which of the following is an example of a mammal?
Tick those items on the list that correspond to how you feel about the course right now.
Click on the shapes in this picture that are flowchart symbols.
Stop the video when you see an example of good customer service.
With this type of question, the user supplies an answer by typing into a text box. There are several variants: the user may be asked to answer in words or numbers; the user's answer may or may not be constrained in length or form; the user may be asked to supply a single answer or a series of answers. Many test designers shy away from this type of question because of the computer's limited ability to interpret freely entered text, but the format can work well if the answers required are tightly constrained and some tolerance is allowed in terms of spelling or use of synonyms. It may be worth persisting, because only supplying questions test for recall from memory, whereas selecting questions test for recognition of correct answers from options displayed. Here are some examples of supplying questions:
Who is currently the President of Russia?
In what year was the Battle of Hastings?
List the five types of question.
The purpose of this type of question is to have the user place a number of items in sequence, whether this is their logical order (ordering questions) or their order of importance (ranking questions). The user could make their selections by typing in numbers or letters, by picking from drop-down lists or by dragging-and-dropping the items into order. Here are some examples of ordering/ranking questions:
Place these steps in installing a hard disk drive in their correct order.
Place this list of monarchs in the order in which they reigned.
Place these potential hazards in order of importance.
Rank these brands of coffee in order of preference.
With this type of question, the user identifies matched pairs from two lists or groups. The options may be represented textually or graphically. As with ordering and ranking questions above, a wide variety of input devices could be used. Here are some examples of matching questions:
Match the programming languages on the left with their originators, listed on the right.
Match the capital cities with their countries.
Match the names of these film stars with their faces.
In this case, the user identifies the location of a part on a photo, illustration or diagram. The user may have to locate a single item or a series of items. Here are some examples of locating questions:
Click on the CPU on this diagram of a PC motherboard.
Click on Rwanda on this map of Africa.
Click where the kidney is positioned on this photograph of a person.
Dedicated assessment tools:
RapidExam from Xstream Software, Inc.
Web Test from AES (SmartForce)
i-assess from EQL
e-test from Riva
YnotAssess from YnotLearn
General purpose e-learning authoring tools:
Authorware 5, Dreamweaver with Coursebuilder extensions, both from Macromedia
Toolbook II from Click2Learn
DazzlerMax from MaxIT
E-learning's Greatest Hits
by Clive Shepherd
Available now from
Above and Beyond