Triads (1992-current)


TRIADS is a powerful e-Assessment authoring and delivery system complemented by an interactive results analysis and reporting program. The unique characteristic of TRIADS is that it allows very flexible, visual question styles with sophisticated controls on question behaviour and scoring. Assessments may be run in summative or in formative mode with context-sensitive feedback either as an immediate response to a user action or as a summary at the end of each question. TRIADS can support scenarios that require concurrent, multiple interaction types overlaid on graphics and has some functionality that is not available in any other system.

History of Development

The direct precursor to TRIADS was initially developed by Professor Don Mackenzie as the University of Derby (UK) Interactive Assessment Delivery System or ‘DIADS’ and used to support computer-delivered assessment in Earth Sciences within the University. The system was used for some components of the course-ware developed by the UK Earth Sciences Course-ware Consortium project under the leadership of Dr Bill Sowerbutts at the University of Manchester, UK, funded as part of the national TLTP initiative ( This application has strongly influenced the development of highly visual course-ware style of interactions that now underpin all elements of its design.

Between 1994 & 1999 the system was further developed to provide assessments in a wider range of disciplines across the University and boosted nationally when used to underpin the TRIADS – Assessment of Learning Outcomes project. This was a collaboration between the Universities of Liverpool (Professor Chris Paul, Lead), Derby and the Open University funded as part of the national FDTL initiative ( The renaming of the system from DIADS to the Tripartite Interactive Assessment Delivery System (TRIADS) occurred at this time in recognition of the contribution made by the three Universities. The system was used by 16 disciplines across 32 academic departments in19 universities as part of the project. The state of development at this stage is summarised in Mackenzie (1999) .( which contains examples of some of the question styles available at this time. (See one example question from the paper below.)


In 1999 the University of Derby set up the Centre for Interactive Assessment Development (CIAD) in response to demand from a wide range of academic disciplines. This was a dedicated e-assessment design consultancy, production, monitoring and results reporting department to support all academic staff across the University.

Between 1999 and 2005 the wide ranging functionality provided by the system attracted the interest from a number of external organisations including pilot e-assessments for schools’ examination bodies (e.g.Maughan.S & Mackenzie.D., 2004) ( and for parts of the UK National Health Service.

In 2005, the external interest in both e-assessment and e-learning prompted a move of development staff from the University’s e-learning unit and the core TRIADS development team into a new commercial e-learning development department of the University (Innovation for Learning, IfL ( ). Production of e-assessments for external organisations was then concentrated in the wider commercial arm of the University. Competency assurance and continuing professional development assessments were developed by the team for medical staff in a range of NHS hospitals.

From 2011 on the TRIAD System has been developed and maintained by Prof. Don Mackenzie at Professional e-Assessment Services which continues to support TRIADS e-assessment clients of the University and other clients largely in the field of competency assessment of medical professionals up to Consultant level.


All versions of DIADS and TRIADS are open-coded in Authorware Professional (previously from Macromedia now © Adobe Systems). Authorware was initially selected because of its intuitive and rapid development environment for the production of a wide range of interactions suitable for e-learning course-ware.

Over time and use in a wide range of applications, the limitations of the standard Authorware interaction controls became apparent and a number of interactions were re-written using a combination of the Authorware scripting language and JavaScript in order to give finer control on question behaviour, response checking and scoring.

TRIADS can deliver all the question types listed below individually with a variety of controls on their behaviour and scoring. Additionally TRIADS can deliver text/numeric entry + checkbox matrix + hotspot + draw line interactions concurrently within a single question. This concurrent functionality can potentially be extended to all interaction types, although it is difficult to imagine a scenario that would require all of them at the moment.


All interactions can be layered and overlain on graphics

Code can be added to all question types to account for specialist scoring requirements and the system may incorporate biological and petrological microscope simulators if these are required for specialist assessments using these tools. The way in which these tools are used to answer a question can also be scored, if required.

The question display engine can be configured in a wide variety of modes which can include controlled random question delivery, menu option links to different question sets and the optional delivery of context-sensitive feedback as an immediate response to a user action or at the end of each question. Feedback can be held internally or externally in .rtf files or as links to web pages.

Assessments may be Browser-delivered for intranet, internet and LMS via a plug-in or compiled as an executable for CD or LAN delivery.


Whilst standard e-assessments can be produced and delivered using TRIADS, it comes into its own when producing simulation and scenario-based assessments where interactions are overlain on graphics and require a high degree of control on their behaviour and scoring.

TRIADS assessments are easily integrated into Authorware-based e-learning packages.

TRIADS questions are ideal for binding seamlessly into an e-learning narrative to generate a dialogue between the user and the e-tutoring element whilst tracking the scores.

There are very few limitations on the complexity of the assessment or nature of the questions that can be delivered using TRIADS.

General Limitations

Because of its wide ranging functionality, TRIADS is very much a tool for specialist Assessment Developers in commercial or university-based e-learning/assessment development departments.

It is not generally suitable for use by most individual tutors/assessment designers on an occasional basis. This was demonstrated in the TRIADS Assessment of Learning Outcomes project when usage of TRIADS fell significantly after project funding supporting developers in each institution ended.

TRIADS is open-coded in Authorware Professional (© Adobe Systems) and is thus vulnerable to commercial decisions made by Adobe. Authorware is still currently (2013) available from Adobe but is regarded as a ‘mature’ product and will not be developed further. Thus its longevity is in question. Re-development of a system as sophisticated as TRIADS using any other development tool will be very expensive and currently there appears to be no rapid development environment that gives the flexibility of delivery mode and level of interactivity provided by Authorware in a single package.

Web delivery of software developed using Authorware is provided via the Authorware Web Player 2004 plug-in. This works well, once installed, but its installation requires ‘Administrator’ security rights on client machines together with a rather specific browser setup. This creates barriers for individual users and difficulties in distribution across some secure corporate/institutional networks.

TRIADS is developed for Windows platforms only.


TRIADS has an unparalleled and innovatory range of assessment styles and any applicant who seeks to patent any kind of question delivery mechanism would need to ensure that their application did not infringe upon or plagiarise the work already done by the TRIADS team.

The TRIADSystem is © University of Derby and Prof. Don Mackenzie



Some related references

Boyle, D.N. Bryon, C.R.C. Paul (1997) Computer-based learning and assessment: A palaeontological case study with outcomes and implications . Computers & Geosciences, Volume 23, Issue 5, June 1997, Pages 573-580 – abstract here

Mackenzie, D.M. (1997) Computer Aided Assessment at the University of Derby. A case study in: Brown, G., Bull, J. and Pendlebury,M. Assessing Student Learning in Higher Education. Chapter 13 pp 215-217. Publ. Routledge, London.

Mackenzie, D.M. (1997) Fully Interactive Assessment on the Web – A Shock to the System. Proceedings of the First Annual Computer Assisted Assessment Conference. 18th. June at the University of Loughborough.

Mackenzie, D.M., Regan, P.F, Wilkins, H. & Hutchinson, P (1998) Integrating Fully Interactive Assessments with Web Resources – A flexible learning environment for students and an easier life for academics?. Geological Society of America Annual Conference, Toronto. Abstracts p.A-389.

Mackenzie, D.M. (1999) Recent Developments in the Tripartite Interactive Assessment Delivery System. Proceedings of the Third Annual Computer Assisted Assessment Conference. University of Loughborough. 16th-17th June

Mackenzie, D.M., Wilkins, H., O’Hare, D. and Boyle, A. (1999) Practical Implementation of Recent Developments in the Tripartite Interactive Assessment Delivery System. A Workshop for the Third Annual Computer Assisted Assessment Conference. University of Loughborough. 16th-17th June

Boyle, A.P., Edwards, D.J., Mackenzie,D.M., Mills, B., O’Hare, D., Morris, E.C., Paul, C.R.C.P., Wilkins, H. and Williams, D.W. (2000) Developments in on-line assessment – experiences and evaluation of using TRIADS and its potential for assessment in electrical engineering. International Journal of Electrical Engineering Education , vol 37, part 1, pp 74-85

Mackenzie, D.M., O’Hare,D., Paul,C., Boyle,A., Edwards, D., Williams,D. & Wilkins, H. (2004) Assessment for Learning: the TRIADS Assessment of Learning Outcomes Project and the development of a pedagogically friendly computer based assessment system. In O’Hare,D & Mackenzie, D.M. (Eds) Advances in Computer Aided Assessment, SEDA Paper 116 pp11-24. Staff and Educational Development Association Ltd., Birmingham.

Maughan,S & Mackenzie,D.M. (2004) BioScope: The Assessment of Process and Outcomes using the TRIADSystem. Proceedings of the 8th International CAA Conference, Loughborough, 6th & 7th July 2004 ISBN 0-9539572-3-3

Mackenzie, D.M. (2005) Online Assessment: quality production and
delivery for higher education. Keynote Address in Enhancing Practice, Assessment Workshop Series No. 5 in Reflections on Assessment, Volume II. pp22-29, Quality Assurance Agency for Higher Education, Gloucester. ISBN 1 84482 266 4

SToMP project (1992 onwards)

The SToMP (Software Teaching of Modular Physics) project was funded in the UK in 1992 to create a learning environment for some common first year physics courses. From the mid 90s, this included an assessment system that supported the delivery of the following question types:

  • List (radio button)
  • List (check boxes)
  • Rank ordering
  • Pair matching
  • Numeric
  • Text
  • Random numeric

Dick Bacon (r.bacon at has shared information about the project and given permission to share the documents below.

He advises that after about 6 months the project’s Open University representative, Steve Swithenby, pointed out that they really should have an assessment component, and during a six week sabbatical in Sweden, Dick created a prototype in Visual Basic handling, he believes, single and multiple choice, pairing and ordering questions. About 1994-5 the real version was produced, see details in the document below. In 2002-ish this was replaced by a QTI v1.2 system that outlived the rest of the SToMP project. Dick retains the SToMP name for his current SToMPII QTIv2 system (see

Here is a screenshot of the 1994-95 version. It shows a five question test, the questions being in the formatted text viewer on the right. The response box has a tab for each question, and the user has just got the first question wrong and has the feedback displayed. It is a self test, so results are not recorded and the student can try again as many times as wanted.

SToMP screenshot

Some links on SToMP

And here are copies of the SToMP manuals with great detail on the functionality:

  • Development manual for the whole of the SToMP materials which gives context dated December 1997 (see chapter 5 for the assessment system) [PDF] [Word
  • A  version of the chapter 5 on assessments dated October 1994 [PDF] [Word]

Thanks Dick for sharing this important 1990s example of assessment technology.

Question Mark Professional (1993)

Question Mark Professional manual coverQuestion Mark Professional also known as Question Mark for DOS version 3 was launched in 1993 and marketed worldwide (with minor enhancements over the years) during the 1990s. The software was gradually superseded by Question Mark for Windows and Question Mark Perception.

Question Mark Professional consisted of:

  • Question Mark for DOS
  • Graphics Companion
  • Toolkit
  • Euro Pack (test delivery software in Dutch, French, German, Italian and Spanish)

There were various add-ons to the software including a Multimedia Editor and a Report Generator. Here are a few paragraphs from the user manual:

Question MarkTM is a computer program, which you can use to create, give, mark and analyse objective tests on a computer. Using Question Mark, you enter questions (on any topic) into the computer; students answer your ques­tions on-screen, and the computer can mark and analyse their answers.

Question Mark is suitable for use both by individual trainers and teachers, seeking to create tests for their students, and by courseware developers who want to create tests for distribution. Question Mark can also be used in any situation where you want to ask people questions and analyse their answers: ques­tionnaires, opinion polls, recruit­ment tests, and many sorts of form-filling.

Question Mark is easy to learn, and you do not need to know about computers to use it. You should start using it productive­ly after only a short learning period.

Using Question Mark, you can create tests with:

    • up to 500 questions (either all or some chosen random­ly);
    • each question in one of 9 types – multiple choice (includes yes/no, true/false), numeric, fill in blanks, word answer, free format, match­ing/ranking, multiple response, logical and explanation;
    • a variety of ways to present the test including giving the student feedback on answers after each question, after the test or not at all, and a range of options including time limits, hints for wrong answers, and letting students pass over questions;
    • control over the way the screen looks when the test is delivered, including the capability to include your own graphics in questions;
    • flexible and intelligent computer marking methods, where you can define a variety of correct answers and scoring methods;
    • the ability to call up other tests to follow on from this test, with the test chosen depending on the student score.

Students can answer the tests without needing to be familiar with the use of computers; and there are measures to pre­vent students from getting access to the questions and answers illegally.

After the questions have been answered, there are sophisti­cated but easy‑to‑use ways to review and analyse the answers. You can collate and analyse answers from differ­ent students and different tests, sending output reports to screen, printer or disk.

Evidence of its release and availability:

Evidence of its functionality:

7 early descriptions of remote proctoring (1997 – 2007)

Here are 7 interesting documents that I found that discuss remotely monitoring computer based assessments.

I am very appreciative of the opportunity to contribute to this blog.  My name is Matthew Jaeh and I am the VP of Operations for ProctorU Inc.

ProctorU is an online proctoring service that allows examinees to test anywhere while still ensuring exam security for the testing organization. Using webcams, screen-sharing technology, and proven authentication techniques, the staff at ProctorU’s three dedicated centers provides live monitoring of test-takers taking their examinations from home, work, or anywhere. Examinees connect to their monitor via a one-on-one video sessions and the monitor provide pre-exam assistance and technical support at no additional charge. The monitor maintains security during the session by monitoring audio, video and the candidate’s screen throughout the entire examination

Question Mark for Web (1995-96)

QM Web Manual coverQuestion Mark for Web (QM Web) is believed to be the world’s first commercial web testing and surveying product. QM Web version 1 was released in October 1995. Version 2 of QM Web was released in September 1996.

To quote contemporary descriptive language: “Using QM Web, you create your questions and load them onto a Web or Intranet server. People anywhere in the world can then answer the questions using just a Web or Intranet browser (e.g. Netscape Navigator or Internet Explorer). Their answers can be sent back to you for analysis, or else marked on the server with immediate results.  If you need to gather information from people and analyze the results, using the Web or an Intranet, then QM Web is the answer for you. Typical uses of QM Web include:

  • Distance education. If you are presenting courses at a distance, you can add assessment – finding out what people know, checking what they have learnt, or offering a formal end of course exam to validate performance.
  • Self-assessment teaching material. Add interactivity to your training and educational material. Create quizzes that people can take to find out how much they know, or as practice tests prior to important exams.
  • Checking employee competence. Ask questions to prove your sales staff understand the products they are selling or that your employees know safety regulations.
  • Recruitment skill tests. Get a prospective employee to take a test to give you objective information on their capabilities, or gather information on the Web prior to selecting people for interview.
  • Surveys and questionnaires. Ask employee attitude surveys, get feedback from trainees on the quality of courses or ask your customers or people accessing your site what they think of your service.

QM Web gives you instant assessment, instant learning and instant results, anywhere in the world.”

QM Web took questions and assessments created in Question Mark Designer for Windows (a Windows assessment management system) and converted them into HTML for use on the web. A server-side program then allowed scoring, feedback and results reporting.

Evidence of its release and availability:

Evidence of its functionality:

QM Web functionality grew from its original release in 1995 to fuller capability in 1996. Manuals on this site have been uploaded with the permission of the copyright holder.

Feel free to ask me any questions about QM Web.

John Kleeman, May 29 2012

Question Mark Guardian (1995)

Questionmark Network Guardian manual cover

Question Mark Guardian (also called Network Guardian) was an add-on companion to Question Mark Designer for Windows which allowed you to control who has access to exams, tests, surveys and quizzes – including:

  • Maintain user records for testing
  • Limit access to particular users
  • Limit access to groups of users
  • Generate unique passwords
  • Provide time-windows for tests
  • Limit number of times for a test
  • Ensure tests are taken in sequence
  • Import user names and passwords

According to contemporary documentation, Question Mark Guardian helps you resolve your security problems. With Guardian you can:

  • Give users or groups of users access to certain question files, and prevent others from accessing the same questions;
  • Prevent people from giving a false name when they answer questions, by assigning a password to each user;
  • Stop users from being able to take a test more than once;
  • Stop people from running a test before it is released or after it has expired;
  • Ensure that users cannot attempt an advanced test without passing a simpler one first;
  • Keep a record of who does what and what their score is, thus preventing unscrupulous users from attempting a test and trying to hide this from you;
  • Assign users to groups within Question Mark Guardian so that you can look at results on a group basis.

Evidence of its release and availability:

Evidence of its functionality:

The first version of Network Guardian was called version 2.1. The user manual was printed in May 1995, ISBN 1 872089 13 5. The user manual has been uploaded to this site with the permission of the copyright holder.

Feel free to post any questions on Question Mark (Network) Guardian, and I will try to answer.

John Kleeman, May 25, 2012

Question Mark Perception v2 (1999)

Question Mark Perception Server v2 manual cover

Question Mark Perception version 2 was released in July 1999 and was a significant milestone in this assessment management system, building on the release of version 1 and adding security, reporting, multimedia and integration amongst other capabilities.

The main function of the software was to enables users to create their own tests, surveys, questionnaires, and assessments for use via the web or an intranet.

Key capabilities:

  • Author questions and organize in topics
  • Select questions (including at random) and put in assessments
  • Control access to assessments securely, with a functional and innovative security model (see chapter 4 of the server manual for a description)
  • Deliver over the web
  • Support of a secure browser
  • Integrate with Learning Management and other systems
  • Report on the results

Evidence of its release and availability:

Evidence of its functionality:

Feel free to post any questions on Perception version 2 and I will try to answer.

John Kleeman, May 23 2012