Integrating Performance Items in Test Environments

Posted on
Like what you see? Share it.Share on Google+Share on LinkedInShare on FacebookShare on RedditTweet about this on TwitterEmail this to someone

Surprisingly, while there are technology challenges associated with implementing performance test items into computer-based test (CBT) environments, many of the considerations are associated with administration and planning.


For this article, I will work from the premise that a third party vendor is delivering your tests in a CBT environment. Your goal is to integrate performance test items, using simulations or emulations, into your existing tests, which are comprised items such as multiple-choice, drag-and-drop, create-a-tree, etc. See the December 2004 Design & Develop column if you need a refresher on various types of test items, at


What does it take to be successful at integrating and delivering performance test items into CBT tests? With a high level of cooperation, effective planning, realistic expectations and perseverance, performance test items can be integrated into CBT without a lot of pain. Depending upon the test delivery platform and the technology used to author test items, the integration effort can be complex. Let’s look at some considerations you should address prior to embarking on this endeavor.


While not all considerations are associated with technology, it is a prime consideration in this effort. It can be distilled down to the following factors:



  • The technology that is used to author the performance test items. (There is a comprehensive set of criteria for selecting this technology.)
  • The technology used to deliver the performance test items in the CBT delivery environment.
  • The technology required to integrate the performance test authoring drivers into CBT delivery environment.


Now add to these considerations with the following questions:



  • Has the technology that you are using been previously integrated into the target CBT environment?
  • If you are working with a test delivery provider, will they allow you to use any integration work that has already been done?


While the technology issues can be complex, in many cases they can be worked out. As is often the case in the software industry, time and money will help solve technology problems. Unfortunately, not everyone has endless resources.


I have found that the more challenging considerations are not technology-related. They involve people, processes and communication. I recommend that you do your homework prior to making a commitment and ask and answer the following questions:



  • Do you have the in-house expertise to do the development and testing?
  • Does the test delivery provider have the expertise?
  • Who will pay for the integration and testing effort, and how much will it cost?
  • Will non-disclosure agreements (NDAs) or contracts be required to protect intellectual property? (These can be very time consuming to draft, obtain agreement and get signed.)
  • How many people and what people need to be involved?
  • If you are working with a test delivery provider, will you be competing for resources with other clients? This could potentially delay your project.


This is not an exhaustive list of questions, but provides a good example of what must be addressed. It does not take much to get bogged down in the administrative and legal issues before starting any technical work. My recommendation is to allow plenty of time to work through the planning before committing to a delivery date.


There are numerous success stories about organizations that have integrated performance test items into CBT environments. Example time frames that I’m aware of span one or two years from conception to implementation. These efforts are relatively new with performance testing, so there is no average cost, time frame or best practice yet. Consequently, exercising diligence in the planning process will help mitigate potential challenges and contribute to the overall success of the integration effort.


The Performance Testing Council is working on collecting data and documenting best practices associated with integrating performance testing into CBT environments. The premise is that the continued move toward performance testing across multiple industries will drive demand for integrating performance testing into CBT environments. See for more information.


James A. DiIanni is the director of assessment and certification exam development at Microsoft Learning and supports the Microsoft Certified Professional program. His experience with performance testing started in 1986 developing simulators for the U.S. Navy, and he has been involved in the IT certification industry since 1997. He can be reached at

Like what you see? Share it.Share on Google+Share on LinkedInShare on FacebookShare on RedditTweet about this on TwitterEmail this to someone


Posted in Archive|


Leave a comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>