Migrating Manual Performance Tests
The recent Association of Test Publishers (ATP) conference in Scottsdale, Ariz., was enlightening with respect to the future direction of migrating manual performance tests to computer-based test environments. Participation in the Performance Testing Council (PTC) has afforded me the opportunity to see excellent presentations from numerous industries and professionals on efforts to migrate manually delivered performance tests to computer based test environments.
As a strong advocate of performance testing, I’m encouraged by the trend toward moving manually delivered performance tests to computer-based testing environments. This trend is clearly validated by examples of simulation item types in computer-based testing, the thinking around performance testing and varying delivery methods.
The bottom line is that there is an increasing demand within testing, certification and licensure to validate skills and competencies by “doing” instead of “memorizing.”
What are the primary issues associated with migrating manually delivered performance test to computer-based test environments? As with any topic of this magnitude and complexity, there is endless research that can be done and volumes of text to be written. To get you started, here are four questions that should be asked before moving a manually delivered performance test to a computer-based testing environment.
Why is the manually delivered test being migrated to a computer-based environment?
I’m not being facetious when I start with this question. If you are beyond this point, you have already done your homework with respect to return on investment as a result of scalability, geographic availability, associated costs and benefits, value to the testing candidate, etc. I only pose the question to help ground us in the commitment that is required to make this effort happen. Performance tests have had varying levels of commitments with respect to both time and costs.
Is your job task analysis (JTA) current?
When is the last time you updated your JTA? Maintaining test validity is extremely important, regardless of how the test is delivered. Even in computer-based test environments, we must ensure that what is being tested is relevant to the job. Job tasks change over time, and while these changes vary across professions, they must be taken into consideration. For example, in the information technology world, new product releases often make tasks easier to perform or eliminate tasks altogether.
Does your exam design, development, delivery and maintenance infrastructure support computer-based testing?
Don’t underestimate the infrastructure changes that may be required to support computer-based testing environments. One example is item banking, where there is a need to store and manage content. Your ability to control the content throughout its life cycle and publish the content to a computer-based test environment is dependent on how effectively it is managed. Item developers can spend a lot of cycles and consequently money as a result of poorly managed item content.
Have you fully evaluated the technology used to author and deliver the computerized performance test?
I addressed numerous considerations for integrating performance test items into computer-based test environments in my February 2005 column. (To read the column, visit http://www.certmag.com/articles/templates/cmag_nl_credentials_content.asp?articleid=1111&zoneid=97. Once again, these considerations are applicable to both the authoring and the test delivery technology. In addition, here are some other questions to consider: What happens if there are changes to either the authoring software or the delivery software? Will this result in an unexpected effort to retest all computer-based items? What measures and processes do you have in place to manage the changes in software over time?
The best thing about migrating manual performance tests to computer-based environments is that it is becoming more common. These increasing efforts will help us gain a better understanding of the associated processes, procedures, and trials and tribulations. I believe we have just begun to explore the potential of deploying performance tests in computer-based test environments. With the continued advancement of technology, we have unlimited potential. The important thing to think when migrating a manual performance test to a computer-based test environment is that it’s not about emulating what we are currently testing, but how we can enhance the testing experience through a new delivery mechanism.
James A. DiIanni is the director of assessment and certification exam development at Microsoft Learning and supports the Microsoft Certified Professional program. He can be reached at firstname.lastname@example.org.