Program Development and Maintenance Process
The National Institute for Certification in Engineering Technologies (NICET) develops certification programs that recognize individuals who meet industry-established standards of job knowledge and work experience. The success of these programs is based on the expertise of the industry professionals who work together to define the content of the exams and the criteria for certification. Volunteer subject matter experts (SMEs) are needed at every stage of the program development/maintenance process. Certification development and maintenance is an ongoing process guided by industry input and data analysis.
Practice Analysis Development
The term “practice analysis” refers to both a document and a process. The document is a listing of all the tasks that are performed by a technician who would be certified through the program, together with the knowledge and skills involved in each task. It is developed by a group of SMEs who come together via meetings and conference calls. The work involves defining the scope of each certification Level, identifying the major areas of responsibility (“domains”), and identifying the tasks, knowledge (including appropriate standards), and skills associated with each Level. The definitions are reviewed to ensure clarity of meaning and consistency with the scope of each Level. Additional reviews are conducted to determine any revisions that may be suggested by the results of each stage of the validation process.
Validation Survey
The practice analysis is “validated” once the final design is confirmed. A sample of practitioners from the program area will review the practice analysis. This review is conducted in two rounds. First, the document is sent to a few individuals who are asked for largely open-ended comments. After that, it is sent—in the form of a survey—to a large sample of technicians who are asked to respond to specific questions about the content. They are given the opportunity to give open-ended comments. Results are analyzed statistically and comments are compiled. These results are used to determine whether the practice analysis is a good representation of the work performed by technicians. Final results of the validation are presented to the Practice Analysis Group for consideration in making final revisions to the document.
Blueprint Construction
This group advises NICET on issues related to methods and criteria for assessing qualifications for certification. The group will prepare a test “blueprint” (number of test questions per topic area) based on the practice analysis and validation survey results. They will finalize reference materials and the types of questions appropriate for each exam. They may be involved in outlining some specific topic areas and will develop/recommend the work experience criteria for certification at each Level.
Item Development
These individuals will write and review test questions for NICET certification exams. Writing is performed remotely in conjunction with other SMEs. Members of the group will meet in person or remotely to review test questions. Questions are reviewed to determine if they meet the standards for inclusion in certification exams and to allocate the appropriate amount of time for each question. (Individuals involved in developing questions are not included in the standard-setting process.)
NICET test questions are formatted as multiple choice (a single correct answer) or multiple response (between two and three correct answers). Multiple response questions can appear in various formats, including a single 4-square graphic, multiple graphics, or a listing of answer choices without a graphic. NICET does not use true/false formatting in test questions.
The development of quality test questions requires a predetermined standard and style guide. NICET has set the following standards so test questions are:
• Covering significant, important, and relevant content (the things that really matter);
• At the appropriate level of difficulty;
• Clear, precise, and direct (not confusing);
• Free of unintentional sources of difficulty;
• Free of unintended answer cues; and
• Unbiased toward any candidate.
Questions are reviewed against the following specifications:
• Is the stem clear?
• Are the responses clear?
• Is/are the key(s) the ONLY correct answer(s)?
• Is there a rationale for any math calculations?
• Are the distractors plausible yet incorrect?
• Is there any area for interpretation in this question?
• Is this question appropriate for this level?
• Is this question in the appropriate topic?
• Is the listed reference and section correct?
• How long will it take for a baseline candidate to answer this question?
• Is this question an enemy/variant to another question?
SMEs are instructed to avoid the following in the development of test questions:
• Trivia,
• Fill-in-the blank,
• Made-up words,
• Humor/puns,
• Tricks,
• “All of the above” or equivalent,
• “None of the above” or equivalent, and
• Negative phrasing (“not” or “except”).
Standard Setting
When a new program is being developed, an existing program is undergoing a major redevelopment—or large numbers of new questions are being written with the objective of creating new versions of the test. During this time, a standard-setting session is held. To determine the passing score for a new test version, NICET uses SMEs and a “modified Angoff” procedure to establish the required minimum number of correct answers to demonstrate that a candidate possesses the necessary level of job knowledge for certification. SMEs will meet remotely or in person for this task. After receiving training in the process, participants rate the test questions by comparing each question with the standard of knowledge and skill expected for that level of certification. Their ratings are then compiled and statistically analyzed by NICET to establish the passing score for the exam. (Individuals involved in standard setting are not included in the item development process.)
Training is essential to developing a qualified workforce. However, to avoid “training to the test,” the NICET development procedures limit participation for those who develop or conduct formal training related to that exam or plan to do so in the next two years. Not all training activities are affected.
NICET incorporates the following overarching concepts while developing and maintaining certification programs.
• Valid – The exam covers the right material for the specialty.
• Reliable – Candidates get the same results on multiple attempts when all other factors remain the same.
• Fair – There is no bias except knowledge of the practice.
• Legally Defensible – NICET uses a defensible standard test development process to ensure that the exam accurately measures knowledge, skills, and abilities of candidates.
For more information, visit nicet.org.
EDITOR’S NOTE: Reprinted with permission from NICET. at nicet.org/about-us/nicet-news/nicet-newsletter/october-2020/nicet-certification-program-development-maintenance-process.