AICP Certification Exam Development Process

Job Analysis

Through a Job Analysis survey, specific key responsibilities and competencies required for effective performance in a specialized planning field are identified. Subject matter experts (SMEs) are assembled who then participate in a focus group which Prometric facilitates. The focus group evaluates the necessary knowledge, skills, and abilities necessary to measure the desired level of competency in a given planning specialization, including establishment of tasks and subtasks that are required to earn the designation.

Test Specification (Blueprint) Development

The development of a test specification (blueprint) document follows the Job Analysis and ensures that each section and objective is appropriately represented on the test. Ratings are solicited from the SMEs during the Job Analysis in order to ensure that the weightings of each objective are appropriate.

Item Writing Workshop

A panel of SMEs is recruited and is trained to write items (items = test questions) during an Item Writing Workshop. During the workshop, a Prometric facilitator conducts item writing training and facilitates the item writing process. SMEs then author questions on and submit them for editing and review. Each item is designed to be an accurate measure of knowledge and relevancy to the goals of the testing program, as well as psychometrically sound and legally defensible.

Item Review

A separate panel of SMEs — under the direction of a qualified Prometric psychometrician — reviews each newly written test item for accuracy, clarity, valid linkage to the test specifications, and accuracy of the identified key. SMEs then revise items in order to meet these requirements.

Psychometric and Language Item Editing

Each item undergoes a psychometric, sensitivity, and language review. Prometric's psychometric review verifies that each item conforms to recognized psychometric item writing guidelines and does not favor any particular nationality, race, religion, or gender. During the editing process, Prometric editors and test developers check all items for grammar, usage, readability, clarity, and consistency of usage. Once the items are selected for inclusion, a Prometric test developer constructs the test in accordance with the test specifications.

Test Administration

The test is administered to qualified candidates in a live testing environment.

Item Analysis and Review of Statistically Flagged Items

Prometric combines a detailed study and subject matter expert judgment to measure the relationship between item types and objectives; to assess fairness, reliability, and validity of each item; and examine performance discrepancies. SMEs review items with poor performance and help determine how to best score them. Prometric's proven methods include the modified Angoff and Borderline Group method.

Standard Setting / Cut Score Analysis

A standard setting task force of SMEs is created to determine the pass-fail standards and the associated cut scores for the test. A cut score is the minimum score that a candidate must obtain in order to pass the exam. SMEs examine the content outline, and make a preliminary assessment of what the minimally qualified candidate would know and not know. The task force then takes the exam themselves, and rates each item individually according to the minimally qualified definition they discussed.

The SMEs reconvene to adjust their ratings of the minimally qualified candidate based on their own performance and collective discussion. Prometric juxtaposes the results of the Item Analysis and Standard Setting studies with candidate performance to recommend a range of scoring options. The cut score that is selected is intended to ensure that only qualified candidates pass the exam and are awarded the credential.

While the selected cut score will not always result in the desired pass rate — because of factors such as a small candidate pool — as more candidates take the test, pass rates should stabilize over time to reflect the optimal pass-fail ratio.

Ongoing Test Maintenance

Over time the test is periodically reevaluated to ensure that it remains a relevant and accurate measure of candidates' expertise in the field. As in early stages of the test development process, a task force of SMEs reexamines performance discrepancies and reassesses issues such as the fairness, reliability, and validity of test items with the aim of determining whether adjustments to the test or cut score are necessary.


Prometric's Road Map to Successful Certification Programs