Why Doesn't IDEA Use the N/A Option?

September 30, 2013

By Steve Benton 

From time to time, IDEA Center staff are asked why we do not include the use of the Not Applicable (N/A) option in the Likert Scale for the Student Ratings of Instruction instrument.

Researchers are split on the issue of whether to provide such an option (Dillman, 2007). The IDEA Center elects not to provide that option. For the 12 learning objectives, the instructor determines what is applicable, not the student. For the other summative items, a forced choice is preferable. For the 20 teaching methods, we rely more on the research, relationship of relevant objectives to methods,than on students’ perceptions of what methods are applicable to the class they are taking.

The standard set of recommended directions for each class are printed on page 4 of the “Directions to Faculty.”

“IDEA focuses on what the instructor was trying to teach and on what you learned. As such, an instructor is not expected to do well on every item. In recognition of this, items not related to this course are not counted in the final evaluation.”

Students are asked to respond to all 12 objectives for the following reasons (not necessarily in order of importance). First, by responding to all 12, students can discriminate between those that were emphasized in the course and those that were not. In fact, when instruction is effective student mean ratings on “unimportant” objectives are typically lower than their ratings on important or essential objectives. The IDEA instrument relies on this concurrent validity that is consistently demonstrated between faculty ratings of importance and student ratings of progress on the objectives. Were we to not include “unimportant objectives” on the form, that important piece of validity evidence, which users want and need, would be lost. Beyond that is the impractical and expensive consideration of tailoring each form to the individual instructor. The IDEA Center, a nonprofit organization, is committed to keeping costs within reasonable limits.

Moreover, some instructors appreciate being able to see students’ mean ratings on unimportant objectives on page 3 of the report (statistical details). It is sometimes the case that an instructor identifies an objective on page 3 that has higher student progress ratings than the instructor expected. This can be helpful formative information about incidental or unintentional student learning.


Dillman, D. A. (2007). Mail and internet surveys: The tailored design method. Hoboken, NJ: John Wiley & Sons.

Post new comment

The content of this field is kept private and will not be shown publicly.