Evidence for the construct validity of the skills portion of the assessments is demonstrated by independent research reports of gains in skills scores after participation in a course or training program aimed at building critical thinking skills.
Some of these peer-reviewed publications written by researchers from countries around the world, are posted as examples of how the skills assessments provided by Insight Assessment have been used to document gains in critical thinking skills.
For this to occur, any improvement in scores must be attributable to improvements in critical thinking and not to some other external factor. In other words, as possible, all variables are held constant with one exception: a treatment is supplied which is expected to increase critical thinking skills. This might be, for example, a staff development program focused on case-based analysis of real problems when the emphasis is on training critical thinking skills, a course in critical thinking that practices students or working professionals in the use of their critical thinking skills, a class or internship focused on training reasoning skills, or some other such treatment. Then, it would be reasonable to expect that improved posttest scores for the same individual or group could be attributed to the effects of the intervention to build critical thinking skills. To maximize quality in the testing condition, consultations with technical staff from Insight Assessment on testing plan design are made available as a part of the new client package when clients are beginning new testing programs.
Construct Validity is also demonstrated by correlational studies where critical thinking scores are correlated with other measures that purport to include the construct. The critical thinking skills portion of these assessments have demonstrated strong correlations with other instruments that purport to include a measure of critical thinking or higher-order reasoning as a component of their scores or ratings. High correlations with standardized tests of college-level preparedness in higher-order reasoning have been demonstrated (GRE Total Score: Pearson r = .719, p<.001; GRE Analytic r = .708, p<.001; GRE Verbal r = .716, p<.001; GRE Quantitative, r = .582, p<.001). These correlations indicate the degree to which these more broadly focused instrument capture an assessment of critical thinking. A number of these relationships were reported in a large multi-site research study involving 50 programs of health science education assessing students’ critical thinking.
Contact Insight Assessment for further information about the construct validity of our test instruments.