Assessing Models of Liberal Education: An Empirical Comparison
Postsecondary institutions today are well aware of the importance of curricular assessment. When, as in the case of Miami University of Ohio, two alternative approaches to general education coexist on the same campus, the task of evaluating the outcomes of the two curricula becomes even more pressing.
Miami University wanted to distinguish the effects on student intellectual and personal development of two general education curricula: the University Requirement, a disciplinary, distribution-based general education program (replaced in fall 1992 with a new liberal education program), and the Western College Program, an interdisciplinary core curriculum.
Miami administered a dozen different tests to matched groups of first, second and fourth year students enrolled in the disciplinary and the interdisciplinary curricula. Qualitative and quantitative measures were used, and cross-sectional as well as longitudinal data were collected and analyzed. As a result of this many-sided approach, a complex picture of the students emerged, along with certain patterns of strengths and weaknesses.
In general, measures of general liberal arts skills--the ACT-Comp (ARC version), the Academic Profile, the Test of Thematic Analysis, the Analysis of Argument, the Measure of Epistemological Reflection and the SUNY-Fredonia Tests of General Education--showed few significant differences between students engaged in the interdisciplinary curriculum and those in the disciplinary curriculum.
While performance on most measures was quite high, the results did suggest a need for students in both groups to increase their understanding of the scientific method and improve their knowledge of global issues. It also became apparent, from the Measure of Epistemological Reflection, that interdisciplinary students enter and graduate at a higher cognitive level than their disciplinary peers. The test's author speculated that Miami faculty in both programs may orient their teaching to a higher cognitive level than most students have achieved.
Several of the instruments used seem to have been designed for students of lower academic ability than Miami's, with the consequence that first year students received extremely high scores, leaving little room for sophomores and seniors to show gains. Several faculty members argue that many of the instruments used measure only traditional linear reasoning, and that in some cases higher-level reasoning processes exhibited by advanced students may have been evaluated negatively.
The most significant differences between the two groups of students emerged from tests of cognitive development and of specific academic behaviors and values--the Myers-Briggs Type Indicator, the ACT Activity Inventory, the College and University Environment Scales, the College Characteristics Index, the College Student Experiences Questionnaire, and the Cooperative Institutional Research Profile. Data show that whereas students in the disciplinary curriculum exhibit higher levels of involvement in social and athletic activities, Western College students participate more frequently in the intellectual, artistic, political and human service activities of the campus. Although this category of instruments did reveal significant differences between the two groups, many of these differences were foreshadowed by entry data from the Cooperative Institutional Research Profile (CIRP) before the students enrolled.
As a result of the project, awareness of the importance of assessment and of the national attention and financial rewards that assessment programs can bring to institutions grew considerably on campus. New faculty were hired to strengthen instruction in scientific and quantitative reasoning. "Expectation Statements," written each semester by the interdisciplinary students only, yielded much information on how students experience their lives on campus, and led to some changes in the treatment of the sophomore year, particularly the development of the students' upper-level program of study.
Further, because the tests were shown to have serious shortcomings for Miami students, faculty were stimulated to continue seeking better ways to measure the effects of general education (see below).
Because of the intensive and prolonged nature of the testing, it was difficult to ensure that students continued to participate. Thus, because of considerable attrition, the groups that began as highly matched ended with a much lesser degree of matching.
The task of transcribing the interview tapes also turned out to be inordinately laborious, and delayed the availability of an important source of data.
Major Insights And Lessons Learned
Although the project did not demonstrate consistent, significant academic differences between the two groups of students, clear differences did emerge in the areas of campus values, academic behaviors, involvement in learning and interpersonal interactions. It is not possible to tell, however, whether these differences result from the respective curricula, or from self-selection on the part of the students. A "chicken or the egg" conclusion was reached to the effect that campus cultures either shape the academic experience of students or are shaped by them.
Perhaps the most significant insight of the project for Miami University was related to the inadequacy of nationally available standardized liberal arts skills tests for measuring the different effects on student learning of the disciplinary and interdisciplinary curricula. This may be due to the high caliber of the populations being measured, which makes it very difficult to obtain gain scores large enough to differentiate between the two groups. On the other hand, the tests may not be sufficiently refined to measure the relatively subtle differences between the outcomes of the two curricula. It is possible that the curricula result in differences not in liberal arts skills, but in the behaviors and values mentioned above. It is also possible that the specific nature of the curriculum has less influence on student learning than do campus ethos, student self-selection, class size, and other non-curricular factors.
The national standardized tests do, of course, provide national norms for student performance. Because they show a high degree of correlation with college-entrance scores, they may constitute more meaningful measures of recruitment efforts than of curricular effects.
Two parallel portfolio programs implemented in fall 1992 assess both the Western College Program and the new liberal education curriculum. All students in the Western Program are required to keep portfolios. A random sample of 40 students campus-wide have been recruited out of each entering class since the summer of 1990 to participate in the project. Results have been shared via the liberal education newsletter, "The Miami Plan." A campus-wide Assessment Council developed a Statement of Philosophy and Goals for Assessment at Miami University, and the University Senate passed it unanimously in 1992. Implementation of assessment within the academic departments has been the focus of the Assessment Council's work during the 1992-93 academic year. It is supported by a State of Ohio Program Excellence Grant and the Miami Liberal Education Program.
In addition to its 1990 final report to FIPSE, the project has resulted in over twenty conference and workshop presentations. Several reports of project results, plus materials on the continuation of the project since FIPSE funding expired, are available: 1) Excerpts from the liberal education newsletter, "The Miami Plan," on portfolios, student time use, interviews, and free-writing exercises; 2) the Miami Statement on Philosophy and Goals for Assessment; 3) the 1992 Summer Orientation talk to parents sharing assessment results; and, 4) the October 1992 AAHE Bulletin article, "AAHE's Assessment Forum Changes Hands," pp. 10-14, that provides observations about the Miami project and subsequent work. A chapter written with SUNY-Fredonia on the two FIPSE projects will appear in the Jossey-Bass book, Are We Making a Difference? edited by Trudy Banta.
For further information please contact:
Karl Schilling, Director
AAHE Assessment Forum
One Dupont Circle, Suite 360
Washington, D.C. 20036-1110