Archived Information

Short Web-Based Version of
Answers in the Tool Box: Academic Intensity, Attendance Patterns, and
Bachelor's Degree Attainment

by Clifford Adelman
Senior Research Analyst, U.S. Department of Education


The text, tables, and appendices of the full monograph (124 pages) are also available on this web-site. This short version includes the executive summary of the monograph (with page references to the original), a selection of its tables wrapped around with a new text, and this introduction. Researchers should use the full monograph text, for which the citation is:

Adelman, C. 1999. Answers in the Tool Box: Academic Intensity, Attendance Patterns, and Bachelor's Degree Attainment. Washington, DC: U.S. Department of Education.

The learning from this study is addressed to 7 audiences:

  1. People who make decisions about the provision of curriculum and guidance in secondary schools.
  2. People who seek to encourage all students to continue their education after high school, particularly those people who are advocates for minority students.
  3. People who advise and monitor students once they are in college.
  4. People in positions from which they seek accountability for college student attainment.
  5. Journalists who report on and interpret trends in education.
  6. Researchers who study the role of postsecondary education in human resource development and equity issues. Faculty supervising graduate students and their dissertations on these issues.
  7. Students themselves.

It is obvious from the study that the academic intensity and quality of oneís high school curriculum (not test scores, and certainly not class rank or grade point average) counts most in preparation for bachelorís degree completion, so

  1. Opportunity-to-learn is our most important objective.
  2. But opportunity-to-learn means little unless students take advantage of the opportunity—and school, peer and family environments are supportive.
  3. Opportunity-to-learn means more than school time. We must find ways to help students fill their curricular portfolios with as much content as possible during non-school hours.
  4. Social skills development and aspirations-boostings mean nothing without opportunity-to-learn.

This study is concerned with BA completion, but there are other postsecondary populations that are equally as important as those who actively seek a bachelorís degree.

Executive Summary

Answers in the Tool Box is a study about what contributes most to long-term bachelor's degree completion of students who attend 4-year colleges (even if they also attend other types of institutions).

Degree completion is the true bottom line for college administrators, state legislators, parents, and most importantly, students—not retention to the second year, not persistence without a degree, but completion.

This study tells a story built from the high school and college transcript records, test scores, and surveys of a national cohort from the time they were in the 10th grade in 1980 until roughly age 30 in 1993. The story gives them 11 years to enter higher education, attend a 4-year college, and complete a bachelor's degree. In these respects—based in transcripts and using a long-term bachelorís degree attainment marker—this story is, surprisingly, new.

This study was motivated by four developments in higher education during the 1990s:

  1. The growing public use of institutional graduation rates as a measure of accountability, and the tendency in public policy and opinion to blame colleges for studentsí failure to complete degrees and/or for failure to complete degrees in a timely manner.

  2. An ever expanding proportion of high school graduating classes entering postsecondary education, and new federal policies encouraging even more students to enter or return to higher education. Our system is being challenged simply to maintain, let alone improve, college graduation rates.

  3. The increasing tendency, overlooked in both policy and research, for students to attend two, three, or more colleges (sometimes in alternating patterns, sometimes simultaneously) in the course of their undergraduate careers.

  4. The rising heat of disputes involving admissions formulas at selective colleges where affirmative action policies have been challenged. These disputes, carried into the media and hence dominating public understanding, involve two indicators of pre-college attainment—grades/class rank versus test scores—without any reference to high school curriculum and its role in the degree completion rates of the mass of minority students.

The story of what contributes most to bachelor's degree attainment works toward six ordinary least squares regression equations that progressively add blocks of key variables following the progress of students from high school into higher education and through the first true year of attendance. The penultimate model (the fifth in the series) accounts for about 43 percent of the variance in bachelor's degree completion [p. 74]. The sixth equation simply indicates that one hits a plateau of explanation at this point. For a story-line such as this, 43 percent is a very high number. A five-step logistic regression then provides both a dramatic underscoring of the principal findings and some enlightening variations.

There are 11 variables in the penultimate linear regression model. The two most important variables, accounting for the bulk of the model's explanatory power are:

In the logistic version of the penultimate model, the same 11 variables (out of 24) are statistically significant, but those displaying the strongest relationships to degree completion (the highest "odds ratios") are all post-matriculation phenomena: continuous enrollment, community college to 4-year college transfer, and the trend in one's college grades.

Among the 11 variables, the following are not usually found in similar analyses:

The only demographic variable that remains in the equation at its penultimate iteration is socioeconomic status, and by the time students have passed through their first year of college, SES provides but a very modest contribution to eventual degree completion. No matter now many times (and in different formulations) we try to introduce race as a variable, it does not meet the most generous of threshold criteria for statistical significance.

Selected Findings

High School Background

College Attendance Patterns

Degree Completion

Conclusions That Follow from These Findings:

What We Learned: Variables to Discard

Examples of stock building-block variables that are discarded because of weak architecture:

. . .and Variables Reconstructed

What We Learned: Principles to Guide Research and Evaluation

The monograph concludes with "tool box" recommendations to those who execute policy regarding both pre-college opportunity-to-learn and post-matriculation advisement. The tool box metaphor is a logical consequence of the analysis. It says that if we are disappointed with uneven or inequitable outcomes of postsecondary education, we must focus our efforts on aspects of student experience that are realistically subject to intervention and change.

We do not have tools to change intentions or perceptions, or to orchestrate affective influences on students' decisions. The events of students' life course histories through their 20s lie largely beyond the micromanagement of collegiate institutions. But we do have the tools to provide increased academic intensity and quality of pre-college curricula, to assure continuous enrollment, to advise for productive first-year college performance, and to keep community college transfer students from jumping ship to the 4-year institution too early.

The recommendations thus address dual enrollment, direct provision of secondary school curriculum by college instructors, an 11-month rolling admissions cycle for all 4-year colleges, using Internet situated courses to keep college students continuously enrolled (even for one course), implementation of institutional policies restricting the extent of course withdrawals/ incompletes/repeats, realistic credit loads, and advisement that is both sensitive and sensible.

The story and its analyses are derived from and apply to a cohort whose history covers the period 1980-1993. There is another and more contemporary cohort whose history, beginning in 1988, is still in progress. Will the story-line change? Will the analyses be validated? Will we have attained greater equity in degree-completion rates for minority students? Have attendance patterns become even more complex, and more oriented toward competences and certifications as opposed to degrees? Only a full data-gathering for this cohort in the year 2000 and the collection of its college transcripts in 2001 will tell.

Selected Tables and Narrative

The principal data source for this study is the second of the three great age-cohort longitudinal studies designed and executed by the National Center for Education Statistics of the U.S. Department of Education. It is called the "High School & Beyond/Sophomore cohort," and it follows the history of the scheduled high school graduating class of 1982 from the time they were in 10th grade in 1980 until roughly age 30 in 1993. The study also uses data from the first longitudinal study (that of the high school graduating class of 1972) and the third (in-process) study, known as the NELS-88 (it began with 8th graders in 1988 and will conclude with a survey of this cohort at age 26/27 in the year 2000 and the gathering of their college transcripts in 2001). All these data sources include high school records and college transcripts. Such data do not lie.

In Part I of Answers in the Tool Box, an index of "academic resources" is constructed from three indicators of pre-college study: academic intensity/quality of curriculum, test scores, and class rank/academic GPA. The curriculum variable consists of 40 gradations. The gradations are determined first by the number of Carnegie units of English, mathematics, science/core laboratory science, history/social studies, and foreign language study on a studentís transcript, and then by the highest level of mathematics the students reached in high school, the number of Advanced Placement courses, whether the student ever took remedial coursework in math or reading, etc. The curriculum gradations comprise what is called a "criterion-referenced" scale. Unlike the case of test scores or class rank/academic GPA, everybody can reach the top rung on the ladder.

The test score variable covers 93 percent of the students in the sample who took a "mini, enhanced SAT" administered by NCES to all its longitudinal studies cohorts. High not all high schools compute class rank, and since academic GPA correlated with class rank at .841, a combined variable was created. All three components of "Academic Resources" were set out in quintiles.

The first table in this selection from Answers in the Tool Box takes the three components of the Academic Resources variable, by quintiles, and indicates the percent of students within each quintile who earned a bachelor's degree by age 30. The table cuts the universe of students four different ways. The distribution clearly indicates that no matter which way you cut it, the curriculum variable (academic intensity + quality) produces higher bachelor's degree completion rates than either test scores or class rank/academic GPA.

Table 5 [page 15 in the original text].—Percent of HS&B/So students completing bachelor's degrees, by quintile of performance on three component variables of "academic resources."

1) Base Group
N=10470; Weighted N=2.83M
   H.S. Curriculum7044195* 3*
   12th Grade Test6437168  3  
2) On-Time HS Grads Only
N=9635; Weighted N=2.61M
   H.S. Curriculum7045195*3*
   12th Grade Test6438178  3  
3) On-Time HS Grads with
SES Data; N=8819;
Weighted N=2.31M
   H.S. Curriculum6945196*3*
   12th Grade Test6438179  3  
   Rank/GPA5937219  5  
4) On-Time HS Grads with
SES Data Who Attended
College at Any Time;
N=6868; Weighted N=1.82M
   H.S. Curriculum72492510* 6*
   12th Grade Test67432314 7  
   Rank/GPA64422814 9  

NOTES: (1) The difference between any pair of estimates in a row is statistically significant at p<.05 except for those indicated by asterisks. (2) Senior Year Weight used for groups 1-3; Postsecondary Transcript Weight #1 used for group 4.
SOURCE: National Center for Education Statistics: High School & Beyond/Sophomore cohort restricted file, NCES CD#98-135.

We then combine the three background variables into the master-variable, Academic Resources (or ACRES—a purposeful agricultural allusion). The weight of each background variable was determined using a logistic regression. Curriculum accounts for 41 percent of ACRES, test scores for 30 percent, and class rank/academic GPA for 29 percent. After the combination, we distribute students by quintile of total "ACRES-points," and match the bachelorís degree attainment rates under that formula against those by quintile of socioeconomic status (SES).

The results provide a preliminary indication that Academic Resources will be a stronger variable than SES in explaining what really makes a difference in degree completion.

Table 11 [page 24 of original text].óACRES versus socioeconomic status in relation to bachelor's degree completion, High School & Beyond/Sophomore cohort, 1982-1993

 Percent Completing Bachelor's Degrees Within Each Quintile
All High School
Academic Resources72.544.817.35.42.0
Socioec Status55.432.923.214.57.2
All Who Attended
a 4-Year College
Academic Resources80.364.1*
Socioec Status72.159.6*55.5*45.535.4

NOTES: (1) The first universe of all high school graduates includes students who received a diploma or GED by 12/31/87 for whom SES and ACRES could be determined. Weighted N=2.45M. The second universe restricts the first by attendance at a 4-year college at any time. Weighted N=1.15M. (2) The Senior Year Weight was used. (3) All row and column pair comparisons are significant at p<.05 except those indicated by asterisks.
SOURCE: National Center for Education Statistics: High School & Beyond/Sophomore cohort, NCES CD#98-135.

Why choose bachelor's degree attainment? Because students don't go to college with the objective of merely getting to the second year. The bottom line for students, their families, institutions, and state legislatures (which worry about such things) is completion. Besides, if all we did was to look at persistence from the first to the second year of study, we find a very odd portrait of "success," as the following table should make clear.

Table 13 [page 28 in original text]—The fallacy of temporal persistence: percent of 1982 HS&B/So college entrants who "persisted" to the academic year 1983-1984, by credits earned in the first year, number of remedial courses, and degree completion

Credits Earned: 0-12 13-19 20-28 >28 % of
By Number of
Remedial Courses
None LOW N 5.2*28.662.348.5
One  8.1*  8.4*30.952.621.8
Three or More14.1*19.0  36.630.317.7
By Highest
Degree Earned
None19.419.9  36.524.229.7
Associate'sLOW N10.4*26.456.110.6
Bachelor's or Higher  1.9* 3.9*29.464.859.8

NOTES: (1) Universe consists of all on-time high school graduates who entered postsecondary education between June and December, 1982 and who were also enrolled during the Academic Year beginning July 1, 1983. (2) Weighted N=1.2M; (3)*Low-N cells with no statistical significance. (4) Rows for credits earned may not add to 100.0% due to rounding and Low-N cells.
SOURCE: National Center for Education Statistics: High School & Beyond/Sophomore cohort, CD #98-135.

Three out of ten "persisters: arrived at "year 2" of their college careers with a good deal less than what one would call "sophomore standing": either less than 20 credits and/or three or more remedial courses. They may have "persisted," but their chances of completing a degree (associate's or bachelor's) are rather low.

It is important to note that, when one considers the entire system of postsecondary education and follows the student through that system (no matter how many schools the student attends), bachelor's degree completion rates are very high. Answers in the Tool Box takes the position that, no matter what students say, unless they actually set foot in a 4-year school at some time after high school they are not in the denominator of degree completion rates. It is neither accurate nor fair to include a student who attends only a cosmetology school or another student who attends only a community college for purposes of earning a certificate in radiologic technology in the universe of potential bachelor's degree recipients. To illustrate, the next table sets forth the system degree completion rates for all students, and then for students who attended a 4-year college at some time. The table also illustrates the effect of advancing beyond an adjusted semesterís worth of credits (10) or a standard one-year credit load (30).

Table 14 [p.29 in original text].—Percent of students completing degrees, by credit-generation thresholds, High School & Beyond/Sophomore cohort, 1982-1993

 All Postsecondary Students Attended a 4-Year College At Any Time
  >0 >10 >30
>0 >10 >30   All DIR All DIR All DIR
Degree by
Age 30:
None 50.745.132.7 31.526.829.525.622.7*20.1*
Associate's 9.310.412.6 5.6*5.5*5.8* 5.6*6.3*6.0*
Bachelor's 40.044.554.8 62.967.764.768.871.073.9
Weighted N 1.951.841.48 1.311.

NOTES: (1) DIR=Direct Entry. (2) Universe=all students for whom transcripts were received. (3) Weighted N is in millions. (4) Columns may not add to 100.0% due to rounding. (5) Paired comparisons (All v. DIR) are significant at p<.05 except those asterisked.
SOURCE: National Center for Education Statistics: High School & Beyond/Sophomore cohort, CD# 98-135.

One of the most commonly-used variables in analyses of access and persistence in higher education is something researchers have labeled "aspirations." It turns out, however, that the questions we ask of students are not really about their educational "aspirations," rather about their expectations, plans, and commitments. For the High School & Beyond/Sophomores, we asked 6 pairs of such questions, in the 10th grade and again in the 12th grade. On the basis of these 6 pairs, a variable called "anticipations" was constructed. This variable measures the level and consistency of the studentís vision and concrete plans for his/her future education. The next table shows us what happened to students at each degree of "anticipation." The data indicate the virtues of reconstructing a variable that is often thoughtlessly applied.

Table 15 [page 34 in the original text].—Degree anticipations and highest degree earned by age 30 in the HS&B/So: percent of students completing degrees at each level of anticipation.

  All Postsecondary Students All Who Ever Attended a 4-Year College
of All
of All
Bachelor's Consistent31.5 4.065.533.9 25.1 3.371.653.4
Increased to Bachelor's55.311.733.028.6 36.27.456.428.9
Associate's Consistent
or Reduced from Bachelor's
67.216.716.117.4 43.612.8*43.611.1
Certificate or Associate's:
88.6*6.3*5.1* 8.5 72.2*5.222.6*3.4
No Degree or Never Knew89.0* 7.6*3.4*11.5 69.0*10.1*20.9*3.3

NOTES: (1) The universes consist of all students for whom "anticipations" could be determined from survey responses in both the 10th and 12th grades. For all postsecondary students, the Weighted N=2.38M; for those who attended a 4-year college at any time, the Weighted N=1.37M. (2) Column pairs are significant at p<.05 except those marked by asterisks.
SOURCE: National Center for Education Statistics: High School & Beyond/Sophomore Cohort, NCES CD#98-135.

Part III of Answers in the Tool Box provides an explication of post-matriculation attendance patterns in higher education. The data and analysis makes it clear that going to college in the 1990s means something very different than it did 20 years ago. The next three tables provide parameters and guidance for understanding (a) the nature and extent of multi-institutional attendance, and (b) that attending more than one school does not produce a significantly lower degree completion rate (even though it obviously takes longer to earn the degree under those circumstances).

Table 18 [page 41 in the original text].—Change in the percent of students attending one and more undergraduate schools: "Basic Participation Populations"* v. Bachelor's Degree Recipients

 Number of Undergraduate Schools
Basic Participants
NLS-72 (1972-1984)59.5%30.6%9.9%
HS&B/So (1982-1993)46.833.719.5
Bachelor's Recipients
NLS-72 (1972-1984)50.436.413.2
HS&B/So (1982-1993)41.736.721.6
Effect size for Basic Participants = .36
Effect size for Bachelor's Recipients = .24
* Basic participation = earned more than 10 undergraduate credits.

NOTES: (1) All within-cohort differences are significant at p<.05. (2) The effect size is used for measuring differences in means across cohorts. One takes an unweighted sample N, mean, and standard deviation for each cohort group. Then one creates a "pooled standard deviation" by the formula: (N1(SD1) + (N2(SD2) / (N1 + N2). Lastly, one determines the difference in the means and divides that difference by the pooled standard deviation. The resulting effect size functions like a Standard Deviation Unit or z-score, that is, it tells us whether the change in the mean is truly significant. In this case, the effect size for basic participants is moderate and that for bachelor's degree recipients small. For guidance on interpreting SDU-type changes, see Bowen, H.R., Investment in Learning (San Francisco: Jossey Bass), 1973, p. 103.
SOURCES: National Center for Education Statistics: National Longitudinal Study of the High School Class of 1972, and High School & Beyond/Sophomore cohort.


Table 19 [page 42 in the original text].—Percent of students attending more than one institution of postsecondary education by type of "referent" first institution attended

First Institution
4-Year College38.8*52.3*50.1
2-Year College36.4*46.556.5

Sector Share of
All Cohort

4-Year College54.750.555.5
2-Year College37.839.740.1

NOTES: (1) The universe for the BPS/90 distribution consists of "typical" college-age (17-23) students only, so as to render it more comparable to the age-cohort longitudinal studies. (2) The universes for both the NLS-72 and HS&B/So consist of all students with postsecondary records. (3) Definition of first institution of attendance for the HS&B/So differs from that of the other studies. See text. (4) All within-cohort differences are statistically significant at p<.05 except those column pairs indicated by asterisks.
SOURCES: National Center for Education Statistics: (1) National Longitudinal Study of the High School Class of 1972; (2) High School & Beyond/Sophomore Cohort, NCES CD #98-135; (3) Beginning Postsecondary Students, 1989-1990, Data Analysis System.


Table 20 [page 45 in the original text].—A portrait of multi-institutional attendance of the HS&B/So, 1982-1993: percent of students attending more than one college, by institutional characteristics and other attendance variables.

  Number of Undergraduate Institutions 
 OneTwoThree+% of All% Earning
All who earned >10 credits473518100%45
Type of Referent First Institution
    Liberal Arts433819* 673
    Community College4634203420
Selectivity of First Institution
    Highly Selective513118* 392
    Selective443620 985
    Non-Selective 4436205265
    Open Door56281631 7
    Not Rated7518* 7* 5 2
Institutional Combinations
    4-year only5631134675
    4-year to 2-year--7129 4--
    2-year to 4-year--63371066
    Alternating/Simultaneous--3070 552
    4-year and other--7822* 317*
    2-year only7518 7*22--
    2-year and other--7723 3--
    Other only9010*-- 6--
    2-year, 4-year & other---- 100 214*

Notes: (1) The universe consists of all students who earned more than 10 credits. Weighted N=1.87M. (2) "Other" institutions consist principally of non-degree granting vocational schools, and these are "not rated" in terms of selectivity. (3) Institutional type based on the Carnegie classification system in effect in 1987. (4)* Low-N cells. Estimates are not significant.
SOURCE: National Center for Education Statistics: High School & Beyond/Sophomore cohort, NCES CD#98-135.

Multi-institutional attendance is not the only notable feature in the changing landscape of student behavior in higher education. For decades we have talked about full-time and part-time students in terms of their original enrollment status in a given term. What we learn from the transcripts is that the proportion of all grades that indicate the student withdrew (without penalty) from a course, left the course incomplete, or repeated the course even though the course was passed on the first attempt) jumped from 4 percent to 7 percent between the 1970s and the 1980s. That may not sound like a large proportion of the grades, but itís a deadly increase that calls the concept of "full-time/part-time" into question. After all, students may start a semester with a 15 credit load, but withdraw or leave incomplete 9 of those credits. If students do this too often, as the next table indicates, their likelihood of finishing a degree is low.


Table 26 [page 55 in the original text].—DWI* Index: percentage of HS&B/So students who dropped, withdrew, or left incomplete proportions of attempted undergraduate courses in three ranges, by highest degree earned, 1982-1993

Proportion of All Attempted
Courses that Became DWIs:
For ALL Students:62.014.723.3
Highest Degree Earned:
    Bachelor's78.414.9* 6.7
    Graduate87.8 9.6 2.6

Notes: (1) All row and column differences significant at p<.05 except the column pair indicated by asterisks. (2) Universe consists of all HS&B/So students with complete postsecondary records. Weighted N=1.65M. (3) Rows add to 100.0%. SOURCE: National Center for Education Statistics: High School & Beyond/ Sophomore cohort, NCES CD#98-135. *DWI=Drops, Withdrawals & Incompletes.

Part IV of the study brings together nearly 30 major variables in two kinds of equations with bachelorís degree attainment as the dependent variable. The variables are fed into the equations in blocks that follow the life-course from high school into postsecondary education and through the first year of college. This is fairly technical material for the general reader, but is absolutely essential for determining what really counts. The next table presents the first stage of both sets of equations, here in the "logistic" version. The logistic equation produces an "odds ratio," a metric that answers the question, "Given a one-unit change in the value of any independent variable, how much do the chances of earning a bachelorís degree change?" The criterion for even glancing at an odds ratio is that all the other data for the variable in that equation are statistically significant.

So, the next table takes students' demographic backgrounds (not only SES, race, and sex, but also whether they had children of their own by age 22), their educational anticipations, and their ACRES, and sets forth the first stage of analysis. SES and ACRES are in quintiles, anticipations has five values (see p. 6 above), race has two values (non-Asian minority/ everybody else), and "children" (parenthood) has two values (yes/no). To illustrate what happens with an odds ratio: Academic Resources is a quintile variable; each move up the quintile ladder increases the odds of earning a bachelorís degree by 97 percent. The t-value is very strong (8.58) and the probability that this relationship would occur by chance (the p-value) is less than one in 1,000.


Table 30 [page 62 of original text] Logistic account of the relationship of pre-college and family variables to bachelor's degree completion among 4-year college students in the High School & Beyond/Sophomore cohort, 1982-1993

Universe: All students for whom the evidence confirms 4-year college attendance at any time (whether transcripts were received or not), who received a high school diploma or equivalent prior to 1988, and who evidenced positive values for all variables in the model. N=4,943. Weighted N=1.179M. Simple s.e.=.697; Taylor series s.e.=1.082; Design effect=1.55.

Adj. s.e. t   p  Odds
Academic Resources0.678 0.0518.58.0011.97

NOTES: (1) Standard errors are adjusted in accordance with design effects of the stratified sample used in High School & Beyond. See technical appendix and Skinner, Holt and Smith (1989). (2) Significance level of t (p) based on a two-tailed test.

In the "linear" form of this equation, these variables account for 23 percent of the variance in bachelorís degree completion rates, with ACRES accounting for the bulk of that figure. In the linear form, too, race and sex would be dropped as variables because they did not meet the minimum criterion for statistical significance.

The second stage of the equations adds in financial aid variables: grants, loans, and various forms of college work/study (combined into a single variable). Whether the student took out an educational loan has no effect on degree completion in either type of equation, and the variable is dropped from the "linear" story line.

The third stage brings college attendance patterns into the story, and is extremely important because (1) in the linear version, it lifts the explanatory power of the model from the 23-24 percent range to 36 percent—a rather dramatic increase, and (2) in the logistic version, it demonstrates how the relationships among the variables alter when a new set enters. The following table is an extract from the logistic presentation in table 39 (pp. 80-81) of the original text:

Logistic Version of the College Attendance Pattern Stage of the Model Explaining Factors in Bachelorís Degree Attainment (by age 30) of Students in the High School & Beyond/Sophomore Cohort Who Attended a 4-Year College at Any Time, 1982-1993

Variable Parameter
Continuous Enrollment1.5534.73.001
Classic Transfer1.3493.85.001
Academic Resources (ACRES)0.5441.72.001
Did Not Return to 1st Institution -0.6910.50.01  
SES 0.1941.21.02  
1st Institution Was Selective0.6932.00.05  
Student Received Grant 0.4051.50.05  
1st Institution Was a 4-Year0.4031.50----
College Work-Study0.2591.30----
No Delay Entering from HS0.2621.30----
1st Institution Was a Doctoral0.1531.17----
Attended Other than 2 or 4-Year-0.6520.52----
Sex (Male=1)-0.1500.86----
Took Education Loan 0.0561.06----
Number of Schools Attended-0.0090.99----

Only half of these variables are statistically significant, and race is barely so. The closer an odds ratio approaches 1.0, the less it means. Put the two of those observations together, and one sees instantly that number of schools attended, taking out a loan, and gender are irrelevant at this third stageĖand remain so through the rest of the equations. Continuous enrollment and classic transfer (defined as earning a semesterís worth of credits at a 2-year college before moving to a 4-year and getting by a semesterís worth of credits at the 4-year school) become very important to the entire story, and hold their positions through the 4th stage (first year performance) and final stage (extended performance indicators). When first year performance indicators enter (grades, total credits earned, and the ratio of credits earned to credits attempted), race joins the ranks of variables that are not statistically significant. By the time students move beyond their first year, race joins sex at an odds ratio approaching 1.0. This makes sense: as one moves along the paths into and through higher education, who you are is not anywhere nearly as important as what you do.

Academic Resources (the ACRES variable) still plays one of the stronger roles in what you do. In the "linear" version of the equation series, ACRES is the strongest variable in the final iteration (which accounts for about 43 percent of the variance in bachelorís degree completion rates), and in the logistic version, it is one of the top six. For pre-college performance, dominated by the intensity and quality of high school curriculum, to have such a lingering effect sends a very powerful message to high school students.

One of the most critical issues that can be addressed with the data and analyses of this study is how to narrow the gap in bachelorís degree attainment rates by race. This is a matter of doing the right thing for minority students: not merely admitting them to college, but making sure that they have the momentum to complete degrees. Otherwise, we defraud them. The components of the ACRES variable tell us what tools to use. The following table is a simulation. Supposing we took the top 40 percent of students in each of the three components of ACRES: curriculum, test scores, and class rank/academic GPA. What were their bachelorís degree attainment rates? How do those rates compare with all students? And which of the three components produces the best results for minority students? The answer is a "no-brainer": curriculum wins, hands down! And curriculum is the only component of pre-college preparation that we can do something about for sure! And curriculum is the only component in which everybody can be at the topĖprovided (a) that they have the opportunity-to-learn and (b) that they take advantage of the opportunity.

Look at what happens in table 40. For all African-American students who entered 4-year colleges directly from high school, the bachelorís degree attainment rate by age 30 was 45 percent. If we took the top 40 percent by class rank/GPA, the rate rises to 59 percent. Sounds good! If we took the top 40 percent by test scores, it rises to 67 percent. Sounds better, but not as good as it can get. For if we took the top 40 percent by the curriculum intensity/quality indicator, the graduation rate rises to 73 percent, and the gap between black and white degree attainment rates shrinks from 30 percent to 13 percent. Now which of these three indicators would you rather use in advising African-American students? Which of these should be the most important in college admissionsĖif one is really interested in the student?

When we look at the impact on Latino students, the case against using class rank/GPA is even more powerful, for here it actually has a negative effect on graduation rates (61 percent for all versus 57 percent for the top 40 percent by rank/GPA). At the same time, using the top 40 percent by curriculum produces a graduation rate statistically equivalent to that for Anglos.

Table 40 [page 85 of the original text].—Bachelorís degree completion rates for students in the top two quintiles of each component of ACRES, who entered 4-year colleges directly from on-time high school graduation, by race, High School & Beyond/Sophomore cohort, 1982-1993

Highest 40% and
Algebra 2

Test Scores
Highest 40% of
Combined Scale

Class Rank/GPA
Highest 40% of
Combined Variable


Notes: (1) Universe for "ALL" consists of all on-time high school graduates who entered 4-year colleges directly from high school, and whose college transcript files are not incomplete (Weighted N=859K); the universe for the three component groups adds high school records with positive values for all three components (Weighted N=805K). (2) standard errors are in parentheses.
SOURCE: National Center for Education Statistics: High School & Beyond/ Sophomore cohort, NCES CD#98-135.

The monograph concludes with "tool box" recommendations designed to increase the curricular content in the portfolios that everybody—but minority students, in particular—brings forward from high school into higher education, recommendations that goad institutions of higher education themselves into being more active providers of that content. We need massive and creative efforts including dual-enrollment, direct provision of curriculum by colleges and community colleges to rural high schools in particular, establishing and expanding community technology centers (using churches, public libraries, granges, Boys-and-Girls clubs, etc.) to provide academic material during non-school time, and other strategies. We do some of this now, but not enough. Opportunity-to-learn is still lacking in too many places, and jiggling with admissions formulas in selective colleges won't solve the problem.

[ED Home] Return to Full-text Version Table of Contents