Learning Anytime Anywhere Partnerships (LAAP)

Current Section
 Office of Postsecondary Education Home
LAAP Evaluation and Performance Indicators
Archived Information


The Government Performance and Results Act of 1993 (GPRA) is a straightforward statute that requires all Federal agencies to manage their activities with attention to the consequences of those activities. Each agency is to clearly state what it intends to accomplish, identify the resources required, and periodically report their progress to Congress. In doing so, it is expected that GPRA will contribute to improvements in accountability for the expenditures of public funds, improve Congressional decision-making through more objective information on the effectiveness of Federal programs, and promote a new government focus on results, service delivery, and customer satisfaction.

While the goals of LAAP are complex and the funded projects are diverse, several objectives and indicators have been selected for reporting GPRA results. Evaluation plans in final LAAP proposals should include gathering these data when they are relevant to the particular project goals and activities.

GPRA Objectives and Indicators for the LAAP Program

Objective 1: Develop innovative partnerships resulting in economies of scale delivering asynchronous distance education and training.

Indicator 1.1 National Distribution—the number of products, courses, and/or degree programs developed for delivery statewide or nationally will increase.

Objective 2: Increase access to asynchronous distance education for diverse groups of learners, especially to prepare them for work in technical and other areas of critical shortage or for the changing requirements of fields.

Indicator 2.1 Number of "underserved" students—the number of underserved students enrolled each year will increase, i.e., individuals with disabilities, in remote areas, welfare recipients or displaced workers, underrepresented populations (Native American, Hispanic, African American), and other adults not otherwise able to participate in postsecondary education.

Indicator 2.2 Course Completion Rate—the completion rate of students who enroll in and complete courses or training programs will increase.

Objective 3: Enable advancements in quality and accountability within postsecondary distance education.

Indicator 3.1 Competency-based—the number of courses that base assessment on student competency, rather than on traditional units of instruction, will increase.

Objective 4: Continuation or expansion of LAAP projects beyond Federal funding.

Indicator 4.1 Projects sustained—projects sustained or expanded at least two years beyond the Federal funding period.

Additional LAAP Program Performance Indicators

In addition, the LAAP program is interested in evaluation questions and indicators beyond those reported for GPRA. The following list is not intended to be a blueprint for project evaluation; but, depending on the nature of the LAAP project, some of these questions and indicators should be addressed.

 

LAAP Evaluation Questions

Indicators

ACCESS: Does the project increase access among populations who otherwise have difficulty accessing postsecondary education? Reach a national or multi-state audience? Do learners successfully complete the courses/program?

  1. Number of learners enrolled by project year
  2. Increase in student enrollment, or use of learning "objects", by categories of underserved populations;
  3. Distribution of enrollment geographically
  4. Rate of course completion; certificate or degree program retention/completion

IMPACT ON LEARNING: Does the project result in increased learning? Higher order learning outcomes? Better performance? New attitudes toward technology or learning? Changed learning behaviors?

  1. Improved learning outcomes, based on objective measures (not self-report, GPA or course grade)
  2. Improved attitude or behavior toward technology

QUALITY: Is a fully articulated system of quality assurance in place? Technologically-based performance assessments? Comprehensive student services?

  1. Documentation of processes and instruments
  2. Increased range of student support services
  3. Increased use of competency-based assessment

FLEXIBLE EDUCATION DESIGN AND DELIVERY: Does the courseware/module adapt to learner differences? Does it allow flexibility in pacing? Entry/exit? Is content or pedagogy adaptable to learner abilities, styles or prior learning? What’s the teaching efficacy of particular design features ?

  1. Increase in use of flexible features (analysis of products and procedures)

COLLABORATION: Do the partnerships result in economies of scale? Is there evidence of resource-sharing? Elimination of duplicate courses or programs or student services? Joint faculty efforts? New joint policies or procedures developed? Are the partnerships continuing beyond the grant funding period? Is collaborative activity increasing (more partners included, spheres of collaboration increased)?

  1. New policies or procedures in admissions, registration, advising, course/curriculum development or revision, faculty development, staffing, teaching methods, resource allocation, intellectual property, or other
  2. Cost savings resulting from resource sharing
  3. Changes in partnership: new partners, new activities, additional partnering ventures

WORKFORCE IMPACT: Does the project prepare adult learners for work in technical areas of critical labor shortage? For changing requirements in career fields or industries? Is there evidence of increased or enhanced job placement?

  1. Numbers of learners taking and passing job/industry certification exams
  2. Employment of displaced workers or former welfare recipients

COST EFFECTIVENESS:

Does increased enrollment/tuition offset the increased cost of developing and delivering highly interactive programs? What’s the "break-even point" for scaling up enrollment? What are the real costs of this delivery approach? Does the use of the new technologies result in any real cost savings?

  1. Activity-based cost analysis
  2. Cost-benefit analysis
  3. Cost comparisons of staffing patterns, partnering arrangements, development processes, etc.

SCALING UP & DISSEMINATION: Is large-scale expansion feasible? Are elements of the project (or the project’s entire "model") being adapted/adopted elsewhere?

  1. Evidence of large-scale expansion
  2. Evidence of project’s impact on the practice of distance learning, e.g., the formation of other partnerships, use of software or courseware, adaptation of policies or procedures by others
"Additional LAAP Program Performance Indicators" developed by Joan Krejci Griggs, FIPSE LAAP Program Officer

 
Print this page Printable view Send this page Share this page
Last Modified: 06/09/2004