Quality Matters: Assuring the Quality of Online Courses
Quality assurance of online courses is of prime importance to all the stakeholders in higher education: students, faculty, administrators, institutions, accrediting agencies, and legislators all benefit from a rigorous process of quality assurance and control. In the interest of defining and assessing quality in online courses, in September 2003, the U.S. Department of Education Fund for the Improvement of Postsecondary Education (FIPSE) awarded MarylandOnline (MOL; http://MarylandOnline.org) a three-year Quality Matters (QM) grant to arrive at consensus among these stakeholders and to develop the tools, process, and infrastructure to assure the quality of online courses.
Presently, as the Quality Matters team wraps up its three year project, it has received a series of significant recognitions: WCET Outstanding Work (WOW) Award (2005), the United States Distance Learning Association’s (USDLA) 21st Century Best Practice Award (2005), and the Maryland Distance Learning Association’s Program of the Year Award (2005).
Thus, it seems a propitious moment to present the concept, process, and recent experiences of Quality Matters to the whole UMUC community.
Quality Matters: The Concept
FIPSE funded Quality Matters as an innovative project that pushes the boundaries of higher education quality assurance in the following ways:
- Institutional participation is voluntary. The grant project involves multiple institutions with shared project tasks and responsibilities.
- The focus is on course improvement, not evaluation. The QM project is crafted to provide faculty with feedback on their course design for continuous improvement, rather than definitive evaluation of faculty performance.
- The Peer Review process is inter-institutional. The QM rubric and process were created by teams representing a variety of higher education segments and intended for use by all levels of the education community. In addition, guidelines dictate that review teams include at least one member (preferably two) from outside the course's home institution.
- QM can serve as a national model. The collaborative, academic, research-based discussions, processes, and tools of QM make it a model amenable to national adoption.
The challenge in this project has been to create a process that can be endorsed by all institutions without compromising institutional autonomy. Similarly, it has been crucial to communicate strongly and unequivocally to all participating faculty members that their academic standards and individual teaching approaches will be respected and embraced.
Quality Matters: The Process
The QM grant project (http://www.QualityMatters.org) has created an inter-institutional continuous improvement process for assuring the quality of online courses. The credibility, reliability, and strength of this process stems from three core features of the project. In the first place, a process has been created that has been vetted by faculty experts. Second, all review criteria and the extensive Quality Matters rubric are based on solid research literature, eleven national standards of best practice, and instructional design principles. Finally, participation by faculty, instructional designers, and institutions is voluntary, collaborative, and supportive. The Quality Matters team has worked hard to communicate these principles to all participants and to establish strong collaborative guidelines.
Concretely, a team of three peer reviewers is trained to use the QM Rubric to review the quality of each online course. The rubric consists of 40 weighted review elements shown in the research literature to positively impact student learning. Each review team includes one content expert and at least one member from an institution other than the course's home institution. Review team members work both individually and collaboratively, and in communication with the faculty member who developed the course. The reviewers provide feedback on the exceptional elements of the course, and provide positive recommendations for improving the course.
The backbone of the Quality Matters rubric, and another aspect of its strength, lies in the integration of review elements that touch upon learning objectives, assessments, and activities. Through this integrated approach, faculty reviewers take a holistic view of the course design. For example, reviewers are asked to consider whether the assessments and activities are indeed driven by the learning objectives. Even if a course clearly states learning objectives, the course cannot meet the stated criteria unless those learning objectives are clearly linked to assessments and activities.
Quality Matters Recognition: Standards and Recent Experiences
The QM course review process is not seen as having a win/lose or pass/fail outcome. Rather, the expectation is that each course reviewed will eventually meet QM quality expectations. To receive QM Recognition, a course must demonstrate all essential review elements (the most heavily weighted elements) and receive a combined total score of 68 out of 80 possible points. This represents a required minimum score of 85%. If a course does not meet these standards after initial review the faculty member may make changes and improvements and submit the course for an expedited review. To encourage a continuous improvement process, the QM project provides instructional design support for implementing the review team's recommendations if no such support exists at the faculty member’s institution. QM has reviewed over 100 courses from 18 different Maryland schools and 10 different schools outside Maryland. Upon initial review, approximately 50% of the courses do not meet QM expectations. The vast majority of these courses subsequently undergo minor revisions in a short period of time to receive QM Recognition.
The QM project is a complex venture that involves and benefits constituencies at institutions across Maryland and the United States. First, Faculty gain solid recommendations for refining course design and materials, thus leading to improving the teaching and learning process, while also benefiting from access to professional development activities and an expanded and diversified peer network. Second, Instructional designers and technologists and distance education staff gain access to shared resources, training, and a widened professional network. Third, Institutions gain access to a quality assurance activity that will inform their programmatic decisions and strengthen their accreditation credentials, to activities that foster collaboration and resource sharing across institutions, and to expedited articulation pathways. Finally, most importantly, students benefit from improved courses and increased access to postsecondary learning opportunities.
QM has moved beyond the point of Maryland-only activities, it is now a nationally-recognized project. In addition to a number of national partners, individuals and programs from 130 different institutions of higher education across 28 different states are participating in QM. Over 600 faculty have been trained to use the rubric to review online courses. Ninety-seven percent of faculty who have gone through training have agreed (28%) or strongly agreed 69%) that "the Quality Matters project will positively impact teaching and learning at my institution." Two-thirds of the trainees indicated that they intended to make improvements in their own course based on their exposure to the rubric, while an even higher number (71%) planned to use the rubric as an aid in their own course development activities.
Participants in the course review process have indicated that it is a positive and valuable professional development experience. A significant number of peer reviewers have indicated continued interest serving in this capacity, and faculty who have had their course reviewed have indicated an interest in becoming a peer reviewer. Peer reviewers have indicated a number of benefits to them personally, including attainment of:
- an increased awareness of the standards of best practice
- insights to improve their own course
- guidelines for developing a new course
- ideas and approaches from other courses, adaptable to their own course
To date, over 100 UMUC faculty, staff, and program administrators are participating in QM. The QM project is co-Directed by Christina Sax, Collegiate Professor in Biology and Assistant Dean of Social, Behavioral, Natural, and Mathematical Sciences in UMUC's Adelphi division. Additionally, Richard Schumaker of UMUC's Center for Teaching and Learning has served on the QM project in a number of different capacities, and recently became certified as an official QM Trainer. Eighteen different undergraduate courses have undergone QM review.
During the Center for Teaching and Learning's (CTL) 2006 Summer Institute, the QM Rubric will be linked to CTL's and GSMT's Expectations documents and lay the foundation for the Using Self-Review for Continuous Improvement of Your Teaching track. This track will be adapted for a global online CTL workshop set to debut in spring 2007. The QM rubric has also been incorporated into the School of Undergraduate Studies online course development guidelines, and future plans call for the inclusion of the QM rubric as a resource in CTLA201 WebTycho training.
We welcome your questions and suggestions about Quality Matters and would be pleased to provide you with more information on it. For more information or, of course, to participate in the QM project, contact Christina Sax (email@example.com) or Richard Schumaker (firstname.lastname@example.org).