Nearly six years ago, the University of Washington (UW) faced a 4 percent budget reduction. The faculty debated a number of questions: Should the reduction be met through vertical or horizontal cuts? What processes should be set up to consider eliminating one program or another and how might the attendant ill will caused by such steps be mitigated?
But to make determinations about "best" kinds of cuts, our academic leadership had its own set of questions. If the University of Washington were to eliminate the X track in Y department, how many majors would be affected? How soon would it be until all of the students in track X would graduate? What alternative pathways might students choose who otherwise would have chosen track X? What other departments and programs would be likely to be affected by such a cut? Analogous questions were posed for each of the 10 or so programs--or parts of programs--being considered for elimination, consolidation, or reorganization.
The provost and the deans wanted answers to these questions--a reasonable set of requests, to be sure. But the institution discovered that it was extremely difficult to provide such answers, even though the requisite data existed somewhere at the university. And when the necessary information, could be gleaned, it was extremely costly to extract in terms of staff time. This inability to provide decision support in a timely manner was deeply troubling. Absent compelling data to support their decisions, academic leaders were forced to make vague statements of rationale, drawing heavily on the criterion of a unit's or a program's "centrality" to the institution's mission. Just how this centrality might be measured could never be defined satisfactorily.
Given the prospect of dwindling state support and the likelihood of future budget cuts, these affairs were unsatisfactory. Responsible academic management at minimum requires the ability to track student progress and measure performance. At a deeper level, though, consensual decision-making involving faculty, chairs, deans, the provost, and the president requires access to a legitimate, recognizable, and common body of information. Too often in moments of crisis, faculty and administrators are locked in a battle over what is "true." Consensus is impossible when essential facts are in contention. In contrast, common information takes the conversation to a higher policy plane: it allows rational discussion of what should be rather than what is.
In response to the absence of an adequate decision support system, a group of us--spanning numerous university divisions including the Office of Undergraduate Education, Planning and Budgeting, Student Affairs, and the Graduate School-proclaimed ourselves the University Strategic Analysis Group (USAG). And together, we produced a relational database that has become a key resource for academic decision-making at the UW. Born of crisis, the USAG database now supports ongoing academic planning at departmental, college, and universitywide levels, directly affecting the culture of our decision-making.
Although this database has had big payoffs, implementing any change in a large university is always problematic. At the University of Washington, this project grew as a revolution from below. It depended on the voluntary, extraordinary efforts of a group of individuals who simply decided to devote themselves to solving these problems. There was no mandate from above. The existence of similar projects at other universities with different histories and philosophies suggests that some of the same ends can be accomplished under different political and institutional conditions. Even so, experience suggests that there are fundamental questions and principles that can help guide any similar initiative.
Once need is established, the second step is to survey existing data resources: What exists presently, in what form, and to what end? This takes time to assess. Even staff who have worked with data at a university for many years may never have asked questions about what data resources are owned by the university as a whole. Staff also typically have not peered over database fences, and may not understand the purpose or logic of any beyond those for which they have direct responsibility.
At the University of Washington, for example, we found a rich array of databases--all of which used disparate computing technologies, file formats, and data definitions. These included the Student Information System (student course enrollment records and student demographics) from the Registrar's Office, course evaluations from the Office of Educational Assessment, building and room information in the Capital and Space Planning Office, the Financial Accounting Systems and Faculty Workload Systems in the Office of Planning and Budgeting, grant and contract expenditures from the Office of Research, graduate student records from the Graduate School, and so on. Moreover, nearly every one of these resided in a different vice president's or vice provost's domain of responsibility.
Each database also had a different audience or master--sometimes the state, sometimes the federal government, and sometimes a particular academic unit (such as the Graduate School). Each also had an "owner" who controlled the shape of the database and governed access to it. Some were Unisys mainframe (using COBOL programming language); others relied upon a PC environment using proprietary software. Five years ago, the challenges of making these disparate systems relate to one another were considerable. Nonetheless, the survey of existing databases was heartening. Data collection is expensive, not only in time, but also in good will. Thus, the first operational rule we adopted: no new data. The challenge was to take the many data riches that already existed and meld them into a single accessible information resource.
One source of resistance to building such a resolve, ironically, is the sense that until there is a big picture to build to, nothing worthwhile can be accomplished. There are two parts to this problem, data and technology. The pace of technological change is such that no system design will ever be "best." Better choices are always available. Starting with proven and tested technology, which can be deployed reasonably quickly and which is based on principles of adaptability and scalability, is essential. Improvements can and should follow.
The main prize is the data. The big task is to revise the structure of data storage now existing in disparate university information systems to "fit" a common relational database environment. This fit will never be perfect and cannot anticipate all future contingencies. A grand view is useful; details (and mistakes) will follow. The repository of data will grow and its structure will change regardless of the quality of the initial database design. Getting the data into the hands of potential users is essential and should always be the project's ultimate purpose. The time and energy required to structure a "perfect" data model detracts from the purpose of providing the data resource in the first place. Once started, the organization and structure of the database will evolve, because its purposes will change with the environment.
Despite the importance of the goal, however, the daunting array of incompatible databases is as likely to inspire paralysis. In response, the necessary direction comes not from the world of data-managers, but from institutional priorities.
At the University of Washington, our first priority was to understand student progress-course-taking patterns, major choice, course access issues, retention and graduation patterns, and the like. This, then, guided the development of the relational database: student records, faculty workload, and time schedule information first; budget records second; and so on. This process must be continuously recalibrated.
The next step is equally critical for design: to identify both the principal clients for the database and the principal unit of analysis. Our decision was to privilege deans and chairs and thus to make the principal unit of analysis the department or program. Our key academic decision-makers are chairs and deans, and budgets are organized by unit (department and program, aggregating to college or school). Unit-based accountability was also beginning to gain currency as a management approach at the University of Washington, making access to data at the unit level even more critical. Many universities are also part of a statewide system. In such circumstances, the principal clients are more likely to be vice presidents or other administrators at the system level. Indeed, a system structure places the primary accountability burden on institution-level vice presidents, who are beholden both to deans and to the statewide system.
The final decision is to settle on the character of the database: should it be primarily oriented toward analysis or accounting? Our strong recommendation is that such a database should be used principally for analytic purposes. Transactions and reporting requirements--as well as the responsibility to collect and vouch for the accuracy of the data--should remain the responsibility of the owners of each data source. Freedom from reporting requirements allows the relational database to remain a dynamic creation, changing with new priorities, responding to the changing needs of users, and inspiring ever-more--sophisticated approaches to leadership and to resolving important management questions.
Accomplishing these steps requires a shift of perspective--especially on the part of the owners of particular sources of data. They must be willing to cede some control over their portion of the world and to adopt a university--rather than a unit-level stance. Our experience suggests that the most meaningful way to accomplish this shift is to bring all owners of constituent databases to the design table. The project will always benefit from their particular knowledge and expertise. At the same time, the process of becoming identified with the university's well-being as a whole will develop naturally with the exchange of information and ideas.
These experiences suggest the following design principles for any such effort:
Following the first phase, though, it became important to institute a new political strategy designed to educate and inspire use. We made a presentation about the initiative to each of the 17 schools and colleges, and used actual data to describe the college that we were visiting, taking the opportunity to make comparisons with other UW colleges on the same dimensions. These descriptions and comparisons portrayed their units in a way, in which they had never been presented before.
During those presentations we were also able to get feedback on the accuracy of the information. One of the important secondary consequences of this enterprise has been to show colleges, schools, programs, and departments the data about them that exist in the university environment. In the vast majority of cases, the units themselves supplied the data, but, before these presentations, they had rarely, if ever, seen the data in the aggregate. In and of itself, deans and chairs considered this to be a beneficial outcome of the project. Because things were now out in the open, they were able both to inspire improved reporting of data and to fix heretofore unknown errors that had accumulated over time.
During these visits we also made a commitment to train chairs, deans, and their designates to use the database directly. That, of course, was the true promise of this project: to enable key academic decision-makers to ask their own questions, to model their own futures, to explore solutions to their own problems. Decentralizing access to information facilitated unit strategic planning, an institutional priority in its own right.
In the following year, we began training sessions. They required an eight-hour commitment and knowledge of Microsoft Access. We wrote training manuals for each of the major databases and led the training sessions. At last count, 130 people had been trained at the UW, and demand continues. Those trained fell roughly into three categories: a small number who wanted to know some basics but who do not actually conduct much analysis, a majority who learned to conduct analyses specific to their unit's interests, and a few who became what we call "power users." These power users draw upon the database routinely to carry out their work. They are a force for change at the UW, and an important source of information to us about how to improve the resource as a whole. Recognizing their status, they have now organized themselves informally as a group, known as Parnassus.
Sometimes there are rocky points, as one dean discovers that the FTE student/faculty ratio in his/her college far exceeds that of another, or that his/her resource allocation seems to be out of line. But even these moments of tension are usually turned to good effect. Either they foster an appreciation that other units make qualitatively different kinds of contributions to the common good, or they provoke a discussion among deans and the provost about optimal principles of resource allocation and appropriate performance expectations. After three and a half years of unmediated access to the database, war has not broken out and suspicion has subsided.
The second question--that of security--is of greater concern. We do not have the resources to "police" use of the database, nor do we wish to. That is why we restricted access and provide training only to deans, chairs, and their designates. When a chair or dean sends someone to training, he or she assumes responsibility for that person's actions. Up to now, this system has worked satisfactorily.
How has the University of Washington used this data resource for decision-making? Some examples follow.
One consequence was a chorus of student complaints about the inability to get into courses and to graduate on time. The university could not respond with authority to these complaints: it had no information on the basis of which to gauge the extent of the problem and no way to assess success in fixing it.
Working with the university's central computing staff, we revised the data recorded and maintained in the registration process. For each course offered, the university began an unduplicated count of how many students attempted to register for the course who were denied registration because of inadequate space. If a student at any time secured enrollment in the course, the count was correspondingly revised. The data were captured and stored at the conclusion of each enrollment period. In addition to other data on the number of places offered and enrollment already maintained, the university now recorded sufficient information to be able to determine the following for each individual course offering:
Percent Offered Enrollment Utilized: the proportion of offered enrollment space relative to total enrollment.
Percent Enrollment Demand Satisfied: the proportion of total enrollment demand relative to total enrollment.
The first measure views enrollment management in an efficiency context and the second in an effectiveness context. Data that yield these measures are made available to everyone involved in enrollment management. Department-level personnel can address enrollment management issues in particular course offerings. University-level administrators can allocate resources to departments to improve the satisfaction of student demand, especially for bottleneck courses. This marks a significant improvement: In years past, scarce resources would have been allocated across the board based upon aggregate numbers and student complaints, rather than in response to the magnitude of the queue of students outside each classroom door.
The ability to respond to unforeseen events has also been enhanced. In the fall of 1997, unexpectedly, the university experienced a surge in acceptance of offers for admissions. This resulted in an enrollment of 300 more freshmen than were anticipated. The ability to forecast the additional enrollment demand that would be generated by these new students made it possible to add course offerings in advance. It was also possible to see where there was underused capacity that could address the unexpected student demand.
Over time the capability to measure and manage enrollment has grown. Individual colleges and their deans are beginning to be able to address and manage enrollments, and to speak to these issues in the context of decisions about planning and budgeting. The satisfaction of enrollment demand at the lower division has risen since academic year 1994-95 from 80.6 percent to 82.3 percent, and the use of available places rose from 87.0 percent to 88.5 percent while average class size remained constant at 39.7. Similar results are evident at the upper-division level with enrollment demand satisfaction rising from 85.7 percent to 89.8 percent and enrollment use remaining constant at 72 percent but with average class size falling to 25.8 from 26.8. These accomplishments are within the context of a growing student body.
Another unanticipated use of this data resource was discovered in planning and scheduling summer quarter course offerings. Previously, these offerings were determined by what faculty were willing to teach, and were constrained by minimum enrollment standards. Data about actual student enrollment demand provided a new basis for determining summer quarter course offerings and for giving priority to courses with significant levels of unsatisfied enrollment demand during the fall, winter, and spring academic terms.
The central advising staff members initiated a "strategic advising program" through which they identify those students most in need of advising assistance from the perspective of institutional student retention. Using the database, we helped them develop criteria to identify these students. Eventually, these criteria included cumulative number of credits earned, minimum and maximum grade point average, and the absence of a declared major. Collectively, they targeted academically strong--but not the strongest-students who had a significant probability of dropping out as juniors. The criteria could be applied each quarter and a group of students could be targeted by the central advising staff for tailored intervention. Previously, the central advising staff would not have been able to identify such students. The decision-support database has enabled efforts to make better use of central advising resources. In turn, a significant university retention issue is being addressed in a systematic way.
At the University of Washington, the lack of such data on actual teaching contributions was a significant barrier to achieving cooperation across colleges and departments. In a college that had experienced a long-term enrollment decline, for example, the absence of data on teaching contributions across disciplines and departmental boundaries unintentionally served as a disincentive for faculty to teach in other colleges. Once it was shown that reliable information about actual teaching contributions could be reported by those who did the teaching, resistance to teaching outside disciplinary and organizational boundaries diminished. Teaching outside organizational boundaries can now be recognized and consciously managed.
To be sure, there are many stories associated with each of the arrows in Chart 1. The transition has been hardest of all, in some ways, for the owners of the original constituent data-bases. It is not easy to argue that giving up monopolistic control of a database will always have a happy outcome for the administrator who is responsible for such a database. It is not easy to accept that ceding control over information won't reduce power. In fact, it does, at least in the short-run.
The transition has also been difficult for academic leaders. In part, this is because these changes were symptomatic of other more profound and unsettling changes in higher education including growing accountability, more attention to student needs and desires, and the tension between access and quality that seems to be ever-present at state institutions.
But the positive reaction to this shift should not be underestimated. Certainly it exceeded our wildest expectations. Notoriously demanding groups of departmental chairs applauded our effort. The initial and continuing demand for training is an excellent indicator of interest and support. While the old paradigm served those who owned data, the new paradigm serves those who need it. The more difficult the environmental constraints--budgetary and political--the more important it is to inform those who must make the tough decisions. Only in the context of shared information can the fundamental value governing decision-making in the academy, consensus, be preserved.
The shift in perspective has been so swift and complete that it has become fully institutionalized. New deans and department chairs have never known a time when they could not count on prompt analyses to answer often complex questions. Our challenge now is to adapt to newly emerging issues and to capture new data that can respond to those issues. Our experience is that issues emerge more quickly than the database can evolve. Sometimes we are clever enough to anticipate issues. Most of the time we are behind. The new level of analytic capacity available to academic decision-makers is now taken for granted. And so it should be.
People are empowered by access to data. They begin to address questions that they have always wondered about or have perceived as being important but have been unable to address. When they can do so, belief is replaced by knowledge. Choices grounded in fact replace choices based upon hope or insistence. Greater timeliness and responsiveness to contemporary issues become possible, even in the absence of budget cuts and other crises.
By Debra Friedman and Phillip H. Hoffman
Debra Friedman is associate provost for academic planning at the University
of Washington, where she founded the University Strategic Analysis Group, which
created a cutting-edge relational database to aid UW academic leaders in
strategic planning. She teaches in the Evans School of Public Affairs and the
Department of Sociology, and is the author of Towards a Structure of
Indifference: The Social Origins of Maternal Custody (Aldine, 1995). Phillip H.
Hoffman is director of Institutional Studies. He is responsible for the decision
support capacity of the University of Washington, both internally and in
collaboration with other higher education institutions.