Developing a Knowledge Management Strategy for Andrews University

Wednesday, January 16, 2002

Invited:

Name Position
Emilio García-Marenko Registrar
Ann Gibson Dean, School of Business Administration
Karen Graham Dean, School of Education
David Heise Chief Information Officer
Jim Massena Director, Institutional Research
Pat Mutch Vice President,Academic Administration
Stephen Payne Vice President, Enrollment Management
Linda Thorman Dean, Graduate School

Agenda

1.      Information access and availability (Intranet)

§         web web web - we need an Institutional Research web site with interactive and canned reports, to complement other offerings

§         Archives of Acrobat pdf files for hardcopy reports

§         data extracts for further analysis in Excel, etc

§         interactive data models from the data warehouse

 

2.      Access Control Issues

§         Opportunity to attack some of the silos still standing?

 

3.      Data Quality

§         responsibility for monitoring, correcting, etc

§         process for ensuring accuracy

 

4.      What Information? Unmet or partially met needs

§         self-service, interactive analysis, "drill-down" , "slice & dice"

§         key performance indicators

§         trend analyses

§         retention studies

§         grade distributions

§         student satisfaction

§         alumni satisfaction

§         faculty contribution/workload

§         student demand by course/section

§         demographics, external comparisons

§         report to NAD (Dallas Kindopp)

§          

 

5.      Prioritized value of reports

§         review current offerings along with unmet needs

§         determine importance rankings

 

6.      Structure of IR

§         How do we structure Institutional Research to better equip it provide the services we are becoming increasingly dependent on?

§         Job Descriptions

 

Meeting Notes

The agenda was far too long to cover completely in a single one and a half hour meeting. We spent most of our time together on item one, and also touched on parts of items 2 and 4.

Agenda Item 1 - Web

(a) Sue Schwab and Jim Massena should prepare a dictionary of reports and scripts that are available, and place it on the Institutional Research web (http://www.andrews.edu/ITS/IR/).

Name Description Format Author

Commonly known report or query name

Explanation of report content, context, and scope, etc

§    paper only

§    downloadable or web viewable static (eg, pdf)

§    downloadable, delimited for Excel analysis

§    datacube for OLAP interactive analysis

§    others?

(with hypertext link if available via the web)

§    Jim Massena

§    Sue Schwab

... ... ... ...

(b) Comments were made about modes of information delivery and interpretation. Some appreciate being able to talk with Jim as he delivers the information, while for others, a formal appointment makes a better setting for delving into the meaning of the reports.

(c) The Institutional Research web should contain links to US News and World Report, Peterson's Guide, and other online sources of comparative and demographic information.

Agenda Item 2 - Access Control

The issues of openness versus privacy and confidentiality were discussed. As hinted at in the agenda heading for item 1, the primary reason for placing Institutional Research information on the web is to improve access and availability, but in a way that restricts access to the internal Intranet, and blocks access from the World Wide Web. Further discussions are needed to determine what information, at the summarized level, should be considered private to the relevant chairs, managers and administrators. In this context, Pat Mutch mentioned an article she had read in Change Magazine about a data warehousing project at the University of Washington.

Agenda Item 4 - Indicators

During the meeting, several other indicators were added to the list suggested in the agenda, and pooling all of those along with lists developed in the Data Warehouse Steering Committee and other venues yields the following list (sorted alphabetically):
    1. Accreditation
    2. Alumni Satisfaction
    3. Applications, Acceptances, Registrations, Yield Rates
    4. Class Sizes
    5. Costs and sources of funds for different mixes of students
    6. Deans Statistics
    7. Demographics, external comparisons
    8. Donor Tracking/Analysis (Census Report)
    9. Faculty Load Analysis
    10. Faculty Productivity
    11. FTE Tracking (Human Resources)
    12. Grade Distributions
    13. Graduate Admissions
    14. Market Segment Analysis
    15. Program cost tracking, multiple sources of income/revenue per student
    16. Registration Analysis
    17. Report to NAD (Dallas Kindopp's comparative statistics)
    18. Research cost tracking for various kinds of research
    19. Retention Analysis
    20. Student Achievement/Outcomes
    21. Student Aid Tracking/Analysis
    22. Student Demand By Course/Section
    23. Student Satisfaction
    24. Viable Majors