Data Warehousing Steering Committee

November 18, 1998 1:30pm
AD306

Members present Rich Forrester, Lisa Jardine, Sue Schwab, Gary Dickerson, Brad Christensen, David Heise, Karen Stockton for Sharon Pittman
Absent members Emilio Garcia-Marenko, Mistee Arias, Derek Bradfield, Ann Gibson, Ron Herr, Pat Mutch, David Penner, Jack Stout, Ed Wines
Prayer Brad Christensen
Minutes Minutes from last week were distributed and David Heise recapped the last meeting. Minutes were accepted.
Review of Data Warehousing Terms Gary Dickerson distributed the Data Warehouse Definitions document.

David Heise - this document is intended to be a reference document for unfamiliar terms and it will be updated as necessary. Unless we can agree on the meaning of terms, we will have problems agreeing about the meaning of the data.

Gary Dickerson - last meeting some members had questions about the term ‘business unit’. Business unit may be a term that comes into focus as we proceed, especially as we address the set of questions.

Rich Forrester - Business units may go across departmental lines in order to answer questions.

David Heise - We will have a common data warehouse that supports the separate business unit’s needs.

Data Warehouse Readiness Assessment Gary Dickerson distributed the results of the Data Warehouse Readiness Assessment the committee members completed at last meeting.

Item of note: The final score was a 71, showing that we are ready for our first increments in data warehousing. The developers of this instrument have noted that this score is typical of the scores of organizations when first tested. Scores typically go down when the assessment is completed a second time. This is due to ‘uninformed optimism’ vs. ‘informed pessimism’.

On page 2, the scores to take note of are the extremely high or extremely low scores. The IT readiness score has always been low. We may need to explain what we’re trying to do to the non-IT people and how it will benefit them.

David Heise – question on page 2, there is a weighting factor with the scores? Are the differences in the column scores significant?

Karen Stockton – do they typically test top-line administrators?

Gary Dickerson – this assessment was completed by the committee members.

David Heise – what was your question related to?

Karen Stockton – I was concerned about the data entry involved.

David Heise – no data entry required. The Banner data will feed the warehouse.

Gary Dickerson – the data warehouse addresses easy access to the data for end users.

Karen Stockton – is this compatible with ASCII data exports?

David Heise – yes, it is, and it can support a number of tools.

Gary Dickerson – it depends upon the end-user tools used to access the data. It should eliminate the re-work.

Karen Stockton – concern about the compatibility with SPSS and such packages.

Sue Schwab – we can extract data from Banner in any format you need. Please contact me.

David Heise – data mining tools can help with the statistical analysis. The complexity of the transaction data in Banner is simplified by organizing the data by business questions instead of transaction questions.

Gary Dickerson – this is one of the reasons we need the business users in on the planning process.

David Heise – we need people to ask the type of questions you are asking in order for this to be successful. Mark Clayton invited us to Whirlpool and he said the success of their data-warehousing project is that the users drove the process and the warehouse was created to answer the user questions.

Page 3, didn’t print well, but the sheet details the measurement points related to each question.

Page 4 details the high responses, the average responses, and the low responses to each question in the assessment. For the large differentials we need to examine why the disparity exists so that we can come to common understanding and expectations for the project.

Rich Forrester – What was question 53?

Gary Dickerson – ‘The IT organization worked with the business units to select the warehouse development method’.

We need to look at some that were very high and see why. We’re going to do more analysis to generate some items to concentrate on.

Demonstration of End-user Data Analysis Tool David Heise demonstrated PowerPlay from COGNOS.

Rich Forrester – The business units identify the questions and what they want to analyze and ITS builds the information?

David Heise – yes.

Gary Dickerson – it is important that we agree on the definitions of the terms used, e.g. How do you define a student? This allows you to compare apples to apples.

Gary Dickerson – can you set up tools like this to do standard reports?

David Heise – yes. PowerPlay has two modes, one, which supports standard reports. You can build reports and save them to run each time the data warehouse data is refreshed.

Karen Stockton – when you purchase the data warehouse, do you then decide which tools to buy?

Gary Dickerson – there are two main approaches: one is the data warehouse in a box, the other is best of breed for each tool you require. At The Data Warehouse Institute conference we attended, it was suggested that you purchase the ‘best of breed’ tools.

David Heise – explained some of the data warehouse elements and their relationship to our needs.

Karen Stockton – bottom line, how many new tools must I learn? How much does it cost me as a department?

David Heise – you would pay for the client software you needed. The cost ranges from $500/seat to $900/seat, depending upon capabilities.

Rich Forrester – by the time we implement a data warehouse, the technology may decrease in price.

David Heise – that is almost guaranteed with Microsoft’s entry into the OLAP market.

Sue Schwab – can we import historical data?

Gary Dickerson – yes.

David Heise – explained the ETL (Extract/Transform/Load) tools and how they might work. He also noted the need to bring in information from Institutional Research that isn’t stored in Banner.

Brain Storming Top Level Questions David Heise – what are some of the important pressing questions that we need to answer at Andrews University?
  1. Applications, Acceptances, Registrations
  2. Class Sizes
  3. Costs and sources of funds for different mixes of students
  4. Deans Statistics
  5. Donor Tracking/Analysis (Census Report)
  6. Faculty Load Analysis
  7. Faculty Productivity
  8. Market Segment Analysis
  9. Program cost tracking, multiple sources of income/revenue per student
  10. Registration Analysis
  11. Research cost tracking for various kinds of research
  12. Retention Analysis
  13. Student Achievement/Outcomes
  14. Student Aid Tracking/Analysis
  15. Viable Majors

Gary Dickerson – we will put the list out for comments, additions by committee members and update it later.

Next Meeting To be advised.
Listserv address DWSteering@andrews.edu
Web site http://www.andrews.edu/ITS/AS/dw/