Inmon and Kimball data warehouse definitions
Bill Inmon, one of the first authors on the subject of data warehousing, has defined a data warehouse in the terms of the characteristics of the data repository:
The data in the database is organized so that all the data elements relating to the same real-world event or object are linked together.
The changes to the data in the database are tracked and recorded so that reports can be produced showing changes over time.
Data in the database is never over-written or deleted - once committed, the data is static, read-only, and retained for future reporting.
The database contains data from most or all of an organization's operational systems and this data is made consistent.
Ralph Kimball, another well known author on data warehousing, defines a data warehouse as "a copy of transaction data specifically structured for query and analysis."
Inmon and Kimball represent differing views on one aspect of data warehousing - whether a data warehouse should be in one logical repository and the methodology that would best support their views. Kimball, in 1997, stated that "...the data warehouse is nothing more than the union of all the data marts". He advocated a bottom-up data warehousing methodology in which individual data marts store subsets of an organization's data. The data marts could be later combined into an all-encompassing data warehouse. Inmon responded in 1998 by saying, "You can catch all the minnows in the ocean and stack them together and they still do not make a whale," indicating the opposing view that the data warehouse should be designed from the top-down to include all corporate data. In this methodology, data marts are created only after the complete data warehouse has been created.
A broader definition of a data warehouse
The aforementioned definitions concentrate on data. However, the means to retrieve and analyze data, to extract, transform and load data, and to manage data are essential components of a data warehouse. Many references to a data warehouse use this broader definition.
Data warehouses versus operational systems
Through use of database normalization and an entity-relationship model, operational systems are optimized for preservation of data integrity and speed of recording of business transactions (see OLTP). Data warehouses are optimized for speed of data retrieval. Frequently data in data warehouses are denormalised via a dimension-based model. Also, to speed data retrieval, data warehouse data are often stored multiple times - in their most granular form and in summaries called aggregates.
Operational system designers generally follow the Codd rules of data normalization in order to ensure data integrity. Codd defines five increasingly stringent rules of normalization. Fully normalized database designs (that is, those satisfying all five Codd rules) often result in information from a business transaction being stored in dozens to hundreds of tables. Relational databases are efficient at managing the relationships between these tables. The databases have very fast insert/update performance because only a small amount of data in those tables is affected in each transaction processed. However, because of frequent enhancements, many operational systems evolve into a collection of cryptic names and seemingly unrelated and obscure data structures. These enhancements may improve performance but complicate use of the systems. Finally, in order to improve performance, older data are usually periodically purged from operational systems.
In the reporting and analysis done in a data warehouse, thousands to billions of records of transactions may need to be read, sorted, summarized and reported on. Data warehouse data are gathered from the operational systems from extended periods of time. For example, a data warehouse may hold data from whatever time clean data that are relevant to the business are available. Data warehouse designers suggest that operational system data be restructured to facilitate reporting and analysis.Also, because of the performance impact on a server used for both reporting and analysis against a data warehouse and for operational systems, data warehouse designers recommend that data warehouses be physically separated from operational databases.
The concept of data warehousing dates back to the mid-1980s when IBM researchers Barry Devlin and Paul Murphy developed the "information warehouse". In essence, the data warehousing concept was intended to provide an architectural model for the flow of data from operational systems to decision support environments. The concept attempted to address the various problems associated with this flow - mainly, the high costs associated with it. In the absence of a data warehousing architecture, an enormous amount of redundancy of information was required to support the multiple decision support environment that usually existed. In larger corporations it was typical for multiple decision support environments to operate independently. Each environment served different users but often required much of the same data. The process of gathering, cleaning and integrating data from various sources, usually long existing operational systems (usually referred to as legacy systems), was typically in part replicated for each environment. Moreover, the operational systems were frequently reexamined as new decision support requirements emerged. Often new requirements necessitated gathering, cleaning and integrating new data from the operational systems that were were logically related to prior gathered data.
Based on analogies with real-life warehouses, data warehouses were intended as large-scale collection/storage/staging areas for corporate data. Data could be retrieved from one central point or data could be distributed to "retail stores" or "data marts" which were tailored for ready access by users.
Key developments in early years of data warehousing were:
1983 - Teradata introduces a database management system specifically designed for decision support.
1986 - Barry Devlin and Paul Murphy publish the article An architecture for a business and information systems in IBM Systems Journal where they introduce the term "information warehouse".
1990 - Red Brick Systems introduces Red Brick Warehouse, a database management system specifically for data warehousing.
1991 - Prism Solutions introduces Prism Warehouse Manager, software for developing a data warehouse.
1991 - Bill Inmon publishes the book Building the Data Warehouse.
1995 - The Data Warehousing Institute, a for-profit organization that promotes data warehousing, is founded.
1996 - Ralph Kimball publishes the book The Data Warehouse Toolkit.
1997 - Oracle 8, with support for star queries, is released.
As technology improved resulting in lower cost for more performance and user requirements for faster data load cycle times and more features increased, data warehouses have evolved into:
Off line Operational Databases
Data warehouses in this initial stage are developed by simply copying the data of an operational system to another server where the processing load of reporting against the copied data does not impact the operational system's performance.
Off line Data Warehouse
Data warehouses at this stage are updated from data in the operational systems on a regular basis and the data warehouse data is stored in a data structure designed to facilitate reporting.
Real Time Data Warehouse
Data warehouses at this stage are updated every time an operational system performs a transaction (e.g., an order or a delivery or a booking.)
Integrated Data Warehouse
Data warehouses at this stage are updated every time an operational system performs a transaction. The data warehouses then generate transactions that are passed back into the operational systems.
Data warehouse architecture
There is no widespread agreement on exactly what constitutes a data warehouse architecture. Though they may not be contradictory, writers on the subject differ on how important they view the possible components of an architecture. One possible conceptualization of a data warehouse architecture consists of the following interconnected layers:
Operational database layer
The source data for the data warehouse
Informational access layer
The data accessed for reporting and analyzing and the tools for reporting and analyzing data
Data access layer
The interface between the operational and informational access layer
The data directory (which is often much more detailed than an operational system data directory).
Normalized versus dimensional approach to storage of data
There are two leading approaches to storing data in a data warehouse - the dimensional approach and the normalized approach.
In the dimensional approach, transaction data are partitioned into either "facts", which are generally numeric transaction data, and "dimensions", which are the reference information that gives context to the facts. For example, a sales transaction can be broken up into facts such as the number of products ordered and the price paid for the products, and into dimensions such as order date, customer name, product number, order shp-to and bill-to locations, and salesperson responsible for receiving the order. A main advantage of a dimensional approach is that the data warehouse is easier for the user to understand and to use. Also, the retrieval of data from the data in the data warehouse tends to operate very quickly. The main disadvantages of the dimensional approach are that, in order to maintain the integrity of facts and dimensions, loading the data warehouse with data from different operational systems is complicated, and, that it is difficult to modify the data warehouse structure if the organization adopting the dimensional approach changes the way in which it does business.
In the normalized approach, the data in the data warehouse are stored following, to a degree, the Codd normalization rule. Tables are grouped together by subject areas that reflect general data categories (e.g., data on customers, products, finance, etc.) The main advantage of this approach is that it is straightforward to add information into the database. A disadvantage of this approach is that because of the number of tables involved, it can be difficult for both users to join data from different sources into meaningful information and then access the information without a precise understanding of the sources of data and of the data structure of the data warehouse.
These approaches are not exact opposites of each other. Dimensional approaches can involve normalizing data to a degree.
Another important decision in designing a data warehouse is which data to conform and how to conform the data. For example, one operational system feeding data into the data warehouse may use "M" and "F" to denote sex of an employee while another operational system may use "Male" and "Female". Though this is a simple example, much of the work in implementing a data warehouse is devoted to making similar meaning data consistent when they are stored in the data warehouse. Typically, extract, transform, load tools are used in this work.
Advantages of data warehouses
There are many advantages to using a data warehouse. Some of them are:
Data warehouses make it easier for end users to access a variety of data.
Data warehouses facilitate decision support system applications such as trend reports, e.g., the items with the most sales in a particular area within the last two years, exception reports, reports that show actual performance versus goals.
Data warehouses can work in conjunction with and, hence, enhance the value of operational business applications, notably customer relationship management (CRM) systems.
Disadvantages of data warehouses
There are also disadvantages to using a data warehouse. Some of them are:
Over their life, data warehouses can have high costs. The data warehouse is usually not static. Maintenance costs are high
Data warehouses can get outdated relatively quickly. There is a cost of delivering suboptimal information to the organization.
There is often a fine line between data warehouses and operational systems. Duplicate, expensive functionality may be developed. Or, functionality may be developed in the data warehouse that, in retrospect, should have been developed in the operational systems and vice versa.
The future of data warehousing
Data warehousing, like any technology niche, has a history of innovations that did not receive market acceptance.
A 2007 Gartner Group paper predicted the following technologies could be disruptive to the business intelligence market .
- Service Oriented Architecture
- Search capabilities integrated into reporting and analysis technology
- Software as a Service
- Analytic tools that work in memory
- Another prediction is that database performance will continued to be improved by use of data
- warehouse appliances, many of which incorporate the developments in the aforementioned Gartner Group report.
Finally, management consultant Thomas Davenport, among others, predicts that more organizations will seek to differentiate themselves by using analytics enabled by data warehouses.