Traditional Business Intelligence Platforms
The traditional Business Intelligence platforms of the past two decades have chiefly succeeded in
providing users comprehensive historical reporting and user-friendly ad-hoc analysis tools. While users have been able to gain tremendous value from traditional platforms for historical reporting capabilities, more users now require data analyses techniques that require direct access to data without relying on IT specialists.
The following challenges associated with traditional BI solutions have been highlighted by federal agencies in the analytics space:
Modern Business Intelligence Platform
While traditional BI platforms often provide analyses that answer the question “What happened?” in
a historical perspective, modern platforms have the ability to answer the question of “What is
happening, what will happen, and why?”, offering the ability to not only obtain and monitor a
continuous pulse of the organization through rapid analytics, but to accomplish mission objectives
through predictive analytics.
Future-Ready BusinessIntelligence
(click to download white paper)
Request for a non-obligation consultancy with JOON on how to leverage modern BI?
Integrating Traditional and Modern BI Platforms
Data platform changes are necessary to shape the foundation for an enterprise-wide data
transformation and organizations are rightfully wary of scrapping their entire IT architecture and
starting fresh. Data warehouses continue to play a key role in existing data platforms, providing
the thoroughly cleansed, organized, and governed data needed for most businesses.
The data warehouse allows business executives and others without deep technical knowledge to gain insights from historical data with relative ease. This data, sourced from the data warehouse, is
highly accurate due to IT scrubbing, rigorous testing, and thorough knowledge of layers of the data
by IT specialists. However, the challenges associated with traditional BI are creating demand for
augmenting an EDW with another form of architecture optimized for quick access to ever-changing
data: the Hadoop Data Lake.
Organizations looking to modernize their analytics platforms have started to adopt the concept of
data lakes. Data lakes store information in its raw and unfiltered form, be it structured, semistructured, or unstructured. As opposed to the stand-alone EDW, data lakes themselves perform very little automated cleansing and transformation of data, allowing data to therefore be ingested with greater efficiency, but transferring the larger responsibility of data preparation and analysis to business users.
Using Hadoop’s distributed file system (HDFS), data lakes offer a low-cost solution for efficiently
storing and analyzing many types of data in its native form. A data lake solution coupled with a
data warehouse defines the next generation of BI and offers an optimal foundation for data
analysis, as shown in Figure 2.
