Telephony Douglas Hackney Applied Business Intelligence

Description
Telephony Douglas Hackney Applied Business Intelligence

Publication: Telephony
Author: Douglas Hackney
Headline: Applied Business Intelligence
[callout: no organization can afford to deny itself the power of business intelligence ]
[begin copy]
1 Business Intelligence
1.1 Definition
Business Intelligence (BI) is a market that will be worth $148 billion dollars by 2003, according to
Survey.com. It is growing at exponential rates; is mission critical to every business, regardless of
size; has every major technology company as a player; features e-everything delivery and
interaction; offers exponentially more capability at orders of magnitude lower prices than just a
year ago; is commercializing top-secret technology from formerly off-limits US government
programs; has some of the hottest initial public offerings (IPO) and best performing companies of
the last two years; and has nowhere to go but up. BI consists of all activities related to organizing
and delivering information and analysis to the business. This includes data mining, knowledge
management, analytical applications, reporting systems, data warehouses, etc. The BI space is
an exciting place to be today, but only if you leverage it to provide high-impact solutions that solve
specific business problems.
2 Architectures
2.1 Monolithic
The primary components in a BI infrastructure are data warehouse (DW) systems. These
systems combine and integrate data from a wide variety of operational systems. They cleanse the
data to remove errors; they standardize the data so that key entities (such as product and
customer) and metrics / measures (such as revenue and net profit) are consistent across the
system; and they integrate the data so that information from different systems (such as
accounting data and marketing data) can be combined to yield very high impact information and
analysis (such as lifetime value of a customer or profit by product).
In the early to mid '90s, many organizations attempted to build their BI infrastructure data
warehouse elements in a "top-down" monolithic fashion (see figure one). This approach
attempted to model the enterprise, then incrementally build a central mega-data warehouse
resource. This has become known as the "dream of homogeneity" as it assumes and demands a
consistency of systems, data and architectures that is inconsistent with the heterogeneous nature
of a business environment. These large scale enterprise-class projects had trouble delivering
value to the business, with studies showing failure rates from 30% (Meta Group) to 80% (DWN
and OTR). Enterprise data warehouses aren't the only large scale projects having troubles. A
recent Boston Consulting Group (www.BCG.com) study showed 70% lack of success in large-
scale enterprise projects involving ERP, CRM, etc. systems. These high failure rates led to the
development of an alternative approach to achieve the goal of the enterprise data warehouse
called "bottom up." It involves the creation of a series of highly targeted, architected data marts
that are integrated into the resulting data warehouse system. This approach has proven to be
very popular and effective. The BCG study found that small, targeted solutions are five times
more likely to be rated as a success by the business.
It is surprising that in the face of these statistics that there is an ongoing fixation with some old-
think DW guru adherents that the enterprise, top-down, monolithic DW approach is the only viable
way to achieve the goal of an integrated information resource. This group is blind to five key
factors: 1) both methods, top-down and bottom-up, are viable given a suitable political and
cultural environment; 2) top-down monolithic approaches are sure death in organizations that lack
the senior level support, long-term sustainable political will, and political and communication skills
required to be a success; 3) top-down monolithic approaches are incapable of accommodating
today's heterogeneous mix of custom DW/data marts (DM), turn-key, packaged DW/DMs, data
mining and analytical applications (see figure two); 4) technological considerations such as
architectures, approaches, tools, technologies, etc. are meaningless to the business - it is fast,
measurable high-impact on the business that counts; and 5) the business makes the rules, not
the technologists.
2.2 Federated BI Architecture
The current BI market is built on the foundation of a modern BI infrastructure, consisting of a
federated BI architecture accommodating all the components of a contemporary BI system:
packaged/turnkey data warehouses (DW) and data marts (DM), packaged/turnkey analytical
applications (AA), custom built DWs and DMs, custom built AAs, data mining, online analytical
processing (OLAP) tools, query and reporting (Q&R) tools, production reporting tools, data quality
tools, extraction transformation and load (ETL) tools, system management tools, information
delivery tools, enterprise information portals, reporting systems, knowledge management
systems, database systems, etc. The federated BI architecture is the "big tent" that provides the
foundation and environment to facilitate and enable business information flow, analysis and
decision making.
As the internet is a network of networks, a federated DW architecture is an architecture of
architectures (see figure three). It provides a framework for the integration, to the greatest extent
possible, of disparate DW, DM and analytical application systems. A federated DW architecture is
the most pragmatic route to provide the maximum amount of architecture possible given the
political and implementation realities of real-world sites.
A federated DW architecture shares as much core information among the various systems as
possible. This is accomplished by sharing critical "master files" or dimensions, common metrics
and measures and other high impact data across all systems that can make use of the
information. It is usually accomplished via an enterprise class ETL tool, which provides a common
meta data repository, and the use of common data staging areas.
2.3 Sample Telecommunications Architecture
A packet based telecommunications company has high demands for BI, and often has a business
model based on core BI functionality, such as bandwidth/utilization based billing, real time
configuration, etc. To accommodate these needs, a federated BI architecture is required to
accommodate the heterogeneous BI requirements inherent in providing the near-real time
analysis required by the networking organization along with service / support team requirements
and the billing, utilization and analysis needs (see figure four).
Telephony and packet BI systems face special challenges in the areas of data volume and real-
time data streams. While typical data warehouse systems are considered large if they contain a
terabyte of data, a packet system can easily contain ten terabytes or more. To provide support for
provisioning, support and dynamic billing the system must also manipulate very large volumes of
data in near-real time. The data must be gathered from a worldwide network of devices,
cleansed, integrated and aggregated within minutes. These requirements are well beyond the
sundry run-of-the-mill architectures, ETL tools and server systems found in everyday data
warehouse systems and require special expertise, experience, techniques and technologies to be
successful.
3 Solutions
No BI system, regardless of its technical elegance or purity of design vision, has a prayer of
survival if it does not provide direct business value and solve a specific business problem. The
most popular ways to achieve this goal are via analytical applications and data mining.
3.1 Analytical Applications
The most popular form of BI utilization from the business perspective is via packaged, turn-key
analytical applications. A true, high-business-impact, analytical application is defined by the
following characteristics:
1. Architected, integrated data from multiple sources (internal & external)
An analytical application includes (or, at a minimum, can include) information from multiple
sources, both native OLTP applications, as in the case of an analytical application offered by an
ERP vendor, and external information from heterogeneous OLTP systems or 3rd party vendors.
Note that many ERP vendor supplied analytical application offerings have no capability to
capture, leverage or utilize external data of any kind. This shortcoming cannot be overly
emphasized as you consider the implications of an environment made up of disparate, non-
architected analytical applications, each with its own semantics, business rules, etc.
2. Flexible, multi-dimensional analysis, drill (up, down, across) and reporting
Analytical applications allow business users a flexible environment to view business metrics and
measures by any number of pertinent dimensions, with any required number of members.
Analytical applications allow seamless drill through into pertinent detailed transactions and
flexible and easy movement across dimensions and measures. They also provide the capability
to view and report information in all forms required by the applicable business processes, i.e.
detailed lists as well as summary cross tab.
3. Turnkey package / short time to market
Analytical applications feature rapid deployment, with easy data extraction and/or integration into
OLTP packages and data sets; indigenous OLAP or native support for industry standard OLAP
engines; pre-formatted, pre-defined relevant business metrics, measures, Key Performance
Indicators (KPI), etc.; and implementation ready agents, reports, and aggregations.
4. Integrated business processes
Analytical applications provide domain specific solutions to specific business challenges,
including internal representations of relevant business processes. Analytical applications provide
an interactive environment to interact with the business process by presenting applicable metrics
and measures of processes, as well as the ability to interact with, and alter, process values and
measures.
5. Self measuring (internally monitored ROI, etc.)
Analytical applications provide internal value measurement of the relevant business processes
and of the analytical application itself. They monitor the ongoing utilization of the analytical
application, and it's effects on the business process. In doing so, they provide ongoing ROI
analysis of the business process, and the analytical application. In addition, they monitor the
utilization of the analytical application, and provide an active monitor into the propagation of the
tool throughout the organization, the relative sophistication of the usage of the system,
optimization of the system and identification of best practices regarding usage of the system.
6. Closed loop system
An analytical application provides a closed loop, feeding new inputs back into the host OLTP or
data warehouse / data mart system. As the users interact with the business process, they
introduce new information or alter existing information, as in a budgeting and forecasting system.
These new values are then fed back into the source systems as new or modified information for
use by all users of the source system and all downstream BI systems. Note that this new or
altered information must flow back into the analytical application in real time or near real time.
This places extraordinary challenges on the technical infrastructure of data warehouse and data
mart systems more accustomed to relatively leisurely monthly, weekly or daily information
refreshes. It also places heavy demands for massive re-calculation and reallocation of data, as in
budget vs. actual calculations or performance against plan. An even greater challenge is that
these write-back, flow-through prerequisites require a level of process rigor and structure that is
diametrically opposed to the free-form flexibility required of a successful BI system. This is a key
technological and cultural hurdle that many teams cannot overcome.
3.2 Data Mining
Data mining solutions are a key weapon in the BI arsenal. They are used to reveal trends and
relationships, and predict future outcomes. They are built on variations of artificial intelligence
such as neural networks, machine learning and genetic algorithms. Data mining tools are a
powerful technological and competitive weapon and form the underpinnings of powerful product
offerings, and infrastructure and support capabilities for packet companies.
Most organizations use data mining tools for the discovery of previously unknown relationships,
trends and anomalies, as well as to predict future outcomes. On the customer side of the house
these capabilities are used for target marketing, churn management, fraud detection and
promotion management. Packet content BI systems can also use data mining tools to track, trend
and predict network volumes, spot significant outlier behavior, optimize system configuration and
performance, and optimize the structure and design of customer offerings.
4 Conclusion
A federated BI system is a prerequisite to survive and thrive in today's fast changing and evolving
market. Without the capabilities provided by integrated data, powerful analytical tools and
insightful data mining applications companies are at a tremendous disadvantage and find
themselves unable to compete with their better informed and capable competitors. With the
players, the customers and the fundamental possibilities of the market changing daily, no
organization can afford to deny itself the power of business intelligence.
[end copy]
Enterprise Group, Ltd. www.egltd.com [email protected]
“Enterprise Group, Ltd.” is a servicemark and should be treated as such.
“We build business intelligence” is a servicemark of Enterprise Group, Ltd. and should be treated
as such.
Other company and product names may be trademarked, servicemarked or registered, and
should be treated as such.
Copyright ©2000, Enterprise Group, Ltd. All rights reserved.

doc_213375608.pdf
 

Attachments

Back
Top