In-memory Technology And The Agility Of Business Intelligence

Description
In-memory Technology And The Agility Of Business Intelligence

IN-MEMORY TECHNOLOGY AND THE AGILITY OF
BUSINESS INTELLIGENCE – A CASE STUDY AT A GERMAN
SPORTSWEAR COMPANY
Tobias Knabke, Mercator School of Management, University of Duisburg-Essen, Duisburg,
Germany, [email protected]
Sebastian Olbrich, Mercator School of Management, University of Duisburg-Essen, Duisburg,
Germany, [email protected]
Abstract
The retail industry has changed significantly due to altered shopping behavior of customers and
technological advancements in recent years. This enforces organizations to quickly adapt to these
dynamically evolving circumstances. Most of the major organizations utilize Business intelligence (BI)
to support their corporate strategies. Therefore, the adaptability of BI gained increasing importance
in theory and industry practice over the last years. Agility is particularly challenging in the domain of
BI since the underlying architecture of enterprise-wide decision support with data warehouse (DWH)-
based BI is not built upon agility, but on reliability and robustness. Although the usage of agile project
approaches like Scrum has been explored, there is still a lack of research investigating further effects
on BI agility. Hence, we analyzed whether the characteristics of DWH and BI impact the agility of BI
in an in-depth case study at a globally operating German sportswear designer and manufacturer. In
particular, we want to identify if a technology like in-memory can help to achieve more BI agility. The
findings indicate that IM technology acts as a technology enabler for agile BI. The impact of some
DWH characteristics on BI agility is significantly positively influenced if IM technology is used.
Keywords: Business Intelligence, Agility, In-Memory Database, Case Study.

1 INTRODUCTION AND MOTIVATION
The empowerment of the consumer by using emerging technologies altered the retail industry
dramatically in recent years. For instance, customers use smartphones to compare prices while
shopping in stores or base their buying decisions on feedbacks via social media platforms.
Additionally, an ever-growing list of online retailers offer to deliver products with competitive prices
directly to the customers (MacKenzie et al. 2013). Market research (Davison and Burt 2014;
MacKenzie et al. 2013; Mulpuru et al. 2014) indicates that these business and technology trends
completely change retail as we know it today. Some of these industry observers even predict more
changes in the next few years than there were over the past century. Thus, retail organizations need to
quickly adapt to this dynamically changing environment in order to stay successful. Managers and
decision makers of retail companies must be able to promptly answer questions like where to sell
products (large or small stores and/or online) or how to react to different pricings of competitors. They
need to operate and adapt their multichannel strategies flexibly and gain insights from huge amount of
customer or social media data frequently.
Hence, fact-based decision making on a broad and reliable data basis and being prepared for multiple
scenarios is one of the most obvious approaches. Decision making as well as the execution of business
processes is usually supported by information systems (IS). Business intelligence (BI), as a distinct
class of dispositive IS, is used as an instrument to understand and gain insights from internal and
external information and the top priorities of CIOs in 2014 (Schulte et al. 2013). Primarily utilized to
reflect an operational performance (reporting-centric), organizations tend to use BI more and more to
actively steer the future. Therefore, a quick adoption is crucial to support timely decision making for
retailers as described above. But, achieving agility is particularly challenging in the domain of BI
(Moss 2009). The vision for BI has traditionally been a single, central repository of data that supports
operational and analytical functions for the entire organization. So the tasks of reporting and
consolidation typically have rigid requirements in terms of robustness, reliability, and non-volatility of
the data provided by the system (Inmon 1996). On the other hand, BI needs to adjust to changing
situations and must collect an enormous amount of data of the surrounding environment (Chen et al.
2012; Gartner 2011; Redman 2008). Many organizations utilize a data warehouse (DWH) as the basic
concept for BI. As a DWH is rather static by design the question remains how BI can be adopted faster
and therefore behave in a more agile way. Although the usage of agile project approaches like Scrum
(Schwaber 1997) has been explored, there is still a lack of research investigating further effects on BI
agility. Such effects may be achieved by different architectural approaches (Caruso 2011), adequate
organizational structures and processes (Zimmer et al. 2012) or technologies such as in-memory (IM)
(Evelson 2011). Current research activities identified positive impacts of in-memory databases
(IMDB) on BI (Knabke et al. 2014; Knabke and Olbrich 2011; Plattner 2009; Plattner and Zeier 2011).
But, the impact of IMDB on the agility of BI has not been sufficiently investigated and mostly
promoted by software vendors. Thus, the aim of this paper is to investigate if and how the usage of
IMDB affects the adaptability of BI. Therefore, we conducted a case study at a globally operating
German sportswear designer and manufacturer who implemented an architectural switch from disk-
based databases (DRDB) to IMDB. To accompany that project, we address the following research
questions:
? Do requirements of BI agility negatively interact with the common DWH-based BI approach?
? Does the usage of IM technology affect the agility of BI at the surveyed organization?
To achieve a common theoretical foundation we first give a background on DHW-based BI and IMDB
and highlight the value of agility in the context of BI. Afterwards, we introduce our research approach
and the case study setting. The fourth section explains the data collection process in detail. Next, we
present the results of our study and their interpretation before considering its limitations. In the last
section, we describe our contribution as well as an outlook to future research opportunities.
2 THEORETICAL FOUNDATION
2.1 Data Warehouse-based BI and In-Memory Databases
BI can be defined as “a broad category of applications, technologies, and processes for gathering,
storing, accessing, and analyzing data to help business users make better decisions” (Watson 2009). It
is an umbrella term for systems and processes that turn raw data into useful information (Chen and
Siau 2012; Wixom and Watson 2010). Most multidimensional BI systems, particularly in
organizations with several source systems, utilize the DWH approach to systematically extract,
harmonize and provide data to reflect the organization’s single point of truth (Kimball and Ross 2002;
Rifaie et al. 2008; Watson 2009; Watson and Wixom 2007). A DWH is built to fulfill fundamental
requirements (Inmon 1996), i.e. integration, subject-orientation, time-variance and non-volatility. It
usually consists of several layers that physically store data if based on DRDB. Data is extracted from
source systems, transformed and loaded into the DWH. This process is called ETL-process (extract,
transform and load). The data is further cleansed, harmonized and consolidated inside the DWH as
single source of truth of an enterprise. To meet application-specific requirements the “general” data
can be enriched with business logic before made available for analysis and reporting. Data is usually
aggregated DWHs based on DRDB to meet performance and response time requirements during
analysis operations (Knabke and Olbrich 2011). In addition, many BI tools use a de-normalized
approach (e.g. star schema) (Kimball 1996) which allows for efficient read operations on big data
volumes.
Although data can be cached in the main memory of a DRDB system, it needs to be processed and
stored in several layers and the primary storage location remains a magnetic hard disk. Instead, an
IMDB keeps its data permanently in main memory of the underlying hardware. Main memory is
directly accessible by the CPU(s) and the access is orders of magnitudes faster (Garcia-Molina and
Salem 1992). Due to recent price reductions for main memory and the usage of dedicated compression
techniques it is now possible to even hold the entire data of large-size companies in-memory (Plattner
and Zeier 2011). IMDB-based BI infrastructures use column-oriented data storage to optimally
support online analytical processing (OLAP) applications like BI. Column-oriented storage also allows
for better suited compression techniques and gains huge performance impacts – up to factor 1000 with
real-life data (Plattner 2009).
2.2 Agility and its Value for BI
The idea of organizational agility has been established in practice and discussed in literature for
decades and is not limited to IS. It originated from the field of manufacturing (Pankaj et al. 2009; van
Oosterhout et al. 2007) and has also been used for several years in different management areas, such
as corporate performance management or supply chain management. In business literature, it drew
mainstream attention through the work of Goldman et al. (1991) with regard to “agile manufacturing”.
Nevertheless, the definition of agility is ambivalent in scientific literature and industry (McCoy and
Plummer 2006; van Oosterhout et al. 2007). While the term “agile” is described as the ability “to move
quickly and easily” by Oxford Advanced Learner’s Dictionary (Hornby and Cowie 1989, p. 23),
researchers have provided a wide range of definitions, often with deficiencies in the academic
approach to arrive at these definitions (Pankaj et al. 2009). In contrast, Conboy and Fitzgerald
(Conboy and Fitzgerald 2004b) conducted a cross-discipline literature review to derive a holistic
definition of agility. In particular, they investigated the underlying concepts of agility, i.e. flexibility
and leanness (Conboy 2009; Sharifi and Zhang 1999; Towill and Christopher 2002). They define
agility as “the continual readiness of an entity to rapidly or inherently, proactively or reactively,
embrace change, through high quality, simplistic, economical components and relationships with its
environment” (Conboy and Fitzgerald 2004a, p. 110). This definition is in line with the definition of
Pankaj et al. (Pankaj et al. 2009) who stated that agility must respect the abilities to sense a change,
diagnose a change as well as select and execute a response to a change in real-time. However, real-
time does not necessarily mean a very short amount of time, e.g. seconds. Instead, the actual physical
length of time is dependent on the context of the IS and may differ for strategic, tactical and
operational IS (Marjanovic 2007; White 2005) as they support business processes of different time
frames. For instance, the time frame for strategic processes may range from months to years (Pankaj et
al. 2009).
To get an understanding of agility in a BI context, we follow the framework suggested by Knabke and
Olbrich (2013). We believe this framework suits our research very well as the authors analyzed the
concept of agility in IS in a structured literature review and mapped their findings to the domain of BI.
As a result of their analysis they grouped similar constructs of BI agility as illustrated in Figure 1.
These agility dimensions are briefly explained in the following:
Change Behavior is a central construct of agility and describes the behavior of BI with regard to
change. Thus, a system can behave reactively, proactively, create or even learn from change.
Perceived Customer Value (PCV) highlights the importance of quality, simplicity and economy as
values for the customer of BI.
Time describes the ability of BI to adapt to changing environments over time. This can either happen
in a continuous process or on an “ad-hoc” basis. The actual physical length of time is dependent on the
context of the IS and may differ for strategic, tactical and operational IS.
Process comprises the ability of BI to sense, analyze and respond to a change. Agile BI should support
methodologies and organizational structures to be able to quickly respond to changing requirements.
Model incorporates the architecture of BI. Agile BI may even require a new architectural approach
which is among others, reusable, reconfigurable and scalable.
Approach describes the process method that is used in BI projects, e.g. traditional models such as
waterfall models or agile methods like Scrum.
Technology considers the underlying technology of BI. This may e.g. be a DRDB or an IMDB.
Environment of BI can be interpreted in multiple ways such as business processes, people, customers,
clients, or formalities. It respects the fact that the need for change often arises outside the IS.

Figure 1. Framework for BI agility
Knabke and Olbrich (2013) identified the BI creation process, i.e. approach, as one aspect of BI
agility. Unfortunately, the observed organization did only focus on a technology shift and did only use
one project approach, which was a traditional one. Therefore, we neglect the dimension approach in
our study as we lack data to be analyzed. Nevertheless, to the best of our knowledge, no study
analyzes how a technological advancement like IMDB affects BI in terms of agility in a real-life case.
Thus, our study attempts to fill this research gap by analyzing how IM-based BI behaves according to
the agility dimension above.
3 RESEARCH DESIGN – CASE STUDY RESEACH
3.1 Case Study Research
We decided for case study research to determine how IM technology in particular affect BI agility.
According to Yin (1994) a case study is “an empirical inquiry that investigates a contemporary
phenomenon within its real-life context, especially when the boundaries between phenomenon and
context are not clearly evident” (Yin 1994, p. 13). We believe that the method of case study research
is well-suited for our problem for several reasons (Dubé and Paré 2003; Yin 1994). First, case study
research provides a way to analyze the impact of technology on BI agility in a natural setting of a real-
life use case without exerting control over the process or setup of the BI transformation project using
IM technology. Second, since the BI agility phenomenon we are investigating cannot be separated
from the BI transformation project, the boundaries between phenomenon and context are not clearly
obvious. Third, we use multiple source of evidence, namely qualitative and quantitative data.
Our case study research method consists of three steps (Dubé and Paré 2003). The first step is research
design. It refers to the “attributes associated with the design of the study” (Dubé and Paré 2003,
p. 605). Data collection as the second step describes the quality of the data collection process
including the data collection methods (qualitative and quantitative). The third step, data analysis, is
concerned with the process description, the use of techniques as well as modes of data analysis.
3.2 Case Study Design: The Quest for Agile BI at a German Sportswear Label
The case study was conducted at a globally acting sportswear designer and manufacturer with several
billion € in revenue and a profit of several 100 million €. The company with headquarter based in
Germany employed more than 50,000 people in 2014 and is one of the biggest sportswear designers
and manufacturers in the world. Its industry has seen some rapid change and fierce competition in
recent years. Hence, the company wants to react to market changes earlier than its competitors and
introduce agile analytics. In a global BI transformation project the organization aimed to consolidate
and transform all global BI applications from different DRDB landscapes into one single, IMDB-
based platform. The initiative that started at the end of 2012 and yields to offer high performance
reporting and business analytics capabilities based on a consolidated and harmonized data basis with a
globally agreed understanding of data structures and key figures (“one version of the truth”). The BI
transformation has a high profile within the organization as BI is seen as a main driver to turn data into
business relevant information. The initiative sets the foundation for several advanced business
scenarios to allow even more fact-based decision making. Examples of these advantages are big data
analytics, predictive analytics or integrated business planning which were technically not feasible
without the transformation program. The program is planned to be finalized completely in 2015.
Nevertheless, some major BI applications have been set live what justifies an in-depth scientific
analysis at this time.
The former BI landscape consisted of two major DWH-based BI systems. One DWH system
contained all retail-related data. All other BI-related information (manufacturing, finance etc.) was
stored in a second DWH system. Both systems are based on DRDB with a necessary performance add-
in to meet at least minimum performance requirements. According to the BI responsible no reporting
and querying was possible without this additional performance infrastructure (analytical database in
Figure 2). In the new landscape only one DWH-based BI system exists which is completely based on
an IMDB. The overall system landscape is shown in Figure 2.

BI
Transformation
Initiative
DRDB BI Landscape
Global BI Reporting and Analysis
Global Source Systems Local Source Systems
Global DWH 1 Global DWH 2 Local DWHs
Local BI Tools
IMDB BI Landscape
Global BI Reporting and Analysis
Global Source Systems Local Source Systems
Global DWH
DRDB DRDB DRDB IMDB
Local Schemas
Local BI add ons
Analytical Database
IMDB

Figure 2. System landscapes overview
3.3 Unit of Analysis
The phenomenon under investigation is the impact of IM technology on the agility of BI. As the
organization utilizes a DWH as the basic concept for its BI activities, we use the criteria for DWH
constituted by Inmon (1996) as a starting point of our analysis. Inmon claimed that the integration of
data from (diverse) sources ensures consistency and yields a single point of truth. Second, BI elements
should be organized according to the subject areas of the organization (subject-orientation).
Structures in a DWH need to contain a connection to time to show changes over time (time-variance).
Fourth, data in the DWH should never be altered (non-volatility). Based on the evidence of high BI
project failure (Chenoweth et al. 2006; Hwang and Xu 2005; Joshi and Curtis 1999; Olszak 2014; Shin
2003), we assume that the fundamentals of DWH-based BI as operated currently, i.e. based on DRDB,
contradict the requirements of today’s agile environments. This indicates the importance of agile BI.
Change
Behavior
Perceived
Customer Value
Time
Process
Model
BI Agility
Dimensions
DWH/BI
Characteristics
Non-Volatility
Subject-
Orientation
Time-Variance
Integration
Technology
In-Memory
Databases
Environment
Disk-Resident
Databases

Figure 3. Unit of analysis
To identify the impacts of DWH-based BI on BI agility we propose our research model as depicted in
Figure 3. Each DWH-based BI characteristic (Inmon 1996) may influence BI agility (Knabke
and Olbrich 2013). As an exemplary impact, integration may affect BI agility in terms of model. As
we primary aim to analyze the impact of IM technology on BI agility in our study, the utilized
technology is a central construct of our unit of analysis. Therefore, we include technology as
moderator in our research model. In a moderator effect the impact of an independent variable (here
DWH/BI characteristics) on an outcome variable (here BI agility) depends on a third variable, the
moderator variable (here technology) (Hayes and Matthes 2009). Taking the moderator into account,
the above stated example can be extended so that the impact of integration on BI Agility in terms of
model is influenced by technology.
4 DATA COLLECTION
Different sources of evidence, e.g. qualitative and quantitative data, can be used to investigate the unit
of analysis to provide a broad picture of the phenomenon of interest (Dubé and Paré 2003; Yin 1994).
Hence, we followed a five-step approach to obtain the necessary data as illustrated in Figure 4. It
integrates two data collection techniques as well as qualitative and quantitative data. First, we
developed a survey-based questionnaire and scrutinized it with a group of researchers in our institute.
The questionnaire was developed following the rules of Dillman (Dillman 1978; Dillman 2000;
Dillman 2000; Dillman et al. 2009). The questionnaire contains each question for the new (IMDB) and
old (DRDB) BI landscape. Additionally, it includes control questions (Bhattacherjee 2012). Second,
we conducted six semi-structured individual expert interviews with six experts at the company of the
case study. Within these sessions we verified our research model and the questionnaire. The experts
have been chosen from business and IT (three each) to cover both points of view. We selected them as
they play a major role in the BI transformation project. The group of experts consisted of project
managers as well as responsible functional and technical stakeholders. According to the results of the
interviews we adapted our research model and the questionnaire in a third step. Step four contained the
conduction of a structured, self-administered survey (Leeuw et al. 2008). The questionnaire was
available on the web within a period of three weeks for a closed group invited via email. For each
variable, e.g. perceived customer value, the participants rated a set of four to ten statements. The
statements “The old BI systems were easy to understand and use.” and “The new BI system is easy to
understand and use.” are one example of the dimension perceived customer value. The answers
consisted of non-dichotomous 7-point Likert scales. The participants’ responses can be aggregated in a
standardized manner and used for quantitative analysis with this study approach (Bhattacherjee 2012).
We evaluated the answers of the survey using quantitative (statistical) methods (see section 5). We
discussed the findings of this analysis with five experts in the last step (one business expert was not
available due to holiday season) to find explanations for (especially unforeseen) results (see section 6).

Figure 4. Data collection process
The results of the quantitative evaluation (step four in Figure 4) have been achieved by using a gradual
approach that is illustrated in Figure 5. First, we coded the answers of the interviewees to numeric
values. The 7-point Likert scale answers from “strongly disagree” to “strongly agree” were coded with
“-3” to “+3”. Afterwards, we calculated basic survey statistics such as the distribution of IT and
business participants. In step three we conducted an analysis of correlation within the BI
characteristics proposed by Inmon (1996). Fourth, we analyzed the relation between BI characteristics
and BI agility without distinguishing between the used technologies DRDB and IMDB. Last, in
pursuit of our second research question, we looked at the impact of BI characteristics on BI agility and
how the moderator variable technology affects this relation. For step three we used the standard
statistical method of analysis of correlation. For the quantitative analysis of step four and five in
Figure 5 we applied partial least squares (PLS) as an established mathematical procedure (Chin 1998;
Kline 1998; Rönkkö et al. 2012) to identify the path coefficients (PC). Various approaches exist to test
the existence of moderation depending on the scale of the moderator effects (Henseler and Chin 2010).
Based on the setting of the use case we used a pragmatic approach to determine the moderator effect
of the variable technology. Since the assumed moderator is dichotomous, i.e. only has two values (old
BI system, new BI system), a moderating effect can be assessed by a group comparison. Therefore, we
calculated the difference between the PCs for the old and new model. This difference can be
interpreted as the moderating effect of the variable technology (Henseler and Fassott 2010).

Figure 5. Steps of the quantitative survey evaluation
Both, the BI characteristics as well as the dimensions of agility are not directly measurable and were
therefore modelled as latent variables (Backhaus et al. 2013). Thus, we assigned indicators to each
latent variable. Statements in the questionnaire represent these indicators. Participants of the survey
rate their agreement according to the statements. Each of the latent variables is then determined as a
weighted linear combination of its indicators. The dependent latent variables are a weighted linear
combination of the independent latent ones (Backhaus et al. 2013). We choose PLS because it can
cope with small sample sizes as well as non-normal data and is adequate for exploratory research (Hair
et al. 2011). It is advisable to check for outliers to ensure reliable data quality (Weiber and Mühlhaus
2010). Therefore, we used an agglomerative clustering technique called single-linkage or nearest-
neighbor respectively (Backhaus 2011). Such algorithms initially treat every data set as a cluster and
then start reducing the number of clusters by joining them until only one cluster containing all data
sets is left. The determination of the outliers was done by using a rule of thumb which has been proven
in practice (Backhaus 2011). For the quantitative analysis we used the software tools SPSS Statistics
Version 22 (IBM 2013) and SmartPLS Version 3.1.3 (SmartPLS 2014).
5 DATA ANALYSIS AND RESULTS OF THE QUANTITATIVE
EVALUATION
The email distribution list contained 187 persons. 69 of the invited persons accessed and started the
survey (36.9%). 43 of them completed the questionnaire, i.e. 62.3% of the participants or 23% of all
invited persons. By checking the outliers as mentioned above 39 participants remained (20.9% of the
invitees). As every participant was asked questions regarding the old and new landscape we achieved a
total of 78 answer sets. 14 of the 39 participants (35.9%) have been users from functional or business
departments, 24 (61.5%) were from IT and one (2.6%) answered “other”. Regarding their involvement
in the BI transformation, three participants (7.7%) were business project members. 13 were IT project
members from the organization itself (33.3%) and another 11 were external project members (28.2%) .
12 participants (30.8%) were end-users with no project involvement.
We analyzed the answer sets (n=78) without differentiating between the underlying technologies
(IMDB and DRDB) to determine correlations within the DWH characteristics proposed by Inmon
(1996). All four variables, i.e. subject-orientation, integration, time-variance and non-volatility,
correlate positively and no negative correlations exist. The coefficients are between 0.38 and 0.60.
Values below 0.9 are deemed acceptable as very high correlation would question the definition of the
latent variables (Huber 2007).
5.1 Dependencies between BI Characteristics and Agility
Table 1 (n=78) summarizes the dependencies between DWH/BI characteristics and BI agility without
moderator distinction. Considering four variables within DHW-based BI characteristics and six
variables in BI agility, this sub model contains 24 relations. The table shows the PCs between
DWH/BI characteristics and the agility of BI. PCs are the weights of the paths obtained by regressions.
Every PC should at least have an absolute value of 0.1 (Huber 2007). Four of the 24 PCs did not meet
this threshold. In return, 83.3% fulfill the criterion. We applied bootstrapping to assess the significance
of the paths. At a significance level of 10%, the resulting t-values (t) should exceed 1.65 (Huber
2007). As shown in Table 1, 41.7% (10 of 24) of the relations are significant at a 0.1 level and meet
the criterion.

Change
Behavior
PCV Time Process Model Environ-
ment
PC t PC t PC t PC t PC t PC t
Subject-Orientation .02 0.08 .33* 1.90** .29* 1.89** .00 0.05 .13* 0.07 .12* 0.83
Integration .11* 0.81 .17* 1.13 .06 0.50 .19* 1.86** .10* 0.53 .16* 1.09
Time-Variance .14* 0.65 .08* 0.46 .05 0.36 .32* 2.45** .24* 1.27 .36* 2.22**
Non-Volatility .47* 2.75** .29* 1.68** .33* 2.48** .40* 2.96** .33* 2.08** .10* 0.61
Notes: * Path coefficient above threshold (0.10)
** Significant t-value, i.e. t-value above threshold (1.65)
Table 1. Path coefficients between DWH-based BI characteristics and BI agility
5.2 Impacts of Technology on the Agility of BI
We split the data set (n=78) along the lines of the variable technology as the moderator is
dichotomous. This results in two groups of n=39 responses each (DRDB and IMDB) and facilitates the
assessment of a moderating effect. Table 2 illustrates the PCs of the PLS procedure for these separate
estimates. The impact of the moderator of the variable technology is the difference (Diff) between the
new (IMDB) and new (DRDB) landscape (Henseler and Fassott 2010). Although all paths with an
absolute coefficient of 0.10 or higher can already be included in moderating considerations (Huber
2007), we chose a cut-off value 0.25 to reflect our more conservative approach.

Change
Behavior
Perceived
Customer Value
Time Process Model Environ-
ment
Subject-
Orientation
DRDB -0.25 0.14 0.15 -0.22 0.05 -0.03
IMDB 0.07 0.43 0.47 0.08 0.40 0.31
Diff 0.31*
+
0.29* 0.33*
+
0.31*
+
0.35* 0.35*
+

Integration
DRDB 0.31 0.04 -0.07 0.25 0.01 0.21
IMDB 0.24 0.39 0.01 0.23 -0.10 0.24
Diff -0.07 0.35* 0.08 -0.02 -0.11 0.03
Time-
Variance
DRDB 0.25 0.20 0.07 0.36 0.21 0.41
IMDB -0.10 0.20 0.07 0.23 0.40 0.22
Diff -0.35** 0.00 0.00 -0.13 0.19 -0.19
Non-
Volatility
DRDB 0.44 0.38 0.43 0.57 0.45 0.17
IMDB 0.58 -0.02 0.27 0.45 0.23 -0.03
Diff 0.14 -0.41**
+
-0.16 -0.12 -0.22 -0.20
Notes: * significant positive moderating effect (difference >= +0.25)
** significant negative moderating effect (difference
 

Attachments

Back
Top