The influence of information processing needs on the continuous use of business intelligence
Kevin McCormack
DRK Research, 5425 Willow Bridge Lane, Raleigh, NC 27526-8484, USA
Peter Trkman
University of Ljubljana, Faculty of Economics, Kardeljeva ploščad 17, 1000 Ljubljana, Slovenia,
Introduction
The complexity of business today means that a company regularly needs to perform complex analyses of vast quantities of data (Azvine, Nauck and Ho, 2003). Many enterprises now deploy business intelligence systems to obtain timely information about the processes within the organization and the companies’ environment combining information on past circumstances, present events and projected future actions to answer questions or solve problem (Dinter, 2013). Business intelligence is the ability of an organization to reason, plan, predict, solve problems, think abstractly, comprehend, innovate and learn in ways that increase organizational knowledge, inform decision processes, enable effective actions, and help to establish and achieve business goals (Bose, 2009; Chaudhuri, Dayal and Narasayya, 2011; Popovic, Coelho and Jaklic, 2012).
The role of business intelligence is to create an informational environment in which operational data gathered from transactional systems and external sources can be analysed in order to reveal strategic business dimensions (Petrini and Pozzebon, 2009). This information should help organizations to respond to key business issues, make predictions and act based on real-time data to improve quality and speed of decision making (MacMillan, 2010). But the implementation of business intelligence is a challenging initiative and often fails in practice (Lukman, Hackney, Popovic, Jaklic and Irani, 2011; Tsitoura and Stephens, 2012). The expected benefits do not always justify the investment in business intelligence technology (Shanks and Sharma, 2011; Trkman, McCormack, Oliveira and Ladeira, 2010). This means that the development of business intelligence capabilities can only provide information-driven decisions, but cannot put them into effect as well.
The available literature (see (Jourdan, Rainer and Marshall, 2008) for a full review) often focuses on anecdotal case studies (Davenport 2009) or a statistical analysis of the impact of business intelligence (Trkman et al., 2010). This is clearly important, especially in the initial, exploratory phase. However, it gives little indication about the factors that lead to the proper outcome of business intelligence implementation. This outcome can either prove to be a radical or incremental change in decision-making or it can water down to just a few fancy charts in a newly acquired programme. Even more importantly, an analysis of the business intelligence impact should not focus on the impact at a particular moment in time but should be longitudinal as to establish how and especially why it varies over time. Similarly to enterprise resource planning systems (Häkkinen and Hilmola, 2008b) case studies examining the long-term success or failure of business intelligence are also scarce.
On a more theoretical side, in order to root business intelligence research within information systems research a rigorous theory-based grounding and analysis of their impact is needed. Since the rapid evolution of technologies and managerial methods poses a significant challenge to theorists (Lewis, 1998), most previous research on business intelligence was atheoretical. Our analysis thus uses organizational information processing theory (Galbraith ,1974) to analyse how new business intelligence technologies can drastically enhance information processing capabilities but do not necessarily have the same effect on the information processing needs. While the improved decision-making is definitely in the best interest of the organization as a whole, it is certainly not necessarily the case for a particular individual, process or decision-making situation. A careful analysis of information processing needs is thus needed.
The paper presents a longitudinal case study of a North American company from the synthetic crude oil industry to show how the development of information processing capabilities was firstly matched with an increase in information processing needs. However, change in management and high turnover of key analitically-oriented employees caused a reduction in information processing needs, that eventually led to the erosion of information processing capabilities.
The structure of the paper is as follows. First, the role of business intelligence in changing decision-making is discussed. Then the information processing theory and the ways in which information processing needs can be influenced are examined. A case study is presented and analysed from the information processing theory perspective. The main findings and further research topics are set out at the end.
The influence of business intelligence on decision-making
The information from business intelligence system needs to be integrated into an organization’s business process. This can be achieved by building a closed-loop decision-making system. The outputs from this business intelligence are used by operational managers in the form of recommended actions (Azvine et al., 2003). A necessary prerequisite for efficient use is the availability of high-quality data. This includes proper data management encompassing the identification of users' needs, data unification, data cleansing and the improvement of data quality control (Popovic et al., 2009). Most business intelligence initiatives have thus focused on developing a high-quality business intelligence data asset that is used instead of existing reporting systems (Shanks, Sharma, Seddon and Reynolds, 2010). Business intelligence derive their value through their ability to extract specific and changing data from a wide variety of heterogeneous sources (MacMillan, 2010). The insight they gain from such data analyses is then used to direct, optimise and automate decision-making (Bose, 2009). However, this is easier said than done. Managing an enterprise requires efficient data management in order to monitor activities and assess the performance of various business processes (Häkkinen and Hilmola, 2008a).
The above general summary of the findings of previous studies definitely provides support for the importance of business intelligence use and their potential impact on the organization. Yet it does not give deeper insights into the ways business intelligence gets adopted and how it changes the way individuals use (or do not use) its tools for decision making. Similar to other strategic technologies, business intelligence will evolve or change depending on organizational needs and maturity (Russell, Haddad, Bruni and Granger, 2010). Despite the significance of business intelligence research, little attention has been given to examining the natural progression of business intelligence adoption and maturation within organizations (Russell et al., 2010). Organizational factors are namely crucial for utilising business intelligence to make better decisions (Viaene and Den Bunder, 2011).
Despite some well-publicised case studies of business intelligence being employed to support value-creating actions, Shanks et al. (2010) found little evidence of this. Since business intelligence takes up resources and the actual benefits occurring in practice are not always clear (Lönnqvist and Pirttimäki, 2006), this may lead to disappointment after the initial enthusiam. Value should be obtained and measured throughout the business intelligence implementation cycle, not just at the end of the project (Viaene and Den Bunder 2011). A greater focus is thus needed on how and to what extent employees adopt and continue to use the available tools. After all, business intelligence systems are not used by an organization but by the individuals working within it. An individual's information-seeking behaviour is consistent with his or her information processing capability (Driver and Streufert, 1969). Assuming that users will somehow start using developed business intelligence applications (”if you build them they will come) is a sure recipe for failure (Glassey, 1998).
User adoption of various kinds of information systems is undoubtedly one of the most thoroughly studied topics. Therefore, the usual guidelines for software’s usability in general (Nielsen, 1999) and decision-support systems in particular (Clark, Jones and Armstrong, 2007) should be followed in the development of business intelligence systems. However, we focus mainly on an analysis of the factors motivating users to want/need to use business intelligence tools in the first place. In this respect, information processing needs within business are in need of further research.
Information processing capabilities and needs
In our paper the impact of business intelligence is examined within the organizational information processing view that posits that information capabilities and needs have to be aligned (Galbraith ,1974). Information processing is the purposeful generation, aggregation, transformation and dissemination of information associated with accomplishing some organizational task (Robey and Sales, 1994; Stock and Tatikonda, 2008). The company thus needs to identify those areas in which an increase in information processing needs demands an increase in its information processing capabilities (Fairbank, Labianca, Steensma and Metters, 2006). Consequently, the interactive effect of information needs and capability has a significant effect on performance (Premkumar, Ramamurthy and Saunders, 2005). In accordance with information processing theory we thus argue that the most effective organizational design strategies are those that recognise an appropriate fit between an organization's ability to handle information and the required information (c.f. Narayanan, Jayaraman, Luo and Swaminathan, 2011). Yet, establishing this fit presents a complex challenge and an alignment processes need to be understood (Chou, Chang, Cheng and Tsai, 2013).
The information processing view thus posits that information processing capabilities and needs must be aligned (Galbraith, 1974). Information processing capabilities need to be balanced with the extent to which decision makers' information requirements are met (Grosswiele, Röglinger and Friedl, 2013). Higher performance standards entail less availability of slack resources and thus higher processing needs to achieve the goals (Galbraith, 1974; Trentin, Forza and Perin, 2012). The information processing view then sees the linkage between a key organizational resource (information) and its management (i.e., the use of information) as an organization's most critical performance factor (Fairbank, Labianca, Steensma and Metters, 2006).
Yet a high level of information processing capabilities is not necessarily positive; a company may experience subpar results if its information capabilities are high but unbalanced (Sleptsov and Anand, 2008). When information processing capabilities are lower than what is needed to perform a task, performance standards will not be met. Conversely, when an organization possesses more information processing capabilities than is required the task will be accomplished inefficiently (Stock and Tatikonda, 2008). Further, information processing needs may change along with organizational development. Business intelligence implementation is thus not a one-off project. If an organization does not select a suitable system to support managers’ decisional task and does not consciously match its information needs and information capabilities, this will automatically lead to misfit (Mani, Barua and Whinston, 2010).
Finally, we argue that not only companies but also individuals (the company’s employees) seek to establish a fit between information processing needs and capabilities. These are namely both present on the organizational and individual levels. How much an individual believes they need information processing to make a decision in a particular situation influences their level of use.
The influence on information processing capabilities
The potential impact of business intelligence on processing capabilities has been extensively studied. Past research mostly focused on the integration of data and development of tools to process those data. Some of the technological capabilities needed for implementing business intelligence are clear. The successful implementation of real-time business intelligence requires a real-time data warehouse. Critical data must be captured from source systems, loaded into a warehouse, and made available (Anderson-Lehman, Watson, Wixom and Hoffer, 2004). Actionable information for applications and users must be provided while the costs of communicating information should be reduced by improving the quality and speed of information processing (Songpol, Chiquan and Napatsawan, 2004).
Business intelligence implementation should involve developing information capabilities that lead to better decision-making, which can be referred to as business intelligence capabilities (Susarla, Barua and Whinston, 2010). How well such activities can be performed depends on the quality of the business intelligence tools and the quality of data used as input (Susarla et al. 2010). However, although many information technology techniques have been developed for intelligent information processing, even the most advanced ones are not yet mature enough to solve complex real-world problems (Zhong, Liu and Yao, 2007). Therefore they must be complemented by the information processing capabilities of individual employees and/or groups of employees. An organization must determine what kind of information is necessary and which capabilities need to be in place to allow successful operation (Lönnqvist and Pirttimäki, 2006). This is very important since developing the “wrong” capabilities may have a negative effect (Sleptsov and Anand, 2008). A company must thus be able to routinely develop information processing capabilities to enable agility in a turbulent business environment (Huang, Pan, Ouyang and Chou, 2011). However, the information processing capabilities can only be viewed as valuable if they are used (Popovic et al. 2012); thus information processing needs also need to be studied.
Influences on information processing needs
The influence of information processing needs on business intelligence adoption has mainly been studied at the organizational level. A well-known claim in this respect is that those in more information-intensive sectors are more likely to adopt business intelligence than those in less information-intensive sectors (Thong 1999).
Yet, information processing needs arise on the organizational as well as the individual level. The latter denotes the need of an individual to use information processing to make a decision. Many times the information technology applications could in fact address unsatisfied information needs, but the managers and the users are unaware of this capability (Ragowsky, Licker and Gefen, 2012). Business intelligence services are namely evoked by organizational actors (usually employees) to help execute their tasks (Mathiassen and Sørensen, 2008). They should view data as an asset that can be turned into a competitive advantage when used properly (Wixom, Watson, Reynolds and Hoffer, 2008). Information processing needs thus need to be related to organizational objectives (Huotari and Wilson, 2001).
There are several ways of making such decisions, ranging from purely intuitive (i.e., gut feelings) to more structured (simple spreadsheet calculations or supported by advanced business intelligence). This means that while the buy-in from top management and their engaging support are absolutely necessary for successful information systems implementation (Hedgebeth, 2007; Neufeld, Dong and Higgins, 2007; Sharma and Yetton, 2003) this in itself is not enough to secure their continuous use. Users may often seek small amounts of information and even that can be from a limited range of sources and providers (Marcella and Illingworth, 2012). Something must assure a permanently high level of information processing needs that will lead to higher business intelligence use.
Some such possibilities are encouragement form management, a system of incentives, individual heroics, regulative pressure or a so-called analytics-oriented culture (see e.g., Davenport, Harris and Morison, 2010; Kiron and Shockley, 2011).
Accordingly, success requires more than just establishing the right infrastructure: tools and systems must be well matched as well as effectively implemented to become part of the problem-solving requirements (Kleinschmidt, De Brentani and Salomo, 2010). Business intelligence usually increases information complexity, alters the manner of information processing and impacts group decision processes and outcomes. The use of business intelligence can thus even exacerbate the overload problem since massive amounts of (sometimes irrelevant) information are available at the click of a mouse (Paul and Nazareth, 2010).
Users can therefore choose either heuristic processing of information, characterised by shallower, less critical information processing or more systematic ways of processing that lead to deeper and more elaborate, argument-based evaluation of information (Scholten, van Knippenberg, Nijstad and De Dreu, 2007). Therefore it is vital that the relationship between users and their information needs is properly interpreted (Normore, 2011). In general, the active search for information depends on the corresponding information-acquisition costs and marginal benefits (Papathanassis and Knolle, 2011).
If we assume that cost of data acquisition from an advanced business intelligence system can be relatively low (dependent, of course, on the system usability), an important question is what the benefits are for the individual employee. If performance expectations and measurements are vague the stimulation for using the business intelligence may be very low (Harris, Craig and Egan, 2009) and users may perceive this as a waste of time. In a typical example of an insurance company, employees who were using some simple analytical tools saved the company thousands of dollars by being able to reject some unjustified claims by policyholders. However, they were considered by the managers as being less productive than others due to the reduced number of processed claims an hour that were key performance indicators for the company.
This example illustrates the importance of managers’ attitudes. Senior management should thus have a vision for business intelligence, provide the necessary resources and insist on the use of information-based decision-making (Cadez and Guilding, 2008; Watson and Wixom, 2007). Users should be provided with data access tools that are appropriate for their needs, trained in how to use the tools and the available data, and given access to people who can help them use business intelligence (Watson and Wixom, 2007). This should resolve the tension between the different information processing needs: namely programmed, quick-action loops on one hand (e.g., built-in, automatic business intelligence) and the increased requirement for emergence and improvisation (e.g., people-assisted decisions) on the other (c.f. Carlsson and El Sawy, 2008). Only in such an environment and in the context of process accountability will the users perceive a greater need for more information and choose the correct decision alternative more often. This motivation produces high-quality decisions because it stimulates systematic information processing (Scholten et al., 2007).
Following this review, we set out the two main research questions as follows:
Research question 1: Following Wilson’s (2006) suggestion that we need to understand the needs that lead to information-seeking behaviour better, we ask the following question: What is the role of the information processing needs of users in the long-term success of business intelligence?
Research question 2: Following the claim by Elbashir, Collier and Sutton, (2011) about the indirect effect of top management on the success of business intelligence, we ask the following: How does influencing the information processing needs of employees affect the use of business intelligence and what role does the top management play?
Case study
Justification of a case study
We used an interpretative mixed method approach that combined a qualitative case study and surveys. According to Yin (2003) a complex interaction between individuals, groups and information systems leading to the adoption and successful use of business intelligence can best be analysed with a mixed method approach. Case study research is particularly appropriate for the study of the development, implementation and use of information systems within organizations (Darke, Shanks and Broadbent, 1998).
In our particular example, a case study combined with survey methods was deemed appropriate. Business intelligence services are namely configurations of heterogeneous information processing capabilities; evoking different configurations may lead to equally satisfactory outcomes (Mathiassen and Sørensen, 2008). Vice versa, very similar configurations may lead to considerably different outcomes. Surveys can provide a focus while case studies can provide the organizational context for studying the relationship between strategy and information technology (Benbasat, Goldstein and Mead, 1987).
Method
A longitudinal mixed method study of Oilcom was chosen to analyse the way business intelligence implementation influenced both information processing capabilities and needs. The usual guidelines for mixed methods (case studies with surveys) were followed (Benbasat et al. 1987; Eisenhardt 1989; Yin 2003). Several data collection methods were used to collect, triangulate and validate the data. Four of the researchers collaborated on developing the company’s business intelligence capabilities over three years and made direct observations during this period. Internal documentation was reviewed and the current business processes were analysed. Process maps were constructed and studied and more than fifty end-users were interviewed. Several data sets were gathered using existing manual data sources (spreadsheets) and combined in order to understand the current situation and quantify the potential opportunity for a business case.
These combined data sets were also used to calculate the current state metrics discussed in this case. Room bookings, no shows, and late arrivals were identified as critical performance metrics. These were used to examine process performance and to develop business cases for improvement. These became important to measure the impact of business intelligence before, during and after the changes. In addition, an online web-based survey and on-site interviews were used to gather information about room availability and guest satisfaction at the fifteen lodges.
Interviews with Oilcom managers, outside contractors, lodge managers and individual workers were also conducted during the three-year effort. The sample consisted of more than fifty Oilcom employees responsible for the assignment of rooms and allocation of room costs. Twenty Oilcom managers responsible for the people using the rooms were interviewed. In addition, each of the camp managers and one clerk from each camp were interviewed and observed (fifteen camps). Also, several hundred camp guests were interviewed and several thousand were surveyed out of a camp population of 15,000. All fifteen camps were also visited during this process. Contractor employees responsible for requesting rooms were interviewed by phone and observed and re-interviewed over a period of three years. All process maps produced were reviewed with the process teams and validated. All data collected and examined were gathered from the Oilcom process teams and validated with the teams in workshops held during the study. The final performance data was gathered directly from the system and validated with the process teams.
Case description and analysis
Oilcom is a producer of synthetic crude oil (the name is fictional, all other data are real) produced from oil sands in Canada. In late 2007, Oilcom began investing heavily in order to expand its production. To sustain its operations and to support the ongoing expansion effort, thousands of skilled construction workers had to be transported from other regions of North America and other countries to remote parts of the world. This meant that Oilcom had to coordinate more than 10,000 temporary workers at 15 lodge sites each day in support of 175 simultaneous construction projects and 300 different construction contractors, along with lodges, buses and airport transport. Oilcom had to continually manage the assignment and movement of workers to or between projects. Oilcom thus faced the same challenges as many companies from various industries which constantly strive to minimise idle resources while aiming to increase revenue from new project opportunities, as well as to improve the quality of staff assigned to each project (Chenthamarakshan et al. 2010).
By the summer of 2008, the manual processes for coordinating the transportation and housing of its temporary workforce were collapsing under the enormity of the task. The information processing capabilities were thus not technologically advanced enough to match the organization’s needs. On an annual basis, Oilcom was losing millions of dollars in overbooked rooms, underutilised transportation, worker turnover and construction project delays. Information on the process was mostly unavailable or unreliable and so an exact calculation of the loss was not even possible. The total costs of the process were over US$ 500 million dollars a year. While the management knew it was inefficient it could not estimate the total loss or solve the problems themselves.
Oilcom had engaged external consultants to quickly survey the current operations and recommend changes. It was clear that trying to manage the work flow with emails, faxes and spreadsheets was impossible. The survey results showed a high level of dissatisfaction with the current process and some facility issues (cleanliness, food, etc.). The people managing the process reported a high level of frustration with the lack of timely data and their inability to secure rooms and to provide confirmation to the guests when requested in a timely manner.
A web-based, self-service, centralised data repository was envisioned where Oilcom could enter its workforce requirements (forecast demand), contractors could enter accommodation requests for their workers (actual demand), lodges could enter current accommodation availability (supply) and bus and airline companies could provide updates to scheduled routes and flights (supply). With access to timely demand and supply information, better decisions could be made by every participant, at every level. Single-point data entry also would minimise the data quality errors rampant in the manual process. Oilcom and the hundreds of participating companies could visualise the approaching demand and plan/prepare as needed for on-time delivery.
An outside firm was contracted to develop the system quickly. The intent was to make the system inexpensive and easy to replace so a software as a service delivery approach was chosen. Within two weeks of the start date a prototype system was being evaluated by users and within another two weeks user feedback was incorporated and the basic system was operational. Quick reference guides were built to instruct various levels of users and train hundreds of contractors and Oilcom personnel on how to enter data and monitor their requests for approval. In less than ten weeks the basic system was developed. The general consensus across participants was that the system was easy to use, information was immediately and broadly accessible across the organization and, most importantly, the system produced reliable results. A typical opinion is expressed by a key requestor: 'It feels like it is natural; it fits with what we need to do and there is very little "friction" or unnecessary work'.
With the force of the senior director and project management directors requesting and using the business intelligence views and reports, a simultaneous increase was seen in information processing needs. This put pressure on the process team members to use the developed business intelligence system. It also displayed incentives for the people to use business intelligence and get recognition by the senior director.
The system consisted of an Excel-looking dashboard with tabs presenting different business intelligence functions. A typical example was an MP (the first initials of the two vice presidents sponsoring the project) report that was developed as a daily coffee table for people at all levels to see and discuss the issues for that day and the following week. This was an on-demand automatically generated compilation of supply and demand for a lodge for a specified day coupled with a forecasted projection eight days out. It also showed the number of people charged to each project or cost centre. This was viewed as a critical document that was looked at daily to help make informed decisions. In addition, key performance indicators were set up to measure progress and help people focus on performance improvements. The system was easily connectable to Excel and allowed preparation of creative reports.
The performance results were significant. No Shows, a key measure for rooms reserved and paid for but where no one showed up (reserved rooms were required to be held for 7 days) went from 30% to less than 10%, saving US$ 20 million dollars per year. No Rooms (measured informally in the surveys mentioned in the introduction to this case) whereby workers arrived and rooms were not available went from 15% to less than 2%. The number of requests made with fewer than twp days advance notice dropped from 80% to 43%. All these produced significant savings.
When the senior director left in late 2009 and there was a significant turnover in project management, the behaviour incentives (the managerial recognition for making a business intelligence supported decision) changed dramatically and the analytics behaviour slowly collapsed to just some transactional support. The new leadership was generally unaware of the system’s capabilities. Due to the collapse of oil prices at the end of 2009, the rate of pay and quality of people employed fell significantly. In fact, three new senior directors changed within a year and the recruitment process for new leaders did not include training in the process or system. Many employees who enjoyed the previously increased information processing capabilities left the company because their analytical approach did not match the reduced information processing needs of the organization. Turnover in the core users’ group was high and 70-80% of these staff left in those three years.
With the new leadership, the job requirements changed. Employees were not expected to use the business intelligence reports and analyses, leading to the degradation of job competencies. The focus was on transactions and getting them done. The initial business intelligence system had relatively few business rules built into it because of the variable nature of the process (e.g., although the rule is that workers, supervisors and executives should be in different types of rooms, sometimes a worker can be placed in a supervisor’s room). This was initially beneficial as it gave employees flexibility to use both their personal judgement and business intelligence capabilities to make the optimal decision. However, it also enabled a quicker downgrade of the processing capabilities.
A typical example of this downgrade was the MP report mentioned earlier which included information on current demand, supply and forecasts by a business unit and lodge. While the core process did not change and its stated goals and key performance indicators remained the same, the view of the importance of this document went from 'the most important document available' to 'nobody looks at it'. This confirms the finding by Chou et al. (2007) that not all generated documents by business intelligence applications are useful, only those that satisfy employees’ needs and preferences. However, those needs depend on leadership’s reporting requirements and in this case the new management did not look deeply into the numbers.
After three years, the new management (the third composition of the leadership team in two years) felt that even the existing processing capabilities available to contractor employees were too high due to some data entry issues (e.g., contractor requestors entering employees in the system without their employee identification numbers). Hence they shut down most self-service requesting and the process reverted to a typical Excel by e-mail process coupled with faxes and telephone calls sent to a centralised data entry group. This group then entered the approved reservation and used the system to allocate charges to projects.
In addition, contractor requestors were blocked from accessing information about their current or historical requests, thus blinding them to process performance and basically reducing their information processing capabilities to zero. The requirement for forecasts from the project management group was abandoned as well. This also meant the elimination of reporting of the difference between forecast and actual demand. Both had been original drivers of the development of the system. The system that was originally meant to provide on-demand, distributed business intelligence capabilities in both reporting and prediction thus became mostly a centralised transactional system providing access to only a small, centrally located group. The management by numbers culture changed to basic transactional performance.
The degradation of performance was considerable. The total number of requests went from 50,000 in 2009 to 107,000 in 2011. The percentage of No shows (when a room was booked but nobody arrived) went from less than 10% to almost 12%. Since rooms are reserved and paid for when a request is submitted that accounted for a significant financial outlay. Two percent of 107,000 requests means over 2,000 rooms were held for up to seven days at US$150 per day. This equals 2.1 million dollars a year. The share of requests submitted later than two days notice went from 43% to almost 90%.
The average days to arrival metric (how many days was the request submitted before the arrival of the individual) went from thirty days in 2009 to ten days in 2011. If someone shows up without a room being reserved they are offered a room at a central lodge or a hotel in town and checked into the penalty box. A lack of planning always leads to errors with actual versus forecast being defined as an error. This penalty box metric went from 1.85% in 2009 to 2.13% in 2011. This clearly shows that an erosion of the information processing capabilities led to the degradation of the process performance and an increase in costs over 2009.
What is interesting, however, is that the current management of this process were unaware of the performance degradation since they do not even examine the performance. Transactions were executed with minimal friction and the basic job gets done. The leadership essentially had a short-term focus and process improvements (or degradation) are not of interest, unless there was a crisis. Even our analysis was conducted only for this paper and has not yet been presented to the current Oilcom leadership.
Discussion
Our first research question was: What is the role of the information processing needs of users in the long-term success of business intelligence? We have shown that business intelligence can enhance information processing capabilities if it is properly implemented. However, only a simultaneous increase in information processing needs (which can arise due to the manager’s attitude to analytically-based decisions, an analytically-oriented culture or other pressures) will ensure this is really used. In the Oilcom case, an intervention by external companies increased the information processing capabilities but that only lasted so long as the information based decision making was required by the management.
Information processing needs are namely both on individual and on organizational level. The latter were previously well studied. Yet, we showed that although the organization as a whole may require information-based decision, this may not reflect on the perception of individuals who make those decisions. The information processing needs are thus both objective (how much information processing is needed to improve the outcome of the decision) and subjective (based on the perception of an individual decision maker about how much information he or she needs).
Using business intelligence can be cumbersome for the users; if they are not required or stimulated to engage, they may not do so. Thus business intelligence alone will not guarantee improvement, even if they are perfect from a technological and data content perspective. Although the use of a data-driven decision is beneficial for the organization it does not necessarily benefit the individuals within it as it can place extra work on them to use the increased information processing capabilities.
Our second research question was: How does influencing the information processing needs of employees affect the use of business intelligence and what role does the top management play?
Similarly to Elbashir et al. (2011), we found that while top management plays a significant role in the effective deployment of business intelligence systems their impact is indirect; in our case, through the potential to influence the processing needs. The employees are often interested in what is required of them. The same transaction in the same process can be done in several ways. Information processing needs are thus very much in the eyes of the beholder. Actions taken by managers to institute, support and legitimise the required new contexts are therefore critical to successful implementation (Sharma and Yetton, 2003).
Interestingly, tangible performance-based incentives (e.g., financial bonuses) may not even be needed to increase the processing needs. In our case, the recognition of and demand for analytically based decisions sufficed. This further confirms that the drive to acquire social acknowledgement or status can importantly contribute to the adoption of information systems (c.f. (Abraham, Boudreau, Junglas and Watson 2013)). This reinforces the need to consider the specific context of use when designing a performance measurement for information technology intensive systems (Elbashir et al. 2008). If the performance is not both measured and recognised it is possible that transactions will run smoothly but inefficiently. This shows that if information processing capabilities in the organization are lower than information processing needs at the transactional level it is likely that people will complain since they will be unable to perform their daily activities. On the other hand, if the capabilities are higher than the needs it is possible they will not be used without this lack of usage being noticed by anyone, despite sub-optimal decisions being continuously made.
Conclusion
Our paper has enhanced the understanding of how information processing needs are developed in organizations. We have shown that a better understanding of the needs that fuel the individual's information-seeking behaviour is required (Wilson 2006). The research shows how top management can affect the information processing needs leading to better acceptance, better decision-making and consequently better performance. We also showed (similarly to Savolainen (2011)), that greater environmental uncertainty does not necessarily lead to higher information processing needs as is often argued (e.g., (Shockley et al. 2011)). The amount of information required to perform a task is not fixed. Put differently, as shown by our case, environmental turbulence (e.g., crisis) can serve as a force stimulating information processing needs. However, increasing those needs with other interventions in less turbulent times may be equally beneficial.
The paper has important practical implications. Previous studies on leaders’ influence on the success of information technology projects have not provided much guidance in terms of specific managerial behaviour that is needed for implementation success (Neufeld, Dong and Higgins, 2007). For business intelligence success management should not only provide nominal or oral support, funding, training etc., but most importantly it should ensure that employees are required to make decisions supported by business intelligence. Thus, the information processing needs have to be carefully managed.
Obviously, our case suffers from the usual limitations of case study research. A single case may not be sufficient to allow the generalisation of the findings. Various characteristics of different types of business intelligence systems may have different influences on business intelligence adoption. Our analysis also does not completely apply to embedded capabilities (Azvine et al. 2005) that are fully automated in the software. Further, the industry in question is mainly focused on service operations management. Information processing capabilities, needs and consequently adoption may be different in more knowledge-intensive and less-structured processes.
Future research avenues include case studies in various organizational settings. A statistical analysis of the influence of perceived information processing needs on the individual adoption and continued use of business intelligence is also called for. In this respect, a development of a more rigorous method for measuring information processing needs would also be needed. This would also help to measure the various levels and intensities of use in different phases of the implementation of business intelligence. Such studies can importantly help to not only better implement business intelligence but also to improve the likelihood of their proper use in the long run, leading to better decision-making and consequently better performance.
Acknowledgements
We are grateful for help and several extremely useful comments and suggestions by the editor Dr. T. D. Wilson, associate editor Dr. Charles Cole, copy-editor Dr. Peta Wellstead and two anonymous reviewers that enabled a considerable improvement of an earlier draft of the paper.
Peter Trkman acknowledges the financial support from the Slovenian Research Agency (project No. J5-6816).
About the authors
Kevin McCormack, Ph.D. is President of DRK Research, a global research network. He is also a professor at two Universities: Northwood University, Midland, Michigan, USA and Skema Business School (a French Business School). He has over thirty years of business leadership, engineering, teaching, research and consulting experience in information technology, operations management, and supply chain management. He can be contacted at: kmccormack@drkresearch.org
Peter Trkman is an Associate Professor in the Department of Information and logistics management, Faculty of Economics, University of Ljubljana, Slovenia. His research deals with various aspects of business process and supply chain management . His work has been cited over 1200 times. He can be contacted at: peter.trkman@ef.uni-lj.si