BeyeBLOGS | BeyeBLOGS Home | Get Your Own Blog

May 11, 2010

Answers to a Reporter's Questions About Pervasive BI - Part 2

In the last post, I shared part of a recent dialogue I had with Canadian reporter David Hamilton of Web Host Industry Review (WHIR) who asked some interesting questions about pervasive business intelligence (BI). Here is the rest of that interview.

WHIR: One of the success story examples of an organization using BI was Blue Cross Blue Shield of Kansas City which, among other things, used its newly found insight into its data to streamline internal processes, and even expose previously unavailable data to physicians who could use that knowledge to better treat patients. It seems to me that some of the benefits of BI in this case were, perhaps, not altogether anticipated. What are some of the - I guess - unforeseen or unexpected benefits of BI for organizations? 

VF: This is a fascinating topic. People pay money to hear companies tell these stories. I’ll give some examples that I’ve seen in recent years.

  • I think it was a surprise, even to Wal-Mart, when several years ago, they were able to track the spread of flu across the US more precisely and accurately than the CDC, based on purchases in their stores of flu medications.

  • I have seen a few occasions where results of BI analysis done in the data warehouse (DW) had discrepancies with the operational system reports. Further investigation uncovered errors in the operational systems which had an impact on the company’s business.

  • Continental Airlines’ BI system allowed them to understand and serve their OnePass customers so well that when Jet Blue was launched in the early ‘00s, Continental was able to forgo the route their competitors took when they introduced low-cost lines like Ted and Song. Continental’s analytics enabled them to differentiate themselves on their service and attract and retain the customers willing to pay a premium for better service, a move that helped them maintain a strong competitive market position even in the face of downward price pressure.

  •  

  • And then there is one of the most famous cases of all, laid out so superbly by Michael Lewis in “Moneyball.” During the 2001 Major League Baseball draft, Harvard grad and statistics whiz Paul DePodesta introduced Oakland A’s general manager Billy Beane to the concept of analyzing data to select the right players and the best team, for the best price. Only DePodesta’s analysis of Kevin Youkilis’ statistical data indicated the promise of the kid who the scouts dismissed as “a fat third baseman who couldn’t run, throw or field.”

The 2009 American League ranking looked like this for the Boston Red Sox player who might never have made it to the pros but for data analysis:

On-base %                          2nd

Slugging %                          5th

Runs created                        8th

Pitches per plate appearance  1st

Youkilis also holds Major League Baseball’s all-time record for most consecutive errorless games at first base (238 in 2008), and MLB’s record fielding attempts  without an error at 1B, (2,002 also in 2008). He won the 2007 AL Gold Glove award for first basemen and in his major league rookie year 2004, was only the 7th player in Red Sox history to hit a home run in his first game.

As contrary as it is to century-old conventional scouting wisdom, predictive analysis has nevertheless proven to be successful. Evidence includes, as analytics expert and my fellow Red Sox fan Thomas Davenport laments, its adoption by the New York Yankees.                               

  • Technically, data mining is different from BI, but if you’re using the broad definition of BI which encompasses everything including the data warehouse, then we can talk about data mining here. Discovering the unknown and unanticipated is what data mining is all about, as illustrated in the proverbial beer and diapers example. As the urban legend goes, analysis of Osco store receipts indicated a correlation between purchases of beer and diapers, and moving them near each other resulted in a spike in sales.

WHIR: Increases in technology have only recently opened up the possibilities of collecting and analyzing massive amounts of data. Can you tell me a little about the infrastructure needed to process this data and present it in usable ways?

VF: There are numerous new and existing technologies aimed at increasing the ability to collect and analyze large volumes of data by improving performance: data compression, 64-bit computing and in-memory processing, in-database analytics, MapReduce, column-oriented database storage, massively parallel processing and flash memory, to name a few. But there is more to this challenge than just the volume of the data.

We are entering a new generation of DW/BI with an expanded set of business needs that pushes the data requirements in several ways. Just a few examples:

  • Analyzing unstructured and semi-structured content along with structured data
  • Integrating data from many sources, including federation of data not in the DW
  • Using a Complex Event Processing (CEP) engine to analyze incoming real-time data streams, possibly joining event data with data persisted in the DW
  • Lower data latency – loading data into a DW for analysis on a continuous basis (vs. periodic batch load or frequent mini-batches)
  • High availability systems that support data analysis for mission-critical applications
  • Automating data classification and business actions
  • Maintenance of detail data – More and more organizations are looking to the DW as a system or record to, among other things, ensure regulatory compliance, which means maintaining data lineage and detail.

An organization’s data volumes as well as the number and types of users plus requirements in any or all of these areas will dictate the needed infrastructure, which will be different in each case. But for sure, an infrastructure that is built today will need to be able to process the data volumes, as well as address the above requirements, either now or in the near future, because they represent the direction in which BI is going.

I would like to say farewell, as I am retiring from the HP BI blog and this will be my last post.

Posted by HP Business Intelligence Solutions at 4:26 AM

May 6, 2010

Digital Marketing Agencies – the overlooked partner for software vendors and IT consulting companies

I'm excited to introduce guest blogger, Simone Burrows. At HP, Simone is our business intelligence solution manager for the Health and Life Sciences industry. What a privilege to have someone who writes with such passion and conviction! I Simone, I hope you can join us often! - John Santaferraro

Ripped jeans and cowboy boots mandatory.

The IT and Marketing functions of many organizations are merging, presenting a challenge for digital marketing agencies as they try to work out who they want to be when they grow up.  While digital marketing agencies have largely recognized that this key client side trend must be addressed, software and IT consulting companies don't seem to have the issue on their radar, and are missing a vital partnership opportunity as a result. 

A recent blog post by Pro Bose, VP Digital Strategy at a leading NY agency addressed the issue from the agency perspective as follows - 

"With digital marketing increasingly taking on a central role in the media matrix, technology enabled platforms and features such as personalization and real time optimization based on analytics etc are very hard to separate from strategic customer segmentation as a separate offering. Most industries have started to look at budget allocations for initiatives that intermingle the boundaries of IT and Marketing. There is no reason marketing agencies shouldn't productize offerings in collaboration with software (enablement) companies. And certainly not try to build it out internally - the past decade is strewn with carcasses of overpaid underdelivered technology products that were the excesses of ad agencies trying to build them."

So, the agencies seem to have learned their lesson and aren't trying to have sideline in marketing software development - a wise choice.  But what about the software vendors and IT consulting companies in the game of selling and implementing marketing analytics, web portals, campaign management or complete CRM suites?  Well despite their best efforts to really understand the needs of their customers, they just don't have the type of in-depth marketing 'know how' of the ad agencies, and certainly aren't in the business of providing the creative brilliance that the best of them deliver.  Yet they recognize that being able 'talk marketing' when positioning their own marketing products and services is vital.  So how do they take it up a level?

It is clear to me there's a huge opportunity for the two to work together.  Just imagine a customer's surprise to be presented with a really tightly knit marketing solution.  The customer research, digital strategy and creative content that combines to form the marketing 'vision' from the marketing agency.    Let's bring it to reality with the best information management, data integration, reporting, CRM and advanced analytics technologies.  Maybe host it as an end to end marketing solution on a super-duper integrated data warehousing platform (HP Neoview comes to mind...).  The IT consulting company ties all this together so their customer doesn't have to.  Dreamy.

So, if it's such a great idea, why isn't this happening?  I mentioned that most software vendors and IT consulting companies just don't have the agencies on their radar.  But Pro Bose at also points out a key barrier -

"Cultural context is everything and product companies and agencies have cultures that are sand and water."

Hmm, no kidding.  Just try walking into the office of an ad agency suited and booted in your best consultancy clothes, only to find yourself terribly overdressed compared to the cool agency folk in their ripped jeans and cowboy boots. 

Software vendors and consulting companies have enough problems making their alliances with other software vendors and consulting companies bear fruit.  Meetings, dinners, promises made, not so many jointly closed deals. 

The same lesson about these 'traditional' alliances also applies to making new alliances with ad agencies successful.  It's all about relationships.  You can sign all the fancy agreements and alliances paperwork you want, but unless the person at Company A likes the person at Company B, nothing will ever get off of the ground.

So, a call to the Alliances organizations at marketing software vendors and consulting companies - it's time to shed the corporate clothing, change into your best ripped jeans and cowboy boots, and start brainstorming end-to-end marketing solutions for your customers.

Check out Pro Bose's blog!

Post by Simone Burrows
HP Business Intelligence Solutions
Health & Life Sciences Industry

Posted by HP Business Intelligence Solutions at 10:16 AM

April 27, 2010

Answers to a Reporter's Questions About Pervasive BI

After publishing the recent post about Pervasive BI, I have done several interviews covering the topic in more depth. Here is the dialogue I had with Canadian reporter David Hamilton of Web Host Industry Review (WHIR) who asked some interesting questions.

WHIR: Some business people I have spoken to have said that what they’ve seen over the past 18 months or so has had them question a lot of their conventional wisdom on how they operate their organizations. Do you see the rise in business intelligence use corresponding to the declining economic conditions during the financial crisis?

VF: With the recession have come emerging regulatory and transparency requirements, the need to better manage risk, control fraud, and not just increase the ability to cross-sell and up-sell, but to improve the whole customer experience and provide better service as a means to retaining customers and increasing their long-term value.

Prior to the financial crisis, most companies’ BI  focus was on generating a consistent set of accurate financial reports to help them understand and manage the business. The goal was to achieve a single version of the truth of (historical) data in the data warehouse for analysis. But analysis based on history means you’re using yesterday’s results to determine tomorrow’s performance. And many companies were beginning to make their data warehouses and BI systems more operational and predictive.

Some companies kept data mart proliferation in check, but many did not. Where there was data mart consolidation, the approach was often to move the existing data structure (problems and all) to a new platform. In most cases, data quality, data governance, and MDM were left to be dealt with later.

With the onset of the financial crisis and economic uncertainty, priority shifted. IT went into cost-cutting mode, becoming more scrutinizing about acquisition costs and payback periods, and replacing traditional data marts and warehouses with new low-cost appliances. Dollars per terabyte moved to the forefront in evaluating data warehouse platforms. Low-cost and open source BI tools took hold in the market. The need to improve staff productivity put emphasis on increasing the efficiency of costly data integration processes. We have seen increased investments in the past couple of years in data quality, data governance and MDM products and services.

As we begin to see a leveling off of the economic decline, we’re seeing another shift. The need to cut and control costs continues. There is a sentiment that this is not a temporary recession, but a new economic reality. At the same time, competition is keener, driving the need for better customer management. There is a recognition that if you can’t respond to conditions in the moment, you can’t influence their outcome. It is this dilemma of extended cost control while having to increase customer service that is driving a change in companies’ use of business intelligence. There is a strong belief that BI can be tapped to help control business and IT costs, but at the same time, it is imperative that they do a much better job of exploiting data as a valuable asset, to do predictive analysis, make use of data generated by social media and other unstructured content, optimize operational decisions, etc.

In the “Top 10 Trends in Business Intelligence for 2010” whitepaper from HP, we note the likelihood that companies and industries will come out of the recession with a transformation agenda. They will look to IT, and specifically BI, as an important lever for transformation initiatives.

This will call for a different BI infrastructure than what companies have been using, which explains why, in a recent survey by The Data Warehousing Institute (TDWI), 46% of respondents indicated that they anticipate replacing their current primary data warehouse platform by 2012. Here are just a few examples:

  • More collaboration with partners, suppliers, trade associations, consortia, industry information exchanges. This means better quality data and a common data framework for sharing and reuse.
  • More analytic support for operational decisions and processes. This means lower data latency, real-time analysis added to the current BI load, and high availability systems.
  • Improved decisions. This involves integrating data from many difference sources in real time, including seemingly insignificant and unrelated events.

In summary, we see both a rise in use of BI as well as a different use, driven by not just the recession itself, but the widespread expectation of a new economic reality as we move forward that will require a fundamental change in the way we make use of business data.

WHIR: Unlike some other business services, BI is not one-size-fits-all when it comes to implementation. When a business comes to HP asking for BI, how involved do businesses have to be in order to ensure very pervasive BI?

VF: The goal may not necessarily be for BI to be pervasive, although wise organizations would like to leverage and maximize their investment in DW/BI. The initial goal is to address a business problem or opportunity. The more business people can be involved in the design, development and implementation of the BI system, the more likely it is that it will meet their needs. In addition to defining requirements, they can act as advocates to the rest of the organization in validating and encouraging use of the DW/BI system and communicating its success.

One of the things that HP helps clients with is overcoming the cultural and political issues in sharing and integrating data. HP guides clients in the establishment of not just a data warehouse, but a trusted data environment that will allow business users to standardize data usage across business teams and develop a thriving data user community.

The use of BI has not become pervasive in organizations. BI usage is only about 24%. In my post about pervasive BI, I discussed many of the obstacles that must be overcome before it can be more pervasive.

In the next post, I will answer more of David Hamilton’s questions, including this very interesting one:

WHIR: What are some of the unforeseen or unexpected benefits of BI for organizations?

Posted by HP Business Intelligence Solutions at 8:31 AM

April 21, 2010

Live BI, CeNSE and BRAIN are HP Lab’s newest projects for BI

A couple of my recent posts have talked about innovation and modernist thinking for business intelligence and data warehousing.  I thought it would be fun for you to hear more about what's going on in HP Lab's. This stuff is true BI innovation.

While HP Labs has a number of innovative projects involving information management, the CeNSE project is especially remarkable. I mentioned it in my previous post on Information in Motion. HP's Central Nervous System for the Earth (CeNSE) is designed to radically change the way we gather, communicate and analyze information.  CeNSE combines nanotechnology, networking and business analytics breakthroughs from the Labs to make the world a better, safer, more efficient place for all us. 

With CeNSE, billions of nano-scale sensors will be placed around the globe and will be used to feel, taste, smell, see and hear the world around us.  Then, using powerful computing networks, the information can be analyzed and actions taken in real-time to benefit society.  For example, with CeNSE, people will be able to prepare for an earthquake before it hits, or anticipate that a building is due to collapse in time to evacuate all tenants. Think of numerous applications where we can monitor what is happening around the world, respond with intelligence, and make the world a safer and more sustainable place.

BRAIN (Behaviorally Robust Aggregation of Information) is a method of more accurately predicting behavior and improving forecasts by taking the bias, manipulation and hierarchy out of prediction.   My good friend, Colin White would call this Decision Science.

BRAIN uses powerful algorithms to more accurately predict behavior and improve forecasting. Imagine being able to use the simplicity of a survey to discover the truth and predict the future based on common market data. It is designed to perform in environments where market mechanisms are either not useful (due to the environment) or non-viable for complexity reasons.

How perfect is this for the uncertainty faced by our post-recession economy? It's a way of addressing the need for "Pattern-based Strategies," a concept recently evangelized by lead analysts and executives at the Gartner Group.

Live BI is a project that takes this new information in motion and information everywhere from sensors and pushes it toward real-time. It is a unified data and analytics platform that allows much more powerful and sophisticated analysis of highly complex data in real time. Imagine being able to take the pulse of the planet at any moment in time. There are countless uses for global data on temperatures, vibrations, pressures, chemical readings and more from all over the planet.

A few days ago, HP published its annual Global Citizenship Report with more details in the Information Explosion section. Check out the report and let me know your thoughts.  What do you think about the interesting new innovations that HP is pursuing to help harness the information explosion?  What projects or initiatives do you find most interesting?

This is why I love my company. HP is committed to using technology in a number of areas that really will make our world better-improving the economy, healthcare, education, the environment and more. 

Post by John Santaferraro
Twitter: santaferraro

 

Posted by HP Business Intelligence Solutions at 4:05 PM

April 15, 2010

Information in Motion: A Key Driver for Business Intelligence Modernization

We have transitioned from a world of static data and transactional batch loads to the world of information in motion. It's a world where the old, static, data warehouse is rapidly being replaced by a world where we stream data in and stream intelligence out. The streaming data sources are on the increase, the need for near real-time data is increasing, and the volume of information in motion is staggering.

The influx of new streams of data is beyond what we ever imagined possible. Consider sensors in the digital oilfield, monitoring of the smart grid, smart meters and home monitors, the digital hospital, the digital healthcare system, location data flowing from mobile devices, and RFID on everything from shipments to assets to people in the field.

For example, on an energy smart grid there are monitors that read electrical phase 60-100 times per second, and thousands of these phase monitors over the entire grid. In addition, millions of energy smart meters on homes and business send reads back to a central repository every 15 minutes, in some cases more frequently, producing terabytes of data every single day.

Along with data pouring in, there is a need for intelligence pouring out. Analyzing near real-time phase readings can drive the optimization of the energy grid and help identify potential outages before they occur. For smart meters, every smart meter can be connected to a web-based portal in each home or business. When smart meter readings are analyzed in near real-time, smart meter intelligence provides insight into customer offers or automated control of home appliances.

On the digital oilfield, there has been a global effort to modernize drilling rigs and production platforms by covering them with devices that monitor pressure, temperature, flow and a host of other measurements. The result is massive amounts of data pouring into data historians and operational systems, all very useful from an analytic perspective.

To add to oilfield information in motion, HP and Shell Oil recently announced a joint project to provide a wireless sensing system to acquire extremely high-resolution seismic data. HP technology combines sensor research originally intended for the printers with memristor technology which combines memory and logic in a technology that's so small and so low power that it enables a pushpin-sized sensor. These sensors are about 1,000 times more sensitive than today's mass produced devices.

There is amazing potential for harnessing billions of drilling and production readings, millions of engineer interactions, and hundreds of thousands of financial transactions every day into digital oilfield intelligence solutions. Streaming out as intelligence, there are opportunities to predict outages or slowdowns in production, automate decision making in the field, and ultimately, to increase well profitability by individual well, basin, or region.

Business intelligence and data warehousing are changing right before our eyes. Without question, the changes mentioned above drive the need for BI modernization. Traditional data warehousing and business intelligence systems were not designed for this kind of environment. When you consider the matrix of information in motion, here are some questions to consider:

  • How do I tap into the information in motion in near real time?
  • What data is most relevant to my near real time needs?
  • What intelligence do I need to stream out of my near real time feeds?
  • What changes do I need to make to my BI and DW platforms to support the new paradigm of information in motion?
  • What is the value of these new sources of data and intelligence for my business?

And just for the record, HP Neoview Advantage was built for such world as this!

For some fun reading, check out the blog post on CeNSE (Central Nervous System for the Earth), or just watch the video...

 

Post by John Santaferraro
Twitter: santaferraro

Posted by HP Business Intelligence Solutions at 12:48 PM

Why Isn't Business Intelligence More Pervasive?

Pervasive business intelligence (BI) is loosely defined as BI for the masses; empowering everyone in the organization, at all levels, with analytics, alerts and feedback mechanisms to make the right decision at the right time.

Pervasive BI: The Hype

For the past several years there have been widespread projections of pervasive BI: accessible by employees, partners, and customers by the, and I quote, “millions.” Not just for “quants” any more, with today’s tools, “anyone with access to a computer, PDA or smartphone can do analytics.”

Pervasive BI: The Reality

 Fueled by the possibilities described in Thomas Davenport’s 2006 Harvard Business Review article “Competing on Analytics,” many organizations aim to deploy advanced analytics and extend use of the data warehouse (DW) to a wider community of users, supporting operational as well as strategic decisions.

But the reality of pervasive BI has fallen far short of the hype. In a case study published in August 2008, IDC analysts made the point that “Despite the fact that the term Business Intelligence was first coined in 1958 and the first BI software tools emerged in the 1970s, BI is not truly pervasive in any organization.”

And according to BI tools guru Cindi Howson, in 2009 the percentage of employees using BI was 24%, down from 25% in 2007. Even today, BI is used mostly by business analysts and power users (roughly 50% penetration vs. 23% use among field staff).

Pervasive BI: The Obstacles

It’s not for a lack of promise that BI holds, but there are a number of reasons why it has not become more pervasive, despite intent on the part of many.

  • Tools have historically been expensive and in many cases, too difficult to use by the business masses.
  • Tools and the way they’ve been implemented don’t always match the needs of all users.
  • The BI systems in place today were built for strategic decisions, the sweet spot of traditional BI. They don’t lend themselves easily to expansion to real-time analysis based on real-time data to support operational decisions.
  • Most organizations strive to protect and leverage the investment that they’ve already made. It’s difficult to build out a system that supports everyone from a conglomeration of departmental data marts, each designed for a single specific application or use.
  • BI has historically been considered discretionary. Investing in it to exploit data beyond support for business analysts has not been deemed a high priority.
  • Despite a growing trend toward fact-based decision-making, many companies adhere to a culture of making decisions by gut feel.
  • Even after 20 years of building DW/BI systems, few organizations have established data integration and enterprise information management as a strategic competence. Issues of poor data quality, data hidden in tightly coupled systems, as well as lack of master data management, data governance, metadata management or any automated means to reconcile differences between data from different sources are widespread and create barriers to reliable analytics.

BI is critical to competitive success, but before organizations can take full advantage, they need to overcome these obstacles.

Technology to the Rescue

Not to worry. In the high tech industry, there is never a shortage of solutions. Here are just a few of the new technologies (and some not so new, but marketed with renewed vigor) standing ready to make BI more pervasive:

  • Open source BI tools – Solve the price issue and you’ve solved the problem.
  • SaaS BI – By replacing upfront licensing fees and yearly maintenance costs with a monthly or annual subscription fee, SaaS BI “breaks the barriers preventing the spread of BI outside the executive suite.”
  • Leveraging new architecture – One vendor offers “components for building pervasive BI systems where data is stored in memory, allowing quick access and easy scaling with growing demands.”
  • Excel – the interface users know and love, with access to live data
  • Web-based BI – enables deployment of “large groups of users at the click of a mouse.”
  • Overlay mash-ups and dashboards – In the world of web-based BI, these two main kinds of mash-ups are “among several BI 2.0 innovations that help spread BI wider and deeper across organizations; a concept known as pervasive BI.” 

Pervasive BI: The First Step

Depending on your infrastructure, one of more of these may help ease the issues of performance, data access, cost, and management overhead. But fundamentally, inconsistent data does create barriers to reliable analytics and strategic use of data. Organizations need to overcome that before BI can be pervasive.

HP consultants worked with client Steven Yon, Division Director, Common Services, Corporate Operations and Information Services at National City, one of the largest financial holding companies in the US, to develop a standard process for enterprise information management and information governance. He talks about how important it was to establish an environment of data governance with consistent metrics and rules, allowing National City to exploit their data as a strategic asset in order to better serve their customers: “If data must go through several conversion steps before it can be used, it makes it difficult to respond to customer needs. We can’t be agile unless data is integrated and coherent. The result is better customer experience. Differentiating ourselves through insight into customer behavior allows us to win relationships even in a commodity business.”

Our research and consulting work at HP indicate an increasing recognition that to truly make the most of BI and analytics, you need to “get the data right.”

Click here for more information about HP’s Information Governance Services to help clients protect, manage and develop data as a valued asset.

Posted by HP Business Intelligence Solutions at 4:47 AM

April 8, 2010

Look Who's Talking About Connected Intelligence

With our recent talk about connected intelligence at HP, I thought it was interesting that Tim Berners-Lee, the inventer of the internet, is talking about the value of connected data. Rather than trying to wax eloquent, I'll let him speak for himself. This is his presentation at the TED Conference in 2009. Check this out...

 Post by John Santaferraro
Twitter: santaferraro

 

Posted by HP Business Intelligence Solutions at 11:06 AM

April 5, 2010

Are You Ready? A New Generation of Data Warehouse and BI Systems is Here

You’re gonna need a bigger disk.” That’s what Amity Police Chief Brody would say if he were chasing down today’s voracious data analysis needs.

A recent InformationWeek Analytics 2010 State of Enterprise Storage survey revealed “an alarming state of affairs.” Nearly half of the IT professionals said they had insufficient storage resources for critical applications, up from 30% the previous year.

I doubt that anyone has gone more than a week or two without reading the latest statistics on the dramatic rate of data growth, so I won’t repeat any here. But that reality, along with regulatory requirements to keep much of the data longer is putting a strain on IT storage budgets and causing IT to search for ways to reduce that strain: SANs, cheaper drives, data compression, multi-temperature storage, etc.

And yet, according to industry analysts, in the data warehousing/business intelligence (DW/BI) world there was a resurgence in independent data marts in 2009 – the same year in which storage budgets were reaching a breaking point. Wouldn’t consolidation, or the reduction of persistent data marts (which are the embodiment of redundant data and therefore wasted storage) make more sense in the belt-tightened reality of the new economy?

A rise in data marts means not only more disk drives, but more processors, memory, ETL scripts to manage, maintenance, floor space, power, and overall system management, with less corporate control over the reliability of the data. One way to help meet today’s user’s appetite for data while maintaining its security and control is to stop duplicating it. Keep it in one place and manage the access, providing all users with the ability to analyze it as they see fit.

What is the reason for this increase in data marts after a data mart consolidation trend that dates back to late 2001? I think there are several.

  • The onus of success: When a data warehouse team successfully provides a trusted analytic data environment, users line up to get access to it. This often results in a system overload and offloading some of those users onto a low-cost mart.
  • Growing analytic needs: Statistical analysis, data mining, predictive models, visualization, etc. have spawned adoption of specialized analytic data marts or sandboxes.
  • New implementations: A growth area for data warehousing is first-time implementations (or first serious undertakings) in emerging markets and medium-sized businesses. Despite what we have learned about best practices over the past 20 years, many new implementations forgo the “single version of the truth” centralized EDW approach and build siloed departmental data marts, learning the same lessons of their predecessors the hard way.
  • Operational BI: Low data latency, real-time analysis and transactional queries have put a strain on traditional data warehouses and caused a resurgence of operational data store (ODS) or ODS-like platforms.

That last point gets at the current challenge of trying to run and manage multiple types of workloads all on the same system with good performance and the ability to scale to high volumes, all without increasing the administrative overhead. This is known as a “mixed workload” which could easily include a mix of batch data load, continuous real-time data load, a large number of standard reports, pre-determined as well as ad hoc queries, complex queries, data mining and operational analytics.

As industry experts point out, most existing data warehouse implementations are overwhelmed by mixed workload requirements, from both a performance and management perspective. In response to users’ needs, IT departments have resorted to multiple warehouse deployments. To be sure, today’s mixed workload reality is quite different from the world in which most of today’s DW/BI systems were developed. Tuning them to optimize for these conflicting requirements is not always possible. But proliferating specialized data marts is not feasible either.

In fact, in a survey by The Data Warehousing Institute (TDWI) regarding next generation data warehouse platforms, 46% of respondents say they are “planning a data warehouse platform replacement by 2012. Many others anticipate keeping their current platforms, but updating them significantly.” TDWI’s position is that “certain relatively new technologies, techniques, and business practices are driving the majority of data warehouses and their platforms toward a redesign, major retrofit, or even replacement significant enough to be recognized as a new generation.

At HP, we agree that a new generation is at hand. In the “Top 10 Trends in Business Intelligence for 2010” whitepaper, we describe our view of the new generation of DW/BI systems and note that we are seeing a change in the BI culture. Leading companies are beginning to think not just about “informing the user” but also about “improving the process,” and using analytic results directly in the workflow context to drive operational execution and dynamic process change. We see the coming economic recovery as an impetus for organizations to invest in a new generation decision management system that will help them to transform their business for a new generation of uses and users (while easing the strain on their storage systems and budgets). The whitepaper discusses some of the technologies which hold promise for enabling this new generation of data warehouses and BI systems.

What do you see as the drivers of the new generation, and what form is it taking? I welcome your comments and wish you a happy 35th anniversary of the movie classic, “Jaws.”

Posted by HP Business Intelligence Solutions at 7:42 PM

April 2, 2010

Welcome to the World of Connected Intelligence!

Connect. It's what we do with dots. It's what we do online. It's what networking is all about. It's the last thing someone thinks about when they are building a stand-alone data mart on a single purpose database or analytic appliance. It's not on the mind of the business sponsor who ships his data off to a cloud in order to get a quick reporting need met.

It is the answer to our disconnected world of information. We've seen people build siloed applications and store data in silos for years. Now they are building business intelligence applications in silos. Marketing owns its own set of BI apps; sales has their own reporting systems; finance owns the applications that track profitability; and product data management lives its own world.

In the world of connected intelligence, every element of business intelligence is tied together in a way that drives more business value and allows companies to respond more quickly to changing requirements.

Connected Intelligence starts with getting the data right. Most companies are already thinking of connecting their own data. We look beyond that to identify the value of connecting to outside information or looking at streams of events that would open up the world of instantaneous decisions. Industries like communications, finance, and energy have complex sets of data, complex data stores, complex data flows and complex networks of people need information to do their jobs. This is an area that requires more than technology. It's the combination of people, processes, and technology that helps companies lay the foundation for connected intelligence.

I recently spoke to two veteran data warehouse and business intelligence leaders in the retail and manufacturing industries. Both had been on the job for more than 10 years. Both had built mature information management programs in their companies. And both were faced with the same challenge: "How do I take the still complex analytics I'm producing, simplify the content, and get it to the front lines in the form of a recommended action."

Once you have the data right, the second level of connected intelligence helps companies use that new view of the enterprise in more ways than any other approach. It's all about connecting insight to action. Think of what you could do by embedding analytics in business processes or distributing the right analytic results to millions of front line decision makers every single day. When analytics reach the front lines, companies can finally make decisions at the speed of business and make every decision count.

Once the business is blanketed with analytics, the third level of connected intelligence opens the door to new business possibilities. What happens when you have connections between your business intelligence applications? What happens when you discover 5 or 10 new combinations of analytics that were never before possible in a disconnected environment? Innovation happens. You begin to discover new business models that support breakthrough initiatives. You monetize the value information and transform your business.

In the utility industry, I was surprised that most companies don't know their actual cost of power. As a result, their pricing models tend to require a mix of regulatory guidance, prediction, and intuition. If these companies could connect trading information with regulatory, demand, grid, and customer information, they would be able to create models to automate the pricing function. If they were able to connect this kind of data in near real-time, they could create a dynamic pricing engine that could potentially change the way companies do business in their industry.

That's connected intelligence, an approach to BI that connects analytics in more ways than ever before to drive increasing business value and open the world of intelligent innovation.

And this is why I love working at HP! I love working with a team of innovators that are committed to helping companies use technology in ways they never before dreamed possible.

But I'm curious what you think? What do you think about the idea of connected intelligence? I would love to hear your thoughts...

Post by John Santaferraro
Twitter: santaferraro

Posted by HP Business Intelligence Solutions at 10:53 AM

March 30, 2010

Innovation, Transformation, and the BICC

A recent tweet from an industry analyst intimated that he was meeting with a client who had spent $11 million on their business intelligence implementation over the last year and that they were still uncertain of the value they would be getting from the investment.  

This is a wake up call for anyone not currently pursuing a business intelligence competency center. The verdict is out: companies who started their BICC efforts 3 to 4 years ago are now using BI to transform their companies.

Since we've been helping companies align IT and business using the concept of a Business Intelligence Competency Center for more than 5 years, I've had the privilege of talking to companies who have mastered the art.  Just for the record, anyone can write a book, anyone can write a white paper, but I prefer talking to people who have done it and learning from them. That's exactly what I did; I talked to people who are seeing the benefits of a working BICC. Here are three trends that get me excited about BICC success.

TREND ONE: Business Intelligence Tied to Corporate Objectives

A common theme for BICC success is the right level of business sponsorship and the right mix of business and IT participation. With these two elements in place, we are seeing companies consistently tie their entire business intelligence program to very specific corporate objectives and aligning with the overall mission and vision of the company. In some cases, companies are even mapping every single BI initiative to a specific business processes or set of processes. This means that nothing is done in BI that doesn't have a direct impact on the business with measurable results.

TREND TWO: Move from BI Maintenance to Innovation

I'm surprised at the number of organizations that are still approaching BI as a means for driving more reports for the business. I'm equally surprised at the number of business users that are operating in a world of reports that stems back from the day of green bar reports.

With a working BICC in place, companies are seeing the walls between business and IT come tumbling down. IT is discovering more of what the business actually needs to operate more efficiently. The business is getting a better understanding of what can be done with business intelligence technology. The end result: amazing efficiency.

Imagine what it costs to produce 6,000 reports. Consider what it would be like to reduce the number of reports produced to 180 and exceed the expectations of the business. Even better, think about the resources you just freed to work on innovative uses of business intelligence.

TREND THREE: Emergence of Trusted Information

Because a working BICC drives toward data ownership, data stewardship, and data governance, companies who are doing it right are experiencing significant improvements in data quality. Companies who have been driving this kind of governance program are actually changing the information culture of their companies.

In today's world, the first thing you do when you get a report is question the accuracy of the figures. In fact, it's likely that you will get a second report on the same topic, from another part of the company, and you will find conflicts in the results.

With the right governance in place, you can maintain data heritage in a way that informs anyone viewing a report the sources of the data and the details of any transformation and cleansing that impacts the end result. We are seeing companies move toward certified reports or certified sets of data, stamped with the mark of the BICC, and therefore trusted by the business users.

The culture shift is simple to understand. When people get used to getting trusted information, they will immediately question or reject anything that doesn't have the BICC stamp. It takes time and discipline to change the information culture of a company, but it's worth the journey.

So, these are just a few of the trends we are seeing from companies who have taken governance seriously. And this is a wake up call for anyone not currently pursuing a business intelligence competency center. Or perhaps, we should be talking about a Business Intelligence Innovation Center!

I'd love to hear about your success with Business Intelligence Competency Center's. What have you done that's worked well? What results have you seen? Either leave a comment below, or send me an email at john.santaferraro@hp.com.

Download this white paper (0.70MB, PDF) to read more about how to implement a Business Intelligence Competency Center.

You can also dig deeper into the Top 10 Trends in Business Intelligence for 2010.

Post by John Santaferraro
Twitter: santaferraro

Posted by HP Business Intelligence Solutions at 12:02 PM