Possibilities (and Possible Pitfalls) of Cloud-Based BI

Ann All

Just last week, I wrote about Shaklee's positive experiences with cloud-based business intelligence, shared by CIO Ken Harris during a recent TDWI webcast. Harris' key indicator of success: The company now has multiple users in multiple functional areas using multiple applications -- without being forced to do so. "That's how I know we've got a winner," said Harris. That might not sound like such a big deal, but it surely is given BI's poor track record.

 

Consider Forrester Research analyst Boris Evelson's discovery, after speaking to a few dozen business professionals, that not one of them relied on IT for their day-to-day information needs.That lack of interest in working with IT to solve their information issues -- which Evelson attributes to the high cost and inflexibility of standard BI solutions, slow deployment times and overall poor relationships with IT, among other factors -- has led business users to rely heavily on Excel spreadsheets. Though many users like Excel, Evelson says they want pre-built BI solutions that can better handle larger amounts of data and allow them to collaborate with co-workers.

 

For some companies, like Shaklee, that's cloud-based BI. In a TDWI interview, Datasource Consulting President Steve Dine recaps some of the benefits, several of which were also mentioned by Harris during the webcast: ability to easily scale computing resources, faster implementation times, lower upfront costs, ability to leverage the cloud for proofs of concept and upgrades, and ability to scale geographically.

 

Like Harris, Dine says cloud security can exceed security found on a company's premises. Harris made an interesting point that, because concerns are more top-of-mind when trusting data to an "outsider," CIOs are less likely to sweep security issues under the rug with cloud providers and assume "everything is OK" as is sometimes the case with internal security.

 

Dine says data transfer speeds can be an issue. While technologies such as Hadoop and MapReduce offer the ability to scale, they don't work well for all types of data. Other methods for transferring large data volumes can create additional overhead to daily processes -- one of the annoyances that cloud-based BI promises to eliminate. And more specifically:

When you're dynamically bringing up instances of virtualized servers, you don't know where those are located within a data center. You can put them within the same zone, but you can't necessarily co-locate those instances on the same box or rack. Therefore, you don't know what the network throughput is between your different cloud-located servers. Also, in most cases you can't really control the architecture of your data storage layer. You will likely be limited to software RAID and won't be able to choose the type of communication backbone between your CPU and storage. You're essentially locked into how the cloud vendor's storage is architected.

Harris downplayed data transfer speeds, calling it a "non-issue" for Shaklee, which uses a solution from PivotLink. Shaklee loads all of its sales, customer, marketing and cost information into its data warehouse. It's "a lot" of data, said Harris, loaded at least daily, in various formats. Commenting on my post, Ajay Dawar said some of PivotLink's customers have "billions of rows of data, thousands of internal and external users and multiple data sources and some have all three of the above, combined."

 

Dine encourages companies to carefully review cloud pricing structures, as they vary from provider to provider. A possible "gotcha" is that it can vary by month or quarter, making it a square peg trying to fit into the round hole of traditional corporate budgets.

 

Beyond specific pricing plans, Forrester's Evelson offers a comprehensive list of what to look for in a vendor offering cloud-based BI applications. Among his items:

  • Ability to modify, expose and reuse product functionality in other applications via APIs or Web services.
  • Enough flexibility so the software doesn't depend on a single data source.
  • Favorable recommendations from Salesforce.com or other vendors upon whose platform, data source or analytics the BI product is based.
  • Functionality to support various styles of BI including reporting, querying, OLAP and dashboards.
  • References from customers in production (last two words are key).

 

Evelson also suggests four way to mitigate risks of cloud-based BI:

  • Back up your own data.
  • Line up a cloud-based Plan B that can be used if your vendor fails (remember LucidEra).
  • Get a commitment from Salesforce.com or similar vendor to support your SaaS analytics if your vendor fails.
  • Create a contingency plan for internal IT to take over and migrate cloud data into the company's enterprise BI solution, and periodically test it/

 

To end on a positive, rather than a scary, note: Datasource Consulting's Dine thinks cloud computing may finally lead to the long-promised "pervasive BI." He said:

We're starting to see licensing models change to meet the utilization-based computing model. In the near future, companies may no longer be constrained by large, upfront, user-based licensing fees. Software-as-a-service vendors are also leveraging the cloud, making it easier for small businesses to load and analyze their data with very little upfront cost and administrative overhead.


Add Comment      Leave a comment on this blog post
Dec 7, 2009 11:32 AM Dan Graham Dan Graham  says:

Like Mr. Evelson, I believe that clouds are a tectonic shift in IT-rivaling the importance of eCommerce in the late 90s. But most analysts estimate clouds will only be 15-20% of an IT infrastructure by 2015.  (WOW -- 20%!) And a great deal of that 20% will be private clouds, a variation of Infrastructure as a Service (IaaS).  

Stephen Dine's comments on public clouds (also IaaS) cannot be compared to Shaklee on Pivotlink (SaaS).  IaaS has all the same characteristics of traditional do-it-yourself BI/EDW except the hardware is off premises. The enterprise still needs to work with IT on justifications, procurement, design, development, testing, and finally production.  Transfer speed is always important: uploading 10 gigabytes at 100Mbps takes at least 15 minutes, a terabyte takes a couple days.  Contrary to this article, security is the #1 worry in every cloud survey by Forrester, IDC, and Gartner.  The public cloud vendors are working hard on this, but some won't even let your auditors inspect the site and their procedures.  Some vendors are good guys, others will get you fired.

Shakelee went down the SaaS path.  When your BI SaaS vendor is a leader such as PivotLink, it's a great approach.  By renting the final solution and not the tools to build it, a small and medium business can get into full production quickly.  Shaklee did this right, bypassing the infrastructure and programming needed.   Yet there are cautions.  PivotLink is a startup.  According to Gartner only a few SaaS vendors are currently profitable.  There is some risk.  More important, SaaS solutions predefine the entire solution 'their way.'  While SaaS vendors all claim unlimited BI flexibility and functions, their data models are limited and their front end tools are not even close to the power of BI tools such as MicroStrategy, Business Objects, etc.  There's nothing wrong with SaaS BI solutions if they solve your problems.  Yet successful BI solutions tend to grow and grow well beyond the initial design. 

As the cloud vendors mature so will we all. My company, Teradata, recently released the Teradata Enterprise Analytic Cloud composed of three IaaS solutions (public and private cloud) and a SaaS partnership.  Right now, internal private clouds are probably the best bet for secure high performance data marts and warehouses.  In terms of performance, however, it's still difficult to beat the flexibility and service levels a strong IT operations staff can achieve with a central enterprise data warehouse. 

Dan Graham

Teradata Corporation

Program Director Active Data Warehousing

Reply
Dec 14, 2009 2:25 AM Ajay Dawar Ajay Dawar  says: in response to Dan Graham

Dan

A very well written response. I applaud you pointing to real numbers like data transfer rates and giving the on-the-ground perspective on the cloud. Separately, having worked at Informix and Redbrick as a kernel engineer in R&D, I have tremendous respect for Teradata's technology and scale.

I'd like to respond to some of the points that you made.

1. Data Transfer speeds.

You are right that @ 100 Mbps, 1 Terabyte might take a couple of days. Moreover, we all know that it is hard to even get 100Mbps on a sustained basis. In many cases, this is a first time ETL and not the amount of data that needs to be transferred every day for daily refresh. PivotLink has customers that analyze terabytes of data. The first time ETL is handled differently than daily refreshes. Sometimes we face a bigger bottleneck. When extracting data from an application, the application itself doesn't pump data @ 10GB / 15 mins. That said, our customers leverage compression technologies for sending this large size data for the first time ETL. They key challenge here is automation and we have automated this process satisfactorily for our customers that have data in the terabytes range. All things considered the 100 Mbps limit is not a practical hindrance to most SaaS deployments.

I don't know of SaaS deployments at the Petabyte scale with daily transfer of terabytes of new data. But with the progress made by Vertica, ParAccel, Asterdata, people working on Hadoop and Map-Reduce and eventually Cisco's push towards Giga-bit ethernets, this is only a matter of time.

2. Company Profitability

If profitability is a deciding factor on whether to buy from a certain sofware vendor than no software company would ever come into being - unless the first sale makes them profitable. No one would have signed up to be the first customer of Microsoft, Oracle, HP, Teradata, Salesforce.com.....

The real issue here isn't profitability, but the cost of switching if the vendor went under. Profitability is important only so far as it informs about how long will the vendor give continued and reliable operations and deliver on promised SLAs.

Customers should ask their SaaS Vendors about:

- Data backups and recovering data

- Getting and reports and analytical results back

- Adequacy of processes that support backup, recovery, up-time

In fact the above is exactly what IT is qualified to assess. This is part of the role of IT that leverages their unique competence.

We have some proof that vendor switching works just fine in the SaaS BI space. When LucidEra went under, three vendors (including PivotLink) jumped in and were able to transition LucidEra customers to their respective offerings. PivotLink customers transitioned in under 90 days. It will be hard to find a company that switched from one of the Big 3 on-prem BI companies to another in < 90 days - with NO IT overhead.

3. Security

More on that in the next post. Please find some information on www.pivotlink.com/sas-70

4. "Limited" Data Models and claims of Ultimate flexibility by SaaS Vendors

More on than in the next post

Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


 
Resource centers

Business Intelligence

Business performance information for strategic and operational decision-making

SOA

SOA uses interoperable services grouped around business processes to ease data integration

Data Warehousing

Data warehousing helps companies make sense of their operational data


Close
Thanks for your registration, follow us on our social networks to keep up-to-date