I tried live tweeting a webinar I attended last week, and boy, were my fingers tired! ( I multitasked, also working on a blog post and responding to the occasional instant message and e-mail.) Not sure how many of my blog readers also follow me on Twitter, so I thought I'd recap what was an interesting hour on cloud-based business intelligence. (You can follow me on Twitter @all1ann if you are so inclined.)
I gave about half my attention to Wayne Eckerson, director of TDWI Research, as he shared several slides detailing differences between cloud computing, software-as-a-service and on-demand infrastructure; the benefits of cloud computing; some common uses; and some areas to consider (integration capabilities, data-loading capabilities, etc.). I admit I wanted him to speed it up, but then I saw a slide that helped explain the need for a tutorial.
Just 9 percent of 183 BI professionals who attended TDWI's World Conference earlier this month described themselves as "very familiar" with cloud computing. Forty-two percent said they were "somewhat familiar" with cloud computing, and 49 percent were "not very familiar" with it. I admit this number surprised me, as it feels like I spend a lot of time reading about cloud computing and a fair amount of time writing about it myself.
Still, despite those numbers, it looks as if we can expect a fairly big BI movement to the cloud within the next three years. While 85 percent of the attendees said none of their companies' BI solutions run in the public cloud today, that number shrank to 47 percent who felt the same would be true in three years. Thirteen percent said some portion of their BI solution was currently in the cloud, with 46 percent predicting this would happen within three years.
Most commonly found in the cloud today: BI tools, cited by 5 percent of respondents, with 12 percent expecting to see them in the cloud within three years. Other BI components: reports (4 percent today, 14 percent within three years); hardware (3 percent today, 8 percent within three years); data marts (2 percent today, 7 percent within three years); extract/transform/load (3 percent today, 4 percent within three years); and source data (4 percent today, 7 percent within three years). Other slides focused on size of companies deploying BI in the cloud (most popular at companies with $500 million to $1 billion in annual revenues) and reasons for adoption (three most popular were low cost, faster development and a lack of internal technology focus).
No offense to Wayne Eckerson, but I generally find real-world experiences from users more compelling than research numbers. Such was the case with the rest of the webinar, which was devoted to Shaklee CIO Ken Harris' discussion of his company's experience with products from webinar sponsor PivotLink. (Not to be confused with Microsoft's new PowerPivot BI technology, which I wrote about yesterday.)
Straight from a User
In a scenario that probably sounds familiar to at least some companies, Harris said Shaklee had huge amounts of data, a need for real-time analysis, an outdated and expensive data warehouse, and a growing number of internal and external users who wanted to access the data. The ultimate goal was for improved visibility into information and enhanced collaboration throughout the company. After evaluating both in-house and other cloud-based solutions, Shaklee selected PivotLink because it met several criteria, most notably being able to answer three "impossible" queries suggested by users within three weeks.
Harris said he "knows I've got a winner" because multiple users in multiple functional areas are using multiple applications without being forced to do so. Though the presentation slide indicates PivotLink's solution was implemented in 120 days, Harris twice said it was deployed in 90 days.
Though Eckerson had mentioned inability to load large amounts of data as a possible shortcoming of cloud-based BI, Harris said this has been a "non-issue" for Shaklee. The company loads all sales, customer, marketing and cost information into its data warehouse. It's "a lot" of data, loaded at least daily, in various formats.
Harris described data cleansing as "an ongoing process" for Shaklee. While IT led the cleansing process, all functional units were involved. Though it wasn't all properly cleansed the first time around, Harris said this wasn't necessarily a bad thing. Quickly loading data into the warehouse revealed problems in underlying source data that hadn't previously been obvious. So Shaklee knew where it needed to apply energy and resources
The Usual Questions
Harris also addressed several key concerns associated with software-as-a-service and cloud computing: Is it secure? Is it really cost effective?
Security is "absolutely better" with PivotLink than with Shaklee's previous system, Harris said. He suggested simply treating a SaaS provider's facility as your own and asking internal security experts to perform a review. It should be "no different than internal analysis," he said.
As for cost, Harris said "it wasn't even close" that SaaS represented a major savings for Shaklee. There are essentially no internal resources devoted to PivotLink's software, which he said yields large annual cost savings on hardware, staff and other items.
And some good advice from Harris on finding the right partner, which is key."With SaaS, you're signing up for an ongoing relationship," even more so than with on-premise software vendors, he said. Get a partner that "truly understands SaaS," and "SLAs are key."
I found his conclusion perhaps the most interesting part of the presentation. The challenges involved with SaaS are not as great as those associated with implementing BI, he said. I think he means that people and processes represent the biggest challenges. Those challenges are going to be there, no matter which software delivery model you choose.
You can view the slides from the presentation "When SaaS and On-Premise BI Collide: The Pros and Cons of Moving BI to the Cloud" using this link.