Last week, I wrote about TrustRadius, a business software review site, and how it differentiates itself from two other business software review sites that had recently come to my attention—G2 Crowd and IT Central Station. There’s more to the TrustRadius story that sheds an informative light on what’s driving the emergence of these crowdsourced review sites.
In my interview with TrustRadius co-founder and CEO Vinay Bhagat, I wanted to get a sense of what compels a business software user to invest the time and effort that’s required to provide the type of meaningful review that TrustRadius is looking for. He said he’s found several core motivations:
One is they just buy in to the whole concept, and have personally felt the pain of evaluations in the past. They’re willing, therefore, to pay it forward and help build a fact base, just like people spend time on sites like Stackoverflow or Wikipedia. You’ve got a lot of invested people who are contributing a lot of their time to those types of sites, just because they really believe in the concept and in building the body of knowledge. A second core motivation is to enhance one’s professional reputation. So when you write a review in TrustRadius, and as other people start to comment or like it, your TrustRadius reputation score grows. We would like that TrustRadius score, longer term, to really be a meaningful metric, whereby if you’re a recruiter, you look at someone’s TrustRadius score to see how well they’re respected. We also think it could be a proxy for vendor influence. So if I’m in a procurement role, or I’m a VP of technology, and I have a high TrustRadius score, it means I’m trusted on the social network. Hopefully, that would have some sway with how vendors see that person’s ability to influence global market decisions for that product. If you’re a consultant or a professional services provider, demonstrating that you have the ability to write authoritative content can be a branding and a lead-generating mechanism, as well. So we believe that reputation ultimately is a really key factor. The third reason is that people are asked. Historically, a lot of social sites were very passive. So just by virtue of being asked, a lot of people say, “Thank you for asking for my expertise, my opinion—I haven’t been asked before.” Typically, people are more inclined to contribute content when they’ve had a very positive or very negative experience, or when they’ve just finished an implementation. So if you can time it correctly, when it’s fresh in their heads, that’s something we’re trying to learn how to do—to understand the timing cycles of implementations.
Bhagat pointed out that TrustRadius vets every review before it goes live, and it frequently works with the person to enhance the review to make sure it’s fully balanced, comprehensible, and thorough:
We’ve intentionally introduced a fair amount of friction in the process to make sure we get high-quality content, because our thesis is that people will read five to 10 reviews—maybe more, but they’re not going to sift through 100 reviews—and they want those reviews to be from people they can trust, and people who are providing quality insights. … Ideally, those should be from five to 10 people directly like them. Probably what’s critical mass for any given product is 25 to 30 reviews, so that you can find enough people like you. Clearly, having more is better. But we don’t think people are going to want to sift through 100 unstructured comments in order to make a decision. They’d rather hear from five to 10 people who are like them, who are providing something of substance.
Bhagat elaborated on how TrustRadius attracts power users to provide reviews:
When we do our outbound research to identify people, we’re looking at people who seem to have in-depth product knowledge. Maybe they’ve blogged about a product; maybe they have a certification from a vendor. We certainly cast a wider net than that, as well. But the way that content is gathered on our network, by having someone go through a questionnaire, tends to filter out the people who don’t really have material knowledge about a product.
I asked about TrustRadius’s revenue model, and Bhagat said it’s still up in the air:
To be honest, we’ve got a thesis as to what our revenue model will be, but we’re focused more on trying to build a rich, vibrant community first. I chose to raise venture capital—we closed a $5 million round with the Mayfield Fund in early July, with a viewpoint that while monetization is clearly important, it’s more important to figure out how to build a clearly useful content asset first. Ultimately, the way that we aim to make money is through charging buyers for premium content, and vendors for premium services. We are testing monetization on the vendor side today; we have not done anything to try to monetize buyers as of yet. So long-term, on the buyer side, one area I see as a possibility is selling curated reports—that would be an example of premium content that could either be purchased one-off, or through subscription. We’ve also had an expression of interest from some large enterprises who do purchasing as teams, where they would like to be able to do a bit more collaboration. So if they’re buying a complex product, and there are 10 people involved in the purchase cycle, and they want to have a place where they can share, and bookmark and note things—that would be an example of more of an enterprise subscription down the road. On the vendor side, what we’re testing today is content licensing. So for a vendor that has a number of very positive reviews on TrustRadius, we’re giving them the option to license that content to use outside of our site, where they can post it on their website, equip their sales reps with a PDF version of a TrustRadius review to use as collateral for sales prospecting. That’s something we’re just testing right now; we see other premium services for vendors over time. But again, the primary focus of the business today, 99 percent of our energy is all about how do we source, structure, and provide something of real value to people, such that we become an important intermediary between the buyer and the seller.
Finally, I asked Bhagat what he foresees happening with the traditional IT market research firms like Gartner and Forrester—whether they’ll be forced to adopt more of a crowdsourced approach. He said he thinks there will always be a place for the traditional technology analyst:
I don’t know if the Gartner/Forrester model is going to be the long-term winning model. But they have huge incumbency and huge relationships, and I don’t believe any one of us is going to put them out of business in the next five to 10 years. What we’ve seen is the proliferation of boutique firms like Ray Wang at Constellation, and the folks at Altimeter, who are more specialized in nature. I would say their role is more forward-thinking—strategic guidance around trends in an industry. We don’t see ourselves as analysts, per se. We see ourselves as curators of what the crowd is saying. In our case, we’re not trying to crowdsource en masse; we’re trying to find the pearls of wisdom within the crowd that are worth listening to. What we’re really trying to do is report on what people think about different products today, what business problems people are trying to solve, and how well those products meet those needs, and what it’s really like in the trenches to work with one of these tools and one of these vendors. I think that type of content is quite distinct from what the analysts are doing. To some degree, of course, Gartner and Forrester talk about tool effectiveness, and so forth. But what I see the boutique analysts trying to do is to be more forward-looking in nature, and I don’t really see that as necessarily our role. Our role is more to report on the here and now about what success is being achieved, and what people are looking for from these tools.