Some Insight on Affiliate Review Sites

I think it was during a session I was speaking in at a conference that someone asked me recently how someone should deal with negative reviews left by site visitors on affiliate review sites of merchants the site is an affiliate for. The person had mentioned that they had been deleting negative reviews of the products/services they were affiliates for as they came in.

[headdesk]

I made it clear that I almost never* (see below) delete negative product reviews on my affiliate sites (or give a positive editorial review where one isn’t warranted), regardless of if I’m an affiliate for the merchant in question or not.

Why not?

Several reasons…

User trust triumphs for long term affiliate performance

My end goal in the grand scheme is about the amounts of my commission checks and my net revenue – not my earnings per visitor. As I said in my recent review of Traffic and Trust:

“My focus is about the users. If I have the users, I will find a way to monetize them.

In simple terms, I’d rather make 1 cent per visitor and have a million visitors then make 10 cents per visitor and have 1000 visitors. My goal is to make sure I have the best site on a given topic for my specific audience. I want to rank because I deserve to rank. I want to earn user trust, because if and when I do, they will become my biggest champions.

The more users I have, the more opportunities I have to make money in all honesty. The more defensible my site in terms of both revenue and search engines. I will monetize my sites however possible, providing the user experience isn’t killed or tainted in the process. #unicorns

People are more likely to leave negative reviews

I’ve made this statement publicly before merely from the experience of owning several large review sites (our site on prepaid wireless has over 14,000 consumer reviews alone and is not even our largest trafficked site.)

Now, there is finally some actual statistical data to back it up. In June of 2010 Nielson Research released a study on Online Shopping Trends. In it, they unveiled that according to their survey, 32% of people in North America were more likely to share a negative review online than a positive one. The global statistic was even higher at 41%.


Sidenote: I’d highly recommend downloading the entire report as it also gives insight into which industries users won’t buy a product within *without* consulting online reviews first. If you don’t have the ability to do review sites, I’d stay away from these sectors. And if you do, I’d look into getting into them. ;-)

People expect to see negative reviews

First off, as a consumer, I know I certainly would find it odd if I went to read reviews on a product and saw all 246 of them were positive. No one gets that many positive reviews without some negatives slipping in here and there. When all I see are positives, then I figure you’re hiding something. You’re not showing me the “real deal” and I’ve instantaneously lost trust in you as a source of information.

As an owner of multiple, high quality affiliate review sites, I’ve seen no proof that negative reviews are directly correlated in significant numbers to the number of folks who 1. click on affiliate links on the product page and 2. buy the item in question.

I took one of my review sites that gets six figures worth of clicks on “product links” per month and gathered some data from that site for you from the month of January. The data compares five products we affiliate for available on that site (not everything we feature are products we affiliate for) that are comparable in most aspects (meaning that I am not comparing a 99 cent screw with a four hundred dollar drill.)

The chart below shows the number of affiliate clicks (raw), the number of search engine referrals for that product’s specific “brand terms” the site received in the month (raw), the number of unique page views the “product page” on our site received (raw), the average consumer rating the product has on the site (relative – each “star” was worth 10,000) and the conversion rate for that product (relative – I used the actual conversion rate times two million) so you can easily see the correlations (or non correlations.) [click for an enlarged image]


As you can see, the product with the highest average review does not necessarily have the best overall conversion rate. As a matter of fact, the site with the lowest average consumer review has the second highest conversion rate. Sure, not ridiculously scientific, but this isn’t a “small sample” either. Like I said, this is a high volume site in regards to traffic. And I see this across most of the review sites we build and manage. From what we’ve seen, if folks start out interested in a specific brand, it takes a lot to convince them otherwise (look at the UPV of the product page compared to the conversion rate of those pages.)

So why on earth would one risk the trust of their users (if this concept confuses you, try grabbing a copy of Traffic and Trust to learn it) and delete negative reviews when – again – from my experience, it doesn’t have a positive increase on conversions? Not saying you should meddle with the reviews even if *did* increase conversions… but since they likely don’t, there really is zero reason to screw with the true opinions your users have about the products you provide information on.

Now, what we do see correlation on? Honestly?

What they see first on the site.

Much like studies have shown that clicks on and attention for sites listed in Google severely drop the further down the page they appear, we’ve found the same with on page listings at review sites.

“Brand terms” excluded, people entering the sites on generic terms will often look at the products (providing they are relevant to the generic term) they see first. So if you sort products to show by average user review, then you probably see traffic (and as a likely result, sales) dispersed the same way. If you did not sort them that way, you’d likely see traffic dispersed in the order in which they are listed (we do on sites where they are not listed by average review.)

Now I’m not saying reviews have NO effect on people’s buying patterns. I’m just saying that having negative reviews mixed in with positive reviews on a product doesn’t seem to “kill” sales of that product – a common fear amongst affiliates that allow reviews. Especially if your site receives a lot of traffic from folks looking for that specific product.

Granted, we also have high review volume, so it is unlikely that any product we have has ten negative reviews and no positive ones. I don’t have any personal data to pull from on sites where the review count is only a handful and all are negative. So, that situation might (and very well could) see different results. I can only speak for my own experience and data.

Reviews allow you to target keywords you otherwise couldn’t

Frankly, reviews allow you to target various misspellings you’d either look illiterate for dropping within editorial content or that you’d never think of.

So let Quora continue to show they are a clueless VC puppet by correcting the language, spelling and grammar mistakes of their community (hat tip to Michael Gray for mentioning that to me) and losing out on resulting search traffic. Not everyone who is smart and/or with money to spend spells like Webster.

My recommendation is to leave your user reviews alone in that respect. Let them bring in long tail traffic you’d never have thought of (or looked intelligent for) targeting.

When do we edit reviews?

Now I’m not saying have zero moderation when it comes to reviews. We moderate reviews for the basics. Meaning we will kill your review if you use hate speech, flame other users rather than leave the review to discussing the product at hand, if the same person spams us multiple times leaving a horrible review for a company under nine different names (we let one stand) and/or if you leave a review with personal information (i.e. if you give out the home phone number of an executive at a company.) I’m sure there might be a few other randomly occurring instances, but those are the main ones we deal with regularly.

About Rae Hoffman

Rae Hoffman aka "Sugarrae" is an affiliate marketing veteran and the CEO of PushFire, a search marketing agency specializing in SEO audits and link building strategies. She is also the author of the often controversial Sugarrae blog. You can connect with Rae via Twitter, Google+ and Facebook.

Sugarrae runs on the Genesis Framework

Genesis Framework

If you’re someone who doesn’t understand a lot of PHP, Genesis will give a ton of functionality that you wouldn’t be able to obtain otherwise with a simple control panel instead of having to alter code. For the advanced, Genesis has incredible customization possibilities via Genesis hooks.

The theme is not only highly customizable, but it has allowed me to run Sugarrae more professionally, with a much more targeted focus on monetization than it ever has been able to achieve before.

You can find out more about Genesis below:

Speak Your Mind

*