Last year, I asked a bunch of the link development pros to sit down and do an interview on the topic of developing links in regards to seo. What I ended up with as a result was a six thousand plus word tutorial on developing backlinks.
The unique thing about this interview, from my perspective, was that not a single one of the five experts interviewed last year saw anyone else’s answers before the interview was published:
An important thing to remember about this group interview is that no one saw anyone elseâ€™s answers beforewriting their own. This isnâ€™t about a single answer followed by four head nods. Any agreements come from true beliefs and any contrary opinions came from the same. Weâ€™re all good at what we do, but it doesnâ€™t mean we always agree. ;-) I only saw the other answers as I put this post together and the others will see each otherâ€™s opinions on the questions that were posed for the first time when this is published.
So, this year, we’re doing it again… only, not only did all of last year’s experts return, but we added six new ones to the mix. As a result, the below is over twelve thousand words of opinion on link development and as a result, the organic search engine optimization landscape as a whole.
Once again, get ready to learn a bit about link development methods and theories from:
– Eric Ward, the Link Moses behind URL Wire
– Rand Fishkin from SEOMoz
– Roger Montti, the founder and owner of martinibuster.com
– Todd Malicoat of Stuntdubl and Clientside
– Justilien Gaspard, Link Columnist for SearchEngineWatch.com, his link building blog and course author SEMPO Institute
– Aaron Wall of SEO Book and Clientside SEM
– Debra Mastaler of Alliance Link and the The Link Spiel
– Michael Gray of the Graywolf SEO Blog
– Andy Hagans, the lazy SEO of the Tropical SEO Blog
– Jim Boykin of We Build Pages and Internet Marketing Ninjas
– and myself, Rae Hoffman, CEO of Sugarrae and MFE Interactive
Each interviewee contributed one question that they wanted everyone else to answer (and they also had to answer them as well). Without further ado, grab a cup of coffee and dig in…
Question One: What are the top 5 or 10 “open” link sources that you still use? Things like social media sites, forums, directories, comments or sites with UGC options that can provide value – even if it’s only indirect because the links are nofollowed (but the exposure makes it worthwhile).
I’m a huge cheerleader of niche social media sites — you asked for 5 or 10 open link sources, there are 29 listed there. Of course they deliver less traffic than Digg or Stumble but the traffic they drive is much more targeted, and it’s also much easier to “hit” on them.
Wikipedia, Yahoo! Answers, Digg Comments, Yahoo! Directory and Flickr
That’s a loaded question.
Top 2 for Link Building
1. Niche & Vertical Directories (mostly paid ones – therefore, not so open)
2. Social Media
Top 3 for Exposure and Building Awareness
1. Social Media
I encourage clients to get involved in forums and blogs for their industry. That way, when some issue needs to be promoted, they have established profiles and can reach out.
The sites mentioned — forums, directories etc are all viable options and we continue to use them in our link building efforts. However I also look for large scale venues with opt in memberships as well as media lists, answer sites, topical networks, book marking sites, topic aggregators and blogging networks to draw links from.
The Wikipedia and del.icio.us are two great places to pull sites and request links from. And if the situation warrants we’ll also hit the contest, awards, and coupon sites as well.
Obviously, YouTube and many other video sharing sites are great for generating traffic and awareness. They might not bring link pop but a well made video helps argue your case. StumbleUpon is another site. While you’re at it, you might throw a few links and a positive review on the McAfee Site Advisor page.
An underutilized open link source is government websites at the local, state and federal levels. There are many angles for obtaining one-way inbound links from government sites. Creative searching will find numerous lists of commercial websites that meet certain conditions for obtaining the links. These are “open” to the extent that there are no restrictions except for meeting certain criteria. Sorry I’m not more explicit, but those who get it should be able to figure it out.
Yahoo Directory, DMOZ Directory, Press Releases… Few blog reviews if review is on relevant site (no follow links to keep your nose clean), nofollowed banners on high traffic relevant sites.
YouTube, Yahoo answers, quality directories like best of the web’s regular website directory or blog directory, forums in whatever niche you’re site is involved in, niche social media sites… and as much as I hate to say it, Wikipedia. I also utilize relationships with other bloggers in the niche that we’ve spent time building. Press releases when you have a reason to write one. Stumble upon, stumble upon, stumble upon… serious… stumble is great for driving traffic and building a site fan base.
Even though everyone still gripes about them, it’s always worth a shot to make submissions at places like DMOZ and Wikipedia. Sometimes they get in, and sometimes they don’t, it’s not worth losing sleep over, but they are worth submitting to. I always submit to the yahoo directory – if you don’t have $300 for that link, you shouldn’t be running a website, and it’s a quality hand-edited directory. The same principle applies to Best of the Web. I often submit to business.com as well, as they give you multiple links. Aside from these sites, there are not many that I use for EVERY site that I work on.
If the content I’m building links and publicity to truly contributes to the target page experience, than I will use Wikipedia. But it has to be a fit that is indisputable. For example, Jazz.com (disclosure, client) was appropriate for several wikipedia links, a couple examples of which are http://en.wikipedia.org/wiki/Ted_Gioia (jazz.com founder) and http://en.wikipedia.org/wiki/Brad_Mehldau. The key is to be honest with yourself about whether or not you are adding value to the wikipedia article. Just because someone pays you money to get them links does not make them wiki-worthy.
I’ll also sometimes use stumbleupon, but rarely via direct stumble. I have other ways to more politely get stumble traction. CSE’s can be a good source, but not for every topic. I have never and still don’t use directories, at least not the yahoo or dmoz wannabee clones or any listed on those “731 directories you can submit to” type of lists. The only directories that I use are the ones I personally research and identify as being 100% germane to the content I’m linking. For example, if the site I’m linking is Violin maker ifshinviolins.com, (not a client) then I am only looking for small heavily vetted directories like this. Sounds boring? Wrong. Get 30 or 40 links from those types of venues and you will rank, friend.
OK I think a lot of people get worked up when they find “open” places that allow links without no-follow, and a lot of the time it’s not worth it. For example look at press release sites, even though the links are straight Google has “turned off” the sites ability to pass page rank or link juice ages ago. That said you shouldn’t ignore those places. At the very least they use them for their marketing value, but those links still act as a “pointer” and can help search engines discover your content, even though they don’t give it much juice.
Back to the question, for me social media sites give you the most bang for your money/time investment. If you’ve done a good job creating content that’s appealing for that audience you have the ability to generate huge amount of links. My favorites are Stumbleupon, Digg, Delicious, Propeller and Reddit.
Yahoo! Directory, Business.com, Work.com, JoeAnt.com, & BOTW. The fact that most of them have a fee (except Work.com) and they all have legitimate editorial review keeps out a lot of the riff raff and keeps those links providing value. As a bonus tip, I find that PPC ads work well if they are aligned with content that is link worthy.
Question Two: Are you afraid of talking about link building in public for fear that Matt might want to make you an example?
I wouldn’t fear any personal retribution :-) But I sure wouldn’t ever want to discuss one of my own sites. Google’s anti-spam team seems to apply their guidelines in a pretty subjective and random manner, so I don’t see any advantage to putting your own site in the limelight. I wouldn’t want to fall victim to another AWallgate.
Not really, but there are certainly clients we’ve worked with and are working with that I wouldn’t mention because I’d hate to have their link profile researched. Usually, it’s not even because of actions we’ve done or recommended, just due to previous SEO activity or unintentional links that would make them (and us) look bad.
Is this the former Soviet Union with the KGB? Hmmm, Matt Cutts did do a co-op in college with the CIAâ€¦.
A little, but it’s less about being made an example of and more about devaluing a link source. I’ve been called out, so been there, survived that. When you talk publicly about issues relating to someone else’s business it’s only natural to expect backlash, comes with the territory. Besides, Matt is/seems like a decent person, I’ve seen people go at him and he always manages to keep his cool and respond in a professional way. More professional than I would be.
But â€¦ I am concerned about sharing open locations and tactics that can be monitored by the engines and devalued.
No, everything I discuss relates to building genuine citations, as well as citations that on visual inspection cannot be discerned as anything but a legit link. If your links can’t withstand visual scrutiny, then you have to step back and seriously consider whether what you’re buying or paying for is worth whatever short-term gains it’s producing because if it can’t pass a hand check then you will have to cope with possibility of penalties looming in your future should a competitor out you to the spam police, or the site is randomly selected for a quality check.
For instance, you may not pass a hand check where the relationship between most of your links and the site subject, are tenuous at best and strained to the point of ridiculousness at worst. All you have to do is ask, Will it pass a hand check? If your answer is yes, then there is nothing to hide.
For everything you do, the big question should be, Will it pass a hand check? This has been my guiding principle for many years now. In 2008 and probably for the foreseeable future you’re going to read more about this concept in link building blogs, and hear others discussing it more in link building sessions at conferences. Whether you’re buying links or reciprocating them, the thought you should keep in mind is: Will it pass a hand check? That’s really where the top tier of link building is at.
I’d only be afraid if I was saying things that might piss off a search engine so that they’d want to make you an example. I always try to think, “If I were an engineer at a search engine, what would I think about what Jim is saying….and if I didn’t like what Jim was saying, what could I do.”
Loaded question… Obviously, I have called Google out for things I’ve disagreed with Google about. I don’t get nervous about mentioning my sites… hello people, this is 2008 and Google has long been a registrar. They can easily figure out what domains I own if they want to since I register my domains with truthful information. But, you’d hope (and I’d like to believe) they have more integrity than that as invividuals, regardless of being part of big old Google. As for Matt specifically, he’s not a “bad” person LOL. Hey, maybe one day he’ll even let go of the fact that years ago I was a spammer. But I won’t hold my breath. ;-)
I don’t think Matt targets people specifically unless they are making a point to make themselves a target for something by acting like a jackass. I would be more concerned with outing certain techniques that will become less effective since Matt’s goal is improved search relevance. I think people get confused about Matt’s goals a lot of times – he may enjoy messing with SEO’s on occasion, but his true goal is not that – it’s improving relevance and reducing spam – period. When counter-acting SEO techniques reduces spam and improves relevance – he’s going to do what he can to counteract those techniques. It’s unfortunate that people don’t recognize that despite being somewhat adversarial – the goals of SEO’s and search engineers are not completely that way.
I am happy to have Matt make an example of me anytime. People like to take shots at me and say I drink the google kool-aid, to which I say, yep, I sure do, pour me another glass please.
It’s not only Matt you need to be afraid of there are other Googlers in “stealth mode” monitoring many of the forums and blogs. I said something in post on SEOMoz a few weeks ago when Matt was on vacation, which coincidentally was removed from the #1 slot the afternoon after I mentioned it, so it’s definitely something you need to be aware of. Clever folks know where they are reading and use it to their advantage.
General tips no, but I know if you are too specific and have an idea that works too well and spread it too widely you might get some sort of retribution for it.
Question Three: How much do you stress internal linking on your own or clients’ sites? Do you have a quick rule of thumb or strategy to maximize the effectiveness of internal links?
I think internal linkage is one of the most underrated ways to increase rankings and traffic. You can’t just depend on your menus and sitemaps either–you have to work in helpful internal links in your copy. I try to make sure every article on our sites has at least five internal links.
It’s one of the areas we spend the most time concentrating on for our large content clients, primarily because it has the greatest ability to impact rankings, inclusion and traffic. There’s no one rule of thumb we use, but we certainly have a set of best practices that we regularly follow and advise. You can find a lot of these on the SEOmoz blog – Link & Ranking Strategies for Enterprise Sites is a good example.
I stress internal linking a lot. In order to take full-advantage of external links, a site needs a good internal linking structure. Otherwise, it is not maximizing all available resources.
Strategy? Keep it looking natural, link within your content. With larger sites, link in themes. For example, if a site has sections for all 50 states, then create a linking pattern based on that. New York pages link to other New York pages. Florida pages link to other Florida pages, etc.
Our link building campaigns have three parts: optimizing internal links (except for new sites) attracting external links and refocusing existing links. In regards to internal linking, links should be absolute, remove unnecessary stop words. Use keyword anchors when it makes sense, don’t overdo especially in the navigation areas. Add optimized contextual links in content areas and point to pages fully optimized for hyperlinked phrases.
Basic site architecture is important. I like to group similar topics together, create clusters of similar topics that reference each other. This relates to hierarchical site organization, basic taxonomic structuring of web pages. I don’t stress it, so much that that’s the normal course of doing things, like breathing. I don’t do, nor do I recommend, turning important keywords within the body of the content into links, that sort of thing. Looks cheesy, like slapping on a handful of cologne. Aside from ROS links in the navigation, which should be managed carefully, I don’t think that links within a site has enough power to help a site to merit the tradeoff of appearing somewhat spammy, which could affect it’s ability to attract links. In combination with other spammy-looking tactics, it could give a negative overall impression.
Only in an initial site review will we look at internal linking. My quick rule of thumb is “the more important the page, the more internal links it should have”.
Internal linking is so overlooked. The part about it most people miss is that having a great theme pyramid and accessible sitemap is only the bare bones. Good internal linking within your content is key (without overdoing it from a usability standpoint). Additionally, techniques like siloing have been created, with the nofollow tag allowing siloing to get even more detailed. My quick rule of thumb is always to link one mention of something we have information about elsewhere on the site to that content. Of course, doing this in moderation shouldn’t need to be stated. You shouldn’t have 100 links to internal pages within an article. Use common sense.
Internal linking is critical to search engine rankings. Brett Tabke’s search engine theme pyramids post is the grand-daddy document I use to explain just how important it is to clients and friends. The biggest rule of thumb is to not link often to (or have indexed) pages that have marginal value from an external site search perspective. This means – if a user would not find the page from google – don’t waste a lot of internal link juice on it. You can link to it all day for users, but you might as well slap a noindex, nofollow on it, and a nofollow on the link itself.
If the site’s existing rankings show me that Google likes what that site is doing, then internal linking the single most important place in the world you can start your linking strategy.
You want to try and take advantage of your strongest sub-pages to spread around your link equity. If you have pages that have lots of inbound links put some targeted anchor text back to other pages on your site. This is especially useful with link bait style pieces. Create a really exceptional informational resource, and spread the incoming link juice with optimal internal anchor text back to your money pages.
I think this is an issue that is screwed up on many sites…no alt tag on the site logo link to the homepage, flat link structure that puts loser categories in parallel with the winners, and no seasonal promotions or editor call outs of best selling items. The key to fixing this is to look at your sales data and web analytics to make sure you are promoting your best content. It is easier to turn a #6 ranking into a #2 ranking than it is to try to capture new keywords.
If your site is new to the field and you are unsure how comprehensive your coverage is you can use Microsoft or Compete Search Analytics category reports and reports on competing site’s traffic to find the best keywords you should be targeting.
Question Four: If you had 7 days to train a link developer, which concepts would you focus on each day as the most important concepts?
Link baiting, link baiting, link baiting. Oh, and social media ;-)
7 Days, huh? I think I’d spend day one training and talking and days 2-7 giving them actual projects to work on. There’s nothing like real life experience (especially with deadlines and goals) to get someone invested in link development. Big concepts?
1. Content first, links second
2. What others say about you is more important than what you say about yourself, so get some independent referrals to help bolster your campaign
3. When building manual links, vary everything you can to make pattern detection as difficult as possible – and that includes temporal patterns!
1. Power of Links – How Search Engine Work & Basic SEO
2. History of Link Building – Tactics to Avoid
3. Finding, Identifying & Assessing Value of Links
4. Traditional Marketing & Public Relations Tactics
5. Guerrilla Marketing Tactics
6. Social Media & Viral Marketing
7. Influence & Negotiations
I’ve written a couple of articles at SearchEngineWatch.com on training link developers: 7 Tips for Training Link Developers and Training Link Developers to Become Marketing Gurus.
Day one: Definition of link popularity, the importance of anchor text, algorithmic influences
Day two: How to run, read and use data from link reports run.
Day three: Internal link optimization and how to reclaim links.
Day four: Foundational linking tactics
Day five: Content generation and media tactics
Day six: Social media tactics
Day seven: Paid links, networks, overall review
Learn what gets a site banned, understand how visual inspections are bringing sites down, understand how multiple tactics when taken together can give a negative impression, and understand what the quality evaluators are looking for, and more importantly, what those giving out links are looking for.
Too many old habits remain, even though the reasons for them have long disappeared. For instance, the outdated notion that PR 4+ links are the standard for links, and anything less is beneath consideration has no basis or foundation in a link building strategy today. Do not pay attention to the toolbar PageRank meter. Even if it’s gray or white, ignore it. Learn the important site evaluation metrics and go by those. In short, evaluate, seriously evaluate, the reasons why you are doing what you are doing, and try to see if there are alternate better ways of doing it. You’ll find new opportunities this way. Above all, ask if it will pass a hand check.
Day 1. Learn all about the website, the services, the products, the company, the competitors, etc.
Day 2. Analyze who links to your competitors and related sites and make a list of who you should contact
Day 3. Start contacting sites, proving you’re human and that you’ve been to the sites.
Day 4. Study “the art of the deal”.
Day 5. Learning the best pages to get links from.
Day 6. Learn to create ads that blend in and get clicks.
Day 7. Study Linkbaiting and create topics for linkbait.
I’m going to assume by the fact that they need training that they’re totally green. In that case, they’re working nights too. So, day one is to read Aaron Wall’s seo book (whose book you should totally order using my affilite link under this post)from front to back. It is a quick, concise “green guide” for anyone who needs to learn the basics of search engine optimization. This will save me several hours and my breath of explaining the same concepts. Day two is learning to understand why links matter and the basics of link development (presentation was given to a small group of extremely green to seo folks last summer and has never been posted online before – cheers). Day three would be learning keyword research and how to create angles and write in an engaging/net friendly style (hell yes my link developers write content)… you get links much easier by publishing content that people want to link to once you make them aware of it than getting people to link to content you wanted to write.
Day four is learning all about the Linkerati and how to develop relationships with other bloggers in the niche they’re working in. Day five would likely be learning the importance of anchor text, how to utilize it for best results when developing off site links and how to do internal linking properly from within the content they develop for promotion. I’d also focus on how to leverage the relationships they develop to get inbound anchor text changed from other sites if it isn’t optimal whenever possible. Day six would be learning to backtrack what the competition is doing to match their own efforts while utilizing the rest of their training to surpass them. Day seven would include social media training… not on gaming the current networks, but understanding how social media works in general. How to leverage it to build a site audience and how it affects things like universal search.
Day 1 – Explain the paradox of pagerank mattering
Day 2 – How to value a link – explain quality over quantity and relevance.
Day 3 – Have them read the masters and ask questions
Day 4 – Teach them where to start with tools
Day 5 – Show them how to write a link request
Day 6 – Create sample ads with them with proper deep links and co-citation.
Day 7 – Let them rest so they don’t jump out of an 8th story window
This is sort of another way of asking about link value factors, which I’m on record and have been insulted for saying are over-rated.
Day 1). What not to do part I Why social voting sites are pointless for 95% of web sites
Day 2). What not to do part II Why mass directory submission is pointless for 95% of web sites
Day 3). What not to do part III Why mass article submission is pointless for 95% of web sites
Day 4). What not to do part IV Why PR chasing paid links are pointless for 95% of web sites
Day 5). What not to do part V Why generalist social bookmarking is pointless for 95% of web
Day 6). What TO DO part I How to identify inbound link targets that will help your site succeed
Day 7). What TO DO part II How to obtain those links once you’ve identified them
Day 1 – anchor text, what it is, how to use it to your advantage, when linking in or out
Day 2 – directories, article syndication, and off site satellite content (ie squidoo)
Day 3 – Blogs, comments, forums
Day 4 – paid links, sponsored advertising, hosted content
Day 5 – backlink analysis via search engines and tools
Day 6 – review and pulling it all together into what makes links good, bad or useless
Day 7 – take a break and work on something else you need time to sharpen the saw
A day following social media sites and blogs about their favorite hobbies to see what sort of stuff is considered remarkable and linkworthy.
Read Seth Godin’s Purple cow on day 2 to reinforce day 1.
Day 3 would introduce feed readers and tracking blogs. Introduce concept of personal bias as it relates to passion in organic links.
Day 4 all about looking credible, the power of anchor text, and the value of links from authoritative websites.
Day 5 Rae’s post about when unique content is not unique, and creating a list of 20 unique ideas that can be added to the site.
Day 6 create linkbait.
Day 7 market it via email.
Question Five: How will recent trends such as personalization and universal search affect the way SEOâ€™s develop and execute link building strategies?
Well I think any development that improves search relevancy is going to favor “real” citations over faked citations. There’s never been a higher value proposition for going whitehat in your link building efforts.
Not much, honestly, though there may be some vertical search integration systems that reward certain types of geo-targeted or content-targeted links, resulting in additional efforts being focused in those areas.
The focus will change from “getting links” to “getting in front of customers.” Additional attention will be given to more traditional marketing tactics. This could range from hiring a PR Firm (public relations, not pagerank) to get into the news, to promoting images, to creating information videos.
Specific tactics will really depend on the industry, since Google currently operates 14 different verticals that could be incorporated into universal search. Obviously, Google won’t be using all of them for each search term. Code search won’t fit in well when searching for “Miami Resorts,” yet maps and images are perfect.
The elements driving universal search (video, podcasts, photos, book listings etc.) provide opportunity to attract more links from varied sources given the range of elements. These elements enhance the quality of your content and when combined and promoted, increase your ability to reach a wider audience. In addition elements associated with universal search can be used for specific marketing initiatives such as reputation management. (Tip – create video rebuttals).
As for personalization – With toolbar browsing and search history data being incorporated into the search results as a way to augment organic link mapping, link builders need to find a way to use this data to locate the influential sites people are visiting. Keep a close eye on your stats programs and what’s coming back under search options like “similar pages” and query refinement suggestions to find additional/new/different websites to draw links from.
Definitely changes the game in terms of converting featured universal search blocks into feeders, the old piggyback SEO strategy in order to gain links for traffic. So in that sense, it’s reassessing the reasons for obtaining a link. YouTube ranking? Build the videos and promote them into the SERPs. News ranking? Write some press releases and get them into the news, that sort of thing. I see the Universal search effort as an opportunity to get more positions into there. People involved with Local search have been dealing with this for a bit longer, and it’s only becoming more intense. Seeing what aggregator sites are featured in Local search then getting aggregated with a link. However, some of the Universal Search components are short-lived in the SERPs, like the news blocks. Yet they’re good for traffic, which falls into the links for traffic approach.
As far as personalization is concerned, I think it’s going to make some site owners consider the value of their content more, the usefulness of it, and most importantly thinking of ways users can share your content with friends to solidify blocks of people coming to your site and influencing the SERPs of friends who haven’t been to your site. Recommendation algos could play a part, so wrapping your mind around getting friends to visit may be one part of it. In terms of links, this might cause a reassessment of strategy, to go back to links for traffic. This is theoretical though because it’s still a bit off.
I don’t think it will change much, if anything, as far as link building strategies.
Smart SEO’s won’t have to adjust much because they’ve been marketing their asses off anyway and saw things like universal search coming. Smart SEO’s have become and are becoming overall online marketers, forcing their link development strategies to get wider and more creative and at times, a side effect of an overall “marketing plan”.
These trends may minimize the overall effectiveness of link building campaigns, but despite the declining value of return on investment for quality link hunting, links will remain an importan algorithm variable for quite some time to come. The impact and cross-referencing of links as a variable reference to other data will be the important piece of the puzzle. An example of this may be using clickstream data (from toolbars, personalization information, etc.) as VALIDATORS for link popularity as a variable. This means that if you have 100 new links to your site, that the clickstream data had better support that level of growth, or your site ranking may be dampened. This, of course, is all completely speculative, and is only one example of how this type of data MAY be used.
In the short term it’s a spamfest, with only the real leading edge folks getting any short term value from it. Over time, it will become much more useful to more and more sites, and actually spark a nice new round of creativity as folks create media elements they wouldn’t have before.
I can see a ski manufacturer choosing to create technical videos on the ski manufacturing process that they might not have ever created if not for the consumer video search potential. Today, nobody has. Let’s see what they same search yields a year from now.
Google seems to be embracing personalized search the most, but I’ve never talked to a “normal” person who wanted it or thought it was good or helpful, the most reaction I ever got was a lukewarm “hmm that’s cool” followed by a “wait a minute does that mean they are watching me”?
I love universal search I think it provides the opportunity for sites that may not have authority on their own to leverage things like google maps, you tube or other sites to drive traffic.
As Google’s house content fills up the search results ranking in the top 2 or 3 organic listings (after Google’s house content) will be like ranking #5 in the past. So you need to submit YouTube videos and put yourself in other verticals as well.
Personalization means that you need to get a following. Real editorial sites with frequently updated content will replace thin affiliate sites as a more profitable strategy.
Question Six: Reciprocal links work. Do you recommend it and how is it different today? If you don’t recommend, why not?
A normal linking “pattern” will have a small percentage of links which are reciprocal. So Google’s not going to penalize you for that (that is, for having a small percentage of links which are reciprocal). Link when it makes sense for your visitors and when it sends you good traffic. Don’t go overboard. And don’t read advice on this topic from before 2005 ;-)
I don’t recommend the “let’s trade links” emails or the pages with long lists of “link partners.” However, I certainly do recommend getting links from sites that you link to and vice versa. I think the patterns the search engines look for to discount reciprocal links try to test for unnatural linking, and oftentimes, interlinking between blogs and sites is one of the most natural processes on the web.
I don’t personally use reciprocal links because my focus is on the harder-to-get, one-way links. However, reciprocal links are natural, influence rankings, and are part of a “natural links profile.”
For clients who want to do that aspect on their own, I encourage them to offer links from within their content pages and to be very selective about choosing partners.
I also stress the need to avoid link software since many of these tools leave footprints or unhealthy linking patterns. Keep it looking natural and use reciprocal linking in moderation.
The power behind reciprocal links lies in the control you have to dictate the anchor text and where it points. I don’t recommend reciprocal links as a solo linking tactic, but it can be part of your overall link mix without issue. We’ll swap if it’s the only way to get a link from an authority site.
Some light link reciprocation is fine and makes sense in many ways. As I mentioned above, you may want to ask if it will pass a hand check. For instance, aggressive reciprocation, such as linking back and forth to another site in a related industry from your navigation bar may not pass a hand check, so in that case you may want to no-follow the links so that you’re doing it for the targeted traffic and sidestep penalty issues altogether. I would like to emphasize that the test for 2008 is going to be, will it pass a hand check?
I recommend that a small percentage of your backlinks should/can be reciprocal if they are highly related. I don’t recommend getting on “links pages” designed for “traders” though. I don’t think google is a fan of counting links from links page.
I feel like I’m beating my head into a brick wall every time this topic comes up. No one listens… “but, my competitors with a nine year old domain and tons of one way links because of self reinforcing authority whose reciprocal links are five years old on other nine year old sites rank!” – no shit sherlock… but maybe it’s due to the surrounding circumstances and not the face value reciprocal links. Reciprocal links have their place in a marketing plan… some of our sites have reciprocal links that weren’t “traded” – we linked to them and they eventually linked to us or vice versa due to quality and traffic. Reciprocal links are natural in moderation. Reciprocal links pages, automated programs… let them die in peace. New sites have different rules than older sites. Develop traffic people… develop traffic.
I find it cumbersome to beat dead horses. Reciprocal links are sometimes natural. If it’s natural, you can have recip links. Don’t get carried away. Peanutbutter.com should always link to Jelly.com – the world wouldn’t be right if they didn’t link to each other.
Reciprocal links absolutely positively DO WORK, so long as the motivation and impetus for the swap in the first place had nothing to do with search rank and everything to do with relevance.
If these two sites want to link to each other then it’s ludicrous for any link builder or consultant to tell them there’s no value in it.
On the other hand, if Garth’s Body Piercing site wants to swap links with Gertrude’s Cribbage Boards site because they think it will help their search rank, then they are either both idiots or were sold link building services by idiots.
Sure they still work, what doesn’t work are the incredibly foot printable recip directories and pages everyone added to their sites a few years ago. If I get a link from the New York Times tomorrow, does anybody really think the value of that link is going to drop by any meaningful amount if I link back to that page? Of course not. Look at things like blog carnivals, they one huge interlinked reciprocal link fest, but the content and structure is completely different than the old style of link exchanges, so there’s still value. When this interview gets published I am absolutely going to link to it.
I think I value my time at quite a lot, so on that basis I have not relied too heavily on reciprocal links for a few years. I find it is generally cheaper just to create linkworthy content and then spend the former link exchange begging time just asking for one ways.
I reciprocate if needed on key links, but I think most reciprocation does not have much a positive ROI for me unless it is much more than a link swap…like a true business partnership.
Question Seven: Do you think the search engines are currently taking steps to dampen the effects of bursty style link growth that is typical of viral content? Do you think they will in the future?
I don’t see why they would. “Bursty” link growth is natural in the age of blogging and social media. For searches on very new topics or news stories, they’d be decreasing relevancy if they ignored these bursty link patterns.
I think that in some cases, they try to dampen it when they think the intent was spam or manipulation, but in other cases, they might actually reward it when it points to content that’s part of a big news story or generating a lot of natural excitement and interest. I wrote about this exact topic in-depth (with charts and illustrations) here – Tracking Temporal Trends in Content and Link Growth.
I think search engines are exploring ways to filter out artificial bursts of link growth from linkbait that is then 301-redirected to a commercial page or another domain. They could simply put a time delay/damping filter on those links after the redirect.
I doubt they would ever try to dampen all bursts of link growth. This would degrade the quality and relevance of search results since we live in a quickly changing society. They just need to separate the natural from the artificial.
We’re still seeing good results after launching campaigns and attracting large numbers of links from social and general media sites. However, since the engines are fickle and sensitive to having their rankings manipulated it stands to reason the law of diminishing returns may kick in and links generated in large numbers will be sandboxed or given less credit.
On the other hand, the types of links being attracted tend to be of low quality in a lot of these campaigns generated through the social media sites so some of this can also be attributed to a “low PageRank” type explanation.
I believe what matters are the reasons why a citation is granted. The Church of the Flying Spaghetti monster ranks for the phrase, Intelligent Design, even though it’s actually opposed to the idea of Intelligent Design, and virtually all its links are virally gained, and I would surmise they were gained in bursts as its notoriety fluxed, most notably from exposure on BoingBoing. For the foreseeable future, I think current relevancy algos can handle these kinds of links. So in the case of the Spaghetti Monster, obviously no steps have been taken to dampen its links even though the burst has passed it by. It earned those links by being relevant for the phrase, Intelligent Design. It earned that SERP position by having relevant links.
I think there’s value in “bursty” style link growth and shouldn’t be dampened. Getting a bunch of links from blogs/news/viral places shows value…the problem is rankings then drop unless you’re taking steps to continue to build links.
Viral content is normal and ignoring links from viral content would be inane. If I were an engine, I wouldn’t be looking to “dampen” the effect of bursty style link growth. I might dampen the effect of links from social media sites themselves… but getting a bunch of links from sites *outside* of the social media networks is a sign of *great* content/linkbait going viral and not something I’d be looking to “dampen” (though I may look to dampen pages redirected suddenly to another domain or “unrelated to the linkbait topic” homepage – which sucks when you legitimately change the url on a previously viral page without changing the content and watch your ranks go down for that page due to it). And at the moment, I certainly don’t see bursty style link growth outside the social media sites (aka on normal sites) being dampened.
Search engines are absolutely taking steps to dampen the effects of bursty growth. Viral link growth is one of the last sweet bastians of SEO – where traffic comes easy. It will be one of the techniques we will one day look back on as the “good old days” when it was “easy” to get a bunch of links at once. Of course, linkbaiting is still not an easy tactic – it requires a good know how of the linkbaiting hooks and an understanding of social media marketing. The search engines are definitely building information about the trends in link growth and how it applies to other variable in search rankings.
Why wouldn’t they? It’s as natural an evolution of the algorithm as it is the natural evolution of viral content. Some of it is great, most of it has an agenda and it blows.
Google is pushing viral content as one of the few “acceptable” methods of link acquisition, so naturally SEO’s jumped all over it and are starting to abuse it. It’s only natural that Google notices this and is taking steps do dampen the effects. They don’t want to negate it but if they can “slice the peak off” and reduce the effects of the spike they certainly will. What this means is you can’t look at link growth as a one shot deal, it’s something you need to keep doing regularly over time, to keep growing your links. I’d personally rather gain 500 links over a 6 month period, instead of 500 links tomorrow and nothing for the next 5 months.
The fact that they accidentally banned their own official adsense blog in the past shows that they are probably looking at link acquisition rates in some way. Surely they have to more aggressively in the future. Recently there was a widget with hidden links to a payday loan site.
Question Eight: Google has announced Knol as a Google hosted supply of content, and already includes YouTube videos near the top of their search results. With this editorial blending of house content filling up the search results, how long do you think it will be before webmasters stop trusting Google advice in general? What will be the straw that breaks the camel’s back? How polluted will the link graph get when webmasters realize Google has no real control over it? What links will still pass weight in that sort of free for all linking environment?
If Aaron Wall is a guage of cutting-edge-hat SEO–and I think he is–then I would say the tide is already beginning to turn against blindly trusting Google. The application of their “rules” is just too subjective and inconsistent for a professional webmaster to say “well I’ll just follow the rules and then I should be OK.”
I don’t hate Google or anything. But there’s no reason to “like” or “trust” them either. They’re morally and legally obligated to do whatever they need to do to deliver maximum shareholder return. Why would that be perfectly aligned with my own personal and professional interests? They’re a highly-profitable corporation with a website that’s heavily ingrained into the webmaster’s ecosystem. Nothing more, nothing less, despite their lofty mission statement.
I think that editorial link detection has gotten tremendously more advanced over the last few years and that Google and others will continue to focus their efforts on innovating in this area. While I certainly have concerns about “in-house” content like Knols, YouTube and even Wikipedia dominating search query results, I think that unless those results are making users happy, they’ll be done away with in short order. Google won the search wars by providing a better experience, but if someone takes over that position of “best” quality from them, I think we’ll see Google fight hard and fast and drop anything that might be hurting user satisfaction.
I believe people are already starting to mistrust Google. Yet, I don’t see much change taking place until another search engine gains market share. After all, Google has made many people very wealthy.
As for links, those that are harder to get will continue to pass value. Just think in terms of sites and links that Google would trust to pass high editorial value.
Links with age. Links from trusted sites. Links from Google.
Webmasters are already handing over their used underwear to Google via Webmaster Tools, so how is something like Knol going to challenge their trust in Google more than it is currently challenged? Google has reasonably defined the rules for playing in its sandbox, so I don’t see any camel backs being broken. Yes some people are targeted, but that’s more from not honestly surveying whether a particular tactic will pass a hand check, or a lack of discretion in linking strategy. Too much success can be a negative event because it brings more scrutiny and that’s when your link strategy is going to come under pressure. I’m not sure how important it is whether webmasters think the link graph is polluted. More important is how users perceive the SERPs and how useful it is.
I see Google as the dominate search authority for years. I’d prefer to work with them as opposed to against them. I believe links will still be the “deciding factor” in rankings for years to come, and any linking method that gets popular, and can be automated, will be filtered.
Google Knol… sigh. “So, like, we control seventy five percent of online searches and we totally promise to not manipulate our algorithm so our pages come up top.” – To be fair, it’s no different than what Yahoo does with its sports and finance portals (aside from Google getting their content developed for free from users vs. Yahoo having to pay for at least some of their content)… except that Yahoo has much less market share and therefore them possibily using nepotism in their search results isn’t nearly as scary. Knol is likely an example of Google proving that they can crush anything that gets in their way. Wikipedia, with all it’s juicy traffic, won’t slap Google’s ads all over their pages and their solution is to build their own. As for trusting Google, sheep will generally continue to be sheep. To be fair, the paranoid will also generally continue to be paranoid. I’d hope people would take a stand, but nobody cares enough until it’s too late. Google has “real control” over every site on the web because they’re essentially a new kind/twist of monopsony. Wow, maybe I should re-think my answer on number two. ;-)
That was way more than one question (*cough* Aaron). I think plenty of people blindly trust Google and that will likely not stop any time soon. Resistance is futile. There will be a group of folks that will always keep a mindful eye on what G is doing, but there is very little that can be done about a lot of it, other than rolling with the changes (or selling books:). I think there are lots of things that probably should have tipped off the general business community, and general webmaster population that just because Google “does no evil”, it doesn’t mean they will “do no harm” (thanks to Paul G for the quote). Your business model may get assimilated – it is your responsibility to see it coming, and sell out before your competitors do. The weak will be eaten, and we will auction off their babies.
The family of links that pass weight will shrink to a smaller and smaller pct of the overall link universe, and at the same time, the searcher himself is going to have to create better queries to obtain results of any use.
Well my position on Google is pretty well known, however I think Knols represents a dramatic shift in their modus operandi. Not to sound insulting but a lot of people in Google really don’t understand the internet works. They don’t understand that people make a living selling goods, services, and information on the internet. Instead they sit in their bean bag and lava lamp filled cubicles letting mommy and daddy take care of them, then they use technology to bring all of that information under the stewardship of Google, without caring how this change affects other people, or the ramifications of these changes on society as whole. Gathering all of this information radically unbalances natural market forces, especially when the information is controlled by the exact same people who are in charge of ranking it. If thousands of years of human history have taught us anything it’s that people act in their own self interests first and foremost, to deny the most basic of human instincts is supremely naive.
Google polluting the SERP’s with it’s own properties could be start of their downfall. People like a certain amount of change and diversity. Situation comedies or reality TV may be “hot” on TV for a few years but eventually people want to see something new or different. When it’s all google all the time, it might not be google time any more. More and more webmasters are waking up to the fact Google may not be acting in their best interests, to the others still in the dark I offer you the red pill and a chance to see how deep the rabbit hole goes…
I am not sure it will be THAT bad, but they did offend a lot of people this year with the PageRank editing stuff. Unfortunately they have no real competition at this point in time, so they can do whatever they see fit.
Given how widely link schemes are spreading I envision Google placing a bit more weight on blog subscriber stats and toolbar usage data, though stuff like StumbleUpon ads allow people to buy users too, so they are going to have to balance out new criteria with old as they add stuff to the algorithms.
Question Nine: In Google’s algothrim updates for 2008, what changes do you expect in terms of how links come into play?
More hand-editing and a larger eval team. I think they may have reached the point of diminishing returns in respect to what they can do algorithmically.
I think we’ll continue to see a focus on detecting paid links (both direct and indirect) and on trying to give heavier value to truly editorial links. I also think there may be a lot of activity around anchor text pattern detection and trying to reward sites whose link profiles are very natural.
I expect updates to the algothrim regarding paid links. While many talk about on-page footprints or hand-checks, I believe the algothrim changes will eventually be based on linking data and patterns.
One has to remember, at its essence, Google is a “data mining company” and not a search company. Search is just a product of data mining.
“Google’s mission is to organize the world’s information and make it universally accessible and useful.”
As previously mentioned, I expect some damping filter to be applied to 301 redirects that occur after a burst of links from linkbait or something viral. That will be fairly easy to implement and may already have been introduced.
I suspect we’ll continue to see a push to discourage paid links or any type of linking tactic that works to purposely, on a large scale, manipulate search rankings. In addition, the engines will continue to refine how they assign relevance based on the text surrounding an anchor and the text on the destination page.
Search history data will continue to find its way into the serps and more importantly, the engines will find a way to display consumer reviews for products and services being endorsed in the various social networks. Looking for running shoes? Up pops the serps and user generated product reviews along with them.
Just theorizing a possible scenario, not really a prediction. But they could evaluate the data produced by their quality evaluators and use it to automate some of the link spam fighting, even if it’s just to chip away at the easy ones.
I think Google’s always trying to filter poor links….sometimes they’re good at it….sometimes they’re not..but they’re always working on it.
More of the same of what we’ve seen in 2007. The great paid links debate will continue to rage on. More collateral damage. More hypocrisy. And more sophistication, complication and seperation of the wheat from the chaff in successful link development strategies and campaigns.
The TYPE of links becomes more and more important. Topical relevance detection is constantly improving, and cross referencing link data with clickstream data will continue to improve. At the end of the day – you need more unique content as a way to generate better and higher quality links. Make it impossible for your competitors to NOT link to you, and you won’t lose in SERPS.
I have a gut feeling that as of today, for every 10 people trying to build an honest link profile, there are 90 people doing nothing more than seeking page one rankings regardless of the collateral damage along the way. So I’d expect Google to make some very solid advances in identifying those who seek to manipulate on behalf of junk.
I expect them to definitely take steps to diminish viral content, social media, and link baiting. I see them relying more and more user data from trusted and converging data points. For example a surge in google analytics alone won’t mean much, but a surge in analytics data, with accompanying link growth, and blog subscribers is much more trustworthy.
More hand editing and them getting better at filtering out some spiky links and links where the anchor text profile is too well aligned.
Question Ten: You always hear about “leaving footprints” in regards to buying links and gaining organic links as well. What in your opinion are the three top “footprints” you see SEO’s leave when developing links that would flag them as “unnatural” to you?
1. Overlinking with a single anchor text.
2. Overlinking to a single page on your site.
3. Overutilizing a certain type of link (sitewide, reciprocal, etc.)
4. Notice a theme here? Look natural! Even if you aren’t.
1. Location – the types of sites you see in a domain’s backlink profile are often the biggest giveaway that manipulative linking tactics are being employed
2. Timing – 500 links to a homepage don’t all appear in a week unless there’s something exciting going on with that website or company. Search engines know when “exciting” events are happening because they see queries, content, news sites, blogs, social media, etc. heat up. If that doesn’t happen and links do, you might expect a further investigation.
3. Association – Who else is being linked-to on the sites where you’re earning links. Payday loans? Low-rate Mortgages? Casinos? Pharmaceuticals? When you see a high number of disparate, commercial-focused links pointing out, it’s going to be pretty easy for the search engines to identify and discount those links.
How about the Top 4?
1. Broker Links
2. Links from Networks of Sites
3. Link Software
4. Anchor Text
Telltale footprints that seem unnatural could be – lots of new links using one or two keyword anchors, all inbound links are coming from pages in same PageRank range, a site has a large number of back links but only from a handful of URL’s, all inbound links point to the main dot com.
A. Anything that can be revealed by a backlink search.
B. Preponderance of links coming from the same niche even though that niche isn’t ordinarily associated with natural backlink profiles for your niche. For instance, you run a travel site about Thailand and the overwhelming majority of your backlinks are coming from recipe site pages featuring cuisine from Thailand. Something like that screams of paid links and invites scrutiny. Will it cause a site to rank? Probably. Will it pass a hand checkâ€¦?
C. Crap content with amazing backlinks. Raises eyebrows, at least.
Being listed in tons of “second tier” directories, leaving link trading software footprints, having your link text always followed by the same text. Having links to your site that don’t generate clicks. (yes, I think Google looks at this…links that aren’t clicked are not worth as much as links that are…..see yahoo’s new patent).
Wow, how do I spot thee? Let me count the ways… I guess my top footprints would be:
1. Buying from link networks, especially those with obvious footprints. I have *yet* to do a site clinic where we didn’t bust at least one site within two minutes for having obvious paid links in their backlinks (buying from newspapers or radio stations are the biggest offenders).
2. Too much of the same keyword rich anchor text. Even worse when you buy fifty links with the same anchor text (or go on a big organic push that has the same outcome) that all go up the same week and then your site doesn’t get another link for months.
3. Abusing *any* good tactic. For instance, if you find a way to create a relation between your site and one that isn’t typically related to it (creating an angle to relate to normally unrelated sites) to get a *relevant* link… you don’t want to do that 400 times. Another example would be that creating a few tight reciprical linking relationships to cross promote two sites in a prominent way is much different than having a links section with 40 links pages, 100 links to a page, even if they’re all “relevant”. Ninety percent of your links being triangular… you get the picture.
The worst “footprints” come from automated link pop manipulation schemes. Any code that could be detected by a find and replace, or a regular expression, is likely going to get picked up by an algorithm. From a human perspective, footprints may include common servers, whois info, location, common design layout, similar content structure, and a variety of other things. Webmasters and SEO’s should definitely be mindful to leaving “footprints” that a good SEO detective could uncover, as well as a search engine algorithm.
I suppose if the page has the following phrase in the text… “Buy a link on this page and let our pagerank help your site rank higher!” Then it might be a flag. OK, that’s a joke, but what’s funny is how many sites actually say nearly this exact thing?
The number one mistake I think inexperienced SEO’s make with links is not varying inbound anchor text and surrounding text enough. For example they want to rank for “buy blue widgets” so 90% of their inbound links have that as anchor text. It’s a backwards way to approach the problem. Work on doing everything you can getting the links you need to get enough trust to rank, once you do then work on getting the more targeted anchor text.
The next is link networks. Creating a massive interconnected network of sites is dangerous. Sites like IAC’s Expedia, Lendingtree, and Ticketmaster are interlinked very heavily. A lot of webmasters look at this and see it as an endorsement of good behavior, and it often gets them in trouble. The thing to understand is each of those sites would rank just as well without the interlinking. You need to build the trust without depending on your own network to do it, when you do then you can interlink with impunity.
The last mistake is people make is linking only for search engines. The end result of good rankings in search engines is qualified traffic, yet some people worry only about getting links that appeal to search engines scoffing at high profile links that come with a nofollow tag. I’ve seen nofollowed links from wikipedia pages bring hundreds of people per day, and that’s nothing t laugh at. So yes go after the links that improve the rankings but don’t neglect the ones that bring traffic.
1. All sitewide links.
2. No anchor text variation.
3. Links exclusively from low quality sites.
Question Eleven: You have a brand new web site devoted to deep sea rescue equipment and education. You have one and only person who can work full time on link building for the next 90 days, then they will leave forever, and nobody will be able to do any link building work beyond that time. The site will continue to have new content added on a monthly basis forever. What advice would you give them?
Deep sea rescue equipment you say? It doesn’t actually sound crazy competitive, so a 90-day linking campaign is probably fairly feasible. I’d put the link builder on two tasks – first, a manual link building campaigns – finding directories about science and oceanic studies, contacting relevant bloggers, making one-off link buys, finding stale websites on the topic that still rank well and offering to buy them (for 301s), etc. Second – I’d have the link builder dive into some viral content on the world’s most terrifying deep sea rescues – totally top of Digg/Reddit/Newsvine/Stumble material if done properly. Once they’ve got some links to the site to help kick it off, I suspect that in such a rarified industry, the site would actually thrive quite nicely.
I’d advise them to reexamine their long-term search marketing strategy because linking and SEO are not a one-time cure. While I don’t consider the term competitive, any site serious about long-term top rankings must make a commitment to promote their business.
In my SEW column, Top 5 SEO and Link Building Challenges for 2008, I discuss how neglected link building destroys top rankings.
From my column:
“This trend of established sites dropping off the first page will only increase in 2008. The fact is while these sites were enjoying the benefits of top ranking; their competition was being proactive in promoting their site and building links.”
Think of it this way. If you were a top player in a brick-n-mortar business, would you stop promoting your company once it reached the top? Any CEO will tell you once you reach the top tier, you must protect your position. Search marketing is no different.
Start with installing a RSS feed on the site to distribute content as it’s added. Then I’d create a Yahoo! 360 page and install RSS feed there as well to ensure the content flows through Yahoo News and into the Y! Search results. Do the same for MSN Live Search. Add URL to all major directories offering lifetime submission. Add to DMOZ and Wikipedia if they’ll have you. Create and add deep sea videos, photos, reports, papers, research, etc and tag. Create page per tag in del.icio.us.
Make sure whatever you do that it can pass a hand check. Build trust links, and obtain links (with appropriate anchors) from high quality sites judged by the metrics of backlinks, quality of content, and the enthusiasm of site visitors for the content.
Submit to dmoz and yahoo, run tons of searches for related/complimentary sites and write to them requesting that they add your resources. Come up with linkbait article topics and write those. Build a forum. Try to build a community. If the link builder did a good job you should see a good ROI on their work, and thus you won’t want them to leave after 90 days. If they have to leave, hire another one to continue the work.
To find another industry that is more lucrative where they can afford a marketer. ;-) Ok, seriously… you can’t never do link development again… now, you might be able to do enough link development, marketing and promotion in ninety days to get a following to help develop links for you providing you’re developing killer content, but I still am uneasy with “one time” marketing campaigns. So, in all honesty, the advice I would give them is to get a more realistic approach and carve out time to do link development, even if it was one hour per week.
1. Understand the value of a link.
2. Be the most popular blogger in the world of deep sea rescue (with a pseudonym – so we can continue to blog later). Beat everyone to breaking the biggest news about anything regarding deep sea rescue equipment and education. Contact every site in the space you can fine in a professional and courteous manner.
3.Get some links from high authority sites that have anything to do with the ocean. Get lots of them (preferably from colleges, schools, and the government).
4. Link to the coast guard a lot, and try to get a link anywhere they have one. They save people, and are pretty much all around water rescue rockstars.
Gee, glad you asked!
Hire a link building expert to help you solve this seemingly unsolvable problem.
Or, since several people complain that I always say what not to do and never what TO do, here you go. This is my boring white hat unsexy, but sure to get you ranked fast and forever approach. This is also just a fraction of what could be done. A list of twenty things to get started.
Over the course of the 90 day period, make sure that one person learns each and every task/skill below, by having them do each/every task under the tutelage of someone who knows how to do it.
1). Start with this web search at google: deep sea rescue equipment
a). Do same search at Google news
b). Do same search at Google blogs
c). Do same search at Google images
d). Do same search at Google video
e). Do same search at Google groups
2). List the URLs of top 30 unique domains from the organic rankings of each search. Remove dupes.
3). List domains and pages of all page one paid listings
4). Take that complete list of URLs and identify every backlink to each one of them. Do not limit your backlink analysis to just the big four engines. Use about 30 engines across every imaginable database. I use over 50.
5). If done right, the scripts running the analysis will take 2-3 days to finish the full run. (caveat: I have my own scripts to do this) This will provide a raw list of about 100,000 backlinks across the family of URLs.
6). Import into excel by URL family, one competing URL per column
7). Run a co-citation analysis
8). Run a high value text string in URL analysis
9). Run a trust factors analysis for all TLDs
10). Identify every library based URL
11). Separate most obvious blog venues
12). Flag all social media URL strings
13). Filter/Run tag search for key terms
14). Filter list by highest and lowest comp URL co-citation factor
15). Take list of highest co-citation URLs not linking to your site
16). Take list of lowest co-citation URLs not linking to your site
17). Visit those URLs, apply subjective analysis to determine validity
18). If valid, identify owner.
19). Contact identified owner via email or phone.
Lather, rinse, repeat.
20). Do this search. Create a directory of those 2300 results only including the ones that are truly legit. Let each one know you have included them. Consider joining any that are of value to you, especially if they have a member links section.
Make sure you are providing content that once the site is gaining traction, can attract links on it’s own after those 90 days are up, like an RSS
feed/blog, etc. Let anyone put that feed on their site. Example, not a client and don’t know them – http://www.divephotoguide.com/customrss/ ) Seek UGC in the form of deep sea rescue photographs or stories/tales of rescue. Consider creating a portable deep see news widget. Consider creating a Google Custom Search engine only including sites in your industry.
Consider that after reading the above, Eric Ward might just have a clue after all, despite some folks thinking all I do is work for million dollar content that doesn’t need help anyway. The reality is that I work for any content if it’s linkworthy.
OK first I’ll answer the real question, I don’t think it’s possible for a site to continue ranking long term, for competitive phrases, with zero link growth. Reading the big google patent application and watching the part of the algo that looks and acts like the “sandbox” but really isn’t, IMHO is proof positive that links gained over time are part of the equation.
Second I’m not fan of restrictive questions or unrealistic scenarios, but I’ll play along. I’d cover the basics like directories, article submission sites, emailing relevant sites in the industry. But as much as possible I’d try to build an email marketing list for site owners in the space, and make sure they get notified every time content gets added, I’d also word it and plant the seed I’m fishing for links. IMHO Google needs to see continual link growth and without a long term strategy that addresses the problem, you’ll end up with a garden that isn’t getting watered anymore. Maybe that didn’t answer the question exactly but it passed along some knowledge.
Tackle the top with top rescue stories ever, history of submarine wrecks, the history of the filed of deep sea rescue…get some evergreen content on the site that is linkworthy. Then pitch it to people who should care about it and see if some of them would be willing to reference it.
Participate in the community and see what they like, what ideas they spread in the past, and what ideas they are spreading. Look at top competing sites for intel.
If content is added to the site continuously make sure it is easy to subscribe to and worthy of a subscription. Make sure the content writers know basic SEO and write keyword rich and click friendly page titles.