Posts filed under '2. SEO'
Hi everyone –
For those of you who follow these things, just letting you know that the October 27 2007 PR (google pagerank update) is now underway.
It’s been a long time coming, and to be honest i thought that it might never come. Showing pagerank (or at least the toolbar version – TBPR) is one of those things that I often think probably serves no real useful purpose to the average webmaster.
It’s really only just basically a measure of how many folks link to you (and how many link to them… recursively) and NOT (unlike what the toolbar PR says when you hover over it) a measure of how important Google thinks your site is – If you want to know that you just need to check out your visitation stats.
I’ll take that further – I think the little green bar probably helps erode the quality of the internet as a whole by encouraging the abuse of the Google algorithm through link exchange / paid links etc.
It’s like crack for webmasters – it causes a kind of ‘PR fixation’ amongst the SEO and webmaster community. I think that’s probably something which is to the detriment of inexperienced webmasters as it tends to sidetrack them from paying more attention to the other aspects of SEO – like writing good content, for instance. Every new webmaster has suffered from that syndrome at some point or another.
I think there are a number of reasons not to worry too much about Pagerank. I described my views about this around the last PR update in April, and I found this informative article about why pagerank isn’t something to worry about too much a while back.
Of course the whole issue does tend to polarise people. Some folks are firmly of the view that PR has ZERO impact upon your position relative to other sites in the search index, whereas others (including me) believe it’s still quite an important measure – simply because it is an effective way to measure popularity algorithmically – and has no really accurtate peer at present. You can see one such holy war regarding pagerank in this thread – in the red corner we have Cass-Hacks, in the blue corner we have dockarl (me).
Anyway, here’s hoping that your PR moved in the right direction – but if it didn’t, do not despair!
October 27th, 2007
I’ve been on an hiatus from writing here, so I thought I might break the trend by talking about the practice of creating multiple websites to ‘corner the market’ – jealously guarding your url to ensure no-one uses a variation.
An example might be registering mysite.com, and then being seduced by the offer (godaddy does this regularly) to register variants of your new domain name (eg .biz, .net, .org) at a ‘special discount’ – they don’t offer fries just yet, but domain sellers really are the masters of the up-sell.
I consider registering more than one domain a bit pointless
The days of people memorising and typing a url into a browser are pretty much over – except for a few notable and brilliant exceptions with catchy names like utheguru.com, oyoy.eu and other less successful or well known sites such as google and youtube most people get to a site the new-fangled way – by following links or doing a search. So, really in essence, you’re probably paying extra for not much benefit.
Furthermore, the practice can have insidious side effects – you can actually shoot yourself in the foot.
Multiple domains = Multiple sources of links
When presented with duplicate content, google often seems to pick one page as the ‘original’ and consign the others as unimportant copies, and they don’t rank well.
You could end up with a situation where google chooses a page from each of your site copies as the ‘original’ and you end up with search traffic spread between all four.
Registering Multiple domains for the same site can actually be bad for business
Links to your sites naturally tend to come with traffic – and a lot of traffic generally comes from search… so… you’ll also end up with your incoming links spread between all the copies of your site.
In such a circumstance, the meaning of synergy (the parts are greater than the whole) does NOT apply. You end up with four sites with a quarter of the links they should have rather than one strong site that aggregates all the power of the incoming links in one place – end result? You don’t rank as well as you could.
How to use your multiple domains ‘the right way’
Best practice is to use something called a 301 redirect – rather than having 4 actual copies of your site all competing with each other, a 301 redirect seamlessly redirects clients (and google) to the ‘main’ url you want to rank well. If you google “how to do a 301 redirect’ you should be on your way to understanding that a bit better.
September 2nd, 2007
In a case of
short term pain for long term gain long term pain for short term gain, everyones favorite search engine has abolished the supplemental index.
But before you go running around your office whooping with delight like I did this morning – STOP. Google hasn’t abolished the supps, they’ve just stopped telling us which pages are in supps.
What’s that mean to the average punter?
Well, it means less questions on the webmaster forums starting with ‘why are my pages all in the supplemental index’, and less time spent by ‘mom and pop’ sites worrying about it.
Possibly a good move.
Me, well, I’m skeptical about the move. The overriding stated aim of Google is to return quality results. I’ve seen plenty of quality pages in the supplemental index – google has stated repeatedly that the biggest reason for a page being in the supps is NOT a perceived lack of quality, but rather a lack of pagerank.
It’s nice to know they are there so that we can make an effort to bring them into the main index where they belong. Google should be adding MORE tools to help genuine webmasters assess how they can improve their index penetration, not less.
It’s a case of ‘need to know’ – Google now no longer reckons we ‘need to know’ which pages their algorithms consider unworthy of a place in the main index. My initial feeling about that move is that it seems a little paternalistic.
Google has eviscerated the ONLY tool that goes any way toward explaining why a page might be performing poorly.
My take? If they are going to stop tagging pages as supplemental they should just abolish the supplemental index altogether – if a page is being crawled but isn’t in the index, well, we know it sucks – so why lump it in with other results? Put differently, why show us pages in a site: search if they’re not going to rank anyway.
At the moment I’m leaning towards thinking this might have been a (short term) backwards step, although it wouldn’t surprise me if we see some new tools in the Google webmaster tools arsenal to help deal with this prob.
ADDENDUM:- Richard Hearne (www.redcardinal.ie) put it best recently on the google webmaster help forums –
“Of course Google would rather we didn’t discuss or even consider this supplemental index. Then again if Google was serious about fixing issues like these they would scrap the supplemental index… or give us back the supplemental tag so that we can try to fix these issues ourselves. “
August 1st, 2007
Escape the Supplemental Index
So you have found yourself in the Google supplemental index and you want to escape.
Fair enough – unless you are a webmaster / blogger it’s hard to understand just how frustrating it is to find your hard-work ‘binned’ to the supplemental index – but worry no more, it’s easier to get out of the supplemental index than you may think.
In this, part two of my ongoing series on the supplemental index (see part one here – The Google Supplemental Index – A Primer), I’ll be giving you three key steps you can take to get your web page out of the supplemental index and stay out.
STEP 1 – Duplicate Content causes Supplementals
Pick a few key pages on your site, and run them through ‘copyscape’ (www.copyscape.com). If copyscape says you have duplicate content on your pages, this could be the reason for the supplemental status of your pages.
Edit the pages, make them more unique, put any quotes in a
<quote> tag, and try again. Move to Step 2.
STEP 2 – Backlinks, Backlinks and Backlinks
So you have a page in the supplementals, it is brimming with unique content, and you just can’t wait to get it out – it’s not hard. I have used this technique many, many times, and if done correctly you’ll find it helps bring your whole site from the ‘infant’ status I spoke about in my previous article to ‘adolescence’.
- Find a page on your site that is in the supplementals, that has heaps of unique content, and note down the url of that page.
- Find a site that has PR3 or better, and allows you to post your url.
- If you don’t know what Pagerank is, I define it in my article about nofollow
- Don’t know how to discover pagerank? You can do so by getting Firefox with Google Toolbar (download it from my toolbar to the right)
- Post your URL on that page, using descriptive anchor text. (eg, if your page is about widgets, the link should say ‘widgets’ if possible).Try to make your link a deep link – like www.utheguru.com/301-redirects instead of just www.utheguru.com
- Can’t find somewhere you can post a link? Some tips:-
- Your host’s forum / bulletin board (make sure that they aren’t no-following links).
- A friend with an established website (a link from the first page is always best)
- Another of your own websites (I’ve done this before and it works)
- Paid editorial.
- DO NOT subscribe to link exchange schemes, ‘free’ directory listings or other such ‘offers’. At best, they don’t work, at worst, they can get you penalized.
This strategy has worked without fail for me.
Use it, and expect your target page to be out of the supplementals within a week or less.
Some people call it giving a page ‘link juice’, or ‘link love’ – whatever you call it, it works.
STEP 3 – Submit a Sitemap to Google
Google webmaster central, and Matt Cutt’s Video about Webmaster Tools will bring you up to speed about this process.
To generate the sitemap for submission, I highly recommend the following free tool.
Why submit a sitemap? Well, you’ve gone to the effort of getting Google ‘interested’ in your site, so you want to give it the best chance possible of indexing your site properly.
A sitemap will help it do this.
Tomorrow, In part three of this series, I’ll be talking about strategies that will help to KEEP your site indexed.
This advice should help you to progress to a ‘mature site’ that is crawled and indexed regularly, without the need for further intervention to keep new pages from going supplemental.
July 20th, 2007
Nope – actually the July 2007 PR update is not underway at all. It’s almost impossible to know when the next PR update will get underway, and trying to guess the date is a bit pointless.
When is the next Pagerank (PR) Update?
As per usual, people have been trying to guess when the next Pagerank (PR) update will happen, and some even swear that it is happening now (as postulated by some commentators on Matt Cutt’s blog). I personally see absolutely no evidence that there is a July toolbar PR update underway – and in fact, if previous trends are any guide, it is likely that the next toolbar pagerank update will not be until August 2007 at the earliest.
There are a few little ‘ripples in cyberspace’ that are a little indicative that SOMETHING is happening, but I’ve checked all the datacentres for a number of my sites, and I can say quite definitively that Tool-bar Pagerank is not happening right at the moment.
Will the next PR update be in August?
Who knows. The next pagerank push could be in August, it could be in January 2008, or I might be totally wrong and it could be happening right now – but, In any case, don’t get a fixation about PR updates. Why?
It’s a common misconception that all your link building goes unrewarded until toolbar PR is updated – that’s just not true. Real PR (that which Google uses internally) is a dynamic, constantly changing beastie – Google just keeps the real value a secret so that webmasters like us don’t go crazy watching our PR go up and down like a yo-yo in between updates.
If you’d like to know why NOT TO WORRY about PR updates, and how to improve your Page Rank between now and the next one, please see my post about the last pagerank update.
July 16th, 2007
I’ve been thinking of doing another ‘big’ post about SEO and a little strategy that’s jumped out at me recently.
Since sometimes doing one big post is a bit overwhelming, I’ve decided to just write about one little minuscule part of the post first to whet your appetite.
Buzz is the noise that bees make.
My first experience with bee keeping was on my school camp, about age 16. My particular school had a 10 week outdoor education curriculum – every grade 10 class would head out to the school farm (“Ironbark”) for 10 weeks – the aim was to be pretty self sufficient – we had to milk the cows for our milk, make our own butter, bale hay, keep (and ultimately cut the heads off and eat) our own poultry, and of course beekeeping was one of the cool things we got to do too.
The (two) Birds and the Bees
Quite early on I volunteered to head out with one of the local bee keepers to learn all about robbing bee hives – of course, there was an ulterior motive. There were only three positions on the “bee team” and the other two had been taken by the two prettiest girls in the class – Shae and Natalie Alexander, (who in my opinion at the time was a complete SPUNK) 😀 .
I figured that, on balance, the very real possibility of being stung to death by a marauding swarm of angry bees was probably offset by the chance to spend an entire day with them 🙂
So.. off we went. I, being the gentleman that I was, let the girls take the very best beekeeping overalls. I was left with a very moth-eaten pair of blue mechanics overalls.
Handed a roll of masking tape I went about patching the 101 holes in the overalls and set to work. When we first cracked the hive open I remember the beautiful low hum coming out of the hive as we puffed the smoke over the bees.
Sweetness turns to Sadness
Everything went quite ok for about the first 4 (out of 10) frames – we brushed off the bees, replaced each honey filled frame with a frame of fake comb called ‘foundation’ and moved to the next frame.
By the 5th frame, however, the bees were starting to get pretty darn angry. It didn’t matter if I puffed more smoke over the bees, the low hum was steadily increasing gradually more and more guard bees started shooting out kamikaze style and belting into my head net.
I think we had about three frames to go when things started to get really crazy – the hum was now something more akin to a F-16 ratcheting up for take off. The inevitable happened – I’d missed patching a hole, and a bee got inside my overalls and stung me – ouch! But I was super Matt – there was no way I was going to moan about it in front of the two prettiest girls in the class 😉
The thing I failed to realise, though, was that when a bee releases its sting it also releases a scent.
Honey, I’ve lost my pants
Before I knew it, I had virtually every bee in the hive clinging to my blue overalls screaming bloody murder.
The girls (along with the beekeeper) cleared out, hopped in the truck and locked the doors. After initially trying in vain to get them to let me in the truck (there was NO WAY they were going to let me in with all those bees 🙂 ), I finally realised I was going to have to get myself out of that particular situation on my own – so I blindly galloped down the hill, stripping off my clothing as I went – heading for the farm dam.
“Splash” – I belly-flopped into the muddy dam (I was about 6 feet tall and 65kg then – skinny as a rake – I must have looked a sight running down the hill with a swarm of bees chasing me and only a hat to ensure my modesty).
I think I spent about half an hour in the dam, popping up every 30 seconds or so for a gulp of air, before the bees finally decided they’d had their pound of flesh and headed home 😀 . The girls thought it was absolutely hilarious – I still hear about ‘Matt and the bees’ occasionally when I run into old school friends. We counted 65 stings on the way home.
Beee vereee vereee careful wit zee bees
Stupidly, after that introduction, I became a bee keeper – I still have about 10 hives.
I’ve learnt a few things about bees since – if you move slowly, methodically, it is actually possible to raid a hive without any protective clothing, nets or smoke at all – it’s not hard.
If you move quickly though, or you happen to accidentally squish a bee, you’re in deep trouble. The bee next to the one that has just been squished tends to tell his neighbor (buzz – buzz), the neighbor then buzz’s to his neighbors and generally in a matter of seconds you have a hive full of very irate bees.
Bees, Buzz and SEO
Buzz is their form of communication. Buzz is how they get things done. Buzz is the very thing that binds the hive into the co-operative society that it is.
In short, buzz is like an amplifier – in no time flat a buzz from a solitary individual in the hive is capable of mobilizing the forces of the whole hive to a dedicated purpose.
Buzz, my friends, is a powerful force.
July 11th, 2007
Shambhavi Sarasvati has recently asked me for some advice. She’s changed the permalink structure on her wordpress site, and then 301 redirected the old permalinks to her new permalinks format.
Shambhavi is a bit concerned that it seems to be taking a while for Google to update to the new permalink structure.
OK – firstly – don’t get concerned – If your 301 redirects are good (and I’ve checked and they are) Google will eventually get around to updating to the new URL – in the meantime, anyone that find you via a google search should still be redirected to the correct page when they click on you anyway – since the redirect is server side, the old url’s will still work in the interim.
Secondly – how to get google to crawl and update a bit quicker – not too hard – just try to get some links to pages you want updated – (or, alternatively, post a HTML sitemap somewhere on your site containing the OLD url’s – and get people to link to it – this will have the effect of getting google to visit the old url’s, which will then (hopefully) lead it to update its index to the new URL’s when it sees the old ones have been 301 redirected). Sometimes getting links to deep pages (ie posts) rather than just the index page seems to be more effective.
It also helps if these links are from relatively high pagerank sites that are crawled regularly – If you have any suggestion’s about how Shambhavi might improve her site further, please leave a comment. By the way, if there are any css gurus reading this, I noticed a problem with Shambhavi’s sidebar in IE – it’s fine in Firefox, but is right at the bottom in IE – I don’t know how to fix it – any suggestions?
June 22nd, 2007
Well I kept it a big secret from you all because I didn’t want to jinx myself – but I was invited recently to an interview with Google (it was the ‘exciting little company‘ I spoke about a couple months back in a post about an upcoming interview).
The position was to be based at Mountain View, California, and as part of the great Webmaster support team – along with neat and very bright people like Adam Lasnik, Vanessa Fox, Aaron D’Souza and Matt Cutts.
The position was ‘Webmaster Trends Analyst’, something I felt uniquely attracted to – I’ve a strong background in stats (from my undergraduate degree and time running scientific trials with the Sugar industry), have run several ecommerce sites and have a Master’s Degree in Computer and Comms Engineering – as well as being a regular poster on the Google Webmaster Forums – so I love hearing about what other folks are up to.
It was an exciting opportunity – so accordingly I took some valuable time off my PhD to prepare – before hopping on the plane for the 13 hour flight to San Francisco.
It was a great experience, but unfortunately I didn’t get the position.
I was disappointed.
As I wrote to one of my contacts about it:-
“so, either I’ll start looking for work again or I’ll bite my bum, put the pedal to the metal and get back into the PhD.
G was going to be a great fit because working with people like yourself would have been a ‘learning’ experience rather than just a job – I hate the 9-5 ‘office worker’ style culture of uni, but love the learning side.
My main problem when it comes to being hired is that of previous job experience..
I start to look like a jack of all trades but an expert at none.
Imho I thought that would be what would get me the job with G, as I’ve been told it makes me a pretty powerful educator – and a great interface between nutty engineer / scientist types and the general public.”
But let me take a step back here for a moment – I need to emphasise that I found the whole experience incredibly rejuvenating and irregardless of the fact I wasn’t successful, I still feel honoured.
If Google were to turn around today and say they wanted to employ me, I’d say yes in an heartbeat.
Why? Because any company that actually recruits internationally for a position known as ‘Webmaster Trends Analyst’ is a company that has a conscience. I don’t see such a position advertised at Yahoo. I don’t see such a position advertised at MSN… actually, I don’t see such a position advertised ANYWHERE.
When I was going through the interviews, one of the interviewers (and I hope I’m not out of line here) actually spoke about the fact that Google pulls together information from heaps of different resources (blogs, forums etc) on a regular basis and tries to quantitatively (from qualitative signals) assess ‘webmaster sentiment’ – and use it as an early warning system to alert them if things (like an algorithm change for instance) have had any unforseen impact. That made me sit back and go ‘wow’.
I count myself very lucky to have been interviewed by a great company with a social conscience like Google and dearly hope an opportunity pops up soon and I get another crack at it (You can contact me if you know of one).
But enough of that – the whole experience was a complete blast – let me show you a few photos.
This first photo is the centre of the Googleplex – it’s a neat place. I like the fact that I seem to have captured a black crow in mid-flight right below the Google sign 🙂
Took this photo on a toilet break at the ‘plex’ – judging from the pace of the interview I figured that time is a commodity in short supply at the googleplex, but this pic (right above the urinal) really rammed it home “Testing on the Toilet” – an A4 page giving thought provoking code tips to the engineers. 🙂
I got the opportunity to do a fair bit of sight seeing while I was there…
The Golden Gate Bridge (with me in front of it).
You’ll notice in all of these photos that I’m wearing one of two shirts – bloody Qantas sent all my luggage to Helsinki on the way over, so I had only a pair of shorts, a pair of moleskins, the shirt I wore on the plane and one I bought for the interview (this one) for the whole trip – don’t get me started about QANTAS.
Across the Golden gate bridge from San Fran is a beaut spot called ‘Reyes Point’ – here’s a photo looking back towards San Fran from there (with the Golden Gate Bridge in the background).
A pretty flower at Reyes point – I believe the plant is called Pigface – why, I don’t know 🙂
A ‘Hummer Limousine’ – Wow!
Another pretty flower in San Fran (are you a REAL Aussie! That’s so COOL! I want a photo with you!!) – the people were very friendly at “Kell’s Bar” – I love a good Irish Pub, and this one was a beauty – it’s just off Columbus.
The owner (right) and head barman of Kell’s Bar..
They shouted me quite a few Guinness’s – here I think that magical brown ale is starting to have its curative effects 🙂 (I am not too sure whether the spooky red eyes were caused by the camera or the Gazillion pints of Guinness)
The morning after – one of those famous cable trams in San Fran.
My Hotel was right in the centre of SF (Sutter and Powell) – I got it for a nightly rate of like $69 – it was fantastic
Just a pretty picture of one of the brass fire hydrants they have all over the place in San Fran.
Some San Fran street art- this was in Chinatown – San Fran has the best Chinatown I’ve seen in any international city – I felt like I was back in Beijing.
On the way out of San Fran – you can see the city itself and the Bay Bridge top RHS.
I loved San Fran – it was such a vibrant colourful city – I hope to go back there someday soon.
QANTAS strikes again – I had to wait 18 hours for my flight back – by the time I took this photo (the time on my watch is AM by the way) we’d been locked in LA airport with no food or refreshments all night waiting for our plane which was recursively only going to be ‘another twenty minutes’ all night.
Some rather unfortunate baggage handler had managed to run into the wing with the mobile stairs, causing severe damage to the port side aileron.
I felt sorry for the parents with little kids – the time on my watch is AM – roughly 20 hours after the plane was meant to leave.
June 18th, 2007
Wordpress, like many other CMS (content management systems) creates duplicates of your posts all over your website.
Having duplicate content can lead to less than optimum search engine listing, and is one of the factors that cause ‘supplemental results’ in Google.
In the following article I describe a wordpress plugin I’ve developed to help address these issues.
You can download the plugin now by clicking here or read the full article.
Continue Reading June 16th, 2007
A little while back I asked the following question on the Google Webmasters help forum:-
Is it OK to duplicate content in a different language?
Nobody could really give me a solid answer at the time. At the risk of setting off a new wave of ‘language spamming’, it seems it is. The following pronouncement from Matt Cutts (Google) seems to confirm it.
Matt: Having content from two different domains isn’t risky if they are in different languages (for example, Chinese and English), but if you have the exact same content on two different domains, it’s better to use a permanent redirect from the duplicate domains to a single preferred domain. (see this interview with Matt Cutts for the full length version.
Language Spamming ??
What do you people think about that? To me, it’s a very significant admission of a potential major future web-spam weakness, given the availability of (relatively accurate) online translation tools like Babel-Fish etc. It also presents enormous SEO possibilities for crawlers / spammers.
Apart from the obvious inferences, I have a few others:-
- Can Googlebot ‘understand’ foreign language words in an english site?
- If so, what effect do these foreign language words have upon a site’s ‘relevance score’…
Ist es OKAY, Inhalt in einer anderen Sprache zu kopieren?
¿Es ACEPTABLE duplicar el contenido en una diversa lengua?
May 9th, 2007
The April 2007 Pagerank Update
It’s official – as of a few hours ago new PR’s are starting to filter through the system – the April 2007 Toolbar PR update is underway! Here’s a few insights about the update and what it means to you, and some tips and tricks you might not know..
Why is my Pagerank Jumping Around?
When a toolbar PR update happens, it doesn’t happen all at once – Google has many ‘datacenters’, and your new PR will ‘percolate’ between those datacentres over the next few days to a week.
The PR shown in your toolbar is usually taken from a relatively random datacenter – for that reason, you’ll tend to see your toolbar PR jumping around alot – this isn’t an indication of any kind of penalty, or anything unusual – it’s just an indication that the PR update is underway.
You can see your PR over the various datacentres at http://www.oy-oy.eu.
What is Pagerank (PR)?
PR, or pagerank is one of the factors used by Google to calculate the importance of your site. Importance is different to relevance – you can have a very low PR site and still outrank much higher PR sites that don’t have content that is as relevant to the user’s search as yours.
People tend to get fixated on PR as it is one of the most visible forms of ‘feedback’ from Google about how your site building efforts are going – and since it only gets updated 3 or 4 times a year, people with active sites (including me) tend to look forward to it.
Should I worry too much about PR?
No. A few reasons:-
- RELEVANCE almost always beats PR if you want good search engine positioning – such things as the words that people use to link back to your site, words in your page, your page title and headings, and words in your url all give Google clues about the relevance of your site. Some people claim that their are 200+ factors such as these that Google uses to calculate relevance.
- Pagerank is generally out of date – it is really, in its most basic form, just a snapshot measure of how many other sites link back to you (and how many sites link back to them).
- You can have a PR 0 site and still beat much higher PR sites in a Google search if you concentrate on RELEVANCE.
As time has gone by, Google has got much better at gleaning ‘relevance’ from a page – and with that enhanced functionality, the relative importance of PR (which was probably once the major contributor to search engine positioning) as a factor in calculating your search engine positioning has been diluted by these other factors – but it is still a factor, and it is worth aiming to improve your PR.
Tips and Tricks to Improve your PR
Well, it’s too late now for this update, but if you’d like to work towards improving your PR (and site traffic) you need to get more sites linking to you, and preferably sites with high PR. Here’s some tips off the top of my head:-
- LINK OUT – link to sites that interest you. This has two effects – it makes your site much more informative for your readers, and it also helps other sites (the target of your link) learn about you. Whilst it is counter-intuitive that linking out will improve the number of sites linking to you, it does. Why? Because it tends to increase your readership. A site with lots of readers becomes a site that people want to link to. Also, people with active sites tend to spend alot of time monitoring who is linking to them – write an interesting article which links to their high PR site, and it’s likely they’ll come and check out your site – if you are lucky, you might get a link from their high PR site back to you as a thankyou.
- WRITE UNIQUE, INFORMATIVE, INTERESTING ARTICLES – if you ever do a Google search for something and you can’t find what you’re looking for easily you have a great opportunity. Find the answer, and write about it. Chances are other people are asking the same question – and you’ll attract links if you write a good quality blog entry about it. Sites that just regurgitate / duplicate information easily found elsewhere won’t tend to get lots of links.
- WRITE SOMETHING CONTROVERSIAL – this is one strategy fitting under the general banner ‘link bait’. My best performing pages are those that have controversial content :).
- USE SOCIAL NETWORKING TOOLS – Things like mybloglog, feedburner, digg-it etc are a great way to improve your following and traffic. I can pretty much guarantee that links to my site increase proportionally with the amount of traffic I receive.
- MAKE A USEFUL TOOL – many of my links come from my wordpress theme, Blix-Krieg. If you put something on your site that is useful, you will attract links.
- USE YOUR HOST– Many web hosting companies have online forums for their users – often, these forums have obscenely high PR. Write something genuinely interesting, and link back to it from your host’s forum. This is also often a great way to help trigger an initial crawl on a new site (see my series on the supplemental index for more info on this).
- BE A GUEST POSTER. Many sites (including mine) allow users to submit their own articles for inclusion – take advantage of the opportunity – write an interesting, relevant article and ask the owner of a high PR site if they’d like to include it – with a link to your own site in the body.
Also check out this page on the top 13 things that won’t effect your pagerank by JLH. Actually JLH is an example of a successful blogger that applies alot of these principles – He writes great articles that are often interesting, controversial and informative all at the same time. He links liberally. He uses a broad array of social networking tools.
Now – could I please ask you folks a favour? I’ve written a WTF at technorati – I’d appreciate your votes – it’s my first experiment with social networking 🙂 Click this link to vote.
Any other suggestions, feel free to post – hell, why not add your url to your comments – I remove no-follow from all comments after 14 days if they are relevant.
All the best,
April 28th, 2007
Well – that’s a provocative title – but at least they do work sometimes.. let me explain more..
What is Google Bombing?
Some argue that the first widely known Googlebomb was created by a men’s magazine, which used the anchor text (anchor text is the words I use in a link) ‘dumb motherfu**er’ in a link to a site selling George W Bush merchandise.
In fact George seems to have been the target of quite a few Google-bombs.
My first experience of a Google Bomb was when I was sent an email suggesting I type the words ‘miserable failure’ in a google search back in about 2003 – the resultant page was, of course, George W. Bush’s Whitehouse page.
Strictly defined, a good Google Bomb should be constructed in such a way that a site returned for a given phrase does not even have that phrase in its content. The theory is that if enough sites link to a site using a particular word or phrase, Google will simply assume that the site must in fact be about that phrase – even if the phrase isn’t on the target page.
So, of course, George Bush’s site doesn’t in fact have the words ‘miserable failure’ on it at all, but it (once) ranked first place for that phrase in any Google search because of the viral campaign launched to get thousands of webmasters to link back to that site with those words.
The coining of the actual phrase ‘Google Bomb’ is credited to a fellow by the name of Adam Mathes, who linked the phrase ‘talentless hack’ to a friends website.. this was documented on the site www.uber.nu, which unfortunately seems to be down now.
Google bombs the GoogleBomb
I used to love the Google Bomb so much that I registered and still own a site that I hoped would become a place where people could suggest and democratically vote upon potential political / humerous / educational googlebombs.
I was intoxicated by their potential power to educate and cause giggles, and perhaps even lead to real change – but, alas, on January 25th, 2007, things came to a premature halt.
On that day, Google announced in this post on their official blog that the glory days had come to an end – Google had created an algorithm that would curtail the impact of Google Bombs. Matt Cutts also spoke about the algorithm change in this article..
I was very sad and disappointed – but all good things must come to an end.
Some practical examples of good anchor text selection in action
These aren’t really examples of ‘Google Bombs’ per-se, but they do show the power of good anchor text selection.
For example, linking back to my lingerie site with the anchor text ‘brassiere’ brought my site from 5th page to 2nd position very quickly for that highly competitive word, even though I don’t have it on my site anywhere.
As another example, I once had an occassion where the inventor of a product which I was manufacturing and promoting (and had sunk hundreds of thousands of dollars in cash and man hours as seed funding) was becoming difficult / adversarial towards us, and had appointed a competitor without our knowledge.
The person was interviewed on national TV, and through sheer pig-headedness, chose to promote the product under a different name to what we had been promoting it as for several years. It was just a spiteful attempt to send search traffic to a competitor who had been appointed without our knowledge.
GoogleBombs / anchor text work very quickly
This taught me a little lesson about Google Bombs (or more specifically in this case, good use of anchor text) – because they are distributed in nature, they can work very, very rapidly to alter search results.
Luckily, in this case, we had a head start. I had noticed in my logs about three days before that we were receiving hits from the website of a media organisation.. this led me to the website, and I noted that it was a ‘draft’ page detailing the upcoming interview, in which this person referred to our product by another name.
Subsequently, I went to the person’s website and discovered that the ‘buy this product’ link pointed to a new site, with a url containing the ‘new name’, which was in fact 302 redirecting to our site in what I suspected at the time (correctly) was an amateurish attempt to nick our PR in preparation for an assault upon our business – on consulting my site logs I discovered we had been receiving hits with this site as HTTP_REFERER for at least 3 months..
Their cover was blown. I spent the next few days starting to optimize my site for ‘our new name’, and had 301 redirected their 302 redirect using some clever .htaccess tricks to a site I knew was likely to get their new site banned or at the least seriously retard their hijack attempt.
I also got some friends with regularly crawled sites to link to the ‘rogue site’ to give Google a clear shot at indexing the ‘new’ content – this might seem like a low act – but, remember, this was a defensive action rather than offensive.
Through the use of googlebombs (it helps having webmaster friends) and the fact I already had an established brand and high crawl rate, I was able to quickly (in less than 24 hours) rank first place for the products ‘newly invented’ name, and take advantage of the media exposure.
I noted on the night before the interview aired, that the 302 redirect was removed – but the damage had already been done. It actually took them about 3 months to get reincluded in Google, so my counter-attack seemed to have worked.
We also ranked first place for the person’s name for about the next 6 months, which, of course, made us the villains, not the person attempting to steal our rankings 😉
So – how does the new anti-GoogleBomb algo work?
I thought possibly Google may have changed their algorithm in such a way that a googlebomb for a word or phrase that either seemed contextually irrelevant, or didn’t exist on the target page would no longer rank. This made me a little worried that what had been a previously powerful seo tool for some of my commercial sites would no longer work.
As alot of you know, I have a pretty successful WordPress theme called Blix Krieg.
When people install my theme, there is a link in the footer back to this site, and also one of my commercial sites (www.jaisaben.com). The anchor text for the link back to my other site is ‘by theDuck’.
Just recently, I was checking Google Webmaster tools and I found that a fair percentage of traffic coming to my other site now comes from people searching for the phrase ‘theDuck’. Who the hell searches for “theDuck” – I dunno – but it seems quite a few do.
Sure – it’s not a highly competitive phrase, but it does prove to me that Google hasn’t deprecated the value of inbound link anchor text outright – and whatever their new anti-googlebomb algorithm is, it probably has very little to do with contextual relevance.
Certainly, my Jaisaben site does not have the phrase ‘theDuck’ on it anywhere, it has nothing about ducks on it and yet it now ranks in second place for a search for the phrase ‘theduck’ – purely and simply because of anchor text.
Lessons I have Learnt
This has taught me a few lessons:-
- Next time I release a wordpress theme, I’ll use non-nonsense words in my footer anchor-text, preferably valuable ones (mesothelioma anyone? – see this link ).
- Anchor text should still be in anyone’s SEO arsenal.
- It would appear (at least for uncompetitive phrases) that having heaps of sites linking back to you with the exact same anchor text doesn’t cause any penalty, contrary to what other seo’s have said.
- However the new Google-bombing algorithm works, it is not simple, and it still leaves enough latitude to use anchor text in clever and powerful ways.
- Never form a business partnership with a mad person, no matter how good their ideas seem to be, unless you have the patience of a Saint and the bank account of Bill Gates. The same probably also applies to personal relationships 🙂
All the best!
April 9th, 2007
Hi everyone – sorry for the long time between drinks 🙂 I’ve been working hard getting back into my PhD…
Matt Cutts Blog Hijacked
Just an interesting little tid-bit today. It seems that Google’s famous unofficial blogger, Matt Cutts, has had his site hijacked by a bunch of hackers calling themselves the Dark Seo Team.
It may be that the Hijack has been resolved by the time you read this, so just in case, if you click the thumbnail above, you can see what his site looked like when I visited today.
Is this a WordPress Vulnerability?
It seems that no-one, not even Matt, is immune to being hacked.I wonder whether this has anything to do with Matt’s recent upgrade to the latest version of WordPress?
I guess we will see soon enough!
It’s interesting that the hack is from a mob calling themselves Dark SEO..
Interesting, I say, because I think I saw warning signs that they were planning something funky about three weeks back.
Dark SEO had Cloaked Copies of Matt Cutt’s Page Weeks Ago
How? Well I was having a brief look at my Google webmaster tools ‘links’ section back then, and I noticed that one of my pages (Damn Ugly Websites) was being linked to from a site with a fairly suspicious URL (click here to see the site). I checked out the site and pretty quickly realised they were running a cleverly cloaked copy of Matt’s Site…
What is Cloaking?
What’s Cloaking? Well, basically, if you follow the link, you’ll see it looks like an innocuous web page to us, the human reader, but it seems that if you happen to be googlebot, that site presents completely different content – a copy of Matt’s Site. The perpetrators? None other than darkseoteam.com, who have now Hijacked Matt’s Site.
I did send a message to Matt suggesting he should check it out.. I think my exact words were “Matt, I found this URL – you should probably check it out” – Wow – if it turns out that was the first stage in their hack attack, I will be feeling very prescient indeed :).
But DAMN – If they were clever enough to hack his site, surely they should have been clever enough to put some adsense units on there as well 🙂
NOTE: He got the better of me.. this was an April Fools Joke from Matt 🙂
April 1st, 2007
Just thought I’d throw another few links to some new blogs I’ve seen hit the ‘link-o-radar’ recently.
As you all probably know, I develop the wordpress theme that you are looking at (get it at my blix krieg download page) which is a wordpress 2.1 compatible version of Sebastien Schmieg’s Blix theme, and incorporates adsense (inspired by additions from SEO Dave).
I also do a fair bit of ‘blog-optimisation’ for people, including help with installation, customization and SEO considerations to help make blogs a financial and social-networking success.
Whenever someone installs my theme, I get pinged, and I often go check a random selection of them out on a weekly basis.
Here’s a few from this week which I liked:-
Triathlete Dad – Regular commenter on this Blog, Susie J, has finally convinced her husband Dave to join the club and get a blog. Dave’s blog is lining up to be a real success – he talks of his inspirational change from (slightly) overweight technical rep to triathlon-racing superdad.
This is one blog I’ll be watching – certainly the content is well written, and from an income perspective, I think the ads that adsense is pulling at the moment are extremely relevant to the content and likely to produce a good return.
Horse Logos – not a BlixKrieg blog but rather a Zen-Cart based ecommerce site I’m in the process of helping set up for a customer. I’m proud of how this is going – coming along nicely and indexed in less than a week.
Working for Cats – a nice Blix Krieg based blog about CATS – again, I was comissioned this week to help the owner make a few mods. Amongst other things, we widened the default theme out a little and fixed a few SEO problems.
Midwestern NGO – a non-profit organisation dedicated to helping developing countries. Welcome to BlixKrieg!
Lotan’s Blog Space – A blog about conservation in Israel and interesting things like living in an eco-dome. I love the work these people have done to the theme, and especially the background art – looks great, two thumbs up!
Just a small comment about comments – I’d highly recommend to Lotan that he enables comments without registration – users and readers are what makes a blog, and being able to make comments tends to get readers interested, and keep them. Plugins like akismet and bad-behaviour are extraordinarily brilliant at filtering spam – but if you are really concerned about unsolicited comments, you can enable akismet AND require that first time commenters comments are reviewed by you before they are given un-fettered access – I’m willing to help you if you’re not sure, Lotan.
name not included – A commercial site that’s pinched the theme and removed acknolwedgement. I wasn’t going to include this one as when I visited they had taken the footer out – ie removed backlinks to wordpress and this theme. That’s a pet peeve of mine, as the development of this theme is largely driven by traffic we get. But, I’ve decided instead to add this no-followed link as you might like to see what they’ve done.
March 16th, 2007
Just a quick post today.. been busy with work..
I was reading the local (online) newspaper today and came across this article.
What struck me was not so much the article, but rather the ad that had been placed next to it.. sometimes automatic as placements send an unintended message – (click the picture to make it readable).
For those readers who aren’t familliar with Rugby, it’s kind of equivalent to American football.
March 15th, 2007
The ‘Scraper Site’ – Benevolent Friend or Deadly Foe?
One of our regulars, Susie J, left the following question for me this morning –
Can you have a bad link? I checked my inlinks through technorati. A few stood out with questions marks. Here’s a couple of them:
These sites do not have any of their own content — just a list of other sties. There is a link to my site to a specific article — but it does not identify my site by name.
Hiya Susie – these are called ‘scraper’ sites.
I’ve got several of them linking back to me too.
There are a number of things you need to consider first before you get too worried about them.
Links from a Bad Neighbourhood – Good or Bad?
Is it bad to have them linking back to you? Well, there are a number of different perspectives on that.
I’d say this right off the bat – Google knows that you can’t help who links to you, so it is impossible to get an official Google ‘penalty’ from such a site linking to you.
If that were possible, I could set up a mean link farm violating every one of Googles webmaster guidelines, and get my competitors struck off Google’s index just by linking to them from my Uber-evil site.
The only exception, of course, is if you link back to the scrapers, in which case it is possible (but unlikely) that Google may consider you’re participating in some link exchange scheme with them and you might get penalised – that’s called linking to a ‘bad neighbourhood’.
Whether or not links from these sites is good or bad from an SEO perspective is a different matter.
What’s their game?
I had the following discussion about this with a few of my SEO friends a few months back, and the general consensus is that those sites are trying to get good search engine positioning by fooling Google into thinking that they are authorities on a particular topic – such as the common cold, in this instance.
Since they link back to me, I don’t get overly perturbed about them, but I have been puzzled about what their game is – because:-
- They can’t be after Pagerank – who’s going to link back to a site with no real information? (except people like us, wondering why they are linking to us – but you’ll note I nofollowed the links to them)
- They aren’t stealing content – they are acknowledging the source of the content.
- They aren’t MFOA (made for adsense) as they (mostly) aren’t displaying ads YET.
So what’s their game? Well Susie, I got your message this morning right after I got back from the gym. I’ve just had a shower (my thinking place) and I believe I may have their strategy sussed.
I reckon they have the same opinion as me – make your outlinks count. Whilst linking out to other sites does, by definition, reduce your pagerank, the effect on your search engine positioning can actually be positive.
This is somewhere along the same lines as ‘it’s not what you know, it’s who you know’ – if you link to a lot of other sites about a topic you start to look like an authority in that topic.
A Devious Black-Hat Scheme..
So search engine positioning is really a combination of relevance and pagerank. So in this case, they are trying to gain relevance in the topic of ‘the common cold’.
I think their strategy might go something like this.
- Use adwords to find some lucrative keywords (for instance, I would imagine competition for the keyphrase ‘the common cold’ would be fierce, so it would be lucrative).
- Crawl the net looking for articles about ‘the common cold’ – or better still, just do a Google or technorati search for the phrase.
- Take small snippets of those articles, and link back to the origin, thus reducing the likelihood of being reported as spammers (after all, everyone likes being linked to).
- Cobble together a large number of snippets in such a way that it’s unlikely that the density of information from any one source is suspiciously high on the page (thus avoiding the possibility of triggering a spam flag or duplicate content penalty from Google – and being deindexed or sent to supplemental).
- Wait to be crawled by Google.
So now, what do they have – they’ve got a keyword rich page, full of relevant links to topical pages about the common cold.. If I’m an automated robot I’m beginning to figure ‘hey, this looks like an interesting page about the common cold’.
So, they’ve got relevance – all they are now missing for good search engine positioning for the phrase ‘the common cold’ is pagerank (PR). Easy fixed – buy a link from a high pagerank site, or indeed (since these people likely have heaps of sites) throw a link at the page from several of your high pagerank sites, preferably in a related field.
Now Comes the Traffic.
VOILA! You’ve got pagerank and relevance – you suddenly appear to Google to be an authority on the topic of ‘the common cold’.
So hopefully, since you’re now the new authority on the common cold, you’ve got great search engine positioning too – and with positioning comes traffic – lots of traffic.
Sir Richard Branson started his empire by standing out the front of potential locations for his record stores, and physically counting the number of people walking past each site per day. He knew that the more people walking past the better – this is the online equivalent.
Think about it – the two links you sent me are scraper sites about cancer and the common cold. Hands up anyone that doesn’t know someone who’s had a cold this winter? Hands up anyone that doesn’t know of someone affected by cancer?
These keywords weren’t chosen by accident – they both have potentially very high traffic!
Money – lots of money, with Adsense.
Here’s where the brilliance lies – since the site doesn’t really give any answers, the first thing people are going to want to do when they get to the site is go elsewhere – so, what to do with all this traffic?
BRING ON THE ADSENSE. Scatter adsense all over the site and make clicking them is the only real way of escaping. Remove the links back to the original sites (after all, you only had them there to make yourself look legit and stop people from reporting you as spammers) and you’ve successfully run the black-hat gauntlet and probably made a motza on your lucrative keyword.
These schemes are all about maximizing traffic and hence financial reward.
They don’t expect to be around long before they are taken down or detected. This is probably the reason they choose very high traffic keywords – so that they can make hay while the sun is shining.
So is having links from scraper sites bad for me?
So from a net useability perspective, sure, these sites are bad for everyone.
I can remember when I first started surfing the net way back in the mid nineties, you could search for just about anything and it would return a multitude of links to porn – back then all you really had to do to game a search engine was to have heaps of ‘keywords’ on your page (a favourite tactic was to have a huge list of smut related words at the bottom of the page). Luckily Google’s algo has matured and that just doesn’t work anymore.
Plus – as of last year, the majority of web users are using the web for commerce and business, rather than porn, which had dominated legitimate searches for the entire public history of the net (says alot about human nature hey?). So these days, the majority of these schemes are in it for adsense income.
Don’t know about you, but I don’t want to go back to the bad-old days where search results are dominated by useless crud, only this time it’s useless crud with adsense ads rather than asking for your credit card number or offering ‘free previews’. Luckily, so far, Google seems to be keeping pace with the spammers and (whilst their is doubtless still loads of money to be made) things have become a whole lot harder for them.
The verdict – good or bad? From the individual short term perspective of your site, being linked to by these sites probably has no effect (at worst) and perhaps even a small positive effect (at best) on your pagerank.
It’s those that steal your content and don’t link back to you that are bad, as (occasionally) Google deems their version of your content the ‘original’ and cans your site to the supplementals as a plagiarised copy.
What can I do about plagiarised content?
A good way to check for copies of your content online is to use the tool called COPYSCAPE.
If it really irritates you that these sites have copies of your material, there are a number of things you can do about it.
First and foremost, most of these sites use some form of spider to harvest your content.
You can try banning the rogue spiders using robots.txt as described in this article, but that approach only works for the ‘well behaved’ bots – those that obey robots.txt. Furthermore, many of these bots seem to harvest their information directly from technorati, so there is nothing you can do about that.
The second approach is to report the sites as a spam site to Google (you can do that in Google webmaster tools – it’s under the ‘tools’ menu described in this article). This gives Google a ‘heads up’ that the site is a spam site.
As for me personally – now that I’ve realised their game, I’ll be reporting these sites.
This goes against my ‘all publicity is good publicity’ ethos, but what the heck – why should they be making money at the expense of legitimate sites.
All the best,
March 12th, 2007
In part three of this series about some tips and tricks I learnt during my recent visits to the Google Adsense Conference in Brisbane, Australia, I’m going to write about ‘targeting’ your ads to your content.
People get grumpy about the relatively low income they receive from adsense ads on their blogs – essentially, they want to know how to make moneyfrom their blogs, and having heard of the rags to riches stories of bloggers making hundreds of thousands of dollars from advertising online, they want a slice of the action – and why the hell not?
So now I need you to take a deep breath whilst I take you through some of the latest guidelines for helping google to serve ads that are more likely to be of interest to your customers.
Basically, to have a site that makes money you need at least three things:-
- Website Traffic.
- Relevant Content.
- Relevant Ads, and a high click through rate from those ads.
Those are the three key ingredients for making money from Google Adsense, and lacking any 1 of them will be to the detriment of the others.
Assuming you have good traffic (and I’ll be writing tutorials on that in the near future) and compelling content, all that remains is to encourage a high ‘click thru rate’ on your site.
I have to admit, on this site, the content is generally great and informational, but I’ve previously simply relied on the google adsense code to serve ads that are likely to be clicked on. I’ve tried out image ads, I’ve tried out text ads, I’ve tried out different colours and positions for the ads – and I’ve seen minor changes from doing so.
Alot of the time, though, I think I go a bit too by being too solutions oriented – For instance, someone has a problem, I’ll try to solve it for them, they leave happy, I get good feedback from them and probably generate traffic through referrals – that’s all great, and it’s a part of my growth strategy at this early stage of my blog
So, having the content, I hope the ads will be clicked and everything will be fine from there – but I’ve been finding this isn’t the case – on my other comercial sites, I end up with click through ratios (CTR – the percentage of visitors that click on ads) of less than 1%, whereas on my product based sites I get closer to 10%.
I started to think for reasons this may be – perhaps my readers are ‘ad savvy’ and have a form of blindness to the ad content – perhaps I am providing what they need – information, and they have no need to follow my ads to get more of it.
Someone suggested to me that I should leave articles I write ‘hanging’ so that the reader feels compelled to look at the ads to find more information – not a bad idea, but it goes against the ethos of this blog to an extent. I think the real answer is to write my articles in such a way that they want to take the next step, and offer, in the advertisements, companies and individuals that may help them do so.
Enter stage right adsense section targeting– this was released last year, and is an incredibly simple way to ensure that your ads are ‘micro targetted’ to the niche group viewing your pages. So why haven’t we all heard about it? I think alot of website and SEO people have kept this one to their chests a bit, as it’s a fantastic tool that can really help dramatically increase your returns and make the SEO people look worth their weight in gold.
But you don’t need to be a big shot blogger to imlplement this code – all you need to do is place tags around the content you think is most appropriate to your audience.
the tags are <!– google_ad_section_start –> and <!– google_ad_section_end –>
When google adsense bot sees the <!– google_ad_section_start –> tag, it expects that any information appearing between that tag and the end tag should be used by it when it considers what sort of ads to serve.
So, for example, I have taken an abstract from a recent Wired Magazine Article about how Yahoo has missed the boat when it comes to website advertising.
I found a paragraph in their that speaks about Hollywood, TV Shoe, Theaters, TV Sets – all things that aren’t really spoken about much in the rest of the article, but that I think will bring commercial ads about technology – things that some of the geeks reading this blog might be interested in.
The truth is that when Semel worked in Hollywood, he understood more about how movies and TV shows made it to theaters and TV sets than virtually anyone else on the planet. Early in his career, during stints in New York, Cleveland, and Los Angeles, all Semel did was sell movies to theater chain owners. He’d show up at each theater — there were only a handful of national chains then — with a list of the movies Warner was going to release over the next few months, and each owner would bid on the movies he wanted.The truth is that when Semel worked in Hollywood, he understood more about how movies and TV shows made it to theaters and TV sets than virtually anyone else on the planet. Early in his career, during stints in New York, Cleveland, and Los Angeles, all Semel did was sell movies to theater chain owners. He’d show up at each theater — there were only a handful of national chains then — with a list of the movies Warner was going to release over the next few months, and each owner would bid on the movies he wanted.
In the next paragraph, I’ve found interestig information about the infrastructure of Yahoo – keywords like servers, technlogy, redesigning a database, redesigning a user interface – all are rock solid keywords that should hopefully trigger ‘mediabot’ to deliver an interesting combnation of consumer products ads and advertising for high grade database and server technology.
But now, despite Semel’s achievements in Hollywood and early success at Yahoo, Silicon Valley is buzzing with a familiar refrain: Wouldn’t an executive with a little more technology savvy be a better fit? Semel has been Yahoo’s CEO for nearly six years, yet he has never acquired an intuitive sense of the company’s plumbing. He understands how to do deals and partnerships, he gets how to market Yahoo’s brand, and he knows how to tap Yahoo’s giant user base to sell brand advertising to corporations. But the challenges of integrating two giant computer systems or redesigning a database or redoing a user interface? Many who have met with him at Yahoo say he still doesn’t know the right questions to ask about technology. “Terry could never pound the table and say, ‘This is where we need to go, guys,'”one former Yahoo executive says. “On those subjects, he always had to have someone next to him explaining why it was important.” One could have made a convincing argument two years ago that such deep technical knowledge didn’t matter much. But now we have empirical evidence: At Yahoo, the marketers rule, and at Google the engineers rule. And for that, Yahoo is finally paying the price
The Lesson endeth for today – tomorrow we will see the results and expand upon them to make some money 🙂 Don’t be alarmed if it doesn’t look like it’s worked at first – it can take 24 to 48 hours.. patience 🙂
March 12th, 2007
This post follows on from my tutorial about pulling pages out of the supplemental index.
A reader at Google Webmaster Help Forums has asked me if it would be possible to post a link to his site about classical music to try and pull one of his pages out of the supplementals.
I have had difficulty with one page from my site that insists on staying in the supplemetal index. I had a mis typed URL that I subsequently made a 301 redirect back to the correct link. Now both the bad URL and the good are in and have remained in the supplemental for ages and I cant seem to shift it. Would it be possible for you to throw a link at that page for me to try to force it back out ?
Your site has a couple of other probs that may be causing the supps SMc (in particular check for suplicate content), but let’s try and see if it works.
Here is a link to SMc’s Classical Music Site – By the way SMc – some tips:-
- I don’t remember where it was that I read this, but google prefers short URL’s – the physical length of the URL, and ‘depth’ of the URL (depth of directories) should be kept to a minimum. If someone has a link I think it was Vanessa Fox that talked about that.
- You should keep the number of links on a page to less than 100, if possible (see Google’s Webmaster Guidelines) –
- Every page should be 2 or 3 links from the home page.
- Links from ‘related’ websites with high PR probably carry more weight than links from ‘unrelated’ sites (ie mine versus a music site).
- Use copyscape to check for duplicate content on your pages (see my primer on the causes of supplementals here).
Follow-up – it worked 🙂
March 10th, 2007
A while ago one of my online buddies, JLH, wrote this tongue in cheek post (or should that be boast? 🙂 ), in which he pointed out that he now outranked Google’s own famous Blogger, Matt Cutts for one of his posts.
JLH and I have been having a light-hearted game of one-upmanship for a while now (see the now infamous Banalities of Bananas post here, and my even more ridiculous second attempt to beat JLH on the lucrative ‘Banal Bananas’ keywords here). JLH ultimately prevailed in the Banal Bananas stakes, so I’ve been wracking my brains about ways to beat him since.. so JLH – here’s my chance to match you on this one…
We now outrank Matt Cutts for the search “How to get out of the supplemental index” – granted, it’s probably a temporary fluke, but I thought I’d get mileage from it while I can 🙂
Cheers and Have a great day,
Update – for the moment we seem to be holding in 2nd position for the above search, and getting some nice traffic too… must have done something right 🙂
March 8th, 2007
What are inlinks (Backlinks)?
Inlinks (also known as backlinks) are an important measure of the ‘popularity’ of your site – the greater the number of other sites that link to you, the more likely it is that you will have a high Pagerank (PR – see a definition of pagerank here).
Does Google show Backlinks I have to my site?
Yes, but not accurately. There are two main ways:-
- You can use the link: modifier in a google search (eg link:www.utheguru.com) but it’s known far and wide as being very inaccurate – in fact google claims that it is deliberately inaccurate.
- You can also use Google Webmaster Tools, and the ‘links tab’ in those tools, but these are also known to be out of date and inaccurate.
A More Accurate Backlink Count – Yahoo Site Explorer
Yahoo, in my honest opinion, has the most up-to-date way of counting links to your site – it is called Yahoo Site Explorer.
How to use Yahoo Site Explorer to Count Inlinks (Backlinks)
Here’s a brief step by step guide to using this Yahoo Site Explorer Feature. (You can click on the thumbnails to get a full size view).
Step 1 – go to www.yahoo.com and enter site: followed by the name of your site in the search box (eg site:www.utheguru.com). This will take you to Site Explorer.
Step 2 – At the top left hand side above the search results, click on ‘inlinks’
Step 3 – In the Leftmost ‘Show Links’ Box, click on ‘except from this domain’ – This stops Yahoo from including links from within your own site.
Step 4 – In the next drop down box across (to:), click on either ‘Entire Site’ or ‘Only This URL’ – I’d suggest ‘Entire Site’ is best, but you can check links to individual URL’s using the other option, which can be handy.
Step 5 – Voila! You can now see how many inlinks you have, and where they come from.
I have a WordPress Plugin that will give you a count of other sites linking to you automatically – it’s called the wordpress yahoo sidebar widget, and you can see it here.
March 7th, 2007