Gone Supplemental – Joomlahacks
January 10th, 2007
Heard recently on the google webmasters forum about a guy who had been putting heaps and heaps of effort into a nice SMF forum at www.jaisaben.com – problem is he used JOOMLAHACKS Joomla-SMF bridge, and now all his content in Google has gone supplemental. Supplemental, as we all know, means no-one can find content when searching in google – it is put into a supplimental index which is where google puts content it thinks is duplicate, or for some other magical reason which us mere mortals will never know, is innapropriate. It’s like the kiss of death for anyone running a content site and hoping to generate income through adsense or similar advertising. Others in forums I have read have suggested that CMS systems like wordpress and joomla actually cause the problem because most of the headers are the same or similar between pages, and still another idea is that the RSS feed causes the problem because it, in essence, contains duplicate content. What a huge bummer. I think I’ve worked out why this happened to him though – the joomlahacks smf bridge leaves your original SMF board wherever you put it, and rewrites all url’s to a new form – that means DUPLICATE CONTENT guys and girls – what a waste of time and effort for the poor guy. Now I find myself wondering – how can he get this content out of the supplemental index… I thought about moving the SMF board to a subdomain of another URL, but that doesn’t get around the original problem – I’m guessing it would just be reindexed as suppemental or duplicate content. This raises a number of questions:-
- How much do you have to rework original content to make it appear to Google as not duplicate content.
- Why in the heck didn’t Joomlahacks think of this problem?
- Is there any benefit for SMF purposes in aggregating content into subdomains?
- Could he have got around this problem if he had restricted google from crawling his /forum subdirectory using robots.txt?
- What do you think about the idea that CMS systems in themselves may lead to supplemental problems?
Any ideas?
Entry Filed under: 2. SEO,SEO Discussions
If you found this page useful, consider linking to it.
Simply copy and paste the code below into your web site (Ctrl+C to copy)
It will look like this: Gone Supplemental – Joomlahacks
6 Comments Add your own
1. toughtitties | January 11th, 2007 at 4:40 am
Yep –
we have the same problem too… I’m thinking this might also have something to do with pagerank? I have some blogs with low page rank that have big probs with supplementals, but my higher pagerank blogs have no such prob – same wordpress install…
2. DuckMan | January 11th, 2007 at 3:49 pm
After asking for advice about this on Google Webmaster Forums, the following advice was given:-
3. MrGamma | January 21st, 2007 at 8:38 pm
I can’t believe you quoted me… If I could take back the above statement I would… I no longer believe anything which I stated there… In fact… all of the above mentioned can effect your rankings… but it is highly unlikely it is the cause for them falling into the supplemental index…
My Mind will change tomorrow I am sure…
4. MrGamma | January 21st, 2007 at 8:56 pm
I should elaborate then…
* How much do you have to rework original content to make it appear to Google as not duplicate content.
I don’t believe Google has a duplicate content penalty… It has duplicate content filters… Google will treat pages differently if some pages are identical matches and if other are closely matched on the same site. It most likely applies the filter on a page by page basis and a domain by domain basis.
* Why in the heck didn’t Joomlahacks think of this problem?
I have not used Joomla Hacks… But I imagine it is a url rewriting mechanism… If all of the old urls are re-written and suddenly links which were responsible for distributing page rank are no longer in existance… then all of the site is prone to fall into the supplemental index. Redirects need to be in place to catch the incoming association form those old urls to redirect the page rank into the new urls…
* Is there any benefit for SMF purposes in aggregating content into subdomains?
The keywords in a url are important as they are weighted against the content of the pages behind them… for any other purposes it is something I don’t know.
* Could he have got around this problem if he had restricted google from crawling his /forum subdirectory using robots.txt?
I think it is a bad idea to restrict the robots from crawling any of your pages which have content you want indexed. The only reason the robots file should be used is to keep multiple requests to the same page using Googles dynamic url cawling technique to a minimum… So that Google may access more of your content faster rather than different states of your content.
* What do you think about the idea that CMS systems in themselves may lead to supplemental problems?
Free software is loaded with problem and optimizing for search engines is sometimes the last thing on a programmers mind.
I have yet to see an OSCommerce template in use in a site which I frequent… I feel they do very pporly with search engines in general… same goes for phpBB… There is something wrong with these freeware solutions and my first guess is that the lack of unique structure of the templates causes a duplicate content filter of sorts… which moves people to the back of the line as far as search results go…
This is highly speculative of course.
5. MrGamma | August 6th, 2007 at 7:03 pm
Doh, I just need a link…
6. Erwin Lozano | March 19th, 2009 at 4:02 pm
This is indeed scary. We all want a good traffic and being indexed as supplemental will hit as big time. Thanks for the warning and for the elaboration Mr. Gamma, highly appreciated!
Leave a Comment
Some HTML allowed:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>
Trackback this post | Subscribe to the comments via RSS Feed