Google Penguin Update – Get your traffic and SERP Ranks BACK
If you were like many and hit hard by the Google Penguin update, here
are a few tips and tricks on how to get your traffic back, getting your
SERP back and getting your website or black back to nornal!
•Links that Come from PR-n/a or PR0 sites. REALLY – ARE YOU SURE?
Fist off the bat, if you discount all those links which are on
PR
N/A or PR0 pages, you will be discriminating against around 90% of the
pages on the Internet! I’m sorry, but just because a page is currently
PR0 or
PR N/A doesn’t mean
its toxic or bad IN ANY WAY and when has anyone from Google ever said it
was? In fact this is not the first time a so called authority has
suggested such links are bad and this has been a common misconception
which I’ve seen espoused in pretty much every major article on this
subject in the last year.
Moreover this is probably the quickest way to build a list of FALSE
POSITIVES which will have a negative value and end up causing even more
anguish, confusion and harm to a domains ranks if you used this as a
signal in your LINK AUDIT and subsequent link removal campaign.
AND THE REASON? It could simply be a newly found page (within the last 4
months) and hence has not yet been updated as part of Googles Pagerank
updates (which take place roughly 4 times per year), or it may just be a
page on a large site which doesn’t have enough Pagerank (Juice) to pass
around all of its wonderful pages and thus is unable to send them pages
into the so called SAFE ZONE of a PR1 or above…
This is one of them myths that seriously needs to be put to bed and I
welcome the day when someone asks one of the vocalists at Google for
clarification or advice on this much lauded white elephant in the room.
If you’ve been sold on this as a method to locate your bad links you
will have wasted all following efforts in getting them links removed and
no doubt done yourself a disservice in the process.
And what about all the other RED flags raised by the so called experts?
•Links that are site wide
This is undoubtedly true and the only question left is to how to most
effectively find such links from the data you have. These links are
normally located in the side bar (blog roll) or footer of bunk domains
often along side a bunch of other similarly smelly links all with clear
money term anchor texts.
•Links which come from very new domains. REALLY – HERE WE GO AGAIN – MORE BOLOGNESE!
New domains typically have no Pagerank or Juice value and since new
domains are sandboxed (Googles penalty on new domains which have not yet
earned trust), Google sandbox outbound links (on new domains) in the
same manor, hence if anyone tries to manipulate the SERPS with such
techniques they fail. This is not a widespread problem that Google are
still grappling with and hence singling out the new domains where one
has a link from is just going to add more white noise to any data you
end up with.
•
Links which come from domains with little traffic.
This just shows a complete misunderstanding of firstly the nature of the
Internet and secondly the problems that Google are grappling with. If
Google was to use such a metric or signal for improving the SERPs they
would end up with so much false data that they would be totally lost and
have no idea what was an authority or why. Most articles have a spike
of traffic when first published then that traffic dwindles down as it is
buried away. When you build a module into an algorithm you want data
which means something significant, not something that will result in
pure white noise. There has NEVER been anything said or even suggested
by Google that they use such a signal to lower ranks and in all my years
in SEARCH I’ve never seen any evidence that such a concept is being
used to qualify the value of links! Despite the beliefs to the contrary.
You can put this one into the quack box along with another similar
signal I’ve seen used to mark links as suspicious, one called
Low Link Velocity,
which is apparently when a domain or page is no longer attracting
backlinks as quickly as it once did and hence is therefore likely to be a
domain which has been sold to a link spammer and is now being used as a
link farm. This again could never be used as a signal by Google to
qualify links as many domains and pages attract a natural spike of back
links when they are first published and interest generally diminishes as
time goes by. That’s pretty normal and certainly not a signal which
could be relied upon to qualify a link!
•Links which come from sites with identical C class. FACT OR FICTION?
In general there is some truth to this one, however, its not a simple
case of black and white though it’s certainly a useful signal for Google
to see whats connected during a manual inspection, certainly also once
they have smelt a rat but it is most likely treated as a SOFT signal in
the SERPs. They would most likely limit the Juice being passed from one
IP to another and these links should most certainly be avoided as it
would help them find networks. That said theres another problem which
I’ve seen here, I have seen historical data being used in some link
audit reports whereby the hosting was changed years ago and the IP data
being supplied is historic and hence misleading. Its a good signal ONLY
if the data is reliable and fresh!
•Links from pages with a big number of external links. ANOTHER FALSE ASSUMPTION?
While I’d tend to agree that if a page has lots of outbound links its
most likely of low value, though in some instances there are pages that
do contain value and have lots of outbound links. Typically where all
the links are natural and relevant to each other and where the anchors
were not money terms you could be safe to assume the page was worthy. So
crudely treating all pages as bad based on the numbers of outbound
links alone would undoubtedly raise false positives and there is no
evidence to suggest Google is that crude.
So where does that leave us? Pretty much in a mess based on some of
these very unsound signals. Raising flags which are unsafe results in
huge amounts of ‘possibly’ suspicious links which then have to be
reviewed again manually which totally defeats the whole purpose of
having a link audit in the first place!