Bill Slawski covers the 20 different ways search engines may rerank search results. It’s certainly worthy of a read.
Here’s a summary of the main points:
- Filtering of duplicate, or near duplicate, content
- Removing multiple relevant pages from the same site
- Based upon personal interests
- Reranking based upon local inter-connectivity
- Sorting for country specific results
- Sorting for language specific results
- Looking at population or audience segmentation information
- Reranking based upon historical data
- Reordering based upon topic familiarity
- Changing orders based upon commercial intent
- Reranking and removing results based upon mobile device friendliness
- Reranking based upon accessibility
- Reranking based upon editorial content
- Reranking based upon additional terms (boosting) and comparing text similarity
- Reordering based upon implicit feedback from user activities and click-throughs
- Reranking based upon community endorsement
- Reranking based upon information redundancy
- Reranking based upon storylines
- Reranking by looking at blogs, news, and web pages as infectious disease
- Reranking based upon conceptually related information including time-based and use-based factors
There’s been talk lately in various forums that Google in particular places sites with minor infractions on the 4th search result page. Statistics show that any results on the 4th page onwards get little to no traffic. This phenomenon is also referred to as the -30 penalty. No one knows for sure if this is permanent (if not, penalized for how long) and to what degree the infraction would cause something like this to happen. I personally haven’t seen downward changes in my client’s sites movement. Those who have sites in the top 10 suddenly drop to the fourth page would do well to investigate possible problems like links from bad neighborhoods, appearance of purchased links, lack of unique content, etc.