As you may or may not be aware, Google is now using machine learning to analyze user intent and dynamically display the best results, by individual search queries. This also means that now, not even their own engineers know why a website ranks above another! It’s always been somewhat of a mystic process to the general public, but it’s becoming more and more mystic to even some of the most seasoned SEO experts.
For example, there used to be around 200 factors that determined how a website ranks for a specific keyword. In 2018, the weighting of each factor is now skewing in real-time, depending on the keyword, user intent, bounce rates, and customized factors depending on the individual search.
Why Should You Care?
More complexity means huge opportunity and competitive edge for those able to stay ahead of the curve. This translates to better than average rankings, more leads, and ultimately more market share and top-line revenue than your competition.
Machine learning is great news, in the sense that it’s now much harder to figure out why something is ranking well, and the barriers for entry into SEO are drastically starting to increase.
This creates a challenge, however, because the added complexity means more time involved, more analysis, and a much deeper understanding of the art and science that goes behind a world-class SEO service.
At Gustin Quon, we’ve been investing in machine learning programs and statistical analysis capability, allowing us to efficiently analyze the websites of our clients’, their competition, and the exact modifications to the on-site and off-site strategies required to obtain top positions in the desired keywords and cities.
In short, we collect a multitude of big data from Google, websites, and other third-party metric providers. We then run statistical analysis coefficients (Pearson’s and Spearman’s) on the data in order to identify what ranking factors are statistically significant to Google position, by how much, and where our deficit lies.
Here’s a brief overview of how we now go about a typical SEO campaign:
- Initial data of targeted URL, geography, country
- Complete scrape of Google’s first 10 pages
- Complete scrape of all 100 ranking websites in the top 10 pages
- On-site data and content
- SEMRush – rankings, LSI keywords, organic and paid traffic
- AHREFS data
- Majestic SEO data
- Analysis of page 1 and top 3 ranking pages/domains compared to those in lesser positions
- Correlations between each ranking factor, focusing in on where the distributions tell us there is a strong, inverse, medium or weak/no correlation.
- Implementation of findings and changes required. Test and tweak over the life of the campaign.
By using the available data and formulating exact changes required to rank – we take out the trial and error guessing game, allowing us to rank websites quicker and more effectively than other agencies.
As Google becomes more complex – as do we.
If you’re interested to see how machine learning can help your website rank, ask your account manager/sales representative to see a report. The process is new, but we are rolling it out for all clients in early 2018. This is just the start of an AI future we will soon be living in. By being a Gustin Quon client, your bases are already covered.