User requests in most cases are ambiguous. To help with this, Google specialists created the RankBrain algorithm. It can realize queries that it has not previously encountered, by comparing already seen questions and search results. About 15% of Google requests have never seen. About 15% of applications are different.
RankBrain is based on vectors that are understandable only by search engine robots. According to the search quality specialist, Greg Corrado, RankBrain is the 3rd most important Google ranking algorithm. In addition to this algorithm, there are about 10,000 signals and as many sub-signals that also affect the ranking.
The principle of the RankBrain algorithm
Tasks that the algorithm solves:
- Understanding user requests.
Previously, the result of search results was some guessing search engine.
For example, the request “gray console produced by Sony.”
In attempts to match the content on the pages and the user’s request, Google could never understand the intentions of the person and what s/he wants to find. Any claim in the text, title, description of the page, formed an indefinite issue. Most likely, the user would not have seen the answer to his or her question.
What has changed?
Now, the machine-learning algorithm comes to the rescue. Now Google will do a search cycle for phrases:
If to comparing the results of these requests, one can generate the critical issuewill be generated.
For example, people once looked for a “gray console designed by Nintendo.” On this request, the result was all about the game consoles, therefore, on the request “gray console made by Sony”, the output will be similar, about game consoles, in this case, PlayStation.
In its blog, Google published a description of an exciting technology that most likely resembles the one that uses RankBrain. This technology is called “Word2vec”. It uses distributed text views to capture similarities between concepts.
In addition to query matching, the RankBrain algorithm focuses on:
CTR, which is the click-through rate.
Dwell Time, which is the length of the click (the time between clicking on the link in the search results and returning the user to the same page in the issue).
Bounce Rate – bounce rates.
Pogo-sticking – return to the SERP and transition to a new search result.
The average viewing time for sites in the TOP-10 is 3 minutes 10 seconds (research published by Searchmetrics confirm this). It suggests that the length of the page view is essential, as one of the rating signals for Google.
To reduce the click-through ratio of organic results is possible due to:
Block with selected video;
For these reasons, since 2015 in the USA, the organic CTR in mobile search has decreased by 41.4%.
Your snippet should shout, “click me.”
How to optimize the title and description for CTR?
- Add words to the title, which will cause emotions (effective, robust, proven). It is not suitable for all niches, but where appropriate, it is worth trying.
- Include brackets in the title. After research on 3.3 million headers, CTR headers with brackets increased by 33% than without them. You can enclose numbers or individual phrases in parentheses, for example (2018), [Examples], (Tips).
- Use numbers. Among the words, numbers stand out well.
- Also, work with emotions.
- Sell your content on the page in the description. The user must know why s/he should go to your site. Place in it a part of the complete reply.
- And of course, do not forget about keywords.
Ranking by selected snippets
According to SEMrush statistics, 11.3% of search results fall into chosen pieces.
To get to the selected snippets, you need to find out:
- What competitors’ requests get into dedicated pieces.
- What requests of yours should get there?
In the selected fragments, in most cases, small pieces of text are pulled up, on average 28-30 words, maximum 84. These are answers to long queries, local or query questions. If you try to place small definition texts in 30 words at the beginning of the page, then the probability of a page hitting the selected snippet increases.
Such text should be in the paragraph, but there are still lists and tables. In about 82% of cases, the version in the selected snippet is a paragraph of text.
Use the H2 and H3 headers correctly, add a long natural query to the header, if it is appropriate, Google will pull them into a snippet.
When Google analyzed the HTML page, it calculated how many times a keyword appears in the text.
At each page visit, the robot checked if the keyword appeared in the title, description, URL, alt or H1.
Google still pays attention to this, but not as much as before.
Now, the primary goal of Google is to understand the context in which the keywords are.
The primary goal of Google is to provide the user with the best result. In most cases, this is not texting with keys.
According to American studies, the best content should be of more than 2000 words. It must:
- To elaborate on something well;
- To be authoritative;
- To have social responsibility.
Expand LSI texts. These words, phrases that are strictly related to the topic of the page, will complement and expand the article — for example, an article about the Paleo Diet.
LSI is going to be as follows:
- Food portions;
- Loss of weight;
If there are going to be many LSI words in the text, but not a bunch of keywords, then for search system this article is one, which elaborates on a topic thoroughly.
YouTube is the second largest search engine in the world. Two times more popular than Bing is. By 2021, the video will take 80% of the traffic. The time that people spend on YouTube has grown by 60% compared to last year. Most experts are too lazy to work with video, but YouTube is too big PS to ignore. Nearly 55% of Google’s search results contain videos. Almost all of them belong to YouTube. And since YouTube belongs to Google, I think in 2019 we can expect even more videos. Google has already begun to mix videos from YouTube in images search.
Content for links
Getting backlinks without little, exciting content is impossible. Google itself stated that links and content are # 1 and # 2 factors, which influence the ranking. The study analyzed more than a million results, and the references and material will still have a significant impact in 2019 on positions at Google.
- Consider the authority, expertise and trust rate of your site. On August 1, 2018, Google updated its algorithm, during which these three indicators will play a significant role in assessing the quality of an Internet resource. In the course of the surveys, identified niches that were most affected by the change.
- July 20, Google updated the recommendations for assessors. It also made new proposals on the evaluation of the EAT (expertise, credibility, and trust of the posted material that on the websites of various topics). These indicators have previously been guiding principles for many years. Thus, Google wants to check everything about the person who manages the site and how much of its information can be trusted.
- Publish unique content.
- Encourage people to post comments on the blog. Google wants to see active communication on your pages, much more than activity in social networks.
- Solve problems with “dead” pages. When your site has many pages, after a while some of them begin to lose traffic, due to incomplete content, poor quality, etc. Combine several pages/articles of such into one large, which will complement each other, and then the “dead” traffic will begin to revive. It will become better than the pieces that were previously.
- Provide with the opportunity to write guest posts. If you have connections with reputable authors from your niche who publish their work in magazines or even just in their social network pages, agree upon getting backlinks.
About the author: Melisa Marzett is a freelance writer who works for Professional essay writers – penessays.com and enjoys simple things in life such as walking down the street, nature, random people she happens to meet on her way and to see their mood. She is not much of a talker but an observer who is curious, attentive and a good listener.