What is included in the semantic core. Semantic core - how to make it right? Is it necessary to select key phrases or can you do without it

Often novice webmasters, faced with the need to create a semantic core, do not know where to start. Although there is nothing complicated in this process. Simply put, you need to collect a list of key phrases by which Internet users are looking for information on your site.

The more complete and accurate it is, the easier it is for a copywriter to write a good text, and for you to get high positions in the search for the required queries. This article will discuss how to correctly compose large and high-quality semantic cores and what to do with them further so that the site goes to the top and collects a lot of traffic.

The semantic core is a set of key phrases, ungrouped by meaning, where each group reflects one need or desire of the user (intent). That is, what a person thinks about when they drive their query into the search bar.

The whole process of creating a kernel can be represented in 4 steps:

  1. We are faced with a task or problem;
  2. In our head we formulate how we can find its solution through a search;
  3. We drive a request into Yandex or Google. Other people besides us do the same;
  4. The most frequent options for calls fall into analytics services and become key phrases that we collect and group according to needs. As a result of all these manipulations, a semantic core is obtained.

Is it necessary to select key phrases or can you do without it?

Previously, semantics were compiled in order to find the most frequent keywords on a topic, write them into the text and get good visibility for them in the search. For the last 5 years, search engines have been striving to move to a model where the relevance of a document to a query will be assessed not by the number of words and the variety of their variations in the text, but by the assessment of the disclosure of intent.

Google started it in 2013 with the Hummingbird algorithm, Yandex in 2016 and 2017 with Palekh and Korolev technologies, respectively.

Texts written without a SA will not be able to fully disclose the topic, which means that it will not work to compete with the TOP in high-frequency and medium-frequency queries. It makes no sense to bet on low-frequency requests - there is too little traffic for them.

If you want to successfully promote yourself or your product on the Internet in the future, you need to learn how to compose the correct semantics that fully reveals the needs of users.

Search query classification

Let's analyze 3 types of parameters by which keywords are evaluated.

By frequency:

  • High-frequency (HF) - phrases that define the topic. Consist of 1-2 words. On average, the number of search queries starts from 1000-3000 per month and can reach hundreds of thousands of impressions, depending on the topic. Most often, the main pages of sites are sharpened for them.
  • Medium frequency (MF) - separate directions in the topic. Mostly contain 2-3 words. With an exact frequency of 500 to 1000. Typically commercial site categories or topics for large informational articles.
  • Low-frequency (LF) - queries related to the search for a specific answer to a question. As a rule, 3-4 words. This can be a product card or an article topic. On average, 50 to 500 people are searched per month.
  • When analyzing metrics or statistics counters data, you can find another type - micro LF keys. These are phrases that are often asked once on a search. There is no sense to sharpen the page for them. It is enough to be in the top for bass, which includes them.



By competitiveness:

  • Highly competitive (VK);
  • Mid-range (SK);
  • Low competitive (NK);

On demand:

  • Navigational. Express the user's desire to find a specific Internet resource or information on it;
  • Informational. They are characterized by a need for information as a response to a request;
  • Transactional. Directly related to the desire to make a purchase;
  • Fuzzy or general. Those for which it is difficult to accurately determine the intent.
  • Geo-dependent and geo-independent. Reflect the need to search for information or make a transaction in your city or without regional reference.


Depending on the type of site, you can give the following recommendations when selecting key phrases for the semantic core.

  1. Information resource... The main focus should be on finding topics for articles in the form of MF and LF queries with low competition. It is recommended to open the topic wide and deeply, sharpening the page for a large number of LF keys.
  2. Online store or commercial site. We collect HF, MF and LF, segmenting as clearly as possible so that all phrases are of a transactional type and belong to the same cluster. We focus on finding well-converting low-frequency NK keywords.

How to correctly compose a large semantic core - step by step instructions

We have moved on to the main part of the article, where I will consistently analyze the main stages that need to go through to build the core of the future site.
To make the process clearer, all steps are given with examples.

Search for basic phrases

Working with the SEO core begins with the selection of a primary list of basic words and phrases (HF) that best characterize the subject matter and are used in a broad sense. They are also called markers.

These can be either the names of the directions, or the types of products, popular queries from the topic. As a rule, they consist of 1-2 words and have tens, and sometimes hundreds of thousands of impressions per month. It is better not to take very wide keys, so as not to drown in negative keywords at the stage of expansion.

The most convenient way to select marker phrases is using. Driving a query into it, in the left column we see the phrases that it contains in itself, in the right - similar queries from which you can often find topics suitable for expanding. The service also shows the basic frequency of the phrase, that is, how many times it was asked per month in all word forms and with the addition of any words to it.

By itself, this frequency is of little interest, so to get more accurate values, you need to use operators. Let us analyze what it is and what it is for.

Operators Yandex Wordstat:

1) "..." - quotes. A query in quotation marks allows you to track how many times a phrase was searched in Yandex with all its word forms, but without adding other words (tails).

2)! - Exclamation point. Using it before each word in the request, we fix its form and get the number of impressions in the search for a key phrase only in the specified word form, but with a tail.

3) "! ...! ...! ..." - quotes and an exclamation mark before each word. The most important operator for the optimizer. It allows you to understand how many times a keyword is requested per month strictly according to a given phrase, as it is written, without adding any words.

4) +. Yandex Wordstat does not take into account prepositions and pronouns when making a request. If you need him to show them, we put a plus sign in front of them.

five) -. The second most important operator. With its help, words that do not fit are quickly eliminated. To apply it, after the analyzed phrase, put a minus and a stop word. If there are several of them we repeat the procedure.

6) (… |…). If you need to get data from Yandex Wordstat for several phrases at the same time, we enclose them in brackets and separate them with a forward slash. In practice, the method is rarely used.

For the convenience of working with the service, I recommend installing a special browser extension “Wordstat Assistant”. It is installed on Mozila, Google Chrome, J. Browser and allows you to copy phrases and their frequencies with one click of the "+" or "Add all" icon.


Let's say we decided to make our own SEO blog. Let's choose 7 basic phrases for him:

  • semantic core;
  • optimization;
  • copywriting;
  • promotion;
  • monetization;
  • Direct

Search for synonyms

When formulating a query to search engines, users can use words that are similar in meaning, but different in spelling.

For example, "car" and "car".

It is important to find as many synonyms for the main words as possible in order to increase the coverage of the future semantic core. If this is not done, then when parsing, we will miss a whole layer of key phrases that reveal user needs.

What we use:

  • Brainstorm;
  • Right column Yandex Wordstat;
  • Requests typed in Cyrillic;
  • Special terms, abbreviations, slang expressions from the subject;
  • Yandex and Google blocks - search for "query name";
  • Competitor snippets.

As a result of all actions for the selected topic, we get the following list of phrases:


Extending basic queries

Let's parse these keywords to identify the basic needs of people in this area.
The most convenient way to do this is in the Key Collector program, but if it's a pity to pay 1800 rubles for a license, use its free analogue - Slovoyob.

In terms of functionality, it is certainly weaker, but it is suitable for small projects.
If you don't want to delve into the work of the programs, you can use the Just-Magiс service and Rush Analytics. Still, it's better to spend a little time and figure out the software.

I will show you how it works in Kay Collector, but if you work with Slovoeb, then everything will be clear too. The program interface is similar.

Procedure:

1) Add a list of basic phrases to the program and remove the basic and exact frequency for them. If we plan to promote in a specific region, we indicate the regionality. For informational sites, this is most often not necessary.


2) Let's parse the left column of Yandex Wordstat by the added words to get all requests from our topic.


3) At the output, we got 3374 phrases. Let's take them off the exact frequency, as in the 1st paragraph.


4) Let's check if there are any keys with zero base frequency in the list.


If there is, delete it and go to the next step.

Negative words

Many people neglect the procedure for collecting negative keywords, replacing it with the removal of phrases that are not suitable. But later you will realize that it is convenient and really saves time.

Open the Data -\u003e Analysis tab in the Key Collector. Select the type of grouping by individual words and scroll through the list of keys. If we see a phrase that does not fit, press the blue icon and add the word instead of with all its word forms to the stop words.


In Slovoeba, work with stop words is implemented in a more simplified form, but you can also create your own list of phrases that do not fit and apply them to the list.

Don't forget to use sorting by Base frequency and number of phrases. This option helps to quickly reduce the list of original phrases or filter out rare ones.


After we have compiled a list of stop words, we apply them to our project and proceed to collecting search suggestions.

Parsing hints

When entering a query in Yandex or Google, search engines offer their options for its continuation from the most popular phrases that Internet users drive in. These keywords are called search suggestions.

Many of them do not fall into Wordstat, so when constructing a semantic one, it is imperative to collect such queries.

Key Collector, by default parses them with an iteration of endings, Cyrillic and Latin alphabets and with a space after each phrase. If you are ready to sacrifice quantity in order to significantly speed up the process, put a tick on the box "Collect only TOP prompts without brute force and space after the phrase."


Often among search suggestions you can find phrases with good frequency and competition ten times lower than in Wordstat, so in narrow niches I recommend collecting as many words as possible.

The time for parsing hints directly depends on the number of simultaneous calls to the search engine servers. The maximum Key Collector supports 50-thread work.
But in order to parse requests in this mode, you will need the same number of proxies and Yandex accounts.

For our project, after collecting tips, we got 29595 unique phrases. In time, the whole process took a little over 2 hours on 10 threads. That is, if there are 50 of them, we will keep within 25 minutes.


Determination of the base and exact frequency for all phrases

For further work, it is important to determine the base and exact frequency and weed out all zeroes. Leave requests with a low number of impressions if they are targeted.
This will help you better understand the content and create a more complete article structure than is in the top.

In order to remove the frequency, first we filter out all unnecessary:

  • repetitions
  • keys with other symbols;
  • duplicate phrases (via the Implicit Duplicate Analysis tool)


For the remaining phrases, we will determine the exact and basic frequency.

and) for phrases up to 7 words:

  • Select through the filter "The phrase consists of no more than 7 words"
  • Open the "Collect from Yandex.Direct" window by clicking on the "D" icon;
  • If necessary, indicate the region;
  • Choose the guaranteed impressions mode;
  • We put the collection period - 1 month and check the boxes above the required types of frequencies;
  • Click "Get data".


b) for phrases from 8 words:

  • We set a filter for the "Phrase" column - "consists of at least 8 words";
  • If you need to advance in a specific city, we indicate the region below;
  • Click on the magnifying glass and select “Collect all kinds of frequencies”.


Cleaning keywords from garbage

After we have received information about the number of impressions for our keys, we can start filtering out those that are not suitable.

Let's consider the order of actions in steps:

1. Go to the "Group Analysis" Key Collector and sort the keys by the number of words used. The task is to find non-target and frequent ones and add them to the list of stop words.
We do everything in the same way as in the "Minus words" paragraph.


2. We apply all found stop words to the list of our phrases and go over it so as not to lose target queries for sure. After checking, click delete "Marked phrases".


3. We filter out dummy phrases that are rarely used in an exact match, but have a high base frequency. To do this, in the settings of the Kay Collector program in the "KEY & SERP" item, insert the calculation formula: KEY 1 \u003d (YandexWordstatBaseFreq) / (YandexWordstatQuotePointFreq) and save the changes.


4. Calculate KEY 1 and delete those phrases for which this parameter is 100 or more.


The remaining keys need to be grouped by landing page.

Clustering

The distribution of queries into groups begins with the clustering of phrases according to the top through the free Majento Cluster program. I recommend KeyAssort, a paid analogue with wider functionality and faster operation speed, but the free one is quite enough for a small kernel. The only caveat is that to work in any of them you will need to buy XML limits. Average price - 5 rubles. for 1000 requests. That is, processing an average kernel for 20-30 thousand keys will cost 100-150 rubles. See the screenshot below for the address of the service you are using.


The essence of key clustering by this method is to group those phrases that have Yandex Top 10:

  • shared urls with each other (Hard)
  • with the most frequent request in the group (Soft).

Depending on the number of such coincidences for different sites, clustering thresholds are distinguished: 2, 3, 4 ... 10.

The advantage of this method is the grouping of phrases according to the needs of people, and not only by synonyms. This allows you to immediately understand which keywords can be used on one landing page.

Suitable for informational people:

  • Soft with a threshold of 3-4 and then hand cleaning;
  • Hard in 3-ke, and then the combination of clusters in meaning.

Online stores and commercial sites, as a rule, are promoted on Hard with a clustering threshold of 3. The topic is voluminous, so I will discuss it later in a separate article.

For our project, after grouping by the Hard method, 317 groups were obtained.


Competition check

There is no point in promoting highly competitive queries. It's difficult to get to the top, and without it there will be no traffic to the article. To understand what topics are profitable to write on, we use the following method:

We focus on the exact frequency of the group of phrases for which the article is being written and the competition for Mutagen. For informational sites, I recommend that you take into work topics that have a total exact frequency of 300 or more, and the coefficient of competition is from 1 to 12 inclusive.

In commercial topics, focus on the marginality of a product or service and how competitors do it in the top 10. Even 5-10 target requests per month can be a reason to make a separate page for it.

How to check competition on demand:

a) manually, by typing in the appropriate phrase in the service itself or through bulk tasks;


b) in batch mode through the Kay Collector program.


Topic selection and grouping

Let's consider each of the resulting groups for our project after clustering and select themes for the site.
Majento, unlike Key Assort, does not provide the ability to download data on the number of impressions for each phrase, so you will have to additionally shoot them through Key Collector.

Instructions:

1) Unload all groups from Majento in CSV format;
2) We concatenate phrases in Excel using the "group: key" mask;
3) Load the resulting list into the Key Сollector. In the settings, there must be a check mark in the "Group: Key" import mode and do not monitor the presence of phrases in other groups;


4) We remove the base and exact frequency for keywords from the newly created groups. (If you use Key Assort, then you do not need to do this. The program allows you to work with additional columns)
5) We are looking for clusters with a unique intent containing at least 3 phrases and the number of impressions for all requests in the total is more than 300. Next, we check the 3-4 most frequent ones for mutagen competition. If among these phrases there are keys with competition less than 12 - we take it to work;

6) We look through the rest of the groups. If there are phrases that are close in meaning and are worth considering within one page, we combine them. For groups containing new meanings, we look at the prospects for the total frequency of phrases, if it is less than 150 per month, then we postpone it until we go through the entire core. Perhaps it will be possible to combine them with another cluster and collect 300 accurate impressions - this is the minimum from which it is worth taking an article into work. To speed up manual grouping, use auxiliary tools: quick filter and frequency dictionary. They will help you quickly find suitable phrases from other clusters;


Attention!!! How to understand that clusters can be combined? We take 2 frequency keys from those that were selected in step 5 for the landing page and 1 request from the new group.
Add them to Arsenkin's tool "Unload Top 10", specify the desired region, if necessary. Next, we look at the number of crossings by color for the 3rd phrase with the rest. We combine groups if there are 3 or more of them. If there are no coincidences or one thing, you cannot combine - different intents, in the case of 2 intersections, look at the issue by hand and use logic.

7) After grouping the keys, we get a list of promising topics for articles and the semantics for them.


Deleting other content type requests

When compiling the semantic core, it is important to understand that commercial queries are not needed for blogs and communication sites. Just like online stores do not need information.

We go over each group and clean up all unnecessary, if it is not possible to accurately determine the intent of the request, we compare the results or use the tools:

  • Commercialization check from Pixel Tools (free, but with a daily check limit);
  • just-Magic service, clustering with a tick check the commercial request (paid, the cost depends on the tariff)

After that, we move on to the last stage.

Optimizing phrases

We optimize the semantic core so that it would be convenient for the seo specialist and copywriter to work with it in the future. To do this, we will leave key phrases in each group that fully reflect the needs of people and contain as many synonyms as possible for the main phrases.

Algorithm of actions:

  • Sort keywords in Excel or Key Collector alphabetically from A to Z;
  • Let's choose those that reveal the topic from different sides and in different words. All other things being equal, we leave phrases with a higher exact frequency or which have a lower key 1 (the ratio of the base frequency to the exact one);
  • We remove keywords with less than 7 impressions per month, which do not carry new meanings and do not contain unique synonyms.

An example of what a well-composed semantic core looks like -

In red, I marked phrases that do not fit the intent. If you neglect my recommendations for manual grouping and do not check compatibility, it will turn out that the page will be optimized for incompatible key phrases and you will no longer see high positions for promoted queries.

Final checklist

  1. We select the main high-frequency queries that set the topic;
  2. We look for synonyms for them, using the left and right columns of Wordstat, competitors' sites and their snippets;
  3. Expand the received queries by parsing the left column of Wordstat;
  4. Preparing a list of stop words and applying them to the resulting phrases;
  5. Parse Yandex and Google tips;
  6. We remove the base and exact frequency;
  7. Expanding the list of negative keywords. We clean up garbage and dummy requests
  8. We do clustering through Majento or KeyAssort. For information sites in Soft mode, the threshold is 3-4. For commercial Internet resources using the Hard method with a threshold of 3.
  9. We import data into Key Collector and determine the competition of 3-4 phrases for each cluster with a unique intent;
  10. We select topics and decide on landing pages for queries based on an estimate of the total number of exact impressions for all phrases from one cluster (from 300 for informational people) and competition for the most frequent ones for Mutagen (up to 12).
  11. For each suitable page, we look for other clusters with similar user needs. If we can consider them on one page, we combine them. When the need is not clear or there is a suspicion that another type of content or page should be the answer to it, we check by issuance or through the Pixel Tools or Just-Magic tools. For content sites, the core should consist of information queries, for commercial sites, of transactional ones. We delete the excess.
  12. We sort the keys in each group alphabetically and leave those of them that describe the topic from different angles and in different words. All other things being equal, priority is given to those queries that have a lower base-to-reach ratio and a higher number of accurate impressions per month.

What to do with the SEO core after its creation

They made a list of the keys, gave them to the author and he wrote an excellent article in full, revealing all the meanings. Eh, I was dreaming ... An explanatory text will work only if the copywriter clearly understands what you want from him and how he can check himself.

Let us analyze 4 components, having worked well, which you are guaranteed to get a lot of targeted traffic to the article:

Nice structure. We analyze the requests selected for the landing page and identify what needs people have in this topic. Next, we write an article plan that fully answers them. The task is to make sure that people who visit the site receive a voluminous and comprehensive answer on the semantics that you have compiled. This will give good behavioral and high relevance to the intent. Once you have a plan, look at competitors' sites by typing the main search query you are promoting. You need to do exactly in this sequence. That is, first we do it ourselves, then we look at what others have and, if necessary, refine it.

Optimization for keys. The article itself is sharpened under 1-2 most frequent keys with competition for Mutagen up to 12. Another 2-3 mid-frequency phrases can be used as headings, but in a diluted form, that is, by inserting additional words that are not related to the topic, using synonyms and word forms ... We focus on low-frequency phrases from which it pulls out a unique part - the tail and evenly embeds it into the text. The search engines will find and glue everything themselves.

Synonyms for basic queries. We write them out separately from our semantic core and set the task for the copywriter to use them evenly throughout the text. This will help to reduce the density of our main words and at the same time the text will be optimized enough to get to the top.

Subject-setting phrases.By themselves, LSI does not promote the page, but their presence indicates that most likely the written text belongs to the "pen" of an expert, and this is already a plus to the quality of the content. To search for thematic phrases, we use the tool "Terms of Reference for a Copywriter" from Pixel Tools.


An alternative method for selecting key phrases using services for competitor analysis

There is a quick approach to building a Semantic Core that is applicable to both novice and advanced users. The essence of the method lies in the fact that we initially select keys not for the entire site or category, but specifically for the article, landing page.

It can be implemented in 2 ways, which differ in how we choose themes for the page and how deeply we expand the key phrases:

  • by parsing primary keys;
  • based on competitor analysis.

Each of them can be implemented at a simpler and more complex level. Let's analyze all the options.

Without using programs

A copywriter or webmaster often does not want to deal with the interface of a large number of programs, but they need good topics and key phrases for them.
This method is just for beginners and those who do not want to bother. All actions are performed without using additional software, using simple and understandable services.

What you need:

  • Keys.so service for competitor analysis - RUB 1,500 With the "altblog" promo code - 15% discount;
  • Mutagen. Checking the competitiveness of requests - 30 kopecks, collecting the basic and exact frequency - 2 kopecks per check;
  • Bukvarix - free version or business account - 995 rubles. (now with a discount of 695 rubles)

Option 1. Selecting a topic by parsing basic phrases:

  1. We select the main keys from the topic in a broad sense, using brainstorming and the left and right columns of Yandex Wordstat;
  2. Next, we look for synonyms for them, the methods of which were mentioned earlier;
  3. We fill in all received marker requests into Bukvarix (you will need to pay a paid tariff) in the extended mode "Search by the list of keywords";
  4. Specify in the filter: "! Exact! Frequency" from 50, Number of words from 3;
  5. We export the entire list to Excel;
  6. We select all the keywords and send them for grouping to the “Kulakov Clusterizer” service. If the site is regional, select the desired city. Leave the clustering threshold for information sites at 2, for commercial sites set at 3;
  7. After grouping, we select topics for articles, looking at the resulting clusters. We take those where the number of phrases is from 3 and with a unique intent. An analysis of the urls of sites from the top in the “Competitors” column (on the right in the Kulakov service plate) helps to better understand the needs of people. Also, do not forget to check the Mutagen competition. We punch 2-3 requests from the cluster. If everything is more than 12, then you should not take the topic;
  8. We have decided on the name of the future landing page, it remains to choose key phrases for it;
  9. From the field "Competitors" copy 3 URLs with the appropriate type of pages (if the site is informational - take links to articles, if commercial, then to shops);
  10. We insert them sequentially into keys.so and unload all key phrases for them;
  11. We combine them in Excel and delete duplicates;
  12. Service data alone is not enough, so you need to expand it. Let's use the Bukvarix again;
  13. We send the resulting list for clustering to the "Kulakov Clusterizer";
  14. We select groups of requests that are suitable for the landing page, focusing on the intent;
  15. We remove the base and exact frequency through the Mutagen in the "Mass missions" mode;
  16. We export the list with refined data on the number of impressions in Excel. Remove nulls for both types of frequencies;
  17. Also in Excel, add the formula for the ratio of the base frequency to the exact one and leave only those keys for which this ratio is less than 100;
  18. We delete requests of a different type of content;
  19. We leave phrases that fully and in different words reveal the main intent;
  20. We repeat all the same steps 8-19 for the rest of the topics.

Option 2. Choosing a topic through competitor analysis:

1. We are looking for top sites in our topic, driving in high-frequency queries and looking through the results through Arsenkin's tool "Top-10 Analysis". It is enough to find 1-2 suitable resources.
If we are promoting a site in a specific city, we indicate the regionality;
2. Go to the keys.so service and enter the urls of the sites that we found and see which pages of competitors bring the most traffic.
3. 3-5 of the most accurate frequency queries of them we check for competitiveness. If for all phrases it is above 12, then it is better to look for another topic that is less competitive.
4. If you need to find more sites for analysis, open the "Competitors" tab and set the parameters: similarity - 3, thematicity - 10. Sort the data in descending order of traffic.
5. After we have selected a topic, we drive its name into the search results and copy 3 urls from the top.
6. Then we repeat steps 10-19 from the 1st option.

Using Kei Collector or Slovoeba

This method will differ from the previous one only by using the Key Collector program for some operations and by deeper expansion of keys.

What you need:

  • kay Collector program - 1800 rubles;
  • all the same services as in the previous method.

"Advanced - 1"

  1. Parse the left and right Yandex columns for the entire list of phrases;
  2. We remove the exact and base frequency through the Key Collector;
  3. We calculate the indicator key 1;
  4. We delete requests for zeroes and with key 1\u003e 100;
  5. Then we do everything in the same way as in paragraphs 18-19 of option 1.

"Advanced - 2"

  1. We do steps 1-5, as in option 2;
  2. Collect keys in keys.so for each url;
  3. Removing duplicates in the Key Collector;
  4. Repeat Points 1-4 as in the "Advanced -1" method.

Now let's compare the number of keys received and their exact total frequency when collecting SN by different methods:

As you can see from the table, the best result was shown by an alternative method of creating a core for a page - "Advanced 1,2". It was possible to obtain 34% more target keys and at the same time the total traffic across the cluster turned out to be 51% more than in the case of the classical method.

Below in the screenshots, you can see how the finished kernel looks like, in each of the cases. I took phrases with the exact number of impressions from 7 per month, so that I can evaluate the quality of the keywords. For full semantics, see the table under the link "View".

AND)


B)


IN)

Now you know that it is not always the most common way, as everyone does, the most correct and correct, but you should not give up other methods either. Much depends on the topic itself. For commercial sites where there are not so many keys, the classic version is quite enough. On information sites, you can also get excellent results if you correctly draw up a technical assignment for a copywriter, make a good structure and seo optimization. We will talk about all this in detail in the following articles.

3 common mistakes when creating a semantic core

1. Collecting phrases on top. It's not enough to parse Wordstat to get a good result!
More than 70% of queries that people enter rarely or periodically do not get there at all. But among them there are often key phrases with good conversion and really low competition. How not to miss them? Be sure to collect search suggestions and combine them with data from different sources (, counters on sites, statistics services and databases).

2. Mixing information and commercial requests on one page.We have already discussed that key phrases differ according to the type of needs. If a visitor comes to your site who wants to make a purchase, but sees a page with an article as a response to his request, do you think he will be satisfied? No! Search engines also think when they rank a page, which means that you can immediately forget about the top in MF and HF phrases. Therefore, if you are in doubt about determining the type of request, look at the results or use the tools Pixel Tools, Just-Magis to determine the commercial value.

3. Choice to promote highly competitive queries. Positions for HF VK phrases by 60-70% depend on behavioral factors, and to get them you need to get into the top. The more applicants, the longer the line of applicants and the higher the requirements for sites. Everything is like in life or sports. Becoming a world champion is much more difficult than getting the same title in your city.
Therefore, it is better to enter a quiet rather than an overheated niche.

It used to be even harder to get to the top. In the top they were on the principle of who had time, he ate. Leaders came to the fore, and they could only be displaced by accumulating behavioral factors. And how to get them if you are on the second or third page ... Yandex broke this vicious circle in the summer of 2015 by introducing the “multi-armed bandit” algorithm. Its essence is precisely to randomly increase and decrease the position of sites in order to understand whether more worthy candidates have appeared to be in the top.

How much money do you need to start?

To answer this question, let's calculate the costs of the required arsenal of programs and services to prepare and ungroup key phrases into 100 articles.

The very minimum (suitable for the classic version):

1. Slovoeb - free
2. Majento Cluster - free
3. For captcha recognition - 30 rubles.
4. Xml-limits - 70 rubles.
5. Checking the competition of a request for Mutagen - 10 checks per day for free
6. If you are in no hurry and are ready to spend 20-30 hours on parsing, you can do without a proxy.
—————————
The result is 100 rubles. If you enter captchas yourself, and get xml limits in exchange for those transferred from your site, then you can actually prepare the kernel for free. You just need to spend another day setting up and mastering the programs and another 3-4 days waiting for the parsing results.

Standard semantist set (for advanced and classic method):

1. Kay Collector - 1900 rubles
2. Key Assort - 1700 rubles
3. Bukvarix (business account) - 650 rubles.
4. Competitor analysis service keys.so - 1500 rubles.
5. 5 proxies - 350 rubles per month
6. Anticaptcha - about 30 rubles.
7. Xml-limits - about 80 rubles.
8. Checking competition with Mutagen (1 check \u003d 30 kopecks) - we will keep within 200 rubles.
———————-
The result is 6410 rubles. You can, of course, do without KeyAssort, replacing it with Majento with a clusterer and use Slovoeb instead of Key Collector. Then 2810 rubles will be enough.

Is it worth trusting the development of the core "pro" or is it better to figure it out and do it yourself?

If a person regularly does what he loves, pumps in it, then following the logic, his results should be definitely better than those of a beginner in this area. But with the selection of keywords, everything turns out exactly the opposite.

Why does a beginner do better than a professional in 90% of cases?

It's all about the approach. The task of the semantist is not to collect the best core for you, but to complete his work in the shortest possible time and so that its quality suits you.

If you do everything yourself according to the algorithms mentioned earlier, the result will be an order of magnitude higher for two reasons:

  • You understand the topic. This means that you know the needs of your clients or site users and can at the initial stage maximize the token queries for parsing, using a large number of synonyms and specific words.
  • We are interested in doing everything efficiently. The business owner or employee of the company in which he works, of course, will approach the issue more responsibly and will try to do everything to the maximum. The more complete the core and the more low-competitive queries in it, the more targeted traffic will be collected, which means that the profit will be higher with the same investments in content.

How do you find the remaining 10% that make up the core better than you?

Look for companies in which the selection of key phrases is a key competence. And immediately discuss what you want the result, like everyone else or the maximum. In the second case, it will be 2-3 times more expensive, but in the long term it will pay off many times over. For those who want to order a service from me, all the necessary information and conditions. I guarantee the quality!

Why is it so important to fully work out semantics

Here, as in any field, the principle of "good and bad choice" works. What is its essence?
Every day we are faced with what we choose:

  • to meet with a person who seems to be nothing, but does not cling to, or having figured out himself to build a harmonious relationship with someone who is needed;
  • to do work that you do not like or find something to which your soul lies and make it your profession;
  • renting a space for a store in a non-walkable place or still waiting until it becomes free is a suitable option;
  • to hire not the best sales manager, but the one who showed himself the best in today's interview.

Everything seems to be clear. And if you look at it from the other side, presenting each choice as an investment in the future. This is where the fun begins!

Saved on this. core, 3-5 thousand. Happy as elephants! But what does this lead to next:

a) for information sites:

  • Losses in traffic by at least 1.5 times with the same investments in content. Comparing different methods of obtaining key phrases, we have already found out empirically that the alternative method allows you to collect 51% more;
  • The project sags faster in search results... It is easy for competitors to bypass us by giving a more complete answer on the intent.

b) for commercial projects:

  • Fewer or higher leads... If we have semantics, like everyone else, then we are moving forward according to the same queries as our competitors. A large number of offers with constant demand reduces the share of each of them on the market;
  • Low conversion. Specific inquiries translate better into sales. Saving on this. core, we lose the most conversion keys;
  • It's harder to move forward. There are many who want to be in the top - the requirements for each of the candidates are higher.

I wish you always make a good choice and invest only in a plus!

P.S. Bonus "How to write a good article with bad semantics", as well as other life hacks for promoting and making money on the Internet, read in my group

Sales Generator

Reading time: 14 minutes

We will send the material to you on:

In this article, you will learn:

  • How to compose the semantic core of the site
  • What programs to use for this
  • How to analyze the semantic core of a competitor's website
  • What mistakes are most often made in assembling the semantic core
  • How much does it cost to order a ready-made semantic core of the site

The semantic core is the basis of any Internet resource, the guarantee of its successful promotion and attraction of the target audience. How to correctly create the semantic core of the site and what mistakes to avoid, you will learn from this article.

What is the semantic core of the site

The easiest and at the same time effective way to attract visitors to your site is to make them themselves show interest in it by clicking on the link from the search engine Yandex or Google. To do this, you need to find out what your target audience is interested in, how, by what words, users search for the necessary information. The semantic core will help you with this.

The semantic core is a collection of individual words or phrases that characterize the subject matter and structure of your site. Semantics is originally a field of philology that deals with the meaning of words. Nowadays it is more often understood as the study of meaning in general.

Based on this, we can conclude that the concepts "semantic core" and "semantic core" are synonyms.

The purpose of creating a semantic core of the site is to fill it with content that is attractive to users. To do this, you need to find out what keywords they will search for information posted on your site.


Submit your application

The selection of the semantic core of the site involves the distribution of search queries or groups of queries across pages in such a way that they satisfy the target audience as much as possible.

This can be done in two ways. The first one is to analyze the search phrases of users and create a site structure based on them. The second way is to first come up with a skeleton of the future site, and then, after analysis, distribute keywords over it.

Each method has a right to exist, but the second is more logical: first, you create the structure of the site, and then fill it with search queries, by which potential customers can find the content they need through search engines.

So you show the quality of proactivity - you independently determine what information to convey to site visitors. Otherwise, by creating a site structure based on keywords, you are only adjusting to the surrounding reality.

There is a fundamental difference between the SEO specialist and the marketer's approach to creating the semantic core of a website.

The classic optimizer will tell you: to create a website, you need to select phrases and words from search queries that can be used to get to the TOP of search results. Then, on their basis, form the structure of the future site and distribute keywords across the pages. The content of the pages is created for the selected keywords.

A marketer or entrepreneur will approach the issue of creating a website differently. First, he will think about what the site is for, what information it will carry to users. Then he will come up with an indicative site structure and a list of pages. At the next stage, he will create a semantic core of the site in order to understand what search queries potential customers are looking for information.

What are the disadvantages of working with the Semantic Core from an SEO professional's perspective? First of all, with this approach, the quality of information on the site deteriorates significantly.

It is up to the company to decide what to say to customers, rather than serving content as a reaction to the most popular search queries. Such blind optimization can lead to the fact that some of the promising queries with low frequency rates are eliminated.

The result of creating a semantic core is a list of keywords that are distributed across the pages of the site. This list contains the URLs of the pages, keywords and the level of frequency of their requests.

An example of the semantic core of the site

How to compose the semantic core of the site: step by step instructions

Step 1. Making an initial list of requests

First, you need to select the most popular search queries on the topic of your site. There are two options for how to do this:

1. Brainstorming method - when, within a short period of time, you, alone or with colleagues, write down all the words and phrases by which, in your opinion, users will search for information posted on your site.

Write down all possible options, including:

  • variations in the spelling of the name of a product or service, synonymous words, ways of spelling the name in Latin letters and Cyrillic;
  • full names and abbreviations;
  • slang words;
  • references to the constituent elements of a product or service, for example, building materials - sand, brick, corrugated board, putty, etc .;
  • adjectives that reflect the significant characteristics of a product or service (quality repair, fast delivery, painless dental treatment).

2. Analyze your competitors' sites... Open an incognito browser for your region. Look at the sites of competitors, which will show you the search results on your topic. Find all potential keywords. You can determine the semantic core of a competitor's site using the com and bukvarix.com services.

Analyze contextual ads. On your own or using specialized services (for example, spywords.ru or advodka.com), study the semantic core of someone else's site and find out what keywords are used by competitors.

With all three approaches, you end up with a fairly large keyword list. But it will still not be enough to create an effective semantic core.

Step 2. Expanding the resulting list

At this stage, Yandex.Wordstat and Google AdWords services will help you. If you enter words from your list of keys, formed at the first stage, into the search line of any of these services, in turn, you will receive a list of refined and associative search queries at the output.

Refined queries may include not only your word, but other words or phrases as well. For example, if you enter the keyword "dog", then the service will give you 11,115,538 requests with this word, which include such requests for the last month as "dog photos", "dog treatment", "dog breeds", etc.


Associative queries are the words or phrases that users searched for along with your query. For example, along with the keyword "dog", users entered: "dry food", "royal canin", "Tibetan mastiff", etc. These search queries may also be useful to you.


In addition, there are special programs for creating the semantic core of the site, for example: KeyCollector, SlovoEB and online services - Topvisor, serpstat.com, etc. They allow not only to select keywords, but also to analyze them and group search queries.

To maximize your list of keys, see what the service's search suggestions show. There you will find the most popular searches starting with the same letters or words as yours.

Step 3. Removing unnecessary requests

Search terms can be classified in different ways. Depending on the frequency, requests are:

  • high-frequency (more than 1500 requests per month);
  • medium-frequency (600-1500 requests per month);
  • low-frequency (100-200 requests per month).

This classification is rather arbitrary. The assignment of a request to a particular category will depend on the topic of a particular site.

In recent years, there has been an upward trend in low-frequency queries. Therefore, to promote the site, the semantic core should include medium and low frequency queries.

There is less competition among them, so it will be much easier to raise your site to the first page of search results than when working with high-frequency queries. Plus, many search engines welcome sites using low-frequency keywords.

Another classification of search queries is based on search goals:

  1. Information - keywords that users enter in the search for certain information. For example: "how to glue the tiles in the bathroom yourself", "how to connect the dishwasher."
  2. Transactional - keywords that are entered by users planning to take some action. For example: "watch a movie online for free", "download the game", "buy building materials."
  3. Vital - queries that users enter in search of a specific site. For example: “Sberbank online”, “buy a refrigerator on Yandex.Market”, “vacancies on Head hunters”.
  4. Others (general) - all other search queries by which you can understand what the user is looking for. For example, the query “car” can be entered by the user if he wants to sell, buy or repair a car.

Now it's time to remove all unnecessary keywords from the list of keywords that:

  • do not correspond to the theme of your site;
  • include brand names of competitors;
  • include the names of other regions (for example, buy an iPhone in Moscow if your site only works for Western Siberia);
  • contain typos or errors (if you write “sabaka” in the search engine, and not “dog”, it will take this into account as a separate search query).

Step 4. Determine competitive requests

To effectively distribute keywords on the site pages, you need to filter them by importance. To do this, use the Keyword Effectiveness Index (KEI). Calculation formula:

KEI \u003d P 2 / C,

where P is the frequency of impressions of the keyword in the last month; С - the number of sites that are optimized for this search query.

The formula shows that the more popular the keyword, the higher the KEI, the more targeted traffic you will attract to your website. High competition for a search query makes it difficult to promote a site on it, which is reflected in the KEI value.

Thus, the higher the KEI, the more popular the search query, and vice versa: the lower the keyword performance index, the higher the competition for it.

There is a simplified variation of this formula:

KEI \u003d P 2 / U,

where instead of C, the U indicator is used - the number of pages optimized for this keyword.

Let's look at an example of how to use the Keyword Performance Index KEI. Let's determine the frequency of requests using the Yandex Wordstat service:


In the next step, let's see how many pages in the search results for the search query we are interested in over the last month.


Let's substitute the found values \u200b\u200bof the variables into the formula and calculate the keyword efficiency index KEI:

KEI \u003d (206 146 * 206 146) / 70,000,000 \u003d 607

How to estimate KEI values:

  • if KEI is less than 10, then search queries are ineffective;
  • if KEI is from 10 to 100, then search queries are effective, they will attract the target audience to the site;
  • if KEI is from 100 to 400, then search queries are very effective, they will attract a significant proportion of traffic;
  • with a KEI of more than 400, search queries are most effective and will attract a huge number of users.

Keep in mind that the gradation of the KEI Keyword Performance Index values \u200b\u200bis determined by the site theme. Therefore, the given scale of values \u200b\u200bcannot be applied to all Internet resources, since for some of them the value of KEI\u003e 400 may be insufficient, and for narrow-topic sites this classification is not applicable at all.

Step 5. Grouping keywords on the site

Semantic core clustering of a site is the process of grouping search queries based on logical considerations and based on search engine results. Before starting the grouping, it is important to make sure that the specialist who will carry out it understands all the intricacies of the company and the product, knows their specifics.

This work is costly, especially when it comes to filling a multi-page Internet resource. But you don't have to do it manually. You can cluster the semantic core of the site automatically using special services, such as Topvisor, Seranking.ru, etc.

But it is better to double-check the results obtained, since the logic of dividing keys into groups in programs may not coincide with yours. Eventually, you will receive the final structure of the site. Now you will clearly understand which pages to create and which to eliminate.

When is it necessary to analyze the semantic core of the competitors' website

  1. When starting a new project.

You are working on a new project and are building the semantic core of the site from scratch. To do this, you decided to analyze the keywords that competitors use to promote their sites.

Many are suitable for you, so you use them to replenish the semantic core. It is worth considering the niche in which competitors operate. If you plan to occupy a small market share, and competitors operate at the federal level, then you cannot just take and completely copy their semantic core.

  1. When expanding the semantic core of a working site.

You have a website that needs to be promoted. The semantic core was formed a long time ago, but it is ineffective. Requires website optimization, restructuring, content updating in order to increase traffic. Where do you start?

The first step is to analyze the semantic core of competing sites using specialized services.

How to use keywords from competitors' sites in the most effective way?

Here are some simple rules. First, consider the percentage of key matches from your site and from other people's Internet resources. If your site is still under development, then choose any competing site, analyze it and use keywords as the basis for creating your semantic core.

In the future, you will simply compare how your reference keys intersect with keys from competitors' sites. The easiest way is to upload a list of all competing sites using the service and filter them by the percentage of intersections.

Then you need to download the semantic cores of the first few sites into Excel or Key Collector and add new keywords to the semantic core of your site.

Secondly, before copying the keys from the donor site, be sure to visually check it.

  1. When buying a ready-made website for the purpose of subsequent development or resale.

Consider an example: you want to buy a specific site, but before making a final decision, you need to assess its potential. The easiest way to do this is by studying the semantic core, so you can compare the current coverage of the site with the sites of competitors.

Take the strongest competitor as a benchmark and compare its visibility indicators with the results of the Internet resource that you plan to acquire. If the lag behind the reference site is significant, this is a good sign: it means that your site has the potential to expand the semantic core and attract new traffic.

Pros and cons of analyzing the semantic core of competitors through special services

The principle of operation of many services for determining keywords on other people's sites is as follows:

  • a list of the most popular search queries is formed;
  • 1-10 search results pages (SERP) are selected for each key;
  • this collection of key phrases is repeated at regular intervals (weekly, monthly, or every year).

Disadvantages of this approach:

  • services give out only the visible part of search queries on the websites of competing organizations;
  • services keep a kind of "impression" of the issue, created during the collection of keywords;
  • services can determine the visibility of only those search queries that are in their databases;
  • services show only those keywords that they know.
  • to get reliable data about keywords on a competing site, you need to know when the search queries were collected (visibility analysis);
  • not all requests are reflected in the search results, so the service does not see them. The reasons may be different: the pages of the site have not yet been indexed, the search engine does not rank the pages due to the fact that they take a long time to load, contain viruses, etc.
  • there is usually no information about which keys are included in the service database used to collect search results.

Thus, the service does not form a real semantic core that underlies the site, but only a small visible part of it.

Based on the above, the following conclusions can be drawn:

  1. The semantic core of the competitors' site, formed with the help of special services, does not give a complete current picture.
  2. Checking the semantic core of a competitor's site helps to complement the semantics of your Internet resource or analyze the marketing policies of competing companies.
  3. The larger the base of keywords of the service, the slower the process of processing the issue and the lower the level of relevance of semantics. While the service collects search results at the beginning of the database, the data at the end of the database expires.
  4. The services do not disclose information about the degree of relevance of their databases and the date of the last update. Therefore, you cannot know to what extent the keywords selected by the service from the competitor's website reflect its real semantic core.
  5. A significant advantage of this approach is gaining access to a large list of competitors' keywords, many of which you can use to extend the semantic core of your own site.

TOP-3 paid services where you can find out the semantic core of competitors

Megaindex Premium Analytics


This service has a rich arsenal for analyzing the semantic core of competing sites. Using the module "Site visibility" you can find and download a list of key phrases, identify sites with similar topics that can be used to expand the semantic core of your site.

One of the disadvantages of Megaindex Premium Analytics is the inability to filter the lists of keys in the program itself, you first need to download them in Excel.

Brief description of the service:

Keys.so


In order to analyze the semantic core using the keys.so service, you need to insert the url of a competitor site, select suitable sites based on the number of matching key phrases, analyze them and download a list of search queries by which they are promoted. The service makes it easy and simple. A nice bonus is the modern interface of the program.

Cons: small size of the search phrases base, insufficient visibility refresh rate.

Service summary:

Spywords.ru


This service not only analyzes visibility, but also provides statistics on advertisements in Yandex.Direct. At first, it is difficult to understand the spywords.ru interface, it is overloaded with functionality, but in general it does its job well.

Using the service, you can analyze competing sites, identify intersections for key phrases, and unload a list of competitors' keys. The main disadvantage is the insufficient service base (about 23 million search phrases).

Service summary:

Thanks to special programs, sites and their semantic cores are no longer a mystery to you. You can easily analyze any competitors' Internet resources that interest you. Here are some tips for using this information:

  1. Use keywords only from sites with similar topics (the more intersections with yours, the better).
  2. Do not analyze portals, they have too large semantic cores. As a result, you will not add to your own kernel, but only extend it. And this, as you already know, can be done endlessly.
  3. When buying a website, be guided by the indicators of its current visibility in the search engine, compare them with the sites included in the TOP to assess the development potential.
  4. Take keywords from competitors' sites to complement the semantic core of your site, not build it from scratch.
  5. The larger the base of the service that you use, the more complete your semantic core will be. But pay attention to the frequency of updating the databases of search phrases.

7 services that will help you create the semantic core of your website from scratch online

Google Keyword Planner


If you are thinking about the question of how to create the semantic core of the site, pay attention to this service. It can be used not only in Runet, but also in other segments where AdWords works.

Open Google AdWords. In the top panel in the section "Tools" click on parameter Keyword Planner. A new menu will appear in which you need to select a section Search for new keywords by phrase, site or category.Here you can configure the following parameters:

  • a keyword or phrase that will be searched for;
  • subject of the product or service;
  • region of search queries;
  • the language in which users enter search queries;
  • search system for key phrases;
  • negative keywords (should not be present in key phrases).

Next, click on the button "Get options",after which Google AdWords will give you possible synonyms for your keyword or phrase. The resulting data can be uploaded to Google documents or CSV.

Benefits of using Google AdWords:

  • the ability to select synonyms for a key phrase;
  • using negative keywords to refine the search query;
  • access to a huge database of search queries of the Google search engine.

The main drawback of the service: if you have a free account, then Google AdWords will provide inaccurate data on the frequency of search queries. The margin of error is so significant that it is impossible to rely on these indicators when promoting a site. The way out is to buy access to a paid account or use another service.

Serpstat


This service allows you to comprehensively collect user search queries by keywords and site domains. Serplast is constantly expanding the number of regional bases.

The service allows you to identify the key competitors of your site, determine the search phrases by which they are promoted, and generate a list of them for subsequent use in the semantic core of your Internet resource.

Serplast service advantages:

  • a wide range of tools for analyzing the semantic core of competitors' sites;
  • informative reporting forms reflecting the frequency indicators for the selected region;
  • the option of unloading search queries for specific pages of the site.

Cons of the Serplast service:

  • despite the fact that the data of the service databases are constantly updated, there is no guarantee that realistic data on the frequency of search queries will be provided in the interval between the latest updates;
  • not all search phrases with low frequency are displayed by the service;
  • limited languages \u200b\u200band countries with which the service works.

Key collector


This service will help you understand not only the question of how to assemble the semantic core of the site, but also solve the problem of its expansion, cleaning and clustering. Key Collector can collect search queries, provide data on the level of their frequency for selected regions, and process semantics.

The program searches for key phrases by starting lists. It can be used to work with databases of various formats.

Key Collector can show the frequency of keywords from data downloaded from Serpstat, Yandex Wordstat and other services.

Semrush


Compilation of the semantic core of the site in the Semrush program will cost you absolutely free. But you will receive no more than 10 keywords with data on their frequency for the selected region. In addition, using the service, you can find out what other search queries are entered by users in other regions for your keyword.

Pros of the Semrush service:

  • works all over the world, you can collect data on the frequency of search queries in the western region;
  • for each key phrase gives out TOP sites in the search results. You can focus on them in the future, when forming the semantic core of your own site.

Cons of the Semrush service:

  • if you want to get more than 10 keywords, you need to purchase the paid version for $ 100;
  • it is impossible to download the complete list of keywords.

Keywordtool


This service allows you to collect key phrases for the semantic core of the site from foreign Internet resources in broad agreement. Also Keywordtool allows you to pick up search suggestions and phrases that contain the basic key.

If you use the free version of the program, then in one session you can get no more than 1000 search phrases without data on the level of their frequency.

Keywordtool service advantages:

  • works with different languages \u200b\u200band in many countries of the world;
  • shows search queries not only from search engines, but also from popular online stores (Amazon, eBay, App Store) and the largest video hosting service YouTube;
  • the breadth of coverage of search phrases exceeds that of Google AdWords;
  • the generated list of search queries can be easily copied into a table of any format.

Disadvantages of the Keywordtool service:

  • the free version does not provide data on the frequency of search queries;
  • there is no way to load keywords immediately with a list;
  • searches for keywords only for phrases that they can be included in, does not take into account possible synonyms

Ubersuggest


The semantic core of the site in the Ubersuggest service can be created based on the search queries of users from almost any country in the world in any language. If you use the free version, you will receive up to 750 search phrases per request.

The advantage of the service is the ability to sort the list of keywords in alphabetical order, taking into account the prefix. All search queries are automatically grouped, which makes it easier to work with them when forming the semantic core of the site.

As a disadvantage of Ubersuggest, one can single out incorrect data on the frequency of search queries in the free version of the program and the inability to search by keyword synonyms.

Ahrefs Keywords Explorer


This service can collect keywords for your semantic core in a broad, phrasal and exact match in the selected region, taking into account the frequency level.

There is an option to select negative keywords and view the TOP search results in Google by your main keywords.

The main disadvantages of Ahrefs Keywords Explorer are only the paid version of the program and the dependence of data accuracy on the relevance of the databases.

Frequently asked questions on compiling the semantic core of the site

  • How many keys are enough to create the semantic core of the site (100, 1000, 100,000)?

There is no definite answer to this question. It all depends on the specifics of the site, its structure, the actions of competitors. The optimal number of keys is determined individually.

  • Is it worth using ready-made databases of key phrases to form the semantic core of the site?

On the Internet, you can find many different resources with themed keybases. For example, Pastukhov's Base, UP Base, Mutagen, KeyBooster, etc. This is not to say that you should not use such sources. Such databases contain significant archives of search queries that will be useful to you for website promotion.

But be aware of metrics like key competitiveness and relevance. Also keep in mind that your competitors may also use ready-made databases. Another disadvantage of such sources is the likelihood of the absence of key phrases that are meaningful to you.

  • How to use the semantic core of the site?

The key phrases selected to create the semantic core are used to generate a relevance map. It includes the title, description and h1-h6 headings, which are needed to promote the site. Also, keys are taken as a basis for writing SEO texts for website pages.

  • Should I take queries with zero frequency for the semantic core of the site?

This is advisable in the following cases:

  1. If you spend a minimum of resources and time to create pages with such keys. For example, the generation of SEO filter pages in automatic mode in online stores.
  2. Zero frequency is not absolute, that is, at the time of collecting information, the frequency level is zero, but the search engine history shows queries for this word or phrase.
  3. Zero frequency only in the selected region, in other regions the level of frequency for this key is higher.

5 common mistakes when assembling a semantic core for a website

  1. Avoiding highly competitive keywords. This does not oblige you at all costs to bring the site to the TOP using this key. You can use such a search phrase as an addition to the semantic core, as a content idea.
  2. Refusal to use keys with low frequency... You can also use similar search terms in the form of content ideas.
  3. Creation of Internet pages for individual search queries. Surely you have seen sites where there is a page for similar requests (for example, "buy a wedding cake" and "make a wedding cake to order"). But the user who enters these requests actually wants the same thing. There is no point in making multiple pages.
  4. Create the semantic core of the site solely using services. Of course, collecting keys automatically makes life easier. But their value will be minimal if you do not analyze the result. After all, only you understand the specifics of the industry, the level of competition and know everything about your company's events.
  5. Over-focus on collecting keys... If you have a small site, then start by collecting semantics using Yandex or Google services. You should not immediately engage in the analysis of the semantic core on competitors' sites or collecting keys from different search engines. All these methods will come in handy when you realize that it is time to expand the kernel.

Or maybe it is better to order the compilation of the semantic core of the site?

You can try to compose the semantic core yourself using the free services that we talked about. For example, Google's Keyword Planner might give you good results. But if you are interested in building a large quality semantic core, plan this line item in your budget.

On average, the development of the semantic core of the site will cost from 30 to 70 thousand rubles. As you remember, the final price depends on the business subject and the optimal number of search queries.

Not to buy a pig in a poke

A good semantic core will not come cheap. To make sure that the performer understands this work and will do everything at a high level, ask him to collect test semantics for one request. This is usually done for free.

To check the results obtained, pass the list of keys through Mutagen and analyze how many of them are high-frequency and low-competitive. Often, performers provide lists with a large number of key phrases, many of which are completely unsuitable for further use.


Array (\u003d\u003e 21 [~ ID] \u003d\u003e 21 \u003d\u003e 28.09.2019 13:01:03 [~ TIMESTAMP_X] \u003d\u003e 28.09.2019 13:01:03 \u003d\u003e 1 [~ MODIFIED_BY] \u003d\u003e 1 \u003d\u003e 21.09. 2019 10:35:17 [~ DATE_CREATE] \u003d\u003e 21.09.2019 10:35:17 \u003d\u003e 1 [~ CREATED_BY] \u003d\u003e 1 \u003d\u003e 6 [~ IBLOCK_ID] \u003d\u003e 6 \u003d\u003e [~ IBLOCK_SECTION_ID] \u003d\u003e \u003d\u003e Y [~ ACTIVE] \u003d\u003e Y \u003d\u003e Y [~ GLOBAL_ACTIVE] \u003d\u003e Y \u003d\u003e 500 [~ SORT] \u003d\u003e 500 \u003d\u003e Articles by Dmitry Svistunov [~ NAME] \u003d\u003e Articles by Dmitry Svistunov \u003d\u003e 11076 [~ PICTURE] \u003d \u003e 11076 \u003d\u003e 7 [~ LEFT_MARGIN] \u003d\u003e 7 \u003d\u003e 8 [~ RIGHT_MARGIN] \u003d\u003e 8 \u003d\u003e 1 [~ DEPTH_LEVEL] \u003d\u003e 1 \u003d\u003e Dmitry Svistunov [~ DESCRIPTION] \u003d\u003e Dmitry Svistunov \u003d\u003e text [~ DESCRIPTION_TYPE ] \u003d\u003e text \u003d\u003e Articles by Dmitry Svistunov Dmitry Svistunov [~ SEARCHABLE_CONTENT] \u003d\u003e Articles by Dmitry Svistunov Dmitry Svistunov \u003d\u003e statyi-dmitriya-svistunova [~ CODE] \u003d\u003e statyi-dmitriya-svistunova \u003d\u003e [~ XML_ID] \u003d\u003e \u003d\u003e [~ TMP_ID] \u003d\u003e \u003d\u003e [~ DETAIL_PICTURE] \u003d\u003e \u003d\u003e [~ SOCNET_GROUP_ID] \u003d\u003e \u003d\u003e /blog/index.php?ID\u003d6 [~ LIST_PAGE_URL] \u003d\u003e /blog/index.php?ID\u003d6 \u003d\u003e /blog/list.php? SECTION_ID \u003d 21 [~ SECTION_PAGE_URL] \u003d\u003e /blog/list.php?SECTION_ID\u003d21 \u003d\u003e blog [~ IBLOCK_TYPE_ID] \u003d\u003e blog \u003d\u003e blog [~ IBLOCK_CODE] \u003d\u003e blog \u003d\u003e [~ IBLOCK_EXTERNAL_ID] \u003d\u003e \u003d\u003e [ ~ EXTERNAL_ID] \u003d\u003e)

Organic search is the most effective source of targeted traffic. To use it, you need to make the site interesting and visible to users of Yandex and Google search engines. There is no need to reinvent the wheel here: it is enough to define what the audience of your project is interested in and how they seek information. This task is solved when building a semantic core.

Semantic core - a set of words and phrases that reflect the topic and structure of the site. Semantics - a section of linguistics that studies the semantic content of language units. Therefore, the terms "semantic core" and "semantic core" are identical. Remember this line, it will not let you slide into keyword stuffing or stuffing content with keywords.

Composing the semantic core, you are answering the global question: what information can be found on the site. Since one of the main principles of business and marketing is customer focus, the creation of the semantic core can be viewed from the other side. You need to determine what search terms users are using to search for information that will be published on the site.

Building a core of meaning solves another problem. We are talking about the distribution of search phrases across the pages of the resource. By working with the core, you determine which page is the best answer to a particular search query or group of queries.

There are two approaches to solving this problem.

  • The first assumes creation of the site structure based on the results of the analysis of user search queries... In this case, the semantic core defines the framework and architecture of the resource.
  • The second approach involves preliminary planning of the resource structure before analyzing search queries... In this case, the semantic core is distributed over the finished framework.

Both approaches work in one way or another. But it is more logical to first plan the structure of the site, and then determine the queries by which users will be able to find this or that page. In this case, you remain proactive: you choose what you want to tell potential customers. If you adjust the resource structure to the keys, then you remain an object and react to the environment, rather than actively changing it.

The difference between SEO and marketing approaches to building a core needs to be clearly emphasized here. Here's the logic of a typical old-school SEO: to build a website, you need to find keywords and select phrases that will just get to the top of the results. After that, you need to create a site structure and distribute the keys across pages. The page content needs to be optimized for key phrases.

This is the logic of a businessman or a marketer: you need to decide what information to broadcast to the audience using the site. To do this, you need to know your industry and business well. First, you need to plan a rough site structure and a preliminary list of pages. After that, when building a semantic core, you need to find out how the audience searches for information. With the help of content, you need to answer the questions that the audience asks.

What are the negative consequences of using the "SEO" approach in practice? Due to the development according to the principle of "dancing from the stove", the information value of the resource decreases. The business must shape trends and choose what to say to customers. A business should not limit itself to reactions to the statistics of search phrases and create pages only for the sake of optimizing a site for some key.

The planned result of building a semantic core is a list of key queries distributed across the pages of the site. It contains URLs of pages, search queries and an indication of their frequency.

How to build a site structure

The site structure is a hierarchical page layout. With its help, you solve several problems: you plan an information policy and the logic of information presentation, ensure the usability of the resource, and ensure that the site meets the requirements of search engines.

To build a structure, use a convenient tool for you: spreadsheet editors, Word or other software. You can also draw the structure on a piece of paper.

When planning your hierarchy, answer two questions:

  1. What information do you want to communicate to users?
  2. Where should this or that information block be published?

Imagine planning a site structure for a small pastry shop. The resource includes information pages, a publications section, and a showcase or product catalog. Visually, the structure might look like this:

For further work with the semantic core, arrange the site structure in the form of a table. In it, indicate the names of the pages and indicate their subordination. Also include columns in the table for page URLs, keywords, and frequency. The table might look like this:

You will fill in the URL, Keys, and Frequency columns later. Now go to search for keywords.

What you need to know about keywords

To find a semantic core, you must understand what are keywords and what keys the audience uses... With this knowledge, you will be able to correctly use one of the keyword research tools.

What keys are used by the audience

Keys are words or phrases that potential customers use to find the information they need. For example, to make a cake, the user enters the query "Napoleon recipe with photo" into the search box.

Keywords are classified according to several criteria. In terms of popularity, high-, medium- and low-frequency queries are distinguished. According to various sources, search phrases are grouped as follows:

  • TO low frequency includes requests with a frequency of impressions up to 100 per month. Some experts include requests with a frequency of up to 1000 impressions in the group.
  • TO mid-range includes requests with a frequency of up to 1000 impressions. Sometimes experts increase the threshold to 5,000 impressions.
  • TO high frequency queries include phrases with a frequency of 1000 impressions. Some authors consider high-frequency keys to have 5000 or even 10,000 queries.

The difference in the frequency estimate is due to the different popularity of the topics. If you are building a core for an online store that sells laptops, the phrase “buy a samsung laptop” with a display frequency of about 6,000 per month will be midrange. If you are building a core for a sports club site, the request for "aikido section" with a frequency of about 1000 requests will be high-frequency.

What do you need to know about frequency when composing a semantic core? According to various sources, from two-thirds to four-fifths of all user requests are low-frequency. Therefore, you need to build the widest possible semantic core. In practice, it must constantly expand with low-frequency phrases.

Does this mean that high and medium frequency requests can be ignored? No, you can't do without them. But consider low-frequency keys as the main resource for attracting targeted visitors.

According to the needs of users, the keys are combined into the following groups:

  • Information... The audience uses them to find information. Examples of information requests: "how to store baked goods correctly", "how to separate the yolk from the protein."
  • Transactional... Users enter them when they plan to take an action. This group includes the keys "buy a bread maker", "download a recipe book", "order pizza with delivery."
  • Other requests... These are key phrases for which it is difficult to determine the user's intent. For example, when a person uses the "cake" key, they may be planning to buy a culinary product or prepare it themselves. In addition, the user may be interested in information about the cakes: definition, features, classification, etc.

Some experts distinguish navigation queries into a separate group. With their help, the audience searches for information on specific sites. Here are some examples: "laptop connected", "city express track delivery", "sign up for LinkedIn." Navigation queries that are not specific to your business can be ignored when compiling the semantic core.

How to use this classification method when building a semantic core? First, you must consider the needs of your audience when distributing keys across pages and creating a content plan. Everything is obvious here: publications of informational sections should respond to information requests. There should also be most of the key phrases without express intention. Transactional questions should be answered by pages from the "Store" or "Showcase" sections.

Second, remember that many transactional issues are commercial. To attract natural traffic on the request “buy a Samsung smartphone”, you will have to compete with Euroset, Eldorado and other business heavyweights. To avoid unequal competition, you can use the recommendation above. Expand your kernel as much as possible and reduce the frequency of requests. For example, the frequency of the request “Buy a Samsung Galaxy s6 smartphone” is an order of magnitude lower than the frequency of the “Buy Samsung Galaxy smartphone” key.

What you need to know about the anatomy of search queries

Search phrases consist of several parts: body, specifier and tail... This can be seen with an example.

What about the query "cake"? It cannot be used to determine the user's intent. It is high-frequency, which determines the high competition in the search results. Using this request for promotion will bring a large proportion of non-targeted traffic, which negatively affects behavioral metrics. The high frequency and non-specificity of the “cake” request is determined by its anatomy: it consists only of the body.

Pay attention to the request "buy a cake". It consists of the body "cake" and the qualifier "buy". The latter determines the user's intent. Specifiers indicate that the key belongs to transactional or informational. Take a look at examples:

  • Buy a cake.
  • Cake recipes.
  • How to serve a cake.

Sometimes specifiers can express exactly the opposite intent of the user. A simple example: users are planning to buy or sell a car.

Now look at the request "buy cake delivery". It consists of a body, specifier, and tail. The latter does not change, but details the intention or informational need of the user. Take a look at examples:

  • Buy cake online.
  • Buy a cake in Tula with delivery.
  • Buy a homemade cake in Orel.

In each case, the intention of the person to purchase the cake is visible. And the tail of the keyword phrase details this need.

Knowledge of the anatomy of search phrases allows you to derive a conditional formula for selecting keys for the semantic core. You must define basic terms related to your business, product, and user needs. For example, customers of a confectionery company are interested in cakes, pastries, cookies, pastries, cupcakes and other confectionery.

After that, you need to find the tails and specifiers that the project audience uses with the basic terms. With tailed phrases, you simultaneously increase reach and reduce core competitiveness.

Long tail or long tail is a term that defines the strategy for promoting a resource for low-frequency keywords. It consists in using the maximum number of keys with a low contention level. Low-frequency promotion ensures high efficiency of marketing campaigns. This is due to the following factors:

  • Promotion by low-frequency keys requires less effort compared to promotion by high-frequency competitive requests.
  • Working with long-tail queries is guaranteed to bring results, although marketers can not always accurately predict which keys will generate traffic. When dealing with high-frequency queries, decent marketers cannot guarantee results.
  • Low-frequency devices provide a higher specificity of the output results to the needs of users.

For large sites, the semantic core can contain tens of thousands of queries, and it is almost impossible to pick and correctly group them by hand.

Services for compiling the semantic core

There are quite a few keyword research tools out there. You can build a kernel using paid or free services and programs. Choose a specific remedy depending on your tasks.

Key collector

You cannot do without this tool if you are engaged in internet marketing professionally, develop multiple sites or compose the core for a large site. Here is a list of the main tasks that the program solves:

  • Selection of keywords. Key Collector collects queries through the Yandex Wordstat.
  • Parsing search suggestions.
  • Clipping inappropriate search phrases with stop words.
  • Filtering requests by frequency.
  • Search for implicit duplicate queries.
  • Definition of seasonal requests.
  • Collection of statistics from third-party services and platforms: Liveinternet.ru, Metrika, Google Analytics, Google AdWords, Direct, Vkontakte and others.
  • Search for pages relevant to the request.
  • Search query clustering.

Key collector is a multifunctional tool that automates the operations required to build a semantic core. The program is paid. You can perform all the actions that Key Collector "knows how" with the help of alternative free tools. But for this you will have to use several services and programs.

SlovoEB

This is a free tool from the creators of Key Collector. The program collects keywords through Wordstat, determines the frequency of queries, parses search suggestions.

To use the program, enter the login and password for your Direct account in the settings. Do not use your main account, as Yandex may block it for automatic requests.

Create a new project. On the Data tab, select the Add Phrases option. Specify search phrases that the project audience is supposed to use to find information about products.

In the "Collect keywords and statistics" section of the menu, select the required option and run the program. For example, determine the frequency of key phrases.

The tool allows you to select keywords, as well as automatically perform some tasks related to the analysis and grouping of queries.

Keyword selection service Yandex Wordstat

To see which phrases a page is shown for in Yandex search results, open the "Search queries" tab in the Yandex.Webmaster panel -> "Recent requests".

We can see the phrases for which there were clicks or the site snippet was shown in the TOP-50 of Yandex for the last 7 days.

To see the data only for the page that interests us, you need to use filters.

The possibilities of searching for additional phrases in Yandex.Webmaster are not limited to this.

Go to the "Search queries" tab -> "Recommended Requests".

There may not be many phrases here, but you can find additional phrases for which the promoted page does not fall into the TOP-50.

Request history

The big disadvantage of the visibility analysis in Yandex.Webmaster, of course, is that the data is available only for the last 7 days. To get around this limitation a little, you can try to supplement the list using the "Searches" tab -> "Request history".

Here you will need to select "Popular Searches".

You will receive a list of the most popular phrases from the last 3 months.

To get phrases from Google Search Console, go to the "Search Traffic" tab -> "Analysis of search queries". Next, select "Impressions", "CTR", "Clicks". This will allow you to see more information that can be useful when analyzing phrases.

By default, the tab displays data for 28 days, but you can expand the range to 90 days. You can also select the desired country.

As a result, we get a list of requests similar to the one shown in the screenshot.

New version of Search Console

Google has already made available some of the tools in the new version of the panel. To view requests for a page, go to the "Status" tab - > "Efficiency".

In the new version, filters are arranged differently, but the filtering logic is preserved. I think there is no point in dwelling on this question. Of the significant differences, it is worth noting the ability to analyze data for a longer period, and not just for 90 days. A significant advantage when compared to Yandex.Webmaster (only 7 days).

Analysis services for competing websites

Competitor sites are a great source of keyword ideas. If you are interested in a specific page, you can manually determine the search phrases for which it is optimized. To find the main keys, it is usually enough to read the material or check the content of the keywords meta tag in the page code. You can also use services for semantic analysis of texts, for example, Istio or Advego.

If you need to analyze the entire site, use the services of complex competitive analysis:

You can use other tools to collect key phrases as well. Here are some examples: Google Trends, WordTracker, WordStream, Ubersuggest, Topvisor... But do not rush to master all services and programs at once. If you are compiling the semantic core for your own small site, use a free tool, for example, the Yandex keyword selection service or the Google Planner.

How to find keywords for the semantic core

The process of selecting key phrases is combined in several stages:

  1. In the first, you will define the base keywords that the audience uses to search for your product or business.
  2. The second stage is devoted to the expansion of the semantic core.
  3. In the third step, you will remove inappropriate search phrases.

Defining base keys

Fill in a spreadsheet or write down general search phrases related to your business and products. Gather colleagues and brainstorm. Record all proposed ideas without discussion.

Your list will look something like this:

Most of the keys you wrote down are high in frequency and low in specificity. To get high-specificity mid- and low-frequency search phrases, you need to expand your core as much as possible.

Expanding the semantic core

You will accomplish this task using keyword research tools such as Wordstat. If your business has a regional binding, select the appropriate region in the settings.

With the help of the key phrase selection service, you need to analyze all the keys recorded at the previous stage.

Copy the phrases from the left column of Wordstat and paste into the table. Pay attention to the right column of Wordstat. In it, Yandex offers phrases that people used in conjunction with the main query. Depending on the content, you can immediately select the appropriate keys from the right column or copy the entire list. In the second case, inappropriate requests will be eliminated at the next stage.

And the result of this stage of work will be a list of search phrases for each basic key that you brainstormed. Lists can contain hundreds or thousands of queries.

Removing unsuitable search phrases

This is the most time consuming stage of working with the kernel. You need to manually remove inappropriate search phrases from the kernel.

Do not use frequency, concurrency, or other purely "SEO" metrics as a criterion for evaluating keys. Do you know why old-school SEOs think certain search phrases are junk? For example, take the "diet cake" key. The Wordstat service predicts 3 impressions for it per month in the Cherepovets region.

To promote pages for specific keywords, old school SEOs bought or rented links. By the way, some experts still use this approach. It is clear that search phrases with low frequency in most cases do not pay off the money spent on buying links.

Now look at the phrase "diet cakes" through the eyes of a typical marketer. Some representatives of the CA confectionery company are really interested in such products. Therefore, the key can and should be included in the semantic core. If a pastry shop prepares appropriate products, the phrase will come in handy in the product description section. If the company for some reason does not work with diet cakes, the key can be used as a content idea for the information section.

What phrases can be safely excluded from the list? Here are some examples:

  • Keys mentioning competing brands.
  • Keys that mention products or services that you do not sell or plan to sell.
  • Keys with the words "inexpensive", "cheap", "with a discount". If you are not dumping, cut off those who love the cheap so as not to spoil the behavioral metrics.
  • Duplicate keys. For example, of the three keys "custom-made cakes for a birthday", "cakes to order for the day" and "cakes to order for birth", it is enough to leave the first one.
  • Keys mentioning inappropriate regions or addresses. For example, if you serve residents of the Northern District of Cherepovets, the key "cakes to order industrial district" is not suitable for you.
  • Phrases entered with errors or typos. The search engines understand that the user is looking for croissants, even if they enter the key "croissants" in the search box.

After deleting the inappropriate phrases, you got a list of queries for the basic key "cakes to order". The same lists need to be drawn up for other basic clues from the brainstorming phase. After that, move on to grouping key phrases.

How to group keywords and build a relevance map

The search phrases with which users find or will find your site are combined into semantic clusters, this process is called clustering search queries... These are similar groups of requests. For example, the semantic cluster "Cake" includes all key phrases associated with this word: cake recipes, ordering a cake, photo cakes, wedding cake, etc.

Semantic cluster is a group of queries united in the sense. It is a multi-level structure. Inside the first-order cluster "Cake" there are clusters of the second order "Cake recipes", "Ordering cakes", "Photos of cakes". Within the cluster of the second order "Cake Recipes" it is theoretically possible to distinguish the third order of clustering: "Recipes for cakes with mastic", "Recipes for biscuit cakes", "Recipes for shortbread cakes". The number of levels in a cluster depends on the scope of the topic. In practice, in most topics, it is sufficient to single out second-order clusters specific to business within the first-order clusters.

In theory, a semantic cluster can have many levels.
In practice, you will have to work with clusters of the first and second levels.

Most of the first-level clusters you brainstorm when you write down the basic key phrases. To do this, it is enough to understand your own business, as well as peep at the site diagram that you drew up before starting work on the semantic core.

It is very important to correctly perform clustering at the second level. Here, search phrases are changed with qualifiers that indicate user intent. A simple example is the “cake recipes” and “custom made” clusters. The first search phrases are used by people in need of information. Cluster 2 keys are used by customers who want to buy a cake.

You have identified the search phrases for the cake-to-order cluster using Wordstat and manual filtering. They must be distributed between the pages of the "Cakes" section.

For example, in the cluster there are searches for “custom-made soccer cakes” and “custom-made soccer cakes”.

If there is a corresponding product in the assortment of the company, it is necessary to create a corresponding page in the section "Mastic Cakes" Add it to the site structure: include the name, URL and search phrases with frequency.

Use Keyword Research or similar tools to see what other search phrases potential customers use to find soccer-themed cakes. Add relevant pages to your keyword list.

In the list of cluster search phrases, mark the distributed keys in a convenient way. Distribute the remaining search phrases.

If necessary, change the structure of the site: create new sections and categories. For example, the page "Custom Paw Patrol Cakes" should go under the "Baby Cakes" section. At the same time, it can be included in the "Mastic Cakes" section.

Pay attention to two points. First, the cluster may not have matching phrases for the page you are planning to create. This can happen for a variety of reasons. For example, these include the imperfection of the tools for collecting search phrases or their incorrect use, as well as the low popularity of the product.

The absence of a suitable key in the cluster is not a reason to refuse to create a page and sell a product. For example, imagine that a confectionery company sells children's cakes featuring Peppa Pig characters. If the corresponding keys are not included in the list, clarify the needs of the audience using Wordstat or another service. In most cases, there will be suitable queries.

Secondly, even after removing unnecessary keys, search phrases may remain in the cluster that are not suitable for the created and scheduled pages. They can be ignored or used in another cluster. For example, if a pastry shop for some reason fundamentally does not sell Napoleon cake, the corresponding key phrases can be used in the Recipes section.

Search query clustering

Grouping of search queries can be carried out manually, in Excel or Google spreadsheets, or automated, using special applications and services.

Clustering allows you to understand how requests can be distributed across the pages of the site for their fastest and most effective promotion.

Automatic clustering or grouping of search queries of the semantic core is carried out based on the analysis of sites included in the TOP-10 results of the search engines Google and Yandex.

How automatic query grouping works: for each of the requests, the issue among the TOP-10 sites is viewed. If there are matches among at least 4-6 of them, then the requests can be grouped to be placed on one page.

Automatic grouping is the fastest and most effective way to combine keywords to form a nearly ready-to-use site structure.

If it is not true, from the point of view of search engine statistics, it will be, alas, impossible to form the structure of the site and distribute requests among its pages, to successfully promote the pages to the TOP!

Apps and services for automatic grouping of search queries

Among the services that automate the grouping of keywords, it is worth highlighting:

  • Key Collector.
  • Rush Analytics.
  • TopVisor.

After distributing all the keys, you will receive a list of existing and planned site pages with URL, search phrases and frequency. What to do with them next?

What to do with the semantic core

A table with a semantic core should become a roadmap and the main source of ideas when forming:

Look: you have a list with preliminary page titles and search phrases. They define the needs of the audience. When drawing up a content plan, you just need to clarify the name of the page or publication. Include your main search term. This is not always the most popular key. In addition to popularity, the query in the title should best reflect the needs of the page audience.

Use the rest of the search phrases as an answer to the question "what to write about." Remember, you don't need to write all the search phrases in your content or product description by all means. Content should cover the topic and answer users' questions. Again, note that you need to focus on information needs, not search phrases and how they fit into the text.

Semantic core for online stores

The specificity of the preparation and clustering of semantics lies in the presence of four very important, for the subsequent, groups of pages:

  • Home page.
  • Pages of sections and subsections of the catalog.
  • Product card pages.
  • Blog article pages.

Above, we have already talked about different types of search queries: informational, transactional, commercial, navigation. For pages of sections and products of an online store, first of all, transactional ones are interesting, i.e. queries using which search engine users want to see sites where they can make a purchase.

It is necessary to start forming the core with a list of goods that you are already selling or planning to sell.

For online stores:

  • as " body»Requests will be made product names;
  • as " specifiers"Phrases:" buy», « price», « sale», « to order», « a photo», « description», «

Useful materials from the blog on collecting keys for semantics, clustering queries and optimizing site pages.

Articles topics:


Semantic core



A correctly composed semantic core can send only the necessary users to your site, and an unsuccessful one can bury it in the bowels of search results.

Working with queries included in the Semantic Core (SJ) consists of collection, cleaning and clustering. Having received the grouping results, you need to determine the optimal place for each of them: on the resource page, as part of the content of your site or a third-party site.


How to collect keys for SJ


Briefly about the important: what operators to use in Wordstat to view the necessary queries, and how to facilitate your work in the service.

Wordstat does not provide absolutely accurate information, it does not contain all queries, the data can be distorted, because not all consumers use Yandex.However, from this data it is possible to draw conclusions about the popularity of a topic or product, approximately predict demand, collect keys, etc. find ideas for new content.

You can search for data by simply driving a query into the search for a service, but there are operators for specifying queries - additional characters with qualifications. They work on the search tabs by words and by regions, on the tab with the history of queries you can use only the operator "+ query".

In the article:

  • Why do you need Wordstat
  • Working with Wordstat operators
  • How to read Wordstat data
  • Extensions for Yandex Wordstat

We strive to become the leaders of the search results: how the analysis of articles from the top will help in working on content, by what criteria to carry out the analysis and how to do it faster and more efficiently.

It is difficult to track the results of blogging and posting other texts on the site without detailed analytics. How to understand why competitors' articles are in the top, but yours are not, although you write better and more talented?

In the article:

  • What is usually advised
  • How to analyze
  • Cons of the approach
  • Benefits of content analysis
  • Tools

How to write optimized texts


Which content gets more links and social signals? Backlinko partnered with BuzzSumo to analyze 912 million blog posts: article length, headline format, social signals and article backlinks, and deduced recommendations for content marketing. We have translated and adapted the study.

In the article:

  • Summary of Conclusions from Content Research
  • New knowledge about content marketing, in detail:
  1. Which content gets more links
  2. What texts are more popular on social networks
  3. Backlinks are hard to get
  4. What materials are collected by all reposts
  5. How the number of backlinks and reposts is related
  6. Which headlines bring more shares
  7. What day of the week is it better to publish content
  8. What content format is shared more often
  9. Which content format gets more links
  10. How to generate links and reposts of B2B and B2C content

Michael (Kashchey)

18.11.2015

The semantic core of the site: what is it? Collecting the semantic core and analyzing key queries

What is the Semantic Core (SJ)? Before I give the answers, let’s understand related concepts. This is necessary so that we speak the same language. So:

Key request (K3) Is a phrase that is driven into the search bar of Yandex, Google, etc.

Frequency of requests. There are low-frequency, mid-frequency and high-frequency categories of requests (LF, MF, HF)

Target audience (CA)... Those who are interested in your services, products or information.

What is SJ? SN is a collection of key queries of all categories for which your target audience will come to your site. Something like this. The second question that needs to be considered before moving on to compiling a semantic core, more precisely to a story about how to compose a semantic core, is the frequency of key queries. What is it and how to divide requests by frequency?

It is not difficult to separate requests by frequency. If the key is driven in more than 1000 times a month, then this is definitely HF. If 100-1000, then this is the midrange. Anything less than 100 is LF.

Attention! In some narrow topics, these numbers do not work. That is, you need to find the highest frequency request - this will be HF. Mid-range requests will be between LF and HF. What services help you find out how many people are typing this key phrase every month? Look for the answer in the article: (in this article, you will also find information about SEO services that help you to find a CV)

Now that I have tried to explain which is which, we proceed to the main thing: to collect the semantic core.

Compilation of the semantic core for the site

Building a semantic core is not as easy as it sounds. You need to consider all possible options for LF and MF requests. To compile an SEO core, it is better to use special services. Information about them can be found at the link above.

How to select requests? Let's say you are building a website for cat lovers. How would you go about looking for information about cats? What would you write to search? The first thing that comes to mind. For instance:

Cat (HF +) (cats are not a separate request)

Siamese cat (HF)

Cats (midrange)

What do domestic cats eat (LF)

I checked the frequency on the wordstat.yandex.ru service. Like this:

Pay attention to the quotes. They are needed to find out how many people entered a query in a direct entry. When compiling a semantic one, you need to focus on direct requests and "tails". ABOUT "Tails" can be read.

Hope this is clear.

Find all possible thematically keys Is a dreary, painstaking work that takes a lot of time. However, a lot depends on the build quality of the semantic core of the site - the success of further SEO-optimization of the resource.

What is the most important thing in the preparation of a SA?

The most important thing in drawing up a semantic core is to correctly structure all key queries. This is necessary in order to use the compiled semantic core as efficiently as possible. And there is nothing better than a table.

Here's a good example of a table. By the way ... the table is better done in the Excel program.

So what do we see? We see a competent structure, it is easy to work on it. You can add your own columns to the table to make your task easier.

Your task is to find as many low-frequency queries with low competition as possible and promote your site for these queries. How can you tell if a query is low-competitive? If the request is LF, then in 80% of cases it has little competition. You can also check the level of competition in a search engine. Like this:

Result: 43 million responses. The competition for the cat theme will be low. For other topics, you need to focus on other numbers. For example, the shortcut "copywriter" is an HF key with 2 million responses and is highly competitive.

HF requests will automatically fit into articles for LF requests - this is normal. It is best to write one article per one CZ, select a picture for it and send it to groups on social networks + promote it with articles with a link, but this is long and expensive. Therefore, 2-3 keys are included in the article - this allows you to reduce the cost of articles.

The article did not answer your question? So ask it in the comments!

P.S. I will be pleased.

Did you like the article? To share with friends: