How to use the semantic core. A simple example of compiling a semantic core. What should be the right nutrition

Semantic core of the site Is a set of keywords and phrases that will most fully describe the site's subject matter and focus.

Compilation of the semantic core - this is the second most important step in creating a website after choosing a topic for a new website. The whole future promotion of a new site in search engines will depend on the compilation of the semantic core.

At first glance, it is not difficult to find keywords for a website. But this process has a large number of nuances that must be taken into account when compiling a semantic core.

In today's article we will try to figure out all the features of compiling the semantic core.

What is the semantic core for?

Semantic core important for a site of any subject from an ordinary blog to an online store. Blog owners need to constantly work to improve search rankings and Semantic Core keywords play a major role in this. Online store owners need to know: how buyers search for a product that the online store distributes.

To bring your site to the TOP-10 search queries you need to correctly compose a semantic core and optimize high-quality and unique content for it. It's not worth talking about the benefits of compiling a semantic core without unique content. Each unique article should be optimized for one or more similar queries.

How to make the semantic core of the site?

One of the online services for the selection of keywords can help in drawing up the semantic core of the site. Almost all search engines have such online services. Have Yandex - Wordstat, at Google - Google AdWords, at Rambler - Rambler Adstat... In online services for the selection of keywords, you can select the main keywords on a specific topic in various word forms and combinations.

Using Wordstat in the left column you can see the number of requests per month not only for a given word, but also for various combinations of a given word (phrase). The left column also contains statistics for the keywords users searched for with a given query. This information can be useful for creating relevant content for the site.

Also in Yandex Wordstat, you can select a specific region in order to find out about the statistics of requests only for a specific region. Such information can be useful for companies that provide services only within one region.

The program can also help to compose the semantic core Key collector... With this program, you can quickly collect all keywords, determine their effectiveness and competitiveness. The program can also analyze the site for compliance with the semantic core.

The main drawback of the Key Collector program - it is paid. The cost of the program is 1500 rubles.

To form the semantic core, you can also use the drop-down tips in search engines. If you enter a "semantic core" into Google, then along with it, Google will provide a few more keywords related to the entered query.

What to do with the semantic core of the site?

After compiling a list of keywords, it is necessary to divide the entire list into conditional groups depending on the frequency of the request. All search queries are divided into: high-frequency, mid-frequency and low-frequency.

It is better to arrange the semantic core in the form of a table, at the top of which high-frequency queries will be indicated, below - medium-frequency, even lower - low-frequency. Words and phrases in each subsequent line should be similar in subject matter and morphology.

Correctly composed semantic core of the site can greatly facilitate the further promotion of the site in search engines. Website promotion affects its traffic, and with it income.

In the last post, we found out what the semantic core of the site is. It is already ready - created independently or ordered by a specialist.

Now let's figure out what to do with the semantic core next.

We can use it to promote our site.

So, step by step ...

1. We divide the keywords into groups according to the number of requests (Z.):

After we have collected all the words relevant to our topic, they need to be divided into groups according to the number of requests for these words in the search.

High Frequency Interrogations (HF) which are the backbone of the industries. Typically consist of one word of general meaning (e.g. construction, taxes, website)
- 4,000 - 10,000 c./month.

Mid-frequency requests (MF) which are derived from high-frequency ones that specify the specifics of the search. That is, these are high-frequency queries with additional clarifying words, usually consisting of two or three words
- 1,000 - 4,000 c./month.

for instance, buy trellises, optimize the site, real estate in Kiev.

Low-frequency requests (LF) which are specific to the given industry and detailed search.

More precisely, these are modified midrange with deeper additions-clarifications, as well as specific word combinations. Those. queries that most specifically reflect the essence of the search
- 10-100-1000 g / month.

Example: construction of multi-storey buildings Moscow, construction materials inexpensively with delivery, wholesale cement Kiev.

HF (high-frequency) queries - the most searched word (words, phrases) of your topic (most popular words).

LF (low-frequency) queries are words or phrases that are searched for in search engines with a low frequency and are related to your topic.

Midrange The (mid-range) request is somewhere between LF and HF.
All figures here are very approximate. They are highly dependent on the niche for which you are choosing keywords, so in practice, the cutoff by numbers should be selected after viewing all the statistics.

For instance:

apartment renovation VCh
apartment renovation Moscow midrange
elite apartment renovation Moscow NCH
HF interior design
apartment interior design midrange
interior design of an apartment St. Petersburg LF

It happens that the greatest return comes from keys with the number of requests 1-2 / month.

2. We break the semantic core into blocks:

A block is a list of queries that are an extension of one, usually highly competitive key, i.e. a group of keywords for which a page of a specific category or posts from this category will be optimized.

for instance, the query block for "office tables" might look like this:

Are common:

office tables (14070)
office table price (669)
office furniture tables (685)
office desks photos (543)

Private:

office corner table (1154)
buy an office table (1099)
office table buy (1099)
office computer table (417)
office desk (358)
office desk (244)
office table with cabinet (219)

The number of impressions per month is shown in brackets.

We can assume that if we optimize the site's posts for these keywords, and we manage to bring them to the first places in the search results, then we will have approximately the same number of visitors.

3. Make a list of entry points:

Entry points These are the pages, once on which, the user may be interested in the information located on them. We will promote them in PS.
Such a list can consist of the Main page and 5-20 pages of sections (categories).

By the way, for successful website promotion, when designing it, you need to create the correct website structure by combining thematically similar articles in a category.
If the information on your resource is partially or completely duplicated, as an entry point you need to choose a page with more complete information and the most convenient for promotion (with a large static weight, a larger link mass).

In the case of online stores, portals, catalogs of companies or services, entry points can be pages of individual products (companies) or simply internal pages containing an answer to a possible user question.

Then, the number of entry points on the site can reach several hundred and even thousands.

4. We distribute the collected keys by entry points:

For each entry point, we select relevant queries based on the content of the article.
As a rule, the number of pages on the site is small, much less than the number of collected keys.

The question arises - how to distribute phrases in such a way as to achieve the maximum effect and, in general, how should the semantic core of the site look like?

I am doing this:

Muzzle (Site Home) - HF,
2nd level (categories) - midrange,
Level 3 (recordings) - LF.

1. Homewill naturally be the first starting point.

If the content on the Main page is dynamic (frequently updated), add a small text (up to 1000 characters) filled with keywords that the Main page will be promoted by.

According to statistics, the Home site accounts for up to 30% -50% of all search traffic.
It is most often visible in the search results and the most external links lead to it.

Therefore, the Main one needs to be optimized for high-frequency, highly competitive keys.

2. Pages of the 2nd level:
Each section (category) is optimized for 1-3 mid-frequency keys and, possibly, for a number of low-frequency keys.

3. Pages of the 3rd level:
We optimize individual records for 1-2 low-frequency keys and, at the same time, for several of its word forms.

This is not a strict division - it all depends on the peculiarities of the subject matter and the structure of the site.

For instance:

Wooden
- metal
- plastic
- prefabricated
- dining
- office

The group of queries related to the "tables" key contains are common :

tables, characteristics of tables, tables to buy, tables price, table to buy moscow

We will optimize the category page for them.

there is private : metal tables.

For them, we either optimize an article (a specific product), or create a subsection (a subcategory - if we have a lot of metal tables) and optimize it for this request.

The semantic core of the site, examples:

Muzzle... We optimize for the following high frequencies:

office tables (14070)
office table price (669)
office furniture tables (685)
office desks photos (543)

buy a computer desk (1099)
computer table to buy (1099)
computer table (417)
computer desktop (358)
computer desk Moscow (456)

Selected articles... We optimize for low frequencies:

corner computer table (1154)
computer table with drawer (219)
inexpensive computer table (180)
computer table dimensions (165)
executive computer desk (123)
computer desk black (90)
office computer tables (88)

As a result of all these actions, we should get a correspondence table of the entry point / list of requests.

For each entry point, select one, maximum two blocks.

Try to avoid the situation when you have too many query blocks planned for 1 entry point.

How many keywords should be on the page?

It can be optimized for any number of words - from 1 to 50. It depends on:

Which words are used;
- How many words could you find on this topic;
- And how you divided the collected words into groups.

After optimizing the page, it is imperative to do its semantic analysis - in other words, check academic and classical nausea so that there is no overspam.

good afternoon , dear readers of my seo blog and its visitors to search engines who are looking for a detailed manual on the selection of keywords for their web resource. - this is the basis for the work of every seo-specialist and blogger in internal optimization, the foundation for successful search engine promotion of a website or blog. Without the selection of keywords, the goals will not be achieved, whether it be the sale of goods or services, the offer and sale of information products, or simply monetization for profit through advertising. And if your semantic core is not composed correctly, although you invest large resources in the search engine promotion of landing pages to get traffic, you should not wait for visitors from search engines. They simply won't find your blog, because it won't be on the first two pages in the SERP (top 20) - you create posts and website pages for the wrong keywords. Therefore, I made this manual for you, where the principles of search, as well as the rules for assessing the cost of promotion, are described in detail and clearly using an example. All materials have been personally tested on my own experience, everything works great - my technique is suitable for both beginner SEOs and bloggers. I am sure that after studying this manual, you will no longer have a question about how to create a semantic core for your blog.


To see all the routine work in compiling the semantic core of the site, we will choose the right keywords for computer online store... Of course, for a large store, the core can consist of thousands of keywords or more. Therefore, in this example, we will consider a small range of keywords. The main thing is to understand the very principle of selecting keywords for the site.

1. Analysis of site topics

The purpose of the first stage of compiling the semantic core is search for an initial list of requests ... Here we analyze the scope of this site - we find a set of phrases that allow us to describe the activities of this web resource, its tasks and options for their implementation. We have to predict and collect into one whole those words and phrases by which potential visitors will search for your site.

1.1 The initial list of phrases on the topic of the site.

If the site is not yet created or in the process of writing, then we compile an initial list of words and phrases that describe this area. These phrases can be taken from the customer, for whom we are creating the semantic core. If the customer does not know them, or he simply is not so interested in it or does not matter (this also happens), then we will have to look at the thematic sites of the same area and select those words and terms that are characteristic of each site we reviewed. Trying to consider which ones will be useful for work and which ones are not is not necessary at this stage. Our task is to collect all thematic phrases from these web resources. Usually such phrases, as a rule, occur in the text 3-10 times, in contrast to common words.
If the customer's site is already on the Internet, then you need to go through its sections and pages, reading various descriptions of all kinds of goods or services. The principle of selecting phrases for the initial list is the same - we select everything that can describe the topic of this web resource. After that, you should also go through the competitive sites in this area and select additional phrases. Figure 1 shows the general scheme of searching for key queries for the future semantic core.

Picture 1. Finding keywords for the core

Now let's create an initial list using the example of an online store. Let's say we don't have a customer site. Therefore immediately we turn to search engines to analyze our topics... Take, for example, the phrase “buy a computer”. We look at the top 10 in the Yandex system (Figure 2).

Figure 2. Issuance of first places for the keyword "buy a computer"

We choose a site in the top ten results zone after the Yandex contextual advertising block. Direct. We look at its pages, headings, the main page and write out thematic phrases (Figure 3).

Figure 3. We are looking for keywords for our core

Do not forget about keywords that can be registered in the description and keywords meta tags and in the title tag (Figure 4).

Figure 4.

1.2. Extended list of phrases

In addition to phrases that fit the topic we need, we should add phrases that can expand the range of our initial list. Let's list the main sources of such phrases:

  • synonyms that can expand both the list of keywords and supplement and diversify the text for better readability by visitors (for example, in our case, “computer” and “hardware”, “PC” are synonyms. Personally, I use the synonymizer.ru service to search for synonyms - Figure 5);

Figure 5. On this site I am looking for synonyms

  • verbs, nouns and adjectives with a common root (for example, for our site, the word “computer” and “computer” have the same root);
  • terms that do not describe the topic itself, but the name of the site or its purpose (for example, in our case, these are the words "store" or "online store");
  • additional words , with the help of which we can form various queries (for example, in our computer online store for the word “computer” additional can be: “sale”, “buy”, “sale”, “buy”, etc.);
  • related terms that can complement our thematic area of \u200b\u200bthe site for which the semantic core is being compiled (for example, for our computer store for the word "computers" the following phrases can be associated: "accessories", "laptop bag", etc.);
  • clue words , or associative queries in search engines (such queries can be seen when you type a query in the search bar - Figure 6);

Figure 6. Search word suggestions

  • slang, abbreviated, professional and erroneous names goods or services, terms from the subject on which the list of words for the semantic core is compiled (for example, in our case, the word “laptop” in search queries can be written as “laptop”, and “motherboard” as “motherboard”);

As you can imagine, compiling an extended list, like the initial one, is almost entirely manual work. You have to take into account various options and little things when choosing words in these lists. But it is at this stage that you need to give your best, driving away all unnecessary thoughts and ordinary human laziness.

1.3. Formation of a list of requests.

After selecting additional words, need to make a list of requests (list of masks) for the semantic core of the site. Each term from the starter list can be combined with words from the extended list. All possible combinations should be written out in a separate table. If we see that some phrases are obviously unlikely for possible search queries, they do not need to be entered into this table. Figure 7 shows part of such a table for our example.

Figure 7. List of masks

2. Clearing the list

Now we need to clean up the keywords in the list of masks. This requires sort our list in descending order of the "frequency" parameter (the number of hits to the search engine of Internet users for a key query, i.e. this is a measure of its popularity).

2.1 Sorting requests by frequency.

What is it for? Each key request of the future semantic core of the site is potential traffic from the Internet. And the more visits to the landing page with this word, the more impressions and, accordingly, the greater the likelihood of a transaction (performing an action on a site page - for example, in our case, “buying a computer”). Therefore, knowing the expected number of visits, you can plan the number of potential visitors (in our case, for a computer store, potential buyers).
To sort by frequency, you can use various resources, ranging from specialized programs (for example, the paid program “ KeyCollector "Or free" SlovoEb ”) And ending with various services of automatic promotion systems ( SeoPult, WebEffector, Rookee ) and search engine services ( Wordstat Yandex and Google AdWords - Keyword Tool ). To sort keywords, the last two services are often used as the primary source of visit statistics. For our example, we will use the Yandex service. An excellent service for creating a semantic core.

2.2. Service Wordstat.yandex.ru

If you still do not know how to use Wordstat in your work, I suggest the following material:

To obtain the frequencies of the keywords of the list of masks, you must perform the following actions:

a) go to the keyword statistics and write the first phrase from the list of masks. If website promotion will take place in a certain region, it is necessary to “clarify the region”. After entering the key request, two columns with requests will appear. The left column shows a list of phrases including our phrase and the number of impressions per month. In the right column, you can see similar associative queries with their own frequencies, which can be used to supplement the list of masks of the future core (Figure 8). We write down each keyword or phrase;

Figure 8. Kernel Keyword Frequency Checking

b) after viewing all the phrases, it is necessary to sort those keywords for which the frequency is greater than the frequency you specified (for example, for our online store we leave all requests for which the frequency is more than 40 - Figure 9);

Figure 9. Correction of the list taking into account the specified frequency

c) now let's see the frequency of the exact occurrence of the key phrase. This is done as follows: we enclose the key request, from which we want to find out the number of impressions, in quotation marks and put an exclamation mark in front of them (Figure 10);

Figure 10 . Checking the frequency of exact occurrences of key phrases

d) after reviewing all key phrases from our modified table, we exclude words whose exact frequency is less than 10 (Figure 11). Such phrases are called dummy words. Due to the scanty number of impressions, they will not bring traffic to landing pages. As a result, we get a table-forecast of site traffic during the month.

Figure 11 . Final adjustment of the list of keywords

3. Estimating the cost of promoting a request

After you have corrected the list by removing words with low frequency, you need to check what resources (mainly monetary) you need to spend in order to promote your landing pages in search engines. Estimate the cost of promotion can be based on the link value of each keyword by seo-aggregators (for example, Rookee, SeoPult ). Let's use the tools of the SeoPult service and see which keywords should be excluded due to the high costs of promotion.
Only registered users can work in this seo-aggregator. Therefore, we go through the registration and add a new project (Figure 12).

Figure 12 .

If you are working on creating a semantic core of a web resource that is not yet in the index of search engines, in the site url field, enter the address of any indexed site. The main thing in this case is that the site you have chosen belongs to the same region for which we checked the frequency of keywords.
On the next page of our project, you need to correctly add the keys, select the number of positions in the top and click the button to calculate the predicted cost of one click from search engines (Figure 13).

Figure 13 . Core Keyword Budgeting Steps

The figure shows that a number of keywords have a higher conversion cost compared to other words. This suggests that these requests are highly competitive in their topic. Therefore, they require such investments for the successful promotion of external links. For landing pages with low-frequency keywords, words with a cost of no more than 5-10 rubles per click are suitable. Usually, a competent internal linking and a couple of external links are enough for these pages to get into the top 10.
Having looked through all the keywords from the list of masks in this way, should group high-medium and low-competition queries based on the numbers of the link value of the phrase. Now you have a ready-made semantic core of the site. It remains to choose landing pages for promotion, fill the pages with unique content, optimize them and index them in search engines.

conclusions

Let's summarize the above - we will have a short manual on compiling a semantic core:

  1. We analyze the topic of the site. This analysis allows you to create an initial list of words that describe the topic and direction of your site.
  2. We supplement the initial list with various additional words (synonyms, terms, morphological changes of words, etc.)
  3. We make the correct list of requests.
  4. We clean up the semantic core, excluding dummy words and keywords whose number of impressions per month is less than the specified value.
  5. We estimate the cost of promoting each keyword by generating lists of keywords with different contexts.

This concludes our little practice in the selection of keywords for the kernel. As you can see, compiling the semantic core is laborious, but not difficult. The main thing is to develop your own clear plan, make your own manual and stick to it.

The semantic core (SJ) is a set of keywords, queries, search phrases for which you need to promote your site so that target visitors from search engines come to it. Compilation of the semantic core is the second step after setting up your site on the hosting. How much quality traffic will be on your site depends on a well-designed CL.

The need to compile a semantic core consists of several points.

First, it makes it possible to develop a more effective strategy for search engine promotion, since the webmaster, who will compose the semantic core for his project, will have a clear idea of \u200b\u200bwhat methods of search engine promotion he will need to apply to his site, decide on the cost of search engine promotion, which strongly depends on the level of competition of key phrases in the search results.

Secondly, the semantic core makes it possible to fill the resource with better quality content, that is, content that will be well optimized for key queries. For example, if you want to make a site about stretch ceilings, then the semantic core of the queries should be selected "starting from" this keyword.

In addition, the compilation of the semantic core implies the determination of the frequency of use of keywords by users of search engines, which makes it possible to determine the search engine in which it is necessary to pay more attention to the promotion of a particular query.

It is also necessary to note some rules for compiling a semantic core, which must be observed in order to be able to compose a high-quality and effective SY.

So, in the semantic core it is necessary to include all possible key queries by which the site can be promoted, except for those queries that cannot bring at least a small amount of traffic to the resource. Therefore, the semantic core should include high-frequency keywords (HF), medium (MF) and low-frequency (LF) queries.

Conventionally, these requests can be broken down as follows: LF - up to 1,000 requests per month, MF - from 1,000 to 10,000 requests per month, HF - more than 10,000 requests per month. It is also necessary to take into account the fact that in different topics these numbers can differ significantly (in some particularly narrow topics, the maximum frequency of requests does not exceed 100 requests per month).

Low-frequency queries just need to be included in the structure of the semantic core, because any young resource has the opportunity to advance in search engines, first of all, through them (and without any external website promotion - they wrote an article for a low-frequency query, the search engine indexed it, and your the site may be in the TOP in a few weeks).

It is also necessary to follow the rule of structuring the semantic core, which is that all keywords of the site must be combined into groups not only in terms of their frequency, but also the degree of similarity. This will make it possible to better optimize the content, for example, to optimize some texts for several queries.

It is also advisable to include such types of queries as synonymous queries, slang queries, abbreviations and so on in the structure of the semantic core of many sites.

Here's a good example of how you can increase website traffic by just writing optimized articles that include midrange and a large number of low-frequency queries:

1. A site for men. Although it is very rarely updated (for all the time, just over 25 articles were published), thanks to well-chosen queries, it is gaining more and more traffic. No links were purchased on the site at all!

At the moment, the growth in attendance has stopped, since only 3 articles were published.

2. Site for women. At first, non-optimized articles were published on it. In the summer, a semantic core was compiled, which consisted of queries with very low competition (I will describe how to collect such queries below). For these requests, relevant articles were written and several links were acquired on thematic resources. You can see the results of this promotion below:

3. Medical site. A medical niche with more or less adequate competition was chosen and more than 200 articles were published for tasty requests with good bids (on the Internet, the term “bid” denotes the cost of one click of a user through an advertisement link). I started buying links in February, when the site was 1.5 months old. So far, most of the traffic comes from Google, Yandex has not yet taken into account purchased links.

As you can see, the selection of the semantic core plays an important role in website promotion. Even with a small or no budget, you can create a traffic site that will bring you profit.

Selection of search queries

You can select queries for the semantic core in the following ways:

With the help of free services of statistics of search queries Google, Yandex or Rambler

With the help of special software or online service (free or paid).

Having analyzed the sites of competitors.

Selection of search queries using search query statistics services Google, Yandex or Rambler

I want to warn you right away that these services do not show very accurate information about the number of search queries per month.

For example, if you look at 4 pages of search results for the query "plastic windows", then the statistics service will show you that this query was searched not 1, but 4 times, since not only the first page of search results is counted as an impression, but also all subsequent ones, which viewed by the user. Therefore, in practice, the real numbers will be slightly lower than those shown by various services.

To determine the exact number of clicks, it is best, of course, to spy on the traffic statistics of competitors who are in the TOP-10 (liveinternet or mail.ru, if it is open). Thus, it will be possible to understand how much traffic the request you are interested in will bring.

You can also roughly calculate how many visitors a given query will bring you, depending on the position that the site will take in the search results. Here's how the CTR (click-through rate) of your site will change at various positions in the TOP:

For example, let's consider the query "! Renovate! Apartments", the region "Moscow and region":

In this case, if your site occupies the 5th position in the search results, this request will bring you about 75 visitors per month (6%), 4th place (8%) - 100 pp / month, 3rd place (11%) - 135 villages / month, 2nd place (18%) - 223 villages / month. and 1st position (28%) - 350 visitors per month.

CTR can also be influenced by a bright snippet, thereby increasing traffic for this request. You can read how to improve the snippet and what it is.

Google search statistics

Previously, I used the statistics of Google search queries more often, since I first of all promoted myself under this search engine. Then it was enough to optimize the article, buy as many different PR links to it as possible (it is very important that these links are natural and the links are followed by visitors) and voila - you are in the TOP!

Now in Google the situation is such that it is not much easier to promote in it than in Yandex. Therefore, you have to pay much more attention to both writing and design (!) Articles, and buying links for the site.

I would also like to draw your attention to the following fact:

Yandex is the most popular in Russia (Yandex - 57.7%, Google - 30.6%, Mail.ru - 8.9%, Rambler -1.5%, Nigma / Ask.com - 0.4%), so if you are promoting in this country, first turn it is worth focusing on Yandex.

In Ukraine, the situation looks different: Google - 70-80%, Yandex - 20-25%. Therefore, Ukrainian webmasters should focus on promotion in Google.

To use Google search statistics, go to

Let's look at an example of keyword research for a culinary website.

First of all, you need to enter the main query, on the basis of which keyword options for the future semantic core of the site will be selected. I entered the query "how to cook".

The next step is to choose the type of match. There are 3 types of match in total: broad, phrasal and exact. I advise you to choose the exact one, as this option will show the most accurate information upon request. And now I'll explain why.

Broad match means that impressions statistics will be shown for all the words in this query. For example, for the query "plastic windows" it will be shown for all words that include the word "plastic" and the word "windows" (plastic, window, buy windows, buy blinds for windows, pvc windows prices). In short, there will be a lot of "garbage".

Phrasal matching means that the statistics for words will be shown in the exact order in which they are listed. Along with the specified word combination of words, other words may be present in the request. For the query "plastic windows" the words "inexpensive plastic windows", "plastic windows Moscow", "how much are plastic windows", etc. will be taken into account.

We are most interested in the metric "Number of requests per month (target regions)" and "Estimated CPC" (if we are going to place Adsense ads on the site pages).

Yandex search query statistics

I use search query statistics almost every day, as it is presented in a more convenient form than its Google counterpart.

The only drawback of Wordstat is that you will not find matching types in it, you will not save the selected queries to your computer, you will not be able to find out the cost per click, etc.

To obtain more accurate results, you need to use special operators, with which you can refine the queries we are interested in. You will find a list of operators on this one.

If you just enter the query "how to cook" in Wordstat, we get the following statistics:

This is the same as if we chose “broad match” in Adwords.

If you enter the same query, but already in quotes, we get more accurate statistics (analogous to phrase matching in Adwords):

Well, to get statistics only for a given request, you need to use the "!" Operator: "! How! To cook"

To get even more accurate results, you need to specify the region for which the site is being promoted:

Also in the top panel of Wordstat there are tools with which you can view statistics for a given query by region, by month and by week. By the way, using the latter, it is very convenient to analyze the statistics of seasonal queries.

For example, by analyzing the query "how to cook", you can find out that it is most popular in the months of December (this is not surprising - everyone is preparing for the New Year):

Rambler search statistics

I want to warn you right away that the statistics of queries from Rambler every year more and more lose their relevance (first of all, this is due to the low popularity of this search engine). So you probably won't even have to work with it.

You do not need to enter any operators in Adstat - it immediately displays the frequency of the request in the case in which it was entered. It also has separate statistics for the frequency of requests for the first page of search results and for all search results pages, including the first.

Selection of search queries using special software or online services

Rookee can not only promote your queries, but it can also help you create the semantic core of your site.

With the help of Rookee, you can easily select a semantic core for your site, you can roughly predict the number of visits for selected queries and the cost of their promotion to the top.

Selection of queries using the free Slovoeb program

If you are going to compose the SY (semantic core) at a more professional level, or you need statistics of queries on Google.Adwords, Rambler.Adstat, Vkontakte social network, various link aggregators, etc., I advise you to immediately purchase Kay Collector.

If you want to compose a large semantic core, but at the same time do not want to spend money on purchasing paid programs, the best option in this case would be the Slovoyob program (read information about the program here). She is the "younger brother" of Kay Collector and allows you to collect SY based on the statistics of queries on Yandex.Wordstat.

Installing the SlovoEB program.

Download the archive with the software.

Make sure the archive is unlocked. To do this, in the file properties (in the context menu, select the "Properties" item), click the "Unblock" / "Unblock" button, if there is one.

Unzip the contents of the archive.

Run the executable file Slovoeb.exe

We create a new project:

Select the required region (the button Regions Yandex.Wordstat):

We save the changes.

Click on the button "Left column Yandex.Wordstat"

If necessary, set "Stop words" (words that should not be included in our semantic core). Stop the words can be such words: "free" (if you sell something on your site), "forum", "wikipedia" (if you have your own information site, which does not have a forum), "porn", "sex" (well, everything is clear here), etc.

Now you need to set the initial list of words, on the basis of which the CJ will be compiled. Let's make the core for a company that is engaged in the installation of stretch ceilings (in Moscow).

When choosing any semantic core, first of all, you need to make a classification of the analyzed topic.

In our case, stretch ceilings can be classified according to the following criteria (I make such convenient mind maps in the MindJet MindManager program):

Helpful advice: for some reason, many webmasters forget to include the names of small settlements in the semantic core.

In our case, it would be possible to include the names of the districts of Moscow and cities of the Moscow region that are of interest to us in the SY. Even if there are very few requests per month for these keywords ("Golitsyno stretch ceilings", "Aprelevka stretch ceilings", etc.), you still need to write at least one small article for them, the title of which would include the required key. You won't even have to promote such articles, because most often, there will be very low competition for these requests.

10-20 such articles, and your site will consistently have several additional orders from these cities.

Press the button "Yandex.Wordstat left column" and enter the required queries:

Click on the "Parse" button. As a result, we get the following list of requests:

We filter out all unnecessary requests that do not fit the site's topic (for example, "do-it-yourself stretch ceilings" - although this request will bring some traffic, it will not bring us customers who will order the installation of ceilings). We select these requests and delete them so that we do not waste time analyzing them in the future.

Now we need to clarify the frequency for each of the keys. Click on the button "Collect frequencies"! "":

Now we have the following information: the request, its general and exact frequency.

Now, based on the frequencies obtained, you need to review all the queries and delete unnecessary ones.

Unnecessary queries are queries that have:

The exact frequency ("!") Is very low (in my chosen topic, in principle, you need to value every visitor, so I will filter out queries with a monthly frequency of less than 10). If there was not a construction topic, but, say, some general topic, then you can safely filter out requests with a frequency below 50-100 per month.

The ratio of general and precise frequency exceeds very high. For example, the query "buy stretch ceilings" (1335/5) can be deleted immediately, because it is a "dummy query".

Requests with very high competition also need to be removed, it will be difficult to advance on them (especially if you have a young site and a small budget for promotion). Such a request, in our case, is "stretch ceilings". In addition, most often, those queries that consist of 3.4 or more words are more effective - they bring more targeted visitors.

In addition to Slovoyob, there is another excellent program for convenient automatic collection, analysis and processing of Yandex.Direct keyword display statistics.

News Searches

In this case, the SJ is not created in the same way as for a regular content project. For a news site, first of all, you need to select the headings under which the news will be published:

After that, you need to select requests for each of the sections. News requests can be expected and not expected.

For the first type of requests, the informational reason is predictable. Most often, you can even pinpoint the date when there will be a surge in the popularity of a particular request. It can be any holiday (New Year, May 9, March 8, February 23, Independence Day, Constitution Day, church holidays), an event (music events, concerts, film premieres, sports competitions, presidential elections).

Prepare such requests in advance and determine the approximate volume of traffic in this case using

Also, do not forget to look at the statistics of traffic to your site (if you have already reviewed some event in your time) and competitors' sites (if their statistics are open).

The second type of queries is less predictable. These include breaking news: cataclysms, catastrophes, some events in the families of famous people (birth / wedding / death), the release of an unannounced software product, etc. In this case, you just need to be one of the first to publish this news.

To do this, it is not enough just to monitor news in Google and Yandex - in this case, your site will just be one of those who simply reprinted this event. A more effective method that allows you to hit a big jackpot is to follow foreign sites. By publishing this news one of the first on the Runet, you, in addition to the tons of traffic that will put the servers on your hosting, will receive a huge number of backlinks to your site.

Cinema queries can also be categorized as expected queries. Indeed, in this case, the date of the premiere of the film is known in advance, the script of the film and its actors are approximately known. Therefore, you need to prepare in advance the page on which the film will appear, add its trailer there for a while. You can also publish on the site news about the film and its actors. This tactic will help you take TOP positions in search engines in advance and will bring visitors to the site even before its premiere.

Do you want to know what queries are currently in trend or predict the relevance of your topic in the future? Then use services that provide information on trends. It allows you to perform many analysis operations: comparison of search trends for several queries, analysis of geographic regions of a query, viewing the hottest trends at the moment, viewing relevant actual queries, exporting results to CSV format, the ability to subscribe to an RSS feed for hot trends, etc. ...

How to speed up the collection of the semantic core?

I think that everyone who came across the collection of the semantic core had the thought: “How long and tedious is the parsing, tired of sorting and grouping thousands of these requests!”. This is normal. I also have this sometimes. Especially when you have to parse and iterate over the CJ, which consists of several tens of thousands of requests.

Before starting parsing, I strongly advise you to split all requests into groups. For example, if your site is "Building a House", break it down into foundations, walls, windows, doors, roofs, wiring, heating, etc. It will simply be much easier for you later to sort through and group queries when they are located in small group and are linked by a certain narrow topic. If you just parse everything in one pile, you will get an unrealistically huge list, which will take more than one day to process. And so, by processing the entire list of SNs in small steps, you will not only work through all the requests more efficiently, but you will also be able to simultaneously order articles from copywriters for already collected keys.

The process of collecting the semantic core almost always starts with automatic parsing of requests (for this I use the Kay Collector). It is also possible to collect manually, but if we work with a large number of requests, I see no reason to waste my precious time on this routine work.

If you work with other programs, then most likely the function of working with proxy servers will be available in them. This allows you to speed up the parsing process and protect your IP from the search engine ban. To be honest, it's not very pleasant when you need to urgently fulfill an order, and your IP is banned for a day due to frequent calls to the Google / Yandex statistics service. This is where paid proxy servers come to the rescue.

Personally, at the moment I do not use them for one simple reason - they are constantly banned, it is not so easy to find high-quality working proxies, and I have no desire to pay money for them once again. Therefore, I found an alternative way of collecting SN, which accelerated this process several times.

Otherwise, you will have to look for other sites for analysis.

In most cases, sites close these statistics, so you can use the entry point statistics.

As a result, we will have statistics of the most visited pages of the site. We switch to them, write out the main queries for which it was optimized, and on their basis we collect the SY.

By the way, if the site has statistics “By search phrases”, you can make your work easier and collect requests using the Key Collector (you just need to enter the site address and click on the “Get data” button):

The second way to analyze a competitor's site is by analyzing their site.

Some resources have a "Most Popular Articles" widget (or something like that). Sometimes the most popular articles are selected based on the number of comments, sometimes based on the number of views.

In any case, having a list of the most popular articles in front of your eyes, you can figure out what requests this or that article was written for.

The third way is to use tools. In general, it was created to analyze the trust of sites, but, to be honest, he considers trust to be very bad. But what he knows how to do well is to analyze the requests of competitors' websites.

Enter the address of any site (even with closed statistics!) And click the trust check button. At the very bottom of the page, the site visibility statistics by keywords will be displayed:

Website visibility in search engines (by keywords)

The only drawback is that these queries cannot be exported, you have to copy everything manually.

The fourth way is to use services and.

With its help, you can determine all the queries for which the site takes positions in the TOP-100 in both search engines. It is possible to export positions and requests to xls format, but I could not open this file on any of the computers.

Well, the last way to find out competitors' keywords is by using

Let's analyze the query "log house" as an example. In the "Competitor Analysis" tab, enter this request (if you want to analyze a specific site, you just need to enter its url in this field).

As a result, we get information about the frequency of the request, the number of advertisers in each of the search engines (and if there are advertisers, then there is money in this niche) and the average cost of a click in Google and Yandex:

You can also view all ads in Yandex.Direct and Google Adwords:

And this is how the TOP of each of the PS looks like. You can click on any of the domains and see all the requests for which it is in the TOP and all its contextual ads:

There is another way to analyze competitors' requests, which few people know about - using the Ukrainian service

I would call it the “Ukrainian version of Spywords”, since they are very similar in functionality. The only difference is that the database contains key phrases that Ukrainian users are looking for. So if you work in a UA-net, this service will be very useful to you!

Analysis of competition requests

So, the requests are collected. Now you need to analyze the competition for each of them in order to understand how difficult it will be to promote this or that keyword.

Personally, when I create a new site, first of all I try to write articles for requests for which there is low competition. This will allow for a short time and with minimal investment to bring the site to a pretty good traffic. At least in such topics as construction, medicine, you can reach an attendance of 500-1000 visitors in 2.5 months. I am generally silent about the women's theme.

Let's see how you can analyze the competition in a manual and automatic way.

Manual way of analyzing competition

Enter the desired keyword in the search and look at T0P-10 (and if necessary, T0P-20) sites that are in the search results.

The most basic parameters to look at are:

The number of main pages in the TOP (if you are promoting the internal page of the site, and in the TOP the competitors mainly have main pages, then most likely you will not be able to overtake them);

The number of direct keyword drives in the Title of the page.

For such queries as "website promotion", "how to lose weight", "how to build a house", there is unrealistically high competition (there are many main pages in the TOP with a direct entry of the keyword into the Title), so you should not advance on them. But if you advance on the query "how to build a house from foam blocks with your own hands with a basement", then you will have more chances to get into the TOP. Therefore, I once again focus my attention on the fact that you need to advance on queries that consist of 3 or more words.

Also, if you are analyzing the results in Yandex, you can pay attention to the TCI of sites (the higher, the harder it will be to overtake them, because a high TCI most often indicates a large link mass of the site), are they in the Yandex Catalog (sites from Yandex Catalog have more trust), its age (age sites are loved by search engines more).

If you are analyzing TOP sites in Google, pay attention to the same parameters that I wrote about above, only in this case instead of Yandex Catalog there will be a DMOZ catalog, and instead of a TCI indicator there will be a PR indicator (if in the TOP pages of sites have a PR from 3 to 10, it won't be easy to overtake them).

I advise you to analyze sites using a plugin. It shows all the information about a competitor's site:

Automatic way of analyzing competition requests

If there are a lot of requests, then you can use programs that will do all the work for you hundreds of times faster. In this case, you can use Slovoeb or Kay Collector.

Previously, when analyzing competitors, I used the KEI (Competition Index). This function is available in Key Collector and Word Word.

In Slovoyob, the KEI indicator simply shows the total number of sites in the search results for a particular query.

In this regard, Kay Collector has an advantage, since it has the ability to independently set a formula to calculate the KEI parameter. To do this, go to Program Settings - KEI:

In the field "Formula for calculating KEI 1" insert:

(KEI_YandexMainPagesCount * KEI_YandexMainPagesCount * KEI_YandexMainPagesCount) + (KEI_YandexTitlesCount * KEI_YandexTitlesCount * KEI_YandexTitlesCount)

In the field "Formula for calculating KEI 2" insert:

(KEI_GoogleMainPagesCount * KEI_GoogleMainPagesCount * KEI_GoogleMainPagesCount) + (KEI_GoogleTitlesCount * KEI_GoogleTitlesCount * KEI_GoogleTitlesCount)

This formula takes into account the number of main pages in the search results for a given keyword and the number of pages in the TOP-10 in which this keyword is included in the page title. Thus, you can get more or less objective competition data for each request.

In this case, the smaller the request KEI, the better. The best keywords will have a KEI \u003d 0 (if they have at least some traffic, of course).

Click on the data collection buttons for Yandex and Google:

Then click on this button:

The KEI 1 and KEI 2 columns will display data on KEI queries for Yandex and Google
respectively. Let's sort the queries in ascending order of the KEI 1 column:

As you can see, among the selected queries there are some that shouldn't be a problem for promotion. In less competitive topics, in order to bring such a query to the TOP-10, you just need to write a good optimized article. And you won't need to buy external links to promote it!

As I said above, I have used KEI before. Now, to assess the competition, I just need to get the number of main pages in the TOP and the number of occurrences of the keyword in the Title of the page. Key Collector has a similar function:

After that, I sort requests by the column "Number of main pages in Yandex search engine" and make sure that by this parameter there are no more than 2 main pages in the TOP and as few entries in the titles as possible. After I have compiled a CJ for all these queries, I decrease the filter parameters. Thus, the first articles on the site will be published for NK requests, and the last - for SK and VK requests.

After all the most interesting queries have been collected and grouped (I will discuss the grouping below), click on the "Export data" button and save them to a file. I usually include the following parameters in the export file: frequency by Yandex with "!" in a given region, the number of main pages of sites and the number of occurrences of the keyword in the headings.

Tip: The Kay Collector sometimes does not quite correctly show the number of occurrences of requests in the headers. Therefore, it is advisable to additionally enter these queries in Yandex and look at the results manually.

You can also evaluate the competitiveness of search queries using the free

Grouping requests

After all the requests have been selected, the time comes for the rather boring and monotonous work of grouping requests. You need to select similar queries that can be combined into one small group and promoted within one page.

For example, if you have such requests in the received list: “how to learn to do push-ups”, “how to learn to do push-ups at home”, “learn to do push-ups for a girl”. Such queries can be combined into one group and write one large optimized article for it.

To speed up the grouping process, parsing keywords in parts (for example, if you have a site about fitness, then during parsing, divide the YA into groups that will include requests related to the neck, arms, back, chest, abs, legs, etc. etc.). This will greatly facilitate your work!

If the received groups will contain a small number of requests, then you can stop there. And when you still end up with a list of several dozen or even hundreds of requests, you can try the following methods.

Working with stop words

In Key Collector it is possible to specify stop words that can be used to mark unwanted keywords in the resulting query table. Such queries are usually removed from the semantic core.

In addition to removing unwanted queries, this function can be used to search for all the necessary word forms for a given key.

We indicate the required key:

All word forms of the specified key are highlighted in the table with queries:

We transfer them to a new tab and already there we manually work with all requests.

Filtering for the "Phrase" field

You can find all word forms of a given keyword using the filtering settings for the "Phrase" field.

Let's try to find all the queries that include the word "bars":

As a result, we get:

Group Analysis Tool

This is the most convenient tool for grouping phrases and further manually filtering them.

Go to the "Data" - "Group Analysis" tab:

And this is what is displayed if you open any of these groups:

By marking any group of phrases in this window, you simultaneously mark or uncheck a phrase in the main table with all queries.

In any case, you cannot do without manual work. But, thanks to Kay Collector, part of the work (and not a small one!) Has already been done, and this, at least a little, but facilitates the process of compiling the site's SY.

After you manually process all the requests, you should get something like this:

How to find a low-competitive profitable niche?

First of all, you must decide for yourself how you will earn on your website. Many novice webmasters make the same stupid mistake - they first create websites on a topic that they like (for example, after buying an iPhone or iPad, everyone immediately runs to make another “apple-related” website), and then they begin to realize that competition in this niche is very high and it will be almost impossible for their govnositik to break into the first places. Or they create a culinary website because they like to cook, and then they realize with horror that they don't know how to monetize such traffic.

Before creating any site, immediately decide how you will monetize the project. If you have an entertainment theme, teaser ads are most likely suitable for you. For commercial niches (in which something is sold) contextual and banner ads are perfect.

Just want to dissuade you from creating sites of general topics. Sites on which it is written "everything about everything" is now not so easy to promote, because TOPs have long been occupied by well-known trust portals. It will be more profitable to make a narrow-topic site that will surpass all competitors in a short time and gain a foothold in the TOP for a long time. I am telling you this from personal experience.

My narrow-topic medical site, already 4 months after its creation, takes first places in Google and Yandex, overtaking the largest medical general-topic portals.

Promote narrow sites EASIER and FASTER!

Let's move on to choosing a commercial niche for a site where we will earn on contextual advertising. At this stage, you need to adhere to 3 criteria:

When assessing the competition in a certain niche, I use the search for the Moscow region, in which the competition is usually higher. To do this, specify in the region settings in your Yandex account "Moscow":

The exceptions are cases when the site is made for a specific region - then you need to look at competitors for this particular region.

The main signs of low competition in search results are as follows:

Lack of full-fledged answers to the request (irrelevant issue).

In the TOP there are no more than 2-3 main pages (they are also called "muzzles"). A large number of "faces" means that the entire site is purposefully promoted for this request. In this case, it will be much more difficult to promote the internal page of your site.

Inaccurate phrases in the snippet indicate that the required information is simply not available on the competitors' sites, or that their pages are not optimized at all for this request.

A small number of large thematic portals. If, for many requests from your future SN, there are many large portals and narrow-topic sites in the TOP, you need to understand that the niche has already been formed, and the competition is very high there.

More than 10,000 requests per month in terms of base frequency. This criterion means that you should not "narrow down" the topic of the site too much. The niche should have a sufficient amount of traffic that can be earned in the context. Therefore, the main query in the selected topic should have at least 10,000 queries per month for the wordstat (without quotes and taking into account the region!). Otherwise, you will need to expand the topic a little.

You can also roughly estimate the amount of traffic in a niche using statistics of traffic to sites that occupy the first positions in the TOP. If this statistics is closed for them, then use

Most seasoned MFA-Schnicks are looking for niches for their sites not in this way, but with the help of the service or program Micro Niche Finder.

The latter, by the way, shows the SOC parameter (0-100 - the request will reach the TOP only on internal optimization, 100-2000 - average competition). You can easily select requests with SOC less than 300.

On the screen below, you can see not only the frequency of requests, but also the average cost of a click, the position of the site by requests:

There is also a useful trick like "Potential ad buyers":

You can also just enter the query you are interested in and analyze it and similar queries:

As you can see, the whole picture is in front of you, you just have to check the competition for each of the requests and choose the most interesting of them.

Estimating the cost of bids for YAN sites

Now let's look at how to analyze the cost of bids for YAN sites.

We proceed by entering the queries we are interested in:

As a result, we get:

Attention! We look at the rate of guaranteed impressions! The actual approximate cost per click will be 4 times less than the one shown by the service.

It turns out that if you bring to the TOP a page about sinusitis treatment and visitors click on the advertisement, then the cost of 1 click on the YAN advertisement will be 12 rubles (1.55 / 4 \u003d $ 0.39).

If you select the region "Moscow and the region" then the bids will be even higher.

That is why it is so important to take first places in the TOP in this particular region.

Please note that when analyzing requests, you do not need to take into account commercial requests (for example, "buy a table"). For content projects, you need to analyze and promote information requests ("how to make a table with your own hands", "how to choose a table for the office", "where to buy a table", etc.).

Where to look for niche search queries?

1. Start brainstorming, write down all your interests.

2. Write down what inspires you in life (it can be some kind of dream, or a relationship, lifestyle, etc.).

3. Look around and write down all the things that surround you (write in a notebook literally everything: pen, light bulb, wallpaper, sofa, armchair, pictures).

4. What is your occupation? Do you work in a factory for a machine or as an ENT doctor in a hospital? Write down all the things you use at work in a notebook.

5. What are you good at? Write that down too.

6. Look through ads in newspapers, magazines, ads on websites, spam in the mail. Perhaps they offer something interesting that you could make a website about.

Find there sites with a traffic of 1000-3000 people per day. Look at the topics of these sites and write down the most frequent requests that bring them traffic. Perhaps there is something interesting there.

I will provide a selection of one narrow topic niche.

So, I was interested in such a disease as "heel spur". This is a very narrow micronishe, for which you can create a small site of 30 pages. Ideally, I advise you to look for broader niches, for which you could make a site of 100 or more pages.

We open and check the frequency of this request (we will not specify anything in the region settings!):

Excellent! The frequency of the main request is over 10,000. There is traffic in this niche.

Now I want to check the competition for the requests for which the site will be promoted ("heel spur treatment", "heel spur treatment", "heel spur how to treat")

I switch to Yandex, specify the "Moscow" region in the settings, and this is what I get:

In the TOP there are many internal pages of general thematic medical sites, which will not be difficult for us to overtake (although this will happen, not earlier than in 3-4 months) and there is a muzzle of 1 site, which is sharpened for the treatment of heel spurs.

The wnopa site is a 10-page govnosite (which, however, has PR \u003d 3) with a traffic of 500 visitors. A similar site can also be overtaken if you make a better site and promote the home page for this request.

Having analyzed the TOP for each of the queries in this way, we can come to the conclusion that this micronish is not yet busy.

Now the last stage is left - checking bids.

We go to and enter our requests there (you do not need to choose the region, since our site will be visited by Russian-speaking people from all over the world)

Now I find the arithmetic average of all guaranteed impressions metrics. That makes $ 0.812. We divide this indicator by 4, it turns out on average $ 0.2 (6 rubles) per 1 click. For medical topics, this is, in principle, a normal indicator, but if you wish, you can find topics with the best bids.

Should you choose a similar niche for your website? It's up to you to decide. I would look for broader niches that have more traffic and better cost per click.

If you really want to find a good topic, I highly recommend you take the next step!

Make a plate with the following columns (I indicated the data approximately):

You must write at least 25 requests per day in this way! Spend from 2 to 5 days looking for a niche and, in the end, you will have a lot to choose from!

Choosing a niche is one of the most important steps in creating a website, and if you treat it negligently, you can think that in the future you will simply waste your time and money.

How to write the right optimized article

When I write an article optimized for any request, I first of all think about making it useful and interesting for site visitors. Plus, I try to arrange the text so that it is pleasant to read. In my understanding, ideally designed text should contain:

1. Bright effective headline. It is the title that first attracts the user when he views the SERP. Please note that the title must be fully relevant to what is written in the text!

2. Absence of "water" in the text. I think that few people like it when what could be written in 2 words is set out on several sheets of text.

3. Simplify paragraphs and sentences. To make the text easier to read, be sure to break large sentences into short ones that fit on one line. Simplify complex turns. It is not for nothing that in all newspapers and magazines the text is published in small sentences in columns. Thus, the reader will perceive the text better, and his eyes will not get tired so quickly.

The paragraphs in the text should also be small. Best of all, text will be perceived in which there is an alternation of small and medium paragraphs. A paragraph of 4-5 lines is ideal.

4. Use subheadings in the text (h2, h3, h4). A person needs 2 seconds to "scan" an open page and decide whether to read it or not. During this time, he scans the page with his eyes and highlights the most important elements for himself that help him understand what the text is about. Subheadings are exactly the tool with which you can catch the visitor's attention.

To make the text structured, easier to read, use one subheading for every 2-3 paragraphs.

Please note that these tags must be used in order (h2 to h4). For instance:

You don't need to do this:

<И3>Subtitle<И3>

<И3>Subtitle<И3>

<И2>Subtitle<И2>

And this is how you can:

<И2>Subtitle<И2>

<И3>Subtitle<И3>

<И4>Subtitle<И4>

You can also do the following:

<И2>How to pump up the press<И2>

<И3>How to build abs at home<И3> <И3>How to pump up the press on the horizontal bar<И3> <И2>How to build biceps<И2> <И3>How to build biceps without exercise machines<И3>

In the texts, keywords should be used to a minimum (no more than 2-3 times, depending on the size of the article). The text should be as natural as possible!

Forget about bold and italicized keywords. This is a clear sign of over-optimization and Yandex doesn't like it very much. Highlight in this way only important sentences, thoughts, ideas (at the same time, try not to get keywords in them).

5. Use in all your texts bulleted and numbered lists, quotes (do not confuse with quotes and aphorisms of famous people! We write only important thoughts, ideas, advice in them), pictures (with filled in the alt, title and signature attributes). Insert interesting thematic videos into articles, this has a very good effect on the duration of visitors' stay on the site.

6. If the article is very long, then it makes sense to use the content and anchor links to navigate through the text.

After you have made all the necessary changes in the text, go over it with your eyes again. Isn't it hard to read it? Maybe some suggestions need to be done even less? Or do you need to change the font to a more readable one?

Only after you are satisfied with all the changes in the text, you can fill in the title, description and keywords and send it for publication.

Title requirements

Title - This is the title in the snippet that appears in the search results. The title in the title and the title in the article (the one that is highlighted by the tag

) must be different from each other and at the same time, they must contain the main keyword (or several keys).

Description requirements

The description should be bright and interesting. In it, you need to briefly convey the meaning of the article and do not forget to mention the necessary keywords.

Requirements for Keywords

Enter the promoted keywords here separated by a space. No more than 10 pieces.

So how do I write optimized articles.

First of all, I choose the queries that I will use in the article. For example, I will take the basic query "how to learn to do push-ups." We drive it into wordstat.yandex.ru


Wordstat has displayed 14 queries. I will try to use most of them in the article whenever possible (it is enough to mention each keyword once). Keywords can and should be declined, rearranged, replaced with synonyms, made as diverse as possible. Search engines that understand the Russian language (especially Yandex) perfectly recognize synonyms and take them into account when ranking.

I try to determine the article size by the average size of articles from the TOP.

For example, if there are 7 sites in the TOP-10 in the search results, the text size of which is mainly 4,000-7,000 characters, and 3 sites that have this indicator is very low (or high), the last 3 I simply do not take into Attention.

Here is the approximate html-structure of the article, optimized for the query "how to learn to push up":

How can a girl learn to do push-ups from the floor and on the uneven bars from scratch?


……………………………

How a girl can learn to do push-ups 100 or more times in 2 months


……………………………

What should be the right nutrition?


……………………………

Daily regime


……………………………
……………………………

Training program

Some SEOs advise to write the main key in the first paragraph without fail. In principle, this is an effective method, but if you have 100 articles on your site, each of which contains a promoted query in the first paragraph, it can backfire.

In any case, I advise you not to get hung up on one specific scheme, experiment, experiment and experiment again. And remember, if you want to make a site that search robots will love, make it so that real people love it!

How to increase website traffic using simple manipulation with linking?

I think that each of you has seen many different linking schemes on sites, and I am more than sure that you did not attach much importance to these schemes. But in vain. Let's take a look at the first linking scheme that most sites have. On it, however, there are still not enough links from internal pages to the main page, but this is not so important. It is important that in this case, the most important on the site will be the category pages (which most often are not promoted and are closed from indexing in robots.txt!) And the main page of the site. Post pages in this case have the least weight:

Now let's think logically: SEOs who promote content projects promote which pages first of all? That's right - pages with records (in our case, these are 3rd level pages). So why, when promoting the site with external factors (links), we so often forget about no less important - internal factors (correct linking)?

Now let's take a look at a scheme in which internal pages will have more weight:

This can be achieved simply by properly adjusting the flow of weight from the top-level pages to the bottom, third-level pages. Also, do not forget that internal pages will be additionally pumped by external links on other sites.

How do you set up this weight transfer? Very simple - with nofollow noindex tags.

Let's consider a site where you need to pump internal pages:

On the main page (1 level): all internal links are open for indexing, external links are closed via nofollow noindex, the link to the sitemap is open.

On pages with headings (2nd level): links to pages with posts (3rd level) and to a sitemap are open for indexing, all other links (to the main, headings, tags, external) are closed via nofollow noindex.

On pages with posts (3rd level): links to pages of the same, 3rd level and to the site map are open, all other links (to the main page, headings, tags, external) are closed via nofollow noindex.

After such a linking was done, the sitemap received the most weight, which evenly distributed it to all internal pages. Inner pages, in turn, have improved significantly in weight. The main page and the rubric page received the least weight, which is exactly what I wanted to achieve.

I will check the page weight using the program. It helps to visually see the distribution of static weight on the site, focus it on the pages being promoted, and thereby raise your site in the search engine results, significantly saving the budget on external links.

Now, let's move on to the fun part - the results of such an experiment.

Women's site. The changes to the weight distribution on the site were made in January this year. A similar experiment brought +1000 to the attendance:

The site is under the Google filter, so the traffic went up only in Yandex:

Do not pay attention to the drop in traffic in May, this is seasonality (May
holidays).

2. Second site for women, changes were also made in January:

This site, however, is constantly updated and links are purchased to it.

Terms of reference for copywriters for writing articles

Article title:how can greens be useful?

Uniqueness: not less than 95% by Etxt

Article size: 4,000-6,000 characters without spaces

Keywords:

how are greens useful - the main key

healthy greens

useful properties of greens

benefits of greenery

Requirements for the article:

The main keyword (listed first in the list) does not have to be in the first paragraph, it just has to appear in the first half of the text !!!

FORBIDDEN: to group keywords in parts of the text, it is necessary to distribute keywords throughout the text.

Keywords should be highlighted in bold (this is necessary to make it easier for you to check the order)

Key words must be entered harmoniously (therefore, you need to use not only direct occurrences of keys, but you can also declare them, swap words, use synonyms, etc.)

Use subheadings in your text. Try not to use direct occurrences of keywords in your subheadings. Make subheadings as varied as possible for each article!

Use no more than 2-3 direct occurrences of the main query in the text!

If in an article you need to use the keywords "how greens are useful" and "how greens are useful for humans," then by writing the second keyword, you will automatically use the first one. That is, there is no special need to write the key 2 times, if one already contains an entry for another key.

After that, we need to install and then activate this plugin, after activating the plugin, we need to go to its settings and make optimization for the entire site as a whole. Write down the name, write down what this site is about and write down general keywords. All this can be done in the plugin settings, as shown in the picture:

Be sure to select Enabled, otherwise our plugin will not work, then the Home Title is the name of our site, its title, it is desirable to include three high-frequency requests (HF) in it, all words can be separated by a comma. The next field is Home Description, this is a description of our site, here you need to place 8 (CP) mid-frequency requests, but since this description must be written in understandable language, and not a set of phrases, for example: this site will tell you about making money on the network, about creating sites and so on. I think clearly explained. Next comes the Home Keywords field, here we also drive our 8 mid-range queries. An example is shown in the picture above, so you can easily figure it out. Next are the settings for our plugin:

We leave everything else unchanged, except for two checkboxes, which are shown in the figure, do not forget to mark them and thereby prohibit indexing of archives, which will have a positive effect on our site. Well that's all, now you know how to create a semantic core, but this is just the beginning and of course you need to work on the site constantly and promote it by keywords and optimize each of your articles. A start has been made and your site has its own core. If you want to know how you can optimize each of your articles using the plugin All in One SEO Pack, follow the news of the site. Good luck to everyone and a huge number of visitors.

Did you like the article? To share with friends: