Whether it is a training institution, some SEO tutorials, or even a lot of SEO knowledge points on various SEO websites, there are blind spots. This so-called blind spot has a huge impact on most newcomers to the SEO industry, and even affects the sites that are optimized for actual rankings. Combining the various SEO questions that many SEO friends have asked me, I have compiled the 8 most basic and comprehensive knowledge points and misunderstandings for everyone to interpret one by one.
First, the content must be original
Regarding content originality, we cannot say that he is not good, but we cannot blindly pursue it deliberately. While emphasizing the freshness of the content, we must learn to optimize the SEO quality of the page (such as reasonable page layout, pictures and text, etc.). For many SEO practitioners in the industry, when they emphasize originality, they do n’t know if they have found it. They have done so much originality, and there is still no one page on the inner page to participate in the ranking. On the contrary, more originality is just to increase the index and increase the homepage ranking.
If it is a small business website, pure originality cannot give you a great value increase, because in most cases it is just to optimize your homepage. For those who really operate SEO word optimization, originality and inclusion are important, but basic SEO optimization is more important. It's like you have been insisting on originality every day, but you have ignored the core of SEO for page quality (such as the specification of pictures, the SEO standardization of TDK, the correct use of H tags, the right text bolding, the keyword frequency of the page, etc.). Even for large-scale website optimization, original content is a very important core point. But you can find that any one million-scale large-scale website is included. A well-ranked site cannot publish 1 million original high-quality content. It is more a page aggregation of keyword demand points to form a huge web site. If you're still trying to be original, try content aggregation, maybe better!
Second, this anti-chain is not the other chain
Whether it is an SEO staff who has been in the business for a while or an SEO staff who has just started, SEO platforms such as Love Station and Webmaster Tools are more practical for everyone. But often these must-have products make many friends have unclear answers to the difference between external links and anti-links. As shown below:
As of the time of submission, we can see that the backlink of Lu Songsong's blog is 26,700, and the search results matched by the webmaster tools are exactly the same. This so-called anti-chain is mistakenly regarded by many SEO personnel as the external link of their own website. To understand the source of this kind of data, you must first understand a domain advanced search instruction in the search engine.
A simple understanding of domain is the backlink domain of a site, which can also be called the backlink domain name of the site, and the search result is the number of matching backlink domain names (the same website can be counted multiple times).
If you consider the backlink domain name from the domain query as the external link of your website, then you know too much about the basic knowledge of SEO, because if I do an anchor text external link, using domain you are not Can't retrieve it.
Third, the ranking is unstable
In general, there are only two kinds of friends who ask this question. One is not brushed, so you think this is black hat behavior and causes instability. Another is that there is no renewal after brushing the rankings, the rankings are dropped, resulting in instability. In fact, I have submitted an algorithmic article about fast ranking on Lu Songsong's blog. It is very detailed. Maybe you don't understand technology, but you must understand thinking. Regarding brush click ranking, if you can truly simulate the click reality or the software parameters are set properly, the ranking will be very stable. Of course, there is another situation that will cause the website ranking to be unstable, that is, the website is weak, and some appropriate content updates are needed. The introduction of external links improves the keyword weight of the page (also known as keyword elevation) to promote the website. The weight is stable, then the keyword ranking you brush will become very stable.
Fourth, this weight is not another weight
Even if you have done SEO for many years, you will take a look at the "weight" of the website from time to time. In fact, any search engine has its own set of page ranking algorithms. This type of algorithm covers the combination of various SEO elements and the final synthesis. Scoring is a weight for page quality.
However, many friends who exchange friendship links will look at the so-called weight value to determine whether I will exchange this external link.
First of all, you need to understand that the main point is that the weights appearing on websites such as love stations and webmaster tools are just so-called weight values generated by a series of traffic estimation through indexed keyword rankings, ranging from 1-10. And this so-called weight is meaningless whether it is your own website or used to exchange friendly links. Because at any time, a brand word without search volume can be swiped to the 10,000 index, and the value of 5 will appear immediately, and some highly searched words may not be built in the Baidu index, but the search volume is huge, as shown in the figure below. :
Therefore, many friends will give priority to operating some high-index keywords when optimizing keywords, but the words that are really valuable, have conversion, and have a large search volume will be ignored. And whether a website really has high weight has at least the following points:
Domain name age
The domain name age of a site can be said to account for the most core proportion in the SEO ranking results (more valuable than any external links and content, because the site weight is the domain name weight). The advantages of the old site are self-evident. Even if you have never done SEO, re-optimizing the stored site for several years has an advantage over any new site. Because search engines do not need to perform a new site behavior assessment on the site.
Page update frequency
The frequency of page updates does not determine the weight of the website, but it is a very good assessment factor for the exchange of friendly links. What is the purpose of the page update? It is nothing more than frequent spider crawling, and frequent site spider crawling is also frequent, and the friendly links exchanged by such sites also play a very good role in spider crawling on their own site. effect.
3.Inner page ranking
Whether a site's weight is high can be seen from the internal page ranking of the page. Especially for large websites, the domain name has a very high score. Even if the internal page does not operate any external links, it will be quickly included and indexed to participate in the ranking.
V. Keyword density should follow 2% -8%
I do n’t know who posted this misunderstanding. At least I have n’t seen any search engine say this. Although this is a reference value, it is a direct stick to many SEO novices. In the future, you will deliberately follow the density ratio when performing SEO optimization. The same keyword density of 5%, some sites may be cheating, some are not. The same keyword density of 15%, some sites may be cheating, some are not cheating. The core point of measuring whether a keyword is piled on a site is definitely not to look at keyword density. But look at the keyword frequency distribution of the page structure itself.
Perhaps a lot of friends know that you can use keywords like love station or webmaster tools to check the keyword density, but if you are careful, you can find that the keyword density found on the platform is very different. If you look for it, it is because of the platform The keyword density calculation formula is different. To understand the keyword density, you first need to learn how to calculate the keyword density on the webpage. Here is an example of using the love station and webmaster tools. As shown below:
We can see from the above figure that the keyword density of the word by Lu Songsong's blog is 0.83% and 1.2% respectively. Although the keyword density calculated by the two platforms is different, the biggest difference is actually the keywords of different platforms. Characters are handled differently. Whether it is love station or webmaster tools, it will not directly calculate the keyword density of the page, but will first calculate the total characters of the web page by removing all HTML element codes from the web page.
Regardless of the keyword density calculation for platforms such as Love Station or Webmaster Tools, their calculation formulas are the same, and there is a difference in the density factor because of the different keyword data after crawling. Calculated as follows:
Webpage keyword density (percent) = total length of key characters (length of key strings * frequency of keywords) / total length of page text
Ruai station data: about 0.082 (about 0.83%) = 96 characters (3 characters * 32 times) / 11584 characters
Such as webmaster tools data: about 0.0117 (about 1.2%) = 165 characters (3 characters * 55 times) / 14070 characters
For the webpage itself, it is composed of multiple DIV sections. The common sections are three sections: the head, the middle, and the bottom. The head can have a top, navigation, etc. The middle is composed of multiple DIV layers depending on the type of website. The bottom is mostly composed of bottom navigation and bottom friendly links. Suppose that your page appears 32 times like Lu Songsong's blog (love station data). If you put the 32 keywords appearing in one section and make them bold, even if it is not a search engine or a user see You can also understand that your website is doing deliberate keyword stuffing. So the density is not important, what is important is how you can effectively distribute the frequency of the keywords that appear, and be natural, even if you exceed 8%. It is not because of this point that search engines will find that keyword stuffing is cheating. However, most SEO practitioners will not deliberately think about this problem, because the tool itself has a readily available density calculation method.
Six, this algorithm is not another algorithm
Regarding algorithms, many friends will emphasize some algorithms published by search engines, but there are very few algorithms that you really understand. For example, Baidu has the so-called blue sky algorithm, green dill algorithm, ice bucket algorithm, and so on. No matter what the algorithm is, it will always consist of only a few hundred or thousands of words, and then let all practitioners of SEO continue to think about it. Regarding algorithms, real search engine algorithms are almost the same and are universal, such as HITS algorithm and HillTop algorithm in link analysis. Keywords include the TF-IDF algorithm, and web pages have document retrieval models. If you can control these algorithms cleverly and use them freely, it is not difficult to find the ranking indicators of search engines. On the other hand, if you blindly pursue those imaginary things, it is difficult to know the secret behind the ranking of search engine results.
Seven, the index determines the difficulty of keywords
This question should be placed at the top of the misunderstanding, but I put him to the end to talk about. Whether it is a party SEO marketing company or an individual SEO staff. Most practitioners rate the keyword index (Baidu index) as the core criterion for the difficulty of keyword optimization. The difficulty of real keyword optimization includes at least 4 categories, and the index is arguably the least valuable category. In essence, the search index of a keyword can only indicate the periodic popularity of a keyword, but it cannot reflect the difficulty of the keyword. There are at least four major types of keyword optimization difficulty, from top to bottom.