Learning SEO to understand the code? Just make these points clear

To do SEO, you have to know some basic code knowledge, and do some small details optimization. For SEOER, it is twice the result with half the effort! So the code still needs to learn. The code must be understood. If the code can not be fixed, it will be very bad for future development, and it will not be well understood. So, code is still very important in this area. It is recommended to learn HTML. Learn SEO again, it is very advantageous to the future development and research and twice the result with half the effort!

Optimization points

W3C Standard

Search engine ranking algorithm is developed on the basis of W3C standard, so the coding of a website conforms to W3C standard as far as possible. The easiest way to see whether it conforms to W3C is to see if there is a DOCTYPE statement at the top of the website code. “DOCTYPE is a document declaration in HTML and XHTML to inform the browser what HTML or XHTML specifications are used in the current document. DOCTYPE declaration is located at the front end of the document and the complete format of the tag is “XHTML for example:

  1. Transitional DTD [transitional].
  2. Frameset DTD [Framework].
  3. Strict DTD [strict].

META Label

Meta tags are all in place. The main function of meta tags is to explain to search engines what information the web page has, such as your title, keywords, description, author information, version and so on. Even if search engines no longer attach importance to keywords and description, it is still very important for SEO.

  1. TITLE: In line with the characteristics of the website and the main keywords. The title is not much here.
  2. Description and keywords: These two tags are better not to be overwritten, but not to be filled in and revised once. Even if you have a high weight, you can’t play with it. Make sure you fill it out and don’t modify it.
  3. Robots: Robots tags are used to tell search engines which pages can be included and which can not. The parameters of robots ontent are all, none, index, noindex, follow, nofollow. The default is all. Setting up robots. TXT can more flexibly control the details of web pages.


A website uses DIV + CSS framework to write SEO-friendly because the code is clear, streamlined, and web code is reduced. It improves the efficiency of spider access and is conducive to collection. The reason why TABLE is not good is that when spiders visit web pages, they encounter multi-layer nesting, which will skip nesting content. In short, too many nesting will affect the inclusion. And DIV + CSS won’t.

Code optimization of website Tags

There is no need to optimize the tag code. Generally, we can use the CTRL + F search function to find the corresponding tags. Here are some tags that are often optimized

  1. H1-H6: Mainly used for headlines. Search engines will give higher weight to H1 tag text for other content. The highest weight in the web page is tilte, followed by H1, H2… The rule is, H1 only needs one page to appear, not many. H1 is generally used for page tags, and the more forward the code is, the better. Other H2 – H6 labels are best controlled in 1 – 5 times! Anyway, more is not good, but affects the beauty and inclusion!
  2. ALT: It is mainly used for picture description. The ALT tag can be found by the search function.

See which picture ALT is empty. Write the picture after ALT. All for inclusion!

  1. Strong and b: They are mainly used for content. Strong and B are both coarsening effects. B is only coarsening in HTML code, while strong is a statement emphasis and emphasis. More emphasis on content application, more suitable for keywords. And the weight of strong is higher than b, all for weight, so use him, haha! In the code to see which use B to replace strong, just remember, appropriate amount, too much will affect!

Matters needing attention

  1. Try to reduce the size of your page as much as possible, because each time a search engine crawler crawls your site, the storage capacity of data is limited. Generally, it is recommended to store less than 100kb, the smaller the better, but not less than 5KB. Reducing the size of the page also has the advantage of enabling your site to form a huge internal link network.
  2. Minimize the use of useless pictures and flash. The search engine crawler dispatched by the content index office does not know the picture, and can only judge the content of the picture according to the content of the attributes such as “ALT, TITLE” of the picture. Flash search engine crawlers are even ignored.
  3. Minimize the use of JS as much as possible. All JS codes are encapsulated in external call files. Search engines do not like JS, which affects the friendliness index of websites.

Fourth, try not to use the table layout, because search engines are lazy to grab the contents of the table layout nested within three layers. Search engine crawlers are sometimes lazy, I hope you must keep the code and content within 3 layers.

Fifth, try not to disperse CSS in HTML tags, try to encapsulate it into external calling files. If CSS appears in HTML tags, search engine crawlers will be distracted to pay attention to those things that do not make any sense for optimization, so it is recommended that they be encapsulated in a dedicated CSS file.

How to optimize the new station to speed up the ranking

Often heard some webmasters say that it is difficult to be included in the new website of SEO. So what should we do to make spiders come to our new station smoothly and fall in love with it? This requires some skills. Today, I will talk about some common ways to make spiders fall in love with new stations.

Transferring Weights and Attracting Spiders among Station Groups

I believe many friends have heard about the concept of “station group”, that is, to create a large number of websites, and then through these websites to transfer weight to each other. Want to know, use white hat to optimize the second-level domain name website, so that the site ranking, as long as willing to adhere to the words is not difficult, after all, the second-level domain name is higher than the second-level directory weight, only next to the top-level domain name, for keyword optimization is very helpful, as long as our website servers handle enough, then it is possible to make a large number of second-level domain name websites for site group optimization. To make a good ranking of keywords.

In the process of building a group of websites, we often encounter new websites, that is, the second-level domain name website has just opened, although it can be collected immediately by spiders, but the articles updated in this second-level domain name website have not been included and released for a long time. Then the keywords that the secondary domain name refers to will naturally disappear. At this time, how should we do so that the updated content of the secondary domain name can be included and released as soon as possible?

Regular Updating Article

Although it is a cliche, this problem can not be ignored and is the top priority. We must know that website operation depends on its content. If the search engine does not release articles all the time, does it mean that the ranking of this website has disappeared? Not so. When we site, although we can only see the homepage of this website, as long as we can update articles regularly, we can let search engines indexed. With the increasing number of indexed articles, even if the updated articles are not released, those indexed articles can still pass weight to the homepage of the website. With the passage of time, the ranking of the two level sites will gradually appear and slowly climb. At this point, it is just around the corner to want to enter the front page of the keywords or even get a better ranking.

  1. Analyzing the Reading Habits of Target Consumers

Website SEO technology is very important, but planning ideas are more important, SEO focuses on planning thinking, technology can be continuously improved in practice, but planning thinking is different, need to combine a variety of knowledge, and creative requirements. We often talk about user experience in optimization, so what is user experience? In fact, it is the reading habits and interests of target consumers. If you can grasp what these people want, then the user experience will be improved accordingly. For example, the consumer group we are facing is a group of students, and the website is always promoting products that students can not afford, so these consumers are naturally not interested in the content of the website, jump out rate will also be improved. At this time, we need to do something to cater to the consumption habits of these consumers. What consumers love and give them a good command of the right place, the website will naturally drop out rate, search engine spiders more trust, for website content will be faster to help.

User experience also includes such factors as the opening speed of the website, the cleanliness and generosity of the website, so when we optimize the website, we need to combine various knowledge. After all, only when we can really understand the structure of our website can we consider the problem from the perspective of consumers, and then we can improve the overall level of the website and let the target consumers in the website. With the increase of stay time, the jump-out rate has gradually decreased from 100% to 70%, which is also helpful for the inclusion of new stations.

Fourth, do a good job in the dissemination of external chains

Apart from attracting consumers, we also need to promote websites appropriately so that others can know what websites are and what products or services can solve them. As long as the promotion work is well done, it can not only attract accurate traffic, but also promote the crawling and collection of spiders, so this work is also very important.

External Chain is the Push of Quality Articles

According to the original idea of search engine, because of the high quality content, people actively spread, in order to respect copyright, so the reprinters also publish the external chain. However, our world is not perfect, not everyone’s ideological realm is so noble, especially when it comes to interests, many things change. All kinds of plagiarism emerge in an endless stream. Many people openly use other people’s articles as their own use, and even do not want to carry out pseudo originals. They directly copy sticky posts and modify them to their own ones. What’s more, a good article will be published at the end of the article, and become four unlike… The original intention of disseminating high-quality articles is to be distorted for the sake of interests. Therefore, the striking force of search engines on the external chain also shows that search engines have been intolerable to such behavior.

Hope to continue the work of publishing the chain, please change this thinking, this method of making garbage chain will only make the way narrower and narrower, not only waste your youth, but also optimize your work without any accumulation, not worth it!

Outer chains attract traffic and spiders.

A good article can not only arouse readers’sympathy, but also attract them to the website. At the same time, a good chain can enable target consumers to understand what the website is engaged in, and whether the product is suitable for itself. Therefore, doing well outside chains can attract accurate traffic.

In addition, the active dissemination of the external chain can also allow spiders to come to your website and “eat” many articles that have not yet been included in the website, thus enhancing the overall weight of the website, so the external chain is very important for the guidance of spiders.

Publishing the Outer Chain Not More but Better

A high-quality external chain can be better than hundreds of garbage external chains, because spiders crawl IP is limited, it can only climb to those places that make it easy to crawl. For example, if a post is posted on a forum website and there are a lot of links in it, which one is more popular with spiders? The answer is the latter. Similarly, we can contribute to popular websites, such as A5 webmasters and home webmasters, as long as successful submissions can inevitably lead to a large number of reprints, and the optimization of new stations will inevitably bring various benefits.

As long as we conscientiously build external chains and update articles, it is easy to find that there is often one or even several spider IP addresses sticking to their own websites, constantly crawling various webpages. When you keep such a spider on your website, you will be very motivated to get your website up to a higher level, and at the same time work enthusiasm will also increase.

SMO (social media optimization) needs to be synchronized

News source resources are very important for a website, because they can make the site known to people, and at the same time they can retain very high quality external resources.

All kinds of sharing projects are also critical, such as rewriting website articles in the circle of friends, and all kinds of SNS community reprint website articles, etc. Although these resources may not be able to stay outside the chain, as long as the content of the website is disseminating, it is very helpful for the website brand building.

In the process of building a new station, there is often a search engine review period, which makes many webmasters feel very difficult. On the one hand, they have to deal with the pressure of their bosses, on the other hand, they also depend on the face of the search engine, so they often hope that the new station can be included smoothly and ranked smoothly.

What is SEO network optimization?

For beginners who are new to SEO, we need to make clear what SEO network optimization is. Only by understanding its definition can we further study it. Here, we use SEO tutorial to explain in detail what is SEO and related peripheral knowledge.

For this problem, in the Internet, or in search engines, there are a large number of related definitions. Different people will explain what SEO is in different languages, but the final result is the same, that is, using their website to get traffic from search engines, and then complete product sales, service reach and brand exposure.

From the definition of encyclopedia, SEO = Search (search) Engine (engine) Optimization (optimization), that is, search engine optimization. Refers to the understanding of the search engine sorting principle, based on the website optimization and station optimization, so as to enhance the current website keywords ranking probability, to obtain traffic.

It is necessary to point out that SEO is not only ranking, but also a collection of five elements, namely, search requirement coverage, inclusion, ranking, presentation and data analysis. Any SEO technology can be summarized, and a deep understanding of this concept will have an important impact on your future deepening of SEO technology.

Assuming that you are a SEO site, SEO sites are massive, in this case, how can you make your keywords rank higher and let the netizens see it? At this time, SEO network optimization is a marketing tool that can be used. Through the corresponding technology, let the website keywords rank high, and expose the information to subdivided users, so as to further enhance the conversion rate of website products and services, and bring benefits to website owners or entrepreneurs.

Here, explain in detail what SEO network optimization is.

One: Typical features of high-quality sites.

1: The layout of keywords in the station is reasonable. Systematic keyword database of subdividing industry can satisfy different users’keyword differentiated search needs.

2: the website has a good inclusion ratio. A large number of effective inclusions reduce the proportion of invalid pages.

3: The location of brand words, long tail keywords or minor keywords in search engines is reasonable. Especially long tail keywords account for a larger proportion of traffic.

4: The site layout is reasonable, the interface is beautiful, the visual sense is strong, the corresponding interactive function is complete, can satisfy the search needs of most subdivided users, can retain the new users who click in, and transform into old users.

5: The website opens steadily and quickly, and has a good user experience.

6: For users who use different devices to browse the website, the site should be adapted to ensure that users can browse conveniently and normally.

7: The external chain of websites is growing naturally.

8: The content of the website has a large audience and high quality.

9: Has a good accumulation of historical data, in search engines have a greater brand influence.

10: Users have better access depth, low jump-out rate and long stay time.

Second: Seo network optimization in the station.

SEO is divided into in-station optimization and out-of-station optimization. Literally, site optimization refers to the internal optimization of the website, including keyword layout, link structure, interactive function settings, pictures, codes, adaptation, correlation construction and so on.

At present, the SEO functions of most CMS station-building systems on the market are quite complete. As a general seoer, the focus of work can be shifted to the organization of high-quality content.

From this point of view, it is particularly important to choose a suitable station construction system and corresponding templates for your industry. For example, if you are a blog site, then using WordPress system to build a site is ideal. Link structure, interactive function (comment) and code streamlining that webmasters pay attention to have been set up.

Third: optimization of off-site SEO network.

Put aside the station, there is only SEO outside the station. Compared with in-station optimization, out-of-station optimization is less controllable, which is also a common problem faced by many peers.

Off-site optimization includes authoritative construction and user behavior data analysis.

1: in terms of authoritative construction, including the construction of external chains such as links and other external links (hints: the role of links from large to small in order to anchor text links, hyperlinks, white links), the exposure of website brand words, the accumulation of historical data of websites, and the credibility of websites.

2: From the user behavior data, it mainly includes click behavior. Through the title and description of the settings, to enhance the keyword ranking is not much different, the site’s clicks.


The above contents explain in detail what is SEO network optimization and how to operate it. As a general enterprise site, it is not difficult to do a good job of content, do a good job of “natural growth” of the external chain, so that the website can stabilize, get the keyword ranking of the website, and obtain the final product and service transformation.