Learning SEO to understand the code? Just make these points clear

To do SEO, you have to know some basic code knowledge, and do some small details optimization. For SEOER, it is twice the result with half the effort! So the code still needs to learn. The code must be understood. If the code can not be fixed, it will be very bad for future development, and it will not be well understood. So, code is still very important in this area. It is recommended to learn HTML. Learn SEO again, it is very advantageous to the future development and research and twice the result with half the effort!

Optimization points

W3C Standard

Search engine ranking algorithm is developed on the basis of W3C standard, so the coding of a website conforms to W3C standard as far as possible. The easiest way to see whether it conforms to W3C is to see if there is a DOCTYPE statement at the top of the website code. “DOCTYPE is a document declaration in HTML and XHTML to inform the browser what HTML or XHTML specifications are used in the current document. DOCTYPE declaration is located at the front end of the document and the complete format of the tag is “XHTML for example:

  1. Transitional DTD [transitional].
  2. Frameset DTD [Framework].
  3. Strict DTD [strict].

META Label

Meta tags are all in place. The main function of meta tags is to explain to search engines what information the web page has, such as your title, keywords, description, author information, version and so on. Even if search engines no longer attach importance to keywords and description, it is still very important for SEO.

  1. TITLE: In line with the characteristics of the website and the main keywords. The title is not much here.
  2. Description and keywords: These two tags are better not to be overwritten, but not to be filled in and revised once. Even if you have a high weight, you can’t play with it. Make sure you fill it out and don’t modify it.
  3. Robots: Robots tags are used to tell search engines which pages can be included and which can not. The parameters of robots ontent are all, none, index, noindex, follow, nofollow. The default is all. Setting up robots. TXT can more flexibly control the details of web pages.


A website uses DIV + CSS framework to write SEO-friendly because the code is clear, streamlined, and web code is reduced. It improves the efficiency of spider access and is conducive to collection. The reason why TABLE is not good is that when spiders visit web pages, they encounter multi-layer nesting, which will skip nesting content. In short, too many nesting will affect the inclusion. And DIV + CSS won’t.

Code optimization of website Tags

There is no need to optimize the tag code. Generally, we can use the CTRL + F search function to find the corresponding tags. Here are some tags that are often optimized

  1. H1-H6: Mainly used for headlines. Search engines will give higher weight to H1 tag text for other content. The highest weight in the web page is tilte, followed by H1, H2… The rule is, H1 only needs one page to appear, not many. H1 is generally used for page tags, and the more forward the code is, the better. Other H2 – H6 labels are best controlled in 1 – 5 times! Anyway, more is not good, but affects the beauty and inclusion!
  2. ALT: It is mainly used for picture description. The ALT tag can be found by the search function.

See which picture ALT is empty. Write the picture after ALT. All for inclusion!

  1. Strong and b: They are mainly used for content. Strong and B are both coarsening effects. B is only coarsening in HTML code, while strong is a statement emphasis and emphasis. More emphasis on content application, more suitable for keywords. And the weight of strong is higher than b, all for weight, so use him, haha! In the code to see which use B to replace strong, just remember, appropriate amount, too much will affect!

Matters needing attention

  1. Try to reduce the size of your page as much as possible, because each time a search engine crawler crawls your site, the storage capacity of data is limited. Generally, it is recommended to store less than 100kb, the smaller the better, but not less than 5KB. Reducing the size of the page also has the advantage of enabling your site to form a huge internal link network.
  2. Minimize the use of useless pictures and flash. The search engine crawler dispatched by the content index office does not know the picture, and can only judge the content of the picture according to the content of the attributes such as “ALT, TITLE” of the picture. Flash search engine crawlers are even ignored.
  3. Minimize the use of JS as much as possible. All JS codes are encapsulated in external call files. Search engines do not like JS, which affects the friendliness index of websites.

Fourth, try not to use the table layout, because search engines are lazy to grab the contents of the table layout nested within three layers. Search engine crawlers are sometimes lazy, I hope you must keep the code and content within 3 layers.

Fifth, try not to disperse CSS in HTML tags, try to encapsulate it into external calling files. If CSS appears in HTML tags, search engine crawlers will be distracted to pay attention to those things that do not make any sense for optimization, so it is recommended that they be encapsulated in a dedicated CSS file.

Leave a comment

Your email address will not be published. Required fields are marked *