Shanghai Longfeng personnel must understand the working principle of the four spider
September 11, 2017
when a spider crawling over a period of time after the website will begin to grab work. Spiders may not be a web content capture success, it will be based on the structure of the site and the specific situation to grab, and if our website there are some complex factors, will become a stumbling block to the spider crawling the page. So we should learn to streamline the website:
second: Streamline ", let the spider grab more easily
, of course, is not necessary with each article without rhyme or reason in the chain, but should let the inner chain naturally, valuable, so that the user, are very friendly to the spider.
website optimization work revolves around the love of spiders in Shanghai do, believe that you have objections to the optimization. Nevertheless, many optimization personnel are working principle for spiders are very strange, so the website optimization still remain in the original stage, unable to continue to enhance.
(1) with CSS and JS code and simplified. Many webmaster in website source code, do not pay attention to whether the streamlined website source code, so many websites will have CSS and JS code redundancy problem, it will cause the difficulty of spiders to crawl. If you understand how the code can modify, merge duplicate code, do not know if you can spend a little money that people change.
first: reasonable distribution chain, let the spider crawling more deeply, more dispersed
is such, we need to understand the working principle of spider and spider according to the working principle to develop and improve the optimization scheme of the website. What is the specific, here I come to you and simply talk about it.
spider robot and the reality of the spider is the same, as long as there is a huge net, it can easily on the web crawling, grabbing food. So, our website is a huge net, the chain is within a " " wire;. If we " wire; " not enough, so the spider is not love Shanghai more deeply, more distributed crawling our website. This shows that we in the optimization process, we must pay attention to the construction of internal links, to provide more and more closely link the entrance for spiders. The most common technique is to add one or more links to relevant articles at the bottom of the article, such as the following web site:
(2), delete the flash compressed images. The spider for pictures and flash is not easy to capture, for falsh, I strongly suggest you delete it, things are not good for users and spiders; and for a lot of pictures website, I suggest Adsense ready for a picture compression tool to upload pictures to compress and then upload each picture, and try to add the alt attribute, let the spider to better identify and better grasp.