Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Search engines such as Google use them to index the web content, spammers use them to scan for email addresses, and they have many other uses。 Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol。 (资料来源:) 二、《互联网搜索引擎服务自律公约》发起单位名单 百度、即刻搜索、盘古搜索、奇虎360、盛大文学、搜狗、腾讯、网易、新浪、宜搜、易查无限、中搜。 (责任编辑:admin) |