A Web Crawler is a program that crawls through the sites in the Web and indexes those URL‘s. Search Engines uses a crawler to index URL’s on the Web. Google uses a crawler written in Python. There are other search engines that uses different types of crawlers. In this post I’m going to tell you how to create a simple Web Crawler in PHP. The codes shown here was created by me.... [READ MORE]
Posts marked with "Search Engines" in tags
Submitting Blogger Blog Sitemap To Google & Other Search Engines
Sitemap is a XML file that contains all the links of your site/blog. Sitemap can be manually created or automatically created using some online tools. These Sitemap can be submitted to search engine via the tools they provide or by mentioning the sitemap in robots.txt. In Blogger blogs the sitemap is located in different places. In Blogger Blogs with the .blogspot.com sub domain, the sitemap is in a simple location ie : http://YOURBLOG.blogspot.com/sitemap.xml But on other blogs ie blogs on custom domains doesn’t have a sitemap like the above URL.... [READ MORE]