Google search was created in 1997 by Larry Page and Sergey Brin and today it performs more than three billion searches daily. These searches are done across trillions of web pages that Google reports to be around 95petabytes in size approximately.
According to Google they use special software known asGooglebot running on a large number of computers to crawl the web. It starts from the last status that it crawled and looks for new sites, updated current sites and invalid links. Google don’t accept money to favour one page over the other by crawling it more often. Now that the pages have been crawled the crawlers report the pages that they have visited and thus the 95petabyte index is updated. Google search doesn’t just visits the pages and shows result but there are several other factors that are kept in mind before showing relevant results in search. Some of the factors are known while others are kept confidential as unfair means may be tried to unfairly rig the system.
Some of the known factors are:-
- Type of content (Relevancy of Data)
- Quality of content
- Freshness of the content
- Authenticity of the site
- Name and address of the website
- Social media promotions
- Number of link points to a particular web page
- Value of those links.
The last two processes involve an important process known as “Pagerank”. These rates the web pages based on a score. Sites are assigned these scores based on where their links come from i.e. which authority, traffic, authenticity and well established pages.
0 comments:
Post a Comment