SEO is the abbreviated form of "Search Engine Optimization". It is the set of process on account of which a website or web page is being constructed or optimized that helps them to enhance their appearance or visibility on top in SERPs (Search Engine Result Pages).
Google Analytics is a free web analysis tool first rolled out in late 2005 but generally it become available for users in August 2006. This tool acts between website & internet browser/users and offers a complete overview of visitor statistics which exactly says about general website activities like page views, site visits, bounce rates, average time spent on site or pages, sources of traffic, location etc. It is also obvious for tracking Adword queries.
Webmaster Tool is a free service catered by Google which provide us a complete report for indexing data, crawling errors, backlink information, search queries, website malware errors, CTR and submitting an XML sitemap. Basically, it acts as a mediator between website and server provide a complete overview of data, issues and other queries.
A blog is referred as an information or discussion published on website or World Wide Web incorporating distinct entries called as posts. Basically, the blog is referred as everything thing where you can include others too. It is more individual in contrast to article and press release. It is also considered as very personal in subject to both style and comprised ideas and information and can be written in the way just like you may talk to your readers. It is also called Web diary or Online Diary.
☛ Search engines are used on the web to place the websites according to the ranking or suggestions.
☛ Search engines can be placed on any site that needs to be made as search engine friendly. It allows the search engines to search the sites and display them on the front according to the pre-defined ranking criteria.
☛ Once the site is suggested for the search engine then the website information gets collected and the links are followed on each web site.
☛ The linked websites works when the site is visited and on that other links also gets active.
☛ If there are more websites that link to the site of the user then the spider (information gatherer) will find the site and display it.
☛ Spiders are the information gathering tools that allow the bot to collect the information about the web sites on the Internet.
☛ The programs that are used provide gathering of the content from the web site and the store for processing of the data.
☛ There are two ways in which spiders can perform the search function on the website and it is as follows:
☛ Search engine can be allowed to find the website that is being created directly by the use of Meta keywords or data.
☛ It can also tell the search engine the information about the website and use the related data to provide to let your site appear on the search engines.
☛ Search engines use basic components and features to make the result easier for the users.
☛ It provides three basic actions that can be performed using the search engine and are as follows:
☛ Gathering of information is being done by the spiders that gather the information and list it down for the crawling by the search engines.
☛ Analyze the information includes the analyzing of the search engine techniques and the way it is being analyzed with respect to other users.
☛ Display information includes the result that needs to be displayed to the user and in right way.
☛ The program receives the information using the search request and then it compares the entry with the index and according to that show the results.
☛ Program allows the search engine to use a structured directory that includes the topics and allow the search engine to crawl the data from the web.
☛ Program allows the web portal sites to offer the search engines and directories that find the information and make it accessible to all.
☛ It includes the basic components and features essential for the users to be used for the better optimization of the search engines.
☛ The components work together to provide the information for the search engine to perform better.
Internet search engines having three parts in which the functions can be performed and they are as follows:
☛ A spider is also known as crawler or bot, this travels from one page to another page and visits the complete page.
☛ It represents the page that needs to be searched on the web site and displayed on the search engines.
☛ It includes the reading of the website using the hyperlinks on all the pages and there is communication that takes place from one page to another.
☛ Catalog or indexing is being done that is created by the programs that compile the pages being read from the web sites.
☛ Program that receives the search request and it has its own comparison with the indexing that needs to be done to have the results located.
☛ Engine and the directory structure for the website are different and it allows the submission to be done using the engine only.
☛ The submission can take time or it is required to have the instances of the web site to be listed on the search engines.
☛ Submission can only appear on the engines if it is submitted correctly and it provides better visibility of the request.
☛ The engines can be added according to the submissions that are taking place and the calculative chances with the sources.
☛ The engines and directories are put in a queue and add the chance to check the site for the content that is being used.