Sitemaps were traditionally utilized as a user interface to help visitors find particular website content. Generally, standard HTML sitemaps have a sitemap page which consists of links and descriptions to all pages within the website. Sometimes the pages are even organized with headings. The sitemap process has matured considerably in the past few years, beginning when Google introduced Sitemaps 0.84 in June 2005 to accommodate the publishing of lists of links for an entire business website. That was when sitemap idea progressed to pushing your website URLs out to the general digital community using server based files. The server based files come in a variety of formats including HTML, XML, ROR, Text and zipped files as .gz format. Soon, support for sitemaps began to gain acceptance by the larger search engines. In November 2006, Google, MSN and Yahoo announced joint support for the Sitemaps protocol. At that time, the version was called Sitemap 0.90 and the website Sitemaps.org was created. In April 2009, Ask and IBM announced support for Sitemaps. At the same time Google, Yahoo, MSN announced their search robots would auto-discover sitemaps through a command in the robots.txt. With these sitemaps search engines can more intelligently index the [...]
Do you know if your website is optimized by page or based on the entire website? Per page optimization has been a ongoing theme in making sure your website is marketable through the search engines. Furthermore, this changes the thought process of your website flow because all pages now need a targeted call to action based on the entry point by visitors.
W3C compliance can have a positive impact on search engine rankings. It’s fair to conclude if code works well for the visitors and browsers, it can only benefit your efforts with the search engines. So, why would one take the risk of having bad Web site code, if good Web site code will help visitors and search engine spiders alike?