Techniques For Websites On Google Search Engines Information Technology Essay

Published: November 30, 2015 Words: 7249

Today search engines have become part of our lives. From students to professors, researchers to technocrats use search engines to ask questions and clear queries on any subject in the world. With more than 12 billion searches being performed each month as of January 2009 (according to comScore), approximately 400 million web searches are performed every day. This means that on average more than 4,500 searches are performed every single second of every day. All these data show how search engines have entered influenced our lives at the very core level. No one would have imagined twenty years back that search engines will be playing such an important role in our daily lives.

There are 4-5 major search engines and each of the search engines differs from each other in some small way. Unofficially, Google is the king of search engines because of the accuracy with which it shows the results from a search query. This is what helped Google turn into a household word. Google owns approximately 65% of the search market share; Google's search technology handles more than 2,900 searches per second. These highly accurate results were possible due to Google designers who added link popularity as an additional parameter to the current keyword search. This combination of keywords and link popularity resulted in higher accuracy results than what could have obtained through keywords alone. Additionally, link popularity and keywords are just two of hundreds of different parameters a search engine uses to rank the relevancy of search results. Due to the large search engine market share owned by Google and people's confidence in its search results, Google is now more or less a benchmark for other search engines. That is why Google is the favorite searching engine that most website likes to get optimized for but just being searched by someone on Google does not help. Eighty four percent of searchers never go beyond second page of search engine results. Let's try to understand the situation by an example.

Imagine the web is like one giant market or bazaar. There are many shops that are scattered all over the market. Having a site on the first page of the search results i.e. in the top 10 is same as having a shop exactly on Main road or near the entrance of the largest shopping mall in the market. To be outside the top 20 or not in first two search result pages is like having a corner shop on the very outskirts of market or in any interior place of the market where no one usually visits. The footfall in a main market is massive, with people passing by the market all the time. On the web, a space among top ten positions on Google has just the same effect. Apparently, the nearer to the number one position the website gets, the greater the chances are that the visitors could be turned into potential customers. That is why top position in Google is considered as a quality brand by most of the web surfers

Search engine optimization essentially involves making it easy for search engines to find the website and boosting the position of the website in their rankings. SEO is the science of customizing elements of the web site to achieve the best possible search engine ranking. That's really all there is to search engine optimization. However, as simple as it sounds, it is not. In any search results, Ranking is affected by both the internal and external elements of the websites; therefore all these elements should be examined. Good SEO can be very difficult to achieve, and great SEO seems pretty well impossible at times. Search engines will naturally change and mature, as the technologies and principles that enable SEO and the engines themselves change. For this reason, the SEO plan should be considered a dynamic, changing document.

OPTIMIZATION TECHNIQUES

Website optimization can be divided into two categories.

On page optimization

Off page Optimization

On Page optimization: On-page SEO means the techniques that one can use on a site. It is about what the webmaster wants to say about his website. It is an opportunity given to the webmaster to explain about the content of the website. The on-page elements include page titles, page metadata, headings, body text, internal links, off-page links, and image alt tags, as well as a host of other assets.

Page Title: The very first variable element on your page is the <title> tag, which appears in the <head> section. The title of the web page is probably the most important of all the on-page elements. Browsers display the title tag in the top bar of the browser window and, in more recent browser versions, for tabbed navigation between different web pages in the same window. Thus, the title is also an important navigation usability aid for browser users. TITLE tags not only tell a browser what text to display in the browser's title bar, but they're also very important for search engines. Search bots read the page titles and use the information to determine what the pages are about. Theoretically, there is no limit to how long a title tag can be. However in practice, both search engines and browsers truncate long titles to make their display consistent.

W3C (World Wide Web Consortium) guidelines state that the length of the title tag should, ideally, be 64 characters (including spaces) or less.

Google truncates at 66 characters or the last complete full word, whichever is the smaller.

Yahoo! truncates at exactly 120 characters, even if this means that the displayed title ends partway through word.

Here are the recommendations for constructing the title tag.

Every page must have a title tag and each page's title tag should be different. There should, nonetheless, be consistency across the pages in how the title tags have been constructed.

The length of each title tag should generally not exceed 85 characters (including spaces) and should never exceed 120 characters.

The title should be written in such a way that it gracefully truncates at both 66 characters exactly and 75 characters exactly (including spaces). It means complete words in a phrase that scans well and uses proper grammar.

Where possible, the title tag should incorporate a short call to action (in a verb-noun construct), or at the very least a provocative statement that encourages the user to click on the link.

The keywords used in the title tag should all be repeated in the URL, the meta-keyword tag, heading tags, and page body text. Synonyms of keywords in the title tag should also appear in the page body text.

Research has shown that modest and appropriate use of capital letters makes the links stand out, so people are more likely to click on them. Secondly, one can notice the use of the ampersand (&) instead of the word "and." in most of the title tags present in websites existing today. This is because, words like "the", "an," and "and" are stop words and are ignored by Google for most searches. As such, using them takes up valuable space in the tag and should be avoided.

Meta description: The meta-description tag is placed between the <head> tags in the HTML page. Its purpose is to provide a brief synopsis of page content, which builds on the headline included in the title tags. The correct syntax is as follows:

<meta name="description" content="put your description in here." />

When a person searches something on internet, search engine returns a long list of Web addresses and a bit of introductory text about the site. The text in Meta description tag is what the person will see in the introductory text of the website. If meta description is not included, it will be chosen randomly from a page on the site and might not make any sense. Therefore it is best to add a Meta description, which is a one- or two-sentence description of the Web site. This also acts be a sales message to potential visitors. The catch with Meta description tags is that they work differently for different search engines.

For example, Google gives very little weight to Meta descriptions. Instead, the Google search engine looks at the text on a page; Google doesn't display the Meta description text either. What does show is the content surrounding the instance of the keyword on your site. Google calls this a snippet.

Meta keywords: The meta-keyword tag (or tags) is also placed between the <head> tags in the HTML page and was intended solely for use by search engines. The correct syntax is as follows:

<meta name= "keywords" content = "keyword1 ,keyword2, keywordn" />.

The idea behind meta keywords is that the keyword tags provide information to the search engine about the contents of a web page. Because of misuse by webmasters at a large scale, the Meta tag has lost its importance. Some search engines like yahoo do use it, but many like Google don't. Still, metatag may come handy for other search engines.

Following are the tips for writing Meta keywords:

Separate each phrase with either a comma or a space but not both.

Use lower case for all keywords and pluralize phrases where possible.

Include capitalized or non plural equivalents of the selected keywords. Exclude all specific phrases from the homepage tag.

Use the most relevant site-wide phrases to begin the section and category page meta-keyword tags, and then also include the most relevant deep phrases.

On content pages, choose keywords that are relevant to that page only.

Never use more than 15 words in the meta-keyword tag and ideally try to limit to 12words in each.

Ensure that all of the key phrases in the keyword tag appear at least once more somewhere in the on-page elements (i.e., title tag, description tag, headings, alt tags, and body text). If it is not possible, seriously consider removing the keyword from the tag.

Most of the websites come up with a great list of keywords and then repeat them on every single page which is not the right way. The keyword should be specific to the page. A keyword should be used only if that word is used in the page within the body text-otherwise search engines may penalize for inappropriate use of keywords.

Heading tag: Header tags are the attributes that set up the different levels of headings and subheadings on a web site. There can be as many as six different levels of headings, though most web sites use only about four. These headers are useful in search engine optimization, because when keywords are put into a heading, search engine may imply that these keywords are so important that they appear in the heading text of the webpage. Search engines pay more attention to them, weighing them more heavily than keywords in body text. Header tags should be included immediately before the body-text tags of the site, and the text of the header goes in between the opening and closing tags. Regular HTML supports up to six levels of heading tags:

<h1>Your heading text</h1>

<h2>First subheading</h2>

<h3>Second subheading</h3>

<h4>Third subheading</h4>

And so on….

Body text: Body text is another place where inclusion of keywords is possible. There is no hard-and-fast rule on the number of times that the keywords should appear on a page. One can use these words regularly in the text, but should not use them out of context or just as a ploy to improve the search engine standings. There are also tags that indicate special formatting in text. Those tags are:

<b>Bold</b>

<i>Italics</i>

<strong>Strongly Emphasized</strong>

<em>Emphasis</em>

<li>New Line in List</li>

Each of these tags indicates special formatting for the word or phrase within the opening and closing tags, and the special emphasis makes a search engine crawler take notice of those words. Therefore, whenever possible, keywords should be used within those tags. This helps both search engines and human readers to identify the key text. Keywords should be used where appropriate and stuffing of keywords should be avoided into a site simply to improve the search engine rankings.

Image alt tag: Alt tags are the alternative text that's displayed on the site while a graphic is loading or if it doesn't load at all. The original purpose of alt tags was twofold:

To allow text-browser users to "see" images and to help partially sighted or blind users to "read" images using, for example, screen-reader software.

These tags are another place where one might want to include the keywords to help boost keyword frequency and improve search engine rankings.

Alt tag is used to describe about the picture used in the webpage if the user hover mouse over it. Also if due to technical reasons image is not loaded the alt tag is displayed on the website. Another use of alt tag is its use in image search. Today many people like to search image on Google. Therefore if all the images are tagged properly then it may surface in top image search which ultimately would bring user to the webpage. Following are the suggestions for the use of alt tags:

Every image must have an alt tag on a given website; with the exception of spacer images (having alt text for the latter is very irritating for disabled users).

The alt tag should faithfully describe the image concerned; the opportunity to include the key phrases should not be lost, but do not simply stuff the tag with keywords.

Anchor text: Anchor text is a text found on a given web site that appears to be a hyperlink. When a search engine looks at the web page, it automatically follows all the links that we have on the page. If those links (or even a large portion of those links) are text-based links, that's even better, because then what the search engine sees is not just the link to another page, but also the keywords.

A general strategy for anchor text is to consider varying the anchor text according to the links associated with it. When the same anchor text is repeated throughout the webpage, it begins to lose its effectiveness, and in fact, can cause a search engine crawler to rank the website lower in the SERPs. When a search engine looks at the web page, it automatically follows all the links that we have on the page. It's much more effective to use multiple keywords and phrases as anchor text on a web page. Variation in anchor text helps in maintaining consistency in the keywords and phrases in use.

Age of site: The reason this makes for a valuable signal is that a site that has been around for a long time must be doing something right in the eyes of the community it is trying to serve. Spammy or poor-quality sites ultimately don't do as well, so they don't last as long. Therefore websites which are 3-4 year old gets preferred over those websites which are 6 months old.

Domain name: When a Keyword is contained in a domain name or a subdomain, there is a good chance that the keyword is pretty relevant to the content of the site. A website with a keyword in the domain name may rank slightly higher than another website that does not use keyword in the domain (all other things being equal). Google actually reads URLs, looking for keywords in them. For example, there is a Web site with the domain name Paav-bhaji.com and someone searches at Google for Paav bahji, Google sees paav-bhaji.com as a match. Here a dash appears between the two words, Google recognizes the words in the domain name. Google also interprets periods and slashes as word separators. Here are few tips for selecting domain name.

A domain name should be short, easy to spell, and easy to remember.

Name of the website should be based on the business the website is into and not on the name of the company XYZ.com

In almost all cases, .com version of a domain name is most preferred.

URL's Nomenclature: The URL (Universal Resource Locator) is the literal address of a web site on the Internet. It's the address that site visitors would type into their browser's address bar to reach any website or in some cases, it's the link those users click to reach the website.

As Discussed earlier, the URL should be as descriptive as possible without being long and hard to remember. So, a URL of www.atopkeyword.com is much more effective than a URL of www.partofyourcompany nameonline.com.

But there is more to a URL than just the base name. For example, a website's structure probably has several levels of pages and files. So the base URL will then include a path to additional pages and folders. Unfortunately, if a website has hundreds of pages or dynamic content, it could end up with a URL that looks like this:

http://www.yoursite.com/o/ASIN/B00023K9TC/ref=s9_asin_title_11966_p/10289462962020168?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center

There are a couple of problems with such URL. The first is visitors would not be able to remember it and the second is that a chance to use valuable keywords is lost because the URL is an undecipherable collection of letters and numbers.

Therefore a website URL's should be as short and descriptive as possible. A user should be able to understand which part of the website he/she is currently accessing to.

For example:

http://www.yoursite.com/products/Mobiles/Accessories/id=507846_Intex_headphones

That URL is much shorter, and it's much more memorable than the longer one. Individuals might still have difficulty remembering the URL, but it's more likely they will remember it than one that's full of random letters and numbers. This illustrates the two key bits of advice about the URL's nomenclature:

URLs should be descriptive without being overly long, and they should give visitors a good idea of what to expect on the page.

This method opens up the potential for including keywords in the URL, which not only helps as crawlers look at the site, but also when the URL is posted as a link on other web sites or mailing lists. Those URLs can be crawled by search engine crawlers, and they should be easy for visitors to use, to remember, and to understand.

Domain expiry date: The length of time for which the domain name is registered could also affect the search engine ranking. Many hackers use throw away domains, or domain names that are registered for no more than a year, because they usually don't even get to use the domain for a full year before they are shut down. For this reason some search engines have implemented ranking criteria that give priority to domains registered for longer periods. A longer registration also shows a commitment to maintaining the web site.

Hosting Location: Location is one the most important issue a company faces while hosting a domain. For example, if the company is based in India and has purchased a domain that is hosted on a server in U.S then the search engine rankings will suffer. Geographically, search engine crawlers would read the site as being contradictory to the location. Because many search engines serve up results with some element of geographical location included, this contradiction could be enough to affect the ranking. Therefore the website must be registered and/or hosted in the appropriate country. If the website is trying to appeal to a U.S. audience, then it has to make sure that the hosting partner's servers are based in the United States. Likewise, if a website is after the Indian market, the website should be host in India.

Google knows where the servers are and will automatically favor the Web site in that country's search results-even though it may be not the targeted market for the website.

Outbound links: A website should also consider linking out from their site (where appropriate) to other high-quality websites that are on a similar topic or theme to their own. Generally, it should target sites with a PageRank 7 (PR7) or better for best effect. Keyword-rich links could be used to make the connection.

Google has always tracked "related sites." One can see what Google pairs to a site with by typing site: yourdomain.com into the query box, then clicking on the "similar sites" link from the result. There would be a list of many misc. websites which has no link with the queried website, but the key point is that some of them are based on the outbound links from the website.

Off page Optimization: Off-page SEO is the heart and soul of a successful optimization campaign and comprises all the tools and techniques needed to get other site owners to rate a website. All of these techniques together deliver a large quantity of high-quality links to a website from important, reliable, and relevant sources.

Inbound Links: The inbound link-building activities fall into two groups: active and passive. The former involves the webmasters actively contacting other sites to submit or request a link. The latter involves link baiting, where the webmasters create attractive content and invite others (in a more passive way) to link back to it.

To ensure source importance, one should target sites established at least three years before (and preferably more than seven) and that carry a homepage PageRank of at least PR3 (and ideally PR5 or more). To ensure source relevance, one should target pages where possible that have similar content or operate in the same niche.

Websites

www.qlew.org

www.123exchangelinks.com

www.linkmarket.net

www.linkmetro.com

www.linkdiy.com*

www.links4trade.com*

www.linkexchanged.com*Reciprocal link exchange: On the web it has been common practice for years to exchange links between webmasters. Webpages like linkpartners.html or web-resources.html are the examples of such practices. Some webmasters go further and build huge link directories off the back of their sites, with outbound links to literally hundreds of partners. Some of these directories are even built using dedicated link-building scripts that semi-automate the process.

*paid service

However, Google deflates the value of such links in computing overall rankings and therefore they are not as useful as were once use to be.

To send reciprocal link exchange requests, following approach (and etiquette) can be followed:

Only seek links with related sites.

Begin by putting a link of their site on your own site first.

Make sure that the note is well written and personalized to the recipient.

Be specific in the note about what link - and keyword-rich link text - you would like them to use.

Directory submission: Web directories are a brave attempt to create a human-maintained taxonomy or classification of the entire web and their numbers have grown dramatically over the last four or five years. Directories can be group into five categories.

Paid directories, which charge webmasters to list their sites or carry their pay-per-click adverts, either on a one-off basis or using a recurring monthly or annual fee.

Reciprocal directories, which will list sites but only in exchange for a reciprocal backlink.

Free directories, which are prepared to list your site for free.

Bidding directories, a relatively new type of directory service, where webmasters bid against each other to maintain key positions in the directory for their chosen keywords.

Deep link directories (whether paid, reciprocal, bidding, or free), which are prepared to list one or more deep links direct to the inner pages of your website.

Directories

www.dmoz.org

www.lii.org

www.webworldindex.com

www.directorydelta.com

www.jayde.com

www.iozoo.com

www.tsection.com

www.tikifind.com

www.premiumsites.org

www.webmasterhole.info

www.topdot.org

www.urlchief.com

www.i-searches.com

www.directoryworld.net

www.linkdirec.com

Search Engine Submission

www.exactseek.com

www.gigablast.com

www.entireweb.com

www.scrubtheweb.com

www.ulyseek.com

www.amfibi.com

www.searchhippo.comSearch engine submission: Search engine submission is the process of submitting the site URL to the thousands of search engines across the web for them to crawl. To find other search engines, search on Google for the words "submit URL," "submit site," and "add to listing" in turn. There are many search engines. A list of top search engine submission sites is given in the table.

Search engine submission software should not be used for submission. One can Use the same description as used for the directory submissions. Always place the homepage URL in the submission box; the engine will find and crawl the rest of your pages from there.

Forum participation: One of the most interesting areas of active link building: forum participation. In a active forum, many posters on forums have beautifully crafted signatures, which show the name of their business (often linked using keyword-rich anchor text). The main reason for this is Search engines index these links, and in some cases both the forum posts and the profile pages of users - which also carry links - acquire a decent PageRank from Google. On many bulletin boards the syntax for a signature file is as follows:

Ayush Singh, Marketing Executive

[url=http://www.yourdomain.com]keyword anchor text[/url]

Forum participation not only helps in inbound links but also spreading awareness about the website and may bring direct traffic to the website. Actively participating in various forums helps in building the brand because people would associate it to the website as well.

Following are few tips for effectively using forum for promoting a website.

Research areas of interest related to website through a search on Google.

Identify no more than three or four forums to join.

Sign up and create a profile page that contains a link to the website and a signature file containing anchor-rich keywords.

Begin posting answers to questions that are qualified to answer and do not drag up old posts from the archives.

Try to tie the postings back to advice on the website where possible.

Articles Submissions repository

www.articlesfactory.com

www.articledashboard.com

http://ezinearticles.com

www.articleonramp.com

www.ideamarketers.com

www.goarticles.comArticles, newsletters, and ,e-zines : The single best free way to promote a site is article submission. It can be easily noticed that there are literally thousands of ezines (electronic magazines) and content sites across the web, covering all manner of specialist subjects. Many of these are run by hobbyists and poorly paid webmasters with very little time on their hands. Therefore these sites can be used by creating high-quality content (1,000-1,500 words) covering a particular topic or answering a specific question and sharing hard-earned knowledge and expertise. Articles can be posted on up to five high-quality free article repositories, together with a synopsis and a resource box ("about the author") that links back to the website or business blog. Webmasters and editors find and "syndicate" thecontent by adding it to their sites. As the resource box must be included in exchange of providing free content, this whole process can help building excellent backlinks to a website.

Online public relations: Online is very similar to article submission. As with articles, one has to write a press release, publish it to a PR repository, and hope that other sites will pick it up and use it. Where online PR differs (and is thus arguably more powerful) is that it targets the mainstream news media more precisely. If the PR is written in correct manner it may find its place in major media networks. Some PR sites even offers the opportunity to release fed into Google News and Yahoo! News. A single PR7 link from a BBC webpage could be the difference between a top 30 and a top 10 ranking for one of the key search phrases. A good press release is not a boring advert for the business but a real story of genuine interest to the audience. Press release should be submitted to one or more of the main online PR repositories.

Press release repository

www.businesswire.com

www.prnewswire.com

www.prweb.com

www.virtualpressoffice.com

www.prlog.org

Tagging: Portable bookmarking (e.g., Google Bookmarks) and social tagging (e.g., del.icio.us), allowing users to maintain and categorize lists of the pages and sites they like most and to remember them for the future. In essence, this is a new voting system to challenge the current proxy of relevance used by search engines, inbound links. There are now a large number of social bookmarking and tagging services, most of which permit members to share their tags with other users and the public at large. In addition, you can subscribe to other people's tags.

Social Bookmarking Websites

digg.com

Technorati.com

del.icio.us

Propeller.com

StumbleUpon.com

The smart webmaster will make it easy for internet users to tag or bookmark their pages, and will ensure that their content includes controversial, topical, newsworthy, or funny material that is likely to attract many tags from surfers.

SEO Tools:

The Alexa Toolbar: Alexa is a company owned by Amazon.com. It's been around a long time, and millions of people around the world use the Alexa Toolbar. Every time someone uses the toolbar to visit a Web site, the toolbar sends the URL of the page to Alexa, allowing the system to create an enormous database of site visits.

The Alexa Toolbar can provide traffic information - it can be used to quickly see how popular a site is and detailed traffic analysis, such as an estimate of the percentage of Internet users who visit the site each month.

Alexa toolbar is available for all major browsers. It provides a site rank for every website. A website ranking 435 is a good ranking whereas a website which ranks 1, 35,670 have to work hard to generate traffic. Although Alexa cannot be used as a sole parameter because these ranking is based on the number of Alexa toolbar users which are just a pinch of total internet users throughout the world.

The Google Toolbar: Google toolbar is the most important toolbar for any SEO because it provides PageRank of the websites. This is the only official way to know the PageRank of the website. There are other tools too provided by Google. This toolbar is freely available for download from google.com. It provides you with the following useful features:

A way to search Google without going to www.google.com first.

A quick view of the Google PageRank.

A quick way to see if a Web page is already indexed by Google.

A quick way to see some of the pages linking to a Web page.

Keywords Finder: Keywords are most important part of any SEO plan. Although Google provides a free keyword suggestion tool there are other free sources which may come handy if someone wants to target other major websites. Keyword finder helps to find keywords related to a particular domain.

http://keywordfinder.org/

http://www.wordstream.com/keyword-niche-finder/

Crawlers: Some utilities will read a Web page and display the content of the page in the manner in which a search engine is likely to see it. When looking at a competitor's pages, one can sometimes see things that aren't visible to the site visitor but that have been placed on the page for the benefit of the search engines. A webmaster would like to view his webpages to check that all the links are readable by the search engines.

Here are a couple of these utilities:

Sim Spider: www.searchengineworld.com/cgi-bin/sim_spider.cgi

Delorie:

www.delorie.com/web/ses.cgi

Keyword Density: As discussed earlier Keywords are the most important part of any SEO plan and in the similar way proper utilization of keywords is necessary. Therefore if a webpage contains more keywords than required then it would result in negative impact on ranking. It is therefore sometimes interesting to check the pages for keyword density, and many tools are available to help you do so. There are various online tools such as the following:

Search Engine World's Keyword Density Analyzer: www.searchengineworld.com/cgi-bin/kwda.cgi

KeywordDensity.com: www.keyworddensity.com

Spanner works Keyword Density Analyzer www.spannerworks.com/seotoolkit/seo-toolkit/keyword-density-analyzer/

Validating HTML: This is the self-appointed guardian of HTML. W3C is an influential organization, and a friend of theirs is a friend of Google's. Third-party accreditation and recognition is massively important to Google.

If webmasters follow W3C's validation instructions and pass, then that's a big plus point for the website.

Webmasters have to take it seriously but with a grain of salt-sometimes it's a matter of form over function. If these HTML errors can be fixed, it should be done at the very first place. It's been worth at least a PageRank jump of one place for many a Web site. Following is the link to check the HTML validity.

http://validator.w3.org

Website owner's Address & Location: WHOIS gives the information about the website, name and address of the owner other vital information about the website. This can come handy if webmasters want to see what information Google can access about their website.

http://www.whois.net

Sitemap tools: If a website does not have any site map, this issue must be addressed-no matter how small the site is. Even if the web site is created using WYSIWYG (what you see is what you get) software, a map often can be created automatically by the program and added as a page. If developers have created the site, webmasters can ask them to map out the site, as this is information they should have created when they built the site in the first place. Finally, one can get one free at in return for advertising their service

http://www.xml-sitemaps.com

Check Inbound & outbound links: Back link checker tells how many sites are linking to a particular website. Quality back links are essential to increase the search engine rankings and due to the same reason, it is necessary to find out the back links to the website as well as other competitor sites. A blend of deep links and home page links prove more useful in enhancing the rankings. Link building services can be relied on to obtain quality links.

http://www.backlinkwatch.com/

Similarity Checker: Search engines ignore and even penalize sites that contain duplicate contents. Similarity checker helps to maintain a unique content in the site.

http://www.studiokraft.com/index.php/action/page-similarity-checker

http://www.webconfs.com/similar-page-checker.php

Indexed Pages Tool: The tool lists the number of pages indexed by the major search engines.

Search Engine Ranking Checker: Checks the position of the site in various search engines for each search term. The automation technique saves much of the time for the SEO professionals in ascertaining the ranking of the website.

http://www.seomoz.org/toolbox/indexed

Site Comparison Tool: What better way of getting an indication of how much search engine optimization a website needs than by comparing the website with one of the competitor websites.

http://www.justsearching.co.uk/tools/site-comparison/

Unethical SEO practices

Domain cloaking: On the surface, domain cloaking sounds like a great idea. The concept is to show users a pretty web site that meets their needs, while at the same time showing search engines a highly optimized page that probably would be almost useless to users. In other words, it's a slimy trick to gain search engine ranking while providing users with a nice site to look at.

It starts with content cloaking, which is accomplished by creating web-site code that can detect and differentiate a crawler from a site user. When the crawler enters the site, it is re-directed to another web site that has been optimized for high search engine results. The problem with trying to gain higher search results this way is that many search engines can now spot it. As soon as they find that a web page uses such a cloaking method, the page is delisted from the search index and not included in the results.

Duplicate content: When a website is built, the content for that site often presents one of the greatest challenges, especially if it's a site that includes hundreds of pages. Many people opt to purchase bits of content, or even scrape content from other web sites to help populate their own. These shortcuts can cause real issues with search engines. Say a web site is about some form of marketing. It's very easy to surf around the Web and find hundreds (or even thousands) of web sites from which one can pull free, permission-granted content to include on the web site. The problem is that every other person or company creating a web site could be doing the same thing. A single article on a topic appears on hundreds of web sites - and users aren't finding anything new if they search for the topic and every site has the same article.

To help combat this type of content generation, some search engines now include as part of their search algorithm a method to measure how fresh site content is. If the crawler examines the site and finds that much of the content is also on hundreds of other web sites, one run the risk of either ranking low or being delisted from the search engine's indexing database.

Hidden pages: One last SEO issue concerns the damage to your SEO strategy that hidden pages can inflict. These are pages in the web site that are visible only to a search crawler. Hidden pages can also lead to issues like hidden keywords and hidden links. Keywords and links help to boost the search rankings, so many people try to capitalize on these requirements by hiding them within the body of a web page, sometimes in a font color that perfectly matches the site background.

There's no way around the issue of hidden pages. If a web site contains hidden pages, it's just a matter of time before the crawler figures out that the content is part of a hidden SEO strategy.

Once that's determined by the crawler, the site ranking will drop drastically.

Keyword Stuffing: Keyword stuffing is the practice of loading the web pages with keywords in an effort to artificially improve the ranking in search engine results. This could mean that one use a specific keyword or key phrase a dozen times or hundreds of times.

Temporarily, this might improve the page ranking. However, if it does, the improvement won't last, because when the search engine crawler examines the site, it will find the multiple keyword uses.

Because search engine crawlers use an algorithm to determine if a keyword is used a reasonable number of times on the site, it will discover very quickly that the site can't support the number of times Keyword or key phrase is used. The result will be that the site is either dropped deeper into the ranking or (and this is what happens in most cases), it will be removed completely from search engine rankings.

Doorway pages: Doorway pages are often confused with landing pages, but they are not even close in their functions. Landing pages are designed to be rich in content, and visitors usually come to these pages through PPC ads. Doorway pages, on the other hand, are created specifically for search engines with the intent of increasing search engine results rankings.

Doorway pages usually use some form of redirection so that when visitors click through the link in the search engine results and land on the page, they are immediately taken to another page. This is either accomplished with a fast Meta refresh, JavaScript, or server side redirection. The Meta refresh is a technique that is used less often now than in the past, because many search engines penalizes web sites that use such tactics.

In place of the Meta refresh, some web sites have found clever ways to trick visitors into clicking a link that leads them forward to the web site they're being drawn to. There are also some web sites that have designed content-rich doorways, which are doorway pages that have some element of content included as well as a basic navigational structure that's consistent with the rest of the web site.

These pages, like other doorway pages, are still designed to draw high quantities of visitors.

Page jacking: Page jacking is a method of search engine spam where whole pages - and even whole web sites - are copied for the purpose of increasing search ranking and traffic for another site.

In one example of page jacking, a person might copy a whole site like Microsoft. They then cloak that site, but it still appears in search listings. Then, when unsuspecting users click through the listing for Microsoft they're taken not to the Microsoft page, but to another page that the hijacker funnels them to. Not only is page jacking a good way to get a web site delisted from search engine results, but it's also a crime that can result in a stiff fine and possibly jail time. There are also trademark and copyright infringement issues associated with page jacking.

Bait and switch: Bait and switch in SEO is the practice of creating an optimized web page specifically for search engines with the intent of obtaining good rankings. When those rankings are obtained, the company replaces the optimized site with one that's less optimized and more normal. The result is nearly instant traffic when the site has been switched.

Bait and switch does have one downfall. Most search engine crawlers revisit a site several times a month. And when the crawler revisits a bait-and-switch site, it will see that the content of the site has changed, and will adjust search rankings accordingly. In other words, the person who set up the bait and switch put a lot of time into a temporary optimization solution.

FFAs and IBLNs: FFA stands for free-for-all link farms and IBLN stands for Independent Back Linking Networks. In its Webmaster Guidelines, Google says:

Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web as your own ranking may be affected adversely by those links.

In practice, Google identifies "bad neighborhoods" by devaluing backlinks from the same IP subnet. Where a site is simply a link farm site (which lists loads of links to other sites, in exchange for links back or money), Google will eventually identify it as a bad neighborhood and deflate the value of the links in its index. IBLNs are networks of sites that all directly or indirectly link back to a site in such as a way as to promote it through the search engine rankings. The way IBLNs get around Google's IP monitoring is by using a completely different web-hosting plan for every site one want to link back directly to him. It is also not foolproof and if detected, can lead to Google simply wiping from its index all the direct referrers or, worse, dropping the entire IBLN, including the main site one was trying to optimize for.

10 steps to optimize a website:

Name of the website should reflect business, the website is into and not should not be tempted to use name of the company.

Website should be hosted in a country where it has the target market.

A list of keywords must be prepared keeping in mind, words people would use to search. This is the most important phase because a website is optimized on the basis of these keywords.

Properly use these keywords in Title, Meta tags, body text, heading tags etc.

Each page should have its unique Keywords. Never repeat keywords in every page.

Use HTML validation tool to check the Coding validity of the website and correct mistakes, if found.

Register on Google Analytics to keep a track of visitors and other traffic trends. A link which is visited more often can be used to promote the website and proper changes should be made in the links which do not generate any traffic

Social networking Sites such as Facebook & twitter can be used to generate traffic and to spread word about the website.

Use relevant forums, blogs and portals to share links of the website. Write articles and press release to spread information about the website.

Never use unethical SEO techniques because it may boost ranking temporarily but in the long term it will come under the radar of Google which will eventually lead to ban of the website.