Techniques For Websites On Google Search Engine Information Technology Essay

Published: November 30, 2015 Words: 7124

Do you have any query. Login to any search engine, type a word or phrase into the search box and click search button. Just wait for a few seconds, and links to thousands (or millions) of pages will appear. With more than 12 billion searches being performed each month as of January 2009 (according to comScore), approximately 400 million web searches are performed every day. This means that on average more than 4,500 searches are performed every single second of every day. In addition to the existing competition, users now expect from any search engine that the responses to their search queries will be returned in less than one second. All these data shows how search engines have entered influenced our lives at the very core level. No one would have imagined twenty years back that search engines will be playing such an important role in our daily lives.

There are 4-5 major search engines and each of the search engines differs from each other in some small way. Unofficially, Google is the king of search engines because of the accuracy with which it show the results from a search query. This is what helped Google turn into a household word. Google owns approximately 65% of the search market share; Google's search technology handles more than 2,900 searches per second. This accuracy was developed when the Google designers combined keyword searches with link popularity. The combination of the keywords and the popularity of links to those pages resulted in a higher accuracy results than what could have obtained through keywords alone. It's also important to understand that link popularity and keywords are just two of hundreds of different criteria that search engines can use in ranking the relevancy of web pages. Due to the large search engine market share owned by Google and people's confidence in its search results, Google is now more or less a benchmark for other search engines. That is why Google is the favorite searching engine that most website likes to get optimized for but just being searched by someone on Google does not help. Eighty four percent of searchers never go beyond second page of search engine results [1] . Let's try to understand the situation by an example.

Imagine the web is like one giant market or bazaar. There are many shops that are scattered all over the market. Having a site in the top 10 is same as having a shop exactly on Main road or near the entrance of the largest shopping mall in the market. To be outside the top 20 is like having a corner shop on the very outskirts of market or in any interior place of the market where no one usually visits. The footfall in a major mall is massive, with people coming in and out of the shop all the time. On the web, a space among top ten positions on Google has just the same effect. Apparently, the nearer to the number one position the website gets, the greater the chances are that the visitors could be turned into potential customers. It's almost as if web surfers associate a top position on Google with a quality brand.

Search engine optimization essentially involves making it easy for search engines to find the website and boosting the position of the website in their rankings. SEO is the science of customizing elements of the web site to achieve the best possible search engine ranking [2] . That's really all there is to search engine optimization. But as simple as it sounds, it is not. Both internal and external elements of the site affect the way it's ranked in any given search engine, so all of these elements should be taken into consideration. Good SEO can be very difficult to achieve, and great SEO seems pretty well impossible at times. Search engines will naturally change and mature, as the technologies and principles that enable SEO and the engines themselves change. For this reason, the SEO plan should be considered a dynamic, changing document.

OPTIMIZATION TECHNIQUES

Website optimization can be divided into two categories.

On page optimization

Off page Optimization

On Page optimization: On-page SEO means all the techniques that one can use on a site. It is all about what we want to say about our website. It is like an opportunity given to us to explain what the website is all about without exaturating. The on-page elements include page titles, page metadata, headings, body text, internal links, off-page links, and image alt tags, as well as a host of other assets.

Page Title: The very first variable element on your page is the <title> tag, which appears in the <head> section. The title of the web page is probably the most important of all the on-page elements. Browsers display the title tag in the top bar of the browser window and, in more recent browser versions, for tabbed navigation between different web pages in the same window. Thus, the title is also an important navigation usability aid for browser users. TITLE tags not only tell a browser what text to display in the browser's title bar, but they're also very important for search engines. Search bots read the page titles and use the information to determine what the pages are about. If you have a keyword between your TITLE tags that competing pages don't have, you have a good chance of getting at or near the top of the search results. Theoretically, there is no limit to how long a title tag can be. However in practice, both search engines and browsers truncate long titles to make their display consistent.

W3C (World Wide Web Consortium) guidelines state that the length of the title tag should, ideally, be 64 characters (including spaces) or less.

Google truncates at 66 characters or the last complete full word, whichever is the smaller.

Yahoo! truncates at exactly 120 characters, even if this means that the displayed title ends partway through word.

Here are the recommendations for constructing the title tag.

Every page must have a title tag and each page's title tag should be different. There should, nonetheless, be consistency across the pages in how the title tags have been constructed.

The length of each title tag should generally not exceed 85 characters (including spaces) and should never exceed 120 characters.

The title should be written in such a way that it gracefully truncates at both 66 characters exactly and 75 characters exactly (including spaces). It means complete words in a phrase that scans well and uses proper grammar.

Where possible, the title tag should incorporate a short call to action (in a verb-noun construct), or at the very least a provocative statement that encourages the user to click on the link.

The keywords used in the title tag should all be repeated in the URL, the meta-keyword tag, heading tags, and page body text. Synonyms of keywords in the title tag should also appear in the page body text.

Research has shown that modest and appropriate use of capitalization makes the links stand out, so people are more likely to click on them. Secondly, one can notice the use of the ampersand (&) instead of the word "and." in most of the title tags present in websites existing today. This is because, words like "the", "an," and "and" are stop words and are ignored by Google for most searches. As such, using them takes up valuable space in the tag and should be avoided.

Meta description: The meta-description tag is placed between the <head> tags in the HTML page. Its purpose is to provide a brief synopsis of page content, which builds on the headline included in the title tags. The correct syntax is as follows:

<meta name="description" content="put your description in here." />

When a person searches something on internet, search engine returns a long list of Web addresses and a bit of introductory text about the site. The text which we enter in Meta description tag is what we will see in the introductory text of the website. If we don't include a Meta description, it will be chosen randomly from a page on the site and, quite frankly, might not make an awful lot of sense. Alternatively, we can add a Meta description, which is a one- or two-sentence description of the Web site. This should also be a sales message to potential visitors-what's going to make them click on our link and not a competitor's? Strong, clean copy is the answer: a sales message, a teaser, a call to action to encourage them to click on our website. The catch with Meta description tags is that they work differently for different search engines. For example, Google gives very little weight to Meta descriptions. Instead, the Google search engine looks at the text on a page. And on the SERPs [3] , Google doesn't display the Meta description text either. What does show is the content surrounding the instance of the keyword on your site. Google calls this a snippet.

Meta keywords: The meta-keyword tag (or tags) is also placed between the <head> tags in your HTML page and was intended solely for use by search engines. The correct syntax is as follows:

<meta name= "keywords" content = "keyword1 ,keyword2, keywordn" />.

The idea is that the keyword tags provide information to the search about the contents of a web page.

Following are the tips for writing Meta keywords:

Separate each phrase with either a comma or a space but not both.

Use lower case for all keywords and pluralize phrases where possible.

Include capitalized or non plural equivalents of the selected keywords. Exclude all deep phrases from the homepage tag.

Use the most relevant site-wide phrases to begin your section and category page meta-keyword tags, and then also include the most relevant deep phrases.

On content pages, choose keywords that are relevant to that page only.

Never use more than 15 words in the meta-keyword tag and ideally try to limit to 12words in each.

Ensure that all of the key phrases in the keyword tag appear at least once more somewhere in the on-page elements (i.e., title tag, description tag, headings, alt tags, and body text). If it is not possible, seriously consider removing the keyword from the tag.

The classic problem most Web sites have is coming up with a great list of keywords and then repeating them on every single page. This is not the right way. Keep the keyword specific to the page. Use a keyword (or for that matter a reference in your title or description) only if that word is used in the page within the body text-if you do not, you'll be penalized.

The Keywords meta tag was originally created as an indexing tool; a way for the page author to tell search engines what the page is about by listing, yep, keywords. Although quite important in the past, this Meta tag isn't as important these days. Some search engines do use it, but many don't. Reportedly Yahoo does; however, Google doesn't. Still, metatag may come handy for other search engines.

Heading tag: Header tags are the attributes that set up the different levels of headings and subheadings on your web site. There can be as many as six different levels of headings, though most web sites use only about four. These headers are useful in search engine optimization, because when we put keywords into a heading, we're saying to a search engine, "These keywords are so important that they appear in my heading text." Search engines pay more attention to them, weighing them more heavily than keywords in body text. Header tags should be included immediately before the body-text tags of your site, and the text of the header goes in between the opening and closing tags. Regular HTML supports up to six levels of heading tags:

<h1>Your heading text</h1>

<h2>First subheading</h2>

Body text: Body text is another place where we want to include our keywords when possible. There is no hard-and-fast rule on the number of times that our keywords should appear on a page. We should use these words regularly in your text, but don't use them out of context or just as a ploy to improve the search engine standings. If the keywords don't work in the normal flow of the text on the page, don't include them. There are also tags that indicate special formatting in text. Those tags are:

<b>Bold</b>

<i>Italics</i>

<strong>Strongly Emphasized</strong>

<em>Emphasis</em>

<li>New Line in List</li>

Each of these tags indicates special formatting for the word or phrase within the opening and closing tags, and the special emphasis makes a search engine crawler take notice of those words. Therefore, if we can use keywords within those tags, we should try to. This helps both search engines and human readers to identify our key text. Only use keywords where appropriate and avoid stuffing keywords into your site simply to improve your search engine rankings.

Spannerworks Keyword Density Analyzer (currently at www.spannerworks. com/seotoolkit/seo-toolkit/keyword-density-analyzer/).

Image alt tag: Alt tags are the alternative text that's displayed on the site while a graphic is loading or if it doesn't load at all. The original purpose of alt tags was twofold: to allow text-browser users to "see" images and to help partially sighted or blind users to "read" images using, for example, screen-reader software. And these tags are another place where we might want to include our keywords to help boost keyword frequency and improve search engine rankings Whenever a image is used in the website ensure that the following recommendations are met:

Every image must have an alt tag; with the exception of spacer images (having alt text for the latter is very irritating for disabled users).

The alt tag should faithfully describe the image concerned; the opportunity to include the key phrases should not be lost, but do not simply stuff the tag with keywords.

Anchor text: Anchor text is text found on a given web site that appears to be a hyperlink. When a search engine looks at the web page, it automatically follows all the links that we have on the page. If those links (or even a large portion of those links) are text-based links, that's even better, because then what the search engine sees is not just the link to another page, but also the keywords.

A general strategy for anchor text is to consider varying the anchor text according to the links associated with it. When we repeat the same anchor text over and over again on a given web page, it begins to lose its effectiveness, and in fact can cause a search engine crawler to rank our site lower in the SERPs. It's much more effective to use multiple keywords and phrases as anchor text on a web page. This allows us to vary the anchor text, but to maintain a consistency in the keywords and phrases we use. When a search engine looks at the web page, it automatically follows all the links that we have on the page.

Age of site: The reason this makes for a valuable signal is that a site that has been around for a long time must be doing something right in the eyes of the community it is trying to serve. Spammy or poor-quality sites ultimately don't do as well, so they don't last as long.

Domain name: When a Keyword is contained in a domain name or a subdomain, there is a good chance that the keyword is pretty relevant to the content of the site. A website with a keyword in the domain name may rank slightly higher than another website that does not use keyword in the domain (all other things being equal). Google actually reads URLs, looking for keywords in them. For instance, if you have a Web site with the domain name rodent racing.com and someone searches at Google for rodent racing, Google sees rodent-racing.com as a match. Because a dash appears between the two words, Google recognizes the words in the domain name. (Google also interprets periods and slashes as word separators.)

A domain name should be short, easy to spell, and easy to remember.

In almost all cases, you should get the .com version of a domain name.

Domain Registration: The length of time for which you register your domain name could also affect your search engine ranking. Many hackers use throw away domains, or domain names that are registered for no more than a year, because they usually don't even get to use the domain for a full year before they are shut down. For this reason some search engines have implemented ranking criteria that give priority to domains registered for longer periods. A longer registration also shows a commitment to maintaining the web site.

Hosting Location: One of the biggest issues that you'll face with domain hosting is the location of your hosting company. If you're in the United States and you purchase a domain that is hosted on a server in England, your search engine rankings will suffer. Geographically, search engine crawlers will read your site as being contradictory to your location. Because many search engines serve up results with some element of geographical location included, this contradiction could be enough to affect your ranking. The difficult part is ensuring that they are registered

and/or hosted in the appropriate country. If you are trying

to appeal to a U.S. audience, be sure that your hosting partner's

servers are based in the United States. Likewise, if

you're after the Irish market, host your Web site in Ireland.

Google knows where the servers are and will automatically

favor your Web site in that country's search results-and that may not be your target market.

Outbound links: I generally recommend that clients do consider linking out from their site (where appropriate) to other high-quality websites that are on a similar topic or theme to their own. Generally, you should target sites with a PageRank PR7 or better for best effect. Use keyword-rich links to make the connection.

Google has always tracked "related sites." You can see what Google pairs you with by typing site: yourdomain.com into the query box, then clicking on the "similar sites" link from the result. What sort of company are you keeping? You may be somewhat confused by the sites you see there, but don't worry too much. The key point is that some of them are based on your outbound links. A more interesting way to explore interrelationships is to pay a visit

to the visually stunning kartoo.com. Try searching on your business name to see how your site relates (in keyword terms) to your competitors.

URL's: The URL (Universal Resource Locator) is the literal address of your web site on the Internet. It's the address that site visitors type into their browser's address bar to reach you. Or in some cases, it's the link those users click to find you.

Ideally, your URL should be as descriptive as possible without being long and hard to remember.

So, as you've learned, a URL of www.atopkeyword.com is much more effective than a URL of www.partofyourcompanynameonline.com.

But there is more to a URL than just the base name. For example, your site's structure probably has

several levels of pages and files. So the base URL will then include a path to additional pages and

folders. Unfortunately, if you have a site that has hundreds of pages or dynamic content, you could

end up with a URL that looks like this:

http://www.yoursite.com/o/ASIN/B00023K9TC/ref=s9_asin_title_11966_p/10289462962020168?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center1&pf_rd_r=1A562KV3VPEPKDF3Z65D&pf_rd_t=101&pf_rd_p=291577501&pf_rd_i=507846

There are a couple of problems with that URL. The first is that there's no way visitors will remember

all of it. And the second is that you've lost valuable keyword real estate because the URL is an

undecipherable collection of letters and numbers.

A better option with your URLs is to try to keep them as short and descriptive as possible. Say that

the preceding long URL was one that leads users to a handmade red scallop shell necklace that you have for sale. Rather than creating a long URL that has no meaning at all, you could create a URL

for the page featuring the necklace that reads something like this:

http://www.yoursite.com/products/necklace/shells/id=507846_red_scallop

That URL is much shorter, and it's much more memorable than the longer one. Individuals might

still have difficulty remembering the URL, but it's more likely they will remember it than one that's

full of random letters and numbers. This illustrates the two key bits of advice we're talking about:

URLs should be descriptive without being overly long, and they should give visitors a good idea of

what to expect on the page. Using this method of creating URLs for the pages in your web site, you

open up the potential for including keywords in your URL, which not only helps as crawlers look

at your site, but also when your URL is posted as a link on other web sites or mailing lists. The URL that you select for your web site and create for your web pages is an important piece of

text. Those URLs can be crawled by search engine crawlers, and they should be easy for visitors to

use, to remember, and to understand.

Off page Optimization: Off-page SEO is the heart and soul of a successful optimization campaign and comprises all the tools and techniques you need to get other site owners to rate your site. All of these techniques together deliver a large quantity of high-quality links to your site from important, reliable, and relevant sources.

Inbound Links: Your inbound link-building activities fall into two groups: active and passive. The former involves you actively contacting other sites to submit or request a link. The latter involves link baiting, where you create attractive content and invite others (in a more passive way) to link back to it.

To ensure source importance, target sites have been established at least three years (and preferably more than seven) and that carry a homepage PageRank of at least PR3 (and ideally PR5 or more). To ensure source relevance, target pages where possible that have similar content to your own or operate in the same niche.

Websites

www.qlew.org

www.123exchangelinks.com

www.linkmarket.net

www.linkmetro.com

www.linkdiy.com*

www.links4trade.com*

www.linkexchanged.com*Reciprocal link exchange: on the web it has been common practice for years to exchange links between webmasters. Witness the pages called linkpartners.html or web-resources.html. Some webmasters go further and build huge link directories off the back of their sites, with outbound links to literally hundreds of partners. Some of these directories are even built using dedicated link-building scripts that semi-automate the process.

*paid service

However, it is true that reciprocal links are worth

less than they used to be, as Google deflates the value of such links in computing your rankings.

Visit www.linkmetro.com and register for the service.

If you must send out reciprocal link requests to other sites, here is my recommended approach (and etiquette):

Only seek links with related sites.

Begin by putting a link to their site on your own site first.

Make sure that your note is well written and personalized to the recipient.

Be specific in your note about what link - and keyword-rich link text - you would like them to use.

Directory submission: Web directories are a brave attempt to create a human-maintained taxonomy or classification of the entire web and their numbers have grown dramatically over the last four or five years. I tend to group directories into five categories.

Paid directories, which charge webmasters to list their sites or carry their pay-per-click adverts, either on a one-off basis or using a recurring monthly or annual fee.

Reciprocal directories, which will list sites but only in exchange for a reciprocal backlink.

Free directories, which are prepared to list your site for free.

Bidding directories, a relatively new type of directory service, where webmasters bid against each other to maintain key positions in the directory for their chosen keywords.

Deep link directories (whether paid, reciprocal, bidding, or free), which are prepared to list one or more deep links direct to the inner pages of your website.

www.dmoz.org

www.lii.org

www.webworldindex.com

www.directorydelta.com

www.jayde.com

www.iozoo.com

www.tsection.com

www.tikifind.com

www.premiumsites.org

www.webmasterhole.info

www.topdot.org

www.urlchief.com

www.i-searches.com

www.directoryworld.net

www.linkdirec.com

Search engine submission: Search engine submission is the process of submitting your site URL to the thousands of search engines across the web for them to crawl. To find other search engines, search on Google for the words "submit URL," "submit site," and "add to listing" in turn and work your way slowly through the top 100 results. There are so many search engines; you could well spend a few happy weekends on this activity alone.

Do not use search engine submission software. Use the same description as you used for the directory submissions (where a description is asked for). Always place your homepage URL in the submission box; the engine (if decent) will find and crawl the rest of your pages from there.

Search Engine Submission

www.exactseek.com

www.gigablast.com

www.entireweb.com

www.scrubtheweb.com

www.ulyseek.com

www.amfibi.com

www.searchhippo.com

Forum participation: Now we get on to one of the most interesting areas of active link building: forum participation. You may have noticed that many posters on forums have beautifully crafted signatures, which show the name of their business (often linked using keyword-rich anchor text). There is a reason for this: Search engines index these links, and in some cases both the forum posts and the profile pages of users - which also carry links - acquire a decent PageRank from Google. On many bulletin boards (for example VBulletin) the syntax for a signature file is as follows:

John Doe, Managing Director

[url=http://www.yourdomain.com]keyword anchor text[/url]

Recommendations:

Research areas of interest related to your site through a search on Google.

Identify no more than three or four forums you can join.

Sign up and create a profile page that contains a link to your website or blog, and a signature file containing anchor-rich keywords.

Begin posting answers to questions that you are qualified to answer and do not drag up old posts from the archives.

Try to tie your postings back to advice on your business blog where possible.

Articles, newsletters, and ,e-zines : The single best free way to promote your site is article submission. It may not have escaped your notice that there are literally thousands of ezines (electronic magazines) and content sites across the web, covering all manner of specialist subjects. Many of these are run by hobbyists and poorly paid webmasters with very little time on their hands. Create high-quality content (1,000-1,500 words) covering a particular topic or answering a specific question and sharing your hard-earned knowledge and expertise. Post your article on up to five high-quality free article repositories, together with a synopsis and a resource box ("about the author") that links back to your website or business blog. Webmasters and editors find and "syndicate" your content by adding it to their sites. As you have required them to include the resource box (as a condition for free reproduction), they build excellent backlinks to your site.

Articles Submissions repository

www.articlesfactory.com

www.articledashboard.com

http://ezinearticles.com

www.articleonramp.com

www.ideamarketers.com

www.goarticles.com

Online public relations: Online PR is a close cousin of article submission. As with articles, you write a press release, publish it to a PR repository, and hope that other sites will pick it up and use it. Where online PR differs (and is thus arguably more powerful) is that it targets the mainstream news media more precisely. Get it right and your humble release could end up featuring on the website of a major news network. Some PR sites even offer you the opportunity to see your release fed into Google News and Yahoo! News. A single PR7 link from a BBC webpage could be the difference between a top 30 and a top 10 ranking for one of your key search phrases. A good press release is not a boring advert for your business but a real story of genuine interest to your audience. You should also submit your press release to one or more of the main online PR repositories.

Press release repository

www.businesswire.com

www.prnewswire.com

www.prweb.com

www.virtualpressoffice.com

www.prlog.org

Tagging: portable bookmarking (e.g., Google Bookmarks) and social tagging

(e.g., del.icio.us), allowing users to maintain and categorize lists of the

pages and sites they like most and to remember them for the future. In

essence, this is a new voting system to challenge the current proxy of relevance used by search engines, inbound links. There are now a large number of social bookmarking and tagging

services, most of which permit members to share their tags with other

users and the public at large. In addition, you can subscribe to other people's

tags. digg.com 2 Technorati.com

3 del.icio.us 4 Propeller.com

5 StumbleUpon.com

The smart webmaster will make it easy for internet users

to tag or bookmark their pages, and will ensure that their content

includes controversial, topical, newsworthy, or funny material that is

likely to attract many tags from surfers.

Tools:

The Alexa Toolbar : Alexa is a company owned by Amazon.com. It's been around a long time, and millions of people around the world use the Alexa Toolbar. Every time someone uses the toolbar to visit a Web site, the toolbar sends the URL of the page to Alexa, allowing the system to create an enormous database of site visits.

The Alexa Toolbar can provide traffic information to you - you can quickly see how popular a site is and even view a detailed traffic analysis, such as an estimate of the percentage of Internet users who visit the site each month.

Work with the Alexa Toolbar for a while, and you'll quickly get a feel for site popularity. A site ranks 453? That's pretty good. 1,987,123? That's a sign that hardly anyone visits the site.

The Google Toolbar : Go to toolbar.google.com, and download and install the Google Toolbar, shown in Figure 1-5. I refer to this fantastic little tool in various places throughout this book because it provides you with the following useful features:

A way to search Google without going to www.google.com first

_ A quick view of the Google PageRank, an important metric that I explain in Chapter 12

_ A quick way to see if a Web page is already indexed by Google

_ A quick way to see some of the pages linking to a Web page

Keywords Finder/;

Crawlers:

Some utilities will read a Web page and display the content of the page in the manner in which a search engine is likely to see it (see Figure 18-8). When looking at a competitor's pages, you can sometimes see things that aren't visible to the site visitor but that have been placed on the page for the benefit of the search engines. When viewing your pages, you may want to check that all the links are readable by the search engines (these tools generally provide a list of readable links).

Here are a couple of these utilities:

_ Sim Spider: www.searchengineworld.com/cgi-bin/sim_spider.cgi

_ Delorie: www.delorie.com/web/ses.cgi

Keyword Density: But it's sometimes interesting to check your pages for keyword density, and many tools are available to help you do so. Web Position Gold, mentioned earlier in this chapter, has a built-in density tool, and you can find various online tools, as well, such as the following:

_ Search Engine World's Keyword Density Analyzer: www.searchengineworld.com/cgi-bin/kwda.cgi

_ KeywordDensity.com: www.keyworddensity.com

Validating HTML: This is the self-appointed guardian of HTML. W3C is an

influential organization, and a friend of theirs is a friend

of Google's. I can't stress this enough: Third-party accreditation

and recognition is massively important to Google.

If you follow W3C's validation instructions and pass, the

validation logo you'll display on your site will be worth its

weight in gold in terms of how the Googlebot regards your

site from then on.

W3C is pretty picky-for example, I was pulled up on an error regarding one of my sites:

The problem was that it wasn't compliant with Internet Explorer version 4 (Microsoft doesn't

even support IE 4.X anymore, so why on earth should I?). Take what they say seriously but

with a grain of salt-sometimes it's a matter of form over function. If you can fix those errors,

do so and enjoy your newfound ally. It's been worth at least a PageRank jump of one place for

many a Web site. http://validator.w3.org

You can run a WHOIS report from numerous

Web sites. For starters, try

www.whois.net and check that everything

is present and correct.

Sitemap tools

If you don't already have a site map, this issue must be addressed-no matter how small

your site is. Even if you have created your site using WYSIWYG (what you see is what you

get) software, a map often can be created automatically by the program and added as a

page. If developers have created your site, you can ask them to map out the site, as this is

information they should have created when they built the site in the first place. Finally,

you can get one free at http://www.xml-sitemaps.com in return for advertising their service

Checking performance of website:

This is a great tool, and if you're happy to check one Web page at a time, it's absolutely free!

NetMechanic tests the technical functions of your site, and within a few seconds you will receive

some important feedback about how well your site performs.

Load Time

HTML Check and Repair

Browser Compatibility

Link Check

Check Inbound & outbound links

inbound (or backward) linkyou

might have even received a response that says you're being added. Well, it's time to check.

Click on Link Popularity on the Marketleap site and enter your URL, and you'll be able to see

how many Web sites link to you and-the best part-who they are.

What Should not be Done:

Domain cloaking

On the surface, domain cloaking sounds like a great idea. The concept is to show users a pretty web

site that meets their needs, while at the same time showing search engines a highly optimized page that probably would be almost useless to users. In other words, it's a slimy trick to gain search

engine ranking while providing users with a nice site to look at.

It starts with content cloaking, which is accomplished by creating web-site code that can detect and

differentiate a crawler from a site user. When the crawler enters the site, it is re-directed to another

web site that has been optimized for high search engine results. The problem with trying to gain

higher search results this way is that many search engines can now spot it. As soon as they find

that a web page uses such a cloaking method, the page is delisted from the search index and not

included in the results.

Duplicate content

When you're putting together a web site, the content for that site often presents one of the greatest

challenges, especially if it's a site that includes hundreds of pages. Many people opt to purchase bits

of content, or even scrape content from other web sites to help populate their own. These shortcuts

can cause real issues with search engines. Say your web site is about some form of marketing. It's very easy to surf around the Web and find

hundreds (or even thousands) of web sites from which you can pull free, permission-granted content

to include on your web site. The problem is that every other person or company creating a web

site could be doing the same thing. And the result? A single article on a topic appears on hundreds

of web sites - and users aren't finding anything new if they search for the topic and every site has

the same article.

To help combat this type of content generation, some search engines now include as part of their

search algorithm a method to measure how fresh site content is. If the crawler examines your site

and finds that much of your content is also on hundreds of other web sites, you run the risk of

either ranking low or being delisted from the search engine's indexing database.

Hidden pages

One last SEO issue concerns the damage to your SEO strategy that hidden pages can inflict. These

are pages in your web site that are visible only to a search crawler. Hidden pages can also lead to

issues like hidden keywords and hidden links. Keywords and links help to boost your search rankings,

so many people try to capitalize on these requirements by hiding them within the body of a

web page, sometimes in a font color that perfectly matches the site background.

There's no way around the issue of hidden pages. If you have a web site and it contains hidden pages,

it's just a matter of time before the crawler figures out that the content is part of a hidden SEO strategy.

Once that's determined by the crawler, your site ranking will drop drastically.

Keyword Stuffing:

Keyword stuffing, mentioned earlier in this chapter, is the practice of loading your web pages with

keywords in an effort to artificially improve your ranking in search engine results. Depending on

the page that you're trying to stuff, this could mean that you use a specific keyword or keyphrase a

dozen times or hundreds of times.

Temporarily, this might improve your page ranking. However, if it does, the improvement won't last,

because when the search engine crawler examines your site, it will find the multiple keyword uses.

Because search engine crawlers use an algorithm to determine if a keyword is used a reasonable number of times on your site, it will discover very quickly that your site can't support the number

of times you've used that keyword or keyphrase. The result will be that your site is either dropped

deeper into the ranking or (and this is what happens in most cases), it will be removed completely

from search engine rankings.

Doorway pages Doorway pages are often confused with landing pages, but they are not even close in their functions.

Landing pages are designed to be rich in content, and visitors usually come to these pages through

PPC ads. Doorway pages, on the other hand, are created specifically for search engines with the

intent of increasing search engine results rankings.

Doorway pages usually use some form of redirection so that when visitors click through the link in

the search engine results and land on the page, they are immediately taken to another page. This is

either accomplished with a fast Meta refresh, JavaScript, or server side redirection. The Meta refresh is

a technique that is used less often now than in the past, because many search engines penalize web

sites that use such tactics.

In place of the Meta refresh, some web sites have found clever ways to trick visitors into clicking a

link that leads them forward to the web site they're being drawn to. There are also some web sites

that have designed content-rich doorways, which are doorway pages that have some element of content

included as well as a basic navigational structure that's consistent with the rest of the web site.

These pages, like other doorway pages, are still designed to draw high quantities of visitors.

Page jacking: Page jacking is a method of search engine spam that's similar in nature to scraping. The difference

is that with page jacking, whole pages - and even whole web sites - are copied for the purpose of

increasing search ranking and traffic for another site.

In one example of page jacking, a person might copy a whole site like Microsoft. They then cloak

that site, but it still appears in search listings. Then, when unsuspecting users click through the

listing for Microsoft they're taken not to the Microsoft page, but to another page that the hijacker

funnels them to. Not only is page jacking a good way to get your web site delisted from search engine results, but it's

also a crime that can result in a stiff fine and possibly jail time. There are also trademark and copyright

infringement issues associated with page jacking.

Bait and switch

Bait and switch in SEO is the practice of creating an optimized web page specifically for search engines

with the intent of obtaining good rankings. When those rankings are obtained, the company replaces

the optimized site with one that's less optimized and more normal. The result is nearly instant traffic

when the site has been switched.

Bait and switch does have one downfall. Most search engine crawlers revisit a site several times a

month. And when the crawler revisits a bait-and-switch site, it will see that the content of the site

has changed, and will adjust search rankings accordingly. In other words, the person who set up

the bait and switch put a lot of time into a temporary optimization solution.

FFAs and IBLNs

Or free-for-all link farms and independent backlinking networks, to

expand the acronyms. In its Webmaster Guidelines, Google says:

Don't participate in link schemes designed to increase your

site's ranking or PageRank. In particular, avoid links to web

spammers or "bad neighborhoods" on the web as your own

ranking may be affected adversely by those links.

In practice, Google identifies "bad neighborhoods" by devaluing backlinks

from the same IP subnet. Where a site is simply a link farm site

(which lists loads of links to other sites, in exchange for links back or

money), Google will eventually identify it as a bad neighborhood and

deflate the value of the links in its index. IBLNs are networks of sites that all directly or indirectly link back

to your site in such as a way as to promote it through the search engine

rankings. The way IBLNs get around Google's IP monitoring is by using

a completely different web-hosting plan for every site you want to link

back directly to you. It is

also not foolproof and if detected, can lead to Google simply wiping from

its index all the direct referrers (the sites it finds built simply to link to

your main site) or, worse, dropping your entire IBLN, including the main

site you were trying to optimize for. Don't be daft - keep it clean!

10 steps to optimize the website:

Select a name for the website related to the industry and not on the name of company.

Host the website in the country where you want to target.

Prepare a list of keywords you feel people would use to search you .try to use specific keywords rather than a general one.

Properly use these keywords in Title , meta tags , body text . never overuse the keywords.

Use HTML validation tool to check the validity