Search Engine Optimization | Search Engine Marketing | Internet & Online Marketing Details

Information on SEO, SEM, PPC, SEO/SEM, Search Engine Optimization, Search Engine Marketing, Pay Per Click, Internet Marketing, Online Marketing.

Tuesday, August 12, 2008

SEO/SEM

Search engine optimization or (SEO) is defined as the process of improving volume and quality of traffic to a website from search engines for a targeted keywords. Usually, the higher or earlier the website is presented or ranked in search engine we get more searches and traffic to our website, search engine optimization can be explained as making the site more appealing to users by adding unique content to a website, ensuring that the content is easily indexed by search engine robots."Seo is also a process of improving higher ranking to websites through search Engines".

SEO involve coding a web site with good presentation and structure, fixing problems that could prevent search engine indexing programs from fully spidering the web site. Keyword stuffing and link farms will harm your website and may remove your website from indexing.
Search engine optimization or Seo may require changes to the HTML source code of a web site, and Content management.
Meta-tags provided a guide to each page's content. But using Meta data to index pages was found to be less than reliable, because some webmasters abused Meta tags by including irrelevant keywords to artificially increase page impressions for their website and to increase their ad revenue. Inaccurate, incomplete, and inconsistent Meta data in Meta tags caused pages to rank for irrelevant searches, and fail to rank for relevant searches. Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.
To improve better results to the users Search engines ensure to see the result pages showed the most relevant search results rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
History Of Search Engines:
Running a company by having a website does not guarantee success Following the tips given in this website tips on interview will surely help your website to stand high in directories and search engine results and therefore increase traffic and the number of potential clients.
History:
Before HTTP protocol was invented the internet was just a huge network consisting of FTP servers, and internet is used as a means of file exchange. Before websites existed the first search engines ran via FTP and similar protocols. Internet acquired its actual shape only after Tim Burners-Lee had created HTTP protocol, we got the World Wide Web, and the Internet acquired its actual shape.
Search Engines Today :
Modern web searchers are divided into two main groups:
• Search engines, Directories.
Search Engines automatically 'crawl' web pages by following hyper links and store copies of them in an index, so that they can generate a list of resources according to user’s requests.
Directories are compiled according to categories by humans who are site owners or directory editors.

The top-13 search engines listed Below cover about 90 percent of all online searches performed on the Internet. Those search engines are:
• www.Google.com
• www.Yahoo.com
• www.MSN Search.com
• www.AOL Search.com
• www.AltaVista.com
• www.Lycos.com
• www.Netscape.com
• www.HotBot.com
• Ask Jeeves/Teoma
• www.AllTheWeb.com
• www.Wisenut.com
• www.iWon.com
How do search engines work
How do Search Engines Work?
All search engines consist of three main parts:
• The Spider (or worm)
• The Index
• The Search Algorithm.

The spider (or worm), continuously ‘crawls’ web space, following links that leads either to different website or with in the limits of the website. The spider ‘reads’ all pages content and pass the data to the index.

The index is the next step of search engine after crawling. Index is a storage area for spidered web pages and is of a huge magnitude. Google index is said to consist of more than three billion pages.

For example Google’s index, is said to consist of more than three billion pages.

Search algorithm is more sophisticated and third step of a search engine system. Search algorithm is very complicated mechanism that sorts an immense database within a few seconds and produces the results list. The most relevant the search engine sees the web page the nearer the top of the list. So site owners or webmasters should therefore see site’s relevancy to keywords.

Algorithm is unique for each and every search engine, and is a trade secret, kept hidden from the public.
Most of the modern web search combines the two systems to produce their results
Architecture of search engines:
Spider - a browser-like program that downloads web pages.

Crawler – a program that automatically follows all of the links on each web page.
Indexer - a program that analyzes web pages downloaded by the spider and the crawler.

Database– storage for downloaded and processed pages.
Results engine – extracts search results from the database.

Web server – a server that is responsible for interaction between the user and other search engine components.
When a website is specifically designed so that it is friendly to the tools that search engine use to analyze websites (Called spiders) is called Search Engine Optimization.
Seo Methodology :
• Off line/off page Optimization
• Online/on page Optimization
• Position Monitoring
On page Optimization :
• Pre Optimization Report
• Key word research
• Competitors website analysis
• Rewriting robot friendly text
• H1 H2 Tags Optimization
• Title Tag Optimization
• Meta Tag Optimization
• Key words Optimization
• Alt Tag Optimization
• Website structure Optimization
• Body text and content Optimization
• Site map for link Optimization
Off page Optimization :
• Hosting of Google site map
• Website submission to all leading search engines having global data base.
• Submission to country specific search engines having country related data base
• Submission to general directories
• Submission to product specific directories
• Submission to country specific directories
• Trade lead posting
Position Monitoring :
• Monitoring website ranking with different keywords
• Renewal of expiry trade leads and posting new trade leads
• Constant research of updated technology for better positioning
• Research on current popular directories and site submission
• Changing methodology with change in search engine algorithm
COMMON MISTAKES IN SEO :
Search Engine Optimization is the practice of making website to be search friendly to search engines and for those who search business through search engines webmasters make following mistakes.

Over Optimization affects your website Ranking.
1.Incorrectly Designed Websites :
• Lack of proper Navigation
• Using frames to save web designers designing times
• Large image sizes will cause more time to download pages. If it is necessary to use large images then consider using thumbnails and open it in separate page. (This helps in creating more pages and more text which help the spiders to crave)
• Using high resolution graphics (Try to use low resolution graphics)
2.Poorly Written of content :
content absolutely must have targeted keywords and phrases. If content is written properly you can make more targeted keywords and appropriate phrases.
Absence of targeted keywords and phrases can break your site. If you have not used related keyword in your body text then your site will not come in listing when user type particular keywords related to your site.

More sure to use keywords placed in the meta keyword tag is logical to your content.

People visiting your site as a result of their search would leave as soon as they see the home page is it is irrelevant or don’t match to the keyword or phrase they are searching. Use some tools such as word tracker to find what people are actually typing in to the search engines to find goods and services similar to yours and should concentrate on ranking well for those terms.

3.Replica of Content :
Using more than one page with different name, but content in page are same then search engines will consider this as trick and this affect ranking never try to copy content from other websites.
4.Improper use of Meta Tags or site without Meta Tags :
Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website.
Improper Meta tags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.
5.Absence of Site map for Websites :
Site maps assist web crawlers in indexing websites more efficiently and efficiently. Site map provides the structure of entire website in one page which is very useful for search engine Optimization. Generally site maps drive search engines to go directly to the page instead of searching the links.

Google site maps is an easy way to tell Google about all the pages on your site; which pages are most important to you, and when those pages change, for a smarter crawl and fresher search result.
6.Improper use of Meta Tags or site without Meta Tags :
Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website.
Improper Meta tags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.
7. Page Cloaking: In this method the web masters deceive search engines by altering the real page content with declared description. Normally spidering robots recognized by their IP addresses or host names are redirected to a page that is specially polished to meet search engines’ requirements, but are unreadable to a human being. In order to detect cloakers, spiders often come from fake IP addresses and under fictitious names. Also, users' feedback is collected to see the relevancy of content with description, the page is also revised by search engine owners’ staff and if found any difference the sites are penalized.
Technical Definition of SEO?
Search engine optimization, the process of increasing the amount of visitors to a Web site by ranking high in the search results of a search engine. The higher a Web site ranks in the results of a search, the greater the chance that that site will be visited by a user.
SERPs :- Search Results are also referred as SERPs (Search engine Result Pages).
Classification of SEO:
1. ON Page Optimization
2. OFF Page Optimization
ON Page Optimization:- On Page involves the various activities that can be performed to your site itself.On-page optimization involves : (everything from top of the page to end bottom)

a. URL changes like converting dynamic Urls (which include query string like ? , & these characters can be present there ) to static HTML based urls , these static urls are also known SE friendly URLS.

b. Modify Title,try to include most Competitive Keyword in title as this the only One element that is considered very seriously by SE's. And helps in Better SERPs.

c. Prepare Appropriate Meta Tags.

The limits are as specified:
Title tag = 65 characters (including spaces)
Meta Keyword tag = 256 characters (Including spaces)
Meta Description tag = 256 characters (Including spaces)

d. Content of the Page Should be SE friendly means your targeted keywords should be present in content of your page.

e. Proper Use of h Tags..Like use H1 for main Headings , h2 for Sub heading , h3...and so on... Hierarchy should be correct as this is considered as SE friendly.

f. Avoid HTML Frames which some search engines find hard to navigate, use Style Sheets (CSS) instead.

g. Description of Image
Optimize Image alternate-text tags (ALT tags), For every image on page try to describe it properly so that visitors should understand from image itself what this is all about..and if possible utilize your keywords here by properly embedding them in ALT tags.

h. Keep pages to less than 100 kilobytes and preferably not much more than a screen full of text.

i. Avoid Java script and Flash as maximum as possible because SE doest like them they like on Simple HTML Format.

j. Search engines also like fresh content and will spider this more frequently. A regularly updated news page, even a blog, can provide deep links to the rest of the website.

k. You should take care of Keyword Density : the ratio of the number of occurrences of a particular keyword or phrase to the total number of words in a page.
You should not dense any Keyword to much in a page otherwise that will be considered as Spam.

l. Keep the no. of outbound links on a page not more that 50 -60 and those should related to theme of your site.
In Short ON Page Summarizes:
•Keyword Density
•Words in Title Tag
•Words in the Page
•Words in Links
•Words in Headings
•Words in Bold
•Beginning Words •Words in URL
•Meta Tags (some engines)
•HTML Validation
•Link Structure
•"Index ability" of the page
•and Hundreds of Other Factors
OFF Page Optimization: Involves various activities that will be performed off line that mean you do not have to perform anything on your site itself.

1. Link Building :- (Reciprocal and One Way)

Classification of Links:

A. Inbound Links:- When someone has placed a link in his site, that when clicked, points to our site is called Inbound link . That mean its coming to us from some one.

B. Outbound Links:- When we will place a link in our site, that when clicked, will point to some other site is called Outbound link.

Back links:- Back links are "the most" important thing in search engine ranking. This is for the sole reason that search engines consider each Back links as a "vote" to your site. The greater the number of votes, the higher your ranking. The number and quality of Back links if by far the most important ranking factor in Google.

Back links can be further classified into the following:

Authority Links: Link from the theme related authority site.

Directory Links: Directory links refer to the links that are obtained from directories. Directory links are also important for search engine rankings as they are more relevant and themed links.

The main directories that one should strive to get links from are Dmoz.org and Yahoo Directory. These two directories have the highest authority and links from these helps a lot to improve ones search engine rankings.

DMOZ:- Directory Mozilla is also called personnel directory of Google.

If your site gets listed in this Directory then this can really have very good serious fact on your Google SERPs.

Reciprocal Links: Reciprocal link exchange is the best strategy for any new site to gain back links. In this type of link building, you get a link from a site in return of a link from your site to theirs.

Nowadays Google Devalues these Reciprocal that is the reason you can see most of clients prefer One Way Link only.

Paid Links: Paid links or Run of Site links refer to all those links that you have bought. Such links are heavily devalued by Google and you should refrain from getting such links as they might hurt your site's trust rank and quality score.

One Way Links:- When Someone has listed your site and has given you back link but you have not given back link from the primary , instead you have given Back Link (BL) from some third party site or directory is called one way link. One-way links are valued highly by search engines.

The entire process is called One Way Link Building.

3 way Link OR Triangular Links:- These are also an sort of one way links with a little distinction that here you will not give (Back Link)BL from directory but from some theme relevant site. Consider this example:

Site A:- education
Site B:- education
Site C:- education

Now A and B site are your and you want to exchange link for your site A with C.
What you need is that C should place your link and in return you will give BL from B (mind it not from A)

This type of link is called 3 way link and the entire process is called 3 way link building.


2. Article Submission:- Articles are the one of the important source of obtaining One way links and they also help to make your site content relevant.
->You can write a well formatted Article related to theme of your site and you can embed your targeted keywords in this. For most competitive keywords you can perform repetition.
->You can embed Hyper links for your site inside the Article under various Heading starting with new paragraph.
->You can embed Hyper links for your competitive Keywords As Well.
Once the Article is ready you can submit it to various Article Submission Sites that are available free online. Once your article is submitted , it will take some time to get approved by moderator so keep checking after 5-7 days regularly to see when your article gets published .

3. Press Release:- A public relations announcement issued to the news media and other targeted publications for the purpose of letting the public know of company developments.
->You can Participate in press releases which is again a source of obtaining One Way links.
->Rest procedure is same as Article Submission.

4. Anchor Texts:- Anchor text is the text that is hyper linked while linking to a site.(When you ll click on Anchor text it will take to you to a site that address u will mention).
Keywords present in anchor texts are heavily valued by search engines for ranking that site against that keyword/ key phrase.
It is advised that one should use variations in their anchor texts so as to make the back links look natural to Google. If Google sees that all the back links have the same anchor text then it might devalue some links thinking them to be unnatural or paid links. Using variations in the anchor text also increases the chances for ranking for additional keywords.
So make sure that you use an anchor text which at the same time contain your targeted keyword and is compelling enough for the visitors to click on it. Remember, we build back links not only for search engine benefit, but also for traffic, so an anchor text should be keyword rich and smart.
Anchor texts should be specific to an individual page and should be relevant to the page content as well. This relevance helps to increase the trust rank and also the quality score of your linked page.

5. Classified Ads :- Classified ads are the most Popular sites that include the vast range of topics or categories listed in them and public used to post their theme relevant Ads here.
They helps to achieve huge amount of traffic towards your site.

http://www.adlandpro.com
http://www.ezilon.com
http://olx.com

These are the few examples of the Classified ads Site's. And you can Post Free of Cost here.

6. Discussion Forums:- Where Public used to come with their variety of Problems ad used to discuss here for finding appropriate help.

SEO HAT is one of the most informative and useful site available to get vast knowledge and to achieve success.
In Discussion Forums you can put link to your site under the Signatures that will be posted(at the end of the post) each time you will participate in any discussion .
As these Forums keeps on getting new content every day by thousand of people as they submit their queries these forums keeps on crawling frequently by Search Engines so if you have placed a link to your site under your signatures that will also be get crawled each time and will help in Better SERPs.

7. Advertisement in Social Media and Big Business Portals:- Getting Advertise your business in major business and shopping portals also helps to achieve popularity.
Social Networking sites also plays the same role.

Digg.com Page Rank: 8
Digg.com is a good site for news article marketing. We can upload videos and podcasts. For working with this site we have to create an account. After creating account we will have to submit URL of our blog website along with user name password. After submitting required information we can start uploading news article, videos and pod casts. We can get good traffic for the site

Squidoo.com Page Rank: 7
Squidoo is good site for making free blog. We can publish our articles in blogs that will link to our site. We can get traffic from this site. Our websites page rank will also be affected in some positive manner as squidoo's page rank is good.

hubpages.com Page Rank: 5
Hub pages.com, a good site here you can easily publish information on a topic you love to write about. Some good topics are Business, Supports, technology, travel etc.

ning.com Page Rank: 7
Ning.com is a social networking site. From following features we can have benefits:
• Create blog posts with photos, files, and moods.
• Manage blog posts and moderate comments.
• Choose the option to publish in the future.
• Members can customize their own pages.
• A profile question section where you write the questions.
• A member "chatter wall" for others to leave chatters.
• Option to add widgets to profile page.
• Member photos and videos.
43things.com Page Rank: 7
43things.com, allow us to make Free blog to publish our own articles that link with our site.

advogato.org Page Rank: 7
The site allow us to publish articles. After having an account we can publish articles.

mallworld.net Page Rank: 5
The site allows us for publish press releases and article. Basically it is a community site; we can share our views with others.
New Google Patent & Quality Guidelines
As we all know in this era of Internet Technology Google is God among all the Search Engines. This is just because the Quality of information that they brought in front of you by use of their smart Algorithm and Out of box strategies they implemented to produce same.

Google keep on refining at their approaches to manipulate Search engine rankings to provide best possible results as per user queries..Recently new Google patent and Quality guidelines have been announced and with positive hope that they are going to implement same soon....

Here is detailed scenario what new Google Engine may stress upon:

1. Exemplary Search Engine:The Search Engines strive to provide high quality results for a search query. There are several factors that may affect the quality of the results generated by a search engine. For example, some web site producers use spamming techniques to artificially inflate their rank. Also, “stale” documents (i.e., those documents that have not been updated for a period of time and, thus, contain stale data) may be ranked higher than “fresher” documents (i.e., those documents that have been more recently updated and, thus, contain more recent data). In some particular contexts, the higher ranking stale documents degrade the search results. Thus, there remains a need to improve the quality of results generated by search engines.

2. Document Inception Date:According to another implementation, the date that a domain with which a document is registered may be used as an indication of the inception date of the document. It may be assumed that a document with a fairly recent inception date will not have a significant number of links from other documents (i.e., back links). For existing link-based scoring techniques that score based on the number of links to/from a document, this recent document may be scored lower than an older document that has a larger number of links (e.g., back links). Here we must also note that NEW Domains might have certain advantages from old domains. Consider the example of a document with an inception date of yesterday that is referenced by 10 back links. This document may be scored higher by search engine than a document with an inception date of 10 years ago that is referenced by 100 back links because the rate of link growth for the former is relatively higher than the latter. While a spiky rate of growth in the number of back links may be a factor used by search engine to score documents, it may also signal an attempt to spam search engine. Accordingly, in this situation, search engine may actually lower the score of a document(s) to reduce the effect of spamming.

3. Content Updates/Changes:A document whose content is edited often may be scored differently than a document whose content remains static over time. Factors, such as the number of “new” or unique pages associated with a document over a period of time … the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document. The amount that the document is updated over one or more periods of time. For instance, content deemed to be unimportant if updated/changed, such as JavaScript, comments, advertisements, navigational elements or date/time tags, may be given relatively little weight or even ignored altogether when determining UA.On the other hand, content deemed to be important if updated/changed such as the title or anchor text associated with the forward links, could be given more weight than changes to other content when determining UA. (UA: User Agent - Usually a Web browser, but may be any means of reading and or interpreting a Web Page.). Documents for which there is an increase in the rate of change might be scored higher than those documents for which there is a steady rate of change, even if that rate of change is relatively high. The amount of change may also be a factor in this scoring.

4. Query Analysis:Another factor is Query Analysis the number of results pages your website is ranked for. Query-based factor may relate to the occurrence of certain search terms appearing in queries over time. A particular set of search terms may increasingly appear in queries over a period of time. For example, terms relating to a “hot” topic that is gaining/has gained popularity or a breaking news event would conceivably appear frequently over a period of time. In this case, search engine may score documents associated with these search terms (or queries) higher than documents not associated with these terms. Another query-based factor may relate to queries that remain relatively constant over time but lead to results that change over time. For example, a query relating to “world series champion” leads to search results that change over time (e.g., documents relating to a particular team dominate search results in a given year or time of year). This change can be monitored and used to score documents accordingly. Yet another query-based factor might relate to the “staleness” of documents returned as search results. The staleness of a document may be based on factors, such as document creation date, anchor growth, traffic, content change, forward/back link growth, etc. For some queries, recent documents are very important.

5. Anchor Text:If the content of a document changes such that it differs significantly from the anchor text associated with its back links, then the domain associated with the document may have changed significantly (completely) from a previous. This may occur when a domain expires and a different party purchases the domain. Because anchor text is often considered to be part of the document to which its associated link points, the domain may show up in search results for queries that are no longer on topic. This is an undesirable result. One way to address this problem is to estimate the date that a domain changed its focus. This may be done by determining a date when the text of a document changes significantly or when the text of the anchor text changes significantly. All links and/or anchor text prior to that date may then be ignored or discounted. The freshness of anchor text may also be used as a factor in scoring documents. The freshness of an anchor text may be determined, for example, by the date of appearance/change of the anchor text. Thus the Anchor text is as always the most important factor which can sometimes damage your site very badly.

6. Traffic:According to an implementation consistent with the principles of the invention, information relating to traffic associated with a document over time may be used to generate a score associated with the document. Now Google may monitor the time-varying characteristics of traffic to, or other “use” of, a document by one or more users. A large reduction in traffic may indicate that a document may be stale. Hence this time Google looks at our traffic.

7. User Behavior:According to an implementation consistent with the principles of the invention, information corresponding to individual or aggregate user behavior relating to a document over time may be used to generate (or alter) a score associated with the document. For example, search engine may monitor the number of times that a document is selected from a set of search results and the amount of time one or more users spend accessing the document. If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively.

8. Domain Related Information:Google looks at your WEB HOSTING PROVIDER to see if there is SPAM/PORN/GATEWAY website on the same server and you might get unwillingly punished so please choose your web hosting provider carefully. Also, or alternatively, the age, or other information, regarding a name server associated with a domain may be used to predict the legitimacy of the domain. A “good” name server may have a mix of different domains from different registrars and have a history of hosting those domains, while a “bad” name server might host mainly pornography or doorway domains, domains with commercial words (a common indicator of spam), or primarily bulk.

9. User Maintained/Generated Data:Bookmarking and other user maintained data from the Google Tool bar and Browser partners will help to measure the importance of the page. Search engine may monitor data maintained or generated by a user, such as “bookmarks,” “favorites,” or other types of data that may provide some indication of documents favored by, or of interest to, the user. Search engine may obtain this data either directly (e.g., via a browser assistant) or indirectly (e.g., via a browser) to determine the importance of the document.

Conclusion
1. Do not change your Link Text drastically or you can hurt your previous linking efforts.
2. If you’re buying links, make sure you update your website/landing page or at least have good amount of text changed, otherwise you can be caught for spamming.
3. Using some keyword ranking reporting software “can damage your website rankings” simply because its not clicking on the website and G will think your website is not “Quality resource”.
4. A new thing with this update is that Your web hosting can really hurt your rankings and that is when there are porn/spam/gateway websites on your web server. This is very harsh to shared hosting accounts as you don’t really know what is on the server besides your website and you can get punished for nothing.


Search Engines Vs Basic Meta Tags
Meta Tags serves as Official Documentation of web site. There was time when Meta Tags were given special attention, weight-age as part of Web site Optimization.. but nowadays trend has been completely changed.

On one side of Internet Marketing; Spammers keep on implementing Black hat SEO tricks to play with major search engines while on other hand Google guys keep on refining their Algorithm to give complete Fight back to beat these Spammers.

So this is all about Search Engines and Spammers & in between comes real SEO guys those work hard according to Google guidelines and optimize websites for stable longer length. Any ways come to point I was talking about Meta Tags.

Nowadays most of the Major Search Engines do not give any special attention or preference to them. They no-longer have significant effect on website SERPs or Rankings, there are hundreds of other activities that plays vital role while doing On page Optimization like: Titles, Proper use of H tags, Website Navigation Architecture, Bold,Italic Fonts, Font Size and so on..
WHY SEARCH ENGINES DO NOT CARE ABOUT META TAGS?
This question is quite interesting and most of people who are new to SEO or have less knowledge they are surprised when they come to know that SEs don't bother about Meta Tags any more.

Meta Keyword: This tag is meant for putting your unique keywords related to content of your web page. Or they just give fair idea about the whole content of your web page, and search engines can have idea: "what this particular page is all about".

BUT what most of people start doing, they started using this tag tag as a source of their web page promotion in unethical way .

Most of spammers use to stuff this Tag with most competitive keywords for their web-site without regard to available content on web page Without regard to length of this Tag.

Some keep on stuffing same keywords with multiple repetitions to Meta Keyword Tag so that they will cheat search engine spiders and will rank well for those keywords.

By the time Google detected all these kind unethical promotional activities so nowadays they started neglecting Meta Keyword Tag. So this Tag can be neglected.

Meta Description: This Tag gives abstract of Web page so that visitors and have fair view : what this page is all about. This tag still has some importance as most of search engines use content available in this tag as Snippet and same is mostly displayed on SERPs.

BUT again most of people stuff this tag with their competitive keywords that doesn't even make any sense, without considering or bothering about recommended length of this Tag and tries to cheat search engines.


If you have unique and quality content on your site then you can even ignore this tag because search engines will pick automatically most relevant content from your web page that matches the Query entered by users. So this tag can also be neglected.

Meta Title: This tag acts as Identity for web-page, gives clue about the whole discussion of page or purpose of web page. Google displays Title Attribute of page as title in SERPs, please remember here I'm talking about Title Attribute of Web Page and not Meta Title Tag.

If you have prepared well defined and appropriate Page Title attribute then there is no need to put this Meta title tag on you web site.

Mostly people keep Meta title Tag and Title attribute of web page exactly same, and they should be. Search Engine gives preference to Title Attribute but not to Meta Title Tag of web page..so again this tag can also be neglected.

So if you are going to start On page Optimization for any site i would recommend do not waste your time on preparing these Meta Tags instead work hard on refining your Industry keywords, Build Powerful, unique Quality Content and utilize your time on improving rest of On page factors for your web site that will give you worth results than these Meta Tags.



SEO Hats
SEO HATS
In SEO world we used to come around this term different types of SEO hats.

Firstly we must know what does Hats means?
The Hats are really just a means to force to think about the different aspects of a problem. They categorized by colored Hats, with the color being symbolic for the approach/aspect of thinking about now. Think of them as the modes of attacking a problem.

So what are the six hats?

The White Hat: symbolizing facts and hard data.
The Red Hat: symbolizing emotions. Here we start making hypotheses about what's going on, but only on an emotional level.
The Black or Gray Hat: which is the color of a judge’s robe? Think negatively and critically here.
The Yellow Hat: where we talk about sunny, happy, positive things.
The Green Hat: where you let your mind roams wide and free in endless pastures looking for solutions. Here is when you relax and explicitly think about solutions, ideally using a creative idea generation technique.
The Blue Hat: your sky-high big-picture view of the problem. Think in wider way.
White Hat SEO:

All the tips that comes under Google Web master Guidelines and that I have discussed in first post about ON and OFF page Optimization are called White Hat SEO techniques.
Black Hat SEO:

What is Black Hat SEO?
It may be defined as the techniques that are used to get higher search engine rankings in an unethical manner. So anything that the search engine forbids us from doing like doorway pages, cloaking, and hidden text is Black Hat SEO.
Black hat SEO Techniques
Hidden Text: - Hidden text is textual content which your visitors can't see but still readable by Search Engine. It can be done by making font color as same as background color or making font color as same as background image.
Keyword Stuffing:- is a technique used to spam search engines by placing excessive or irrelevant keywords in the meta keywords tag or throughout the content of the page. Because search engines scan web pages for the words that are entered into search criteria by the user.

Landing Pages/Doorway Pages/Bridge Pages :- It may be define as the fake pages that the user will never see. Doorway pages are pages packed with keywords, and designed solely for the search engines. It is an attempt to trick search engines into indexing the site higher.

Cloaking:- It is a technique in which the content presented to the search engines is different from that present to the user's browser. This is done by delivering content based on IP address of the user requesting the page. When a user is defined as search engine spider, a server side script delivers a different version of web page. It is use to deceive the search engine. Actually what happens it is a CGI script which reads the IP address of the requester against the IP address of the SE spiders. If no match is found then requester is identified as “human”. The point is that highly optimized web pages are served to SE while human visitors get a different web page from the same url.

Link Farming:- The process of exchanging reciprocal links with websites in order to increase Search Engine Rankings. This is done just to increase the number of sites that links to yours site, but it is consider as SPAM. Because Google says we have to exchange links only with the site that are having relevant theme as same as our site and a link page doesn't contain more then 50 links.

Page Hijacking:- It is the form of spamming the index of search engine. Mostly web hosts do it, when we forget to renew our domain name. It is achieved by creating a rough copy of a popular website which shows content similar to the original website to a web crawler but redirects the web surfers to the unrelated or malicious website. Link spammers monitor DNS records for domains that will expire soon and buy them when they expire and replace the pages with the links to their pages.

Automated Search Engine Submission Software:- We don't have to use such softwares for the promotion purpose. They don't work efficiently and sometime they caused for banning the website. We don't have to rely on them.

Duplicate Content:- To use the same content on different pages, Google consider it as spam and if it detects large amount of duplicate content on different sub or main domains then Google can impose “Duplicate Content Penalty” which results in losing rankings. So we should put different meaningful content at our pages and on sub domains also.

Content Solely For Search Engines:- Don't publish any content that is meant for search engines only. Content rich in keyword density not useful for human beings and it is easily detected by Google and other search engines which results in penalties. Therefore we always have to write content firstly for humans and then for crawlers and robots.

Frames:- Though frames are not “Black Hat SEO” but frames should be avoided as they cause huge problem with crawlers to index the site.

Email Spamming:- Unwanted, unsolicited email, to send the emails to number of people not directly related to SEO is called email spamming.

Hence these are the major Black Hat SEO Techniques. These techniques gives short term results and lead to the banning of site. So we should never implement such techniques.
Difference Between Black Hat and White Hat SEO
• Black Hat SEO is meant for SE only. while white hat is meant for human visitors.
• These techniques are hidden to human beings and white hat techniques are visible to human beings.
• It is meant for short term results only. and this is meant for long term results.
• These methods are unethical ways of optimizing the site and these methods are ethical ways according to SE norms.
• these may result in getting site banned and these always brings good search engine rankings.
• SE are enemies for Black Hat SEO & here SE's are friends for White Hat SEO techniques.


Thus always go for White Hat SEO Techniques, which makes your site search engine friendly and completes the motive of making a website
Search Engine Friendly URLs
What are SEF URLs:- Search Engine Friendly URLs commonly abbreviated as SEF URLs.URL is top most beginning element of any website & this should be selected carefully if you are looking to promote your website on major search engines.

Search engines always tend to give preference to those things that most commonly liked by human users. So while selecting your url do take into consideration few things:

Your domain name should be Human friendly enough like: should not be too long, easy to remember, should be theme specific.

If the name that you have chosen is available in .com then don't go for any other TLD (.biz,.info so on..)as .com is most popular and people do like them.

Take care of canonical issues carefully:

http://www.domainname.com
http://domainname.com
http://domainname.com/index.html
http://www.domainname.com/index.html


All these domain above listed may seem to be quite similar to each other to a normal Human visitor, but these are entirely different from each other from Search Engines view.

So fix all these issues at time of Hosting..Else follow Search Engine Friendly Redirect methods to solve canonical issues.

Most preferred domain name is : http://www.domainname.com
so for the rest of the possibilities just implement Search Engine Friendly Permanent Redirect 301.

Don't go for Dynamic URLs: Search engines like Static URL mostly because they find them easy to crawl and navigate as compared to dynamic urls. even if your website consist of Dynamic URLs like : contains Query strings(?, =, *, %, &) all these are difficult to crawl than simply static urls.

Try to embed your Targeted keyword in your Domain name: this is most commonly asked question that whether main keyword in URL name helps in rankings or not.

Answer is quite simple Doesn't matter.
As advanced major SE like Google consider rankings based on hundreds of factors and this can be very last or have minute effect that keyword in URL will help..But in some other search engines like MSN this Keyword in URL still has some importance but not in GOOGLE.

Underscore OR Dashes: this can be other possibility while designing your URLs .
See Google consider all these special characters very differently.According to me dashes are much more beneficial as compare to Underscores as far as GOOGLE is considered.

Lets Do It Yourself:
Go to Google:
enter: worldwide-seo see the no. of results and analyze them carefully. Now
Enter: worldwide_seo see the no. of results and analyze them carefully.

If you Do it carefully then you will see that in former query Google has returned more results as compared to Later. This is because Google has returned all the results that were matching : worldwide, seo, worldwide seo, worldwide

But in later only results containing specific results for worldwide_seo has been returned.
So why to loose your traffic for the rest of possible combinations by replacing dashes with underscores. Go for Dashes rather than Underscores.
Placing Appropriate Files on Root Directory...
Here we will cover files that are required to uploaded at Root Directory of web site and how these are helpful in SEO. We will discuss Robots.text, URL list.text, info.text and site map.XML.
Site map
Site-map acts an abstract part of your web site just like index of book. It contains all the websites pages URLs and can exist in different format like: XML, Text or HTML site-map.
Each has its own purpose some are meant for search engines and other are meant for human visitors.
Like XML and Text site-maps are best suited for Google and Yahoo search engines for better indexing of your web site.
HTML Site-maps are created in perfect attractive way so that any visitor can find out any web page on your site easily, these graphical site-maps contains all the pages of your site or you can say entire navigation of your whole website is placed here Human friendly format for visitors ease to navigate throughout website.
1. Site map.xml:- An xml site map has following basic information:
URL
Location: www.Domain.com
Priority: 0 to 1.0
Date: 2007-10-05T16:16:09+00:00
Change Frequency: daily, weekly, monthly, yearly

We can create them manually or one can use various on line tools available to generate them. Once you have done with xml site map file upload it at root directory of web site.
Perfect site maps helps to achieve fast indexing of your web site.

2. URL list:- This text file includes all the urls available in particular website in text format. Sometimes also referred as textual site-map of web site. Yahoo gives special intention to this text site-map deep indexing your web site.
You can create it manually by inserting one URL per line in text file or take help of online free tools. Again upload this file urllist.text on Root directory of site.

3. Robots.text:- This text file is specially meant for Spiders and mostly includes the various areas of your website that you don't want to get crawled by the various spiders or search engines.
File consist of different commands/instruction that will instruct Spiders to crawl your website.
Like if you have done some redirect then you can put specific code here in this file to indicate spiders to crawl redirected page.
You can allow spiders to crawl your site-map through robots.text by placing following code in file:

User-agent: *
Site map: http://www.yoursite.com/sitemap.xml

This file robots.text is also uploaded on Root directory of site.

4. Info.text:- This text file includes very basic information about your website. Like :
url:
site_owner:
address:
city:
state:
country:
postal_code:
phone_number:
display_email:
site_name:
Title:
site_description:

This file is also uploaded on Root directory of site.
Multilingual Search Engine Optimization

DEFINITION: - Multilingual SEO/Search means understanding your online market completely, it means building a relevant and effective strategy. Getting your site in good SEO & PPC positions on non English language search engines.
Facts to Be Undertaken
• It is a fact that people search in their own language and that people in different parts of the world use the Internet in very different ways. For example, in many parts of the world, Google is not the most popular or preferred search engine.
• In the past, companies used to translate their websites in the hope of getting new business. However today Multilingual SEO is considered as much more effective International Marketing Tip than translation of Website.
• 63 percent of the total global on line population is non-English speaking. That means a lot of people using search engines in other languages.
• Bigmouth media's track record in search engine optimization in multi language global territories is proven. We work in over 20 languages, including non-western character sets. We’ve tripled traffic to major brands' sites across all continents. We could do this for your brand.
• We know the global, multiple languages are used. Every territory has its own search engine preferences and we have to make sure that we're clued up-to-the-minute on the major players in all global regions.
Procedure
1. Translate site into required language like (German, Italian, Spanish, Dutch, French, Arabic, Portuguese and Chinese).
2. Now you have to draw traffic to the newly translated sites. There are many ways to draw traffic, but the search engines are just as important in German or Spanish as they are in English.
3. The main thing is to be able to move around in the language...and if you are not fluent, make sure a translator cleans up any text edits without undoing the changes key to your multilingual SEO efforts.
4. Let's assume the original site is in English, the translation into French, for example, is already complete and you have a list of English search terms (keywords).
5. The first step is to identify equivalent French search terms. This might not give you the same number of search terms. Ex:For instance, if you start with the 10 search terms around the word "socks" (buy socks, buy socks online, glow-in-the-dark socks, etc.), you will most likely end up with twice as many search terms in French, as there are two common words for socks in French ("bas" and "chaussettes"). This might mean that you need to create additional landing pages for French search engine surfers.
6. Be very attentive while doing Official translations for keyword research.
7. The next step, of course, is to find out which of the search terms are worth pursuing.
8. The third step is to group the search terms together into natural groupings and assign each group to a page on the website, just as one would do in English
9. On Page Optimization: We have to just place search terms in all the right places.
10. In English, a reference to "website monitoring service" would count as a reference for the search term "website monitoring". But the German equivalent, ?berwachungsservice f?r Webseiten, would read literally in English as "monitoring service for websites
11. Many companies keep the same filenames when they create a translated site. Keeping the same file name helps the webmaster keep track of internal Linking structure simple.
12. Next Comes where to house the translated site on a separate site, in a sub-domain. The general consensus is that it is preferable to give it its own domain with the appropriate country extension...
Second best is a sub-domain, which at least carries a semblance of being a separate site and allows some directories to consider it a home page for listing purposes (and you want those directory links).
13. Don't forget to build the links that are so important to your optimization. Good quality links. Relevant links, both in terms of topics and in terms of the search terms in the language of the site.
Benefits
• Effective multilingual SEO campaigns not only bring the best traffic to your site but it also ensures that the site is usable and communicates to the audience your are targeting in that country. There is no point attracting new visitors if your site fails to meet their expectations. Your multilingual content needs to combine an understanding of what people expect to find and how they search. The key is always placing the international user first, which in turn will benefit your business.
• Sales can increase by 3-7 times more likely to sell.
• Sell your product to more people abroad.
• Reach new international Clients.
• Globalize your brand name.
• Achieve ROI through Cost effective international SEO.
SEO Tools
I'm going to share with all of you huge collection of SEO Tools that can really help you all doing optimization of web sites.

With help of these Tools you can cut down your manual working hours on making reports related to stuffs like: keyword analysis, Competitors analysis, Domains Analysis and so on...
Tools Related To Domains
Basic Domain Analysis:-

1. http://www.who.is/
This will offer you: TLD Availability, Domain Name: , Status: , Registrar:, Who is Server:, Referral URL:, Expiration Date: , Creation Date: , Last Update Date:, Name Servers: , IP Address: , IP Location:, Website Status:, Server Type: , Alexa Trend/Rank:, Page Views per Visit: , Cache Date: , Registrant: , Contact info of Domain

2. Server Information:
http://www.kenkai.com/seo-tools-server-info.php

3. Domain Way Back Check tool: This tool will give you detailed view of sites way back history. From day of creation to till date how your/any site has gone through different phases.
http://web.archive.org/collections/web.html
Keyword Research & Analysis
1. Keyword Suggestion Tools:

free keywords.word tracker.com
digital point.com/suggestion/
trends.google.com
https://adwords.google.com/select/KeywordToolExternal
http://www.google.com/experimental/
keyword discovery.com
http://keywords.submitexpress.com/
goodkeywords.com
http://labs.google.com/sets
inventory.overture.com
iwebtool.com

2. Keyword Density Analysis:
http://www.seocentro.com/tools/search-engines/keyword-density.html
Competitors analysis Tools
iwebtool.com
Seocompany.ca
To check out how you competitors are performing on major search engines go for this:
http://www.urltrends.com/comparetrend.php
Page Rank Check
To check Page rank or trust rank go for these:
http://www.prcheckingtool.com/
http://www.trustrank.org/
iwebtool.com
seocompany.ca

Reciprocal Link Checking Tool:
http://tools.seobook.com/general/link-check/

Back Link check tool:Offer you all the back link along with Anchor Text, PR and Outbound links on that page.
http://www.backlinkwatch.com/

Meta Tags Analysis:
http://www.seocentro.com/tools/search-engines/metatag-analyzer.html
Tools for Different Google Data Canters
In order to check your PR, Indexed Pages, Back Links and so on ...ON different Google Data Centers check out these:
http://oyoy.eu/google/pr/
http://oyoy.eu/google/cache/
http://oyoy.eu/google/links/
http://oyoy.eu/google/pages/

Check SERPs on GOOGLE:
This is very important Tool that will let you know about your web site performance on GOOGLE i.e for which keyword you are doing TOP on GOOGLE.
http://www.seodigger.com/
Site Map Generator Tools
1. XML site maps: Maximum 500 pages will be indexed in site map. Both HTML and XML Site maps & Text Site maps as well.
http://www.xml-sitemaps.com/

2. A1 Site map Generator: Unlimited Pages. Both HTML and XML Site maps
http://www.micro-sys.dk/products/sitemap-generator/

"Do Remember to Update Your Bookmarks List if you find this informative. USE Bookmarks button available on Top Right Corner."
Basics Elements Of Google SERPs
This is very common topic does not contains any out of box information but still I just thought of sharing with all of you basic elements of Google SERPs.

1.Title:- This element is fetched from Page Title Attribute of Website.
2.Snippet:- This is basic introduction which is fetched from web page depending upon query.
There is no hard and fast rule that information will be fetched from Meta Description Tag Only.
Sometimes this information is fetched from DMOZ web directory if website is listed on DMOZ , this happens mostly if Google does not find appropriate information from web page of site corresponding to particular query.Then they prefer DMOZ info.
3. URL:- This is exactly the same URL that Google has found from its database that contains most appropriate information as per query entered by user. This can be any inner page of website or home page as well.
4. Page Size:- Google SERPs display page size of all the results returned by Google corresponding to entered Query.
5. Cached Info:- This is Google Cached snapshot link that reflects Google Bot's Last visit to respective web page.
6. Similar Pages:- This will display all the websites that have same theme and are most relevant to a particular site.
7. Site Links:- This is most important vital feature of Google in ORGANIC Results that they offer additional Bonus to Quality sites by offering them up to 8 site links that will be displayed along with your normal listing in Google SERPs. These links simply displays most Quality / Valued pages from your website as per selected by GOOGLE.
See the difference instead of Just Single result in GOOGLE SERPs your are getting getting 8 more extra pages from website that will be displayed to users that helps them to navigate easily through your websites most popular areas.

How these Site Links can Be Generated:- Most of Guys consider this as their Luck.

My experience about site links is somewhat different:

Domain/Site should be aged & genuine(Theme) enough to gain Google trust.
Smart Navigation on Footer can play a bit of role to gain site links.
So Design your footer like: Pick some highly informative(Quality Content) pages from your site and place them under smart Anchors on Footer and use it throughout your site.

Good No. One way back links corresponding to these anchors will do rest for you.

8.Google Maps/ Stock Quotes:- For some queries you will notice the plus sign (Click + sign to expand further) within some Google SERPs that started showing up next to invitations to map an address or get a stock quote a month ago. All these features added by Google to go beyond users query terms and provide real relevancy by using complex algorithms .

Google Maps reflects address location of particular business, one can relate their own Google Maps its free and quite easy.
Google Stock Quote produces graphical representation of valuable information for particular business body.
What is a Forum?
A forum is a place where groups of people can submit postings for all to read.
Users can discuss ideas and topics and plan out events.
Forum is another term for Message Boards or Chat Boards.

Who uses forums?
Lots of people use forums. Gamers use forums to discuss game play and combat strategies.
Programmers use it to discuss code ideas and to lay out ground work before beginning on projects.
You can provided help or support to others on many subjects through a forum.
It’s just a really great way to discuss any subject with other people.
There are also other great features like searching to find posts others have made based on keywords you enter.