Saturday, December 18, 2010

XHTML2 Working Group Documents Published as W3C Notes

On 16 December 2010 W3C published a number of documents from the XHTML 2 Working Group as W3C Notes:
  • XHTML 2.0, a general-purpose markup language designed to represent documents for a wide range of purposes across the World Wide Web.
  • XML Events 2, designed to provide an interoperable way of associating behaviors with document-level markup.
  • CURIE Syntax 1.0, a syntax for expressing Compact URIs
  • HLink, a module that provides the ability to specify which attributes of elements represent links, and how those links should be traversed. HLink also extends XLink use to a wider class of languages than those restricted to the syntactic style allowed by XLink.
  • XHTML Role Attribute Module, a module to support role classification of elements
  • XHTML Access Module, a module designed to enhance document accessibility.
  • XFrames, intended originally to replace HTML Frames.
Learn more about the XHTML 2 Working Group.


Related Links
XHTML page with considerations of search engine optimization

Tuesday, December 14, 2010

search engines updates on keywords meta tag

Even though Google and other search engines do not use the keywords meta tag, many search engines, Other search engines believes that the keywords meta tag is a good location to include words that you want indexed where those same words are not included with your content.

Three examples for using the keywords meta tag are:

1. To include synonyms of words that are not included in the content of your page. People search on many variations of words and phrases. Sometimes it is hard to include all of the different variations of those words in the content of your page without making your page sound like a keywords page.

2. To place emphasis on certain words. It is not always easy to place your major keywords at the top of your page or in a <h1> tag. Your major keywords might not be introduced until the middle of your page. Using the keywords meta tag, you can add these keywords to inform the search engines that these words are important. Remember two important rules when using this tag: When emphasizing keywords, do not place the same keyword more than two times in the keywords meta tag. This could be considered keyword spamming and can reduce your ranking value. Also, do not place irrelevant keywords in your meta tag. Using irrelevant keywords could get your site band from many search engines.

3. Another use for the keywords meta tag is to add words from different languages. A multi-language site with one language as its default page might want to add words from other languages to attract people speaking languages that are different than the language of the default page.

See Also
Importance Of Keywords in SEO
SEO-Title of your Web page
SEO Tasks

Tuesday, December 7, 2010

SEO-Title of your Web page

The title of your Web page is the first piece of information that an Internet searcher will see about your site. Many people will decide whether or not to click to your site based on the title text.

Title tags are also used for:
1. Indexing your site with almost all major search engines.
2. Describing your page when someone adds it to their Favorites or Bookmarks.
3. Provides the main hyperlink text that links to your site.

For these reasons, make sure that your title tag is descriptive, unique, clear, concise and contains keywords that people use when they are searching for your site. Do not use "Home Page". This is neither descriptive nor unique. Also, use different titles for different pages. This will allow your results to be unique in the results page if multiple pages from your site are displayed together.

Read
SEO - Page description meta tag

Tuesday, November 30, 2010

XHTML 1.1, XHTML Basic 1.1, XHTML Print Recommendations Revised

On 24 November 2010 the XHTML2 Working Group published three revised Recommendations: XHTML 1.1 - Module-based XHTML Second Edition, XHTML Basic 1.1 Second Edition, and XHTML-Print Second Edition. A related specification, XHTML Modularization, defines a framework for building XHTML language definitions from a set of modules. Each of the revised Recommendations combines modules to different ends. XHTML 1.1 is a "full" set of modules, XHTML Basic 1.1 is a minimal set of modules for environments such as mobile phones, PDAs, pagers, and set top boxes. XHTML Print targets printing, e.g., from mobile devices to printers that may not be full-featured. These revisions incorporate corrections to errata; see each document for the list of changes. Learn more about the XHTML 2 Working Group.

Tuesday, November 23, 2010

HTML5 Web Messaging Draft Published

19 November 2010

The Web Applications Working Group has published the First Public Working Draft of HTML5 Web Messaging. This specification defines two mechanisms for communicating between browsing contexts in HTML documents. Cross document messaging allows documents to communicate with each other regardless of their source domain, in a way designed to not enable cross-site scripting attacks. Channel messaging allows independent pieces of code (e.g. running in different browsing contexts) to communicate directly. Learn more about the Rich Web Client Activity.

Monday, November 22, 2010

SEO - New Features in ASP.NET4

New features in ASP.NET 4 for search engine optimization.
  1. Page.MetaKeywords and Page.MetaDescription properties
  2. URL Routing support for ASP.NET Web Forms
  3. Response.RedirectPermanent() method
Page.MetaKeywords and Page.MetaDescription properties


Now with ASP.NET 4 web forms we can easily add MetaKeywords and MetaDescription in code behind classes. We can now set keywords and description programmatically.
 Below is a simple code snippet that demonstrates setting these properties programmatically within a Page_Load() event handler


In addition to setting the Keywords and Description properties programmatically in your code-behind, you can also now declaratively set them within the @Page directive at the top of .aspx pages.  The below snippet demonstrates how to-do this:



URL Routing support for ASP.NET Web Forms

URL routing was a capability we first introduced with ASP.NET 3.5 SP1, and which is already used within ASP.NET MVC applications to expose clean, SEO-friendly “web 2.0” URLs.  URL routing lets you configure an application to accept request URLs that do not map to physical files. Instead, you can use routing to define URLs that are semantically meaningful to users and that can help with search-engine optimization (SEO).
For example, the URL for a traditional page that displays product categories might look like below:
http://www.mysite.com/products.aspx?category=software
Using the URL routing engine in ASP.NET 4 you can now configure the application to accept the following URL instead to render the same information:
http://www.mysite.com/products/software
Response.RedirectPermanent() method
It is pretty common within web applications to move pages and other content around over time, which can lead to an accumulation of stale links in search engines.
ASP.NET 4 introduces a new Response.RedirectPermanent(string url) helper method that can be used to perform a redirect using an HTTP 301 (moved permanently) response.  This will cause search engines and other user agents that recognize permanent redirects to store and use the new URL that is associated with the content.  This will enable your content to be indexed and your search engine page ranking to improve.
Below is an example of using the new Response.RedirectPermanent() method to redirect to a specific URL:


ASP.NET 4 also introduces new Response.RedirectToRoute(string routeName) and Response.RedirectToRoutePermanent(string routeName) helper methods that can be used to redirect users using either a temporary or permanent redirect using the URL routing engine.  The code snippets below demonstrate how to issue temporary and permanent redirects to named routes (that take a category parameter) registered with the URL routing system.


 
 

Saturday, November 20, 2010

SEO Things to Remember

Some basic things to remember while optimizing search engine

SEO Analysis and Review
Pre-Optimization SEO Analysis
Keyword Research & Analysis
Competitive Analysis
Log File Analysis
No. of On-Page Optimization Activities According to the Package 
Title Tag Optimization
META Tags Optimization
Image Alt Tags Optimization
Content Optimization
HTML Sitemap Creation
Robots.txt Creation
Google Analytics Setup
Google XML Sitemap
No. of Off-Page Optimization Activities According to the Package 
Search engine submission
- Manual
Directories Submission
- Manual
Link Exchange 
Article Writing
Article Submissions
Classified Submissions
Forum Postings
No. of SMO Activities According to the Package 
SMO Profile Creations
(Twitter, Facebook etc.)
Social Bookmarking
Press Release Distribution
Blog Posting
Video Submissions



* Create unique, accurate page titles
* Make use of the "description" meta tag
* Improving Site Structure
* Improve the structure of your URLs
* Make your site easier to navigate
* Optimizing Content
* Offer quality content and services
* Write better anchor text
* Optimize your use of images
* Use heading tags appropriately
* Make effective use of robots.txt
* Be aware of rel="nofollow" for links

SEO for Mobile Phones
* Notify Google of mobile sites
* Guide mobile users accurately
* Promotions and Analysis
* Promote your website in the right ways
* Make use of free webmaster tools

Read more

Tuesday, November 16, 2010

Global Adoption of W3C Standards Boosted by ISO/IEC Official Recognition

W3C 03 November 2010

The International Standards Organization (ISO), and the International Electrotechnical Commission (IEC) took steps that will encourage greater international adoption of W3C standards. W3C is now an "ISO/IEC JTC 1 PAS Submitter" , bringing "de jure" standards communities closer to the Internet ecosystem. As national bodies refer increasingly to W3C's widely deployed standards, users will benefit from an improved Web experience based on W3C's standards for an Open Web Platform. W3C expects to use this process (1) to help avoid global market fragmentation; (2) to improve deployment within government use of the specification; and (3) when there is evidence of stability/market acceptance of the specification. Web Services specifications will likely constitute the first package W3C will submit, by the end of 2010. For more information, see the W3C PAS Submission FAQ.

Thursday, November 11, 2010

How to track SEO progress ?

How to track SEO progress ? is very common question for seo experts and website owners.We can track the search engine optimization pregress by using verious analysis tools. It is very important to find out  Popularity of website ,Website compliance,Visitors analysis and Content analysis
Top seo analysis tools are listed below.

Woo Rank
WooRank is the fruit of much hard work by online marketing consultants, clockwork-precision developers and creative designers. Together they’ve built a simple yet powerful website analysis tool meant to leverage search engines.

Website Gradder
Website gradder by hubspot is one of the most popular website tool which analyze your full page on-site and off-site SEO factors. This analysis also includes domain information, meta-data information, RSS links, inbound and outbound links. 

Lipperhey Lipperhey
Is an online service that uses objective criteria to analyze the quality and searchability of a website by the major search engines. Lipperhey is very easy to use. You enter the URL of the website you would like to have analyzed, then Lipperhey analyzes and assesses the website. If you want to improve your websites results, get a subscription and let Lipperhey tell you exactly what you need to do to enhance your websites results. 

SEOrush
SEOrush is a one touch resource which provides you a FREE SEO report for your URL or a competitors URL. The report will return On Page SEO and Off Page SEO as well as social presence, indexed pages, validation, meta information, various ranks and standard domain information. 

Link Voodoo
Link Voodoo is a powerful combination of a TrustRank Score Calculator and a Backlink Analyzer. These are just the first two steps in what will be a complete and revolutionary set of professional link building tools.

Piloun
This website also gives a brief insight about your page SEO and details about your Alexa, Google and compete rankings as well. It also analyze your Javascripts and CSS.

SpyserMate
SpyderMate is a web-based search engine optimization analysis tool. Users must simply enter their web site’s URL and SpyderMate will return a very detailed analysis of the web site’s current internet marketing state. It can be used as a benchmark for SEO success as well as a guide to whipping your web site into shape. SpyderMate is useful both for those new to SEO as well as professional SEO practicioners.

SEO workers
SEO Workers are experts at analyzing, discovering and implementing successful Internet marketing optimization strategies for every type of website from high-end ecommerce sites to business services websites in all niche markets

SEO Chat Tools
The SEO Tools found in this section were designed to assist you in configuring your website(s) for search engine optimization within the different search engines. Select one of the SEO Tools from below to begin optimizing your website(s) for top placement within the search engines.

SEOmoz
The SEOmoz toolset includes over twenty SEO tools designed to help with every aspect of search engine optimization, including on-page targeting, site crawlability, competitive analysis, rank checking and keyword difficulty

Wednesday, October 27, 2010

First Draft of Navigation Timing Draft Published

26 October 2010

The Web Performance Working Group has published the First Public Working Draft of Navigation Timing. To address the need for complete information on user experience, this document introduces the NavigationTiming interfaces. This interface allows JavaScript mechanisms to provide complete client-side latency measurements within applications. With the proposed interface, the previous example can be modified to measure a user's perceived page load time. Learn more about the Rich Web Client Activity.

Tuesday, October 26, 2010

Eight HTML5 Drafts Updated

The HTML Working Group published eight documents:

•Working Drafts of the HTML5 specification, the accompanying explanatory document HTML5 differences from HTML4, and the related non-normative reference HTML: The Markup Language.

•Working Drafts of the specifications HTML+RDFa 1.1 and HTML Microdata, which define mechanisms for embedding machine-readable data in HTML documents, and the specification HTML Canvas 2D Context, which defines a 2D immediate-mode graphics API for use with the HTML5 element.

•HTML5: Techniques for providing useful text alternatives, which is intended to help authors provide useful text alternatives for images in HTML documents.

•Polyglot Markup: HTML-Compatible XHTML Documents, which is intended to help authors produce XHTML documents that are also compatible with non-XML HTML syntax and parsing rules.

Learn more about HTML5.


Friday, January 14, 2011 - HTML Working Group : Eight HTML5 Drafts Updated

Tuesday, October 19, 2010

Updated Techniques for Web Content Accessibility Guidelines (WCAG) 2.0

The Web Content Accessibility Guidelines Working Group has published updates of two Notes that accompany WCAG 2.0: Techniques for WCAG 2.0 and Understanding WCAG 2.0. (This is not an update to WCAG 2.0, which is a stable document.) To learn…
W3C News

Thursday, September 23, 2010

SEM - Search Engine Marketing Tips

To Start SEM you need to understand and analyze the Business information of your client and for what they are best known for?

For example understand brief description of company or business , products of company and its services, what are the other competitors and what they are doing.
Create list of Important Technical Information about clients Business.

If your client have not identified the keywords for his business then make a list of suggested keywords. do some further research on the keywords, which are best for the business.
Identify the geographical target of business and market like Local / Regional / National /International or  all

Clarify the clients view on Pay-Per-Click / Email & Newsletter Marketing / Affiliate Marketing etc.? then make plan for Link Popularity and Building marketing campaign for the website

List out the best customers of your clients business and What organizations or industries are supportive and/or complimentary to products/services?

Finalize the goals of SEM like to Higher Rankings in Search engines like Google, Yahoo, MSN, ASK etc. Increased number of unique and targeted visitors

Then Create Pre SEO Analysis Report with reference
1. Index Status: Number of WebPages indexed by search engines
2. Back links: Incoming links to your website other than your own.
3. IP Neighbors: Website IP Neighbors
4. Blacklist IP: List of anti-spam databases where the website IP is blacklisted
5. Canonicalization: Process of picking up the right URL for the website
6. Domain Duplication: Duplicate URLs of the website
7. Black Hat: Unethical techniques implemented in the website
8. Traffic Estimation: Provides you a rough traffic estimate of the website


 More SEO Tasks

Wednesday, September 22, 2010

W3C Multimodal Architecture and Interfaces Draft Updated

The Multimodal Interaction Working Group has published an updated Working Draft of Multimodal Architecture and Interfaces (MMI Architecture), which defines a general and flexible framework providing interoperability among modality-specific components from different vendors - for example, speech recognition from one vendor and handwriting recognition from another. The main changes from the previous draft are
(1) the inclusion of state charts for modality components,
(2) the addition of a 'confidential' field to life-cycle events and
(3) the removal of the 'media' field from life-cycle events.

A diff-marked version of this document is available. Learn more about the W3C Multimodal Interaction Activity.

Sunday, September 5, 2010

XML Security Drafts Published

The XML Security Working Group has published five working drafts. XML Signature 2.0, Canonical XML 2.0 and the XML Signature Streamable Profile of XPath 1.0 are part of an ongoing effort to rework XML Signature and Canonical XML in order to address issues around performance, streaming, robustness, and attack surface. The Working Group has also published updated Working Drafts for its XML Signature Best Practices and XML Security Relax NG Schemas Working Group Notes. Learn more about XML Security.

Tuesday, August 24, 2010

Web Portal Requirements and Analysis

You should implement your portal in a series of short, focused phases, each of which is designed to deliver value to the business. The modular nature of the portal, which promotes the creation of modular, reusable port let- and template-based UIs, user profiles and content types, helps in this regard.

Basic Requirements in Web Portal development:
• Provide portal administration user interface
• Provide functionality of User groups and directory
• Portal should highlight Dashboard, intranet or enterprise searching
• Provide facility for Dashboard personalization and customization
• Include Calendar tool (individual, group, enterprise)
• Provide alerting messages and events
• Instant messaging
• Facility for newsletters subscription
• Blogging application
1. Member login
2. Blog Creation: Provide topic selection and tag selection for blog
3. Blog Verification and approval
4. Allow comments on blogs
5. Stars rating for Blogs
• Licensed news and information feeds
1. Latest News
2. Most Viewed news
3. Most Commented news
4. Other Top Stories
• RSS aggregators and publish RSS feeds
• Attention streams
• Collaboration spaces and team sites
• Facility for profile management
• Storage and document repositories
• Mapping and geolocation tools

Monday, August 23, 2010

The Worldwide Web Consortium (W3C) announced Web Performance Working Group

W3C has announced a new Web Performance Working Group, whose mission is to provide methods to measure aspects of application performance of user agent features and APIs. As Web browsers and their underlying engines include richer capabilities and become more powerful, Web developers are building more sophisticated applications where application performance is increasingly important. Developers need the ability to assess and understand the performance characteristics of their applications using well-defined interoperable methods. This new Working Group will look at user agent features and APIs to measure aspects of application performance. Group deliverables will apply to desktop and mobile browsers and other non-browser environments where appropriate and will be consistent with Web technologies designed in other working groups including HTML, CSS, WebApps, DAP and SVG. Learn more in the Working Group charter and how this work fits into the W3C's Rich Web Client Activity.

Thursday, August 19, 2010

seo for cms - seo for content management system

First step is Choose Quality CMS. Using various analytical reports on Open source CMS, Choose best one which fits your business requirements.Analyze hosting, geographic location and domains situation. Ensure the maintenance and upgrade path is clear.

This will reduce the SEO consultant’s basic work, set strict quality-control procedure for content management system implementation.

Validate a reference site with the W3C Check metadata and Meta description are accessible and editable for CMS. To achieve this use correct plug-in. Check URLs are correctly generating for content. Use standard procedure for this activity

Main drawback of cms is they use different paths for the same content. we can fix this using a correct plugin that allows you to choose the most valid address for the content item or page
Other regular seo activities are applicable for cms like
1. Use Key Words in URLs
2. Use Site Navigation as Text links, don't use images
3. Publish to a Flat Directory
4. Eliminate Broken Links
5. Xml Site Maps submission
6. Google Site Maps
7. Avoid Spelling Errors
8. Avoid Duplicate Content
9. Appropriate Use of robots.txt

Tuesday, August 17, 2010

Role of w3c validations in search engine optimization

Obviously content is very important for search engine optimization
If you have a site with no quality content and no backlinks yet are 100% compliant with W3C standards, the site will still not rank well in search engines

Google has clearly stated that W3C validation does not affect search engine rankings. the page will not rewarded with a higher in search ranking.

Still we should consider the Points listed below as a Role of w3c validations in Search engine optimization

1.Minimize Page Load Time for great website visitor experience
2.Minimize Bounce Rates for your website
When you validate your content grammatically ,Markup,css,rss,mobile compatibility using w3c validations,then you can go for other important Search engine optimization tasks.
http://seow3c.blogspot.com/2010/07/seo-tasks.html

Monday, August 16, 2010

Google Ranking - KEYWORD plays very important role

Here we are discussing importance of keywords in website development.
Domain name and page names should be a valid Keyword relevant to the web content.
In URL first words are very important
Areas where KEYWORD plays very important role :
1.Page Header with proper Keywords
2.Page Title tag with proper Keyword
3.Description Meta tag with Keyword
4.Keyword in metatag
5.Keyword density in body text
6.Keyword in H1, H2 and H3
7.Font size of Keyword
8.Keyword in alt text
9.Keyword in links to site pages (anchor text)

Web Security Context Working Group has published a W3C Recommendation

The Web Security Context Working Group has published a W3C Recommendation of Web Security Context: User Interface Guidelines. This specification deals with the trust decisions that users must make online, and with ways to support them in making safe and informed decisions where possible. It describes user interactions and user interface guidelines with a goal toward making security usable, based on known best practice in this area. Learn more about the Security Activity.

Tuesday, August 10, 2010

RDFa Group has published two Working Drafts: RDFa Core 1.1 and XHTML+RDFa 1.1

The RDFa Working Group has just published two Working Drafts: RDFa Core 1.1 and XHTML+RDFa 1.1. RDFa Core 1.1 is a specification for attributes to express structured data in any markup language. The embedded data already available in the markup language (e.g., XHTML) is reused by the RDFa markup, so that publishers don't need to repeat significant data in the document content. XHTML+RDFa 1.1 is an XHTML family markup language. That extends the XHTML 1.1 markup language with the attributes defined in RDFa Core 1.1. This document is intended for authors who want to create XHTML-Family documents that embed rich semantic markup. Learn more about the Semantic Web Activity.

Saturday, August 7, 2010

w3c Releases all in one Validator - Unicorn

W3C has released Unicorn, a one-stop tool to help people improve the quality of their Web pages. Unicorn combines a number of popular tools in a single, easy interface, including the Markup validator, CSS validator, mobileOk checker, and Feed validator, which remain available as individual services as well. W3C inviting developers to enhance the service by creating new modules and testing them in our online developer space (or installing Unicorn locally). W3C looking forward to code contributions from the community as well as suggestions for new features.  Learn more about W3C open source software.download


The W3C's unified validator service will definetly help the people to improve the quality of their Web pages

XQuery from the Experts: A Guide to the W3C XML Query Language

Saturday, July 31, 2010

XHTML page with considerations of search engine optimization

Obviously content is very important for search engine optimization. But equivalently Code your pages using external CSS and JavaScript and use relevant and semantic XHTML formatting which will be part of on-page optimization.


Benifits Of XHTML
  • Strict XML syntax encourages authors to write well-formed markup, which some authors may find easier to maintain
  • Integrates directly with other XML vocabularies, such as SVG and MathML
  • Allows the use of XML Processing, which some authors use as part of their editing and/or publishing processes  
Definations of XHTML

•XHTML (Extensible Hypertext Markup Language) is a family of XML markup languages that mirror or extend versions of the widely used Hypertext Markup Language (HTML), the language in which web pages are written
•The next generation of HTML, compliant with XML standards. Although it is very similar to the current HTML, it follows a stricter set of rules, thus allowing for better automatic code validation.
•A general purpose markup language that utilises existing and future XML elements to present information on the WWW

Impotant XHTML elements

1. First declare document type: Always remember to include the correct DOCTYPE to tell the browser which version of XHTML you are using
<!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Transitional//EN” “http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd”>
2. Assign the document an xmlns attribute and set page language code
<html xmlns=”http://www.w3.org/1999/xhtml” xml:lang=”en” lang=”en”>
3. Declare head element of the page  <head>
4. Declare Character set used in the page
<meta http-equiv=”Content-Type” content=”text/html; charset=iso-8859-1″ />
5. Declare page title element
<title>Page Title </title>
6. Declare the page description
<meta name=”description” content=”page description” />
7. Include external CSS and Javascript files.
<link href=”css/styles1.css” rel=”stylesheet” type=”text/css” media=”all” />
<script src=”js/fngetScript.js” type=”text/javascript”></script>

8. Close the head element
</head>
9. The document Body
  <body>

Important XHTML elements of a page

<h1></h1>
only one <h1> element on a page and like the page title it should be a short description on the page text content.it means <h1> tag should be used to represent the main topic of the page.
<h2></h2> / <h3></h3>
we can use sub heading h1 to h6. If one header is a sub-topic of another header use the next level header.
<p></p>
The paragraph tag
<em></em>
Used to italicize text. Like the <strong> tag the search engines will put emphasis on text inside this tag
<strong></strong>
Used to bold text.
<a></a>
<a href=”pagename” rel=”nofollow”>
<img>
 include an image ALT attribute along with width and height
<img src=”image.gif” width=”150” height=”60” alt=”Image keyword” />
 
Here you should remember to close all the tags. Never over-use <DIV> tags. Use <DIV> tags only in the appropriate places. but it would be more appropriate if you use the <UL> tag instead.Validate your webpage as and when it is being developed.

SEO Tasks

Basic SEO Tasks

Today, we are going to list the required steps to design and develop SEO compliant web sites. We applies these steps to every web site we design and develop to ensure that every customer ranks high on the most popular search engines

1. Table based layouts are an outdated and ineffective practice for web design
        a. Check xhtml/css templates
2. WebPages should be designed specifically for easy printing by end users
3. Alternative text is easy to implement and key for search engine placement and accessibility
4. Headings are easy to implement and key for search engine placement and accessibility
5. Avoid keywords with exceptionally high competition
6. Find the Best Keywords ( keywords meta tag )
7. Discover What Your Competitors are Doing
8. Write Very Linkable & Sharable Content ( Tips for Web Content Writer )
9. Optimize Your Title and Meta Tags ( Page Title of web page , Page description meta tag )
10. Optimizing Your Headings and Subheadings
11. Use Title and ALT Attributes
12. Optimizing File Nomenclatures
13. Tell the Search Engines What to Index
14. Feed Search Engines Static and XML Site Maps
15. Use Checklists and Validators
          1. W3C Markup Validation Service (XHTML) test your design with below              link     http://validator.w3.org/check?uri=www.seow3c.com
          2. CSS Validation Service :  http://jigsaw.w3.org/css-validator/
          3. Link Checker :  http://validator.w3.org/checklink
16. Plan Internet Marketing of website
          1. leave comment and backlinks on social media network and related websites.
                  Back link checker
                  Google search text: link:seow3c.blogspot.com/
          2. Blog posting
          http://seow3c.blogspot.com/
          3. Press Release
          4. Documents sharing (sample links)
          5. Company profile submission on wiki sites

SEO Things to Remember
Search Engine Marketing

If your web design is not ranking high on search engines, please contact us so we can run a scan to determine the cause and to address any SEO concerns. contact
 

Friday, July 30, 2010

What is Search Engine Optimization

Search Engine Optimization, ie getting your site ranked higher so more people show up at your doorstep.
Paid and Organic search listings
Organic search engine listings are the main results users see when they do a Google search. The websites appearing in the organic listings appear because those sites are most relevant to the user’s keywords. Indeed, most of these sites appear in the top of the search engine results because the webmasters of these sites have used SEO tactics to ensure top rankings.
The paid or sponsored listings usually appear on the top, bottom and to the right of the regular organic listings. Usually these are pay per click (PPC) ads, which means the website owner only pays when someone clicks on his ad (as opposed to paying for impressions).
This isn’t an either/or game. Just because you do SEO doesn’t mean you can’t/shouldn’t use PPC and vice versa.
SEO is not free traffic, it takes time and/or money to get good organic rankings but in the long run it’s usually cheaper than PPC.

What's off-page SEO?
Off page SEO refers to those things you do outside of your own web pages to enhance their rankings in the search engines.
This is a glorified way of saying, “get links” and did I mention, “more links”.

How quickly will I see results?
If you target long tail keywords you can see results pretty quickly but always remember SEO is a long term strategy not a set and forget thing.
If you’re after more competitive keywords prepare to commit to it for at least three months of consistent effort.
The Art of SEO: Mastering Search Engine Optimization (Theory in Practice)SEO Made Simple (Second Edition): Strategies For Dominating The World's Largest Search EngineSearch Engine Optimization: Do It Yourself - Kindle Bestseller (The Definitive Do It Yourself Guide to SEO)Ranking Number One: 50 Essential SEO Tips To Boost Your Search Engine Results

Thursday, July 29, 2010

Open Source CMS Reports: History of CMS Awards Received

Open Source CMS Reports: History of CMS Awards Received: "Joomla! Best Overall Open Source CMS - 2d Place (Packt Publishing) 2008 Best PHP Open Source CMS - 2d Place (tie) (Packt Publishing) 2008 ..."

Open Source CMS Reports: CMS Comparison: Ranking for Alexa, Compete and Qua...

Open Source CMS Reports: CMS Comparison: Ranking for Alexa, Compete and Qua...: "Compete and Quantcast also provide rankings of the sites on the web. Both of these systems, however tend to focus only on the highest traffi..."

Open Source CMS Reports: Open Source CMS Google PageRank

Open Source CMS Reports: Open Source CMS Google PageRank: "9 >> Joomla!,Plone,WordPress8 >> DotNetNuke,Drupal,eZ Publish,Typo3,Xoops7 >> Alfresco,e107,Jahia,Liferay,MODx,OpenCms,phpWebSite,TikiWiki6 ..."

Open Source CMS Reports: Open Source CMS Market Updates

Open Source CMS Reports: Open Source CMS Market Updates: "The data shows a clear and significant lead for Joomla! Joomla! is not only the leading system, but exceeds the nearest competitor, Drupal, ..."

Open Source CMS Reports: Download links for Availabel CMS (Content manageme...

Open Source CMS Reports: Download links for Availabel CMS (Content manageme...: "WordPress is a leading open source CMS. Backed by both a largecommunity and the efforts of Automattic, Inc., it has grown froma pure bloggin..."