Tuesday, July 31, 2012

Free SEO Tools

1. Search Engine Submissions

Having your website indexed and searchable by search engines is the first and most important set in SEO for your website. This free service will allow you to submit your website to over 20 key search engines (including Google, Yahoo, Big & Ask) and jump start you SEO campaign. Submit your website to over 20 high quality search engines for free.

2. Website SEO Analysis

Enter your website’s URL on Woo Rank and get a detailed SEO analysis report of your website in seconds. They will also send you a pdf of the same SEO report straight to your email address, should you wish.
Please click here to go to Woo Rank.

3. Keyword Tool

Google’s very own Keyword Tool. You do not have to be registered or signed into Google to use this keyword tool. Use this Keyword Tool to research your keyword and see what people are searching for on Google and how competitve those keywords are.
Please click here to go to Google Keyword Tool.

4. Keyword & Website Analysis

Alexa is one of the key website analysis tools for researching your website and those of your competitors. It will show you the Alexa Rank along with search metrics such as high impact keywords.
Please click here to go to Alexa.

5. Free Back Links Directory

The main method of effective back linking is by getting other sites to send a one way (ie. non reciprocal) to your website. Some directories are good for this and Global-Dir.com offer free one way back links.

Monday, July 30, 2012

Applying WCAG 2.0 to Non-Web ICT - First Draft Published

27 July 2012
The Web Content Accessibility Guidelines Working Group (WCAG WG)  published the First Public Working Draft of Applying WCAG 2.0 to Non-Web Information and Communications Technologies (WCAG2ICT). It is a draft of an informative (that is, not normative) W3C Working Group Note that will clarify how Web Content Accessibility Guidelines (WCAG) 2.0 can be applied to non-Web ICT. Please see important background information in the Call for Review e-mail. Comments are welcome through 7 September 2012. Read about the Web Accessibility Initiative (WAI).

Friday, July 27, 2012

W3C Invites Implementations of Page Visibility, Performance Timeline, and User Timing

26 July 2012
The Web Performance Working Group invites implementation of three Candidate Recommendations:
  • Page Visibility which defines a means for site developers to programmatically determine the current visibility state of the page in order to develop power and CPU efficient web applications.
  • Performance Timeline which defines an unified interface to store and retrieve performance metric data. This specification does not cover individual performance metric interfaces.
  • User Timing which defines an interface to help web developers measure the performance of their applications by giving them access to high precision timestamps.
Learn more about the Rich Web Client Activity.

Best Practices for Fragment Identifiers and Media Type Definitions Draft Published

26 July 2012

The Technical Architecture Group has published the First Public Working Draft of Best Practices for Fragment Identifiers and Media Type Definitions. Fragment identifiers within URIs are specified as being interpreted based on the media type of a representation. Media type definitions therefore have to provide details about how fragment identifiers are interpreted for that media type. This document recommends best practices for the authors of media type definitions, for the authors of structured syntax suffix definitions (such as +xml), for the authors of specifications that define syntax for fragment identifiers, and for authors that publish documents that are intended to be used with fragment identifiers or who refer to URIs using fragment identifiers. Learn more about the Technical Architecture Group.

Tuesday, July 24, 2012

Adobe, Google, Microsoft Sponsorships Bolster W3C Staffing of HTML5 Work

24 July 2012
W3C is pleased to announce commitments from Adobe, Google, and Microsoft for sponsorship funding that will enable W3C to provide additional staffing in support of the HTML Working Group's full range of activities, including editing several specifications and developing tests. These sponsorships will help W3C fill a position announced in June in response to an April call for editors from the HTML Working Group Chairs. In their April email, the Chairs also outlined the group's parallel efforts to finalize a stable HTML5 standard by 2014 and engage with the community on future HTML features. Learn more about the HTML Working Group.

Three Provenance Last Call Drafts Published

24 July 2012
The Provenance Working Group published three Last Call Working Drafts . Provenance is information about entities, activities, and people involved in producing a piece of data or thing, which can be used to form assessments about its quality, reliability or trustworthiness.
  • PROV-DM: The PROV Data Model introduces the provenance concepts found in PROV and defines PROV-DM types and relations. The PROV data model is domain-agnostic, but is equipped with extensibility points allowing domain-specific information to be included.
  • PROV-O: The PROV Ontology expresses the PROV Data Model using the OWL2 Web Ontology Language (OWL2). It provides a set of classes, properties, and restrictions that can be used to represent and interchange provenance information generated in different systems and under different contexts. It can also be specialized to create new classes and properties to model provenance information for different applications and domains.
  • PROV-N: The Provenance Notation is introduced to provide examples of the PROV data model: aimed at human consumption, PROV-N allows serializations of PROV instances to be created in a compact manner. PROV-N facilitates the mapping of the PROV data model to concrete syntax, and is used as the basis for a formal semantics of PROV. The purpose of this document is to define the PROV-N notation.
Comments on the Last Call Working Drafts are welcome through 18 September. The group also published a Working Draft of PROV Model Primer, which provides an intuitive introduction and guide to the PROV specification for provenance on the Web. The primer is intended as a starting point for those wishing to create or use PROV data. Learn more about the Semantic Web Activity.

Last Call: SPARQL 1.1 Query Language

24 July 2012
The SPARQL Working Group has published a Last Call Working Draft of SPARQL 1.1 Query Language. RDF is a directed, labeled graph data format for representing information in the Web. This specification defines the syntax and semantics of the SPARQL query language for RDF. SPARQL can be used to express queries across diverse data sources, whether the data is stored natively as RDF or viewed as RDF via middleware. SPARQL contains capabilities for querying required and optional graph patterns along with their conjunctions and disjunctions. SPARQL also supports aggregation, subqueries, negation, creating values by expressions, extensible value testing, and constraining queries by source RDF graph. The results of SPARQL queries can be result sets or RDF graphs. Comments are welcome through 21 August. Learn more about the Semantic Web Activity.

SEO Advice :Create a free product or service

Content drives most traffic when you offer something useful. There are many types of useful content you can create and they largely depend on the niche of your site. You can have articles with tons of advice, or short tips but one of the most powerful ways to get traffic is to create a free product or service. When this product or service gets popular and people start visiting your site, chances are that they will visit the other sections of the site as well.

Free products and services are great for getting free traffic to your site and one of the best varieties in this aspect is viral content. Viral content is called so because it distributes like a virus – i.e. when users like your content, they send it to their friends, post it on various sites, and promote it for free in many different ways. Viral content distributes on its own and your only task is to create it and submit it to a couple of popular sites. After that users pick it and distribute it for you. Viral content can be a hot video or a presentation but it can also be a good old article or an image.

Thursday, July 19, 2012

W3C Identifies how the Web will Transform the Digital Signage Industry

18 July 2012
W3C announced new momentum for making the Web the future interoperable platform for Digital Signage. W3C issued a summary of key topics and use cases for bringing Digital Signage to the Web, as well as a first gap analysis of enhancements to the Web to enable the transformation of the Digital Signage ecosystem. Digital signage covers a spectrum of display sizes and locations, from sports arenas and urban video terminals of every shape, to monitors in elevators, storefront windows, train stations, and public kiosks featuring rich interactivity. In June, an initial opportunity to discuss next-generation Web-based Digital Signage services drew industry stakeholders to a W3C Workshop "All Signs Point to the Web," hosted by NTT. Read the full press release about the Workshop report and join the Web-based Signage Business Group to develop use cases and requirements for standardization.

Registration Open for Mobile Web Training Courses (in English y en Español)

18 July 2012
Registration is now open for a new round of mobile Web online training courses which begin on 3 September 2012. In these courses, you learn to "mobilize" pages and deliver a good Web experience on mobile devices. These 6-week long W3C online training courses, supported by experienced and professional trainers, let you study at your own pace. The courses are separately delivered in English and in Spanish:
Learn more about W3C online training for developers.

Monday, July 16, 2012

Basic optimization tips : Effects on traffic

Meta Description

The meta description tag is an element search engines use to help determine what the page is about. The tag also appears as your site description in search results, so writing your tag to appeal to human eyes can lead to increased clicks on your listing. You will notice that the search engine bolds the keywords you originally searched for in the description tag. If you do not have a description tag, Google will write one for you, and you really don't want that.

Google PageRank

Your Google PageRank (toolbar PageRank is between 0 and 10 - the higher, the better) is basically a measure of trust and authority. Domains with a higher PR are likely to rank for more terms than sites with low PR.

Alexa Rank

Your Alexa Rank is actually a measure of traffic to your site where they rank your site compared to all other sites in the world. If your site is in the top 100,000 Alexa rank, your traffic is probably doing OK. For example, pearanalytics.com floats between 47,000 and 75,000.

Title Tag

The title tag is an element that the search engines use to help determine what the page is about. Since it shows up as the first line of your listing in search results, it can make or break your clicks if it does not sound appealing to people. A concise and appropriate title tag projects an image of professionalism as well as encourages users to bookmark your page, knowing they won't have to edit text to remember what they bookmarked.

Robots.txt

Your robots.txt file, located in your root folder, is a way for webmasters to indicate which pages/folders/directories should not be accessed by crawlers or search engines. A good example are any pages behind a login. However, there are some serious misuses of the robots file that we come across sometimes, and we want to try and alert you to those. Overusing or blocking too many sections of your site could cause harm to your inbound link effectiveness.

Page Load Time

Having slow loading pages can affect your rankings, and ultimately your traffic. Even the traffic you do get may bounce at a higher rate on slow loading pages. If you have an e-commerce site, expect a loss in sales for pages that load too slowly.

Clean URL

Use clean URLs and add targeted keywords where you can to enhance the SEO friendliness of your site. You will notice search engines will highlight those pages in the results, so having pages that are descriptive is better than random characters and number sequences, which is almost always the case. But be careful about pages that have affiliate codes or ID's in them. If they are duplicate copies of existing pages on the site, you want to be sure the search engine is not seeing these affiliate pages as duplicate content. You can "NOINDEX" them if necessary.

Domain Age

A young domain will likely not rank well immediately depending on competitiveness, unless there is a major social or viral event to drive a massive amount of traffic to the site in a short period of time. Domain age is used in the "trust and authority" calculation the search engine does. Also, purchase your domain out to 5 or 10 years instead of just 1 or 2 years at a time. That makes the search engine comfortable that you plan on being around a while.

Analytics

While this won't affect your traffic, it is what you need to measure YOUR traffic. 

 

Friday, July 13, 2012

W3C Release :Four Device API Specifications Published; HTML Media Capture Last Call

12 July 2012
The Device APIs Working Group has published four Working Drafts:
  • a Last Call Working Draft of HTML Media Capture.The HTML Media Capture specification defines HTML form extensions that facilitate user access to media capture capabilities of the hosting device. Comments are welcome through 09 August.
  • a First Public Working Draft of Pick Media Intent, which enables access to a user's media gallery from a Web application.
  • a First Public Working Draft of Proximity Events, which defines a means to receive events that correspond to a proximity sensor detecting the presence of a physical object.
  • a Working Draft of Pick Contacts Intent, which enables access to a user's address book service from a Web application.
Learn more about the Ubiquitous Web Applications Activity.

W3C Release :File API Draft Published

12 July 2012
The Web Applications Working Group has published a Working Draft of File API. Web applications should have the ability to manipulate as wide as possible a range of user input, including files that a user may wish to upload to a remote server or manipulate inside a rich web application. This specification defines the basic representations for files, lists of files, errors raised by access to files, and programmatic ways to read files. Additionally, this specification also defines an interface that represents "raw data" which can be asynchronously processed on the main thread of conforming user agents. The interfaces and API defined in this specification can be used with other interfaces and APIs exposed to the Open Web Platform. Learn more about the Rich Web Client Activity.

W3C Release :Two JSON-LD First Drafts Published

12 July 2012
JSON has proven to be a highly useful object serialization and messaging format. JSON-LD harmonizes the representation of Linked Data in JSON by outlining a common JSON representation format for expressing directed graphs; mixing both Linked Data and non-Linked Data in a single document. The RDF Working Group has published two related First Public Working Drafts:
  • JSON-LD API 1.0 outlines an API and a set of algorithms for transforming JSON-LD documents in order to make them easier to work with in programming environments like JavaScript, Python, and Ruby.
  • JSON-LD Syntax 1.0 outlines a common JSON representation format for expressing directed graphs; mixing both Linked Data and non-Linked Data in a single document.
Learn more about the Semantic Web Activity.

Thursday, July 12, 2012

SEO and w3c standards: SEO Tasks

SEO and w3c standards: SEO Tasks: Basic SEO Tasks 1. Table based layouts are an outdated and ineffective practice for web design         a. Check xhtml/css templates 2. ...

Thursday, July 5, 2012

Impact of W3C Validation over SEO

 Here we can refer some experts comments to analyze the Effect of W3C Validation over SEO

1. Aaron Wall, of  SEOBook:
    If you want to get links from web designers who charge high rates then W3C validation is important to SEO, otherwise it has little direct importance outside of ensuring proper rendering to end users. When one visits Amazon.com or Google or Yahoo! (or just about any billion Dollar+ internet company) they will find a website that doesn’t validate. Why is that?

2. Brent Payne, SEO Director for Tribune
    I like to keep errors under 25 or so, though Tribune has 100+ errors. Perfect code, I don’t think is necessary but you don’t want to have too malformed of HTML. Some say it is a ranking factor, I say you just don’t want to have stuff that is too unexpected for the bots.

3. Dennis Goedegebuure, Senior Manager & Head of SEO at eBay Inc.
    It depends on the type of errors and how many, it all depends on whether the crawler can actually read the real content of the page.

4. Matt Cutts the official voice of Google
Here is a video with what Matt has to say regarding the problems:
With Respect to Effect of W3C Validation over SEO
As you can see from the above testimonials, validation has no direct impact over SEO and you will not receive any bonus from Google. But this doesn’t mean that you don’t have to fix some of the problems to make the site more appealing to the visitors and have a really speedy website.

Tuesday, July 3, 2012

Social Media impact on SEO

The long and short of the answer would appear to be yes. Both Google and Bing admitted late last year to using "social signals" to help rank search results. Additionally Google has a Beta test version of social search under is Labs tools that directly signals which social media outlet it has pulled results from, as well as it's Beta version of the "+1 button" which is it's answer to the Facebook "Like" button. 

From Bing:

“We do look at the social authority of a user. We look at how many people you follow, how many follow you, and this can add a little weight to a listing in regular search results.”

And from Google:

“Yes, we do use [tweeted links and RTs] as a signal. It is used as a signal in our organic and news rankings. We also use it to enhance our news universal by marking how many people shared an article.”

The bottom line is this.  Social signals do matter from an SEO standpoint, but they aren’t strong enough to justify artificially padding your Facebook “likes”, Twitter “tweets” and Google “+1s.”  Instead, providing your website readers with social tools that make sharing your content on social networks easy to do and encouraging them to share articles they enjoy on social networking sites should be enough to grow your social signals profile in an effective and sustainable way.

Selectors API Level 2 Draft Published

28 June 2012
The Web Applications Working Group has published a Working Draft of Selectors API Level 2. Selectors, which are widely used in CSS, are patterns that match against elements in a tree structure. The Selectors API specification defines methods for retrieving Element nodes from the DOM by matching against a group of selectors, and for testing if a given element matches a particular selector. It is often desirable to perform DOM operations on a specific set of elements in a document. These methods simplify the process of acquiring and testing specific elements, especially compared with the more verbose techniques defined and used in the past. Learn more about the Rich Web Client Activity.

Media Capture and Streams Draft Published

28 June 2012
The Web Real-Time Communication and Device APIs Working Groups published a Public Working Draft of Media Capture and Streams. This document defines a set of JavaScript APIs that allow local media, including audio and video, to be requested from a platform. Lean more about Ubiquitous Web Applications Activity.

Web Intents Draft Published

26 June 2012
The Web Intents Task Force, jointly operated by the Device Applications (DAP) Working Group and the Web Applications (WebApps) Working Group, has published a First Public Working Draft of Web Intents. This document defines DOM interfaces and markup used by client and service pages to create, receive, and reply to Web Intents messages, and the procedures the User Agent carries out to facilitate that process. Learn more about the Ubiquitous Web Applications Activity and the Rich Web Client Activity.