W3C announces today the new charter for
the HTML Working Group, until the end of June 2015.
The HTML Group mission remains the development of the HTML language and its associated APIs. This new charter
includes an experiment where the HTML Working Group can publish some of their Recommendation-track specifications under
a permissive license. This is intended to encourage collaboration and make it easier to reuse materials.
See the charter and FAQ Regarding HTML Working Group Charter License Experiment for more details. The group remains on schedule to complete HTML
5.0 to W3C Recommendation in 2014. Learn more about the HTML Working Group.
Search Engine Marketing| seo tips | w3c Release | Unlock the secrets of SEO success and W3C standards mastery on our blog. Elevate your online presence with expert insights, staying visible and accessible
Monday, September 30, 2013
Thursday, September 26, 2013
Just a Few Days Left to Register for W3C HTML5 Training Course
Don’t miss it! Register now for the upcoming W3C HTML5 online course
that starts next Monday, 30 September 2013. Acclaimed trainer Michel
Buffa will cover the techniques developers and designers need to create
great Web pages and apps. This new course edition has been significantly
enhanced since the June 2013 course. It features additional sections,
including a JavaScript crash course, advanced techniques regarding time
based animation, 2D geometric transformations, Web Audio API, etc., all
illustrated by numerous examples. Learn more about W3DevCampus, the official W3C online training for Web developers.
First Public Working Draft: WAI-ARIA 1.1
The Protocols and Formats Working Group today published a First Public Working Draft of Accessible Rich Internet Applications WAI-ARIA 1.1.
WAI-ARIA provides an ontology of roles, states, and properties that
define accessible user interface elements and it can be used to improve
the accessibility and interoperability of web content, particularly web
applications. It is introduced in the WAI-ARIA Overview. WAI-ARIA 1.1 is expected to include only a few changes from 1.0. The primary change in this Draft is the addition of of
aria-describedat
. Learn about the current status of WAI-ARIA 1.0 and 1.1, and the Web Accessibility Initiative (WAI).Tuesday, September 24, 2013
Timed Text Markup Language 1 (TTML1) (Second Edition) is a W3C Recommendation
The Timed Text Working Group has published a W3C Recommendation of Timed Text Markup Language 1 (TTML1) (Second Edition).
This document specifies Timed Text Markup Language (TTML), Version 1,
also known as TTML1, in terms of a vocabulary and semantics thereof. The
Timed Text Markup Language is a content type that represents timed text
media for the purpose of interchange among authoring systems. Timed
text is textual information that is intrinsically or extrinsically
associated with timing information. It is intended to be used for the
purpose of transcoding or exchanging timed text information among legacy
distribution content formats presently in use for subtitling and
captioning functions. In addition to being used for interchange among
legacy distribution content formats, TTML Content may be used directly
as a distribution format, for example, providing a standard content
format to reference from a <track> element in an HTML5 document,
or a <text> or <textstream> media element in a SMIL 2.1
document. Learn more about the Video in the Web Activity.
Call for Review: Internationalization Tag Set (ITS) Version 2.0 Proposed Recommendation Published
The MultilingualWeb-LT Working Group has published a Proposed Recommendation of Internationalization Tag Set (ITS) Version 2.0.
The technology described in this document “Internationalization Tag Set
(ITS) 2.0“ enhances the foundation to integrate automated processing of
human language into core Web technologies. ITS 2.0 bears many
commonalities with its predecessor, ITS 1.0 but provides additional
concepts that are designed to foster the automated creation and
processing of multilingual Web content. ITS 2.0 focuses on HTML,
XML-based formats in general, and can leverage processing based on the
XML Localization Interchange File Format (XLIFF), as well as the Natural
Language Processing Interchange Format (NIF). Comments are welcome
through 22 October. Learn more about the Internationalization Activity.
Network Service Discovery Draft Published
The Device APIs Working Group has published a Working Draft of Network Service Discovery.
This specification defines a mechanism for an HTML document to discover
and subsequently communicate with HTTP-based services advertised via
common discovery protocols within the current network. Learn more about
the Ubiquitous Web Applications Activity.
SEO Optimization - Keyword Density
It has always been about being noticed. People dress well or do something bizarre in order to be noticed or make a statement. To get ahead in life or business one needs to be at the top, first in line. Now more often than not, the World Wide Web to functions along the same principles.
With tough competition between websites, articles, and e-commerce sites what makes one more successful than others is the position a site gets on search engines. The higher the ranking the larger the number of hits or traffic to the site. To achieve a higher position websites use the strategy of SEO optimization and content is written such that it is keyword dense.
It is not as simple as using words repeatedly you need to know which keywords are relevant to your website and pages. Once you have a master list ensure that :
• Match keyword tag with content. Ensure that the keyword meta tag has around 900 characters or 25 words.
• Ensure that the keyword density in between 3-10% . And, avoid using the same word more than once in a sentence. Check keyword validators to determine the exact density percentage.
• Try and maximize use in the top half of the page. Many search engine spiders do not go beyond 25-50%.
• Be smart and spell your keywords with variations and also include plurals
Go one step ahead and make use of a keyword density checker. This is an automated system that will comb through your web content and highlight words used in higher density. The system will enable you to perfect your pages and provide what search engines or spiders want.
A Keyword Density Cloud see http://www.webconfs.com/keyword-density-checker.php can be used effectively to crawl selected URLs, analyze word density, and remove common stop words. The tool can be added to your website and fulfil a vital role in SEO optimization.
It is all about knowing how to gather your audience or customers and the keyword density checker or cloud can help you through several stumbling blocks. The golden rules of SEO are: keyword density, keyword frequency, keyword prominence, and keyword proximity.
The basic rule of thumb is to know "what are the keywords your potential customers are likely to use." Keywords must not be random but relevant to your line of business. The trick is to strike a balance neither too many nor too little. Too many can get your site banned and too little means you get lower rankings and your wonderful site goes unnoticed.
So, while sticking to your marketing plan and focus take a moment to take care of web content and its many nuances. Make content meaningful and relevant, find out the most important words that will be used to find what your site has to offer (put yourself in the user's shoes). And construct content using all the golden rules and keys mentioned above.
If you tread the right path success in the World Wide Web will be yours.
Author Bio
Aaron Brooks is a freelance writer for www.1888seoservices.com, the premier website to find Seo consulting, link buildings and professionals seo training, online marketing tips,
Article Source: http://www.ArticleGeek.com - Free Website Content
With tough competition between websites, articles, and e-commerce sites what makes one more successful than others is the position a site gets on search engines. The higher the ranking the larger the number of hits or traffic to the site. To achieve a higher position websites use the strategy of SEO optimization and content is written such that it is keyword dense.
It is not as simple as using words repeatedly you need to know which keywords are relevant to your website and pages. Once you have a master list ensure that :
• Match keyword tag with content. Ensure that the keyword meta tag has around 900 characters or 25 words.
• Ensure that the keyword density in between 3-10% . And, avoid using the same word more than once in a sentence. Check keyword validators to determine the exact density percentage.
• Try and maximize use in the top half of the page. Many search engine spiders do not go beyond 25-50%.
• Be smart and spell your keywords with variations and also include plurals
Go one step ahead and make use of a keyword density checker. This is an automated system that will comb through your web content and highlight words used in higher density. The system will enable you to perfect your pages and provide what search engines or spiders want.
A Keyword Density Cloud see http://www.webconfs.com/keyword-density-checker.php can be used effectively to crawl selected URLs, analyze word density, and remove common stop words. The tool can be added to your website and fulfil a vital role in SEO optimization.
It is all about knowing how to gather your audience or customers and the keyword density checker or cloud can help you through several stumbling blocks. The golden rules of SEO are: keyword density, keyword frequency, keyword prominence, and keyword proximity.
The basic rule of thumb is to know "what are the keywords your potential customers are likely to use." Keywords must not be random but relevant to your line of business. The trick is to strike a balance neither too many nor too little. Too many can get your site banned and too little means you get lower rankings and your wonderful site goes unnoticed.
So, while sticking to your marketing plan and focus take a moment to take care of web content and its many nuances. Make content meaningful and relevant, find out the most important words that will be used to find what your site has to offer (put yourself in the user's shoes). And construct content using all the golden rules and keys mentioned above.
If you tread the right path success in the World Wide Web will be yours.
Author Bio
Aaron Brooks is a freelance writer for www.1888seoservices.com, the premier website to find Seo consulting, link buildings and professionals seo training, online marketing tips,
Article Source: http://www.ArticleGeek.com - Free Website Content
Sunday, September 22, 2013
Last Call: TriG
20 September 2013
The RDF Working Group has published a Last Call Working Draft of TriG.
The Resource Description Framework (RDF) is a general-purpose language
for representing information in the Web. This document defines a textual
syntax for RDF called TriG that allows an RDF dataset to be completely
written in a compact and natural text form, with abbreviations for
common usage patterns and datatypes. TriG is an extension of the Turtle
[turtle] format. Comments are welcome through 11 October. Learn more
about the Semantic Web Activity.DOMMatrix interface Draft Published
20 September 2013
The CSS Working Group and the SVG Working Group have published a First Public Working Draft of DOMMatrix interface.
This specification describes a transformation matrix interface with the
dimension of 3x2 and 4x4. The transformation matrix interface replaces
the SVGMatrix interface from SVG. It is a common interface used to
describe 2D and 3D transformations on a graphical context for SVG,
Canvas 2D Context and CSS Transforms. Learn more about the Style Activity and the Graphics ActivityCSS Ruby Module Level 1, and CSS Syntax Module Level 3 Drafts Published
20 September 2013
The Cascading Style Sheets (CSS) Working Group has published two Working Drafts today:- CSS Ruby Module Level 1. “Ruby” are short runs of text alongside the base text, typically used in East Asian documents to indicate pronunciation or to provide a short annotation. This module describes the rendering model and formatting controls related to displaying ruby annotations in CSS. CSS is a language for describing the rendering of structured documents (such as HTML and XML) on screen, on paper, in speech, etc.
- CSS Syntax Module Level 3. This module describes, in general terms, the basic structure and syntax of CSS stylesheets. It defines, in detail, the syntax and parsing of CSS - how to turn a stream of bytes into a meaningful stylesheet.
Friday, September 20, 2013
SEO and w3c standards: SEO - Right video at the right time is priceless
SEO and w3c standards: SEO - Right video at the right time is priceless: Using Videos In News Releases for Better SEO In a survey conducted by Forbes Magazine, 75% of senior executives said that they watched wo...
SEO and w3c standards: Bad backlinks kill search engine rankings
SEO and w3c standards: Bad backlinks kill search engine rankings: Bad backlinks kill search engine rankings. Website owners and SEO companies know this. Still, some small business owners have spent $100,00...
Wednesday, September 18, 2013
Effects On Small & Medium Sized Business - Penguin 2.0
Local Citations
Local Business directories are high quality sites that are meant to list your business details. This is very similar to how the “yellow pages” works but these listings are found online. The more higher quality and trusted sites that contain your Business Name, Address and Phone number. These types of links will help you rank locally and provide you with a stronger organic presence.
A page for each location
The recent penguin update calls for more specific pages. The best was an SMB owner can interpret this is by creating a specific page for each location with unique text. By creating pages that have unique location information, you have a better chance to rank that page locally in your DMA.
Added Tip# If you include micro data to these pages you will get an extra boost!
Claim your authorship
Recently Matt Cutts made a video talking about authorship tags and their importance for local SEO. Adding this tag helps Google recognize you from potential spammers or unnatural SEO methods and helps protect you from entering any negative filters associated with this update.
Power of Local Pages
To rank locally, having powerful local pages on Google, Bing and Yahoo can help get you more organic coverage as well as a better ranking in the search results pages. Fill out as much information as you can for these local pages like hours of operation, types of products and services etc. to fully optimize these pages. Push people to provide consumer reviews when possible, having an active and positive customer review section can help you rank better as well as make more sales!
Social Media
Get involved with your Facebook, twitter and other social platforms. Penguin takes into count your social influence and that plays a part in your overall local rankings. Come up with a post schedule and keep to it. Use more imagery in your posts because studies show there is more engagement when you do. Produce interesting content on your site and share it on your social profiles. This will drive traffic as well as boost your organic rankings.
Clean up your bad links
For the first time ever, we have to worry about the links pointing to our sites from years ago. If you have toxic links pointing to your site, they need to be pruned and removed. Your current links could be holding you back. Link Research Tools offers a Link Detox report you can run. Once you run this report, you can log into your Google Webmaster Tools and disavow those links or even contact the website owners directly to remove the toxic links.
If you apply these 7 simple steps you will see your site start to grow rankings in the search engines quickly and you can rest assured the tactics are implanting are safe and effective.
New Tracking Protection Working Group Chairs
18 September 2013
Today W3C appointed two new Chairs to the Tracking Protection Working Group: Justin Brookman and Carl Cargill. They join continuing co-Chair Matthias Schunter. The Working Group updated two draft DNT specifications this week. Matthias Schunter announced this week a stable plan for reaching Last Call. Learn more about the Tracking Protection Working Group.W3C Webinar: Developing Portable Mobile Applications with Compelling User Experience using the W3C MMI Architecture
17 September 2013
The W3C Multimodal Interaction Working Group (MMI-WG) is pleased to announce the second webinar on "Developing Portable Mobile Applications with Compelling User Experience using the W3C MMI Architecture", to be held on September 24, 2013, at 11:00 a.m. ET.Prior to this second webinar, the MMI-WG held the W3C Workshop on Rich Multimodal Application Development on July 22-23 in New York Metropolitan Area, US, and identified that distributed/dynamic applications depend on the ability of devices and environments to find each other and learn what modalities they support. Therefore this second webinar will focus on the topic of device/service discovery to handle Modality Components of the MMI Architecture dynamically.
The discussion during the webinar will interest anyone who wants to take advantage of the dramatic increase in new interaction modes, whether for health care, financial services, broadcasting, automotive, gaming, or consumer devices.
Several experts from the industry and analyst communities will share their experiences and views on the explosive growth of opportunities for the development of applications that provide enhanced multimodal user-experiences. Read more and register for the webinar. Learn more about Multimodal Interaction at W3C.
Monday, September 16, 2013
Updated Drafts of Tracking Preference Expression (DNT), and Tracking Compliance and Scope
13 September 2013
The Tracking Protection Working Group has updated two Working Drafts:- Tracking Preference Expression (DNT). This specification defines the technical mechanisms for expressing a tracking preference via the DNT request header field in HTTP, via an HTML DOM property readable by embedded scripts, and via properties accessible to various user agent plug-in or extension APIs. It also defines mechanisms for sites to signal whether and how they honor this preference, both in the form of a machine-readable tracking status resource at a well-known location and via a "Tk" response header field, and a mechanism for allowing the user to approve exceptions to DNT as desired.
- Tracking Compliance and Scope. This specification defines the meaning of a Do Not Track (DNT) preference and sets out practices for websites to comply with this preference.
Wednesday, September 11, 2013
W3C at Intel's IDF'13; W3DevCampus featured
11 September 2013
W3C is present at the Intel Developer Forum (IDF'13),
happening now in San Francisco, USA. Through talks in developer
sessions and discussions at the W3C booth, W3C explains the many
advantages of using the Open Web Platform technologies, and how the W3DevCampus training program helps developers to get trained on these Web technologies. W3DevCampus is the only official training program and features a W3C HTML5 course now open for registration. Meet Bernard Gidon at the W3C booth at IDF'13 to learn more and then enroll to become an HTML5 expert!W3C Invites Implementations of JSON-LD 1.0
10 September 2013
The RDF Working Group and the JSON-LD Community Grouppublished the Candidate Recommendation of JSON-LD 1.0, and JSON-LD 1.0 Processing Algorithms and API. This signals the beginning of the call for implementations for JSON-LD 1.0.JSON-LD harmonizes the representation of Linked Data in JSON by describing a common JSON representation format for expressing directed graphs; mixing both Linked Data and non-Linked Data in a single document. The syntax is designed to not disturb already deployed systems running on JSON, but provide a smooth upgrade path from JSON to JSON-LD. It is primarily intended to be a way to use Linked Data in Web-based programming environments, to build interoperable Linked Data Web services, and to store Linked Data in JSON-based storage engines.
The JSON-LD 1.0 specification describes the JSON-LD language in a way that is useful to authors. It also provides the core grammar of the language for implementers. The JSON-LD 1.0 Algorithms and API specification describes useful Algorithms for working with JSON-LD data. It also specifies an Application Programming Interface that can be used to transform JSON-LD documents in order to make them easier to work with in programming environments like JavaScript, Python, and Ruby.
Learn more about the Semantic Web Activity.
Thursday, September 5, 2013
Media Capture and Streams Draft Published
03 September 2013
The Web Real-Time Communication Working Group and the Device APIs Working Group have published an updated Working Draft of Media Capture and Streams.
This document defines a set of JavaScript APIs that allow local media,
including audio and video, to be requested from a platform. Learn more
about the Ubiquitous Web Applications Activity.
Subscribe to:
Posts (Atom)