Sunday, August 5, 2012

Principles of Web 2.0

In the previous post I shared with you the definitions of Web 2.0 in the process of looking at Service 2.0 as an inspiration of Web 2.0. In this post, I'll focus on the principles of Web 2.0 as per the work of Tim O'Reilly.

Principles of Web 2.0

In 2005, Tim O’Reilly has initially drawn the Web 2.0 meme map in a brainstorming session, which was later, cited in researches. When O’Reilly wrote his official paper in 2007 he didn’t include the meme map yet it remains available on the official article on the O’Reilly website. The meme map shows the concepts radiating from the core of web 2.0

In his paper, O’Reilly has listed the main principles, which could be used to differentiate Web 1.0 and Web 2.0 sites, businesses or companies. The following were the principles:
1.    The web as platform
2.    Harnessing collective Intelligence
3.    Data is the next Intel inside
4.    End of the software release cycle
5.    Lightweight programming models
6.    Software above the level of a single device
7.    Rich user experiences
In the conclusion he listed the main core competencies of Web 2.0 companies, which were with the similar spirit but not the same structure. The competencies were as follows:
1.    Services, not packaged software, with cost-effective scalability,
2.    Control over unique, hard-to-recreate data sources that get richer as more people use them,
3.    Trusting users as co-developers,
4.    Harnessing collective intelligence,
5.    Leveraging the long tail through customer self-service,
6.    Software above the level of a single device,
7.    Lightweight user interfaces, development models, AND business models.
In 2009 O’Reilly presented a new paper named “Web Squared: Web 2.0 Five Years On” where he stressed, redefined some older characters and add new ones. The new discussed 4 aspects about the web 2.0 included: sensory-oriented collective intelligence, the learning web, the information shadow of reality on the web, and the real-time collective mind.
The following represents a compiled summary of the principles of Web 2.0 as described by O’Reilly in his 2 papers:

1.    The Web As Platform

a.    Value gets created in the service delivered over the platform (Google vs. Netscape case).
b.    The Value is co-created by the users and is facilitated by the software (Facebook, Twitter and YouTube cases)
c.    Long-tail focus through customer self-service and algorithmic data management (Adsense vs. Doubleclick case).
d.    Architecture of participation: new users bring new resources, the service gets better automatically as the network of users grows (BitTorrent vs. Akamai case).

2.    Harnessing Collective Intelligence

a.    The product is the collective work of the users. The product grows organically due to the user activity (eBay case).
b.    Provider’s role is to enable the context of user activity.
c.    Value is in the value of user participation to create flow around products offered (Amazon vs. case).
d.    Radical Trust: with enough participants all flaws are shallow (Wikipedia case)
e.    Folksonomy: Allowing multiple overlapping customers’ categorization rather than provider’s own rigid categorization called taxonomy ( and flickr cases)
f.     Viral marketing as recommendations propagating from one customer to another vs. traditional advertising ( case).
g.    Open-source: or Open-anything as a product of collective-intelligence in production.
h.    Network effects are the key for dominance in Web 2.0 era.
i.      Blogosphere turned the web into a sort of a global brain with conversation going on all the time.
j.     The crowd of wisdom: The constant interactions of members of the crowd define the visibility and power of members or artifacts.
k.    We, the media: the audience decides what’s important and not the traditional media provider.
l.      Smart devices –not only humans- are feeding data all the time on location, speed, view, this data is being collected, presented and acted upon in real-time.
m.   The web is a marvel of crowdsourcing. (YouTube, Flickr, Twitter, MySpace and Facebook Cases)
n.    Applications are built to direct people to perform certain tasks. (Wikipedia, Amazon, Digg and Mechanical Turk Cases)
o.    The network gets smarter as it grows. Combination of devices capabilities, access to networks and crowdsourcing define a new level of intelligence.
p.    Discovering implied metadata, and building a database to capture that metadata and/or foster an ecosystem around it.

3.    Data is the Next "Intel Inside"

a.    Infoware: Products that are relying on databases as a main competency, where the data is the major asset.
b.    Control on over databases defines market control, and allows for financial returns and leveraging of network effects (YAHOO, Google, Amazon, eBay and Network Solutions cases).
c.    Ownership of data is the competitive edge and not the ownership of software which could be imitated or open-sourced (MapQuest vs. YAHOO, Microsoft and Google case).
d.    Enhancement of data by provider’s own efforts or by harnessing collective intelligence creates a huge added value on publicly available or easy-to-imitate databases (Amazon vs. case).
e.    Enhancement of data is a value that could be offered to other providers where the focus is shifted from the original data to the enhanced intermediate accessible new database as a data source. Providers could merge more than a source of data to enable services that never existed before (Google Maps, NavTeq & case)
f.     The high expense & return in creating the data will put the data at the center of competition on ownership.
g.    Network-wide data systems could be formed to provide reliable aggregate data source systems. These data systems will be a major component in the “internet operating system” and will enable future applications (the case of identity systems relying on PayPal, Amazon 1-click and Google use of cell number as an Identifier).
h.    Open-Data: Due to the importance and criticality of ownership of data, data-owning providers (like Amazon) may start to be more enforcing for their data copyright policies. This will lead to the rise of free data movement leading to open data project as in the case of the proprietary software, free software movement and open-source (Wikipedia and Greasemonkey examples).

4.    End of the Software Release Cycle

a.    Looking at the software as a service rather than a product implies fundamental changes to business models.
b.    Operations are a core competency. Operations are more critical than the software artifact. Data has to be constantly and continuously maintained and operated.(the case of Google’s search algorithm vs. Google’s system administration, networking and load balancing). This implies change in the development tools suitable for building dynamic systems enabling the constant change.
c.    Users must be treated as co-developers.
d.    The move from “release early and release often” to the “perpetual beta”. Service may continue to be in beta mode years after release with.
e.    Frequent monthly, weekly & daily updates or even every-half-hour build update rate.
f.     Real-time monitoring of user-behavior and response to new features is a major new competency.
g.    This pace of development is a challenge for traditional providers who require radical change in their development lifecycles, design patters and corresponding business models and revenue sources.

5.    Lightweight Programming Models

a.    Simplicity guarantees higher adoption than complication. Simple models are highly adopted than formal corporate sophisticated models (Amazon’s case of 5% corporate SOAP and 95% simple REST).
b.    Allow for loosely coupled or fragile systems than tight corporate-bases coupled systems.
c.    Think Syndication and not coordination: Care more about getting the data to the other side rather than controlling the data on the other side.
d.    Design for Hackability and remixability: Decrease barriers for reusability. Allow users to access things the way they want when they want. Allow users to decrypt, hack, remix and reuse the components of the service creatively (Google Maps Case vs. ESRI).
e.    Move from “All rights reserved” to “some rights reserved”
f.     Innovation in assembly: Allow easy reuse and remix of existing services for a third-party to provide a new service. The abundance of accessible and reusable services’ components will allow for a new competition on the use of existing services to create new services. This availability will also give room for differentiation for existing providers who can reuse components from other services to enrich, renew and reposition their own.   

6.    Software Above the Level of a Single Device

a.    Allowing seamless integration of multiple devices, web and software.
b.    Devices will not only consume data but will also produce and report data (Car & traffic monitoring example).

7.    Rich User Experiences

a.    Development of the tools and standards allowed for the creation of interactive rich applications.
b.    The attempts from both directions of providing desktop-like rich web applications from one side and integrating web and online features continue to enhance the user experience.
c.    Rich Applications have the power of learning from user, accessing the user’s data and leveraging the architecture of participation and collective intelligence of the social network. 

No comments:

Post a Comment

Please share your thoughts and insights ...