The list went on and on. But what was it that made us identify one application or approach as "Web 1.0" and another as "Web 2.0"? (The question is particularly urgent because the Web 2.0 meme has become so widespread that companies are now pasting it on as a marketing buzzword, with no real understanding of just what it means. The question is particularly difficult because many of those buzzword-addicted startups are definitely not Web 2.0, while some of the applications we identified as Web 2.0, like Napster and BitTorrent, are not even properly web applications!) We began trying to tease out the principles that are demonstrated in one way or another by the success stories of web 1.0 and by the most interesting of the new applications.

1. The Web As Platform

Like many important concepts, Web 2.0 doesn't have a hard boundary, but rather, a gravitational core. You can visualize Web 2.0 as a set of principles and practices that tie together a veritable solar system of sites that demonstrate some or all of those principles, at a varying distance from that core.

Web2MemeMap

Figure 1 shows a "meme map" of Web 2.0 that was developed at a brainstorming session during FOO Camp, a conference at O'Reilly Media. It's very much a work in progress, but shows the many ideas that radiate out from the Web 2.0 core.

For example, at the first Web 2.0 conference, in October 2004, John Battelle and I listed a preliminary set of principles in our opening talk. The first of those principles was "The web as platform." Yet that was also a rallying cry of Web 1.0 darling Netscape, which went down in flames after a heated battle with Microsoft. What's more, two of our initial Web 1.0 exemplars, DoubleClick and Akamai, were both pioneers in treating the web as a platform. People don't often think of it as "web services", but in fact, ad serving was the first widely deployed web service, and the first widely deployed "mashup" (to use another term that has gained currency of late). Every banner ad is served as a seamless cooperation between two websites, delivering an integrated page to a reader on yet another computer. Akamai also treats the network as the platform, and at a deeper level of the stack, building a transparent caching and content delivery network that eases bandwidth congestion.

Nonetheless, these pioneers provided useful contrasts because later entrants have taken their solution to the same problem even further, understanding something deeper about the nature of the new platform. Both DoubleClick and Akamai were Web 2.0 pioneers, yet we can also see how it's possible to realize more of the possibilities by embracing additional Web 2.0 design patterns .

Let's drill down for a moment into each of these three cases, teasing out some of the essential elements of difference.

Netscape vs. Google

If Netscape was the standard bearer for Web 1.0, Google is most certainly the standard bearer for Web 2.0, if only because their respective IPOs were defining events for each era. So let's start with a comparison of these two companies and their positioning.

Netscape framed "the web as platform" in terms of the old software paradigm: their flagship product was the web browser, a desktop application, and their strategy was to use their dominance in the browser market to establish a market for high-priced server products. Control over standards for displaying content and applications in the browser would, in theory, give Netscape the kind of market power enjoyed by Microsoft in the PC market. Much like the "horseless carriage" framed the automobile as an extension of the familiar, Netscape promoted a "webtop" to replace the desktop, and planned to populate that webtop with information updates and applets pushed to the webtop by information providers who would purchase Netscape servers.

In the end, both web browsers and web servers turned out to be commodities, and value moved "up the stack" to services delivered over the web platform.

Google, by contrast, began its life as a native web application, never sold or packaged, but delivered as a service, with customers paying, directly or indirectly, for the use of that service. None of the trappings of the old software industry are present. No scheduled software releases, just continuous improvement. No licensing or sale, just usage. No porting to different platforms so that customers can run the software on their own equipment, just a massively scalable collection of commodity PCs running open source operating systems plus homegrown applications and utilities that no one outside the company ever gets to see.

At bottom, Google requires a competency that Netscape never needed: database management. Google isn't just a collection of software tools, it's a specialized database. Without the data, the tools are useless; without the software, the data is unmanageable. Software licensing and control over APIs--the lever of power in the previous era--is irrelevant because the software never need be distributed but only performed, and also because without the ability to collect and manage the data, the software is of little use. In fact, the value of the software is proportional to the scale and dynamism of the data it helps to manage.

Google's service is not a server--though it is delivered by a massive collection of internet servers--nor a browser--though it is experienced by the user within the browser. Nor does its flagship search service even host the content that it enables users to find. Much like a phone call, which happens not just on the phones at either end of the call, but on the network in between, Google happens in the space between browser and search engine and destination content server, as an enabler or middleman between the user and his or her online experience.

While both Netscape and Google could be described as software companies, it's clear that Netscape belonged to the same software world as Lotus, Microsoft, Oracle, SAP, and other companies that got their start in the 1980's software revolution, while Google's fellows are other internet applications like eBay, Amazon, Napster, and yes, DoubleClick and Akamai.

Web 2.0 Courses

Success in the Web 2.0 world depends on a successful user experience.

Register now to learn advanced User Interface techniques using PHP and SQL. You'll master building a dynamic website using efficient and reusable code and seamlessly integrating Web 2.0 patterns, object-oriented PHP, along with other technologies and techniques.

Recommended for You

© 2024, O’Reilly Media, Inc.

(707) 827-7019 (800) 889-8969

All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners.

About O'Reilly

  • Academic Solutions
  • Corporate Information
  • Privacy Policy
  • Terms of Service
  • Work with Us
  • Editorial Independence
  • Community & Featured Users
  • Newsletters
  • Meetups & User Groups

Partner Sites

  • makezine.com
  • makerfaire.com
  • craftzine.com
  • O'Reilly Insights on Forbes.com

Shop O'Reilly

  • Customer Service
  • Shipping Information
  • Ordering & Payment
  • Affiliate Program
  • The O'Reilly Guarantee
  • Search Search Please fill out this field.

What Is Web 2.0?

Understanding web 2.0.

  • Advantages and Disadvantages

Web 2.0 vs. Web 1.0

Web 2.0 vs. web 3.0, web 2.0 components.

  • Applications

The Bottom Line

  • Marketing Essentials

What Is Web 2.0? Definition, Impact, and Examples

web 2.0 essay

Investopedia / Joules Garcia

Web 2.0 describes the current state of the internet, which has more user-generated content and usability for end-users compared to its earlier incarnation, Web 1.0. Web 2.0 generally refers to the 21st-century internet applications that have transformed the digital era in the aftermath of the dotcom bubble .

Key Takeaways

  • Web 2.0 describes the current state of the internet, which has more user-generated content and usability for end-users compared to its earlier incarnation, Web 1.0.
  • It does not refer to any specific technical upgrades to the internet; it refers to a shift in how the internet is used.
  • There is a higher level of information sharing and interconnectedness among participants in the new age of the internet
  • It allowed for the creation of applications such as Facebook, X (formerly Twitter), Reddit, TikTok, and Wikipedia.
  • Web 2.0 paved the way for Web 3.0, the next generation of the web that uses many of the same technologies to approach problems differently.

The term Web 2.0 first came into use in 1999 as the internet pivoted toward a system that actively engaged the user. Users were encouraged to provide content, rather than just viewing it. The social aspect of the internet has been particularly transformed; in general, social media allows users to engage and interact with one another by sharing thoughts, perspectives, and opinions. Users can tag, share, post, and like.

Web 2.0 does not refer to any specific technical upgrades to the internet. It simply refers to a shift in how the internet is used in the 21st century. In the new age, there is a higher level of information sharing and interconnectedness among participants. This new version allows users to actively participate in the experience rather than just acting as passive viewers who take in information.

Because of Web 2.0, people can publish articles and comments on different platforms, increasing engaged content creation and participation through the creation of accounts on different sites. It also gave rise to web apps, self-publishing platforms like  WordPress , Medium, Substack, as well as social media sites. Examples of Web 2.0 sites include Wikipedia, Facebook, X, and various blogs, which all transformed the way the same information is shared and delivered.

History of Web 2.0

In a 1999 article called Fragmented Future, Darcy DiNucci coined the phrase Web 2.0. In the article, DiNucci mentions that the "first glimmerings" of this new stage of the web were beginning to appear. In Fragmented Future, DiNucci describes Web 2.0 as a "transport mechanism, the ether through which interactivity happens."

The phrase became popularized after a 2004 conference held by O'Reilly Media and MediaLive International. Tim O'Reilly, founder and chief executive officer (CEO) of the media company, is credited with the streamlining of the process, as he hosted various interviews and Web 2.0 conferences to explore the early business models for web content.

The interworking of Web 2.0 has continually evolved over the years. Instead of a single instance of Web 2.0 having been created, its definition and capabilities continue to change. For example, Justin Hall is credited as being one of the first bloggers, though his personal blog dates back to 1994.

Advantages and Disadvantages of Web 2.0

The development of technology has allowed users to share their thoughts and opinions with others, creating new ways of organizing and connecting with other people. One of the largest advantages of Web 2.0 is improved communication through web applications that enhance interactivity, collaboration, and knowledge sharing.

This is most evident through social networking, where individuals armed with a Web 2.0 connection can publish content, share ideas, extract information, and subscribe to various informational feeds. This has brought about major strides in marketing optimization as more strategic, targeted marketing approaches are now possible.

Web 2.0 also brings about a certain level of equity. Most individuals have an equal chance of posting their views and comments, and each individual may build a network of contacts. Because information may be transmitted more quickly under Web 2.0 compared to prior methods of information sharing, the latest updates and news may be available to more people.

Disadvantages

Unfortunately, there are a lot of disadvantages to the internet acting more like an open forum. Through the expansion of social media, we have seen an increase in online stalking, doxing , cyberbullying, identity theft , and other online crimes. There is also the threat of misinformation spreading among users, whether that's through open-source information-sharing sites or on social media.

Individuals may blame Web 2.0 for misinformation, information overload, or the unreliability of what people read. As almost anyone can post anything via various blogs, social media, or Web 2.0 outlets, there is an increased risk of confusion on what is real and what sources may be deemed reliable.

As a result, Web 2.0 brings about higher stakes regarding communication. It's more likely to have fake accounts, spammers, forgers, or hackers that attempt to steal information, imitate personas, or trick unsuspecting Web 2.0 users into following their agenda. As Web 2.0 doesn't always and can't verify information, there is a heightened risk for bad actors to take advantage of opportunities.

Web 1.0 is used to describe the first stage of the internet. At this point, there were few content creators; most of those using the internet were consumers. Static pages were more common than dynamic HTML , which incorporates interactive and animated websites with specific coding or language.

Content in this stage came from a server’s file system rather than a database management system. Users were able to sign online guestbooks and HTML forms were sent via email. Examples of internet sites that are classified as Web 1.0 are Britannica Online, personal websites, and mp3.com. In general, these websites are static and have limited functionality and flexibility.

Dynamic information (always changing)

Less control over user input

Promotes greater collaboration, as channels are more dynamic and flexible

Considered much more social and interative-driven

Static information (more difficult to change)

More controlled user input

Promoted individual contribution; channels were less dynamic

Consider much more informative and data-driven

The world is already shifting into the next iteration of the web (appropriately dubbed "Web 3.0"). Though both rely on many similar technologies, they use the available capabilities to solve problems differently.

One strong example of Web 3.0 relates to currency. Under Web 2.0, users could input fiat currency information such as bank account information or credit card data. This information could be processed by the receiver to allow for transactions. Web 3.0 strives to approach the transaction process using similar but different processes. With the introduction of Bitcoin, Ethereum , and other cryptocurrencies, the same problem can be solved in a theoretically more efficient way under Web 3.0.

Web 3.0 is more heavily rooted in increasing the trust between users. More often, applications rely on decentralization, letting data be exchanged in several locations simultaneously. Web 3.0 is also more likely to incorporate artificial intelligence or machine learning applications.

Focuses on reading and writing content

May be more susceptible to less-secure technology

May use more antiquated, simpler processing techniques

Primarily aims to connect people

Focused on creating content

Often has more robust cybersecurity measures

May incorporate more advanced concepts such as AI or machine learning

Primarily aims to connect data or information

There is no single, universally-accepted definition for Web 2.0. Instead, it's best described as a series of components that, when put together, create an online environment of interactivity and greater capacity compared to the original version of the web. Here are the more prominent components of Web 2.0.

Wikis are often information repositories that collect input from various users. Users may edit, update, and change the information within a web page, meaning there is often no singular owner of the page or the information within. As opposed to users simply absorbing information given to them, wiki-based sites such as Wikipedia are successful when users contribute information to the site.

Software Applications

The early days of the web relied upon local software being installed on-premises. With Web 2.0, applications gained a greater opportunity to be housed off-site, downloaded over the web, or even offered as a service via web applications and cloud computing . This has shepherded a new type of business model where companies can sell software applications on a monthly subscription basis.

Social Networking

Often one of the aspects most thought of when discussing Web 2.0. Social networking is similar to wikis in that individuals are empowered to post information on the web. Whereas wikis are informational and often require verification, social networking has looser constraints on what can be posted. In addition, users have greater capabilities to interact and connect with other social networking users.

General User-Generated Content

In addition to social media posts, users can more easily post artwork, images, audio, video, or other user-generated media. This information shared online for purchase or may be freely distributed. This has led to greater distribution of content creator crediting (though creators are at greater risk for their content being stolen by others).

Crowdsourcing

Though many may think of Web 2.0 as allowing for individual contribution, Web 2.0 brought about great capabilities regarding crowdsourced, crowdfunded , and crowd-tested content. Web 2.0 let individuals collectively share resources to meet a common goal, whether that goal be knowledge-based or financial.

There is no single universally accepted definition for Web 2.0 or Web 3.0. Because of its expansive nature, it's often hard to confine the boundaries of Web 2.0 into a single simple definition.

Web 2.0 Applications

The components above are directly related to the applications of Web 2.0. Those components allowed for new types of software, platforms, or applications that are still used today.

  • Zoom, Netflix, and Spotify are all examples of software as a service (SaaS) . With the greater capability of connecting individuals via Web 2.0, off-premise software applications are exponentially more capable and powerful.
  • HuffPost, Boing Boing, and Techcrunch are blogs that allow users to input opinions and information onto web pages. These pages are informative similar to Web 1.0; however, individual contributors have a much greater capability in creating and distributing their own informative content.
  • X, Instagram, Facebook, and Threads are social media networks that allow for personalized content to be uploaded to the web. This content can then be shared with a private collection of friends or with a broad social media user base.
  • Reddit, Digg, and Pinterest are also applications that allow for user input. These types of applications are more geared towards organizing social content around specific themes or topics, much like how original forums used to.
  • YouTube , TikTok, and Flickr are even more examples of content sharing. However, specific applications specialize in the distribution of multimedia, video, or audio.

What Does Web 2.0 Mean?

Web 2.0 describes how the initial version of the web has advanced into a more robust, capable system. After the initial breakthrough of the initial web capabilities, greater technologies were developed to allow users to more freely interact and contribute to what resides on the web. The ability for web users to be greater connected to other web users is at the core of Web 2.0.

What Are Examples of Web 2.0 Applications?

The most commonly cited examples of Web 2.0 applications include Facebook, X, Instagram, or Tiktok. These sites allow users to interact with web pages instead of simply viewing them. These types of websites extend to sites like Wikipedia, where a broad range of users can help form the information that is shared and distributed on the web.

Is Web 2.0 and Web 3.0 the Same?

Web 2.0 and Web 3.0 use many of the same technologies (AJAX, JavaScript, HTML5, CSS3). Web 3.0 is more likely to leverage even more modern technologies or principles in an attempt to connect the information to drive even greater value.

In the early days of web browsing, users would often navigate to simple web pages filled with information and limited-to-no ability to interact with the page. Today, the web has advanced and allows for users to connect with others, contribute information, and have greater flexibility in how the web is being used. Though Web 2.0 is already shaping the way for Web 3.0, many of the fundamental pieces of Web 2.0 are still used today.

Web Design Museum. " Web 2.0 ."

Darcy DiNucci. " Fragmented Future ."

O'Reilly. " Web 2.0 and the Emergent Internet Operating System ."

University of Notre Dame of Maryland. " History of Blogging ."

web 2.0 essay

  • Terms of Service
  • Editorial Policy
  • Privacy Policy
  • Your Privacy Choices

ESSAY SAUCE

ESSAY SAUCE

FOR STUDENTS : ALL THE INGREDIENTS OF A GOOD ESSAY

Essay: Web 2.0

Essay details and download:.

  • Subject area(s): Information technology essays
  • Reading time: 4 minutes
  • Price: Free download
  • Published: 16 June 2012*
  • File format: Text
  • Words: 1,038 (approx)
  • Number of pages: 5 (approx)

Text preview of this essay:

This page of the essay has 1,038 words. Download the full version above.

The term "Web 2.0" was first used in January 1999 by Darcy DiNucci, a consultant on electronic information design (information architecture). The Web we know now, which loads into a browser window in essentially static screenfuls, is only an embryo of the Web to come. The first glimmerings of Web 2.0 are beginning to appear, and we are just starting to see how that embryo might develop. The Web will be understood not as screenfuls of text and graphics but as a transport mechanism, the ether through which interactivity happens.

The term Web 2.0 was initially championed by bloggers and by technology journalists, culminating in the 2006 TIME magazine Person of The Year (You). Web 2.0 websites allow users to do more than just retrieve information. By increasing what was already possible in "Web 1.0", they provide the user with more user-interface, software and storage facilities, all through their browser. This has been called "network as platform" computing. Major features of Web 2.0 include social networking sites, user created web sites, self-publishing platforms, tagging, and social bookmarking. Users can provide the data that is on a Web 2.0 site and exercise some control over that data.

Web 2.0 offers all users the same freedom to contribute. While this opens the possibility for serious debate and collaboration, it also increases the incidence of "spamming" and "trolling" by unscrupulous or less mature users. The impossibility of excluding group members who don’t contribute to the provision of goods from sharing profits gives rise to the possibility that serious members will prefer to withhold their contribution of effort and free ride on the contribution of others.

Key features of Web 2.0 Folksonomy: Free Classification of Information Rich User Experience User as a Contributor Long Tail User Participation Basic Trust Dispersion

The client-side (web browser) technologies used in Web 2.0 development include Ajax and JavaScript frameworks such as YUI Library, Dojo Toolkit, MooTools, jQuery and Prototype JavaScript Framework. Ajax programming uses JavaScript to upload and download new data from the web server without undergoing a full page reload.

Adobe Flex is another technology often used in Web 2.0 applications. Compared to JavaScript libraries like jQuery, Flex makes it easier for programmers to populate large data grids, charts, and other heavy user interactions.

On the server side, Web 2.0 uses many of the same technologies as Web 1.0. Languages such as PHP, Ruby, Perl, Python, as well as JSP, and ASP.NET, are used by developers to output data dynamically using information from files and databases. What has begun to change in Web 2.0 is the way this data is formatted. In the early days of the Internet, there was little need for different websites to communicate with each other and share data. In the new "participatory web", however, sharing data between sites has become an essential capability.

Web 2.0 can be described in three parts: Rich Internet application (RIA) ‘ defines the experience brought from desktop to browser whether it is from a graphical point of view or usability point of view. Some buzzwords related to RIA are Ajax and Flash.

Web-oriented architecture (WOA) ‘ is a key piece in Web 2.0, which defines how Web 2.0 applications expose their functionality so that other applications can leverage and integrate the functionality providing a set of much richer applications. Examples are feeds, RSS, Web Services, mash-ups.

Social Web ‘ defines how Web 2.0 tends to interact much more with the end user and make the end-user an integral part.

A third important part of Web 2.0 is the Social web, which is a fundamental shift in the way people communicate. The social web consists of a number of online tools and platforms where people share their perspectives, opinions, thoughts and experiences. Web 2.0 applications tend to interact much more with the end user. As such, the end user is not only a user of the application but also a participant by: Podcasting Blogging Tagging Curating with RSS Social bookmarking Social networking Web content voting The popularity of the term Web 2.0, along with the increasing use of blogs, wikis, and social networking technologies, has led many in academia and business to append a flurry of 2.0’s to existing concepts and fields of study, including Library 2.0, Social Work 2.0, Enterprise 2.0, PR 2.0, Classroom 2.0, Publishing 2.0, Medicine 2.0, Telco 2.0, Travel 2.0, Government 2.0, and even Porn 2.0. Many of these 2.0s refer to Web 2.0 technologies as the source of the new version in their respective disciplines and areas.

Blogs, wikis and RSS are often held up as exemplary manifestations of Web 2.0. A reader of a blog or a wiki is provided with tools to add a comment or even, in the case of the wiki, to edit the content. This is what we call the Read/Write web. Talis believes that Library 2.0 means harnessing this type of participation so that libraries can benefit from increasingly rich collaborative cataloging efforts, such as including contributions from partner libraries as well as adding rich enhancements, such as book jackets or movie files, to records from publishers and others.

Futurist John Smart, lead author of the Metaverse Roadmap, defines Web 3.0 as the first-generation Metaverse (convergence of the virtual and physical world), a web development layer that includes TV-quality open video, 3D simulations, augmented reality, human-constructed semantic standards, and pervasive broadband, wireless, and sensors. Web 3.0’s early geosocial (Foursquare, etc.) and augmented reality (Layar, etc.) webs are an extension of Web 2.0’s participatory technologies and social networks (Facebook, etc.) into 3D space. Of all its metaverse-like developments, Smart suggests Web 3.0’s most defining characteristic will be the mass diffusion of NTSC-or-better quality video to TVs, laptops, tablets, and mobile devices, a time when "the internet swallows the television." Smart considers Web 3.0 to be the Semantic Web and in particular, the rise of statistical, machine-constructed semantic tags and algorithms, driven by broad collective use of conversational interfaces, perhaps circa 2020.[77] David Siegel’s perspective in Pull: The Power of the Semantic Web, 2009, is consonant with this, proposing that the growth of human-constructed semantic standards and data will be a slow, industry-specific incremental process for years to come, perhaps unlikely to tip into broad social utility until after 2020.

According to some Internet experts, Web 3.0 will enable the use of autonomous agents to perform some tasks for the user. Rather than having search engines gear towards your keywords, the search engines will gear towards the user.

...(download the rest of the essay above)

About this essay:

If you use part of this page in your own work, you need to provide a citation, as follows:

Essay Sauce, Web 2.0 . Available from:<https://www.essaysauce.com/information-technology-essays/web-2-0/> [Accessed 03-05-24].

These Information technology essays have been submitted to us by students in order to help you with your studies.

* This essay may have been previously published on Essay.uk.com at an earlier date.

Essay Categories:

  • Accounting essays
  • Architecture essays
  • Business essays
  • Computer science essays
  • Criminology essays
  • Economics essays
  • Education essays
  • Engineering essays
  • English language essays
  • Environmental studies essays
  • Essay examples
  • Finance essays
  • Geography essays
  • Health essays
  • History essays
  • Hospitality and tourism essays
  • Human rights essays
  • Information technology essays
  • International relations
  • Leadership essays
  • Linguistics essays
  • Literature essays
  • Management essays
  • Marketing essays
  • Mathematics essays
  • Media essays
  • Medicine essays
  • Military essays
  • Miscellaneous essays
  • Music Essays
  • Nursing essays
  • Philosophy essays
  • Photography and arts essays
  • Politics essays
  • Project management essays
  • Psychology essays
  • Religious studies and theology essays
  • Sample essays
  • Science essays
  • Social work essays
  • Sociology essays
  • Sports essays
  • Types of essay
  • Zoology essays

About / RSS / Subscribe / Submissions

  • Contemporaries
  • Book Series

Web 2.0 and Literary Criticism

Edited by Aarthi Vadde and Jessica Pressman

Aarthi Vadde and Jessica Pressman

Multiplayer Lit/Multiplayer Crit

Sarah Wasserman

The Participatory Cultures of Omenana: Reading and Writing on a Nigerian SF Website

Matthew Eatough

Do It for the Vine: Literary Reviews and Online Amplification

Kinohi Nishikawa

Can Literary Theory be Participatory?

Priya Joshi

The Handwritten Styles of Instagram Poetry

Seth Perlow

Studying and Preserving the Global Networks of Twitter Literature

Christian Howard

Close Shaves with Content

Tess McNulty

Prescribed Print: Bibliotherapy after Web 2.0

Literary criticism 2.0: emerging ideas.

Jared Zeiders

A Creative Reading of Web 2.0 and Literary Criticism Using Voyant’s TermsBerry

Tina Lumbis

Introduction

Web 2.0 is changing the literary. We all know this, and we have emergent fields of study based upon this knowledge: electronic literature, game studies, cultural analytics, digital humanities. Yet, scholarship on how the contemporary digital environment is transforming literary criticism deserves more attention. Our cluster takes up this topic: in the individual essays, in their networked relationship, and in the pathway to its production and publication.

This cluster of essays began as a seminar at the American Comparative Literature Association (ACLA) conference, held at Georgetown University in March 2019. Aarthi and Jessica bonded over a shared sense that literary critics need to learn from, follow, and take seriously the trends in web 2.0 literature and literary culture even as we remained skeptical of web 2.0's rhetoric of newness and openness. Jessica had recently published a polemical piece, "Electronic Literature as Comparative Literature," a call for comparative literature scholars to take seriously media and digital translation as part of comparative studies; Aarthi had recently published on the transformative nature of collaborative writing platforms in "Amateur Creativity: Contemporary Literature and the Digital Publishing Scene." Both of these pieces were oriented to the changing media ecology of literary and critical writing, and we convened the seminar with the hopes of converging our individual lines of inquiry while inviting others to the table.

Our call-for-papers asked potential participants to explore the impact of information technologies on the making of contemporary literature and literary culture in a global context. The personal computer, mobile devices, the cloud, the server farm, the search engine, the algorithm, and the network are now indispensable parts of daily life; they are equally indispensable to the reading, writing, and distribution of literature. We wished to understand the consequences of their ubiquity on specific developments in the contemporary literary field — its aesthetic forms, medial substrates, institutional sites of canonization, and informal sites of readership. Popular aspects of web 2.0 literariness elude our specialist methods of analysis and challenge the prevailing norms of selectivity guiding our profession. How might literary history and literary value look different if the discipline paid more heed to the collaborative reading and writing practices of social media, online reviewing culture, and self-publishing platforms? Where is a literary critic to focus her gaze when what counts as content, text, or poetics is inseparable from hardware, software, platform, social network?

The contributors to this cluster collectively examine the specific challenges and opportunities posed to literary critics approaching web 2.0 participatory culture. They present astute analyses of literature in medial translation, make social media platforms their objects of study, and take methodological cues from multiplayer collaborations of various kinds. Yet, even as the topic of our seminar sought to address the bleeding-edge of literary culture, its format (three days spent in close conversation around a table) reminded us of how much older than the internet participatory culture is and how much literary criticism depends upon it. Although the term gained popularity as a way of distinguishing web 2.0 and differentiating digital media from broadcast media, one could say participatory culture goes as far back as the ancient Greek practice of methexis in which audiences would participate and improvise in theatrical performances. To retain one's individual voice while also becoming enmeshed in a collective such as "the people" or "the hive mind" is a paradox of participatory culture that applies as much to theories of democratic politics and the state as it does to internet culture and literary tradition.

The seminar at Georgetown was so generative in part because we as members understood ourselves to be becoming a written collective even before we met in person. We knew from the start that we were going to turn the seminar into a published cluster for  Post45 , so we began our conversation at the conference with the intention of building something together for a digital venue. Though we each wrote and shared individual papers for the seminar, we used these short talks as building blocks for a conversation that combined and recombined over the space of the conference and the editorial process. The essays before you bear the spirit of our dialogue, and the current (we hesitate to say "final") form of the cluster reflects our editorial desire to bring the collaborative ethos of the live seminar into the more highly mediated and modular environment of WordPress.

One way in which we have approximated conversation is through hyperlinks, both conceptual and programmatic. We asked each of our contributors to consider their individual essay in relationship to the other contributions and to revise with the goal of referencing key streams of thought from the seminar. We also asked each contributor to denote in their essays where those connections occur and where we could include HTML hyperlinks to other essays so that the conceptual connections perform programmatically as a network. What we realized around that table in that room, and what any virtual meet-up misses, is the importance of live, in-person, embodied interaction. Residual liveness is, perhaps, why online participatory culture is so vibrant and addictive. Philip Auslander understood the importance of "liveness" — the sense of being "live" even when that embodied presence is experienced through technological mediation (extended by microphones, screens, etc.) — before web 2.0 emerged, and the concept of liveness seems ever more vital to understanding our contemporary literary condition. 1  Many of the essays in this cluster attend to the liveness of contemporary literary culture, from emphasis on handwriting in Instagram poetry to the need for a literary theory driven by actual readers rather than the ideal reader of critical imagination .

Rather than thinking of the published cluster as a remediation of the seminar, we seek to present it as evidence of the iterative, even recursive nature of participatory culture. Participation at its most idealistic implies active membership in some community larger than ourselves, but as the emancipatory promises of web 2.0 lose their luster, so too do their notions of agency and self-fashioning. We are not only linked to other humans, objects, and data files, but we are also informed and, in machinic metaphors, even formatted by these relationships. Participation and conscription converge in online life as "opting in" has become the default setting.

If we consider how computational infrastructures inform our experiences as users, we face uneasy questions of how our objects of study find us online. Digital life is not only about using tools but also about being shaped and even circumscribed by them. Our past searches on Google inform what we find in future searches; our network of friends on Facebook shapes the content of our sidebars. What we see online is neither random nor unimportant, and it is determined by exactly what does not meet the eye. As cultural and literary critics, this fact should have profound implications on our approach to our work. Our seminar kept returning to questions of what we could not see and why that occlusion matters.

Perspective and orientation are never just about vision but also about politics, as Sara Ahmed has shown; and, following Franco Moretti, a visualization can change the way we see and the questions we ask. 2 It is not clear whether such concepts have the capacity to explain our lack of access to user data or the invisibility of the algorithms guiding what we see, but it seems right to say that too provincial a notion of literary value is its own form of blindness. Why do we roll our eyes at Instagram poetry? Why do we blow off amateur reviewers and critics? Why do we trivialize the platforms that process and promote literariness to wider audiences than we as scholars will ever reach (e.g. Amazon Vine, social media teasers, the genre of listicles)? Our seminar served as a reminder that literary critics need to think reflexively and critically about why and how we proceed with our values and assertions, close readings and citations, in a climate of both information overload and opacity. Our cluster now gives you a taste of those conversations as they take networked form.

Sarah Wasserman offers an account of "multiplayer crit" to illustrate how literary criticism has changed with the web-enabled times and should continue to become a more collaborative endeavor. Her inspiration for "multiplayer crit" is the multiplayer novel, which formally and thematically embeds the strategies of contemporary videogames. Multiplayer crit goes further than the singly-authored novel or monograph by modelling scenes of collective reading and writing that recall gamers building worlds together. A degree of autonomous world-building also plays into the rise of online literary journals curating African science-fiction, as Matt Eatough explains. A focus on new African literary journals that publish online, especially the Nigerian science fiction magazine  Omenana , which serves as Matt's case study, reveals a younger generation of writers more interested in cultivating common tastes than in assuming the social missions of an Achebe or Ngũgĩ. In adopting the perspectives and vocabularies of web 2.0, and fan culture in particular, these magazines set out to release African literature from its historical obligations to educate the public and satisfy a Euro-American market. Kinohi Nishikawa takes us to the polar opposite of the digital publishing scene with Amazon Vine: a platform for Amazon.com users to write online reviews and form community while increasing company revenue. Nishikawa considers the implications of Vine on and for literature, especially literature by writers outside of the United States and for literature in translation.

If literary criticism, little magazines, and reviewing culture are proving adaptable and even reinvigorated by the participatory cultures of web 2.0, the fate of literary theory remains less sure. Priya Joshi asks the million dollar question: Can literary theory be participatory? Turning away from the abstracted concepts of author, reader, and text, she bring book history's methods of analysis to bear on the communication circuits of contemporary literature. If "Theory" with a capital T remains a category intent on policing the boundaries of the literary, she argues, then it will be unable to account for the popular movements endogenous to contemporary literary culture. Seth Perlow and Christian Howard spotlight such popular (even viral) movements by taking Instapoetry and Twitterature as their objects of study. While both concede the distance of these corporate-branded genres from the sanctified space of the literary, they use that distance to reconsider the metaphysics of authorial presence and the critic's duty to specify inherited criteria of value when writing about such texts.

Finally, Tess McNulty and Leah Price take us from the zone of popular culture to the paraliterary where reading resides not in the hands of critics but in those of marketers, scientists, and doctors. Tess asks what counts as "content" in web 2.0, prompting us to recognize and shift our critical gaze towards the stuff that demands our attention but does not reward it. She counterintuitively finds the strategies of clickbait permeating the aesthetics of writers we can confidently call literary, foremost George Saunders. Leah in turn makes a distinction between reading literature and literary reading as she follows the newly minted profession of bibliotherapy through its neuroscientific rationales and its own participatory networks. When reading recommendations morph into medical prescriptions and concierge services, the consolations of literature soften the critical edge of disciplinary literary studies.

As is evident from each of these synopses, a major objective of this cluster is to conduct a self-assessment of our own processes, methods, and perspectives as literary critics. To that end, we have turned the tables on ourselves and made our contributions into a combined object of study for two new participants: Tina Lumbis and Jared Zeiders . Both graduate students in English and Comparative Literature at San Diego State University with a focus on digital humanities, Tina and Jared synthesize the ideas from our essayistic arguments into two different types of data visualizations that together add another layer of interpretation and collectivization.

Our editorial vision is to create a microcosm of the participatory cultures we study. We employ an array of digital tools and digitally-informed strategies to chart the changing contours of the literary field. As our cluster demonstrates, contemporary literary criticism need not be segregated into qualitative and quantitative camps. The criticism for which we advocate ranges between methods, modes, and objects of study. The literary is alive and well within web 2.0 participatory cultures and presents ample opportunity for creative, even heterodox, approaches to it.

Aarthi Vadde is associate professor of English at Duke University. She is the author of  Chimeras of Form: Modernist Internationalism beyond Europe, 1914-2016   (Columbia UP, 2016; winner of 2018 Harry Levin Prize) and the co-editor of  The Critic as Amateur   (Bloomsbury Academic, 2019).

Jessica Pressman is associate professor of English at San Diego State University. She is the author of  Digital Modernism: Making It New in New Media  (Oxford UP, 2014), co-author,with Mark C. Marino and Jeremy Douglass, of  Reading Project: A Collaborative Analysis of William Poundstone's  Project for Tachistocope {Bottomless Pit}  (Iowa UP, 2015), and co-editor, with N. Katherine Hayles, of  Comparative Textual Media: Transforming the Humanities in a Postprint Era  (Minnesota UP, 2013).

Keywords : Web 2.0, literary criticism, participatory culture, liveness

  • Philip Auslander,  Liveness: Performance in a Mediatized Culture  (New York: Routledge, 1999). [ ⤒ ]
  • See Sara Ahmed,  Queer Phenomenology: Orientations, Objects, Others  (Durham: Duke University Press, 2006) and Franco Moretti,  Graphs, Maps, Trees  (New York: Verso, 2005). [ ⤒ ]

Past clusters

Abortion Now, Abortion Forever

African American Satire in the Twenty-First Century

Ali Smith Now

Anti-Work Aesthetics

Asian/American (Anti-)Bodies

Bernadette Mayer

Bored As Hell

Critique in the Trump Era

Cultural Analytics Now

Dark Academia

David Berman

Decolonize X?

Ecologies of Neoliberal Publishing

Extraordinary Renditions

Feel Your Fantasy: The Drag Race Cluster

For Speed and Creed: The Fast and Furious Franchise

Forms of the Global Anglophone

Gestures of Refusal

Get in the Cage

Global Horror

Heteropessimism

How We Write (Well)

Interpretive Difficulty

Leaving Hollywoo: Essays After BoJack Horseman

Legacies — 9/11 and the War On Terror at Twenty

Little Magazines

Locating Lorine Niedecker

Lydia Davis

Mike Davis Forever

Minimalisms Now: Race, Affect, Aesthetics

New Filmic Geographies

New Literary Television

Poetry's Social Forms

Public Humanities as/and Comparatist Practice

Reading Disco Elysium

Reading Sally Rooney

Reading with Algorithms

Roth’s Yahrzeit

Slow Burn: Quarantine Edition

Someone Else's Object

Stranger Things and Nostalgia Now

The 7 Neoliberal Arts

The After Archive

The Bachelor

The Body of Contemporary Latina/o/x Poetry

The Hallyu Project

The Pain Cluster

The Stuff of Figure, Now

Tis the Damn Season: Taylor Swift's evermore

W(h)ither the Christian Right?

What's Contemporary About the Academy Awards?

Recent Uncollected Essays

Peer Reviewed

Generic Life: Mass Consumption and Globalization in Harryette Mullen’s S*PeRM**K*T

Anna Zalokostas

The Programming Era: The Art of Conversation Design from ELIZA to Alexa

Christopher Grobe

Our Costume Dramas of Creative Destruction

Aaron Chandler

Manifest Diversity and the Empire of Finance

Susan Koshy

Susan Sontag and the Americanization of the Nouveau Roman

Wattpad’s fictions of care.

Sarah Brouillette

Amiri Baraka’s Changing Same as Anational Sociality

Gerónimo Sarmiento Cruz

What is Web 2.0?

P.S. Interestingly, the fact that I'm using Flickr for figures in my writing these days led to a leak of one part of the essay before the piece as a whole was published. I posted a " Web 2.0 meme map " to flickr so I could reference it from there in the article. I thought I'd get the article up before anyone noticed it, but I was surprised to see Business Week pick it up last week .

0 TrackBacks

Trackback url for this entry: http://blogs.oreilly.com/cgi-bin/mt/mt-t.cgi/4297, comments: 16.

  Josh Owens [09.30.05 08:08 PM]

Good read. We actually discussed the map in question on our latest podcast... http://web20show.com/articles/2005/09/28/web-2-0-show-episode-2

  Shamil [10.01.05 01:48 AM]

Great article explaining not only Web 2.0 but underlying psychology and business logic.

There is one doubt though - access. Isn't Access a critical component to make the Web 2.0 a ubiquotous platform? It appears to me that a monopoly could emerge in this capital-intensive business as entire networks are owned by a few large telecom companies.

Would appreciate your thoughts on the above, Thanks Shamil

  Douglass Turner [10.01.05 07:58 AM]

Excellent piece. What Web 2.0 does not address indeed is incapable of addressing is the need for fundamental advancement in the user interface of the Web. The Internet is the platform, but Loose Coupling and Mashups create woeful interfaces.

If you want to get a look at where Web interfaces need to go, take a look at our work in search interface technology called SearchIris:

demo: http://www.visual-io.com/search/index.html

howto: http://www.visual-io.com/search/about.html

  Tim O'Reilly [10.01.05 10:33 AM]

I said I'm not fond of definitions, but I woke up this morning with the start of one in my head:

Web 2.0 is the network as platform, spanning all connected devices; Web 2.0 applications are those that make the most of the intrinsic advantages of that platform: delivering software as a continually-updated service that gets better the more people use it, consuming and remixing data from multiple sources, including individual users, while providing their own data and services in a form that allows remixing by others, creating network effects through an "architecture of participation," and going beyond the page metaphor of Web 1.0 to deliver rich user experiences.

  C. Enrique Ortiz [10.01.05 05:19 PM]

Do you remember "the network is the computer"? Web 2.0 is based on that same concept, except that Web 2.0 further defines what the network should be -- an open and always available architecture and set of services, that can be consumed as is, or combined into compound services, that are accessible via open and consistent methods, regardless of device, platform, computer language.

  SutroStyle [10.01.05 11:37 PM]

You should really look into this news before you muse about web 2.0 - this might be the more important: http://www.eweek.com/article2/0,1895,1865104,00.asp

  Thomas Madsen-Mygdal [10.02.05 01:35 PM]

I'm very disappointed that you're promoting company owned data silos as a core of web2.0.

"Control over unique, hard-to-recreate data sources that get richer as more people use them".

All though you describe the free data movement as something that will come the result before that wil be many years of lockin.

Products where the product actually is the community/commons. Products that gets network effects and requires people to use their product to be a part of the community/commons.

Is it really "web2.0" to lock your customers in and basing your business model on aggregration of your customers data. And at the same limiting competition because of network effects. When simple open standards could allow decentralization today - and in many instances all ready is doing it. (What would the blogosphere be like if it was based on this centralized model).

Why should'nt we be able to do this now - and why not promote this model which all ready is emerging in many ways?

  Suresh Kumar [10.02.05 01:40 PM]

I don't think Tim was promoting Data silos...

i think he was making the point that Web 2.0 company knows how to extend the value of a 'data-set' that they may not necessarily own..

Google Maps and Amazon are examples of that.

  Tim O'Reilly [10.02.05 03:15 PM]

What Suresh said is correct.

However, I *do* believe that owning a hard-to-recreate source of data will bee seen as a competitive advantage in the Web 2.0 era. Recognizing that fact is not "promoting" it. If anything, it will allow those who value free access to data to take steps to ensure that they own their own data, rather than getting sucked into various schemes where other people have them by the short hairs because they don't realize what the levers of power are.

  choi li akiro singh santos [10.03.05 09:07 AM]

East Asia is already on Web 4.0 :-)

Great summary. The discussion and examples ignore the real changes sweeping East Asia. The last time I checked it is the WORLD Wide Web. East Asia is paving the way as far the next generation Internet -- while North America is stuck at 1-3 megabits, we are at 10 megabits.

Here is a recent article on Web 4.0 in South Korea:

http://www.businessweek.com/@@@l0wPIUQ@6DlRwEA/magazine/content/05_39/b3952405.htm

One cannot underestimate the potential of multi-player games in creating persistent virtual environments. Warcraft has hundreds of thousands of FANATIC users in South Korea alone:

http://www.businessweek.com/@@IPq474UQ*aDlRwEA/magazine/content/05_38/b3951085.htm

We in the east think that our edge in bandwidth will allow us to not look too much to the West this time around. We may not be doing much in terms of conceptualizing the framework we are in, but things are moving so fast out here, we defer to you bandwidth-starved folks in the West on that point. As William Gibson aptly said, "The future is here. It's just not evenly distributed yet."

Perhaps another core competency is the ability to function and build communities in a multi-lingual WWW.

  Thomas Madsen-Mygdal [10.04.05 05:13 AM]

I understand your position.

Allthough i respect your "radar" tremendously i think it's off a bit on this one. There's data proving that what you're promoting/decontructing isn't the emerging model.

And that piece of data is that little thing called the blogosphere.

If one sat down and read your article, accepted that as best practice the whole blogosphere would be running one giant ebay-style service where you'd need to be in order to be a part of the party. All the early blog pioneers recognized that they we're part of something bigger and any attempts to centralize it would fail. So they created rss, ping, metaweblog api, etc. Blogs we're decentralized from their very beginning. Imagine the less amount of innovation that would've happened if all the tools and services was controlled by a single company.

What is happening now is that all the new companies aren't pushing open standards, creating light versions of the semantic web in their spaces, etc. They are pushing the ebay best practice of the middle 90's - not the decentralized web model that really is the core of what's happening.

So imho there's data to back that this is the emerging model. And in my opinion especially at times where you know this is gonna become hype/buzz there's certainly a possibility to take a stand and make sure we don't go down 10 years in the same wrong path as before.

I think you're underestimating the impact you're having on this one and thereby the responsibity.

As Brenda Laurel says in a great quote: "Stories, movies, videogames and websites don't have to be about values to have a profound influence on values. Values are everywhere, embedded in every aspect of our culture and lurking in the very natures of our media and technologies. Culture workers know the question isn't whether they are there but who is taking responsibility for them, how they are being shaped, and how they are shaping us for the future."

At the core the web2.0 should be about giving up control on users and data - and recognizing that your little company is only a very little part of something much larger whether it's in your field or in general.

  Tim O'Reilly [10.04.05 09:14 AM]

I think you misunderstand my position. It's far more nuanced than the idea that either centralization OR decentralization is a trump card. I believe that decentralization is a key driver of innovation in Web 2.0 (P2P, web services, blogging -- and the fundamental architecture of the web, or the internet itself, being good examples.) But each wave of decentralization leads to clever new forms of centralization, clever new forms of competitive advantage.

I'm willing to take a large bet that the blogosphere is NOT the counter example that you argue for. Why? History repeats itself. The rhetoric of the early web was much the same as the rhetoric of blogging: everyone is equal, anyone can put up a web site. Not only that, the web was one of the most profoundly decentralized architectures you can imagine, with zero barriers to entry. Yet within only a decade, we had giant companies so powerful that a whole industry has grown up whereby the "long tail" of decentralization tries to optimize their notice by the search engines, who are now at the hugely profitable head of the once flat web.

And this is already happening in blogging. Look at the influence of sites like Technorati, Feedster, Bloglines etc. in annointing "the top bloggers" (who get increasing attention as a result). Notice also how many of the sites in the top 100 blogs are already owned not by individuals but by blog publishing companies like Weblogsinc and Gawker Media.

As to taking a stand, I will take my stand here: identifying a trend is not promoting it. Is an scientist who writes about global warming promoting it?

I see all around me people who understand that the business opportunity is in controlling key data sources, or building network-effect businesses that give them a preeminent position in what was originally a decentralized marketplace.

How you choose to react to that is up to you. Some people will make it part of their business plan, others will try to oppose it. But knowing that that is the game is step one.

If you look at my talks (and I've been talking about this for quite a few years), one slide I frequently show is one about the Internet Operating System, with two images, and the question, "What kind of operating system do we want?" The two images: "One ring to bind them", and a routing map of the Internet, with the caption "Small Pieces Loosely Joined."

I love the "small pieces loosely joined" architecture of the Internet and of the most successful open source projects, and I spend a great deal of my time promoting the benefits of that architecture. But I also recognize the counterforce, and understand its attraction, perhaps even its inevitability.

When Steven Levy was working on the profile he did of me in the recent issue of Wired, we talked about this dynamic. I quoted from a poem of Wallace Stevens (Esthetique du Mal): "the tragedy begins again, in the yes of the realist spoken because he must say yes, because beneath every no lay a yes that had never been broken."

Decentralization is overcome by re-centralization again and again (the PC, once the haven of homebrew hackers, became the core of huge corporate hegemonies; the internet followed the same path; so too will the latest generation of applications.) Yet innovation comes again and again from the fringes, because of something in us that keeps inventing a new, free future, and will not take no for an answer.

That dynamic is the core of progress. It was only through the profit impulse, which led to huge companies taking control of these once decentralized hacker projects, that they were brought to the general public. The hackers move on, and make the future interesting again.

It's really all quite lovely.

  George Chiramattel [10.14.05 12:18 PM]

Hi Tim, First of all let me congratulate you on this beautiful article.

I would also like to add to this discussion. If the Internet represents the 'collective intelligence of humanity' then in my opinion we require better tooling to utilize it. I wouldn't expect the 'virtual brain of humanity' to come with a 'search box' as its primary interface :-) At the following URL , I have described how we can build a better tool to handle the huge volume of information that is getting published on the net. I call this tool FolkMind. http://www.chiramattel.com/george/blog/archives/2005_10.html#a000084

  gogle [06.15.06 02:59 AM]

well. I'm willing to take a large bet that the blogosphere is NOT the counter example that you argue for. Why? History repeats itself. History never repeats. I am not argue with you.

  Alex Szilaghi [01.11.07 08:59 PM]

The web 2.0 is the future. Is really hard to define something that is just starting. Anyhow keep up writing. You have good info on your articles.

  taly weiss [01.17.07 07:23 AM]

Web 2.0 is an open to all platform, as such – it must hold all information with no restrains. That is the beauty of this platform and yes, it does imply that you will find a lot of irrelevant information (The web 2.0 antagonists claim that "we get too much crap"). But, this is exactly why contribution is necessary. If you meet unimportant information – you should "bury" it, or rank it low. This contribution will serve to sort the information by its relevancy and true value. I see contribution as means not only for adding more information but as a key to help others navigate in the endless information offered. With others actively responding you can decide what is best to read and what is worth your attention. Improving the contribution behavior is a necessary key to web 2.0 success but unfortunately it hasn't yet received the attention it deserves. We can easily recognize that not a lot do contribute to this collective intelligence as the ratio data (views per voters or per rankers, viewers per comments; viewers per members) found at web 2.0 reach about 3-5% at most. For more on this issue see www.trendsspotting.com/blog/?p=1, www.trendsspotting.com/blog/?p=4

Post A Comment:

Email Address:

Remember personal info?

Comments: (You may use HTML tags for style)

Type the characters you see in the picture above.

STAY CONNECTED

Recommended for you, recent comments.

  • taly weiss on What is Web 2.0? : Web 2.0 is an open to a...
  • Alex Szilaghi on What is Web 2.0? : The web 2.0 is the futu...
  • gogle on What is Web 2.0? : well. I'm willing to ta...
  • George Chiramattel on What is Web 2.0? : Hi Tim, First of all le...
  • Tim O'Reilly on What is Web 2.0? : Thomas - I think you m...
  • Thomas Madsen-Mygdal on What is Web 2.0? : Tim, I understand your...
  • choi li akiro singh santos on What is Web 2.0? : East Asia is already on...
  • Tim O'Reilly on What is Web 2.0? : Thomas -- What Suresh ...
  • Suresh Kumar on What is Web 2.0? : I don't think Tim was p...
  • Thomas Madsen-Mygdal on What is Web 2.0? : I'm very disappointed t...

MOST ACTIVE  |   MOST RECENT

  • Google Bets Big on HTML 5: News from Google I/O
  • Google Web Elements and Google's Iceberg Strategy (Google I/O)
  • Google I/O keynote, day 1
  • From the July NY Tech Meetup
  • Active Facebook Users By Country
  • FCC discusses broadband: the job is a big one
  • Four short links: 27 May 2009
  • Allison Randal
  • Andrew Savikas
  • Artur Bergman
  • Brady Forrest
  • Brett McLaughlin
  • Dale Dougherty
  • Jesse Robbins
  • Jim Stogdill
  • Marc Hedlund
  • Michael Loukides
  • Mike Hendrickson
  • Nat Torkington
  • Nikolaj Nyholm
  • Robert Passarella

RADAR TOPICS

  • book related
  • emerging tech
  • hard numbers
  • open source

Home — Essay Samples — Information Science and Technology — World Wide Web — The Likelihood in the Web 2.0

test_template

The Likelihood in The Web 2.0

  • Categories: Information Technology Internet World Wide Web

About this sample

close

Words: 429 |

Published: Sep 19, 2019

Words: 429 | Page: 1 | 3 min read

Image of Alex Wood

Cite this Essay

Let us write you an essay from scratch

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Get high-quality help

author

Prof Ernest (PhD)

Verified writer

  • Expert in: Information Science and Technology

writer

+ 120 experts online

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

Related Essays

3 pages / 1474 words

3 pages / 1370 words

2 pages / 985 words

1 pages / 594 words

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

121 writers online

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

Related Essays on World Wide Web

The Internet has become one of the best inventions in the modern world .it is even difficult to imagine how life would be in the absence of the internet. the internet is a viewed to be the current trend that everyone should fit [...]

This paper describes the design philosophies for the Internet protocol suite, TCP/IP which was first proposed 15 years ago by DARPA. It shows us the design goals of the Internet with their importance and how these goals led to [...]

In the article “Is Google Making Us Stupid?” by Nicholas Carr he argues that the internet is changing the way we think and work for the worst. I have to disagree. Although the internet is changing us, it’s for the better. First, [...]

This paper, by using information and sources, will describe the work, legend, and contribution of the World Wide Web to the world made by Tim Berners-Lee. The World Wide Web is a central and necessary part of our day-to day [...]

Changing domain can have a serious impact on organic traffic and is usually not advisable from the perspective of SEO. This is because each domain name is linked to several metrics and characteristics such as relevance, [...]

Anonymity guarantees that a client may utilize a resource or service without unveiling the client's identity. The prerequisites for anonymity give protection of the client identity. Anonymity isn't proposed to secure the subject [...]

Related Topics

By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.

Where do you want us to send this sample?

By clicking “Continue”, you agree to our terms of service and privacy policy.

Be careful. This essay is not unique

This essay was donated by a student and is likely to have been used and submitted before

Download this Sample

Free samples may contain mistakes and not unique parts

Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

Please check your inbox.

We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

Get Your Personalized Essay in 3 Hours or Less!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

web 2.0 essay

Excite really never got the business model right at all. We fell into the classic problem of how when a new medium comes out it adopts the practices, the content, the business models of the old medium—which fails, and then the more appropriate models get figured out.
Sites like del.icio.us and flickr allow users to "tag" content with descriptive tokens. But there is also huge source of implicit tags that they ignore: the text within web links. Moreover, these links represent a social network connecting the individuals and organizations who created the pages, and by using graph theory we can compute from this network an estimate of the reputation of each member. We plan to mine the web for these implicit tags, and use them together with the reputation hierarchy they embody to enhance web searches.

web 2.0 essay

Six ways to make Web 2.0 work

Technologies known collectively as Web 2.0 have spread widely among consumers over the past five years. Social-networking Web sites, such as Facebook and MySpace, now attract more than 100 million visitors a month. As the popularity of Web 2.0 has grown, companies have noted the intense consumer engagement and creativity surrounding these technologies. Many organizations, keen to harness Web 2.0 internally, are experimenting with the tools or deploying them on a trial basis.

Over the past two years, McKinsey has studied more than 50 early adopters to garner insights into successful efforts to use Web 2.0 as a way of unlocking participation. We have surveyed, independently, a range of executives on Web 2.0 adoption. Our work suggests the challenges that lie ahead. To date, as many survey respondents are dissatisfied with their use of Web 2.0 technologies as are satisfied. Many of the dissenters cite impediments such as organizational structure, the inability of managers to understand the new levers of change, and a lack of understanding about how value is created using Web 2.0 tools. We have found that, unless a number of success factors are present, Web 2.0 efforts often fail to launch or to reach expected heights of usage. Executives who are suspicious or uncomfortable with perceived changes or risks often call off these efforts. Others fail because managers simply don’t know how to encourage the type of participation that will produce meaningful results.

Twitter responses from our readers

After “Six ways to make Web 2.0 work” was posted, we wanted to encourage Twitter users to continue the conversation. Twitter allows individuals to broadcast 140-character posts to a loosely connected community of followers. Within a few days, over 300 posts used the #web2.0work hashtag 1 1. Hashtags are a community-driven convention for adding additional context and metadata to Twitter posts. Users can create a hashtag simply by prefixing a word with a hash (#) symbol. For example: #web2.0work. we established to monitor conversations and respond to the stream of opinions surrounding the article.

The tweets 2 2. A tweet is a single post or status update on Twitter. came in several varieties. Many respondents simply reported that we had posted the article and offered a shortened URL back to the piece on mckinseyquarterly.com. Others, however, went further, commenting on the findings of the article and sharing how they have been integrating some of the “six ways” precepts into their own Web 2.0 processes.

@estephen : @mckquarterly #web2.0work Rec 1 is spot on, even for conservative companies. we use this technique for wiki/blog internally. very effective.

@Racecarver48 : #web2.0work From recent first pilots, I can specifically agree with your theses 4, 5, and 6. We applied the successful w/o knowing them;-)

@Nurbani : @McKQuarterly #web2.0work #5 hit home 4 mktg initiatives.

The issue that Twitter respondents seemed to agree on the most was that Web 2.0 work can’t be added to an already full load, but instead needs to be meshed with everyday workflows.

@Barry_H : #web2.0work McKinsey report http://bit.ly/5Ac1y. Point 3 really important, embed new things in the day job don't add more work

@sifowler : @McKQuarterly 3. The tool has to fit in the workflow. Perceived 'extra' work reduces take-up. #web2.0work

@Salv_Reina : @McKQuarterly re Workflow, this is critical. If the tool sits outside day 2 day work, it won't take easily #web2.0work

Some users raised questions about points in our analysis that they thought were missing or not fully developed.

@tomguarriello : @McKQuarterly Yes, but your recs don't address the fear of social media that paralyzes many organizations. Loss of control/risk stops them

@RichardStacy : @mckquarterly possible adjunct to point 2 - create permission to fail, encourage experimentation and provide 'ROI break' #web2.0work.

Or what they thought we got wrong.

@drkleiman : it feels like too much focus on the tools and tech itself, not enough on thinking thru strategy & goals that can be accomplished #web2.0work

Others felt the article’s list focused too heavily on issues internal to companies rather than on the potential for participation beyond organizational borders.

@timolsen : #web2.0work - Article emphasizes how to use web 2.0 internally, I think using it externally is the real kicker.

One of the article’s authors, Michael Chui, @mchui , directed this respondent to other published research where we analyze how Web 2.0 can be used to enlist participation among consumers.

@mchui : @timolsen Definitely agree that external uses of Web 2.0 are important - see http://bit.ly/FDPsM #web2.0work

In some cases, the article prompted a fuller dialogue away from the Twitter environment. A number of tweets posted links to more substantial blog site responses.

@theparallaxview : new post 'McKinsey's Six of the Best for Web 2.0 Work' http://bit.ly/xvc9v #web2.0work @McKQuarterly

@broadstuff : McKinsey - 6 Lessons for making good use of Web 2.0 http://tinyurl.com/bd9ahg

@SocialMedia411 : Contribution and Connection are the New Currency (ConversationAgent): http://bit.ly/aSBY2 Response to McKinsey Report

@SandeepVizEdu : 6 Ways To Make Web2.0 Work - Visual Adaption of #web2.0work by McKinsey http://tinyurl.com/b5f5o6

One respondent, after reading the piece, vowed to take action.

@kwjenkins : @McKQuarterly The article inspired me to start a blog on our work Intranet - we believe in collab, just not a lot of grassroots efforts yet

Some historical perspective is useful. Web 2.0, the latest wave in corporate technology adoptions, could have a more far-reaching organizational impact than technologies adopted in the 1990s—such as enterprise resource planning (ERP) , customer relationship management (CRM), and supply chain management (Exhibit 1). The latest Web tools have a strong bottom-up element and engage a broad base of workers. They also demand a mind-set different from that of earlier IT programs, which were instituted primarily by edicts from senior managers.

The new tools

Web 2.0 tools and technology adoption

Web 2.0 covers a range of technologies. The most widely used are blogs, wikis, podcasts, information tagging, prediction markets, and social networks (Exhibit 2). New technologies constantly appear as the Internet continues to evolve. Of the companies we interviewed for our research, all were using at least one of these tools. What distinguishes them from previous technologies is the high degree of participation they require to be effective. Unlike ERP and CRM, where most users either simply process information in the form of reports or use the technology to execute transactions (such as issuing payments or entering customer orders), Web 2.0 technologies are interactive and require users to generate new information and content or to edit the work of other participants.

A range of technologies

Web 2.0 technologies for collaboration and communication

Earlier technologies often required expensive and lengthy technical implementations, as well as the realignment of formal business processes. With such memories still fresh, some executives naturally remain wary of Web 2.0. But the new tools are different. While they are inherently disruptive and often challenge an organization and its culture, they are not technically complex to implement. Rather, they are a relatively lightweight overlay to the existing infrastructure and do not necessarily require complex technology integration.

Gains from participation

Clay Shirky, an adjunct professor at New York University, calls the underused human potential at companies an immense “cognitive surplus” and one that could be tapped by participatory tools. Corporate leaders are, of course, eager to find new ways to add value. Over the past 15 years, using a combination of technology investments and process reengineering, they have substantially raised the productivity of transactional processes. Web 2.0 promises further gains, although the capabilities differ from those of the past technologies (Exhibit 3).

Management capabilities unlocked by participation

Web 2.0: Content generation and community buiding

Research by our colleagues shows how differences in collaboration are correlated with large differences in corporate performance. 1 1. Scott C. Beardsley, Bradford C. Johnson, and James M. Manyika, “ Competitive advantage from better interactions ,” mckinseyquarterly.com, May 2006. Our most recent Web 2.0 survey demonstrates that despite early frustrations, a growing number of companies remain committed to capturing the collaborative benefits of Web 2.0. 2 2. Building the Web 2.0 Enterprise: McKinsey Global Survey Results ,” mckinseyquarterly.com, July 2008. Since we first polled global executives two years ago, the adoption of these tools has continued. Spending on them is now a relatively modest $1 billion, but the level of investment is expected to grow by more than 15 percent annually over the next five years, despite the current recession. 3 3. See G. Oliver Young et al., “Can enterprise Web 2.0 survive the recession?” forrester.com, January 6, 2009.

Management imperatives for unlocking participation

To help companies navigate the Web 2.0 landscape, we have identified six critical factors that determine the outcome of efforts to implement these technologies.

1. The transformation to a bottom-up culture needs help from the top. Web 2.0 projects often are seen as grassroots experiments, and leaders sometimes believe the technologies will be adopted without management intervention—a “build it and they will come” philosophy. These business leaders are correct in thinking that participatory technologies are founded upon bottom-up involvement from frontline staffers and that this pattern is fundamentally different from the rollout of ERP systems, for example, where compliance with rules is mandatory. Successful participation, however, requires not only grassroots activity but also a different leadership approach: senior executives often become role models and lead through informal channels.

At Lockheed Martin, for instance, a direct report to the CIO championed the use of blogs and wikis when they were introduced. The executive evangelized the benefits of Web 2.0 technologies to other senior leaders and acted as a role model by establishing his own blog. He set goals for adoption across the organization, as well as for the volume of contributions. The result was widespread acceptance and collaboration across the company’s divisions.

2. The best uses come from users—but they require help to scale. In earlier IT campaigns, identifying and prioritizing the applications that would generate the greatest business value was relatively easy. These applications focused primarily on improving the effectiveness and efficiency of known business processes within functional silos (for example, supply-chain-management software to improve coordination across the network). By contrast, our research shows the applications that drive the most value through participatory technologies often aren’t those that management expects.

Efforts go awry when organizations try to dictate their preferred uses of the technologies—a strategy that fits applications designed specifically to improve the performance of known processes—rather than observing what works and then scaling it up. When management chooses the wrong uses, organizations often don’t regroup by switching to applications that might be successful. One global technology player, for example, introduced a collection of participatory tools that management judged would help the company’s new hires quickly get up to speed in their jobs. The intended use never caught on, but people in the company’s recruiting staff began using the tools to share recruiting tips and pass along information about specific candidates and their qualifications. The company, however, has yet to scale up this successful, albeit unintended, use.

At AT&T, it was frontline staffers who found the best use for a participatory technology—in this case, using Web 2.0 for collaborative project management. Rather than dictating the use, management broadened participation by supporting an awareness campaign to seed further experimentation. Over a 12-month period, the use of the technology rose to 95 percent of employees, from 65 percent.

3. What’s in the workflow is what gets used. Perhaps because of the novelty of Web 2.0 initiatives, they’re often considered separate from mainstream work. Earlier generations of technologies, by contrast, often explicitly replaced the tools employees used to accomplish tasks. Thus, using Web 2.0 and participating in online work communities often becomes just another “to do” on an already crowded list of tasks.

Participatory technologies have the highest chance of success when incorporated into a user’s daily workflow. The importance of this principle is sometimes masked by short-term success when technologies are unveiled with great fanfare; with the excitement of the launch, contributions seem to flourish. As normal daily workloads pile up, however, the energy and attention surrounding the rollout decline, as does participation. One professional-services firm introduced a wiki-based knowledge-management system, to which employees were expected to contribute, in addition to their daily tasks. Immediately following the launch, a group of enthusiasts used the wikis vigorously, but as time passed they gave the effort less personal time—outside their daily workflow—and participation levels fell.

Google is an instructive case to the contrary. It has modified the way work is typically done and has made Web tools relevant to how employees actually do their jobs. The company’s engineers use blogs and wikis as core tools for reporting on the progress of their work. Managers stay abreast of their progress and provide direction by using tools that make it easy to mine data on workflows. Engineers are better able to coordinate work with one another and can request or provide backup help when needed. The easily accessible project data allows senior managers to allocate resources to the most important and time-sensitive projects.

Pixar moved in a similar direction when it upgraded a Web 2.0 tool that didn’t quite mesh with the way animators did their jobs. The company started with basic text-based wikis to share information about films in production and to document meeting notes. That was unsatisfactory, since collaborative problem solving at the studio works best when animators, software engineers, managers, and directors analyze and discuss real clips and frames from a movie. 4 4. See Hayagreeva Rao, Robert Sutton, and Allen P. Webb, “ Innovation lessons from Pixar: An interview with Oscar-winning director Brad Bird ,” mckinseyquarterly.com, April 2008. Once Pixar built video into the wikis, their quality improved as critiques became more relevant. The efficiency of the project groups increased as well.

4. Appeal to the participants’ egos and needs—not just their wallets . Traditional management incentives aren’t particularly useful for encouraging participation. 5 5. Exceptions exist for harnessing information markets and searching crowd expertise, where formal incentives are an essential part of the mechanism for participation. Earlier technology adoptions could be guided readily with techniques such as management by objectives, as well as standardized bonus pay or individual feedback. The failure of employees to use a mandated application would affect their performance metrics and reviews. These methods tend to fall short when applied to unlocking participation. In one failed attempt, a leading Web company set performance evaluation criteria that included the frequency of postings on the company’s newly launched wiki. While individuals were posting enough entries to meet the benchmarks, the contributions were generally of low quality. Similarly, a professional-services firm tried to use steady management pressure to get individuals to post on wikis. Participation increased when managers doled out frequent feedback but never reached self-sustaining levels.

A more effective approach plays to the Web’s ethos and the participants’ desire for recognition: bolstering the reputation of participants in relevant communities, rewarding enthusiasm, or acknowledging the quality and usefulness of contributions. ArcelorMittal, for instance, found that when prizes for contributions were handed out at prominent company meetings, employees submitted many more ideas for business improvements than they did when the awards were given in less-public forums.

5. The right solution comes from the right participants . Targeting users who can create a critical mass for participation as well as add value is another key to success. With an ERP rollout, the process is straightforward: a company simply identifies the number of installations (or “seats”) it needs to buy for functions such as purchasing or finance and accounting. With participatory technologies, it’s far from obvious which individuals will be the best participants. Without the right base, efforts are often ineffective. A pharmaceutical company tried to generate new product ideas by tapping suggestions from visitors to its corporate Web site. It soon discovered that most of them had neither the skills nor the knowledge to make meaningful contributions, so the quality of the ideas was very low.

To select users who will help drive a self-sustaining effort (often enthusiastic early technology adopters who have rich personal networks and will thus share knowledge and exchange ideas), a thoughtful approach is required. When P&G introduced wikis and blogs to foster collaboration among its workgroups, the company targeted technology-savvy and respected opinion leaders within the organization. Some of these people ranked high in the corporate hierarchy, while others were influential scientists or employees to whom other colleagues would turn for advice or other assistance.

When Best Buy experimented with internal information markets, the goal was to ensure that participation helped to create value. In these markets, employees place bets on business outcomes, such as sales forecasts. 6 6. See Renée Dye, “ The promise of prediction markets: A roundtable ,” mckinseyquarterly.com, April 2008; and the video “ Betting on prediction markets ,” mckinseyquarterly.com, November 2007. To improve the chances of success, Best Buy cast its net widely, going beyond in-house forecasting experts; it also sought out participants with a more diverse base of operational knowledge who could apply independent judgment to the prediction markets. The resulting forecasts were more accurate than those produced by the company’s experts.

6. Balance the top-down and self-management of risk . A common reason for failed participation is discomfort with it, or even fear. In some cases, the lack of management control over the self-organizing nature and power of dissent is the issue. In others, it’s the potential repercussions of content—through blogs, social networks, and other venues—that is detrimental to the company. Numerous executives we interviewed said that participatory initiatives had been stalled by legal and HR concerns. These risks differ markedly from those of previous technology adoptions, where the chief downside was high costs and poor execution.

Companies often have difficulty maintaining the right balance of freedom and control. Some organizations, trying to accommodate new Web standards, have adopted total laissez-faire policies, eschewing even basic controls that screen out inappropriate postings. In some cases, these organizations have been burned.

Prudent managers should work with the legal, HR, and IT security functions to establish reasonable policies, such as prohibiting anonymous posting. Fears are often overblown, however, and the social norms enforced by users in the participating communities can be very effective at policing user exchanges and thus mitigating risks. The sites of some companies incorporate “flag as inappropriate” buttons, which temporarily remove suspect postings until they can be reviewed, though officials report that these functions are rarely used. Participatory technologies should include auditing functions, similar to those for e-mail, that track all contributions and their authors. Ultimately, however, companies must recognize that successful participation means engaging in authentic conversations with participants.

Acceptance of Web 2.0 technologies in business is growing. Encouraging participation calls for new approaches that break with the methods used to deploy IT in the past. Company leaders first need to survey their current practices. Once they feel comfortable with some level of controlled disruption, they can begin testing the new participatory tools. The management imperatives we have outlined should improve the likelihood of success.

Michael Chui is a consultant in McKinsey’s San Francisco office; Andy Miller is an associate principal in the Silicon Valley office, where Roger Roberts is a principal.

The authors would like to acknowledge the contributions of their colleagues James Manyika, Yooki Park, Bryan Pate, and Kausik Rajgopal.

Explore a career with us

Nicholas Carr's blog

The amorality of web 2.0.

This post, along with seventy-eight others, is collected in the book Utopia Is Creepy .

From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the physical world, the Web presents a readymade Promised Land. On the Internet, we’re all bodiless, symbols speaking to symbols in symbols. The early texts of Web metaphysics, many written by thinkers associated with or influenced by the post-60s New Age movement, are rich with a sense of impending spiritual release; they describe the passage into the cyber world as a process of personal and communal unshackling, a journey that frees us from traditional constraints on our intelligence, our communities, our meager physical selves. We become free-floating netizens in a more enlightened, almost angelic, realm.

But as the Web matured during the late 1990s, the dreams of a digital awakening went unfulfilled. The Net turned out to be more about commerce than consciousness, more a mall than a commune. And when the new millenium arrived, it brought not a new age but a dispiritingly commonplace popping of a bubble of earthly greed. Somewhere along the way, the moneychangers had taken over the temple. The Internet had transformed many things, but it had not transformed us. We were the same as ever.

The New New Age

But the yearning for a higher consciousness didn’t burst with the bubble. Web 1.0 may have turned out to be spiritual vaporware, but now we have the hyper-hyped upgrade: Web 2.0. In a profile of Internet savant Tim O’Reilly in the current issue of Wired, Steven Levy writes that “the idea of collective consciousness is becoming manifest in the Internet.” He quotes O’Reilly: “The Internet today is so much an echo of what we were talking about at [New Age HQ] Esalen in the ’70s – except we didn’t know it would be technology-mediated.” Levy then asks, “Could it be that the Internet – or what O’Reilly calls Web 2.0 – is really the successor to the human potential movement?”

Levy’s article appears in the afterglow of Kevin Kelly’s sweeping “ We Are the Web ” in Wired’s August issue. Kelly, erstwhile prophet of the Long Boom, surveys the development of the World Wide Web, from the Netscape IPO ten years ago, and concludes that it has become a “magic window” that provides a “spookily godlike” perspective on existence. “I doubt angels have a better view of humanity,” he writes.

But that’s only the beginning. In the future, according to Kelly, the Web will grant us not only the vision of gods but also their power. The Web is becoming “the OS for a megacomputer that encompasses the Internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network. This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and bodies but our minds … We will live inside this thing.”

The revelation continues:

There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born. You and I are alive at this moment. We should marvel, but people alive at such times usually don’t. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization. Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.

This isn’t the language of exposition. It’s the language of rapture.

The Cult of the Amateur

Now, lest you dismiss me as a mere cynic, if not a fallen angel, let me make clear that I’m all for seeking transcendence, whether it’s by going to church or living in a hut in the woods or sitting at the feet of the Maharishi or gazing into the glittering pixels of an LCD screen. One gathers one’s manna where one finds it. And if there’s a higher consciousness to be found, then by all means let’s get elevated. My problem is this: When we view the Web in religious terms, when we imbue it with our personal yearning for transcendence, we can no longer see it objectively. By necessity, we have to look at the Internet as a moral force, not as a simple collection of inanimate hardware and software. No decent person wants to worship an amoral conglomeration of technology.

And so all the things that Web 2.0 represents – participation, collectivism, virtual communities, amateurism – become unarguably good things, things to be nurtured and applauded, emblems of progress toward a more enlightened state. But is it really so? Is there a counterargument to be made? Might, on balance, the practical effect of Web 2.0 on society and culture be bad, not good? To see Web 2.0 as a moral force is to turn a deaf ear to such questions.

Let me bring the discussion down to a brass tack. If you read anything about Web 2.0, you’ll inevitably find praise heaped upon Wikipedia as a glorious manifestation of “the age of participation.” Wikipedia is an open-source encyclopedia; anyone who wants to contribute can add an entry or edit an existing one. O’Reilly, in a new essay on Web 2.0, says that Wikipedia marks “a profound change in the dynamics of content creation” – a leap beyond the Web 1.0 model of Britannica Online. To Kevin Kelly, Wikipedia shows how the Web is allowing us to pool our individual brains into a great collective mind. It’s a harbinger of the Machine.

In theory, Wikipedia is a beautiful thing – it has to be a beautiful thing if the Web is leading us to a higher consciousness. In reality, though, Wikipedia isn’t very good at all. Certainly, it’s useful – I regularly consult it to get a quick gloss on a subject. But at a factual level it’s unreliable, and the writing is often appalling. I wouldn’t depend on it as a source, and I certainly wouldn’t recommend it to a student writing a research paper.

Take, for instance, this section from Wikipedia’s biography of Bill Gates , excerpted verbatim:

Gates married Melinda French on January 1, 1994. They have three children, Jennifer Katharine Gates (born April 26, 1996), Rory John Gates (born May 23, 1999) and Phoebe Adele Gates (born September 14, 2002). In 1994, Gates acquired the Codex Leicester, a collection of writings by Leonardo da Vinci; as of 2003 it was on display at the Seattle Art Museum. In 1997, Gates was the victim of a bizarre extortion plot by Chicago resident Adam Quinn Pletcher. Gates testified at the subsequent trial. Pletcher was convicted and sentenced in July 1998 to six years in prison. In February 1998 Gates was attacked by Noël Godin with a cream pie. In July 2005, he solicited the services of famed lawyer Hesham Foda. According to Forbes, Gates contributed money to the 2004 presidential campaign of George W. Bush. According to the Center for Responsive Politics, Gates is cited as having contributed at least $33,335 to over 50 political campaigns during the 2004 election cycle.

Excuse me for stating the obvious, but this is garbage, an incoherent hodge-podge of dubious factoids (who the heck is “famed lawyer Hesham Foda”?) that adds up to something far less than the sum of its parts.

Here’s Wikipedia on Jane Fonda’s life , again excerpted verbatim:

Her nickname as a youth—Lady Jane—was one she reportedly disliked. She traveled to Communist Russia in 1964 and was impressed by the people, who welcomed her warmly as Henry’s daughter. In the mid-1960s she bought a farm outside of Paris, had it renovated and personally started a garden. She visited Andy Warhol’s Factory in 1966. About her 1971 Oscar win, her father Henry said: “How in hell would you like to have been in this business as long as I and have one of your kids win an Oscar before you do?” Jane was on the cover of Life magazine, March 29, 1968. While early she had grown both distant from and critical of her father for much of her young life, in 1980, she bought the play “On Golden Pond” for the purpose of acting alongside her father—hoping he might win the Oscar that had eluded him throughout his career. He won, and when she accepted the Oscar on his behalf, she said it was “the happiest night of my life.” Director and first husband Roger Vadim once said about her: “Living with Jane was difficult in the beginning … she had so many, how do you say, ‘bachelor habits.’ Too much organization. Time is her enemy. She cannot relax. Always there is something to do.” Vadim also said, “There is also in Jane a basic wish to carry things to the limit.”

This is worse than bad, and it is, unfortunately, representative of the slipshod quality of much of Wikipedia. Remember, this emanation of collective intelligence is not just a couple of months old. It’s been around for nearly five years and has been worked over by many thousands of diligent contributors. At this point, it seems fair to ask exactly when the intelligence in “collective intelligence” will begin to manifest itself. When will the great Wikipedia get good? Or is “good” an old-fashioned concept that doesn’t apply to emergent phenomena like communal on-line encyclopedias?

The promoters of Web 2.0 venerate the amateur and distrust the professional. We see it in their unalloyed praise of Wikipedia, and we see it in their worship of open-source software and myriad other examples of democratic creativity. Perhaps nowhere, though, is their love of amateurism so apparent as in their promotion of blogging as an alternative to what they call “the mainstream media.” Here’s O’Reilly: “While mainstream media may see individual blogs as competitors, what is really unnerving is that the competition is with the blogosphere as a whole. This is not just a competition between sites, but a competition between business models. The world of Web 2.0 is also the world of what Dan Gillmor calls ‘we, the media,’ a world in which ‘the former audience,’ not a few people in a back room, decides what’s important.”

I’m all for blogs and blogging. (I’m writing this, ain’t I?) But I’m not blind to the limitations and the flaws of the blogosphere – its superficiality, its emphasis on opinion over reporting, its echolalia, its tendency to reinforce rather than challenge ideological extremism and segregation. Now, all the same criticisms can (and should) be hurled at segments of the mainstream media. And yet, at its best, the mainstream media is able to do things that are different from – and, yes, more important than – what bloggers can do. Those despised “people in a back room” can fund in-depth reporting and research. They can underwrite projects that can take months or years to reach fruition – or that may fail altogether. They can hire and pay talented people who would not be able to survive as sole proprietors on the Internet. They can employ editors and proofreaders and other unsung protectors of quality work. They can place, with equal weight, opposing ideologies on the same page. Forced to choose between reading blogs and subscribing to, say, the New York Times, the Financial Times, the Atlantic, and the Economist, I will choose the latter. I will take the professionals over the amateurs.

But I don’t want to be forced to make that choice.

Scary Economics

And so, having gone on for so long, I at long last come to my point. The Internet is changing the economics of creative work – or, to put it more broadly, the economics of culture – and it’s doing it in a way that may well restrict rather than expand our choices. Wikipedia might be a pale shadow of the Britannica, but because it’s created by amateurs rather than professionals, it’s free. And free trumps quality all the time. So what happens to those poor saps who write encyclopedias for a living? They wither and die. The same thing happens when blogs and other free on-line content go up against old-fashioned newspapers and magazines. Of course the mainstream media sees the blogosphere as a competitor. It is a competitor. And, given the economics of the competition, it may well turn out to be a superior competitor. The layoffs we’ve recently seen at major newspapers may just be the beginning, and those layoffs should be cause not for self-satisfied snickering but for despair. Implicit in the ecstatic visions of Web 2.0 is the hegemony of the amateur. I for one can’t imagine anything more frightening.

In “We Are the Web,” Kelly writes that “because of the ease of creation and dissemination, online culture is the culture .” I hope he’s wrong, but I fear he’s right – or will come to be right.

Like it or not, Web 2.0, like Web 1.0, is amoral. It’s a set of technologies – a machine, not a Machine – that alters the forms and economics of production and consumption. It doesn’t care whether its consequences are good or bad. It doesn’t care whether it brings us to a higher consciousness or a lower one. It doesn’t care whether it burnishes our culture or dulls it. It doesn’t care whether it leads us into a golden age or a dark one. So let’s can the millenialist rhetoric and see the thing for what it is, not what we wish it would be.

Promulgate:

  • Click to share on Twitter (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to email a link to a friend (Opens in new window)
  • Click to print (Opens in new window)

193 thoughts on “ The amorality of Web 2.0 ”

If you think about it, those who would infuse some designed progress toward perfection in Web 2.0 of TCP/IP have something in common with the Dover, PA school board who want to legislate positive design and purpose in the much older web of DNA/RNA. I agree that the web is amoral. And I believe that is its virtue.

“They can employ editors and proofreaders and other unsung protectors of quality work.”

This is one of the main problems with software today, free or proprietary. A lot of it gets slapped together and rushed to release, ridden with bugs, quality issues, and don’t even get me started on usability issues.

Where’s the fire?

“I agree that the web is amoral. And I believe that is its virtue.”

Here. Here.

This process will claim some of the “unsung protectors” as it were, some of whom deserve better. When, really, has it ever been different?

But in its amorality, this medium will also uncover that person who, BUT FOR the right credentials, might have contributed mightily with his or her ideas to the betterment of us all. It’s as close to color blind as society gets.

The point is not found in all the drek. It’s finding the jewel in the trash.

A superb post by Nick!

‘Echolalia’ — there’s a lookup word. I used Dictionary.com.

Moral or otherwise, the Web 1 or 2 or whathaveyou is nice. I learn from it. I add to it. But I agree that it’s as dangerous, stupid, sad or coherent as one wants it to be.

Kelly is a fuzz-brain. Tim is not.

Open Source and Wikipedia are not the same. And faith in open source does not mean shunning the professional for the ameture. How does Nick explain the high quality of Apache Web Server and the robustness of PHP ? Wikipedia is a mess, I agree, that is due to its own model.

Let me talk something else here. Skill and professions are not something that existed just because the Web (1.0, 2.0 or X.X) was not there. Skills and professions were not something that will cease to exist when Web is ubiquitous. The Human Race found that there are a class of people that can do a set of jobs better than the rest, or, are choosen to specialize on a job. That is the reason why some people became the carpenter, some soldier and some priests. If the job performed by specialists becomes too easy, that specialization goes away. Economics is exactly proportional to the need of specialization.

Web does not make people expert content creator. This specialization will be there and people will get paid for it. People will not go for a Fairy Tale Wikipedia and will always go for Harry Potter.

Layoffs in the media houses may be explained as shrinking necessesity for average content creators. Quality will always be rewarded economically.

Shouvik, I didn’t mean to imply (and I’m sorry if I did) that open source software is of poor quality. (I’ve written often of the critical importance of open source to the future of IT.) I was simply saying that the veneration of open source efforts, often at the expense of traditional for-pay software development, is one manifestation of the cult of the amateur.

As for your claim that “quality will always be rewarded economically,” I think you’re being much too complacent.

Asay: ‘The Amorality of Web 2.0’ (Nick Carr)

It’s so hard to find intelligent contrarians these days. I experienced that firsthand today and yesterday at LinuxWorld UK, where you were cheered for saying inane but popular things like, “My dream is to bless the world with Linux desktops…

It is always interesting to read exceptionally well-crafted content — especially when you don’t totally agree (or disagree) with the point of view of the author.

I am on a team of graduate students at MIT that are looking at the “business side of Web 2.0”. As such, we are trying to determine if there really is a “there there”. Jury is still out…but at some level, its starting to feel like the 1990s again…

Everything you’ve written here is a valid opinion, and commercial encyclopedias are doomed anyway because (as Microsoft is finding out with Linux) it’s hard to compete with free. (I eagerly await EB putting out TCO studies on Wikipedia.)

Speaking as someone who’s highly involved in it (I write stuff, I’m an administrator, I’m on the Arbitration Committee, I’m a mailing list moderator, I do media interviews), Wikipedia is of mediocre quality with some really good bits. If you hit the “Random page” link twenty times, you’ll end up mostly with sketchy three-paragraph stub articles.

That said, the good bits are fantastic. Although articles good enough to make “Featured Article” status (which are indeed excellent) tend to be hideously esoteric; somehow getting more general articles up to that sort of quality is not facilitated at present.

Encyclopedia Britannica is an amazing work. It’s of consistent high quality, it’s one of the great books in the English language and it’s doomed. Brilliant but pricey has difficulty competing economically with free and apparently adequate (see http://en.wikipedia.org/wiki/Worse_is_better – this story plays out over and over again in the computing field and is the essence of “disruptive technology”). They could release the entire EB under an open content license, but they have shareholders who might want a word about that.

So if we want a good encyclopedia in ten years, it’s going to have to be a good Wikipedia. So those who care about getting a good encyclopedia are going to have to work out how to make Wikipedia better, or there won’t be anything.

I’ve made some efforts in this direction – pushing toward a page-rating feature, a “Rate this page” tab at the top, which, unlike an editorial committee, will actually scale with the contributor base and will highlight areas in need of attention. (See http://meta.wikimedia.org/wiki/Article_validation_feature and http://meta.wikimedia.org/wiki/En_validation_topics – the feature is currently waiting on an implementati, on the lead developer thinks won’t kill the database.) Recent discussion on the WikiEN-L mailing list has also included proposals for a scaleable article rating system.

Wikipedia is likely to be it by first-mover advantage and network effect. Think about what you can do to ensure there is a good encyclopedia in ten years.

This is a very interesting piece. I think that your main point — that the Internet is not what it has been hailed or condemned as — is very true.

I only wonder where the Web is headed. Perhaps it can only be as perfect as its creators.

When you subscribe to any blog you know what you are getting into. Generally blogs are opinionated and spilled with facts here and there. You do your own DD before coming to any conclusion. Only when professionals start misrepresenting infomation it becames unethical. The case in point is fox news article on opendocument file formats where it raps Massachusetts official. During the initial release of the article it conveniently failed to disclose it was sponsored by microsoft. Cases like this lead people to cheer for Opensource at the expense of Proprietary systems. I guess people prefer being amoral compared to unethical

The problem I have with your examples on Wikipedia is that you haven’t picked the best examples. If you review the featured articles, would you reach the same conclusion?

I chose the two entries I used (Gates and Fonda) at random, and they were the first two I looked at – I didn’t try to find the worst examples possible, in other words; I simply took the first two I went to. (I wanted to choose subjects that most people would have some familiarity with.) I have looked at a lot of other entries previously and since, and many are every bit as bad as the two I featured. You’re right, though, that there are very good entries, and I suppose I could have searched for a couple of those and featured them. But an encyclopedia can’t just have a small percentage of good entries and be considered a success. I would argue, in fact, that the overall quality of an encyclopedia is best judged by its weakest entries rather than its best. What’s the worth of an unreliable reference work?

Do Ants Have Souls?

Nick Carr writes a brilliant piece on people’s quasi-religious fervor over Web 2.0. He quotes from Kevin Kelly’s We Are The Web (this is from the last page of the article):There is only one time in the history of each

“I would argue, in fact, that the overall quality of an encyclopedia is best judged by its weakest entries rather than its best. What’s the worth of an unreliable reference work?”

Those are really two separate things. Given that Wikipedia lets you see inside the sausage factory, judge it by the results of “Random link” and articles about things you do know (as you did). On the second point, that too many articles are unreferenced is a problem we’re at work on – that’s actually a different problem than quality of writing and coverage, IMO.

Its strange… you write a blog, to criticize web 2.0, when blogs- a web 2.0 application is the reason you are heard and read over the Internet!

Do computers make us smarter?

With all the talk lately about the new net revolution, Web 2.0, and all of that (e.g., point and counter-point), it is interesting to throw some actual research into the mix. Lowell Monke’s recent article in Orion Magazine does just that.

[R]e…

The Cult of the Amateur – Really?

Nicholas Carr has a post on the amateur nature of the collective consciousness of the Internet. Its ironic. He points to the slipshod quality of much of Wikipedia to highlight the flaws of the blogosphere including echolalia, tendency to reinforce…

“In theory, Wikipedia is a beautiful thing.”

In theory? I thought that in theory it wouldn’t work at all…

I don’t know about this theory that Wikipedia is a heap of junk; by comparing it with the EB.

I mean, does anyone *have* a copy of the 5th year of the EB? Who says that that was better than the Wikipedia?

I mean, aren’t we comparing a very new and very ambitious encyclopedia with a two hundred year old encyclopedia, and expecting *rather* too much? As in what gives?

When Kevin Kelly turned up again, I knew that the resurgence of tech was real. Enough real value was being created to allow hucksters and frauds to make a living again.

The question is, are we empowering the esoteric at the expensive of the authoritative? Maybe the answer to that is yes, so far. Maybe, that dream of authority was always a myth anyway. The Britannica was a status purchase for the middle class, unread and displayed on the shelves. Why should it be our yardstick?

The web is a dreadfully imperfect tool. But let’s ask whether our technology serves human needs better, not whether it matches a previous dream of human knowledge.

Nice piece! I don’t disagree with most of it. Except for the very last paragraph, which is really the only paragraph dealing with the point of your story. That the web, machines, and maybe even technology are not moral.

I’ve changed my mind on this. I used to think technology was neutral — just a tool — you could use it for good or evil. Pretty standard belief for us nerds. But in spending the last three years trying to figure out what the greater meaning of technology is I’ve reluctantly concluded that technology is a moral force (for the good). I’ll need a whole book to make that argument (if I can) and that is what I am working on.

But you have to agree it is an important and vital question. I hope you continue your investigation of it.

Does the EB even have entries for Bill Gates and Jane Fonda? Not that I think it should, but why are these metrics for comparing encyclopedic performance?

And I’d answer my own question but the town’s library has been shut down and I don’t feel like spending $49.95 to find out. Seeing how the on-line edition only has 73,000 articles in it, I’m doubtfull.

Nicholas Carr has an interesting piece on the Web 2.0 phenomenon, the vision of the web as a sort of collective consciousness that will fundamentally change human culture, and even the very concept of human intelligence.

This is a very interesting p…

I think everyone in the wikipedia community is trying very hard to make the quality “good” as you say; and wikipedia certainly responds to input such as this. You might be happy to know that both articles you have mentioned have been since added to cleanup projects, in addition to broader discussions about ways to improve writing quality.

At this point, it seems fair to ask exactly when the intelligence in “collective intelligence” will begin to manifest itself. Or is “good” an old-fashioned concept that doesn’t apply to emergent phenomena like communal on-line encyclopedias?

I certainly am not of the opinion that wikipedia is some transcendent work beyond the descriptions of good or bad, but I think this point might be looked at more closely. A work, of whatever size, that is edited and written by a collection of people over a period of time that, in all probability, have varying masteries of english will inevitably appear to be bad writing. It takes another person to come in and combine all the probably factually correct information into sentence structures that are pleasing to read, wiki’s call this re-factoring sometimes. It is a difficult and time consuming process as you can imagine, but one wikipedia is trying to make more appealing for editors.

Comments are closed.

We use cookies to enhance our website for you. Proceed if you agree to this policy or learn more about it.

  • Essay Database >
  • Essay Examples >
  • Essays Topics >
  • Essay on Sociology

Web 2.0 Essay

Type of paper: Essay

Topic: Sociology , Facebook , Media , Teenagers , Eating Disorders , Eating , Internet , Disorders

Words: 1200

Published: 03/25/2020

ORDER PAPER LIKE THIS

A number of technology geeks have given various explanations about what is web 2.0. The exact definition is still a topic of debate. However, I believe that web 2.0 is a web-based application, which can be accessed from anywhere, and the users can be contributors to content on internet, rather than just being viewers. Is web 2.0 a boon or bane to the society? In discussions about web 2.0, one controversial issue has been about its various opinions. In a recent study, the researchers observed that, young people used this technology more extensively than older people did. They say that, since young people regularly use social networking sites, which are based on web 2.0 technology, they become extensive users. On surveying the parents of children regarding usage of internet and other social media, most of them reported that their children constantly use social networking sites mainly Facebook, MySpace, Twitter, to keep in touch with others; share pictures and videos of events; etc. Parents have also said that, children these days are trying to develop their e-skills by making use of such technologies. Web 2.0 not only serves as a source of communication and entertainment a good platform for networking, claim many businesspersons. They say that, since sites like LinkedIn, YouTube are most visited by huge number of people every day, social media marketing has become easy and very effective means of business marketing today. I agree to all their points and support web 2.0 since it is very easy to use. At the same time, I also believe that this technology has ruined the privacy of people to a certain extent. It allows a lot of information to be posted online, which creates a negative impact most of the times. Extensive use of social networking sites has made the children more prone to bullying, harassment and abuse. Making use of the easy access worldwide, people have started exploiting its features to do wrong deeds like hacking accounts, spamming, creating fake IDs, posting views and comments against competitors and rivals, forgeries, etc.

SOCIAL MEDIA CAUSES EATING DISORDERS

A wide accepted opinion about media, prevail among the researchers who study about the relationship between media and eating disorders. They claim that thinness depicting and thinness- promoting (TDP) media have a huge impact on teenagers. Young men and women develop eating disorders when they are exposed to TDP media more frequently. More women start dieting, when they start reading more TDP magazines rather than watching shows. On the other hand, men start dieting and exercising when they start watching TDP shows. They exercise and diet much both for themselves and for women. A study says that Facebook triggers eating disorders. Teenage girls who spend hours together on Facebook, flicking photos and albums have chances of developing body image related problems and lead to eating disorders. Doctors say that, posting selfies on the Facebook and constant sight of many photos and albums can make them think they are fat and hence have eating disorders. Researchers Evelyn Meier and James Gray reported: ‘It is not the total time spent on the internet or Facebook, but the amount of Facebook time allocated to photo activity that is associated with greater thin deal internalization, self-objection, weight dissatisfaction, and drive for thinness.’ The problem is that for almost all teenagers, Facebook has replaced the traditional way of meeting and talking to each other, said the researchers. Another trend which TDP has created among young women is the so-called ‘thigh gap’. This has caused a vast number of eating disorder cases. Experts blame the media for fuelling the idea of taking up diet plans with dangerous weight-loss goals, in the minds of young girls and women. They try to maintain a diet and become very slender, where the thighs do not touch one another when they stand. This trend has set from the time magazines, social media and TV started encouraging underweight models.

ADAM MORDECAI’S ANALYSIS OF INCOME INEQUALITY IN USA

“ 9 Out Of 10 Americans Are Completely Wrong About This Mind-Blowing Fact” is a video created by Adam Mordecai and based on the research by Harvard professor Michael I. Norton, who found that 9 out of 10 people taking his survey universally thought that wealth is more evenly distributed in the United States than it actually is. He conducted the study by dividing the population of United States into 5 groups: top 20%, bottom 20%, middle class sub grouped as second, third and fourth classes’ 20% each. He says that, 92% i.e., 9 out of 10 respondents said that the wealthiest people would be 10-20 times better than the poorest Americans, middle class would be booming and healthy, smoothly transitioning into wealth and the poverty line will be off the charts. I used to think that 20-30 % of the poorest are beginning to suffer, middle class is slightly suffering and the wealthiest are making 100 times what the poor make and 10 times what middle class people make. This was the assumption of the survey participants as well. However, Mordecai says that the reality is not even close to what the people think is real. As seen in the chart, the poorest Americans do not even register on the chart. They hardly have only pocket penny. Middle class is virtually indistinguishable from the poor. Even the top 10-20% is suffering. Only those at the very top are doing considerably better. Top 2-5% is also literally off the chart. Top 1% own 10 times more than what the chart can allow. Which means nearly 40% of America is owned by the top 1% and bottom 80% owns only 7% of the country’s wealth. Top 1% of the earners take home quarter of their income. They own 50% of the country’s stocks, while the bottom 50% owns only 0.5% of the stocks. The bottom 50% is not investing, but they are just scraping off the ground. Adam Mordecai concludes with a fact that a CEO is working almost 384 times harder than an average employee of the company is. My personal inference from the video was that, average employees must work hard for more than a month to make what the CEO makes in an hour. This kind of wealth distribution is very risky to the United States. Although I should have a better insight about the people’s income inequality, I cannot help to think that this portrayal of income inequality indicates how broken the nation is and how is the reality ironically different from what we citizens assume it to be.

Innes, Emma. Could Facebook trigger eating disorders? Teenage girls who spend hours looking at posted photos 'develop poor body image'. 4 Dec 2013. 23 Jan 2014 <http://www.dailymail.co.uk/health/article-2518062/Could-Facebook-trigger-eating-disorders-Teenage-girls-spend-hours-looking-posted-photos-develop-poor-body-image.html>. Mordecai, Adam. 9 Out Of 10 Americans Are Completely Wrong About This Mind-Blowing Fact. n.d. 23 jan 2014 <http://www.upworthy.com/9-out-of-10-americans-are-completely-wrong-about-this-mind-blowing-fact-2>. Rudd, Peter and Matthew Walker. Children and Young People's Views on Web 2.0 Technologies. Project. Slough: National Foundation for Educational Research, 2010.

double-banner

Cite this page

Share with friends using:

Removal Request

Removal Request

Finished papers: 2493

This paper is created by writer with

ID 256513644

If you want your paper to be:

Well-researched, fact-checked, and accurate

Original, fresh, based on current data

Eloquently written and immaculately formatted

275 words = 1 page double-spaced

submit your paper

Get your papers done by pros!

Other Pages

Animal cruelty argumentative essay, lil wayne essay sample, historical studies essay, essay on advertising sales and sales management, the religion of islam essay, critical thinking on environmental guilt rhetorical analysis of advertising, big business and organization labor essay example, example of summarize your current level of responsibility and decision making ability in your admission essay, report on design and subjectivity, crossing brooklyn ferry critical thinking, example of essay on stalking and domestic violence, gdp discussions essay sample, argumentative essay on afin53 dividend discount model, protective tariffs on chinese solar panels research paper example, nursing chronic health conditions critical thinking example, example of 2 parts discussion board literature review, essay on there are certain risks that are associated with cerebral palsy, coronary artery disease a reply post essay examples, computer crime essay example, good example of essay on mmmmm dd yyyy, good essay on john rawls theory of justice in urban and regional planning, sample essay on the history of mary prince where a mother witnesses the auctioning of her daughters, free tips for a happier life 3 re wiring your brain for happiness essay sample, short term and long term goals admission essay sample, the slaughterhouse five essays examples, sample essay on political policy and legal influences on seclusion of illegitimate immigrants to medicaid medicare, sample essay on creativity and society experience, free argumentative essay on obedience is not necessarily a virtue, balko essays, summa essays, cobb essays, bears essays, solids essays, sant essays, alden essays, sniffles essays, pertussis essays, dtap essays, rotavirus essays, loris essays, conceptualizing essays, abscess essays.

Password recovery email has been sent to [email protected]

Use your new password to log in

You are not register!

By clicking Register, you agree to our Terms of Service and that you have read our Privacy Policy .

Now you can download documents directly to your device!

Check your email! An email with your password has already been sent to you! Now you can download documents directly to your device.

or Use the QR code to Save this Paper to Your Phone

The sample is NOT original!

Short on a deadline?

Don't waste time. Get help with 11% off using code - GETWOWED

No, thanks! I'm fine with missing my deadline

Web 2.0 Platform: Facebook Essay

Importance of facebook, uses of facebook.

The Internet has made the world to become a small place because it provides a platform for people to interact and do business. There are so many websites that have been developed for purposes of interconnecting people using Web 2.0 technology, and they include Facebook, Twitter, and MySpace, among several others.

Facebook is the most popular social networking site, and this is evidenced by the number of users that continue to increase every day. When in Facebook one can find a friend, job and many other things. Therefore, this paper focuses on how Facebook has influenced people’s lives.

Almost everyone is talking about Facebook because on this site one can trace his/her childhood friends and get new ones. Popoola (2011) argues that the main reason why many people prefer to use this social media to communicate is because it is cheap and convenient.

This is because Facebook is not like mobile phones that require to credit, which is perceived to be very costly. In fact, all that one need is a computer that is connected to the Internet or an Internet enabled mobile phone.

The popularity of this social network has made many people to seek for literacy in the use of computers because everyone, whether old or young, wants to move as the world moves.

Anderson (2007) explains that before the coming of social networking sites such as Facebook, people used to communicate through the mobile phones and letters, which were very expensive and unreliable. Facebook provides information about people by displaying individual details of every user, which is useful for security purposes.

When on Facebook, one befriends the people he/she likes by looking at their profile information, but then the person that is being requested for friendship has to confirm that he/she is known to the person making the request.

Communication is made possible on Facebook through an electronic mailing service that allows users to send and receive messages. Alternatively, users can chat in real-time through a chat utility integrated in Facebook.

According to Bronk (2008), Facebook has helped to retain their friends, unlike before when many friendships ended as people went their different ways after going through the education system.

On Facebook, people are not required to meet physically because they can interact virtually, just like they would if they were in close vicinity. Friendships are based on interests and this has led to the rise of cross cultural interactions. In such a case, interaction is not biased because the users can befriend anyone regardless of race or gender.

Facebook is informative because users are able to update one another on what is happening in different parts of the world. Many users are using the platform to express themselves through the status updates that are posted regularly. For instance, activists have continued to use Facebook to express their grievances.

In countries like Egypt, the government disabled the Internet because the revolution against the government was perceived to be coordinated through Facebook. By barring people from accessing the Internet, the Egyptian government was able to control the revolution, but that was not enough. This is because the stage for protesting against the government was already set.

Politicians too have changed the way they conduct their campaigns. This is because holding political rallies has many disadvantages: they could be met with a lot of hostility, and the amount of money spent on fueling their vehicles is extremely expensive (Fox, 2009).

Politicians have therefore embraced Facebook by staging their campaigns on the site because they are assured that their message will reach many people and will not be blocked in anyway. Once a user has posted something on the wall, it cannot be manipulated in anyway like in the posters that can easily be tainted or torn into pieces.

But this has not hindered politicians from holding rallies on the ground because by having alternatives of reaching their target audience their chances of success become broadened. In essence, Facebook is a useful tool especially to political aspirants who do not have all the money to go around.

Business people and companies also benefit from the capability of Facebook. For anyone who wishes to market his/her goods and services, Facebook is the ideal platform for advertising. This is because it is cheap and efficient, unlike other methods of marketing.

Facebook has many users who reside in various parts of the world, and it is ideal for companies that wish to expand their businesses. Greenstein (2009) argues that the popularity of Facebook has made many companies to include a link to the site in their websites.

In addition, there are many companies that advertise their jobs through Facebook and thus, it is a useful tool for jobseekers and employers. Most companies use the site to conduct surveys about their products and services.

This is cost effective because it would be very expensive to hire another party to conduct the survey. Moreover, the awards that are issued nowadays are influenced on the polls that take place on the social network.

Facebook has become a tool in decision making processes in most companies. This is because there are many employees who have been sacked because of the comments they posted on Facebook. Hawkins (n.d.) states that employers use social networking sites to track what their employees are doing and that way they are able to tell about the personality of their employees.

There are some employers who argue that Facebook has been contributing to their company’s decline in performance. They further explain that their employees sneak into the social network and leave their duties unattended.

Moreover, Facebook is identified as one element that fosters unity in today’s world. This is because people are able to communicate from different parts of the world without the barrier of their cultural differences.

Perhaps this is because Facebook allows people to communicate in their interactive languages, thus eliminating the problem of language barrier – the users are only present virtually and none of the communicating parties can identify any weaknesses in the other person, and judgments can only be made by the comments one posts on the wall.

Similarly, Facebook has enabled many people to find marriage partners. According to Lecky-Thompson (2009) there are so many people who are in successful marriages, owing to their interactions on Facebook. However, there are many marriages that have been ruined by this social site.

This is because most couples fight and argue about the comments made by one of them, especially towards opposite sex. The only problem here is that someone can put somebody else’s picture and thus, hide their image.

A major drawback of Facebook is that it is difficult to understand someone’s character while interacting with them on the social network. There are also criminals who use the site to trap unsuspecting users and later holding them as hostages and sometimes end up killing them.

However, people are encouraged to meet strangers in public places to know the person much more. Facebook enables users to establish their own social circles called groups. These groups are formed by people who share common interests such assisting each other in the time of need (Lecky-Thompson, 2009).

The children are also affected by Facebook, and this has caused their performance in school to decline. Most teachers are complaining about Facebook because they say that learners have become obsessed with it such that they finish their homework in a rush, just to please the teacher.

Children are now spending most of their time chatting with their friends on the social network such that they don not have time to play constructive games like soccer, which plays a major role in making them creative and discovering their talents early enough.

However, to others, Facebook is their source of consolation because when they are annoyed, they pour it out on the network. There are many people who pass their time by just chatting or playing games on the site for the sake of avoiding boredom.

Siegler (2009) argues that the emergence of social networking sites such as Facebook is trying to bring the world together in an effort to seal the gap that was left by modernization where everyone had to go his way in search of livelihood.

Today everyone belongs to a social network because they are many and one can join and leave at their own pleasure. It is surprising to find that everyone is busy such that even parents don not have time for their children and some use the social network to interact with their children while they are away from home.

This is much better than having no time at all. Likewise, smart teachers make good use of Facebook to communicate with their learners about the lessons that require detailed explanations. Some even go a step ahead to post the assignment on the site.

Sometimes, professionals use Facebook to share information, which makes their work easier because when they are more knowledgeable they can solve issues much faster. Since most people are obsessed with Facebook, they are not able to work on their interpersonal skills on face-face basis. This is because Facebook enables people to express themselves without feeling ashamed, which is beneficial to poor communicators.

Currently, Facebook is the most popular social networking site that uses Web 2.0 technology, and is useful in creating awareness. The site is used frequently as an exhibit of crime where some people were found on the wrong side of the law, such as displaying explicit pictures. In some countries the site is used to spread hate speech among ethnic communities.

In such a case the government of the concerned country has had to monitor the statements posted on the site so that situations leading to civil unrest can be avoided.

Similarly, there has been a problem of hackers who manipulate users profile and pose as those individuals. However, Facebook management has put safety measures in place so that the details that can be accessed by a user are limited according to specifications of the user.

Anderson, T. (2007). Web 2.0 and New Media Definitions. NewCommBiz.com. Web.

Bronk, C. (2008). Convergence and Connectivity: 1 of 2. YouTube. Web.

Fox, P. (2009). Friends and Neighbors . Guardian.co.uk. Web.

Greenstein, H. (2009). Facebook pages vs Facebook groups : What’s the Difference? Web.

Hawkins, K. What is a Social Networking Site? Web.

Lecky-Thompson, G. (2009). Facebook: Good or Bad for Communication. Web.

Popoola, J. (2011). What are the Effects of Social Networking Websites? Ezine Articles. Web.

Siegler, M. (2009). Location is the Missing link between Social Networks and the Real World . Web.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2021, July 26). Web 2.0 Platform: Facebook. https://ivypanda.com/essays/web-2-0-platform-facebook-essay/

"Web 2.0 Platform: Facebook." IvyPanda , 26 July 2021, ivypanda.com/essays/web-2-0-platform-facebook-essay/.

IvyPanda . (2021) 'Web 2.0 Platform: Facebook'. 26 July.

IvyPanda . 2021. "Web 2.0 Platform: Facebook." July 26, 2021. https://ivypanda.com/essays/web-2-0-platform-facebook-essay/.

1. IvyPanda . "Web 2.0 Platform: Facebook." July 26, 2021. https://ivypanda.com/essays/web-2-0-platform-facebook-essay/.

Bibliography

IvyPanda . "Web 2.0 Platform: Facebook." July 26, 2021. https://ivypanda.com/essays/web-2-0-platform-facebook-essay/.

  • Facebook Should Be Banned
  • Facebook Essay
  • Facebook: An Indispensable Social Networking Tool
  • Facebook and Twitter: Privacy Policy
  • Social Networking and Web 2.0
  • Facebook Addiction Problem Overview
  • History of Facebook Transformation
  • Facebook's Negative and Positive Effects on Children
  • Toms Company: Social Networking Sites
  • Facebook Network Globalization
  • Union Management: The International Brotherhood of Electrical Workers
  • Organizational Behavior and Management
  • Leadership Styles and Qualities
  • Front Line Employees and Service Quality
  • Performance Management: Goals and Objectives

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 02 May 2024

Web 2.0 technologies and translator training: assessing trainees’ use of instant messaging as a collaborative tool in accomplishing translation tasks

  • Kizito Tekwa   ORCID: orcid.org/0000-0003-2886-9106 1 ,
  • Wenchao Su   ORCID: orcid.org/0000-0002-9851-9753 2 , 3 &
  • Defeng Li 4  

Humanities and Social Sciences Communications volume  11 , Article number:  555 ( 2024 ) Cite this article

Metrics details

  • Language and linguistics

Web 2.0 technologies have had a significant impact on collaborative communication practices in teaching, learning, and professional work environments. In translation studies, computer-supported collaborative translation tasks have mainly been discussed within the project-based learning framework, where research has foregrounded correlations between collaboration and performance. However, trainees’ specific uses of collaborative tools, including transcripts of real-time exchanges, have neither been sufficiently investigated nor informed pedagogical strategies and approaches in any tangible way. This study bridges this gap by evaluating trainees’ collaborative practices while they translated a text, localized a restaurant menu, and simulated the design and launch of a language service agency. Data was gathered from a questionnaire, in-class presentations, and real-time instant messaging (IM) transcripts. Data analysis of the real-time IM exchanges unveiled considerable trainee communicative practices during collaborative tasks. Furthermore, correlations were established between the volume of instant messages, time of exchange, role played by trainees, and conversation themes with the teams’ final assessment performances. This study provided valuable insights into the effectiveness of IM as a collaborative tool in training environments. It also informed our suggested guidelines for properly integrating IM into the translator training curriculum.

Similar content being viewed by others

web 2.0 essay

The effectiveness of translation technology training: a mixed methods study

web 2.0 essay

Illuminating humanist nature in teaching translation and interpreting studies: Devising an online customisable AI-driven subtitling course

web 2.0 essay

Performance and perception: machine translation post-editing in Chinese-English news translation by novice translators

Introduction.

Web 2.0 technology use, including social media and instant messaging (IM) by translation professionals, has been central in the discourse of the past decade (Desjardins, 2011 ; García, 2010 ; Pym, 2011 ). The use of IM is predominantly a result of its ubiquity, portability, availability, locatability, and multimodality (Schrock, 2015 ) following rapid advances in technological development. Consequently, modern professional workplaces have become inherently collaborative, turning IM into a core communication tool between translators and their peers, project managers, clients, and subject experts (Kerremans et al., 2019 ).

Undoubtedly, the technology-driven professional environment has a bearing on translator training, with an increasing number of calls, some backed by empirical research, to align training with the realities of the industry (Su and Li, 2023 ). Monti ( 2012 ), for instance, has recommended the integration of Web 2.0 in translator training, arguing that trainers should “take into account these emerging trends and should adopt cooperative teaching models based on the new translation technologies, which allow the simulation of real work contexts” (p. 797). In her appeal to integrate Facebook into translator training, Desjardins ( 2011 ) has argued that “social networking sites can be incorporated [into translator training] in ways that foster complex learning provided students are taught to use them judiciously” (p. 187). Therefore, to heed these calls, trainers have adopted various training models, including project-based learning (PjBL), in order to emphasize autonomy, engage learners as they resolve real-life problems, develop trainees’ critical thinking skills, and foreground collaboration.

Regarding the application of PjBL to translator training, researchers (Kerremans et al., 2019 ; Tekwa, 2020 ) have profoundly explored the contribution of IM to completing collaborative tasks. However, while there appears to be a consensus on the cruciality of information exchange during collaborative translation tasks, thorough investigations have not been undertaken to explain how such interactions occur, what role interlocutors play, how IM practices are related to translation performance, and how IM use could be optimized via moderation and guidance. In addition, the practical use of IM in collaborative translation tasks has not been sufficiently observed to lay solid groundwork for its adequate integration in collaborative translation tasks within the PjBL framework.

Therefore, this study evaluates the practical employment of real-time IM by 68 trainee translators (nine teams) working collaboratively on translation, localization, and language service provision (LSP) tasks. Real-time IM exchanges provided voluntarily by trainees, their in-class presentations, and responses to a questionnaire were gathered and analyzed to explore how trainees worked together to perform the assigned tasks. In particular, the analysis focused on trainees’ IM practices, including how often, on which days, and at what times they exchanged IM; their preferred (synchronous or asynchronous) IM form; how they resolved task-related problems; and whether they assumed or were assigned roles. Finally, the data were analyzed to determine whether trainees’ IM practices correlated with their final assessment of the three tasks. Based on the study’s findings, guidelines were suggested for integrating IM as a collaborative tool in PjBL.

Project-based learning and communication challenges in collaborative tasks

Project-based learning ((PjBL), or “the translation task and project-based approach” (Hurtado Albir, 2015 , p. 256), encourages students to develop problem-solving, critical thinking, and collaborative skills based on empirical evidence from translation classrooms (e.g., Li et al., 2015 ). Therefore, we believe that project-based learning, especially group-oriented project-based learning is a valuable framework for the present study to design and implement the collaborative tasks in translator training. One of the foundational tenets of PjBL is collaborative or group work, where students “work together, share their findings, and decide how to best represent their new knowledge” (Li et al., 2015 , p.3). That means for group-oriented PjBL to be successful, members need to have minimum communication skills. Therefore, for PjBL to achieve its expected outcomes, learners must be able to communicate with each other.

Due to technological advances, much of the communication in today’s PjBL tasks is computer-mediated. That means information is shared via various media, including IM and other chat-based platforms, web conferencing, video conferencing, traditional text messages, and other digital communication networks (Zafirov, 2013 ). Even though it has been associated with higher performance, computer-mediated communication poses non-negligible challenges within the group collaboration framework. For instance, Moghaddas and Khoshsaligheh ( 2019 ) found that trainees are easily distracted during conversations, while García González and Veiga Díaz ( 2015 ) concluded that collaborative exchanges are time-consuming and members lack teamwork experience. Similarly, Apandi and Afiah ( 2019 ) raised concerns regarding the prior readiness of trainees and trainers, the difficulty of introverted trainees adapting, the challenges of accessing physical locations to observe teams at work, and the translation of culture-bound words. Furthermore, researchers have investigated the IM communication habits of teams engaged in PjBL-based tasks, concluding that high-performing teams start early, are consistent in their exchanges, develop camaraderie, are better organized, and engage in deep, rich, and thought-provoking conversations. In contrast, low-performing teams start slowly, are inconsistent and erratic, are poorly organized, and engage in shallow exchanges (Thomas and MacGregor, 2005 ).

Meanwhile, in terms of PjBL methodologies adopted, researchers have tended to rely on questionnaires, observations, presentations, or periodic reports. For instance, Pitkäsalo and Ketola ( 2018 ) based their analysis of trainee collaborative practices and exchanges on periodically submitted group reports, while Apandi and Afiah ( 2019 ), García González and Veiga Díaz ( 2015 ), Moghaddas and Khoshsaligheh ( 2019 ), Li et al. ( 2015 ), Prieto-Velasco and Fuentes-Luque ( 2016 ), and Martins and Ferreira ( 2019 ) adopted either a single approach or a combination of approaches, including instructor observations, reports, presentations, and semi-structured interviews in their assessment of how trainees collaborated during translation tasks. Though useful in their insightful conclusions, these approaches revealed that the raw data of trainees’ IM exchanges have not been adequately explored to understand learners’ IM group task communicative practices. In our opinion, this represents a significant research gap worthy of filling, especially given that within the industry, there have been calls for increased collaboration among professionals, including via current mobile messaging platforms (Désilets and Van Der Meer, 2011 ; Gough, 2011 ).

Therefore, conversant with some of the IM challenges outlined above, we designed this study to focus on real-time IMs shared by groups performing collaborative tasks within the PjBL framework. In other words, we analyzed trainees’ conversation transcripts and questionnaire responses to answer two fundamental questions:

How did trainees use real-time IM as a collaborative tool to accomplish translation tasks?

To what extent did their IM practices correlate with their performance?

The two fundamental research questions comprise sub-areas, as outlined in Fig. 1 below.

figure 1

The two main research questions and sub-areas of investigation.

Methodology

This section discusses the participants, study design (i.e., task objectives, pre-task, task execution, reporting, and assessment), and data collection and analysis. The data collection section equally underscores the mixed methods used by describing the quantitative and qualitative data collection, instruments, and analyses.

Participants

In total, 68 trainees (female: n  = 58; male: n  = 10) participated in the study. They were first-year students in the professional Masters of Translation and Interpreting (MTI) program at a Chinese university. They were enrolled in the mandatory Language Services and Project Management (LS&PM) course designed to provide students with broad practical and theoretical knowledge of the translation industry and guide them toward making potential career choices within the expanding profession. The course introduces trainees to translation (e.g., computer-assisted translation (CAT), machine translation (MT), machine translation postediting (MTPE), and translation techniques), localization (e.g., websites, menus, tools, and resources), and project management (e.g., processes, deadlines, industry practices, tools, and resources). The LS&PM course runs in the fall semester of each academic year and lasts 16 weeks, from early September to early January. The trainees formed nine teams, each comprising eight or nine members who voluntarily agreed to work together.

Task design

The teams had to complete three tasks: translate a text, localize a menu, and design and launch an LSP agency. Regarding technology use, the translation and localization tasks had to be accomplished on the Smartcat and POEditor CAT tools. In contrast, the LSP task used a WeChat mini-program designed and launched on the social media platform (Lin et al., 2020 ) that allows companies and individuals to develop, launch, and disseminate applications, products, services, news, and business information in multiple formats.

A class WeChat group was created to facilitate communication. Then, each team created a separate WeChat group to facilitate exchanges while accomplishing the tasks. Therefore, in addition to face-to-face meetings, IM was the primary means of communication. WeChat was chosen as the IM platform because it is a learning tool (Hou et al., 2021 ; Shi et al., 2017 ; Xue et al., 2021 ) and, arguably, China’s most popular messaging application, meaning trainees used it daily for multiple reasons, including exchanging messages, voicemails, calls, photos, documents, and pictures. Though the teams met offline whenever necessary, it is noteworthy that solely their raw IM exchanges were analyzed within the context of this study.

The tasks had different objectives, as illustrated in Table 1 .

The translation task required trainees to collaborate in translating (from Chinese to English) a text of approximately 2500 characters using the Smartcat CAT tool. The teams had two weeks to accomplish the task, i.e., use the tool, discuss, and agree upon target text equivalents. They had to use the MT function of the tool and post-edit the output, thereby developing their technological, textual, terminological, and collaborative competencies and other skills (e.g., leadership, team player, and time management) necessary to succeed in today’s industry (Kiraly, 2005 ).

Concerning the localization task, the trainees were requested to adapt the menu of a Chinese restaurant hypothetically setting up shop in New York City in a predominantly white, middle-income neighborhood where spicy food is unpopular and half of the population is vegan. Therefore, the task required them to think critically and adapt the menu by adding, removing, or modifying ingredients, improving the design, describing dishes, and employing persuasive language. Each task lasted two weeks, during which members used various technologies and exchanged IM as they worked together.

Furthermore, regarding designing and launching an LSP, the trainees were expected to either use any current LSP as their model or brainstorm creative and innovative ideas to meet current and future clients’ multiple needs. The tasks required collaborative work, critical thinking, and information mining on websites, applications, and other mini-programs. The teams also had to design the mini-program and employ marketing language (emotional, functional, descriptive, formatting, and positioning). Each project had to be accomplished within a specific time frame, informed by a workflow (Hurtado Albir, 2015 ; Kiraly, 2005 ), which included the pre-task, task execution, reporting, revision, and assessment stages.

Before the trainees performed the assigned tasks, they had practical in-class training that lasted for three weeks. Specifically, they received theory-based lectures on localization, translation, project management, and language services. In addition, they had hands-on practice using technology tools, simulated short-duration group tasks and took a short quiz to ensure they were familiar with the tools and processes. The pre-task also allowed them to work in teams and practice exchanging IM, discussing challenges, and building trust and confidence. During the pre-task, team members sent and received instant messages via the class WeChat group, which included the instructor. This was a crucial opportunity for the instructor to answer questions, guide the exchanges, and ensure that instructions for the tasks were precise, concise, understood, and followed.

Task execution

The tasks, instructions, due dates, and tools to use were transmitted to the teams via the class WeChat group. Thereafter, the instructor mainly functioned as a guide. Meanwhile, the trained worked collaboratively in teams, conversing face-to-face or exchanging IM on their team WeChat group. They could return to the class WeChat group and ask questions to the instructor or other team members. Project-based learning mainly occurred at this phase, where trainees relied on their critical thinking skills while interacting with their team members to develop heuristics useful in real professional workplaces.

Each week, in class, the teams reported on their progress and the roles assigned to or taken up voluntarily by members. They also discussed the challenges encountered and measures taken to resolve them, their teamwork atmosphere, and their level of camaraderie. Most often, teams that had faced the same problems shared their perspectives and experiences given that the “ways and means of accomplishing … goals vary greatly from group to group, from project to project and from student to student” (Kiraly, 2005 , p. 1109). Meanwhile, the instructor offered advice and guidance and, most importantly, gathered data on team members’ roles.

The translation task was assessed based on rubrics and assessment methods suggested by Hurtado Albir ( 2015 ), albeit slightly modified to reflect specific trainee learning needs (see Supplementary Appendix 2 ). The instructor provided feedback after reviewing the teams’ final translation. The localization task was assessed based on how well the menu was translated and adapted (taste, item descriptions, rare ingredient descriptions, layout, measurements, price, color, and selling language) for local diners and how the teams justified terminological and design-based choices during their presentations (see Supplementary Appendix 3 ). As for the LSP task, the trainees were assessed on the design, effectiveness of the business language, services offered, ability to track and quickly respond to client requests, and project management procedures (see Supplementary Appendix 4 ).

Data collection and analysis

Data were gathered through a questionnaire on trainees’ IM communication practices and experiences. Furthermore, in-class presentations, interviews with trainees, and raw IM exchanges voluntarily provided by the teams at the end of the semester were analyzed to obtain various findings.

Questionnaire

The questionnaire gathered data from individual trainees who worked collaboratively and communicated via IM after they provided their consent for the data to be analyzed within the framework of this study. In particular, the questionnaire comprised eight questions, focusing on the trainees’ perceptions of their team’s level of camaraderie, members’ IM exchange practices, problems encountered, problem-resolution strategies, and team members’ attitudes within the context of the problem-based collaborative tasks. The questionnaire comprised mainly Likert scale–type questions to facilitate data collection, given that it took participants only an average of 73.23 s to complete.

Real instant message exchanges and in-class presentation data

During the weekly in-class presentations, data were gathered on task organization and the roles assigned to or voluntarily taken up by team members. The nine teams each voluntarily provided their corpus of IM conversations at the semester’s end. The data comprised 78,138 Chinese characters and English words forwarded as Microsoft WordPad files to the instructor via email. There were 4544 IM turns, defined as “one line from one participant that ends at the point the participant presses ‘enter’ and sends the transmission to his or her interlocutor” (MacKiewicz and Lam, 2009 , p. 419).

The IM exchanges were downloaded and pre-treated for analysis. The conversations of each team were pre-treated individually and carefully verified by two assistants hired for this purpose. Specifically, IM turns that contained pertinent information were color-coded using different colors, as outlined in Table 2 below.

Synchronous IM exchanges were considered to be messages sent back and forth within intervals of less than 60 seconds. In other words, for an exchange to be considered synchronous, a reply had to occur within 60 seconds (MacKiewicz and Lam, 2009 ). Whenever a turn contained information belonging to two or more categories, the colors were aligned next to each other to facilitate tallying. For example, if trainees engaged in synchronous exchanges about a specific theme, such as scheduling, two colors (red and orange) were used to identify the theme and the IM exchange form, as illustrated in Fig. 2 below.

figure 2

Use of colors to identify various instant messaging categories.

The six turns above discussed one theme, i.e., scheduling, identified using red. However, turns 1, 5, and 6 were asynchronous exchanges (they occurred at different times and were represented with the green color), while turns 2, 3, and 4 were synchronous exchanges (they all occurred at 7:26 pm) and were coded using orange. The IM exchange categories of each team were identically marked and then tallied to obtain means and standard deviations.

Data analysis

The questionnaire responses were downloaded from the website of free online surveys into an Excel spreadsheet and then uploaded to the SPSS software for further analysis. The questionnaire was internally consistent given that it had a Cronbach’s alpha value of α  = 0.89. Furthermore, SPSS was utilized to calculate the means, standard deviations, Pearson’s correlation coefficient of several variables, and T -tests that determined statistical variances among the variables. Meanwhile, we conducted thematic analyses of trainees’ interviews to extract textual information that underpins the trainees’ IM exchanges.

How trainees used real-time IM as a collaborative tool to accomplish translation tasks

To understand the extent to which trainees utilized IM to accomplish the assigned tasks, we investigated the frequency of exchanges, preferred communication time and day, preferred communication forms, the role played by members, themes discussed, the extent of IM multimodality, and communicative and technical challenges and how they were resolved.

The top-performing teams communicated better than the bottom-performing teams

Communication frequency.

We found significant differences in the volume of IM exchanges, with the top-performing teams communicating significantly more frequently while working on the three tasks. The data of IM exchanges showed that Teams 1, 7, 8, and 9 exchanged 887, 848, 756, and 678 IM turns, respectively. The volume of IM turns exchanged by the four teams accounted for over half of the total IM exchanges. In contrast, the bottom-performing teams, namely Teams 2, 3, 4, 5, and 6, exchanged 357, 297, 279, 255, and 187 IM turns, respectively, as presented in Fig. 3 below.

figure 3

Volume of IM exchanges per team of participants.

Preferred communication day and times

We then examined whether the teams preferred to exchange IM on specific days and time slots. One reason was to understand how close to or far away from the day of the lesson and how late into the night teams discussed. We assumed that such knowledge might be instrumental in designing future collaborative tasks and guiding IM exchanges within translation tasks and PjBL.

Our data indicated that participants exchanged the most instant messages (20%) on Monday, the day of the lesson, and Tuesday (17%), the day following the lesson, as presented in Fig. 4 below.

figure 4

Daily number of instant messages exchanged by participants.

In contrast, teams exchanged the least IM on Friday (four days after the lesson). However, as the lesson day approached, the volume of exchanges gradually increased on Saturday and Sunday. Upon further analysis, we found that messages exchanged prior to, on, or the day after the class often centered around the core themes, i.e., were devoid of non-task-related themes and issues. For instance, Conversation 1 in Fig. 5 is an excerpt of a task-focused IM conversation by Team 8 members on the day before the lesson.

figure 5

Conversation 1.

In terms of the communication times, we divided the day into multiple three-hour time slots (6 am–8:59 am/9 am–11:59 am/12:00 pm–2:59 pm/3:00 pm–5:59 pm/6:00 pm–8:59 pm/9:00 pm–11:59 pm) and a six-hour time slot (12:00 am–5:59 am). Then, we tallied the number of IM turns exchanged within each time slot. We also found that the most unpopular time slots were early in the morning, between 6:00 am and 8:59 am, and after midnight, between 12:00 pm and 5:59 am. The data also showed that Teams 9, 5, and 3 preferred communicating between 9:00 pm and 11:59 pm, relatively late at night.

Preferred IM type

The study intended to understand whether participants preferred synchronous or asynchronous conversations and whether the preferred conversation mode had a bearing on their performance. Our data analysis showed that participants engaged more in synchronous (3228 turns, 71%) than asynchronous (1317 turns, 29%) conversations (Fig. 6 ).

figure 6

Percentage of synchronous and asynchronous instant messaging exchanges per team.

In terms of conversation type per team, we found that Team 1 sent the most synchronous IM turns (83.2%) and the fewest asynchronous (16.8%) IM turns (Fig. 6 ). In contrast, Team 3 exchanged the fewest (60%) synchronous and the most (40%) asynchronous IM turns. We also found that overall, teams exchanged 60% or more synchronous IM turns, making this form of communication the most preferred. Our data further showed that during synchronous IM, participants brainstormed ideas, seldom derailing from the topic. For instance, in Conversation 2 (Fig. 7 ) below, members of Team 3 discussed their menu layout (localization task) synchronously, sticking to the topic of conversation.

figure 7

Conversation 2.

In terms of correlation with the final assessment, there was a high positive correlation between synchronous exchanges and the final assessment [ r (9) = 0.74, p  = 0.022, i.e., non-significant] and a high negative correlation between asynchronous exchanges and the final assessment [ r (9) = –0.88, p  = <0.002, significant]. The data corroborated previous findings, indicating that synchronous communication encourages brainstorming and fosters more productive exchanges than asynchronous communication (Thomas and MacGregor, 2005 ).

Roles played by members

The data gathered during the weekly team in-class presentations revealed that teams employed two methods to fill seven fundamental roles. According to the data, roles were either assigned to members permanently or on an ad hoc basis or were voluntarily taken up by members (Fig. 8 ).

figure 8

Structure of teams showing assigned/volunteered roles.

In particular, Teams 1, 7, 8, and 9 (top-performing teams) identified four crucial roles (i.e., team lead, terminology manager, timekeeper, and secretary), which were filled permanently regardless of the task they performed (Table 3 ). While the team lead oversaw the entire project, the terminology manager documented, stored, and updated term records. Meanwhile, the timekeeper reminded members about deadlines, and the secretary recorded the challenges encountered and a list of questions to discuss in class or ask the trainer.

In contrast, Teams 2, 3, 4, 5, and 6 (low-performing teams) had only one permanent role, team lead, which they filled consistently (Table 3 ). The other roles were assigned to members or were voluntarily filled on an ad hoc basis. For instance, the team lead of Team 6 assigned a “competent” member to manage terminology while other members translated the document.

The data also showed that most roles were assigned during the LSP design and launch task, which required a content writer/creator, data miner, page designer, and members with practical knowledge of WeChat mini-programs. Furthermore, regarding team organization and role play, we found discrepancies between high- and low-performing teams that reflected their performance in the final assessment score.

Conversational themes

Our data analysis showed that participants discussed multiple themes, some of which were not directly related to the tasks assigned. While task-related themes included localization, translation, LSP, scheduling, and planning, non-task-related themes included gossip, personal issues, and entertainment (e.g., meals, games, concerts, and music). It was also found that most IM exchanges (33%) occurred when teams discussed the LSP design and launch task. Furthermore, trainees also exchanged more IMs when performing the localization task (31%) than the translation task (26%).

In terms of non-task-related IM exchanges, we found that team members discussed personal issues (1%), entertainment (1%), gossip (4%), and scheduling-related issues (4%). However, we found significant discrepancies in how the teams discussed themes unrelated to the assigned task (Table 4 ). For instance, 14% of the IM exchanges of Team 4, one of the low-performing teams, were gossip, while 13% of the IM exchanges of Team 2, another low-performing team, were entertainment.

Furthermore, we analyzed the transcripts of the teams’ IM exchanges and found that gossip played a significant role. For instance, 7% of the instant messages exchanged by members of Teams 9 and 3 were gossip. However, while Team 9, a top-performing team, used gossip to transition to other themes, Team 3 appeared to gossip for gossip’s sake. For their part, Team 8 and Team 9 extensively discussed scheduling issues that accounted for 10% and 15% of the total volume of IMs exchanged. While the themes of gossip and entertainment suggested a degree of camaraderie among team members, as indicated in the survey findings, the lengthy exchanges on scheduling appeared to corroborate the questionnaire responses of some trainees who perceived IM exchanges to be time-consuming. We equally noted that non-task-related themes had several functions, as outlined in Fig. 9 below.

figure 9

Functions of non-task-related themes.

Finally, we found a relationship between the number of themes discussed and how teams performed in the final assessment. Specific statistical evidence is presented in the section on the association between IM practices and the final assessment.

IM multimodality

The 4544 IM turns exchanged by the nine teams were multimodal, including texts, images, screenshots, emojis, and documents. In total, 3,954 turns (87%) were text, while 204 (4%), 313 (7%), and 74 (2%) turn were images, emojis, and documents, respectively. It was impossible to account for voicemails, given the inability to export them from the WeChat IM platform.

After breaking down the volume of exchanges by the teams, we found an uneven multimodal IM exchange pattern that consolidated the popularity of text messages as an IM mode, corroborating research highlighting this phenomenon, especially among college students (Shi et al., 2017 ; So, 2016 ).

According to the data in Fig. 10 above, Team 7 had the most multimodal exchanges, with images and screenshots accounting for 12% and documents accounting for 6% of their exchanges. In contrast, Team 9 exchanged no instant messages in picture mode, even though they exchanged 94 (11%) emojis. Meanwhile, Team 6 shared no emojis, preferring texts (76%), images (12%), and documents (21%). A t-test to determine the statistical differences among the messaging forms confirmed differences between text messages ( M  = 87.67, SD = 7.18), images ( M  = 4.11, SD  = 5.69), documents ( M  = 2.11, SD  = 3.86), and emojis ( M  = 4.56, SD  = 1.18). However, only the difference concerning text messages and emojis was statistically significant ( p  = 0.001 and p  = 0.005, respectively).

figure 10

Distribution of multimodal messages by teams.

Challenges encountered and resolution strategies

One of the main problems encountered during the exchanges was the unwillingness of members to engage in discussions. Other negative experiences included a lack of time consciousness, a dislike for IM exchanges, laziness, and the inability of members to get along. For instance, as outlined in Fig. 11 , the questionnaire data showed that 62% of the trainees believed all their team members got along, while 1% believed none of them got along with the others.

figure 11

Extent to which team members got along.

Though mostly communication-related problems are emphasized in this study, it is worth noting that teams also encountered technical difficulties, including sharing large files on the IM platform, downloading and uploading files, selecting design options for the language service agency logo, and translating technical terms. For example, in Conversation 3 (Fig. 12 ) below, Team 5 and Team 6 members discussed and resolved two problems encountered while uploading a picture and registering a mini-program.

figure 12

Conversation 3.

Our analysis indicated that technical and communicative problems were resolved using several strategies, including online file sharing and offline meetings, as demonstrated in Teams 5 and 6’s IM exchanges. Other strategies included online discussion/debate, Internet mining (finding the solution by searching the Internet), troubleshooting, and role swapping (the team member who knows the solution swaps roles with the one who has the problem). The problems and specific solutions adopted to resolve them are summarized in Fig. 13 .

figure 13

Problems and solution strategies employed by teams.

The data showed that most strategies were deployed to resolve technical difficulties, with teams adopting at least two solution strategies, including online discussion/debate. In contrast, communicative challenges were resolved uniquely via online discussion/debate. Furthermore, we found that the top-performing teams (Teams 1, 7, 8, and 9) each employed four problem-resolution strategies, while the majority (80%) of the low-performing teams adopted at most two strategies to overcome encountered challenges.

Meanwhile, the IM exchange corpus had no information on strategies adopted to resolve the inability of members to get along. Consequently, we contacted the three participants (P1, P2, and P3) who got along with “few team members” and “no team members” for their perspectives. Four main reasons accounted for their behavior: 1) the attitude of some members; 2) incompatible schedules; 3) individual trainees’ personalities; and 4) technical problems. For instance, P3 maintained: “It was difficult for me to keep talking when some group members believed they knew everything and won’t accept others’ opinion[s]. If I had known them well at the start of the class, I wouldn’t have joined their group.” Meanwhile, P2 claimed their mobile device was nonfunctional during the first half of the semester, making it impossible to exchange instant messages seamlessly.

The association between IM practices and the final assessment

Im frequency.

The highest average score of the final assessment was recorded for the LSP task (89.3%), followed by the localization task (83.3%) and translation task (82.4%) (Table 5 ). The number of IM turns exchanged by teams reflected their final assessment score even though IM was only one conversational mode. The data indicated that teams that exchanged more IM turns performed better than teams that exchanged fewer IM turns.

Table 5 showed that the top four teams (Team 1, 9, 8, and 7) that exchanged the most instant messages also had the highest scores in the final assessment. In contrast, Team 6, which had the fewest IM exchanges (4%), also had the lowest performance on the final assessment. There was a very high significant correlation [ r (9) = 0.97, p  = < 0.001] between the final assessment and the percentage of IM exchanges by teams.

Furthermore, it was found that the volume of IM exchanges for each task correlated with the teams’ average final assessment score. There was a significantly high correlation between the final score and IM specific to the localization task [ r (9) = 0.98, p  = < 0.001] and IM specific to the translation task [r(9) = 0.98, p  = < 0.001]. However, the correlation between the final assessment score and IM exchanges specific to the LSP task was slightly lower [ r (9) = 0.84, p  = < 0.004], comparatively. Therefore, based on the volume of IM exchanged, we concluded that teams performed better in the tasks for which they exchanged the most IM turns. Also, the translation and localization tasks had a slightly higher contribution toward the final assessment score than the LSP task.

Conversation time

Furthermore, we identified a relationship between the conversation time and the final assessment score. Though teams conversed at different time slots, we found that four teams, including the three top-performing teams (8, 1, and 2), preferred conversing in the morning (9:00 am–11:59 am). In contrast, three teams, including Teams 5, 3, and 9 (one of the four top-performing teams), preferred conversing in the evening between 9:00 pm and 11:59 pm, as illustrated in Fig. 14 below.

figure 14

Conversation pattern based on time slots.

We performed a t-test on SPSS to determine the statistical differences between the various IM slots. The data unveiled significant statistical variances in the 9 am to 11:59 am ( p  = <0.001), 12 pm to 2:59 pm ( p  = <0.001), and 6 pm to 8:59 pm ( p  = < 0.001) conversation time slots.

Other correlations

Our data analysis also unveiled correlations between the final assessment score and the role play by team members, the number of conversation themes, and teams’ problem-resolution strategies. With regard to role play, there was a high positive and significant correlation between the final assessment of the teams and the total number of roles played by team members [ r (9) = 0.89, p  = <0.001]. We also found a significantly positive correlation between localization task roles [ r (9) = 0.94, p  = <0.001], translation task roles [ r (9) = 0.74, p = 0.022], and LSP task roles [ r (9) = 0.94, p  = <0.001], including statistically significant roles for localization and LSP. That means the more roles team members played in the various tasks, the higher their teams’ assessment scores.

In terms of the number of conversation themes, we found a higher positive correlation coefficient between the final assessment score and scheduling [ r (9) = 0.46, p  = 0.20] than between the final assessment score and personal issues [ r (9) = 0.35, p  = 0.36] and gossip [ r (9) = 0.16, p  = 0.68]. In contrast, there was a negative correlation between the final assessment score and entertainment [ r (9) = –0.32, p  = 0.40]. Though not significant from a statistical viewpoint, the results suggested that the exchanges on scheduling had a higher positive influence on teams’ performance.

With regard to the correlation between the number of strategies adopted by teams to resolve various challenges and the final assessment score, we found a strong and significant positive correlation between the two variables [ r (9) = 0.90, p  = <0.001]. In other words, the more solutions a team adopted, the better they performed on the final assessment. The finding appeared to corroborate other studies (Stadler et al., 2018 ; Veerasamy et al., 2019 ) that have linked problem-solving skills to academic performance among university students.

Discussion and recommendations

The findings had far-reaching implications for IM use in collaborative translation tasks, particularly within the PjBL framework. The trainees’ IM practices showed significant irregularities in terms of IM frequency, role play, IM types, conversation types, and themes discussed between high-performing and low-performing teams. There were also correlations between teams’ IM practices and final assessment scores. Meanwhile, findings from the data gathered during in-class presentations indicated that teams that identified and filled specific roles while performing the three tasks performed better than teams that filled roles on an ad hoc basis. In addition, the data from real IM exchanges revealed that the volume of IM exchanged aligned with the final assessment score, with most of the IM conversations occurring on or around the day of the lesson. The findings also indicated that synchronous communication was the most preferred form of IM, and most top-performing teams conversed in the morning between 9 am and 11:59 am. Regarding modality, text was the most popular IM mode, followed by emojis. Furthermore, the teams discussed several task-related and non-task-related themes. Finally, teams overcame challenges by employing various problem-resolution strategies, including debating the problem online and offline, mining the Internet, sharing files and documents, voting, and working simultaneously to edit cloud-based files.

The findings indicated that IM is a useful collaborative tool for completing multiple and interrelated tasks, particularly within the broader framework of PjBL.Moreover, the ability of the top-performing teams to organize themselves, identify and fill various roles, and resolve challenges using multiple strategies appeared to answer calls by the task-based approach and PjBL advocates to “place students in the center of the translating [localizing and LSP designing] operation so that they can understand [their] dynamics” (Hurtado Albir, 2015 , p. 15).

Though IM exchanges only accounted for part of the teams’ collaborative communication endeavors (the other part being offline meetings), we argue that adequate IM communication guidelines could significantly improve the effectiveness of IM use in collaborative tasks. In our estimation, the volume, time, manner, and thematic content of the IM exchanges and problem-resolution strategies adopted by teams are crucial and substantiate our call to integrate IM in translation tasks and project design. Meanwhile, conscious of the fact that the effectiveness of such an endeavor depends on the course objectives and other variables, including IM clients and related technological affordances, we contend, at this juncture, to provide a few general guidelines (see Supplementary Appendix 1 ) in favor of such an integration.

First, we recommend that IM exchange guidelines be written as a template and presented to teams before they begin working on a task. The guideline template should be customizable. The ability of teams to fill the template with options that work best for them is, therefore, crucial and constitutes the very essence of the guidelines. Second, the guidelines should include conversation days and time slots for members to select based on their availability to communicate online or offline and work simultaneously on cloud-based documents. We believe a pre-arranged conversation and work schedule may significantly minimize or eliminate the time spent debating schedule-related matters. Third, the guidelines should ensure that teams spend most of their time discussing pertinent themes. Therefore, the instructor could suggest themes that fall within the scope of the task and allow teams to allocate or set aside time tasks and non-task time slots or discussion quotas. All team members, especially those not playing a specific role may be encouraged to participate more in group discussions, increasing their overall engagement level. Furthermore, a moderator could be assigned to steer conversations back to predetermined themes to avoid lengthy discussion topics unrelated to the task. Fifth, the guidelines should be expandable to include options such as language choice (if there is a need to communicate in a specific language) and a section where teams can document challenges encountered during exchanges and strategies adopted to resolve them.

This study sought to investigate the use of IM as a collaborative tool within the context of PjBL. Participants, trainee translators enrolled in the MTI program at a Chinese university, were required to perform three tasks, including translation, localization, and language service provision. The study adopted the mixed methods approach, combining quantitative and qualitative data gathered from voluntarily donated IM exchanges, in-class presentations, interviews, and responses to a questionnaire completed by the 68 trainees who participated in the study. The findings showed discrepancies among the teams in terms of IM frequency, task-related and non-task-related themes discussed, conversation times, roles played by members, and preferred IM types. Also, correlations were established between teams’ IM practices and their performance in the final assessment. In particular, top-performing teams exchanged more IMs, discussed more themes, identified and filled permanent roles when performing various tasks, and deployed more problem-resolution strategies.

Informed by the findings, we deemed it necessary to foster the integration of IM as a collaborative tool in translation tasks by providing guidelines for its application in learning environments. The guidelines, we argued, should be a template distributed to teams prior to the commencement of the tasks. The template should be customizable, allowing teams to discuss and fill in the blanks with various information, including online and offline conversation schedules, themes (both task-related and non-task-related, if applicable) to discuss, and roles (fixed and ad hoc) to fill. In addition, a moderator should steer members back to task-oriented discussion themes. We posit that such guidelines may save time, further engage members, ensure that conversations focus on the right topics, and offer teams better control over their deliberations. However, we emphasize that any proposed guidelines must complement, not overshadow, the raison d’être of collaborative tasks within the PjBL framework, which underscores autonomy, critical thinking, overcoming challenges, and developing multiple competencies required in today’s technology-oriented translation industry.

Limitations of the study

One limitation of the current study is that only 68 participants took part in the study. Though the number appears representative, we believe more participants, preferably in more than one MTI program, could offer more insightful findings. In addition, the study focused on communicative problems, meaning future research could profoundly assess technical issues, including those not highlighted in IM conversation threads.

Data availability

The data supporting this study’s findings are available on request from the corresponding author. The data are not publicly available due to privacy or ethical restrictions.

Apandi A, Afiah DSS (2019) Project based learning in translation class. Acad J Perspect Educ Lang, Lit 7(2):101–108

Google Scholar  

Désilets A, Van Der Meer J (2011) Co-creating a repository of best practices for collaborative translation. Linguist Antverpiensia N Ser–Themes Transl Stud 10:27–45

Desjardins R (2011) Facebook me!: Initial insights in favour of using social networking as a tool for translator training. Linguist Antverpiensia N Ser–Themes Transl Stud 10:175–193

García González M, Veiga Díaz MT (2015) Guided inquiry and project-based learning in the field of specialised translation: a description of two learning experiences. Perspectives 23(1):107–123

Article   Google Scholar  

García I (2010) The proper place of professionals (and non-professionals and machines) in web translation. Tradumática (8):0001-0007

Gough J (2011) An empirical study of professional translators’ attitudes, use and awareness of Web 2.0 technologies, and implications for the adoption of emerging technologies and trends. Linguist Antverpiensia N Ser–Themes Transl Stud 10:195–217

Hou R, Han S, Wang K, Zhang C (2021) To WeChat or to more chat during learning? The relationship between WeChat and learning from the perspective of university students. Educ Inf Technol 26(2):1813–1832

Hurtado Albir A (2015) The acquisition of translation competence. Competences, tasks, and assessment in translator training. Meta 60(2):256–280

Kerremans K, Gutiérrez RL, Stengers H, Cox A, Rillof P (2019) Technology use by public service interpreters and translators: The link between frequency of use and forms of prior training. FITISPos Int J 6(1):107–122

Kiraly D (2005) Project-based learning: a case for situated translation. Meta: J des Trad/Meta: Translators’ J 50(4):1098–1111

Li D, Zhang C, He Y (2015) Project-based learning in teaching translation: students’perceptions. Interpret Transl Train 9(1):1–19

Lin Y, Qiu J, Chen P (2020) Exploration and practice on intelligent teaching patterns based on WeChat Mini Program. In: Proceedings of the 9th International Conference on Educational and Information Technology, ICEIT’20, Oxford, United Kingdom, pp. 153–157

MacKiewicz J, Lam C (2009) Coherence in workplace instant messages. J Tech Writ Commun 39(4):417–431

Martins C, Ferreira C (2019) Project-based learning in audiovisual translation: A case study in error analysis. J Audiov Transl 2(1):152–182

Moghaddas M, Khoshsaligheh M (2019) Implementing project-based learning in a Persian translation class: a mixed-methods study. Interpret Transl Train 13(2):190–209

Monti J (2012) Translators’ knowledge in the cloud: the new translation technologies. Proceedings of the International Symposium on Language and Communication: Research trends and challenges (ISLC). İzmir University, Turkey, pp. 789–799

Pitkäsalo E, Ketola A (2018) Collaborative translation in a virtual classroom: Proposal for a course design. Translett Int J Transl Interpret (1): 93-119

Prieto-Velasco JA, Fuentes-Luque A (2016) A collaborative multimodal working environment for the development of instrumental and professional competencies of student translators: an innovative teaching experience. Interpret Transl Train 10(1):76–91

Pym A (2011) What technology does to translating. Transl Interpret 3(1):1–9

Schrock AR (2015) Communicative affordances of mobile media: portability, availability, locatability, and multimediality. Int J Commun 9:1229–1246

Shi Z, Luo G, He L (2017) Mobile-assisted language learning using Wechat instant messaging. Int J Emerg Technol Learn 12(2):16–26

So S (2016) Mobile instant messaging support for teaching and learning in higher education. Internet High Educ 31:32–42

Stadler M, Becker N, Schult J, Niepel C, Spinath FM, Sparfeldt JR, Greiff S (2018) The logic of success: the relation between complex problem-solving skills and university achievement. High Educ 76(1):1–15

Su W, Li D (2023) The effectiveness of translation technology training: a mixed methods study. Humanit Soc Sci Commun 10(1):1–12

Article   ADS   CAS   Google Scholar  

Tekwa K (2020) Real-time machine-translated instant messaging: a brief overview with implications for translator training. In: Zhao, J, Li, D, & Tian, L (eds.) Translation education: a tribute to the establishment of World Interpreter and Translator Training Association (WITTA). pp. 135–153, Springer

Thomas WR, MacGregor SK (2005) Online project-based learning: how collaborative strategies and problem solving processes impact performance. J Interact Learn Res 16(1):83–107

Veerasamy AK, D’Souza D, Lindén R, Laakso MJ (2019) Relationship between perceived problem‐solving skills and academic performance of novice learners in introductory programming courses. J Comput Assist Learn 35(2):246–255

Xue S, Hu X, Chi X, Zhang J (2021) Building an online community of practice through WeChat for teacher professional learning. Prof Dev Educ 47(4):613–637

Zafirov C (2013) New challenges for the project based learning in the digital age. Trakia J Sci 11(3):298–302

Download references

Acknowledgements

This work was supported by the research grant (No. CTS202306) from the Center for Translation Studies, Guangdong University of Foreign Studies, and the 2023 Guangdong University Quality Assurance Project by the Department of Education of Guangdong Province ([2024] 9).

Author information

Authors and affiliations.

School of Foreign Languages, Shenzhen Technology University, Shenzhen, China

Kizito Tekwa

School of Interpreting and Translation Studies, Guangdong University of Foreign Studies, Guangzhou, China

Center for Translation Studies, Guangdong University of Foreign Studies, Guangzhou, China

Centre for Studies of Translation, Interpreting and Cognition, Faculty of Arts and Humanities, University of Macau, Macau, China

You can also search for this author in PubMed   Google Scholar

Contributions

Conception and design of the work: KT, WS. Supervision: DL. Original draft: KT. Revising and editing: KT, WS, DL. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Wenchao Su .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

Approval was obtained from the University of Macau Ethics Review Committee. The procedures used in this study adhere to the tenets of the Declaration of Helsinki.

Informed consent

Informed consent was obtained from all participants in the study.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Appendix 1–4, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Tekwa, K., Su, W. & Li, D. Web 2.0 technologies and translator training: assessing trainees’ use of instant messaging as a collaborative tool in accomplishing translation tasks. Humanit Soc Sci Commun 11 , 555 (2024). https://doi.org/10.1057/s41599-024-02934-5

Download citation

Received : 31 May 2023

Accepted : 05 March 2024

Published : 02 May 2024

DOI : https://doi.org/10.1057/s41599-024-02934-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

web 2.0 essay

IMAGES

  1. introduction to Web 2.0

    web 2.0 essay

  2. The positives of web 2.0 essay

    web 2.0 essay

  3. Web 2.0 Technology and Education Essay Example

    web 2.0 essay

  4. Web 2.0 Paper

    web 2.0 essay

  5. Difference Disclosed: Web 2.0 vs Web 3.0

    web 2.0 essay

  6. Словарь Веб 2.0 by Library Turgeneva

    web 2.0 essay

VIDEO

  1. Web Authoring

  2. Web 2.0 vs Web 3.0

  3. web 3.0 digital India revolution from web 2.0 to web 3.0 #video #viral

  4. Gateway B2+

  5. What is Web 3.0 ?

  6. JavaScript v2.0.

COMMENTS

  1. What Is Web 2.0

    You can visualize Web 2.0 as a set of principles and practices that tie together a veritable solar system of sites that demonstrate some or all of those principles, at a varying distance from that core. Figure 1 shows a "meme map" of Web 2.0 that was developed at a brainstorming session during FOO Camp, a conference at O'Reilly Media.

  2. Web 2.0

    Web 1.0. Web 1.0 is a retronym referring to the first stage of the World Wide Web's evolution, from roughly 1989 to 2004. According to Graham Cormode and Balachander Krishnamurthy, "content creators were few in Web 1.0 with the vast majority of users simply acting as consumers of content". Personal web pages were common, consisting mainly of static pages hosted on ISP-run web servers, or on ...

  3. What Is Web 2.0? Definition, Impact, and Examples

    Web 2.0: A term used to describe companies, applications and services on the Internet that have transitioned from the old "Web 1.0" structure. Web 2.0, in general, refers to the web applications ...

  4. Essay about Web 2.0

    Essay about Web 2.0. Web 2.0 is the term given to the shift from a mostly read only web to the interactive and user- generated content we know today. Programs and applications on the web and the advent of high-speed connections make it easy for the average person to upload, create, and publish content. Conversations happen when readers comment ...

  5. Web 2.0 Technology: Design Aspects, Applications and Principles Essay

    The Design Aspects of Web 2.0. It is important to note that web 2.0 is an improvement developed on web 1.0. Web 1.0 technology is where a "small number of developers would develop web and spread them to many viewers" (Hosie-Bounar and Waxer 29). Web 2.0 provides people an opportunity to view and also take part in the scripting of the ...

  6. Web 2.0 Technology: Development and Issues Essay

    Undoubtedly, Web 2.0 is a paramount technology that helps the sharing of information from one user to another. It has also helped many business, both new and old, to adopt new strategies of involving customers. In other words, Web 2.0 acts as a link between business and customers. (Parise 1). So far, Web 2.0 has been successful amid criticism ...

  7. Web 2.0

    Examples are feeds, RSS, Web Services, mash-ups. Social Web ' defines how Web 2.0 tends to interact much more with the end user and make the end-user an integral part. A third important part of Web 2.0 is the Social web, which is a fundamental shift in the way people communicate. The social web consists of a number of online tools and ...

  8. Web 2.0 and Literary Criticism

    This cluster of essays began as a seminar at the American Comparative Literature Association (ACLA) conference, held at Georgetown University in March 2019. Aarthi and Jessica bonded over a shared sense that literary critics need to learn from, follow, and take seriously the trends in web 2.0 literature and literary culture even as we remained ...

  9. Web Wisdom: An Essay on How Web 2.0 and Semantic Web Can Foster a

    Thomas, C., & Sheth, A. P. (2011). Web Wisdom: An Essay on How Web 2.0 and Semantic Web Can Foster a Global Knowledge Society. Computers in Human Behavior, 27 (4), 1285-1293. This Article is brought to you for free and open access by the The Ohio Center of Excellence in Knowledge-Enabled Computing (Kno.e.sis) at CORE Scholar.

  10. Web 2.0 : Principles and Best Practices

    Web 2.0 is here today and yet its vast, disruptive impact is just beginning. More than just the latest technology buzzword, it's a transformative force that's propelling companies across all industries towards a new way of doing business characterized by user participation, openness, and network effects. What does Web 2.0 mean to your company and products?

  11. Essay: Web 2.0: Unlearned Lessons from Previous Virtual Learning

    Essay Web 2.0: Unlearned Lessons from Previous Virtual Learning Environments Emilia Bertolo Department of Geographical and Life Sciences, Canterbury Christ Church University, Canterbury Date received: 25/03/2008 Date accepted: 14/04/2008 Abstract Web 2.0 tools are changing the way we work online. As lecturers, we need to understand

  12. What is Web 2.0?

    Web 2.0 is an open to all platform, as such - it must hold all information with no restrains. That is the beauty of this platform and yes, it does imply that you will find a lot of irrelevant information (The web 2.0 antagonists claim that "we get too much crap"). But, this is exactly why contribution is necessary.

  13. The Likelihood in the Web 2.0: [Essay Example], 429 words

    The Web 2.0 presents the likelihood of a Web as a phase. The thought was with the ultimate objective that instead of reasoning about the Web as a place... read full [Essay Sample] for free

  14. What is Web 2.0: Design Patterns and Business Models for the ...

    This paper was the first initiative to try to define Web 2.0 and understand its implications for the next generation of software, looking at both design patterns and business modes. Web 2.0 is the network as platform, spanning all connected devices; Web 2.0 applications are those that make the most of the intrinsic advantages of that platform ...

  15. Web 2.0

    Google was a pioneer in all three components of Web 2.0: their core business sounds crushingly hip when described in Web 2.0 terms, "Don't maltreat users" is a subset of "Don't be evil," and of course Google set off the whole Ajax boom with Google Maps. Web 2.0 means using the web as it was meant to be used, and Google does. That's their secret.

  16. Computers and Web 2.0

    The Internet and teacher education: Traversing between the digitized world and schools. Internet and Higher Education, 14 (1), 3‐9. This essay, "Computers and Web 2.0" is published exclusively on IvyPanda's free essay examples database. You can use it for research and reference purposes to write your own paper.

  17. Reflective Essay. Web 2.0 is a set of web design and…

    Web 2.0 is a set of web design and development principles that first made an appearance in the early 2000s. These practices differ from the fixed communication of early web pages because Web 2.0…

  18. Six ways to make Web 2.0 work

    Audio. Six ways to make Web 2.0 work. Over the past two years, McKinsey has studied more than 50 early adopters to garner insights into successful efforts to use Web 2.0 as a way of unlocking participation. We have surveyed, independently, a range of executives on Web 2.0 adoption. Our work suggests the challenges that lie ahead.

  19. The Moral Economy of Web 2.0 (Part One)

    The Moral Economy of Web 2.0 (Part One) March 18, 2008 Henry Jenkins. I wrote the following essay on the cultural politics around web 2.0 with Joshua Green, a post-doc in the CMS program, who is speerheading the Convergence Culture Consortium and who is my partner in crime in organizing the Futures of Entertainment conferences.

  20. Network, Networked Publics and Web 2.0

    Web 2.0 sites are defined as a loose connection of web applications that facilitates the sharing of information and user- centred designs in collaboration with the World Wide Web. This site helps users to connect and interact with each other through dialoguing, sending text messages and sharing videos reducing the whole world into a virtual ...

  21. The amorality of Web 2.0

    Web 1.0 may have turned out to be spiritual vaporware, but now we have the hyper-hyped upgrade: Web 2.0. ... O'Reilly, in a new essay on Web 2.0, says that Wikipedia marks "a profound change in the dynamics of content creation" - a leap beyond the Web 1.0 model of Britannica Online. To Kevin Kelly, Wikipedia shows how the Web is ...

  22. Essay On Web 2.0

    A number of technology geeks have given various explanations about what is web 2.0. The exact definition is still a topic of debate. However, I believe that web 2.0 is a web-based application, which can be accessed from anywhere, and the users can be contributors to content on internet, rather than just being viewers.

  23. Web 2.0 Platform: Facebook

    Conclusion. Currently, Facebook is the most popular social networking site that uses Web 2.0 technology, and is useful in creating awareness. The site is used frequently as an exhibit of crime where some people were found on the wrong side of the law, such as displaying explicit pictures. In some countries the site is used to spread hate speech ...

  24. Web 2.0 technologies and translator training: assessing ...

    Web 2.0 technologies have had a significant impact on collaborative communication practices in teaching, learning, and professional work environments. In translation studies, computer-supported ...