The term “social media” continues to rise in usage and popularity, but what do people mean when they say it? Is it misleading? And who needs a label, anyway?
Both Aliza Sherman and Olivier Blanchard have taken on this topic recently, each with similar conclusions (i.e., “social media” is not the best term, it’s hard to come up with a better one, but we should probably try). They suggest “social web” and “social communications” as possible alternatives, each of which has its appeal. At the same time, neither term seems to get its arms all the way around the paradigm shift that’s taking place.
It all started with Web2.0, when people tried to stick a label on this really big thing that was happening that no one quite understood. The term was coined for the O’Reilly media conference in 2004, and sparked quite a debate. People argued about what “it” meant, as if you could point at something and say whether or not it was “Web2.0.” It was a vague umbrella term used to cover a wide array of related concepts, but in the end, it stuck.
“Social media” as a term suffers from the same problem. It’s an umbrella used to cover a huge array of practices, technologies and philosophies about digital content and engagement (as Aliza Sherman illustrates). It reflects another paradigm shift, in the same way “Web2.0” did.
Out of curiosity, I did a search on Google Trends to see how search behavior related to some of these umbrella terms has evolved since 2004.
Even though this is an imprecise way to get at what people are saying and talking about (since it only reflects search volume), I think it still shows a few interesting things:
- Social media usage is rising exponentially, and has matched use of social networking as a term
- Social networking usage continues to rise, but not as quickly as social media
- Web 2.0 (two variants) are used a bit more frequently, but their use has declined dramatically since 2007
Labels are for forests, not trees
People in the business of understanding, explaining and advancing paradigm shifts like “Web2.0” and “social media” are passionate about what terms are used. Terminology matters to practitioners, and they use and evolve it for a few reasons:
- To provide a reference point for discussions and debates
- To improve understanding of the paradigm shifts that are occurring
- To self-identify with others in their profession
For people who aren’t practitioners, it’s a different story. When it comes to labels like “Web2.0” and “Social media,” most people don’t care about the nuanced distinctions being made. In addition, I would argue that getting the label “just right” won’t help describe these paradigm shifts to non-practitioners. After all, many people don’t even know what a Web browser is (let alone the Internet).
Paradigm shifts and the labels stuck on them are about taking a look at the forest, and they’re just not that relevant to daily life. Most people only care about the trees of connecting with friends or watching cool videos on YouTube or sharing a picture of their kids. Labels are for practitioners and people trying to make sense of what’s happening.
Paradigm shifts are label proof
No one ever agreed on the meaning of the term Web 2.0, and it ultimately didn’t matter. I don’t think anyone will agree on the meaning of the term social media, either. Three years from now, its usage will probably die down, following in the footsteps of “Web 2.0.”
As much as we want to stick labels on these big paradigm shifts, they resist them. They reflect deep societal change and technological disruption, and no simple label is ever going to capture the richness of that change and its impact on humanity. It’s a fool’s errand to keep searching for the right term, but it’s probably a worthwhile one, because the debate itself is what matters.