His idea was this: struggling writers who finally get published on an established website or magazine are often forced to delete earlier drafts of the work to give that publisher exclusive rights over the content. So why not create self-destructing content format to stop copyright issues save both parties aggravation?
My first reaction? Utterly ridiculous. Why would someone want to erase something they’ve worked so hard at? I’d always assumed the “mission impossible” approach to information management (“this tape will self destruct in 5 seconds…”) was confined to clichéd spy films.
If it seems ridiculous, that’s partly because it’s so difficult. The Internet has forever blurred the line between usage and copyright infringement, because it’s so difficult to retain control of information once it’s reproduced on someone else’s screen. There’s all manner of video processing tools, content scrapers, screenshot plugins and crawlers that are geared towards downloading and capturing online information in one format or another (there’s even the Internet History Engine that downloads and saves old iterations of websites for posterity). You might delete a web page or post, but hundreds of people might already have made a replica.
And so came the idea of self-destructive content.
The publishing perspective
Returning control of digital assets to the distributor is a curious idea. Publishers (as in Aaron’s story) still see Digital Rights Management in analogue terms. Such a view was far easier in an age where physical items were sold, taking a black and white view to an area that is incredibly grey. What they fail to realise is that once content is viewed online (even more so than with the Mission Impossible tape) it has been copied: copied into the user’s memory, however fallible; copied across multiple servers; copied into the machine, software/ app that it’s viewed in.
Maybe this makes auto-descructive content a half-way good idea then? At least until you consider what readers and end users want.
What the customer wants
Aside from a minority who expect everything for free, most people realise that entertainment (film, music, writing) takes time and costs money. If we can get it for free we will, but otherwise we’re willing to pay for the things we value. Essentially, this is not a discussion about digital rights management (for that kinda thing, try elsewhere).
Instead, it’s about what we expect when we access content across various devices, at different times. Above all, we desire consistency. We assume (because of magazines, books, even places) that when we find something in one location, it inherently exists there and always will. In the same way, when we find a website or video on one URL, we expect it to stay there.
Inconsistency of experience is one of the major downfalls of hardware and software. It’s what drives someone to take a hammer to their smartphone/ laptop. For an end user, self-destructive content damages our experience and disorientates us.
What does the Internet want?
It’s one thing considering the supplier and the recipient’s needs, but where does the medium stand in all this? What does the technology lend itself towards and where should we take it?
The Web is good at many things, but it excels at duplication, connection and fluidity. Duplication because digital information can be replicated almost instantaneously. This is totally at odds with self-destructive content, and probably explains why it’s such a holy grain for content owners (desirable and yet elusive).
The connection bit is clear: linking, searching and recommending are all things the Internet’s good at. It funnels, redirects and scatters us over millions of pages. However, all these signposts can lead us to dead ends, thanks to the fluid nature of the web.
Top-level domains (TLDs) are our places. Websites are the activities we find there. Posts, comments and uploads are the activity equipment. 80% of the time we can rely on them staying consistent, as we remembered them. That’s only because the owners choose to; in reality they can change everything from the equipment (a post/ article) through to the venue (the TLD). The infrastructure of the web allows us to be remarkably flexible, and to update/ remove as we see fit.
Is a solution possible?
Surely we’re ignoring the inherent make-up of the web? To expect the Internet to remain consistent and unchanging forgets that it’s a constantly evolving medium. It’s dynamic, fragmented and global. As the struggle behind the Wikileaks website showed, it can exist anywhere.
Equally the web was made for propagating information and creating networks from it. Deleting content on command does two things:
- it destroys these networks, which is unhelpful, and
- can’t eradicate every copy of an article or video.
If content self destructs unexpectedly it will only harm people’s trust in the Internet and any value they find in this content. But it’s also possible that users and consumers of content, want it to self-destruct…
Snapchat and Wackr messaging apps are both base their USPs on the ability to delete your own content after 1 view or a few days. There’s something novel about this but it’s also the ultimate form of encryption (as per Mission Impossible).
So what would lead you to use Snapchat instead of another messaging service like Whatsapp or BBM? I think it’s the fact that any message or photo is temporary. The service is has two things: people can share more personal (private, intimate, embarrassing) info but it’s also got the allure of Chatroulette because you’re never sure what you’ll receive. It’s too soon to say whether they have a long term future or not, but it’s clear that some people want self-destructing content.
The struggle for the Internet
What do we want the web to look like? Should users or content owners decide the direction? Or do we simply let code happen? Thoughts and suggestions below: