Content questions: will we have the answers?

This is about information: who controls its flow, who uses it, and who watches you when you use it.

This is about you. Because you access information — or content — on the internet, and because you probably create it as well.

Will someone have the power to tell you what content is and is not appropriate? Who controls what happens to the content you publish? Will someone use your content to deceive or mislead?

Just this month, 3 news stories have brought these questions into sharper focus. Will we, as writing professionals, have good answers? We’d better, because I don’t know if anyone else will.

When is content inappropriate? Who decides?

Advertisement captioned Don't worry, it's just Twitter

Scene from a recent ad appearing on Twitter’s website and in movie theaters

On March 1, Twitter CEO Jack Dorsey promised to start measuring the platform’s “health” as a first step to freeing users from trolls and propaganda. (Josh Bernoff does a great job of  breaking down the announcement.) Admitting that “we didn’t fully predict or understand the real-world negative consequences” of Twitter’s free-for-all format, Dorsey promises to get busy and fix the problem.

Can he fix it?  Can he put the lid back on Pandora’s box? It strikes me as too little, too late.

Of course, Dorsey isn’t the only social-media mogul wringing his hands. Mark Zuckerberg’s Facebook, besieged by covert Russian advertisers during the 2016 presidential campaign, is looking for ways to take back the platform, as it were, and ensure a more balanced flow of content.

This, too, sounds like too little, too late. More to the point, the people in charge of Twitter and Facebook aren’t journalists or publishers. They’re technologists, ill equipped to consider the complex ethical issues associated with controlling the creation, dissemination, and use of content.

Who controls the content?

The day after Jack Dorsey’s announcement, Foreign Policy reported that Chinese authorities are demanding personal information — home, school, and work addresses; photos; scans of their ID cards; even marriage certificates — from Uighurs who have emigrated to France. (The Uighurs are a mostly Muslim ethnic group concentrated in northwest China. For decades they’ve clashed with Beijing over issues of autonomy.)

Why would Uighurs living in France — some of them French citizens — give their personal information to the Chinese government? Because if they don’t, the government threatens to make life difficult for their family members back in China.

In possession of their personal information, of course, the Chinese authorities can more easily keep tabs on a group they see as seditious — even from thousands of miles away.

George Orwell foresaw this kind of thing when he wrote 1984. For most of us, though, the “information age” was touted as a time of greater community and greater freedom, as the light of knowledge and truth flooded the world.

What if content is used to deceive?

Woman walking past a wall filed with social media logos

Social media logos adorn a wall in India (Jagadeesh Nv/EPA-EFE/Shutterstock, via Washington Post)

Knowledge and truth? Or something else?

Robert D. Kaplan, of the Center for a New American Security., writing in the Washington Post, described the potential for using digital and video technology to create an illusory world — in the same way movie directors used 20th century technology to stage lavish scenes, like the parting of the Red Sea in The Ten Commandments.

A movie director can keep an audience in thrall for a couple of hours, Kaplan explains. But a 21st century authoritarian regime will be able to create a propaganda bubble in which it entraps all of its citizens, all the time.

Already, Kaplan reports, “the Chinese, eventually with the help of big data, are working on following the Internet searches of their citizens, and then determining who needs to be singled out for further observation. If a government or a company knows the destination and sequence of all of your searches, it is virtually inside your mind.” (He could’ve said Google here. Not just the Chinese.)

For Kaplan, today’s “dark age of technology” is forcing us to fight for objectivity — for the assurance that the information we access is real and not an artifice made to spread propaganda or steal our privacy.

Is Kaplan being alarmist? Has he himself, by cherry-picking facts and exaggerating the magnitude of the problem, created the very artificial reality that he decries? Not if you ask me. From watching the news media and social media, I’m convinced that the situation is already more precarious than most people want to admit.

Which, again, prompts questions about how to ensure that content is accurate and unbiased, that it’s made freely available, and that it won’t be used to manipulate or deceive. Questions that might soon dominate the professions in which we, as content producers, work.

How will we answer?

When the questions come, will we have the right answers? We can’t expect Jack Dorsey or Mark Zuckerberg — well-intentioned though they may be — to even understand the complexity of the questions, let alone find the answers.

So it’s up to us, the content experts. What are we doing to prepare?

Advertisements

9 thoughts on “Content questions: will we have the answers?

  1. Pingback: Content questions: will we have the answers? — Leading Technical Communication – blancaandieorozco

  2. Mark Baker

    Is Kaplan being alarmist? Yes.

    Is content used to create an illusory world? Yes, since the first word was written.

    What the web actually does is thrust such a tide of illusions on us that it makes us more conscious of how illusory the pictures it paints are.

    This is an old wolf in new sheep’s clothing, but there are so many wolves now, and their sheeps clothing is such a bad fit that we can always see their paws and teeth sticking out.

    We are more aware of the wolf, not because he is more subtle than before, but because he is less so.

    If there is a danger in the present situation is not that we will mistake all wolves for sheep but that, in our cynicism, we mistake all sheep for wolves.

    Reply
  3. Larry Kunz Post author

    Mark, thanks for your comment. I don’t think you speak for everyone when you say that the web makes us “more aware of the wolf.” I think that many people — I daresay all of the people some of the time, and some of the people all of the time — struggle to distinguish the wolves from the sheep. Some even have begun to say, “Well, it doesn’t matter. Your truth can be different from my truth, and that’s OK.”

    While the web has helped many of us hone our critical-thinking skills, the wolves are still very clever. Real lies are being spread, and real harm is being done.

    Reply
    1. Mark Baker

      Larry, of course real lies are being spread. Of course real harm is being done. And of course many people struggle to tell wolves from sheep. But that was true of paper before it was true of the web. It was true of papyrus and vellum and stone inscriptions and cave paintings before it was true of paper. We are a horrible species. We lie, cheat, steal, and kill. But it was not the web that made us so.

      If we start blaming the web for the sins of man we will only play into the hands of the wolves who seek to censor the web for their own ends. There has never been and will never be any form of censorship that favors truth over lies or sheep over wolves.

      Reply
  4. Larry Kunz Post author

    I don’t blame the web for the sins of man. But we should recognize that the web has given the sinners a powerful new tool. They’re talking about that very thing this week at SXSW.

    At the risk of opening up Pandora’s box: Guns don’t kill. People kill. But people with guns kill a lot more easily.

    Reply
    1. Mark Baker

      The web has given everyone a powerful new tool, saint and sinner, wolf and shepherd. Wolves in sheeps clothing are not really the thing you should be watching out for. It is wolves in shepherd’s clothing that do the real damage.

      If you want to make the comparison to guns, then the comparison is not guns to the web, it is guns to words and the web to WalMart. Censoring the web is censoring words.

      It is the wolves in shepherd’s clothing that lead the censorship charge, and the way they do it is always the same. “Oh no! People you disagree with are saying things and other people are listening. How distressing for you. Give me the keys to the web and to the presses and I will make them be quiet so they don’t worry you any more.”

      Fear mongering about speech is always, always, always, the leading edge of oppression.

      Reply
  5. Pingback: Content questions: is the human element worth a try? | Leading Technical Communication

  6. Pingback: Content questions: a crisis of trust | Leading Technical Communication

  7. Pingback: Content questions: Critical Thinking 101 | Leading Technical Communication

Tell me what you think

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s