Tag Archives: content

Carrying the earth on our shoulders

At last week’s STC Summit, I attended a couple of presentations that probed the same question. It’s an old question, but it’s still a thorny one.

atlas_rockefeller_center

I used to see this guy on childhood trips to New York. Now he reminds me of my Tech Pubs colleagues.

How can we integrate content into a unified presentation when the content comes from all over the place? When different teams — communication specialists and nonspecialists — are creating content using different tools and different styles, often with different objectives in mind, how can we present it to customers as a unified whole?

Both presentations showcased successful case studies for integrating content. Both placed the Tech Pubs department at the center of the action. Yet both left me wondering why this whole thing — integrating content produced independently and content produced as part of a collaborative effort — isn’t easier. Continue reading

Advertisements

Policing the public square

No doubt you’ve seen the news. Facebook CEO Mark Zuckerberg has gone up to Capitol Hill this week to answer questions from several different Congressional committees. They want to know what Facebook is doing about privacy breaches and interference by foreign actors.

In his prepared testimony, Zuckerberg said:

“It’s not enough to just connect people, we have to make sure those connections are positive. It’s not enough to just give people a voice, we have to make sure people aren’t using it to hurt people or spread misinformation. It’s not enough to give people control of their information, we have to make sure developers they’ve given it to are protecting it too. Across the board, we have a responsibility to not just build tools, but to make sure those tools are used for good.”

Zuckerberg and Besmehn

Mark Zuckerberg and assistant Andrea Besmehn arrive on Capitol Hill (photo source: NPR)

So, as NPR’s Camila Domonoske points out, Facebook now admits that it’s a content publisher, not just a technology platform on which other people create content. That’s big news.

Here’s even bigger news: Continue reading

Content questions: Critical Thinking 101

In my first content questions piece, I cited Robert D. Kaplan’s Washington Post article, in which he describes how people use content to distort and deceive — how information becomes misinformation and then the misinformation is amplified.

wolf in a forest

Reader Mark Baker proffered this comment:

This is an old wolf in new sheep’s clothing, but there are so many wolves now, and their sheep’s clothing is such a bad fit that we can always see their paws and teeth sticking out.

I respectfully disagree.

This is not to pick on Mark, with whom — based on his subsequent comments and on other conversations we’ve had in this forum — I agree on most things. But here, at least, I think he understates the problem.

Sure, sometimes it’s easy to spot the content frauds. Just like in Cold War-era spy movies, you knew who the bad guys were because they had Russian accents.

But many wolves are better at masking their true selves. Social media, especially, makes for effective masks. It’s easy to pretend you’re something you’re not.

(It’s been 25 years since Peter Steiner’s famous “nobody knows you’re a dog” cartoon in the New Yorker. How much has really changed since then?)

Exposing the wolves

We try hard to spot the wolves behind the masks. We look for trusted allies who can curate the content we receive. And we instinctively turn toward people who resemble ourselves — our tribe.

That exposes some of the wolves, but not nearly all of them.

To expose some wolves, we need to stop judging their appearance and start judging the things they say (or write).

In other words, we need to think critically. Continue reading

Content questions: a crisis of trust

We’ve been talking about content, about who gets to decide what is and isn’t appropriate, and especially about what happens to the content you publish.

A lot of it comes down to trust. Can we trust the content we encounter? How do we know? And, of course, how can we create content that people will recognize as trustworthy?

Meet the Edelman Trust Barometer. Published by the Edelman research firm, the barometer is an international study that focuses on the degree to which people trust “institutions” — defined by Edelman as government, business, media, and NGOs.

Richard Edelman speaking in the video

Richard Edelman (screen shot from The Battle for Truth)

I don’t think I’m off base if I interchange the term content providers for institutions. After all, the content we consume — the content on which we base our opinions and our worldview — comes predominantly from government, business, media, and NGOs. And the content you create probably falls into one of those categories.

The newest Trust Barometer finds that people’s trust in institutions — or content providers — is dropping precipitously, especially in the U.S.

In the words of CEO Richard Edelman, “the United States is enduring an unprecedented crisis of trust.” Edelman even posted a short video, titled The Battle for Truth, in which he said (emphasis mine):

  • We don’t have shared facts. Therefore, we lack rational discourse.”
  • Silence is a tax on truth, and we have to speak up.”

By speaking up, Edelman means that it’s incumbent on every institution — every content provider — to “fill the void for quality information.” Trustworthy information.

I don’t disagree with him. But I doubt that every content provider is willing or able.

What do you and I, as consumers of content, do then? Continue reading

Content questions: is the human element worth a try?

At a time when the news media is under intense scrutiny, when people struggle to distinguish reliable information from “fake news” from merely biased news, how will we decide — and who will decide:

  • When is content inappropriate?
  • Who controls the content?
  • What if content is used to deceive?

I posed these questions last week, with emphasis on the information, or the content, that we create. And I asked how we — the content creators — will shape the answers.

Answering the content conundrum

Steven Brill interviewed on CNN
Steven Brill, interviewed on CNN on March 4, 2018

Here’s one answer, from Steven Brill, whose Wikipedia page calls him a “journalist-entrepreneur.” Brill’s new project is called NewsGuard.

NewsGuard, whose launch date has not been announced, will try to “help consumers distinguish between sites that are trying to get it right and sites that are trying to trick people.” Those are the words of Brian Stelter, who interviewed Brill for CNN’s “Reliable Sources” earlier this month. Continue reading

Content questions: will we have the answers?

This is about information: who controls its flow, who uses it, and who watches you when you use it.

This is about you. Because you access information — or content — on the internet, and because you probably create it as well.

Will someone have the power to tell you what content is and is not appropriate? Who controls what happens to the content you publish? Will someone use your content to deceive or mislead?

Just this month, 3 news stories have brought these questions into sharper focus. Will we, as writing professionals, have good answers? We’d better, because I don’t know if anyone else will.

When is content inappropriate? Who decides?

Advertisement captioned Don't worry, it's just Twitter

Scene from a recent ad appearing on Twitter’s website and in movie theaters

On March 1, Twitter CEO Jack Dorsey promised to start measuring the platform’s “health” as a first step to freeing users from trolls and propaganda. (Josh Bernoff does a great job of  breaking down the announcement.) Admitting that “we didn’t fully predict or understand the real-world negative consequences” of Twitter’s free-for-all format, Dorsey promises to get busy and fix the problem.

Can he fix it?  Can he put the lid back on Pandora’s box? It strikes me as too little, too late. Continue reading