Fake News: A Dedomenocratic Perspective

While the Dedomenocratic Party did not offer candidates in the 2016 election, the election advanced its agenda.   One of the ways that the election contributed to dedomenocracy eventually became characterized as fake news.   Claims that news was fake started early in the cycle, initially as a play on the name Fox News to being Faux News.   By the time the election was over, the accusation of promoting fake news extended across all news sources, particular those that criticized the Democratic Party candidates or those that promoted the Republican Party candidates.   In response, there were counter claims of fake news harming or benefiting the other side.   While these charges and counter-charges appear hopelessly confused in terms of benefiting one side of the other, the entire discussion benefits the Dedomenocratic Party by drawing attention to data instead of people.

One of the ways I would describe a dedomenocracy is that it is a form of government by data instead of people.   In the extreme, this government is more concerned about data than about people.  In essence, this form of government protects what might be called the rights of data, and does so by using data.   Human individuals have rights, but those rights are redefined in terms of how humans collect and defend data.

While I welcome the discussion about fake news, I disagree that we should label news as fake or not fake.   From a data perspective, there other ways to distinguish data qualities.  Also, there is value to false data (news).  The existence of fake news is itself a type of good data and to take full advantage of that data, we need to preserve and make available for study the fake data whenever it appears.

From a dedomenocratic perspective, it is useful to discredit data with other data. It is not useful to suppress or erase fake data.  Instead it is far more valuable to match data that cover the same phenomena to allow the contrast to be compared with supporting data.  Even when one datum is shown to be far inferior it is still valuable to keep in the data store since it will be relevant by influencing future observations.

Dedomenocratic processes focus on data, not on news.   Data is an observation.  Data may include gossip, rumors, opinions, are imaginations, but they are identified as distinct from actual observations associated with the means of obtaining the observation.

I like the discussion of fake news because from my perspective all of news is in some sense fake.   In my earlier discussions, I complained about a particular category of data I called dark data.  In those discussions, I was particularly describing the problem of preconceived models or theories producing data that is distinct from direct observations.  However, models and theories are special cases of a broader concept of human narratives.   Scientific theories or models are human narratives written in mathematical or statistical languages.  The narratives instruct us what to expect from future observations.

Describing dark data as narratives instead of models makes the concept of dark data applicable to journalism.   All published journalism consist of narratives.  Journalism may vary in quality in terms of abundance and sourcing of data, but in the end the expression of that data is in human language and arranged in a story.  The story is a narrative that inevitably incorporates preconceived models of reality either by the journalist or by the the journalist’s audience.   All narrative is dark data.   In other words, all news is fake news.

Earlier, I envisioned a future where journalists will specialize on their investigative skills to collect data into data stores.   In this world, software would automate narrative-building because it is more easily automated than the task of uncovering new observations.   Also, the news consumers will demand narrative automation to build fresh narratives that incorporate all relevant information at the time the article is read.  The journalist will earn their living by gathering data instead of writing narratives.

Personally, I want that now when I read anything.  I want to read what is known now, not what was known when the article is written.  While historic accounts may have sources that can be checked, the published accounts offer no easy way to link to future corrections, additions, or rebuttals.   With modern data technology, it is increasingly annoying to have to consume an earlier account that omits the relevant later developments.  I don’t want journalists to write narratives, I want them to uncover new observations particularly related to social and political topics.

In this century, I want a different type of news than what worked in prior centuries.  I want news to present data to me in some form of interactive visualization tool that may be very graphical for highly numeric data, or infographic in nature for more diverse data, or automated narratives for more subjective information.   This visualization would be machine-generated based on the data that is available at the moment the content is being read.

An exciting consequence of this new approach is that no two people would read the exact same news story of a particular topic unless they read it at the exact same time.  Someone relying on a reading of the article several days earlier will be challenged to discuss the same topic with someone reading the same article just today.   Each reading will have access to all relevant data at the time the article is read.   This will fundamentally change rhetoric.  For example, it would introduce a new fallacy of arguing from an outdated narrative.   Historically, one can quote a source because it is never changing.   In the new era, the only valid sources are the narratives based on the most current data.  The old narrative itself becomes irrelevant to the debate outside of the data point that it was a narrative that worked sometime in the past.

It seems harsh to describe all news as fake news.   I think it is easy to discuss the distinction of the dark data of narrative from the brighter data of observations.  For an illustration consider the news of a particular earthquake.   In my definition, the description of the event as an earthquake is dark data or fake news.   The concept of an earthquake is a narrative, it is a word that rapidly conveys the nature of the event to the largest population, but the message is a narrative about the general concepts of an earthquake.   A narrative-free description of the event would provide observations of the times, magnitudes, and directions of movements of the earth or the shaking of structures.  Enriching information will include the timing and nature of damages and injuries.  Additional information may document the causes of those damages and injuries as results from falling or breaking objects, structural failures, etc.  The information will include the scientific measurements of epicenters and subterranean movements.

We can document exhaustively all aspects of this event without ever mentioning the word earthquake.   With all this data, the actual word earthquake is not necessary.  The news consumers would consult the data visualization or automated narration to describe all that is known known about the event at this time.   With this data capability, the word earthquake is not even helpful.   There is no benefit to imposing a preconceived notion of earthquake on this particular event.  We know everything that there is to know from the observations.   Also, being free of the preconceived notion of earthquake, we are liberated to find other patterns that can tie together multiple events in ways that may help us better understand better ways to cope with future events.

Artificially collecting a diverse range of events under a common word like earthquake motivates us to study earthquakes in general as something that might be predicted or something we can prepare better for.   In contrast, having the details of each event absent an overall label of earthquake allows us to focus on local conditions of both geography and human social behaviors to better adapt to what matters the most.   The problem for government is not the earthquake, the problem is how to deal with the kind of challenges that result from the earthquake.

Obviously the recent discussions of fake news is about politics instead of natural disasters, but I think the reporting of both have the same problems and the same solutions.  The fake news is the narrative of describing the motives, intentions, or consequences of political or societal actors.   The real news, the bright data, is the actual observations of evidence related to their actions.    As more observations become available to add knowledge or to challenge earlier observations, we will have access to new narratives that always start fresh from the observations available in the data store.   Automated data visualization/narration is the real news that will replace the fake news.  What really makes legacy news fake is the tyrannical influence of past narratives that influence what future observations we accept.   Fake news is the need to keep old narratives relevant when the such a narrative never would have emerged if started from scratch with the data available at the current moment.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s