In this blog site, I frequently discussion my own fantasy government that I called a dedomenocracy. One key feature of this government is that instead of asking the population to democratically choose policy, it asks the population to demographically define how to measure the greater good. When some future crisis occurs, what does the population prioritizes and what is the population willing to sacrifice. In such a government, I can not imagine that we would agree to sacrifice our younger people (and especially not our young women) for the benefit elderly people (and especially not old men). We do not live in a dedomenocracy.
The government must quit. Certainly there will be new cases, and new deaths. The rate will fluctuate over time, location, and demographic. The important information is whether the population is tolerating this, and whether they are adapting. To get this information, the government needs to stand down and watch.
This navigation reminds me of the hyperspace short cuts in science fiction. In both cases, the ship is in a short cut where spatial properties are different from more routine conditions of open seas in deep waters. In both cases, the navigator must rely on information he had when he entered the short cut. The navigator has very little if any relevant measurements of what will really matter to the outcome of the journey.
I envision a distant time when a dedomenocracy has been operating for multiple generations so it has good data about human responses to crises. That data should tell the algorithm that humans are prone to fear reactions. It will also tell the algorithm that an over protected population lacks the experience of handing real fears.
A dedomenocracy fears nothing while a democracy fears everything. In this context, everything refers to the collective library of scientific knowledge. Nothing refers to the empty space that may harbor plans that we will can only learn by paying close attention to the present, allowing observations to contradict theories we accepted in the past.
There is a benefit to opening our processes to the possibility that the reality may be changing, where the changing is from an evolving intelligence or even from a plethora of competing intelligences that have transitions of power much like our political systems. Admitting dark data into our algorithms blinds us to this possibility, especially when we allow dark data to have priority over observations.
A government by data could consider the observations of iatrogenic complications and deaths. The public’s fear of a virus could grant this government permission to impose some new authoritarian policy that would do something, but that something would exploit the opportunity to improve the future prospects based on all observations of the current world. Such a government would be free to decide to tackle the problem of iatrogenesis instead of the problem of the virus. Fixing the overextension of medicine may ultimately benefit more people than overreacting to a virus that is not as threatening as the population perceived.
In a democracy, the declaration of an emergency is a declaration to freeze science, particularly in those areas that tend to predict the most pessimistic results if nothing is done. I suspect this is inevitable because a democracy selects specific individuals to be leaders, and human leadership demands steadfast determination to see a policy to completion and the install confidence of the population. Given the recent experience, this particular property of democracy raises doubts about a democracy’s ability to handle a new emergency that is inconsistent with established theories and the operational plans based on those theories.
All government funded scientists, whether through salary, contract, or grant, have a conflict of interest when it comes to providing science to support government policies. The strong bias is toward supporting those policies and avoiding any challenge to those policies.
Though this processing of big data, the algorithm will make discoveries about the world that it is incapable of disclosing to humans. Instead it will act on these discoveries in an attempt to optimize some objective. There is a much more profound benefit of this arrangement: if the humans were to become aware of the discovery, they may be incapable of handling it. Humans will panic at the implications.