banner
虫子游戈

虫子游戈

一个写故事的人类
mastodon
email

Will big data and artificial intelligence destroy democracy?

This article is translated from the English version on Scientific American, with the German version available at Spektrum der Wissenschaft. Authors: Dirk Helbing, Bruno S. Frey, Gerd Gigerenzer, Ernst Hafen, Michael Hagner, Yvonne Hofstetter, Jeroen van den Hoven, Roberto V. Zicari, Andrej Zwitter. February 25, 2017.

“Enlightenment is humanity's emergence from its self-imposed immaturity. Immaturity is the inability to use one's understanding without guidance from another.”

—— Immanuel Kant, “What is Enlightenment?” (1784)

The digital revolution is in full bloom. How will this change our world? The data we generate doubles every year. In other words: the data we produced in 2016 is equivalent to all the data produced by humanity up until 2015. Every minute, we conduct hundreds of thousands of Google searches and post hundreds of thousands of Facebook updates. This data contains information that can reveal what we think and feel. Soon, everything around us (including our clothing) will be connected to the internet. It is estimated that in 10 years, there will be 150 billion connected measuring sensors, which is 20 times more than the Earth's population. At that time, the amount of data will double every 12 hours. Many companies are already trying to make "big money" with this "big data."

Everything will become intelligent; soon we will not only have smart phones but also smart homes, smart factories, and smart cities. Can we expect such developments to create smarter nations and a smarter planet?

In fact, the field of artificial intelligence is making remarkable progress, especially in its contribution to the automation of data analysis. AI is no longer just a set of lines of code; it now has the ability to learn, enabling continuous self-evolution. Recently, an algorithm developed by Google DeepMind learned to play through 49 Atari games on its own. Current algorithms are nearly as good as humans in recognizing handwritten characters and patterns, and in some tasks, they even outperform humans. They can also describe the content of photos and videos. Today, 70% of financial transactions are executed by algorithms. Some news content is also generated automatically. All of this will have significant economic consequences: in the next 10 to 20 years, about half of today's jobs will be under threat from algorithms. 40% of the Fortune 500 companies today will disappear within a decade.

It is expected that supercomputers will soon surpass human capabilities in almost all fields—likely between 2020 and 2060. Experts have already begun to sound the alarm. Visionary technologists like Elon Musk of Tesla, Bill Gates of Microsoft, and Steve Wozniak, co-founder of Apple, have warned that superintelligence poses a serious danger to human existence, potentially even more dangerous than nuclear weapons.

Is this alarmism?#

One thing is clear: our economy and the way we organize society will undergo fundamental changes. We are experiencing the largest transformation since the end of World War II; after production automation and the creation of self-driving cars, the next step is the automation of society. Thus, society stands at a crossroads, with enormous opportunities ahead but also considerable risks. If we make the wrong decisions, we could threaten our greatest historical achievements.

In the 1940s, American mathematician Norbert Wiener (1894-1964) invented cybernetics. He believed that the behavior of systems could be controlled through appropriate feedback. Soon, some researchers imagined controlling the economy and society based on this fundamental principle, but the necessary technology did not exist at that time.

Today, Singapore is seen as a perfect example of a data-controlled society. Initially, a project aimed at protecting its citizens from terrorism has also impacted its economy, immigration policy, real estate market, and school curricula. China is also following a similar path. Baidu (China's Google) recently invited the military to participate in its "China Brain Project." This involves running so-called "deep learning algorithms" on data collected from users' search engine activities. In addition, China is planning some form of social control. Recent reports indicate that every Chinese citizen will have a so-called "social credit score," which will determine under what conditions they can obtain loans, jobs, or travel visas to other countries. This type of personal monitoring will include people's internet browsing habits and their social behaviors (see the "Focus on China" section).

As consumers face increasingly frequent credit checks, certain online stores are experimenting with personalized pricing, and we in the West are also moving in a similar direction. We are increasingly aware that we are all at the center of institutional surveillance. This situation was revealed in 2015 when the UK's secret service's “Karma Police” project was made public, indicating that they would comprehensively screen everyone's internet usage. Is "Big Brother" becoming a reality?

Programmed Society, Programmed Citizens#

Everything starts off quite harmless. Search engines and recommendation platforms have begun to provide us with personalized suggestions for products and services. This information is based on personal data and metadata collected from previous searches, purchases, travel behavior, and social activities. Although users' identities are officially protected, they can be easily inferred in practice. Today, algorithms clearly know what we have done, what we are thinking, and how we feel—perhaps even better than our friends, family, or even ourselves. Many times, the recommendations we receive are so fitting that we feel the results are our own decisions, even though they are not. In fact, we are being remotely controlled more successfully than ever before. The more we are understood, the less likely our choices are to be free, and the more likely they are to be predetermined by others.

But the problem does not stop there. Certain software platforms are evolving towards "persuasive computing." In the future, these platforms will have the ability to guide us through a series of actions using complex manipulation techniques to execute complicated workflows or generate free content for internet platforms, from which companies can profit immensely. The current trend is shifting from programming computers to programming people.

These technologies are also becoming increasingly popular in the political world. Under the label "nudge," governments are attempting on a large scale to guide citizens towards healthier or more environmentally friendly behaviors through "nudges" (a modern form of paternalism). This new caring government is not only interested in what we do but also wants to ensure that we do what it considers to be the right thing. The magical trick here is "big nudging," a combination of "big data" and "nudging." For many, this seems like a digital wand that allows for efficient governance of the populace without involving citizens in the democratic process. Can this overcome the obstacles posed by vested interests and optimize the development of the world? If so, citizens may find themselves under the rule of a data-driven "benevolent ruler," which could achieve the desired economic and social outcomes like a digital wand.

Pre-programmed Disasters#

But looking at the relevant scientific literature, attempts to control people's opinions in a so-called "optimized" manner are doomed to fail because the problem is highly complex. The dynamic process of opinion formation is full of variables. No one knows how to best utilize the digital wand (i.e., manipulative nudging techniques). What is good? What is bad? The criteria often only become apparent in hindsight. For example, during the 2009 swine flu pandemic in Germany, everyone was encouraged to get vaccinated. However, we now know that a certain proportion of those vaccinated would develop a rare disease—narcolepsy. Fortunately, the number of people choosing to get vaccinated did not increase!

Another example is that health insurance providers recently attempted to encourage people to exercise more by distributing smart health wristbands to reduce the incidence of cardiovascular diseases; but ultimately, this could lead to an increase in hip surgeries. In complex systems like society, improvements in one area almost inevitably lead to deterioration in another. Therefore, large-scale interventions can sometimes end up being large-scale mistakes.

Moreover, criminals, terrorists, and extremists will eventually attempt to seize control of this digital wand—even if we might not notice it. Almost all companies and institutions have been hacked, including the Pentagon, the White House, and the NSA.

When there is a lack of sufficient transparency and democratic control, another problem arises: systems can be eroded from within. Search algorithms and recommendation systems may be affected. Companies can bid to purchase specific phrases to achieve more favorable results. Governments may also have the ability to influence outcomes. During elections, governments can nudge undecided voters to support themselves—this is a form of manipulation that is difficult to detect. Therefore, whoever controls this technology can win elections—pushing themselves into power.

In countries where a few individual search engines or social media platforms dominate the market share, this issue becomes even more severe. The public may be decisively influenced, and these countries can be remotely intervened. Although the European Court's ruling on October 6, 2015, limited the unrestricted export of European data, this potential problem remains unresolved across Europe, let alone elsewhere.

What adverse side effects can we expect? To manipulate without being detected, they will employ a so-called "resonance effect"—fully customized suggestions for each individual. In this way, local trends will gradually be reinforced through repetition, ultimately leading to "filter bubbles" or "echo chamber effects": all the information you receive ultimately only reflects your own views. This can lead to social polarization, dividing into different groups that can no longer understand each other, with increasing conflicts between them. Personalized information may inadvertently undermine social cohesion in this way. This phenomenon can currently be observed in American politics: the divide between Democrats and Republicans is becoming increasingly severe, making political compromise nearly impossible. The result is the fragmentation of society, which could even lead to social collapse.

Due to the resonance effect, large-scale changes in public opinion can only occur slowly and gradually. There is a time lag in the occurrence of this effect, but likewise, what has already happened cannot be easily undone. For instance, resentment against minorities or immigrants may spiral out of control; excessive nationalism may lead to discrimination, extremism, and conflict.

Perhaps more importantly, manipulative methods can indeed change the way we make decisions. They can at least temporarily obscure other relevant cultural and social habits. Overall, the large-scale use of manipulative methods may lead to severe social harm, including cruel behavior in the digital world. Who should be held accountable for this?

Given the massive fines imposed on tobacco companies, banks, IT, and automotive companies in recent years, this raises legal issues that should not be overlooked. But if there are any, which laws might be violated? First, manipulative technologies clearly limit the freedom of choice. If remote control of our behavior works well, we will essentially become digital slaves, as we will only execute decisions that have actually been made by others in advance. Of course, manipulative technologies only have partial effects. Nevertheless, our freedom will gradually disappear, but of course—at present, this process is so slow that people do not resist much.

The great Enlightenment thinker Immanuel Kant provided insights that are very relevant to this. He pointed out many things, one of which is that a state that tries to determine the happiness of its citizens is a tyrannical state. However, the right to self-development can only be exercised by those who have the authority to control their lives, which is predicated on informational self-determination (informational self-determination). This is precisely our most important constitutional right. Unless these rights are respected, democracy cannot function well. If these rights are restricted, it endangers our constitution, our society, and our state.

Because the way manipulative technologies like big nudging work is similar to personalized advertising, other laws will also be affected. Advertising must be truthful and not misleading. They are also not allowed to utilize certain psychological manipulation techniques, such as subliminal stimulation. This is why inserting a brief soft drink advertisement into a movie is prohibited, as such advertisements are not consciously perceived and may still have a subconscious effect. Furthermore, the widespread collection and processing of personal data currently does not comply with data protection laws in European countries and some other places.

Finally, the legality of personalized pricing is also questionable, as it may be a misuse of insider information. Other relevant aspects may include violations of equality and non-discrimination principles as well as competition laws, as free market access and price transparency can no longer be guaranteed. This situation can be compared to companies selling products at lower prices in other countries but trying to prevent purchases through those countries. This has previously led to very high punitive fines.

Personalized advertising and pricing differ from traditional advertising or discount coupons because the latter are not targeted, do not invade our privacy, do not exploit our psychological weaknesses, and do not deprive us of critical thinking.

Moreover, we must not forget that in academia, even the most harmless decision-making experiments are considered to involve human subjects, which must be approved by ethical committees that the public can rely on. In each case, participants are asked to indicate informed consent. In contrast, clicking once to confirm our agreement to the contents of hundreds of pages of "terms of use" agreements (a model now used by many information platforms) is highly inappropriate.

While this is the case in academia, nudging and other manipulative technologies can conduct experiments on millions of people without informing them, without transparency, and without ethical constraints. Even large social networks like Facebook or online dating platforms like OkCupid have publicly admitted to conducting such social experiments. If we want to avoid irresponsible research on humans and society (just think of the psychologists who participated in implementing torture), we urgently need to enforce high standards, particularly scientific quality standards and codes of conduct similar to the Hippocratic Oath.

Have Our Thoughts, Our Freedom, and Our Democracy Already Fallen?#

Do we assume that there exists a superintelligent machine with god-like knowledge and superhuman abilities: would we follow its commands? This seems possible. But if we do, then the warnings issued by Elon Musk, Bill Gates, Steve Wozniak, Stephen Hawking, and others will become reality: computers will take over the world. We must clearly recognize that superintelligence can also make mistakes, lie, pursue selfish interests, or be manipulated. Most importantly, it may not match the dispersed collective wisdom of the entire group.

The idea of replacing the thoughts of all citizens with computer clusters is absurd, as this would greatly reduce the diversity and quality of achievable solutions. We can already see clearly that despite the significant increase in data recently and the use of personalized information, the problems of this world have not diminished—in fact, quite the opposite! World peace is fragile. Long-term climate change may lead to the most severe loss of species since the extinction of the dinosaurs. We have not yet overcome the financial crisis and its impact on the economy. Cybercrime is estimated to cause losses of $3 trillion annually. Nations and terrorists are also preparing for cyber warfare.

image

Figure 1: The development of the digital world. Source: Dirk Helbing. With the help of big data, we can now make better, evidence-based decisions. However, top-down control will increasingly fail, as the complexity of society will grow exponentially with the networking of our world. Distributed control methods will become increasingly important. Only by leveraging collective wisdom can we hope to find suitable solutions to the complexity challenges of our world.

In a rapidly changing world, superintelligence may never provide perfect decisions (see Figure 1): the rate of increase in system complexity outpaces the growth of data, while the growth of data outpaces the development of data processing capabilities, and the speed of data transmission is limited. This can lead systems to overlook local knowledge and facts, which are crucial for achieving excellent solutions. Distributed local control methods often outperform centralized ones, especially in complex systems where behavior is highly variable, unpredictable, and cannot be optimized in real time. This has already become a reality in urban traffic control, and the problems will be even more severe for our highly networked, globalized social and economic systems.

Additionally, there is a danger: decision manipulation through powerful algorithms may undermine the foundation of "collective wisdom," which often flexibly adapts to the challenges of our complex world. For collective wisdom to be effective, individuals' information searches and decisions must be conducted independently. However, if our judgments and decisions are predetermined by algorithms, this effectively amounts to brainwashing. The existence of wisdom would be reduced to merely receiving instructions, responding automatically to stimuli.

In other words: personalized information constructs "filter bubbles" around us, which are digital prisons that confine our thoughts. In such circumstances, how can creativity and "thinking outside the box" be achieved? Ultimately, the centralized systems of bureaucratic behavior and social control using superintelligent information systems will lead to a new form of dictatorship. Therefore, the principle of top-down control under the banner of "liberal paternalism" is essentially a totalitarian regime wearing a pretty mask.

In fact, the goal of big nudging is to constrain the behavior of many people within norms and manipulate their views and decisions. This places it at the center stage of political propaganda, aiming to deprive citizens of their agency through behavioral control. We believe the long-term consequences are fatal, especially when considering the aforementioned harmful cultural impacts.

A Better Digital Society is Possible#

Despite fierce global competition, the wise approach for democratic countries is not to discard the achievements of centuries. Compared to other political systems, Western democracies have the advantage of having learned to cope with diversity and plurality. Now they just need to learn how to leverage them more.

In the future, these countries will find a healthy balance between businesses, governments, and citizens. This requires a networked mindset and the establishment of "ecosystems" for information, innovation, products, and services. For good results, it is important not only to create opportunities for participation but also to support diversity. Because we cannot determine the best objective function: should we optimize GDP per capita or sustainability? Power or peace? Happiness or life expectancy? Usually, we can only know which is better after the fact. By allowing the pursuit of various different goals, a diverse society can better cope with the unexpected challenges that may arise.

Centralized, top-down control is a solution of the past, suitable only for low-complexity systems. Federal systems and majority voting are the solutions of the present. As the economy and culture develop, social complexity will continue to increase. The future solution is collective wisdom. This means that citizen science, crowdsourcing, and online discussion platforms are very important new methods for providing more knowledge, ideas, and resources.

Collective wisdom requires high diversity. However, today's personalized information systems are undermining diversity, and this trend is strengthening.

Social diversity is as important as biological diversity. It not only contributes to collective wisdom and innovation but also provides resilience to society—the ability of our society to respond to unexpected shocks. Reducing social diversity often also diminishes economic and social functionality and performance. This is why totalitarian regimes frequently come into conflict with neighboring countries. A typical long-term consequence of reduced social diversity is political instability and war, which has been seen throughout history. Therefore, diversity and participation rights should not be seen as concessions to citizens but as functional prerequisites for a prosperous, complex, modern society.

image

Figure 2: The digital crossroads. Source: Dirk Helbing. Our society stands at a crossroads: if more powerful algorithms are controlled by a few decision-makers and reduce our self-determination, we will regress to a form of feudalism (Feudalism 2.0), as we will lose important historical achievements. However, we still have the opportunity to choose the path of digital democracy or Democracy 2.0, which can benefit us all (see https://vimeo.com/147442522).

Overall, it can be said that we are now standing at a crossroads (see Figure 2). Big data, artificial intelligence, cybernetics, and behavioral economics are shaping our society—whether for better or worse. If these widely used technologies are incompatible with the core values of our society, they will inevitably cause widespread destruction. They could create an automated society with totalitarian characteristics. In the worst-case scenario, centralized AI could control what we know and think, as well as how we behave. We are at a historic moment where it is time to decide the right path—a path that will allow us to benefit from this digital revolution. Therefore, we urge adherence to the following fundamental principles:

  1. Further decentralize the functions of information systems;
  2. Support informational self-determination and participation;
  3. Enhance transparency for better trust;
  4. Reduce distortion and pollution of information;
  5. Enable user-controlled information filtering technologies;
  6. Support social and economic diversity;
  7. Enhance interoperability and improve opportunities for collaborative cooperation;
  8. Create digital assistants and coordination tools;
  9. Support collective wisdom;
  10. Promote responsible behavior in the digital world through digital cognitive education and enlightenment.

By following this digital policy, we can all benefit from the results of this digital revolution, whether in the economy, government, civil society, or elsewhere. What are we waiting for?

A Strategy for the Digital Age#

Big data and artificial intelligence are undoubtedly important innovations. From personalized medicine to sustainable cities, they have tremendous potential to catalyze economic value and social progress. However, using these technologies to deprive citizens of their agency is completely unacceptable. Big nudging and social credit scores will centrally misuse the personal data collected to control behavior in a fundamentally totalitarian manner. This is not only incompatible with human rights and democratic principles but also unsuitable for managing modern innovative societies. To address the real problems of this world, far better methods need to be proposed in the fields of information and risk management. The field of responsible innovation research and the “Data for Humanity” initiative can provide guidance on how big data and artificial intelligence should be used for the benefit of society.

What can we do now? First, even in this era of digital revolution, the basic rights of citizens should be protected, as these rights are the fundamental prerequisites for a practical and viable democratic society. This requires creating a new social contract based on trust and cooperation, which does not view citizens and customers as obstacles or exploitable resources but as partners. To this end, the state must provide an appropriate regulatory framework to ensure that these technologies are designed and used in a manner consistent with democracy. This must guarantee informational self-determination, not just theoretically but also practically, as this is a prerequisite for us to live in a self-determined and responsible manner.

We should have the right to obtain copies of the personal data collected from us. Laws should stipulate that this information must be automatically sent in a standardized format to personal data storage locations, allowing individuals to manage the use of their data (potentially with the support of certain AI-based digital assistants). To ensure better privacy and prevent discrimination, laws must punish unauthorized use of data. Individuals must then have the ability to decide who can use their information, for what purposes, and for how long. Additionally, appropriate measures must be taken to ensure the security of data storage and exchange.

Considering a complex and nuanced reputation system based on multiple indicators may help improve the quality of information on which our decisions are based. If data filters and recommendation and search algorithms can be chosen and configured by users, we can examine issues from different perspectives, making us less susceptible to manipulation by false information.

Furthermore, we need to provide citizens with effective complaint procedures and effective sanctions for rule violations. Finally, to achieve sufficient transparency and trust, leading scientific institutions should act as trustees for data and algorithms that currently evade democratic control. This also requires appropriate codes of conduct, which must be adhered to by those with access to sensitive data and algorithms—akin to a Hippocratic Oath for information technology professionals.

Additionally, we need a digital agenda to lay the foundation for the future of work and the digital society. Every year, we invest billions of dollars in agriculture and public infrastructure, schools, and universities—to benefit industry and services.

So what public systems do we need to ensure the success of the digital society? First, a completely new educational philosophy is needed. Greater emphasis should be placed on critical thinking, creativity, innovation, and entrepreneurship, rather than producing standardized workers (whose future tasks will be performed by robots and computer algorithms). Education should also help individuals understand how to use digital technologies responsibly and critically, as citizens must clearly understand how the digital world interacts with the physical world. For citizens to effectively and responsibly exercise their rights, they must understand these technologies and know how to use them appropriately. Therefore, scientific, industrial, political, and educational institutions need to provide this knowledge more widely.

Second, participatory platforms are needed to make it easier for individuals to become self-employed, establish their own projects, find partners, promote products and services worldwide, manage resources, and contribute to taxes and social security (an economy shared by all). To improve this aspect, towns and even villages could establish emerging digital community centers where new ideas can be collaboratively developed and tested for free. The open and innovative approaches discovered in these centers can promote large-scale collaborative innovation.

Specific types of competitions can provide additional motivation for innovation, help raise public awareness, and create momentum for a participatory digital society. They are particularly useful in mobilizing civil society, ensuring that local contributions address global issues (for example, through the "Climate Olympics"). For instance, platforms aimed at coordinating scarce resources can help unlock the enormous potential of the circular and sharing economy, which is still underdeveloped.

With a commitment to open data strategies, governments and industries will increasingly provide data for scientific and public purposes, thereby creating conditions for an effective information and innovation ecosystem to keep pace with the challenges of our world. This can be incentivized through tax reductions, similar to how some countries incentivize the use of environmentally friendly technologies through tax breaks.

Third, establishing a citizen-run "digital nervous system" can open up new Internet of Things opportunities for everyone and provide real-time data observation available to all. If we want to use resources more sustainably and slow down climate change, we need to measure the positive and negative side effects of our interactions with others and the environment. By using appropriate feedback loops, systems can be influenced in a self-organizing manner to achieve the desired results.

To successfully implement this, we need various incentive and exchange systems, providing them to all economic, political, and social innovators. This can create entirely new markets, laying the foundation for new prosperity. By diversifying the financial system (for example, through functionally differentiated currencies) and establishing new rules for compensating innovative inventions, the nearly limitless potential of the digital economy can be unleashed.

To better cope with the complexity and diversity of our future world and turn it into an advantage, we will need personal digital assistants. These digital assistants will also benefit from the development of artificial intelligence. It is foreseeable that in the future, many networks combining human wisdom and artificial intelligence will be flexibly constructed and configured as needed. However, to maintain control over our lives, these networks should be controlled in a distributed manner. In particular, individuals should be able to log in or out as needed.

Democratic Platforms#

The "Wikipedia of Cultures" could ultimately help coordinate various activities in a highly diverse world and make them compatible with each other. It will make the successful principles inherent in world cultures evident so that they can be combined in new ways. Such a "Cultural Genome Project" would also serve as a form of peace project, as it would raise public awareness of the value of social and cultural diversity. Global companies have long recognized that culturally diverse and multidisciplinary teams are more successful than homogeneous teams. However, effectively organizing the knowledge and ideas of many people requires a framework to create collective wisdom, and this framework is still lacking in many places. To change this, providing online deliberation platforms would be very useful. They can also create the framework needed to achieve upgraded digital democracy while providing citizens with more opportunities for participation. This is important because many of the problems facing today's world can only be managed through contributions from civil society.

Focus on China: Will Future Societies Look Like This?#

How will behavioral and social control affect our lives? The social credit score currently being implemented in China gives us a glimpse. In China, all citizens are rated under a one-dimensional scoring metric. Everything they do affects the increase or decrease of their score. The goal of this practice is not only mass surveillance. This score also depends on individuals' clicks on the internet and whether their behavior is politically correct, determining their credit conditions, rights to specific jobs, and travel visas. Thus, this social credit score pertains to behavioral and social control. Even the behavior of friends and acquaintances can affect this score, applying the principle of family liability: everyone simultaneously becomes a guardian of virtue and a spy; those who think unorthodox thoughts will be isolated. If such principles were widely applied in democratic countries, it would not matter whether the national government or influential companies set these rules—in both cases, the pillars of democracy would face direct threats:

  • Tracking and measuring all behaviors that leave digital traces will lead citizens to become "naked," gradually eroding their dignity and privacy.
  • Decision-making will no longer be free, as erroneous choices defining the scoring system from the perspective of the government or companies will have negative consequences. In principle, individual autonomy will be stripped away.
  • Every small mistake will be punished, and no one will be above suspicion. The presumption of innocence will be eliminated. Predictive policing (Predictive Policing) will even lead to punishments for violations that have not yet occurred (only anticipated to occur).
  • Since underlying algorithms cannot guarantee flawless operation, the principles of fairness and justice will be replaced by a new form of arbitrary assessment, and individuals will find it nearly impossible to defend themselves against it.
  • If individual goals are externally set, the possibility of personal self-development will be eliminated, thereby erasing democratic plurality.
  • Local cultural and social norms will no longer serve as the basis for appropriate, context-dependent behavior.
  • Using a one-dimensional objective function to control society will lead to more conflicts, thereby jeopardizing social security. This is likely to result in severe instability, as we have seen in our financial system.

Such social control would transform self-responsible citizens into individuals as underlings, leading to Feudalism 2.0. This is diametrically opposed to democratic values. Therefore, it is time for Enlightenment 2.0, which will be based on digital self-determination and give rise to Democracy 2.0. This requires democratic technologies: information systems compatible with democratic principles. Otherwise, they will destroy our society.

Further Reading#

ACLU: Orwellian Citizen Score, China's credit score system, is a warning for Americans, http://www.computerworld.com/article/2990203/security/aclu-orwellian-citizen-score-chinas-credit-score-system-is-a-warning-for-americans.html

Big data, meet Big Brother: China invents the digital totalitarian state. The worrying implications of its social-credit project. The Economist (December 17, 2016).

Harris, S. The Social Laboratory, Foreign Policy (29 July 2014), http://foreignpolicy.com/2014/07/29/the-social-laboratory/

Tong, V.J.C. Predicting how people think and behave, International Innovation, http://www.internationalinnovation.com/predicting-how-people-think-and-behave/

Volodymyr, M., Kavukcuoglu, K., Silver, D., et al.: Human-level control through deep reinforcement learning. In: Nature, 518, S. 529-533, 2015.

Frey, B. S. und Gallus, J.: Beneficial and Exploitative Nudges. In: Economic Analysis of Law in European Legal Scholarship. Springer, 2015.

Gigerenzer, G.: On the Supposed Evidence for Libertarian Paternalism. In: Review of Philosophy and Psychology 6(3), S. 361-383, 2015.

Grassegger, H. and Krogerus, M. Ich habe nur gezeigt, dass es die Bombe gibt [I have only shown the bomb exists]. Das Magazin (3. Dezember 2016) https://www.dasmagazin.ch/2016/12/03/ich-habe-nur-gezeigt-dass-es-die-bombe-gibt/

Hafen, E., Kossmann, D. und Brand, A.: Health data cooperatives—citizen empowerment. In: Methods of Information in Medicine 53(2), S. 82–86, 2014.

Helbing, D.: The Automation of Society Is Next: How to Survive the Digital Revolution. CreateSpace, 2015.

Helbing, D.: Thinking Ahead—Essays on Big Data, Digital Revolution, and Participatory Market Society. Springer, 2015.

Helbing, D. und Pournaras, E.: Build Digital Democracy. In: Nature 527, S. 33-34, 2015.

van den Hoven, J., Vermaas, P.E. und van den Poel, I.: Handbook of Ethics, Values and Technological Design. Springer, 2015.

Zicari, R. und Zwitter, A.: Data for Humanity: An Open Letter. Frankfurt Big Data Lab, 13.07.2015. Zwitter, A.: Big Data Ethics. In: Big Data & Society 1(2), 2014.

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.