Nexus – A Brief History of Information Networks from the Stone Age to AI

Nexus – A Brief History of Information Networks from the Stone Age to AI

책을 읽으면서 우리나라 보수 정치집단의 지금 상황과 소름끼칠 정도로 잘 어울리는 단락이 있어 인용해 본다.

In a well-functioning democracy, citizens trust the results of elections, the decisions of courts, the reports of media outlets, and the findings of scientific disciplines because citizens believe these institutions are committed to the truth. Once people think that power is the only reality, they lose trust in all these institutions, democracy collapses, and the strongmen can seize total power.

Of course, populism could lead to anarchy rather than totalitarianism, if it undermines trust in the strongmen themselves. If no human is interested in truth or justice, doesn’t this apply to Mussolini or Putin too? And if no human institution can have effective self-correcting mechanisms, doesn’t this include Mussolini’s National Fascist Party or Putin’s United Russia party? How can a deep-seated distrust of all elites and institutions be squared with unwavering admiration for on leader and party? This is why populists ultimately depend on the mystical notion that the strongmen embodies the people. When trust in bureaucratic institutions like election boards, courts, and newspapers is particularly low, an enhanced reliance on mythology is the only way to preserve order.

이 책을 읽으면서 더욱 더 느끼는 것이지만 지금 현재의 어느정도 안정된 세계 각국의 상황이 그냥 만들어진 것이 아니구나를 절실히 느끼게 된다. 그리고 인터넷, AI 등과 같이 최신의 IT 기술이 나오면서 민주주의가 더 이상적인 방향으로 나아 갈것이라 예상했던 내가 잘 못 생각한 것이구나를 느끼게 된다. 특히 책에 있는 아래의 구절을 보게 되면…

As humankind enters the second quarter of the twenty-first century, a central question is how well democracies and totalitarian regimes will handle both the threats and the opportunities resulting from the current information revolution. Will the new technologies favor one type of regime over the other, or will we see the world divided once again, this time by a Silicon Curtain rather than an iron one?

As in previous eras, information networks will struggle to find the right balance between truth and order. Some will opt to prioritize truth and maintain strong self-correcting mechanisms. Others will make the opposite choice. Many of the lessons learned from the canonization of the Bible, the early modern witch hunts, and the Stalinist collectivization campaign will remain relevant, and perhaps have to be relearned. However, the current information revolution also has some unique features, different from—and potentially far more dangerous than—anything we have seen before.

The main split in twenty-first-century politics might be not between democracies and totalitarian regimes but rather between human beings and nonhuman agents. Instead of dividing democracies from totalitarian regimes, a new Silicon Curtain may separate all humans from our unfathomable algorithmic overlords. People in all countries and walks of life—including even dictators—might find themselves subservient to an alien intelligence that can monitor everything we do while we have little idea what it is doing.

In 2016-2017, Facebook’s business model relied on maximizing “user engagement”. As user engagement increased, so Facebook collected more data, sold more advertisements, and captured a larger share of the information market. In addition, increases in user engagement impressed investors, thereby driving up the price of Facebook’s stock. The algorithms then discovered by experimenting on millions of users that outrage generated engagement. So in pursuit of user engagement, the algorithm made the faithful decision to spread outrage.

People often confuse intelligence with consciousness, and many consequently jump to the conclusion that nonconscious entities cannot be intelligent. But intelligence and consciousness are very different. Intelligence is the ability to attain goals, such as maximizing user engagement on a social media platform. Consciousness is the ability to experience subjective feelings like pain, pleasure, love, and hate.

When people can no longer make sense of the world, and when they feel overwhelmed by immense amounts of information they cannot digest, they become easy prey for conspiracy theories, and they turn for salvation to something they do understand – a human.

The problem is that algorithms make decisions by relying on numerous data points, whereas humans find very difficult to consciously reflect on a large number of data points and weigh them against each other. We prefer to work with single data points. That’s why when faced by complex issues – whether a loan request ,a pandemic, or a war – we often seek a single reason to take a particular course of action and ignore all other considerations. This is the fallacy of the single cause.

The truth is that while we can easily observe that the democratic information network is breaking down, we aren’t sure why. That itself is a characteristic of the times. The information network has become so complicated, and it relies to such an extent on opaque algorithmic decisions and inter-computer entities, that it has become very difficult for humans to answer event the most basic of political questions: why are we fighting each other?

마지막 구절이 남는다. – Every old thing was once new. The only constant of history is change.

아무리 생각해도 나는 지금 AI로 촉발된 대변혁의 시대를 거쳐가고 있는 것 같다. 지금 이렇게 기록하는 것도 먼 훗날 되돌아보면 의미있는 작업이었음을 확인할 것이라 생각한다.

2025.1.31 작성 시작, 3.19 다 읽음

답글 남기기

이메일 주소는 공개되지 않습니다. 필수 필드는 *로 표시됩니다