Have you read Yuval Noah Harari's Homo Deus? Perhaps you've read his Sapiens, in which he recounts the history of humanity to the present moment. In Homo Deus, he offers his thoughts on humanity's future. Although he makes many intriguing points, I focus on his idea that in the coming years, we will separate what we term consciousness from what we consider to be intelligence.
What does this mean? For Harari, it is that, over time, humanity will grow to prize the simple fact of intelligence, that is, the ability to envision and manage an orderly society, over the fact of physical awareness and sentience. In other words, as he sees it, what humanity will in the future value most highly is obtaining social order and, if doing so no longer requires organic beings to effect it, non-organic entities or algorithms will do just fine. Whether these objects have consciousness will not matter. Whether they are aware of who they are outside of themselves will not count.
I suppose we could think here of the world of the movie Matrix. And we'd probably be on target. Yet the issue goes deeper than a world controlled by cyborgs. If we decide that consciousness is no longer critical to creating a society, why do we need to exist at all? Why do we even need to be human?
And what is human?
Although I applaud what technology can do for us, I also agree wth Harari that if we do not temper our affections for technology with a firm understanding of what it means to be human, we will no longer be the humans we are today. Maybe this is for the best, maybe it is not. If you value your consciousness and awareness at all, however, it is a decided step backwards.
Ultimately, this way of thinking underscores the danger of a subscribing, without thought of a spiritual reality, to a naturalistic Darwinian worldview in which humanity is nothing more than just another plop on the evolutionary screen: we're worthless.
And why not? No God, no divine image: did we ever really matter?
No comments:
Post a Comment