Happiness Is Wanting What You Have
Reading Ray Kurzweil's The Singularity is Near: When Humans Transcend Biology, I came across this:
Once a planet yields a technology-creating species and that species creates computation (as has happened here), it is only a matter of a few centuries before its intelligence saturates the matter and energy in its vicinity, and it begins to expand outward at at least the speed of light (with some suggestions of circumventing this limit). Such a civilization will then overcome gravity (through exquisite and vast technology) and other cosmological forces—or, to be fully accurate, it will maneuver and control these forces—and engineer the universe it wants. This is the goal of the Singularity.
That last sentence is a strong statement. Basically, Kurzweil is saying the ultimate goal of intelligence is to engineer the universe to match its wants. This reflects a very utilitarian, anthropocentric philosophy. I'm generally a fan of progress: reducing suffering, expanding scientific understanding, using our resources more efficiently, pushing new artistic boundaries. But does this really mean bending the entire universe to our will?
I stopped hard at the end of that paragraph. "This is the goal of the Singularity." In reflecting on this with my puny biological brain, I recalled something my dad liked to say when I was growing up: happiness is wanting what you have, not having what you want.
It's easy to toss this aside as a trite turn of phrase -- especially when you're a teenager and your dad has maybe taken advantage of it as a parenting tool -- but it's one of those sayings that keeps coming back to me. It very neatly gets at one of the fundamental notions of philosophy. Do you take a hands-off approach to your world, and smile upon it as a gift, despite all the things that don't work to your advantage? Or do you scrutinize your world, see how it could be better -- according to your own judgment -- and work to change it?
I go back and forth on this. On the one hand, it gets under my skin when people try to put a stop to progress because it's "unnatural". What's natural? There's no natural or unnatural. Everything just is. The question of right or wrong should never be based on this completely fabricated notion of what's natural. On the other hand, I see the cost of progress, and I cringe at what we humans have done, and continue to do.
Humans are in the midst of the Sixth Extinction, and the blood is on our hands. We didn't set out to drive thousands of species to extinction, but it seems to be the unavoidable outcome of our growth as a species, both in terms of population and of technology. Can we stop the Sixth Extinction? Given our track record, it's hard to imagine that we could. But real AI is around the corner, and a completely mind-blowing explosion of intelligence will ensue. Perhaps there's some hope, as humans take advantage of and ultimately merge with AI, that we will find ways to live better and achieve progress while at the same time reducing and even reversing the damage we do. (When I mention reversing, I'm thinking of the potential to salvage genetic material from extinct species and reintroduce them, or replicate them in v-worlds. If the term v-world is new to you, or merely intriguing, you would probably enjoy my novel, Upload.)
However, if we take the utilitarian, anthropocentric view proffered by Kurzweil, which is essentially a selfish "have what you want" philosophy, the extinction of species on Earth will be nothing compared to the harnessing of an entire sun's energy, or the ultimate conversion of all matter and energy into one unified effort to maximize the computational capacity of the universe.
Some day, left to run its own course, the Sun would eventually consume the Earth -- barring intervention on our part, or on the part of some alien species who has other plans for it. Similarly, the Universe would either melt into entropic nothingness or collapse into another Big Bang. I tend to favor the entropy model. And I like to think of the Universe as a seed pod, the nutrients of which intelligent life will consume, so we can ultimately spring forth into some new world, leaving the husk of seed behind. Seen in this light, it's only natural we would destroy what the Universe has to offer en route to new life. How exactly we would outlive the Universe is very much a matter of speculation, but it seems like a nice optimistic "new beginnings" twist on what could otherwise be an existentially depressing storyline.
Like I said, I go back and forth on this. When I take the small view, I want to pursue a low-impact, very Zen-like state of detached love for all that is around me. When I think big, I think in terms of progress, scientific understanding -- perhaps even a clear, provable knowledge of how the Universe came to be, and what else is out there.
When a beaver builds a dam, floods a wood, and creates a pond, a lot of creatures are killed in the process. The homes of insects, worms, rodents, and other small creatures who lived near the dammed stream are flooded. Trees and other plants caught in the newly formed pond eventually die. The trees that go into making the dam are obviously killed in the process. They do a lot of damage. But they also create new habitats in the process. (One of my all-time favorite Scientific American articles mentions this in an analysis of the impact of wolves on Yellowstone: Lessons from the Wolf.) Arguably, the beaver has increased the informational complexity of its environment in the process. There is loss and suffering, but there is also creation, and beauty.
Where do I ultimately land? There are days, when I learn of yet another truly ugly, horrid thing that humans have wrought upon our innocent planet-mates, when I think the Universe would be better off without us. Maybe there's some more promising alien species, who didn't rise out of war and hatred, greed and jealousy -- who didn't have such a shitty childhood. Let them lead the way. But until we know of such a more-enlightened counterpart, we're the best thing going. Maybe once we jump the rails of biological evolution, leave our lizard brains behind, and merge with AI to become a computer-based life form -- maybe we'll get out of our teenage years and become the loving, peaceful, long-sighted post-humans I dream we could be.
When I was in high school, I won an essay contest sponsored by the local Optimists Club. I've often thought that was rather ironic, given my penchant for seeing the dark side of our species. But maybe, in the end, that wasn't so far off. I look to the promise of greater intelligence to carry us away from our mean roots. Nearly all indications so far suggest that as our intelligence and education increase, we do get nicer to each other, but we also get faster and faster at consuming our resources and screwing all the other species on this planet. But maybe there's a threshold, and once we get past that, we'll actually find a way to be happy, continue to make progress, and enrich the world around us instead of depleting it. That should be the goal of the Singularity.
But what if my dad was right, and we'd really be happiest looking at the Universe as a gift, and going with the flow? We'll never know which is the correct answer. How frustrating is that? We only get to choose one path, and can never know whether the other would have been better.