Overweening Generalist

Showing posts with label forecasting. Show all posts
Showing posts with label forecasting. Show all posts

Friday, April 5, 2013

Stephen Wolfram's Model of Information

In the 1940s, John von Neumann and Stanislaw Ulam began playing around with the idea of natural systems being extrapolated to initial conditions, then playing out as a sort of cellular automata. And I remember when I first read about cellular automata - James Gleick's book Chaos: Making A New Science had just come out - and it was filled with mind-spaghettifying ideas. Ideas like artificial life, the now-famous "Butterfly Effect," chaos mathematics and Benoit Mandelbrot and fractal geometry and fractal art, and it was  - much of it - way over my Generalist's head, but exciting. Cellular automata was in there. I had never heard of it.

                                                           Wolfram

Years later I picked up Stephen Wolfram's book after it came out in 2002: A New Kind of Science was about 1300 pages long, and was the manifesto of a guy who graduated with a PhD in particle physics from CalTech when he was 20, then received one of the first MacArthur "Genius" awards at age 21. This guy had a way to model just about everything: syntactic structures, social systems, particle physics.  Just about everything. It turns out he was a big-time guy in cellular automata, carrying on in the tradition of another "Martian," John von Neumann.

Wolfram's math was over my head, but books like this make me excited just to be in the presence of this sort of compendious mind. It's the kind of book I take off the shelf and open at random and read, hoping for some sort of inspiration. It usually works. Wolfram models information in our world upon his forays into cellular automata, in which you have a very basic system under initial conditions, and watch it evolve. He developed a taxonomy of the sorts of systems that arise, that he called "Class 1" "Class 2," and so on. These first two classes exhibit a low order of complexity; they tend to reach a level of constancy and repetition that's sorta boring. There's no surprises. They go on and on, ad nauseum, or die. A system like this? A clock.

His Class 3 level I think of as "noise." You can't predict anything. It's seemingly entirely random, like being bombarded by cosmic rays. If there's any structure at all, it's too complex. It seems akin to entropy. A system like this? Your TV tuned to a dead channel: all static and noise.

                                        cellular automata being simulated, played out. 

Wolfram's Class 4 is where the action is: these systems turn out lots of surprises. They're complex but there's structure; you can model from them and make a certain sense out of what's going on. Systems like this are intellectually exciting and basically describe any theory or "law" in the sciences. They're surfing the edge, almost falling into "noise" but never quite. It reminded me of Ilya Prigogine's ideas about complex adaptive systems and negative entropy, how life flourishes despite how "hot" it burns and uses resources. It creates information, structure, patterns, complexity. Indeed, Prigogine and Wolfram seem compatible enough to me...

                                 Shannon's basic equation for information theory:
                                 world-shattering stuff, turns out


My Other Information Systems
Probably because of my intellectual temperament - which includes not being particularly adept at math - I had always been very impressed with guys like Wolfram and what they were able to do with math, but I have also been suspicious that they're somehow operating from the conceit...or rather, flawed assumption that numbers can describe everything and that everything that's interesting to us is really just stuff that's interacting with the environment and doing computations. I thought these weird math geniuses had become hamstrung by the computer metaphor, and as I saw how different the human brain was from what they had asserted it was - a "biological computer" - I felt my suspicions confirmed.

I remember Timothy Leary giving a talk in Hollywood. He had been reading a recent book and was very enthusiastic about it. It was titled Three Scientists and Their Gods, by Robert Wright. So of course I had to read it. It's about Ed Fredkin, E.O. Wilson, and Kenneth Boulding. Leary seemed taken by Fredkin especially. This was an Everything Is Information Processing in a digital way stuff. Leary's psychedelic intellectual friend Robert Anton Wilson seemed interested in this view too, but never committed to it. RAW always seemed more committed to Claude Shannon's mathematical theory of communication - which is the gold standard for quantifying information - but Shannon's theory has information with no necessary semantic component; RAW made a heady brew from combining Shannon with Korzybski, who was all about semantics and our environment and how we make meanings.

Earlier, the originator of pragmatism, Charles Sanders Peirce, had developed a theory of semiotics that took into consideration the content of information using signs and a mind interacting with signs; he had begun to work out a system of defining, quantifying, and taking into account the evolution of a piece of information. This was the "pragmatic theory of information," but it hasn't gone all that far. Shannon's 1948 paper blew it off the map. But still, "information" had to have some sort of semantic component to it, or I had difficulty grasping it. Shannon's and Von Neumann's and Fredkin's and Wolfram's and Leary's ideas about "information" felt too disembodied to me; my intuition told me this couldn't be right. But I'm starting to come over to their side. Let me explain.


Modeling Natural Processes
Via cellular automata theory and the gobs of other stuff a Mind like Wolfram has, he said you can only get so far by modeling life as atoms, or genes or natural laws or as matter existing in curved space at the bottom of a gravity well. More fruitfully, we can model any natural process as computation. Big deal, right? Yea, but think of what this implies: Wolfram thought we can model a redwood tree as a human mind as a dripping faucet as a wild vine growing along a series of trees in a dense jungle thicket in the Amazon. Why? Because all of these systems were "Class 4" systems, and these are the only really interesting things going on. All of these systems exhibit the behavior of "universal computation" systems. (If this reminds you of fractals and art and Jackson Pollock, you're right: I see all of this stuff as a Piece. And so, apparently, does the math.)

Also: you cannot develop an algorithm that can jump ahead to predict where the system will be at Time X; this was proven by Alan Turing in 1936. You can't predict faster than the natural process itself. You had to wait to see what the system did; this blows to smithereens any Laplacian Demonic idea about knowing all the initial conditions and being able to predict everything. So guys like Ray Kurzweil - who has become more and more a sort of Prophet for quantifying the acceleration of information and making bold, even bombastic prediction about what will happen to our world, our society? Wolfram/Turing say no. There are no short cuts and our natural world is irreducible to anything close to Laplace's Demon. The system is too robust to reduce to even what Kurzweil seems to think it is. Robert Anton Wilson used the term "fundamentalist futurism" to criticize those groups of intellectuals in history that Karl Popper had called the enemies of the Open Society. I think the term may apply to Kurzweil too, but I'm not sure. Certainly it seems to apply to Hegelian historicism, most varieties of Marxism, Plato's Republic, and Leo Strauss and the Wolfowitz/Bush/Cheney Neo-Cons.

As I read Wolfram and Kurzweil, the latter seems to see our world as modeled within Wolfram's classificatory scheme as something like a Class 2 system: complex, but if you know enough about the algorithm that undergirds the whole schmeer: fairly predictable.

Arrogance? Aye, but human, all-too human, as Fred N wrote.

                                                    Drew Endy, now at Stanford

Synthetic Biology
Leary, with his penchant for neo-logizing, had in his 1970s book Info-Psychology, defined "contelligence" as "the conscious reception, integration and transmission of energy signals." There were eight fairly discrete levels of this reception--->integration----> transmission dynamic (modeled on the syntactic actions of the neuron). All well and good and trippy, but a team at Stanford led by Drew Endy has made a computer out of living cells.

Engineers at Stanford, MIT, and a bunch of other places have made biological computers. Do you know how a computer must be able to store lots of data? Well, it turns out storing data in DNA is insanely, wildly do-able and has more storage space than you can imagine. Perhaps you heard that some more of these everything-is-a-computer types stored all of Shakespeare's Sonnets in DNA. But that's small taters: it looks like we'll be able to store entire libraries, TV shows, movies, and CDs in DNA. Read THIS and see if you don't feel your mind getting a tad spaghettified.

So: a silicon chip uses transistors to control the flow of electrons along a path; Endy and his team at Stanford have developed a "transcriptor" to control the flow of proteins from one place to another, using Boolean Integrase Logic gates (or "BIL gates" so there's your geek humor for the day!). Endy says their biological computers are not going to replace the thing you're using to read this, but they will be able to get into a tiny, tight quarters and feedback info and manipulate data inside and between cells...something your Smart Phone cannot do.

Endy sees his biological computers as inhabiting a cell and telling us if a certain toxin is present there. It could also tell us how often that cell has divided, giving us early info on cancer, for example. It could also tell us how a drug is interacting with the cell, and make therapeutic drugs more individually tailor-made.

In a line that reminded me of dear old Crazy Uncle Tim, Endy told NPR that, "Any system that's receiving information, processing information, and then using that activity to control what happens next, you can think of as a computing system."

For more on bio-computing, see HERE and HERE.

I'm starting to swing more with Wolfram. But there are many other little snippets that are swaying me. I still like older forms of "information," more human-scaled and poetic and embodied.

But then there are the intelligent slime-molds, which I will leave you with. Grok in their fullness. Don't say I ain't never gave ya nuthin'!

How Brainless Slime Molds Redefine Intelligence.

Saturday, May 19, 2012

Obesity, OR: "Does Our Butt Look Big In That?" (Pt. 3)

A lyricist named Bernie Taupin once wrote this line in a song called "The Bitch Is Back," sung by Elton John:

"Times are changin' now the poor get fat."

And if anyone wants to know why or how this historical turn of events took place, it's easy to find out that  our ingenious modern era with its manipulation of science and technology has produced food at a level to mock Malthus, and cheaply, too. (In the rich countries.) Evolutionarily, for 99% of the time we've been homo sapiens it's been a real slog to capture enough calories and eat a diet with enough protein, fat, and carbs to keep us going, and the average life expectancy rose to the unheard-of high of 38 years old in Unistat by 1850. Evolutionarily, we were pretty much programmed to die by 40. Why sit around as old people and use up the precious tribe's resources? Just for your stories and wisdom? Write that shit down, grandpa, and die already. You're taking up space and it's been at least five years since you used the plow worth a damn.

Sir Thomas Malthus was a catastrophist. If you were around when he was doing his version of what Sir Martin Rees is doing now, and you were prone, let's say, to "pessimistic thinking," you might have thought him a prophet. Basically he said we humans reproduce at an exponential rate, while the rate of food production is arithmetical. It was only a matter of time before famines became common and quite widespread. Malthus was a Man of God, too...No wonder his outlook was so prone to bleakness...(I tend to listen worriedly to Sir Martin Rees, though, truth be told, but that's for another blogspew.)

                                    Reverend Malthus, 1766-1834. Sociologist, economist,
                                                         pessimist.

All too human, Malthus did his futurology and prognostications while living in what Nassim Nicholas Taleb calls Mediocristan: he was using far too simple mathematics and couldn't factor in something totally unknowable but fairly Black Swan-ish: we were able to harness mind-power to produce more and more food in smaller and smaller areas, and quicker and quicker, and then transport got better and faster, and refrigeration came into its own...another futurologist proven wrong. (Temporarily?)

But in the ultra-short period of, say, 125 years, this easy access to sugar (which was always hard to find for 99% of our existence), fat, and carbs - all delightful, life-enriching and acting on dopamine levels in the brain (AKA the "reward system") - threw us a curve. We didn't know how to handle it. And then other sciences and technologies combined forces and made our lives comfier and comfier, to the point where, very very suddenly, on our evolutionary scale, we sit around all day long, every day, and eat rich, fatty food. Meanwhile, our bodies are basically the same ones we had a million years ago. No wonder we're fat!

Now we are so rich we've extended the average lifespan to double what it was in 1850, and we're dying of degenerative diseases. Now the game is not predicting when the food will run out, but when we'll learn how to handle the food. And maybe our analytical tools are more sophisticated than Malthus's.

We saw in my last entry that the NCHS/CDC say the stats showed the obesity epidemic is leveling off already. A recent mega-research paper predicted 42% of the Unistat public would have a Body Mass Index of 30% or higher by 2030, but we have reasons to doubt that. The CDC in 2003 predicted that by 2010 40% of the public would be clinically obese (BMI above 30%); the number turned out to be 35.7%

The British Dept of Health predicted in 1999 that by 2010 25% of Brits would be obese. They updated this prediction in 2006 to 33%. By 2010 the number was 26.1%. Fudge factors? Yes, all sorts of them. First off, of course, many who responded by admitting they were eating fudge as they spoke. Then again...

Some numbers were obtained by phone surveys, asking people how much they weighed, and people tend to prevaricate in that situation. Nonetheless, the numbers are probably pretty close. They have turned out to not be as bad as our best predictors predicted. Do the predictors have a vested interest in their High Numbers? Yes, probably. More money gets thrown at Public Health and obesity-related problems, and some of that money sticks to the predictors and their colleagues. But still: we have a long road to hoe, and it's not going to be easy.

                                Chicago-style deep-dish pizza: now qualifies as a "vegetable"
                                in Unistat schools, thanks to the Goliath food and beverage 
                                industry and their lobbyists. Man, this looks good right about 
                                                       now! Eh?

Why will it be difficult? Well, that too is a very complex problem, but if we look at the Goliath-like Food and Beverage Industry and what it can afford in lobbying Congress, versus the public interest groups that want to educate and restrict massive amounts of sugar and fat in schools, or curb advertising aimed at children, well, David gets stomped to death by Goliath like an ant. In the last three years, four government agencies sought to reduce sugar, salt and fat in food marketed to kids: Congress killed it. The Center For Science in the Public Interest - a bunch of do-gooders who object to 9 year olds who weigh 170 pounds already - spent $70,000 last year lobbying Congress. The Food and Beverage Industry spends that every 13 hours. Pizza is now classified as a "vegetable" in schools. According to this article from Reuters, the food/bev industry has never lost  a significant political battle, and their tactics are the same as what the tobacco industry's were: we're just giving people what they want in a free society. There's no real proof our food and drink is making people sick. They need to moderate their own intake, and exercise more. If you made a hefty paycheck working as a lobbyist for big Food and Bev, wouldn't you say that too?

Note that Ol' Captain Buzzkill William Dietz makes an appearance in the above-cited article: "This may be the first generation of children that has a lower life span than their parents."

Here are two classic takes on why we're fat, from different points of view. First, check out Professor Richard McKenzie, who may be getting some of that sweet Food and Bev money alongside his emeritus professor dough. Yes, we're fatter. On average, Unistatians are 26 lbs heavier than they were in 1960. SUVs were made for fatties. Gurnies have had to be reinforced, stadium seats widened. Because we're on average 26 pounds fatter than 1960, we use an extra two billion gallons of gasoline and jet fuel. We create much more greenhouse gas and our medical costs have skyrocketed. But, as he argues in his book Heavy: The Surprising Reasons America Is the Land of the Free and Home of the Fat, it's all due to lowered tariffs, cheap imports, and "our growing economic freedoms," which go with political freedoms. No reason to change any of the freedom stuff! (I'll let you mull this one over on your own.)

I think it's a classic, valid libertarian view. There's much to say for it. I'm not completely sold on how we're economically freer now, though. But the freedom argument holds some appreciable weight (sorry!) with me. What I object to is the ultra-monied Food/Bev lobby and their louder bullhorns. They don't want frank education about food and what it's doing to us. For guys like McKenzie, money equals freedom, but I'd like more "freedom" for the educators.

                                     Jonah Lehrer, brilliant popularizer of neuroscience, 
                                     the latest psychology, and very creative science writer,
                                                born in 1981.

From Wired, here's a typically smart article from Jonah Lehrer. Why do people eat too much? Well, we're really bad at recognizing when we're full. (That long legacy of hungry homo saps.) Also, restauranteurs think we expect huge portions, and we probably do. So plates have gotten bigger and bigger. Serving sizes are up, Lehrer says, 40% over the last 25 years. We're prone to mimicking the behaviors of those around us. And yes, Big is Good. But why? Lehrer links this to primate status-seeking, which I find fascinating. The problem is: seeking high status by getting the big serving, we get obese, which lowers status. Talk about a vicious circle!

As always, Lehrer suggests a way our of the predicament: if we become mindful of the power/powerlessness module in our primate brain that links Big Food to High Status and therefore, Power, we realize the folly. Mindfulness. It's a big theme in much of Lehrer's writings on neuroscience. But it's easier said than done.

In closing, I suggest we meditate - or ruminate? - a bit on the epigraph Jonah Lehrer uses at the beginning of his article, the quote from M.F.K. Fisher. Is it true? If so, how much do you think it explains about our obesity problem? Do you think some subconscious part of our brain tends to equate food with security, security with love, love with food?

Wednesday, July 6, 2011

Musing on Two Mad Scientists, Part Deux (of Deux)

"The future influences the present just as much as the past." - Nietzsche
---------------------------------------------
I can't recall at what age I had my first experience of deja vu, but I remember telling my aunt about the strange state of mind I had just experienced and she told me the name. Ahh...like my mom and dad's Crosby, Stills, Nash and Young record! But what did that music have to do with my experience? Oh, wait a minute: they're both about being trippy.



I see the term vuja de thrown around a lot more recently - usually by Business Gurus, who always want to find a new way to get us to "think outside the box," while I always protest the fact that you/we/they are in a BOX in the first place - and I'm pretty sure vuja de originated with the American philosopher Carlin, but now it's taken to mean something like, "the odd experience of feeling like you've just seen something new, even though you have in fact seen it many times before." But that definition fits more jamais vu, it seems to me. (Or is that really another packaging of deja vu? I feel like I just read somewhere...) Carlin said, in Napalm and Silly Putty, that vuja de was "the distinct sense that somehow, something that just happened has never happened before. Nothing seems familiar. And then suddenly the feeling is gone. Vuja de." (p.29)

Meanwhile, the comedian David Cross asserts that deja vu "is just the lazy man's version of telling the future."
------------
Check out Kurzweil, from February, 2009. It's less than 9 minutes, but gives the standard Kurzweil rap:


One thing I like is that Kurzweil has in the past few years been emphasizing what's so difficult to model in artificial intelligence: human embodiment, our humor, our methods for dealing with ambiguity, and our poetry, empathy, and loving affection. Nevertheless, he seems like a True Believer who must be taken seriously. And if you haven't already taken him seriously, I can pretty much guarantee that, when you do delve into his books, his thought, his lebenswelt (or "life-world"), no matter if you're enchanted or appalled or a combination of both or something in-between, you will be changed. I do not think we as a species are programmed to think about accelerating change such as this. Why would this type of thought come naturally? We've only had the light bulb for about 130 years! The dinosaurs had no way of knowing what an asteroid impact would be; we do know, but we don't know what accelerating technological change will do to us. I already see it carrying some of us away.
-----------------------------
On a certain level, I read about Moravec, Kurzweil, and many others like them, and it seems absurd, because, while they assert that computing technology and its - as Buckminster Fuller coined it: omniephemeralization - or the tendency for technology to get smaller, faster, more powerful and cheaper,  and that computers have already brought information to the poorest Africans, etc: in the rich First World, less people can afford health care, to pay their mortgage, cost of living is rising, education costs have skyrocketed, etc, etc, etc: things seem to be getting worse. Oh, but the Transhumanists, Singularitarians, Extropians, and other (well-off) people have been saying it's going to get better because of all this movement towards the technological Singularity...just around the corner.

It reminds me of Christian evangelism. And the Singularity is the Transhumanist's Rapture. It's a more nuanced form than the 2012 Mayan calendar meme thing, but...I want to see a movement toward real social sanity. An economics "as if people mattered," as E.F. Schumacher said. Not only an economics of growth for the sake of growth (for the Ruling Class, but this ideology of growth is shared by the cancer cell, as Edward Abbey said), or Consumerism, which does not make us happy.

We have nothing but unsanity and out-and-out insanity now, with "free market" fundamentalists in banking/government/corporate power. And there are literally billionaires in Unistat who are clamoring for Obama to raise their taxes; they care about the future of the country, the Trickle-Down Economics has never worked; it truly seems like voodoo. Yea, verily, it is a sham.Yet Obama can't raise their taxes: the market fundamentalists have too much power.

(Do Not Adjust Your Mind: It Is Reality That Has Malfunctioned. - Robert Anton Wilson)

And this Voodoo Economics (the term was used by George H.W. Bush, when he was running against Ronald Reagan, but when Bush was chosen for VP he shut up about it) market fundamentalism is taken seriously by "experts" in High Places. When the government bails out a firm like Goldman Sachs, who played a large part in ruining the world economy in the first place, and no one goes to jail...how is someone like me supposed to take Ray Kurzweil seriously?

But I digress...
--------------------
Another way to unpack the Singularitarians - one possible model - is that, looking at the long clock of human history, or even just since the last Ice Age/the Holocene Epoch - a very wild concatenation of things happened, and suddenly, we were the species with these huge brains that we didn't really know how to operate. And, in a mere 250-300 years after we figured out how to do "science" well, we have probes leaving the solar system, we are on Mars, we have Internet, jet travel, etc. It's really, really RILLY weird. And: is this what ordinarily happens on a habitable planet? Will we annihilate ourselves somehow, because we didn't know how to keep emotional pace with our technology? Will Mother Nature shake us off like a bad case of fleas? Has nuclear annihilation happened countless times on countless planets in the multiverse? Is there a skeleton key for getting out of this?
----------------------
Is intelligence..."evil"? (Sad idea: I personally consider it "sexy" but there may be a considerable downside in the wrong hands?) I consider the word "evil" mostly as a religious construct, but for our speculative purposes, let it fly for now. Is the vision of downloading ourselves into silicon cyborgs so we can live forever and explore the galaxy and other worlds...is that a valid script for us? Is that where all "this" was heading? The first 99% of our time as homo sapiens we run around forests, party, and die by age 30 in a loin cloth...then in the last 1% we get...what we read in those various books of World History? And then in the last 1/10 of that 1% we download ourselves into "machines of loving grace" and blow this popsickle stand of a planet for better digs in Eternity? I can't buy it. Or rather, it's not my Main Model as a model agnostic. Here's why:

Look at the basic software problem. Now, I am no computer scientist (far from it!), but I understand that a basic problem with software is the legacy of earlier software. Everything gets built upon existing architecture, and, even by the time of Windows95 Microsoft could not have possibly checked it all out for bugs; they knew that when millions of people started using it and bugs became manifest, then they would patch and do triage. And that was 1995! Everything is ultra-complex now, but brittle. Stuff crashes. Can you imagine the quality of the crash when people are beginning to be concerned with attempting to download the information in their nervous systems? And I'm not even going to mention the gangs of neo-Marxists and green-anarchist hackers who want to screw it up for those Elites who will want to go silicone-transhuman. In addition, George Carlin would like to remind us: "We will never be an advanced civilization as long as rain showers delay the launching of a space rocket."

Secondly, there are some forecasting obstacles that seem impossible to deal with. We are dealing with complex adaptive systems, and try as we have there seems no overarching theory for how to deal with them. Immune systems and stock markets are horribly nonlinear. The massive feedback loops in all of the systems mean that adaptation and learning take place at the same time. Self-organization, chaos, strange fractal attractors, frozen accidents, lever points: Kurzweil says yes, but...look at my Law of Accelerating Returns! Look at my charts!

Yes, but: Thirdly, the idea that information has been doubling every 18 months or so: the Law of Diminishing Returns could easily kick in. Some scholars in the field think it already has. We had the Wright Brothers, then steady increase in flight technology, then in 1969 a human walked on the moon...but no plans yet for a walk on Mars. Conditions change. We run up against new terrains, and it changes the equations. Maybe?

Fourth: My intuition tells me the whole Singularity scenario is too redolent of Cartesian ideas about disembodied minds, in addition to my suspicions about the Christian evangelizing mode writ into Technology. This is just a personal hunch; it might be wrong. (Could Kurzweil, et.al be doing Ironic Science? Just a thought.) Nassim Nicholas Taleb has done exhaustive research on the basic fraudulency of forecasting, especially in economics and banking. He distrusts "experts" as I do. This is a simple part of my stance in my Number One model for thinking about the Technological Singularity, remember. (My fourth model is: they're right.) Taleb says in What We Believe But Can't Prove, that we have an overestimation of knowledge in the social sciences. This would impinge on the totalizing metanarrative of the Singularitarians. "It is said, 'The wise see things coming.' To me, the wise are those who know they cannot see things coming." (p..199-200)
---------------------------
For a fun time: watch a documentary from a couple years ago called TechnoCalyps. I found it on You Tube. It should fan the flames...
---------------------------------------------
To say that Mad Scientists like Kurzweil and Moravec have delusions of grandeur seems far too easy to me. The ancient Chinese alchemists and the Epic of Gilgamesh are largely concerned with attaining immortality. It's clearly in our script. But is there a defect in our wiring? I have to admit, as I get older, I'd...rather not get older. I want to be as young and healthy as I can be. I want my memory and intellect to...not falter. I'm excited for what they will have to offer there. But I'm not sure about immortality. It seems like a Poet's Dream. And indeed, another way to look at Kurzweil, et.al: they are poets and don't realize it....
-------------------
As of today, subject to change with new information, etc: My Top Four Models For Thinking About the Technological Singularity:

1.) Sure we have Moore's Law, but we also have Murphy's Law: the rich will control all this great technology; we already see a basic emotional indifference by the rich towards the rest of us, and they have gotten richer since Reagan and everyone else has stayed the same or gotten relatively poorer. Employing those same logarithmic ideas the technophiles use to sociology, this will continue. The rich will inherit the Earth and Space; the rest of us will be lucky if they don't exterminate us, or as the CIA says in their assassination manual: "eliminate with extreme prejudice."

2.) The basic Terminator scenario: the super-intelligent robots wipe the humans out.

3.) A super-super robot freaks out and kills not only what humans remain but all of the merely super-robots.

4.) The utopian scenarios of Kurzweil, Moravec, et.al. We can choose to not download ourselves into silicon; others will want to. I want to take a pill to wipe out any disease and add telomere length to my genes without any untoward effects. The organ cloning and implantation thing that is same-day is great! I feel 28 again...and in some ways, I literally am! Because nanotechnology has gotten so good, it can assemble gold bars, so money doesn't mean much anymore; the things we need to exist are plentiful and dirt-cheap, etc, etc, etc. We figured out a way to reverse global warming, and we bioengineered bacteria to eat pollution. Many people have left the planet to live in L5 space colonies, now that they're safe. I can jack into a tiny implant and learn Chinese immediately. It's great! You should see this place! Anyone can live in any way they want, virtually...and I mean that literally.
-------------------------
I welcome your take!
-------------------------
Professor Carlin had a forecast about the future and human longevity:

"The human life span will be extended to 200 years, but the last 150 will be spent in unremitting pain and sadness." - p.93, When Will Jesus Bring the Pork Chops?


On that note: Onward and upward!

Monday, May 30, 2011

Forecasting and Futurology on Memorial Day in the U.S.

Today Unistatians (i.e, the People of Unistat, more commonly known as "the United States," and where I happen to have been born and "raised," and within whose borders I currently reside) have a national holiday to drink, fight, watch sports, and air grievances with family and friends - among other things - in memory of people who died while fighting in one of Unistat's military services. For most Unistatians, the "reasons" why those people went to war and died in the first place are naive, embarrassingly wrong,  or opaque. But fuck it! Let's parTAY!

Certain thinkers since, oh, let's say Diderot's Encyclopedia, have been wondering if we can somehow figure out a way to use rational-empirical physicalist methods to forecast the future. This kind of thing really got going with the advent of science fiction and brilliant Generalists such as H.G.Wells and Jules Verne. And it became a sort of cottage industry circa 1950. Futurology!

There seem to be two broad approaches linking physical processes to the acceleration of history and technology: 1.) thermodynamic measurements,  and 2.) information theory and chaos mathematics

The second one interests me more, mostly because, as a non-specialist, I find the Information Theory/Chaos Math folk have more engaging (and therefore plausible) narratives. (In my current state of ignorance it seems that thermodynamics and information intersected around 1944-48, with the work of scientific Illuminati like John Von Neumann, Norbert Wiener, Erwin Schrodinger, and Claude Shannon. More some other day. When I've read up more on it.)

For the nigh-curious, here is a list of some Futurologists.

Info-flow and acceleration of technology (and madness, anxiety and reactionary politics?):
Around 1990, for example, the mathematician Theodore Gordon (for some reason left off the Wiki List) published a paper which demonstrated that chaos increases as information flows throughout society increased, and the two are probably intertwined. What it seems to me now is that Gordon was saying that more and more Black Swan events would happen, but they might be almost impossible to predict. These events should be easier to predict, but because of the cognitive biases of "experts" and "specialists" they are not. The global financial meltdown of 2008 should be the Final Bell to start thinking about World Systems differently, building in more robustness and minimizing fragility, but there is no reason to believe this is happening or will happen, given events since August, 2008. (These "locked-in" cognitive biases of "experts" is a huge hurdle, seems to me.)

Moving along, ever-accelerating...

Sir Martin Rees has a sobering quasi-prediction for humanity's future, and if ya wanna, give yourself an intellectual thrill-chill and watch his 18 minute talk from July, 2005 on this here:


(Much) more sanguine futurologists such as Ray Kurzweil think we will reach a point of "singularity." And it seems rather soon. Different singularitarians forecast different dates for a whole new ballgame: our genomics, robotics, computer science, nanotechnology, etc, will increase in their acceleration and everything will change so radically they will look back on us at this moment and laugh at how "primitive"we were. This is too big to go into here, but I'll address my personal pixillation over their whole compelling narrative in a future blog-spew. Suffice: when I read the Extropians and Singularitarians, part of me is very excited and enthusiastic if they're right, and another part of me (intuition?) is horrified and tends to encourage a spiral toward the anxiety pole, damn my overweening imagination!

Anyway, something's coming, and pronto! Head's up!

Finally, I leave us with a quote/forecast from Nassim Nicholas Taleb, author of The Black Swan and Fooled By Randomness:

"The twentieth century was the bankruptcy of the social utopia; the twenty-first will be that of the technological one." - found on p.31 of Taleb's The Bed of Procrustes.


[Irony? Taleb had a NY Times best seller with The Black Swan, but now when you type those words into your search engine, you get the first three pages covered with information about a film about...ballet? (Just kidding: I saw it and thought it intense and psychedelic-Jungian. But poor Nassim! Was it a black swan event for him?)]


P.S: Have a great Memorial Day! And in memoriam, think about reading General Smedley Butler's short book War Is A Racket; it just may blow your mind while leaving everything else intact.