Overweening Generalist

Sunday, November 10, 2013

Robots and Technological Unemployment: Further Considerations

In my last blogspew I wrote a bit about all the ideas and rhetoric I encountered as a kid in the 1970s, reading books and magazines from earlier in the century, when said rhetoric was about the End of Toil. And this seems possible, but we are stuck in a dumb-game about being unfathomably rich, or living in a constant state of biosurvival anxiety due to lack of money and the fear of poverty, homelessness, hunger, penury.

Maybe the ballsiest rhetoric about "all that" came out in 1930 - just as the Great Depression was setting in - by Lord John Maynard Keynes. In a short yet profound essay, "The Economic Possibilities For Our Grandchildren," Keynes - a polymath - wrote that the end of the economic game was in sight, and that many wouldn't know what to do with themselves, that working three hours a day is quite enough for most people, and that a few will know how to live a life of leisure - the goal of a true liberal arts education - while others will have a rough go of it.

"I feel sure that with a little more experience we shall use the new-found bounty of nature quite differently from the way in which the rich use it today, and will map out for ourselves a plan of life quite otherwise than theirs." - Keynes


Read this essay if you haven't before. (If it seems "tl/dr" skip to the II section.) He says that within 100 years this end of toil would be possible. As we write: 16 years and change from now. Which reminds me of a couple of studies that came out in the last 18 months.

We must know something about where we've been in order to understand where we are, and where we might be going. 

Profs. Erik Brynjolfsson and Andrew McAfee
A 98 page book appeared around January 23rd of 2012 titled Race Against The Machine. In a stunning move by an increasingly lame TV institution, 60 Minutes actually did a segment about technological unemployment that Brynjolfsson and McAfee had warned about, and allowed them on as talking heads. The segment featured much footage of robots in factories doing the work that humans previously did. Famous AI/roboticist Rodney Brooks is shown with one of his robots that can learn, pick up an object from the floor, works cheaper than a Chinese factory worker, can be programmed to do a new task by a human in a matter of minutes, etc. 

Everyone should have been talking and writing about Tech Unemployment after this, but few did. I think most of the population is clueless and in reactionary mode, while the Owner Class would rather the population not know what's going to happen to them. It was right there on your beloved teevee, people!

Here's the 14 minute segment, in case you missed it. That's Brynjolfsson in the pic.


In a blog post after the 60 Minutes piece ran, McAfee complained that other experts had misunderstood what they were saying. Near the end of the post he writes:

Previous waves of automation, like the mechanization of agriculture and the advent of electric power to factories, have not resulted in large-scale unemployment or impoverishment of the average worker. But the historical pattern isn’t giving me a lot of comfort these days, simply because we’ve never before seen automation encroach so broadly and deeply, while also improving so quickly at the same time.

Now: These guys are not my heroes. I've read their stuff. I object to their avoidance of talking about the human questions of suffering under continuing austerity and the defunct neoliberal economic model. In the 60 Minutes piece McAfee is asked about the human fallout, and he acts befuddled, saying only that "science fiction" is his best guide. Maybe he'd get too much crap from colleagues if he brought up Universal Basic Income? If you look at McAfee's blog there's nothing there about what to do about human suffering (that I could see), and in his book with Bryndolfsson they stress more "education" and "entrepreneurship," which I find tin-eared, or just plain stupid. Look at the education system NOW, look at the debt...and where are these new jobs that people would be "educated" to do going to come from? Servicing robots? What a joke. You just spent a dense 90 pages writing about the inexorableness of machines in the workforce. Fer crissakes! Read the Keynes essay from 1930! (Maybe if you rise so far in academia that you teach at M.I.T. [Bryndolfsson], or Harvard Business [McAfee] you aren't required to address ideas of human suffering?)

Interestingly, if you read the comments to McAfee's blog post I linked to above, the UBI is mentioned. 

Matthew Yglesias of Slate is pro-UBI, but thinks the idea of permanent technological unemployment is a myth...because in the past when new tech revolutionized production, it created new jobs. Here's McAfee's rebuttal. I find McAfee persuasive here. Do you?

Worse than McAfee, to my eyes, is Bryndolfsson's TED talk . How wonderful! The solution to technological unemployment? Work alongside a robot with advanced Watson-ability solutions! Because...it worked in chess. 

This is just pathetic. I applaud these two geeks for pointing out the obvious rapid influx of technological unemployment. They are just silly asses when it comes to what to do about the human fallout, in my opinion.


                                    a still from Fritz Lang's 1927 film Metropolis

Profs Carl Benedikt Frey and Michael A. Osborne
Both of Oxford. On 17 September, 2013 they produced a paper, "The Future of Employment: How Susceptible Are Jobs To Computerisation?" They were motivated by a 1933 paper by Keynes, about the possibility that machines will put most people out of work. They also cite Bryndolfsson and McAfee. They say 47% of all jobs in Unistat are at high risk of evaporating under computers/automation/robots/better AI systems...within the next 20 years. Round it out to 16 years and change, just to make it interesting?

Which jobs are susceptible to loss?
-transportation
-data crunchers
-logistics
-production labor
-office support/administrative support
-sales
-service
-construction
-machine operators
-crafts/repairs

Let's not even talk about booksellers, journalists, musicians, travel agents and a bunch more - who still exist! - but...you know what I mean?

Some things that could slow or speed up the loss of these jobs: regulation of technologies as they come online, and access to cheaper labor. In a paper by Frank Levy of M.I.T. and Richard Murnane of Harvard they address the types of jobs that will be lost to robots: "Each of these occupations contained significant amounts of routine work that could be expressed in deductive or inductive rules and so were candidates for computer substitution and/or offshoring." 

A.I. has gotten better and better at pattern recognition/machine learning and crunching Big Data, so a lot of clerical and administrative jobs are on the way out. 

Computerization will hit a bottleneck or technological plateau, then A.I. will be so good that it will replace most of the jobs in management, science, engineering, and even (this one really gets me) the arts. 

Look at the jobs not susceptible to automation. They mostly suck; the post-war boom and middle class labor movement seem a thousand years ago. The jobs that are hard to replace with a robot are low-wage: buildings and grounds maintenance, housecleaning, food preparation (although I've seen robots on video...nevermind), personal services like doing manicures and haircuts, personal care of the elderly (although I've seen videos of robots doing this work...nevermind), or any job involving abstract, unstructured cognitive work that's hard to write code for. And even with these jobs, software like Network Manager is often used.

Frey and Osborne advise more education to do the sorts of jobs robots can't do: "Acquire creative and social skills," they say. Is it me or is this just fucking ridiculous? It's almost worse than Brynjolfsson's "work side by side with the robots!" Just acquire social skills! Just learn to be more creative! 

Do these academics ever leave the Ivory Tower and talk to strangers in the streets? Message to Frey and Osborne: you were spurred to write your paper by a 1933 paper by Keynes. Please re-read his 1930 essay on "The Economic Possibilities For Our Grandchildren," and get real: work was, for the most part, something we as a species should try to cure. It's a malady. We have in our sights the cure. It's all about sharing the wealth enough so we can not be burdened with biosurvival anxiety and drudgery. And you'd be surprised how many of us know how to handle leisure. We will still "work" although we may not consider it that. Work may "be" play, but it will be productive. And how many boons have come to humanity when people saw some little problem that needed to be overcome, had the time to tinker, to "screw around" and made a contribution to humankind? Answer: almost all boons...

According to Frey and Osborne, only those occupations that require a high degree of creativity or "social intelligence" or other advanced skills can resist the rise of AI. I saw one paper - my notes are scattered so I can't say where - but two jobs that will last for awhile were (I'm not kidding and if any of you challenge me on this I will find the source to prove it): CEO and poet. 

This is where we're headed. And sooner rather than later, friends. It's time to think about what Life is for. Is it to compete in the rat race so you don't have to live under a freeway overpass? Or is money different than wealth?









(this is more than 14 months old, so it's "worse" [or better?] than this):










Wednesday, November 6, 2013

Rise of the Robots and Technological Unemployment

When I was in grammar school and high school I'd often ditch class and go to the library. One of the things I'd learned was good for laffs and the imagination: look at microfilm of old Life magazines, or if the library had bound versions of the entire year for old magazines I'd love to read those. The ads in magazines like Colliers that showed a doctor saying he prefers these cigarettes over all others because of their fine, smooth taste. His stethoscope around his neck, smiling. Wow! How things had changed since...1952!



Always wondrous were ads for gadgets that would eliminate drudgery and free up the woman of the house (it was always a woman) to live a life of leisure. The rhetoric of machines that would eliminate soul-numbing work captured my attention at a very early age because all you had to do was extrapolate...wouldn't it be cool if dad didn't have to go to work and he and mom would be there when I got home from school...doing...whatever it was they wanted to do? What would my world be like when I was an old man of 30?

As I began to study the history of the Industrial Revolution up to present days, I found this rhetoric of labor and machines a constant: at some point in the future - possibly my own future - we would enter another Epoch: robots and computers (same thing) would do all the horrible work, leaving humans to create, socialize, dream. How would the bills get paid? I didn't know, never having paid bills. I figured the money went to others...who worked. But: their work would have gone away too, right?

Everyone would be playing games, painting, writing poetry or learned papers and books, learning new languages or music, or joyously goofing off.

         "Because everything in her house in waterproof, the housewife of 2000..." Wow!

It doesn't look like it's going to happen like They Promised, does it? Why?

Well, the simple answer: instead of the populace understanding that any machine that puts people out of work was invented not only by a genius and his team, but the genius and his team built upon millions of hours of previous work by previous toilers and tinkerers and basic scientific research funded by everyone - all of who were supported by farmers and mothers - we instead allowed the idea that whoever could buy the biggest and fastest machines, owned All Of That.

There seem to be a few hundred choice entry points to tell this story to myself and y'all, but for now I'll cut to December of 2012.



Paul Krugman
In one of his shorter posts for the NYT, Krugman published "Rise of the Robots" on December 8, 2012. He notes that the "college premium" had been stagnant for a few years. In other words, the payoff for getting a degree was not showing its previous earning power in the marketplace. When he first started writing about income inequality twenty years earlier, it was about the gap between laborers and CEOs and other assholes, like hedge fund managers. Now it seems to be between workers and capital...and OMG Marxism! The dreaded Karl Marx, hibernating for a hundred years, suddenly stirs. Production rises, income of labor stays the same and then begins to lose. Why? Automation. Read the article. "If this is the wave of the future, it makes nonsense of just about all the conventional wisdom on reducing inequality." Education won't help when what we really have are a few people who own machines. The biggest and fastest machines. Those with the biggest and fastest machines are reaping all the rewards; everyone else gets the shaft. You buy the biggest machines, you pay 100-1000 of the brightest PhDs to collect data, write algorithms, maintain the data servers...you win! Everyone else is fucked.

Jaron Lanier
Jaron Lanier, computer whiz/prodigy/generalist/genius says he was there (and he was, as numerous books on the history of Silicon Valley attest to) when this really got going and he and his famous friends thought it was going to be this incredible "information is free" thing that would make everyone's lives better. Now he says they were horribly wrong. Because it turns out that the NSA, Wal-Mart, Facebook, Goldman Sachs...all bought the biggest, fastest computers and hired an army of gifted geeks. He has ideas about how to save us, and I think they're good to begin our thinking with.

I've followed Lanier's career for a long time. I think he's one of the best and most interesting thinkers in the world, but rather than talk about his ideas, I'd rather you took the time to watch what he's saying about the existential situation we're in now:

Here's 4 minutes on "Why Facebook isn't free."


Here he is interviewed by Andrew Keen, about Lanier's book Who Owns The Future? It's about 10 minutes and 40 seconds:


Finally, for 27 minutes or so - I think you'll find it well worthwhile - he's interviewed about his books and his changed thinking and what we might do to remedy this "jobless recovery" situation. NB around 5:20 to 6:00, in talking about the structural changes from Kodak to Instagram: "We pretend that the people who do the work don't exist." Another notable moment: from around 8:00 on: "honesty in accounting" could solve the mess the middle class is in. Also a fascinating point: around 11:30: "levies" and their history:


I have a bee in my bonnet and I'm afraid you're going to be hearing more from the OG on income inequality, American fascism, mob mentality, robots/automation/computers, Real Wealth vs. Money, the college loan bubble, Missing Public Discussions, and social fallout of Winner-Take-All Hypercapitalism and Privateeing, and ideas about how we might extricate ourselves from rising misery.

Tuesday, October 29, 2013

Synthetic Biology: Potentials Perilous and Promising

"Synbio," or synthetic biology, is here. It's alive!:


It's already been three years since Craig Venter's team made a species that was self-replicating...and its parents were not a mom and dad, but a computer.

In 2003 the human genome was sequenced. It costed billions of dollars to sequence and took up the energies of people in over 160 labs. Now you can buy a sequencing machine for a few thousand and sequence your own genome overnight. Or pay 23 and Me $99. By this time next year it'll probably be half that.

Synthetic biology, according to Venter, will change everyone's life at some point. Its upside: we can make microbes that eat carbon dioxide. We can generate flu vaccines almost overnight. Tiny critters that generate clean biofuels that are cheaper and as efficient as fossil fuels seem possible. The brilliant Drew Endy of Stanford is gung-ho about genetic engineering and synthetic biology, claiming it already constitutes 2% of the Unistat economy and is growing at 12% annually.

Venter commissioned a panel to study the potential issues in public health and national security arising from synbio. Two big problems jumped out at us:

1. Synthetic biological work had become so cheap that most of the people who were doing it weren't even trained biologists, so there was no understood consensus about standards, ethics or safety.

2.) What standards existed by governments and international bodies were ten years old and so may as well have been 100 years old.

You're probably wondering what I'm wondering: when will someone get hold of some genome of a relatively benign virus or bacteria, tweak it using known methods, then use it as a bioweapon?

You can email a genetic sequence to someone else. You need to buy a few things to tinker with, but it's doable. I'm trying to spook you for Halloween. Is it working yet?

In the 18th century, Giambattista Vico, countering Rene Descartes, asserted that humans can only know what they have made. Only true understanding can come from something the mind makes, and Descartes's notion about "distinct ideas" in the mind as a basis for philosophy was flawed because we did not make the mind; Descartes was doing metaphysics. Vico called his principle, verum factum. That which is true and that which is made convert into each other; anything else is an abstraction. (I linked Vico's idea to Niels Bohr's Copenhagen Interpretation of quantum mechanics HERE, in case anyone wants to see how bent I can get.)

Back to biology, there's the GOF, which is also growing at an exponential rate, or at least ultra-quickly. It's short for Gain Of Function. Here's how it applies to the Pandora's Box of synbio: biologists attempt to combat some potential horrific pathogen by creating it in the lab, so then they can figure out a way to develop a vaccine for it. We can only know what we have made, as Vico said.

At a conference for scientists a researcher said that he'd tinkered with the H5N1 virus then being talked about as a potential killer of millions, if it mutated. It's a simple coronavirus, but he tinkered with it so a host could infect another via transmission through the air. Then another researcher said he'd done the same thing. They both published their papers, in bigtime journals Science and Nature. They knew what they had done could be interpreted as reckless, and indeed: both journals were persuaded to omit the part of each biologist's work that detailed the techniques by which they took a dangerous virus and made it far more dangerous, because who knows which band of deranged and sick mo fos would read this stuff and get ideas? And carry it off? (Beside The State, of course, by which I mean Google "Tuskegee Syphilis Study.")

But...can you really keep info under wraps? ("Paging Mr. Snowden! Mr. Edward Snowden; Please come to the white courtesy phone...")

In reading about the uncooperative governments (SARS in China, anyone?), the paranoia about Western governmental power (read up on Indonesia and their lethal coronaviral outbreaks), governmental snafus, international differences between countries, and just how hopelessly behind the curve biosecurity experts are in Unistat alone...I'm not sanguine, friends. It's only a matter of time. Let us pray the international bioterrorists make a crucial mistake and the deaths are limited.

However, when it does happen? There's nothing more paranoia-inducing than a massively-social-mediated group of people terrified of the invisible death-bringing entities that may be in the very air they're breathing. All bets are off, and it seems just the thing to get Ted Cruz elected President. (Then: watch out, "liberals": all that NSA data could be gunnin' fer ya!)

With seven billion on the planet now, even if a pandemic arose "naturally" and killed off 3-5% of the population (like the Spanish Flu of 1918 did), how much more paranoid are we now than then? Many people who didn't die will go to their grave convinced the Other was responsible...

I hope you're scared now, or I'm not doing my job, on this, October 29...

The old Biology: you observed life from outside that life, wondered about details and behavior and then dissected to see how it worked, or placed the life in some environment and observed.

The new Biology: You're an engineer: you know the life-form because you created it, from genomic information and computer models. Now you watch to see how it plays out. If it moves, eats, respirates and replicates, you've created a new species!

So...yea. The scary part is anyone with a serious political beef, or simple hatred, can align with others and send away for stuff and do what's called 4-D printing: those microbes that were just info on a screen are now ready to be released into your enemy's territory. You send away for stuff, you use steganography (al-Qaeda left a code in a porn video). Sequencers are cheap. The data is there. One fleeting problem: many biotech companies are keeping track of "nucleotides of concern": any known dangerous sequences are tracked: who is it that wants this info?

So: we have bioterror security experts who aren't sure how to determine threats, or if a threat is all that important; they don't know how to surveil those who'd go the whole nine and release something unspeakable, and they're not sure how to combat the pathogens anyway. Supposedly the International community is getting their act together along these lines. But...let's recall some sobering facts: in 2002 at SUNY Stony Brook, researchers took the genetic code for polio and made that virus. Because...verum factum, and Gain Of Function (GOF). If we truly know these bad boys we stand a chance of combating them when they come at us.

And let's not forget that in 2005 researchers sequenced the 1918 Spanish flu virus that killed 50 million people. They sequenced it...and then of course they made it. And the speed and cost of doing this is becoming ever-quicker and ever-cheaper. Just think: the Spanish Flu killed 50 million, but its lethality was only 2.5%. On the other hand, the H5N1 killed 59% of the people it infected. Can you imagine a huge batch of H5N1 tweaked (like two researchers have already done) to become transmissible via air?

(By the way: now is not the time to read this article about how Unistat labs are insecure. Just don't read this, or it might even bum your Halloween.)

Other Bad Signs: in Unistat the CDC and NIH don't have the infrastructure to develop massive amounts of vaccine for something that might appear. How many would need that magic shot or pill? Not as many as those hundreds of millions who'd take Lipitor or Viagra, paying for it all and making investors happy. Big Pharma is in the Big Money game; they cannot afford to spend an estimated $700 million to $1 billion to develop a vaccine or pill, when maybe after the bioterror attack quarantine and international cooperation stops the spread. There's no money in that! (SARS was stifled largely because of quarantine and cooperation.)

To sum up: synbio offers incredible promise, but just one really "successful" bioterror attack by angry young men who take their own version of a merciless God and some old border dispute very seriously...and life on Earth will have truly changed, and not in a good way. Because we have cops and monitors on one hand, but cheap technology, sheer fluid-like information and motivated ingenuity on the other hand. (Please make sure you wash both hands, thoroughly, when you're done reading this morose report.)

Dr. Frankenstein's imperative makes every day from here on out all the more fraught with drama, eh?

Happy Halloween! Muahahahahaha<cough>ahamuahahaha! Okay that's it: I may have failed to scare you, but in writing this - consulting 13 articles and taking notes - I've grown pallid, anemic and weak in my anxiety attack, and it further sickens me to say, "Well, I just hope that all happens after I'm dead and gone, 'cuz..." What kind of morality is that? It's like saying, "I hope all-out nuclear war happens after I'm dead, while your children are still around to experience it."

Now if you'll excuse me. I need to go rest. Oy! (No, but seriously: don't drink and drive on Halloween.)



Wednesday, October 23, 2013

Metaphysics and Overspecialization: A Meander

Woody Allen once talked about the time when he was expelled from college because he was caught cheating during the final exam in Metaphysics, when he "looked into the soul of the boy sitting next to me."

Aristotle
The subject of metaphysics is something I will never truly understand, but I'm cool with it. It's a blast to try to understand. There are many, many metaphysical roads to take from Aristotle, who is generally credited with being the first to tackle metaphysics as a "science" or a topic in philosophical thought, even though he didn't use the word, apparently. Along with a lot of hand waving and trying to field answers from students about "ultimate things," he variously called what his translators have labeled as metaphysics: "theology" "first philosophy" "first science" and "wisdom." When I read him, he wants to get at the "first cause" of things. He wants to talk about ontology, or the Being-ness of stuff. (Kant put epistemology as the "first philosophy" but I'm getting ahead of myself.) The part that has most intrigued me lately is the search for that which does not change. Given my understanding of physics, I'm not sure there "is" anything that does not change, but it's an interesting idea to think with, and I guess Aristotle was influenced by Plato here, at least a little.

The semantics of "metaphysics" among non-professional philosophers (like the OG) has always seemed a mess, but as I get older, that bothers me less and less. Just think of what metaphysics implies: thinking about things that are above physics. Or beyond physics. It's supposedly a topic that addresses those things that have no mass, no atomic or subatomic structure and no energy. I usually see the topic as what Max Stirner called a "spook": we humans can make up all kinds of things and ideas that simply don't exist, and then reify those "things." And yet, as some sort of humanist type, this notion of metaphysics goes back so far...it's a part of us. And therefore, it can't be negligible, even if it's just made out of words.

Aristotle's origin of all things was with the Prime Mover. For a good time over the next month, mentally insert "Prime Mover" in place of "God" every time anyone says it or writes it...or you think of It. Report your results! ("For Prime Mover' sake! Put the toilet seat down once every blue moon!")

Words
I remember where I was when it happened. I was sitting in a room a few blocks from the Pacific Ocean, near the Los Angeles harbor. I was half-awake and listening to some scientist answer questions on the radio. He said something about language and neuroscience and I started to perk up and listen attentively: it turns out that language, our words, have physical status. It's hard to pin down, but neural imaging, studies of brain damaged people, and our understanding of synapses and learning...the words we use are all tied up with larger neural clusters (made of atomic goo and having some weight and mass)  that have to do with our being human beings with bodies and living in a world with language...but the words themselves have some physical, ontological status, even if it's hazy and difficult to pin down. They're taking up neural space. It made sense: language does not Speak from On High to us. It's not "out there" and emanating from some Superior Being. It's a biological property, and for abstruse evolutionary reasons we developed it to a very high degree, compared with our other-ape cousins. And a lot of it seems localized in the brain - language, that is - and it's so enchanting to other parts of our brain that most of us seem to think that language "really does" reflect "reality." It "actually" maps anything "out there" into words, in a perfect fit. If we're stunned and "can't find the words to express...," it's only due to some temporary imbalance. Possibly of the humors. Or not enough coffee. Too much wine. Not enough sleep. You're freakin' stoned again?

Topics
What are our thought-chains that lead us back to privately pondering the origin of matter, what happens after death, whether there's a superior, even transcendent intelligence inhering somewhere, or a perennial favorite: why is there something rather than nothing?...or why we find ourselves getting all worked up over other peoples' answers to these questions? For me, often: mortality thoughts. And, especially in groups of friends and acquaintances of "curious, breathing, laughing flesh"(Whitman), we're already outside our ordinary "reality": drinking some wine or other inebriant accomplices, jousting with witticisms, stoned on weed, euphoric in music, coming down softly from a whirl of flirtation. I know these states get me going on some metaphysical topic, but often I keep it to myself. Although I do like to hear where you stand. And why.

                                                   Habermas

Jurgen Habermas (and Marx)
I see Habermas as the Noam Chomsky of the European Union, committed to rationality and saving Europe from monied interests. (To my German and other European readers: I apologize in advance for the paltry riffs on Habermas I'm about to play.) Habermas, now in his 80s, is still fighting for something saner. He's made splashes in legal theory, political theory, sociology, psychology. (Here's a blogger-champion to read on him.)

I first became interested in Habermas when I heard a lecture about his idea of an "ideal speech situation," and this seemed to come of his historical views on the rise of literacy and media, coffee houses and newspapers. Everyone should be allowed to speak their view, without fear of recrimination. Metaphysical appeals are the wrong way: we should talk about what's demonstrably "real." Only rational thought will save us. With enough of this massively democratic speech, the better ideas will out. I'm making this too simple to an absurd degree.

Anyway, since the early 1980s - when Habermas was advocating no metaphysics in public speech about our life conditions - he's gradually softened up. He now believes that the discourse of religion has its place in public speech in his massively democratic ideology. Even though he confesses he's "unmusical" when it comes to religion (borrowing a phrase from one of his biggest influences, Max Weber), Habermas thinks there's no getting around the impulse to religious thought. Even though he still seems to be an atheist, he's made amends with religion while trying to maintain his Kantian-Enlightmenment rationalistic ideal speech situation idea.

Peter Berger, reviews Habermas's slow move toward allowing religion/metaphysics. I agree that Habermas is like Edward Gibbon's magistrate, who finds the various religious beliefs of the populace, "useful."

Being a carrier of the Critical Theory tradition (even though he had some cogent critiques of Adorno, Horkheimer, et.al and their opposition to "instrumental rationality"), Habermas is thoroughly steeped in Marx's ideas about religion, that it was "the sigh of an oppressed creature, the heart of a heartless world, and the soul of soulless conditions," and that it was the "opiate of the people." Marx: "The abolition of religion as the illusory happiness of the people is the demand for their real happiness. To call on them to give up their illusions about their condition is to call on them to give up on a condition that requires illusions. The criticism of religion is, therefore, in embryo, the criticism of that vale of tears of which religion is the halo." - Contribution To The Critique of Hegel's Philosophy of Right, 1843.

A common reading of this goes something like this, thumbnail: Religion is a conspiracy, mostly by the Ruling Class, to conceal from Workers the actual reasons for our unhappiness. I think anyone pondering Marx (or The New Atheists, for that matter) ought realize the ambiguity here in Marx. For he might also be saying, religion has the correct insights in that our suffering must be overcome and what we really desire ought to have satisfaction, but  in looking toward religion we make a fundamental error in thinking of deriving our happiness through metaphysics, and not the nitty-gritty mundane, materialist world, which will require knowledge and action.

Anyway, one of the most renowned thinkers in the European Union had at the center of his social ideas the rejection of appeals to metaphysics as a basis for rational understanding, and now he's allowing metaphysics into that program. Which now seems primarily aimed at saving the idea of a relatively sane Europe.

I've often wondered, in the years since I read Habermas's Theory of Communicative Action (I read volume 1 and only thumbed through volume 2), that the ideal speakers in his ideal democracy might need to know more than about what they've specialized in, because I've been in rooms with far too many well-educated specialists who can't understand why the other guy ain't seein' it all from his angle. About which more later below...

Very Brief Take on Philosopher Kings
I still get a charge out of the far-more ancient-than-Aristotle Chinese view of metaphysics in the Tao Te Ching: "That which is above matter is the Tao." Hey: it's a decent take. Or at least it makes it for me.

There was a time when the Schoolmen, the Scholastics, doing philosophy as theologians, were The Cheese intellectuals in the West. When they decided to hold the Renaissance, starting on January 1, 1500, some Humanists, artists, engineers, poets and political philosophers began to get a piece of the action. By around 1860, Natural Philosophy (AKA "science") began to rack up win after win. And this held sway through the Roaring 20th century.

Richard Rorty said the Philosophers had always insisted that, no matter what others thought, theirs was The Cheese all along. They had constructed a bunch of elaborate systems that placed something between the individual and the world: Mind. Language. History's Laws. But theirs was the discipline ne plus ultra.

Whatever, it had always been assumed, says Rorty in his Philosophy and the Mirror of Nature, that the role of philosophers was as meta-cultural criticism and the assumption that only philosophers had a "God's Eye View" on all the other sub-disciplines and fields of study. It was even up to philosophers to tell the lesser historians or economists or psychologists or anthropologists to do more of this, less of that.  We're never going to arrive at a One True Real Copy of Reality if you keep doing that sort of fieldwork! Do something else. The very picture of Plato's Philosopher Kings. As the renegade Marxist sociologist Alvin Gouldner called it: a Platonic Complex.

Rorty says, enough with the idea of achitectonic disciplines: the true role of the philosopher is to live up to its name: love wisdom. And we do that by being Generalists: we read about popular culture and wonder what it means for sociology. We talk to some historians about medicine and get some ideas there...how can this all "hang together...?" We read the philosophy of science and then about actual conditions in labs and see if we find something there. We look at marginalized discourses and books and authors and then make conversations about what they may have to offer that is being missed by those not being marginalized. We wonder about happiness and political power and economics and language and quantum mechanics and Dark Matter...and how it all hangs together. Or might.

Rorty says: enough with the Philosopher King role. It never worked and was pretentious and it alienated philosophers from a more valuable role: as messengers between disciplines. Generalists.

(Right now witness the Third Culturalists trying to assume the role of Philosopher Kings, and attacks from the traditional Humanities and other places. Maybe start HERE. How much of it has to do with funding and prestige?)

Contra people like Pinker and Dawkins and (what I see as) their sophisticated scientism, I do not abide by the idea that there exists any meta-discourse, anywhere. (As of October, 2013)

                                          rendering of Margaret Fuller

The Fascinating Case of Buckminster Fuller's Metaphysics
Talk about a Generalist! And yet, as I parse Fuller's books, I always got the feeling that, as much as he paid lip-service to economics, sociology, poetry, and the humanities, he thinks (he died in 1983, but his ideas are still alive for me) Science is a meta-discourse. And metaphysics actuates science.

So what is metaphysics, according to Fuller? Scientific laws that express a tremendous amount of generalization from a dizzying welter of individual cases. Or, an example in Bucky-speak:

Humans are unique in respect to all other creatures in that they also have minds that can discover constantly varying interrelationships existing only between a number of special case experiences as individually apprehended by their brains, which covarying interrelationship rates can only be expressed mathematically. For example, human minds discovered the law of relative interattractiveness of celestial bodies, whose initial intensity is the product of masses of any two such celestial bodies, while the force of whose interattractiveness varies inversely as the second power of the arithmetical interdistancing increases.
-Critical Path, p.63

Fuller thinks that humans, constantly looking into Nature, using their Minds (different than the brain), discover generalities expressed in the language of math. As time goes on, these generalities get honed and become evermore exact and interaccomodative. (<----I just used a word that I'm not sure even exists, but every time I study Bucky I get infected with his unique verb-ifying language style, so I say what the hell and let 'er fly.)

But this bit about the Mind not being the same as the brain? Well, first let's get to Fuller's conception of God:

Acknowledging the mathematically elegant intellectual integrity of eternally regenerative Universe is one way of identifying God. 

Ohhh...another Platonist. Hey, whatever floats your Dymaxion House!

God may also be identified as the synergy of the interbehavioral relationships of all the principles unpredicted by the behaviors of characteristics of any of the principles considered only separately. 

Recall: Fuller is the grandnephew of American Transcendentalist Margaret Fuller. There's something genetic. Nevertheless, Fuller seemed Leonardo enough for the 20th century.

Oh yea: Mind does not equal Brain:

Brains always and only coordinate the special case information progressively apprehended in pure principle by the separate senses operating in pure mathematical-frequency principle. Brain then sorts out the information to describe and identify whole-system characteristics, storing them in the memory bank as system concepts for single or multiple recall for principle-seeking consideration and reconsideration as system integrities by searching and ever reassessing mind. 

Okay, this brain sounds pretty impressive to me. How can anything be better than that? Well, here's how Bucky conceived mind:

Only minds have the capability to discover principles. Once in a very great while scientists' minds discover principles and put them to rigorous physical test before accepting them as principle. More often theologists or others discover principles but do not subject them to the rigorous physical-special-case testing before accepting and employing them as working-assumption principles.
-pp.159-160, Critical Path

The mind, unlike the brain, is weightless, massless, colorless, and not detected by any instrument that I know of. Furthermore, for Fuller, the physical principles that actually work to run our world of technics and know-how, are also weightless, massless, odorless, colorless, and they don't take in or emit energy, etc:

Mind and general physical principles, generalized, are metaphysical entities. And their synergy runs the world.

Fuller, in book after book, is able to think about our lives and educations and be somewhat dispassionate about the way we were trained to think of inquiry and knowledge as being separate entities. At other times he sees this as something like a conspiracy theory against Mind by powerful interests. Why so much at stake? Because, specialization gets you extinct. And we need as many people to think in creative, generalistic ways as possible if we are to avert catastrophe. Think of his God, his idea of Mind, your Mind. Does it make sense? In the introductory chapter to Synergetics he sees specialization as fostering isolation, futility and confused feelings. Humanity is "deprived of comprehensive understanding." Understanding based on the soundest metaphysical principles. Because most of us in the West were educated to specialize, we tend to abandon personal responsibility for thinking of the Big Picture, and taking social action. We let others deal with the big stuff. He doesn't say it, but he seems to equate specialization with marginalization. "Specialization's preoccupation with parts deliberately forfeits the opportunity to apprehend and comprehend what is provided exclusively by synergy."

Fuller sees art, science, economics and "ideology" all as having separate "drives" and "complexedly interacting trends" which could be understood via synergetics, but hardly anyone "in" one of those fields seems to believe this. This is threatening to the survival of the species. Giant pandas only eat bamboo. When the bamboo is gone, the panda is gone. 99% of all species that ever existed are extinct (or a like number; it's not good news), and not all extinctions were due to specialization or overspecialization, but there have have been enough extinctions, presumably, due to this short-sightedness. And we're supposed to have all the tools! We live at the equator and near the Arctic Circle, in rain forests and deserts, savannas and at 10,000 feet above sea level.

Related to this, here may be one of the brainiest conspiracy theories you'll ever read:

We have also noted how the power structures successively dominant over human affairs had for aeons successfully imposed a "specialization" upon the intellectually bright and physically talented members of society as a reliable means of keeping them academically and professionally divided - ergo, "conquered," powerless. The separate individuals' special, expert glimpses of the separate, invisible reality increments became so infinitesimally fractionated and narrow that they gave no hint of the significant part their work played in the omni-integrating evolutionary flow of total knowledge and its power-structure exploitability in contradistinction to its omni-humanity-advancing potentials. Thus the few became uselessly overadvantaged instead of the many becoming regeneratively ever more universally advantaged.
--p.162, Critical Path

In a slim and criminally underrated and under-read book by Fuller, GRUNCH of Giants, he goes into the history of this conspiracy by the very few to use the "wizards" for their own control of wealth and power. And if you can get with the prose style, you might find it very rewarding.

With this I abandon my typing with the idea that we've specialized too much; we've been marginalized, the survival of our species is at stake, and the deepest synergetic nexus of survival and real wealth is metaphysical know-how. I had no idea I'd end up here. Adieu!

P.S: Not long ago I was delving around in the philosopher Willard Van Orman Quine, and in 1948 he seems to have thought very much like Fuller on these topics. I wonder if Fuller influenced him, or Quine influenced Fuller, or this is another of those convergences that Charles Fort described as "It's steam engines when it comes steam engine time." In 1948, in an essay, "What There Is," Quine said that our best scientific theories "carry an ontological commitment" to objects whose existence is incompatible with nominalism.

                                                 Buckminster Fuller

Thursday, October 17, 2013

The Demonic Powers of (Some) Books: A Take or Three

A while back one of my intellectual colleagues urged me to read Fritz Leiber's novel Our Lady of Darkness, and if you haven't read it yet, it's October and the perfect time to get down to the library and read this thing. It's even better if you live near San Francisco, as it's set there. Leiber, influenced by Lovecraft and Clark Ashton Smith and Montague Rhodes James, uses some Jungian riffs and gets off a tremendous work I couldn't put down. It's weird, realistic, creepy, and destabilizing, somwhat artsy in style and yet a page-turner. Because I'm all out of breath I'll just say Yo This Is An Amazing Book. It's perfect for Halloween-times. (As I link the title to Amazon- I take no money from them - I noted the reviews were less than 4.5 stars...which is simply absurd, trust me on this.)

Have you ever been in a Big City and felt like It had something to say? As if there were signs all around, but you didn't quite have the key to read the language?

Leiber posits a secret art of reading Cities, and predicting and manipulating the future, via Megapolisomancy, and a dark character named Thibault De Castries literally wrote the book on this art. Everything that makes up the metropolis: steel, wire, and cement; paper, rubber and bricks...has always had effects on humans throughout history. The effects are physiological, psychological, and, perhaps most importantly: hyper-psychological. I'd say "parapsychological" but this could be misconstrued. It's creepier than that. Castries also wrote The Grand Cipher, but I don't want to say too much here. Ever since I finished Leiber, my forays in the City - always an expedition in psychogeography - have never felt the same. It's those damned...elementals emanating from the stuff the City is made from. But I won't go into it. Save for the utterly demonic aspect of Leiber's novel.

                                                      Fritz Leiber

"Demonic"? Aye, but not in the American evangelical's sense. The word's had a peculiar evolution. Everyone who's studied any philosophy knows that Socrates attributed whatever he "knew" to his daemon: a voice that spoke to him. This demonic voice was associated with Divine Knowledge. And I remember reading how Goethe was so blown away by JS Bach he said Bach was demonic.

In late 18th-19th century Europe, highly influenced by Hamann and Herder, Goethe saw uncanny creative genius as "demonic." Goethe seems fairly demonic his own self, but that reminds me of one of his books, The Sorrows of Young Werther. It made the demonic Goethe a huge celebrity writer-star at age 24, and was based on autobiographical elements that Goethe later regretted sharing with the world: a very romantic young man's unrequited love leads him to suicide. And the book was responsible for "copycat" suicides in real life. Is it Goethe's fault? The book's fault? The culture's fault?

I used to say it's a combination of all three, but mostly the culture. Now I prefer to attribute the suicides to the book more than the culture or Goethe. I have my reasons. It seems to me the demonic in the 19th century sense is probably at large in every culture, almost everywhen. And while the demonic powers reside in Goethe's nervous system, those books, when disseminated throughout Germany and then the rest of Europe, went out of Goethe's hands. If the culture's "right" then you get readers who succumb to something irrational they see in the book. But the Book actuated the suicides. Goethe's writing resonated so strongly with young people who saw in themselves aspects of the fictional character. And killed themselves.

Other books are linked to killers. Demonic?



A confession: Here's where I realize I'm a bit...off: I'm bibliomane enough to admit to a Walter Mitty thrill that books can have such powers over humans.

Stephen King voluntarily pulled his novel Rage, a work he started while in high school, because it might prove as an "accelerant" to school gun violence, already notorious in Unistat. I can see his point. Already it looks like maybe there was a copycat killing. And yet: is it a publicity stunt? Something to garner a heavier demand for the novel? Am I being cynical? King says guns aren't the problem in Unistat; it's the Kardashianization of culture that's the problem, and King himself owns guns and is a big 2nd Amendment guy.

Now hold on, wait a minute: if I assert the absurdity of blaming Marilyn Manson for the Columbine killers, or Judas Priest or Ozzy Osbourne for other self-inflicted deaths of Unistatian teens, why do I support the book medium over those musical texts? Good Question. Here's how I've negotiated it: in reading interviews and seeing the rock stars talk about their work - and I'm thoroughly acquainted with their music, by the way - I believe the musicians when they say they're writing that music for the joy and fun of it, and Ozzy liked to argue Who believes Vincent Price was an actor who meant harm for his audience?

The writer of a book is working with the nature of the book, the reading of which is almost exclusively solitary, and silent. Reading a novel makes demands on the nervous system that are unique to the act of reading and certainly different than the apprehension of auditory musical texts. But it's the intent and subjectivity of the Author which, combined with the phenomenology and physiology of reading books that makes some of them...demonic.



It seems only fair to ask of the author of a book that might possibly cause untoward (or desirable) effects on its readers to warn them in some way, but the very nature of fiction and unheimlich aspects of  the demonic...seems to violate the rules of the game. However, a warning or notice is done from time to time. The fair warning. For example, in a series of putatively "non-fiction" postscripts to a 700+-page surrealistic novel, Robert Anton Wilson tells his readers:

This book, being part of the only serious conspiracy it describes [...] has programmed the reader in ways that he or she will not understand for a period of months (or perhaps years) [...] Officials at Harvard thought Dr. Timothy Leary was joking when he warned that students should not be allowed to indiscriminately remove dangerous, habit-forming books from the library unless each student proves a definite need for each volume.
-Illuminatus! Trilogy, p.774, omnibus ed.

Who among us can withhold admiration for the author who, in such an overwhelmingly vivid fashion of embedding a non-existent text within the actual text, influences later generations to actually produce a "real" version of the once-embedded imaginary book? One might think immediately of the Necronomicon. But this has been going on for some time. Here's Frances King:

Someone has only to announce the existence of a mysterious book, or an even more mysterious occult fraternity, and there will always be those who are prepared to produce the required article or organization - usually for a suitably large fee. For example, no one had heard of any alchemical writings of the early English St. Dunstan until the Elizabethan magician Edward Kelly stated that he had found a strange red powder of projection and The Book of St. Dunstan, describing how to use this same red powder for the purpose of transmuting base metals into gold, in the ruins of Glastonbury Abbey. Nevertheless, within fifty years of Kelly first making his claim to this discovery no less than half a dozen alchemical tracts had been printed, all of them differing one from another, and each claiming to be the sole authentic Book of St. Dunstan.
-Sexuality, Magic and Perversion, pp.5-6

But these wild, inspired imaginings that go viral: they act as palimpsests, they infuse and infect and imbue the gesticulations and ideation of far-flung gens, dead ignorant of their originations. Fer crissake: look at the abominable life of The Protocols of the Learned Elders of Zion. Now the Priory of Sion has momentum. The Gemstone and The Octopus will fuel conspiracy thinking for a long while yet. These works might be thought of as "non-fiction," but they seem somehow like hyperfiction to me. They are demonic, but not in Goethe's sense. And there are too many to name.

The prolific historian Philip Jenkins traced the origin of satanic panics in 1980s Unistat to a 1926 novel written by Herbert S. Gorman titled The Place Called Dagon. Lovecraft himself was influenced by this novel. What's sorta odd (a digression!) to me: Gorman was the first biographer of James Joyce, his 1924 book receiving much help from Joyce himself, and now thought to be a wonderful source for how Joyce wanted to have been perceived. Gorman was a busy writer and he could have no inkling that, 55 years later, a strain of high-strung xtian PTA types would read his novel and get ideas. So to wrap up this digression: we have a bizarre synchro-mesh of a newspaper reporter and novelist, Lovecraft, Joyce, and the McMartin preschool debacle, among others...

Demonic?



Peter Lamborn Wilson: "The world of apocrypha is a world of books made real, which may well be understood and appreciated by readers of Borges, Calvino, Lewis Carroll - or certain sufis. The apocryphal imagination turns 'Tibet' or 'Egypt' into an amulet or mantram with which to unlock an 'other world', most real in dreams and books and dreams of books, visions induced by holy fasting or noxious alchemic fumes."
-Sacred Drift: Essays on the Margins of Islam, p.22

More PLW: "According to the Manicheans, books might be Angels, living personifications of the Word from On High - or from elsewhere, from another reality. There exist angelic alphabets. The British magus and alchemist, John Dee, received angelic transmissions in the Enochian alphabet, and Jewish magicians used angelic letters in their amulets and Kabbalistic meditations."
-The Little Book of Angels, p.6

A final thought from PLW: "The crude truth is perhaps that texts can only change reality when they inspire readers to see and act, rather than merely see. [...] Just as there exist books which have inspired earthshaking crimes we would like to broadcast texts which cause hearers to seize (or at least make a grab for) the happiness God denies us. Exhortations to hijack reality. But even more we would like to purge our lives of everything which obstructs or delays us from setting out - not to sell guns or slaves in Abyssinia - not to be either robbers or cops - not to escape the world or to rule it but to open ourselves to difference. I share with the most reactionary moralists the presumption that art can really affect reality in this way, and I despise the liberals who say all art should be permitted because - after all - it's only art."
-Immediatism, Essays by Hakim Bey, pp.57-58

Maybe I ought remember my William James and think about the predispositions of readers who might "allow" a book to take hold of them, influencing but not causing them to act in a way a contemporary evangelist would deem "demonic." It would seem James's "tender-minded" might be more prone to the lure of such books than his "tough-minded." Maybe Erik Davis is right when he writes of Lovecraft's doomed protagonists, bookish types (like some people we know?) whose "intellectual curiosity drives them to pore through forbidden books or local folklore."

"district attorneys hunt for books so evil they are not protected by the First Amendment..." - RAW, p.8, Everything Is Under Control

Okay, for today I'm ready to call this a wash, and suffice to say that only some books are demonic, as are some authors (only they might not know it); culture has some skin in this demonic game, and I'm not sure how much. Writing has always been associated with magic, danger, the demonic. Let us try not to forget it...

                                                       Thoth, who seems to have 
                                                       started this whole damned
                                                           thing.