Overweening Generalist

Thursday, October 17, 2013

The Demonic Powers of (Some) Books: A Take or Three

A while back one of my intellectual colleagues urged me to read Fritz Leiber's novel Our Lady of Darkness, and if you haven't read it yet, it's October and the perfect time to get down to the library and read this thing. It's even better if you live near San Francisco, as it's set there. Leiber, influenced by Lovecraft and Clark Ashton Smith and Montague Rhodes James, uses some Jungian riffs and gets off a tremendous work I couldn't put down. It's weird, realistic, creepy, and destabilizing, somwhat artsy in style and yet a page-turner. Because I'm all out of breath I'll just say Yo This Is An Amazing Book. It's perfect for Halloween-times. (As I link the title to Amazon- I take no money from them - I noted the reviews were less than 4.5 stars...which is simply absurd, trust me on this.)

Have you ever been in a Big City and felt like It had something to say? As if there were signs all around, but you didn't quite have the key to read the language?

Leiber posits a secret art of reading Cities, and predicting and manipulating the future, via Megapolisomancy, and a dark character named Thibault De Castries literally wrote the book on this art. Everything that makes up the metropolis: steel, wire, and cement; paper, rubber and bricks...has always had effects on humans throughout history. The effects are physiological, psychological, and, perhaps most importantly: hyper-psychological. I'd say "parapsychological" but this could be misconstrued. It's creepier than that. Castries also wrote The Grand Cipher, but I don't want to say too much here. Ever since I finished Leiber, my forays in the City - always an expedition in psychogeography - have never felt the same. It's those damned...elementals emanating from the stuff the City is made from. But I won't go into it. Save for the utterly demonic aspect of Leiber's novel.

                                                      Fritz Leiber

"Demonic"? Aye, but not in the American evangelical's sense. The word's had a peculiar evolution. Everyone who's studied any philosophy knows that Socrates attributed whatever he "knew" to his daemon: a voice that spoke to him. This demonic voice was associated with Divine Knowledge. And I remember reading how Goethe was so blown away by JS Bach he said Bach was demonic.

In late 18th-19th century Europe, highly influenced by Hamann and Herder, Goethe saw uncanny creative genius as "demonic." Goethe seems fairly demonic his own self, but that reminds me of one of his books, The Sorrows of Young Werther. It made the demonic Goethe a huge celebrity writer-star at age 24, and was based on autobiographical elements that Goethe later regretted sharing with the world: a very romantic young man's unrequited love leads him to suicide. And the book was responsible for "copycat" suicides in real life. Is it Goethe's fault? The book's fault? The culture's fault?

I used to say it's a combination of all three, but mostly the culture. Now I prefer to attribute the suicides to the book more than the culture or Goethe. I have my reasons. It seems to me the demonic in the 19th century sense is probably at large in every culture, almost everywhen. And while the demonic powers reside in Goethe's nervous system, those books, when disseminated throughout Germany and then the rest of Europe, went out of Goethe's hands. If the culture's "right" then you get readers who succumb to something irrational they see in the book. But the Book actuated the suicides. Goethe's writing resonated so strongly with young people who saw in themselves aspects of the fictional character. And killed themselves.

Other books are linked to killers. Demonic?



A confession: Here's where I realize I'm a bit...off: I'm bibliomane enough to admit to a Walter Mitty thrill that books can have such powers over humans.

Stephen King voluntarily pulled his novel Rage, a work he started while in high school, because it might prove as an "accelerant" to school gun violence, already notorious in Unistat. I can see his point. Already it looks like maybe there was a copycat killing. And yet: is it a publicity stunt? Something to garner a heavier demand for the novel? Am I being cynical? King says guns aren't the problem in Unistat; it's the Kardashianization of culture that's the problem, and King himself owns guns and is a big 2nd Amendment guy.

Now hold on, wait a minute: if I assert the absurdity of blaming Marilyn Manson for the Columbine killers, or Judas Priest or Ozzy Osbourne for other self-inflicted deaths of Unistatian teens, why do I support the book medium over those musical texts? Good Question. Here's how I've negotiated it: in reading interviews and seeing the rock stars talk about their work - and I'm thoroughly acquainted with their music, by the way - I believe the musicians when they say they're writing that music for the joy and fun of it, and Ozzy liked to argue Who believes Vincent Price was an actor who meant harm for his audience?

The writer of a book is working with the nature of the book, the reading of which is almost exclusively solitary, and silent. Reading a novel makes demands on the nervous system that are unique to the act of reading and certainly different than the apprehension of auditory musical texts. But it's the intent and subjectivity of the Author which, combined with the phenomenology and physiology of reading books that makes some of them...demonic.



It seems only fair to ask of the author of a book that might possibly cause untoward (or desirable) effects on its readers to warn them in some way, but the very nature of fiction and unheimlich aspects of  the demonic...seems to violate the rules of the game. However, a warning or notice is done from time to time. The fair warning. For example, in a series of putatively "non-fiction" postscripts to a 700+-page surrealistic novel, Robert Anton Wilson tells his readers:

This book, being part of the only serious conspiracy it describes [...] has programmed the reader in ways that he or she will not understand for a period of months (or perhaps years) [...] Officials at Harvard thought Dr. Timothy Leary was joking when he warned that students should not be allowed to indiscriminately remove dangerous, habit-forming books from the library unless each student proves a definite need for each volume.
-Illuminatus! Trilogy, p.774, omnibus ed.

Who among us can withhold admiration for the author who, in such an overwhelmingly vivid fashion of embedding a non-existent text within the actual text, influences later generations to actually produce a "real" version of the once-embedded imaginary book? One might think immediately of the Necronomicon. But this has been going on for some time. Here's Frances King:

Someone has only to announce the existence of a mysterious book, or an even more mysterious occult fraternity, and there will always be those who are prepared to produce the required article or organization - usually for a suitably large fee. For example, no one had heard of any alchemical writings of the early English St. Dunstan until the Elizabethan magician Edward Kelly stated that he had found a strange red powder of projection and The Book of St. Dunstan, describing how to use this same red powder for the purpose of transmuting base metals into gold, in the ruins of Glastonbury Abbey. Nevertheless, within fifty years of Kelly first making his claim to this discovery no less than half a dozen alchemical tracts had been printed, all of them differing one from another, and each claiming to be the sole authentic Book of St. Dunstan.
-Sexuality, Magic and Perversion, pp.5-6

But these wild, inspired imaginings that go viral: they act as palimpsests, they infuse and infect and imbue the gesticulations and ideation of far-flung gens, dead ignorant of their originations. Fer crissake: look at the abominable life of The Protocols of the Learned Elders of Zion. Now the Priory of Sion has momentum. The Gemstone and The Octopus will fuel conspiracy thinking for a long while yet. These works might be thought of as "non-fiction," but they seem somehow like hyperfiction to me. They are demonic, but not in Goethe's sense. And there are too many to name.

The prolific historian Philip Jenkins traced the origin of satanic panics in 1980s Unistat to a 1926 novel written by Herbert S. Gorman titled The Place Called Dagon. Lovecraft himself was influenced by this novel. What's sorta odd (a digression!) to me: Gorman was the first biographer of James Joyce, his 1924 book receiving much help from Joyce himself, and now thought to be a wonderful source for how Joyce wanted to have been perceived. Gorman was a busy writer and he could have no inkling that, 55 years later, a strain of high-strung xtian PTA types would read his novel and get ideas. So to wrap up this digression: we have a bizarre synchro-mesh of a newspaper reporter and novelist, Lovecraft, Joyce, and the McMartin preschool debacle, among others...

Demonic?



Peter Lamborn Wilson: "The world of apocrypha is a world of books made real, which may well be understood and appreciated by readers of Borges, Calvino, Lewis Carroll - or certain sufis. The apocryphal imagination turns 'Tibet' or 'Egypt' into an amulet or mantram with which to unlock an 'other world', most real in dreams and books and dreams of books, visions induced by holy fasting or noxious alchemic fumes."
-Sacred Drift: Essays on the Margins of Islam, p.22

More PLW: "According to the Manicheans, books might be Angels, living personifications of the Word from On High - or from elsewhere, from another reality. There exist angelic alphabets. The British magus and alchemist, John Dee, received angelic transmissions in the Enochian alphabet, and Jewish magicians used angelic letters in their amulets and Kabbalistic meditations."
-The Little Book of Angels, p.6

A final thought from PLW: "The crude truth is perhaps that texts can only change reality when they inspire readers to see and act, rather than merely see. [...] Just as there exist books which have inspired earthshaking crimes we would like to broadcast texts which cause hearers to seize (or at least make a grab for) the happiness God denies us. Exhortations to hijack reality. But even more we would like to purge our lives of everything which obstructs or delays us from setting out - not to sell guns or slaves in Abyssinia - not to be either robbers or cops - not to escape the world or to rule it but to open ourselves to difference. I share with the most reactionary moralists the presumption that art can really affect reality in this way, and I despise the liberals who say all art should be permitted because - after all - it's only art."
-Immediatism, Essays by Hakim Bey, pp.57-58

Maybe I ought remember my William James and think about the predispositions of readers who might "allow" a book to take hold of them, influencing but not causing them to act in a way a contemporary evangelist would deem "demonic." It would seem James's "tender-minded" might be more prone to the lure of such books than his "tough-minded." Maybe Erik Davis is right when he writes of Lovecraft's doomed protagonists, bookish types (like some people we know?) whose "intellectual curiosity drives them to pore through forbidden books or local folklore."

"district attorneys hunt for books so evil they are not protected by the First Amendment..." - RAW, p.8, Everything Is Under Control

Okay, for today I'm ready to call this a wash, and suffice to say that only some books are demonic, as are some authors (only they might not know it); culture has some skin in this demonic game, and I'm not sure how much. Writing has always been associated with magic, danger, the demonic. Let us try not to forget it...

                                                       Thoth, who seems to have 
                                                       started this whole damned
                                                           thing.


Friday, October 11, 2013

Euclidean Quotidian: 90 Degree Angles and the Semantic Unconscious

Ten Scattershot Ideas, One For Each Finger and Two Thumbs

1.) Supposedly the medieval Europeans thought Euclid's works were the same as the one we know as Eucleides of Megara, so olde books about geometry in Europe were by "Megarensis." They weren't the same dude: "Megarensis" was a contemporary of Plato; the great Euclid of high school geometry was closer to being contemporary with some of Plato's early students.

The Arabs got hold of Euclid and thought the name was made of ucli (the key) and dis (measure). At any rate, his Elements was the model of rationality ne plus ultra, and I'm writing this piece after pondering Euclid's influence on two philosophers, Vico and Spinoza, who were not the first to mimic the potent rhetorical form and structure of Euclid.

2.) In Peter Thonemann's review of three books for the TLS, note the story of the Malawi girl, who charged with learning how to set a dinner table English-style, experienced a steep learning curve, because the world she grew up in was curvilinear; there were no right angles. We had to learn the "order of things" we take for granted as "the way things are done."  I also thought it interesting that  with the Romans rolling through the peoples of Europe, they brought right angles and rectangles and ideas about straight lines and order with them, the Irish being the last to "convert," and it went along with Christianity.

There's a question of the "reading" of artefacts from the long-dead: if they built with right angles, was their social structure more authoritarian? Some think so. Others think what matters is the initial posit and then iterated forms that grew from there. Mikhail Okhitovich, Soviet sociological thinker of the 1930s, asserted that right angles originated with private land ownership, then extended to architectural forms, and represent a non-communistic mode of thought; because of this curvilinear forms in architecture were the best and most egalitarian form.



Before rigid hierarchical forms of State, what was often found were circular forms, which have a center but seem to resist hierarchy...on some level. Do Euclidean forms give rise to a form of thought that permeates a culture, and if so, is this idea mostly unconscious, part of the paideuma?

Many non-communist Left-ish thinkers have assumed that dwellings based on rectangles and 90 degree angles were somehow metaphors for artificiality, non-organicism, or simply convention, and living in "boxes" tended to encourage conformist social ideas and a stifling of creativity. Look at any fat book on great 20th century architects and buildings. Look at Buckminster Fuller.

3.) A pop kulch example of a leftist strain in American thought is found in this folk song: "Little Boxes." Boxes and conformity. Boxes and restraint. Boxes and the suburbs, Levittowns.

4.) The distaste for "boxes" runs in countless intellectual and aesthetic fields. While Nietzsche lays out with this probe: "Mathematics would certainly not have come into existence if one had known from the beginning that there was no exactly straight line, no actual circle, no absolute magnitude," and we are left to wonder, our contemporary Nassim Nicholas Taleb writes in his Bed of Procrustes, "They are born, then put in a box; they go home to live in a box; they study by ticking boxes; the go to what is called 'work' in a box, where they sit in their cubicle box; they drive to the grocery store in a box to buy food in a box; they go to the gym in a box to sit in a box; they talk about thinking 'outside the box'; and when they die they are put in a box. All boxes, Euclidean, smooth boxes." (p.31)

5.) Art critic Jed Perl wonders about the state of painting and painters in today's art world. At one time the rectangle frame of the painting was a given. The artist played an outre role in society. But now practically all competing media are either rectangle shaped (iPod/iPad/iPhone?), or text is read within a rectangular-ish frame (the screen you're using now?); further: images in the most popular media are dynamic inside a rectangular frame: TV, films, the camera frame. Could it be that the "degree of stabilizing supremacy of that rectangle has been undermined by the technology that surrounds us?," Perl asks. He knowns painters. It's his milieu. And Perl asserts that today's painter, because of the static image inside a rectangle, has been forced to go on the defensive or offensive, which presents a new hindrance. At the same time, Perl asserts that painting is not dead.

6.) In what appears to be an untitled poem, Tony Quagliano:

I read this poem about geometry
or shadows
or was it poetics, or
some analogy among the three---
that sounds right
a poem about science and art
itself some artful connection
opting for the poem of course (being a poem) slyly
saying math's impure
or at least not pure enough
for one geometer not impressed by Euclid
or more impressed by non-Euclid
or some such twist
and what gets me, why I mention this at all, is
that the poem was good

though no one bled directly in it
words were clean, scientific
stitched in artful lines for the anthologist
and while a slashed wrist would have to wait
this poem of shadows, or math
or some connection in the courtyard of art
this fragile suture, poet to geometer, takes life
over your dead body
and mine

and it was good
which is why I mention this at all.
-p.65, Language Matters: Selected Poetry

7.) I remember reading about some hotshot engineering students - probably at CalTech? - and the problem of stacking oranges at the grocery store. Because of their roundness, there's far more non-used-up space (AKA "air") between oranges. How to maximize the number the oranges stackable? Well, you obviously make square oranges, using the Lego-mind. Easier said than done.


I hadn't thought much about shipping containers and how they have made the world seem far smaller and distance irrelevant until I read Andrew Curry's fine piece in Nautilus not long ago. "Invisible to most people, (shipping containers) are fundamental to how practically everything in our consumer-driven lives works." As for packing as much stuff into a space as efficiently as possible, it doesn't get much better than shipping containers. ("Invisible to most people...")

Score one for rectilinearity.

8.) One of the Prophets of Euclidean space and modern consciousness, Marshall McLuhan, in 1968:

The visual sense, alone of our senses, creates the forms of space and time that are uniform, continuous and connected. Euclidean space is the prerogative of visual and literate man. With the advent of electric circuitry and the instant movement of information, Euclidean space recedes, and the non-Euclidean geometries emerge. Lewis Carroll, the Oxford mathematician, was perfectly aware of this change in our world when he took Alice through the looking-glass into the world where each object creates its own space and conditions. To the visual or Euclidean man, objects do not create time and space. They are merely fitted into time and space. The idea of the world as an environment that is more or less fixed is very much the product of literacy and visual assumptions. In his book The Philosophical Impact of Contemporary Physics Milic Capek explains some of the strange confusions in the scientific mind that result from the encounter of the old non-Euclidean spaces of preliterate man with the Euclidean and Newtonian spaces of literate man. The scientists of our time are just as confused as the philosophers, or the teachers, and it is for the reason that Whitehead assigned: they still have the illusion that the new developments are to be fitted into the old space or environment.
-p. 347, Essential McLuhan, from an essay, "The Emperor's New Clothes," originally in Through the Vanishing Point: Space in Poetry and Painting, co-written with Harley Parker. McLuhan asserted in 1968 that "the artist is a person who is especially aware of the challenge and dangers of new environments presented to human sensibility." McLuhan thought artists were subversive because society expected the replication of existing orders and forms, but artists violated these expectations.

Three thoughts:
a.) In 1968 McLuhan may have been far more prophetic than he thought: not only are scientists still trying to come to terms with non-Euclidean findings in astrophysics, materials science, microbiology, subatomic physics (but I do see some inroads), but going back to Jed Perl's essay on the "state of the art" in painting 45 years later, McLuhan's "with the advent of electric circuitry"...and I think maybe painting, contra Perl, may be, if not dead, in the ICU, condition: critical.

b.) When I do that mental yoga which allows me into McLuhan's thought-space, I realize how intensely Euclidean my assumptions seem, as based on the idea of Gutenberg Man and the space of the literate reader of texts, for hours every day, decades on end, eyes decoding 26 symbols with punctuation, left to right, linear left to right, left to right (THIS), left to right, punctuation. In my conditioned assumptions of quotidian reality, objects "really do" fit inside of space and time. I want them to create space and time themselves, by power of their sheer Being, capital be. But most of the time: no. I have to work on it. How do I get out of Gutenberg Euclidean head space? Cannabis, film, walks in nature, animation, humor and surrealism, reading Joyce or Pound, get into the Korzybski-Zen level of the phenomenal event-level, pre-language, observing without hypnotizing and misleading "woids," and then careful consciousness of abstracting, watching myself abstract until It all melts, or something strange in science. You have your ways.

c.) For such a overwhelmingly "straight" Euclidean man, Prof. McLuhan's (whose personal politics were a sort of conservative Catholic with tinges of anarchy?) mind was, to me, reliably non-Euclidean and psychedelic. His deep immersion in James Joyce and Ezra Pound was probably a significant influence here, but there was so so so so much more. He was an absolute virtuoso with playing with metaphors and combining those ideas with others, if only just to see if they were thrilling and made anyone else want to think about some idea in some new way. I find this an anarchist strain in McLuhan's thought. (How about I take catholic idea about the senses and think about the new electronic media, like radio of TV? I can add ideas I copped from Thomas Nashe, Wyndham Lewis, Ezra Pound, Harold Innes, and anthropologists. And Finnegans Wake! And mythology, Poe, Einstein,  and painting's figure/ground and the rise of the Renaissance's vanishing point? And then: Vico! And commercials and comic strips!? And Walter J. Ong...and and and...)




9.) Robert Anton Wilson's extensions of Timothy Leary's ideas of the evolution of "circuits" in the human mind drew heavily on Euclid for the first three "domesticated primate" aspects of all of us: the oral/biosurvival circuit is about approach/avoidance and is represented in Euclidean metaphor as "forward-back." The second circuit stage of development (according to the theory, we "imprint" all of these circuits), the anal/territorial circuit, is about up/down, and represents the deeper levels of any thinking about politics, whether within the family, local city, national, or international. Notice up/down fits well in Euclidean space-thought.

The third circuit is about right/left and for mammals like us, based on the bilateral symmetry of the body and the nervous system, which nature has seen fit to encourage a dominance of one side over the other, most people's left hemisphere's motor cortex encouraging right-handedness. Conceptual thought and left-right equations (think: algebra!) and logic all fall under the third circuit.

Although neuroscientific ideas about hemisphericalization in evolution and discrete modules of each of the brain's two hemispheres has moved away from a once-popular notion of the "holistic" right hemisphere and the "linear" left, these metaphors still seem to resonate. For Wilson, right-handedness and math and literacy in symbolic humans indicate a left-hemisphere domination (the left hemisphere controls the right side of the body) which has unconsciously biased "linear" and hierarchical forms in human history, which begins with writing. The right hemisphere, relatively "silent" and seemingly subdued by assumptions about "reality" made by the left hemisphere (especially in industrialized Western humans), has yet to harness the intuitive genius housed in the right hemisphere.

So much ink has been spilled over these ideas, once extremely popular but now seemingly in a slow descent. Nevertheless, these ideas live, as you may have noticed from a conversation within the past few months. Why?

Well, I think it's because there's still some truth to the right/left brain modularity-of-function idea, although it's not as simple as those who popularized the findings of the Sperry and Gazzaniga "split brain" experiments. Also: I think Wilson was on to something: "Right-hand dominance, and associated preferences for the linear left-lobe functions of the brain, determine our normal modes of artifact-manufacture and conceptual thought, i.e., third circuit 'mind.' It is no accident, then, that our logic (and our computer-design) follows the either-or, binary structure of these circuits. Nor is it an accident that our geometry, until the last century, has been Euclidean. Euclid's geometry, Aristotle's logic, and Newton's physics are meta-programs synthesizing and generalizing first brain forward-back, second brain up-down and third brain right-left programs." - Cosmic Trigger vol 1, pp.199-200

For Wilson (and Leary) there were relatively "new" circuits that have appeared in human evolution over the last 11,000 years or so. And they seem non-Euclidean, more organic, curvilinear, and more inclusive of a holistic, total-floating body sense, as if we were meant to move through space/time.

To be clear: Euclid and his forebears the Pythagoreans wormed their way into our paideuma due to the natural evolution of mammals on a rocky watery planet with an atmosphere conducive to carbon-based replicative life forms under the purview of a energy-source star at a Goldilocks distance. We got Euclidean forms because that's the way we evolve. Which may Beg the Q, but it's one of my favored narratives, and my entire brain, both hemispheres, seem to harmonically resonate with it.

[Further extrapolations from Wilson on this complex of ideas: see Illuminatus! Trilogy, pp.793-795; Prometheus Rising, pp.97-100; Schrodinger's Cat Trilogy, pp.342-347.]

10.) I grew up in boxy architecture, and when I first encountered this idea - about rectangles and 90 degree angles and conformity - I also found out we forgot how we did it, but at some point we had to learn to see in 3-D spatial terms. Supposedly some cultural anthropologists had gone into deepest darkest rain forest Africa and lived with and studied pygmies, whose complete environment was always giant trees and vines and moving through those living breathing green spaces, always canopied by jungle thickness as "ceiling."And when they were taken to a clearing at the edge of the forest and the anthropologists pointed to a man and a jeep far off in the distance, the natives thought they were seeing a tiny man. They had not learned to see over vistas of "open space."

So, I lay in bed and looked at the point where the ceiling meets the walls. Two walls meet at the "point" of the ceiling. And I tried to remember what it was like to not see that as a point in space. It's akin to many visual illusions or the Necker Cube you've all seen. It was fruitless. Until, one day...O! Such little things that thrill me. Aye: the corner was on a flat plane. And then it pointed out toward me...

I attest, I assert that when I enter buildings of a non-Euclidean build, my consciousness is altered. An inventory of memories and anecdotes would bore you and me, but I wonder if you have felt the same? I love round rooms. A spiral staircase can really get me going. On and on. But here's the thing: if I grew up in a non-Euclidean house, I strongly suspect that entering a Euclidean "tiny box" house would alter my conscious also. Because I think these represent the unfamiliar structure of space...

I hope I didn't come off like some un-hep "square" in this blogspew.


Saturday, October 5, 2013

I Didn't Build This Blog

In the Blizzard of Memes, you can't see what hit you. Metaphors land softly in the snow of your neural fluids only dimly noted, at best. Other cleaning systems, born by fluids electrically discharging, move the thought-stuff out of your system, and what remains...remains. Other memes are noted...as "memes" because, well, here you are: noting them. But what's inside?

And what's forgotten?

When Obama made a now-famous speech in his campaign against Romney (who ironically invented Obamacare), he riffed on "You didn't build that." I immediately recognized this riff as being probably stolen (watch my loaded words here!), or borrowed from Elizabeth Warren. I'd read or heard or saw Warren give a variation, well-fleshed-out, years earlier. But she didn't build that.

I just finished reading the Wikipedia article on "You Didn't Build That." I thought it was a decent Wiki, although as I read it I found - as I usually do - that I'd wanted it to link to...something older. Because Elizabeth Warren didn't build "You didn't build that." (But I've always believed she had long internalized the conceptual framework of the idea; and I think it gets near to the heart of our central tragedy in Unistat that hardly anyone understands that conceptual framework. When Obama used the idea, I had the strong feeling he had not fully internalized that conceptual framework; he was merely riffing and playing his role.) Let me explain. Try to...

The Wikipedia link above? If you clicked on it and only skimmed it for two seconds? All the iterations that article has gone through? The contributors? They didn't build that. Jimmy Wales didn't build it. The infrastructure of the Internet? The infrastructure that supports that infrastructure? The history of architecture, design, craftsmanship, planning, industrial works, mathematical, chemical, and physical ideas? As the Jewish comedian said, "Don't get me started!"

I assume you're reading this in some sort of environment. My guess is, you're "indoors." (I find it taxes my imagination to visualize anyone reading OG outside, walking down a street, on a mobile device, but  who knows?) Anyway, indoors or out: look away from the screen for 30 seconds or three breaths and note your surroundings. Did you build that? As I look around this cramped, book-packed room, I find I assembled most of it. The bones - walls, ceiling, the wiring inside the walls, the paint, those little screws that hold the plate on around the light switch on the wall...I most definitely did not build that.

(I was thinking just now of tough guy and revolutionary Modernist Ezra Pound, who, barely scraping by but going out of his way and tirelessly taking pains and efforts toward making sure that rich lady patrons of the arts knew about Pound's friends, the relatively unknown Joyce, cummings, Hemingway, Frost, Eliot, on and on...and that those soon-to-be "important" artists were going to be supported, subsidized, receive notice. Meanwhile, Pound the poet-revolutionary made his own furniture and got by on a bowl of soup. He could look on the serviceable chair with pride. Did he build that?)

In 1919, a writer writing about a very, very old idea:

The now dead inventor of the steam engine could not have produced his ingenious invention except by using the living powers of other dead men - except by using the material and spiritual or mental wealth created by those who had gone before. In the inventor's intellectual equipment there was actively present the kinetic use-value of 'bound-up-time,' enabling him to discover the laws of heat, water, and steam; and he employed both the potential and kinetic use-values of mechanical instruments, methods of work, and scientific knowledge of his time and generation - use-values of wealth created by the genius and toil of by-gone generations.
-pp.121-122, Manhood of Humanity, Alfred Korzybski

Who knows to what degree this idea has sunk in. I know that when I first encountered it it felt totally revolutionary. And yet, I found I kept forgetting it. Growing up in Unistat you very easily become brainwashed to believe without reservation, that everything someone has, they..."made that." For what it's worth, I now find the idea completely preposterous and feel embarrassment when I remember how naive I'd been to believe it. (And I'm embarrassed that so many of Us still believe it.)

I'd encounter the idea articulated by Korzybksi (NB: he didn't build that) again and again and it was wondrous and seemed "truer" than what my conditioning led me to work with. It seemed very much like when I learned as a young boy that the sun didn't "set" but instead we were on a much smaller body, revolving away from the sun. I knew intellectually this was true, but my natural, naive experience of the sun moving and not us...held. It took practice to get over this. Now when the sun "sets" I can feel us moving on Earth, from my relatively inertial standpoint.

"We" can be utterly profoundly liberating as a concept internalized. Or so I assert.

You can take in history in an embodied way that seems to me qualitatively richer than what was dished to you by cultural conditioning. And, to take a Poet out of context, this "makes all the difference."

"We" goes back a long time, to the most inchoate use of tools by our deepest ancestors.

The words on the screen you're looking at right now are made of letters, and people helped you learn to read them; they had a breakfast those mornings that they merely assembled...: the phonemes, the sounds, the poetry of language and its resonances. They helped you learn to decode these, as others had done for them.

Countless tinkerers throughout human time added incrementally to the sum totals of technics. The ones who tried a new approach that didn't work, but others took note? This too created value: we now know what doesn't seem to work. Let's go more in this direction. And hey: why not keep notes?

Assembling, let me be clear, is nothing to be sneezed at. It is a creative act. But it seems thoroughly encompassed within building.

You have built much. Probably far more than you realize. You don't realize the many things you have built because of categorical accounting schemes you assumed were true. You have built neural circuitry in other mammals, for one thing...You have played a part in building me. (How? Just think about it. Hint: maybe it has something to do with one of Korzybski's triumvirate "material and spiritual or mental wealth"?)

This laptop I'm using? I built none of it. (But "We" built it all.) The silicon, the plastic, the glass, the mathematics "inside" it? I didn't build that. The router, the insulation on the thingumbob that plugs into the whatsit that gives me the juice? Me no build-a dat ting, no. Let's not even get into the Server, or satellites, or the stuff that goes into the foundation of the building that houses the thing that supports the dealio that runs on the doohickey, artifact, and article that goes into that gadget over there, that thing made of metal but's really some alloy of some sort? Who the hell mined that? But I digress...

My ideas at Overweening Generalist? An absurdly complex agglomeration and concatenation of metaphors and names that I didn't build, but I may have used a form, a syntax, a display, an array of combined ideas that may have spurred something within you. I got that from those who went before me. It's my understanding that almost all knowledge percolates constantly in these fashions. But I didn't build it all ex nihilo, of course. The details seem fabulous but true...

No: It's more like We the culture threw out an ungawdly amount of mindstuff, and some of it stuck inside my head! (I didn't build any of those metaphors, I merely borrowed them, so if you have any complaints, please see The Mgt.) The quasi-hidden form of the desktop? The icons? The people at Blogger? The coders that made Blogger so easy for a dunderhead like myself to use, so I can write this crap so you can read it? Me no build zees zing, neethuh! (The farmers that grew the food that allowed most thinkers and tinkerers and laborers to get off the farm and do weirder things with knowledge, like build engines, roads, algorithms, surgical steel, Etsy?)

WE built it, like Korzybski says. Take what he's saying about the steam engine and just extrapolate, and give yourself credit for doing so, for it's a creative act to do so, and who knows what brilliant and novel ways you're envisioning this idea, but if you make something of this idea (did I do that, just now, today?), give yourself some credit. Just not all the credit.

Because giving yourself all the credit just seems to me...childish. Or, I'll be charitable: child-like. Naive, and, as the Philosopher said, "Human, all-too human." Other times I say: greedy and pretentious and stupid.

There seems very much I have not said here.

Here's Prairie Populist Elizabeth ("Betty") Warren, with a variation on a very very olde idea: Is she right? If not, how is she wrong?


Wednesday, October 2, 2013

The Drug Report: Crisis In Psychopharmacology

It's been at least 30 years since a truly new drug has hit the market that addresses the needs of patients suffering from depression, anxiety, manic depression (now rather bloodlessly called "bipolar disorder"), and schizophrenia. Any "new" drugs in the last 30 years have been basically some variation on an older, established drug (called "Me Too" drugs), in an effort of competing drug companies to keep up with the competition. These non-new "new" drugs are almost always marketed as "blockbuster" or "revolutionary" therapeutics, touting less side effects than older, competing drugs. They are not new and the side effects are just different, not less. 50 or so psychiatric drugs bring in $25 billion a year in Unistat alone. And they're pretty lousy.

(I know, I know: you'd be far worse off without the one that worked for you. Hey: they do some good. For some people. I want better drugs for you, is all. And we were promised them with the 2000 mapping of the human genome. So...where are they? Later.)

                                                        serotonin

The drugs people use - by every estimate I've seen between 20% to 25% of the Unistat population takes  at least one of these - were discovered by accident. By serendipity. In the 15 years after 1945. In 1952 a tuberculosis drug didn't work for TB, but iproniozid sure elicited euphoria when tested! Bingo: the first antidepressant. The drug that became Tofranil was supposed to work for schizophrenics, but it didn't help them, only make them run naked into town, laughing. Another antidepressant. In 1949 lithium was discovered, by accident, to treat manic depression. In 1957 Leo Sternbach was about ready to give up his research into a class of antihistamines, things were looking like a dead-end, when he stumbled onto the benzodiazepines: your Valium, Xanax, Lorazepam, Klonopin, etc: an empire of anti-anxiety drugs, and a huge influence on the tonality of culture in the West in the latter half of the 20th century.

With better technics, we learned much more about neurons and neurotransmitters. The SSRIs seemed to treat depression and anxiety. They were really the last big breakthrough. Ever since then, clinical trials that have made it to Stage III have been nothing but huge, sad, very expensive wastes. And so Novartis, Glaxo-Smith-Kline, Astra Zeneca, Pfizer, Sanofri and Merck have by and large quit trying. They've halted clinical trials, moved onto research that shows more promise. The pipeline for new psychopharmacological drugs is dry.

                                    psilocybin, very much like serotonin in structure

Wait a minute: with more neuroscientists than ever before, far better imaging devices, a tremendous acceleration of knowledge about the human brain over the past 30 years...why? And mental health takes an increasing toll on us. If not you, someone you know. Why is this so difficult? Is it because what R.D. Laing called "the medical model" finally showed its hand? (A pair of nines?)

Again: our technology to map with ever finer-grains our cells, genes, and organs is greater than ever. We now have a deeper understanding of the human genome, an explosive discovery of the complexity of the epigenome, increasing understanding of how our environment and microbes interact with us...why don't we have a drug that will cure depression by now? Are we simply too complex to understand? Were we destined to be granted a brief window of time in which a few "happy accidents" would yield up as good as it gets, and it all ended 30 years ago? What about our computing power and pharmacological knowledge? Isn't it also subject to Moore's Law: a doubling roughly every 18 months? Shouldn't we have had a bevy of breakthroughs by now?

What are we doing wrong?

In 2011 Eli Lilly thought they had a breakthrough for schizophrenia. They'd given PCP to mice, then their new drug and...the mice calmed down! Everything went well. They got to Stage III clinical trials (humans) and 18 months later the drug was dead. Placebos worked just as well. Lilly is another company that has all but given up now too.

                                    LSD: like psilocybin and serotonin, structurally

Some New Ways of Thinking and Genuine Promise 
Steven Hyman of Harvard and M.I.T. knows this field well. He was quoted in an article I read as admitting of his colleagues, "People are tired of curing mice."

Let's go back to the last breakthough: Prozac and all its cousins.

It had been assumed that, when those happy accidents occurred, there must be a theoretical basis. Pharmacologists have always acted like they were on top of what was going on, but the trade secret was they were faking it: when a drug worked, it went on the market, people used it and they "worked" well enough, but at first the chemists and psychiatrists had no idea why. With better understanding of the brain, they found the ancient model of the imbalance of humors as an explanatory scheme. Only they juiced it up: they found  these drugs altered neurotransmitters. Therefore, the lack of the neurotransmitter caused the disease! It seemed quite plausible, and very much like the hardcore finding that insulin works for diabetics.



Nassim Nicholas Taleb says this is a classic case of the "reverse-engineering problem": drop an ice cube on the floor and then go play cards with your friends in the other room. Can you visualize the cube breaking down into a tiny pool of water? Of course you can. You walk back into the kitchen and see a tiny pool of water where you had dropped the cube. It's pretty straight-forward. Now: imagine walking down the street and coming upon a tiny pool of water. A little spot of wet. How many ways can you dream up the cause of this spot?

A cop comes upon a drunken man looking for his keys, at night, under a streetlight. The cop asks the drunk why he keeps looking under the streetlight, and the drunk says it's because the light is so much better there.

Obviously, even our best researchers have been looking where the light was bright. And the reverse-engineered explanation of our not-all-that-great/we-can-do-better psychopharmacological drugs? Human. All-too human.



The neurotransmitters are not the cause of the mental illness. They merely point at the underlying cause; neurotransmitters (dopamine, serotonin, norepinephrine, etc) are tangential and partial. Reverse-engineering to allow more serotonin to remain in the synaptic gap between neurons was a genius move; too bad there are a handful of studies that show SSRIs work little better than placebos. (For some people they have worked well enough; I don't want to slight this!) All in all, there's a "truthiness" about depression drugs.

We treat everyone the same in studies, while knowing they have variable epigenomes. This is receiving some major research and seems quite promising, to my eyes. We have a semantic problem with experts dealing with a patient, making observations and tests, then naming the disease they "have," which is a major problem: people and diseases do not fall into our socially-constructed and convenient categories as well as we'd like. This problem is now far more acknowledged than ever, which seems promising to me. One example is the Research Domain criteria: we map behavioral abnormalities and symptoms and link them to specific causes in the brain, without the label of "schizophrenia" or "panic disorder." Why is this approach better? Because it's more targeted. Instead of looking at one or two neurotransmitters that "cause" schizophrenia, we try to find out specifically what causes people to hear voices, or become catatonic.

The idea that we must take 18 years from conception through clinical trials is being re-thought. Even more crucially for mental disease: non-human animal studies long ago reached diminished returns. Now the idea is small-scale, carefully controlled studies on humans will speed up the process and may yield breakthroughs in shorter periods.

Another area of promise: when a drug failed, it often worked for a few people. But our gold standard of drug testing: double-blind and placebo-controlled? The rules were that if the placebo worked as well as the drug, throw out the drug. But the people who were helped probably should have told us something.

Along those lines, there is a strong call to restore abandoned or "invisible" clinical trials to correct the scientific record. We may learn some very interesting things from "failed" trials.

The techniques surrounding stem cells have accelerated at an incredibly dizzying pace upward and for the better: now researchers can test cells and drugs in a a dish and make very good guesses as to whether a compound would have some efficacy.

With the mapping of human genome in 2000, hundreds of utopian promises were made that now seem embarrassing or outright quackery. But there was reason to be optimistic. We thought because we were very complex, we'd have the most genes, but instead of 100,000 we only had about 21,000. Grapes have more genes than us: this was nothing like what we'd expected. Worse: 13 years later we now know that a "bigger" system - in terms of complexity - governs the genome: the epigenome. It turns out that RNA plays a far, far bigger part than we'd thought. The complexity can seem overwhelming.

In 2002 researcher Andrew Hopkins came up with an eye-opening paper, the "druggable genome": Okay: we'd thought we had 100,000 genes. We have closer to 21,000. He estimated that only about 10% of those genes coded for proteins that could bind to small molecules, which is how drugs work, basically. So: about 2,100 genes. But he estimated that, of those, only about 20% would be likely to involve diseases. So now we're down to about 420 possibilities for targets. And then he guessed we'd already discovered 50% of those (probably accidentally?). We only had 210 targets left? For all diseases, not just mental illnesses? Not exactly a rosy scenario. But...

Cheminformatics! This is a burgeoning discipline using the aforementioned computational doubling: there are tens of thousands of compounds in digitized libraries. Do you test them all? Two guys wrote  an algorithm to teach a computer to sift through a welter of data on TB, which is becoming antibiotic-resistant. A Big Deal, quite threatening to all of us, potentially. Their algorithm said: find all compounds that are like the drugs that used to work on tuberculosis. So you get that data set. Then the algorithm says, throw out every compound known to be toxic to mammalian cells. You have a smaller set, but a safer one to work with. The algorithm discovered a 40-year old drug that was shown to have anti-TB properties but had been forgotten.




Even more interesting and promising: researchers in Cambridge, MA have taken messenger RNA (mRNA), an ultra fragile molecule which, when injected activates the body's immune response, tweaked a couple of "letters" in its nucleotide sequence, and made a non-fragile mRNA that does not turn on the immune system. What this could do is take the information from the DNA in a gene and make it "fix" missing or broken proteins in another cell, in effect causing a patient with a (probably inherited?) protein abnormality to make a drug inside their own cells!

Nessa Carey, a gifted explainer of how epigenetics works in our bodies, has urged us to be cautious about getting too excited over drugs based on DNA-RNA, because so far, "One of the major problems with this kind of approach therapeutically may sound rather mundane. Nucleic acids, such as RNA-DNA, are just difficult to turn into good drugs. Most good existing drugs - ibuprofen, Viagra, antihistamines - have certain characteristics in common. You can swallow them, they get across your gut wall, they get distributed around your body, they don't get destroyed too quickly by your liver, they get taken in by cells, and they work their effects on the molecules in or on the cells. Those all sound like really simple things, but they're often the most difficult things to get right when developing a new drug."

Finally, there is a very real call to combine all our new technologies with an active looking for happy accidents, like in the 1945-60 period. We find as many compounds that could possibly have efficacy, get people willing to be guinea pigs to try them (we have far better ways to guess at what's likely to have horrendous side effects or death-dealing qualities, but we're by no means "covered" here), and see what happens! Yes, the dark side is that the poor will probably be the ones to sign up...How do we find new things to try? "Scientists Map All Possible Drug-Like Chemical Compounds." It turns out the drunk looking for his keys was far more accurate an analogy than we might've guessed. Or wanted to guess. Check out all the unexplored chemical "space" yet to be charted! It reminds me of the incredible number of phenethylamines and tryptamines that Alexander Shulgin mapped: but a drop in the ocean? (Shulgin deserved the Nobel Prize for Chemistry: just read-up on his career! It's almost criminal he didn't get the Prize.) It's like looking for signs of life in the Milky Way! Or more prosaically: like geologists learning how to more profitably drill for oil. It's also about algorithms and possibilities and adventure and hellacious mistakes yet to be made.

To all of us looking for better living through chemistry: Bon appetite! I do think we may make it through this bottleneck to a whole new world of more sophisticated drugs that will make all the ones we've had since 1945 look primitive. Maybe?

Some Of The Works Consulted:
The Epigenetics Revolution by Nessa Carey
"No New Meds," by Laura Sanders:
http://www.sciencenews.org/view/feature/id/348115/description/No_New_Meds
Happy Accidents: Serendipity In Modern Medical Breakthroughs, by Morton A. Meyers
"The Psychiatric Drug Crisis" by Gary Greenberg:
http://www.newyorker.com/online/blogs/elements/2013/09/psychiatry-prozac-ssri-mental-health-theory-discredited.html
PIHKAL: A Chemical Love Story, by Alexander and Ann Shulgin
"Where Are All The Miracle Drugs?" by Brian Palmer:
http://www.slate.com/articles/health_and_science/human_genome/2013/09/human_genome_drugs_where_are_the_miracle_cures_from_genomics_did_the_genome.single.html
"Messenger RNAs Could Create a New Class of Drugs," by Susan Young:
http://www.technologyreview.com/news/512926/messenger-rnas-could-create-a-new-class-of-drugs/
"Faster, Smarter and Cheaper Drug Discovery":
http://www.sciencedaily.com/releases/2013/03/130321131920.htm
Serendipity: Accidental Discoveries In Science, by Royston Roberts
Hope or Hype: The Obsession With Medical Advances and the High Cost of False Promises, by Richard A. Deyo and Donald L. Patrick
"Experts Propose Restoring Invisible and Abandoned Trials to 'Correct the Scientific Record'":
http://www.sciencecodex.com/experts_propose_restoring_invisible_and_abandoned_trials_to_correct_the_scientific_record-114055
The Black Swan: The Impact of the Highly Improbable, by Nassim Nicholas Taleb

Friday, September 27, 2013

Evgeny Morozov, Thomas Pynchon, and the dot.com Bubble

I had been reading tons of stuff over the past few days on three fascinating cybermedia critics: Sherry Turkle, Jaron Lanier, and Douglas Rushkoff. In, say, 1996, all three were fairly gung-ho about the vast liberating potentialities of the digital era; now all three have quite grave doubts about how things have turned, by 2013. All three are stellar thinkers (I think one of them is just a staggering genius who should be far better known). They all came at cyberculture from different directions. But I got sidetracked, so maybe next month.

The Internet as "we" know it is only about 22 years old. By 1995, only 15 million people were on the Net. I find it jaw-dropping how It has changed everything in such a short span of time. In studying Turkle, Rushkoff and Lanier and how their views have changed, I spun off serendipitously into all sorts of other areas. Among other things, I found I didn't understand the "dot-com" bubble bursting all that well, so I started poking around  for assumptions about commerce and the Net, 1995-99.

                                          Kevin Kelly, one of the uber-cyberutopians

Concomitantly, I've been reading Pynchon's new novel Bleeding Edge - 'cuz it's freakin' Pynchon! - and it turns out to have a lot to say about the bubble. I'm calling it a coincidance, Robert Anton Wilson's word for something between "coincidence" and "synchronicity," that was actuated by his reading of James Joyce's Ulysses and Finnegans Wake. One of RAW's books is titled Coincidance, and reading it will elucidate what he meant by "coincidance" far better than I did here...

Anyway, I found over the past six-seven years that I'd developed an immunity to the approximately 3700 books (and counting) that hype how great this new digital age will be. I've seen plenty of upside; most of us will by now acknowledge there's quite a downside to it, too. The stakes seem not inconsiderable, to put it mildly and doubly negative.

I think I saw downsides before most of my friends and colleagues, but that may be due to the sociology of knowledge: many of them had jobs that were "wired" to the gills; meanwhile, I've struggled. My position as a reader-writer-thinker type has been on the edge of poverty; you simply get different perspectives from that vantage point. And yet, in keeping with the sociology of knowledge as I understand it (largely through Berger, Luckmann, Mannheim, Vico, Werner Stark, early Marx and McLuhan), my perspective is but one, yet possibly incorporates a wider view of the scene: I have no ideological commitments in the sense that I have not had to answer to authorities or bosses or peers in business, academia, or a funded private sector. If I had had a job in any of those places I believe I'd be like anyone else: being in those situations necessarily influences (an unkind word would be "warps") one's perspective on things. A steady, livable income is obviously desirable, but I have not had that. Mutatis mutandis: those in steady, honorable positions know things that I don't. (Obviously!)

So I found myself gravitating toward critics of cyber-utopianism (I miscounted: there are 3956 books that do nothing but encourage us to think It's All Gonna Be Just Great), and found a hero in a young Belarus-born academic named Evgeny Morozov. Perhaps you've read him: he's published two books, and had articles in Foreign Policy, NYT, WSJ, TLS, Economist, Slate, New Scientist, New Prospect, Boston Review, SF Chronicle...and many more. If the info on his Wiki page is right he's not yet 30. He was educated in Bulgaria, moved to Berlin, been at Stanford and Georgetown, and now he's working on a PhD in the History of Science at Harvard.

                                                      Morozov

What an odd egg Morozov is. He already seems to have an encyclopedic grasp of technology and media and how they affect the social sphere. He's perhaps the foremost critic of cyberutopian rhetoric, and for an Eastern European not yet 30, his rapier wit in English at times shines with a Gore Vidal-like gleam. At other times he reveals his age, but I must caution those conditioned to the rosy future of all things digital: Morozov as prolific gadfly may ridicule once too much, albeit, but his voice seems a necessary corrective as we move further into the Snowden Era. Color Morozov non-sanguine. His position as a species of Nay-Sayer seems absolutely legitimate, and his knowledge and rhetoric strikes me as stellar.

Okay, I'm not the biggest fan of hatchet jobs in book-criticism, and have long thought the only people who deserve to be savaged are the powerful, the wealthy, the pompous. If you're paying I'd be happy to savagely review Dick Cheney's latest book about how right he's been his whole life, or anything Donald Trump writes. But Evgeny reminded me that some of the cyberutopians in the second decade of the 21st century are ripe for the hatchet, and just check out this job Evgeny pulled off in The New Republic. He's bilious, abrasive, sarcastic, very smart, and funny. An enfant terrible. 
(I've looked at Khanna's stuff and think he deserves everything that Evgeny dishes.)

His two books are The Net Delusion  and To Save Everything, Click Here. But the subtitles are the calling cards for Morozov, he who is fed up with the rhetoric of cyberutopianism: "The Dark Side of Internet Freedom," and "The Folly of Technological Solutionism."

Morozov's history of the Net is one of the better ones I've seen (see The Net Delusion), and he goes way back to Pentagon-funded engineers like Vint Cerf, Norbert Weiner, Vannevar Bush, and David D. Clark. Where he gets really interesting is when he begins to discuss Kevin Kelly, Stewart Brand, John Perry Barlow, Howard Rheingold (and yes, Jaron Lanier) and their crowd. There are at least 93 books that go over their story and I'm betting you know these guys well. Morozov seems to admire them, and I definitely do, too.

The problem is: these guys were anarchist-libertarian former hippies with deep roots in the hedonistic 1960s, and they developed a revolutionary rhetoric about how the Internet could change the world and make it a far, far, far better place. With the Net, we could be rid of the Intermediary: free exchange of ideas, different ways of trading, and politics would all transform our social reality. They were preaching a "flat" world at least 10 years before that colossal fraud Thomas Friedman was. But these guys were the real deal, and they seemed to believe their own rhetoric. But all that's not the problem. The problem was: the believed they could deal with The Suits/Wall St/Control, and we now see how that turned out. (I'm consumed by the "Information wants to be free" ideology they came up with. I believed it 98% in 1999. Now? Uhhh...maybe a forthcoming blogspew?)

But back to 1995-99. Who was it that once said that history was the temporary resultant of rival gangs of programmers?

Morozov thinks the lasting achievement of the the early cyberutopians was that they wrested the Net from the Cold War-mentality short-haired engineers. The cyberutopians in turn believed they were smarter and could handle the Big Biz people who would want to use the Net to make money. At some point, the cyberutopians realized they'd need some cash to make their ideas go over big, so they found themselves having dinner with Suits, and seemed to genuinely believe they could do their thing by using private capital without getting, to borrow a term from the guy who invented html, Ted Nelson: "intertwingled."

Here's what I'm still trying to puzzle out: why did Venture Capitalists invest at all in these start-ups that seemed like really neat-o ideas but couldn't seem to deliver real services? This is fascinating to me. I can't help but think the cyberutopians' rhetoric hypnotized them into abandoning all traditional methods of assessing risk and likelihoods of true financial performance. It seems that Bill Gates (who was once "hippie" enough to have possibly joined Stewart Brand, but didn't) and other believers in NeoLiberal economics being done with the Net PLUS the cyberutopians' dazzling pitches clouded the Venture Capitalists' minds. And: at Pets dot.com, probably the most-cited example of the ensuing insanity: at one point - around late 1999 - they were spending $12million on advertising, with only $620,000 in revenue. The bubble exploded soon after. O! The humanity!

I thought of writing about Rushkoff/Turkle/Lanier but ended up typing far too much around Evgeny Morozov. I barely touched on the Bubble stuff, probably because I'm still trying to understand it, with 13 year's hindsight. But I'd like to end with Pynchon talking about this stuff in Bleeding Edge:

It's Spring of 2001 and the heroine of the book, Maxine Tarnow, fraud investigator in Manhattan, is doing some detective work:

Silicon Alley  in the nineties provided more than enough work for fraud investigators. The money in play, especially after about 1995, was staggering, and you couldn't expect elements of the fraudster community to not to go after some of it, especially  HR executives, for whom the invention of the computerized payroll was often confused with a license to steal. If this generation of con artists came up short now and then in IT skills, they made up for it in the area of engineering, and many entreprenerds, being trusting souls, got taken. But sometimes distinctions between hustling and being hustled broke down. It didn't escape Maxine's notice that, given stock valuations on some start-ups of interest chiefly to the insane, there might not much difference. How is a business plan that depends on faith in 'network effects' kicking in someday different from the celestial pastry exercise known as a Ponzi scheme? Venture capitalists feared industrywide for their rapacity were observed to surface from pitch sessions with open wallets and leaking eyeballs, having been subjected to nerd-produced videos with subliminal messages and sound tracks featuring oldie mixes that pushed more buttons than a speed freak with a Nintendo 64. Who was less innocent here?
-pp.71-72

If The Reader has a recommendation for a particularly great book on the Bubble, or books or articles of dissentual data around Morozov, feel free to drop the title or link in the comments. Aun aprendo. Danke!

Sunday, September 22, 2013

FLOTUS Flouts Humility Yet Again; Practically Waterboards Entire Nation!

First, fellow Americans, choke this 54-second clip from the Liberal Media down, if you can take it:



Michelle Obama must be stopped, ladies and germs. She's out of control and obviously in the pocket of Big Water. Her latest extremist liberal agenda? To force more water down our throats! Where is the outrage?

Oh, here's some:

I knew Rush Limbaugh would be there for me, and he is. And thanks for digging deeper, Rush: she's not only in the pocket of Big Water, but Big Soda too. And liberal (which means socialist which means communist which means Nazi-Stalinist-Bin Laden Lovers which means Traitors) bloggers think they're funny with lines like, "If Mrs. Obama asked the nation to smoke more cigars and go for nightly Oxycontin and mayonnaise smoothies, Rush would suddenly be against that." Yea, real funny: Har...Har...Hardy-Har-Har! (golf-clap).


                                Meet the new Public Enemy!

Alyson Goodman of the CDC thinks her study of the supposed hydration deficit among the citizenry indicates we are probably choosing less healthy "beverages" than water. Oh? So water is now a beverage is it? How Orwellian! I don't know about you, friend, but when I'm settling in with something called a "beverage" I better be gettin' good and hammered. I'm sorry, Michelle, but I'll quote another First Lady and Just Say No to you and your Big Government water-pushers.

Why do they hate America? Why do they want to control what we drink? Why do they hate freedom so much? Isn't it enough that the thing they are now telling us can improve our health (water) is the very same thing that has MURDERED so many people in Boulder, Colorado recently? Does the First Lady have no sense of decency?

I'm afraid, fellow Americans, that Rush has only hit the tip of the ocean-liner-killing iceberg: water is far more dangerous than my fellow Americans know. I used italics in that last sentence to highlight how grave I see the situation. What I aim to do is provide a little relief with some FACTS.

Friends: you start off small. Maybe on a hot day you have nothing better to do but grab a bottle of water at the Try 'N Save and the next thing you know you're hooked: soon your body will eerily seem like it's made of water. And you can't get enough. It's insidious! The next thing you know you're living on a shack in the Great Pacific Garbage Patch, which is largely made of...used plastic water bottles! The kind Michelle Obama got you started on! You're condemned to live in a desolate, floating, watery grave. Is this what you want, Mrs. Obama? Dear Reader: THINK!

Oh, it gets worse, yes in-deedy.

You say, "But Overweening Generalist-dude, you're overreacting..."

Oh, I'm "OVERREACTING," am I???

"Yes: I'll just drink a little bit more water from my tap," you say.

I thought you'd say that. Tsk tsk tsk! You poor, unwitting sap. Lemme pour you a tall cool glass of FACTS, straight-up: your tap water is laced with Prozac, Zoloft, Paxil...all those "antidepressants." Stuff like Dammitol and Repressitol. It's...depressing, frankly. Hell, maybe you should drink the water. But you need to take the bull by the tail and look the facts in the face: SSRIs are in your tap water, friends


Maybe it's a set-up. I wouldn't put it past her. Get the public so loopy on funny-pills that don't even notice their water lacks a certain...taste. Then they take your farm because now you're living on/in the Pacific Garbage Patch, wandering around like a zombie, all wasted on Dammitols. (Good luck with the farm in the American Southwest, suckers!: No water!)


Call me Mr. Fancy Pants, but I don't care: water tastes boring to me. If you wanna indulge a little, hey, it's a free country (for how long under the Obamas I don't know). If you wanna live all sedated on a patch of floating garbage the size of Texas, it's your funeral. Water for me lacks...I dunno...bourbon? Anyway...


But wait: it gets even worse. (Worser? Worsier? Worsy? Worsiestlier?) It's bad.



                       This image seems to say, "I'm Michelle Obama
                      and my husband and I are bent on world 
                      domination, please vote for us!" What do you
                      make of it? Or are you too AFRAID to say?

If you live in some parts of our great-great Nation, you can turn on the tap and drink a brain-eating amoeba! Yep: some All-American boy in Louisiana was playing with the Slip 'N Slide and one drop of water from the hose went up his nose, which was all the amoebas needed and now the child's brain is colonized by a microbe that literally eats away the precious grey goo and kills. That's the bad news. I call that a Slip 'N Slide of Death, friends. Let's not sugarcoat it.


The good news is this: they blame the rise of brain eating amoebas in US waterways on global warming, which we know is a Liberal Plot, so therefore nothing to fear. They can't scare us. Also: did you read that article? The sources are NBC, the CDC, NPR, National Geographic and "scientists." All of them I'm Smarter Than You and I Drive a Volvo and Eat Brie Liberals. Therefore the news is false. And Katrina was just a fluke anyway. (By the way, watch where you step in Louisiana, as you can pick up a nasty fluke...)


You ain't heard the end of the implications of Mrs. Obama's attempt to get more water into us. Did you know you can drink too much water and DIE? It's true. You can call me a "cup half-empty" kinda guy, but friends: I say we take no chances here. Call me rather a Cup Bone Dry guy. 


The evidence against the seemingly harmless chemical compound of two hydrogens atoms bonded to one oxygen seems like a little thing. And indeed it is. Until you add it up. Pretty soon: you're staring at a big glass of water. Murderous, Obama-endorsed water. And just look: that sweaty glass of water seems to be staring right back at you, just laughing. (Sure, that's probably only my reflection, but I think my point holds.) And you say you're gonna drink that? What have you been smoking?


Now I lay the hammer down and cinch my case against Mrs. Obama and her attempt to water-log the citizens of the United States of America: Not long ago a man who studies such things - and yes, he's a scientist, but sometimes they actually know something worth knowing - Stanley Falkow of Stanford, who studies bacteria, said that the "world is covered in a fine patina of feces." Talk about the Straight Poop!


And why is the world covered in micropoo? Because people don't wash their hands! They think they do, but they don't. And it stands to reason that the closer you get to the bathroom, the more concentrated the patina. And where are you going to be spending more time if you listen to the FLOTUS? Think about it: you drink all those cups of water They want you to drink and suddenly, you're getting far more shit - literally- on you than ever before. Because you're gonna hafta pee, let's face it.


You can call me alarmist, but I think I know a thing or two when it comes to poo, and you followers of Mrs. Obama don't know shit: patinas of feces carry patinas of microbes, brain-eating types and every damned thing else you can think of. 


Sadly, fine citizens, I'm forced to believe that this latest plot by the Obamas is all about making us sick, infecting our very bodily essences with feces and microbes - call it a Bataan Death March Into Obamacare - while causing some of us to literally drink ourselves to death, all while Nature has seen fit to hit us with a drought. If we live, we're destined to wander like zombies, zonked out on Zoloft from the tap water, along the rim of the Great Pacific Garbage Patch. And the Cubs will still have never won the Series.


My recommendation: strong beer or red wine or bourbon, straight. Wear gloves. Everywhere. Even to bed. Don't shower anymore than is necessary. Less, even. Sure, you'll smell like...a patina of feces. But the Good News is: only the Good Americans will smell too. Ye shall know them by their smell, which be like unto feces. Get used to adult diapers, even if you're a teenager. It's a small price to pay to maintain our freedom.


Anything else and the Liberal Agenda wins again. You can thank me in your prayers. Not this time, Mrs. Obama! Sorry!


I remind us that there were some who knew this all along! Clemenceau! Communist subversion!