As of yesterday, I've been "keeping" a journal for 27 years now. I've probably missed writing something for a given day maybe 20 times, probably less. It is compulsive, and obviously a habit.
I've filled cheap spiral-bound lined notebooks - the cheapest I can find at a stationery store or supermarket - both sides of the page, with lots of lists of things in the top margin of the page, little bits of arithmetic.
I'll fill one up over 11 to 16 months, find a swatch of cheap masking tape and write the beginning and ending dates on it, then plaster the tape onto the cover of the notebook, then stash it away in a closet with the others.
Sounds kinda sick? Maybe. Sounds like something Prozac might help? Maybe. After a couple of years of doing it, I went on a kick of reading all of Gore Vidal: his historical novels, his quasi-surrealist "outrageous" novels (like Myra Breckinridge, but there are others), but - and Gore would've hated to see this - I think he was a better essayist than novelist. Even though I often vehemently disagree with Vidal - especially on the value of certain writers over others - I'm always impressed with his quite great ability as an essayist.
Gore Vidal, who half-jokingly asserted that diarists were dangerous.
When he was in his early twenties he lived with Anais Nin.
And one day I was reading an essay when the topic of diarists came up. Vidal thought - perhaps this was part arch-humor - that diarists were suspect. He linked assassins (like Arthur Bremer, for example) to their diaries. People who wrote only for themselves were suspect. It hurt, a little. But I kept on.
What the hell do I write? Well, the first few years I'd write a lot, every day. Because my life seemed exciting, and I wanted to remember it. Many years later I sat down and read the things I wrote in my early twenties...and it seems like I'm reading someone else's life. Frankly, I sound like a precocious 14 year old girl. "I fixed my bike!" Exclamation points. I'd like to think I'd been putting off re-packing the ball bearings, but I probably just fixed a flat and...was glad I was able to ride again. (!)
Now, I'll often note the mundane. I'll cover four days on one page. Whether I did yoga or not, stuff I ate, people I exchanged emails with. A particular interaction with a guitar student from the day. Oh-so quotidian, and I know you'd be bored to read it.
A reader may note I used the term "diarists" in the title of this blogspew, but when I talk to my friends, I say "journal." Because I've read many famous published diaries (Anais Nin, Samuel Pepys, Anne Frank, the usual suspects) and they seem like "literature" to me. We know Nin thought there would be readers of her diaries. Having an audience in mind greatly changes the content and tone, to put it mildly. Certainly there are entries among my logorrhea that seem fit to be read by others, but when I think about it, I'm one of those compulsive jotters who's really okay with them not being read after my death. What the hell? Page through them for a day or two, have a laff, learn something new and lurid about beastly-dead Michael, then fer crissakes: burn the things for warmth. Or light.
Or just to buy space in a closet.
Okay, some of you actually liked finding great-grandma-ma's diary from the late 19th century. I get it. Do I see myself as great grandma-ma? No. But perhaps I should...
Another reason I don't call myself a "diarist" is that I used to think it gendered: women keep diaries; men write in journals. I don't believe that anymore, but I'm okay with being stuck in my ways. Also: there's a sense in which the bulk of my dull recordings of my days seems almost more like a "log" and don't even deserve the same term as what Anne Frank did.
To return to Gore Vidal's riff - which he repeated a few times - I think he has a point. When Jodi Arias was arrested she wrote a memoir (apparently) in prison, "in case I become famous." Ted Kaczynski, rather famously, had a manifesto. Norwegian mass killer Anders Breivik, who killed 77 and left over 300 injured, gifted us with a 1500 page Facebook document in which he railed against immigrants, multiculturalism, how Western culture is dead, how he felt close to his "Viking" heritage, etc. He also dropped some of his charm onto YouTube, which I haven't seen. Breivik plagiarized from Kaczynski too. The unkindest cut.
Jared Lee Loughner, who shot Arizona Congresswoman Gabrielle Giffords and others, was found a paranoid schizophrenic concerned with the English language, alternative currencies, and a fear of mind control. He bequeathed something for us all on YouTube before heading down to the rally to shoot. (Understanding and representation of Loughner in my neural circuits are adjacent to Robert De Niro's character in Taxi Driver, Travis Bickle and secret service guys, and in a private moment, "Are you talkin' to me? And no wonder: Screenwriter Paul Schrader had Arthur Bremer in mind.)
The Virginia Tech killer, Seung-Hui Choi, sent an 1800 page statement to NBC, with a cache of personal videos and photos. He was inspired by Columbine. LAPD cop Chris Dorner, who was fired from the Ramparts division, left an 11 page manifesto about why he had to kill (it was a "necessary evil"), and he was pissed about the Rodney King incident and how he was treated by fellow cops. So he lost it. I remember watching that manhunt live on TV in Los Angeles. The cops looked about as ready to take Dorner alive as they were ready to take the SLA alive, once they were sure Patty Hearst wasn't in that safe-house in Los Angeles.
I could go on. And on and on. And you may say, "Yea, but you're talking about manifestoes and YouTube videos and Facebook rants." And I say, yea: I think social media has made a lot of people into diarists of a sort.
But really: the Vidal riff is too arch by half. Most of us do it for therapy or simply to ward off "real life" when it becomes a bit too intense. When I read a greatly abridged version of Pepys's diary a few years ago, I was struck by how often he went to the theatre and saw Shakespeare. He notes which play, and I think, "Gee, he saw Taming of the Shrew just a few months ago." But I'm like that with film noir. Read my...errr...journal and note how often I re-watched Double Indemnity or Out of the Past or The Killers or even Armored Car Robbery (saw this again two nights ago: lots of 1950 location shots near places in LA I used to live, and Charles McGraw may be the most hard-boiled actor in all of noir)...
The writer Sarah Manguso published a 93 page book about her 20+ years of compulsive diarizing, and I found this interview with Julie Beck interesting. I think Manguso's sickness (rare autoimmune disease that she wrote a book about) and middle class upbringing must have something to do with writing 800,000 words and counting. I have never counted words, not really caring. Manguso resonates with me about when she started: things in her life seemed momentous, and so much had happened to her, to her own mind. And she wanted to remember it. It is a way of dealing with mortality and memory, no doubt. She thinks keeping a diary will serve as a prevention against "living thoughtlessly." I can see that. But I'm too close to it all to be know to what extent it worked. It does provide solace amid anxiety. The word "graphomania" comes up.
For Manguso, pregnancy and its hormonal cataclysm changed her view of her compulsive diarizing: ordinary "reality" became as important as those "momentous" events, which usually, in hindsight were not so momentous. My favorite line from the interview:
Every exchange that I had with another person, everything I observed, every little throwaway moment I had on the subway observing this and that, the denseness of the experience just seemed unmanageable without writing it down.
For me, this is redolent of a Borges piece, or maybe something from Oliver Sacks.
Here's a huge difference between Manguso and me: I tend to want to "manage" my excitement over ideas I've read in books. Rarely have little impersonal moments with strangers made it into my log/journal/diary, unless they were exceptionally funny or wonderfully weird. I have witnessed verbal tiffs between friends and acquaintances and wrote what I could remember when I got home, in case anyone asks later. What did we do last Christmas? Hold on, I'll go look it up.
In the Beck interview Manguso comments on her diary book, Ongoingness: The End of a Diary, but also her other books. She says narrative, whether in reading or writing, doesn't come easy to her, hence her style. Then she adds, "and I don't need to read or re-read an entire book or re-watch an entire movie." But I love to re-read my favorite books. With each re-reading I'm able to see more and go deeper into that world. Same with films. But: I am not enamored with narrative either; I return to my books and films for mood, style, effects, form. Last night I saw Truffaut's Jules and Jim for maybe the eighth time. And still, it's only as the film nears the climax, that I'm reminded of the ending, which I remember being shocked by the first time. It's quite a climax...so why do I seem to only remember it hazily? I think because I watch it for the friendship of Jules and Jim, the depiction of countryside in France and Germany 1900-1930, French manners, the simmering mental illness of Catherine, the way they negotiate the menage, the accepted insanity of WWI and Jules and Jim being terrified they might kill each other, the interspersed file footage, the cuts and freeze frames and sheer beauty of Jeanne Moreau. The voice-over. Last night I noted that the first five minutes seem "new" to me (they're not, of course: my brain is blitzed by the romantic mood of the opening), and that the denouement seems to barely register for me.
I guess some relatively compartmentalized area of my self sees the climax, remembers the shock from my first viewing, sort of shrugs it off as "Of course you had to end a film like this that way for it to have the emotionally logical effect of such a plot, its syntax, the chaotic madness of the femme etc..." Then I quickly go back to being bathed in the incredible pathos of the film. (In truth I love Truffaut's 400 Blows even more.)
What actually happens to the characters at the end of Jules and Jim seems trivial to my emotional needs, apparently. I once worked with a librarian who could give a detailed chronological synopsis of what happens in a work of fiction, and I thought her simply marvelous for this display, so different was her mind from mine.
This apprehension of how individual nervous systems abstract signals from our environment and concentrate them: this otherness of other peoples' minds is what makes me love them. Because, somehow, perhaps my diarizing helped me in this appreciation, via personal feedback?
Finally, I put forth the idea that "social media" has made many of us diarizers. This may be part of why I don't "do" social media. I've yet to Tweet. I was on Facebook for one day. I've heard of "Snapchat" but I don't really know what it is, nor do I care.
However, I started blogging in order to see what I think about ideas, and maybe entertain certain strange minds that resonate with mine. If blogging of the OG sort can be considered social media, so be it: I do social media. But no doubt that rare handful of posts that are mostly about "me" must qualify as social media. And this post seems the most self-indulgent one I've done. I'll try to wait a long time before I write in such a personal way again. Some aspect of my nervous system seems to be pushing itself to the fore and saying "This wasn't an OG post!"
Oh, well.
Some Sources Read Just Before Writing This
"Poor Historians: Some Notes on the Medical Memoir," by Suzanne Koven
"The Pleasure of Keeping - and Re-reading - Diaries," by Elisa Segrave
"Personal Manifestos: Never A Good Sign"
Jia Tolentino's insightful review of Manguso's book about her diary
ont bob campbell faire oeuvre graphique pour
votre blog en demandant ici!
The Overweening Generalist is largely about people who like to read fat, weighty "difficult" books - or thin, profound ones - and how I/They/We stand in relation to the hyper-acceleration of digital social-media-tized culture. It is not a neo-Luddite attack on digital media; it is an attempt to negotiate with it, and to subtly make claims for the role of generalist intellectual types in the scheme of things.
Showing posts with label perception. Show all posts
Showing posts with label perception. Show all posts
Friday, September 9, 2016
Monday, March 24, 2014
Recent Research on Odors
Last July I read a delightful essay by a former chemistry teacher, who was responding to an article in Scientific American that defended the minor, competing theory of how olfaction works in humans, and presumably, other mammals and critters: that each molecule has quantum vibration, and this is what distinguishes smells for us. A hydrogen atom in a molecule was substituted with a heavier deuterium isotope, which technically did not change the molecular structure of the original, but both flies and people could smell the difference. Previously, the idea of a quantum vibration working in the nervous system was laughed at by detractors as "fashionable junk science." The reigning idea of olfaction is that a molecule docks in one of our 400-or-so different receptors on the olfactory bulb, and each receptor acts in concert with the others. Once docked, a chain reaction occurs and the brain recognizes, "Hey that smells like sandalwood," or "Ewww! What smells like rotten eggs in here?" The author of the essay, Ruchira Paul, wasn't entirely convinced of the quantum vibrational theory, but was still open to it.
What I liked most was Ruchira's observations that we have a limited vocabulary for odors, we don't have accurate standards for measuring smells, that our memory of odors lasts longer than our memory of sights, and that our sense of scents seemed uniquely intimate in its link with our own biographies and memories, our history. Thus seems probably because olfaction is part of the limbic system. She writes that smells are the "forgotten sense" in the semantic sense that, among psychophysical testing of our perceptual apparatuses, researchers have had better instruments to test our range of detection of differences in sight and sound, because they are purely physical phenomena, while our sense of smell and taste are chemical and thus more unwieldy and difficult to measure.
The physics: we have three light receptors, and researchers have estimated humans can distinguish about 10 million colors. Wavelengths of light turned out to be quite amenable to measurement.
Our ears are very complex, miraculous little organs working in concert, and biophysical research in the branch of physics called acoustics found that humans could distinguish differences in around 500,000 wavelengths of sound, and we now know that this number diminishes with age. (<---Depressingly, I found I dropped out at 14kHz. And yes, I'm close to twice 25.)
But what about odors? Chemistry-detection/measurement turned out to be more of a sticky wicket. Many of us grew up hearing and believing that we humans are completely defeated by dogs in our ability to detect odors. If you have an old biology textbook hanging around the house it probably says humans can perceive about 10,000 different odors, while dogs detect 300,000,000. This turns out to be a vastly un-empirical guesstimate from the 1920s. How far-off the guesstimate was we'll get to in a moment.
A year ago, March 2013, I began doing some research on medical diagnoses via analysis odors, because of all those articles on dogs I'd read over the years: how they could detect cancer and all that. It turned out to be fascinating, hot, exciting stuff, and maybe I'll do a separate blogspew on it one day, but just a quick diversion into that area...
Researchers at Cedars-Sinai in Los Angeles say, of course: obesity is not difficult to detect. I mean, just look at that dude! And why are people obese? Well, obviously: they eat too much and don't exercise enough. Jeez, no foolin' Sherlock? Tell me something we don't know! How about this: when you breathe out certain gas-emitting bacteria from the microbiome in your gut, this may be a deeper reason why you're obese: the ratio of gut bacteria that are associated with fatness versus the gut bacteria that are not? The implications are <ahem> large. And this gives our overweight loved ones cause for hope, because if we can figure out how to alter our gut bacteria ratio via drugs or even simple probiotics? We could be on the way to defeating obesity. (And oh man! This has become a hot research topic; there's a lot riding on making this work.) ("Doctors Detect Obesity Bug On Breath")
Also, dig: 11 months ago, in PLOS ONE, a possible discovery of individual human metabolic phenotypes! (Human wha?) Okay, our gadgets are now becoming so sophisticated that the chemical world is becoming much easier to map, finally. Maybe it will soon catch up with purely physical phenomena we can measure. But researchers in Zurich, noting that, despite fluctuating factors involving diet and the gut microbiome, people's urine remained "highly individual," and that urine phenotypes (phenotypes: that which we can observe; genotypes: an organism's genetic makeup which codes for genetic expression that largely gives rise to that which is phenotypic) persist over time. The Zurich researchers used a group of subjects over nine days, exhaling into a machine that handled mass spectrometry, found that "consistent with previous metabolimic studies based on urine, we conclude that individual signatures of breath composition exists." I've even heard this individual-breath signature idea bandied about as a way to get rid of all our passwords, but I'm not sure if the geek was joking or not.
Back to our general sense of smells...
Last September a study appeared in PLOS ONE that I found intriguing: in 1985 a book appeared called Atlas of Odor Character Profiles, by Andrew Dravnieks. Researchers used this book as a basic data set to start with, and with it they have determined there are around ten basic, tightly-structured categories of odor. They are:
1. fragrant
2. woody/resinous
3. fruity (non-citrus)
4. chemical
5. minty/peppermint
6. sweet
7. popcorn
8. lemon
[The last two are both "sickening"]
9. pungent
10. decayed
What they hope to do now is demonstrate the soundness of this - to me, overly-rationalistic take, but what do I know? - model by predicting how a given chemical compound is likely to smell. (My first guess? Number 4.) The researchers used very elaborate statistical techniques to arrive at the 10 and will continue to do so as they test their model. And maybe I shouldn't be so snarky: the basic model for the sense of taste has remained the same for very many years, only going from four to five since 1985 in the West (sweetness, sourness, bitterness, saltiness, with the relatively recent addition of umami).
I hope these guys are on to something, if only for the reason that we can all internalize these ten categories and then invent more words to describe nuances within each category. With this research, Ruchira Paul's observation that we don't yet have accurate standards by which to measure smell will have been eclipsed by some new, "objective" model.
The Latest: Humans Can Detect One TRILLION Odors ("Conservative Estimate")
You may have heard the news from last week. See HERE for a decent overview. Researchers at Rockefeller University took 26 participants and used 128 different odor molecules. (In the actual phenomenological-existential "real" world there are vastly more odors, but that's why this research is so brilliant: they would test a person by mixing two of three vials with combinations from the 128 odors, and the third one was not the same as the other two. The result: if less than 50% of the molecules are identical, people could still smell the difference! People could tell the difference between the two (same) vials and the one different one. If 51% of the two vials were identical, people could tell. The researchers admitted that often the admixtures of odors from the original 128 were "nasty and weird." Think about it: they could mix 10, 20, or 30 odors, in any combination from the 128. This yields trillions of different scents. And people could detect the differences! One basic odor of the 128 may have been "orange," another "spearmint" or "anise." But they mixed them together in all sorts of groupings. No wonder they were "nasty and weird."
We have around 400 different small receptors, working in concert. A smell of a rose would have around 275 different molecules in unique combination.
One of the olfactory researchers, Andreas Keller, said, "The message here is that we have far more sensitivity in our sense of smell that for which we give ourselves credit. We just don't pay attention to it and don't use it in everyday life."
I want to see this study replicated many times. It almost seems too wonderful to be true. I hope there's no Clever Hans Effect tainting the research. It makes me wonder about training humans to smell cancer like dogs, but we seem so biased toward sophisticated gadgetry in this regard, and against dogs and human perceptual apparatus alone, that I won't hold my breath...or nose.
Imagining Smells: An Uncommon Gift
Or so Oliver Sacks tells us. Most of us have little trouble conjuring in our minds a sight or sound from our vivid past. But it's rare to summon an induced hallucination of odor. However, some can do this, and Sacks relates what one "Gordon C." wrote to him in 2011:
Smelling objects that are not visible seems to have been a part of my life for as long as I can remember....If, for instance, I think for a few minutes about my long dead grandmother, I can almost immediately recall with near-perfect sensory awareness the powder that she always used. If I'm writing to someone about lilacs, or any specific flowering plant, my olfactory senses produce that fragrance. This is not to say that merely writing the word "roses" produces the scent; I have to recall a specific instance connected with a rose, or whatever, in order to produce the effect. I always considered this ability to be quite natural, and it wasn't until adolescence that I discovered it was not normal for everyone. Now I consider it a wonderful gift of my specific brain.
-pp.45-46, Hallucinations
On the other hand, there are other, more terrifying olfactory hallucinations described in this wonderful book: people who had traumatic accidents who were violently attacked or witnessed something horrific will, when by-chance experiencing the smell or similar smells associated with the traumatic moment, might experience a shell-shocked "replay"back to the Very Unpleasant Moment.
Let us tend to those more-common moments when some odor sends us back in time to a more comforting or interesting moment, which seems more common with the olfactory/memory nexus than the triggering of traumatic memories.
What I liked most was Ruchira's observations that we have a limited vocabulary for odors, we don't have accurate standards for measuring smells, that our memory of odors lasts longer than our memory of sights, and that our sense of scents seemed uniquely intimate in its link with our own biographies and memories, our history. Thus seems probably because olfaction is part of the limbic system. She writes that smells are the "forgotten sense" in the semantic sense that, among psychophysical testing of our perceptual apparatuses, researchers have had better instruments to test our range of detection of differences in sight and sound, because they are purely physical phenomena, while our sense of smell and taste are chemical and thus more unwieldy and difficult to measure.
The physics: we have three light receptors, and researchers have estimated humans can distinguish about 10 million colors. Wavelengths of light turned out to be quite amenable to measurement.
Our ears are very complex, miraculous little organs working in concert, and biophysical research in the branch of physics called acoustics found that humans could distinguish differences in around 500,000 wavelengths of sound, and we now know that this number diminishes with age. (<---Depressingly, I found I dropped out at 14kHz. And yes, I'm close to twice 25.)
But what about odors? Chemistry-detection/measurement turned out to be more of a sticky wicket. Many of us grew up hearing and believing that we humans are completely defeated by dogs in our ability to detect odors. If you have an old biology textbook hanging around the house it probably says humans can perceive about 10,000 different odors, while dogs detect 300,000,000. This turns out to be a vastly un-empirical guesstimate from the 1920s. How far-off the guesstimate was we'll get to in a moment.
A year ago, March 2013, I began doing some research on medical diagnoses via analysis odors, because of all those articles on dogs I'd read over the years: how they could detect cancer and all that. It turned out to be fascinating, hot, exciting stuff, and maybe I'll do a separate blogspew on it one day, but just a quick diversion into that area...
Researchers at Cedars-Sinai in Los Angeles say, of course: obesity is not difficult to detect. I mean, just look at that dude! And why are people obese? Well, obviously: they eat too much and don't exercise enough. Jeez, no foolin' Sherlock? Tell me something we don't know! How about this: when you breathe out certain gas-emitting bacteria from the microbiome in your gut, this may be a deeper reason why you're obese: the ratio of gut bacteria that are associated with fatness versus the gut bacteria that are not? The implications are <ahem> large. And this gives our overweight loved ones cause for hope, because if we can figure out how to alter our gut bacteria ratio via drugs or even simple probiotics? We could be on the way to defeating obesity. (And oh man! This has become a hot research topic; there's a lot riding on making this work.) ("Doctors Detect Obesity Bug On Breath")
Also, dig: 11 months ago, in PLOS ONE, a possible discovery of individual human metabolic phenotypes! (Human wha?) Okay, our gadgets are now becoming so sophisticated that the chemical world is becoming much easier to map, finally. Maybe it will soon catch up with purely physical phenomena we can measure. But researchers in Zurich, noting that, despite fluctuating factors involving diet and the gut microbiome, people's urine remained "highly individual," and that urine phenotypes (phenotypes: that which we can observe; genotypes: an organism's genetic makeup which codes for genetic expression that largely gives rise to that which is phenotypic) persist over time. The Zurich researchers used a group of subjects over nine days, exhaling into a machine that handled mass spectrometry, found that "consistent with previous metabolimic studies based on urine, we conclude that individual signatures of breath composition exists." I've even heard this individual-breath signature idea bandied about as a way to get rid of all our passwords, but I'm not sure if the geek was joking or not.
Back to our general sense of smells...
Last September a study appeared in PLOS ONE that I found intriguing: in 1985 a book appeared called Atlas of Odor Character Profiles, by Andrew Dravnieks. Researchers used this book as a basic data set to start with, and with it they have determined there are around ten basic, tightly-structured categories of odor. They are:
1. fragrant
2. woody/resinous
3. fruity (non-citrus)
4. chemical
5. minty/peppermint
6. sweet
7. popcorn
8. lemon
[The last two are both "sickening"]
9. pungent
10. decayed
What they hope to do now is demonstrate the soundness of this - to me, overly-rationalistic take, but what do I know? - model by predicting how a given chemical compound is likely to smell. (My first guess? Number 4.) The researchers used very elaborate statistical techniques to arrive at the 10 and will continue to do so as they test their model. And maybe I shouldn't be so snarky: the basic model for the sense of taste has remained the same for very many years, only going from four to five since 1985 in the West (sweetness, sourness, bitterness, saltiness, with the relatively recent addition of umami).
I hope these guys are on to something, if only for the reason that we can all internalize these ten categories and then invent more words to describe nuances within each category. With this research, Ruchira Paul's observation that we don't yet have accurate standards by which to measure smell will have been eclipsed by some new, "objective" model.
The Latest: Humans Can Detect One TRILLION Odors ("Conservative Estimate")
You may have heard the news from last week. See HERE for a decent overview. Researchers at Rockefeller University took 26 participants and used 128 different odor molecules. (In the actual phenomenological-existential "real" world there are vastly more odors, but that's why this research is so brilliant: they would test a person by mixing two of three vials with combinations from the 128 odors, and the third one was not the same as the other two. The result: if less than 50% of the molecules are identical, people could still smell the difference! People could tell the difference between the two (same) vials and the one different one. If 51% of the two vials were identical, people could tell. The researchers admitted that often the admixtures of odors from the original 128 were "nasty and weird." Think about it: they could mix 10, 20, or 30 odors, in any combination from the 128. This yields trillions of different scents. And people could detect the differences! One basic odor of the 128 may have been "orange," another "spearmint" or "anise." But they mixed them together in all sorts of groupings. No wonder they were "nasty and weird."
We have around 400 different small receptors, working in concert. A smell of a rose would have around 275 different molecules in unique combination.
One of the olfactory researchers, Andreas Keller, said, "The message here is that we have far more sensitivity in our sense of smell that for which we give ourselves credit. We just don't pay attention to it and don't use it in everyday life."
I want to see this study replicated many times. It almost seems too wonderful to be true. I hope there's no Clever Hans Effect tainting the research. It makes me wonder about training humans to smell cancer like dogs, but we seem so biased toward sophisticated gadgetry in this regard, and against dogs and human perceptual apparatus alone, that I won't hold my breath...or nose.
Imagining Smells: An Uncommon Gift
Or so Oliver Sacks tells us. Most of us have little trouble conjuring in our minds a sight or sound from our vivid past. But it's rare to summon an induced hallucination of odor. However, some can do this, and Sacks relates what one "Gordon C." wrote to him in 2011:
Smelling objects that are not visible seems to have been a part of my life for as long as I can remember....If, for instance, I think for a few minutes about my long dead grandmother, I can almost immediately recall with near-perfect sensory awareness the powder that she always used. If I'm writing to someone about lilacs, or any specific flowering plant, my olfactory senses produce that fragrance. This is not to say that merely writing the word "roses" produces the scent; I have to recall a specific instance connected with a rose, or whatever, in order to produce the effect. I always considered this ability to be quite natural, and it wasn't until adolescence that I discovered it was not normal for everyone. Now I consider it a wonderful gift of my specific brain.
-pp.45-46, Hallucinations
On the other hand, there are other, more terrifying olfactory hallucinations described in this wonderful book: people who had traumatic accidents who were violently attacked or witnessed something horrific will, when by-chance experiencing the smell or similar smells associated with the traumatic moment, might experience a shell-shocked "replay"back to the Very Unpleasant Moment.
Let us tend to those more-common moments when some odor sends us back in time to a more comforting or interesting moment, which seems more common with the olfactory/memory nexus than the triggering of traumatic memories.
Wednesday, September 18, 2013
On Vision and Judgement
Just let me discuss three recent analyses of the topic before I get out of your hair?
"Experts" and Winners of Classical Music Competitions
So, among the more than 1000 people in the study on musical excellence that you agreed to take part in are ordinary people, musicians, and "experts," the last being the sort of people who judge the winner of the Tchaikovsky or Paganini International Competitions. There are many of these competitions, worldwide, and winning one may get you a tour or a deal to record a few CDs. You signed up for this study, and randomly any one of you are assigned to either 1.) only listen to the top three finalists and then try to guess who won; 2.) view and listen to the top three and guess who won; or 3.) only view the top three but not listen to them play.
Of course, this is a major competition, so all three are shredding, hot-assed players. They all kick ass and play gorgeously. Give it your best shot anyway.
Of course, this is a major competition, so all three are shredding, hot-assed players. They all kick ass and play gorgeously. Give it your best shot anyway.
You can probably "see" where this went: the third group - who only watched and never heard a note - guessed correctly at far above chance the actual winners of BigTime International classical music competitions. The group (made up randomly of musicians, ordinary people, and "experts") that only listened to the top three did the worst at guessing, but the group that both watched with the sound did only slightly better.
This suggests a few things. One: we have an unconscious bias towards visual data even when dealing with the judgement of audio data. Visual data even seems to interfere with audio data. Two: "experts" once again tend to be full of crap. Three: As a longtime rock guitarist, this makes me laugh because it has always been quite an "open secret" that the coolest-looking guitarist will always be more impressive to the fans than the guy who is not all that attractive but plays circles around the cool looking dude.
But classical music was supposed to be different. And I wondered why so many of the female violinists on the covers of my classical CDs were so pulchritudinous. (Aye, but they play marvelously too! I don't hold their looks against them.)
Funny: Here's how the study got going: Dr. Chia-Jung Tsay has PhDs in Organizational Behavior and Psychology, and a PhD in Music from Harvard. She studied piano at Julliard. As a kid, she entered piano competitions and noticed that the reception of her auditions seemed slightly different depending on when she only sent in an audio recording of her playing versus when she sent in a video tape. Tsay played at Carnegie Hall at age 16. Ah...and here is a classical music glamor shot of Dr.Tsay:
After doing her study, Dr. Tsay says she thinks the experts aren't judging solely on "superficial" (by which I think she means: hotness?) criteria, but that there's something about visual information that is very compelling to our brains. We have always been told we shouldn't judge a book by its cover, but publishers and marketers know that we do anyway. When instrumentalists seem to be "hamming it up" they may be merely acting because they think it's what the audience wants, or their playing may be so embodied that they emote in a strikingly visual way, all while being far more "aware" of what they're doing musically rather than what they're doing with their body or face, or both. I have known both types of players. I have been both types.
In the end, I think we might need to reconsider the idea of "purity" usually assumed by performance: if you're moved by a performance...you're moved by the performance. Just be aware that the visual aspect (if there is one) probably shaped your experience. I think we can't get away from this; it's how we're wired. Maybe we should be a little easier on ourselves. It's biology!
Link to a brief discussion of Tsay's study HERE. NPR and Tsay discussing her study, with a photo of famously emotive virtuoso pianist Lang-Lang.
Long ago I saw the piano virtuoso Arturo Benedetti Michelangeli play on the old Arts and Entertainment channel back when cable TV was "new" in the US, and they'd show an entire concert without commercials. He was near the end of his career (he died in 1995), and had let his hair grow long. He was a really hot player, firing off Liszt, but exuded a bitchiness come la prima donna, and he reminded me of when I'd recently seen Ritchie Blackmore play live: sneering, swaggering, total command over technique, dressed in flamboyant black, with a hint of lasciviousness. I found both Blackmore and Michelangeli captivating, even thrilling.
Nicolo Paganini the Genoese came on the 19th century violin scene at a time when the "free agent" musician could make a lot of money and not have to answer to royalty, beg the aristocracy for money or rely on patronage. The new, more powerful "middle class" (AKA bourgeoisie) of Europe loved him. Paganini's technique was otherworldly, and he greatly inspired Chopin, Liszt, and Schumann. But Paganini also cultivated a "demonic" image, which also put asses in the seats. He was the first Jimi Hendrix or Ozzy Osbourne, in a way. But he also pioneered dazzling violin techniques. Paganini may have been the first to exploit his visual, emotive self in a "flamboyant" way in an effort to accentuate his musical self.
Nicolo Paganini the Genoese came on the 19th century violin scene at a time when the "free agent" musician could make a lot of money and not have to answer to royalty, beg the aristocracy for money or rely on patronage. The new, more powerful "middle class" (AKA bourgeoisie) of Europe loved him. Paganini's technique was otherworldly, and he greatly inspired Chopin, Liszt, and Schumann. But Paganini also cultivated a "demonic" image, which also put asses in the seats. He was the first Jimi Hendrix or Ozzy Osbourne, in a way. But he also pioneered dazzling violin techniques. Paganini may have been the first to exploit his visual, emotive self in a "flamboyant" way in an effort to accentuate his musical self.
HERE is Blackmore acolyte Yngwie Malmsteen's showmanship: the entire rock vocabulary, mixed with Bach, Paganini, and Hendrix.
HERE is Nadja Salerno-Sonnenberg playing the finale of the Mendelssohn Violin Concerto in E minor, on American TV, accompanied by a pianist. She flubs a bit, but she's fiery and kinetic as all hell, an Italian-American "tomboy" who loved to play baseball in the street as a kid. I always found her incredibly emotive visually, in addition to her "pure" playing.
Experiment: click on the next link and LISTEN without seeing anything. Then watch the guy play. Do you recognize the name? Is he any good? He's a white guy wearing a t-shirt, apparently at an audition.
The McGurk Effect
This one's really weird: it turns out that what we "see" people say influences what we think we heard, and we can be fooled. And it seems we can't do much to make amends for it. Vision influences hearing here, too. "Reality" seems to be warped in a surprising way. Watch the video!
The Demeanor Assumption
This is a term from lawyer and fraud specialist Robert Hunter. Emily Pronin of Princeton, who studies our ability to detect lies, calls it "the illusion of asymmetric insight." What is it?
I'm drawing on Ian Leslie's piece from New Statesman.
This month a court in England ruled that a muslim woman must remove her hijab/veil when giving evidence. Some experts applauded this, but thought this didn't go far enough: no one should be allowed to veil their faces in court, ever, reasoning that the more body language and facial expressions we see, the better for juries, lawyers and judges to ascertain who's telling the truth. Or who's lying. (I loved the CSICOP-sounding group the "National Secular Society.")
But it turns out we humans are under the illusion that we can determine who's lying. Studies show we're not very good at all. Frankly, we kinda suck at it. Liars can look you straight in the face and get away with it. The innocent can appear twitchy and nervous and suspicious. When I read about this, I thought of Kafka, and especially Anthony Perkins in Orson Welles's The Trial. Also, growing up a snotty thin long-hair "hippie" kid in a town that was not accepting, I was used to telling the truth and not being believed by adults.
Ian Leslie's article made me want to read his book, Born Liars: Why We Can't Live Without Deceit, about research into how lousy we are at detecting lies. He argues that, contrary to the court's opinion, we might "hear" the testimony and evidence better if everyone were veiled!
When you meet me, most prominent in your mind are two things: 1.) my face; and 2.) your own thoughts. You probably think you can read my thoughts while your own are private. But...I'm meeting you, too. Why wouldn't things be the same for me? Leslie sums up this cognitive bias thus: "I am never quite what I seem; you are an open book."
How does this data about how bad we are at detecting lies reflect on the stuff about judging musical performances? Does the Demeanor Assumption throw off classical music "experts"?
I previously wrote on the topic of deception HERE.
Trailer for Welles's interpretation of Kafka's The Trial:
Friday, May 3, 2013
Rushkoff's Presentism and Further Musings on Time
Prof. George Carlin was performing on The Tony Orlando and Dawn Rainbow Hour - a TV show - in 1976. He told the audience about Time as a concept. He said that there is no "present" that we can say to be truly living in because just as we identify it...ZOINK!...it's gone. We've moved on to the next moment. "There's no present. Everything is in the near future, or the recent past." - copped from Sullivan's delightful biography of Carlin, 7 Dirty Words: The Life and Crimes of George Carlin. The gnomish intellectuality of Carlin's POV on "presentism" notwithstanding, I do enjoy the para-Zen-like musing as a potent little philosophical riff, akin to Led Zeppelin's very simple but powerful, "Whole Lotta Love."
Both Carlin and Zep are "basic" but they rock-steady...On to my bit.
Douglas Rushkoff has produced yet another marvelous book, Present Shock: When Everything Happens Now. Because most of the libraries in my area don't yet have it, or if they do the waiting list is too long, and also because I don't have any money, I once again stood (sometimes sat) in a bookstore and did a very fast read of the whole thing, over about two hours. I will re-read it when it's less the molten hot commodity and I have more time to devote to it. Speaking of time/Time...
Douglas Rushkoff
Rushkoff says that, within our collective historical consciousness, our technologies have landed us in "the future." Now, and for the foreseeable future, which is now. So let's start to feel freed up from thinking of book-like narratives of seeing a beginning, rising action, falling action, climax, denouement (or variations of that), because terrorism, child starvation and global warming don't have these narrative structures to them. They're not like: dictators in Europe out of hand, killing people, Hitler invades Poland, death, more stuff pushes the action, death, Pearl Harbor, D-Day, death, death, death, Victory in Europe day, death, slaughter, carpet bombing, lives stunted forever, Hitler and Eva off themselves in a bunker, death and then another dollop of mass death, "discovery" of the Death Camps, Hiroshima, death, ticker-tape parade. That sorta stuff. (In my reading outside idiot-making "school" I developed a habit of imposing complex, intertwingling narratives-without-end...both because I find these conceptions of "history" more true-to-phenomenological "facts" although not "identical" to them, and because I simply find it far more instructive and interesting to conceive of historical narrative this way. Ya got a bettuh whaddyacall..."method"? Do me a solid and let's hear it.)
There's so much going on with our gadgets now that we're not managing time in the way that will make us more happy; we're too caught up in the modes of Industrial (and corporation's) Time and this really ought to stop. It starts with you: maintain eye contact when you're with someone you really want or need to be communicating with. Ignore the cell phone when it vibrates. The email, your text messages, scrolling for Facebook "friends" can wait. Rushkoff's one of My Guys, obviously.
Franz Boas, trained as a physicist but he then invented Modern Cultural Anthropology, traveled to live among the Inuit. Extensive immersion-within-the-field studies were essential in this new discipline, which sought to distance itself from the old anthropology: reading the latest dispatches from missionaries, spies, or pirates and making armchair proclamations about..."the" Pygmies/French/Laplanders/Maori/Turks/Yanomano, etc. Boas wrote later about "culture shock." It's something that must have been experienced countless times before by missionaries and pirates; but Boas was a true intellectual and very reflective. The utter newness of the ways "eskimos" lived, what they thought and took-for-granted as "reality" disoriented Boas; it was all too much. His mind was blown by the experience. Culture shock became a to-be-expected in the training of cultural anthropologists.
I had the Experience when I spent three weeks in Tokyo/Kyoto/Osaka/Kathmandu. But Boas stuck it out. His students - people like Margaret Mead and Ruth Benedict and Alfred Kroeber and Edward Sapir and Zora Neal Hurston and many others developed an empirical view of "culture," and altered the idea of the Platonic One True Reality. A major aspect of cultural relativity is the observation that different peoples seem to experience Time very differently than Western Industrialized Humans, or anyone else under 24/7 electric lights/effective "grid", clocks everywhere, mass transit, schedule, schedule, schedule, metropolises, and mass communications.
Sociologist Alvin Toffler was a huge deal with he came out with his book Future Shock in 1970. This sort of shock was defined as "too much change happening in too short of a period of time," or something like that. It led to disorientation (and there was no Internet! no Twitter or Facebook or...blogs) and anxiety. It reminded me of Aldous Huxley's idea of the consciousness or nervous system on psychedelic drugs: it's as if "normal consciousness" was a garden hose with a kink in it, so the water only flows out in dribs and drabs, but on a psychedelic, the hose is unkinked and the amount of...information, perception, ideas, emotions...is overwhelming. It feels like a firehose in your brain, when what you're used to is...a dribble here and there. It can be - it will be - overwhelming. Your comfy old-sneakers whelm will be overtaken. But you knew that.
Another Net guru, Clay Shirky, coined the term "filter failure" as a sort of update on "future shock." This implies that we were all trained to acknowledge that some sort of mental "filter" or way of practicing a....mental hygiene? was part of our educations. (Was it yours?) Rushkoff, one of those that Richard Rorty would have labeled a Strong Poet, has minted some of his own terms here, the two that stick out most clearly being "digiphrenia" and "fractalnoia." On Rushkoff's blog he posted some of his favorite early reviews and media interviews, which probably flesh a lot of this out better than I've done here...
George W.S. Trow
It's firehoses for everyone now, seemingly, and you don't even need LSD. Everyone is On, all the time, now, 24/7. Twittering, blogging, texting, chatting, On Demand. So where in the narrative arc of life or some current news story are we? We don't know. (I think George W. S. Trow had some amazing things to say about our lack of context - he loathed what TV had done to our sense of history, and see his short - so it won't eat up your TV-time - book Within The Context Of No Context.) And yet gadzillions of bits of information about what's "going on" is readily available. The Boston Marathon was bombed. There's a story! Now: to find out. Notice how the story plays out into the "future" (what will Unistat do with Tsarnaev?) and the "past" (was Tamerlane under the guidance of plotting, malevolent terrorist elders?). When will we "know" that this story has a climax and denouement? Answer: we will impose some sort of "ending" ourselves.
What, then, about our very real problems, like starvation and global warming? Here the story has even less of a discernible narrative arc. It looks like we can ameliorate, manage, divert, defer...the idea of the 19th century novel and our sense of historical events has broken down, aided by our extremely sophisticated communication gadgets. In his previous book, Program Or Be Programmed: Ten Commandments For The Digital Age (which is short and doesn't demand too much of your...time), Rushkoff urged us to be in our own real time, to not always be plugged in and "on," to live our lives without the imperatives of our digital gadgets, which are programmed by others, the code hidden from you. Are your gadgets using you to do what some corporations want? Or are you using your gadgets in accordance with your consciously-negotiated and present and personal values? His recent book seems to take on a far grander theme: how our sense of Time is and has been affected by technology.
The section on chronobiology was particularly interesting, and when I have more time alone, at home, to delve in it, I think I'll skip back to those bits first.
Another area of his thought I wish to study further: I confess I don't feel I've grokked in its fulness the riffs on money and capitalism and the dated Industrial Age and how we can get out of this mess. Rushkoff says we live in a "steady state real time economy" now. Maybe I'm brainwashed by the Corporations or Control, but I need tutoring on that one. He certainly seems to think he's on to something. Must...obtain...book...
One thing: why do we plan for the "future" when the banks will just steal all out retirement money anyway and any corporation we work for could not possibly care less about us? (This reminds me of the fiery, sharp, acid, and hilarious Lee Camp, reacting to the possible positive aspects of the recent news story that CEO pay has increased 1000 times since 1950 over what the worker makes. See HERE.)
At any "rate" (HA!), Time marches on and we're all wired with lots of information, to put it mildly. Do we know how to make sense of it all? We certainly have ample opportunities to panic, react with fear, paranoia, and to propagate more erroneous and just plain pernicious info outselves. Also: notice how we must always be "catching up" on the latest whatever. We've fallen behind. (Really? What are our lives about? Completing multiple "jobs" that we uncomfortably saddle ourselves with, seemingly of our own "free will"?)
Is this the "new normal"? I don't know, but I do know that the next person who says, "Well what I heard was that..." about a very recent news story will make me want to slug them in the mouth...but I won't. The OG don't swing that way. He all about Peace 'n shit. But do try to cite a source?
Rushkoff says we can make meanings of our lives filled with information by using a very powerful human tool: pattern recognition. And yet: he seems to buy the idea that we're in a post-narrative age; I tend to go along with this golden postmodern trope, but note there are enough Nuts who are angry that the rest of us won't swallow what they and only they possess: the Ultimate One True Truth. Rather, I see a radical breakdown of thousands-year-old hardcore narrative tropes. Most people are not hardcore ideologues who have one consistent POV. And Nassim Nicholas Taleb notes in his Bed Of Procrustes that people who think that intelligence is about noticing patterns that are relevant are wrong; in a complex world "intelligence consists in ignoring things that are irrelevant."
Finally, I got the idea that We have some semantic issues. I already thought that - hell I ALWAYS think that - but Rushkoff hammered more of it home. I was reminded of the great old anthropologist Weston LaBarre's term "group archosis," which he defined as "Nonsense and misinformation so ancient and pervasive as to be seemingly inextricable from our thinking." (found on p.53 of Robert Edgerton's book Sick Societies)
Similarly, the world systems-theorist Immanuel Wallerstein dropped the term "unthinking" on me while I was reading his Uncertainties of Knowledge, p.104. Opposed to rethinking, unthinking as a word emphasizes that there are very deep-seated notions that, even though physical sciences have shown them to be inadequate, these notions nevertheless stay with us and lead us epistemologically astray.
Some Previous Blogspews on "Time"
History and Perception of Time: Labeling and Control
Historical Consciousness and Deep Time: A Ramble
Still Caught In Time
Five Brief Riffs On The Oddity of Time
Keeping Up To Date On Time Travel
Both Carlin and Zep are "basic" but they rock-steady...On to my bit.
Douglas Rushkoff has produced yet another marvelous book, Present Shock: When Everything Happens Now. Because most of the libraries in my area don't yet have it, or if they do the waiting list is too long, and also because I don't have any money, I once again stood (sometimes sat) in a bookstore and did a very fast read of the whole thing, over about two hours. I will re-read it when it's less the molten hot commodity and I have more time to devote to it. Speaking of time/Time...
Douglas Rushkoff
Rushkoff says that, within our collective historical consciousness, our technologies have landed us in "the future." Now, and for the foreseeable future, which is now. So let's start to feel freed up from thinking of book-like narratives of seeing a beginning, rising action, falling action, climax, denouement (or variations of that), because terrorism, child starvation and global warming don't have these narrative structures to them. They're not like: dictators in Europe out of hand, killing people, Hitler invades Poland, death, more stuff pushes the action, death, Pearl Harbor, D-Day, death, death, death, Victory in Europe day, death, slaughter, carpet bombing, lives stunted forever, Hitler and Eva off themselves in a bunker, death and then another dollop of mass death, "discovery" of the Death Camps, Hiroshima, death, ticker-tape parade. That sorta stuff. (In my reading outside idiot-making "school" I developed a habit of imposing complex, intertwingling narratives-without-end...both because I find these conceptions of "history" more true-to-phenomenological "facts" although not "identical" to them, and because I simply find it far more instructive and interesting to conceive of historical narrative this way. Ya got a bettuh whaddyacall..."method"? Do me a solid and let's hear it.)
There's so much going on with our gadgets now that we're not managing time in the way that will make us more happy; we're too caught up in the modes of Industrial (and corporation's) Time and this really ought to stop. It starts with you: maintain eye contact when you're with someone you really want or need to be communicating with. Ignore the cell phone when it vibrates. The email, your text messages, scrolling for Facebook "friends" can wait. Rushkoff's one of My Guys, obviously.
Franz Boas, trained as a physicist but he then invented Modern Cultural Anthropology, traveled to live among the Inuit. Extensive immersion-within-the-field studies were essential in this new discipline, which sought to distance itself from the old anthropology: reading the latest dispatches from missionaries, spies, or pirates and making armchair proclamations about..."the" Pygmies/French/Laplanders/Maori/Turks/Yanomano, etc. Boas wrote later about "culture shock." It's something that must have been experienced countless times before by missionaries and pirates; but Boas was a true intellectual and very reflective. The utter newness of the ways "eskimos" lived, what they thought and took-for-granted as "reality" disoriented Boas; it was all too much. His mind was blown by the experience. Culture shock became a to-be-expected in the training of cultural anthropologists.
I had the Experience when I spent three weeks in Tokyo/Kyoto/Osaka/Kathmandu. But Boas stuck it out. His students - people like Margaret Mead and Ruth Benedict and Alfred Kroeber and Edward Sapir and Zora Neal Hurston and many others developed an empirical view of "culture," and altered the idea of the Platonic One True Reality. A major aspect of cultural relativity is the observation that different peoples seem to experience Time very differently than Western Industrialized Humans, or anyone else under 24/7 electric lights/effective "grid", clocks everywhere, mass transit, schedule, schedule, schedule, metropolises, and mass communications.
Sociologist Alvin Toffler was a huge deal with he came out with his book Future Shock in 1970. This sort of shock was defined as "too much change happening in too short of a period of time," or something like that. It led to disorientation (and there was no Internet! no Twitter or Facebook or...blogs) and anxiety. It reminded me of Aldous Huxley's idea of the consciousness or nervous system on psychedelic drugs: it's as if "normal consciousness" was a garden hose with a kink in it, so the water only flows out in dribs and drabs, but on a psychedelic, the hose is unkinked and the amount of...information, perception, ideas, emotions...is overwhelming. It feels like a firehose in your brain, when what you're used to is...a dribble here and there. It can be - it will be - overwhelming. Your comfy old-sneakers whelm will be overtaken. But you knew that.
Another Net guru, Clay Shirky, coined the term "filter failure" as a sort of update on "future shock." This implies that we were all trained to acknowledge that some sort of mental "filter" or way of practicing a....mental hygiene? was part of our educations. (Was it yours?) Rushkoff, one of those that Richard Rorty would have labeled a Strong Poet, has minted some of his own terms here, the two that stick out most clearly being "digiphrenia" and "fractalnoia." On Rushkoff's blog he posted some of his favorite early reviews and media interviews, which probably flesh a lot of this out better than I've done here...
George W.S. Trow
It's firehoses for everyone now, seemingly, and you don't even need LSD. Everyone is On, all the time, now, 24/7. Twittering, blogging, texting, chatting, On Demand. So where in the narrative arc of life or some current news story are we? We don't know. (I think George W. S. Trow had some amazing things to say about our lack of context - he loathed what TV had done to our sense of history, and see his short - so it won't eat up your TV-time - book Within The Context Of No Context.) And yet gadzillions of bits of information about what's "going on" is readily available. The Boston Marathon was bombed. There's a story! Now: to find out. Notice how the story plays out into the "future" (what will Unistat do with Tsarnaev?) and the "past" (was Tamerlane under the guidance of plotting, malevolent terrorist elders?). When will we "know" that this story has a climax and denouement? Answer: we will impose some sort of "ending" ourselves.
What, then, about our very real problems, like starvation and global warming? Here the story has even less of a discernible narrative arc. It looks like we can ameliorate, manage, divert, defer...the idea of the 19th century novel and our sense of historical events has broken down, aided by our extremely sophisticated communication gadgets. In his previous book, Program Or Be Programmed: Ten Commandments For The Digital Age (which is short and doesn't demand too much of your...time), Rushkoff urged us to be in our own real time, to not always be plugged in and "on," to live our lives without the imperatives of our digital gadgets, which are programmed by others, the code hidden from you. Are your gadgets using you to do what some corporations want? Or are you using your gadgets in accordance with your consciously-negotiated and present and personal values? His recent book seems to take on a far grander theme: how our sense of Time is and has been affected by technology.
The section on chronobiology was particularly interesting, and when I have more time alone, at home, to delve in it, I think I'll skip back to those bits first.
Another area of his thought I wish to study further: I confess I don't feel I've grokked in its fulness the riffs on money and capitalism and the dated Industrial Age and how we can get out of this mess. Rushkoff says we live in a "steady state real time economy" now. Maybe I'm brainwashed by the Corporations or Control, but I need tutoring on that one. He certainly seems to think he's on to something. Must...obtain...book...
One thing: why do we plan for the "future" when the banks will just steal all out retirement money anyway and any corporation we work for could not possibly care less about us? (This reminds me of the fiery, sharp, acid, and hilarious Lee Camp, reacting to the possible positive aspects of the recent news story that CEO pay has increased 1000 times since 1950 over what the worker makes. See HERE.)
At any "rate" (HA!), Time marches on and we're all wired with lots of information, to put it mildly. Do we know how to make sense of it all? We certainly have ample opportunities to panic, react with fear, paranoia, and to propagate more erroneous and just plain pernicious info outselves. Also: notice how we must always be "catching up" on the latest whatever. We've fallen behind. (Really? What are our lives about? Completing multiple "jobs" that we uncomfortably saddle ourselves with, seemingly of our own "free will"?)
Is this the "new normal"? I don't know, but I do know that the next person who says, "Well what I heard was that..." about a very recent news story will make me want to slug them in the mouth...but I won't. The OG don't swing that way. He all about Peace 'n shit. But do try to cite a source?
Rushkoff says we can make meanings of our lives filled with information by using a very powerful human tool: pattern recognition. And yet: he seems to buy the idea that we're in a post-narrative age; I tend to go along with this golden postmodern trope, but note there are enough Nuts who are angry that the rest of us won't swallow what they and only they possess: the Ultimate One True Truth. Rather, I see a radical breakdown of thousands-year-old hardcore narrative tropes. Most people are not hardcore ideologues who have one consistent POV. And Nassim Nicholas Taleb notes in his Bed Of Procrustes that people who think that intelligence is about noticing patterns that are relevant are wrong; in a complex world "intelligence consists in ignoring things that are irrelevant."
Finally, I got the idea that We have some semantic issues. I already thought that - hell I ALWAYS think that - but Rushkoff hammered more of it home. I was reminded of the great old anthropologist Weston LaBarre's term "group archosis," which he defined as "Nonsense and misinformation so ancient and pervasive as to be seemingly inextricable from our thinking." (found on p.53 of Robert Edgerton's book Sick Societies)
Similarly, the world systems-theorist Immanuel Wallerstein dropped the term "unthinking" on me while I was reading his Uncertainties of Knowledge, p.104. Opposed to rethinking, unthinking as a word emphasizes that there are very deep-seated notions that, even though physical sciences have shown them to be inadequate, these notions nevertheless stay with us and lead us epistemologically astray.
Some Previous Blogspews on "Time"
History and Perception of Time: Labeling and Control
Historical Consciousness and Deep Time: A Ramble
Still Caught In Time
Five Brief Riffs On The Oddity of Time
Keeping Up To Date On Time Travel
Tuesday, April 23, 2013
World Book Night, 2013, Late Edition: Conspiracy Reading and Patterns
Because I'm on Pacific Standard Time, I get the news late. Let me explain international time zones to you. Some places are 15 minutes off. Others 30 minutes. If you're in Nepal you're 15 minutes behind Bangladesh, but Myanmar (Burma) is 30 minutes ahead of Bangladesh, which means if you're a major player in the Myanmar junta and want to call your friend in Nepal to say "Wassuuuuuup!?"...add 45 minutes. Which just seems Discordian to me. London is something like nine hours ahead of me; it's already tomorrow's middle-of-the-morning there "now."
Wait a minute: I can't explain that. The half-hour thing, I mean. I'm used to New York being three hours "ahead" of us. Tokyo is something like 16 ahead, so presumably they Know Things that I don't. The inscrutable, Wise East, aye. I'm not really sure how I started off on this time zone crap when the title of this blogspew was supposedly about books and conspiracy "reading and patterns." Sorry.
Anyway, from where I stand/sit, from my relative inertial position and perspective, it's still April 23, or World Book Night, which, if I can trust Wikipedia, started around 1995 in London.
Buncha do-gooders tryna get more folks to read. Okay, I admit I'm a sympathizer...
April 23 - on or about - is also the day the demonic, horrible events in the 805 page tome Illuminatus! Trilogy begin. If you haven't read that book but are thinking about starting it: don't. It screws with your mind. It most definitely wrecked me forever. I'll never be the same. Others have said very similar things. I know it's "only" a novel, but at least half of it refers to actual historical events.
Many have admired the Book for its wry satirization of conspiracy thinking. I have adopted that point of view, if only for my own sanity.
There's been a long strain in academic discourse about books belonging to one long thread of previous books. All books are in some sense a response to previous books. I like this idea a lot, although I'm not totally sold on it being "correct."
There's some interesting fancy computer research being done on "macroanalysis" of texts based on an author's word choice, style, themes, and "overarching subject matter" that suggests some interesting things about relative "originality" and the influence of a previous author, whether a writer knew they were influenced or not. See, for example, HERE.
I've made very many guesses about the influences on Shea and Wilson in the writing of their damned Illuminatus!, but I'd like to see what some future computer algorithms say about Robert Anton Wilson's influences. He's openly stated about 30 or so; what would the computer say?
Anyway, the Illuminatus! cites innumerable books - mostly ones that "actually" exist in my own phenomenological/existential sensory-sensual world in space/time. Possibly yours, too.
And today's "real world" news feels like it's been influenced by the aforementioned book. Just a sampler:
Professor Jennifer Whitson says, based on her research, that if we feel out of control, we will find patterns and connections and "see" things that aren't there; our brains so need to feel like things "make sense." See reportage on her findings HERE, HERE, and HERE. It's interesting stuff, and those who've read Kahneman will be familiar with this. But I'm not sure if it describes all conspiracy thought. That's far too rationalistic for me. Why? Because, well, Watergate really happened. There are conspiracy laws on the books and people go to prison for it all the time. Watergate really was a conspiracy. "Conspiracy" seems a huge semantic spook that needs some robust and fairly massive and creative intellectual work in order for us to be able to think more clearly about the concept. The idea that aliens from another planet or dimension have been controlling us for our entire history as a species ought not, it seems to me, be on an equal epistemic footing with the idea that the Neo-Cons lied Unistat into the Iraq War.
But hey, I'm biased. (And so are you.)
Does Whitson's work account for all "ritual" and "superstition" and "religion"? Maybe a lot of it, is my guess as of this date.
Ralph Waldo Emerson: "Society, everywhere, is in conspiracy against the manhood of each of its members." Let's be charitable and update his 19th century assumptions to include the fairer sex. What does Emerson mean here?
In Castaways of the Image Planet, Geoffrey O'Brien writes about Fritz Lang's Spies and the 20th century mindset and logic of paranoia and conspiracies: film, the bureaucratic state, the collage-like logic of images. Those who are fans of Lang's (like me!) know he saw Osama bin Laden and Goebbels figures long before those guys were doing their schtuff. (See his Dr. Mabuse films!) Is all the "news" about secret underground terror networks and the Deep State - secret networks that operate within and outside government agencies who cooperate (at times) in order to maintain the status quo - is all this really "new"?
Would it help to stop calling some ideas "conspiracies" and start thinking of them as "normal primate-mammalian politics"?
Or, yet another of many paths to take: conceptual blending? (More serious writers on conspiracy need to become thoroughly acquainted with this idea!)
So far, the best and most underrated book by academics that takes conspiracy theories seriously as a philosophical problem - a problem of epistemology - is Conspiracy Theories: The Philosophical Debate, ed. by David Coady. (Get it via your library: that's an insane price.) Robert Anton Wilson is mentioned in there. Such problems of demarcation lines vis a vis Karl Popper! And what about the pragmatic approach?
"Any epistemological elite, religious or secular, must develop a system of cognitive defenses to defend its claims against the outside criticisms, but also, very importantly, to assuage the doubts harbored by insiders..." - Adventures of an Accidental Sociologist, by Peter Berger, pp.36-37
An academic writer with a law degree, Mark Fenster, had a hit with Conspiracy Theories: Secrecy and Power In American Culture. So much so that he's updated it for the post 9/11 era. He's the only academic I've read that seems to understand the philosophical aspects of deep play - the ludic aspects - in Wilson's work. When citizens feel like voting isn't enough to satisfy their need to participate in the power process, they resort to conspiracy narratives as a way to participate. And largely, they draw upon popular culture's narratives, and creatively tweak and combine. Some of it should give us much cause for alarm. With further and further connections and deeper, hidden orders uncovered, there's a quite-human neurobiological buzz of adrenaline...and "wonder and awe." And let's face it: delight. Conspiracies are exciting.
"Let us not, in the pride of our superior knowledge, turn with contempt from the follies of our predecessors...He is but a superficial thinker who would despise and refuse to hear of them merely because they are absurd." - Charles MacKay's 1852 ed. of Memoirs of Extraordinary Popular Delusions and the Madness of Crowds
Professor Timothy Melley's Empire of Conspiracy: The Culture of Paranoia in Postwar America reminds us of Jennifer Whitson's thesis, but combines the multifarious ideas about mind control and surveillance and other aspects of "control" a citizen may feel is in the hands of Others. Melley's key term is "agency panic" and I think he was not drilling in a dry hole in that book.
"Or maybe it's the repetition. Maybe you've been looking at this stuff for so long that you've read all this into it. And talking with other people who've done the same thing." - Pattern Recognition, William Gibson, p.109
There are many other above-average, well-researched books on conspiracy thinking and paranoia. But I still see Robert Anton Wilson's oeuvre as an ideal Ground Zero for all that. Or rather: a Staging Area.
Yes, yes, yes! These books refer to other books, which refer to others ad inifinitum. Nice bibliographies. Reading about reading and interpretations about interpretations. Does something seem...missing there? Maybe?
Fiction about truth can be stranger than whatever "reality" seems. And the word is not the thing; the map is not the territory. The menu is not the meal.
Happy reading! (But you've been warned about the Illuminatus! Trilogy)
The opening 2 minutes and 17 seconds from Fritz Lang's 1928 film Spies:
Wait a minute: I can't explain that. The half-hour thing, I mean. I'm used to New York being three hours "ahead" of us. Tokyo is something like 16 ahead, so presumably they Know Things that I don't. The inscrutable, Wise East, aye. I'm not really sure how I started off on this time zone crap when the title of this blogspew was supposedly about books and conspiracy "reading and patterns." Sorry.
Anyway, from where I stand/sit, from my relative inertial position and perspective, it's still April 23, or World Book Night, which, if I can trust Wikipedia, started around 1995 in London.
Buncha do-gooders tryna get more folks to read. Okay, I admit I'm a sympathizer...
April 23 - on or about - is also the day the demonic, horrible events in the 805 page tome Illuminatus! Trilogy begin. If you haven't read that book but are thinking about starting it: don't. It screws with your mind. It most definitely wrecked me forever. I'll never be the same. Others have said very similar things. I know it's "only" a novel, but at least half of it refers to actual historical events.
Many have admired the Book for its wry satirization of conspiracy thinking. I have adopted that point of view, if only for my own sanity.
There's been a long strain in academic discourse about books belonging to one long thread of previous books. All books are in some sense a response to previous books. I like this idea a lot, although I'm not totally sold on it being "correct."
There's some interesting fancy computer research being done on "macroanalysis" of texts based on an author's word choice, style, themes, and "overarching subject matter" that suggests some interesting things about relative "originality" and the influence of a previous author, whether a writer knew they were influenced or not. See, for example, HERE.
I've made very many guesses about the influences on Shea and Wilson in the writing of their damned Illuminatus!, but I'd like to see what some future computer algorithms say about Robert Anton Wilson's influences. He's openly stated about 30 or so; what would the computer say?
Anyway, the Illuminatus! cites innumerable books - mostly ones that "actually" exist in my own phenomenological/existential sensory-sensual world in space/time. Possibly yours, too.
And today's "real world" news feels like it's been influenced by the aforementioned book. Just a sampler:
- Tamerlan Tsarnaev was an Alex Jones fan. Maybe? Quite possibly. And let's not even address the insanely delicious irony, but I'm reminded of Shea and Wilson's "Tar Baby Principle" mentioned in Illuminatus!: You will become attached to what you attack. This idea always seemed a cousin to Buddhism's "you become what you behold." But wait a minute...
...if Tamerlane was really influenced by - or "understood"? - Jones, he sorta horribly ironically
got it wrong, because Jones thinks attacks blamed on Muslims are really set-ups or "false flag"
attacks engineered by the Evil Gummint. So...how does some pernicious idea about terrorism in
the name of Islamic jihad figure in? (I still see the brothers Tsarnaev as more like Harris and
Klebold.)
- Glenn Beck thinks that whatever imbecilic conspiracy theory his own pea-brain imagines, it must be accepted as true unless the government can prove it's wrong. The word "thinks" in the previous sentence should be taken ironically. Of course. This is ethics, law, and logical thinking straight from the Idiot's Fun House. Lemme see if I can get on Beck's wavelength here. <Ahem> Okay: Hey, Glenn Beck? I'm not so sure that the decent real Americans haven't not negated their "misunderestimation" of you. Now: prove me wrong, or you're Evil Incarnate and I'm the True Bestest Murrkin who truly loves his country, mom, the flag, a baby's smiling face, Betty-Sue's halter top, NASCAR, and hard cider on a sweltering Missouri summer night, etcetera! <the OG wept>
- It appears as if the paranoid Elvis impersonator from Mississippi who mailed Ricin to Obama may have been framed. This seems whacked enough to have been in Illuminatus! In case you haven't been following this story (i.e, you "have a life"), the Elvis impersonator who apparently did NOT send Ricin to the POTUS did think he was trying to expose a shadowy world of human organ harvesters. I am not making this jit up.
Professor Jennifer Whitson says, based on her research, that if we feel out of control, we will find patterns and connections and "see" things that aren't there; our brains so need to feel like things "make sense." See reportage on her findings HERE, HERE, and HERE. It's interesting stuff, and those who've read Kahneman will be familiar with this. But I'm not sure if it describes all conspiracy thought. That's far too rationalistic for me. Why? Because, well, Watergate really happened. There are conspiracy laws on the books and people go to prison for it all the time. Watergate really was a conspiracy. "Conspiracy" seems a huge semantic spook that needs some robust and fairly massive and creative intellectual work in order for us to be able to think more clearly about the concept. The idea that aliens from another planet or dimension have been controlling us for our entire history as a species ought not, it seems to me, be on an equal epistemic footing with the idea that the Neo-Cons lied Unistat into the Iraq War.
But hey, I'm biased. (And so are you.)
Does Whitson's work account for all "ritual" and "superstition" and "religion"? Maybe a lot of it, is my guess as of this date.
Ralph Waldo Emerson: "Society, everywhere, is in conspiracy against the manhood of each of its members." Let's be charitable and update his 19th century assumptions to include the fairer sex. What does Emerson mean here?
In Castaways of the Image Planet, Geoffrey O'Brien writes about Fritz Lang's Spies and the 20th century mindset and logic of paranoia and conspiracies: film, the bureaucratic state, the collage-like logic of images. Those who are fans of Lang's (like me!) know he saw Osama bin Laden and Goebbels figures long before those guys were doing their schtuff. (See his Dr. Mabuse films!) Is all the "news" about secret underground terror networks and the Deep State - secret networks that operate within and outside government agencies who cooperate (at times) in order to maintain the status quo - is all this really "new"?
Would it help to stop calling some ideas "conspiracies" and start thinking of them as "normal primate-mammalian politics"?
Or, yet another of many paths to take: conceptual blending? (More serious writers on conspiracy need to become thoroughly acquainted with this idea!)
So far, the best and most underrated book by academics that takes conspiracy theories seriously as a philosophical problem - a problem of epistemology - is Conspiracy Theories: The Philosophical Debate, ed. by David Coady. (Get it via your library: that's an insane price.) Robert Anton Wilson is mentioned in there. Such problems of demarcation lines vis a vis Karl Popper! And what about the pragmatic approach?
"Any epistemological elite, religious or secular, must develop a system of cognitive defenses to defend its claims against the outside criticisms, but also, very importantly, to assuage the doubts harbored by insiders..." - Adventures of an Accidental Sociologist, by Peter Berger, pp.36-37
An academic writer with a law degree, Mark Fenster, had a hit with Conspiracy Theories: Secrecy and Power In American Culture. So much so that he's updated it for the post 9/11 era. He's the only academic I've read that seems to understand the philosophical aspects of deep play - the ludic aspects - in Wilson's work. When citizens feel like voting isn't enough to satisfy their need to participate in the power process, they resort to conspiracy narratives as a way to participate. And largely, they draw upon popular culture's narratives, and creatively tweak and combine. Some of it should give us much cause for alarm. With further and further connections and deeper, hidden orders uncovered, there's a quite-human neurobiological buzz of adrenaline...and "wonder and awe." And let's face it: delight. Conspiracies are exciting.
"Let us not, in the pride of our superior knowledge, turn with contempt from the follies of our predecessors...He is but a superficial thinker who would despise and refuse to hear of them merely because they are absurd." - Charles MacKay's 1852 ed. of Memoirs of Extraordinary Popular Delusions and the Madness of Crowds
Professor Timothy Melley's Empire of Conspiracy: The Culture of Paranoia in Postwar America reminds us of Jennifer Whitson's thesis, but combines the multifarious ideas about mind control and surveillance and other aspects of "control" a citizen may feel is in the hands of Others. Melley's key term is "agency panic" and I think he was not drilling in a dry hole in that book.
"Or maybe it's the repetition. Maybe you've been looking at this stuff for so long that you've read all this into it. And talking with other people who've done the same thing." - Pattern Recognition, William Gibson, p.109
There are many other above-average, well-researched books on conspiracy thinking and paranoia. But I still see Robert Anton Wilson's oeuvre as an ideal Ground Zero for all that. Or rather: a Staging Area.
Yes, yes, yes! These books refer to other books, which refer to others ad inifinitum. Nice bibliographies. Reading about reading and interpretations about interpretations. Does something seem...missing there? Maybe?
Fiction about truth can be stranger than whatever "reality" seems. And the word is not the thing; the map is not the territory. The menu is not the meal.
Happy reading! (But you've been warned about the Illuminatus! Trilogy)
The opening 2 minutes and 17 seconds from Fritz Lang's 1928 film Spies:
Sunday, January 20, 2013
History and Perception of Time: Labeling and Control
I use the word "control" in the title but I think in this semantic sense it's human; oh-so human.
Here's What I'll Ramble On About Here:
- Noocene Epoch
-"human progress"
- acceleration of data, information
- Anthropocene Epoch
- Holocene Epoch
- a final riff
So: How do you think we're doing so far in the Noocene Epoch? (There oughtta be an umlaut over that second "o" in Noocene.) I copped this Epoch from The Biosphere and Noosphere Reader. There it was defined as something like: how we manage and adapt to the immense amount of knowledge we've created. My answer is: I don't know, but I suspect a lot of us are having birth-pangs of a rather longue duree, if we can use that term on a personal scale.
Mutt: We can't.
Jute: We can.
Mutt: You won't.
Jute: I will.
With something like a logarithmic increase in world population and technological development, including Teilhard's global media/communications vision of a noosphere (the human mind permeating the electromagnetic spectrum), we seem to be going a bit nuts; it may be coming too fast for our biologically-evolved selves. And are we making logarithmic-like gains in empathy, understanding, and a general updating of ethics and manners, a cosmopolitan outlook? My knee-jerk says nay; Steven Pinker wants to argue something like a "yes" to this in his recent doorstop The Better Angels Of Our Nature. And I so want to believe his basic thesis is right.
Human "Progress"
On the other hand, there's a long tradition of denial of "progress" by heavyweight thinkers. I usually read them as necessary correctives to a general cultural mindlessness about "progress." Chris Hedges has a bit of a jeremiad this week: the very technological boom that we've created - it started only a few minutes ago, on the vast homo sapiens sapiens timescale - is the very thing that may be taking us down. For those of us with an atavistic need for Bad Time when there's one to be had, read Hedges's "The Myth of Human Progress."
Acceleration of Info
Robert Anton Wilson thought the general rise of social lunacy and conspiracy theory was related to the information flow-through in society, which, according to statistics he derived from French economist Georges Anderla, was doubling at ever-increasing rates. Bytes, Data, Information, Knowledge, and Wisdom may all "be" different things, indeed, but RAW's (and Kurzweil's for that matter) notions of pegging an idea and a method for counting, then watching the curve rise absurdly quickly, seems an effective rhetoric to get us to think of acceleration of processes, however flawed the methodology may be.
Futurist Juan Enriquez talking about data-doubling for 2 minutes.
Ray Kurzweil's Law of Accelerating Returns (ancient!: from 2001)
Robert Anton Wilson and Terence McKenna on information doubling; 4 minutes
Anthropocene Epoch
According to RAW's Jumping Jesus, we were at 4 Jesus in 1500, then 8 by 1750 and the start of the Industrial Age. I increasingly see the Industrial Age as now being described as the beginning of the Anthropocene Epoch. Can we get out of it unscathed? I increasingly doubt it. I don't mean the human experiment on this rocky watery planet will end soon, but I do think we will radically alter what it means to be "human" in the next 30-50 years.
Cesare Emiliani
Holocene Epoch
The Age of Faith. The call of Being. The Mind of Europe, the Ming Dynasty, the "postmodern," The Sixties...All of these ways of conceptualizing our time here (and any other one you can think of) happened during the Holocene Epoch, which was coined by Cesare Emiliani: he thinks our calendar, which shifts when a Jewish rabbi-carpenter-anarchist was born, is too subjective. The "entirely recent" (AKA "Holocene") is, for Emiliani, anything from 10,000 years ago to today, roughly the Neolithic to now. The last great Ice Age had receded: the human story is told in the last 10,000 years, and so why don't we just add a "1" to whatever year we're in now and think of time that way? So, we're living in 12013 now.
I confess I'm a sucker for romantic intellectuals who are so overweening in their grandiosity of ideation that they think they can change the basic calendar. Do I think Emiliani's idea will ever catch on? Not a chance. But it has caught on with me. I like the psychological sense of a new way to control my perception of time with the Holocene.
Final Riff
To whatever extent human's many problems represent an Existential Risk: climate change, lurking plagues, asteroid collisions, Mutually Assured Destruction, and continued overpopulation (the world had roughly 200 million total when the anarchist rabbi was born; 791 million in 1750; 1.6 billion in 1900; 2.9 billion in 1960; 3.6 in 1970; 4.4 in 1980; 5.2 in 1990; 6.0 in 2000; and we passed 7,000,000,000 around Halloween, 2011); whether there's another Great Dying, or a Robot Apocalypse, or a happy Singularity or Omega Point: we will need to pass through something Ahead that we might later think of as a Bottleneck Epoch.
On another level and despite the many charming cyclical models of Time and History proffered by some of our more ingenious thinkers, the ideas from Hegel, Marx, Heidegger and Derrida lead me to agree with Derrida: there is no lost original language or vocabulary that will restore our sense of being grounded in some sort of Absolute Ultimate. All that is or seems, seems as metaphor, and we must find our way bravely in this present (which we want, at times, to be "timeless"). We post-postmoderns: can we believe in a teleology for our species, within an historical trajectory? Do we take seriously an eschatology? Clearly some do, but they seem in a negligible minority. In the previous paragraph I hazarded a Bottleneck Epoch, my optimism winning out. I, like Buckminster Fuller, am biased: I like the humans and I, as Bucky said, want them "to be a success in universe."
Nonetheless, how do we think about our present eschatoteleological dilemma? (A: mostly, we don't.)
I wrote this entire post in hopes that someone will think me a Heavy Cat.
Here's What I'll Ramble On About Here:
- Noocene Epoch
-"human progress"
- acceleration of data, information
- Anthropocene Epoch
- Holocene Epoch
- a final riff
So: How do you think we're doing so far in the Noocene Epoch? (There oughtta be an umlaut over that second "o" in Noocene.) I copped this Epoch from The Biosphere and Noosphere Reader. There it was defined as something like: how we manage and adapt to the immense amount of knowledge we've created. My answer is: I don't know, but I suspect a lot of us are having birth-pangs of a rather longue duree, if we can use that term on a personal scale.
Mutt: We can't.
Jute: We can.
Mutt: You won't.
Jute: I will.
With something like a logarithmic increase in world population and technological development, including Teilhard's global media/communications vision of a noosphere (the human mind permeating the electromagnetic spectrum), we seem to be going a bit nuts; it may be coming too fast for our biologically-evolved selves. And are we making logarithmic-like gains in empathy, understanding, and a general updating of ethics and manners, a cosmopolitan outlook? My knee-jerk says nay; Steven Pinker wants to argue something like a "yes" to this in his recent doorstop The Better Angels Of Our Nature. And I so want to believe his basic thesis is right.
Human "Progress"
On the other hand, there's a long tradition of denial of "progress" by heavyweight thinkers. I usually read them as necessary correctives to a general cultural mindlessness about "progress." Chris Hedges has a bit of a jeremiad this week: the very technological boom that we've created - it started only a few minutes ago, on the vast homo sapiens sapiens timescale - is the very thing that may be taking us down. For those of us with an atavistic need for Bad Time when there's one to be had, read Hedges's "The Myth of Human Progress."
Acceleration of Info
Robert Anton Wilson thought the general rise of social lunacy and conspiracy theory was related to the information flow-through in society, which, according to statistics he derived from French economist Georges Anderla, was doubling at ever-increasing rates. Bytes, Data, Information, Knowledge, and Wisdom may all "be" different things, indeed, but RAW's (and Kurzweil's for that matter) notions of pegging an idea and a method for counting, then watching the curve rise absurdly quickly, seems an effective rhetoric to get us to think of acceleration of processes, however flawed the methodology may be.
Futurist Juan Enriquez talking about data-doubling for 2 minutes.
Ray Kurzweil's Law of Accelerating Returns (ancient!: from 2001)
Robert Anton Wilson and Terence McKenna on information doubling; 4 minutes
Anthropocene Epoch
According to RAW's Jumping Jesus, we were at 4 Jesus in 1500, then 8 by 1750 and the start of the Industrial Age. I increasingly see the Industrial Age as now being described as the beginning of the Anthropocene Epoch. Can we get out of it unscathed? I increasingly doubt it. I don't mean the human experiment on this rocky watery planet will end soon, but I do think we will radically alter what it means to be "human" in the next 30-50 years.
Cesare Emiliani
Holocene Epoch
The Age of Faith. The call of Being. The Mind of Europe, the Ming Dynasty, the "postmodern," The Sixties...All of these ways of conceptualizing our time here (and any other one you can think of) happened during the Holocene Epoch, which was coined by Cesare Emiliani: he thinks our calendar, which shifts when a Jewish rabbi-carpenter-anarchist was born, is too subjective. The "entirely recent" (AKA "Holocene") is, for Emiliani, anything from 10,000 years ago to today, roughly the Neolithic to now. The last great Ice Age had receded: the human story is told in the last 10,000 years, and so why don't we just add a "1" to whatever year we're in now and think of time that way? So, we're living in 12013 now.
I confess I'm a sucker for romantic intellectuals who are so overweening in their grandiosity of ideation that they think they can change the basic calendar. Do I think Emiliani's idea will ever catch on? Not a chance. But it has caught on with me. I like the psychological sense of a new way to control my perception of time with the Holocene.
Final Riff
To whatever extent human's many problems represent an Existential Risk: climate change, lurking plagues, asteroid collisions, Mutually Assured Destruction, and continued overpopulation (the world had roughly 200 million total when the anarchist rabbi was born; 791 million in 1750; 1.6 billion in 1900; 2.9 billion in 1960; 3.6 in 1970; 4.4 in 1980; 5.2 in 1990; 6.0 in 2000; and we passed 7,000,000,000 around Halloween, 2011); whether there's another Great Dying, or a Robot Apocalypse, or a happy Singularity or Omega Point: we will need to pass through something Ahead that we might later think of as a Bottleneck Epoch.
On another level and despite the many charming cyclical models of Time and History proffered by some of our more ingenious thinkers, the ideas from Hegel, Marx, Heidegger and Derrida lead me to agree with Derrida: there is no lost original language or vocabulary that will restore our sense of being grounded in some sort of Absolute Ultimate. All that is or seems, seems as metaphor, and we must find our way bravely in this present (which we want, at times, to be "timeless"). We post-postmoderns: can we believe in a teleology for our species, within an historical trajectory? Do we take seriously an eschatology? Clearly some do, but they seem in a negligible minority. In the previous paragraph I hazarded a Bottleneck Epoch, my optimism winning out. I, like Buckminster Fuller, am biased: I like the humans and I, as Bucky said, want them "to be a success in universe."
Nonetheless, how do we think about our present eschatoteleological dilemma? (A: mostly, we don't.)
I wrote this entire post in hopes that someone will think me a Heavy Cat.
Saturday, June 30, 2012
Poetry, Conversation, Translations, In-Form-a-Tion, Etc
In an interview about his book of poetry titled Uselysses, Noel Black recalls a time with Harry Mathews at a San Francisco art school, and I like this passage because it sheds light on my two previous blogspews on "translation":
Harry Mathews came and gave a lecture to a class I took at New College, and I had this amazing conversation with him afterward about the “I” and “self” and that whole labyrinth. I’ll never forget what he said to me because it was so freeing. I’m paraphrasing here, but what he said is that Americans, because most of us only speak one language, have a tendency to believe that language comes from within us out of some sort of linguistic font of self, which leads us to this “I” to which we cling. For a lot of Europeans, on the other hand, many of whom are polyglots, language is something external that’s not only mutable, but easily rearranged and manipulated and only loosely regarded as any part of a fixed self. By that measure, he had concluded after many years of living abroad, that you could only know yourself with the shared language you were using with another person, i.e. you are creating a different self with each different person you’re with in whatever common language you happen to share. I loved that idea so much because what it says is that the I is always in relationship, that it’s a conversation, a community.
-gleaned from this Levi Rubeck interview with Noel Black
It seems what Noel Black is getting from Harry Mathews here supports what Lera Boroditsky has been arguing in her academic career. But then, beware: this is me, today, interpreting/translating/fumbling to explain to my Dear Reader what I think is going on. Lots may get lost in my grapples with Boroditsky's thought, with what Noel Black seeks to remember ("I'm paraphrasing here...") from his time with Harry Mathews, what Mathews thinks "makes sense" vis a vis European polyglots and language and the "self" versus what monolingual Unistatians think about where language comes from, how it relates to a "self," etc, etc, etc.
I'll return to Noel Black's poetry later, but want to try and haul in some things about interpretations, translations, and Information Theory.
How ordinary my having a blog seems!
One of my own spiritual fathers, Robert Anton Wilson, often wrote and talked about the acceleration of information. In his cosmic intellectual goofy-sufi-like humor, he eventually dubbed this mathematical doubling the Jumping Jesus Phenomenon, and if you don't know what it is, the first paragraph in this link gives the inkling. RAW took this stuff seriously, if not siriusly, and he actually has quite a lot to say about the statistical mechanisms underlying this phenomenon, and the social, personal, political and philosophical implications of it. Terence McKenna had some similar things to say on this topic, although they seemed more teleological and metaphysical in bend than RAW's. If one looks at Ray Kurzweil's work, particularly the fat The Singularity Is Near, Kurzweil seems to have picked up from RAW and his influences on information doubling and logarithmically taken it to another level, but that's for The Reader (and future history?) to decide.
There's also quite an abundant literature that seeks to determine boundary distinctions and interactions between types of knowledge, and, more fascinating to me, the differences between data, information, knowledge and wisdom. I have found nothing but vast abstruse disagreements here. Which is fine with me.
RAW, in his widely dispersed writings on information doubling in history, touched on the qualitative aspects therein, but any reader can easily miss that in favor of what I see as - wild and ironically - a Platonified view of how information works. If Claude Shannon's and Warren Weaver's and John Von Neumann's and Norbert Wiener's and Gregory Bateson's and...all the others mentioned in James Gleick's marvelous 2011 book The Information - the information that can be quantified and more or less given a relatively thick description as to how it worked within a scientific/technological sense - THAT information...then yes. Maybe. But I wonder about our brain's unrealistic expectations of information transformation, mostly because I played, at childhood birthday parties, the game of Telephone...and never stopped thinking about what it meant. (Huh? WTF is the OG onto now?)
I remember how funny it was that, what went into the first kid's ear turned out, after the 16th iteration, to be shorter and weirder and having almost no resemblance to the original. Why was it funny?
Well, humor may be a way to let our guards down and admit we're not perfect. We seem not even close to perfect. And that it's okay, because being human? We're far too complex to get strings of information exactly right, using our wetware. (I didn't think this stuff about Telephone until the last couple years.) Getting data and memory 100% is not something we do well in these embodied minds of aggregated replicators and evolutionarily legacy-ed mind-software. But what about other implications?
One of the "reasons" the market tanked in 2008 was "we" ("quants" and others) had written algorithms into very posh hi-powered computers that took in data about either buying or selling, and in microseconds, made "decisions" on whether it was buy or sell time. And the data/information the algorithms were working on was extensive. But it was based on earlier data strings that supposedly represented something in the Actual World that you or I would care about...like buying a house. But again: this data string was based on decisions made in microseconds from other data strings. It was like Telephone, only these algorithms were making buy/sell decisions without any human minds interposed. And yet, all you needed was one bit of some "interpretation" (can robotic algorithms, no matter how complex, be said to interpret?) that the algorithm "thought" was okay...to be wrong in some way. And then we had a sort of Telephone-like iteration which resulted in something that was not given to the same mirth of a children's backyard party and something rather like a Worldwide Economic Depression. Oh, humans, talking amongst themselves, playing by what the "rules" allowed, made bad decisions too...
Were there unethical, even criminally negligent decisions made? You betta you ass there were.
There's a LOT missing in this basic schematic, no?
Why do we allow complex mathematical formulae, worked over at blinding speeds by hi-powered computers, to make "decisions" about what are basically human values? Because it seemed/seems like a cost-efficient or "neat-o" idea? I read about this stuff and the FOLLY angers me. (Hence the Robin Hood Tax seems like a very sound idea to me.)
Now, I'm writing this based on my best understanding as a Generalist, some dude reading books, and maybe I've missed something? Maybe the authors I've read missed something? Maybe I misread some tiny bit based on some tiny bit someone, who I trust KNOWS something...errr...missed? Hey, that's life.
It's not at all convincing or clear to me that higher levels of abstracted information make for something healthier, to borrow from a hippie slogan, "for children and other living things." Too many artifacts and errors creep. We must always have something like an attempt of wise humans refereeing. It seems far too many Geeks and policy makers do not get this.
And that's my point. Information transfer from brain to brain or from algorithm to algorithm of from algorithm to brain is not a Platonically perfect dealio, ladies and germs. And the repercussions are not academic; they come packing a world of hurt at times.
Back to Black...
Noel Black's Uselysses
Gawd, what a delightful, funny book of poetry. This alone should be enough to pique you, but maybe my taste is not close enough to yours, so I'll say a few things about why I like the book.
Black remembered someone calling themselves a "depressionist," and, being an artist in today's Unistat? It's easy to feel useless, no matter how much effort and soul-bearing you do. For a punster - who prefers portmanteau to pun, as Joyce preferred the portmanteau as stylistic device -"uselysses" can describe the interior feelings of many an artist in this day and age.
He's steeped in the poetic tradition, but feels its weight, Whitman appearing at odd times throughout the book, laying his body down along the entire stretch of road one might drive from New York to San Francisco. And, steeped in the art school/academic and especially San Francisco poetry scene (where certain, say LANGUAGE poets live in their own poetic fiefdoms while poets of Other Schools theirs), he had to leave, go back to Colorado Springs, where he was brought up by a gay father who died of AIDS, and a lesbian mother. He goes back to evangelical right wing hotbed Colorado Springs to reclaim it, in some Whitmanian sense, for Art.
Another of the younger poets who draw upon and allude to TV and pop culture (almost) as much as anyone, Black's surrealistic sensibility and sense of the cosmic absurdity of his own fleeting thoughts in the mundane get worked over into something truly artistic and human and hilarious. Aside from a very witty section of short poems all based on how some famous poets died, he's perhaps best-known as the author of a chapbook (contained in Uselysses) called Moby K. Dick, (an odd concatenation of Philip K Dick's book Ubik and Melville's masterwork) in which he takes books he's loved and combines their deeper structures, has a sort of ethereal chat between both books (or book/another author), and neither book comes to the reader in bold relief; rather, the odd intertwining essence-ish-ness of each book speaks in the short poems, variously titled "Lord Jim Thompson," "Paul Austerlitz," "Watchmen in the Rye," "Huckleberry Finnegans Wake," "The Spy Who Came in from the Cold Blood," you get the idea.
If you're a Joycean looking for something in, say, "Huckleberry Finnegans Wake" you're unlikely to come away with anything, save for the idea that a poet drew, in some odd way, on Finnegans Wake. Black's purpose here seems closer to William Burroughs's use of the cut-up method, only Black is cutting up general feelings - interpretations - in his mind about the two books, how his sense of himself was subsumed while reading those books, and how his made-meanings of texts dreamily interposed in the overnight "dialogue" between both books as they sat on his shelf.
But this is Black at his artsiest. The Noel Black I had most fun with was the one who wanted to write poetry again because it was FUN, and he had calendrical time and geographic distance from two places he'd tried to make it, San Francisco and Brooklyn. Colorado Springs seems to suit him fine. Here are some lines from a poem that, to me, depict my favorite aspects of Black. From "Poem of Carl Sagan":
It must be confusing for Christians
who arrive in Heaven
to find Carl Sagan seated at the right hand of God,
which is a gigantic, glowing vagina
floating above the Captain's chair
on the deck of the Starship Enterprise.
"It's interesting - and I never imagined this - "
says Carl, using the weirding voices of Science to soothe the recently dead,
"that the Universe is merely an emanation of the brain,
which as we look into it, tricks us into believing
that we are gazing into an unfathomable outward expanse
that is but the unknowable inner reaches of our own minds. Now,
who would like to be reborn?"
Vaginal wormholes, C. S. Lewis perturbed by it all, Star Trek's spaceship as the Holy Ghost, telling someone they'll understand heaven a lot better if they re-read Dune, then here's Carl Sagan again:
Then he unzips his burnt-orange windbreaker
and a laser of love shoots out
from the spectral Starfleet logo upon his heart,
zapping them all into the raptures of wordless knowledge
as God folds their souls into dream.
If this all sounds vaguely like you and your funny friends, high in college, your parents split up or dead, listening to the Scorpions in someone's mom's basement in Unistatian suburbia in 1981, then yea: you probably have something in common with Noel Black. And you can either confirm or deny this by reading this book, but ESPECIALLY the last part, a rather long poem called "Prophecies For The Past," which was, to me, one of the most moving poems I've read by a currently living poet. I found Uselysses in my public library, but will buy the book if only for "Prophecies For The Past," which articulates a living reality for so very many of us, growing up in broken homes in cultural poverty pockets or suburban white America, last 30 years of the 20th century.
Noel Black, family man, seems wonderfully jester-wise and nutty, wears his resilient heart on his sleeve, which I picture as paisley right now, for some reason. I loved this book.
Harry Mathews came and gave a lecture to a class I took at New College, and I had this amazing conversation with him afterward about the “I” and “self” and that whole labyrinth. I’ll never forget what he said to me because it was so freeing. I’m paraphrasing here, but what he said is that Americans, because most of us only speak one language, have a tendency to believe that language comes from within us out of some sort of linguistic font of self, which leads us to this “I” to which we cling. For a lot of Europeans, on the other hand, many of whom are polyglots, language is something external that’s not only mutable, but easily rearranged and manipulated and only loosely regarded as any part of a fixed self. By that measure, he had concluded after many years of living abroad, that you could only know yourself with the shared language you were using with another person, i.e. you are creating a different self with each different person you’re with in whatever common language you happen to share. I loved that idea so much because what it says is that the I is always in relationship, that it’s a conversation, a community.
-gleaned from this Levi Rubeck interview with Noel Black
It seems what Noel Black is getting from Harry Mathews here supports what Lera Boroditsky has been arguing in her academic career. But then, beware: this is me, today, interpreting/translating/fumbling to explain to my Dear Reader what I think is going on. Lots may get lost in my grapples with Boroditsky's thought, with what Noel Black seeks to remember ("I'm paraphrasing here...") from his time with Harry Mathews, what Mathews thinks "makes sense" vis a vis European polyglots and language and the "self" versus what monolingual Unistatians think about where language comes from, how it relates to a "self," etc, etc, etc.
I'll return to Noel Black's poetry later, but want to try and haul in some things about interpretations, translations, and Information Theory.
How ordinary my having a blog seems!
One of my own spiritual fathers, Robert Anton Wilson, often wrote and talked about the acceleration of information. In his cosmic intellectual goofy-sufi-like humor, he eventually dubbed this mathematical doubling the Jumping Jesus Phenomenon, and if you don't know what it is, the first paragraph in this link gives the inkling. RAW took this stuff seriously, if not siriusly, and he actually has quite a lot to say about the statistical mechanisms underlying this phenomenon, and the social, personal, political and philosophical implications of it. Terence McKenna had some similar things to say on this topic, although they seemed more teleological and metaphysical in bend than RAW's. If one looks at Ray Kurzweil's work, particularly the fat The Singularity Is Near, Kurzweil seems to have picked up from RAW and his influences on information doubling and logarithmically taken it to another level, but that's for The Reader (and future history?) to decide.
There's also quite an abundant literature that seeks to determine boundary distinctions and interactions between types of knowledge, and, more fascinating to me, the differences between data, information, knowledge and wisdom. I have found nothing but vast abstruse disagreements here. Which is fine with me.
RAW, in his widely dispersed writings on information doubling in history, touched on the qualitative aspects therein, but any reader can easily miss that in favor of what I see as - wild and ironically - a Platonified view of how information works. If Claude Shannon's and Warren Weaver's and John Von Neumann's and Norbert Wiener's and Gregory Bateson's and...all the others mentioned in James Gleick's marvelous 2011 book The Information - the information that can be quantified and more or less given a relatively thick description as to how it worked within a scientific/technological sense - THAT information...then yes. Maybe. But I wonder about our brain's unrealistic expectations of information transformation, mostly because I played, at childhood birthday parties, the game of Telephone...and never stopped thinking about what it meant. (Huh? WTF is the OG onto now?)
I remember how funny it was that, what went into the first kid's ear turned out, after the 16th iteration, to be shorter and weirder and having almost no resemblance to the original. Why was it funny?
Well, humor may be a way to let our guards down and admit we're not perfect. We seem not even close to perfect. And that it's okay, because being human? We're far too complex to get strings of information exactly right, using our wetware. (I didn't think this stuff about Telephone until the last couple years.) Getting data and memory 100% is not something we do well in these embodied minds of aggregated replicators and evolutionarily legacy-ed mind-software. But what about other implications?
One of the "reasons" the market tanked in 2008 was "we" ("quants" and others) had written algorithms into very posh hi-powered computers that took in data about either buying or selling, and in microseconds, made "decisions" on whether it was buy or sell time. And the data/information the algorithms were working on was extensive. But it was based on earlier data strings that supposedly represented something in the Actual World that you or I would care about...like buying a house. But again: this data string was based on decisions made in microseconds from other data strings. It was like Telephone, only these algorithms were making buy/sell decisions without any human minds interposed. And yet, all you needed was one bit of some "interpretation" (can robotic algorithms, no matter how complex, be said to interpret?) that the algorithm "thought" was okay...to be wrong in some way. And then we had a sort of Telephone-like iteration which resulted in something that was not given to the same mirth of a children's backyard party and something rather like a Worldwide Economic Depression. Oh, humans, talking amongst themselves, playing by what the "rules" allowed, made bad decisions too...
Were there unethical, even criminally negligent decisions made? You betta you ass there were.
There's a LOT missing in this basic schematic, no?
Why do we allow complex mathematical formulae, worked over at blinding speeds by hi-powered computers, to make "decisions" about what are basically human values? Because it seemed/seems like a cost-efficient or "neat-o" idea? I read about this stuff and the FOLLY angers me. (Hence the Robin Hood Tax seems like a very sound idea to me.)
Now, I'm writing this based on my best understanding as a Generalist, some dude reading books, and maybe I've missed something? Maybe the authors I've read missed something? Maybe I misread some tiny bit based on some tiny bit someone, who I trust KNOWS something...errr...missed? Hey, that's life.
It's not at all convincing or clear to me that higher levels of abstracted information make for something healthier, to borrow from a hippie slogan, "for children and other living things." Too many artifacts and errors creep. We must always have something like an attempt of wise humans refereeing. It seems far too many Geeks and policy makers do not get this.
And that's my point. Information transfer from brain to brain or from algorithm to algorithm of from algorithm to brain is not a Platonically perfect dealio, ladies and germs. And the repercussions are not academic; they come packing a world of hurt at times.
Back to Black...
Noel Black's Uselysses
Gawd, what a delightful, funny book of poetry. This alone should be enough to pique you, but maybe my taste is not close enough to yours, so I'll say a few things about why I like the book.
Black remembered someone calling themselves a "depressionist," and, being an artist in today's Unistat? It's easy to feel useless, no matter how much effort and soul-bearing you do. For a punster - who prefers portmanteau to pun, as Joyce preferred the portmanteau as stylistic device -"uselysses" can describe the interior feelings of many an artist in this day and age.
He's steeped in the poetic tradition, but feels its weight, Whitman appearing at odd times throughout the book, laying his body down along the entire stretch of road one might drive from New York to San Francisco. And, steeped in the art school/academic and especially San Francisco poetry scene (where certain, say LANGUAGE poets live in their own poetic fiefdoms while poets of Other Schools theirs), he had to leave, go back to Colorado Springs, where he was brought up by a gay father who died of AIDS, and a lesbian mother. He goes back to evangelical right wing hotbed Colorado Springs to reclaim it, in some Whitmanian sense, for Art.
Another of the younger poets who draw upon and allude to TV and pop culture (almost) as much as anyone, Black's surrealistic sensibility and sense of the cosmic absurdity of his own fleeting thoughts in the mundane get worked over into something truly artistic and human and hilarious. Aside from a very witty section of short poems all based on how some famous poets died, he's perhaps best-known as the author of a chapbook (contained in Uselysses) called Moby K. Dick, (an odd concatenation of Philip K Dick's book Ubik and Melville's masterwork) in which he takes books he's loved and combines their deeper structures, has a sort of ethereal chat between both books (or book/another author), and neither book comes to the reader in bold relief; rather, the odd intertwining essence-ish-ness of each book speaks in the short poems, variously titled "Lord Jim Thompson," "Paul Austerlitz," "Watchmen in the Rye," "Huckleberry Finnegans Wake," "The Spy Who Came in from the Cold Blood," you get the idea.
If you're a Joycean looking for something in, say, "Huckleberry Finnegans Wake" you're unlikely to come away with anything, save for the idea that a poet drew, in some odd way, on Finnegans Wake. Black's purpose here seems closer to William Burroughs's use of the cut-up method, only Black is cutting up general feelings - interpretations - in his mind about the two books, how his sense of himself was subsumed while reading those books, and how his made-meanings of texts dreamily interposed in the overnight "dialogue" between both books as they sat on his shelf.
But this is Black at his artsiest. The Noel Black I had most fun with was the one who wanted to write poetry again because it was FUN, and he had calendrical time and geographic distance from two places he'd tried to make it, San Francisco and Brooklyn. Colorado Springs seems to suit him fine. Here are some lines from a poem that, to me, depict my favorite aspects of Black. From "Poem of Carl Sagan":
It must be confusing for Christians
who arrive in Heaven
to find Carl Sagan seated at the right hand of God,
which is a gigantic, glowing vagina
floating above the Captain's chair
on the deck of the Starship Enterprise.
"It's interesting - and I never imagined this - "
says Carl, using the weirding voices of Science to soothe the recently dead,
"that the Universe is merely an emanation of the brain,
which as we look into it, tricks us into believing
that we are gazing into an unfathomable outward expanse
that is but the unknowable inner reaches of our own minds. Now,
who would like to be reborn?"
Vaginal wormholes, C. S. Lewis perturbed by it all, Star Trek's spaceship as the Holy Ghost, telling someone they'll understand heaven a lot better if they re-read Dune, then here's Carl Sagan again:
Then he unzips his burnt-orange windbreaker
and a laser of love shoots out
from the spectral Starfleet logo upon his heart,
zapping them all into the raptures of wordless knowledge
as God folds their souls into dream.
If this all sounds vaguely like you and your funny friends, high in college, your parents split up or dead, listening to the Scorpions in someone's mom's basement in Unistatian suburbia in 1981, then yea: you probably have something in common with Noel Black. And you can either confirm or deny this by reading this book, but ESPECIALLY the last part, a rather long poem called "Prophecies For The Past," which was, to me, one of the most moving poems I've read by a currently living poet. I found Uselysses in my public library, but will buy the book if only for "Prophecies For The Past," which articulates a living reality for so very many of us, growing up in broken homes in cultural poverty pockets or suburban white America, last 30 years of the 20th century.
Noel Black, family man, seems wonderfully jester-wise and nutty, wears his resilient heart on his sleeve, which I picture as paisley right now, for some reason. I loved this book.
Subscribe to:
Posts (Atom)















