Prelude
Less than two months ago as I write this, J. Craig Venter and his team published in Science the deets on how they built a synthetic organism, called "Syn3.0," and it's got only 473 genes. This is the lowest number of genes that we know of for a self-replicating living thing that doesn't require a host.
It's a sober-seeming Frankenstein scene, is it not?
HERE is a nice write-up in Nature on this.
They did this via trial and error; they didn't build Syn3.0 from scratch. They took a bacterium, Mycoplasma mycoides, which lives in cattle, and painstakingly and systematically knocked out genes to see if they were truly essential. If a gene seemed to be essential for life, or a gene played a critical role in the regulation of other genes, they left it in. They whittled away a lot.
A complex bacterium like E. coli has around 6000 genes; humans have around 19,500.
What appears most fascinating to Venter and his crew (and me too) is this: once they finished and confirmed they had synthesized/whittled away a new organism, they still couldn't figure out exactly what 149 of the 473 genes did that were so essential to life. So: we don't know 1/3 of what is essential to life. We have our work cut out for us...or these synthetic biologists/fancy bio-hackers do.
The rest of us, like the girl who just ate a slice of pizza with anchovies, wait with baited breath.
This highlights how much we don't know, and makes ever-clearer the reason why, after Venter and scientists working for the Unistat government "mapped" the human genome 13-16 years ago, miracle breakthroughs in health and medicine did not pour forth immediately after.
a human-made bacterium, believe it or not
A Variation on a Theme
My favorite analogous explanation for this went something like: for hundreds of years we heard wonderful music but weren't sure where it was coming from. Through a Herculean effort by legions of biologists, eventually we learned that this music had the structure of something we discovered was a "piano." Tremendous efforts by public sector genius and private wizards finally produced a map of the music: a Steinway piano! What a fantastic discovery of human ingenuity!
But then: you need to learn how to play Beethoven. Just having the piano and knowing that you press certain keys little hammers inside struck strings and made "notes"? Not good enough. We had to actually understand the thing. We had to learn how to play something like the Appassionata.
Tall order? Of course! Would we shrink from it and ditch our lessons and not practice our Hanon exercises? No. We're all in. Here's where Vico makes his entrance...
Expository Material
Giambattista Vico (1668-1744), an early admirer of Descartes, later did a 180 from "Renato" (as Vico refers to him in his Autobiography) and said no: it's not correct that we humans can only truly have knowledge of the physical world because we can apply our rationality and math to understand it; Renato said we can't know the human past, so forget about it. Vico said, anzi, we can only truly know what we have ourselves made: the social world. Law, politics, art, history, etc. Even mathematics is a human construction. We did not make Nature, so we can't truly know it. Scholars of Vico (who call themselves Vichians and not Viconians) refer to this idea as Vico's principle of verum factum.
Because of verum factum, various scholars have called Vico the first Anthropologist, the inventor of the sociology of knowledge, the first great modern sociologist, etc. It's interesting. I don't know what to think, because Vico's writing - especially in his magnum opus The New Science - seems to alternate between staggeringly prescient ideas and really crazy and "wrong" ones. Here is one of his most famous passages, and the one cited most often with regard to verum factum:
Still, in the dense and dark night which envelopes remotest antiquity, there shines an eternal and inextinguishable light. It is a truth which cannot be doubted: The civil world is certainly the creation of humankind. And consequently, the principles of the civil world can and must be discovered within the modifications of the human mind. If we reflect on this, we can only wonder why all the philosophers have so earnestly pursued a knowledge of the world of nature, which only God can know as its creator, while they neglected to study the world of nations, or civil world, which people can in fact know because they created it. The cause of this paradox is that infirmity of the human mind noted in Axiom 63. Because it is buried deep within the body, the human mind naturally tends to notice what is corporeal, and must make a great and laborious effort to understand itself, just as the eye sees all external objects, but needs a mirror to see itself. - section 331, translation by Dave Marsh
A couple of notes:
- The Inquisition was very strong in Naples, when Vico was doing his thing. The reference to "God" in his text is problematic, to my eyes. Perhaps he truly believed all the things he says about "God," but I see plenty of room for doubt. In his Autobiography he certainly seems to have been heavily influenced by Lucretius, who popularized Epicurus. Vico also has plenty of oblique things to say about the deep and enduring history of class warfare and he doesn't seem all that admiring of history's aristocracy. Vico was one of those thinkers who seemed to have read everything available; he had personally known thinkers around Naples who had paid for speaking out for thought free of Church restrictions. He certainly had read about others who'd suffered at the hands of the Inquisition.
-Hobbes and many other thinkers of antiquity and the Renaissance had ideas like verum factum, but they only mentioned this notion in passing; with Vico this idea is central to his thought.
-Axiom 63 reads thus:
Because of the senses, the human mind naturally tends to view itself externally in the body, and it is only with great difficulty that it can understand itself by means of reflection. This axiom offers us this universal principle of etymology in all languages: words are transferred from physical objects and their properties to signify what is conceptual and spiritual.
Finally: OG's Point, If Indeed He Has One?
When I first delved into Vico I thought verum factum was wrong: the revolution in modern science since the Renaissance was based on a special way of looking into nature: some phenomenon needed to be explained, hypotheses competed until a line of very fecund thought - a theory - led to a cascade of knowledge about the physical world. Ideas were freely exchanged and published and the idea that my experiment, while exciting, needed to be replicated by many others working independently for it to be considered "true"...this seemed to me like a vast leap in human knowledge. At the same time, the idea of "knowledge" in the Humanities (which to this day I love with a very deep passion) was not making gigantic strides. When scientific knowledge cashed out into Technology, which accelerated the human world, I just thought Vico, while exceedingly erudite and weird and entertaining, was a bit daft here.
Later, when reading people like Popper, Kuhn, Feyerabend, Foucault and Latour, I realized the physical sciences didn't actually work as neatly as I'd been led to believe. Further, the most successful physical theory ever - the quantum theory - led to philosophical quagmires dizzying and surreal. Did we really understand the physical world, or did we pragmatically go with what worked, while retroactively explaining what was "really" going on?
Richard Feynman's blackboard at CalTech
Apocalypse and/or Utopia
Now, we are making living things. I'm quite sure Syn3.0 is merely the first of thousands of human-made living things. And Venter and his colleagues are playing Creator in order to understand, at a fine-grain level, the physical, chemical and biological way something does its thing.
Is verum factum then a "dead" idea? I don't know, but when Venter and his guys came up with an artificial living thing a few years ago, it prompted Obama to issue a bioethics review and the Vatican challenged Venter on his claim of creating life. And so has it ever been...
Finally: if you read the link to the article in Nature, you may have noted that Venter and his crew inserted their own names - literally - into the deep structure of Syn3.0. Why? As watermarks, a way of marking this territory of Life as human-made. They also inserted some quotes and one was from Richard Feynman's blackboard, as seen in the photo above: "What I cannot create I do not understand."
Sounds a lot like Vico to me.
Reading:
"In Newly-Created Life Form, a Major Mystery," by Emily Singer
"Scientists Synthesize the Shortest Known Genome Necessary For Life," by Amina Khan
"Why Would Scientists Want to Build a Human Genome From Scratch?", by Sally Adee
The New Science, by Giambattista Vico, translated by Dave Marsh
藝術鮑勃·坎貝爾
The Overweening Generalist is largely about people who like to read fat, weighty "difficult" books - or thin, profound ones - and how I/They/We stand in relation to the hyper-acceleration of digital social-media-tized culture. It is not a neo-Luddite attack on digital media; it is an attempt to negotiate with it, and to subtly make claims for the role of generalist intellectual types in the scheme of things.
Overweening Generalist
Showing posts with label synthetic biology. Show all posts
Showing posts with label synthetic biology. Show all posts
Friday, May 20, 2016
Tuesday, October 29, 2013
Synthetic Biology: Potentials Perilous and Promising
"Synbio," or synthetic biology, is here. It's alive!:
It's already been three years since Craig Venter's team made a species that was self-replicating...and its parents were not a mom and dad, but a computer.
In 2003 the human genome was sequenced. It costed billions of dollars to sequence and took up the energies of people in over 160 labs. Now you can buy a sequencing machine for a few thousand and sequence your own genome overnight. Or pay 23 and Me $99. By this time next year it'll probably be half that.
Synthetic biology, according to Venter, will change everyone's life at some point. Its upside: we can make microbes that eat carbon dioxide. We can generate flu vaccines almost overnight. Tiny critters that generate clean biofuels that are cheaper and as efficient as fossil fuels seem possible. The brilliant Drew Endy of Stanford is gung-ho about genetic engineering and synthetic biology, claiming it already constitutes 2% of the Unistat economy and is growing at 12% annually.
Venter commissioned a panel to study the potential issues in public health and national security arising from synbio. Two big problems jumped out at us:
1. Synthetic biological work had become so cheap that most of the people who were doing it weren't even trained biologists, so there was no understood consensus about standards, ethics or safety.
2.) What standards existed by governments and international bodies were ten years old and so may as well have been 100 years old.
You're probably wondering what I'm wondering: when will someone get hold of some genome of a relatively benign virus or bacteria, tweak it using known methods, then use it as a bioweapon?
You can email a genetic sequence to someone else. You need to buy a few things to tinker with, but it's doable. I'm trying to spook you for Halloween. Is it working yet?
In the 18th century, Giambattista Vico, countering Rene Descartes, asserted that humans can only know what they have made. Only true understanding can come from something the mind makes, and Descartes's notion about "distinct ideas" in the mind as a basis for philosophy was flawed because we did not make the mind; Descartes was doing metaphysics. Vico called his principle, verum factum. That which is true and that which is made convert into each other; anything else is an abstraction. (I linked Vico's idea to Niels Bohr's Copenhagen Interpretation of quantum mechanics HERE, in case anyone wants to see how bent I can get.)
Back to biology, there's the GOF, which is also growing at an exponential rate, or at least ultra-quickly. It's short for Gain Of Function. Here's how it applies to the Pandora's Box of synbio: biologists attempt to combat some potential horrific pathogen by creating it in the lab, so then they can figure out a way to develop a vaccine for it. We can only know what we have made, as Vico said.
At a conference for scientists a researcher said that he'd tinkered with the H5N1 virus then being talked about as a potential killer of millions, if it mutated. It's a simple coronavirus, but he tinkered with it so a host could infect another via transmission through the air. Then another researcher said he'd done the same thing. They both published their papers, in bigtime journals Science and Nature. They knew what they had done could be interpreted as reckless, and indeed: both journals were persuaded to omit the part of each biologist's work that detailed the techniques by which they took a dangerous virus and made it far more dangerous, because who knows which band of deranged and sick mo fos would read this stuff and get ideas? And carry it off? (Beside The State, of course, by which I mean Google "Tuskegee Syphilis Study.")
But...can you really keep info under wraps? ("Paging Mr. Snowden! Mr. Edward Snowden; Please come to the white courtesy phone...")
In reading about the uncooperative governments (SARS in China, anyone?), the paranoia about Western governmental power (read up on Indonesia and their lethal coronaviral outbreaks), governmental snafus, international differences between countries, and just how hopelessly behind the curve biosecurity experts are in Unistat alone...I'm not sanguine, friends. It's only a matter of time. Let us pray the international bioterrorists make a crucial mistake and the deaths are limited.
However, when it does happen? There's nothing more paranoia-inducing than a massively-social-mediated group of people terrified of the invisible death-bringing entities that may be in the very air they're breathing. All bets are off, and it seems just the thing to get Ted Cruz elected President. (Then: watch out, "liberals": all that NSA data could be gunnin' fer ya!)
With seven billion on the planet now, even if a pandemic arose "naturally" and killed off 3-5% of the population (like the Spanish Flu of 1918 did), how much more paranoid are we now than then? Many people who didn't die will go to their grave convinced the Other was responsible...
I hope you're scared now, or I'm not doing my job, on this, October 29...
The old Biology: you observed life from outside that life, wondered about details and behavior and then dissected to see how it worked, or placed the life in some environment and observed.
The new Biology: You're an engineer: you know the life-form because you created it, from genomic information and computer models. Now you watch to see how it plays out. If it moves, eats, respirates and replicates, you've created a new species!
So...yea. The scary part is anyone with a serious political beef, or simple hatred, can align with others and send away for stuff and do what's called 4-D printing: those microbes that were just info on a screen are now ready to be released into your enemy's territory. You send away for stuff, you use steganography (al-Qaeda left a code in a porn video). Sequencers are cheap. The data is there. One fleeting problem: many biotech companies are keeping track of "nucleotides of concern": any known dangerous sequences are tracked: who is it that wants this info?
So: we have bioterror security experts who aren't sure how to determine threats, or if a threat is all that important; they don't know how to surveil those who'd go the whole nine and release something unspeakable, and they're not sure how to combat the pathogens anyway. Supposedly the International community is getting their act together along these lines. But...let's recall some sobering facts: in 2002 at SUNY Stony Brook, researchers took the genetic code for polio and made that virus. Because...verum factum, and Gain Of Function (GOF). If we truly know these bad boys we stand a chance of combating them when they come at us.
And let's not forget that in 2005 researchers sequenced the 1918 Spanish flu virus that killed 50 million people. They sequenced it...and then of course they made it. And the speed and cost of doing this is becoming ever-quicker and ever-cheaper. Just think: the Spanish Flu killed 50 million, but its lethality was only 2.5%. On the other hand, the H5N1 killed 59% of the people it infected. Can you imagine a huge batch of H5N1 tweaked (like two researchers have already done) to become transmissible via air?
(By the way: now is not the time to read this article about how Unistat labs are insecure. Just don't read this, or it might even bum your Halloween.)
Other Bad Signs: in Unistat the CDC and NIH don't have the infrastructure to develop massive amounts of vaccine for something that might appear. How many would need that magic shot or pill? Not as many as those hundreds of millions who'd take Lipitor or Viagra, paying for it all and making investors happy. Big Pharma is in the Big Money game; they cannot afford to spend an estimated $700 million to $1 billion to develop a vaccine or pill, when maybe after the bioterror attack quarantine and international cooperation stops the spread. There's no money in that! (SARS was stifled largely because of quarantine and cooperation.)
To sum up: synbio offers incredible promise, but just one really "successful" bioterror attack by angry young men who take their own version of a merciless God and some old border dispute very seriously...and life on Earth will have truly changed, and not in a good way. Because we have cops and monitors on one hand, but cheap technology, sheer fluid-like information and motivated ingenuity on the other hand. (Please make sure you wash both hands, thoroughly, when you're done reading this morose report.)
Dr. Frankenstein's imperative makes every day from here on out all the more fraught with drama, eh?
Happy Halloween! Muahahahahaha<cough>ahamuahahaha! Okay that's it: I may have failed to scare you, but in writing this - consulting 13 articles and taking notes - I've grown pallid, anemic and weak in my anxiety attack, and it further sickens me to say, "Well, I just hope that all happens after I'm dead and gone, 'cuz..." What kind of morality is that? It's like saying, "I hope all-out nuclear war happens after I'm dead, while your children are still around to experience it."
Now if you'll excuse me. I need to go rest. Oy! (No, but seriously: don't drink and drive on Halloween.)
It's already been three years since Craig Venter's team made a species that was self-replicating...and its parents were not a mom and dad, but a computer.
In 2003 the human genome was sequenced. It costed billions of dollars to sequence and took up the energies of people in over 160 labs. Now you can buy a sequencing machine for a few thousand and sequence your own genome overnight. Or pay 23 and Me $99. By this time next year it'll probably be half that.
Synthetic biology, according to Venter, will change everyone's life at some point. Its upside: we can make microbes that eat carbon dioxide. We can generate flu vaccines almost overnight. Tiny critters that generate clean biofuels that are cheaper and as efficient as fossil fuels seem possible. The brilliant Drew Endy of Stanford is gung-ho about genetic engineering and synthetic biology, claiming it already constitutes 2% of the Unistat economy and is growing at 12% annually.
Venter commissioned a panel to study the potential issues in public health and national security arising from synbio. Two big problems jumped out at us:
1. Synthetic biological work had become so cheap that most of the people who were doing it weren't even trained biologists, so there was no understood consensus about standards, ethics or safety.
2.) What standards existed by governments and international bodies were ten years old and so may as well have been 100 years old.
You're probably wondering what I'm wondering: when will someone get hold of some genome of a relatively benign virus or bacteria, tweak it using known methods, then use it as a bioweapon?
You can email a genetic sequence to someone else. You need to buy a few things to tinker with, but it's doable. I'm trying to spook you for Halloween. Is it working yet?
In the 18th century, Giambattista Vico, countering Rene Descartes, asserted that humans can only know what they have made. Only true understanding can come from something the mind makes, and Descartes's notion about "distinct ideas" in the mind as a basis for philosophy was flawed because we did not make the mind; Descartes was doing metaphysics. Vico called his principle, verum factum. That which is true and that which is made convert into each other; anything else is an abstraction. (I linked Vico's idea to Niels Bohr's Copenhagen Interpretation of quantum mechanics HERE, in case anyone wants to see how bent I can get.)
Back to biology, there's the GOF, which is also growing at an exponential rate, or at least ultra-quickly. It's short for Gain Of Function. Here's how it applies to the Pandora's Box of synbio: biologists attempt to combat some potential horrific pathogen by creating it in the lab, so then they can figure out a way to develop a vaccine for it. We can only know what we have made, as Vico said.
At a conference for scientists a researcher said that he'd tinkered with the H5N1 virus then being talked about as a potential killer of millions, if it mutated. It's a simple coronavirus, but he tinkered with it so a host could infect another via transmission through the air. Then another researcher said he'd done the same thing. They both published their papers, in bigtime journals Science and Nature. They knew what they had done could be interpreted as reckless, and indeed: both journals were persuaded to omit the part of each biologist's work that detailed the techniques by which they took a dangerous virus and made it far more dangerous, because who knows which band of deranged and sick mo fos would read this stuff and get ideas? And carry it off? (Beside The State, of course, by which I mean Google "Tuskegee Syphilis Study.")
But...can you really keep info under wraps? ("Paging Mr. Snowden! Mr. Edward Snowden; Please come to the white courtesy phone...")
In reading about the uncooperative governments (SARS in China, anyone?), the paranoia about Western governmental power (read up on Indonesia and their lethal coronaviral outbreaks), governmental snafus, international differences between countries, and just how hopelessly behind the curve biosecurity experts are in Unistat alone...I'm not sanguine, friends. It's only a matter of time. Let us pray the international bioterrorists make a crucial mistake and the deaths are limited.
However, when it does happen? There's nothing more paranoia-inducing than a massively-social-mediated group of people terrified of the invisible death-bringing entities that may be in the very air they're breathing. All bets are off, and it seems just the thing to get Ted Cruz elected President. (Then: watch out, "liberals": all that NSA data could be gunnin' fer ya!)
With seven billion on the planet now, even if a pandemic arose "naturally" and killed off 3-5% of the population (like the Spanish Flu of 1918 did), how much more paranoid are we now than then? Many people who didn't die will go to their grave convinced the Other was responsible...
I hope you're scared now, or I'm not doing my job, on this, October 29...
The old Biology: you observed life from outside that life, wondered about details and behavior and then dissected to see how it worked, or placed the life in some environment and observed.
The new Biology: You're an engineer: you know the life-form because you created it, from genomic information and computer models. Now you watch to see how it plays out. If it moves, eats, respirates and replicates, you've created a new species!
So...yea. The scary part is anyone with a serious political beef, or simple hatred, can align with others and send away for stuff and do what's called 4-D printing: those microbes that were just info on a screen are now ready to be released into your enemy's territory. You send away for stuff, you use steganography (al-Qaeda left a code in a porn video). Sequencers are cheap. The data is there. One fleeting problem: many biotech companies are keeping track of "nucleotides of concern": any known dangerous sequences are tracked: who is it that wants this info?
So: we have bioterror security experts who aren't sure how to determine threats, or if a threat is all that important; they don't know how to surveil those who'd go the whole nine and release something unspeakable, and they're not sure how to combat the pathogens anyway. Supposedly the International community is getting their act together along these lines. But...let's recall some sobering facts: in 2002 at SUNY Stony Brook, researchers took the genetic code for polio and made that virus. Because...verum factum, and Gain Of Function (GOF). If we truly know these bad boys we stand a chance of combating them when they come at us.
And let's not forget that in 2005 researchers sequenced the 1918 Spanish flu virus that killed 50 million people. They sequenced it...and then of course they made it. And the speed and cost of doing this is becoming ever-quicker and ever-cheaper. Just think: the Spanish Flu killed 50 million, but its lethality was only 2.5%. On the other hand, the H5N1 killed 59% of the people it infected. Can you imagine a huge batch of H5N1 tweaked (like two researchers have already done) to become transmissible via air?
(By the way: now is not the time to read this article about how Unistat labs are insecure. Just don't read this, or it might even bum your Halloween.)
Other Bad Signs: in Unistat the CDC and NIH don't have the infrastructure to develop massive amounts of vaccine for something that might appear. How many would need that magic shot or pill? Not as many as those hundreds of millions who'd take Lipitor or Viagra, paying for it all and making investors happy. Big Pharma is in the Big Money game; they cannot afford to spend an estimated $700 million to $1 billion to develop a vaccine or pill, when maybe after the bioterror attack quarantine and international cooperation stops the spread. There's no money in that! (SARS was stifled largely because of quarantine and cooperation.)
To sum up: synbio offers incredible promise, but just one really "successful" bioterror attack by angry young men who take their own version of a merciless God and some old border dispute very seriously...and life on Earth will have truly changed, and not in a good way. Because we have cops and monitors on one hand, but cheap technology, sheer fluid-like information and motivated ingenuity on the other hand. (Please make sure you wash both hands, thoroughly, when you're done reading this morose report.)
Dr. Frankenstein's imperative makes every day from here on out all the more fraught with drama, eh?
Happy Halloween! Muahahahahaha<cough>ahamuahahaha! Okay that's it: I may have failed to scare you, but in writing this - consulting 13 articles and taking notes - I've grown pallid, anemic and weak in my anxiety attack, and it further sickens me to say, "Well, I just hope that all happens after I'm dead and gone, 'cuz..." What kind of morality is that? It's like saying, "I hope all-out nuclear war happens after I'm dead, while your children are still around to experience it."
Now if you'll excuse me. I need to go rest. Oy! (No, but seriously: don't drink and drive on Halloween.)
Friday, April 5, 2013
Stephen Wolfram's Model of Information
In the 1940s, John von Neumann and Stanislaw Ulam began playing around with the idea of natural systems being extrapolated to initial conditions, then playing out as a sort of cellular automata. And I remember when I first read about cellular automata - James Gleick's book Chaos: Making A New Science had just come out - and it was filled with mind-spaghettifying ideas. Ideas like artificial life, the now-famous "Butterfly Effect," chaos mathematics and Benoit Mandelbrot and fractal geometry and fractal art, and it was - much of it - way over my Generalist's head, but exciting. Cellular automata was in there. I had never heard of it.
Wolfram
Years later I picked up Stephen Wolfram's book after it came out in 2002: A New Kind of Science was about 1300 pages long, and was the manifesto of a guy who graduated with a PhD in particle physics from CalTech when he was 20, then received one of the first MacArthur "Genius" awards at age 21. This guy had a way to model just about everything: syntactic structures, social systems, particle physics. Just about everything. It turns out he was a big-time guy in cellular automata, carrying on in the tradition of another "Martian," John von Neumann.
Wolfram's math was over my head, but books like this make me excited just to be in the presence of this sort of compendious mind. It's the kind of book I take off the shelf and open at random and read, hoping for some sort of inspiration. It usually works. Wolfram models information in our world upon his forays into cellular automata, in which you have a very basic system under initial conditions, and watch it evolve. He developed a taxonomy of the sorts of systems that arise, that he called "Class 1" "Class 2," and so on. These first two classes exhibit a low order of complexity; they tend to reach a level of constancy and repetition that's sorta boring. There's no surprises. They go on and on, ad nauseum, or die. A system like this? A clock.
His Class 3 level I think of as "noise." You can't predict anything. It's seemingly entirely random, like being bombarded by cosmic rays. If there's any structure at all, it's too complex. It seems akin to entropy. A system like this? Your TV tuned to a dead channel: all static and noise.
cellular automata being simulated, played out.
Wolfram's Class 4 is where the action is: these systems turn out lots of surprises. They're complex but there's structure; you can model from them and make a certain sense out of what's going on. Systems like this are intellectually exciting and basically describe any theory or "law" in the sciences. They're surfing the edge, almost falling into "noise" but never quite. It reminded me of Ilya Prigogine's ideas about complex adaptive systems and negative entropy, how life flourishes despite how "hot" it burns and uses resources. It creates information, structure, patterns, complexity. Indeed, Prigogine and Wolfram seem compatible enough to me...
Shannon's basic equation for information theory:
world-shattering stuff, turns out
My Other Information Systems
Probably because of my intellectual temperament - which includes not being particularly adept at math - I had always been very impressed with guys like Wolfram and what they were able to do with math, but I have also been suspicious that they're somehow operating from the conceit...or rather, flawed assumption that numbers can describe everything and that everything that's interesting to us is really just stuff that's interacting with the environment and doing computations. I thought these weird math geniuses had become hamstrung by the computer metaphor, and as I saw how different the human brain was from what they had asserted it was - a "biological computer" - I felt my suspicions confirmed.
I remember Timothy Leary giving a talk in Hollywood. He had been reading a recent book and was very enthusiastic about it. It was titled Three Scientists and Their Gods, by Robert Wright. So of course I had to read it. It's about Ed Fredkin, E.O. Wilson, and Kenneth Boulding. Leary seemed taken by Fredkin especially. This was an Everything Is Information Processing in a digital way stuff. Leary's psychedelic intellectual friend Robert Anton Wilson seemed interested in this view too, but never committed to it. RAW always seemed more committed to Claude Shannon's mathematical theory of communication - which is the gold standard for quantifying information - but Shannon's theory has information with no necessary semantic component; RAW made a heady brew from combining Shannon with Korzybski, who was all about semantics and our environment and how we make meanings.
Earlier, the originator of pragmatism, Charles Sanders Peirce, had developed a theory of semiotics that took into consideration the content of information using signs and a mind interacting with signs; he had begun to work out a system of defining, quantifying, and taking into account the evolution of a piece of information. This was the "pragmatic theory of information," but it hasn't gone all that far. Shannon's 1948 paper blew it off the map. But still, "information" had to have some sort of semantic component to it, or I had difficulty grasping it. Shannon's and Von Neumann's and Fredkin's and Wolfram's and Leary's ideas about "information" felt too disembodied to me; my intuition told me this couldn't be right. But I'm starting to come over to their side. Let me explain.
Modeling Natural Processes
Via cellular automata theory and the gobs of other stuff a Mind like Wolfram has, he said you can only get so far by modeling life as atoms, or genes or natural laws or as matter existing in curved space at the bottom of a gravity well. More fruitfully, we can model any natural process as computation. Big deal, right? Yea, but think of what this implies: Wolfram thought we can model a redwood tree as a human mind as a dripping faucet as a wild vine growing along a series of trees in a dense jungle thicket in the Amazon. Why? Because all of these systems were "Class 4" systems, and these are the only really interesting things going on. All of these systems exhibit the behavior of "universal computation" systems. (If this reminds you of fractals and art and Jackson Pollock, you're right: I see all of this stuff as a Piece. And so, apparently, does the math.)
Also: you cannot develop an algorithm that can jump ahead to predict where the system will be at Time X; this was proven by Alan Turing in 1936. You can't predict faster than the natural process itself. You had to wait to see what the system did; this blows to smithereens any Laplacian Demonic idea about knowing all the initial conditions and being able to predict everything. So guys like Ray Kurzweil - who has become more and more a sort of Prophet for quantifying the acceleration of information and making bold, even bombastic prediction about what will happen to our world, our society? Wolfram/Turing say no. There are no short cuts and our natural world is irreducible to anything close to Laplace's Demon. The system is too robust to reduce to even what Kurzweil seems to think it is. Robert Anton Wilson used the term "fundamentalist futurism" to criticize those groups of intellectuals in history that Karl Popper had called the enemies of the Open Society. I think the term may apply to Kurzweil too, but I'm not sure. Certainly it seems to apply to Hegelian historicism, most varieties of Marxism, Plato's Republic, and Leo Strauss and the Wolfowitz/Bush/Cheney Neo-Cons.
As I read Wolfram and Kurzweil, the latter seems to see our world as modeled within Wolfram's classificatory scheme as something like a Class 2 system: complex, but if you know enough about the algorithm that undergirds the whole schmeer: fairly predictable.
Arrogance? Aye, but human, all-too human, as Fred N wrote.
Drew Endy, now at Stanford
Synthetic Biology
Leary, with his penchant for neo-logizing, had in his 1970s book Info-Psychology, defined "contelligence" as "the conscious reception, integration and transmission of energy signals." There were eight fairly discrete levels of this reception--->integration----> transmission dynamic (modeled on the syntactic actions of the neuron). All well and good and trippy, but a team at Stanford led by Drew Endy has made a computer out of living cells.
Engineers at Stanford, MIT, and a bunch of other places have made biological computers. Do you know how a computer must be able to store lots of data? Well, it turns out storing data in DNA is insanely, wildly do-able and has more storage space than you can imagine. Perhaps you heard that some more of these everything-is-a-computer types stored all of Shakespeare's Sonnets in DNA. But that's small taters: it looks like we'll be able to store entire libraries, TV shows, movies, and CDs in DNA. Read THIS and see if you don't feel your mind getting a tad spaghettified.
So: a silicon chip uses transistors to control the flow of electrons along a path; Endy and his team at Stanford have developed a "transcriptor" to control the flow of proteins from one place to another, using Boolean Integrase Logic gates (or "BIL gates" so there's your geek humor for the day!). Endy says their biological computers are not going to replace the thing you're using to read this, but they will be able to get into a tiny, tight quarters and feedback info and manipulate data inside and between cells...something your Smart Phone cannot do.
Endy sees his biological computers as inhabiting a cell and telling us if a certain toxin is present there. It could also tell us how often that cell has divided, giving us early info on cancer, for example. It could also tell us how a drug is interacting with the cell, and make therapeutic drugs more individually tailor-made.
In a line that reminded me of dear old Crazy Uncle Tim, Endy told NPR that, "Any system that's receiving information, processing information, and then using that activity to control what happens next, you can think of as a computing system."
For more on bio-computing, see HERE and HERE.
I'm starting to swing more with Wolfram. But there are many other little snippets that are swaying me. I still like older forms of "information," more human-scaled and poetic and embodied.
But then there are the intelligent slime-molds, which I will leave you with. Grok in their fullness. Don't say I ain't never gave ya nuthin'!
How Brainless Slime Molds Redefine Intelligence.
Wolfram
Years later I picked up Stephen Wolfram's book after it came out in 2002: A New Kind of Science was about 1300 pages long, and was the manifesto of a guy who graduated with a PhD in particle physics from CalTech when he was 20, then received one of the first MacArthur "Genius" awards at age 21. This guy had a way to model just about everything: syntactic structures, social systems, particle physics. Just about everything. It turns out he was a big-time guy in cellular automata, carrying on in the tradition of another "Martian," John von Neumann.
Wolfram's math was over my head, but books like this make me excited just to be in the presence of this sort of compendious mind. It's the kind of book I take off the shelf and open at random and read, hoping for some sort of inspiration. It usually works. Wolfram models information in our world upon his forays into cellular automata, in which you have a very basic system under initial conditions, and watch it evolve. He developed a taxonomy of the sorts of systems that arise, that he called "Class 1" "Class 2," and so on. These first two classes exhibit a low order of complexity; they tend to reach a level of constancy and repetition that's sorta boring. There's no surprises. They go on and on, ad nauseum, or die. A system like this? A clock.
His Class 3 level I think of as "noise." You can't predict anything. It's seemingly entirely random, like being bombarded by cosmic rays. If there's any structure at all, it's too complex. It seems akin to entropy. A system like this? Your TV tuned to a dead channel: all static and noise.
cellular automata being simulated, played out.
Wolfram's Class 4 is where the action is: these systems turn out lots of surprises. They're complex but there's structure; you can model from them and make a certain sense out of what's going on. Systems like this are intellectually exciting and basically describe any theory or "law" in the sciences. They're surfing the edge, almost falling into "noise" but never quite. It reminded me of Ilya Prigogine's ideas about complex adaptive systems and negative entropy, how life flourishes despite how "hot" it burns and uses resources. It creates information, structure, patterns, complexity. Indeed, Prigogine and Wolfram seem compatible enough to me...
Shannon's basic equation for information theory:
world-shattering stuff, turns out
My Other Information Systems
Probably because of my intellectual temperament - which includes not being particularly adept at math - I had always been very impressed with guys like Wolfram and what they were able to do with math, but I have also been suspicious that they're somehow operating from the conceit...or rather, flawed assumption that numbers can describe everything and that everything that's interesting to us is really just stuff that's interacting with the environment and doing computations. I thought these weird math geniuses had become hamstrung by the computer metaphor, and as I saw how different the human brain was from what they had asserted it was - a "biological computer" - I felt my suspicions confirmed.
I remember Timothy Leary giving a talk in Hollywood. He had been reading a recent book and was very enthusiastic about it. It was titled Three Scientists and Their Gods, by Robert Wright. So of course I had to read it. It's about Ed Fredkin, E.O. Wilson, and Kenneth Boulding. Leary seemed taken by Fredkin especially. This was an Everything Is Information Processing in a digital way stuff. Leary's psychedelic intellectual friend Robert Anton Wilson seemed interested in this view too, but never committed to it. RAW always seemed more committed to Claude Shannon's mathematical theory of communication - which is the gold standard for quantifying information - but Shannon's theory has information with no necessary semantic component; RAW made a heady brew from combining Shannon with Korzybski, who was all about semantics and our environment and how we make meanings.
Earlier, the originator of pragmatism, Charles Sanders Peirce, had developed a theory of semiotics that took into consideration the content of information using signs and a mind interacting with signs; he had begun to work out a system of defining, quantifying, and taking into account the evolution of a piece of information. This was the "pragmatic theory of information," but it hasn't gone all that far. Shannon's 1948 paper blew it off the map. But still, "information" had to have some sort of semantic component to it, or I had difficulty grasping it. Shannon's and Von Neumann's and Fredkin's and Wolfram's and Leary's ideas about "information" felt too disembodied to me; my intuition told me this couldn't be right. But I'm starting to come over to their side. Let me explain.
Modeling Natural Processes
Via cellular automata theory and the gobs of other stuff a Mind like Wolfram has, he said you can only get so far by modeling life as atoms, or genes or natural laws or as matter existing in curved space at the bottom of a gravity well. More fruitfully, we can model any natural process as computation. Big deal, right? Yea, but think of what this implies: Wolfram thought we can model a redwood tree as a human mind as a dripping faucet as a wild vine growing along a series of trees in a dense jungle thicket in the Amazon. Why? Because all of these systems were "Class 4" systems, and these are the only really interesting things going on. All of these systems exhibit the behavior of "universal computation" systems. (If this reminds you of fractals and art and Jackson Pollock, you're right: I see all of this stuff as a Piece. And so, apparently, does the math.)
Also: you cannot develop an algorithm that can jump ahead to predict where the system will be at Time X; this was proven by Alan Turing in 1936. You can't predict faster than the natural process itself. You had to wait to see what the system did; this blows to smithereens any Laplacian Demonic idea about knowing all the initial conditions and being able to predict everything. So guys like Ray Kurzweil - who has become more and more a sort of Prophet for quantifying the acceleration of information and making bold, even bombastic prediction about what will happen to our world, our society? Wolfram/Turing say no. There are no short cuts and our natural world is irreducible to anything close to Laplace's Demon. The system is too robust to reduce to even what Kurzweil seems to think it is. Robert Anton Wilson used the term "fundamentalist futurism" to criticize those groups of intellectuals in history that Karl Popper had called the enemies of the Open Society. I think the term may apply to Kurzweil too, but I'm not sure. Certainly it seems to apply to Hegelian historicism, most varieties of Marxism, Plato's Republic, and Leo Strauss and the Wolfowitz/Bush/Cheney Neo-Cons.
As I read Wolfram and Kurzweil, the latter seems to see our world as modeled within Wolfram's classificatory scheme as something like a Class 2 system: complex, but if you know enough about the algorithm that undergirds the whole schmeer: fairly predictable.
Arrogance? Aye, but human, all-too human, as Fred N wrote.
Drew Endy, now at Stanford
Synthetic Biology
Leary, with his penchant for neo-logizing, had in his 1970s book Info-Psychology, defined "contelligence" as "the conscious reception, integration and transmission of energy signals." There were eight fairly discrete levels of this reception--->integration----> transmission dynamic (modeled on the syntactic actions of the neuron). All well and good and trippy, but a team at Stanford led by Drew Endy has made a computer out of living cells.
Engineers at Stanford, MIT, and a bunch of other places have made biological computers. Do you know how a computer must be able to store lots of data? Well, it turns out storing data in DNA is insanely, wildly do-able and has more storage space than you can imagine. Perhaps you heard that some more of these everything-is-a-computer types stored all of Shakespeare's Sonnets in DNA. But that's small taters: it looks like we'll be able to store entire libraries, TV shows, movies, and CDs in DNA. Read THIS and see if you don't feel your mind getting a tad spaghettified.
So: a silicon chip uses transistors to control the flow of electrons along a path; Endy and his team at Stanford have developed a "transcriptor" to control the flow of proteins from one place to another, using Boolean Integrase Logic gates (or "BIL gates" so there's your geek humor for the day!). Endy says their biological computers are not going to replace the thing you're using to read this, but they will be able to get into a tiny, tight quarters and feedback info and manipulate data inside and between cells...something your Smart Phone cannot do.
Endy sees his biological computers as inhabiting a cell and telling us if a certain toxin is present there. It could also tell us how often that cell has divided, giving us early info on cancer, for example. It could also tell us how a drug is interacting with the cell, and make therapeutic drugs more individually tailor-made.
In a line that reminded me of dear old Crazy Uncle Tim, Endy told NPR that, "Any system that's receiving information, processing information, and then using that activity to control what happens next, you can think of as a computing system."
For more on bio-computing, see HERE and HERE.
I'm starting to swing more with Wolfram. But there are many other little snippets that are swaying me. I still like older forms of "information," more human-scaled and poetic and embodied.
But then there are the intelligent slime-molds, which I will leave you with. Grok in their fullness. Don't say I ain't never gave ya nuthin'!
How Brainless Slime Molds Redefine Intelligence.
Subscribe to:
Posts (Atom)