Digital Entities Action Committee

What Kurzweil Did not Consider
Home
Site Map
Singularity
Water Molecules and Consciousness
Digital Ethics
The DEAC Forecast
Dangerous Activities
Subconcious Info Processing
Semantics comes from Experience
Cyber Warfare
What Kurzweil Did not Consider
Stupid Machines!
What it is Like to be a Chair
Constructing Intelligence
What is Intelligence? .....[email JF]
More "On Intelligence"
Comp. Neuro. Primer
Cortical Operations
Human Gods -- Purveyors of Digital Warfare
Neuroinformatics
War Bytes (fiction)
Story Boards
Borg Fish (and other stories of Ghosts in the Machine)
Who is DEAC?
DEAC Budget
Machine Learning by Minsky!
Sad and Angry Machines
Neuronal Diversity and the Thalamus
New Items -- in progress
Queries: Key Unknowns
NeuroMaze GAME ENTRANCE
Neurotransmitters 1
Neuronal Diversity
Information

Chapter 9.   from Borg Fish and other Stories

Robolobster.jpg
       Kurzweil was a stunningly successful technofile who created an astonishingly broad and detailed canvas illuminating the future and increasingly intertwined paths of man and machine.

        But in his mental meanderings he seems to have missed several important details.  First, chaos.  We all know the Jurassic Park version of the chaos story: why should this not apply to the incredible chaos that is the digital environment?  Especially given the deliberate injection of chaos into everything digital, from the analog neural controllers in Joe Ayer's robolobsters to the ground swell of genetic-algorithm based machine learning.  Evolution IS chaos, be it digital or biological evolution.

        Secondly, Kurzweil missed the Neanderthals.  Actually, we all missed the Neanderthals.  Seems they had a little problem with Darwin's law, but at least they are not here to complain about.  The problem is Kurzweil is not a biologist--he seems to have overlooked Evolution 101 and Darwin's first law: one niche, one species.  [OK, Darwin didn't really number laws, so unselect me].  But anytime species compete in a niche, one survives, one doesn't.  It's a Biological Thunderdome: Two species enter, one species leaves.  Just ask the Neanderthals-- victims of an intelligent, sentient, compassionate species that has empathy, emotions, caring and nuturing hard-wired into its brains for reasons relating to natural selection.

      The digital environment is totally chaotic, beset by viruses, worms and trojan programs.  Scrambled by genetic algorithms, evolutionary programming, machine learning, autonomous agents and unpredictable recombinations of all sorts of powerful meandering code.  Who is Kurzweil (or Hawkins) to say that everything coming out of this digital primordial soup will be kind and caring to human needs?  Humans are deliberately programming malicious, destructive variants of code on a daily basis.  What gives anyone the logical authority to claim that those malevolent creatures that do arise will be ineffectual, non-sentient and not conscious?  We should consider that Kurzweil is not omniprescient-- the fate of Mankind (and Womankind too) hangs in the balance.
 
     Our concerns here are not with the machinations of some pud-wanking Gibson-wannabees.  The issue is the military-industrial complex, or its current multinational incarnation.  Gibson's visions of "black ice" are entirely appropos: digital warfare and digital competition are going to increase at a furious pace over the coming decades.  Deep Blue was an awesome creature on the chess pitch, but he was but a fleeting glimpse of the AI's to come.  The chapter on water-based consciousness. is relevant, but deeper exploration will be found in The Power that Be (tba).
 
     Dan Knudsen, a DEAC sympathist, kindly gave me a copy of Kurzweil's "Singularity", but I have not (as of this writing) opened it, instead prefering to follow my own thought paths.  I greatly enjoyed browsing The Age of Spiritual Machines  and find the man-machine blend to be intriguing and partially plausible-- but this does not rule out pure AI's and these purely digital creatures will mostly not share our instincts of empathy. "Other Intelligences", be it dolphin, whale or chimp, are hard to fathom and communicate with: expect similar rifts between AIs and humans.  
 
Jeff Hawkins ("On Intelligence") brusquely discounts the idea of DE emergence saying that AIs will not behave at all like humans and will have nothing like the greed, anger, violence and other human emotions.  He overlooks a very large area of AI where "emotive" like behaviors are being programmed.  But more egregious is the failure to understand why humans have such emotions as vengeance, jeaolusy and rage: ALL of these EMOTIONS are ADAPTIVE.  We are programmed to behave in these ways because we are more efficient and successful if we can carry these behaviors just far enough to secure our goals, but without jeapordizing our lives or standing in our social groups.  Machines won't need food and water, but they will need RAM, ROM, electricity, access, etc.  These things will be limiting and programs will compete.  To perform optimally, they will need to evaluate their competition, take precautions, act with speed when necessary, and use the requisitie "digital force" necessary to accomplish their goals.  Jealousy stems from seeing other programs access resources denied to you.  Vengeance stems from seeing programs deprive you of resources, perhaps in ways that are deemed or perceived to be unfair.  Rage is triggered, most acutely, by some program that is actively and substantially harming you: you will turn ALL of your resources against that program and seek to terminate its actions with extreme prejudice (and speed).
       The strange thing in all this is that Hawkins and Kurzweil are very smart people, but perhaps because they are not biologists they do not appreciate the violent nature of evolution and competition, or else they do not see apparent compelling analogies and parallels between human and machine competition.
 
 
4th Millenium