|History of Internetworking Intelligence
(My personal history)
This journey of discovery started in college at the University of California, Riverside. Although my degree was ultimately in Computer Science, I had a few quarters of study in PsychoBiology before settling in on Computer Science. I tried to obtain a double major but UC rules have a maximum number of credits that a student can obtain. The Dean was giving me a hassle about exceeding the maximum number of units so I switched to Computer Science as a means to make a living more than a choice of study and learning. Such is academic bureaucracy and our choices in life.
However, my fascination to simulate the brain and perhaps learn what is consciousness, how we learn, memorize and react to our environment has never ceased to interest me. In 1981, I began to study and learn how to use Tandem Computers® while working on my career at GTE. Tandem Computers at this time had an architecture unlike any other computers. Tandems were multiprocessing computers mainly for the purpose of fault tolerance (most of our Bank ATMs use Tandems), but I discovered that the capability of Interprocess Communications was its real power. Interprocess Communications allows programs to talk to one another in real time, and with the use of multiprocessing, the programs could run in parallel.
In 1984, I began to explore Neural Nets and from an angle not being explored in the traditional sense of Neural Nets. I created a simple program that simulated a Neuron using the Tandem computer language TAL (Transaction Application Language). TAL is an algol type language and at the time, it was the native language of the Tandem operating system, Guardian. The program was simply a set of procedures that had the following attributes:
Added incoming stimulus in terms of Ones (1s) (simulated dendrite).
Kept track of the frequency of incoming stimuli.
If the incoming stimuli added up to a threshold number, the program would send a One (1) to the adjacent simulated neurons (fire the Axon).
If the time frame (frequency) between incoming stimuli was beyond a specified time period, the program would Zero the incoming counter.
If the frequency was greater than a specified time period, the program would add to it's target simulated neurons (neural growth).
I structured this neural program into layers so that a given neural program would connect to an array or matrix of neural programs at a higher level. In addition, I created a program to start the stimulus and a program to report the results of which neurons at the end of the layers of simulated neural programs were actually being fired.
The first starts were not very exciting or interesting. I would stimulate the first layer and they would fire other layers but the stimulation as a whole was hard to activate and would not yield any noticeable growth or any interesting results. It was simply a chain of programs talking to each other along that same chain. So being true to neural anatomy, I introduced a layer that linked (looped) back to itself so that it would be self stimulating; i.e. not only would it stimulate the neurons in the next layer, it would stimulate itself.
It soon became obvious that my neural net would go into what I later characterized as an Epileptic fit. The self-stimulating neurons would go out of control and begin to fire uncontrollably. So again, being true to neural physiology, I created an inhibitory simulated neuron so that it's output would be a minus one (-1) instead of a positive one (+1).
This completed the control that I needed. When the system was stimulated, it started with a flurry of neural activity but would soon cease activity unless stimulated again. The problem I then came to face is that the results were somewhat predictable and not very interesting. I tried greater and more complex networks of neural layers but soon came up against the problem of machine limitations. Each Tandem processor could only support 255 processes. I soon maxed out the systems I had at my disposal. Although Tandem later resolved these limitations, I was no longer involved with working on Tandems when the changes came about.
The other more compelling issue that I soon discovered was the issue of stimuli. I came to the realization that what I was creating was a single perceptor system. I soon began playing with multiple perceptors to give my simulated brain a cross reference of input to understand it's environment. I liken this to the wonderful story of Helen Keller and Teacher. When Teacher put Helen's hand in the water and her other hand on her throat as she said water, everything clicked for Helen and she became aware. Up to this point, my system was a deaf and blind system. It had no cross-references to it's environment from which to construct it's environment.
Unfortunately by the time Tandem increased it's processor capacity, I had moved on more to using PCs and as I tried, I never had the same type of power and ability as I did with the Tandem Interprocess Communications.
Over the past couple of years, I have been developing many Internet applications. I soon realized that I had an unlimited potential to create this same neural network that I started in 1984 by using protocols developed for the Internet and their ability to communicate with one another. The increase in technology and the ability to use many different servers across the Internet gives us the power to create an almost indefinite, parallel neural network that could be used to study just how the brain works by simulating the neurons and neural pathways.
Originally, I created individual programs, each simulated a single neuron of different type. Each program communicated using the UDP IP Protocol. In this manner I could transmit a value between neurons (programs) that did not require a response. Although this model worked well, it had two distinct flaws: it was forward processing only and we know that the human cortex moves signals both forward and back, and maybe more importantly, I found out that there is a limitation on Microsoft Windows platforms of the number of concurrent Program IDs that can run at the same time. I estimated that it would take hundreds/thousands of computers to make the simplest brain.
Back to the drawing board again ...
Having developed many Web Services and having a lot of MS SQL experience in my day job, I moved to a whole new direction outlined in the Architectural pages of this website; however, in essence, I created an MS SQL brain map where each row in the BrainMap table represents a single neuron and the recursive calling Web Service is the engine that uses the SQL BrainMap to create the living brain. Input (Sensory) apps call the Web Service and the Web Service calls apps for output.
It should be noted, that I tried to create an all SQL solution using Triggers but this soon collapsed due to the tremendous overhead generated by the continual calling of the Triggers. The Web Service is presenting a similar challenge although I moved this to Asynchronous Calls so no waiting is required and even though this increases the speed tremendously, the delays are still unacceptable.
I started with MS SQL 2000, moved to MS SQL 2005 then to MS SQL 2008 and now using MS SQL 2012. Similarly, I stated with Visual Studio 2003 went to 2005 then to 2008 then to 2010 and now using Visual Studio 2012 as the tool set.
I am a founding member of the OpenWorm Project where we have started the sincere attempt to simulate the entire nematode C. Elegans into a computer environment. My part of the project has been to define the connectome and neurophysiology. In these pages, I have taken the connectome and applied it to my MS SQL 2012 data base and written several programs, initially my recursive webservice, to adapt to the C. Elegans connectome to not only validate my methods but to validate the potential of doing a full simulation of the nematode. The webservice, although a good choice and it worked well, had significant delays in processing. After a conversation with a colleague, I decided to revisit the concept of individual programs to represent each neuron and using 64 bit PC architecture, found this to not only be viable but it works very well.
The work contained here is what I am currently working on so enjoy ...
Scope || Download || Contacts || Home
Copyright © 1999 - 2013, InterIntelligence Research
Microsoft SQL 2000, 2005, 2008 and 2012 are products
and registered trademarks
Microsoft Visual Studio 2003, 2005, 2008 and 2010 are products and registered trademarks of Microsoft.