Computation and Neural Systems

The Computation and Neural Systems (CNS) program was established at the California Institute of Technology in 1986 with the goal of training PhD students interested in exploring the relationship between the structure of neuron-like circuits/networks and the computations performed in such systems, whether natural or synthetic. The program was designed to foster the exchange of ideas and collaboration among engineers, neuroscientists, and theoreticians.

History

edit

In the early 1980s, having laid out the foundations of VLSI,[1] Carver Mead became interested in exploring the similarities between computation done in the brain and the type of computations that could be carried out in analog silicon electronic circuits. Mead joined with John Hopfield, who was studying the theoretical foundations of neural computation,[2] to expand his study. Mead and Hopfield's first joint course in this area was entitled “Physics of Computation”; Hopfield teaching about his work in neural networks and Mead about his work in the area of replicating neuronal structures in highly integrated electronic circuits.[3] Given the interest among both students and faculty, they decided to expand upon these themes in the following year. Richard Feynman joined them and three separate courses resulted: Hopfield's on neural networks, Mead's on neuromorphic analog circuits,[4] and Feynman's course on the physics of computation.[3][5] At this point, Mead and Hopfield realized that a new field was emerging with neural scientists and the people doing the computer models and circuits all talking to each other.

In the fall of 1986, John Hopfield championed forming an interdisciplinary Ph.D. program to give birth to a scholarly community studying questions arising at the interface between neurobiology and electrical engineering, computer science and physics. It was called Computation and Neural Systems (CNS). The unifying theme of the program was the relationship between the physical structure of a computational system (physical or biological hardware), the dynamics of its operation and the computational problems that it can efficiently solve. The creation of this multidisciplinary program stems largely from progress on several previously unrelated fronts: the analysis of complex neural systems at both the single-cell and the network levels [6] using a variety of techniques (in particular, patch clamp recordings, intracellular and extra-cellular single and multi-unit electrophysiology in the awake animal and functional brain imaging techniques, such as functional magnetic resonance imaging (fMRI)), the theoretical analysis of nervous structures (computational neuroscience) and the modeling of artificial neural networks for engineering purposes.[2] The program started out with a small number of existing faculty in the various divisions. Amongst the early founding faculty were Carver Mead, John Hopfield, David Van Essen, Geoffrey Fox, James Bower, Mark Konishi, John Allman, Ed Posner and Demetri Psaltis. In that year, the first external professor, Christof Koch, was hired.

Since 1990, about 110 graduate students have been awarded a PhD in CNS and 14 a MS in CNS. About two-thirds of CNS graduates pursued an academic career, with the remaining CNS graduates founding and/or joining start-up companies. Over this time, the average duration of PhD has been 5.6 years.

During this time, the executive officers of the CNS Program were John Hopfield, Demetri Psaltis, Christof Koch, and Pietro Perona. The current executive officer is Thanos Siapas.[7]

edit

CNS faculty founded and co-founded a number of conferences and workshops:

edit

Notable alumni

edit

References

edit
  1. ^ C. Mead and L. Conway, Introduction to VLSI systems. Addison-Wesley Reading Mass. (1980)
  2. ^ a b Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. NatL Acad. Sci. USA Vol. 79, pp. 2554-2558, April 1982
  3. ^ a b Shirley K. Cohen, Interview with Carver Mead. Archives of the California Institute of Technology. (PDF)
  4. ^ C. Mead, Analog VLSI and neural systems. Addison-Wesley (1989)
  5. ^ R.P. Feynman, Feynman Lectures on Computation. Tony Hey and Robin W. Allen ed. Perseus Books Group (2000) ISBN 0738202967
  6. ^ D.J. Felleman, D.C. Van Essen. Distributed hierarchical processing in the primate cerebral cortex. Cerebral Cortex, 1 (1) (1991)
  7. ^ "Contacts - Biology and Biological Engineering". Caltech. Archived from the original on 7 July 2024.

Further reading

edit
  • Shirley K. Cohen, Interview with Carver Mead. Archives of the California Institute of Technology. (PDF)
edit
  NODES
Community 1
HOME 1
Idea 1
idea 1
languages 1
mac 2
Note 1
os 10
Training 1
web 1