My most recent, complete, and readable work is available on the theoretical-physics site ArXiv, with the title Elastic Nanocomputation in an Ideal Brain. This work supercedes what is written below, which I only leave posted for reasons of personal history, and because I am unmotivated to keep this website fully synchronized with my research.
“We don't know what a neuron does, we don't know how neurons are connected, and we don't know how the connections change with learning. But most important is that we don't even know what the whole circuit is trying to do. Detect regularity? Detect change? Be stable? Be sensitive? Preserve information? Throw away information? Our best chance for understanding will come from lucky guesses about the general statistical structure of perceptual processing, and from practical, partial solutions which can be tweaked and "hacked" into ever-better systems which solve real-world problems.”
Unlike most neuroscientists, I've worked as an engineer and software architect, and thus have unusual and strong opinions about how to best approximate and build a model of the brain which really works and can do something useful. Below are outlines of my past research and links to some articles I've written: some are for laymen, some for specialists, and some are free-form discussions; some are published, some are not.
This subject is exciting because it strongly hints that real neurons are a hundred-fold better at computing and transmitting information than people have given them credit for. The old view is that each neuron is slow and noisy; my results show that they neurons are capable of being both faster and more accurate, since the so-called "noise" is probably just high-density information we don't yet understand.
The following paper is my best summary of these claims. It starts
with a bit of judo, taking someone else's refutation and showing that
it actually supports my case (contrary to their claims). It then
shows that realistic cells (with lots of thin dendrites) are even
better at powerful and fast information-processing than the simple
models we were arguing about. It closes by noting that a sacred cow
of science-- Occam's Razor, the scientific bias toward choosing simple
explanations over complex ones--can lead us to systematically
underestimate the complexity of powerful information-processing
systems like brains: when we look at a complicated neuron, should we
assume that it's as dumb and slow as our measurements have been, or
should we assume that it's as complex and subtle as Nature might have
evolved it to be?
Simple codes vs. efficient codes
Each of the following micro-essays discusses an underestimated issue in neural theory. They are meant to persuade and explain, not to survey previous work or create a scientifically unassailable edifice. Read them at your own risk.
Probabilities vs. Parameters
An
analog signal can represent the probability of something being
there (e.g. likely vs. unlikely), or a
parameter of it (e.g. angle of tilt). What is the difference between
these two approaches? What are the advantages and disadvantages or each?
Types of invariance
An "invariance"
is some representation of a signal which doesn't change even when the
signal does change... for example, you know a floor stays fixed even
as you move your head and walk around. Here is a catalog of several
types of invariance we expect to find in perception, and some of the
tricks the brain may use for discovering them.
Energy use by brains
Our opinions about
what neurons do are based almost exclusively on measurements of
neurons which fire at least 10 spikes per second, which may be ten- or
a hundred-fold faster than typical neurons during normal perception.
Why are realistic spike rates so slow? What are the implications?
Why are reports biased towards unrealistically fast-firing neurons?
'Belief particles' have the binding
problem'
The "binding problem" is a well-known challenge for
representations which break objects apart into complementary features
(like shape and position). I argue that it is also a problem for
probablistic representations which break objects apart into competing
hypotheses (beliefs). Since real brains probably face both problems,
its circuitry may solve both problems at once.
Bill spent a year in West Africa imagining neural circuits and re-inventing the Hebb Rule before getting exposed to real neurobiology and computational theory at Caltech. His most impactful research has been in three fields: 1) the paradox that irregular cortical spiking probably represents an information-dense, precise-time pulse code, 2) the electro-chemical properties of neocortical neurons which make a precise-time code both possible and desirable, and 3) an interpretation of cortex (and thalamo-cortical feedback) as a system which learns to predict its own sensory input in real time. His recent focus has been on designing software "API's" which isolate the scaling and hierarchy issues of cortical-style signal processing from the modules which perform feature extraction/compression/reconstruction/prediction on smaller groups of signals.
Bill has taught physics in West Africa and "Physics for Poets" in
junior college, and has Physics degrees from Haverford College (BS)
and Caltech (PhD), where he worked in the Computation and Neural
Systems Program under Christof Koch. He was a postdoctoral fellow
at the Mathematics Research Group at the National Institutes of
Health, and has worked at half a dozen Silicon Valley companies
as scientist, engineer, and software architect, in many fields:
statistical algorithms, biophysics, software tools, web applications,
and finance.
Publication list