The brain operates asynchronously over continuous time. Since our artificial computers use global clocks and discrete time, a natural question is whether the continuity of time matters with respect to the results or efficiency of the computation.
Fortunately, it is easy to see that we can ignore most of the implications of continuous-time computation. Dynamical systems can be approximated arbitrarily well in discrete time by decreasing the duration of the timestep. This is because the system is governed by differential equations, for which solutions are locally linear, so a repeated tangent-line approximation can be made (
However, all physical systems are subject to noise from a variety of sources, so the state vector effectively has finite precision even in real life. This is like a solution “band” instead of a solution curve; the noise groups nearby solutions into equivalence classes. To reliably perform computation, the system must be robust to a certain level of noise, and entire solution bands must converge to the desired result. So long as our timestep is small enough that the approximation error at each step is less than the noise contribution, we can expect any sufficiently robust computational system to produce the desired result, whether through attractor dynamics or error-correction mechanisms.
The presence of noise encourages us to believe that all physical computers must be
The question of efficiency remains; numerically approximating trajectories of a high-dimensional dynamical system that performs computation like the brain may be intractable on modern hardware. Can we do much better, by finding computational primitives in the brain that define a rule-based system independent of physical time, and then implementing those primitives directly in software? This question will guide future exploration.
Back to index
Moore, Cristopher. “Finite-dimensional analog computers: Flows, maps, and recurrent neural networks.” 1st International Conference on Unconventional Models of Computation-UMC. Vol. 98. 1998. ↩︎