The Silver Bullet presents an interesting alternative to the algorithmic, Universal Turing Machine approach to computing based on behaving machines which are built of sensors and effectors linked to an environment. The proposal that Von Neumann machines emulate this parallel, synchronous model by simulating many simple independent behavioural units is very close to the implementation of LSL.
Certainly virtual environments map on to this model well, but I've often thought that the Von Neuman architecture is ill-suited to distributed systems which fundamentally process many concurrent requests. Despite their fundamental concurrency these applications are often by necessity implemented using relatively small numbers of operating system threads processing events and having to store partial results across asynchronous I/O calls. It's an approach which breaks conceptually atomic operations in to a sequence of event handlers split on non-blocking I/O calls which all to often require programmers to implement the stack tearing manually, saving all the stack data to a heap object which can be saved until the I/O completes.
This is exactly the job which is automated by microthreading systems such as Java GoX, Brakes and the Mono microthreading support we've implemented for Second Life, so maybe those facilities will be more generally useful in the future.