Constructor Theory

I'll be completely honest, I have a really difficult time grasping constructor theory. But I love the breadth of its conceptual uses and will continue reading it until I understand it better! Much of it reminds me of Turing's and Church's work. I wonder how constructor theory intersects lambda calculus too. Much to learn!

 
Front-of-a-bombe-code-breaking-machine-at-Bletchley-Park.jpg
 

The Church-Turing Thesis

Oh boy. I really need to read this but I'm definitely intimidated!

Exciting Papers

 

Sub-kBT micro-electromechanical irreversible logic gate

Here's a lovely paper that came across my email recently. From what I learned, it explores a simple experiment of logic gates to show that physical irreversibility and logical irreversibility should be treated independently. They built a logic gate that has an arbitrarily small energy dissipation, which results in a logically irreversable process. In other words, the logic gate barely dissipates any heat at all when performing the computation, which is almost (but not quite) a physically reversible process. Yet it is not reversible logically. This was all showed with an OR gate. This was extended to a NOR gate, which showed the amount of average energy dissipation is <Q>=0.31±0.05 kBT. Naturally, these gates were combined to make a simple 1-bit adding machine.

... the OR gate, realized with a micro-electromechanical cantilever, can be operated with energy well below kBT, at room temperature, if the operation is performed slowly enough and friction losses are minimized. Thus, no fundamental energy limit need be associated with irreversible logic computation in general and physical irreversibility is not necessarily implied.

Very thought-provoking results. I wonder what reversibility means in terms of logic. Physically, it means amount of energy dissipated. Logically, it means what you're able to know about the inputs. Although I'm not keen into believing the digital world and physical world are really that different, it's interesting to think about the relationship between the two. How does knowledge about past and future states relate to the arrow of time? How does it relate to the second law? And, more importantly, how does this relate to living systems and how they compute?

Lingering questions: How much longer time does this process need to dissipate energy below KbT? Have these results been reproduced? 

 

Dependence of dissipation on the initial distribution over states

"A new kind of second law"

This is an excellent analysis of the second law of thermodynamics in terms of initial states, rather than processes that changes one state to another. If I understand it correctly, this is an analysis of initial states of computers. In other words, the paper focuses on non-equilibrium processes, which is basically a system that is set with some initial condition, and then time passes and it reaches equilibrium. This is computation in one of its most basic forms. Take a state and evolve it in time. If a system is not in equilibrium, then you don't need to put work into it for it to evolve. Instead, it dissipates energy into the heat bath. Some of this energy cannot be retrieved, thus the process is irreversible.

Here, they found that there is extra dissipation in a system if distribution of initial states is different than the distribution that would give you the minimal dissipation. Makes sense when you read it the first pass. However, think about that for a second. There is dissipation that depends on the initial states, NOT the process that takes one state into another. Specifically,

We show that if P is run with any initial distribution r0, then the associated dissipation Wd(r0) equals Wd(q0) plus an extra cost. That extra cost is the drop in the value of the Kullback-Leibler (KL) divergence [27, 28] between r(x) and q(x) as the system evolves from t = 0 to t = 1, i.e., the KL divergence between r0 and q0 minus the KL divergence between r1 and q1.

But perhaps the most interesting part of this study is the fact that none of this depends on the presence of a heat bath at all. Also, the extra work is 0 if the computation is completely reversible. It only depends on the distribution of the initial states. This reinforces the idea that computation relies on the initial conditions just as much as the actual process (the Hamiltonian). Another clue in the mysterious relationship between data and programs.

 

Wave-Based Turing Machine: Time Reversal and Information Erasing

Droplets bouncing around on a surface is apparently a great way to think about quantum mechanics, or at least some system with a wave encoding information about a discrete object. Here's a nice video showing how a bouncing particle on a liquid surface has a chaotic trajectory. The probability distribution of that particle's position ends up being something like the most unstable mode of the boundaries for the particle. But this paper shows that the droplet's trajectory can be totally reversed, even though the trajectory is chaotic. This is because the droplet's trajectory is completely encoded in the resonating wave it's riding. In other words, the system totally remembers what the droplet is doing and that memory can be accessed and then totally reversed. 

A great physical (well, MORE physical than usual) example of what memory is.

Wave-Based Turing Machine: Time Reversal and Information Erasing, S. Perrard, E. Fort, and Y. Couder, Phys. Rev. Lett. 117, 094502 (2016) http://dx.doi.org/10.1103/PhysRevLett.117.094502 link: https://blog.espci.fr/efort