Unconventional Values
Today, a few words about unconventional values.
#1. Thermodynamic AI
We've already written about thermodynamic AI and the startup Normal Computing (see the post), but a fresh conversation between Diamandis and the founder of another thermodynamic AI startup called Extropic, as well as the author of effective accelerationism (e/acc), Guillaume Verdon, has just been released.
Lex Fridman also recently recorded a session with him:
Extropic describes their approach here (https://www.extropic.ai/future). It seems that Extropic's approach is essentially similar to Normal Computing but implemented on different hardware. SPUs at Normal Computing use LC circuits, while Extropic uses the Josephson effect in superconductors. For the mass market, Extropic aims to create something simpler on transistors that will work at room temperature. However, I didn't see the details.
There's a good post "What’s the difference between Extropic, Normal Computing, and D-Wave?" trying to sort out the entire existing zoo.
#2. Optical Computing
Another interesting topic is optical computing. Quanta recently published a very brief overview on this topic. For instance, the startup Lightmatter works in this field. Among their products are the programmable photonic interconnect Passage and the accelerator Envise. They also have a DL framework called Idiom. I am not sure to what extent all these are ready; it seemed to me that it's still far from the scale of modern hardware and the models trained on it, but it's worth observing.
It feels like this is primarily about interconnect. Google is already using optical switches (optical circuit switch, OCS) instead of Infiniband for pods with TPUv4 (a more detailed article here: https://arxiv.org/abs/2304.01433). The Open Compute Project is also developing this direction, see #1:
and #2:
In DL interconnect addresses the problem of underutilized hardware, as many computations are actually communication- (or i/o-) bound. It's a long-standing big topic. See also the roofline performance model. Faster memory also adds a lot here (as long as it fits).
However, there's a whole ecosystem there, including, of course, matrix accelerators (https://www.nature.com/articles/s41566-024-01394-2, https://arxiv.org/abs/2309.10232, https://spie.org/news/matrix-multiplications-at-the-speed-of-light, https://www.nature.com/articles/s41377-022-00717-8).
#3. DNA Storage
Another interesting topic is DNA Storage. Data storage needs are growing faster than our capabilities, and there are expectations that soon we will be submerged in this ocean of data. Moreover, current storage technologies are not particularly long-lasting, allowing storage only for decades with periodic maintenance. I recalled a quote from Cixin Liu's "The Death's End":
"We informed the government that with the current state of technology, it is impossible to preserve ten gigabytes of images and one gigabyte of text — the minimum requirements for the Museum — for a billion years. They didn't believe us. We had to provide proof. Then they agreed to lower the standard to one hundred million years."
DNA storage theoretically allows for storage not for hundreds of millions of years, of course, but clearly for more than just decades.
In October 2020, Illumina, Microsoft, Twist Bioscience, and Western Digital founded the DNA Data Storage Alliance. The Alliance has an introductory publication "An introduction to DNA data storage" from 2021, and here's a recent popular review from IEEE Spectrum.
Progress in the field is ongoing, particularly with the development of the use of the enzyme terminal deoxynucleotidyl transferase, TdT, which can add new letters to the ends of single-stranded DNA.
To compete with magnetic tapes used for archiving, one must be able to write at a speed of 2 Gbps, i.e., 2 billion bases per second (in the encoding scheme where one base encodes 1 bit, not 2 as theoretically possible). The current DNA synthesis market, as estimated by the author in the Spectrum article, is equivalent to only 300 thousand bases per second. Quite far off, but progress in information storage is exponential, and in sequencing, it's even beyond exponential. Synthesis isn't as good yet, but it is still improving. When we reach such bandwidth (which is equivalent to 20 human genomes per minute), of course, the threat landscape will change no less seriously.
About DNA storage and computing, as well as the exotic thing called the Nondeterministic Universal Turing Machine (NUTM), I wrote a review in 2017.
Overall, interesting topics, stay tuned!