Skip to main content

Can Crypto Find a Purpose? A Blockchain Approach to Optimizing Neural Networks


Authors: Igor Arsenin and Arturas Vaitaitis


Can Crypto Find a Purpose? A Blockchain Approach to Optimizing Neural Networks

As training ever-larger transformer-based models encounters diminishing returns, a novel blockchain protocol could advance AI by emphasizing the optimization of neural network architectures, harnessing the decentralized computational power of blockchain technology. The innovative protocol would replace arbitrary decryption tasks in the proof-of-work concept with a focus on enhancing benchmark scores of AI models on standardized datasets, utilizing interfaces like the Open Neural Network Exchange (ONNX) protocol to define architectures. The economic potential of blockchain technology could draw a diverse range of players into the field, sparking a competitive drive for the development of more efficient and effective neural networks, potentially giving blockchain a purpose beyond digital currency while democratizing the field of AI.


The remarkable progress in large language models (LLMs) can be traced to two key developments: the discovery of the right neural network architecture based on transformers and attention mechanisms, as outlined in the groundbreaking paper "Attention is All You Need" (2017)[^1^]. As a result, training larger models on ever-expanding datasets with a fairly fixed net architecture has achieved significant advances.


A recent Nature publication, “In AI, Bigger is Better”[^2^], surveyed the scaling properties of LLMs. The breakthrough occurred once the model size reached a few billion parameters, and the race for bigger models is ongoing. However, this approach faces constraints, as Sam Altman explained in a recent interview: “I think we’re at the end of the era where it’s gonna be these giant models, and we’ll make them better in other ways.”[^3^]


A significant constraint is memory, as all parameters must be stored in memory. While hardware will undoubtedly improve following Moore's Law, the largest models will remain accessible primarily to major corporations, creating a bottleneck. Furthermore, we are approaching the limits of available datasets as the internet is exhaustively mined for data. Although incorporating video data could expand data quantity, its semantic value is in doubt. There are indications that training larger transformer-based models on fixed datasets is already experiencing diminishing returns.


The human brain demonstrates that limited input from sensory organs can be coupled with a larger, more complex neuronal architecture for advanced cognition. The brain has about 86 billion neurons and 100 trillion synapses, making it two orders of magnitude larger than GPT4. The amount of written data a high-achieving human can consume over a lifetime is about 4 billion tokens (assuming a 600 words-per-minute reading speed and 4 hours of reading a day). The amount of visual information over the lifespan is 1 trillion tokens. Only a fraction of this visual information is probably of any use. Therefore, the brain seems to fall into the paradigm of a larger, more complicated model trained on a smaller dataset. This suggests that after size, the next low-hanging fruit to enhance LLMs is optimizing the network itself.


Some promising research in optimizing network architecture includes Neural Architecture Search (NAS) and AutoML, which employ machine learning algorithms for discovering optimal network structures. However, they have yet to produce breakthroughs on the scale of transformers. Notably, the human brain resembles recurrent neural networks (RNNs) rather than transformers and is quite efficient to run, consuming under 100 watts of energy compared to 40kW for a small GPT3 engine in continuous use.


Enter Bitcoin and Blockchain Technology


With a market cap of nearly $500 billion out of $1.2 trillion total, Bitcoin still relies on proof-of-work validation. This immense computational power could potentially be harnessed to run algorithms, such as an evolutionary approach, to develop improved neural networks.


Blockchain technology has struggled to find a definitive purpose beyond its most successful use case: digital money. Other applications, such as Decentralized Autonomous Organizations (DAOs), have largely failed to take off. Even Bitcoin's role as a form of currency remains uncertain due to its persistent price volatility and lack of widespread adoption.


A novel crypto protocol, building on the cryptographic core of Bitcoin's blockchain, could potentially transform the field of AI by harnessing the power of decentralized computing to develop and optimize neural network architectures. This approach would repurpose the immense computational resources used by blockchain participants, known as miners, and direct them towards advancing large language models (LLMs).


At the heart of this proposal lies the proof-of-work concept, which currently requires miners to solve computationally-costly mathematical problems using the Secure Hash Algorithm 256-bit (SHA-256). However, instead of these arbitrary decryption tasks, the new crypto protocol would focus on improving the benchmark scores of AI models on a medium-size standardized dataset, such as a 10-gigabyte text corpus.


A standardized interface, like the Open Neural Network Exchange (ONNX) protocol[^4^], could be employed to specify neural network architectures. These structures would be easily verifiable and publicly visible, allowing for both financial rewards and bragging rights to motivate miners to invest in the endeavor. Furthermore, those who discover valuable improvements in LLM models might opt to monetize their advancements through copyrighting rather than merely posting them on the blockchain.


Many technical details of this proposal will need to be worked out to ensure the blockchain is functional, secure, and provides correct incentives. However, the widely-adopted LLM benchmarks such as GLUE and SuperGLUE[^5^] are a good starting point for designing an automated model-verification system.


Rather than requiring complex testing sets to avoid the overfitting issue that plagues rapidly growing models, a simpler method would be to establish a benchmark for LLMs through in-sample model testing. This could be achieved by setting a relatively small parameter size—for instance, 10 billion—to feasibly limit the scope of complexity.


In practice, LLMs would be submitted to the blockchain via a unique hash value. Then, these models would be compared through a randomly selected in-sample test. The verification process involves running the model on the chosen test set, a relatively inexpensive operation. The ultimate winner will be the model that possesses the best combination of accuracy and performance. This victorious model is granted the right to mint a blockchain coin and pen the next block in the chain.


This evolving benchmarking process can be adaptable over time. For example, the accuracy threshold can be raised or lowered to influence the overall speed and advancement of the blockchain – an operation mimicking the mining difficulty adjustment of bitcoin. The simpler the verification process and the closer it is to existing standard protocols, the more robust, scalable, and, ultimately, successful the blockchain will become.


Although proof-of-work presents an appealing starting point for redirecting the computational power of blockchain towards optimizing neural networks, the proof-of-stake incentive structure of blockchains, such as Ethereum or Solana, could also be adopted. Most of the computation for training neural nets will occur off-chain, while the primary on-chain task will involve validation or benchmarking of models. The incentive for model validation can be structured similarly to proof-of-stake blockchains, with payments deriving from transaction fees and validation rewards. The primary incentive for model builders will remain the minting of cryptocurrency.


By harnessing the economic potential of blockchain technology, a diverse range of players could be attracted to the field, igniting a competition to develop more efficient and effective neural networks. In doing so, blockchain may finally discover a purpose beyond digital currency – propelling innovation and progress in the realm of artificial intelligence and large language models.


Footnotes: 

  1. https://arxiv.org/abs/1706.03762
  2. https://www.nature.com/articles/d41586-023-00641-w
  3. https://techcrunch.com/2023/04/14/sam-altman-size-of-llms-wont-matter-as-much-moving-forward/
  4. https://onnx.ai/
  5. https://gluebenchmark.com/ and https://super.gluebenchmark.com/




Comments

Popular posts from this blog

PnL.ai Intro

Part of my covid project and part of my long obsession with prediction markets, I have created a web page that displays and allows to compare best and worst performing trading strategies. TL;DR: best stocks + best strategies -> the list of top and bottom performing trading algorithms.  Product Typically, trading newsletters and stock-scanners display only price return for top market gainers and losers. I have forever been interested in inspecting top and bottom performing trading strategies for a given set of securities and could not find any websites that do that. So, I decided to create a tool of my own. I wanted the tool that would help me to answer questions like if there is a better strategy than buy and hold, should I follow greed and fear indicator of the market or do the opposite. Top and bottom performing securities do not tell you if a stock is going to go up or down, but they do alert you to rapidly changing market conditions, such as change in the competitive landscape,

Chronicles of Alma Mater in April and Other Phystech Stories

"I am a student, I am glad I am a student. Only two month ago I was a schoolboy, Mathematics and Physics were always of interest to me.That's why I am here. " From Phystech 1 year English textbook. As a result of a natural process of clutter accumulation in my brain my stories are loosing colors and details and some of them disappeared all together. The only way to preserve some of those silly, yet dear memories is to put them in writing. First few big words: Back in the old country, the proud name of Phystech had stood for the Excellency in physics and math. It shaped the minds of several generations of Russian scientists. In simple words, our school kicked ass. This Excellency, as all good things in life, did not come for free. It was not the difficult exams or the rats in our dorm showers or god-awful food in the campus cafeteria. We had very few women. They say that a sum of looks and smarts form a constant. In my year among 90 or so, borderline genius guys