Overview
A myriad of sectors are heavily dependent on large simulations of physical systems based primarily on traditional methods like Molecular Dynamics, and Density Functional Theory. Such sectors include Pharmaceutical, energy, semiconductors, etc. For example, in the recent Covid-times, millions of Molecular Dynamics simulations have been run, largely independently, related to the ACE receptor and spike protein to better understand the binding mechanisms[3]. Currently, most of this information is dormant, redundant, and inconclusive. The data is frequently dormant as the simulation data is analyzed for publications or industrial applications and then held on local data storage units, redundant as there are often teams around the world doing highly similar simulations, and inconclusive because often single simulations lack enough information to lead to conclusive results. Thus, centralized infrastructures are rather limiting in developing AI-centric frameworks for improving the efficiency and accuracy of physics computation and knowledge extraction. As bad as this is, this is only the surface of the problem. The larger problem is that there is no natural way to incorporate vast and diverse amounts of physics information (experiments, quarks, chemicals, proteins), data, knowledge , and algorithms in a cohesive and synergetic manner.
We just nearly (the first) missed out on funding our project in Fund7 in the AI category. Here we have done refactoring, and updated our plans.
Objectives and Goals
Our end goal is clear. We hope to create the correct infrastructure to incentive mass adoption of cardano-based protocols in the computationally oriented scientific communities including academia, industry, start-ups, and individual community members.
We are creating a decentralized protocol for the simulation of physical systems while leveraging Nunet for computational resources and SingularityNet for AI enhancements with open ended improvements using anything from Deep Learning [1], to neuro-symbolic AI [2], quantum chemistry [4], cognitive architectures[5], etc. Additionally, we are building a tokemonics system to incentive computation, data, algorithm development, mining, and community rewards for collaborations and support from individual community members, academics, and even corporations. One of our driving principles is the coupling of advancements in artificial intelligence to advancements in functional near-term technologies.
Our solutions will be useful in markets like Biotechnology, Artificial Intelligence, Chemical Synthesis, and many more. These are quickly growing markets, and would be absolutely amazing for the health of the cardano ecosystem to bridge the market demand home. Take for instance just the Biotechnology market; it is expected to surpass 1.5 Trillion by 2030 and growing at nearly ten percent per year [6].
The paradigm shift we are creating with SNet and Nunet stems from creating a computational and algorithm environment for end-to-end integration of multi-scale simulations for developing and employing theoretical and AI algorithms built up from heterogeneous data sources, symbolic knowledge extraction, and cognitive principles to lead to the most interconnected framework for self-consistent computations in the physical sciences. This will all be done to mimic the use of High Performance Computing infrastructures, and in principle, we should be able to simulate molecular systems faster than many of the top supercomputer when Nunet is fully developed with a large enough ecosystem. All of our code will be developed for parallelized, multi-virtual node CPUs/GPUs. By using AI integration, we should also be able to surpass many of the conventional bottlenecks of such computations.
Industry and community
From an industry perspective, users (entities taking advantage of our computational protocol) can exchange tokens for theoretical computations of a particular system of study and/or private/public algorithms developed by various entities (individuals, research labs, corporations, community members). From the community perspective they will get rewarded for the contribution to data, computation, algorithm development (to name a few).
Rewards are mostly obtained from the following procedures: physics data (experiments, simulation data, theory), computational resources and storage, algorithm developments (developing new algorithms, training neural networks, improving existing networks), mining, and technology development. The first two are rather clear. In short, mining is the eventually-automated process of performing specific computations as suggested by community members or recommended by an AI agent that anyone can partake in by staking or resource allocation. As well, entities that develop on the protocol (via any of the above including mining) can obtain rewards via a predetermined ratio of tokens paid by industrial entities using smart contracts.
The particulars of the miscellaneous challenge are described such that there are no other well fitting categories to place our proposal. Some of the semi-related categories were developer ecosystem, open-source ecosystem, and Business creation. While our end objective is in partial alignment with each of these, the main focus and output of our current proposal is only the solution of a particular phase of development. That is, obtaining a foundational suite of algorithms and infrastructure to begin to address further developments at later funding events (either through project catalyst, SingularityNet’s DeepFunding, or third party funding). So, as this proposal is foundational specifically to physics algorithms and implementation with Nunet, we find it difficult to make a compelling argument in other challenge settings. Although, we do have smaller proposals to begin outreach to both industry and academic collaborations to begin this transition in parallel if possible.
Mostly general technical research and development uncertainties and complexity of the project from that side. We are fairly confident that the team will be able to deal with difficulties, but that may require additional time and work. Of course, we are working with Nunet, and any delays on their side could be near-term problematic, but can be circumvented by focusing on the details that can be directly implemented at current times. They are a well-proven team, and delays may happen, but they build great code.