not approved
Dune Analytics for Cardano: explore, visualize and discover trends in DeFi markets
Current Project Status
Unfunded
Amount
Received
₳0
Amount
Requested
₳190,666
Percentage
Received
0.00%
Solution

High-fidelity smart-contract analytics platform brings real-time market data into the Cardano ecosystem. Be the first one to discover the new crypto trends and top performing strategies.

Problem

dune-analytics-7d4e0d.png

Cardano contains vast amounts of DeFi smart-contract data that isn’t easily accessible. As a result valuable financial DeFi trends and insights are not being discovered and leveraged!

Impact / Alignment
Feasibility
Value for money

Team

1 member

Dune Analytics for Cardano: explore, visualize and discover trends in DeFi markets

Please describe your proposed solution.

PART 1 (current proposal)

Blockchain analytics companies like Dune Analytics and Messari are massively popular on EVM blockchains and have provided huge benefits across various applications. They have empowered developers to build better dApps, traders make better decisions and researchers discover new trends in DeFi.

Data Accessibility:

The Cardano blockchain contains vast amounts of data that is publicly available but can’t be easily accessed nor analyzed. This project will abstract away all that complexity and offer easy access to valuable smart contract data. It significantly reduces the barrier to entry for users looking to explore Defi data from top Cardano projects.

Scope of work:

  • REST API to access smart contract data feeds from top Cardano dApps
  • Minswap, Liqwid
  • 2-3 Project TBD (eg Wingrider, Indigo, VIFY, etc.)
  • High fidelity financial data feeds designed for low-latency, which are a necessity for DeFi applications involved in trading, derivatives, and risk management.
  • Enterprise-grade security, reliability and availability

Data Analysis:

Even with access to raw blockchain data, analyzing specific smart contract interactions happening on DEXes and Lending protocols can be quite challenging. This project will also process and aggregate contract data into dApp metrics and financial indicators to enable more advanced blockchain analytics particularly valuable for researchers, traders, and developers looking to extract deep insights from Cardano’s Defi ecosystem.

Scope of work:

  • REST API to access aggregated dApp metrics and financial indicators
  • Dexes - market price, trades volume, order book, pool liquidity, token TVL, and other token metris etc.
  • Lending protocols - borrowing/lending rates, loan status, liquidations, and other loan metrics etc.
  • NFT marketplaces - floor price, trade volume, sale history, policy history, and other NFT metrics

content_DEX-metrics–d82317.png

PART 2 (future proposal)

Data Visualization: Extracting insights from time series data can be difficult without the right tools and interface. This proposal will allow users to create custom dashboards with various visualizations similar to Dune Analytics. Users will be able to sort, filter, aggregate data to identify trends and gain valuable insight.

Screen-Shot-2023-07-06-at-12.40.09-PM-6dad24.png

Knowledge Sharing: In many cases, users must start their analyses from scratch, even if others have conducted similar analyses in the past. This project will foster an open community where users can share their analyses and build upon the work of others. This saves time and promotes a collective understanding of the Cardano ecosystem.

Screen-Shot-2023-07-06-at-1.47.39-PM-63cf44.png

How does your proposed solution address the challenge and what benefits will this bring to the Cardano ecosystem?

Some cardano projects have already confirmed that they will use this solution to power key components in their Dapps. These projects are:

  • Metera: Platform to create and invest in tokenized portfolios
  • Genius Yield: Next-generation DEX and yield optimizer

However this proposal offers new possilibites for the community at large. It brings new capabilities to Scrolls, an open-source Cardano indexer and integrates with Maestro’s Dapp platform to offer a novel and powerful interface to:

1) Access complex DeFi data

2) Explore DeFi trends in real-time, and

3) Discover new valuable blockchain insights.

Whether you're a developer, researcher, or trader, this powerful toolset will completely democratize access to the Defi data and help further fuel adoption and innovation in the ecosystem.

How do you intend to measure the success of your project?

Measuring success of this project will be straightforward. Data feed API metrics will be tracked, monitored and shared publicly to the community to evaluate API usage growth, request volume and endpoint distribution. The public report will include:

  • Daily & monthly request volume
  • Monthly active users and growth rate
  • Monthly endpoint usage distribution

Please describe your plans to share the outputs and results of your project?

The benefits of the project will be available either as a self-hosted solution or hosted API service to the community.

Self-hosted

A fork of Scrolls will be open-sourced and extended to include these new reducers that can parse and process smart contract market data.

Hosted API

For users who don’t want to run their own Scrolls infrastructure, Maestro will offer an end-to-end data feed API, fully managed by its internal indexer and data processing pipeline to provider a high-fidelity, high-performance smart contract data feeds. This will lower the barrier to entry and democratize DeFi metrics for everyone. It will empower teams to build better dApps and offer innovative features such as Metera and Genius Yield.

What is your capability to deliver your project with high levels of trust and accountability?

Maestro has a proven track record in providing Cardano infrastructure services: Blockchain Indexer, Transaction Manager and Turbo Transaction. In particular Maestro’s indexer endpoints averages 300,000 request / day and is being used to power top projects and dApps on Cardano. A smart contract data feed API plays perfectly in the strength of Maestro’s expertise and guarantees a high level of success.

What are the main goals for the project and how will you validate if your approach is feasible?

Data Accessibility:

The Cardano blockchain contains vast amounts of data that is publicly available but can’t be easily accessed nor analyzed. This project will abstract away all that complexity and offer easy access to valuable smart contract data. It significantly reduces the barrier to entry for users looking to explore Defi data from top Cardano projects.

Scope of work:

  • REST API to access smart contract data feeds from top Cardano dapps (Minswap, Wingrider, Liqwid, Indigo, etc.)
  • High fidelity financial data feeds designed for low-latency, which are a necessity for DeFi applications involved in trading, derivatives, and risk management.
  • Enterprise-grade security, reliability and availability

Data Analysis:

Despite access to raw blockchain data, analyzing specific smart contract interactions happening on DEXes and Lending protocols is challenging. This project will also process and aggregate contract data into dapp metrics and financial indicators to enable more advanced blockchain analytics particularly valuable for researchers, traders, and developers looking to extract deep insights from Cardano’s Defi ecosystem.

Scope of work:

  • REST API to access aggregated dapp metrics and financial indicators
  • Dexes - market price, trades volume, order book, pool liquidity, token TVL, and other token metris etc.
  • Lending protocols - borrowing/lending rates, loan status, liquidations, and other loan metrics etc.
  • NFT marketplaces - floor price, trade volume, sale history, policy history, and other NFT metrics

Please provide a detailed breakdown of your project’s milestones and each of the main tasks or activities to reach the milestone plus the expected timeline for the delivery.

Milestone 1 - Implement data feed endpoint schema for raw metrics (non-aggregated)

Timeline: 2 months

Tasks:

  • Select specific protocols that will the indexed (eg: Dex, lending, NFT marketplace contracts)
  • Define endpoint schema for raw metrics
  • Implement REST API

Milestone 2 - Implement backend ETL pipeline architecture

Timeline: 2 months

Tasks:

  • Extract - extraction of raw blockchain data in real-time
  • Transform - event-driven data processing pipeline for customized to each protocols
  • Load - scalable data storage system paired with fast and efficient querying interface

Milestone 3 - Implement data feed endpoint schema for aggregated metrics

Timeline: 1 month

Tasks

  • Select specific aggregated metrics that will computed for selected protocols(eg: Dex, lending, NFT marketplace)
  • Define endpoint schema for aggregated metrics
  • Implement REST API

Milestone 4 - End-to-end integration and comprehensive testing

Timeline: 1 month

Tasks

  • Integrate backend pipeline with raw metrics endpoints
  • Integrate backend pipeline with aggregated metrics endpoints
  • Test data liveliness and accuracy
  • Test latency in high load scenarios

Please describe the deliverables, outputs and intended outcomes of each milestone.

Deliverable 1 - Implement data feed endpoint schema for raw metrics (non-aggregated)

  • Scrolls reducers for DEX contract
  • Scrolls reducers for Lending contract
  • Endpoint schema for raw contract metrics
  • Deploy contract data feed APIs

Deliverable 2 - Implement backend ETL pipeline architecture

  • Extraction - extraction pipeline of raw blockchain data in real-time
  • Transforming - event-driven data processing pipeline for customized to each protocols
  • Loading - scalable data storage system paired with fast and efficient querying interface

Deliverable 3 - Implement data feed endpoint schema for aggregated metrics

  • Aggregation of indexed data into DeFi metrics (eg: Dex, lending, NFT marketplace)
  • Endpoint schema for aggregated metrics
  • Deploy aggregated contract data feed APIs

Deliverable 4 - End-to-end integration and comprehensive testing

  • Integrate backend pipeline with raw metrics endpoints
  • Integrate backend pipeline with aggregated metrics endpoints
  • Test data liveliness and accuracy
  • Test latency in high load scenarios

Please provide a detailed budget breakdown of the proposed work and resources.

Deliverable 1 - Implement data feed endpoint schema for raw metrics (non-aggregated)

$55/hr x 5 weeks x 40hr/week = $11 000

Deliverable 2 - Implement backend ETL pipeline architecture

$55/hr x 9 weeks x 40hr/week = $19 800

Deliverable 3 - Implement data feed endpoint schema for aggregated metrics

$55/hr x 6 weeks x 40hr/week = $13 200

Deliverable 4 - End-to-end integration and comprehensive testing

$55/hr x 6 weeks x 40hr/week = $13 200

TOTAL ($) = $57 200

TOTAL ($0.3/ADA) = 190 666 ADA

Who is in the project team and what are their roles?

Varderes Barsegyan - Engineering Manager, Software Architect and Go Developer - <https://www.linkedin.com/in/barsegyanvarderes>

Maestro CTO, Genius Yield TPM;

Varderes Barsegyan is an engineer with a diverse background in physics, computer science, aerospace engineering, bioinformatics, and blockchain technology. As the Co-founder and CTO of Maestro, a leading provider of blockchain infrastructure for Cardano, he is trailblazing the way for the financial operating system of the world.

Jamie Harper - Senior Rust Engineer - <https://www.linkedin.com/in/jamie-h-8bb539114/>

Maestro engineer, Cardano open source contributor;

James is a Rust developer with unique experience in the Cardano ecosystem. Over a period of three years, he has been involved in auditing the Cardano ledger and related codebases, earning recognition as a contributor in the Cardano Babbage era ledger specification.

His in-depth understanding of Cardano's internals acquired through auditing led James to begin developing his own software within the ecosystem, and is a top contributor for some of TxPipe's open source software projects, Scrolls and Pallas. He has also shared valuable insights through write-ups, warning of common developer pitfalls within the Cardano ecosystem, contributing to improved security practices.

James later joined Maestro, where he plays a key role in designing, developing, and optimizing their developer platform and indexer, working with the rest of the team to ensure they meet the evolving needs of developers within the Cardano ecosystem.

Jeev B. - Senior DevOps Engineer - <https://www.linkedin.com/in/jeevb>

Maestro engineer, Union.AI staff engineer;

Jeev Balakrishnan is an accomplished staff software engineer with a proven track record of success. With his expertise and strong technical skills, he has played a crucial role in leading mission-critical projects at Union.AI, Freenome, and Maestro. Jeev's ability to deliver exceptional solutions, coupled with his innovative mindset, has made him a valuable contributor to the Cardano ecosystem.

Pedro Lucas - Technical Business Analyst - <https://www.linkedin.com/in/pedrohlucas>

Maestro Developer Experience, BizDez;

Pedro Lucas has over 20y experience in IT. Working as a Technical Business Analyst in Business Process Management and Decision support DataViz solutions in Finance and Banking. He has been in Crypto for 3y and 100% dedicated to Cardano communities and techology for almost 2y. Pedro has helped in Gimbalabs, amongst other communities, created and ran 'Cardano for non-techs' workshop sessions, and now collaborates with Maestro focusing on Developer Experience and Business Development.

How does the cost of the project represent value for money for the Cardano ecosystem?

The value to the Cardano ecosystem is potentially quite high. Analytics companies in EVM, such as Dune Analytics and Messari, are massively popular and have tens of thousands of users. Their platforms empower developers to build better Dapps, traders to make better decisions and researchers to discover new DeFi trends.

Maestro has carefully analyzed the costs associated with building and maintaining this service. This includes but is not limited to product development, engineering and marketing. The end result, however, is a sophisticated data aggregation and query system with economies of scale. Thousands of people in the ecosystem will be able to query mission critical data or extend the system to fit their particular needs.

The services budget is according to or below standard rates for these professional services in Europe or the USA.

close

Playlist

  • EP2: epoch_length

    Authored by: Darlington Kofa

    3m 24s
    Darlington Kofa
  • EP1: 'd' parameter

    Authored by: Darlington Kofa

    4m 3s
    Darlington Kofa
  • EP3: key_deposit

    Authored by: Darlington Kofa

    3m 48s
    Darlington Kofa
  • EP4: epoch_no

    Authored by: Darlington Kofa

    2m 16s
    Darlington Kofa
  • EP5: max_block_size

    Authored by: Darlington Kofa

    3m 14s
    Darlington Kofa
  • EP6: pool_deposit

    Authored by: Darlington Kofa

    3m 19s
    Darlington Kofa
  • EP7: max_tx_size

    Authored by: Darlington Kofa

    4m 59s
    Darlington Kofa
0:00
/
~0:00