completed
Wolfram: AI - LLM Distributed Inference Services
Current Project Status
Complete
Amount
Received
₳100,000
Amount
Requested
₳100,000
Percentage
Received
100.00%
Solution

Development and prototyping of a distributed LLM inference service. The modular system will support a range of models

Problem

The construction and utilization of LLM inference infrastructure is costly and centralized, placing control in the hands of major corporations and making it inaccessible to global populations.

Impact Alignment
Feasibility
Value for money

Wolfram Blockchain Labs

3 members

close

Playlist

  • EP2: epoch_length

    Authored by: Darlington Kofa

    3m 24s
    Darlington Kofa
  • EP1: 'd' parameter

    Authored by: Darlington Kofa

    4m 3s
    Darlington Kofa
  • EP3: key_deposit

    Authored by: Darlington Kofa

    3m 48s
    Darlington Kofa
  • EP4: epoch_no

    Authored by: Darlington Kofa

    2m 16s
    Darlington Kofa
  • EP5: max_block_size

    Authored by: Darlington Kofa

    3m 14s
    Darlington Kofa
  • EP6: pool_deposit

    Authored by: Darlington Kofa

    3m 19s
    Darlington Kofa
  • EP7: max_tx_size

    Authored by: Darlington Kofa

    4m 59s
    Darlington Kofa
0:00
/
~0:00