Crynux Network
HomeBlogGitHubTwitterDiscordNetstats
  • Crynux Network
  • Releases
    • Helium Network
    • Hydrogen Network
  • System Design
    • Network Architecture
    • Consensus Protocol
      • Inference Task Validation
      • Training/FT Task Validation
    • Verifiable Secret Sampling
    • Task Lifecycle
      • Task State Transitions
    • Task Dispatching
    • Task Pricing
    • Quality of Service (QoS)
    • Model Distribution
  • Node Hosting
    • Start a Node
      • Start a Node - Windows
      • Start a Node - Mac
      • Start a Node - Linux
      • Start a Node - Docker
      • Start a Node - Vast
    • Get the Test CNX Tokens
    • Private Key Security
    • Assign GPU to the Node
    • Proxy Settings
    • Docker Compose Options
    • Advanced Configuration
  • Application Development
    • How to Run LLM using Crynux Network
    • Application Workflow
    • Execute Tasks
      • Text-to-Image Task
      • Text-to-Text Task
      • Text-to-Music Task
      • Text-to-Video Task
      • Fine-Tuning Task
    • Crynux Bridge
    • API Specification of the Relay
    • Crynux SDK
  • Crynux Token
    • Wallet Configuration
      • Metamask
  • Troubleshooting
    • FAQ
    • Locate the Error Message
    • Exceptions in WebUI
  • Misc
    • Privacy Policy
Powered by GitBook
On this page
  • Method 1: Using the Official Crynux Bridge
  • Method 2: Hosting Your Own Crynux Bridge
  • Method 3: Sending Tasks Directly to the Blockchain
Edit on GitHub
  1. Application Development

How to Run LLM using Crynux Network

Running LLM tasks with various open-source models can be as simple as calling an OpenAI-compliant API via the Crynux Network. The example below demonstrates how to send an LLM chat completion task to the Crynux Network using the official OpenAI SDK:

from openai import OpenAI

client = OpenAI(
    base_url="https://bridge.crynux.ai/v1/llm",
    api_key="wo19nkaeWy4ly34iexE7DKtNIY6fZWErCAU8l--735U=", # For public demonstration only, strict rate limit applied.
    timeout=180,
    max_retries=1,
)

res = client.chat.completions.create(
    model="Qwen/Qwen2.5-7B",
    messages=[
        {
            "role": "user",
            "content": "What is the capital of France?",
        },
    ],
    stream=False
)

print(res)
import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "https://bridge.crynux.ai/v1/llm",
  apiKey: "wo19nkaeWy4ly34iexE7DKtNIY6fZWErCAU8l--735U=", // For public demonstration only, strict rate limit applied.
  timeout: 180000,
  maxRetries: 1,
});

async function main() {
  try {
    const chatCompletion = await client.chat.completions.create({
      model: "Qwen/Qwen2.5-7B",
      messages: [
        {
          role: "user",
          content: "What is the capital of France?",
        },
      ],
      stream: false,
    });
    console.log("Chat completion response:", chatCompletion);
    
    return chatCompletion;
  } catch (error) {
    console.error("Error:", error);
  }
}

main();

This code is standard for invoking OpenAI models through their API. The only modification is the base_url, which is changed from the OpenAI URL to the official Crynux Bridge. A live version of this JavaScript code, embedded in a CodePen webpage, allows you to input arbitrary text and receive a response:

The API Key in the example code is for public demonstration purposes only and has a strict rate limit, making it unsuitable for production environments. To use the Crynux Network in production, choose one of the following methods:

Method 1: Using the Official Crynux Bridge

You can request a separate API Key with a higher quota from the Crynux Discord server. Join the server and request new keys from an admin in the "applications" channel.

Method 2: Hosting Your Own Crynux Bridge

You can host your own instance of the Crynux Bridge to provide private APIs for your application. This approach gives you greater control over various system aspects, including reliability and speed-related configurations.

Method 3: Sending Tasks Directly to the Blockchain

You can bypass the Crynux Bridge entirely and interact directly with the blockchain and Crynux Relay to send tasks. Crynux SDKs are available in various languages and can be embedded directly into your code to run LLM tasks. Please consult the Crynux SDK documentation for detailed usage instructions:

PreviousAdvanced ConfigurationNextApplication Workflow

Last updated 1 day ago

The API, provided by the official Crynux Bridge, supports both OpenAI-compliant /completions and /chat/completions endpoints. Features like streaming, tool-calling, and numerous other configuration options are also supported. For a comprehensive list of supported features, please refer to the.

Starting a Crynux Bridge is as straightforward as running a Docker container. An additional requirement is a wallet funded with sufficient (test) CNX to cover the tasks you run on the network. And at this moment, you can get test CNXs for free in the as well.

Crynux Bridge is fully open-sourced on . A step-by-step guide for starting a Crynux Bridge instance is available in the following document:

Crynux Bridge documentation
Crynux Discord
GitHub
Crynux Bridge
Crynux SDK
LogoJoin the Crynux #DeAI Discord Server!Discord