Results 1 to 1 of 1

Thread: Dllama - Local LLM Inference

  1. #1

    Dllama - Local LLM Inference

    logo.png

    A simple and easy to use library for doing local LLM inference directly from your favorite programming language.

    Download
    tinyBigGAMES/Dllama: Local LLM inference Library (github.com)

    Simple example
    Code:
    uses
      System.SysUtils,
      Dllama;
    
    
    begin
      // init
      if not Dllama_Init('config.json', nil) then
        Exit;
      try
        // add message
        Dllama_AddMessage(ROLE_SYSTEM, 'You are a helpful AI assistant');
        Dllama_AddMessage(ROLE_USER, 'What is AI?');
        
        // do inference
        if Dllama_Inference('phi3', 1024, nil, nil, nil) then
        begin
          // success
        end
      else
        begin
          // error
        end;
      finally
        Dllama_Quit();
      end;
    end.
    Media
    Attached Images Attached Images

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •