Results 1 to 2 of 2

Thread: Dllama - Local LLM Inference

  1. #1

    Dllama - Local LLM Inference

    logo.png

    A simple and easy to use library for doing local LLM inference directly from your favorite programming language.

    Download
    tinyBigGAMES/Dllama: Local LLM inference Library (github.com)

    Simple example
    Code:
    uses
      System.SysUtils,
      Dllama;
    
    
    begin
      // init
      if not Dllama_Init('config.json', nil) then
        Exit;
      try
        // add message
        Dllama_AddMessage(ROLE_SYSTEM, 'You are a helpful AI assistant');
        Dllama_AddMessage(ROLE_USER, 'What is AI?');
        
        // do inference
        if Dllama_Inference('phi3', 1024, nil, nil, nil) then
        begin
          // success
        end
      else
        begin
          // error
        end;
      finally
        Dllama_Quit();
      end;
    end.
    Media
    Attached Images Attached Images

  2. #2
    I just tried this project of yours and was not impressed. While it seemed that some text was being written by the AI the output was incoherent mess of various words. I don't have enough knowledge on this topic to even begin to guess what might be wrong.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •