logo.png
A simple and easy to use library for doing local LLM inference directly from your favorite programming language.
Download
tinyBigGAMES/Dllama: Local LLM inference Library (github.com)
Simple example
Code:
uses
System.SysUtils,
Dllama;
begin
// init
if not Dllama_Init('config.json', nil) then
Exit;
try
// add message
Dllama_AddMessage(ROLE_SYSTEM, 'You are a helpful AI assistant');
Dllama_AddMessage(ROLE_USER, 'What is AI?');
// do inference
if Dllama_Inference('phi3', 1024, nil, nil, nil) then
begin
// success
end
else
begin
// error
end;
finally
Dllama_Quit();
end;
end.
Media
Bookmarks