Results 1 to 9 of 9

Thread: Dllama - Local LLM Inference

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Quote Originally Posted by Jonax View Post
    I'm afraid I didn't even try the project. So much to do, so little time. Besides I'm no github user. So can't really comment at all other than the general point that it's good to see some pascal activity. Keep up the good work.
    hi, thanks. You can directly download without having an account.

  2. #2
    Thanks for link . I'm afraid I won't be able to try this anyway. Despite the tempting foreign language capabilities. But I have a couple general questions.
    This runs on the Embarcardero Delphi?
    How much disc space is needed/recommended?

  3. #3
    Quote Originally Posted by Jonax View Post
    Thanks for link . I'm afraid I won't be able to try this anyway. Despite the tempting foreign language capabilities. But I have a couple general questions.
    This runs on the Embarcardero Delphi?
    How much disc space is needed/recommended?
    - Yes, Pascal (Delphi/FreePascal), C/C++ (C++ Builder, Visual Studio 2020, Pelles C)
    - ~5MB for the distro and then you will a model to use, the smalled is Phi3, which is ~2.5GB in size, most models that I can run in VRAM are from 4-8GB in size.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •