搜索关键词:local-llm,共找到 5 个结果
排序方式:
当前按 更新时间 降序 排列

mahmoudnaggar/laravel-lmstudio

Advanced LM Studio integration for Laravel - Run local LLMs with OpenAI-compatible API, model management, streaming, embeddings, and more

版本:v1.0.0 下载:4 Stars:4 点击:2

时间:2025-12-24 23:12

pandarose/ollama-stream

Stream a real-time response from a local Ollama model

版本:未知版本 下载:4 Stars:0 点击:3

时间:2025-05-28 18:35

ardagnsrn/ollama-php

This is a PHP library for Ollama. Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine. It acts as a bridge between the complexities of LLM technology and the desire for an accessible and customizable AI experience.

版本:1.0.5 下载:35.98k Stars:196 点击:2

时间:2024-08-14 12:46

akramzerarka/llama-cpp-php

The package enables the use of the LLama C++ library in PHP, thereby allowing the setup and execution of LLM models in PHP on your local machine.

版本:v0.3 下载:16 Stars:1 点击:2

时间:2024-02-21 07:00

kambo/llama-cpp-php

The package enables the use of the LLama C++ library in PHP, thereby allowing the setup and execution of LLM models in PHP on your local machine.

版本:v0.2.0-alpha 下载:312 Stars:46 点击:1

时间:2023-04-13 20:27