kambo/llama-cpp-php 问题修复 & 功能扩展

解决BUG、新增功能、兼容多环境部署,快速响应你的开发需求

邮箱:yvsm@zunyunkeji.com | QQ:316430983 | 微信:yvsm316

kambo/llama-cpp-php

最新稳定版本:v0.2.0-alpha

Composer 安装命令:

composer require kambo/llama-cpp-php

包简介

The package enables the use of the LLama C++ library in PHP, thereby allowing the setup and execution of LLM models in PHP on your local machine.

README 文档

README

Latest Version on Packagist Tests Total Downloads

The package enables the use of the LLama C++ library in PHP, thereby allowing the setup and execution of LLM models in PHP on your local machine.

This is highly experimental and not suitable for production use!

Use at your own risk!

Only Linux is supported!

asciicast

Installation

You can install the package via composer:

composer require kambo/llama-cpp-php kambo/llama-cpp-php-linux-lib

Note: the kambo/llama-cpp-php-linux-lib package contains a binary library for Linux.

Usage

Get model, you can use for example this command:

wget https://huggingface.co/LLukas22/gpt4all-lora-quantized-ggjt/resolve/main/ggjt-model.bin
$template = "You are a programmer, write PHP class that will add two numbers and print the result. Stop at class end.";
$context = Context::createWithParameter(new ModelParameters(__DIR__ .'/models/ggjt-model.bin'));
$llama = new LLamaCPP($context);
echo "Prompt: \033[0;32m".$template."\033[0m".PHP_EOL;

foreach ($llama->generate($template, new GenerationParameters(predictLength: 200)) as $token) {
    echo $token;
}

License

The MIT License (MIT). Please see License File for more information.

统计信息

  • 总下载量: 312
  • 月度下载量: 0
  • 日度下载量: 0
  • 收藏数: 46
  • 点击次数: 0
  • 依赖项目数: 1
  • 推荐数: 0

GitHub 信息

  • Stars: 46
  • Watchers: 5
  • Forks: 4
  • 开发语言: PHP

其他信息

  • 授权协议: MIT
  • 更新时间: 2023-04-13