takaaki-mizuno/llm-json-adapter 问题修复 & 功能扩展

解决BUG、新增功能、兼容多环境部署,快速响应你的开发需求

邮箱:yvsm@zunyunkeji.com | QQ:316430983 | 微信:yvsm316

takaaki-mizuno/llm-json-adapter

最新稳定版本:0.1.0

Composer 安装命令:

composer require takaaki-mizuno/llm-json-adapter

包简介

README 文档

README

What is it ?

When using LLMs from the system, you often expect to get output results in JSON: OpenAPI's GPT API has a mechanism called Function Calling, which can return JSON, but Google's Gemini does not seem to have that functionality.

Therefore, I have created a wrapper library to switch LLMs and get results in JSON. What this library can do is as follows.

  • Allows you to define the results you want to get in JSON Schema
  • Switch between LLMs (currently supports OpenAI's GPT and Google's Gemini).
  • Retry a specified number of times if the JSON retrieval fails

Installation

composer require takaaki-mizuno/llm-json-adapter

How to use

Use the following code to get the results in JSON.

OpenAI

$instance = new LLMJsonAdapter(
    providerName: "openai",
    attributes: [
        "api_key" => "[API-KEY]",
        "model" => "gpt-3.5-turbo",
    ],
    maximumRetryCount: 3,
    model: "gpt-3.5-turbo",
    defaultLanguage: "en"
);

$response = new \TakaakiMizuno\LLMJsonAdapter\Models\Response(
    name: "response data name",
    description: "response data description",
    schema: [JSON SCHEMA]
);

Google Gemini

$instance = new LLMJsonAdapter(
    providerName: "google",
    attributes: [
        "api_key" => "[API-KEY]",
        "model" => "gemini-1.5-pro-latest",
    ],
    maximumRetryCount: 3,
    defaultLanguage: "en"
);

$response = new \TakaakiMizuno\LLMJsonAdapter\Models\Response(
    name: "response data name",
    description: "response data description",
    schema: [JSON SCHEMA]
);

BedRock

$instance = new LLMJsonAdapter(
    providerName: "bedrock",
    attributes: [
        'accessKeyId' => '[ACCESS-KEY]',
        'secretAccessKey' => '[SECRET-KEY]',
        'model' => 'anthropic.claude-3-haiku-20240307-v1:0',
    ],
    maximumRetryCount: 3,
    defaultLanguage: "en"
);

$response = new \TakaakiMizuno\LLMJsonAdapter\Models\Response(
    name: "response data name",
    description: "response data description",
    schema: [JSON SCHEMA]
);

Ollama

$instance = new LLMJsonAdapter(
    providerName: "ollama",
    attributes: [
        'url' => "http://localhost:11434",
        'model' => 'llama3',
    ],
    maximumRetryCount: 3,
    defaultLanguage: "en"
);

$response = new \TakaakiMizuno\LLMJsonAdapter\Models\Response(
    name: "response data name",
    description: "response data description",
    schema: [JSON SCHEMA]
);

统计信息

  • 总下载量: 4
  • 月度下载量: 0
  • 日度下载量: 0
  • 收藏数: 1
  • 点击次数: 0
  • 依赖项目数: 0
  • 推荐数: 0

GitHub 信息

  • Stars: 1
  • Watchers: 1
  • Forks: 0
  • 开发语言: PHP

其他信息

  • 授权协议: MIT
  • 更新时间: 2024-01-28