saqqal/llm-integration-bundle
最新稳定版本:v1.0.0
Composer 安装命令:
composer require saqqal/llm-integration-bundle
包简介
Symfony bundle for integrating large language models (LLMs) into applications via api providers, supporting API Together and OpenAI.
README 文档
README
LLMIntegrationBundle is a powerful Symfony bundle that seamlessly integrates Large Language Models (LLMs) into your Symfony applications. With support for multiple AI providers and a flexible architecture, it's designed for easy extension and customization.
📚 Table of Contents
- Features
- Installation
- Configuration
- Usage
- Available AI Clients
- CLI Commands
- Extending the Bundle
- Exception Handling
- Testing
- License
- Author
- Contributing
- Documentation
- Acknowledgements
✨ Features
- 🌐 Support for multiple AI providers
- ⚙️ Flexible configuration
- 🛡️ Exception handling with custom exceptions
- 🖥️ CLI integration for generating new AI service classes
- 🧩 Extensible architecture
- 🧪 Comprehensive unit testing
📦 Installation
Install the bundle using Composer:
composer require saqqal/llm-integration-bundle
🛠️ Configuration
- Register the bundle in
config/bundles.php:
<?php return [ // ... Saqqal\LlmIntegrationBundle\LlmIntegrationBundle::class => ['all' => true], ];
- Create
config/packages/llm_integration.yaml:
llm_integration: llm_provider: 'api_together' llm_api_key: '%env(LLM_PROVIDER_API_KEY)%' llm_model: 'meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo'
- Set the API key in your
.envfile:
LLM_PROVIDER_API_KEY=your_api_key_here
🚀 Usage
Injecting the AI Service
Inject AiServiceInterface into your services or controllers:
use Saqqal\LlmIntegrationBundle\Interface\AiServiceInterface; class YourService { private AiServiceInterface $aiService; public function __construct(AiServiceInterface $aiService) { $this->aiService = $aiService; } // ... }
Generating Responses
Use the generate method to send prompts and receive responses:
public function generateResponse(string $prompt): string { $response = $this->aiService->generate($prompt); return $response->getData()['content']; }
Changing Output Type
You can change the output type to DynamicAiResponse for more flexible access to API responses:
public function generateDynamicResponse(string $prompt): mixed { $response = $this->aiService->generate($prompt, [], true); return $response->choices[0]->message->content; }
🤝 Available AI Clients
LLMIntegrationBundle supports the following AI clients:
- API Together (
ApiTogetherClient) - OpenAI (
OpenAiClient) - Anthropic (
AnthropicClient) - Arliai (
ArliaiClient) - Deepinfra (
DeepinfraClient) - Groq (
GroqClient) - HuggingFace (
HuggingFaceClient) - Mistral (
MistralClient) - OpenRouter (
OpenRouterClient) - Tavily (
TavilyClient)
To use a specific client, set the llm_provider in your configuration to the corresponding provider name.
💻 CLI Commands
Generate a new AI service class
php bin/console llm:create-ai-service
Follow the prompts to enter the provider name and API endpoint.
List available AI clients
php bin/console llm:list-ai-services
This command will list all available AI clients that are tagged with the @AiClient attribute.
🔧 Extending the Bundle
To add a new AI provider:
- Create a new client class extending
AbstractAiClient:
use Saqqal\LlmIntegrationBundle\Attribute\AiClient; use Saqqal\LlmIntegrationBundle\Client\AbstractAiClient; #[AiClient('your_provider')] class YourProviderClient extends AbstractAiClient { protected function getApiUrl(): string { return 'https://api.yourprovider.com/v1/chat/completions'; } protected function getAdditionalRequestData(string $prompt, ?string $model): array { return [ // Add provider-specific options here ]; } }
- Update your configuration to use the new provider:
llm_integration: llm_provider: 'your_provider' llm_api_key: '%env(YOUR_PROVIDER_API_KEY)%' llm_model: 'your-default-model'
🚦 Exception Handling
Create an event subscriber to handle LlmIntegrationExceptionEvent:
use Saqqal\LlmIntegrationBundle\Event\LlmIntegrationExceptionEvent; use Symfony\Component\EventDispatcher\EventSubscriberInterface; class LlmIntegrationExceptionSubscriber implements EventSubscriberInterface { public static function getSubscribedEvents(): array { return [ LlmIntegrationExceptionEvent::class => 'onLlmIntegrationException', ]; } public function onLlmIntegrationException(LlmIntegrationExceptionEvent $event): void { $exception = $event->getException(); // Handle the exception } }
🧪 Testing
Run the test suite:
./vendor/bin/phpunit
📄 License
This bundle is released under the MIT License. See the LICENSE file for details.
👨💻 Author
Abdelaziz Saqqal - LinkedIn - Portfolio
🤝 Contributing
Contributions are welcome! Please fork the repository and submit a pull request with your changes.
📚 Documentation
For more detailed documentation, please visit our Wiki.
统计信息
- 总下载量: 10
- 月度下载量: 0
- 日度下载量: 0
- 收藏数: 1
- 点击次数: 2
- 依赖项目数: 0
- 推荐数: 0
其他信息
- 授权协议: MIT
- 更新时间: 2024-09-22