noah-medra/prompt-builder
最新稳定版本:1.0.2
Composer 安装命令:
composer require noah-medra/prompt-builder
包简介
Laravel package to build dynamic AI prompts
README 文档
README
PromptBuilder is a Laravel package that allows you to create and execute AI prompts in a flexible and structured way. This package enables you to generate dynamic queries, customize parameters, and manage conversation history to improve the relevance of responses.
Installation
Composer
To install PromptBuilder via Composer, run the following command:
composer require noahmedra/promptbuilder
Usage
Creating a PromptBuilder object
Start by instantiating a PromptBuilder object:
use NoahMedra\PromptBuilder\PromptBuilder; $builder = PromptBuilder::make();
Defining the Driver (Processing Engine)
By default, PromptBuilder uses the OllamaDriver. If you want to use another driver, you can specify it using the driver() method.
use NoahMedra\PromptBuilder\Drivers\HuggingFaceDriver; $builder->driver(HuggingFaceDriver::class);
Adding instructions (with Sub-Instructions and Conditions)
Instructions are rules or constraints you want to apply to the AI's generated response. These instructions can have sub-instructions or be condition-dependent. You can chain instructions infinitely and even create nested conditions.
Here is an example where instructions can contain sub-instructions, and conditional logic can be applied using the when() method.
Example with Instructions and Sub-Instructions:
$builder->instruction("### Financial History") ->instruction("We will also review your financial history to identify trends.") ->when( true, // This condition is true, so the sub-instructions will be applied function($builder) { $builder->instruction("Here is a summary of your financial history for the last three months."); ->when(false, fn($ist) => $ist->add('It seems there are no significant changes in your financial behavior this month. We’ll continue to monitor this trend for future insights.')) $builder->instruction("Your balance has fluctuated between X and Y. It seems like you had some unexpected expenses."); } );
In this example:
- Main instructions: The first two instructions introduce the financial history topic and explain the review process.
- Conditional instructions: The
when()method checks a condition (in this case,true), and if true, additional sub-instructions are added (e.g., a summary of the financial history).
Adding dynamic parameters
You can pass custom parameters to your prompt with the withParams() method:
$builder->withParams(['key' => 'value']);
Defining a context
You can add additional context to guide the AI in generating the response.
$builder->context("You are a virtual assistant with expertise in financial analysis.");
Asking a question
You can ask a question that will be included in the final prompt.
$builder->ask("What are the main trends in the past three months of my financial data?");
Managing history
If you want the AI to use previous conversation history to provide more contextual responses, enable the useHistory() option.
$builder->useHistory(true);
Generating and retrieving the response
Once all instructions and parameters are defined, you can generate the prompt and obtain the AI's response by calling the getOutput() method:
$builder->process(); $output = $builder->getOutput(); echo $output->('message.content'); // Display the generated response echo $output->('model'); // Display the model's response
Handling JSON responses
If you want the AI's response to be formatted as JSON, use the expectResponseFormat() method.
$builder->expectResponseFormat('{"resume": "Summary of the response", "response": "Your response here"}');
Driver: How It Works (Input/Output)
The Driver is responsible for handling the input (your prompt) and generating the output (the AI's response). Each driver implements the DriverInterface and provides a process() method that takes a BuilderInput object, processes the prompt, and returns a BuilderOutput.
For example, the OllamaDriver sends the prompt to a local Ollama service and receives the AI's response.
Here is an example of the OllamaDriver:
namespace App\Drivers; use Exception; use Illuminate\Support\Facades\Http; use NoahMedra\PromptBuilder\BuilderInput; use NoahMedra\PromptBuilder\BuilderOutput; use NoahMedra\PromptBuilder\Drivers\DriverInterface; class OllamaDriver implements DriverInterface { public function process(BuilderInput $input) : BuilderOutput { return new BuilderOutput($this->executePrompt($input)); } private function executePrompt(BuilderInput $input) : string { $response = Http::withHeaders([ 'Content-Type' => 'application/x-www-form-urlencoded', ])->post('http://localhost:11434/api/chat', [ 'model' => 'llama3.1', 'stream' => false, 'messages' => [ [ 'role' => 'user', 'content' => $input->getPromptText(), ] ], ...($input->getParams() ?? []) ]); if ($response->failed()) { throw new Exception($response->body()); } return $response->body(); } }
In this example:
- The
process()method receives aBuilderInputobject which contains the prompt text. - The
executePrompt()method sends the prompt to a remote API (Ollama) and receives the AI response. - The response is returned as a
BuilderOutputobject.
Complete Example
Here is a complete example of how to use PromptBuilder:
use NoahMedra\PromptBuilder\PromptBuilder; $builder = PromptBuilder::make(); // Add parameters, instructions, context, and a question $builder->withParams(['lang' => 'en']) ->instruction("Provide a clear and concise answer.") ->instruction("The response should be in JSON format.") ->instruction("Include the following sub-instructions:") ->when(true, function($builder){ $builder ->instruction("Ensure the answer is clear and easy to understand.") ->instruction("Be concise and avoid unnecessary details.") }) ->context("You are an expert in financial analysis.") ->ask("What are the main trends in the past three months of my financial data?"); // Generate and get the response $builder->process(); $output = $builder->getOutput(); echo $output->get('message.content');
Features
- Driver flexibility: Easily add or replace text-processing engines.
- Dynamic instructions with sub-instructions and conditions: Add custom instructions that can have nested sub-instructions, and create complex prompts with conditional instructions using the
when()method. - History management: Store previous conversations to provide context.
- JSON response support: Format the AI's responses in JSON for easy integration.
- Dynamic parameters: Pass custom parameters to your prompts.
Customization
You can easily extend PromptBuilder by creating your own drivers or by customizing the PromptDriverInterface. To do this, simply create a class that implements the PromptDriverInterface and pass it to the driver() method.
Contributions
Contributions are welcome! If you have ideas to improve the package, feel free to submit an issue or a pull request.
统计信息
- 总下载量: 31
- 月度下载量: 0
- 日度下载量: 0
- 收藏数: 0
- 点击次数: 0
- 依赖项目数: 0
- 推荐数: 0
其他信息
- 授权协议: MIT
- 更新时间: 2025-12-02