Class LLMSummarizationCompressor
- Namespace
- AiDotNet.PromptEngineering.Compression
- Assembly
- AiDotNet.dll
Compressor that uses an LLM to intelligently summarize and compress prompts.
public class LLMSummarizationCompressor : PromptCompressorBase, IPromptCompressor
- Inheritance
-
LLMSummarizationCompressor
- Implements
- Inherited Members
Remarks
This compressor delegates compression to a language model, which can understand context and produce semantically equivalent but shorter versions of prompts. It's the most intelligent form of compression but requires an LLM call.
For Beginners: Uses AI to make your prompt shorter while keeping the meaning.
Example:
var compressor = new LLMSummarizationCompressor(summarizeFunc);
string verbose = @"I would like you to help me with a task. The task involves
taking the following customer feedback data and analyzing it to identify
the main themes and patterns. Please pay special attention to any recurring
complaints or suggestions that customers have made. After your analysis,
provide a summary of your findings.";
string compressed = await compressor.CompressAsync(verbose);
// Result: "Analyze customer feedback to identify themes, recurring complaints,
// and suggestions. Summarize findings."
When to use:
- For complex prompts where simple pattern matching won't work
- When semantic understanding is required
- When maximum compression with preserved meaning is needed
Constructors
LLMSummarizationCompressor(Func<string, CancellationToken, Task<string>>?, string?, Func<string, int>?)
Initializes a new instance of the LLMSummarizationCompressor class.
public LLMSummarizationCompressor(Func<string, CancellationToken, Task<string>>? summarizeFunc = null, string? systemPrompt = null, Func<string, int>? tokenCounter = null)
Parameters
summarizeFuncFunc<string, CancellationToken, Task<string>>Function that calls an LLM to summarize text. Takes the text to compress and returns the compressed version.
systemPromptstringOptional custom system prompt for the compression.
tokenCounterFunc<string, int>Optional custom token counter function.
Properties
SystemPrompt
Gets the compression prompt template.
public string SystemPrompt { get; }
Property Value
Methods
CompressAsync(string, CompressionOptions?, CancellationToken)
Compresses the prompt asynchronously using the LLM.
public override Task<string> CompressAsync(string prompt, CompressionOptions? options = null, CancellationToken cancellationToken = default)
Parameters
promptstringoptionsCompressionOptionscancellationTokenCancellationToken
Returns
CompressCore(string, CompressionOptions)
Compresses the prompt synchronously.
protected override string CompressCore(string prompt, CompressionOptions options)
Parameters
promptstringoptionsCompressionOptions
Returns
Remarks
If no summarization function is provided, falls back to a simple rule-based compression. For full LLM-based compression, use CompressAsync.