Skip to content

Conversation

@Pevooo
Copy link

@Pevooo Pevooo commented Mar 26, 2025

✨ Add CustomTextGenerator for Custom Function-Based Text Generation

πŸ“Œ Summary

This PR introduces CustomTextGenerator, a new text generation class for llmx that allows users to define their own text generation function. This enables custom logic for generating text, making llmx more flexible for various use cases.

πŸ› οΈ Changes Introduced

  • New CustomTextGenerator Class
    • Accepts a text_generation_function: Callable[[str], str] to generate text based on a user-defined function.
    • Implements caching to reuse previously generated responses.
    • Includes token counting functionality via num_tokens_from_messages().

βœ… How It Works

# Define a custom text generation function
def my_custom_function(prompt: str) -> str:
    return f"Generated response for: {prompt}"

# Instantiate CustomTextGenerator
custom_gen = CustomTextGenerator(text_generation_function=my_custom_function)

# Generate text based on a user message
response = custom_gen.generate([{"role": "user", "content": "Hello"}])
print(response.text[0].content)  # Output: "Generated response for: user: Hello"

πŸš€ Why This is Useful

  • Enables custom text generation logic beyond built-in providers.
  • Improves flexibility for users integrating external models or APIs.
  • Supports caching, reducing redundant calls and improving efficiency.

@Pevooo Pevooo marked this pull request as draft March 26, 2025 23:59
@Pevooo Pevooo marked this pull request as ready for review March 27, 2025 00:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant