Summarix is a Python package designed to process user-inputted text statements or stories and extract structured summaries or insights using a reliable language model with pattern matching and retries. It simplifies transforming plain text prompts into organized, actionable data by leveraging the capabilities of a pattern-aware conversation framework. This ensures consistent interpretation and mapping of user inputs into predefined data formats, avoiding ambiguities and enhancing automation in knowledge extraction or storytelling analysis.
- Uses advanced language models from langchain (by default ChatLLM7)
- Pattern matching with regex for precise output extraction
- Supports custom language model integration
- Handles retries and error management seamlessly
- Simplifies conversion of complex text inputs into structured data
Install the package via pip:
pip install summarixImport the main function and use it with your input text:
from summarix import summarix
response = summarix(user_input="Your text here")- user_input (str): The text statement or story to process.
- llm (Optional[BaseChatModel]): A custom langchain language model instance. Defaults to using ChatLLM7.
- api_key (Optional[str]): API key for the LLM7 service. If not provided, it will look for the environment variable
LLM7_API_KEYor use the default free tier.
You can pass your own language model instance to utilize other providers supported by langchain, e.g., OpenAI, Anthropic, Google Generative AI.
Example using OpenAI:
from langchain_openai import ChatOpenAI
from summarix import summarix
llm = ChatOpenAI()
response = summarix(user_input="Analyze this story", llm=llm)Example using Anthropic:
from langchain_anthropic import ChatAnthropic
from summarix import summarix
llm = ChatAnthropic()
response = summarix(user_input="Describe the scenario", llm=llm)Example using Google Generative AI:
from langchain_google_genai import ChatGoogleGenerativeAI
from summarix import summarix
llm = ChatGoogleGenerativeAI()
response = summarix(user_input="Generate insights", llm=llm)The default rate limits for LLM7's free tier are sufficient for most use cases. For higher limits, you can obtain a free API key at https://token.llm7.io/ and provide it via environment variable LLM7_API_KEY or directly in the function call:
response = summarix(user_input="Task", api_key="your_api_key")For issues or feature requests, please visit the GitHub repository:
https://github.com/chigwell/summarix/issues
Eugene Evstafev
Email: hi@eugene.plus
GitHub: chigwell