Skip to content

chigwell/forum-guard

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

forum-guard

PyPI version License: MIT Downloads LinkedIn

A Python package designed to facilitate structured and reliable textual interactions within a global forum environment. It processes user inputs—such as comments, questions, and feedback—using pattern matching and retries to extract key information, topics, or sentiments, ensuring consistent and meaningful responses. The system enables moderators or automated tools to identify and categorize submissions effectively, fostering authentic and unfiltered conversations while maintaining high-quality, structured exchanges based solely on the provided text data.

Installation

pip install forum_guard

Usage

Here's an example of how to use the forum_guard function in Python:

from forum_guard import forum_guard

user_input = "Your user input text here."
response = forum_guard(user_input)
print(response)

Input Parameters

  • user_input (str): The user input text to process.
  • llm (Optional[BaseChatModel]): An instance of a langchain.llm core language model to use. If not provided, the default ChatLLM7 will be used.
  • api_key (Optional[str]): The API key for ChatLLM7. If not provided, it attempts to read from the environment variable LLM7_API_KEY.

Custom LLM Usage

You can pass your own language model instance to forum_guard. Supported models include, but are not limited to:

from langchain_openai import ChatOpenAI
from forum_guard import forum_guard

llm = ChatOpenAI()
response = forum_guard(user_input, llm=llm)
from langchain_anthropic import ChatAnthropic
from forum_guard import forum_guard

llm = ChatAnthropic()
response = forum_guard(user_input, llm=llm)
from langchain_google_genai import ChatGoogleGenerativeAI
from forum_guard import forum_guard

llm = ChatGoogleGenerativeAI()
response = forum_guard(user_input, llm=llm)

Notes

  • The default ChatLLM7 is based on the langchain_llm7 package. You can install it via:
pip install langchain-llm7
  • To increase rate limits, you can set your own LLM7_API_KEY in environment variables or pass it directly:
response = forum_guard(user_input, api_key="your_api_key")

Support

For issues and feature requests, visit the GitHub repository: https://github.com/yourusername/forum-guard/issues

Author

Eugene Evstafev (chigwell)
Email: hi@euegne.plus