This directory contains comprehensive documentation for the ProactivePulse MSP Insight Agent system.
architecture.md- System architecture and design patternsapi-reference.md- Complete API documentationdata-models.md- Data structures and schemasdeployment.md- Deployment guides for local and AWSuser-guide.md- End-user documentationdevelopment.md- Developer setup and contribution guidelinestroubleshooting.md- Common issues and solutions
ProactivePulse is a dual-mode system that correlates infrastructure metrics with support tickets to generate proactive insights for Managed Service Providers (MSPs).
- Anomaly Detection: Identifies unusual patterns in metrics
- Ticket Clustering: Groups similar support issues
- Correlation Engine: Links anomalies with ticket patterns
- Proactive Insights: Generates root-cause hypotheses and recommended actions
- Dual Mode: Runs locally for development or on AWS for production
Local Mode (MODE=local):
- scikit-learn for anomaly detection
- sentence-transformers for NLP
- Local file system storage
- FastAPI development server
AWS Mode (MODE=aws):
- Amazon SageMaker for anomaly detection
- Amazon Bedrock for NLP and text generation
- S3 and DynamoDB for storage
- Lambda functions with API Gateway
- AWS Account with appropriate permissions
- AWS CLI configured with credentials
- S3 buckets for data storage
- DynamoDB tables for structured data
- IAM roles and policies for service access
To deploy in AWS mode:
- Set
MODE=awsin your environment configuration - Configure AWS credentials using one of these methods:
- AWS credentials file (
~/.aws/credentials) - Environment variables (
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY) - IAM roles (for EC2 or Lambda deployments)
- AWS credentials file (
- Update the following AWS-specific configuration values:
AWS_REGION: Your preferred AWS regionS3_BUCKET_RAW: S3 bucket for raw dataS3_BUCKET_PROCESSED: S3 bucket for processed dataDYNAMODB_INSIGHTS_TABLE: DynamoDB table for insightsDYNAMODB_ANOMALIES_TABLE: DynamoDB table for anomaliesDYNAMODB_CLUSTERS_TABLE: DynamoDB table for clustersAWS_BEDROCK_MODEL_TEXT: Bedrock model for text generationAWS_BEDROCK_MODEL_EMBED: Bedrock model for embeddings
The system requires the following AWS resources to be created:
-
S3 Buckets:
- Raw data bucket for incoming tickets and metrics
- Processed data bucket for analyzed results
-
DynamoDB Tables:
- Insights table for storing generated insights
- Anomalies table for storing detected anomalies
- Clusters table for storing ticket clusters
-
IAM Roles and Policies:
- Permissions for S3 read/write operations
- Permissions for DynamoDB read/write operations
- Permissions for Bedrock model access
- Permissions for SageMaker endpoint invocation (if using)
-
Bedrock Model Access:
- Enable access to selected Bedrock models in the AWS console
-
SageMaker Endpoint (optional):
- Deploy a Random Cut Forest model for enhanced anomaly detection
See .env.example for a complete list of environment variables with descriptions.
For AWS deployment, the following variables are particularly important:
MODE=aws
AWS_REGION=us-east-1
S3_BUCKET_RAW=your-raw-data-bucket
S3_BUCKET_PROCESSED=your-processed-data-bucket
DYNAMODB_INSIGHTS_TABLE=YourInsightsTable
DYNAMODB_ANOMALIES_TABLE=YourAnomaliesTable
DYNAMODB_CLUSTERS_TABLE=YourClustersTable
AWS_BEDROCK_MODEL_TEXT=amazon.nova-lite-v1
AWS_BEDROCK_MODEL_EMBED=amazon.titan-embed-text-v1- Issues: Report bugs and feature requests on GitHub
- Discussions: Join community discussions
- Documentation: Check relevant documentation files
- Support: Contact the development team
See Development Guide for information on:
- Setting up development environment
- Code style and standards
- Testing requirements
- Pull request process