Victor is a highly modular and extensible AGI framework designed for complex cognitive simulations and advanced AI operations. This repository contains the "Victor Prime Synthesis Core," an architecture engineered for sophisticated AI development, featuring custom tensor operations, advanced memory systems, and a dynamic sector-based cognitive model.
The heart of the system is the victor_core, which provides a robust foundation for AGI development. Key components include:
- Modular Sector-Based Design: The AGI's operations are segmented into specialized, concurrently operating sectors. These include Input Processing, Cognitive Executive, Memory Management, Natural Language Generation (NLG), Plugin Management, and Prime Loyalty. Sectors communicate asynchronously via the
BrainFractalPulseExchange. VictorBrain: The central orchestrator that initializes, manages, and coordinates all sectors and core components, driving the main AGI processing loop.ASICoreDataContainer: A centralized container that manages and provides access to shared resources such as global configuration (ASIConfigCore), the main memory system (HyperFractalMemory), and NLP/code tokenizers (FractalTokenKernel_v1_1_0).OmegaTensor: A custom autograd library built withnumpy, enabling dynamic computation graphs and gradient tracking for neural network operations within the AGI framework.BrainFractalPulseExchange: An asynchronous, topic-based messaging system facilitating decoupled communication and event handling between different sectors and components.HyperFractalMemory: A sophisticated memory system designed for storing complex data structures, featuring semantic search capabilities (placeholder for vector similarity), emotional impact assessment, and relevance-based decay.PrimeLoyaltyKernel: A component integrated within thePrimeLoyaltySectorto ensure the AGI's actions and decisions align with predefined core directives and ethical guidelines.- Extensible Plugin System: Managed by the
ModularPluginSectorand itsModularPluginCortex, allowing for the dynamic loading and integration of new capabilities and tools from thevictor_pluginsdirectory.
The victor_modules directory houses more extensive, specialized components that can be integrated into the core AGI or function as standalone tools or advanced plugins:
quantum/zero_point_quantum_driver.py: A module simulating zero-point energy compression and metaphysical embedding concepts, offering unique data transformation capabilities.fractal_agi/victorch_filthy_fractal_agi_persist.py: A conceptual, self-contained AGI persistence mechanism employing fractal logic and simulated quantum annealing for state management (Note: this is a thematic module and not directly integrated intoVictorBrain's main persistence which is handled byHyperFractalMemoryand component-specific serialization).plugin_loader.py: A legacy plugin loader. This component is deprecated in favor of the newModularPluginSectorandModularPluginCortex.
The models directory contains model configurations and training infrastructure for Victor:
blank_slate.json: A minimal, untrained model configuration that serves as a starting point for training or fine-tuning. Contains architecture configuration, training hyperparameters, and system settings.transformer_model.py: Complete PyTorch implementation of a GPT-style transformer model with 125M parameters, including multi-head attention, position embeddings, and causal masking.- Pretrained GGUF Models: Victor supports GGUF (GPT-Generated Unified Format) quantized models for efficient inference. Due to their large size, pretrained models must be downloaded separately. See
models/PRETRAINED_GGUF_DOWNLOAD.txtfor instructions. - Model Utilities: The
modelsmodule provides Python utilities for loading and managing models. Seemodels/example_usage.pyfor examples.
For detailed information about models, see models/README.md.
Victor includes a complete training pipeline for state-of-the-art transformer models:
# Install training dependencies
pip install torch tqdm numpy
# Train on sample data (auto-generated)
python train_sota_model.py --create-sample-data --epochs 3
# Train on your own dataset
python train_sota_model.py --data your_text.txt --epochs 10 --batch-size 8The training script implements:
- Modern Optimization: AdamW with learning rate warmup and cosine annealing
- Efficient Training: Gradient clipping, accumulation, and automatic checkpointing
- Progress Tracking: Detailed logging and loss visualization
For comprehensive training documentation, see TRAINING_GUIDE.md.
For the easiest setup experience, use our automated setup scripts that create directories, install dependencies, and create a desktop shortcut:
Windows:
- Ensure you have Python 3.8 or newer installed from python.org
- Clone this repository to your local machine
- Double-click
setup.bator run it from Command Prompt - A "Victor LLM" shortcut will be created on your desktop
- Double-click the desktop shortcut to run Victor!
Linux/macOS:
- Ensure you have Python 3.8 or newer installed
- Clone this repository to your local machine
- Open a terminal in the repository directory
- Run:
bash setup.sh - A desktop shortcut will be created (or use
./run_victor.shto run)
- Ensure you have Python 3.8 or newer installed.
- Clone this repository to your local machine.
- Navigate to the repository's root directory and install the required dependencies:
This will install
pip install -r requirements.txt
numpy,scipy,openai,pyttsx3,pydub, andopencv-python, among any other core requirements. - (Optional) To use pretrained GGUF models, install llama-cpp-python:
See
pip install llama-cpp-python
models/PRETRAINED_GGUF_DOWNLOAD.txtfor instructions on obtaining pretrained models.
The primary entry point for the advanced Victor AGI framework is victor_core/main.py.
- Environment Variables: Certain components or plugins (especially those interfacing with external services like OpenAI) may require specific environment variables to be set (e.g.,
OPENAI_API_KEY). The core AGI framework can operate without these for basic functionalities, but ensure they are configured for full capabilities or when using relevant plugins. - Run the AGI: Execute the following command from the root of the repository:
This script initializes the
python -m victor_core.main
VictorBrain, activates all its sectors, sets up a dummy plugin for demonstration if the plugin directory is empty, and starts the main AGI processing loop.
For a simpler, direct demonstration of an LLM-based agent, the VICTOR_AGI_LLM.py script at the root of the repository is still available. This script provides a minimal framework for an AGI-style agent using OpenAI's language models directly.
- Ensure your
OPENAI_API_KEYenvironment variable is set. - Run the script:
You can enable text-to-speech output by adding the
python VICTOR_AGI_LLM.py
--voiceargument if you have thepyttsx3package installed and configured.
- The core AGI framework logic resides primarily within the
victor_corepackage. - New functionalities and tools can be added by creating plugins in the
victor_pluginsdirectory (the specific path is configured invictor_core/config.pyviaASIConfigCore.PLUGIN_DIR). - Larger, more specialized modules or standalone conceptual systems can be developed within the
victor_modulesdirectory. - The system uses an asynchronous architecture; familiarity with Python's
asynciolibrary is beneficial for development.