Skip to content

Commit 0d0ae5b

Browse files
apartsinclaude
andcommitted
Add README with banner, badges, examples, and installation instructions
- Generated banner image via Gemini API - Added top-level pyproject.toml for pip install from root - MIT LICENSE file - Quick start examples in Python and TypeScript - Links to all documentation, tutorials, and samples - Package builds as modelmesh-lite 0.1.0 (wheel + sdist) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
1 parent 3c928b4 commit 0d0ae5b

5 files changed

Lines changed: 286 additions & 7 deletions

File tree

LICENSE

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) 2026 ModelMesh Contributors
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.

README.md

Lines changed: 202 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,202 @@
1+
<p align="center">
2+
<img src="docs/assets/banner.png" alt="ModelMesh" width="100%">
3+
</p>
4+
5+
<h1 align="center">ModelMesh Lite</h1>
6+
7+
<p align="center">
8+
<strong>One integration point for all your AI providers.</strong><br>
9+
Automatic failover, free-tier aggregation, and capability-based routing.
10+
</p>
11+
12+
<p align="center">
13+
<a href="https://pypi.org/project/modelmesh-lite/"><img src="https://img.shields.io/pypi/v/modelmesh-lite?color=blue" alt="PyPI"></a>
14+
<a href="https://pypi.org/project/modelmesh-lite/"><img src="https://img.shields.io/pypi/pyversions/modelmesh-lite" alt="Python"></a>
15+
<a href="LICENSE"><img src="https://img.shields.io/badge/license-MIT-green" alt="License"></a>
16+
<a href="https://github.com/ApartsinProjects/ModelMesh/actions"><img src="https://img.shields.io/badge/tests-356%20passed-brightgreen" alt="Tests"></a>
17+
</p>
18+
19+
---
20+
21+
Your application requests a **capability** (e.g. "chat completion"). ModelMesh picks the best available provider, rotates on failure, and chains free quotas across providers -- all behind a standard OpenAI SDK interface.
22+
23+
## Install
24+
25+
```bash
26+
pip install modelmesh-lite
27+
```
28+
29+
## Quick Start
30+
31+
Set an API key and go:
32+
33+
```bash
34+
export OPENAI_API_KEY="sk-..."
35+
```
36+
37+
### Python
38+
39+
```python
40+
import modelmesh
41+
42+
# Create an OpenAI-compatible client with automatic provider routing
43+
client = modelmesh.create("chat-completion")
44+
45+
response = client.chat.completions.create(
46+
model="chat-completion", # virtual model name = capability pool
47+
messages=[{"role": "user", "content": "Hello!"}],
48+
)
49+
print(response.choices[0].message.content)
50+
```
51+
52+
### TypeScript
53+
54+
```typescript
55+
import { ModelMesh } from "@modelmesh/core";
56+
57+
const mesh = new ModelMesh();
58+
await mesh.initialize({ providers: ["openai"] });
59+
const client = mesh.getClient();
60+
61+
const response = await client.chat.completions.create({
62+
model: "chat-completion",
63+
messages: [{ role: "user", content: "Hello!" }],
64+
});
65+
console.log(response.choices[0].message.content);
66+
```
67+
68+
## What Happens Under the Hood
69+
70+
```
71+
client.chat.completions.create(model="chat-completion", ...)
72+
|
73+
v
74+
+-----------+ +-----------+ +----------+
75+
| Router | --> | Pool | --> | Model | --> Provider API
76+
+-----------+ +-----------+ +----------+
77+
Resolves the Groups models Selects best Sends request,
78+
capability to that can do active model handles retry
79+
a pool the task (rotation policy) and failover
80+
```
81+
82+
**`"chat-completion"`** resolves to a pool containing all models that support chat. The pool's **rotation policy** picks the best active model. If it fails, the router retries with backoff, then rotates to the next model. When a provider's free quota runs out, rotation automatically moves to the next provider.
83+
84+
## Multi-Provider Example
85+
86+
Add more API keys -- ModelMesh chains them automatically:
87+
88+
```bash
89+
export OPENAI_API_KEY="sk-..."
90+
export ANTHROPIC_API_KEY="sk-ant-..."
91+
export GEMINI_API_KEY="AI..."
92+
```
93+
94+
```python
95+
import modelmesh
96+
97+
# All detected providers join the pool -- failover is automatic
98+
client = modelmesh.create("chat-completion")
99+
response = client.chat.completions.create(
100+
model="chat-completion",
101+
messages=[{"role": "user", "content": "Explain quantum computing briefly."}],
102+
)
103+
```
104+
105+
OpenAI, Anthropic, and Gemini models are now in the same pool. If OpenAI is down or its quota is exhausted, the request routes to Anthropic, then Gemini.
106+
107+
## YAML Configuration
108+
109+
For full control, use a configuration file:
110+
111+
```yaml
112+
# modelmesh.yaml
113+
providers:
114+
openai.llm.v1:
115+
connector: openai.llm.v1
116+
config:
117+
api_key: "${secrets:OPENAI_API_KEY}"
118+
119+
anthropic.claude.v1:
120+
connector: anthropic.claude.v1
121+
config:
122+
api_key: "${secrets:ANTHROPIC_API_KEY}"
123+
124+
models:
125+
openai.gpt-4o:
126+
provider: openai.llm.v1
127+
capabilities:
128+
- generation.text-generation.chat-completion
129+
130+
anthropic.claude-sonnet-4:
131+
provider: anthropic.claude.v1
132+
capabilities:
133+
- generation.text-generation.chat-completion
134+
135+
pools:
136+
chat:
137+
capability: generation.text-generation.chat-completion
138+
strategy: stick-until-failure
139+
```
140+
141+
```python
142+
client = modelmesh.create(config="modelmesh.yaml")
143+
```
144+
145+
## Key Features
146+
147+
| Feature | Description |
148+
|---|---|
149+
| **OpenAI-compatible** | Drop-in replacement for any OpenAI SDK client |
150+
| **Multi-provider routing** | OpenAI, Anthropic, Gemini, Groq, and more |
151+
| **Automatic failover** | Retry with backoff, then rotate to next model |
152+
| **Free-tier aggregation** | Chain quotas across providers |
153+
| **Capability-based pools** | Request tasks, not specific providers |
154+
| **8 rotation strategies** | Stick-until-failure, cost-first, latency-first, round-robin, and more |
155+
| **Pluggable connectors** | Extend any integration point with the CDK |
156+
| **Zero dependencies** | Core library has no external dependencies |
157+
158+
## Documentation
159+
160+
| Document | Description |
161+
|---|---|
162+
| **[System Concept](docs/SystemConcept.md)** | Architecture, design, and full feature overview |
163+
| **[Model Capabilities](docs/ModelCapabilities.md)** | Capability hierarchy tree and predefined pools |
164+
| **[System Configuration](docs/SystemConfiguration.md)** | Full YAML configuration reference |
165+
| **[Connector Catalogue](docs/ConnectorCatalogue.md)** | All pre-shipped connectors with config schemas |
166+
| **[Connector Interfaces](docs/ConnectorInterfaces.md)** | Interface definitions for all connector types |
167+
| **[System Services](docs/SystemServices.md)** | Runtime objects: Router, Pool, Model, State |
168+
169+
### CDK (Connector Development Kit)
170+
171+
| Document | Description |
172+
|---|---|
173+
| **[CDK Overview](docs/cdk/Overview.md)** | Architecture and class hierarchy |
174+
| **[Base Classes](docs/cdk/BaseClasses.md)** | Reference for all CDK base classes |
175+
| **[Developer Guide](docs/cdk/DeveloperGuide.md)** | Tutorials: build your own connectors |
176+
| **[Convenience Layer](docs/cdk/ConvenienceLayer.md)** | QuickProvider and zero-config setup |
177+
| **[Mixins](docs/cdk/Mixins.md)** | Cache, metrics, rate limiter, HTTP client |
178+
179+
### Samples
180+
181+
| Collection | Description |
182+
|---|---|
183+
| **[Quickstart](samples/quickstart/)** | 6 progressive examples in Python and TypeScript |
184+
| **[System Integration](samples/system/)** | Multi-provider, streaming, embeddings, cost optimization |
185+
| **[CDK Tutorials](samples/cdk/)** | Build providers, rotation policies, and more |
186+
| **[Custom Connectors](samples/connectors/)** | Full custom connector examples for all 6 types |
187+
188+
## Development
189+
190+
```bash
191+
# Clone and install dev dependencies
192+
git clone https://github.com/ApartsinProjects/ModelMesh.git
193+
cd ModelMesh
194+
195+
# Run tests
196+
pip install pytest
197+
cd src/python && python -m pytest ../../tests/ -v
198+
```
199+
200+
## License
201+
202+
[MIT](LICENSE)

docs/assets/banner.png

234 KB
Loading

pyproject.toml

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
[build-system]
2+
requires = ["setuptools>=68.0", "wheel"]
3+
build-backend = "setuptools.build_meta"
4+
5+
[project]
6+
name = "modelmesh-lite"
7+
version = "0.1.0"
8+
description = "Capability-driven AI model routing with automatic failover and free-tier aggregation"
9+
readme = "README.md"
10+
requires-python = ">=3.11"
11+
license = "MIT"
12+
authors = [
13+
{name = "ModelMesh Contributors"}
14+
]
15+
keywords = ["ai", "llm", "routing", "openai", "multi-provider", "failover", "load-balancing"]
16+
classifiers = [
17+
"Development Status :: 3 - Alpha",
18+
"Intended Audience :: Developers",
19+
"Programming Language :: Python :: 3",
20+
"Programming Language :: Python :: 3.11",
21+
"Programming Language :: Python :: 3.12",
22+
"Programming Language :: Python :: 3.13",
23+
"Programming Language :: Python :: 3.14",
24+
"Topic :: Software Development :: Libraries",
25+
"Topic :: Scientific/Engineering :: Artificial Intelligence",
26+
]
27+
dependencies = []
28+
29+
[project.optional-dependencies]
30+
yaml = ["pyyaml>=6.0"]
31+
full = ["pyyaml>=6.0"]
32+
dev = ["pytest>=7.0", "pytest-asyncio>=0.21", "ruff>=0.1"]
33+
34+
[project.urls]
35+
Homepage = "https://github.com/ApartsinProjects/ModelMesh"
36+
Documentation = "https://github.com/ApartsinProjects/ModelMesh/tree/master/docs"
37+
Repository = "https://github.com/ApartsinProjects/ModelMesh"
38+
Issues = "https://github.com/ApartsinProjects/ModelMesh/issues"
39+
40+
[tool.setuptools.packages.find]
41+
where = ["src/python"]
42+
include = ["modelmesh*"]
43+
44+
[tool.ruff]
45+
line-length = 120
46+
target-version = "py311"
47+
48+
[tool.pytest.ini_options]
49+
asyncio_mode = "auto"

src/python/pyproject.toml

Lines changed: 14 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,18 @@
11
[build-system]
22
requires = ["setuptools>=68.0", "wheel"]
3-
build-backend = "setuptools.backends._legacy:_Backend"
3+
build-backend = "setuptools.build_meta"
44

55
[project]
6-
name = "modelmesh"
6+
name = "modelmesh-lite"
77
version = "0.1.0"
8-
description = "ModelMesh Lite — Capability-driven AI model routing with OpenAI-compatible interface"
8+
description = "Capability-driven AI model routing with automatic failover and free-tier aggregation"
99
readme = "../../README.md"
1010
requires-python = ">=3.11"
1111
license = {text = "MIT"}
1212
authors = [
1313
{name = "ModelMesh Contributors"}
1414
]
15-
keywords = ["ai", "llm", "routing", "openai", "multi-provider"]
15+
keywords = ["ai", "llm", "routing", "openai", "multi-provider", "failover", "load-balancing"]
1616
classifiers = [
1717
"Development Status :: 3 - Alpha",
1818
"Intended Audience :: Developers",
@@ -21,16 +21,23 @@ classifiers = [
2121
"Programming Language :: Python :: 3.11",
2222
"Programming Language :: Python :: 3.12",
2323
"Programming Language :: Python :: 3.13",
24+
"Programming Language :: Python :: 3.14",
2425
"Topic :: Software Development :: Libraries",
26+
"Topic :: Scientific/Engineering :: Artificial Intelligence",
2527
]
26-
dependencies = [] # Zero external dependencies for core
28+
dependencies = []
2729

2830
[project.optional-dependencies]
29-
aiohttp = ["aiohttp>=3.9"]
3031
yaml = ["pyyaml>=6.0"]
31-
full = ["aiohttp>=3.9", "pyyaml>=6.0"]
32+
full = ["pyyaml>=6.0"]
3233
dev = ["pytest>=7.0", "pytest-asyncio>=0.21", "ruff>=0.1"]
3334

35+
[project.urls]
36+
Homepage = "https://github.com/ApartsinProjects/ModelMesh"
37+
Documentation = "https://github.com/ApartsinProjects/ModelMesh/tree/master/docs"
38+
Repository = "https://github.com/ApartsinProjects/ModelMesh"
39+
Issues = "https://github.com/ApartsinProjects/ModelMesh/issues"
40+
3441
[tool.setuptools.packages.find]
3542
where = ["."]
3643
include = ["modelmesh*"]

0 commit comments

Comments
 (0)