Skip to content

Julia library with Rust backend for axiomatic computation. Take your Python prototypes further with axiomatics!.

License

Notifications You must be signed in to change notification settings

hyperpolymath/Axiom.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

95 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Axiom.jl

What is Axiom.jl?

Axiom.jl is a next-generation ML framework that combines:

  • Compile-time verification - Shape errors caught before runtime

  • Formal guarantees - Mathematical proofs about model behavior

  • High performance - Rust backend for production speed

  • Julia elegance - Express models as mathematical specifications

using Axiom

@axiom ImageClassifier begin
    input :: Image(224, 224, 3)
    output :: Probabilities(1000)

    features = input |> ResNet50(pretrained=true) |> GlobalAvgPool()
    output = features |> Dense(2048, 1000) |> Softmax

    # GUARANTEED at compile time
    @ensure sum(output)  1.0
    @ensure all(output .>= 0)
end

model = ImageClassifier()
predictions = model(images)  # Type-safe, verified, fast

Features

Compile-Time Shape Verification

# PyTorch: Runtime error after hours of training
# Axiom.jl: Compile error in milliseconds

@axiom BrokenModel begin
    input :: Tensor{Float32, (224, 224, 3)}
    features = input |> Conv(64, (3,3))
    output = features |> Dense(10)  # COMPILE ERROR!
    # "Shape mismatch: Conv output is (222,222,64), Dense expects vector"
end

Formal Verification

@axiom SafeClassifier begin
    # ...
    @ensure valid_probabilities(output)    # Runtime assertion
    @prove ∀x. sum(softmax(x)) == 1.0      # Mathematical proof
end

# Generate verification certificates
cert = verify(model) |> generate_certificate
save_certificate(cert, "fda_submission.cert")

PyTorch Compatibility

# Load any PyTorch model
model = from_pytorch("resnet50.pth")

# Get 2-3x speedup immediately
output = model(input)  # Uses Rust backend

# Add verification
result = verify(model, properties=[ValidProbabilities()])

High Performance

# Development: Julia backend
model = Sequential(Dense(784, 128, relu), Dense(128, 10))

# Production: Rust backend (2-3x faster)
prod_model = compile(model, backend=:rust, optimize=:aggressive)

Quick Start

Installation

using Pkg
Pkg.add("Axiom")

Hello World

using Axiom

# Define a simple classifier
model = Sequential(
    Dense(784, 256, relu),
    Dense(256, 10),
    Softmax()
)

# Generate sample data
x = randn(Float32, 32, 784)

# Inference
predictions = model(x)

# Verify properties
@ensure all(sum(predictions, dims=2) .≈ 1.0)

With @axiom DSL

using Axiom

@axiom MNISTClassifier begin
    input :: Tensor{Float32, (:batch, 28, 28, 1)}
    output :: Probabilities(10)

    features = input |> Conv(32, (3,3)) |> ReLU |> MaxPool((2,2))
    features = features |> Conv(64, (3,3)) |> ReLU |> MaxPool((2,2))
    flat = features |> GlobalAvgPool() |> Flatten
    output = flat |> Dense(64, 10) |> Softmax

    @ensure valid_probabilities(output)
end

model = MNISTClassifier()

Why Axiom.jl?

The Problem

ML models are deployed in critical applications:

  • Medical diagnosis

  • Autonomous vehicles

  • Financial systems

  • Criminal justice

Yet our tools allow bugs to slip through to production.

The Solution

Axiom.jl catches bugs before they cause harm:

Issue PyTorch Axiom.jl

Shape mismatch

Runtime crash

Compile error

NaN in output

Silent failure

Detected/proven

Invalid probabilities

Undetected

Guaranteed impossible

Adversarial fragility

Unknown

Certifiable bounds

Documentation

Project Structure

Axiom.jl/
├── src/                 # Julia source
│   ├── Axiom.jl        # Main module
│   ├── types/          # Tensor type system
│   ├── layers/         # Neural network layers
│   ├── dsl/            # @axiom macro system
│   ├── verification/   # @ensure, @prove
│   ├── training/       # Optimizers, loss functions
│   └── backends/       # Julia/Rust backends
├── rust/               # Rust performance backend
│   ├── src/
│   │   ├── ops/       # Matrix, conv, activation ops
│   │   └── ffi.rs     # Julia FFI bindings
│   └── Cargo.toml
├── test/               # Test suite
├── examples/           # Example models
└── docs/               # Documentation & wiki

Roadmap

  • v0.1 - Core framework, DSL, verification basics

  • v0.2 - Full Rust backend, GPU support

  • v0.3 - Hugging Face integration, model zoo

  • v0.4 - Advanced proofs, SMT integration

  • v1.0 - Production ready, industry certifications

Contributing

We welcome contributions! See CONTRIBUTING.md.

  • Bug reports and feature requests

  • Documentation improvements

  • New layers and operations

  • Performance optimizations

  • Verification methods

Julia-First Verification

Axiom’s proof system is Julia-native by default. SMT solving runs through packages/SMTLib.jl with no Rust dependency. The Rust SMT runner is an optional backend you can enable for hardened subprocess control.

Julia-native example:

@prove ∃x. x > 0

Optional Rust runner:

export AXIOM_SMT_RUNNER=rust
export AXIOM_SMT_LIB=/path/to/libaxiom_core.so
export AXIOM_SMT_SOLVER=z3
@prove ∃x. x > 0

License

Palimpsest-MPL-1.0 License - see LICENSE for details.

Acknowledgments

Axiom.jl builds on the shoulders of giants:

  • Julia - The language

  • Flux.jl - Inspiration for Julia ML

  • Rust - Performance backend

  • PyTorch - Ecosystem compatibility


The future of ML is verified.

About

Julia library with Rust backend for axiomatic computation. Take your Python prototypes further with axiomatics!.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Sponsor this project

Packages

No packages published

Contributors 2

  •  
  •