A high-performance, robust tool to convert MySQL dump files to PostgreSQL-compatible format.
- High Performance: Multi-threaded processing with optimized streaming for large files
- Comprehensive Conversion: Handles data types, functions, constraints, and indexes
- Flexible Output: Write to file or stream to stdout
- Validation: Optional PostgreSQL syntax validation
- Robust Error Handling: Graceful handling of malformed SQL with detailed reporting
- Memory Efficient: Processes files in configurable chunks to handle large dumps
cargo install mysql2postgresgit clone https://github.com/yourusername/mysql2postgres.git
cd mysql2postgres
cargo build --release# Convert to file
mysql2postgres input.sql output.sql
# Stream to stdout
mysql2postgres input.sql --stdout
# Pipe to PostgreSQL
mysql2postgres dump.sql --stdout | psql -d mydatabase
# With verbose logging
mysql2postgres input.sql output.sql --verbose
# Skip validation for faster processing
mysql2postgres input.sql output.sql --no-validation| MySQL Type | PostgreSQL Type | Notes |
|---|---|---|
TINYINT |
SMALLINT |
|
INT/INTEGER |
INTEGER |
|
BIGINT |
BIGINT |
|
AUTO_INCREMENT |
SERIAL/BIGSERIAL |
Based on integer type |
VARCHAR(n) |
VARCHAR(n) |
|
TEXT/LONGTEXT |
TEXT |
|
DATETIME |
TIMESTAMP |
|
JSON |
JSONB |
For better performance |
ENUM |
VARCHAR(255) |
With comment |
SET |
TEXT |
With comment |
BLOB/BINARY |
BYTEA |
| MySQL Function | PostgreSQL Equivalent |
|---|---|
NOW() |
NOW() |
CURDATE() |
CURRENT_DATE |
CURTIME() |
CURRENT_TIME |
UNIX_TIMESTAMP() |
EXTRACT(EPOCH FROM NOW()) |
IFNULL(a,b) |
COALESCE(a,b) |
IF(cond,a,b) |
CASE WHEN cond THEN a ELSE b END |
- Backtick Identifiers: Converted to double quotes
- ENGINE Specifications: Removed (InnoDB, MyISAM, etc.)
- Character Set/Collation: Removed or converted
- INSERT IGNORE: Converted to plain INSERT
- ON DUPLICATE KEY UPDATE: Removed
- Index Definitions: Converted to comments with suggestions
- Zero Dates: Converted to valid PostgreSQL dates
mysql2postgres [OPTIONS] <INPUT_FILE> [OUTPUT_FILE]
Arguments:
<INPUT_FILE> Path to MySQL dump file
[OUTPUT_FILE] Path to output PostgreSQL dump file (optional when using --stdout)
Options:
--chunk-size <CHUNK_SIZE> Chunk size for reading large files [default: 8192]
--test Run test suite
-v, --verbose Enable verbose logging
--no-validation Skip PostgreSQL output validation
--stdout Output the converted SQL directly to stdout
-h, --help Print help
-V, --version Print version
The tool is designed for high performance:
- Streaming Processing: Memory usage remains constant regardless of file size
- Parallel Processing: CPU-intensive conversions use all available cores
- Optimized Patterns: Pre-compiled regex patterns and fast string matching
- Efficient I/O: Large buffers and batch processing for optimal throughput
On a modern machine, typical performance:
- ~100MB/s for simple dumps (mostly INSERT statements)
- ~50MB/s for complex schemas with many data type conversions
- ~1M lines/minute for mixed content
use mysql2postgres::{MySQLToPostgreSQLConverter, ConversionStats};
use std::path::Path;
// Convert to file
let converter = MySQLToPostgreSQLConverter::new(
"input.sql",
"output.sql",
8192
)?;
let stats = converter.convert()?;
// Convert to stdout
let converter = MySQLToPostgreSQLConverter::new_with_stdout(
"input.sql",
8192
)?;
let stats = converter.convert()?;
println!("Converted {} tables", stats.tables_created);The tool provides comprehensive error reporting:
- Parse Errors: Line numbers and context for malformed SQL
- Conversion Warnings: Non-critical issues that were handled
- Validation Errors: PostgreSQL syntax issues (when validation enabled)
- I/O Errors: File access and write permissions
Run the full test suite:
cargo testRun benchmarks:
cargo benchRun with verbose output:
cargo test -- --nocapture- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Add tests for new functionality
- Ensure all tests pass (
cargo test) - Run formatting (
cargo fmt) - Run linting (
cargo clippy) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
git clone https://github.com/yourusername/mysql2postgres.git
cd mysql2postgres
# Install development dependencies
cargo install cargo-watch cargo-edit
# Run tests in watch mode
cargo watch -x test
# Run with file watching
cargo watch -x 'run -- sample_mysql.sql --stdout'- Complex Triggers: Not supported, will be commented out
- Stored Procedures: Not converted, require manual translation
- Advanced MySQL Features: Some features may need manual adjustment
- Character Set Edge Cases: Some complex encoding scenarios may need review
This project is licensed under the MIT License - see the LICENSE file for details.
See CHANGELOG.md for detailed version history.
- =� Bug Reports: GitHub Issues
- =¡ Feature Requests: GitHub Discussions
- =Ú Documentation: docs.rs
- Inspired by various MySQL to PostgreSQL migration tools
- Built with the excellent Rust ecosystem libraries
- Thanks to all contributors and users providing feedback