πΈ React Native Live Document Edge Detection
A high-performance, real-time document edge detection and crop library for React Native (iOS & Android).
| iOS (Live Detection) | Android (Live Detection) |
|---|---|
![]() |
![]() |
| iOS (Manual Crop) | Android (Manual Crop) |
|---|---|
![]() |
![]() |
- Real-time Edge Detection: Detects document boundaries instantly from the camera feed.
- Cross-Platform: Fully supported on iOS and Android.
- High Quality Capture: Captures high-resolution images of the document.
- Perspective Correction: Automatically crops and warps the detected document to a flat rectangle.
- Customizable UI: Customize the overlay color, stroke width, and more.
- TypeScript Support: First-class TypeScript definitions included.
npm install react-native-live-detect-edges
# or
yarn add react-native-live-detect-edgesDon't forget to install the pods:
cd ios && pod installAdd the following key to your Info.plist to request camera permission:
<key>NSCameraUsageDescription</key>
<string>We need access to your camera to scan documents.</string>Ensure you have the camera permission in AndroidManifest.xml (the library adds it, but your app must request it at runtime):
<uses-permission android:name="android.permission.CAMERA" />This component renders the camera preview with the live edge detection overlay.
import { LiveDetectEdgesView } from 'react-native-live-detect-edges';
<LiveDetectEdgesView
style={{ flex: 1 }}
overlayColor="rgba(0, 255, 0, 0.5)"
overlayStrokeWidth={4}
/>;Use the takePhoto method to capture the current document. It returns the full image and the detected corner points.
import { takePhoto } from 'react-native-live-detect-edges';
const handleCapture = async () => {
try {
const result = await takePhoto();
console.log('Original Image:', result.originalImage.uri);
console.log('Cropped Image:', result.image.uri); // Auto-cropped if detected
} catch (error) {
console.error('Capture failed:', error);
}
};If you want to manually adjust the crop points (e.g., in a custom UI), use cropImage.
import { cropImage } from 'react-native-live-detect-edges';
const result = await cropImage({
imageUri: 'file:///path/to/image.jpg',
quad: {
topLeft: { x: 100, y: 100 },
topRight: { x: 400, y: 100 },
bottomRight: { x: 400, y: 500 },
bottomLeft: { x: 100, y: 500 },
},
});
console.log('New Cropped URI:', result.uri);To obtain the quad (Quadrilateral) coordinates for manual cropping, you typically need a UI that allows users to drag corner points over the image.
We provide a full example of such a UI using react-native-gesture-handler and react-native-reanimated in the example app.
π Check out the implementation here: example/src/screens/CropScreen.tsx
| Prop | Type | Default | Description |
|---|---|---|---|
overlayColor |
string |
"rgba(0, 255, 0, 0.5)" |
The color of the edge detection overlay (stroke). |
overlayFillColor |
string |
undefined |
The fill color of the detected area. Defaults to a semi-transparent version of overlayColor if not specified. |
overlayStrokeWidth |
number |
4 |
The thickness of the detection lines. |
style |
ViewStyle |
- | Standard React Native style prop. |
Captures the current frame, attempts to detect edges, and saves both the original and a cropped version.
Returns:
originalImage:{ uri, width, height }image:{ uri, width, height }(The auto-cropped result)detectedPoints:Quadrilateral(Corner points in image coordinates)
Performs a perspective transform on an image given specific corner points.
Params:
imageUri:string(Local file path)quad:Quadrilateral(topLeft,topRight,bottomRight,bottomLeft)
See the contributing guide to learn how to contribute to the repository and the development workflow.
Special thanks to the authors of these amazing libraries:
- Android: Big thanks to pynicolas for FairScan. This library provided the core inspiration and logic for the Android implementation.
- iOS: Huge shoutout to the team at WeTransfer and all contributors of WeScan for building such a robust foundation.
MIT
Made with create-react-native-library



