Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 41 additions & 0 deletions flash/troubleshooting.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -420,6 +420,47 @@ Failed to deserialize result: [error]

3. **Check argument types**: Input arguments must also be serializable.

### Payload too large
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Citation: Documents the MAX_PAYLOAD_SIZE (10 MB) limit and DESERIALIZE_TIMEOUT_SECONDS (30s) timeout enforced in src/runpod_flash/runtime/serialization.py. The new PayloadTooLargeError and DeserializeTimeoutError exceptions are defined in src/runpod_flash/runtime/exceptions.py.
View source


**Error:**
```
Payload size X MB exceeds limit of 10.0 MB
```

**Cause:** The serialized argument exceeds the 10 MB limit. Flash uses base64 encoding, which expands data by approximately 33%, so roughly 7.5 MB of raw data becomes 10 MB when encoded.

**Solutions:**

1. **Use network volumes for large data**: Save large data to a network volume and pass the file path:
```python
@Endpoint(name="processor", gpu=GpuGroup.ANY, volume="vol_abc123")
async def process(file_path: str) -> dict:
import numpy as np
data = np.load(file_path) # Load from volume
return {"result": process_data(data)}
```

2. **Compress data before sending**: For data that must be passed directly, use compression:
```python
import gzip

compressed = gzip.compress(data.tobytes())
# Pass compressed bytes instead
```

3. **Split large requests**: Break large datasets into smaller chunks and process them in multiple requests.

### Deserialization timeout

**Error:**
```
Deserialization timed out after 30s
```

**Cause:** The deserialization process took longer than 30 seconds. This usually indicates malformed or corrupted serialized data that causes the unpickle operation to hang.

**Solution:** Verify your input data is properly serialized. If you're manually constructing payloads, ensure the data was serialized using `cloudpickle` and encoded with base64. The Flash SDK handles this automatically for programmatic calls.

### Circuit breaker open

**Error:**
Expand Down
Loading