Bring Your Own Storage (BYOS)
Store your emails and attachments in your own cloud storage. Maintain full control over your data with support for Amazon S3, Cloudflare R2, Azure Blob Storage, Google Cloud Storage, and other S3-compatible services.
Why Bring Your Own Storage?
Custom storage gives you complete ownership and control over your email data. Instead of relying on Mailhooks platform storage, your emails and attachments are stored directly in your own cloud storage bucket.
Data Sovereignty
Keep your data in your preferred region and jurisdiction. Meet compliance requirements like GDPR, HIPAA, or industry-specific regulations.
Cost Optimization
Leverage your existing cloud storage contracts and volume discounts. Use cost-effective options like Cloudflare R2 with zero egress fees.
Data Retention Control
Set your own lifecycle policies, backup schedules, and retention periods. Integrate with your existing data management workflows.
Direct Access
Access raw email files directly from your bucket for custom processing, archiving, or integration with other tools.
Supported Storage Providers
Mailhooks supports major cloud storage providers and any S3-compatible storage service.
Amazon S3 / S3-Compatible Storage
Connect to Amazon S3 or any S3-compatible storage service including Cloudflare R2, DigitalOcean Spaces, MinIO, Backblaze B2, and Wasabi.
S3-Compatible Services
Configuration Fields
| Field | Description | Required |
|---|---|---|
| bucket | Your S3 bucket name | Yes |
| region | AWS region (e.g., us-east-1) - for Amazon S3 | Yes (AWS) |
| endpoint | Custom endpoint URL - for S3-compatible services | Yes (Compatible) |
| accessKeyId | Access key ID for authentication | Yes |
| secretAccessKey | Secret access key for authentication | Yes |
| forcePathStyle | Use path-style URLs instead of virtual-hosted-style | No |
https://<account-id>.r2.cloudflarestorage.comAzure Blob Storage
Store emails in Microsoft Azure Blob Storage with enterprise-grade security and global availability.
Configuration Fields
| Field | Description | Required |
|---|---|---|
| accountName | Azure Storage account name | Yes |
| containerName | Blob container name for storing emails | Yes |
| accountKey | Storage account access key | Yes |
Google Cloud Storage
Use Google Cloud Storage for seamless integration with your GCP infrastructure and BigQuery analytics.
Configuration Fields
| Field | Description | Required |
|---|---|---|
| projectId | Your GCP project ID | Yes |
| bucketName | GCS bucket name for storing emails | Yes |
| credentials | Service account JSON key file contents | Yes |
Storage Path Format
Emails and attachments are stored using organized, human-readable paths that make it easy to browse and manage your data directly in your storage bucket.
Email Storage Path
emails/{domain}/{year}/{month}/{day}/{emailId}.eml
Example:
emails/mail.example.com/2025/01/09/abc123def.emlAttachment Storage Path
attachments/{domain}/{emailId}/{filename}
Example:
attachments/mail.example.com/abc123def/document.pdfmailhooks/) to organize your bucket if you're using it for multiple purposes.Attachment Storage
When an email contains attachments, they are stored separately in your bucket alongside the email. This allows for efficient storage and direct access to individual files without parsing the entire email.
How Attachments Are Stored
Each attachment is stored with its original filename in a folder specific to the email it belongs to. The API response includes storage paths for each attachment.
API Response with Attachments
{
"id": "abc123def",
"from": "[email protected]",
"subject": "Invoice Attached",
"storagePath": "emails/mail.example.com/2025/01/09/abc123def.eml",
"attachments": [
{
"id": "att_001",
"filename": "invoice.pdf",
"contentType": "application/pdf",
"size": 125432,
"storagePath": "attachments/mail.example.com/abc123def/invoice.pdf"
},
{
"id": "att_002",
"filename": "receipt.png",
"contentType": "image/png",
"size": 45678,
"storagePath": "attachments/mail.example.com/abc123def/receipt.png"
}
]
}Downloading Attachments
You can download attachments directly from your storage bucket using the storage path, or use the Mailhooks SDK for convenience.
Using the Mailhooks SDK
import { Mailhooks } from '@mailhooks/sdk';
const mailhooks = new Mailhooks({ apiKey: 'mh_your_api_key' });
// Download attachment via Mailhooks API
const attachment = await mailhooks.emails.downloadAttachment(
'abc123def', // emailId
'att_001' // attachmentId
);
// Save to file
import fs from 'fs';
fs.writeFileSync(attachment.filename, Buffer.from(attachment.data));Direct from S3 Bucket
import { S3Client, GetObjectCommand } from '@aws-sdk/client-s3';
const s3 = new S3Client({
region: 'auto',
endpoint: 'https://xxx.r2.cloudflarestorage.com',
credentials: {
accessKeyId: process.env.R2_ACCESS_KEY_ID,
secretAccessKey: process.env.R2_SECRET_ACCESS_KEY,
},
});
// Download attachment directly from your bucket
const response = await s3.send(new GetObjectCommand({
Bucket: 'my-email-bucket',
Key: 'attachments/mail.example.com/abc123def/invoice.pdf',
}));
const fileBuffer = await response.Body.transformToByteArray();Configuration Options
Path Prefix
Add a custom prefix to all storage paths. Useful when sharing a bucket with other applications or organizing by environment.
# Without prefix
emails/mail.example.com/2025/01/09/abc123.eml
# With prefix "mailhooks/"
mailhooks/emails/mail.example.com/2025/01/09/abc123.emlFallback Behavior
Configure what happens when your custom storage is temporarily unavailable.
Use Platform Storage (Default)
Emails are stored on Mailhooks platform storage if your custom storage fails. Ensures no email loss.
Reject Email
Emails are rejected with a temporary failure if your storage is unavailable. The sender's server will retry delivery.
Domain-Level Configuration
Configure different storage settings per domain. Useful for multi-tenant applications or when different domains have different compliance requirements.
Setting Up Custom Storage
Create a Storage Bucket
Create a bucket in your preferred cloud provider. Ensure it's configured with appropriate access controls.
Create Access Credentials
Generate API keys or service account credentials with write access to your bucket.
Configure in Dashboard
Navigate to Settings > Storage and enter your provider credentials.
Test Connection
Use the "Test Connection" button to verify your configuration before saving.
Accessing Storage Paths via API
When custom storage is configured, the email API response includes the storage path so you can access the raw email file directly from your bucket.
{
"id": "abc123def",
"from": "[email protected]",
"to": ["[email protected]"],
"subject": "Hello World",
"storagePath": "emails/mail.yourdomain.com/2025/01/09/abc123def.eml",
...
}Use this path with your cloud provider's SDK to download the original .eml file for custom processing, archiving, or analysis.
Parsing Emails with the SDK
The Mailhooks SDK includes a parseEml utility for parsing raw .eml files stored in your custom storage. This is useful when you want to process emails directly from your bucket without going through the Mailhooks API.
Installing the SDK
npm install @mailhooks/sdkThe parseEml Function
The parseEml function takes raw EML content and returns a structured email object with all the parsed data.
Return Type
interface ParsedEmail {
from: string; // Sender email address
to: string[]; // Array of recipient addresses
subject: string; // Email subject line
body: string; // Plain text body
html?: string; // HTML body (if available)
attachments: Array<{ // Attachment metadata
filename: string;
contentType: string;
size: number;
}>;
headers: Record; // Email headers
date?: Date; // Date the email was sent
} Complete Example: Webhook + Custom Storage
Here's a complete example showing how to handle a webhook notification and parse the email directly from your S3 bucket.
import express from 'express';
import { parseEml } from '@mailhooks/sdk';
import { S3Client, GetObjectCommand } from '@aws-sdk/client-s3';
const app = express();
app.use(express.json());
// Configure S3 client for your storage (e.g., Cloudflare R2)
const s3 = new S3Client({
region: 'auto',
endpoint: process.env.R2_ENDPOINT,
credentials: {
accessKeyId: process.env.R2_ACCESS_KEY_ID!,
secretAccessKey: process.env.R2_SECRET_ACCESS_KEY!,
},
});
app.post('/api/webhooks/mailhooks', async (req, res) => {
const { event, data } = req.body;
if (event === 'email.received') {
// Check if email uses custom storage
if (data.usesCustomStorage && data.storagePath) {
// Fetch the raw EML from your bucket
const response = await s3.send(new GetObjectCommand({
Bucket: process.env.R2_BUCKET!,
Key: data.storagePath,
}));
// Parse the EML content
const emlBuffer = Buffer.from(
await response.Body!.transformToByteArray()
);
const email = await parseEml(emlBuffer);
// Now you have full access to the parsed email
console.log('From:', email.from);
console.log('Subject:', email.subject);
console.log('Body:', email.body);
console.log('HTML:', email.html);
console.log('Attachments:', email.attachments.length);
console.log('Headers:', email.headers);
// Process the email...
await processEmail(email);
}
}
res.status(200).send('OK');
});
async function processEmail(email: ParsedEmail) {
// Your custom processing logic here
// e.g., extract data, store in database, trigger workflows
}Fetching and Parsing from Azure Blob Storage
import { BlobServiceClient } from '@azure/storage-blob';
import { parseEml } from '@mailhooks/sdk';
const blobService = BlobServiceClient.fromConnectionString(
process.env.AZURE_STORAGE_CONNECTION_STRING!
);
const containerClient = blobService.getContainerClient('emails');
async function getEmailFromAzure(storagePath: string) {
const blobClient = containerClient.getBlobClient(storagePath);
const downloadResponse = await blobClient.download();
// Convert stream to buffer
const chunks: Buffer[] = [];
for await (const chunk of downloadResponse.readableStreamBody!) {
chunks.push(Buffer.from(chunk));
}
const emlBuffer = Buffer.concat(chunks);
// Parse the email
return await parseEml(emlBuffer);
}Fetching and Parsing from Google Cloud Storage
import { Storage } from '@google-cloud/storage';
import { parseEml } from '@mailhooks/sdk';
const storage = new Storage({
projectId: process.env.GCP_PROJECT_ID,
keyFilename: process.env.GCP_KEY_FILE,
});
const bucket = storage.bucket(process.env.GCS_BUCKET!);
async function getEmailFromGCS(storagePath: string) {
const file = bucket.file(storagePath);
const [contents] = await file.download();
// Parse the email
return await parseEml(contents);
}Use Cases
Compliance & Archiving
Meet regulatory requirements by storing emails in your own infrastructure with custom retention policies and audit trails.
Custom Processing Pipeline
Trigger cloud functions or batch processing jobs when emails arrive in your bucket for custom ML/AI analysis.
Multi-Region Deployment
Store emails in region-specific buckets to meet data residency requirements and reduce latency for global applications.
Cost Optimization
Use storage providers with favorable pricing like Cloudflare R2 (zero egress) or apply lifecycle policies to move old emails to cold storage.