Batch Processing Strategies
When you need to submit thousands of credit records at once, batch processing lets you upload large portfolios efficiently while handling rate limits, failures, and progress tracking.
1. When to Use Batch Processing
Batch processing is the preferred approach for large-scale submissions. Use it when:
Large Portfolios
Submitting more than a few hundred records at a time. The single-record API becomes impractical above this threshold.
Monthly Submissions
Regular monthly reporting cycles where all active accounts need to be submitted together within a reporting window.
Initial Data Load
Onboarding an existing portfolio for the first time, which may involve tens of thousands of records.
Corrections & Resubmissions
Resubmitting a group of corrected records after receiving CRA rejection notices.
2. Rate Limits and Throughput
The API enforces rate limits to ensure platform stability. Plan your batch strategy around these constraints:
Rate Limits:
┌──────────────────────────────────────────────────────┐
│ Limit │ Value │
├──────────────────────┼────────────────────────────────┤
│ Requests per minute │ 10 requests/min │
│ Records per request │ Up to 1,000 records/request │
│ Max file size │ 10 MB per request │
│ Effective throughput │ ~10,000 records/min │
│ Monthly portfolio │ ~600,000 records/hour │
└──────────────────────────────────────────────────────┘Note: If you exceed the rate limit, the API returns a 429 Too Many Requests response with a Retry-After header indicating how many seconds to wait before retrying.
3. Chunking Strategies
Split your records into chunks that fit within the 1,000 records-per-request limit. Here is a practical implementation:
// Split records into chunks and submit with rate limiting
const CHUNK_SIZE = 1000;
const DELAY_MS = 6500; // Stay safely under 10 req/min
function chunkArray(records, size) {
const chunks = [];
for (let i = 0; i < records.length; i += size) {
chunks.push(records.slice(i, i + size));
}
return chunks;
}
async function submitBatch(records, apiKey) {
const chunks = chunkArray(records, CHUNK_SIZE);
const results = [];
console.log(`Submitting ${records.length} records in ${chunks.length} chunks`);
for (let i = 0; i < chunks.length; i++) {
console.log(`Chunk ${i + 1}/${chunks.length} (${chunks[i].length} records)`);
const response = await fetch('https://api.metro2.switchlabs.dev/v1/records', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${apiKey}`
},
body: JSON.stringify({ records: chunks[i] })
});
const result = await response.json();
results.push({
chunk: i + 1,
status: response.status,
accepted: result.accepted || 0,
rejected: result.rejected || 0,
errors: result.errors || []
});
// Wait between chunks to respect rate limits
if (i < chunks.length - 1) {
await new Promise(resolve => setTimeout(resolve, DELAY_MS));
}
}
return results;
}
// Usage
const allRecords = loadRecordsFromDatabase();
const results = await submitBatch(allRecords, process.env.METRO2_API_KEY);
const totalAccepted = results.reduce((sum, r) => sum + r.accepted, 0);
const totalRejected = results.reduce((sum, r) => sum + r.rejected, 0);
console.log(`Done: ${totalAccepted} accepted, ${totalRejected} rejected`);4. Handling Partial Failures
When a batch request is partially successful, some records are accepted while others are rejected. The API returns detailed information about each record's status:
// Example response with partial failures
{
"batchId": "batch_abc123",
"status": "completed_with_errors",
"summary": {
"total": 1000,
"accepted": 987,
"rejected": 13
},
"errors": [
{
"index": 42,
"accountNumber": "ACCT-0042",
"code": "INVALID_SSN",
"message": "SSN format is invalid"
},
{
"index": 156,
"accountNumber": "ACCT-0156",
"code": "MISSING_REQUIRED_FIELD",
"message": "dateOpened is required"
}
]
}// Collect failed records and retry after fixing
async function submitWithRetry(records, apiKey, maxRetries = 2) {
let remaining = [...records];
for (let attempt = 0; attempt <= maxRetries; attempt++) {
const chunks = chunkArray(remaining, CHUNK_SIZE);
const failedRecords = [];
for (const chunk of chunks) {
const response = await fetch('https://api.metro2.switchlabs.dev/v1/records', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${apiKey}`
},
body: JSON.stringify({ records: chunk })
});
const result = await response.json();
// Collect records that failed with retryable errors
if (result.errors) {
for (const error of result.errors) {
if (isRetryable(error.code)) {
failedRecords.push(chunk[error.index]);
} else {
console.error(`Permanent failure: ${error.accountNumber} - ${error.message}`);
}
}
}
await new Promise(resolve => setTimeout(resolve, DELAY_MS));
}
if (failedRecords.length === 0) break;
remaining = failedRecords;
console.log(`Retry ${attempt + 1}: ${remaining.length} records to resubmit`);
}
}
function isRetryable(code) {
const retryableCodes = ['RATE_LIMITED', 'TIMEOUT', 'INTERNAL_ERROR'];
return retryableCodes.includes(code);
}5. Async Processing with Status Polling
For very large portfolios, use the async batch endpoint. You submit all records at once and poll for completion:
// Step 1: Create an async batch
const createResponse = await fetch('https://api.metro2.switchlabs.dev/v1/batches', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${apiKey}`
},
body: JSON.stringify({
records: allRecords, // Can exceed 1,000 records
callbackUrl: 'https://your-app.com/webhooks/batch-complete' // Optional
})
});
const { batchId } = await createResponse.json();
console.log(`Batch created: ${batchId}`);
// Step 2: Poll for status
async function pollBatchStatus(batchId, apiKey) {
while (true) {
const statusResponse = await fetch(
`https://api.metro2.switchlabs.dev/v1/batches/${batchId}`,
{ headers: { 'Authorization': `Bearer ${apiKey}` } }
);
const status = await statusResponse.json();
console.log(`Status: ${status.state} (${status.progress}%)`);
if (status.state === 'completed' || status.state === 'failed') {
return status;
}
// Poll every 10 seconds
await new Promise(resolve => setTimeout(resolve, 10000));
}
}
const finalStatus = await pollBatchStatus(batchId, apiKey);
console.log(`Batch complete: ${finalStatus.summary.accepted} accepted`);Batch Status Values
queuedBatch is waiting to be processedprocessingRecords are being validated and submittedcompletedAll records have been processed (check summary for details)failedBatch failed to process (e.g., invalid format, auth error)6. Monitoring Batch Progress via Webhooks
Instead of polling, you can configure a webhook callback URL when creating the batch. The API will send a POST request to your endpoint when the batch completes:
// Webhook payload sent to your callbackUrl
{
"event": "batch.completed",
"batchId": "batch_abc123",
"timestamp": "2025-03-15T14:30:00Z",
"summary": {
"total": 15000,
"accepted": 14892,
"rejected": 108,
"processingTimeMs": 45230
},
"errorsUrl": "https://api.metro2.switchlabs.dev/v1/batches/batch_abc123/errors"
}
// Your webhook handler
app.post('/webhooks/batch-complete', (req, res) => {
const { batchId, summary } = req.body;
console.log(`Batch ${batchId} complete:`);
console.log(` Accepted: ${summary.accepted}`);
console.log(` Rejected: ${summary.rejected}`);
if (summary.rejected > 0) {
// Fetch and process error details
fetchAndHandleErrors(req.body.errorsUrl);
}
res.status(200).send('OK');
});Tip
For portfolios over 5,000 records, use async batch processing for best performance. The async endpoint handles chunking internally and provides progress tracking, so you do not need to manage chunks yourself.
Related Resources
API Reference
- • POST /api/v1/records — Synchronous submission
- • POST /api/v1/batches — Async batch submission
- • GET /api/v1/batches/:id — Batch status polling