Developer CenterGuidesBatch Processing

    Batch Processing

    The current public batch model is `submit-metro2-data` plus `batch-status`. There is no separate public `/api/v1/batches` or `/api/v1/records` surface.

    Current Pattern

    1. Chunk your records locally into manageable request sizes.
    2. POST each chunk to /api/v1/submit-metro2-data.
    3. Capture the returned batch_id.
    4. Poll /api/v1/batch-status/{id} for the stored result.

    Rate Limits

    • submit-metro2-data: 300 requests / 60 seconds
    • A 429 response includes retry_after.
    • The repo does not implement a documented public max-records-per-request rule, so keep chunk sizes conservative and test with your data.

    Example

    const CHUNK_SIZE = 500;
    
    function chunk(records, size) {
      const out = [];
      for (let i = 0; i < records.length; i += size) {
        out.push(records.slice(i, i + size));
      }
      return out;
    }
    
    for (const batch of chunk(records, CHUNK_SIZE)) {
      const response = await fetch(
        "https://metro2.switchlabs.dev/api/v1/submit-metro2-data",
        {
          method: "POST",
          headers: {
            "Content-Type": "application/json",
            Authorization: `Bearer ${apiKey}`,
          },
          body: JSON.stringify({ records: batch }),
        },
      );
    
      const result = await response.json();
    
      if (response.status === 429) {
        await new Promise((resolve) =>
          setTimeout(resolve, (result.retry_after || 1) * 1000),
        );
        continue;
      }
    
      console.log(result.batch_id, result.accepted, result.rejected);
    }

    Do Not Rely On

    • /api/v1/records
    • /api/v1/batches
    • /api/v1/batches/{id}
    • Webhook callback endpoints described as part of batch completion under `/api/v1/webhooks`