Batch Processing
For high-volume workloads, the batch endpoints let you submit up to 500 properties per job asynchronously. Instead of waiting for each valuation synchronously, you submit a job, receive a job_id, and poll for completion.
Endpoints
| Endpoint | Description |
|---|---|
POST /v1/avm/batch/property | Submit a property valuation batch |
POST /v1/avm/batch/rental | Submit a rental valuation batch |
POST /v1/avm/batch/multifamily | Submit a multifamily valuation batch |
GET /v1/avm/batch/{job_id} | Poll job status and retrieve results |
Step 1 — Submit a batch job
curl -X POST https://api.padstats.io/v1/avm/batch/property \
-H "X-API-Key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"notify_url": "https://yourserver.com/webhooks/avm",
"properties": [
{
"property_id": "ref-001",
"location": { "address": "123 Main St, Austin, TX 78701" },
"search_params": { "radius_mi": 5.0, "num_comparables": 0 },
"property_fields": {
"total_bedrooms": 3,
"total_bathrooms": 2,
"living_area": 1800,
"year_built": 1995,
"property_type": "houses"
}
},
{
"property_id": "ref-002",
"location": { "address": "456 Oak Ave, Tampa, FL 33602" },
"search_params": { "radius_mi": 5.0, "num_comparables": 0 },
"property_fields": {
"total_bedrooms": 4,
"total_bathrooms": 3,
"living_area": 2400,
"year_built": 2010,
"property_type": "houses"
}
}
]
}'
Batch request fields
| Field | Type | Required | Description |
|---|---|---|---|
properties | array | Yes | Array of property objects (max 500) |
properties[].property_id | string | No | Your reference ID — echoed back in results |
properties[].location | object | Yes | address or latitude + longitude |
properties[].search_params | object | No | radius_mi, num_comparables, lookback_months, date |
properties[].property_fields | object | Yes | Same fields as the single-property AVM |
notify_url | string | No | Webhook URL — called when the job completes |
Response (HTTP 202 Accepted)
{
"job_id": "f47ac10b-58cc-4372-a567-0e02b2c3d479",
"status": "queued",
"avm_type": "property",
"total": 2,
"created_at": "2026-03-16T14:32:01.123Z",
"status_url": "/v1/avm/batch/f47ac10b-58cc-4372-a567-0e02b2c3d479",
"notify_url": "https://yourserver.com/webhooks/avm"
}
Use status_url as the polling path — it's the same as GET /v1/avm/batch/{job_id} pre-formatted for convenience.
Step 2 — Poll for completion
curl "https://api.padstats.io/v1/avm/batch/f47ac10b-58cc-4372-a567-0e02b2c3d479" \
-H "X-API-Key: YOUR_API_KEY"
In-progress response
{
"job_id": "f47ac10b-...",
"status": "running",
"total": 500,
"completed": 312,
"failed": 0,
"progress_pct": 62.4
}
Completed response
{
"job_id": "f47ac10b-...",
"status": "complete",
"total": 500,
"completed": 498,
"failed": 2,
"progress_pct": 100.0,
"completed_at": "2026-03-16T14:35:44.789Z",
"updated_at": "2026-03-16T14:35:44.789Z",
"webhook_delivered": true,
"results": [
{
"index": 0,
"property_id": "ref-001",
"status": "complete",
"data": {
"valuation": {
"point_value": 487000,
"confidence_interval_95": { "lower": 452000, "upper": 522000 }
}
}
},
{
"index": 1,
"property_id": "ref-002",
"status": "error",
"error": "Unable to geocode address"
}
]
}
Webhooks
If you supply a notify_url, PadStats will POST the completed job result to that URL when the job finishes. The webhook payload is identical to the polling response above.
Webhook delivery is retried up to 3 times with exponential backoff (2s → 4s → 8s) on failure. Check webhook_delivered: true/false in the polling response to confirm delivery.
Performance guidance
| Scenario | Recommended config |
|---|---|
| Need only valuations, no comparable listings | num_comparables: 0 — 2–3s per 8 properties |
| Need comparable properties returned | num_comparables: 5 — 5–7s per 8 properties |
| Maximum throughput | num_comparables: 0, radius_mi: 3.0 |
Setting num_comparables: 0 is the most impactful single optimization — it skips the comparables database query entirely.
Polling pattern (Python)
import time
import requests
API_KEY = "YOUR_API_KEY"
BASE = "https://api.padstats.io"
def poll_batch(job_id, interval=10, timeout=600):
deadline = time.time() + timeout
while time.time() < deadline:
r = requests.get(
f"{BASE}/v1/avm/batch/{job_id}",
headers={"X-API-Key": API_KEY},
)
job = r.json()
print(f"Status: {job['status']} — {job.get('progress_pct', 0):.1f}%")
if job["status"] in ("complete", "failed"):
return job
time.sleep(interval)
raise TimeoutError("Batch job did not complete within timeout")