Developer

Ingest API

Connect any TMS, WMS, or OMS to ShipIQ via push webhooks or periodic polling.

Overview

The ShipIQ Ingest API receives shipment events from your systems and produces a normalised, deduplicated record of every shipment. It supports two integration modes:

Push (webhooks)

Your TMS/WMS posts events to POST /ingest/{source} in real time. ShipIQ acknowledges immediately and normalises asynchronously.

Pull (polling)

ShipIQ periodically calls your API on a configurable interval, fetching new events since the last cursor. Cursor state is persisted per source.

All inbound payloads are preserved verbatim before normalisation. This gives you a full audit trail and the ability to replay any event without data loss.

Authentication

Every request to the ingest service must include a per-source API key as a bearer token in the Authorization header. Keys are scoped to a single ingestion source — a key generated for mcleod-prod cannot be used to push events to a different source.

POST /ingest/mcleod-prod HTTP/1.1
Host: ingest.shipiq.ca
Authorization: Bearer siq_live_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Content-Type: application/json

Generate keys from Settings → Integrations → <source> in the ShipIQ app. The plaintext key is shown exactly once at creation; the server stores only a SHA-256 hash. Rotate by generating a new key and revoking the old one — no deployment or restart required.

Requests with a missing, malformed, or revoked key receive a 401 Unauthorized. Requests with a valid key whose source does not match the URL path receive 403 Forbidden.

Push Endpoint

POST/ingest/{source}

Sends a single event payload from an external system. The {source} path parameter must match a registered source slug for your workspace.

Path parameters

sourcerequired

string

The slug of the registered ingestion source, e.g. sample or mcleod.

Request headers

Authorizationrequired

string

Bearer token for this source. Format: Bearer siq_live_….

Content-Typerequired

string

Must be application/json.

Response

HTTP/1.1 202 Accepted
Content-Type: application/json

{ "status": "accepted" }

A 202 means the event was accepted. Normalisation happens asynchronously and typically completes within seconds.

Example — curl

curl -X POST https://ingest.shipiq.ca/ingest/sample \
  -H "Authorization: Bearer siq_live_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" \
  -H "Content-Type: application/json" \
  -d '{
    "id": "SAMPLE-001",
    "status": "in_transit",
    "tracking": "TRK-12345",
    "bol": "BOL-99999",
    "carrier": "FedEx Freight",
    "origin":      { "city": "Chicago",  "state": "IL", "postal_code": "60601", "country": "US" },
    "destination": { "city": "New York", "state": "NY", "postal_code": "10001", "country": "US" },
    "eta": "2026-04-28T18:00:00Z",
    "weight_lbs": 500,
    "updated_at": "2026-04-26T10:00:00Z"
  }'

Health Check

GET/health

Returns 200 OK when the service is running. No authentication required. Use this as a liveness probe in your container orchestration or uptime monitor.

Sources

Each external system is represented as an ingestion source in your ShipIQ workspace. Register sources from Settings → Integrations in the app — no deployment required.

SlugTypeModeDescription
sampletmsPushFlat test payload — no real system required
mcleodtmsPush / PullMcLeod Software TMS (standard REST schema)

Need a source type that isn't listed? Contact us — we'll add an integration for your system.

Source: sample

The sample source uses a flat, self-describing JSON schema. Use it to test the full pipeline end-to-end without a live TMS or WMS.

{
  "id":              string   // required — stable dedup key for this event
  "event_type":      string   // optional — defaults to "shipment_update"
  "tracking":        string   // carrier tracking / PRO number
  "bol":             string   // Bill of Lading number (used for cross-source matching)
  "status":          string   // see Status Values below
  "carrier":         string   // carrier name
  "origin": {
    "city":          string
    "state":         string
    "postal_code":   string
    "country":       string   // ISO 3166-1 alpha-2
  }
  "destination":     { /* same shape as origin */ }
  "eta":             string   // RFC 3339 timestamp
  "actual_delivery": string   // RFC 3339 timestamp
  "weight_lbs":      number   // converted to kg on ingest
  "updated_at":      string   // RFC 3339 — used for conflict resolution ordering
}

Only id is required. All other fields are optional — the worker will only update canonical fields that are present in the payload.

Source: mcleod

The mcleodsource maps McLeod Software's standard REST API webhook schema. If your McLeod instance uses a non-standard schema, contact us — we can configure a custom mapping for your variant.

{
  "orderId":           string   // required — McLeod order ID
  "eventType":         string   // optional — defaults to "shipment_update"
  "proNumber":         string   // PRO number (tracking number + match key)
  "bolNumber":         string   // BOL number (cross-source matching)
  "status":            string
  "carrier": {
    "name":            string
    "scac":            string
  }
  "origin": {
    "city":            string
    "state":           string
    "zip":             string
    "country":         string
  }
  "destination":         { /* same shape as origin */ }
  "estimatedDelivery":   string  // RFC 3339
  "actualDelivery":      string  // RFC 3339
  "weightLbs":           number
  "updatedAt":           string  // RFC 3339
}

Status Values

ShipIQ normalises raw provider statuses into a standard set. The following values are recognised:

Raw value (in payload)Displayed asNotes
in_transitIn TransitShipment is moving
bookedPendingConfirmed but not yet picked up
at_warehouseHeldAt intermediate facility
deliveredDeliveredFinal delivery confirmed
delayedDelayedRunning behind schedule
exceptionExceptionRequires manual attention
heldHeldHeld for customs / inspection

Unrecognised status values fall back to in-transit in the UI.

Match Keys

Match keys allow ShipIQ to recognise that two events from different sources describe the same physical shipment and merge them into a single canonical record.

When normalising an event, the mapper extracts one or more match keys from the payload. If any key already exists in shipment_match_keys, the incoming event is merged into the existing shipment rather than creating a new one.

Key typeField (sample source)Description
bolbolBill of Lading — most common cross-system identifier
protrackingCarrier PRO number
poPurchase order number (mapper-defined)
carrier_trackingtrackingCarrier-issued tracking number

A shipment from your TMS with bol: "BOL-99999" and a separate event from your WMS also carrying bol: "BOL-99999" will automatically be merged — even if they use completely different internal IDs.

Conflict Resolution

When two sources report data for the same shipment, ShipIQ uses field-level timestamp-based merging rather than simple last-write-wins. For each canonical field, ShipIQ tracks which source last wrote it and when.

// Per-field contribution tracking
{
  "status":    { "source": "mcleod", "at": "2026-04-26T14:00:00Z" },
  "eta":       { "source": "mcleod", "at": "2026-04-26T09:00:00Z" },
  "origin":    { "source": "sample", "at": "2026-04-26T08:00:00Z" },
  "weight_kg": { "source": "wms",    "at": "2026-04-26T16:00:00Z" }
}

For each field in an incoming event, ShipIQ compares the event's updated_at timestamp against the recorded contribution timestamp. The field is only overwritten if the incoming event is newer.

Example

Your TMS updates status → in_transit at 14:00. Your WMS sends a weight correction at 16:00 with no status field — statusstays as the TMS value. The WMS's weight_kg wins because 16:00 is newer than the TMS timestamp for that field.

A stale event with an older timestamp than the current contribution for a field is silently skipped — the newer value is preserved.

Failed events appear with their error in the integration audit view (Settings → Integrations → <source> → Audit). Contact support to retry.

Error Codes

StatusCauseResolution
202 AcceptedSuccess — event queued for normalisation
400 Bad RequestMalformed bearer token, or mapper could not extract id/event_typeVerify the Authorization header and payload schema
401 UnauthorizedMissing, invalid, or revoked bearer tokenGenerate a new key from Settings → Integrations
403 ForbiddenKey valid but not authorized for this source slugUse the key that was generated for this specific source
404 Not FoundUnknown source slug in pathRegister the source in Settings → Integrations and deploy its mapper
500 Internal ErrorAn internal error occurredRetry after a short delay; contact support if it persists

Normalisation errors are surfaced in the in-app integration audit view, not as HTTP errors — the push endpoint only guarantees that the event was accepted, not that normalisation succeeded. Monitor the audit view for failed events to catch payload-format issues early.