Migrate from Bigeye
Export your Bigeye monitors via their API and translate to ODCS YAML with a short Python recipe. Best-effort mapping — verify field names against your tenant before running at scale.
Recipe status
Bigeye does not publish a stable export format for monitors. This recipe calls their documented metric-list API and maps common metric types to ODCS. Field names and metric type strings can vary between Bigeye versions and tenants. Test against a handful of monitors first. For large deployments or tenant-specific edge cases, email us and we will translate directly.
What you need
- A Bigeye workspace API token (generate from your Bigeye admin panel → API Keys). Set it as
BIGEYE_API_TOKENin your shell environment. - The numeric
warehouseIdof the warehouse you want to migrate. Set asBIGEYE_WAREHOUSE_ID. - Python 3.9+,
requests, andPyYAML.
The recipe
Save as bigeye_to_odcs.py, run from any machine with network access to the Bigeye API, and pipe the output to a YAML file.
# bigeye_to_odcs.py — best-effort translation of Bigeye monitors to ODCS YAML.
# Requires a Bigeye workspace API token. Treat the output as a starting point;
# verify field names against your Bigeye tenant before running at scale.
import os
import requests
import yaml
BIGEYE_API = "https://app.bigeye.com/api/v1"
TOKEN = os.environ["BIGEYE_API_TOKEN"]
WAREHOUSE_ID = int(os.environ.get("BIGEYE_WAREHOUSE_ID", "0"))
session = requests.Session()
session.headers.update({"Authorization": f"apikey {TOKEN}"})
def list_monitors(warehouse_id):
# The Bigeye metric/monitor list endpoint has evolved between API versions.
# Adjust the path for your tenant if the response shape differs.
r = session.get(
f"{BIGEYE_API}/metrics",
params={"warehouseId": warehouse_id, "pageSize": 500},
)
r.raise_for_status()
return r.json().get("metrics", [])
# Map common Bigeye metric types to ODCS validity entries.
# Bigeye metric names may differ slightly per tenant; extend this map as needed.
TYPE_MAP = {
"PERCENT_NULL": "not_null",
"PERCENT_UNIQUE": "unique",
"FRESHNESS": "freshness",
"ROW_COUNT": "row_count",
"MIN": "range_min",
"MAX": "range_max",
}
validity = []
for m in list_monitors(WAREHOUSE_ID):
t = m.get("metricType") or m.get("type")
col = m.get("column", {}).get("name") if m.get("column") else None
rule = TYPE_MAP.get(t)
if rule is None:
print(f"[warn] unmapped metric type: {t} (name={m.get('name')})")
continue
entry = {"rule": rule}
if col:
entry["column"] = col
# Preserve Bigeye-configured thresholds where present.
if "thresholds" in m:
entry["thresholds"] = m["thresholds"]
validity.append(entry)
odcs = {
"apiVersion": "v3.1.0",
"kind": "DataContract",
"name": "bigeye_migration",
"customProperties": {"anomalyarmor": {"validity": validity}},
}
print(yaml.safe_dump(odcs, sort_keys=False))Then run: python bigeye_to_odcs.py > contract.yaml and paste the resulting YAML into /migrate/soda — that is the shared ODCS entry point, not Soda-specific.
What does not translate automatically
Bigeye autometric suggestions, advanced seasonal models, and custom SQL metrics that depend on the Bigeye runtime. The recipe logs a warning for each unmapped metric. AnomalyArmor builds its own baselines post-import, so autometric-equivalent coverage is established automatically — you do not need to translate those one-for-one.
Ready to move the monitors over?
Bigeye migration