Any AI you already use, plugged in (ChatGPT, Claude, Cursor, Copilot). Custom internal tools your team builds on top. Same reconciled Amazon data underneath all of it — replacing six SaaS subscriptions with one.
DataDoe gives developers a clean, structured data layer over Seller Central, Vendor Central and Amazon Ads — accessible through REST, MCP, or BigQuery on day one.
We handle auth, retries, rate limits, schema drift. You ship products.
Four ways out, all over the same canonical schema: a REST API for backend services and automations, an MCP server for AI assistants like Claude, ChatGPT, Cursor, Codex, Copilot and Gemini CLI, direct BigQuery access mirrored into your own GCP project, and recurring exports as CSV, JSON, TSV, XML or Excel delivered on a schedule. Underneath all four sits the same data — 40+ canonical tables across Seller Central, Vendor Central and Amazon Ads, reconciled, typed and BigQuery-compatible.
Yes — that's the main reason teams pick it. Hit the REST API from your backend services in any language. Mirror tables into your BigQuery project and feed them straight into your existing dbt models, Looker dashboards, Hex notebooks or Metabase. Drop the MCP connection into Cursor, Claude Code or Codex so your devs can query Amazon data in plain English while building. Schedule recurring exports to land files in your inbox without writing pipeline code. Same data, every entry point your stack already uses.
Each account — US Seller Central, EU Vendor Central, JP marketplace, the new brand you just acquired — goes through Amazon's official Login with Amazon (LWA) OAuth flow once. Each Seller or Vendor lives in your workspace as a scoped entity with its own ID, name and marketplace. No SP-API plumbing per new account. No token refresh code. No regional endpoint juggling. Just a hosted OAuth flow and a Seller or Vendor you can query against.
The REST API works with anything that speaks HTTP — Python, Go, Ruby, PHP, Rust, .NET, Elixir, TypeScript, JavaScript. AI coding tools (Cursor, Claude Code, Codex, Copilot) generate clients straight off the API. BigQuery is plain SQL, so any tool with a BQ connector works — dbt, Hex, Looker, Metabase, Tableau, Power BI, your notebook of choice. The MCP server is compatible with any MCP client out there.
Yes. DataDoe is a registered SP-API developer. Each Amazon account you connect goes through Amazon's official Login with Amazon (LWA) consent flow — your team grants access through Amazon's own screen, no credential sharing, no shadow scraping. Tokens, refresh handling, regional endpoints and rate-limit compliance are managed upstream. Data is encrypted at rest, scoped per organization, and accessed only through authenticated keys you issue from your workspace.
Different feeds have different refresh patterns. Settlements, listings, products, orders and AWD sync continuously. Sales & traffic, FBA inventory health and ad performance refresh daily. Repeat purchase, market basket and weekly organic search ranks refresh weekly. The BigQuery dataset is synced daily. Initial backfill on connect goes up to 735 days for some Vendor Central tables, 730 days for order items and reimbursements, 365 days for repeat purchase, and shorter windows where Amazon caps history.
Yes. Every workspace can mirror its tables to a BigQuery dataset in your own GCP project. Your analysts run any SQL they want, join with your own internal data (manufacturing costs, headcount, freight, anything), materialize views, build dbt models, point Looker, Hex or Metabase at it. The data structure follows a stable, documented canonical schema — common keys are normalized so cross-feed joins don't require column aliasing.
Yes — and Amazon Ads on top of that. Vendor Central data includes weekly demand forecasting with mean and 70/80/90% confidence levels, repeat purchase performance, market basket co-purchase analysis, and the daily sales / traffic / inventory feed with full Manufacturing-retail vs Sourcing-retail split — NPPM, sell-through rate, vendor lead time, 90+ days aged inventory. Plus all of Seller Central (settlements, FBA inventory health, returns, account health, search performance, AWD, COGS and more) and Amazon Ads (Sponsored Products, Display, Brands — search terms, placements, ad groups, negative keywords). One workspace, all three sources, one canonical schema.
That's the work we do upstream so your team doesn't get pulled off roadmap. Amazon adds fields, deprecates endpoints, changes report formats and adjusts rate limits regularly. We absorb those changes into our pipeline — your queries against the canonical schema keep working. No surprise 4am pages because Amazon shipped a regional API change overnight.
Your data can live in your BigQuery project — so exporting is just a SQL query. Or use the REST API to dump tables to your own warehouse. There's no proprietary format, no obfuscated columns, no lock-in on the data itself. You can also schedule recurring exports as CSV, JSON, TSV, XML or Excel files at any time, delivered to email recipients of your choice. Walk away with everything you ever ingested.
Every integration. Full onboarding support. If it’s not the best decision you made in 2026, you can cancel anytime.
Skip six months of SP-API integration
Hands-on onboarding by the build team
Connect anything with API & MCP
Replace SaaS tools with your own apps
Access Amazon-audited infrastructure
Every integration. Full onboarding support. If it’s not the best decision you made in 2026, you can cancel anytime.