Skip to main content

Questions to ask an AI vendor before you sign

A straight checklist for contracts and demos—written for people who are not machine-learning experts.

Training and your data

Ask plainly: “Will our inputs or outputs be used to train your models?” Get the answer in writing. If the salesperson hesitates, treat that as a red flag.

Ask whether you can opt out, and whether opt-out costs extra. Some vendors charge for enterprise privacy; that is normal, but it should not be a surprise after the fact.

Ask what happens if a user pastes something they should not have—can you purge it, and how long does that take?

  • Where is data processed (region/country)?
  • Retention: how long are prompts and logs kept?
  • Subprocessors: who else touches the data?
  • Can you get a data processing agreement that matches your industry?

Accuracy and “I don’t know”

Models guess. Ask how the product handles uncertainty. Good systems cite sources, show confidence, or route to a human when the question is out of scope.

Ask for a demo on your own messy example—not the polished slide deck sample.

Ask how often the underlying model changes, and how you are notified. Silent upgrades can break a workflow you tuned for weeks.

Security and access

You need SSO, role-based access, and audit logs that your IT person can actually read. If the product is “admin password in a spreadsheet,” walk away.

Ask about incident response: who do you call, and what is their SLA?

Ask how secrets and API keys are stored, and whether you can rotate them without a professional services project.

Support, roadmap, and references

Ask for two references in a company your size—not only Fortune 500 logos.

Clarify support hours and channels. For SMBs, long email-only queues turn small bugs into big outages.

Ask what is on the roadmap versus what is vapor. A honest “not planned” beats a shrug.

Commercials: pilots, limits, and exit

Negotiate a pilot with clear success criteria and an off-ramp. Month-to-month after pilot is fine if you are still evaluating.

Understand token or seat limits. Many bills spike when usage spreads beyond the team that tested.

Ask how you export your prompts, workflows, and configuration if you leave. Lock-in is not always bad, but it should be a choice, not a trap.

Clarify what happens to data at contract end—deletion timelines and certificates if you need them.

After you sign: minimum oversight

Assign someone to read release notes monthly.

Keep a one-page map: which workflows use the tool, which data classes are involved, and who approves changes.

Review access quarterly—especially for contractors who left.