AI cites a source or URL that does not exist
Classic symptom: a tidy reference to a ruling or article that is fabricated. It looks real and that is the issue.
Try this first
- 1Click every source before citing, even if it looks plausible
- 2Ask AI for only sources it can actually retrieve
- 3Use a web-enabled AI for current topics
- 4Mark in your document which sources came from AI
When to bring us in
Fabricated sources in legal or medical work is a hard no.
See also
- Can I paste a customer file or email into ChatGPT?Depends on the account and settings. Free ChatGPT and a Team tenant behave very differently from what most people assume.
- I want a one-page AI policy for my teamA real one-pager beats a thick document nobody reads. Four headers and concrete examples.
- How do I tell if an AI answer is made up?Models sound confident even when they are wrong. A few habits catch most mistakes.
None of the above fits?
Describe your situation below. We pass your input plus the steps you already saw to our AI and return tailored next-step advice. If it's too risky to DIY, we'll say so.
Or skip the DIY entirely
Our Managed IT clients do not look these things up. One point of contact, a fixed monthly price, resolved within working hours.