Honesty About AI Usage in the Workplace Decreases Trust, Research Shows

Honesty About AI Usage in the Workplace Decreases Trust, Research Shows

Whether you’re using AI to write cover letters, grade papers or draft ad campaigns, you might want to think twice about telling others. That simple act of disclosure can make people trust you less, our new peer-reviewed article found.

As researchers who study trust, we see this as a paradox. After all, being honest and transparent usually makes people trust you more. But across 13 experiments involving more than 5,000 participants, we found a consistent pattern: Revealing that you relied on AI undermines how trustworthy you seem.

Participants in our study included students, legal analysts, hiring managers and investors, among others. Interestingly, we found that even evaluators who were tech-savvy were less trusting of people who said they used AI. While having a positive view of technology reduced the effect slightly, it didn’t erase it.

Why would being open and transparent about using AI make people trust you less? One reason is that people still expect human effort in writing, thinking and innovating. When AI steps into that role and you highlight it, your work looks less legitimate.

But there’s a caveat: If you’re using AI on the job, the cover-up may be worse than the crime. We found that quietly using AI can trigger the steepest decline in trust if others uncover it later. So being upfront may ultimately be a better policy.

Being caught using AI by a third party has consequences, as one New York attorney can attest.

Why it matters

A global survey of 13,000 people found that about half had used AI at work, often for tasks such as writing emails or analyzing data. People typically assume that being open about using these tools is the right choice.

Yet our research suggests doing so may backfire. This creates a dilemma for those who value honesty but also need to rely on trust to maintain strong relationships with clients and colleagues. In fields where credibility is essential – such as finance, health care and higher education – even a small loss of trust can damage a career or brand.

The consequences go beyond individual reputations. Trust is often called the social “glue” that holds society together. It drives collaboration, boosts morale and keeps customers loyal. When that trust is shaken, entire organizations can feel the effects through lower productivity, reduced motivation and weakened team cohesion.

If disclosing AI use sparks suspicion, users face a difficult choice: embrace transparency and risk a backlash, or stay silent and risk being exposed later – an outcome our findings suggest erodes trust even more.

That’s why understanding the AI transparency dilemma is so important. Whether you’re a manager rolling out new technology or an artist deciding whether to credit AI in your portfolio, the stakes are rising.

What still isn’t known

It’s unclear whether this transparency penalty will fade over time. As AI becomes more widespread – and potentially more reliable – disclosing its use may eventually seem less suspect.

There’s also no consensus on how organizations should handle AI disclosure. One option is to make transparency completely voluntary, which leaves the decision to disclose to the individual. Another is a mandatory disclosure policy across the board. Our research suggests that the threat of being exposed by a third party can motivate compliance if the policy is stringently enforced through tools such as AI detectors.

A third approach is cultural: building a workplace where AI use is seen as normal, accepted and legitimate. We think this kind of environment could soften the trust penalty and support both transparency and credibility.

The Research Brief is a short take on interesting academic work.

The post “Being honest about using AI at work makes people trust you less, research finds” by Oliver Schilke, Director of the Center for Trust Studies, Professor of Management and Organizations, University of Arizona was published on 05/06/2025 by theconversation.com