27.10.2025
Fear of AI is rising. Expert reveals why managers — and their leaders — feel unprepared
With headlines warning of job cuts and “intelligent agents” taking over knowledge work, many executives are feeling uneasy about what artificial intelligence means for their own effectiveness.
A recent Gartner survey of HR leaders found that only eight per cent believe their managers have the skills to use AI effectively, and just 14% say their organisation provides adequate support to help managers integrate AI into everyday work. For many leadership teams, this confidence gap is fuelling quiet but widespread anxiety about the future of management itself.
Fineas Tatar, productivity expert and co-founder of premium executive assistant service Viva, says this fear is not irrational. It is a signal that leadership must evolve.
“Executives rarely admit it, but many are anxious about how AI will affect their ability to lead. The real danger isn’t being replaced by technology. It’s hesitating to adapt while the landscape changes around you.”
Tatar says that AI anxiety often takes three forms among senior leaders:
The Imposter Instinct
Leaders worry they are not technical enough to make informed decisions about AI tools. This creates hesitation and signals uncertainty to their teams, which slows innovation.
The Control Ghost
When leaders do not fully trust AI systems or the people using them, they tighten oversight. This creates bottlenecks and reduces the creative space teams need to experiment and improve.
The Perfection Phantom
Executives often delay adoption until they “fully understand” the technology. In fast-moving markets, that delay becomes a competitive cost.
Rather than suppressing fear, Tatar says leaders should use it as information. It highlights where learning, structure, and support are needed most. He recommends three focus areas for executives who want to stay effective as AI transforms their organisations:
1. Build AI fluency through curiosity, not mastery
Executives do not need to become engineers. What matters is learning to ask sharper questions, identify reliable data, and apply sound judgement. AI is only as useful as the clarity of the person guiding it.
2. Design human-centred systems
The goal of AI should be to remove tactical noise, not to replace people. Leaders who integrate AI thoughtfully free up time for higher-level work such as strategy, culture, and stakeholder relationships.
3. Lead with transparency
Admitting uncertainty does not weaken authority. It builds trust and psychological safety. When leaders share what they are learning, teams are more likely to experiment responsibly and adopt AI in productive ways.
Tatar concluded: “The executives who thrive in this transition are those who face it directly, turn it into action, and design the right support systems around them. AI will not erase leadership, but it will redefine what effective leadership looks like.”
A recent Gartner survey of HR leaders found that only eight per cent believe their managers have the skills to use AI effectively, and just 14% say their organisation provides adequate support to help managers integrate AI into everyday work. For many leadership teams, this confidence gap is fuelling quiet but widespread anxiety about the future of management itself.
Fineas Tatar, productivity expert and co-founder of premium executive assistant service Viva, says this fear is not irrational. It is a signal that leadership must evolve.
“Executives rarely admit it, but many are anxious about how AI will affect their ability to lead. The real danger isn’t being replaced by technology. It’s hesitating to adapt while the landscape changes around you.”
Tatar says that AI anxiety often takes three forms among senior leaders:
The Imposter Instinct
Leaders worry they are not technical enough to make informed decisions about AI tools. This creates hesitation and signals uncertainty to their teams, which slows innovation.
The Control Ghost
When leaders do not fully trust AI systems or the people using them, they tighten oversight. This creates bottlenecks and reduces the creative space teams need to experiment and improve.
The Perfection Phantom
Executives often delay adoption until they “fully understand” the technology. In fast-moving markets, that delay becomes a competitive cost.
Rather than suppressing fear, Tatar says leaders should use it as information. It highlights where learning, structure, and support are needed most. He recommends three focus areas for executives who want to stay effective as AI transforms their organisations:
1. Build AI fluency through curiosity, not mastery
Executives do not need to become engineers. What matters is learning to ask sharper questions, identify reliable data, and apply sound judgement. AI is only as useful as the clarity of the person guiding it.
2. Design human-centred systems
The goal of AI should be to remove tactical noise, not to replace people. Leaders who integrate AI thoughtfully free up time for higher-level work such as strategy, culture, and stakeholder relationships.
3. Lead with transparency
Admitting uncertainty does not weaken authority. It builds trust and psychological safety. When leaders share what they are learning, teams are more likely to experiment responsibly and adopt AI in productive ways.
Tatar concluded: “The executives who thrive in this transition are those who face it directly, turn it into action, and design the right support systems around them. AI will not erase leadership, but it will redefine what effective leadership looks like.”
Posted by:
FMJ
0 comment(s)
Add your comment
Department/function
Region
Employment Type
- Building Design, Planning, Development 2
- Catering 8
- Construction 38
- Contracts, Projects, Bids 21
- Energy Management 88
- Engineering, Maintenance 581
- Estates, Property 22
- Events 3
- Facilities Management (main) 241
- Hard Services 49
- Health & Safety 3
- HVAC 219
- Management 6
- M&E 219
- Operations 76
- Procurement 17
- Sales & Marketing 7
- Soft Services 2
- Workplace 1