Microsoft’s recent wave of layoffs, one of its largest in years, has sparked controversy, not only due to its scale but also because of the company’s internal response. While nearly 9,000 employees, many from the Xbox gaming division, were let go in early July, a senior executive’s suggestion that laid-off workers use AI tools like Microsoft Copilot and ChatGPT to process their grief has raised eyebrows.
Matt Turnbull, a senior producer at Xbox Game Studios Publishing, published a LinkedIn post offering advice to colleagues and industry peers navigating job loss. The post, later deleted, was widely circulated online after being shared by developer Brandon Sheffield on Bluesky and documented by multiple outlets.
“I know these types of tools engender strong feelings in people, but I’d be remiss in not trying to offer the best advice I can under the circumstances,” Turnbull wrote. He explained that he had been experimenting with generative AI platforms such as Microsoft Copilot to help reduce the “emotional and cognitive load” that comes with job loss.
Turnbull provided a collection of AI prompts aimed at helping users manage both professional tasks and emotional distress. These included suggestions for rewriting CVs, composing outreach messages to former colleagues, and even reframing feelings of imposter syndrome. One recommended prompt read: “I’m struggling with imposter syndrome after being laid off. Can you help me reframe this experience in a way that reminds me what I’m good at?”
However, the post drew swift backlash. Critics on social media described the advice as “gross” and “completely detached from reality,” especially at a time when many view artificial intelligence as a factor contributing to job losses in the first place.
The criticism is compounded by the broader context. Microsoft has already cut over 6,000 jobs earlier this year, in addition to 10,000 roles slashed in 2023. The latest round also came with the cancellation of several gaming projects in development. Executives have cited a need to adapt to a “dynamic marketplace,” with the growing integration of AI often named as a strategic priority.
The use of AI for emotional support remains highly contentious. While Microsoft has promoted Copilot as a potential well-being assistant, particularly for Gen Z and millennial users, mental health professionals have warned against positioning general-purpose chatbots as substitutes for therapy.
Microsoft CEO Satya Nadella and Copilot lead Mustafa Suleyman have both pitched the tool as a productivity booster and, increasingly, a personalised adviser. Suleyman recently claimed that Copilot can “sense a user’s comfort boundaries, diagnose issues, and suggest solutions,” according to Fortune.
Despite these ambitions, experts continue to raise concerns about data privacy, misinformation, and the risk of users relying too heavily on emotionless AI for complex human issues.
Turnbull’s original intent may have been to offer practical help in a time of crisis. But to many, the suggestion of turning to Copilot for emotional clarity felt tone-deaf, especially when it came from an executive at the very company behind the layoffs.