The Mixternal Comms Playbook is pleased to present part two of an exclusive three-part series from Frank Dias that explores how a solo communicator can effectively run an employee engagement survey with the help of artificial intelligence. Frank breaks down his step-by-step approach to building and executing a survey aligned with business priorities. He uses AI tools to streamline tasks such as question development, data analysis, and reporting. He highlights the importance of AI in augmenting human expertise, emphasizing that communicators can use it to boost efficiency, save time, and generate actionable insights.
Read part one here:
And now, part two…
This article is focused on one of my IC+AI use cases: the employee engagement survey.
In my three-part behind-the-scenes series, I’m breaking down my workflow into which parts are person-owned and which parts are AI-augmented/enhanced in running an employee engagement survey. I highlight what I did and how I did it.
The challenge
Is it possible to run an employee engagement and communications survey with the help of AI to a global 3,000+ employee audience to provide valuable, actionable insights to present to a company’s leaders aligned with their priorities?
Part 2: Survey Execution, Communications & Data Collection
Part 3: Reporting, Analysis, and Leadership Presentation
In the second installment of my series on AI-enhanced employee engagement surveys (you can read part 1 here), I explore survey execution, the communications, and what to do with the collected data (preparing it for AI use to get it ready for deep analysis to build out my report, which I’ll talk about in part 3).
A quick recap
My simple survey workflow looked like this:
Together with AI I developed, shaped, and landed on these five themes, which link to what a healthy organisation should be interested in measuring and improving against where necessary:
Leadership, Strategy, Progress, & Direction
Employee Connection & Communications
Performance, Productivity, & Development
Work Environment, Culture, & Wellbeing
People Management, Feedback, & Recognition.
Here’s a copy of my list of 50 questions I used for my survey. In hindsight, 50 is way too many.
I’ve already covered how I developed the survey and its design. Now, it's time to tackle the survey execution, communications, data collection, and analysis process.
1. Go-live: Launching the survey
Survey tool – the organisation I was in used the Google Workspace Suite, so to keep things simple and accessible to the target audience and to keep it in-house, I used Google Forms. I didn’t need to pay for an external fancy survey tool.
Once I laid out the question set, it was easy to create, branch, and identify the types of questions and how I wanted people to input their answers. I break this down in my list of 50 questions.
Identifying the target audience – This was simple, everyone! However, I needed to engage with key stakeholders to sense-check the questions and survey approach and what I was aiming for in terms of the outcomes to add value to these stakeholders.
As I mentioned in Part 1, connecting with my manager was the first connection, then to the right senior leaders including HR, the Social Equity/DEI team, my marketing and comms team, and several other trusted contacts. I was also after their support in the promotion once I’d taken on board their feedback and tweaked it.
Drafting the communications – I also kept this simple: a CEO email out to all, with chaser reminders over a two-week period, and targeted emails with league participation tables to department heads to motivate their local support call to action. I created two specific assets:
The CEO comms – An example of the CEO comms I sent out. I had it translated into French and included it after the English as a single email to everyone.
About the survey one-pager – This covered more details I didn’t include in the CEO comms, such as how I would manage any sensitive data collection, usage, access, purpose, objectives, and measurement.
Run-time – I ran the survey for just two weeks. Ideally, I would recommend three as a minimum and four as a maximum. After it was out, it was just a matter of sitting back and waiting for the closure date. However, I regularly peeked at the answers to help me see early trends.
2. Using AI to analyse the data collection and initial analysis
I won’t dwell too much on this as I cover some details in Part 1. After trying and comparing some of the other tools, I used ChatGPT’s analysis feature, the most reliable AI tool at the time.
AI usage for this data-heavy lifting was vital to speeding up the process of finding insights, generating analysis, and comparing the data with some of the demographics to do in-depth reporting.
Once the survey closed, I looked through Google Form’s visual analysis of the results, which included graphs, to get a feel for the initial take on the stories.
A sample of two of the visuals:
Keep reading with a 7-day free trial
Subscribe to Mixternal Comms Playbook to keep reading this post and get 7 days of free access to the full post archives.