Measuring Internal Comms before it was Cool with Angela Sinickas

Chuck Gose: You’ve worked with a lot of global companies across so many industries. Which do you feel communicators get to have the most fun in? And by fun, I guess I’m referring to being creative and clever.

Angela Sinickas: Hey, I don’t do creative and clever. I do numbers.

CG: But surely you see companies in some industries who are being creative in telling number and data stories too?

AS: I think it has less to do with industry and more with the senior leaders of the company. I’ve seen an oil company CEO who had more emotional intelligence than CEOs of “softer” companies in the entertainment industry. When communicators have leaders who get what comms can really do for them, communicators are trusted when they make recommendations, no matter how crazy it might seem — including getting dressed up in costumes at town halls or singing You are the Wind Beneath my Wings at employee recognition events.

CG: What made that CEO different?

AS: He really connected with people, not just talked with them. And, he was more systemic: When I interviewed him on what he expected communication-wise from all managers and supervisors, the answers he gave me were identical to the answers I got from managers at all levels as to what communication behaviors they thought the CEO expected from them. That rarely happens.

“I made my first speech on measurement in 1981 when no communicators ever thought they’d need that information.”

CG: Your reputation and work is built around measurement. I really want to dig into this. Cool?

AS: Way.

CG: This might be a strange question but do you enjoy it or are you just really good at it? Or both?

AS: I started laughing at the first part of “enjoy it.” Absolutely I do enjoy it. When I took the ACT and SAT pre-college exams, my math and verbal scores were almost the same. This made it hard to choose a major. I did some writing for the daily newspaper at the University of Illinois, and halfway through switched to journalism. My first few jobs were for the U of I medical center, so I communicated about scientific things — a good blend of aptitude and skills.

When at my next employer, we had a CEO who was an accountant, I had to tap into the science/math part of my brain to come up with a way to get numbers/research into my plans for him, so I started getting scientific about communication. Great results! From being the editor of the employee magazine, I was promoted to a new position as internal comms manager, and got a staff of two and a budget during a recession. That’s when I really began to enjoy measurement! I kept doing it in every job I had, and I love sharing this expertise with other communicators and hearing how it has enhanced their own careers and credibility with their leaders. I’m still as passionate talking about this as I was when I made my first speech on measurement in 1981, when no communicators ever thought they’d need that information.

CG: Are you an internal comms measurement hipster then?

AS: I’m not cool enough to know what a hipster is.

CG: Like you, I love math, numbers and data. I proudly tell people that I took calculus in college because I wanted to – not because I had to. But what do you tell communicators who hate measurement? Is it like a kid who won’t try lima beans – just tell them they have to do it?

AS: No, it’s more like putting a little mashed apple into their mashed lima beans in that you should sweeten the concept. I usually start by saying, just do a different type of research. We ALL do research before we write anything, right? I just suggest that research include some empirical data. It can be as easy as looking at some online usage data, or just asking sample employees some questions about a topic before you develop a full-blown communication plan that will fail because you weren’t aware of employees’ current knowledge, attitudes or behaviors related to the topic of your campaign.

“When they do measure, [Internal Communicators] can fall into the trap of measuring their own activities because it’s easier to do, doesn’t require money and doesn’t require permission.”

CG: From your experience, what’s the biggest mistake communicators make when it comes to measurement? And you can’t say “they’re not measuring.”

AS:  I once wrote about the top 10 mistakes. I think internal communicators fear that the numbers might show they’re doing a bad job, and that’s why they want to avoid the entire situation. And when they do measure, they can fall into the trap of measuring their own activities because it’s easier to do, doesn’t require money and doesn’t require permission. But saying you wrote 6 articles about safety doesn’t give you the right to take credit for a reduction in accidents.

CG: It doesn’t?

AS: It could, but you need to build the chain of evidence. For example, if you wrote about safety only for employees in half your locations, and occurances of that type of accident went down in your pilot locations but stayed the same in the other ones, now you’re building a credible correlation. Or, as with a client, we asked employees on a survey which safety behaviors they paid more attention to in the past year specfically because of the new communications they received (cause and effect). Not only did we see the reported behavioral impact of the communication, but we compared the survey results with the actual reduction in accidents. Not surprisingly, the reduction rates were greatest for the behaviors employees said they now paid the most attention to. That doesn’t take calculus to calculate. It’s just adding up survey percentages and looking at another department’s outcomes in comparison.

CG: Health and safety departments are great places to start in my opinion. There’s obviously an emotional component to what they do, but they are data driven. Everything they do ties back to a metric of some kind, to watch for trends and hiccups in performance. Communicators could learn by how other departments in their own companies use data.

AS: So true. My clients are thrilled when they can point to some data from a survey we’ve done to explain to an executive why what they want to do won’t work. For example, one company’s function heads were pushing back against not being allowed to send all-employee emails anymore. They all had to give the raw information to comms, who then included a number of these announcements in consolidated emails with better written headlines and summaries — with links to the original raw data on the intranet. In our recent survey, 85% of employees said they now read more of these announcements in the consolidated and edited form than they ever did when they were one-offs. End of debate.

CG: Let’s say a communicator joins a new company and day one they see no measurement is in place. What are some early steps you recommend they take to get started?

AS: I recommend focus groups to get a feel for what is going on “out there.” Based on that, you can draft better survey questions to quantify what is working and what needs improvement. I also recommend starting with interviews with each member of the executive leadership team to identify how they think IC is currently working and what they would like to see improved. That information also feeds into the first survey design.

CG: I thought you were going to say “Call Angela at…”

AS: That goes without saying. Or at least, it’s better if you say it. The number, by the way is (714) 904-0671!

CG: Well done. When it comes to focus groups and more qualitative data, how do you balance it out when it conflicts with quantitative results? What I’m asking here is that sometimes what people say is different than their behavior.

AS: There are a couple of issues here. First, in focus groups, you’re not talking to a truly representative slice of the entire employee population, and second, you’re not talking to enough of them for any type of statistical reliability. So what you hear in focus groups is not only not representative of the whole, but you get a bit of peer pressure in a face-to-face setting that makes quieter people uncomfortable disagreeing out loud with their louder peers. That’s why a survey is always a more accurate way to find out how prevalent various types of knowledge, attitudes and behaviors actually are.

CG: And how do you feel about anonymous versus named surveys?

AS: For employee surveys, you’ve got to make it feel as anonymous as possible, especially if any of the questions could come back to bite them, such as questions about their supervisor’s communication skills or behaviors. That also means keeping demographic questions to a minimum, only the things you could act on differently. You can communicate differently by location, business unit, shift or job level. You can’t segment communication approaches by age, gender, or generally years of service (other than new employee orientation and pre-retirement planning). For these types of segmentation, we have to respond to the overall percentages of the employee group who want or need different approaches and make them available to all of them.

CG: Do you think there’s any value at all in capturing things like age, gender, years of service, etc then when gathering feedback and data?

AS: It can sometimes be interesting to clients, but not actionable. It’s particularly fun when we do ask something like age categories (linked to the generations) and come back with a result that shows the younger people also still want a printed version of the 32-page magazine, or older people use a particular type of social medium in their personal lives at a much higher percentage than anticipated. But you can’t do anything with that data, and you pay a lot more to get all your results sliced and diced so many more ways.

“If you do research that isn’t actionable, you’re wasting time and money. And it’s a shame because sometimes it’s so easy to make a survey question more actionable.”

CG: I think that’s a great lesson you shared. Not all data you capture can be actionable. Or should be. That doesn’t mean that it’s not valuable.

AS: I think I disagree. If you do research that isn’t actionable, you’re wasting time and money. And it’s a shame because sometimes it’s so easy to make a survey question more actionable. For example, I saw a homemade survey question that asked employees to agree/disagree on the statement, “I get too much email.” About one-third agreed, and so the communicator concluded that there’s too much email usage. First of all, that makes no sense if only one-third agreed. Another one-third disagreed with the statement and another one-third were neutral. How many of them actually meant they were getting too little by email or just the right amount? You can’t tell. A better form for that question is, “Are you getting too much, too little or the right amount through email.” That will be actionable.

CG: Is there something that you see communicators measuring that you believe is a waste of time and resources?

AS: Most usage statistics, as all most of them tell you when someone clicks on a link is that you did a great job of promoting it. You don’t know if that click led to the information they were hoping to find, if they read it, if they understood it, or if they are going to act on it. Many of these numbers need to be put into context. If your readership goes down in the summer or holiday times, it’s not because the content got bad, but because people aren’t at work. Those numbers need to be tracked as the percentage of available eyeballs at work each month.

CG: Are you a baseball fan?

AS: Only my Cubbies. I was born in Chicago, and while I was working for the Chicago Tribune when they owned the Cubs, I could hear the organ music from my back stairs.

CG: I am a baseball fan. And here’s a few things I know. One, most Cubs fans aren’t baseball fans. And two, how baseball is measured has changed with advanced analytics. When I was a kid, you only had home runs, batting average, steals, ERA, etc. Now, there are all kind of ways to analyze baseball through these new analytics, that give more accurate comparisons. Do you ever think IC will get to this level?

AS: We are there — when you work with the right people to guide you to understanding what will matter in terms of measurement, not just to manage your own program, but also to make sense to your management team. And it’s not just me. There’s a lot of great stuff online about this. The CIPR in the UK developed an IC Measurement Matrix that’s clear, simple, and actionable. You can use it to help develop survey questions or to do measures without surveys.

CG: And one final question for you. Describe your thoughts on internal communications via an emoji.

AS: Aww, I’m having too much fun to stop. Here’s what I got: ?

I’m cautiously optimistic that we are elevating the field to a professional management function that gets credibility with leaders.

CG: Thanks for the great chat, Angela!


Warning: Invalid argument supplied for foreach() in /home/customer/www/ on line 1821