Measuring the Effectiveness of a Published Corporate Plan
Q: I work with Brisbane City Council, Australia’s largest local government. Council has a strong commitment to communicating with its 1.2 million-strong target audience, and in June this year launched its most readable Corporate Plan ever. We are now in the throes of assessing the effectiveness of this report and would appreciate any assistance in identifying relevant criteria.
A: Dear Orla:
Ideally, it would be wonderful to have some of the following types of measures on last year’s Corporate Plan to compare against this year’s. If you did measure anything specific last year, I would start by repeating those measures this year to see if there is any change. Here are some other types effectiveness measures you might try:
Not having seen the plan, I don’t know what type of information is included in it, or why it is issued each year. However, I’m guessing that there are some desirable outcomes you’d like from the readers of the plan. I’d start there:
- Do you want more or fewer of them to show up at Council meetings on any particular topic covered in the plan?
- How many of them used a phone number or address the Plan provided for them to contact the Council with questions, comments or concerns?
- Did you want them to vote a particular way on any public referendums?
- Did you want them to change their opinions on certain issues that you track by some type of poll on an ongoing basis?
Whatever those ideal outcomes are, that’s the first thing I would track and compare against the level they were at BEFORE the plan was issued. If there is an outcome that you want each year after the plan is issued, then try to compare your outcome this year against the outcome after last year’s plan was issued.
Reading grade level test
Many word processing programs offer a reading grade level test to tell you how many years of formal education someone needs to have in order to understand the writing. It often appears as part of the grammar check under the Tools or Edit menu. See how well the grade level matches your target audience’s average education level. Under a separate email, I’m attaching a copy of a worksheet you can use to do this by hand if your software doesn’t offer this tool.
Phone or paper survey
Do a telephone survey resulting in at least 400 to 600 randomly selected respondents (but preferably more). First ask if they remember receiving the Plan. If not, that tells you how many either didn’t receive it or tossed it out before figuring out exactly what it was. For the rest of the respondents who do remember receiving it, you can ask a series of readership survey type questions, such as:
- How much of the Plan they actually read
- Which types of information they prefer to read
- How easy the writing is to understand
- Whether the Plan was too long, too short or just right
- How clear the section headings/headlines were
- How effective the photos and illustrations are
- How easy the layout is to follow
- What the preferred distribution method is
- The overall value of receiving the Plan
You can also ask them to what extent reading the Plan affected a series of potential behavioral or attitudinal outcomes (such as voted differently on an issue, discussed something they read with a neighbor or friend, changed their opinion on an issue, felt differently about the Council itself, etc.).
Starch Test focus groups
You could convene a series of Starch Test focus groups with randomly selected citizens who said that they read all or part of the Plan. In the session, you’d start by asking participants what they remember having been in the Plan (this is the unaided recall section of the test). Ask each person to write down their own list of the topics, pictures, headlines, anything that they remember having been in the Plan. Then debrief each one so you can see which elements had the greatest overall recall.
Then hand out to each participant a copy of the Plan and a worksheet that lists each section or element of the Plan on the left hand side of a table. The column headings of the table would say “Skipped,” “Skimmed” and “Read Thoroughly.” You then ask each participant to go over the Plan page by page and check mark one of the columns for each section of the Plan to indicate how they read it. Then you discuss with the group what types of sections they read thoroughly and WHY. What they skipped and WHY. This will tell you what exactly appealed to them about the plan and what didn’t, from the perspectives of content, writing style and design/formatting. (This is the aided recall portion of the test.)
After you finish debriefing them on the current Plan, you can distribute last year’s Plan and have the group critique the differences from year to year to see what they like better about either one.
Hope these ideas give you something to get started on. Feel free to email me directly with more explanation of the Plan and what it’s intended to do so I can provide more specific recommendations for you.
Angela D. Sinickas