Building the Campaign System Every Semester Runs On: University of Arizona


You're running campus communications for a large university. You know the brand. You know the audiences you're supposed to reach. You've been doing this long enough to have strong intuitions about what works.
But intuitions about what audiences want aren't the same as confirmed data on what they actually need — and when your next initiative is campus-wide safety communications, reaching five very different groups on a topic that's emotionally loaded for a lot of people, "we think we know" isn't enough of a foundation to build from. And even if you go out and get the data yourself, you still have to figure out what to do with it across a decentralized campus where a dozen different departments each run their own slice of communications.
That's where the University of Arizona's central marketing team was when we started working together.
TL;DR
- Client: University of Arizona
- Situation: Needed to launch a campus-wide safety communications initiative across five distinct audiences, with no confirmed data on any of them and no shared system for departments to coordinate from.
- The work: Multi-segment audience panel, full campaign strategy, and a complete evergreen campaign system.
- The shift: From every department working off different assumptions about what their audiences want, to one confirmed foundation the whole campus executes from.
- The outcome: What five audiences actually told them — plus the complete campaign system their teams could use immediately and keep using every semester.
The Challenge
The university’s central marketing team had been managing campus communications long enough to know what they were doing. What they didn't have, for this particular initiative, was any baseline data on what five very different campus audiences actually knew about safety resources, how they wanted to hear about them, or which messages would land versus which ones would get scrolled past.
That's a specific kind of stuck. Not "we don't know how to run campaigns" stuck. More like: "We're about to spend a meaningful amount of time and money communicating about something important and sensitive, and we're going to have to guess at the foundation it's built on." For a team that's accountable for campus-wide communications, that's uncomfortable. Because when the messages don't land with students (or land wrong) there's no good explanation that doesn't start with "we assumed."
Here's the part most people skip over when they think about audience research: even if you run the focus groups and get solid findings, you've still got a gap. What do you do with what you learned? How does a research report become a content calendar? How does "students prefer to hear about this from peers" become an actual playbook for recruiting and activating peer influencers? How does "parents want clear, direct information in email" become a template the right department can execute against next Tuesday?
Research reports sit on shelves. The university needed something that would get used and get used year over year with only minor tweaks.
They needed to know what each audience actually told them, and they needed that to become the system their teams would work from, not a document their teams would eventually get around to reading.
The Shift
What made this work worth doing was committing upfront to the full arc: go directly to each of the five audiences → find out what they actually think, know, and need → use that to build strategy → then turn the strategy into a campaign system that departments can run without guessing.
Most projects stop at step two. This one had to go all the way to step four.
The belief that had to change going in: it's not enough to say "we'll run some focus groups and use what we learn to inform our strategy." The goal had to be "what the audiences tell us becomes the actual infrastructure (the personas, the content calendar, the templates, the playbooks, the measurement system) that our teams use this semester and the next one and the one after that."
That's a different scope of commitment. And it's also the only version of this project that doesn't end up on a shelf.

jenna Rutschman
Executive Director, Campus Marketing Strategy
University of Arizona
In my role at central marketing within U of A, [Sunny's] been invaluable, offering workshops, counsel, and working collaboratively on numerous initiatives. [She's] easy to work with, fun, intelligent, and forward-thinking. I trust them to deliver beyond expectations, on time, and within budget. Their transparency and proactive communication ensure any potential issues are addressed promptly.
The Work
We started where we had to: talking directly to each of the five audiences.
Phase 1: Built and Ran Multi-Segment Research
We designed and ran a panel covering current students, graduate students, faculty, administrative staff, and parents. The goal wasn't only to document what they knew, it was to understand how they thought about campus safety, which channels they actually used, who they trusted to communicate this kind of information, and which early messages felt useful versus which ones they'd tune out. Because the subject matter had the potential to hit hard emotionally, the conversations had to be structured carefully — sensitive enough to get real answers, direct enough to get actionable ones.
What came out of the virtual focus groups wasn't just a picture of what each group knew. It was what they didn't know that the university had assumed they did, what they actually cared about that no one had thought to ask, and critically, which messages landed before any budget was spent finding out the hard way.
- Designed the research panel
Built a research panel covering five campus audiences: current students, graduate students, faculty, administrative staff, and parents. Designed it to capture both hard data (what do they know, where do they get information, what channels do they use) and softer insights (what resonates emotionally, who do they trust, what makes them pay attention). - Ran virtual focus groups
Conducted virtual focus groups with each segment asking critical questions about awareness, current knowledge, how they prefer to receive information, and which early messages resonated versus fell flat. The subject matter had the potential to be highly emotional for many people, so we had to be sensitive while still getting actionable data. - Gathered baseline intelligence and quantified qualitative responses
Research showed what each group actually knew (versus what the university assumed they knew), which messages landed, which channels they actually used, and who they trusted. Every insight captured specifically to inform what to create and how to create it - not just "interesting findings."
Phase 2: Turned Intelligence Into Something Teams Could Use
From the audience conversations, we built personas. Not the kind that gets printed on foam boards and referenced once. These were built around mindset and behavior — how each audience moved from unaware to informed to taking action. Each persona came with: the content they needed at each stage, the channels they actually used, who they trusted, and their level of understanding of the topic. A team member could pick up any one of these and know what to create without another strategy conversation.
Then we mapped how each persona moved through the campaign and translated those maps into concrete content specifications: what to say, in which channel, in what tone, and at what point in the arc. Not journey maps for the strategy deck. Actual briefs departments could hand off and execute.
The channel strategies went in the same direction. Not "students use Instagram" — but here's the format that works for this audience on this platform, here's how often, here's how to adapt the message while keeping the campaign consistent, and here's what you're measuring to know if it's working.
- Built personas teams could actually use
Created personas based on mindsets and behaviors, not just demographics. But these weren't abstract profiles - each one included exactly what content they needed, which channels they used, who they trusted, and what stage of understanding they were at. Teams could look at a persona and know exactly what to create for them. - Mapped journeys to content needs
Showed how each persona moved from awareness to understanding to action. Then translated that into concrete specifications: what information to deliver at each stage, through which channels, and in what tone. Journey maps became content briefs teams could execute against immediately. - Created channel playbooks
Built strategies for Facebook, Instagram, X, LinkedIn, YouTube Shorts - but didn't stop at "students use Instagram." Created execution playbooks: which personas use each platform, what formats work, how often to post, how to adapt messaging while staying consistent, and what success looks like. Teams could start posting immediately without guessing.
Phase 3: Built a Complete Multi-Channel Campaign System
This is where the audience data became infrastructure.
We created an evergreen content calendar — not a "fall semester plan," but a structure built to be updated each semester rather than rebuilt. Themes, messaging angles, and content types are defined so that teams are improving what works rather than starting from scratch every time.
Working with an agency partner, we created a production-ready asset library: copy templates by audience and channel, a graphics library, and a design system that allows each department to create locally relevant content while keeping the visual and message framework consistent. Not examples to inspire future work, the actual files teams could use the same week.
We built influencer activation playbooks for each audience. Not "peer voices are influential for students" but: here's how to identify who those peer voices are, here's how to recruit them, here's what to ask them to create, here's how to amplify it, and here's how to know if it's working.
And we built the production system itself: who creates what for which audiences, how it gets reviewed, how approval works across departments, so the campaign could execute consistently without everything funneling back through central and creating a bottleneck.
- Evergreen content calendar
Created messaging calendar mapped to campaign goals and audience needs - but designed as an evergreen system. The structure works semester after semester with updates, not rebuilds. Themes, messaging angles, and content types are defined for ongoing use. Every piece of content is aligned with specific audience needs. - Production-ready asset library
Working with an agency partner, created assets teams could use immediately: Copy templates for each audience and channel combination, Graphics library (icons, animated GIFs, video overlays, social templates), Design system allowing customization while keeping things consistent, Everything built to adapt - teams could update with new information while keeping the proven structure. - Influencer activation playbooks
Identified influential voices for each audience (peer students, orientation leaders, campus organizations, faculty, parents) - then built complete activation plans: how to recruit them, what to tell them, what content they should create, how to amplify it, how to know if it's working. "Students trust peers" became "here's exactly how to recruit and activate peer influencers." - Production system for ongoing execution
Built the process: who creates what content for which audiences, how it gets reviewed and approved, and how to maintain quality across decentralized departments. This meant consistent execution without everything funneling through one bottleneck. - Measurement system tied to research baseline
Set up tracking with specific metrics tied back to baseline research. Built in a follow-up research approach to see if awareness shifted and inform ongoing optimization. Not just "measure engagement" but "here's exactly how to track it and use the data to improve."
The Outcome
The campaign launched ahead of the fall semester with teams using the system immediately, not months later, after trying to figure out how to turn strategy into actual posts.
What's different now isn't just that the university has a campaign. It's that they have a foundation every department can build from. Teams across campus can create locally relevant content while staying consistent because they have persona guides, asset libraries, and content frameworks, not just strategic recommendations floating somewhere in a shared drive. Different departments can work independently without stepping on each other's messaging because the system defines who does what and how.
And next semester, they're not rebuilding from zero. They're updating what already works.
For a campus-wide initiative on a sensitive topic, reaching audiences as different from each other as undergraduates and parents, the alternative to this kind of shared foundation is every department doing its own thing and hoping it adds up to something coherent. It usually doesn't.
Measurement was built tied back to what the audiences told us in the panel, so follow-up data will show whether awareness actually shifted, and where to focus next.
The Takeaway
For other organizations:
Most research ends with insights and recommendations. Teams figure out execution on their own - and most never do it effectively, or they rebuild from scratch each time. The value is in going all the way: run research → turn insights into strategy → build turnkey campaign teams can use immediately and keep using.
Who's probably in the same stuck place:
Organizations with research reports that never become campaigns. Universities need consistent communications across decentralized operations, but no bridge from insights to execution. Teams rebuilding campaigns each cycle because they don't have an evergreen system to build from. Organizations that understand their audiences in theory but can't turn that into calendars, assets, and systems that work into the future.
What this case proves:
Research without execution is expensive shelf decoration. The value is in the complete arc: audience intelligence → strategy → turnkey evergreen campaign. Organizations need research that gathers actionable insights, personas that become content specs, journeys that become calendars, channel strategies that become playbooks, and assets that enable immediate use and ongoing updates. This research-to-execution capability - especially building evergreen systems instead of one-time campaigns - separates consultants who deliver insights from partners who deliver sustained results.
