All the lead-generation and sales in the world won’t save you if your product doesn’t perform.

Digital marketing can be can be pretty straightforward. Learn about your customer, build the right content that speaks to them, get it placed in front of them, make sure your website has a way for people to show that they’re interested, and watch the leads roll in.

Pretty soon, you’ll be able to say things like, “online marketing is responsible for 25% of the company’s sales this quarter” and other fancy sentences that show your bosses, the board, and whoever else that you’re crushing it.

It’s a good life, doing things, showing a return on investment, keeping the sales queue full, but what happens when you notice that as you’re getting successful with more and more clients, engagement with your product is dropping?

That’s where we found ourselves last July. Here’s a chart.

Oof.

Oof.

Not only were we not keeping up with last year, we were losing ground a little bit every month. As we scaled up from 200 to 1,000 clients, our only respite was Summer break when all the schools we work with went on vacation.

That day, we kicked off a project that turned things around for a small group of our sites. In fact, it blew them up — with average traffic numbers that beat the rest of our network 300% and set new standards for our team.

That's more like it.

That's more like it.

It all happened because back in July, we knew that without a product that performed, new customers were useless. And we couldn’t count on our current customers to figure out how to right the ship on their own either. So we decided to break marketing out of its traditional role and make it quarterback of a cross-departmental team dedicated to understanding and improving engagement on our websites.

Here’s what we did:

Step 1: Group ’em up!

1,000 different unique customers was no joke, and we knew we wouldn’t have any kind of impact on all of them at once, so we started small — and asked our local sales team to pick a school or two from each of their states. We then took those customers, organized them, and made them our test group.

Here’s all the pretty logos:

27 customers, spread out across the states

27 customers, spread out across the states

 

ross the States.

(I’ve got the) Power Rankings

At the start, we didn’t really know what we were looking for. We talked to the schools, let them do what they were already doing, pulled the usage numbers at the end of the month (for us, it was pageviews), and ranked them, starting in August (when school got back in session). This gave us a chance to actually see what was happening.

Here’s how August shaped up:

 

Why were the best the best?

Once we had our rankings, we started at the top. What did our highest achievers have in common?

We needed a measuring stick to help us rate everyone else, and guide our plan —

The top three schools on our rankings had two things in common:

  • They were able to get 3x the school’s enrollment to visit their site that month. Lots of people were visiting from the community over and above the students that went there.
  • Once someone visited, they hit 7 pages on the school’s site or more during the month. The community was engaged with whatever it was on the site so much that they clicked around.

So, we started focusing on those two measurements, and it told us a lot.

Based on the information — if a school didn’t have 3x their enrollment visiting the site, they needed help with marketing. For one reason or another, the customer just wasn’t getting enough different people from their community there.

Then, if a school’s visitors weren’t hitting 7 pages over the course of a month, it was a content problem. For whatever reason, the articles, photos, information on the school’s site just weren’t exciting enough to get enough page views.

Now check out the power rankings again, this time with our math added:

We dragged that same math across the entire test group and learned:

  • Our top three hit both requirements. They didn’t need marketing or content help.
  • Five of our schools needed help marketing their sites.
  • Four of our schools needed help improving content on their sites.
  • Ten of our schools needed improvement in both.
  • Three of our schools didn’t have enough data to make the call.

Where they placed also mattered. While it wasn’t a clear trend, roughly, we noticed that if a school was placed toward the bottom of the chart and wanted to move up a little higher, it was best to concentrate on marketing their site. If the school was near the top but just couldn’t couldn’t top the charts? Work on content. Almost everyone in places 16–25 needed to work on both.

Based on this ratio, we also learned who had the most potential. Just look at #7, the Eagles. Visitors to their site hit almost 10 pages in the month, which was most in the entire group, but fell so far short in marketing their site that they couldn’t make it count. Even so, we knew that going forward, they’d be a school to watch.

These ratios taught us where to focus, and how to teach our customers how to improve.

Get specific

Ratios were a great start, but they were only the beginning. Now that we knew what needed to improve, the next question was how they could do it. So we looked back at the schools, dug deeper into the analytics, and got specific.

Analytics told us that photos were a huge driver of pageviews — more than almost every other content category.

In August, there were a couple tactics that jumped out at us at making a big difference across all the pages in our group:

Marketing the site:

  • Connecting an athletics page to a school’s website — As easy as hooking a school’s VNN athletics website up to their main school site so when someone clicks ‘athletics’ they are redirected on the VNN page. In the beginning, parents and fans usually go to the school’s website first when they’re looking for information.
  • Promotion of the athletic page on the school site — exactly what it sounds like. Did the school put a link to their site on the announcements page?
  • Sharing of posts on social media — Schools higher up on our list had higher share counts per article than schools that placed lower. Social media was the perfect way to increase a site’s reach in the community.

Improving Content:

  • “Tell me what I need to know” content — Game recaps, score reporting, and rich media type content is an integral part of what makes a VNN site valuable to a school, but at it’s most basic, it makes sense that a big number of visitors to the site are simply looking for information on where they need to be — whether that’s a schedule, tryout information, or the plans for Friday’s tailgate. We saw that posts telling people what they need to know outperformed game information over 70% of the time.
  • Photos — If a school posted over 100 photos in the month from games, they were almost double as likely to make the top ten. Seven pages is nothing when you’re looking at photos.

Some of the tactics, like 301 redirects and posting logistics content seem like a no-brainer, but with the amount of schools coming through our site build process, and every school being different, it was easy to miss when a school was just starting out.

One thing, every month.

From these specific tactics, we were able to create one monthly go-forward action that we thought each school on the list could benefit most from, based on what we’ve seen on their specific site. This kept it easy for our support team to communicate, and for our customer schools to remember and execute on.

For some, it was getting a bigger Facebook following, for others, it was making sure coaches posted ticket information the day before a game.

At the end of September, we charted the schools again, looked back at the actions each school took, and ran the ratios again. Then in October, and November. Every month, little things changed — in the middle of a season, photos might be more important than links, and at the beginning, announcing rosters might work better than scores — but the process stayed the same:

  • Use the test group.
  • Pull the data.
  • Run the measurements.
  • Make a list of tactics that could directly improve your measurements.
  • Keep it simple and give customers one thing to do.

Pretty soon, we had a good idea how to help a customer become more successful from soup to nuts, and it was showing in our monthly reporting. Of the 27 schools in our test group, 23 of them were beating our network average, and each month, we’re adding one or two more to that list. In 2016, we’re going to scale it up from our 27 school test group, to the 1,500 clients we’ve got across the company.

If you’re a marketer and notice engagement with your product dropping, it’s a problem you can fix. Take it from us and follow our steps to pull a plan together, because all the great brand building you’re doing won’t matter if you’ve got a product that’s simply existing.

Previous
Previous

How I Work: James Marlow, Lugoff-Elgin High School (Interview)

Next
Next

How I Work: Lee Laskowski, Clayton High School (Interview)