You’ve heard of report monkeys—I’m going to tell you about data squirrels. Data squirrels spend lots of energy collecting data; then they hoard it. They carry it around in their cheeks, feeling well-prepared for the winter, but their mouths are so full they can’t process and swallow their hoarded goods.
I’m thinking of one former client who was using nearly every single variable available at the time- virtually every available report in Sitecatalyst was filled with data. According to “Best Practices”, they had a weekly meeting to discuss their implementation and keep it up-to-date- the data was tested, current, and abundant. Their dashboards were vibrant and plentiful. But they were so busy collecting and hoarding data that they had no time to process and analyze it. The idea of taking action on the data was never part of the process.
For all of their effort, how much value do you think they were getting out of their analytics tool?
Sidenote: Data Squirrels and Report Monkeys make great friends. The Squirrel hands data to the Monkey and the Monkey hands “reporting requirements” back to the squirrel. This cycle can go on indefinitely without any real value.
You know what makes the web analyst in me want to go cry in a corner? This all-too-common situation:
A major company spends thousands of dollars and dozens of man-hours on an implementation project to grab and report on some new data. The implementation consultant gets the data, shows them the working reports, signs off on the project, then walks away. Six months later, the company calls the consultant and says “this report doesn’t work anymore, someone disabled the variable five months ago and it took us until now to notice. Can we bring you back on to fix the problem?” (Presumably, so the data can go another 5 months without being used).
If you can go 5 months without looking at a report, why-oh-why did you throw money at someone to make the report available? Sadly I know the answer: it’s easier to throw money at something and feel that sense of accomplishment than it is to spend time analyzing and making potentially risky decisions based on the data.
It’s a lovely feeling to wrap up an implementation and feel like the hard part is done. But if you really want value out of your tool, the work is far from done. Fortunately, the post-implementation work is the fun stuff: the stuff that doesn’t count on your IT department to roll out new code; the stuff that has the potential to realize actual, provable value for your company.
If you must, on the day your implementation project is completed, mark a few hours of time two weeks or one month out to play with the data until you get an action out of it. Schedule a meeting with the analyst-minded folks of your organization and go over the data- NOT to discuss how complete your implementation is, but to discuss what insight the data has given you and what the next steps are. Don’t let USING your data fall down the priority list just because it doesn’t have a project and a budget attached to it.
This may just be a matter of semantics, but as an industry we need to change our mindset from “I want to track…” to any of the following:
- I want to HYPOTHESIZE…
- I want to TEST…
- I want to COMPARE…
- I want to BRAINSTORM…
- I want to OPTIMIZE…
- I want to DECIDE…
- I want to ACT UPON…
“Tracking” or merely collecting data should never be the goal or even a main focus. ALWAYS implement with this question in mind: “How am I going to USE this data to optimize my site?”
A major healthcare website recently hired an agency with which I am familiar. The client has used Adobe Analytics for years and noticed a few things that were a little off in their reports (have I mentioned how much I hate the Form Analysis plugin?), so they hired this agency to do an implementation audit. As they sign the contract for the implementation audit contract, they say, “You’ll notice we don’t use conversion variables or events, and that’s fine. We’re really content with just the traffic data, just help us make sure the traffic data is coming through properly”. In other words, “This solution worked for us four years ago, please just keep it working as is.”
Oh, how this breaks my heart! Because I know a year from now they might say, “Gee, we’re spending a chunk of money on our web analytics, but we’ve never done a thing with the data. Let’s bring on an expert to help us analyze”. And that expert will say “analyze? analyze what?” Then they’ll need to re-implement, then wait even more time before they have real, actionable data.
If you’re using the same set of reports that you were 18 months ago, you are very likely losing value on your implementation. That is, unless your website hasn’t changed, in which case there may be bigger problems. The web is constantly evolving: so should your site, and so should your implementation.
The problem, of course, is finding a balance between “We don’t need any newfangled reports” and “Let’s track everything in case someday we might need it”. The best way around this? Any time you are changing your implementation, don’t think of what data you want to track now (again: never think “I want to track”). Think of which reports you’ve actively used in the last 3 months. Think of what reports you will use in the next 3 months. If current reports don’t fall in that list, scrap them or at least hide them from your menu so they don’t distract you.
See what analytics trends are on the rise. Check out some of the top blogs in the industry- particularly your vendor’s blog, like Omniture’s Industry Insights– to see what’s up-and-coming in analytics. If you’re hiring an implementation consultant- either from a vendor or an agency- don’t just tell them what reports you’d like. Ask them which cool reports others having been using. Use their experience. They may be content to take orders or fulfill the requirements of the contract (which are usually made by salespeople, not analysts), but it’s very likely that they have great ideas if you’ll give them a couple hours to incorporate them.
As an implementation consultant, this is something I’ve always struggled with. I get excited by the challenge of finding unique ways to bring in new data. When a client says “I want to track order confirmation numbers broken down by coupon codes with campaign click-throughs as a metric”, my natural inclination is to say “sure, we can do that!” then whip out my javascript editor. And if that report NEVER gets used, I’ll never know unless the client calls me back in.
I write this post as a step in the direction of repentance. I have been a Data Squirrel enabler, and I know it. Learn from my past and don’t allow these mistakes to diminish the value of your implementation.