The sins of data in digital design (and thoughts on redemption)

by John Kaminsky

Having led a design-focused digital agency for many years, I’ve suffered inside when a decision was made to add a “sign up for a discount or newsletter” pop-up before a user even had a chance to see the content they requested. Or to insert one more display ad unit into a page already swimming in them. Or to stick a “buy now” button so far up a user’s nose that a first glance at the page caused a searing headache.

Time and again, I stood up in conference rooms and made an impassioned case for reason. And I often got my legs knocked right out from under me — by data. The pop-up has accelerated the growth of our email database. The data says revenue goes up with more ads on the page. The A/B test shows the conversion rate is up. I grasped at straws, trying to formulate a persuasive response: user experience quality, brand equity. Basic human decency? Thanks for sharing your concern. What’s next on the agenda?

Going beyond tactical design decisions, the invisible hand of data is widely seen as a main contributor to the design and UX homogenization of the web. The data shows what performs, good performers become best practices, and then best practices are embedded into templates, code libraries and tools that are designed to make spinning up a new web site or app easier and less expensive than ever. All of this exerts an overwhelming gravitational pull on UI designers and engineers. Is it any wonder that 80% of the web looks like the same half-dozen Squarespace, WordPress, or Shopify templates? Have our user experience architects become interior decorators, whose job it is to apply a pleasant veneer to the walls of a templated universe?*

Marriage? Charitable foundation? Furniture boutique? This user interface is for you. Squarespace “Anya”, “Peña”, and Shopify “Debut” templates.

So, is the consequence of data-driven design a lack of consideration for anything that can’t easily be quantified, and a slow march towards bland homogenization of the internet?

Thankfully, I don’t believe that to be the case. Like money or any other asset, data can be used for good or bad — and it’s frequently mishandled (sometimes honestly, sometimes manipulatively) by amateur and sophisticated professionals alike.

To be deployed successfully, and to avoid the aforementioned pitfalls, data must be used within a framework that is:

Was the ultimate business goal we were working towards a larger email database or increased conversion rate? Of course not — those were just avenues to reach a larger goal. My weakness in those meetings was failing to previously build a consensus on a total system of metrics, before they were applied as a measuring stick for design and user experience. By taking a narrow view that focused on optimizing for specific (and perhaps arbitrary) metrics, the totality of the system was ignored and short-term, small-minded aspects were prioritized. Watching individual, easy-to-understand metrics tick upward can be highly satisfying. But what evidence did we have that these metrics were contributing to the holistic growth of the brand or business? It turned out that they weren’t, and the data eventually showed that. Having a more holistic data framework would have enabled us to put these metrics into context and to evaluate their broader impacts.

To ensure your metrics framework remains holistic, there are other temptations to be wary of. The first is the allure of combating narrow-minded thinking with more of the same. Could I have argued that adding an “instant popup” on a landing page negatively impacted bounce rate? Probably — but I would have been falling into the trap of using one narrow metric to argue against another, absent the critical holistic perspective.

Another common temptation is to consciously or unconsciously overweight the metrics that have improved when analyzing the performance of a design change. If your holistic set of metrics is comprehensively defined before analysis begins, the temptation to paint bullseyes where the arrows land is diminished.

To sell more stuff or to show more ads isn’t a strategy. To be strategic, there must be agreement as to the things you’re *not* going to do in addition to the things you are. If the defined strategic goal had been to grow long-term customer relationships/lifetime customer value, it would have been far easier to show how certain short-term decisions might provide an initial bump but have neutral or negative consequences in the long term. Of course, if the strategy was to maximize revenue in the current quarter than those short-term decisions were spot-on. Instead of evaluating decisions against a strategy that all stakeholders had bought into, I’d been pleading for the importance of a soft attribute (user experience quality, brand equity) versus a hard, measurable attribute like conversion rate or newsletter signup growth. That battle was lost before it began.

More broadly, adopting best practices or incorporating code libraries or components without contextualizing their contribution to a truly differentiated strategy is a contributor for the creeping white-breadification of the web, which gets worse by the day.

This is a good opportunity to apply the 80/20 rule — think about the 20% of your digital experience that’s most critical to your strategy and then lavish 80% of your effort there. Clearly prioritize the metrics that measure where your strategy differs from the competition and elevate those in the consciousness of all stakeholders.

It’s no secret that Google ensures each small change is vetted, tested and ramped up before becoming part of their user experience. However, this is a single facet of one of the most aggressive systems in history for generating and validating new, ambitious thinking within a company (i.e. 20% time, venture investments, moonshot programs, etc.). The intelligent use of data provides a mechanism for ensuring that the best big ideas are the ones that make it to market and smooths the edges to ensure the greatest chance of their success. If an organization attempts to skip directly to measurement and optimization without generating big ideas, well…

Inspired ideas + optimization = optimized inspiration.
Mediocre ideas + optimization = ?

In my conference room debates, it wouldn’t have been the right time to argue for an overhaul of a client’s R&D program or to pause the design review to encourage the client to “think big”. But I could have used this line of reasoning to consistently and explicitly advocate for bigger, more ambitious product design moves in conjunction with optimization from the very first pitch meeting. This would have had the added benefit of lifting the discourse in the room out of the weeds and to the more elevated viewpoint where every good designer prefers to operate.

I’d encourage anyone who designs in a measurable environment (read: any digital experience, and an increasing number of physical experiences) to ensure they have a holistic, strategic, and balanced framework to operate within, and to understand how the various metrics used to measure design success fit within that framework.

One final note: The examples at the beginning of this article are simplified cases of data being viewed through a narrow lens, to support objectively bad decision making. This happens (all the time). But often decision-makers are faced with complex, hard-to-wrangle challenges where objective truth is harder to find. The same framework of holistic-strategic-balanced is still the right path forward.

* I love and respect the interior design profession, but they can only do their best work if architects and engineers do too.




Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Aviary Analytics

Aviary Analytics

We help clients achieve transformative growth using advanced analytics, data modeling, and AI.