Slipperiness vs Conveying the Full Story

Challenge / Question

I was asked last week by a client – How can I present the full story of our research outcomes, so people will not only want to read it, but will find the experience enjoyable and informative? How do we get people to get the whole story not just bits and bytes of it?

This is a great question and it is a challenge that many of our clients face – and one that is not easily ‘solved’. The web is a slippery place, and the back button is conveniently easy to use.

Think engagement

Keeping people focussed and engaged in web content is difficult, however this is not unique to web, in fact any form of publication where there is no face to face contact faces this issue – it is just not possible to force or coerce people into reading and understanding the entirety of a topic.

On the other hand web and digital channels do present benefits that we should keep in mind: Reach, cost, and capacity to engage (including interactivity).

So given that it is not possible to force people to keep reading and learning how can we maximise the level engagement and information that does get taken in?

Take on the Challenge

One of the strengths of the web is its flexibility, we have copious options and approaches available to us (note, this can also present a challenge). We are not stuck with a bound book and pages, we have complete presentation and editorial control, so we are free to optimise and use all the tricks available to make our content as engaging as we possibly can. In more sophisticated digital implementations we can even test (e.g. AB testing) what works and rearrange content presentation on the fly.

All of this takes significant effort from the project team, in terms of time, money and metal energy. It will likely take several iterations around ideas and methods of presentation, and involve creativity and ingenuity.

To visualise the benefit I imagine a lopsided seesaw – the more effort we put in the greater the reach and easier it will be for our audiences to engage with and understand the information. Whilst this analogy is not precise enough to be expressed as an equation, if it were it would be non-linear. Meaning that the project team effort can be substantial and increase significantly depending on the complexity and volume of information.

project-team-effort

Be realistic

One of the common issues we encounter caused by unrealistic expectations, people expecting web visitors to spend hours on a website or complete all of the actions available to them (the web is just another channel and is great but not magic). This is possibly due to an ‘over sell’ at the beginning of a project, or just a miss-perception concerning what success might look like. Here the aim is to compare the web as a channel to others via universal metrics rather than zeroing in on internal metrics like page views or bounce rates.

Measure and track goals and metrics that can feasibly be achieved, set internal (and sponsors) expectations realistically. Compare success at an ‘outcomes level’ via metrics such as cost per touch, information accessibility, impact (e.g. change created or catalysed in the real world). These metrics usually help to pull everyone involved back to the big picture.

This is particularly important for website and portals conveying in-depth information as they generally have quite niche audiences. If we compare the internal / performance / standard web metrics of say, a retail site, to our own we are unlikely to see a favourable picture.

Re-engage

One of the advantages of the web is that we can take some control of the interaction (with permission of course). The most common way is to encourage people to sign up to an email list of some kind, but we can be more ingenuitive of course. We might send alerts, we can help facilitate a process of change (based on the information or advice provided via the website), including alerts and tracking of status changes. We could ask people to connect via social media and narrow cast information (reminders/alerts/news) via those channels. There are an ever increasing number and variety of digital channels.

Reengaging might also include non-digital channels. We have been involved with the production of many web based platforms which are part of wider communications programs such as road shows or compulsory self-assessment programs. These activities are an ideal opportunity to both engage, but also collect contact details for reengagement.

Get the basics right

When we are approach to provide advice on existing sites we occasionally see that the basics are just not executed well. Getting the basics right in an entry level requirement, without this optimising or tweaking is likely to have limited impact.

So what would we categorise as the basics? We would consider:

  1. Understanding target audience and pitching all aspects of the information appropriately
  2. High quality aesthetic (appropriate for audience) – helping to create an impression credibility / quality and building trust
  3. Thorough information architecture and content planning
  4. Appropriate interactivity (beyond the scope of this article visit lcubed.com.au or see our engagement claw for more on this one)
  5. User experience testing (one way to quality control the work)

There are more – but this list is a good start.

Measure to improve

This is where we should be using internal metrics – to make things better. However usage metrics can only tell us part of the story, so we are also advocates of asking people (users) for feedback. In this way we can gather both quantitate and qualitative feedback.

Try as we might, our website or communications platform is unlikely to be perfect for all audiences, so ongoing assessment and improvement is important. At a high level we might use quantitative measures to identify areas of weakness and then ask users for their input on what improvement might look like.

See below for a quick guide to quant and qual data collection and its use, and which is more useful at what stage of a project.

Before launch After launch
What to do Qualitative – ask people to establish a picture of what needs to be done Quantitative – measure to identify exceptions, weaknesses and/or set targets for improvements
How to do it Quantitative – set measureable targets, project manage implementation and establish initial benchmarks Qualitative – as people what improvement they would like to see or brainstorm changes.

Quick reference – How to use quantitative data and qualitative input in (complex topic) website projects.