Engagement is Complex
Engagement is a spectrum of psychological intensity and is useful to broadly discuss how active, interested, and committed people are. That vague general definition was all fine… until we started trying to measure engagement. Measuring feelings is always a dicey business and depending on the methodology and tools used, the approaches vary wildly. Measurement techniques include observation, interviews, surveys, digital activity, and text analysis.
I think measuring emotion is a bit of a fool’s errand and, in the end, it only impacts outcomes when it is expressed in behavior; I don’t know someone thinks something is funny until they laugh – and that laughter is likely to elicit a behavioral response in me. Similarly, the value of content does not change with the volume of people who see it. It is the application of content that changes and that has value. The content remains the same.
Because behaviors do not occur in isolation measuring one action in isolation doesn’t tell you much. The current state-of-the-art still calculates the volume of single behaviors or transactions. This RFM analysis model is the way many online retailers track activity and can use it to create automated prompts and drill down to see the transaction/buying history of each. That is helpful if you are a retailer but even then, experts would caution that “past performance is no guarantee of future results.” This type of data also does little to help us understand how behaviors change. I would want to know what behavior patterns looked like for people who moved from one category to the next because then I could pinpoint what behavior I should make easier or harder and when in the behavior flow; THAT is how I could impact future potential.
Behavior Change is the Only Thing That Impacts Value
If we want to measure the value of collaboration, communication, and engagement inside organizations this approach does even less; more content or transactions (emails, deliverables, and meetings) are not necessarily better. What I want to know is how the best work is accomplished. It’s the sequence of behaviors – what behavior followed the last behavior – that is interesting. By comparing multiple groups can we learn something about what behavior sequences make a difference and then can we influence the trajectory of future behavior?
If you are a fan of economics, you know that the only way to change the economics of something is by changes to behavior patterns. What that means in plain language is that changing how someone behaves is the only way a person, group, or organization will grow, adapt, become more effective, or produce more value. Behavior is the only thing that matters.
Originally, I thought this was only an issue for community platforms but virtually none of our business software tracks behavior sequences, which is curious.
The evolution of databases grew from mathematics, accounting, and engineering, not the social sciences. They originally were used to calculate numbers, not observe behavior.
The databases of large software platforms are still designed around counting content or transactions, not people and their behaviors.Rachel Happe
We can tell you everything about the history of transactions or files and very little about the history of users as a group. This foundational issue with technology has sent the entire leadership world on a wild goose chase with little to show for it and a lot of complicated solutions yielding little strategic insight. We can’t even see if a platform is paying for itself by looking at its analytics. Organizations spend billions on technology annually and don’t really know which of it is effective. Mind-boggling.
This data schematic from Salesforce helps to see this; the primary “parent’ data in its database is the Account, which if you are tracking past sales makes sense. However, it means that individual contacts are nested and not the primary identifier or organizing element and any transaction they have is then nested even further under each contact. That means I can go in and see the transaction pattern of each contact in isolation but it is more challenging to see the behavior flow of segments of people who share a common attribute.
New: Engaged Organization’s Engagement Compass
Measuring engagement requires us to define how we will do so. Engagement itself is an emotional state, which eludes exact measurement but it impacts how we behave. We don’t speak up much if we are unengaged. We don’t ask questions when we feel vulnerable. We won’t offer alternative perspectives if we feel like we will be ignored, cause a problem or have it shot down. Defining and categorizing those behaviors is critical for measurement and are an effective proxy to understanding engagement in an objective way.
I first did this with the Community Engagement Framework and have recently extended it to become is Engaged Organization’s new Engagement Compass. This framework categorizes different types of engagement behaviors into their relative value, defines the feelings each category of behavior reflects, and indicates the role of moderation and community management to shift those behavior patterns.
Capturing these behaviors and showing how they are shifting over time can help leaders identify how culture is shifting in populations. No one system captures all of this data and when I have developed dashboards with clients, it requires mapping the data available to an engagement category and adapting as more data is available from more systems.
Monitoring how these behaviors shift can help us better understand how organizations and cultures work. Which behaviors repeat the most and how frequently? Are people comfortable asking questions? Seeing those behavior patterns provides a concrete and less murky analysis of how cultures operate. It provides both a benchmark and a way to identify opportunities to remove friction.
Once you can really SEE the mix and flow of behaviors, the real work can begin. A lot of things impact behavior; leadership, training, governance changes, new tools and technology, different incentives, and more. Hypothesis about what impacts behaviors can be tested. Does adding training make any meaningful difference? Does governance change have any influence? The impact and value will be self-evident – IF you can get the data.
Working transparently in networks, communities, and groups has one of the largest impacts on value because everyone has access to the latest information and everyone can operate in the same context. The payoff is so enormous that the investment in fostering that change would be self-evident – if we could see it.
An Engagement Dashboard Showing Self-Evident Value
So how do we SEE shift behavior and value What might a dashboard look like?
After years of thinking about this, I decided to design what I would most like to see in an engagement dashboard. This dashboard would show sequences of behavior over time and allow comparisons between segments. It would allow me to see where each behavior falls on the Engagement Compass and might allow me to toggle and see which behaviors represent a milestone. I would want to zoom in and out in time and toggle to compare different segments and trends over time.
What I don’t need or want is the data for individuals, because surveillance and the punitive responses that come from it do not encourage the growth of positive behaviors. The best use of this data is to learn, adjust, test, and adapt environmental factors or to identify what the most successful groups are doing that could identify opportunities.
This dashboard would effectively highlight groups that:
- Increase and maintain constructive behaviors.
- Can effectively shift destructive behavior to a healthier state.
- Have the highest constructive engagement overall.
- Are the fastest at completing a workflow.
- Are fastest at moving between different milestones.
This information would make it easy to pinpoint issues, opportunities, best practices, and effective behaviors, which can then be encouraged and rewarded – and that is both meaningful and actionable.
Case Study: The Value of Tracking Behaviors
Working with a client, we developed a community-centric strategy targeted at increasing engagement and the depth of engagement on their social intranet. As part of that strategy, we identified three aligned ‘key behaviors’ that were both reasonable to expect based on past engagement (the next best step) and would help realize the strategy. One of them was asking questions. They undertook a research effort with an academic team to evaluate how often and when that behavior happened. They looked at what they considered successful and unsuccessful projects and analyzed how asking showed up in each set. The results were fascinating – and powerful.
They discovered that the most successful projects had very different behavior patterns. In successful projects, they saw a lot of questions in the first phase of the projects and much fewer in the subsequent three phases. In unsuccessful projects they saw the reverse; few questions in the beginning and more and more as the project went on. This case study comes from the Digital Workplace Communities in 2021 report.
Interesting, isn’t it?
Have you seen data that shows this type of behavior sequence? Does it make sense to you? Does it spark other insights?
Have you looked at your engagement behavior this way? What did you learn?