In my recent talk on everyday identity, I suggested that login-time consent to data sharing is not a great example of human-centered design.
Even if we had already figured out the perfect ceremony for real-time consent or developed the best login interfaces, individuals still tend to be disadvantaged in the federated identity balance of power — that big flashing “I Agree, Here’s My Data” button might as well read “I’m Over a Barrel, So Go Ahead and Take It Anyway”.
David Weinberger has this analysis (do read the whole thing):
Since just about every vendor on the Web would like to know more about you rather than less, why won’t just about every vendor ask for more information rather than less? It’s all just a button press.
The golfer use case in my slides highlights this issue as well, using InfoCard flows. In real life, my boss was actually asked for his Social Security Number (!) as a prerequisite for starting a new account while trying to book a tee time over the phone. In that communication mode it’s easier to just say “no, thanks” and hang up the phone; with an information card many people might just press Return to get it all over with.
So how do we get to truly human-centered design? We take into account people’s real tendencies and desires, and try to bake these into identity ecosystems in a way that redresses the power balance.
Here are three common tendencies: new-relationship energy (the conscious effort you’re willing to invest when something is new vs. familiar), the efficiency imperative (the impatience with annoying multi-step interactions that makes you stop paying attention), and the self-revelation imperative (accepting that it’s legitimate to choose to share data about yourself when it gets you something of value).
Based on these, here’s what I suggest:
Let’s reduce the routine gathering of data-sharing consent at login time — it doesn’t materially empower individuals and, as a bonus, it annoys them. Instead, we should find a way to let people configure data usage policies at the time of establishing relationships with online partners; without this, people are stuck with accepting others’ terms and have no window in which to impose any of their own. In essence, we need to be thinking about the game theory of identity! To quote David Weinberger again:
[I]f we’re going to make it easy to give out our personal information, we ought to be thinking about the norms, market forces, or rules that would make it harder to ask for that information.
We also need to enable applications to get something useful done when handed only a tiny slice of someone’s personally identifiable information, and use pseudonyms and other privacy measures zealously when coordinating among applications. If we can’t enable this, we’ll continue to be asked for way too much information because it’s the apps’ path of least resistance.
Finally, we should reserve user-approval loops for extraordinary circumstances, ideally those dictated by people’s own preference settings — which allows identity-based app behavior to go on in the background (e.g., while we’re sleeping, windsurfing, or whatever) as appropriate and to grab our attention when we need it.
(More thoughts soon on some solution opportunities in all this…)No tags for this post.