Can I have your undivided attention?
Here comes some random thoughts and a recommendation for a podcast series, Your Undivided Attention. It explores for example “what it means to become sophisticated about human nature, by interviewing hypnotists, magicians, experts on the dynamics of cults and election hacking and the powers of persuasion.“ Cool, right?
So, the inventor of the infinite scroll said (after the invention) that making an interface easier to use is not necessarily a good thing for the humanity. If a UI is designed not to help you but to hold you, our brains cannot usually catch up with our impulses. Classic examples can be found in social media platforms and among games.
This also comes down to what tasks we want the user to achieve. If the tasks stem from holding the user, like the things infinite scrolling or endless swiping is applied to, we cannot really talk about “user-centeredness” or being “humane” or “emphatic”. However, if we have a task, say buying a train ticket as swiftly and simply as possible and letting the user go on their way, the user is guided towards a whole other direction. The direction or goal could be helping the user continue with their day and helping them to meet with other people.
Sometimes optimizing the experience for the user can be supported, by simply asking them what they want (not only after a user testing session or via a feedback form). For instance, if I would every now and then be personally asked by LinkedIn how much time I would want to spend there per day and given a possibility to set goals for that time usage, I’d feel quite nice about it. Sometimes friction in UX is a good thing, more so if the user has a say in it.
Just like casinos as physical spaces have an asymmetry of power (design-wise: guiding people towards slot machines, reducing conscious behaviour) and one of their main KPI’s being time spent on site, we have a similar situation among digital devices and services. We as designers, developers and stakeholders, define and create those spaces.
This calls for responsibility and action from people creating these products. It’s not enough to just iterate and optimize, there should be incentives to make algorithms driven by user intentions and values. It’s not enough to provide options for the user and give tools to manage their digital wellbeing when they (we) are spending most of the time in a space where power is unequally distributed.