In a recent article on Tech Crunch, Nir Eyal—a founder of two startups and a Lecturer in Marketing at the Stanford Graduate School of Business—admits that he wants you to be addicted to his products. Despite the abounding rhetorical drug associations, he believes that a certain reinterpretation of addiction via consumer manipulation might improve our lives, seeing as how the people who manufacture these addictions are people with morals, too. That's why he created his "Manipulation Matrix."
The matrix seeks to help you answer not, “Can I hook users?” but “Should I attempt to?”
The argument rests on one's individual, not communal, moral compass: whether the creator's idea is sincerely believed to improve people's lives. If it all lines up, you have "facilitators" like Mark Zuckerberg or the creators of Twitter. But how do we know how they make decisions? Aren't their moral compasses enigmatic at best?
Unlike the addiction to nicotine, new technologies offer an opportunity to dramatically improve the lives of users. It’s clear that like all technologies, recent advances in the habit-forming potential of web innovation have both positive and negative effects.
As the march of technology makes the world a more addictive place, innovators need to consider their role. It will be years, perhaps generations, before society develops the antibodies to new addictions. In the meantime, users will have to judge the yet unknown consequences for themselves, while creators will have to live with the moral repercussions of how they spend their professional lives.
Of course, our market should allow for all kinds of technological consumption. A definition of "harm" is an abiding question. When will users admit they're addicted? It seems a matter of critical distance—whether these users can trust themselves or their dealers, peddlers, facilitators, or entertainers to know what hurts and what helps.