Does Experience Design Need to Steal Your Data?
Last summer, I attended a conference at a resort in Southern California. For weeks beforehand, I received emails encouraging me to upload a picture of myself so that I could take advantage of the facial recognition registration system the organizers were using. It would be convenient and frictionless, the emails promised. I could see why that might be true but I also saw drawbacks. First, I had to find and upload a photo, which probably wouldn’t take less time than registering onsite anyway. Second, I object in principle to giving companies more personal data than they need. Who knows how secure all these registration systems are? When I arrived at the resort, I finger-typed my name into an on-screen keyboard and printed out a badge. It took 15 seconds and gave me the opportunity to initiate a chat with two of the organizers.
After the conference, which I enjoyed, I was dorking around with my iPhone while waiting for a flight home. I happened to burrow down into the privacy settings and noticed that the device keeps a list called Significant Locations. It’s this list that makes it possible for apps to remember places it believes are important to you. The list showed that I had just spent three days at this resort. Yes, I’d just spent time there, but in what sense was it now “significant” to me?
I began to wonder about all the data I had relinquished in attending this one professional conference. I bought a flight, giving my name, email, physical address, credit card data, frequent flier number, passport number, Known Traveler Number. At the airport, I gave fingerprints to the Clear system, and my luggage was scanned. When I landed, I took an Uber, giving my name, email, credit card data, my current whereabouts. Then the conference wanted to scan my face and give my contact information to its sponsoring partners.
Business as usual suddenly started to seem like too much. It was a massive increase in what security experts call my 'attack surface,' i.e. the amount of personal data I expose to potential misuse.
This is the modern world. We know it. Our every movement generates a series of data points. The stewards of this data give you a Terms of Service or a Privacy Statement that you accept because it’s too long and full of legalese to actually read. Then they sell your data. Or worse, they lose it to hackers. I used to worry I was becoming paranoid but recent scandals — everything from the Equifax data breach (which affected me) to the hacking of the American election (which affects everybody) — have only reconfirmed the pitfalls of giving up so much personal information. It now seems to me that paranoia is a privacy best practice.
This paranoia, however, puts two parts of me at odds with each other. As an attendee at a conference, I did not want to surrender so much data for so flimsy a purpose. But as the leader of an experience design agency, I recognize that clients have a tremendous appetite for all this information. Their intentions are good. They want to understand their audiences better. They want to personalize the experiences they offer. They want to see that their messages are having an influence on sentiment and behavior. They want to track how far these messages are being shared. They want to know about the return on their investments.
All this requires data and clients are increasing the pressure on their agency partners to measure and monitor. They get Nielsen ratings for their advertisements and click-through rates for their digital campaigns. Why shouldn’t they get data from their conference, their pop-up shop, or their general session? It’s a reasonable request. Experience design agencies turn to surveys, registration data, crowd-tracking devices, screens that capture demographic data. Soon we will be inventing devices that scan audiences for signs of their attentiveness or we will be asking them to install apps on their Apple Watches that feed biometric data back to our clients. Did the CEO’s keynote cause a spike in attendee heart rate?
This data can be useful and meaningful, but I wonder if it isn’t time to include privacy in our practice of experience design. As professionals, we pride ourselves on thinking through an experience from the vantage point of the attendee. We imagine the ability of our creative to influence the way an attendee conceives of a brand. But do we take into account what the attendee thinks about privacy?
Have we asked ourselves whether personalization validates poking, prodding, prying, and spying on the audiences we are supposed to be galvanizing?
Have we considered whether a person really views facial recognition as a worthwhile tradeoff for a seamless registration? How long will it be before hackers break into one of the major registration companies in our industry and download a trove of Fortune 500 email addresses that can be used in phishing or “whaling” attacks?
If these risks are worthwhile, it obliges us to be prudent in the data we ask for, secure in the way that we store it, and ethical in how we put it to use. But it also occurs to me that a counter-narrative could be proposed — one in which the difficulty in quantifying experiences stops being a liability and becomes a virtue. Something ineffable happens when people meet with each other face-to-face. Our industry is scrambling to define and chop it up into KPIs. But the ineffable may also be a safe haven, a moment we can share with others without fear of leaking our personal information into the hands of those who would exploit us.