Who owns the mirrorworld?
Some of my favorite talks at AWE had to do with the ethics of spatial computing. I've been in conversations about ethics using MR before, but usually they revolved around how we could use MR to teach people to be more ethical because, well, "learning technologies." However, thinking about how the social, political, and other human issues surrounding these technologies, and specifically the data collected by them, are a lot more complicated. Kent Bye, host of the Voices of VR Podcast, presented this framework during a keynote. While he spoke for nearly an hour, he only really had enough time to touch on each one of these important topics. There's clearly a lot we need to figure out. The unanswered question: Whose job is it to solve all these problems? And who would we trust to do it?
One reason these issues are so important - and so dangerous - is that data is a commodity. Personal data, geographical data, corporate data, surveillance data - all these and other forms of information can be used to make money. Now, I'm all about making money - I've got a yacht to buy - but what is the right business model for our data? Should we expect to "own" our data? Who pays for it? Just as importantly, who pays to keep it safe? Kevin Kelly made the argument that expecting to "own" our data is an outdated, "agricultural" model that is not sophisticated enough to address all these ethical concerns. We need to rethink how we operationalize ownership.
The other reason that the commoditization of our personal data is an issue is that with the mass adoption of any new technology what usually happens, at least at first, is the rich get richer and the poor get poorer. If my data are worth money, then the more I generate, the more I get paid. The Mirrorworld will provide us new kinds of art, entertainment, media, educational opportunities, connections, and ultimately jobs, but if I can't access it, I can't use it. Some of you might not think this is your problem, but in an interconnected world, threats to security are shared. Cybersecurity ain't cheap, and we're only as secure as our lowest common denominator. It's like any other disease - you can wash your hands all day, but if the person next to you on the airplane is sneezing, you're still at risk of getting sick. We all share the same air up there.
Another question I found particularly interesting has to do with revoking access. Right now, if you are particularly offensive on social media, those companies reserve the right to block you, and are held responsible for the content on their sites. While we can debate whether Twitter and Facebook should block fake news, until our government regulates it, it's up to them. However, in China, social credit scores are being used to evaluate and punish citizens through limiting citizens' access to travel, schools, and even their pets. We need to be able to trust our governments with our data, but can we? And what happens when our online presence bleeds through into our real lives? (And yes, I know this was an episode of Black Mirror.)
Next up: The World in Machine Readable Format
Your comment will be posted after it is approved.
Leave a Reply.
These posts are written or shared by QIC team members. We find this stuff interesting, exciting, and totally awesome! We hope you do too!