Making privacy public

By

Fall quarter can bring with it many great experiences: the motivation of a new year of classes, the excitement for dining hall food that you can still stand to eat (for now) and of course, the Stanford Marriage Pact.

Every year, around November, a survey promises to help Stanford students meet their match. After filling out questions about their interests, values and future goals, each student is paired with someone else who completed the form in a complementary way. Along with about 4,000 other students who participated this year, I got lost in the fun of filling out the form and awaiting my results. However, it was only after the whole ordeal was done that I started thinking about how easily I shared so much personal information with an anonymous source, not giving it a second thought.

Last year I took CS 181: Computer Science, Ethics, and Policy, which explored the implications of technology creation and use. In the span of ten weeks, we discussed topics ranging from autonomous vehicles to social media echo-chambers to facial recognition. One of our units focused on data privacy and the limitations that exist in our current approach to the subject. By the end of the class, I was overwhelmed at the details I had overlooked in my daily practices, such as clicking “accept” without reading a privacy notice or continuously giving an app or website information about my interests in order to customize my experience.

At first, I thought my lack of thought to these challenges was simply my own obliviousness. It was only after the class that I realized that each of us are constantly putting our data out into the world with sometimes little thought and, even worse, that there are few ways to prevent doing so in the digital age.

When we unlock our phones, we are giving companies data about our facial features. When we watch videos or shop on Amazon, we are providing data on our interests and needs as consumers. When we share a post on Facebook, we are establishing what types of views we endorse or object to. In the above-mentioned cases, we often argue that we are volunteering our data to improve our own user experience. However, do we really have a choice?

Even if I tell myself that I will read privacy agreements in 2020, reading confusing jargon will not improve my choices of either accepting the conditions and using the app or refusing and not getting what I want. The reality is that we are not going to boycott the Internet, and without changing our approach to privacy, we will continue providing endless information about our personal lives to a wide array of sources.

Instead, in both official and unofficial spaces, we need to start having conversations about how to protect ourselves in the ever-more-complicated digital world. Truly having the freedom to use technology at will begins with being aware of the choices that we have in order to create the best and safest user experience. Large-scale privacy issues exist, but we — as individuals — need to be equipped to combat them.

Contact Trisha Kulkarni at trishak8 ‘at’ stanford.edu.

While you're here...

We're a student-run organization committed to providing hands-on experience in journalism, digital media and business for the next generation of reporters. Your support makes a difference in helping give staff members from all backgrounds the opportunity to develop important professional skills and conduct meaningful reporting. All contributions are tax-deductible.


Get Our EmailsDigest