Stanford students now spend four-fifths of the waking day staring at a screen; is this the new college normal?

Opinion by Conrad Safranek
July 8, 2020, 10:07 p.m.

So far during online college classes, Stanford students have averaged 78% of their daytime (12.4 hour/day) staring at a laptop, tablet or phone, a jump from an already high 50% during regular classes. The data suggests that we are not equipped for this new normal.

Imagine twenty years ago someone tried to convince you of this claim:

“By the year of 2020, you will spend almost every waking moment within reaching distance of a charged hunk of metal. What’s more, your current ‘normal’ day will be cut in half to make room for 8 hours of sedentary time staring at a screen.”

At the start of the century most would have laughed off this prediction. Such global and fundamental shifts in human behavior normally unfold at evolutionary rates — on the order of millennia, or at the very least centuries — right? Twenty years later and, for many college students, this 50% cut to our regular day to make room for our screens has become an inescapable reality.

I recently conducted a longitudinal study of 16 Stanford undergraduates to track screen time before and after the start of quarantine and online classes. Preliminary results show that average screen time across devices during a regular week of in-person classes consumes 50.2% of the waking day. What’s more, this has risen to a staggering 77.6% of our waking day due to the recent switch to online classes during the COVID-19 lockdown. Assuming eight hours of sleep, this means non-screen time has dwindled from an already-low eight hours per day during regular classes to a now measly 3.6 hours. In short, all your in-person time, your time in nature, your time cooking and eating, has been compressed into a small sliver of what it used to be.

Stanford students now spend four-fifths of the waking day staring at a screen; is this the new college normal?

So, where is all the time going, and what are the implications of this shift? There are two things to consider here:

First, psychology has shown us that self-control and willpower draw upon a limited pool of energy that can be used up. This “ego depletion” means that when you expend energy on a task that requires significant focus, you subsequently become less able to sustain the same level of self-control required to resist temptations and quick reward.

Second, and unfortunately quite related, is the fact that nearly every screen is fraught with distractions. For any task on your laptop or phone, it’s possible to instantly switch to a different task. Whether it’s a ping of an incoming email, a Snapchat banner at the top of your phone or even just a memory of something you forgot to do earlier that pops into your head during your work; with a simple swipe you can switch tabs, windows or applications to address this new thought.

Stitch together ego depletion and ease of distraction, and you can imagine how increasing screen time bodes poorly for our focus and self-control. The more time we spend on our devices, the less able we are to fend off distractions and temptations. As part of the study, each Stanford student checked their phone and laptop screen time history to report how they spent their device time during regular and online classes respectively (as defined by Apple’s screen time history categorizations).

The data showed that, with the increase in screen time that came with online classes, the percent of total device time spent on “Productivity” took a dip while the categories of “Entertainment” and “Social Networking” rose. This is perhaps surprising, especially considering that all our class time has been funneled into Zoom, which is categorized as productivity. Despite these extra hours, the percent of the total time actually focused while on our devices is shrinking as our waning willpower gives way to distractions.

Stanford students now spend four-fifths of the waking day staring at a screen; is this the new college normal?

The most worrisome part of it all is the utter lack of understanding as to how these hours of device time may be affecting us.

Some of the brightest minds from universities around the world are being plucked by big tech companies to use their expertise towards one ominous end: to build tech that — in a word — is “used.” The success of a new phone app or social media platform is defined by how much users rack up minutes glued to the product and how much they build dependency into daily life. 

Tech is designed to be addictive. Take a look at your iPhone email. On your home screen a bright red circle sits on the app icon inviting you to pop it open. You swipe down on your mail to refresh the page, leading to a variable reward — sometimes good news, sometimes a chore needing to be dealt with and sometimes nothing. This “variable ratio” reward, similar to the intermittent reinforcement of a slot machine, has been shown in behavioral studies to be the most habit-forming type of conditioning. What’s more, responding to an email takes 30 seconds and leaves you with a spike of dopamine and a feeling of accomplishment. No wonder it’s hard to sustain focus on that 60-page online textbook reading that you have open in the other tab.

What little research has been conducted is focused largely on screen time among youth, and the initial results are not great. For youth and adolescents, hallmark studies have demonstrated a correlation between social media use and decreased self-esteem, as well as an association between late-night device access and disordered sleep. Furthermore, high screen time among teens has been directly linked to disease outcomes, ranging from obesity to depression. Some studies have even gone as far as paralleling screen time overuse with the craving behaviors of substance dependence.

Adults are rightfully worried about kids, but perhaps they should also be worried about themselves. We currently have very little understanding of how 56 (and now because of quarantine, close to 90) hours of weekly screen time are affecting us. Studies on adults are currently limited, in part because tech companies do not want to make user-data accessible. For example, Apple currently refuses to let users choose whether they want to release their own screen time history data to third party apps that might help them improve their focus and decrease their screen time, let alone releasing it to a researcher trying to better understand device usage.

So, what’s next? I predict we’re going to see a massive backlash against screens in the next ten years. To get there, though, we need to take some proactive steps towards actually understanding how screens are impacting both youth and adult populations. Once research has built a case for how important it is to better manage screen time, my next hope is that powerful tools will be developed and integrated across devices to help us take back some control over how we interact with our devices.

Stanford students now spend four-fifths of the waking day staring at a screen; is this the new college normal?

In closing, consider this eerie 1940s image of children running through a cloud of D.D.T. At the time, the insecticide was considered a miracle of the modern world and used for everything from lice removal to hospital bed sanitization.

Though perhaps dramatic to compare screentime to D.D.T., time and time again in history technologies have expanded faster than society can understand how they impact us. Though there’s no denying the vast utility of smartphones and laptops, as screen time continues to eat-up more and more of our waking day it’s time we start thinking more critically about how this may be affecting us, and how we can build tools to manage it.

Contact Conrad Safranek at conradws ‘at’

The Daily is committed to publishing a diversity of op-eds and letters to the editor. We’d love to hear your thoughts. Email letters to the editor to eic ‘at’ and op-ed submissions to opinions ‘at’ 

Follow The Daily on Facebook, Twitter and Instagram.

Login or create an account