Too much television-watching in young adulthood could lead to cognitive problems in mid-life, suggested a study Wednesday that tracked more than 3,000 people for 25 years.
A quarter century of telling white coats about your glowing screen drool dish habits and making lots of that long green in the process. And hey, we might have even learned something, although the unbelievably flawed design of this study might undermine that a little bit. Still, two and a half decades well spent everyone, give yourselves a round of applause.
People who reported watching more than three hours of television per day as young adults were twice as likely to suffer from poor cognition down the road, compared to those who were more active and reported less screen time.
Maybe think twice before that "Ow! My Balls!" marathon. Poor cognition, a nice new euphemism for good ole down home idiocy we've all come to appreciate and cherish.
The research tracked 3,247 adults, aged 18-30 when they enrolled in the study published in the Journal of the American Medical Association (JAMA) Psychiatry.
And yet there are a few cranks who claim that academic rigor is vanishing from the mush sciences.
Scientists assessed cognitive function in the 25th year using three tests of mental processing speed, executive function and verbal memory.
No electric shocks, unlawful imprisonments or giant mazes? Man, you used to be cool, Psychiatry.
Low levels of physical activity and lots of television-watching were linked to slower processing speed and worse executive function, the study found.
Oh, by the way there were also these other confounding variables, but you're getting your news from Yahoo so you probably didn't understand most of these words anyway.
Verbal memory, however, did not appear to be affected by the amount of television time.
It's time to build your vocabulary by passively absorbing hours of "Gattaca: The Series."
The study was led by Tina Hoang of the Northern California Institute for Research and Education at the Veterans Affairs Medical Center, San Francisco; and Kristine Yaffe of the University of California, San Francisco.
That laid-back California attitude. I'll have the study done in twenty-five years, mom. Quit bugging me and harshing my buzz.
We win again! Take that, Wallonia! USA! USA!
According to Andrew Przybylski, an experimental psychologist at the University of Oxford, who was not involved in the findings, the study contained shortcomings.
Yeah, no kidding.
"First, these data rely entirely on a potentially problematic self-reported measure for television time," he said in a statement.
You should have hooked them up to wires like a real scientist.
Researchers also did not study participants' cognitive function at the beginning of the study, in order to have a baseline for a comparison.
"I didn't feel like doing it, mom! I'm sure it won't matter."
He also pointed out that "nearly one in three participants did not complete the study," further weakening the strength of the findings.
I could not complete this brutal marathon of writing down my idiot box habits and getting paid.
"Taken together, the work should provoke continued conversation about the nature of different forms of interactive media and underline the value of open science methodology including open datasets, pre-registered analysis plans, and robust and open peer review process," he said in a statement.
I mean, were you suckas even thinkin' bout yer pre-registered analysis plans? Your dataset is wack.
"Until these innovations are introduced into this research literature, we will be left scratching our heads at studies like this."
Now if you'll excuse me, I have an award-winning "selfie addiction" study to conduct.
The media bosses are criminals because they put out such garbage for consumers to view and suffer.
If you are subscribed to cable tv these days... you already are brain dead
Think this is bad...wait til the "techy generation" generation hits their mid 40's,this is candyland!