Your age could determine how likely you are to spread fake news on Facebook.
Facebook users older than 65 shared seven times more misinformation during the 2016 US presidential election season than users between the ages of 18 and 29, a study published Wednesday in Science Advances found. Conservatives were also more likely to share fake news than liberals or moderates.
Social media companies, including Facebook, have been ramping up theirto combat misinformation since the 2016 election. Facebook found evidence that Russian trolls used the social network to sow discord among Americans by spreading fake news and targeting ads at voters.
As part of the study, nearly 1,300 Facebook users shared their social media posts with researchers at New York University and Princeton University in 2016. More than 90 percent of the users didn't spread misinformation to their friends during the election season.
The study, though, found a link between a person's age and how likely they were to spread fake news. About 11 percent of people over 65 shared articles from a fake news site, while only 3 percent of users between 18 and 29 did so. That was still true even after researchers looked at other factors, such as a person's political ideology.
One theory is that older Facebook users lack "the level of digital media literacy" needed to determine if they should trust what they read online, the study notes.
Conservatives and Republicans also shared more fake news than Democrats and liberals, the study found. About 18 percent of Republicans in the study shared a link to a fake news site compared to 3.5 percent of Democrats.
Researchers looked at a list of fake news sites compiled by BuzzFeed News to define what content was false or misleading. Some of the fake news sites included The Denver Guardian, True Pundit and the Conservative Daily Post.
The study's authors, though, noted there could be limitations to their findings. Conservative Facebook users, for example, could have simply been exposed to more fake news on the social network and the patterns the researchers found may not represent how willing this group is to believe or share misinformation.
A Facebook spokesperson declined to comment on the study, but pointed to some of the company's efforts to combat misinformation. The social network, which has partnered with online tips to help users spot misinformation. Some of the tips include being wary of headlines with capital letters and exclamation points, looking closely at an article's link and investigating the source of a story., also has
CES 2019: See all of CNET's coverage of the year's biggest tech show.
CES schedule: It's six days of jam-packed events. Here's what to expect.