The underlying central metaphor of the new Marvel film "Spider-Man: Far From Home" is that news shouldn't be trusted. Many agree, but whether or not people label it "fake" seems to be based on how different their own bias is.

If you are a Republican and regard CNN is biased, having a conservative voice on their network does not make the outlet seem less biased, it makes the participant seem less credible, even if they believe the individual is honest. 

It can seem a little maddening until we recognize that people do not make either/or decisions, on most issues there could be three or points and people will move toward or away from the space between them based on the issue. Objectivity is one point in a triangle of trust regarding media, as is Honesty and Knowledge, finds a new Personality and Social Psychology Bulletin paper.

No one really believes in objectivity in modern times, and journalists no longer even put on the pretense of such. The assumption is if you were hired at a particular network or publication, you either matches the editorial bias, which will be reflective of the audience that pays the bills, or you are a token member of the dissent. 

Grandparents are a less polarizing example than media, notes Laura Wallace, PhD, social psychologist at The Ohio State University and lead author of a recent paper. "Most everyone agrees that grandparents are honest. But if Grandma says that her grandson Johnny is the best soccer player around, most people will smile politely but not believe her. She's obviously biased."

Their study on the topic has obvious confounders; it's college undergraduates taking psychology courses so there is lots of inherent bias in that selection. In an experiment, 169 undergraduate students read a fictitious conversation between aid workers trying to decide how to allocate resources at the beginning of an Ebola outbreak in the Congo. They had to decide whether to allocate limited resources to Rutu, a rural area where the outbreak started, or Poko, a nearby city where the disease had spread. The aid workers were all described as "highly trained." One worker, Roger, advocated for sending aid to Rutu and for some participants was described as having worked in that area as a Peace Corps volunteer, which might indicate that he is biased. For other participants, this information was omitted, leaving no indication of bias.

After reading the conversation, participants completed a questionnaire in which they evaluated the aid workers' proposals. Results showed that when Roger was described as having a previous connection to Rutu, participants thought Roger was biased in his recommendation to send aid to Rutu, - even though they also thought he was trustworthy, an expert in the field, and likable.

As a result, study participants thought his suggestion to send aid to Rutu was less credible, but only when they were told he had previously worked there.

So  knowledge of bias may damage credibility, just as untrustworthiness does.

But Bias and Untrustworthiness always have the same consequences

"In the case of biased, but honest sources, the information they present might only support one side of the issue, but at least people can treat the information as useful for understanding that side," Wallace said. "Untrustworthy sources may never be that useful."

In addition, the difference between a biased source and an untrustworthy source has a big impact if the source changes positions. In an unpublished separate study the same researchers found that when untrustworthy sources change their position, it does not make them any more or less persuasive.

"Untrustworthy sources are seen as unpredictable. You can't tell what position they are going to take and it is not seen as meaning anything if they flip-flop," she said.

When biased sources changed their positions on an issue, there was a positive effect on persuasion

"People believe there must be new evidence that is really compelling to get a biased source to change positions and take the opposite side," Wallace said. "So there are sometimes differences in how effective biased sources are compared to untrustworthy ones."

Previous studies suggest that people tend to believe that those who agree with them are less biased than those who disagree with them so this study used a topic and location college students were unlikely to know anything about, to try and control for pre-existing knowledge.