Facebook moderators were instructed not to remove extreme, abusive or graphic content from the platform even when it violated the company's guidelines, an undercover investigation has found.

While nudity is almost always removed, violent videos involving assaults on children, racially charged hate speech and images of self-harm among underage users all remained on Facebook after being reported by users and reviewed by moderators.

"These revelations about Facebook's content moderation are alarming, but not surprising," said Julian Knight, a member of the UK’s Digital, Culture, Media and Sport Select Committee.

The Conservative MP said: "Facebook has recently committed to reducing fake news and improving privacy on its platform, which is welcome.

"But they don't seem as committed to sacrificing profits made from extreme content, as is demonstrated by Channel 4's investigation."

Facebook called the practices "mistakes" which do not "reflect Facebook's policies or values".

In the investigation by Channel 4's Dispatches programme a reporter worked at Cpl Resources, Facebook's largest centre for UK content moderation.

Over a six-week period between March and April this year, the reporter attended training sessions and filmed conversations in the Cpl offices in Dublin.

A particularly shocking video featured in the programme showed an adult man punching and stamping on a screaming toddler.

Moderators marked the video as disturbing - meaning users must click to view it - and allowed it to remain online, going on to use it in training sessions as an example of acceptable content.

One moderator filmed in the programme said: "If you start censoring too much then people stop using the platform. It's all about money at the end of the day."

Facebook told Dispatches the video should have been removed by moderators.

Roger McNamee, an early investor in Facebook who has since become highly critical of its impact on society, described such videos as the "crack cocaine" of the company's product.

Mr McNamee said: "It's the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform.

"Facebook understood that it was desirable to have people spend more time on site. If you're going to have an advertising-based business, you need them to see the ads so you want them to spend more time on the site."

Facebook's vice-president of global policy solutions, Lord Allan, disagreed.

"There is a minority who are prepared to abuse our systems and other internet platforms to share the most offensive kind of material," the former Liberal Democrat MP said.

"But I just don't agree that that is the experience that most people want and that's not the experience we're trying to deliver."

In one training session filmed by Dispatches, the group is shown a cartoon of a woman drowning a young white girl in a bathtub, accompanied by the caption "when your daughter's first crush is a little negro boy".

The trainer says such images should be ignored by moderators and allowed to remain online.

Facebook later told Dispatches the picture would violate their policies on hate speech and it would investigate the incident.

The investigation also found that extremist pages which had a lot of followers would be treated with special consideration, in the same manner as pages for governments and large news organisations.

The Facebook page for jailed far-right leader Tommy Robinson is given such protection, meaning frontline moderators cannot directly remove material which may violate policies.

Instead it is referred to a more senior reviewing team at Facebook known as "Cross Check".

"Obviously they have a lot of followers so they're generating a lot of revenue for Facebook," one moderator said about the fascist Britain First page, before it was deleted when deputy leader Jayda Fransen was convicted of racially aggravated harassment in March.

In a statement released ahead of the programme's broadcast, Lord Allan said: "It's clear that some of what is shown in the programme does not reflect Facebook's policies or values, and falls short of the high standards we expect.