/ 18 March 2015

FPB earmarks R8m to censor online

Film and Publications Board panel discussion.
Film and Publications Board panel discussion.

ANALYSIS
Soon, the Film and Publications Board (FPB) hopes it will “have the power to order an administrator of any online platform to take down any content that the Board may deem to be potentially harmful and disturbing to children of certain ages.”

In other words, if anyone again decides to display a presumed presidential penis online, the FPB will be there to save the day.

In 2012, when artist Brett Murray did just that with his controversial painting “The Spear” (featuring the exposed genitals of a man strongly resembling President Jacob Zuma), the FPB “classified” the painting as unsuitable for children under the age of 16. By that time, however, the painting itself had been defaced while pictures of the original spread online, and so the FPB decided it too had to venture online, and censor it there.

It found that it lacked the reach to do so, and was still scratching its collective head about how to go about censoring everyone from Wikipedia to local news websites when its own appeals board slapped it down – hard – for classifying the painting in the first place.

“The Spear” was declassified, but the underlaying problem continued to haunt the FPB: how does it censor the web?

Earlier this month it delivered an answer, by way of a 14-page set of draft regulations for online content that would, in theory, have given it the power to order Wikipedia et al to remove the image, and then send them an invoice for the cost of doing so.

Much like its conduct with “The Spear”, the draft regulations show a pig-headed insistence on replicating the real-world censorship role of the FBP in an online space. And although there are elements of the regulations that seem set to make it into law, other aspects are impractical, unconstitutional, and sometimes downright silly.

‘Lack of understanding’
“A lot of difficulties with the draft [regulations] flows from FBP’s lack of understanding of how content is distributed online,” says Dominic Cull of Ellipsis Regulatory Solutions, and an expert advisor to the Internet Service Providers’ Association (Ispa) and other bodies that are directly affected by the FPBs efforts. “They deal well with the easy stuff, but don’t understand where the easy stops and the hard begins.”

The “easy stuff” is the move of television online. The FPB is, uncontroversially, responsible for providing age ratings for broadcast television and films in cinemas, down to the trailers for movies. 

That kind of entertainment is rapidly moving towards online streaming, with several home-grown services now offering films and series on demand. The amount of content these streaming services can make available is enormous – in theory every film ever produced anywhere on the planet – and well beyond the FPB’s capacity to review each. So the board has proposed co-regulation, which will allow each streaming provider to classify its own content with an in-house team of people after they are trained, for no more than five days, by the FPB.

By June 2016, everything such streaming providers make available to South African audiences must be rated. That may prove a challenge for in-house teams too, so the FPB draft regulations make another concession – content classified under another system can be “deemed” classified by the FPB, if the regimes are sufficiently similar. In practice, a 13 (language) or 16 (nudity) classification imposed by regulators in the US or Europe will be accepted for local use, and everyone can get on with their streaming.

Policing online – the hard stuff
So far so good, but then the FPB delves into “hard stuff”, the world of what it describes as self-generated content. This, the draft regulations say, could include “a drawing, picture, illustration or painting; recording or any other message or communication, including a visual presentation, placed on any distribution network including, but not confined to, the internet”.

Translation – make a sexy video and upload it to YouTube and the FPB could be coming for you. Likewise if you tweet a picture of your artistic interpretation of the genitals of a public figure. And perhaps even if you describe the results of a botched circumcision on your blog.

Streaming services will be responsible for their own classification expenses, and those who distribute “self-generated” content can expect an invoice from the FPB once it has decided to classify such content of its own accord. So it appears that finding harmful content is the driver behind an expected eight-fold increase in the money the FPB says it will need to police the online space.

This week the department of communications, under which the FPB falls, published its budget forecasts. In the last financial year, the budget shows, the FPB spent under R1-million for “online and mobile content regulation”. By 2016 that is expected to increase to R8.2-million.

The department of communications has long dreamt of having a crack team of people surfing the internet and looking for things that must be policed. In 2002, in the Electronic Communications and Transactions Act, these “cyber inspectors” were given wide powers to inspect websites, with the unspoken justification that they would be a first line of defence against child pornography, as well as guarding against cyber crooks and cyber attackers and other baddies who can have the prefix associated with them.

Internal disputes
Despite several promises over the following decade and a half, cyber inspectors were never appointed. One reason for the long delay was a fierce fight between various government departments for the staff, budget, and prestige that would come with policing the internet. Departments dealing with children and justice all wanted a piece of the action, and fought to a standstill every attempt to implement the cyber inspectors provision.

Much the same seems set to happen with the FPBs new regulations in the near future.

The department of state security is expected to release, within weeks, a first draft of the Cybercrimes and Related Matters Bill. That law is being drafted behind closed doors, but is understood to be a clear victory for the state security department over the department of telecommunications and the department of communications. By several accounts the law will make online child pornography a matter for security services rather than censors.

That will be a significant blow for the FPB, which often insidiously conflates child pornography with sexual content to which children should not be exposed in its arguments for broader powers.

That, in turn, will leave the FPB with only the old argument that the internet is a minefield of sex, and that somebody must think of the children – arguments that have floundered time and again on the rocks of technical unfeasibility.