Thursday, 21 September 2017

2017 is the year we realise we've been doing the Internet wrong

2017 is the year we realise we've been doing the Internet wrong: "But there is a seam of libertarianism in technology which sees it as above and beyond the state in general and regulation in particular. Even as a replacement for it. Who needs a public sector if you have dual core processing?  When tech was the poor relation in the global economy that could be interesting and disruptive. Now tech is the global economy, it is self-serving.

 These apps were developed in a time of neoliberal consensus. The state was beaten and bowed, shrunk to its role of uprooting barriers and getting out of the way of the brilliant, innovative, invisible hand of the private sector.  When I was at Ofcom in the 2000s we strove valiantly, day and night, to avoid any regulation of the internet, even where that included consumer rights and fairer power distribution." 'via Blog this'

We need universal digital suffrage to make technology work for us all | Prospect Magazine

We need universal digital suffrage to make technology work for us all | Prospect Magazine: "Universal Digital Suffrage

Reliable access to the internet is a prerequisite for being a digital citizen—but 14 per cent of adults said they did not have access to the internet at home in 2016.

The Tory/Lib Dem Coalition forgot digital inclusion for most of its tenure—when it finally remembered it demonstrated a poverty of ambition. Its target for inclusion was (and remains) 90 per cent. So one in ten will never have access to digital services.

And the current Government is so utterly unambitious about broadband provision that it has now re-announced the same pot of broadband money three times.

 In the 19th Century the Tories finally came round to the idea that universal suffrage was a democratic prerequisite. The task now is to make them understand the importance of universal digital suffrage—or elect a Labour government that does." 'via Blog this'

Running a responsible ccTLD - Nominet

Running a responsible ccTLD - Nominet: "For over twenty years we have been working to maintain the relevance, stability, security and safety of the .UK domain. We keep pace with the criminals, stay ahead of the trends and ensure everyone understands the benefits of being part of the UK’s namespace.

 Like many of our fellow ccTLD registries, we are alert to the multitude of cyber threats facing our industry. We have invested significantly in our infrastructure and have developed a sophisticated network analytics capability, which can process the billions of requests moving around our DNS infrastructure and provide actionable insight.

The DNS can identify unusual patterns and trends that can be indicative of network threats and potentially criminal behaviours. Using the data we gather, we can protect our DNS but also provide insight to third party clients using it to see what is happening with their own system, either to help mitigate a cyber attack or gather data in the aftermath.

 As a country code registry, it’s been important to maintain equitable access to the namespace." 'via Blog this'

Theresa May's speech is just the latest in politicians wilfully misunderstanding the internet

Theresa May's speech is just the latest in politicians wilfully misunderstanding the internet: " As is so often the case, The Daily Mail started it. After the Parsons Green attack last week, the newspaper wasted no time in allocating blame. A day after the tube bombing, the Mail's front page headline read: WEB GIANTS WITH BLOOD ON THEIR HANDS.   This isn't a new line of argument for the paper, which labelled Google "the terrorist's friend" after the Westminster attack in March.

As I wrote in the magazine back in April, the government (with the aid of particular papers) consistently uses the threat of terrorism to challenge tech giants and thus justify extreme invasions of our online privacy.

This year, Amber Rudd condemned WhatsApp's privacy-protecting encryption practices, the Snoopers' Charter passed with little fanfare, the Electoral Commission suggested social media trolls should be banned from voting, and now - just today - Theresa May has threatened web giants with fines if they fail to remove extremist content from their site in just two hours. 

 No one can disagree with the premise that Google, YouTube, and Facebook should remove content that encourages terrorism from their sites - and it is a premise designed to be impossible to disagree with. What we can argue against is the disproportional reactions by the government and the Mail, which seem to solely blame terrorism on our online freedoms, work against not with tech giants, and wilfully misunderstand the internet in order to push through ever more extreme acts of surveillance and censorship.

It is right for May to put pressure on companies to go "further and faster" in tackling extremism - as she is due to say to the United Nations general assembly later today. Yet she is demanding artificially intelligent solutions that don't yet exist and placing an arbitrary two hour time frame on company action.

In April, Facebook faced scrutiny after a video in which a killer shot a grandfather remained on the site for two hours. Yet Facebook actually acted within 23 minutes of the video being reported, and the delay was due to the fact that not one of their users flagged the content until one hour and 45 minutes after it had been uploaded. It is impossible for Facebook's team to trawl through everything uploaded on the site (100 million hours of video are watched on Facebook every day) but at present, the AI solutions May and other ministers demand don't exist. (And incidentally, the fact the video was removed within two hours didn't stop it being downloaded and widely shared across other social media sites). 

 As Jamie Bartlett, Director of the Centre for the Analysis of Social Media at Demos, told me after a home affairs committee report accused Facebook, Twitter, and YouTube of "consciously failing" to tackle extremism last year:

“The argument is that because Facebook and Twitter are very good at taking down copyright claims they should be better at tackling extremism. But in those cases you are given a hashed file by the copyright holder and they say: ‘Find this file on your database and remove it please’. This is very different from extremism. You’re talking about complicated nuanced linguistic patterns each of which are usually unique, and are very hard for an algorithm to determine.”

 At least May is in good company. Last November, health secretary Jeremy Hunt argued that it was up to tech companies to reduce the teenage suicide rate, helpfully suggesting "a lock" on phone contracts, referring to image-recognition technology that didn't exist, and misunderstanding the limitations of algorithms designed to limit abuse. And who can forget Amber Rudd's comment about the "necessary hashtags"? In fact, our own Media Mole had a round-up of blunderous statements made by politicians about technology after the Westminster attack, and as a bonus, here's a round-up of Donald Trump's best quotes about "the cyber".

But in all seriousness, the government have to acknowledge the limits of technology to end online radicalisation.

And not only do we need to understand limits - we need to impose them. Even if total censorship of extremist content was possible, does that mean its desirable to entrust this power to tech giants?

As I wrote back in April: "When we ignore these realities and beg Facebook to act, we embolden the moral crusade of surveillance. We cannot at once bemoan Facebook’s power in the world and simultaneously beg it to take total control. When you ask Facebook to review all of the content of all of its billions of users, you are asking for a God." " 'via Blog this'

Julia Reda MEP – What the Commission found out about copyright infringement but ‘forgot’ to tell us

Julia Reda – What the Commission found out about copyright infringement but ‘forgot’ to tell us: "Copyright policy is usually based on the underlying assumption that copyright infringement has a direct negative effect on rightsholders’ revenues.

The most recent example for this kind of reasoning is the Commission’s highly controversial proposal of requiring hosting providers to install content filters to surveil all user-uploaded content. The Commission claims this measure is necessary to address a “value gap”, a supposed displacement of value from licensed music streaming services to hosting services like YouTube, which host a mixture of licensed and unlicensed content.

To properly discuss such far-reaching proposals, we clearly need to have access to all available evidence on whether such displacement actually takes place in practice.

 This study may have remained buried in a drawer for several more years to come if it weren’t for an access to documents request I filed under the European Union’s Freedom of Information law on July 27, 2017, after having become aware of the public tender for this study dating back to 2013." 'via Blog this'

Wednesday, 20 September 2017

Annual Report 2017: Information Law Group

The Information Law Group was established in 2014-15 and seed-funded by LPS in 2015/16, in addition to its external funding from projects for the European Commission and the RDF in 2014/15. It held its second annual Seminar and other guest seminars, and first annual PhD and Work in Progress Workshops on 20 June 2016.
Six external seminars were organized in 2016/17:
·        3rd Annual Information Law Seminar by Prof. Roger Brownsword joint event with School of Law,
·        visiting speaker Hugh Tomlinson QC joint event with SCHRR,
·        Jeremy Olivier from Ofcom,
·        Prof. Andrea Matwyshyn from Northeastern University, 
·        Dr Mélanie Dulong de Rosnay from CNRS Paris, visiting at LSE,
Justin Walford, senior editorial counsel at The Sun newspaper
 Work in progress workshops were carried out on 3 May 2017, the PhD workshop in the morning and the Work in progress workshop in the afternoon (2pm-5.30pm). By holding both on the same day, we ensure that some professors attended the PhD workshop to give comments. There were 3 PhD presentations in the morning workshop, one internal to Sussex.There were six Works in Progress presented at the afternoon workshop, 2 internal to Sussex. The two discussants included one internal to Sussex. The external presenters were from Cambridge, Hertfordshire, Leeds and Tilburg/EBU, the external discussant was from LSE.
The attendance for both workshops was in total approximately twenty (including PhD students, speakers, discussants and chairs).
In the course of planning the workshops, we also collaborated with the Crime Research Group’s first Annuallecture, which meant attendees could also attend that lecture and reception after the conclusion of the work in progress seminar. 
A significant outcome from the first Annual Seminar (2015) is that Chris Marsden has been invited to present a paper at a conference at Georgetown University in February 2018, to be published in the Georgetown Law Technology Review. The timescale reinforces the fact that continued ongoing research planning produces results over time, and the three years of our establishment has resulted in visibility.
A result of the expansion of the Group, with the recruitment of Judith and Nicolo, and our expanding links to Engineering, Informatics, SPRU, IDS and the Sussex Humanities Lab/journalism programme, is that we plan to apply for the establishment of a more broad-based interdisciplinary Centre for Information Governance Research in academic year 2017/18.

Thursday, 14 September 2017

Australia poised for media mergers after ownership reforms

Australia poised for media mergers after ownership reforms: "Australian media companies Fairfax and News International have lobbied for an easing to ownership restrictions, which limit companies to owning two of the three main media interests — radio, newspaper and television. They argue that cut-throat competition for advertising from Facebook and Google makes reform necessary, while the rise of internet competitors means there would be no loss of media diversity.

 Pauline Hanson’s One Nation party also won a commitment from the government to: force the state broadcaster ABC to disclose all staff salaries above A$200,000; legislate to ensure ABC coverage is fair and balanced; and order a review into “competitive neutrality” in broadcasting.

The removal of the “two out of three” rules on cross media ownership could aid a consortium led by Lachlan Murdoch, co-chairman of News Corp, in its battle to take over Ten. Its previous “conditional” offer for the Australian broadcaster was complicated by Mr Murdoch’s ties with News Corp, which already owns radio and newspaper assets in Australia, and may have fallen foul of existing media laws." 'via Blog this'