Wednesday, 26 April 2017

AI report fed by DeepMind, Amazon, Uber urges greater access to public sector data sets | TechCrunch

AI report fed by DeepMind, Amazon, Uber urges greater access to public sector data sets | TechCrunch: "Ultimately, the report does call for “urgent consideration” to be given to what it describes as “the ‘careful stewardship’ needed over the next ten years to ensure that the dividends from machine learning… benefit all in UK society.” And it’s true to say, as we’ve said before, that policymakers and regulators do need to step up and start building frameworks and determining rules to ensure machine learning technologists do not have the chance to asset strip the public sector’s crown jewels before they’ve even been valued (not to mention leave future citizens unable to pay for the fancy services that will then be sold back to them, powered by machine learning models freely fatted up on publicly funded data).

 But the suggested 10-year time frame seems disingenuous, to put it mildly. With — for instance — very large quantities of sensitive NHS data already flowing from the public sector into the hands of one of the world’s most market capitalized companies (Alphabet/Google/DeepMind) there would seem to be rather more short-term urgency for policymakers to address this issue — not leave it on the back burner for a decade or so. Indeed, parliamentarians have already been urging action on AI-related concerns like algorithmic accountability." 'via Blog this'

Tuesday, 25 April 2017

These internet firsts will remind you how far we've come - Business Insider

These internet firsts will remind you how far we've come - Business Insider: "In October 1969, UCLA student Charley Kline was attempting to send the word “login” over to the Stanford Research Institute using the internet’s precursor: ARPANET.

At first, the system crashed, only managing to send the letters “i” and “o”. But an hour or so later, the full message was successfully sent and history was made:" 'via Blog this'

Monday, 24 April 2017

SCL: European Net Neutrality, at last?

SCL: European Net Neutrality, at last?Luca Belli and Chris Marsden review the long history of developments, and the latest position, on net neutrality in Europe, amid some hopeful signs. 

Net neutrality is the principle mandating that internet traffic be managed in a non-discriminatory fashion, in order to fully safeguard internet users' rights. On 30 August 2016, all EU and EEA members finally obtained guidance on how to implement sound net neutrality provisions. The path has been tortuous and uneasy, starting from 'not neutrality', reaching an open Internet compromise and, finally, attaining net neutrality protections. In this article, we aim briefly to recount how net neutrality evolved in Europe and how much significant progress has been made by the recently adopted net neutrality Guidelines. 'via Blog this'

Saturday, 15 April 2017

The Low-Down: Streaming Now Makes Most of the US Music Industry's Revenue

The Low-Down: Streaming Now Makes Most of the US Music Industry's Revenue: "Overall last year, retail revenues from recorded music in the US grew 11.4 percent to $7.7 billion, the biggest gain since 1998, according to the RIAA. Even with such growth the industry is still licking its wounds from the last decade and a half -- sales remain about half what they were in 1999, the heyday of the CD.
Subscriptions, like the monthly fees for Apple Music or Spotify's paid tier, were the biggest money maker at $2.3 billion, and they basically doubled from a year earlier, the RIAA said." 'via Blog this'

Thursday, 13 April 2017

FCA Publishes Discussion Paper on the Regulation of DLT (blockchains)

FCA Publishes Discussion Paper on the Regulation of DLT: "The FCA continues its ‘wait-and-see’ approach before considering changes to its framework. It will instead explore emerging business models and continue to help innovators test-bed solutions in its regulatory sandbox.

 The FCA remains technology neutral/ agnostic but it is encouraging to note its approach to resilience and openness to regulating on technology outcomes, in line with statutory objectives.

The paper also recognises that DLT is not a panacea and that market outcomes like faster payments could be delivered by other technologies. It is indicative however of an increasingly mature approach to technology risk and the paper does recognise DLT’s innovative potential for record-keeping and efficiency.

 With a voluntary standards process also underway and increasing regulatory accommodation, end-users will be more accepting of the increasing trust that DLT affords, allowing benefits around efficiency, transparency and provenance to be fully realised. This much is very encouraging for UK DLT and cements the UK’s position as a global fintech hub with a forward-looking regulatory regime." 'via Blog this'

Wednesday, 12 April 2017

Where to after Watson: The challenges and future of data retention in the UK (BIICL)

Where to after Watson: The challenges and future of data retention in the UK (BIICL): "The judgment of the CJEU in the Watson case was handed down shortly before the year's end in 2016. The determination that member states may not impose on communications providers a general obligation to retain data was applauded by privacy groups and has undoubtedly caused disquiet among those involved policing and intelligence. What parliamentarians and judges will make of it in the coming months - and, post-Brexit, years - is both uncertain and important.

In this event experts will examine the strengths, weakness and implication of the decision, with an eye to rights protections, the need to combat serious crime, and the practicalities of managing both in light of the European Court's decision." 'via Blog this'

Monday, 10 April 2017

Balkinization: Assessing Algorithmic Authority

Balkinization: Assessing Algorithmic Authority: "Compared to these examples, the obscurity at the heart of our "cultural voting machines" (as I call dominant intermediaries) may seem trivial. But when a private entity grows important enough, its own secret laws deserve at least some scrutiny.

 I have little faith that such scrutiny will come any time soon. But until it does, we should not forget that the success of algorithmic authorities depends in large part on their owners' ability to convince us of the importance--not merely the accuracy--of their results. A society that obsesses over the top Google News results has made those results important, and we are ill-advised to assume the reverse (that the results are obsessed over because they are important) without some narrative account of why the algorithm is superior to, say, the “news judgment” of editors at traditional media.

(Algorithmic authority may simply be a way of rewarding engineers (rather than media personalities) for amusing ourselves to death.) " 'via Blog this'

Data Ethics Group - The Alan Turing Institute

Data Ethics Group - The Alan Turing Institute: "Made up of academics specialising in ethics, social science, law, policy-making, and big data and algorithms, the Data Ethics Group will drive the Institute’s research agenda in data ethics, and work across the organisation to provide advice and guidance on ethical best practice in data science.

The Group will work in collaboration with the broader data science community, will support public dialogue on relevant topics, and will set open calls for participation in workshops, as well as public events.

In a connected project, The Alan Turing Institute is participating in the Royal Society and British Academy project on data governance." 'via Blog this'

Do robots have rights? The European Parliament addresses artificial intelligence and robotics

Do robots have rights? The European Parliament addresses artificial intelligence and robotics: "The European Parliament has put forward initial proposals in its resolution on legal rules for machines that are able to act with a high degree of autonomy and take their own decisions through being equipped with AI and having physical freedom of movement.

This will not be the final word on the matter from a legal perspective, and we are still some years away from corresponding laws being enacted. In the meantime, technical development in the field of AI and robotics will not wait for national or European lawmakers and is set to continue unabated. It remains to be seen whether technical progress might not soon overtake the legal discussion.

 Aside from the legal issues surrounding robotics, lawyers will be interested to see how AI finds its way into our own professional lives. There has been a lot of talk recently about legal tech and digital transformation in relation to legal advice. Yet just looking at the numerous new legal issues that arise in connection with AI and robotics, robots appear to be creating as much new work for us on the one hand as intelligent assistants will be able to take over on the other." 'via Blog this'

Sunday, 9 April 2017

International stakeholder engagement - Ofcom

International stakeholder engagement - Ofcom: "Ofcom hosts an International Stakeholders Forum (ISF) every 4 months. This is the primary means through which we aim to update UK stakeholders on our international activities. We also use these as an opportunity to share information, as well as impressions, on international policy developments.  As well as this forum, we hold dedicated spectrum briefing sessions, details of which can be found here.

 For further information on these meetings and if you want to be added to the circulation list please email" 'via Blog this'

The ongoing war on encryption – TechnoLlama

The ongoing war on encryption – TechnoLlama: "Calls to have technology firms offer backdoor access to private and encrypted communications must be read as a call to endanger everyone’s communications by making them easier to read by hackers. Moreover, encryption is not proprietary, it is just a clever use of maths, and there is no way that governments will ever be able to ban that.

If somehow an app is made vulnerable, terrorists will move to another method, and we the public will still be left vulnerable.

But you may argue that we should never give up, and that the fight against terrorism is a worthy cause. It certainly is, but we cannot give up our expectations of security on the assumption that somewhere a terrorist is using an encrypted tool to communicate with one another. There is little evidence that this is the case, and even strong evidence to the contrary. The Paris terrorists used unencrypted burner mobile phones to communicate, and also favoured face to face contact.

 We cannot give away our rights based on fables and ignorance." 'via Blog this'

From Smart Cities 1.0 to 2.0: it's not (only) about the tech

From Smart Cities 1.0 to 2.0: it's not (only) about the tech: "Today’s Internet of Things technologies, data analytics platforms and sensor-enabled services are sure to deliver new ways to understand, visualise and analyse the nature and scale of many of our most pressing urban challenges.

 But solving challenges such as waste management, urban liveability and land-use planning will require more than technology investments, data-capture services or digital prototypes. Solutions will also depend on effective long-term partnerships within and beyond government.

While the digital infrastructure is no doubt important, it will be the city governments that invest in new ways to collaborate and co-innovate that will ultimately lead the way in delivering the smarter, more responsive services our cities so desperately need." 'via Blog this'

Saturday, 8 April 2017

Bundeskartellamt 18th Conference on Competition, Berlin, 16 March 2017 | European Commission

Bundeskartellamt 18th Conference on Competition, Berlin, 16 March 2017 | European Commission: "The challenges that automated systems create are very real. If they help companies to fix prices, they really could make our economy work less well for everyone else.

So as competition enforcers, we need to keep an eye out for cartels that use software to work more effectively. If those tools allow companies to enforce their cartels more strictly, we may need to reflect that in the fines that we impose.

And businesses also need to know that when they decide to use an automated system, they will be held responsible for what it does. So they had better know how that system works.

 In The Hitchhiker's Guide to the Galaxy, the Guide in question was a sort of electronic book. Although it was often wildly inaccurate, it was also a huge success. That was partly because of the words printed in big, friendly letters on the cover: “Don't Panic”.

I think that's good advice. We certainly shouldn't panic about the way algorithms are affecting markets." 'via Blog this'

The IPKat: First live blocking order granted in the UK

The IPKat: First live blocking order granted in the UK: "This is an important order that demonstrates how technological advancement prompts a re-consideration of traditional approaches, including whether intermediary injunctions should be only aimed at blocking access to infringing websites [the answer appears to be no, and this order may pave the way to even more creative enforcement strategies in the future].

 Arnold J's decision shows how the law - including the one on blocking orders - is subject to evolution. This is so also to permit that the 'high level of protection' that the InfoSoc Directive [from which s97A CDPA derives] intends to provide is actually guaranteed.

 As far as the GS Media 'profit-making intention' is concerned, to some extent the view of Arnold J appears somewhat narrower (but practically not dissimilar) than that of other courts, eg the District Court of Attunda in Sweden [here, here, and here] that have applied GS Media so far. Further applications of GS Media by UK courts are however keenly awaited." 'via Blog this'

Wednesday, 5 April 2017

Finding Proportionality in Surveillance Laws – Andrew Murray, Inforrm's Blog

Finding Proportionality in Surveillance Laws – Andrew Murray | Inforrm's Blog: "Much of the Bill’s activity is to formalise and restate pre-existing surveillance powers. One of the key criticisms of the extant powers of the security and law enforcement services is that the law lacks clarity. Indeed it was this lack of clarity which led the Investigatory Powers Tribunal to rule in the landmark case of Liberty v GCHQ that the regulations which covered GCHQ’s access to emails and phone records intercepted by the US National Security Agency breached Articles 8 and 10 of the European Convention on Human Rights.

Following a number of strong critiques of the law including numerous legal challenges the Government received three reports into the current law: the report of the Intelligence and Security Committee of Parliament, “Privacy and Security: A modern and transparent legal framework”; the report of the Independent Reviewer of Terrorism Legislation. “A Question of Trust”; and the report of the Royal United Services Institute: “A Democratic Licence to Operate”. All three reported deficiencies in the law’s transparency.

 As a result the Bill restates much of the existing law in a way which should be more transparent and which, in theory, should allow for greater democratic and legal oversight of the powers of the security and law enforcement services. In essence the Bill is split into sections: interception, retention, equipment interference and oversight, with each of the three substantive powers split again into targeted and bulk." 'via Blog this'

Tim Berners-Lee: selling private citizens' browsing data is 'disgusting' Guardian

Tim Berners-Lee: selling private citizens' browsing data is 'disgusting' | Technology | The Guardian: "The Twitter folks, who crowed about how great anonymity was for the “Arab spring” – never say that without quotes – then suddenly they find that this anonymity is really not appreciated when it’s used by nasty misogynist bullies and they realize they have to tweak their system to limit not necessarily behavior but the way it propagates. They’ve talked about using AI to distinguish between constructive and unconstructive comments; one possibility is that by tweaking the code in things, you can have a sea change in the way society works." 'via Blog this'

Final Programme: PhD WIP workshop 3 May 11am-1pm

Elif Mendos Ku┼čkonmaz (Queen Mary University of London): The EU-US PNR Agreement under EU Privacy & DP law
-  Paul Pedley (City, University of London): Protecting the privacy of library users
-  Maria Bjarnadottir (Sussex): Who is the guarantor of human rights on the internet?
Chair: Chris Marsden (Sussex)
Discussants: Nico Zingales (Sussex), Andres Guadamuz (Sussex). 
Logistics: 11am-1pm 3 May in the Moot Room, Freeman Building,University of Sussex.
Afternoon Workshop: all PhD attendees are registered to attend the afternoon workshop 2pm-5.30pm F22 without charge (programme here), the evening lecture by the Europol Director, and the drinks reception in Fulton B at 6.30pm.

UPDATE: Special guest speaker and drinks reception for Annual WIP Seminar

In addition to a packed afternoon of talks - and a morning PhD WIP workshop - we will also be able to attend the Annual Lecture by Rob Wainwright, Director of Europol, whose talk is likely to touch on issues of cybercrime and online liability. This will run 5.30-6.30pm - the afternoon concludes with a free drinks reception outside Fulton B from 6.30pm onwards.

Tuesday, 4 April 2017

Minister explains Rudd's 'necessary hashtags' after week of confusion | Technology | The Guardian

Minister explains Rudd's 'necessary hashtags' after week of confusion | Technology | The Guardian: "PhotoDNA has been successfully used in the fight against online child abuse imagery, but is less well suited to extremist content due to the broader nature of such material. Nonetheless, in December 2016, social media firms including Facebook, Twitter, Google and Microsoft committed to contribute image and video hashes of terrorist content to a shared database, to speed discovery and takedown of material that breaches each site’s terms of service." 'via Blog this'

2016 No.607: The Open Internet Access (EU Regulation) Regulations 2016

"19.—(1) Where OFCOM determine that there are reasonable grounds for believing that a person
is breaching, or has breached an obligation under Articles 3, 4 or 5 of the EU Regulation or under
these Regulations they may give that person a notification under this regulation.

21.—(1) The amount of a penalty notified under regulation 19 (other than a penalty falling
within regulation 20(5)) is to be such amount as OFCOM determine to be—
(a) appropriate; and
(b) proportionate to the breach in respect of which it is imposed,
but in the case of a breach of an information requirement not exceeding £2,000,000, and in the
case of any other breach of the EU Regulation or these Regulations, not exceeding ten per cent. of
the turnover of the notified person’s relevant business for the relevant period.
'via Blog this'

Tuesday, 28 March 2017

A Longitudinal Measurement Study of 4chan’s Politically Incorrect Forum and its Effect on the Web – Bentham’s Gaze

A Longitudinal Measurement Study of 4chan’s Politically Incorrect Forum and its Effect on the Web – Bentham’s Gaze: "Ultimately, 4chan and /pol/ are continuously evolving.  Over the past year, the sale of 4chan to Hiroyuki Nishimura, recent rumors of the site struggling with monetization, the introduction of very mild moderation by so-called janitors, or other controversial events like the #GamerGate incident, naturally create shifts in topics and activities, as well as users moving to other, somewhat similar sites (e.g. 8chan). But as the world increasingly looks at 4chan, 4chan will not so silently be looking back — a fact that we can personally attest to." 'via Blog this'

Populism and Privacy - UN Special Rapporteur on Privacy

2015-2017 have seen agrowing tendency, especially though not exclusively in Europe, to indulge in “gesture-politics”. In other words, the past eighteen months have seen politicians who wish to be seen to be doing something about security, legislating privacy-intrusive powers into being – or legalise existing practices – without in any way demonstrating that this is either a proportionate or indeed an effective way to tackle terrorism.
b.      The new laws introduced are predicated on the psychology of fear: the disproportionate though understandable fear that electorates may have in the face of the threat of terrorism. The level of the fear prevents the electorate from objectively assessing the effectiveness of the privacy-intrusive measures proposed.
c.       There is little or no evidence to persuade the SRP of either the efficacy or the proportionality of some of the extremely privacy-intrusive measures that have been introduced by new surveillance laws in France, Germany, the UK and the USA. Like Judge Robart in the recent case on the immigration ban in the USA, the SRP must seek evidence for the proportionality of the measures provided for by law[1]s. In the same way as Judge Robart asked as to precisely how many cases of terrorism were carried out since 2001 by nationals of the states subjected to the immigration ban, the SRP must ask as to whether it would not be much more proportional, never mind more cost-effective and less privacy-intrusive if more money was spent on the human resources required to carry out targeted surveillance and infiltration and if less effort were expended on electronic surveillance. This, in a time when the vast majority of all terrorist attacks were carried out by suspects already known to the authorities prior to the attacks.
d.      There is also growing evidence that the information held by states, including that collected through bulk acquisition or “mass surveillance” is increasingly vulnerable to being hacked by hostile governments or organised crime. The risk created by the collection of such data has nowhere been demonstrated to be proportional to the reduction of risk achieved by bulk acquisition.
e.       Furthermore, the abuse of data collected by bulk acquisition remains a primary source of concern. Without necessarily casting aspersions on the incoming US administration, the concerns expressed in that context by a senior HRW researcher are worth reproducing: “In the US, the National Security Agency continues its information dragnet on millions of people every day, despite modest reforms in 2015. Now the keys to the world’s most sophisticated surveillance apparatus have been handed over to a candidate (who) threatened to imprison his political opponent, register and ban Muslims, deport millions of immigrants, and menace the free press.”[2] While the checks and balances existing in the USA or indeed the ethical standards of the Executive itself may hopefully push the US away from the realisation of such risks, the point being made here by the SRP is that once the data sets produced by mass surveillance or bulk acquisition exist and a new unscrupulous administration comes into power anywhere in the world, the potential for abuse of such data is such so as to preclude its very collection in the first place.
f.       RECOMMENDATION: Desist from playing the fear card, and improve security through proportionate and effective measures not with unduly disproportionate privacy-intrusive laws “I don’t believe that any form of leadership is best exercised by using fear. True political leadership does not play the fear card” [3]

                    [2]   Cynthia Wong, Surveillance in the age of populism” Human Rights Watch last accessed on 12th Feb 2017 at
                    [3]   Cardinal Vincent Nichols speaking to the BBC on Sunday 05 February 2017 –Westminster hour website 

Monday, 27 March 2017

Europe will fine Twitter, Facebook, Google etc unless they rip up T&Cs • The Register

Europe will fine Twitter, Facebook, Google etc unless they rip up T&Cs • The Register: "An official from the EC's consumer protection authorities confirmed it intends to "take action to make sure social media companies comply with EU consumer rules."

 Today's crackdown follows a letter sent to tech giants at the end of last year pointing out that the rules users sign up to when they use their services are not consistent with European law and need to be changed.

Those letters resulted in a flurry of activity by the US-based companies, introducing new policies and processes in an effort to head off a formal investigation.

But, as the German government made clear earlier this week when it announced plans to fine them up to €50m for not taking down illegal content within 24 hours, those efforts were not sufficient.

Germany – which remains the most powerful member of the European Union – promised it would also push its efforts to make Facebook, Twitter and friends more accountable in Europe. The decision to push for changes to their terms and conditions appears to be the first stage of that.

 As for the changes requested by the EC, they appear to be focused on pulling out the legal language that the companies use to avoid liability as far as possible.

In particular, the requirement for any user of the services worldwide to sue the company in the state of California – where most of the companies are based and which has a tech-friendly legal system – is top of the list, with the EC saying it needs to be changed so users can sue the company in their home country.

 There is also a push to remove or reform language over consumers waiving their rights, including the ability to cancel a contract – something that would likely change social media companies' ability to claim that anything posted to their networks is their property. And changes have been requested over how the companies determine what is suitable content submitted by users." 'via Blog this'

Sunday, 26 March 2017

European Parliament offers scathing criticism of EU-US Privacy Shield

European Parliament offers scathing criticism of EU-US Privacy Shield: "Af­ter the vote, Claude Moraes, the Civil Lib­er­ties Com­mit­tee Chair­man, said that “the Civil Lib­er­ties Com­mit­tee res­o­lu­tion adopted to­day sends a clear mes­sage that, while the Pri­vacy Shield con­tains sig­nif­i­cant im­prove­ments com­pared to the for­mer EU-US Safe Har­bour, key de­fi­cien­cies re­main to be ur­gently re­solved”.

The par­lia­ment res­o­lu­tion thus ac­knowl­edges sig­nif­i­cant im­prove­ments along with of­fer­ing scathing crit­i­cism of the new agree­ment. The lack of ef­fec­tive ju­di­cial re­dress for EU cit­i­zens in the US is among the is­sues high­lighted. Specif­i­cally, the res­o­lu­tion states that “nei­ther the Pri­vacy Shield Prin­ci­ples nor the let­ters of the U.S. ad­min­is­tra­tion pro­vid­ing clar­i­fi­ca­tions and as­sur­ances demon­strate the ex­is­tence of ef­fec­tive ju­di­cial re­dress rights for in­di­vid­u­als in the EU whose per­sonal data are trans­ferred to an U.S. or­gan­i­sa­tion un­der the Pri­vacy Shield Prin­ci­ples”.

The res­o­lu­tion also crit­i­cises the fact that “the Om­budsper­son mech­a­nism set up by the U.S. De­part­ment of State is not suf­fi­ciently in­de­pen­dent”." 'via Blog this'

Friday, 24 March 2017

Senate votes to let ISPs sell your Web browsing history to advertisers | Ars Technica

Senate votes to let ISPs sell your Web browsing history to advertisers | Ars Technica: "The rules were approved in October 2016 by the Federal Communications Commission's then-Democratic leadership, but are opposed by the FCC's new Republican majority and Republicans in Congress. The Senate today used its power under the Congressional Review Act to ensure that the FCC rulemaking "shall have no force or effect" and to prevent the FCC from issuing similar regulations in the future.

 The House, also controlled by Republicans, would need to vote on the measure before the privacy rules are officially eliminated. President Trump could also preserve the privacy rules by issuing a veto. If the House and Trump agree with the Senate's action, ISPs won't have to seek customer approval before sharing their browsing histories and other private information with advertisers." 'via Blog this'

Free Speech and Protected Privacy: Balancing Two Human Rights 5 April 1pm

Free Speech and Protected Privacy: Balancing Two Human Rights : News and events : ... : Law : University of Sussex: "Free Speech and Protected Privacy: Balancing Two Human Rights
Wednesday 5 April 13:00 until 14:30
Ashdown House, Room 101

Speaker: Hugh Tomlinson QC, Matrix Chambers

Part of the series: Sussex Centre for Human Rights Research

Hugh Tomlinson QC, a member of Matrix Chambers, is a noted specialist in media and information law including defamation, confidence, privacy and data protection. " 'via Blog this'

Thursday, 23 March 2017

Thank heavens the wrangling over BT's Openreach separation has ended • The Register

Thank heavens the wrangling over BT's Openreach separation has ended • The Register: "What hasn’t changed under the legal separation, as opposed to a structural one, is where Openreach’s profits go, with Shurmer noting they "will flow back to the BT Group”. The group's budget will also be controlled by BT.  In terms of investment, the announcement will make no difference to BT’s current broadband roll-out plans. “This agreement is based on the guidance we have already given the city around our investment plans, so there is no change there."

The biz is currently connecting 10 million customers to its ultrafast hybrid fibre and copper G.Fast and 2 million "pure fibre" connections by 2020. Critics have said the biz is relying too much on G.Fast over full fibre.

 However, Shurmer hinted the new structure could help boost further investment. "But what we do have now with this new consultation process is this new approach to developing a business case for future network investment." 'via Blog this'

Home Office admits it's preparing to accept EU ruling on surveillance • The Register

Home Office admits it's preparing to accept EU ruling on surveillance • The Register: "Other than the notable omission of a draft code of practice on communications data alongside the other draft codes published last month, it has been unclear whether the Home Office had paid any attention to the ruling at all – until last Friday, when an IT tender relating to the Investigatory Powers Act made mention of a "a new communications data independent authorising body", which was spotted by the Open Rights Group.

 Regarding the new authorising body, a Home Office spokesperson repeated to The Register that it was "disappointed" and "carefully considering [the ruling's] implications".

"The government will vigorously defend the fundamental powers in the Investigatory Powers Act because they are vital to the police and intelligence agencies in arresting criminals, prosecuting paedophiles and preventing terrorist attacks," the spokesperson added. "We will provide Parliament and the courts with an update on our response to the judgment in due course."

 While the ambiguity of "in due course" has become something of a running joke for those asking questions of the department, it did also inform us that although the CJEU ruling was specifically directed at a previous bit of legislation which the Investigatory Powers Act replaced, DRIPA, it was currently considering how the ruling would affect the new Snoopers' Charter. 'via Blog this'

Wednesday, 22 March 2017

The world's leading privacy pros talk GDPR with El Reg • The Register

The world's leading privacy pros talk GDPR with El Reg • The Register: "The European Court of Justice ultimately conceded that Safe Harbor was indeed invalid, and suddenly there was no legal basis for American megacorps to continue quaffing Europeans' data. Not that those companies cared, or agreed even. Facebook, Microsoft, and Salesforce have continued to shuttle Zuckabytes back home through "model clauses" contracts, a measure which is again being challenged by Schrems.

 Even if this workaround is shot down during the ongoing court case in Dublin, however, both the EU and US share much about privacy in terms of cultural values regarding privacy, suggested Hughes." 'via Blog this'

Monday, 20 March 2017

Selling your soul while negotiating the conditions: from notice and consent to data control by design | SpringerLink

Selling your soul while negotiating the conditions: from notice and consent to data control by design | SpringerLink: "This article claims that the Notice and Consent (N&C) approach is not efficient to protect the privacy of personal data. On the contrary, N&C could be seen as a license to freely exploit the individual’s personal data. For this reason, legislators and regulators around the world have been advocating for different and more efficient safeguards, notably through the implementation of the Privacy by Design (PbD) concept, which is predicated on the assumption that privacy cannot be assured solely by compliance with regulatory frameworks. In this sense, PbD affirms that privacy should become a key concern for developers and organisations alike, thus permeating new products and services as well as the organisational modi operandi.

Through this paper, we aim at uncovering evidences of the inefficiency of the N&C approach, as well as the possibility to further enhance PbD, in order to provide the individual with increased control on her personal data. The paper aims at shifting the focus of the discussion from “take it or leave it” contracts to concrete solutions aimed at empowering individuals. As such, we are putting forth the Data Control by Design (DCD) concept, which we see as an essential complement to N&C and PbD approaches advocated by data-protection regulators. The technical mechanisms that would enable DCD are currently available (for example, User Managed Access (UMA) v1.0.1 Core Protocol).

We, therefore, argue that data protection frameworks should foster the adoption of DCD mechanisms in conjunction with PbD approaches, and privacy protections should be designed in a way that allows every individual to utilise interoperable DCD tools to efficiently manage the privacy of her personal data. After having scrutinised the N&C, PbD and DCD approaches we discuss the specificities of health and genetic data, and the role of DCD in this context, stressing that the sensitivity of genetic and health data requires special scrutiny from regulators and developers alike. In conclusion, we argue that concrete solutions allowing for DCD already exist and that policy makers should join efforts together with other stakeholders to foster the concrete adoption of the DCD approach." 'via Blog this'

Co-regulation in EU personal data protection: the case of technical standards and the privacy by design standardisation 'mandate' | Kamara | European Journal of Law and Technology

Co-regulation in EU personal data protection: the case of technical standards and the privacy by design standardisation 'mandate' | Kamara | European Journal of Law and Technology: "The recently adopted General Data Protection Regulation (GDPR), a technology-neutral law, endorses self-regulatory instruments, such as certification and technical standards. Even before the adoption of the General Data Protection Regulation, standardisation activity in the field of privacy management and data security had emerged.

In 2015, the European Commission issued the first standardisation request to the European Standardisation Organisations to develop privacy management standards based on art. 8 of the EU Charter of Fundamental Rights.

There is a rising shift from command-and-control regulation to the inclusion of co-regulation tools in the EU data protection legislation. The aim of this article is to provide insights on the role of standardisation as a form of co-regulation in the data protection context. " 'via Blog this'

Cyberleagle: The Investigatory Powers Act - swan or turkey?

Cyberleagle: The Investigatory Powers Act - swan or turkey?: "Over 300 pages make up what then Prime Minister David Cameron described as the most important Bill of the last Parliament. When it comes into force the IP Act will replace much of RIPA (the Regulation of Investigatory Powers Act 2000), described by David Anderson Q.C.’s report A Question of Trust as ‘incomprehensible to all but a tiny band of initiates’. It will also supersede a batch of non-RIPA powers that had been exercised in secret over many years - some, so the Investigatory Powers Tribunal has found, on the basis of an insufficiently clear legal framework. 

None of this would have occurred but for the 2013 Snowden revelations of the scale of GCHQ’s use of bulk interception powers. Two years post-Snowden the government was still acknowledging previously unknown (except to those in the know) uses of opaque statutory powers. 

Three Reviews and several Parliamentary Committees later, it remains a matter of opinion whether the thousands of hours of labour that went into the Act have brought forth a swan or a turkey. If the lengthy incubation has produced a swan, it is one whose feathers are already looking distinctly ruffled following the CJEU judgment in Watson/Tele2, issued three weeks after Royal Assent. That decision will at a minimum require the data retention aspects of the Act to be substantially amended. " 'via Blog this'

YouTube Censors Everyone: Feminists, LGBT Vloggers, Pundits and Gamers | Heat Street

YouTube Censors Everyone: Feminists, LGBT Vloggers, Pundits and Gamers | Heat Street: "YouTube has caved in to calls for content restrictions and censorship on its platform, implementing an optional new feature called “restricted mode”.

It’s designed to censor indecent material — the kind that advertisers do not wish to be associated with.

According to Google, the optional feature “uses community flagging, age-restrictions, and other signals to identify and filter out potentially inappropriate content.”

It’s a feature that’s been around for at least a year, but YouTube producers haven’t been feeling the hurt until now.

Since YouTube ramped up the mode’s restrictions, several LGBT bloggers discovered that their content was blocked, and accused the platform of hiding their videos." 'via Blog this'

Friday, 17 March 2017

Algorithms in decision-making inquiry launched - UK Parliament

Algorithms in decision-making inquiry launched - News from Parliament - UK Parliament: "The Committee would welcome written submissions by Friday 21 April 2017 on the following points:

 The extent of current and future use of algorithms in decision-making in Government and public bodies, businesses and others, and the corresponding risks and opportunities;

Whether 'good practice' in algorithmic decision-making can be identified and spread, including in terms of:
—  The scope for algorithmic decision-making to eliminate, introduce or amplify biases or discrimination, and how any such bias can be detected and overcome;

Whether and how algorithmic decision-making can be conducted in a ‘transparent’ or ‘accountable’ way, and the scope for decisions made by an algorithm to be fully understood and challenged;

The implications of increased transparency in terms of copyright and commercial sensitivity, and protection of an individual’s data;

Methods for providing regulatory oversight of algorithmic decision-making, such as the rights described in the EU General Data Protection Regulation 2016.

The Committee would welcome views on the issues above, and submissions that illustrate how the issues vary by context through case studies of the use of algorithmic decision-making." 'via Blog this'

DeepMind AI faces privacy questions about its data deal with the NHS | WIRED UK

DeepMind faces privacy questions about its data deal with the NHS | WIRED UK: "The mostly-silent centre of arguments is the Information Commissioner's Office (ICO), which oversees data protection issues in the UK. The body has been investigating the DeepMind and NHS deal since initial complaints were made.

The ICO confirmed to WIRED that its investigations into the sharing of patient information was close to finishing.

"We continue to work with the National Data Guardian and have been in regular contact with the Royal Free and Deep Mind who have provided information about the development of the Streams app," the ICO said. "This has been subject to detailed review as part of our investigation. It’s the responsibility of businesses and organisations to comply with data protection law.”" 'via Blog this'

Thursday, 16 March 2017

Advertisers look forward to buying your Web browsing history from ISPs | Ars Technica

Advertisers look forward to buying your Web browsing history from ISPs | Ars Technica: "If no agency enforces privacy rules, "consumers will have no ability to stop Internet service providers from invading their privacy and selling sensitive information about their health, finances, and children to advertisers, insurers, data brokers or others who can profit off of this personal information, all without their affirmative consent," Sen. Edward Markey (D-Mass.) said last week.

 Acting FTC Chairwoman Maureen Ohlhausen said last year that the FTC recommends getting opt-in consent for "unexpected collection or use of consumers’ sensitive data such as Social Security numbers, financial information, and information about children," and an opt-out system for other data, she wrote. Under that scenario, ISPs apparently would not need opt-in consent from customers before sharing Web browsing history." 'via Blog this'

Wednesday, 15 March 2017

Data hungry gov’t vows to eyeball data offences in woolly digital pledge | Ars Technica UK

Data hungry gov’t vows to eyeball data offences in woolly digital pledge | Ars Technica UK: "Digital minister Matt Hancock has previously said that the government would implement the GDPR "in full"—a vow repeated in the DCMS' digital strategy, which highlights concerns about the transfer of data between the UK and European Union once Brexit kicks in.

"As part of our plans for the UK’s exit from the EU, we will be seeking to ensure that data flows remain uninterrupted, and will be considering all the available options that will provide legal certainty for businesses and individuals alike," it said.

 Britain's data watchdog, the Information Commissioner's Office, told Ars that the DCMS was leading a review of data protection offences. It declined to comment, however, on how such a review might affect the controversial Part 5 of the Digital Economy Bill." 'via Blog this'

Tuesday, 14 March 2017

GDPR, the proposed Copyright Directive and intermediary liability: one more time! | Peep Beep!

The GDPR, the proposed Copyright Directive and intermediary liability: one more time! | Peep Beep!: "One way to make sense of the GDPR could be to say that it implicitly acknowledges that the E-Commerce Directive liability exemptions should apply even in situations in which the service provider is (primarily) liable as a data controller.

 Note that the Court of Appeal in Northern Ireland did not wait for the GDPR to hold that Facebook, as a data controller and an information society provider, could avail itself of the national transposition of Article 14 of the E-Commerce Directive in CG v Facebook Ireland Ltd & Anor [2016] NICA 54 (21 December 2016).

 Such an interpretation is sensible, although if the characterisation of data controller is retained it would seem logical [but who is interested in logic?] to conclude after Google Spain that the processing performed by Facebook should therefore be distinct from the processing performed by the uploader of the information.

 However because Articles 12-14, strictly speaking, only target one specific situation: liability for the (unlawful) information transmitted or stored by their users, a cumulative application of EU data protection law and e.g. Article 14 of the E-Commerce Directive could appear odd in some instances, e.g. in the case of a search engine referencing content lawfully published." 'via Blog this'

AI, machine learning and personal data | ICO Blog

AI, machine learning and personal data | ICO Blog: "When the General Data Protection Regulation (GDPR) comes into force in 2018, the regulatory toolkit will be sharpened. Some key changes will be:

  1.  more powerful rights for individuals, including rights in relation to automated decisions and profiling; 
  2. new accountability provisions, including the implementation of codes of conduct and certification mechanisms that will help to improve standards and hold organisations to account in areas such as automated decision making; 
  3. and
    increased enforcement powers for the ICO, including the ability to issue fines of up to €20,000,000 or 4% of annual worldwide turnover for infringements of the of the regulation. 

These changes, and more, will contribute towards a relevant and effective regime for the regulation of personal data in the world of big data, AI and machine learning." 'via Blog this'

Monday, 13 March 2017

I invented the web. Here are three things we need to change to save it: Tim Berners-Lee

I invented the web. Here are three things we need to change to save it | Tim Berners-Lee | Technology | The Guardian: "Through collaboration with – or coercion of – companies, governments are also increasingly watching our every move online and passing extreme laws that trample on our rights to privacy. In repressive regimes, it’s easy to see the harm that can be caused – bloggers can be arrested or killed, and political opponents can be monitored. But even in countries where we believe governments have citizens’ best interests at heart, watching everyone all the time is simply going too far. It creates a chilling effect on free speech and stops the web from being used as a space to explore important topics, such as sensitive health issues, sexuality or religion." 'via Blog this'

ICO Upholds £1,000 Fine Against TalkTalk for Personal Data Breach

ICO Upholds £1,000 Fine Against TalkTalk for Personal Data Breach - ISPreview UK: "ICO then raised the issue with TalkTalk on 20th November and the ISP confirmed reception of that letter. However it then took until 27th November before TalkTalk’s Information Security Officer, Mike Rabbitt, was able to confirm that an investigation had been started, although they didn’t officially confirm that a data breach had occurred until 1st December.

TalkTalk claims that the delay in reporting the breach was because “the incident had not been reported to either [TalkTalk’s] Information Security or Fraud team.”

In February 2016 the ICO informed TalkTalk that they intended to impose a fine for the reporting failure, which TalkTalk opposed and ultimately the case went to appeal.

 Suffice to say that the Tribunal was unanimous in dismissing TalkTalk’s appeal." 'via Blog this'

Sunday, 12 March 2017

Video of ICO Elizabeth Denham discusses GDPR | ICAEW

Information commissioner Elizabeth Denham discusses GDPR | ICAEW: "In a wide-ranging speech, the commissioner noted that however fast regulation moves, technology moves faster. She outlined the new General Data Protection Regulation (GDPR) which will be with us in May 2018 and the important role that ICAEW members have to play in spreading the word about the new requirements" 'via Blog this'

CJEU judgment in Watson « Independent Reviewer of Terrorism Legislation

CJEU judgment in Watson « Independent Reviewer of Terrorism Legislation: "The CJEU considered that DRIPA 2014 “exceeds the limit of what is strictly necessary and cannot be considered to be justified, within a democratic society“: para 107.  But it referred the case back to the English Court of Appeal for a decision on the extent to which UK law is consistent with EU requirements (para 124).  The  battle will resume there in the New Year.

The case (Case C-698/15) was joined with a Swedish case brought by Tele2 Sverige AB (Case C-203/15)." 'via Blog this'

Wednesday, 8 March 2017

Dubliner who is the CIA's go-to smart guy for cyber security tech start-ups

Meet the Dubliner who is the CIA's go-to smart guy for cyber security tech start-ups - "Paladin is focused on several aspects of cyber security, he says. "If you think about it, we've benefited enormously from the internet in a very short space of time, and as cyber security threats grow, we're only perhaps now realising the true cost of that. The Internet of Things brings a whole new set of security concerns, so that's one obvious area we're looking at. Blockchain - a system for permanently storing transaction records on networks of unrelated computers permanently and verifiably - is another area of interest, particularly for 'know your client' functions and how it may provide greater security for customers.

"Enterprise IT and its operation of secure transactions is another one. A key one is threat analysis - the use of data to understand what's going on that might threaten a company's IP and operations. It's about how data is analysed, used and protected; how do transactions take place, is it seamless and who is storing data. The final one is how secure information interfaces with genomic or gene sequencing in the diagnostics and therapeutics functions related to health.

 "What we know for certain is that there's a constantly evolving set of threats against our personal data and that of corporates and governments. The reaction to that is a set of innovations, we want to invest in that innovation and the market is large and growing. The threat faced by businesses is often existential. This isn't just an IT problem, it's one of which a CEO is now constantly aware."" 'via Blog this'

Wikileaks 'reveals CIA hacking tools' - BBC News

Wikileaks 'reveals CIA hacking tools' - BBC News: "There is a huge amount of information in the CIA data dump but a lot of it, such as its apparent success in compromising smart TVs, is not that surprising. Lone researchers have managed similar hacks, so smart government agents were always going to be able to go further.
Plus, we kind of know that a lot of the modern internet-of-things gear is broken as all kinds of holes have been found in all kinds of gadgets - including cars.

What's more interesting is the work said to have been done on iPhone and Android handsets. That's because Apple works hard to make sure iOS is secure and Google has made a real effort lately to secure its operating system. For a spy agency, access to those gadgets is key because they travel everywhere with a target.
What is likely to hit the CIA the hardest is losing control of all the zero day exploits and malware detailed in the papers." 'via Blog this'

Monday, 6 March 2017

Copyright and Open Access: A Sussex Humanities Lab Lunchtime Debate : Sussex Humanities Lab : University of Sussex

Copyright and Open Access: A Sussex Humanities Lab Lunchtime Debate : Sussex Humanities Lab : University of Sussex: "In light of the changing policy on copyright being pursued by the University, and the changing IP environment for higher education, the Sussex Humanities Lab is hosting a debate between Prof David Berry and Prof Tim Hitchcock" 'via Blog this'

About Internet of Things research: PETRAS

About | PETRAS: "The PETRAS Internet of Things Research Hub is a consortium of nine leading UK universities which will work together over the next three years to explore critical issues in privacy, ethics, trust, reliability, acceptability, and security." 'via Blog this'

Master spy behind Snoopers’ Charter wants to gag leakers, journalists | Ars Technica UK

Master spy behind Snoopers’ Charter wants to gag leakers, journalists | Ars Technica UK: "Hancock, these days, is the government's cheerleader for the Digital Economy Bill—which is currently winging its way with ease through parliament. However, controversial provisions within Part 5 of the draft law fail to offer any safeguards for plans to share citizen data more widely. And everyone from privacy campaigners to doctors are deeply concerned about the government's plans.

The draft law is name-checked a number of times in the Law Commission's Protection of Official Data review, where it explores the wobbly "legislative landscape" on personal information disclosure offences in the UK. "The provisions contained in the Digital Economy Bill do not streamline the legislative landscape, but rather add to it. From a theoretical perspective the legislative landscape looks irrational, dispersed, and lacking in uniformity," it said.
It went on to discuss the "practical implications" by arguing that "the potential for the offences to overlap is likely to be increased when the Digital Economy Bill receives the Royal Assent," seemingly in a clear acknowledgement that more leaks of sensitive government information will take place.

Notably, the Law Commission failed to once mention the EU's upcoming General Data Protection Regulation, which Hancock has said will be implemented in full in 2018—in part to allow online businesses to continue to transfer data between the UK and the soon-to-be 27-member state bloc." 'via Blog this'

Understanding the Consumer Review Fairness Act of 2016 by Eric Goldman :: SSRN

Understanding the Consumer Review Fairness Act of 2016 by Eric Goldman :: SSRN: "Anti-review clauses distort the marketplace benefits society gets from consumer reviews by suppressing peer feedback from prospective consumers, which in turn helps poor vendors stay in business and diminishes the returns that good vendors get from investments in quality (thus degrading their willingness to make those investments).

 Recognizing the threats posed by anti-review clauses, Congress banned them in the Consumer Review Fairness Act of 2016 (the CRFA). As the House Report explains, the law seeks “to preserve the credibility and value of online consumer reviews by prohibiting non-disparagement clauses restricting negative, yet truthful, reviews of products and services by consumers.” By doing so, the CRFA helps advance the effective functioning of marketplaces." 'via Blog this'

Privacy: Ten More Questions for President Trump-Lawfare

Ten More Questions for President Trump - Lawfare:

"You say that you “bet a good lawyer could make a great case out of the fact that President Obama was tapping my phones in October, just prior to Election!” Are you planning to bring suit against Obama or anyone else under either 50 U.S.C. § 1810—which provides for civil remedies for “[a]n aggrieved person, other than a foreign power or an agent of a foreign power . . . who has been subjected to an electronic surveillance”—or under 18 U.S.C. § 2520—which provides that “any person whose wire, oral, or electronic communication is intercepted . . . in violation of [criminal wiretap law] may in a civil action recover from the person or entity, other than the United States, which engaged in that violation”?

To the extent no such surveillance took place or you have grossly mischaracterized it, do you have any concerns that you might have imputed grave misconduct to your predecessor—in the language of New York Times v. Sullivan—with “‘actual malice’—that is, with knowledge that it was false or with reckless disregard of whether it was false or not”?" 'via Blog this'

Google’s Artificial Brain Learns to Find Cat Videos: WIRED

Google’s Artificial Brain Learns to Find Cat Videos | WIRED: "Since coming out to the public in 2011, the secretive Google X lab — thought to be located in the California Bay Area — has released research on the Internet of Things, a space elevator and autonomous driving.

 Its latest venture, though not nearing the number of neurons in the human brain ( thought to be over 80 billion), is one of the world’s most advanced brain simulators. In 2009, IBM developed a brain simulator that replicated one billion human brain neurons connected by ten trillion synapses.

 However, Google’s latest offering appears to be the first to identify objects without hints and additional information. " 'via Blog this'

Sunday, 5 March 2017

U.S. Government’s Privacy Watchdog Is Basically Dead, Emails Reveal

The U.S. Government’s Privacy Watchdog Is Basically Dead, Emails Reveal: "One key item on PCLOB’s agenda for the near future was helping ensure that privacy rights were protected in the course of implementing a pact called Privacy Shield, which would allow corporate information transfers to the U.S. from within the European Union. The U.S. government reassured Europeans, fearful of American surveillance programs, that PCLOB would be involved in overseeing such transfers.

But with only one member, that’s unlikely, says Jake Laperruque, senior counsel at the legal think tank The Constitution Project. “PCLOB falling away may be another nail in the coffin for the US-EU Privacy Shield unless Congress gets serious” about reforming other areas of surveillance policy, he wrote in an email to The Intercept." 'via Blog this'

Wednesday, 1 March 2017

ACS:Law: When bad things happen to bad people – TechnoLlama

ACS:Law: When bad things happen to bad people – TechnoLlama: "This being the Internet, the first thing some enterprising souls did was to copy the data and to start sharing it online immediately through torrent sites (as of writing, the file is still there, but I will not link to it for reasons that will become obvious).

The emails contained some potentially embarrassing details about the practice at ACS:Law, particularly some indication that the firm targeted married men and pensioners with the gay porn allegations, hoping that it would prompt unquestioning payment from the accused. In other words, blackmail and extortion, using copyright as an excuse to obtain easy money from unsuspecting victims." 'via Blog this'

Tuesday, 28 February 2017

Over half of the world's internet traffic isn't coming from humans: AI & Law

Over half of the world's internet traffic isn't coming from humans:

"According to a new report from Imperva, more than half of all internet traffic now comes from bots — software applications designed to do everything from posting Pro-Trump messages on Twitter to crawling the web to deliver better search results.

Bots, these days, are capable of just about any repetitive online process humans are, and much like humans, they’re a mixed bag. Some are good, others are detrimental.

The good news is, good bot use grew considerably in 2016 while bad bot traffic remained the same." 'via Blog this'

Platform Regulation: How advertising fuels fake news | LSE Media Policy Project

How advertising fuels fake news | LSE Media Policy Project: "It could of course be argued that it was always the case that cheap, vivid-if-dubious content paid, which is why of course newspapers always published ‘fake news’ that would attract idle consumers standing at supermarket checkouts. But the new system bypasses the checks and ethical balances that had evolved in most Western press systems: freedom of the press was always subject to balancing rights, and self-regulation and professional ethics which encouraged accuracy and responsible journalism.

The platforms also benefit. They are dependent on consumers spending more time with sticky content, and also arguably benefit from the current power shift away from traditional news publishers. Ad agencies, ad networks, and the networks and service providers that provide net access to consumers all benefit.

There is a reason that the impacts are being felt globally and have even been linked to ‘post truth’ politics more generally: This shift in advertising models is not something that is happening at the margins: it is a massive structural change transforming media systems everywhere." 'via Blog this'

AI’s inflation paradox: FT Alphaville on AI and law

AI’s inflation paradox | FT Alphaville: "From a capital investment perspective that’s a message that asks: why invest in costly capital intensive equipment when it’s so much more cost effective to hire low-paid humans to do the same job? That business model is well exploited by technology companies like Amazon.

As argued on Wednesday, the idea society should fear the invention of robots which would displace humans from low-paid menial work is laughable.

Schumpeterian logic dictates that as long as overall productivity goes up — i.e. these robots are more cost-efficient to operate in that industry than humans — this sort of tech will help us create much more attractive jobs elsewhere.

But of course, if what we’re really doing is spending resources on technology for technology’s sake, which isn’t more efficient than humans at making the essential stuff, it stands to reason an ever greater number of humans must move down the job-quality chain (if not to subsistence-level work) to compensate for the deficit.

You could think of it this way: Allocating robots and AI systems to the luxury of knowledge work only demotes humans to having to do more of the menial work." 'via Blog this'

UK forced to derail Snoopers’ Charter blanket data slurp after EU ruling | Ars Technica UK

UK forced to derail Snoopers’ Charter blanket data slurp after EU ruling | Ars Technica UK: ""The European Court of Justice handed down a judgment relating to the UK’s communications data regime in December. The matter must now be considered by the domestic courts and the consultation on the communications data code of practice has been deferred until this has taken place," a spokesperson confirmed to Ars on Friday.

 A public consultation on the various draft codes of practice required to accompany the Investigatory Powers Act, colloquially known as the Snoopers' Charter, were published with a glaring omission: the blueprint for the home office's communications data code wasn't among the cache of documents released by Whitehall officials.

 Draft codes released back in March last year when the legislation was being scrutinised in parliament have now been "superseded" by those published on Thursday as part of a six-week-long public consultation, the home office said.

However, it was initially silent on why the communications data code had altogether disappeared from view. The missing-in-action draft statutory code should provide detailed guidance to government agencies and ISPs and other comms providers (collectively referred to as CSPs) "on the procedures to be followed when acquisition of communications data takes place," under the provisions laid out in the Investigatory Powers Act 2016.

Ars understands that so-called Internet Connection Records are yet to be captured by CSPs as required under the new law. It seemed clear that the home office had mothballed implementation of those provisions, following the recent ruling from the Court of Justice of the European Union on the "general and indiscriminate" retention of citizens' communications data." 'via Blog this'

Monday, 27 February 2017

Cogent Accidentally Blocks Websites In Global Ham-Fisted Piracy Filtering Effort | Techdirt

Cogent Accidentally Blocks Websites In Global Ham-Fisted Piracy Filtering Effort | Techdirt: "Last week, reports began to emerge that internet users were unable to access The Pirate Bay and other BitTorrent-focused websites. Ultimately it was discovered that this was courtesy of transit provider Cogent, which was blackholing an undetermined number of IP addresses allegedly linked to copyright infringement.

The IP addresses in question didn't belong to the websites -- but to popular CDN provider Cloudflare. All told, Cogent's blockade impacted around twenty different websites -- but the impact was global, with ISP users worldwide unable to access these IP addresses if they traveled the Cogent network.

Initially, Cogent wouldn't comment whatsoever on why this was occurring, but later confirmed to Ars Technica that the company had received a Spanish court order (it's not clear if it's the same 2015 order demanding Cogent block access to music streaming website

Cogent was vague about the order itself, but did confirm that The Pirate Bay was blocked -- despite it not being a target of the court order. Subsequent routing checks confirmed the impact was global across Cogent's footprint." 'via Blog this'

How to nuke websites you don't like: Slam Google with millions of bogus DMCA takedowns • The Register

How to nuke websites you don't like: Slam Google with millions of bogus DMCA takedowns • The Register: "Which of course raises the question: Why? And the answer is three-fold:

1] There is no reason not to. Once a company has created an automated script to throw out URLs and send them automatically to Google, it is extremely easy and fast to do so. But perhaps more importantly, there is no mechanism for punishing abuse of the system. Companies can send millions of requests and there is no comeback. They can send millions the next day. And the next.

2] They will occasionally get one right. A 0.03 per cent success rate would ruin any other business, even spammers, but to IP lawyers, getting any positive result ever is a good outcome. Especially since they are paid by the hour.

It focuses Google's attention on specific websites. In one respect, Google is to blame for this abuse of the system. In talking about its system for handling copyright infringement and DMCA takedowns, the company's legal director for copyright, Fred von Lohmann, told a Congressional hearing on the copyright issue not only that Google "relies on copyright owners to inform us" of infringing material, but that "Google has been demoting sites based on the number of takedown notices they receive from copyright owners." It would have taken big corporations' IP lawyers about three seconds to realize that sending millions of requests – even completely fake ones – for particular websites was likely to achieve their main goal of downgrading them from the first few pages of a Google search. And so that's what they have done." 'via Blog this'

Thursday, 23 February 2017

Op-ed: Mark Zuckerberg’s manifesto is a political trainwreck | Ars Technica UK

Op-ed: Mark Zuckerberg’s manifesto is a political trainwreck | Ars Technica UK: "Zuckerberg adds that he's thinking of creating "worldwide voting system" for Facebook users which could then be used as a template for how "collective decision-making may work in other aspects of the global community." That's a vague formulation. But coming on the heels of his comments about politicians with Facebook engagement, he sounds like he's floating the idea of turning Facebook into the infrastructure for managing elections.

For anyone who watched the way Facebook fragmented U.S. citizens during the most recent election, this is a chilling thought. But it also makes sense as a way forward for the social media giant." 'via Blog this'

Cloud industry body sets up new data protection code • The Register

Cloud industry body sets up new data protection code • The Register: "Data protection law expert Kuan Hon...said: "This is a very positive step. I hope that this code will be approved for GDPR purposes, whether by the European Commission or a national data protection supervisory authority, to enable transfers to adhering cloud providers, even if they are outside the EU, as well as to help evidence their compliance generally." 

Hon has called for the EU's E-Commerce Directive to be updated to address an anomaly which exposes infrastructure cloud providers to potential liabilities for unlawful handling of personal data by their customers, even if they are not aware of their customers’ activities. She said the anomaly will be more striking when the GDPR takes effect." 'via Blog this'

BMG Rights Mgmt. (US) LLC v. Cox Commc'ns, Inc., No. l:14-cv-1611, 2017 BL 44014 (E.D. Va. Feb. 14, 2017), Court Opinion

Bloomberg Law - Document - BMG Rights Mgmt. (US) LLC v. Cox Commc'ns, Inc., No. l:14-cv-1611, 2017 BL 44014 (E.D. Va. Feb. 14, 2017), Court Opinion: Copyright lawyers can be expensive...

"In this case, the jury awarded $25 million in damages for infringement. Separately, BMG incurred more than $10 million in attorney's fees" 'via Blog this'

Wednesday, 22 February 2017

3rd Annual Information Law Research Seminar: Prof. Roger Brownsword 29 March 2pm

The Right to Know and the Right Not to Know Revisited
2.00-4.00pm 29 March Freeman Centre F22
Abstract: In the context of the availability of non-invasive prenatal testing (and the upcoming report by the Nuffield Council on Bioethics on this topic) as well as the systematic genotyping of UK Biobank’s participants, this paper considers the plausibility, basis, scope, and weight of the claim that participants and patients have a right to know as well as a right not to know the results of the genetic analysis undertaken.
Bio: Roger Brownsword holds professorial positions at King’s College London and Bournemouth University, and he is Honorary Professor in Law at Sheffield University. Until his retirement in 2010, he was founding Director of TELOS, an inter-disciplinary research centre at King’s College London that focuses on law, ethics, and technology. He was a member of the Nuffield Council on Bioethics from 2004–2010, he chaired the Ethics and Governance Council for UK Biobank from 2011-2015, he is a member of the UK National Screening Committee and, currently, he is a member of the Royal Society Working Party on Machine Learning. He has published more than a dozen books (most recently the Oxford Handbook on Law, Regulation and Technology [with Eloise Scotford and Karen Yeung]) and some 250 academic papers; he is a member of the editorial board of the Modern Law Review and (with Han Somsen) founding general editor of Law, Innovation and Technology. He was a member of the Law panel for RAE2008 and of the international Law panel for RAE2014 in Hong Kong.