Wednesday, 26 April 2017

AI report fed by DeepMind, Amazon, Uber urges greater access to public sector data sets | TechCrunch

AI report fed by DeepMind, Amazon, Uber urges greater access to public sector data sets | TechCrunch: "Ultimately, the report does call for “urgent consideration” to be given to what it describes as “the ‘careful stewardship’ needed over the next ten years to ensure that the dividends from machine learning… benefit all in UK society.” And it’s true to say, as we’ve said before, that policymakers and regulators do need to step up and start building frameworks and determining rules to ensure machine learning technologists do not have the chance to asset strip the public sector’s crown jewels before they’ve even been valued (not to mention leave future citizens unable to pay for the fancy services that will then be sold back to them, powered by machine learning models freely fatted up on publicly funded data).

 But the suggested 10-year time frame seems disingenuous, to put it mildly. With — for instance — very large quantities of sensitive NHS data already flowing from the public sector into the hands of one of the world’s most market capitalized companies (Alphabet/Google/DeepMind) there would seem to be rather more short-term urgency for policymakers to address this issue — not leave it on the back burner for a decade or so. Indeed, parliamentarians have already been urging action on AI-related concerns like algorithmic accountability." 'via Blog this'

Tuesday, 25 April 2017

These internet firsts will remind you how far we've come - Business Insider

These internet firsts will remind you how far we've come - Business Insider: "In October 1969, UCLA student Charley Kline was attempting to send the word “login” over to the Stanford Research Institute using the internet’s precursor: ARPANET.

At first, the system crashed, only managing to send the letters “i” and “o”. But an hour or so later, the full message was successfully sent and history was made:" 'via Blog this'

Monday, 24 April 2017

SCL: European Net Neutrality, at last?

SCL: European Net Neutrality, at last?Luca Belli and Chris Marsden review the long history of developments, and the latest position, on net neutrality in Europe, amid some hopeful signs. 

Net neutrality is the principle mandating that internet traffic be managed in a non-discriminatory fashion, in order to fully safeguard internet users' rights. On 30 August 2016, all EU and EEA members finally obtained guidance on how to implement sound net neutrality provisions. The path has been tortuous and uneasy, starting from 'not neutrality', reaching an open Internet compromise and, finally, attaining net neutrality protections. In this article, we aim briefly to recount how net neutrality evolved in Europe and how much significant progress has been made by the recently adopted net neutrality Guidelines. 'via Blog this'

Saturday, 15 April 2017

The Low-Down: Streaming Now Makes Most of the US Music Industry's Revenue

The Low-Down: Streaming Now Makes Most of the US Music Industry's Revenue: "Overall last year, retail revenues from recorded music in the US grew 11.4 percent to $7.7 billion, the biggest gain since 1998, according to the RIAA. Even with such growth the industry is still licking its wounds from the last decade and a half -- sales remain about half what they were in 1999, the heyday of the CD.
Subscriptions, like the monthly fees for Apple Music or Spotify's paid tier, were the biggest money maker at $2.3 billion, and they basically doubled from a year earlier, the RIAA said." 'via Blog this'

Thursday, 13 April 2017

FCA Publishes Discussion Paper on the Regulation of DLT (blockchains)

FCA Publishes Discussion Paper on the Regulation of DLT: "The FCA continues its ‘wait-and-see’ approach before considering changes to its framework. It will instead explore emerging business models and continue to help innovators test-bed solutions in its regulatory sandbox.

 The FCA remains technology neutral/ agnostic but it is encouraging to note its approach to resilience and openness to regulating on technology outcomes, in line with statutory objectives.

The paper also recognises that DLT is not a panacea and that market outcomes like faster payments could be delivered by other technologies. It is indicative however of an increasingly mature approach to technology risk and the paper does recognise DLT’s innovative potential for record-keeping and efficiency.

 With a voluntary standards process also underway and increasing regulatory accommodation, end-users will be more accepting of the increasing trust that DLT affords, allowing benefits around efficiency, transparency and provenance to be fully realised. This much is very encouraging for UK DLT and cements the UK’s position as a global fintech hub with a forward-looking regulatory regime." 'via Blog this'

Wednesday, 12 April 2017

Where to after Watson: The challenges and future of data retention in the UK (BIICL)

Where to after Watson: The challenges and future of data retention in the UK (BIICL): "The judgment of the CJEU in the Watson case was handed down shortly before the year's end in 2016. The determination that member states may not impose on communications providers a general obligation to retain data was applauded by privacy groups and has undoubtedly caused disquiet among those involved policing and intelligence. What parliamentarians and judges will make of it in the coming months - and, post-Brexit, years - is both uncertain and important.

In this event experts will examine the strengths, weakness and implication of the decision, with an eye to rights protections, the need to combat serious crime, and the practicalities of managing both in light of the European Court's decision." 'via Blog this'

Monday, 10 April 2017

Balkinization: Assessing Algorithmic Authority

Balkinization: Assessing Algorithmic Authority: "Compared to these examples, the obscurity at the heart of our "cultural voting machines" (as I call dominant intermediaries) may seem trivial. But when a private entity grows important enough, its own secret laws deserve at least some scrutiny.

 I have little faith that such scrutiny will come any time soon. But until it does, we should not forget that the success of algorithmic authorities depends in large part on their owners' ability to convince us of the importance--not merely the accuracy--of their results. A society that obsesses over the top Google News results has made those results important, and we are ill-advised to assume the reverse (that the results are obsessed over because they are important) without some narrative account of why the algorithm is superior to, say, the “news judgment” of editors at traditional media.

(Algorithmic authority may simply be a way of rewarding engineers (rather than media personalities) for amusing ourselves to death.) " 'via Blog this'

Data Ethics Group - The Alan Turing Institute

Data Ethics Group - The Alan Turing Institute: "Made up of academics specialising in ethics, social science, law, policy-making, and big data and algorithms, the Data Ethics Group will drive the Institute’s research agenda in data ethics, and work across the organisation to provide advice and guidance on ethical best practice in data science.

The Group will work in collaboration with the broader data science community, will support public dialogue on relevant topics, and will set open calls for participation in workshops, as well as public events.

In a connected project, The Alan Turing Institute is participating in the Royal Society and British Academy project on data governance." 'via Blog this'

Do robots have rights? The European Parliament addresses artificial intelligence and robotics

Do robots have rights? The European Parliament addresses artificial intelligence and robotics: "The European Parliament has put forward initial proposals in its resolution on legal rules for machines that are able to act with a high degree of autonomy and take their own decisions through being equipped with AI and having physical freedom of movement.

This will not be the final word on the matter from a legal perspective, and we are still some years away from corresponding laws being enacted. In the meantime, technical development in the field of AI and robotics will not wait for national or European lawmakers and is set to continue unabated. It remains to be seen whether technical progress might not soon overtake the legal discussion.

 Aside from the legal issues surrounding robotics, lawyers will be interested to see how AI finds its way into our own professional lives. There has been a lot of talk recently about legal tech and digital transformation in relation to legal advice. Yet just looking at the numerous new legal issues that arise in connection with AI and robotics, robots appear to be creating as much new work for us on the one hand as intelligent assistants will be able to take over on the other." 'via Blog this'

Sunday, 9 April 2017

International stakeholder engagement - Ofcom

International stakeholder engagement - Ofcom: "Ofcom hosts an International Stakeholders Forum (ISF) every 4 months. This is the primary means through which we aim to update UK stakeholders on our international activities. We also use these as an opportunity to share information, as well as impressions, on international policy developments.  As well as this forum, we hold dedicated spectrum briefing sessions, details of which can be found here.

 For further information on these meetings and if you want to be added to the circulation list please email ofcom.international@ofcom.org.uk" 'via Blog this'

The ongoing war on encryption – TechnoLlama

The ongoing war on encryption – TechnoLlama: "Calls to have technology firms offer backdoor access to private and encrypted communications must be read as a call to endanger everyone’s communications by making them easier to read by hackers. Moreover, encryption is not proprietary, it is just a clever use of maths, and there is no way that governments will ever be able to ban that.

If somehow an app is made vulnerable, terrorists will move to another method, and we the public will still be left vulnerable.

But you may argue that we should never give up, and that the fight against terrorism is a worthy cause. It certainly is, but we cannot give up our expectations of security on the assumption that somewhere a terrorist is using an encrypted tool to communicate with one another. There is little evidence that this is the case, and even strong evidence to the contrary. The Paris terrorists used unencrypted burner mobile phones to communicate, and also favoured face to face contact.

 We cannot give away our rights based on fables and ignorance." 'via Blog this'

From Smart Cities 1.0 to 2.0: it's not (only) about the tech

From Smart Cities 1.0 to 2.0: it's not (only) about the tech: "Today’s Internet of Things technologies, data analytics platforms and sensor-enabled services are sure to deliver new ways to understand, visualise and analyse the nature and scale of many of our most pressing urban challenges.

 But solving challenges such as waste management, urban liveability and land-use planning will require more than technology investments, data-capture services or digital prototypes. Solutions will also depend on effective long-term partnerships within and beyond government.

While the digital infrastructure is no doubt important, it will be the city governments that invest in new ways to collaborate and co-innovate that will ultimately lead the way in delivering the smarter, more responsive services our cities so desperately need." 'via Blog this'

Saturday, 8 April 2017

Bundeskartellamt 18th Conference on Competition, Berlin, 16 March 2017 | European Commission

Bundeskartellamt 18th Conference on Competition, Berlin, 16 March 2017 | European Commission: "The challenges that automated systems create are very real. If they help companies to fix prices, they really could make our economy work less well for everyone else.

So as competition enforcers, we need to keep an eye out for cartels that use software to work more effectively. If those tools allow companies to enforce their cartels more strictly, we may need to reflect that in the fines that we impose.

And businesses also need to know that when they decide to use an automated system, they will be held responsible for what it does. So they had better know how that system works.

 In The Hitchhiker's Guide to the Galaxy, the Guide in question was a sort of electronic book. Although it was often wildly inaccurate, it was also a huge success. That was partly because of the words printed in big, friendly letters on the cover: “Don't Panic”.

I think that's good advice. We certainly shouldn't panic about the way algorithms are affecting markets." 'via Blog this'

The IPKat: First live blocking order granted in the UK

The IPKat: First live blocking order granted in the UK: "This is an important order that demonstrates how technological advancement prompts a re-consideration of traditional approaches, including whether intermediary injunctions should be only aimed at blocking access to infringing websites [the answer appears to be no, and this order may pave the way to even more creative enforcement strategies in the future].

 Arnold J's decision shows how the law - including the one on blocking orders - is subject to evolution. This is so also to permit that the 'high level of protection' that the InfoSoc Directive [from which s97A CDPA derives] intends to provide is actually guaranteed.

 As far as the GS Media 'profit-making intention' is concerned, to some extent the view of Arnold J appears somewhat narrower (but practically not dissimilar) than that of other courts, eg the District Court of Attunda in Sweden [here, here, and here] that have applied GS Media so far. Further applications of GS Media by UK courts are however keenly awaited." 'via Blog this'

Wednesday, 5 April 2017

Finding Proportionality in Surveillance Laws – Andrew Murray, Inforrm's Blog

Finding Proportionality in Surveillance Laws – Andrew Murray | Inforrm's Blog: "Much of the Bill’s activity is to formalise and restate pre-existing surveillance powers. One of the key criticisms of the extant powers of the security and law enforcement services is that the law lacks clarity. Indeed it was this lack of clarity which led the Investigatory Powers Tribunal to rule in the landmark case of Liberty v GCHQ that the regulations which covered GCHQ’s access to emails and phone records intercepted by the US National Security Agency breached Articles 8 and 10 of the European Convention on Human Rights.

Following a number of strong critiques of the law including numerous legal challenges the Government received three reports into the current law: the report of the Intelligence and Security Committee of Parliament, “Privacy and Security: A modern and transparent legal framework”; the report of the Independent Reviewer of Terrorism Legislation. “A Question of Trust”; and the report of the Royal United Services Institute: “A Democratic Licence to Operate”. All three reported deficiencies in the law’s transparency.

 As a result the Bill restates much of the existing law in a way which should be more transparent and which, in theory, should allow for greater democratic and legal oversight of the powers of the security and law enforcement services. In essence the Bill is split into sections: interception, retention, equipment interference and oversight, with each of the three substantive powers split again into targeted and bulk." 'via Blog this'

Tim Berners-Lee: selling private citizens' browsing data is 'disgusting' Guardian

Tim Berners-Lee: selling private citizens' browsing data is 'disgusting' | Technology | The Guardian: "The Twitter folks, who crowed about how great anonymity was for the “Arab spring” – never say that without quotes – then suddenly they find that this anonymity is really not appreciated when it’s used by nasty misogynist bullies and they realize they have to tweak their system to limit not necessarily behavior but the way it propagates. They’ve talked about using AI to distinguish between constructive and unconstructive comments; one possibility is that by tweaking the code in things, you can have a sea change in the way society works." 'via Blog this'

Final Programme: PhD WIP workshop 3 May 11am-1pm

Speakers: 
Elif Mendos Ku┼čkonmaz (Queen Mary University of London): The EU-US PNR Agreement under EU Privacy & DP law
-  Paul Pedley (City, University of London): Protecting the privacy of library users
-  Maria Bjarnadottir (Sussex): Who is the guarantor of human rights on the internet?
Chair: Chris Marsden (Sussex)
Discussants: Nico Zingales (Sussex), Andres Guadamuz (Sussex). 
Logistics: 11am-1pm 3 May in the Moot Room, Freeman Building,University of Sussex.
Afternoon Workshop: all PhD attendees are registered to attend the afternoon workshop 2pm-5.30pm F22 without charge (programme here), the evening lecture by the Europol Director, and the drinks reception in Fulton B at 6.30pm.

UPDATE: Special guest speaker and drinks reception for Annual WIP Seminar

In addition to a packed afternoon of talks - and a morning PhD WIP workshop - we will also be able to attend the Annual Lecture by Rob Wainwright, Director of Europol, whose talk is likely to touch on issues of cybercrime and online liability. This will run 5.30-6.30pm - the afternoon concludes with a free drinks reception outside Fulton B from 6.30pm onwards.

Tuesday, 4 April 2017

Minister explains Rudd's 'necessary hashtags' after week of confusion | Technology | The Guardian

Minister explains Rudd's 'necessary hashtags' after week of confusion | Technology | The Guardian: "PhotoDNA has been successfully used in the fight against online child abuse imagery, but is less well suited to extremist content due to the broader nature of such material. Nonetheless, in December 2016, social media firms including Facebook, Twitter, Google and Microsoft committed to contribute image and video hashes of terrorist content to a shared database, to speed discovery and takedown of material that breaches each site’s terms of service." 'via Blog this'

2016 No.607: The Open Internet Access (EU Regulation) Regulations 2016

"19.—(1) Where OFCOM determine that there are reasonable grounds for believing that a person
is breaching, or has breached an obligation under Articles 3, 4 or 5 of the EU Regulation or under
these Regulations they may give that person a notification under this regulation.


21.—(1) The amount of a penalty notified under regulation 19 (other than a penalty falling
within regulation 20(5)) is to be such amount as OFCOM determine to be—
(a) appropriate; and
(b) proportionate to the breach in respect of which it is imposed,
but in the case of a breach of an information requirement not exceeding £2,000,000, and in the
case of any other breach of the EU Regulation or these Regulations, not exceeding ten per cent. of
the turnover of the notified person’s relevant business for the relevant period.
'via Blog this'