Paper plane protesters urge Russia to unblock Telegram app | Reuters: "Russia began blocking Telegram on April 16 after the app refused to comply with a court order to grant state security services access to its users’ encrypted messages.
Russia’s FSB Federal Security service has said it needs access to some of those messages for its work, that includes guarding against militant attacks.
In the process of blocking the app, state watchdog Roskomnadzor also cut off access to a slew of other websites.
Telegram’s founder, Russian entrepreneur Pavel Durov, called for “digital resistance” in response to the decision and promised to fund anyone developing proxies and VPNs to dodge the block." 'via Blog this'
For researchers and students of cyberlaw and Internet regulation. The information law group in IT and IP Law, launched in 2013, led the EC-funded FP7 Internet Science and DG JUSTICE Openlaws projects. The group has strong links to the legal profession through board membership in the Society for Computers and Law and IFCLA conferences. Sussex ITIP Masters degree (LLM), PhD projects, Internet Law and IP Law courses.
Monday, 30 April 2018
Sunday, 29 April 2018
CYBER Algorithms have become so powerful we need a robust, Europe-wide response | Marietje Schaake MEP
Algorithms have become so powerful we need a robust, Europe-wide response | Marietje Schaake | Opinion | The Guardian: "Think about how algorithms can add to discrimination rather than combat it: is it really as easy for a black person to rent or post a room on Airbnb as it is for a white person? How do we prevent bogus conspiracy theories going viral? Is political content, whatever its leanings, treated equally on Facebook? Are users informed about who pays for political ads? Coca-Cola’s recipe may be a secret, but it can be tested for compliance with health requirements. And if hundreds of millions of people suddenly drank nothing but soft drinks, surely public authorities would start raising concerns and work out policies to address them.
None of this means we need new EU regulations or EU regulations to oversee technology platforms. The notion that laws should apply online as they do offline generally holds. But we do have to make sure there is accountability beyond mere promises of better behaviour.
For oversight to be possible, regulators need to be able to assess the workings of algorithms. This can be done in a confidential manner, by empowering telecommunications and competition regulators. Those software codes wouldn’t need to be published, but their workings could be scrutinised. The impact of algorithms could be tested through a form of sampling to assess their intentions, and whether they promote some kinds of content while downplaying others." 'via Blog this'
None of this means we need new EU regulations or EU regulations to oversee technology platforms. The notion that laws should apply online as they do offline generally holds. But we do have to make sure there is accountability beyond mere promises of better behaviour.
For oversight to be possible, regulators need to be able to assess the workings of algorithms. This can be done in a confidential manner, by empowering telecommunications and competition regulators. Those software codes wouldn’t need to be published, but their workings could be scrutinised. The impact of algorithms could be tested through a form of sampling to assess their intentions, and whether they promote some kinds of content while downplaying others." 'via Blog this'
Friday, 27 April 2018
PRIVACY SEMINAR 4: YouTube Community Guidelines enforcement – Google Transparency Report
YouTube Community Guidelines enforcement – Google Transparency Report: "YouTube relies on a combination of people and technology to flag inappropriate content and enforce YouTube’s Community Guidelines. This report provides data on the flags YouTube receives and how we enforce our policies. Flags can come from our automated flagging systems, from members of the Trusted Flagger program (NGOs, government agencies, and individuals) or from users in the broader YouTube community.
We rely on teams from around the world to review flagged videos and remove content that violates our terms; restrict videos (e.g., age-restrict content that may not be appropriate for all audiences); or leave the content live when it doesn’t violate our guidelines.
This chart shows the volume of videos removed by YouTube, by source of first detection (automated flagging or human detection). Flags from human detection can come from a user or a member of YouTube’s Trusted Flagger program. Trusted Flagger program members include individuals, NGOs, and government agencies that are particularly effective at notifying YouTube of content that violates our Community Guidelines." 'via Blog this'
We rely on teams from around the world to review flagged videos and remove content that violates our terms; restrict videos (e.g., age-restrict content that may not be appropriate for all audiences); or leave the content live when it doesn’t violate our guidelines.
This chart shows the volume of videos removed by YouTube, by source of first detection (automated flagging or human detection). Flags from human detection can come from a user or a member of YouTube’s Trusted Flagger program. Trusted Flagger program members include individuals, NGOs, and government agencies that are particularly effective at notifying YouTube of content that violates our Community Guidelines." 'via Blog this'
REGULATION: FTC lags international counterparts in staffing for privacy enforcement - FTCWatch
FTC lags international counterparts in staffing for privacy enforcement - FTCWatch: "Comparisons between the FTC’s privacy division and data protection regulators in the UK, Japan and other countries must consider that those regulators are stand-alone agencies. But the gap between the spending and staffing of those agencies compared to the FTC’s privacy staff is huge. That is particularly striking given that most of the world’s biggest Internet companies, such as Facebook, Google and Amazon, are based in the US.
The UK’s Information Commissioner’s Office, for example, has a budget of $34 million (£24 million) and a workforce of 520. But the ICO expects its budget to grow to $54 million (£38 million) by 2019, with a headcount of 700 by 2020. The US has five times the population of the UK.
In Ireland, the Data Protection Commissioner has a budget of $14.5 million (€11.7 million), with an expected headcount of 140 by the end of 2018. Ireland has a population of just 4.5 million, but the Irish DPC will be the lead enforcement agency for the GDPR because Facebook, Google and many other multinational Internet companies base their European operations there.
And in Japan, the Personal Information Protection Commission has a budget of $32.2 million (3.5 billion yen), and a staff of 103.
Comparisons of the FTC’s privacy spending with other regulators, such as the FCC in the US and data protection authorities in other countries, show the FTC’s privacy resources are “so pathetic,” said Chris Hoofnagle, a professor at the University of California, Berkeley School of Law, who published a book in 2016 on the history of the FTC.
“Put it this way — the United Kingdom’s ICO, tasked with protecting a much smaller population and on information rights issues only, has a staff of over 500,” Hoofnagle said." 'via Blog this'
The UK’s Information Commissioner’s Office, for example, has a budget of $34 million (£24 million) and a workforce of 520. But the ICO expects its budget to grow to $54 million (£38 million) by 2019, with a headcount of 700 by 2020. The US has five times the population of the UK.
In Ireland, the Data Protection Commissioner has a budget of $14.5 million (€11.7 million), with an expected headcount of 140 by the end of 2018. Ireland has a population of just 4.5 million, but the Irish DPC will be the lead enforcement agency for the GDPR because Facebook, Google and many other multinational Internet companies base their European operations there.
And in Japan, the Personal Information Protection Commission has a budget of $32.2 million (3.5 billion yen), and a staff of 103.
Comparisons of the FTC’s privacy spending with other regulators, such as the FCC in the US and data protection authorities in other countries, show the FTC’s privacy resources are “so pathetic,” said Chris Hoofnagle, a professor at the University of California, Berkeley School of Law, who published a book in 2016 on the history of the FTC.
“Put it this way — the United Kingdom’s ICO, tasked with protecting a much smaller population and on information rights issues only, has a staff of over 500,” Hoofnagle said." 'via Blog this'
PRIVACY Ireland: Digital Age Of Consent Should Be 16, Say Fianna Fáil And Sinn Féin
Digital Age Of Consent Should Be 16, Say Fianna Fáil And Sinn Féin: "Ministers intend to set the digital age of consent at the lowest possible age – 13 years – despite cyber security experts warning it would allow social media firms to gather information about young children, including where they live. But Fianna Fáil has told Extra.ie it will ‘put down an amendment to the Data Protection Bill at committee stage in order to amend the age of digital consent from 13 to 16’.
Jim O’Callaghan, the party’s justice spokesman, said: ‘Fianna Fáil believes that children should be protected as much as possible from data profiling and commercial targeting." 'via Blog this'
Jim O’Callaghan, the party’s justice spokesman, said: ‘Fianna Fáil believes that children should be protected as much as possible from data profiling and commercial targeting." 'via Blog this'
CYBER Regulation on the implementation and functioning of the .eu Top Level Domain name | Digital Single Market
Regulation on the implementation and functioning of the .eu Top Level Domain name | Digital Single Market: "Proposal for a Regulation of the European Parliament and of the Council on the implementation and functioning of the .eu Top Level Domain name and repealing Regulation (EC) No 733/2002 and Commission Regulation (EC) No 874/2004" 'via Blog this'
Privacy: RTBF Ruling: High Court Judgment on Delisting
SCL: The ‘Right to be Forgotten’ Ruling: High Court Judgment on Delisting: "The much-anticipated decision in NT 1 & NT 2 v Google LLC [2018] EWHC 799 (QB) was handed down on 13 April 2018. The joint judgment in two separate claims against Google, is the first time the English courts have had to rule on the application of the ‘right to be forgotten’ principle following the decision in Google Spain SL, Google Inc. v Agencia Espanola de Proteccion de Datos (AEPD) and Mario Costeja Gonzalez (Case C-131/12).
This article explores the decision and the ramifications on future delisting requests to Google. The judgment necessarily had to deal with a number of novel issues and discuss the legal approach to such claims." 'via Blog this'
This article explores the decision and the ramifications on future delisting requests to Google. The judgment necessarily had to deal with a number of novel issues and discuss the legal approach to such claims." 'via Blog this'
CYBER ‘Worst of Both Worlds’ FOSTA Signed Into Law, Completing Section 230’s Evisceration – Technology & Marketing Law Blog
‘Worst of Both Worlds’ FOSTA Signed Into Law, Completing Section 230’s Evisceration – Technology & Marketing Law Blog: "To recap: even before it became law, we had proof that FOSTA wasn’t needed to prosecute Backpage or ensure victims’ abilities to sue Backpage. This proof was entirely foreseeable, and FOSTA opponents repeatedly told Congress that these developments were likely to occur soon and that slowing down the process would reveal that.
(Here are the predictions I directly presented to Congress: Sex Trafficking Exceptions to Section 230, Balancing Section 230 and Anti-Sex Trafficking Initiatives, and Answers to Questions for the Record). Incredibly, Congress quickly pushed forward despite the failure of its key justifications.
Development #3: FOSTA opponents warned Congress that the law would chill legal speech. Even before the law was signed, this prediction materialized–the Internet has already started shrinking in three major ways:
* sites that catered to sex workers have shuttered. For example, Reddit shut down various subreddits used by sex workers. This has thrown the sex worker community into turmoil and disarray, and many sex workers have raised substantial fears about increased physical safety risks.
* online personals have shrunk, including most prominently Craigslist’s personals section.
* major services, including Google and Microsoft, have taken new steps to ban legal but unwanted content." 'via Blog this'
(Here are the predictions I directly presented to Congress: Sex Trafficking Exceptions to Section 230, Balancing Section 230 and Anti-Sex Trafficking Initiatives, and Answers to Questions for the Record). Incredibly, Congress quickly pushed forward despite the failure of its key justifications.
Development #3: FOSTA opponents warned Congress that the law would chill legal speech. Even before the law was signed, this prediction materialized–the Internet has already started shrinking in three major ways:
* sites that catered to sex workers have shuttered. For example, Reddit shut down various subreddits used by sex workers. This has thrown the sex worker community into turmoil and disarray, and many sex workers have raised substantial fears about increased physical safety risks.
* online personals have shrunk, including most prominently Craigslist’s personals section.
* major services, including Google and Microsoft, have taken new steps to ban legal but unwanted content." 'via Blog this'
CYBER: Wisconsin Appeals Court Blows Open Big Holes in CDA S.230–Daniel v. Armslist – Technology & Marketing Law Blog
Wisconsin Appeals Court Blows Open Big Holes in Section 230–Daniel v. Armslist – Technology & Marketing Law Blog: "Congress eviscerated Section 230 via the Worst of Both World FOSTA, but defendants have been doing well with Section 230 defenses over the past year-plus. Then, last week, a Wisconsin appeals court issued a published opinion that massively screws up Section 230 jurisprudence.
I don’t know if the timing is a coincidence or a signal of broader common law retrenchment of Section 230 post-FOSTA. Either way, it’s very troubling.
The case relates to a shooting in the Milwaukee area that killed four people and wounded four others. The shooter found the seller of the gun and ammo on Armslist, an online marketplace for such things, even though the shooter was subject to a court order banning him from owning a gun. (The maxim “if guns are outlawed, only outlaws will have guns” seems vaguely apropos here). The shooter and seller consummated the transaction offline, so Armslist functioned as an online classified advertising service. (Thus, this case doesn’t turn on Armslist functioning like a marketplace; contrast the Airbnb v. SF ruling).
A shooting victim’s estate sued Armslist for negligence for its role in the transaction.
The lower court dismissed the case on Section 230 grounds. The appeals court reversed." 'via Blog this'
I don’t know if the timing is a coincidence or a signal of broader common law retrenchment of Section 230 post-FOSTA. Either way, it’s very troubling.
The case relates to a shooting in the Milwaukee area that killed four people and wounded four others. The shooter found the seller of the gun and ammo on Armslist, an online marketplace for such things, even though the shooter was subject to a court order banning him from owning a gun. (The maxim “if guns are outlawed, only outlaws will have guns” seems vaguely apropos here). The shooter and seller consummated the transaction offline, so Armslist functioned as an online classified advertising service. (Thus, this case doesn’t turn on Armslist functioning like a marketplace; contrast the Airbnb v. SF ruling).
A shooting victim’s estate sued Armslist for negligence for its role in the transaction.
The lower court dismissed the case on Section 230 grounds. The appeals court reversed." 'via Blog this'
CYBER Christine Lagarde: Addressing the Dark Side of the Crypto World: IMF Blog
Addressing the Dark Side of the Crypto World | IMF Blog: "These advances will take years to refine and implement. Two examples highlight the promise of this approach over the long term:
Distributed ledger technology (DLT) can be used to speed up information-sharing between market participants and regulators. Those who have a shared interest in maintaining safe online transactions need to be able to communicate seamlessly.
The technology that enables instant global transactions could be used to create registries of standard, verified, customer information along with digital signatures. Better use of data by governments can also help free up resources for priority needs and reduce tax evasion, including evasion related to cross-border transactions.
Biometrics, artificial intelligence, and cryptography can enhance digital security and identify suspicious transactions in close to real time. This would give law enforcement a leg up in acting fast to stop illegal transactions.
This is one way to help us remove the “pollution” from the crypto-assets ecosystem.
We also need to ensure that the same rules apply to protect consumers in both digital and non-digital transactions. The U.S. Securities and Exchange Commission and other regulators around the world now apply the same laws to some initial coin offerings (ICOs) as they do to offerings of standard securities. This helps to increase transparency and alert buyers to potential risks." 'via Blog this'
Distributed ledger technology (DLT) can be used to speed up information-sharing between market participants and regulators. Those who have a shared interest in maintaining safe online transactions need to be able to communicate seamlessly.
The technology that enables instant global transactions could be used to create registries of standard, verified, customer information along with digital signatures. Better use of data by governments can also help free up resources for priority needs and reduce tax evasion, including evasion related to cross-border transactions.
Biometrics, artificial intelligence, and cryptography can enhance digital security and identify suspicious transactions in close to real time. This would give law enforcement a leg up in acting fast to stop illegal transactions.
This is one way to help us remove the “pollution” from the crypto-assets ecosystem.
We also need to ensure that the same rules apply to protect consumers in both digital and non-digital transactions. The U.S. Securities and Exchange Commission and other regulators around the world now apply the same laws to some initial coin offerings (ICOs) as they do to offerings of standard securities. This helps to increase transparency and alert buyers to potential risks." 'via Blog this'
Thursday, 26 April 2018
CYBER British EU Commissioner: ID check & prior approval for online posts - EDRi
LEAK: British EU Commissioner: ID check & prior approval for online posts - EDRi: "For the past year, Commissioner King and his services have been strongly pushing for “upload filtering (pdf)” – the automatic approval of all uploads in all formats before they are put online.
The aim is to ensure that nothing that was previously removed on the basis of the law, or the arbitrary terms of service of an internet company, or that is or has been assessed as being unwelcome or illegal by a guess made by an AI programme can be uploaded or re-uploaded to the internet.
If the European Commission succeeds in getting this principle accepted by the European Parliament in the Copyright Directive (vote is scheduled for 20-21 June 2018), it plans to rush out new legislation to cover other forms of content within weeks. It seems that some Members of the European Parliament (MEPs) are already being lobbied to push for this.
Paradoxically, while the European Commission uses populist demands about “all parties” making “more efforts and faster progress” on removing “illegal” content, the Commission itself has no idea how many items of allegedly illegal content that were flagged by the EU police cooperation agency Europol led to an investigation or a prosecution – clearly showing a lack of a serious, diligent approach “from all sides”. “From all sides, except ours” might be more accurate.
ID Checks: Now, acting on his own initiative, Commissioner King has decided that “voluntary” identification (by companies that are eager to collect as much data about us as possible) is the next battle – this time in the fight against “online disinformation” (whatever that may mean) and to fight against abuse of data (collecting data as a way of avoiding collected data from being abused). Facebook’s “real-name policy” has previously caused demonstrable harm to vulnerable and marginalised groups.
In the letter, King proposes multiple ways of achieving this control – such as through the WHOIS database of domain name owners, through surveillance of IPv6 internet protocol numbers. The European Court of Human Rights ruled this week (pdf) that a court order is needed to gain access to IP address data." 'via Blog this'
The aim is to ensure that nothing that was previously removed on the basis of the law, or the arbitrary terms of service of an internet company, or that is or has been assessed as being unwelcome or illegal by a guess made by an AI programme can be uploaded or re-uploaded to the internet.
If the European Commission succeeds in getting this principle accepted by the European Parliament in the Copyright Directive (vote is scheduled for 20-21 June 2018), it plans to rush out new legislation to cover other forms of content within weeks. It seems that some Members of the European Parliament (MEPs) are already being lobbied to push for this.
Paradoxically, while the European Commission uses populist demands about “all parties” making “more efforts and faster progress” on removing “illegal” content, the Commission itself has no idea how many items of allegedly illegal content that were flagged by the EU police cooperation agency Europol led to an investigation or a prosecution – clearly showing a lack of a serious, diligent approach “from all sides”. “From all sides, except ours” might be more accurate.
ID Checks: Now, acting on his own initiative, Commissioner King has decided that “voluntary” identification (by companies that are eager to collect as much data about us as possible) is the next battle – this time in the fight against “online disinformation” (whatever that may mean) and to fight against abuse of data (collecting data as a way of avoiding collected data from being abused). Facebook’s “real-name policy” has previously caused demonstrable harm to vulnerable and marginalised groups.
In the letter, King proposes multiple ways of achieving this control – such as through the WHOIS database of domain name owners, through surveillance of IPv6 internet protocol numbers. The European Court of Human Rights ruled this week (pdf) that a court order is needed to gain access to IP address data." 'via Blog this'
CYBER Online platforms: Commission sets new standards on transparency and fairness
European Commission - PRESS RELEASES - Press release - Online platforms: Commission sets new standards on transparency and fairness: "The new rules will tackle these concerns by:
[1] Increasing transparency: Providers of online intermediation services must ensure that their terms and conditions for professional users are easily understandable and easily available.
This includes setting out in advance the possible reasons why a professional user may be delisted or suspended from a platform. Providers also have to respect a reasonable minimum notice period for implementing changes to the terms and conditions.
If a provider of online intermediation services suspends or terminates all or part of what a business user offers, this provider will need to state the reasons for this. In addition, the providers of these services must formulate and publish general policies on (i) what data generated through their services can be accessed, by whom and under what conditions; (ii) how they treat their own goods or services compared to those offered by their professional users; and (iii) how they use contract clauses to demand the most favourable range or price of products and services offered by their professional users (so-called Most-Favoured-Nation (MFN) clauses). Finally, both online intermediation services as well as online search engines must set out the general criteria that determine how goods and services are ranked in search results.
[2] Resolving disputes more effectively: Providers of online intermediation services are required to set up an internal complaint-handling system. To facilitate out-of-court dispute resolution, all providers of online intermediation services will have to list in their terms and conditions the independent and qualified mediators they are willing to work with in good faith to resolve disputes. The industry will also be encouraged to voluntarily set up specific independent mediators capable of dealing with disputes arising in the context of online intermediation services. Finally, associations representing businesses will be granted the right to bring court proceedings on behalf of businesses to enforce the new transparency and dispute settlement rules.
[3] Setting up an EU Observatory to monitor the impact of the new rules: The Observatory would monitor current as well as emerging issues and opportunities in the digital economy, with a view to enabling the Commission to follow up on today's legislative proposal if appropriate. Particular attention will be paid to developments in policy and regulatory approaches all over Europe.
Depending on the progress achieved and based on the insights gained through the EU Observatory, the Commission will assess the need for further measures within three years." 'via Blog this'
[1] Increasing transparency: Providers of online intermediation services must ensure that their terms and conditions for professional users are easily understandable and easily available.
This includes setting out in advance the possible reasons why a professional user may be delisted or suspended from a platform. Providers also have to respect a reasonable minimum notice period for implementing changes to the terms and conditions.
If a provider of online intermediation services suspends or terminates all or part of what a business user offers, this provider will need to state the reasons for this. In addition, the providers of these services must formulate and publish general policies on (i) what data generated through their services can be accessed, by whom and under what conditions; (ii) how they treat their own goods or services compared to those offered by their professional users; and (iii) how they use contract clauses to demand the most favourable range or price of products and services offered by their professional users (so-called Most-Favoured-Nation (MFN) clauses). Finally, both online intermediation services as well as online search engines must set out the general criteria that determine how goods and services are ranked in search results.
[2] Resolving disputes more effectively: Providers of online intermediation services are required to set up an internal complaint-handling system. To facilitate out-of-court dispute resolution, all providers of online intermediation services will have to list in their terms and conditions the independent and qualified mediators they are willing to work with in good faith to resolve disputes. The industry will also be encouraged to voluntarily set up specific independent mediators capable of dealing with disputes arising in the context of online intermediation services. Finally, associations representing businesses will be granted the right to bring court proceedings on behalf of businesses to enforce the new transparency and dispute settlement rules.
[3] Setting up an EU Observatory to monitor the impact of the new rules: The Observatory would monitor current as well as emerging issues and opportunities in the digital economy, with a view to enabling the Commission to follow up on today's legislative proposal if appropriate. Particular attention will be paid to developments in policy and regulatory approaches all over Europe.
Depending on the progress achieved and based on the insights gained through the EU Observatory, the Commission will assess the need for further measures within three years." 'via Blog this'
British MPs Just Called Facebook A "Morality-Free Zone" And Likened It To A "Vampire Squid"
British MPs Just Called Facebook A "Morality-Free Zone" And Likened It To A "Vampire Squid": "Throughout the session, Schroepfer also defended Zuckerberg's decision not to fly to London and address the MPs before the committee himself, insisting the CEO was busy "in the office" fixing the problems raised by the recent scandals.
It was left to committee chair Damian Collins to ask about a Politico story published during the session which suggested Zuckerberg had agreed to address MPs of the European parliament in Brussels.
"I did not know that," Schroepfer said. "That is news to me right now."
Collins replied: "He’s obviously found time to get out of the office and go to Brussels."" 'via Blog this'
It was left to committee chair Damian Collins to ask about a Politico story published during the session which suggested Zuckerberg had agreed to address MPs of the European parliament in Brussels.
"I did not know that," Schroepfer said. "That is news to me right now."
Collins replied: "He’s obviously found time to get out of the office and go to Brussels."" 'via Blog this'
Hate Speech: After the Toronto attack don't explain Incel ideology, ban it | WIRED UK
After the Toronto attack don't explain Incel ideology, ban it | WIRED UK: "in the UK misogyny is not considered a hate crime, a fact debated by MPs in parliament only last month. If it were, tackling such sites under existing hate crime legislation would be easier, notes Bjarnadóttir. "If this doesn't spark the debate and take further, I seriously question what would," she adds." 'via Blog this'
Blockchains, The Rule of Code vs. The Rule of Law - TechBros by TechSis?
The Rule of Code vs. The Rule of Law - Harvard University Press Blog: "Blockchain technology facilitates the emergence of new self-contained and autonomous systems of rules that create order without law and implement what can be thought of as private regulatory frameworks, which we refer to as lex cryptographica. These systems enable people to communicate, organize, and exchange value on a peer-to-peer basis, with less of a need for intermediary operators. They provide individuals with the opportunity to create a new normative layer or a customized system of code-based rules that can be readily incorporated into the fabric of this new technological construct—thereby making it easier for people to circumvent the law.
Lex cryptographica shares certain similarities with the more traditional means of regulation by code. Both purport to regulate individuals by introducing a specific set of affordances and constraints embedded directly into the fabric of a technological system. Lex cryptographica, however, distinguishes itself from today’s code-based regimes in that it operates autonomously—independently of any government or other centralized authority.
If the vision of blockchain proponents edges toward reality, we may delegate power to technological constructs that could displace current bureaucratic systems, governed by hierarchy and laws, with algocratic systems, governed by deterministic rules dictated by silicon chips, computers, and those that program them. These systems could improve society in demonstrable ways, but they also could restrain rather than enhance individual freedom.
When it comes to freedom and autonomy, the assumption that the rule of code is superior to the rule of law is a delicate one—and one that has yet to be tested. As Lawrence Lessig has already warned, “When government disappears, it’s not as if paradise will take its place. When governments are gone, other interests will take their place.”" 'via Blog this'
Lex cryptographica shares certain similarities with the more traditional means of regulation by code. Both purport to regulate individuals by introducing a specific set of affordances and constraints embedded directly into the fabric of a technological system. Lex cryptographica, however, distinguishes itself from today’s code-based regimes in that it operates autonomously—independently of any government or other centralized authority.
If the vision of blockchain proponents edges toward reality, we may delegate power to technological constructs that could displace current bureaucratic systems, governed by hierarchy and laws, with algocratic systems, governed by deterministic rules dictated by silicon chips, computers, and those that program them. These systems could improve society in demonstrable ways, but they also could restrain rather than enhance individual freedom.
When it comes to freedom and autonomy, the assumption that the rule of code is superior to the rule of law is a delicate one—and one that has yet to be tested. As Lawrence Lessig has already warned, “When government disappears, it’s not as if paradise will take its place. When governments are gone, other interests will take their place.”" 'via Blog this'
Wednesday, 25 April 2018
Academics Against Press Publishers’ Right: 169 European Academics Warn Against It - IVIR
Academics Against Press Publishers’ Right: 169 European Academics Warn Against It - IVIR: "Article 11 of the proposal for a Directive on Copyright in the Digital Single Market, as it currently stands following negotiations in the EU Council and Parliament, is a bad piece of legislation.
Why?
The proposal would likely impede the free flow of information that is of vital importance to democracy. This is because it would create very broad rights of ownership in news and other information. These rights would be territorial – there would be one for each Member State.
The rights would be owned by established institutional producers of news. And in each Member State, the new right would sit on top of all the other property rights that such publishers of news already enjoy: copyrights, database rights, broadcast rights and other related rights.
This proliferation of different rights for established players would make it more expensive for other people to use news content. Transaction costs would be greatly increased, as permissions would need to be sought for virtually any use. Even using the smallest part of a press publication (except perhaps for strictly private use) would mean payment would be due to an institutional news publisher.
That means, the proposal would be likely to harm journalists, photographers, citizen journalists and many other non-institutional creators and producers of news, especially the growing number of freelancers.
The people most likely to benefit would be the big established news institutions. If they should benefit, this is likely to exacerbate existing power asymmetries in media markets that already suffer from worrying levels of concentration in many Member States.
That said, it is not clear that even these big news institutions would benefit. Similar rights introduced in Germany and Spain were not effective.
The proposed right would provide no protection against ‘fake news’.
There is no sound economic case for the introduction of such a right. An additional intellectual property right would not change the fundamental problems that news institutions face. They would still have to compete with many other actors for consumer attention, advertisers and hence revenue."
'via Blog this'
Why?
The proposal would likely impede the free flow of information that is of vital importance to democracy. This is because it would create very broad rights of ownership in news and other information. These rights would be territorial – there would be one for each Member State.
The rights would be owned by established institutional producers of news. And in each Member State, the new right would sit on top of all the other property rights that such publishers of news already enjoy: copyrights, database rights, broadcast rights and other related rights.
This proliferation of different rights for established players would make it more expensive for other people to use news content. Transaction costs would be greatly increased, as permissions would need to be sought for virtually any use. Even using the smallest part of a press publication (except perhaps for strictly private use) would mean payment would be due to an institutional news publisher.
That means, the proposal would be likely to harm journalists, photographers, citizen journalists and many other non-institutional creators and producers of news, especially the growing number of freelancers.
The people most likely to benefit would be the big established news institutions. If they should benefit, this is likely to exacerbate existing power asymmetries in media markets that already suffer from worrying levels of concentration in many Member States.
That said, it is not clear that even these big news institutions would benefit. Similar rights introduced in Germany and Spain were not effective.
The proposed right would provide no protection against ‘fake news’.
There is no sound economic case for the introduction of such a right. An additional intellectual property right would not change the fundamental problems that news institutions face. They would still have to compete with many other actors for consumer attention, advertisers and hence revenue."
'via Blog this'
FOSTA/SESTA Becomes Law Despite Strong Opposition –
FOSTA/SESTA Becomes Law Despite Strong Opposition –: "FOSTA/SESTA was the response to a 2016 sex trafficking case involving Backpage.com that was dismissed under Section 230.
The First Circuit concluded that Backpage’s choices about “what content can appear on the website and in what form,” did not amount to content creation, but were instead editorial choices protected under Section 230.
Many viewed this result as an affront to victims of sex trafficking and a loophole for companies like Backpage to profit from trafficking.
Nothing in Section 230 protects online forums from being charged under criminal law. Federal criminal law prohibits knowingly facilitating prostitution, and knowingly selling, soliciting, or advertising the sexual services of victims of sex trafficking." 'via Blog this'
The First Circuit concluded that Backpage’s choices about “what content can appear on the website and in what form,” did not amount to content creation, but were instead editorial choices protected under Section 230.
Many viewed this result as an affront to victims of sex trafficking and a loophole for companies like Backpage to profit from trafficking.
Nothing in Section 230 protects online forums from being charged under criminal law. Federal criminal law prohibits knowingly facilitating prostitution, and knowingly selling, soliciting, or advertising the sexual services of victims of sex trafficking." 'via Blog this'
Friday, 20 April 2018
Faceook: Pirate MEP calls for end to targetted advertising "arms race"
See https://twitter.com/Cellular_PP/status/986852470226898945
Tuesday, 17 April 2018
CYBER Cloud Computing: Legal & Regulatory Challenges (Ian Walden) - YouTube
Cloud Computing: Legal & Regulatory Challenges (Ian Walden) - YouTube: "Dr Ian Walden of the Centre for Commercial Law Studies, Queen Mary, University of London, and Of Counsel at Baker & McKenzie, speaking at the Faculty of Law, UCC conference "Regulating Cloud Computing: Clear Skies Ahead" on 16 November 2012" 'via Blog this'
PRIVACY - NT1 and NT2 v Google LLC [2018] EWHC 799 (QB)
One Brick Court - News - NT1 and NT2 v Google LLC [2018] EWHC 799 (QB):
"13/04/2018
Judgment was handed down in the first two “right to be forgotten” claims against Google LLC to be brought in England and Wales. Both claimants sought orders for the removal of Google search results which linked to information about their “spent” criminal convictions. They also sought compensation under the Data Protection Act 1998 (DPA) and damages for misuse of private information.
Warby J rejected NT1’s claim, but in NT2’s claim he made an order to “delist” search results. NT2’s claim of inaccuracy in respect of one of the links, to a national newspaper report, was upheld. The Judge found in respect of other links that “the crime and punishment information has become out of date, irrelevant and of no sufficient legitimate interest to users of Google Search”. NT2 was also successful in his claim for misuse of private information.
Warby J declined to make any award of compensation or damages to NT2. Following the CJEU’s decision in Google Spain, Google had engaged in “an enterprise...committed to compliance with the relevant requirements” and, in the current legal climate, “it would be harsh to say it had failed to take reasonable care to do so”.
Google was therefore entitled to the defence under section 13(3) of the DPA, and for similar reasons, no damages were payable for misuse of private information." 'via Blog this'
"13/04/2018
Judgment was handed down in the first two “right to be forgotten” claims against Google LLC to be brought in England and Wales. Both claimants sought orders for the removal of Google search results which linked to information about their “spent” criminal convictions. They also sought compensation under the Data Protection Act 1998 (DPA) and damages for misuse of private information.
Warby J rejected NT1’s claim, but in NT2’s claim he made an order to “delist” search results. NT2’s claim of inaccuracy in respect of one of the links, to a national newspaper report, was upheld. The Judge found in respect of other links that “the crime and punishment information has become out of date, irrelevant and of no sufficient legitimate interest to users of Google Search”. NT2 was also successful in his claim for misuse of private information.
Warby J declined to make any award of compensation or damages to NT2. Following the CJEU’s decision in Google Spain, Google had engaged in “an enterprise...committed to compliance with the relevant requirements” and, in the current legal climate, “it would be harsh to say it had failed to take reasonable care to do so”.
Google was therefore entitled to the defence under section 13(3) of the DPA, and for similar reasons, no damages were payable for misuse of private information." 'via Blog this'
MEDIA: Communications List User Group Meeting: 15 February 2018 Inforrm's Blog
Media and Communications List User Group Meeting: 15 February 2018 – Paul Magrath | Inforrm's Blog: "One of the matters discussed was the question of transferring cases from other divisions into the Queen’s Bench Division Media and Communications List (M&CL). At present, the M&CL was not a specialist list for the purposes of the Civil Procedure Rules, such as would enable the transfer in of cases. That would require a practice direction from the Lord Chief Justice. It was a matter that might be considered at some future date, but as things stood any transfer had to be done by the court in which the proceedings had been issued.
A recent case where the issue had arisen was Appleby Global Group LLC v British Broadcasting Corporation[2018] EWHC 104 (Ch) (in which claims for damages and a permanent injunction against media defendants arising out of the alleged misuse of confidential financial information in the so-called ‘Paradise Papers’ had been brought in the Chancery Division and the judge, Rose J, declined to transfer the case to the M&CL).
Warby J accepted that there might be cases involving media issues that needed specialist expertise (such as technical company law points) that justified their being heard in another court, such as the Chancery Division." 'via Blog this'
A recent case where the issue had arisen was Appleby Global Group LLC v British Broadcasting Corporation[2018] EWHC 104 (Ch) (in which claims for damages and a permanent injunction against media defendants arising out of the alleged misuse of confidential financial information in the so-called ‘Paradise Papers’ had been brought in the Chancery Division and the judge, Rose J, declined to transfer the case to the M&CL).
Warby J accepted that there might be cases involving media issues that needed specialist expertise (such as technical company law points) that justified their being heard in another court, such as the Chancery Division." 'via Blog this'
Monday, 16 April 2018
Privacy: Commission guidance on the direct application of the GDPR
EUR-Lex - 52018DC0043 - EN - EUR-Lex: "Communication from the Commission to the European Parliament and the Council
Stronger protection, new opportunities - Commission guidance on the direct application of the General Data Protection Regulation as of 25 May 2018
Introduction
On 6 April 2016, the EU agreed to a major reform of its data protection framework, by adopting the data protection reform package, comprising the General Data Protection Regulation (GDPR) 1 replacing the twenty years old Directive 95/46/EC 2 (‘Data Protection Directive’) and the Police Directive 3 .
On 25 May 2018, the new EU-wide data protection instrument, the General Data Protection Regulation, ("the Regulation"), will become directly applicable, two years after its adoption and entry into force 4 .
The new Regulation will strengthen the protection of the individual’s right to personal data protection, reflecting the nature of data protection as a fundamental right for the European Union 5 ." 'via Blog this'
Stronger protection, new opportunities - Commission guidance on the direct application of the General Data Protection Regulation as of 25 May 2018
Introduction
On 6 April 2016, the EU agreed to a major reform of its data protection framework, by adopting the data protection reform package, comprising the General Data Protection Regulation (GDPR) 1 replacing the twenty years old Directive 95/46/EC 2 (‘Data Protection Directive’) and the Police Directive 3 .
On 25 May 2018, the new EU-wide data protection instrument, the General Data Protection Regulation, ("the Regulation"), will become directly applicable, two years after its adoption and entry into force 4 .
The new Regulation will strengthen the protection of the individual’s right to personal data protection, reflecting the nature of data protection as a fundamental right for the European Union 5 ." 'via Blog this'
CYBER Week 11 House of Lords Committee Report: AI in the UK
AI in the UK: — Shorthand Social: "The UK must seek to actively shape AI's development and utilisation, or risk passively acquiescing to its many likely consequences. A shared ethical AI framework is needed to give clarity as to how AI can best be used to benefit individuals and society. By establishing these principles, the UK can lead by example in the international community.
We recommend that the Government convene a global summit of governments, academia and industry to establish international norms for the design, development, regulation and deployment of artificial intelligence.
The prejudices of the past must not be unwittingly built into automated systems, and such systems must be carefully designed from the beginning, with input from as diverse a group of people as possible." 'via Blog this'
We recommend that the Government convene a global summit of governments, academia and industry to establish international norms for the design, development, regulation and deployment of artificial intelligence.
The prejudices of the past must not be unwittingly built into automated systems, and such systems must be carefully designed from the beginning, with input from as diverse a group of people as possible." 'via Blog this'
Google Spain – 2014 High Court judgment - Norwich Pharma
Google Spain – new High Court judgment - Panopticon Panopticon: "Mr Hegglin is an individual who is resident in Hong Kong, but has previously lived in and retained closed connections with the UK. An anonymous person posted abusive and defamatory material concerning Mr Hegglin on a number of websites which were then indexed on Google.
Mr Hegglin went on to bring proceedings against Google Inc under the DPA, including claims under s. 10 (right to prevent processing likely to cause substantial damage or distress) and s. 14 (right to rectification). He sought an injunction requiring Google Inc to block specific sites containing the allegations and a Norwich Pharmacal order was made.
Relying specifically on Google Spain, Bean J held that service of the DPA proceedings could properly be effected on Google Inc. He also held that England was the appropriate forum for the dispute and was also suitable for the trial, particularly as the defamatory remarks risked damage to Mr Hegglin’s reputation in England." 'via Blog this'
Mr Hegglin went on to bring proceedings against Google Inc under the DPA, including claims under s. 10 (right to prevent processing likely to cause substantial damage or distress) and s. 14 (right to rectification). He sought an injunction requiring Google Inc to block specific sites containing the allegations and a Norwich Pharmacal order was made.
Relying specifically on Google Spain, Bean J held that service of the DPA proceedings could properly be effected on Google Inc. He also held that England was the appropriate forum for the dispute and was also suitable for the trial, particularly as the defamatory remarks risked damage to Mr Hegglin’s reputation in England." 'via Blog this'
How should internet regulation be improved? Committee launches inquiry: UK Parliament
How should internet regulation be improved? Committee launches inquiry - News from Parliament - UK Parliament: "The Committee will explore how the regulation of the internet should be improved, and whether specific regulation is required or whether the existing law is adequate. The inquiry also investigates whether the online platforms have sufficient accountability and transparency, and whether they use fair and effective processes to moderate content.
Over the course of the inquiry the Committee will hear evidence on what information online platforms should provide to consumers about the use of their personal data and what responsibility online platforms should have for the content that they host.
The Committee seeks evidence on questions including:
Is there a need to introduce specific regulation for the internet?
What should be the legal liability of online platforms for the content that they host?
How effective, fair and transparent are online platforms in moderating content that they host?
What role should users play in establishing and maintaining online community standards for content and behaviour?
What effect will the United Kingdom leaving the European Union on the Government’s regulation of the internet?" 'via Blog this'
Over the course of the inquiry the Committee will hear evidence on what information online platforms should provide to consumers about the use of their personal data and what responsibility online platforms should have for the content that they host.
The Committee seeks evidence on questions including:
Is there a need to introduce specific regulation for the internet?
What should be the legal liability of online platforms for the content that they host?
How effective, fair and transparent are online platforms in moderating content that they host?
What role should users play in establishing and maintaining online community standards for content and behaviour?
What effect will the United Kingdom leaving the European Union on the Government’s regulation of the internet?" 'via Blog this'
Thursday, 12 April 2018
PRIVACY Congress won’t hurt Zuckerberg and Facebook, but GDPR and Europe could | WIRED UK
Congress won’t hurt Zuckerberg and Facebook, but GDPR and Europe could | WIRED UK: "Under pressure to be seen taking action on data protection, Zuckerberg – who, in 2010, remember, argued that privacy was no longer a “social norm” – announced on April 2 that he would like to extend the GDPR’s protections to all Facebook’s users “in spirit”.
No-one knew exactly what this meant – “We’re still nailing down details on this,” Zuckerberg told Reuters – and the backlash was swift. Op-eds were penned; tweets were fired out; US and European consumer groups made demands in open letters.
A few days later, Zuckerberg issued an apologetic (if still extremely unclear) climbdown.
Somehow, GDPR, bane of IT departments and sales tool of dubious “security consultants”, has turned into a political rallying point. It’s a bit like finding that your office HR policy has become the key text for a revolutionary movement." 'via Blog this'
No-one knew exactly what this meant – “We’re still nailing down details on this,” Zuckerberg told Reuters – and the backlash was swift. Op-eds were penned; tweets were fired out; US and European consumer groups made demands in open letters.
A few days later, Zuckerberg issued an apologetic (if still extremely unclear) climbdown.
Somehow, GDPR, bane of IT departments and sales tool of dubious “security consultants”, has turned into a political rallying point. It’s a bit like finding that your office HR policy has become the key text for a revolutionary movement." 'via Blog this'
Wednesday, 11 April 2018
EU Commissioner Margrethe Vestager: Facebook is designed to create addiction – like tobacco and alcohol
EU Commissioner Margrethe Vestager: Facebook is designed to create addiction – like tobacco and alcohol: "She is a Twitter user herself but says she has never used Facebook, originally because she felt at the time that it was wrong to “look over the shoulders” of her children online.
She also has a strong antipathy to accepting all sorts of conditions to be able to use social media.
“I just don’t tick that box and go somewhere else. I just get so, argh!.. I just can’t accept it. If you read just the first paragraph of Facebook’s terms of conditions, you will find it unreasonable.”
Margrethe Vestager stresses that the new rules require providers of a service to only ask for the information needed to deliver their service." 'via Blog this'
She also has a strong antipathy to accepting all sorts of conditions to be able to use social media.
“I just don’t tick that box and go somewhere else. I just get so, argh!.. I just can’t accept it. If you read just the first paragraph of Facebook’s terms of conditions, you will find it unreasonable.”
Margrethe Vestager stresses that the new rules require providers of a service to only ask for the information needed to deliver their service." 'via Blog this'
Monday, 9 April 2018
Data Enforcement Cyprus | Insights | Linklaters
Data Protected Cyprus | Insights | Linklaters: "In Cyprus, there is no current enforcement practice in relation to the GDPR. However, the enforcement of the current law is instructive.
The Commissioner investigates complaints submitted to her office and also launches her own investigations. Criminal proceedings for contraventions of the current legislation (the “DPA”) have been brought in a limited number of cases and there have been a couple of reported convictions.
The most significant civil sanction imposed by the Commissioner under the DPA to date is a fine of €10,000 imposed in November 2017 on the Cyprus Telecommunications Authority (CYTA), the government-owned telecom operator, for failure to implement appropriate organisational measures (revision of employee access rights on change of position within the organisation) to prevent unauthorised access to and disclosure of personal data of a significant number of CYTA clients to a third party. This matter is still investigated by the police and it is possible that criminal charges may be brought against the employee and/or the third party to whom the data were disclosed.
In an earlier case, a fine of €3,000 and an order to terminate processing and destroy relevant personal data had been imposed on a company that had infringed the proportionality principle under the DPA as more data than necessary was being collected.
A fine of €3,000 has also been imposed in two occasions for failure by the Nicosia General Hospital to take appropriate security measures to protect patient personal data contained in their hospital files from accidental or unintended loss or destruction.
In another case, the Commissioner imposed a fine of €2,562 on a company that had infringed various provisions of the DPA, including by sending advertising text messages without the prior written consent of the data subjects; and failing to notify the Commissioner of the commencement of processing. The same fine of €2,562 was imposed on the Director - General of a government ministry for breach of the DPA provisions on the security of sensitive personal data.
The most significant civil sanction imposed by the Commissioner under the ePrivacy Law (see below) to date was a fine of €8,000 on a person who had repeatedly infringed various provisions of the ePrivacy Law, specifically: (i) the prohibition on the use of electronic mail for direct marketing purposes without the recipient’s prior consent; and (ii) the requirement that the sender’s identity and a valid electronic mail address, to which a request that communications cease may be sent, be included in such electronic mail.
The first criminal proceeding to be reported under the DPA involved the owner of a massage business who had installed a secret video camera without consent of clients and without notification to the Commissioner. The sentence imposed at first instance was three months’ imprisonment, which was reduced to 55 days on appeal.
In a more recent criminal case, a sentence of 16 months’ imprisonment was imposed on an individual for a breach of the prohibition on unauthorised access to, and processing of, personal data. The case involved the unauthorised use of credit card information of other persons for the purpose of illegal money withdrawals.
In another case, the court imposed a criminal fine of €1,200 for the unauthorised dissemination of personal data through social media." 'via Blog this'
The Commissioner investigates complaints submitted to her office and also launches her own investigations. Criminal proceedings for contraventions of the current legislation (the “DPA”) have been brought in a limited number of cases and there have been a couple of reported convictions.
The most significant civil sanction imposed by the Commissioner under the DPA to date is a fine of €10,000 imposed in November 2017 on the Cyprus Telecommunications Authority (CYTA), the government-owned telecom operator, for failure to implement appropriate organisational measures (revision of employee access rights on change of position within the organisation) to prevent unauthorised access to and disclosure of personal data of a significant number of CYTA clients to a third party. This matter is still investigated by the police and it is possible that criminal charges may be brought against the employee and/or the third party to whom the data were disclosed.
In an earlier case, a fine of €3,000 and an order to terminate processing and destroy relevant personal data had been imposed on a company that had infringed the proportionality principle under the DPA as more data than necessary was being collected.
A fine of €3,000 has also been imposed in two occasions for failure by the Nicosia General Hospital to take appropriate security measures to protect patient personal data contained in their hospital files from accidental or unintended loss or destruction.
In another case, the Commissioner imposed a fine of €2,562 on a company that had infringed various provisions of the DPA, including by sending advertising text messages without the prior written consent of the data subjects; and failing to notify the Commissioner of the commencement of processing. The same fine of €2,562 was imposed on the Director - General of a government ministry for breach of the DPA provisions on the security of sensitive personal data.
The most significant civil sanction imposed by the Commissioner under the ePrivacy Law (see below) to date was a fine of €8,000 on a person who had repeatedly infringed various provisions of the ePrivacy Law, specifically: (i) the prohibition on the use of electronic mail for direct marketing purposes without the recipient’s prior consent; and (ii) the requirement that the sender’s identity and a valid electronic mail address, to which a request that communications cease may be sent, be included in such electronic mail.
The first criminal proceeding to be reported under the DPA involved the owner of a massage business who had installed a secret video camera without consent of clients and without notification to the Commissioner. The sentence imposed at first instance was three months’ imprisonment, which was reduced to 55 days on appeal.
In a more recent criminal case, a sentence of 16 months’ imprisonment was imposed on an individual for a breach of the prohibition on unauthorised access to, and processing of, personal data. The case involved the unauthorised use of credit card information of other persons for the purpose of illegal money withdrawals.
In another case, the court imposed a criminal fine of €1,200 for the unauthorised dissemination of personal data through social media." 'via Blog this'
Data Protected Cyprus | Insights | Linklaters
Data Protected Cyprus | Insights | Linklaters: "Are there any special rules when processing personal data about children?
Consent from a child in relation to online services will only be valid if authorised by a parent. A child is someone under 16 years old, though Member States may reduce this age to 13.
It is not yet known whether in Cyprus the age at which a child can provide a valid consent will be reduced below 16." 'via Blog this'
Consent from a child in relation to online services will only be valid if authorised by a parent. A child is someone under 16 years old, though Member States may reduce this age to 13.
It is not yet known whether in Cyprus the age at which a child can provide a valid consent will be reduced below 16." 'via Blog this'
Wednesday, 4 April 2018
Privacy: FTC failed to enforce Facebook consent decree, critics charge amid firestorm - FTCWatch
FTC failed to enforce Facebook consent decree, critics charge amid firestorm - FTCWatch: "consumer advocates are directing much of their ire at the FTC.
“I’m glad people are finally saying that maybe they should enforce their consent orders, but it is a little bit ironic that the people who were there [at the agency] back in the day and could have done something are now the ones saying that the FTC should act,” Marc Rotenberg, president and executive director of the Electronic Privacy Information Center, said in an interview.
“This whole Cambridge Analytica controversy can be laid directly at the doors of the Federal Trade Commission,” Chester said. “Despite the consent decree, Facebook expanded its data gathering practices without constraint every day. I made all of that information available to the FTC. … You know how many e-mails I sent? Hundreds. I said, ‘Here is what they are doing — it is not permitted by the consent decree.’”
Rotenberg also noted that EPIC had “repeatedly told the commission, every time they asked what do we need to do, we answered very simply, ‘enforce your consent orders.’ We said to the FTC that there should be a formal process every time there is a substantial change in business practice that implicates personal data by a company that is subject to a consent order concerning privacy. The FTC has an affirmative obligation to determine whether that change in business practice violates the consent order.”
“We wanted them to create a formal process, and we wanted them to stay on top of these companies. It didn’t happen,” he added. “What most of the world doesn’t understand is that the company was actually subject to a legal order. It wasn’t, ‘maybe we can go ask Mark Zuckerberg to be nicer.’ We knew in Washington what the constraints on the practices were supposed to be, and the FTC simply dropped the ball.”
Other advocates share that anger, as reflected in a letter signed by 17 leading consumer privacy groups last month to FTC acting Chairman Maureen Ohlhausen and Commissioner Terrell McSweeny, charging: “It is unconscionable that the FTC allowed this unprecedented disclosure of Americans’ personal data to occur. The FTC’s failure to act imperils not only privacy but democracy as well.”" 'via Blog this'
“I’m glad people are finally saying that maybe they should enforce their consent orders, but it is a little bit ironic that the people who were there [at the agency] back in the day and could have done something are now the ones saying that the FTC should act,” Marc Rotenberg, president and executive director of the Electronic Privacy Information Center, said in an interview.
“This whole Cambridge Analytica controversy can be laid directly at the doors of the Federal Trade Commission,” Chester said. “Despite the consent decree, Facebook expanded its data gathering practices without constraint every day. I made all of that information available to the FTC. … You know how many e-mails I sent? Hundreds. I said, ‘Here is what they are doing — it is not permitted by the consent decree.’”
Rotenberg also noted that EPIC had “repeatedly told the commission, every time they asked what do we need to do, we answered very simply, ‘enforce your consent orders.’ We said to the FTC that there should be a formal process every time there is a substantial change in business practice that implicates personal data by a company that is subject to a consent order concerning privacy. The FTC has an affirmative obligation to determine whether that change in business practice violates the consent order.”
“We wanted them to create a formal process, and we wanted them to stay on top of these companies. It didn’t happen,” he added. “What most of the world doesn’t understand is that the company was actually subject to a legal order. It wasn’t, ‘maybe we can go ask Mark Zuckerberg to be nicer.’ We knew in Washington what the constraints on the practices were supposed to be, and the FTC simply dropped the ball.”
Other advocates share that anger, as reflected in a letter signed by 17 leading consumer privacy groups last month to FTC acting Chairman Maureen Ohlhausen and Commissioner Terrell McSweeny, charging: “It is unconscionable that the FTC allowed this unprecedented disclosure of Americans’ personal data to occur. The FTC’s failure to act imperils not only privacy but democracy as well.”" 'via Blog this'
Privacy: The Future of Self-Regulation is Co-Regulation by Ira Rubinstein :: SSRN
The Future of Self-Regulation is Co-Regulation by Ira Rubinstein :: SSRN: "Abstract
Modern regulatory theory has long treated voluntary self-regulation and direct government regulation as opposing ends of a regulatory continuum, with most self-regulatory schemes falling somewhere in the middle. This chapter explores the middle ground by examining co-regulatory approaches to privacy, in which industry enjoys considerable flexibility in shaping self-regulatory guidelines, consumer advocacy groups have a seat at the table, and government sets default requirements and retains general oversight authority to approve and enforce these guidelines. Privacy co-regulation is generally understood as a collaborative, flexible, and performance-based approach to privacy regulation that draws on the theoretical insights of collaborative governance theory.
This paper argues that privacy self-regulation in the form of voluntary codes has had a sufficiently long run to prove its worth but has failed. Now is the time to make the transition to co-regulation, especially in the U.S. It is organized into three sections. The first considers in greater detail the differences between self-regulation and co-regulation. The second looks at the failure and stubborn persistence of voluntary codes of conduct. The third shifts the discussion to three case studies of privacy codes and practices that have benefited from a co-regulatory approach. In the past few years, there have been some notable developments in co-regulatory schemes as well some important empirical studies. These new materials provide an opportunity to understand the conditions for the success (and failure) of co-regulatory solutions in the privacy field and what this implies for the future of regulatory innovation. The chapter concludes by offering a few recommendations on how the U.S. Congress can implement co-regulatory approaches in any future legislation to optimally protect online consumer privacy while preserving innovation and economic growth." 'via Blog this'
Modern regulatory theory has long treated voluntary self-regulation and direct government regulation as opposing ends of a regulatory continuum, with most self-regulatory schemes falling somewhere in the middle. This chapter explores the middle ground by examining co-regulatory approaches to privacy, in which industry enjoys considerable flexibility in shaping self-regulatory guidelines, consumer advocacy groups have a seat at the table, and government sets default requirements and retains general oversight authority to approve and enforce these guidelines. Privacy co-regulation is generally understood as a collaborative, flexible, and performance-based approach to privacy regulation that draws on the theoretical insights of collaborative governance theory.
This paper argues that privacy self-regulation in the form of voluntary codes has had a sufficiently long run to prove its worth but has failed. Now is the time to make the transition to co-regulation, especially in the U.S. It is organized into three sections. The first considers in greater detail the differences between self-regulation and co-regulation. The second looks at the failure and stubborn persistence of voluntary codes of conduct. The third shifts the discussion to three case studies of privacy codes and practices that have benefited from a co-regulatory approach. In the past few years, there have been some notable developments in co-regulatory schemes as well some important empirical studies. These new materials provide an opportunity to understand the conditions for the success (and failure) of co-regulatory solutions in the privacy field and what this implies for the future of regulatory innovation. The chapter concludes by offering a few recommendations on how the U.S. Congress can implement co-regulatory approaches in any future legislation to optimally protect online consumer privacy while preserving innovation and economic growth." 'via Blog this'
Subscribe to:
Posts (Atom)