May/June 2019 issue of acmqueue The May/June 2019 issue of acmqueue is out now

Subscribers and ACM Professional members login here

Privacy and Rights

  Download PDF version of this article PDF

Error 526 Ray ID: 4fb3d3a57f1c99d8 • 2019-07-24 06:27:10 UTC

Invalid SSL certificate








What happened?

The origin web server does not have a valid SSL certificate.

What can I do?

If you're a visitor of this website:

Please try again in a few minutes.

If you're the owner of this website:

The SSL certificate presented by the server did not pass validation. This could indicate an expired SSL certificate or a certificate that does not include the requested domain name. Please contact your hosting provider to ensure that an up-to-date and valid SSL certificate issued by a Certificate Authority is configured for this domain name on the origin server. Additional troubleshooting information here.


Originally published in Queue vol. 13, no. 8
see this item in the ACM Digital Library



Meng-Day (Mandel) Yu, Srinivas Devadas - Pervasive, Dynamic Authentication of Physical Items
The use of silicon PUF circuits

Nicholas Diakopoulos - Accountability in Algorithmic Decision-making
A view from computational journalism

Jim Waldo, Alan Ramos, Weina Scott, William Scott, Doug Lloyd, Katherine O'Leary - A Threat Analysis of RFID Passports
Do RFID passports make us vulnerable to identity theft?

Whitfield Diffie, Susan Landau - Communications Surveillance
As the sophistication of wiretapping technology grows, so too do the risks it poses to our privacy and security.


(newest first)

Gary LaFever | Sun, 03 Apr 2016 00:09:34 UTC

This article does a good job of explaining statistical underpinnings of threats to realizing the full benefit of Big Data caused by shortcomings of traditional approaches to data privacy. These approaches were developed before increasing volume, variety and velocity of data turned little d data into Big Data. As a result, they purposefully insert inaccuracy into Big Data that vitiates the value of the data.

In a speech entitled A Call to Arms: The Role of Technologists in Protecting Privacy in the Age of Big Data (see then FTC Commissioner Julie Brill laid down a Call to Arms to technologists saying &it is important to recognize that you  the computer scientists, the engineers, the programmers, the technologists  have a unique set of skills that are key to solving these critical privacy issues. If you join me in this effort, I think that together we can help big data operate in a system that respects consumer privacy and engenders consumer trust, allowing big data to reach its full potential to benefit us all.

If individuals and society as a whole are to benefit from the Big Data Grand Bargain (see Forbes article at, computer scientists, data scientists, engineers, programmers, and technologists must devote themselves to developing new ways to balance the benefit of the Big Data Grand Bargain against its costs  including privacy  in a manner that does not reduce or eliminate the very value of what we're trying to protect  Big Data. See and #bigprivacy.

Steve Welburn | Wed, 03 Feb 2016 15:10:22 UTC

The DataShield project in the UK provides a framework for users to submit restricted queries against sensitive datasets.

Olivia Angiuli | Tue, 03 Nov 2015 20:15:39 UTC

Hi Jaksa, The point you raise is definitely another alternative approach to privacy preservation that has been proposed, and I'd point you to this paper to read more about it: In my opinion, the largest downside of this approach is that all queries that were ever performed on a dataset -- in combination -- would have to meet a given privacy standard. This is because all researchers who ever query this dataset could potentially combine all of their datasets together, and the combination of these queries must, in aggregate, meet the privacy standard. The implication of this is that the more queries you do on the dataset, the worse and worse your results would get. The approach discussed in this paper attempts to sidestep that problem by providing a "universally optimal" dataset for public use.

Jaksa Vuckovic | Fri, 30 Oct 2015 09:46:44 UTC

It seems that re-identification is quite easy to do and could be a big problem for privacy. What do you think about moving to a model where instead of sharing the dataset, 3rd parties are allowed to submit queries? Obviously the query language should not have the power to return individual records. Would such a solution eliminate or at least reduce the possibility of re-identification?

Leave this field empty

Post a Comment:

© 2019 ACM, Inc. All Rights Reserved.