Draft, please do not redistribute


Trends and Challenges in Privacy HCI Research



Download 376.68 Kb.
Page10/14
Date06.08.2017
Size376.68 Kb.
#27104
1   ...   6   7   8   9   10   11   12   13   14

4 Trends and Challenges in Privacy HCI Research


In the previous sections, we provided an overview of the research landscape of HCI as it relates to privacy. As a conclusion to this article, we outline several trends that are changing the privacy landscape, as well as major research challenges in the field. While the research subfields reviewed in Section 3 tackle a specific aspect of privacy in HCI, we focus here on five “grand challenges” that span several subfields:

  • Developing more effective and efficient ways for end-users to manage their privacy.

  • Gaining a deeper understanding of people’s attitudes and behaviors towards privacy

  • Developing a “Privacy Toolbox”

  • Improving organizational management of personal data

  • Converging privacy research with technological adoption models

Below, we outline each of these trends, indicate where we see current research headed, and what are the challenges facing researchers and practitioners.

4.1Better Ways of Helping End-Users Manage Their Personal Privacy


It is becoming increasingly difficult to manage personal privacy as information and communication technologies become pervasive. Personal information is fragmented across a number of devices, applications, web sites, and organizations, each with different user interfaces, notifications, and management policies. We argue that we need new approaches for alleviating the burden of managing user’s personal privacy.

Information and communication technologies increasingly preserve information about the individuals using them15 and surveillance systems are spreading into the workplace (in the form of email and web monitoring) and to other spheres of daily activity (e.g., broadcasting the interior of night clubs, bars, or beaches [52]). Often, these systems collect information unbeknownst to the user. Furthermore, the development of digital sensors has enabled the collection of novel types of information in everyday situations (e.g., automatic toll payment systems based on RFID and license plate recognition [111], implantable sensors monitoring the health of patients [205], monitoring systems deployed in the homes of elderly people [38]). Technical and economic considerations suggest that sensing technologies will become a ubiquitously present infrastructure, open for use by individuals as well as organizations for a wide array of purposes. A distinctive characteristic of these systems is that the interaction is increasingly becoming implicit, out of the scope of control of Norman’s “Seven Steps of Interaction” [227]. This kind of implicit interaction requires new mechanisms for managing the resulting risks to personal information and privacy.

One possible solution to problems above is to develop more effective and less burdensome user interfaces for helping people make good decisions. A key challenge here is that there is currently no agreement as to what kinds of interaction styles are best for each type of information disclosure. Rule- or policy-based mechanisms may be suboptimal for many applications, as discussed in Section 3.2.2. Other interaction styles, such as social translucency and plausible deniability, might be able to achieve comparable effects with far less burden and with a greater sense of control [28], but there are no clear guidelines on how to build plausible deniability into computing systems. Ambiguity has been discussed as a design resource in other contexts (e.g., games) [117], and we believe it will become an increasingly important design element in the context of privacy. In short, there needs to be much more work to determine the efficacy of these different ideas in a wider range of contexts.

Another possibility is to consider a better division of labor that helps shoulder the burden of managing personal privacy. A consensus is slowly building in the research community that privacy-sensitive applications cannot make all data transfers explicit, nor require users to track them all. The related UIs and interaction patterns would simply be too complex and unwieldy. From a data protection viewpoint, experience shows that most data subjects are unable or unwilling to control all disclosures of personal information, and to keep track of all parties that process their personal data [64, 95]. Distributing the burden of managing one’s personal privacy across a combination of operating systems, networking infrastructure, software applications, system administrators, organizations, and third parties could help address this problem. Ideally, these entities would provide advice to users or make trusted decisions on their behalf, with the ultimate goal being to reduce the overall effort required to make good decisions. Taking email spam as an example, multiple entities—including ISPs, local system administrators, and automatic filters—all contribute to reducing the amount of spam that end-users receive. Here, it makes sense to share the costs of spam reduction since the hardship would otherwise be borne by a large number of individuals.

Trusted proxies are another example of a third-party organization that can help manage privacy. For instance, MedicAlert is a paid service that stores personal medical records and forwards them to first responders in the case of medical emergencies. Such organizations, either not-for-profit (like MedicAlert), or for-profit (regulated by a service contract), could include:


  • evaluation clearinghouses, that indicate what products and services to trust. For example, SiteAdvisor [265] evaluates web sites’ spam, popup, and virus risks, and provides ratings via a web browser plug-in.

  • services that hold users’ location information and disclose it in case of emergency or subpoena, similar to current mobile telecom operators.

  • A service that seeds suspected privacy violators with fake personal data and tracks how that data is used and shared.

  • A service that checks if an individual reveals too much personal information in her resume and is at risk for identity theft [280].

In summary, privacy protection is a “systemic property” that requires support at all levels. However, special care should be exercised in allocating responsibility and oversight correctly, because the business goals of many organizations may not be aligned with those of the users, as suggested by recent controversies over security leaks at large personal data brokerage firms [297, 300].



Download 376.68 Kb.

Share with your friends:
1   ...   6   7   8   9   10   11   12   13   14




The database is protected by copyright ©ininet.org 2024
send message

    Main page