TECHNOLOGY AND PRIVACY: A SYMPOSIUM
Thursday 30 July 1998
Institute of Civil Engineers, 1 Great George Street, Westminster, London
1. Introduction and Rationale
The potential effects of the new electronic technologies on privacy are much debated. Among the issues frequently mentioned are:
Unfortunately, we have as yet little clear idea about to assess these matters. Many suppose that new technical capacity - the new techniques of data-mining, new developments in information scanning and visual recognition - will inevitably impact on privacy. Others are more cautious, arguing that the effects are as yet unclear; that impact will depend on a whole variety of factors including market response, the rate of take up and public resistance; and that there is nothing to suggest that the effects of these technologies will necessarily be experienced in terms of violations of privacy.
A major problem is that there is little interaction between the various constituencies with relevant expertise: technology developers, technology appliers in business and government, policy makers, academic researchers. Those who are well informed about likely technical developments have had little access to frameworks for taking into account the human and social dimensions of privacy. At the same time, those skilled in assessing social dynamics do not always have up to date information on the latest technical developments.
The aim of the symposium was to address the current lack of interaction by bringing together leading representatives of the various relevant expert constituencies. In particular, the meeting introduced some of the latest work being undertaken into the social dimensions of technology and privacy, and generated suggestions for research topics most likely to be of value to industry, business and government. The specific aim was to identify priority areas for future research and discussion.
Below are copies of the (one page) briefing materials supplied by our speakers: Gordon Drury (NDS Ltd); Professor David Mason (University of Plymouth); and Charles Raab (University of Edinburgh); and a short Overview of the discussion. The presentations and discussion were chaired by Dr Geoff Robinson (formerly Director of Technology, IBM)
2. Privacy: What Are The Basic Questions?
Gordon Drury (NDS Ltd)
The following lists some topics for discussion in a presentation. It is a summary of a paper written for the ITEC regulatory committee earlier this year and circulated to other groups within ITEC. The intent is to stimulate discussion of the issues and thereby gain some consensus and focus.
As a result of the above, what are the requirements placed upon technology for the implementation of private systems?
Discussion should be focussed on exploring the linkages between technological capabilities and social and related needs especially in terms of costs and benefits, both technical and social.
3. Technology, Work And Surveillance: Organisational Goals, Privacy And Resistance
Professor David Mason (University of Plymouth) and Professor Graham Button (Xerox EuroPARC, Cambridge)
Many discussions of the implications of new technologies for privacy concern systems designed specifically with surveillance functions in mind (such as CCTV in public spaces). Other discussions focus on data protection in the context of systems where personal data collection is the core activity (such as consumer behaviour surveying and associated marketing).
However, many technologies now used in the workplace are not specifically designed to have a surveillance function but they can, nevertheless, keep detailed records on the behaviour of employees. This is often a by product of systems designed to assist such tasks as stock control, re-ordering of materials or the planning of work flows. What are the implications for employees' privacy of the surveillance capacity of these systems?
Research on new management techniques tends to stress the potentialities of modern electronic technologies to enhance the work of teams and individuals. It takes a broadly positive view of the future, focusing on the empowering capacity of new technological ways of supporting work. By contrast, research undertaken in the labour process tradition depicts a less benign view of the nature of the employment relationship - one which sees it as intrinsically oppositional and conflict-ridden. Employees are assumed to be likely to experience any enhanced surveillance capacity as an intrusion into their privacy and autonomy at work. As a result, it is argued, employees will respond by trying to undermine the technology in some way.
Yet it turns out that little research has actually been conducted on what employees think, how they respond or what they see as acceptable and unacceptable aspects of modern electronic technologies. The questions that need to be addressed include:
Each of these questions raises further possibilities. Thus it may be that employers will calculate that the disadvantages of utilising surveillance potential outweigh the advantages. This might be because the act of surveillance itself imposes a cost. Or perhaps the risk of undermining employee compliance is judged too great. For their part, employees may, in some circumstances be suspicious of, and resistant to, surveillance, but on other occasions regard surveillance as legitimate (such as in health and safety matters). Many employees work in environments where levels of (non-technological) management and peer surveillance are already high. In this case, employees might perceive technologically enhanced surveillance capacity as a protection against the possibility of capricious or malicious accusations of dereliction. In yet other circumstances, employees may regard technological systems as offering other kinds of support in activities where they would otherwise be individually exposed. In all these cases, the wider climate and culture of the organisation is likely to be a crucial mediating factor.
4. Privacy Issues: What Do We Need to Know?
Mr Charles Raab (University of Edinburgh)
the question of whether new technical capacities will inevitably have an impact on privacy, or whether it depends on lots of other factors, partly depends on
how one looks at risk
what mix of precautionary or redress approaches/policy tools to apply to the impact
we don't know enough about risk
we don't know enough about the mix (or the individual ingredients) and its (their) efficacy
need to look at the human and social dimensions of privacy, including the principles for its protection and the values to be safeguarded
need to look at privacy-protection regimes - the 'mix' of laws, enforcement agencies, voluntary codes, privacy-enhancing technologies, market solutions
in the 'mix', what is the (proper) role of government and policy? - more than just ensuring compliance, and possibly requiring re-thinking privacy-protection policy in terms of social policy
in the 'mix', what is the role of industry? - in particular in regard to creating and using technologies with privacy implications, its understanding of issues, and its development of self-regulation
international and global dimensions of privacy protection, with particular reference to the European Data Protection Directive 95/46/EC transborder data flows, and the Internet
what is the role of academic research, and what priorities?
should be comparative (including cross-country learning) and global
should involve organisational and inter-organisational dimensions (e.g., designing regulatory systems)
should look at selected issues systematically and comprehensively, in collaboration with practitioners (e.g.: smart cards, CCTV transport monitoring/other geographic information system applications) and see what mix of laws, codes, principles, practices, technologies, organisations, public opinion, etc. relate to it
should aim to get a better grasp on questions of risk, trust, balance and equity
5. Overview of the Discussion
The following is a (necessarily selective) attempt to capture some of the main comments and opinions.
It was agreed that the complexity of questions about privacy required some careful "back to basics" thinking about the nature of individual and societal rights and responsibilities. There are some knotty issues here including the extent to which privacy should be considered as a right of the community as much as a right of individuals. It was pointed out that the availability of technology did not entail any new problems of privacy. Understanding the technology is often relatively simple; the difficult question is how and why people react, appropriate or ignore technological options. Humans, not technology, determine whether or not there is a problem of privacy. We do not yet know enough about how organisations make decisions about whether or not to implement the surveillance capacities of their information systems, nor about the ways in which employees respond. Much traditional thinking and research on this subject has been constrained by a combination of disciplinary boundaries, social prejudice and obsolete social and economic models. In general, we need to look more carefully at the extent and distribution of the risks of privacy intrusion, and at the workings of the mix of mechanisms for protection. A key question here is who should regulate this mix, and how?
It was suggested that government regulation is unlikely to work, one possible reason being that customers tend not to do what they are told. It is perhaps too early in our understanding of the specific issues and risks involved to fashion effective and sustainable regulation. There has in any case been little pressure on the government to deal with privacy, partly as a result of the fact that the UK cultural context is characterised by rather little widespread concern about privacy. Rather few people have ever bothered to check what information is being held on them. In this situation there is a very real danger that the question of privacy might be hijacked by those with a narrow outlook on the problem.
Perhaps the more significant public concern is one of trust rather than privacy. One guideline often applied is that data, especially personal data, should only be used for the purposes for which it was gathered, and by those to whom it was directly supplied. But how can we guarantee the responsible interpretation of this ideal? From the point of view of increasing competitiveness, it could be argued there are virtues in allowing organisations to exploit data from outside their own sector. But to what extent? And with what permissions and safeguards?
In sum, there are no answers! But we can note the following key themes for future discussion and debate:
1. The public concern is with trust rather than privacy.
2. Customers tend to be looking to have choice over privacy and trust that works.
3. There is a need for greater clarity on, and awareness of, the appropriate rules for passing personal data between organisations.
4. In practice there may need to be stronger protection mechanisms over privacy within the workplace than in the market place.
6. List of Participants
Mr Nigel Birch, EPSRC
Mr Richard Clayton, Demon Internet
Mr Richard Daniels, Office of Science & Technology
Mr Gordon Drury, NDS Ltd
Mr Keith Ferguson, National Westminster Bank
Mr Martin Freeth, Bristol 2000
Professor Rodger Hake, British Computer Society
Dr Richard Harper, Digital World Centre
Mr Nigel Hickson, CIID, DTI
Mr Roger James, Napp Pharmaceuticals Ltd
Mr John Leighfield, Birmingham Midshires Building Society
Professor David Mason, University of Plymouth
Professor John Midwinter OBE, University College
Professor Jim Norton, Future Unit, DTI
Mr Charles Raab, University of Edinburgh
Dr Geoff Robinson CBE
Mr Mike Rodd, IEE
Mr Peter Saraga, Philips Research Laboratories
Mr Roland Sinker, OFTEL
Mr Roderick Snell, Snell & Wilcox Ltd
Mr Geoff Vincent, Mediation Technology
Mr John Wagstaff, Association of British Insurers
Ms Barbara Walker, CBI
Mrs Peta Walmisley, British Computer Society
Dr Mark Wilkins, EPSRC
Mr Paul Williams, CIID, DTI
Professor Steve Woolgar, ESRC Virtual Society? Programme
Return to top of page
Switch to graphical version
Page developed and maintained by Christine Hine
Contents current at 15th October 1998