TECHNOLOGY AND PRIVACY: A SYMPOSIUM
Thursday 30 July 1998
Institute of Civil Engineers, 1 Great George Street, Westminster, London
1. Introduction and Rationale
The potential effects of the new electronic technologies on privacy aremuch debated. Among the issues frequently mentioned are:
Unfortunately, we have as yet little clear idea about to assess thesematters. Many suppose that new technical capacity - the new techniques of data-mining, newdevelopments in information scanning and visual recognition - will inevitably impact onprivacy. Others are more cautious, arguing that the effects are as yet unclear; thatimpact will depend on a whole variety of factors including market response, the rate oftake up and public resistance; and that there is nothing to suggest that the effects ofthese technologies will necessarily be experienced in terms of violations of privacy.
A major problem is that there is little interaction between the variousconstituencies with relevant expertise: technology developers, technology appliers inbusiness and government, policy makers, academic researchers. Those who are well informedabout likely technical developments have had little access to frameworks for taking intoaccount the human and social dimensions of privacy. At the same time, those skilled inassessing social dynamics do not always have up to date information on the latesttechnical developments.
The aim of the symposium was to address the current lack of interactionby bringing together leading representatives of the various relevant expertconstituencies. In particular, the meeting introduced some of the latest work beingundertaken into the social dimensions of technology and privacy, and generated suggestionsfor research topics most likely to be of value to industry, business and government. Thespecific aim was to identify priority areas for future research and discussion.
Below are copies of the (one page) briefing materials supplied by ourspeakers: Gordon Drury (NDS Ltd); Professor David Mason (University of Plymouth); andCharles Raab (University of Edinburgh); and a short Overview of the discussion. Thepresentations and discussion were chaired by Dr Geoff Robinson (formerly Director ofTechnology, IBM)
2. Privacy: What Are The Basic Questions?
Gordon Drury (NDS Ltd)
The following lists some topics for discussion in a presentation. It isa summary of a paper written for the ITEC regulatory committee earlier this year andcirculated to other groups within ITEC. The intent is to stimulate discussion of theissues and thereby gain some consensus and focus.
As a result of the above, what are the requirements placed upontechnology for the implementation of private systems?
Discussion should be focussed on exploring the linkages betweentechnological capabilities and social and related needs especially in terms of costs andbenefits, both technical and social.
3. Technology, Work And Surveillance: Organisational Goals, Privacy AndResistance
Professor David Mason (University of Plymouth) and Professor GrahamButton (Xerox EuroPARC, Cambridge)
Many discussions of the implications of new technologies for privacyconcern systems designed specifically with surveillance functions in mind (such as CCTV inpublic spaces). Other discussions focus on data protection in the context of systems wherepersonal data collection is the core activity (such as consumer behaviour surveying andassociated marketing).
However, many technologies now used in the workplace are notspecifically designed to have a surveillance function but they can, nevertheless, keepdetailed records on the behaviour of employees. This is often a by product of systemsdesigned to assist such tasks as stock control, re-ordering of materials or the planningof work flows. What are the implications for employees' privacy of the surveillancecapacity of these systems?
Research on new management techniques tends to stress thepotentialities of modern electronic technologies to enhance the work of teams andindividuals. It takes a broadly positive view of the future, focusing on the empoweringcapacity of new technological ways of supporting work. By contrast, research undertaken inthe labour process tradition depicts a less benign view of the nature of the employmentrelationship - one which sees it as intrinsically oppositional and conflict-ridden.Employees are assumed to be likely to experience any enhanced surveillance capacity as anintrusion into their privacy and autonomy at work. As a result, it is argued, employeeswill respond by trying to undermine the technology in some way.
Yet it turns out that little research has actually been conducted onwhat employees think, how they respond or what they see as acceptable and unacceptableaspects of modern electronic technologies. The questions that need to be addressedinclude:
Each of these questions raises further possibilities. Thus it may bethat employers will calculate that the disadvantages of utilising surveillance potentialoutweigh the advantages. This might be because the act of surveillance itself imposes acost. Or perhaps the risk of undermining employee compliance is judged too great. Fortheir part, employees may, in some circumstances be suspicious of, and resistant to,surveillance, but on other occasions regard surveillance as legitimate (such as in healthand safety matters). Many employees work in environments where levels of(non-technological) management and peer surveillance are already high. In this case,employees might perceive technologically enhanced surveillance capacity as a protectionagainst the possibility of capricious or malicious accusations of dereliction. In yetother circumstances, employees may regard technological systems as offering other kinds ofsupport in activities where they would otherwise be individually exposed. In all thesecases, the wider climate and culture of the organisation is likely to be a crucialmediating factor.
4. Privacy Issues: What Do We Need to Know?
Mr Charles Raab (University of Edinburgh)
the question of whether new technical capacities will inevitably have an impact on privacy, or whether it depends on lots of other factors, partly depends on
how one looks at risk
what mix of precautionary or redress approaches/policy tools to apply to the impact
we don't know enough about risk
we don't know enough about the mix (or the individual ingredients) and its (their) efficacy
need to look at the human and social dimensions of privacy, including the principles for its protection and the values to be safeguarded
need to look at privacy-protection regimes - the 'mix' of laws, enforcement agencies, voluntary codes, privacy-enhancing technologies, market solutions
in the 'mix', what is the (proper) role of government and policy? - more than just ensuring compliance, and possibly requiring re-thinking privacy-protection policy in terms of social policy
in the 'mix', what is the role of industry? - in particular in regard to creating and using technologies with privacy implications, its understanding of issues, and its development of self-regulation
international and global dimensions of privacy protection, with particular reference to the European Data Protection Directive 95/46/EC transborder data flows, and the Internet
what is the role of academic research, and what priorities?
should be comparative (including cross-country learning) and global
should involve organisational and inter-organisational dimensions (e.g., designing regulatory systems)
should look at selected issues systematically and comprehensively, in collaboration with practitioners (e.g.: smart cards, CCTV transport monitoring/other geographic information system applications) and see what mix of laws, codes, principles, practices, technologies, organisations, public opinion, etc. relate to it
should aim to get a better grasp on questions of risk, trust, balance and equity
5. Overview of the Discussion
The following is a (necessarily selective) attempt to capture some ofthe main comments and opinions.
It was agreed that the complexity of questions about privacy requiredsome careful "back to basics" thinking about the nature of individual andsocietal rights and responsibilities. There are some knotty issues here including theextent to which privacy should be considered as a right of the community as much as aright of individuals. It was pointed out that the availability of technology did notentail any new problems of privacy. Understanding the technology is often relativelysimple; the difficult question is how and why people react, appropriate or ignoretechnological options. Humans, not technology, determine whether or not there is a problemof privacy. We do not yet know enough about how organisations make decisions about whetheror not to implement the surveillance capacities of their information systems, nor aboutthe ways in which employees respond. Much traditional thinking and research on thissubject has been constrained by a combination of disciplinary boundaries, social prejudiceand obsolete social and economic models. In general, we need to look more carefully at theextent and distribution of the risks of privacy intrusion, and at the workings of the mixof mechanisms for protection. A key question here is who should regulate this mix, andhow?
It was suggested that government regulation is unlikely to work, onepossible reason being that customers tend not to do what they are told. It is perhaps tooearly in our understanding of the specific issues and risks involved to fashion effectiveand sustainable regulation. There has in any case been little pressure on the governmentto deal with privacy, partly as a result of the fact that the UK cultural context ischaracterised by rather little widespread concern about privacy. Rather few people haveever bothered to check what information is being held on them. In this situation there isa very real danger that the question of privacy might be hijacked by those with a narrowoutlook on the problem.
Perhaps the more significant public concern is one of trust rather thanprivacy. One guideline often applied is that data, especially personal data, should onlybe used for the purposes for which it was gathered, and by those to whom it was directlysupplied. But how can we guarantee the responsible interpretation of this ideal? From thepoint of view of increasing competitiveness, it could be argued there are virtues inallowing organisations to exploit data from outside their own sector. But to what extent?And with what permissions and safeguards?
In sum, there are no answers! But we can note the following key themesfor future discussion and debate:
1. The public concern is with trust rather than privacy.
2. Customers tend to be looking to have choice over privacy and trustthat works.
3. There is a need for greater clarity on, and awareness of, theappropriate rules for passing personal data between organisations.
4. In practice there may need to be stronger protection mechanisms overprivacy within the workplace than in the market place.
6. List of Participants
Mr Nigel Birch, EPSRC
Mr Richard Clayton, Demon Internet
Mr Richard Daniels, Office of Science & Technology
Mr Gordon Drury, NDS Ltd
Mr Keith Ferguson, National Westminster Bank
Mr Martin Freeth, Bristol 2000
Professor Rodger Hake, British Computer Society
Dr Richard Harper, Digital World Centre
Mr Nigel Hickson, CIID, DTI
Mr Roger James, Napp Pharmaceuticals Ltd
Mr John Leighfield, Birmingham Midshires Building Society
Professor David Mason, University of Plymouth
Professor John Midwinter OBE, University College
Professor Jim Norton, Future Unit, DTI
Mr Charles Raab, University of Edinburgh
Dr Geoff Robinson CBE FREng
Mr Mike Rodd, IEE
Mr Peter Saraga, Philips Research Laboratories
Mr Roland Sinker, OFTEL
Mr Roderick Snell, Snell & Wilcox Ltd
Mr Geoff Vincent, Mediation Technology
Mr John Wagstaff, Association of British Insurers
Ms Barbara Walker, CBI
Mrs Peta Walmisley, British Computer Society
Dr Mark Wilkins, EPSRC
Mr Paul Williams, CIID, DTI
Professor Steve Woolgar, ESRC Virtual Society? Programme
Return to top of page
Switch to text only version
Page developed and maintained by Christine Hine
Contents current at 20th September 1999