Using IT effectively: a guide to technology in the social sciences
Using IT effectively: a guide to technology in the...
26 downloads
1901 Views
4MB Size
Report
This content was uploaded by our users and we assume good faith they have the permission to share this book. If you own the copyright to this book and it is wrongfully on our website, we offer a simple DMCA procedure to remove your content from our site. Start by pressing the button below!
Report copyright / DMCA form
Using IT effectively: a guide to technology in the social sciences
Using IT effectively: a guide to technology in the social sciences Edited by
Millsom Henry University of Stirling
© Millsom Henry 1998 the collection and introductory material © The contributors for individual chapters 1998 This book is copyright under the Berne Convention. No reproduction without permission. All rights reserved. First published in 1998 by UCL Press UCL Press Limited 1 Gunpowder Square London EC4A 3DE UK and This edition published in the Taylor & Francis e-Library, 2005. “To purchase your own copy of this or any of Taylor & Francis or Routledge’s collection of thousands of eBooks please go to www.eBookstore.tandf.co.uk.” 1900 Frost Road, Suite 101 Bristol Pennsylvania 19007–1598 USA The name of University College London (UCL) is a registered trade mark used by UCL Press with the consent of the owner. British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library. Library of Congress Cataloging-in-Publication Data are available ISBN 0-203-98452-8 Master e-book ISBN
ISBNs: 1-85728-794-0 (Print Edition) HB 1-85728-795-9 (Print Edition) PB Cover design by Amanda Barragry
CONTENTS
SECTION ONE:
FOREWORD Professor Howard Newby
vi
LIST OF FIGURES AND TABLES
vii
NOTES ON CONTRIBUTORS
viii
EDITOR’S INTRODUCTION Millsom Henry
xiii
NEW CHALLENGES FOR TEACHING AND LEARNING 1
EXPONENTIAL EDUCATION Peter Cochrane
2
PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE Stephen Heppell
12
3
TECHNOLOGY AND SOCIETY: AN MP’S VIEW Anne Campbell
15
4
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY Adrian Kirkwood
19
SECTION TWO:
2
DEVELOPING COURSEWARE FOR THE SOCIAL SCIENCES 5
EXPECTATIONS AND REALITIES IN DEVELOPING COMPUTERASSISTED LEARNING: THE EXAMPLE OF GraphIT! Ruth Madigan Sue Tickner Margaret Milner
29
6
THE DATA GAME: LEARNING STATISTICS Stephen Morris Jill Szuscikiewicz
36
7
CONVERSION OF THE IDEOLOGIES OF WELFARE TO A MULTIMEDIA TEACHING AND LEARNING FORMAT David Gerrett
47
8
DESIGNNET: TRANSNATIONAL DESIGN PROJECT WORK AT A DISTANCE Stephen Scrivener Susan Vernon
54
v
SECTION THREE: IMPLEMENTING COMPUTER-ASSISTED LEARNING IN THE SOCIAL SCIENCES 9
COMPUTER-AIDED LEARNING AS A LEARNING TOOL: LESSONS FROM EDUCATIONAL THEORY Graham R.Gibbs David Robinson
62
10
ANORAKS AND TECHIES: A CALL FOR THE INCORPORATION OF NON-TECHNICAL KNOWLEDGE IN TECHNOLOGICAL DEVELOPMENTS Vernon Gayle
73
11
EVANGELISM AND AGNOSTICISM IN THE TAKE-UP OF INFORMATION TECHNOLOGY Danny Lawrence Ken Levine Nick Manning
80
12
STANDARDS FOR THE NON-STANDARD: THE IMPACT OF NEW TECHNOLOGY ON THE NON-STANDARD STUDENT Ann Wilkinson
91
SECTION FOUR: THE EFFECTIVENESS OF THE NEW TECHNOLOGIES IN TEACHING AND LEARNING ENVIRONMENTS 13
INFORMATION TECHNOLOGY AND TEACHING QUALITY ASSESSMENT: REFLECTIONS OF A SOCIOLOGIST Chris Turner
14
WHY COSTS ARE IMPORTANT IN THE ADOPTION AND ASSESSMENT OF NEW EDUCATIONAL TECHNOLOGIES David Newlands Alasdair McLean Fraser Lovie
106
15
USING MULTIMEDIA TECHNOLOGY FOR TEACHING: A CASE STUDY APPROACH David CrowtherNeil BarnettMatt Davies
115
16
INFORMATION TECHNOLOGY AND TEACHING THE SOCIAL SCIENCES: OBSTACLES AND OPPORTUNITIES Duncan Timms
125
GLOSSSARY
135
INDEX
137
98
FOREWORD
Professor Howard Newby Vice-Chancellor, University of Southampton
This book could not be more timely. The shape, structure and content of higher education in the UK is once more under intense public scrutiny and the role of information technology (IT) in teaching and learning remains a key issue. “Using IT effectively” is central to this. There are those with influence over policy in Brussels, Whitehall and Westminster who still regard IT as some sort of panacea: put students in front of VDU screens, the argument runs, and mass higher education can be provided on the cheap. Such “technological fix” arguments are simply false. But so is their contrary, that new technology has no relevance to the future provision of higher education in this country. Clearly the nature of teaching and learning will be affected—sometimes in quite profound ways—by new developments in IT. The key words in the title of this book are not so much “IT” (most developments over the next decade are known or knowable), but “using” and “effectively”. The technology will stand or fall by its use. Education is—and will, for the foreseeable future, remain—at heart a social process. New technology can assist in raising the quality of this process, but it must go with the grain of conventional pedagogy. Without this the sheer quantity of information available via modern information technology will disable, rather than enable, participation in a genuinely educational process. In the social sciences these issues are particularly pertinent. The social sciences thrive on debate. While the acquisition of empirical data is an important component of the understanding of society, the facts are never self-evident: they require interpretation. IT is, therefore, an essential tool in the social sciences and brings on to the desktop capacity to assimilate and analyze information at a speed and a cost undreamt of less than a generation ago. But it is not a substitute for the informed understanding that also comes from debate and discussion. Social science is, in many respects, the epitome of education as a social process. The contributions to the collection are not, therefore, narrowly concerned with the technology per se, startling though the advances in this continue to be. Most of the contributors are concerned with the organizational, social and pedagogical use of the new teaching and learning technologies. Relating these technologies to those contexts is crucial if the promise evinced by the new technologies is to be fulfilled. As we look forward to developing a higher education system attuned to the needs of a new century, these chapters remind us of the great care with which new technology must be handled if, indeed, it is to be used effectively.
LIST OF FIGURES AND TABLES
Figures Figure 1.1 Children using computers: an on-line school session. Figure 1.2 The surrogate head surgeon. Figure 1.3 The virtual university. Figure 4.1 Households in the UK with video recorder, home computer and audio CD player, 1985–94. Figure 4.2 Proportion of UK households with selected media technologies, 1994. Figure 4.3 Access to information and communications technologies in UK households, 1994. Figure 6.1 A painless way to absorb basic information. Figure 6.2 Repeated, varied and purposeful experimentation. Figure 6.3 Explaining the pattern recognition principle of choosing a test. Figure 6.4 Interactivity allows the student to follow their own line of interest. Figure 6.5 Knowing which tests are appropriate. Figure 6.6 More information on each test is available. Figure 6.7 Self-testing enables students to monitor their learning in an exploratory environment. Figure 7.1 Steps in the authoring process (A–M). Figure 7.2 Example of a lesson in Ideologies of Welfare. Figure 9.1 Correlation Explorer. Figure 9.2 A screen from Inspiration. Figure 9.3 The Polygraph from MacLaboratory.
5 6 7 21 22 23 40 40 41 41 42 42 42 49 50 69 70 70
Tables Table 6.1 Table 6.2 Table 7.1
Areas covered by the software. Confidence and understanding before and after using the software. Steps in the authoring process (A–M).
37 42 48
NOTES ON CONTRIBUTORS
Neil Barnett is Lecturer in Public Sector Management at Leeds Metropolitan University. Research interests include local government structure and decentralization. He is interested in developing multimedia teaching/learning material for public sector managers and social science students in an interdisciplinary environment. Anne Campbell is the first woman MP for Cambridge and the city’s third Labour MP. Since her election in April 1992, she has taken a special interest in science and technology, education and economic affairs. She was a member of the House of Commons Select Committee on Science and Technology from 1992 to 1997 and was Vice-Chair of the Parliamentary Office of Science and Technology. Anne Campbell worked with David Blunkett, Secretary of State for Education, on the future of information technology (IT) in education and research from 1995 to 1997. She is currently Parliamentary private Secretary to John Battle, Minister for Science, Energy and Industry. She has also chaired a sub-group of Labour’s Policy Commission on the Information Superhighway. Educated at Newnham College, Cambridge, she taught mathematics at Cambridge secondary schools before becoming a Senior Lecturer in Statistics at Cambridgeshire College of Arts and Technology (now Anglia Polytechnic University). She was Head of the Statistics and Data Processing Department at the National Institute of Agricultural Botany from 1983 to 1992. She is a Fellow of the Institute of Statisticians, the Royal Statistical Society and the Royal Society of Arts. Peter Cochrane is Head of Research at BT Laboratories. A graduate of Trent Polytechnic and Essex University, he is also a visiting professor to UCL, Essex, and Kent universities. Peter Cochrane has published and lectured widely on technology and the implications for society. He received the IEE Electronics Division Premium in 1986; the Queen’s Award in 1990; and the Martlesham Medal for contributions to fibre optic technology, the Computing and Control Premium, and the IERE Benefactors Prize in 1994. David Crowther is Lecturer in Management Accounting at Aston Business School, Aston University. His main area of research is corporate performance measurement and behavioural accounting. He is interested in teaching the use of technology and in particular multimedia as a teaching/learning tool. Matt Davies, a qualified chartered accountant, is Lecturer in Financial and Management Accounting at Aston University. His main research interest is in the use of shareholder value performance measures and the use of technology in the teaching process. Vernon Gayle is Lecturer in Sociology at the University of Stirling and responsible for teaching research methods and data analysis to both undergraduates and postgraduates. His own research is mainly concerned with analyzing social and economic data using generalized linear models. He is a committed
ix
GLIM4 user and has published a paper that reflects his interests in ordered categorical data analysis. He has also conducted research into a complementary treatment regimen for cancer patients. David Gerrett received his Bachelor of Pharmacy degree from Queensland University in 1977. After professional registration and working as a community and hospital pharmacist, he returned eight years later to full-time academia and read for a Masters in Hospital Pharmacy. Continuing part-time, he received his doctorate in 1995. This considered the role of community pharmacists as advisers on prescribed medication. In researching a role he was led towards a greater understanding of the literature on social policy and professionalism. An outlet for this understanding was provided by his role as course leader and primary author for the Postgraduate Programme in Social and Administrative Pharmacy which uses Multimedia as its sole teaching and learning method. The conversion of the Ideologies of Welfare was found to have generic appeal and is currently in use on three further postgraduate programmes. Graham R.Gibbs is Principal Lecturer in Sociology at the University of Huddersfield. He has a wide experience of teaching research methods using IT at undergraduate and postgraduate levels. His most recent research has focused on the use of CAL in teaching the social sciences and especially the use of computer aided co-operative learning in teaching theoretical subjects. Millsom Henry is the Deputy Director of SocInfo (see Glossary). She graduated from the universities of Durham and Stirling as a social scientist with specific teaching and research interests in the sociology of ethnicity and gender, especially in relation to culture/media and the social implications of the new technologies. She has presented numerous papers at international conferences, published over 12 articles and is editor of three regular publications. Millsom Henry was commissioned to edit three books and to write two chapters for publications due out in 1997. Finally, but not least, she somehow finds time to complete a part-time PhD on the identities of Caribbean women in Britain at the University of Stirling. Stephen Heppell is Professor of Information Technology in the Learning Environment at Anglia Polytechnic University and Director of ULTRALAB. Stephen Heppell is a member of a number of public committees and acts as a consultant in both the public and private sectors; he also has a long list of television appearances and writes regularly for the popular press. He is on the editorial board of the Journal of Information Technology for Teacher Education and the Journal of Multimedia and has contributed many chapters in books and journals; a full list can be viewed from: http:// www.ultralab.anglia.ac.uk/pages/ultralab/team/stephen/contents.html Adrian Kirkwood is Head of the Programme on Learner Use of Media within the Open University’s Institute of Educational Technology. He undertakes research and evaluation studies related to access and applications of media and information technologies, both within and outside the Open University. He has been a consultant to organizations including British Telecom and the National Council for Educational Technology and has made invited contributions to international conferences. He co-authored Personal computers for distance education (Paul Chapman 1992) and has published widely on the subject of using media in education and training. Danny Lawrence is Senior Lecturer in Sociology in the School of Social Studies at the University of Nottingham. His early research in “race” and ethnic group relations led to Black migrants: white natives (Cambridge University Press, 1974, reprinted and reissued by Gregg, 1992) and he has subsequently published many articles in this field. He has since conducted research and published on transmitted deprivation; youth unemployment; the professional aspiration and changing circumstances of the occupational groups responsible for the delivery of careers guidance; the disestablishment of teacher careers and, most recently, higher education and the labour market.
x
Ken Levine lectures at the School of Social Studies at Nottingham University. His main sociological research interests are adult literacy and architects as a professional group. He has gained considerable experience using computers with both undergraduate and postgraduate students in a variety of contexts, including courses on statistics and survey design and analysis, as well as introductions to networks, word processing, email and databases. He has collaborated with colleagues (including Danny Lawrence) on the production of CAL courseware (using Authorware Professional) designed for a module on “official” statistics. A long spell as the departmental Computing Officer attempting to meet the needs of staff and student users has taught him that despite massive advances in technology, the gap between expectations and reality in educational IT is more or less unchanging. Fraser Lovie is Research Assistant in the Department of Politics and International Relations at the University of Aberdeen. He is also working on the research project on the Internet delivery of International Relations courses with Alasdair McLean. Ruth Madigan is a Senior Lecturer in Sociology at the University of Glasgow. She has taught urban sociology and methods of social research, in particular data analysis using SPSS, for many years. Her article “Gender issues: teaching with computers in sociology” (SocInfo Journal I, 1995) arose directly out of this experience. The TLTP-TILT project (University of Glasgow) offered an opportunity to explore new approaches to computer-aided learning in the area of basic research statistics. Alasdair McLean is Lecturer in the Department of Politics and International Relations at the University of Aberdeen and is also the Convenor of the Faculty IT User Group. He has considerable experience of distance education through audio-conferencing and leads a research project on the Internet delivery of international relations courses. Nick Manning is Professor of Social Policy and Sociology, University of Nottingham. He has been interested in introducing IT into the social science curriculum since the late 1980s, both as a general environment for student learning and for specific courses. This has included both data analysis and the construction of lectures in hypertext. He has also developed postgraduate degrees combining social policy and IT, with European Union (EU) funding. His research work is mainly on eastern Europe. This started in the 1980s on social policy, changed to environmental and housing movements in the early 1990s, and is now on (un)employment policy and household experience in Russia. His other areas of work have included medical sociology and health policy, comparative social policy among various OECD countries, and general theories of social problems and social policy. Margaret Milner is a Lecturer in Quantitative Methods in the Department of Accounting and Finance at the University of Glasgow. She has had a keen interest in developing computer applications for the teaching of statistics and quantitative methods to accountancy students and also teaches MBA students and students interested in IT. Her research topics include investigating the distributional properties of accounting ratios and decision-making and report format choices. As a member of the team developing GraphIT!, a TLTP-TILT project, the strategic use of graphs and graphical analysis is also an important teaching and research interest. Stephen Morris has been producing CAL for many years; and was an original ITTI project holder for the production of CAL in medical educa tion for which the software Statistics for the Terrified was produced. He is one of the prime movers of the MIDRIB project, which will bring together a comprehensive database of peer-reviewed medical images to UK higher education over the Internet. He has also been the head of successful higher education computer units at St Bartholomew’s Hospital (1984–94), and (currently) St George’s Hospital Medical School.
xi
Professor Howard Newby took office as Vice-Chancellor of the University of Southampton on 1 September 1994, moving from the Economic and Social Research Council, where he was first Chairman (1988–94), and then Chief Executive (1994). Professor Newby has a background of research and writing in rural affairs, including many books and articles on social change in rural England, and is a Rural Development Commissioner. He is also a member of a number of government bodies concerned with the funding of research in the UK, including the Dearing Committee Research Working Group; Chairman of the Centre for the Exploitation of Science and Technology; and a member of the executive council of the European Science Foundation. He is Vice-Chair of the Committee of Vice-Chancellors and Principals and serves on a number of its steering and sector groups. David Newlands is Senior Lecturer in Economics at the University of Aberdeen. His principal research interests in economics include regional economics and the economics of the welfare state. He has also conducted research on new educational technologies and is directing a major project to examine the impact of such technologies on educational provision in Scotland. David Robinson is Senior Lecturer in Psychology at the University of Huddersfield. His work has included involvement in the development and evaluation of a software system to support general practitioners. His current research interest is the relationship between autobiographical memory and fantasy. Stephen Scrivener is Professor and Director of the Design Research Centre at the University of Derby. He has published four books and over 50 papers in learned journals, and has refereed conferences and made numerous presentations. Jill Szuscikiewicz has worked on CAL with Stephen Morris since the original ITTI grant was obtained, and more recently at St George’s Hospital Medical School has worked on projects with a more medical base such as Immunology Explained and Heartbeat. She is now project manager of MIDRIB, funded under the eLib programme as a joint project with Bristol University. Sue Tickner is a Teaching and Learning Support Consultant at the University of Glasgow. Originally an English graduate, she worked as a teacher in Britain and Spain before returning to the UK for an MSC in IT and learning (by distance learning). After working as an independent developer/trainer and as an Open University tutor Sue Tickner joined the University of Glasgow’s TLTP project, as designer/co-ordinator for the Numerical Data Group. She has recently been conducting research into distance education and remote learning technologies. Duncan Timms is Director of Soclnfo (see Glossary) and Professor of Sociology in the Department of Applied Social Science in the University of Stirling. He is also Director of Project VARESTILE (the Value-Added Reuse at Stirling of Existing Technology in the Learning Experience), an institutional project funded under the Teaching and Learning Technology Programme. His teaching and research interests encompass two main areas: the social correlates of health and illness, and the social implications of information and communications technologies. He is also the Director of a series of Scottish-Nordic Winter Schools on Comparative Social Research funded by the EU under the Training and Mobility of Researchers Programme. Chris Turner is Professor of Sociology, University of Stirling. His teaching and research interests include the social constructions of childhood; processes of transition from childhood to adulthood; the impact of state intervention in children’s lives; and state policies on children since 1945. Susan Vernon is Director of Applied Arts at the University of Derby. Having studied at Gray’s School of Art in Aberdeen (jewellery and printmaking) and at the University of Central England, Birmingham (MA in Industrial Design), Susan regularly exhibits her work at an international level.
xii
Ann Wilkinson is a Senior Research Fellow working as the coordinator of the CTI Centre for Human Services in the Department of Social Work Studies, University of Southampton. The Centre is also funded to provide information and advice to CTI Centres on antidiscriminatory practice. Her previous research includes a European-funded project to provide information on access to higher education for disabled students. This was published as an information system and forms part of her MPhil thesis.
EDITOR’S INTRODUCTION Millsom Henry
This book represents a selection of edited papers, most of which were presented at the first Soclnfo (see Glossary) International Conference on Technology and Education in the Social Sciences (TESS) in September 1995 at the University of Stirling. This event provided a unique opportunity for social scientists to share ideas and experiences around the issues of innovation in teaching and learning. The proliferation of technology initiatives, the growth of more generic computer-assisted courseware for higher education, the rapid expansion of higher education, the emergence of the Internet and the Teaching Quality Assessment (TQA) exercises in the UK all provided the backdrop for many of the issues raised at the conference. As a result, the papers in this collection represent an attempt to document as well as to stimulate some critical debate on the impact of these technologies for teaching and learning. Although studies are emerging in the social sciences on the varying forms of technology and its related sub-cultures, more work is needed on how technology affects everyday life. The social, political and economic implications of the new technologies should be a central concern for the social sciences. Interestingly, this was recognized by the Economic and Social Research Council (ESRC) with the launch in spring 1997 of a new research programme on the role of virtual technologies in society.1 This programme will pay attention to the wide-ranging implications of new technology which to date has been missing in social scientific research. This must be good news. It is heartening to note that the themes surrounding the effective use of technology in education were also highlighted as key areas in the new ESRC programme. Using IT effectively: a guide to technology in the social sciences examines in detail some of the major issues associated with the development, impact, implementation and assessment of technology particularly within the social sciences. In UK higher educational institutions, the teaching and learning process is currently undergoing a major revolution and a more sustained examination of the development, implementation, assessment and impact of technology within the teaching and learning process is required. This book should be read as a basis for further investigations into both the positive and negative consequences of technology. Section One contains four key contributions from business, politics, academia and the technological field on the changing effect of technology on the teaching and learning process. In Chapter 1 Peter Cochrane points out the significance in expanding the traditional way of looking at teaching and learning. As the Head of BT Research Laboratories, Cochrane contends that while the younger generation “are embracing IT and rapidly gaining skills, the teaching profession remains dominated by a population resisting, or unable to see the need for, change”. This has led to a gap between the students and staff and is, according to Cochrane, a
1. ESRC Virtual Societies Programme directed by Professor Steve Woolgar at Brunel University. For more details refer to: http://www.esrc.ac.uk/programmes
xiv
cause for concern. In addition, there is a failure to appreciate the significance of IT as an effective tool for teaching and learning. Cochrane insists that “IT is not an ‘instead of’ but an ‘as well as’ technology. It is unlikely to replace the teacher or the institution but it will change their nature. In future, education will have to be more available, just-in-time, and on-line as it becomes a continuous life-long process.” The key point of this chapter is that there is a dramatic shift in the way formal education is viewed which has implications for staff, students and the nature of higher education disciplines. Social scientists should be interested in these shifts not only as an area for external enquiry, but also to investigate ways in which IT can be used more effectively as a tool within its own discipline areas. In Chapter 2 Stephen Heppell, the Director of ULTRALAB, points out how developments in educational technology contrast sharply “between rapidly expanding advancing technological potential and slower pedagogical, social and political development”. This has led to a number of tensions “between the emergent capabilities both of the technology itself and of the ‘children of the information age’; the challenge that these capabilities pose for existing models of education and assessment; the challenges posed for public policy and the social implications of technology for work, gender, family and education”. These tensions are of central concern to the social science disciplines, yet to date we have failed to critically engage in the debate. Chapter 3 focuses on the need to use the technologies effectively to encourage innovative research and to assist in the development of good teaching and learning skills. Anne Campbell, the Labour MP for Cambridge, demonstrates how technology can be used to improve access to the political as well as the learning process by describing the design and management of her online political surgery, one of the first in the UK. Campbell highlights the issues of access and empowerment which are central to the debate about the implications of the new technologies and insists that the information revolution should “not worsen the divisions in society, but is used to enable opportunity, equality and democracy”. In the final chapter of this section, Adrian Kirkwood also picks up on some of the negative consequences of technological development and considers whether it will excerabate social differences. The evidence to date suggests that whether in the home or at work, technology has tended to reinforce existing social differences in relation to gender and class. In this regard, Kirkwood argues that these concerns represent a valid subject for social scientific enquiry. In Section Two, a few examples of the development of courseware in the social sciences are highlighted. Ruth Madigan, Sue Tickner and Margaret Milner describe their work in Chapter 5, as part of an interdisciplinary team to produce a basic statistics CAL program. As selfconfessed “relative novices in courseware development” the authors here describe how the design and implementation of their program forced a fundamental re-evaluation of their own teaching methods. Their case study provides some salutary lessons about collaboration in courseware development and raises issues about the effective use of technology for teaching and learning. In Chapter 6, Stephen Morris and Jill Szuscikiewicz also outline their attempts to develop, assess and implement a statistical program for students. The chapter demonstrates how to exploit the technology and introduce graphics and simulations to encourage practical experimentation. By employing these methods, Morris and Szuscikiewicz assert, students will have a deeper appreciation of statistics. In Chapter 7, David Gerrett describes how he developed a programme based on an existing core course. Gerrett utilized the literature about the ideologies of welfare in social and public policy to create what he called “one-to-one, non-judgemental tuition”. Chapter 8 documents Stephen Scrivener’s and Susan Vernon’s vision of the future where collaborative group work by designers as part of international teams supported by computer and electronically mediated communication and cscw tools will predominate. This
xv
form of learning has implications for all discipline areas and may be particularly of use to the social science community who should be able to fully exploit the capability of computer-mediated communications. The contributions in Section Three focus more directly on the implementation of CAL programs in the social sciences. In Chapter 9, Graham R.Gibbs and David Robinson argue that attempts to replace the teacher with technology are unhelpful and relate to a much wider societal process of deskilling. Developments in CAL, they argue, should be used to enhance teaching skills by providing flexible learning tools rather than seeking to replace or deskill teachers. In Chapter 10, Vernon Gayle points out how the influx of IT in higher education has been poorly conceived and ineffectively implemented. Gayle argues that the inclusion of a sociological account of the teaching and learning environment serves to provide “an empirical account of the sociality of the teaching and learning environment and incorporates the knowledgeability held within the non-technical perspectives”. Nick Manning, Danny Lawrence and Ken Levine examine some of the reasons why academics have been reluctant to embrace technology in their teaching and research areas in Chapter 11. Paying particular attention to the attitudes, organizational ethos and context of academia, the authors maintain that a number of factors have hindered the successful development and use of IT. In response to Brackenbury’s & Hague’s (1995) article, Manning et al. argue that, rather than being dismissed as irrational, the actions of academics who do not embrace technology may actually be based on calculation. Ann Wilkinson’s contribution in Chapter 12 addresses how the implementation of technology affects groups defined as “non-standard” students. Wilkinson points out that educational technologies should be providing the opportunity to look at different approaches to teaching and learning that benefit all. In Section Four, Chris Turner reflects on his recent experience of TQA in Scotland by exploring both the current and potential uses of IT in the learning and teaching of sociology in higher education. Focusing in more detail on cost, David Newlands, Alasdair McLean and Fraser Lovie in Chapter 14 stress the importance of comparing the costs of different technologies as well as looking at the evidence for the learning achievements and experiences of students. David Crowther, Neil Barnett and Matt Davies assess how the introduction of computer-based-learning programs have been driven by the desire to achieve efficiency savings. However, as the authors point out, “efficient teaching may not represent efficient learning”. Crowther, Barnett and Davies proceed to outline the general failure to exploit multimedia technology, particularly in the social sciences, before going on to describe how to maximize both effective learning and efficient teaching. Finally, in Chapter 16, Duncan Timms examines the obstacles and the opportunities of IT teaching in the social sciences. According to Timms, the development of computer-based learning in the social sciences has been slow, with the exception of its use in data collection and analysis. Consequently, teaching IT in the social sciences has remained largely unchanged. The reasons for this are both general and specific and as a result the pres sures are also both positive and negative. The chapter ends with a brief look ahead to the direction that the social science community should take in relation to the pervading role of technology. The issues surrounding developments in technology are undoubtedly wide-ranging, as these chapters have shown. There are issues which still need to be identified as well as resolved. This book should be seen as an attempt to provide a working document to some of the issues raised. It is clear that as a research area, the social, political and economic implications of technology will expand over the next few years; and in the area of education it is evident that with the continuing expansion of higher education, issues of access, quality, effectiveness and choice will be central. Consequently, the pedagogic nature of disciplines, the structure of universities, the teaching and learning styles of both staff and students and any partnership with industry, commerce and public policy must be strategically reviewed. This will undoubtedly involve a huge investment in time as well as financial and other resources. Nevertheless, for the first time, the social
xvi
science community are well-placed to take the lead and to shape policy in this area. The role of technology, then, must be an issue that is placed high on the social science agenda.
Section One NEW CHALLENGES FOR TEACHING AND LEARNING
Chapter 1 EXPONENTIAL EDUCATION Peter Cochrane
We are living at a time of unprecedented change, with technology advancing faster, and producing more new opportunities, than ever before (Lilley 1995). IT has created not only the mechanisms to do more with less, but also the means of storing, accessing and transporting information on a scale inconceivable just ten years ago (Emmot 1995). Technology feeding technology, with machines used to design better machines, is the evolutionary process responsible for the exponential capability growth now driving society. In contrast, our wetware (the brain between our ears) has seen no significant change during the past 150,000 years, and in evolutionary terms mankind is in stasis (Calvin 1991). So if we are to survive in a technologically driven world that is changing faster than we can biologically accommodate, we have to use the very technology that engendered our predicament to help us cope; it is our only course of action. Going back to earlier, and in many respects simpler, times is not an option—no matter how distortedly attractive it may appear (Bronowski 1973). The progress of our species has always been, and remains, irrevocably linked to innovation and technology—and it is one way only! We just could not support the world’s population of over 5 billion without the technology we have come to take for granted (Toffler 1971). Human-technology perspective Only 2,000 years ago most of humankind lived in tribal communities of just a few hundred individuals meeting and knowing fewer than 1,000 people in a lifetime. For this life, we were well equipped, with all of the tribe’s knowledge contained in the human brain, and passed on from father to son, mother to daughter. For most, all the information they ever required was within the tribe. Civilization, cities and trade changed all this, and in a period of less than 200 years the transition from the farming and rural existence to the Industrial Age was completed (Bronowski 1973). During this transition, the ability to transport large quantities of goods and people across the planet emerged, creating a demand for good telecommunications. It is interesting to reflect that colonization and supremacy in war were the primary motives for the development of much of our industry, and have led directly to today’s revolution in IT. More impressively, we have created a new era in much less than 100 years. When De Forest invented the thermionic valve in 1915 he could never have guessed the revolution that he was starting. The next major step was the invention of the transistor in 1946 by Shockley, Bardeen and Brattain, to be followed by the integrated circuit in 1958, the laser in 1960, and optical fibre in 1966. In the last 50 years we have seen the world become dominated by electronics (chips) and optical fibre. As a result, computers and communication are now ubiquitous, and we have created more information, achieved and understood more than all of the past generations since we first discovered fire. This pace of change will not only continue, but accelerate: and the trajectory is now clear—it is exponential. Every year
EDUCATION
3
(or thereabouts) sees optical fibre transporting twice as much traffic (Cochrane & Heatley 1995), memory chips storing twice as much data, and computers twice as fast (Emmot 1995). Many people consider English to be the planet’s primary language, and speech to be the most sophisticated and dominant form of communication. Well, they are wrong. The dominant form is now binary, and it is between machines having more conversations per day than mankind has had in its entire existence (Drexler 1990). We can now wear more computing power in a wristwatch than was provided by a commercial computer the size of a domestic washing machine 30 years ago. In 10 years the PC will be around 1,000 times more powerful than today, and in 20 years near 1,000,000 times more. By about 2015 super computers will have reached human equivalence in terms of information storage and processing, and by 2025 that power will be available on our desks. About five years later (Calvin 1991, Regis 1991) computers will be wearing us! If we are to maintain a primary role on this planet, we must understand technology and use it to advance our own limited brain capacity. It is not possible to ignore these changes for they are inexorable, and will promote even more change (Cochrane 1995b, Kennedy 1993). In short, you can opt out, but you cannot escape. Antagonistic technology There is absolutely no doubt that most IT interfaces seem to have been designed by people who feel we should all be computer scientists (Norman 1988). This is definitely the wrong approach. Most people have great difficulty driving a VHS video recorder, let alone a PC. Unless we humanize machines (make devices extremely user friendly and easy to use), a society divided by its abilities with machines will be born. This would be a disastrous society of IT “have and have nots”, full of tension, and sub-optimal for our own productivity, progress and survival. It is vital, therefore, that technology is bent into people and people are not bent further into technology (Emmot 1995). Today the primary interface is the button, switch, knob, mouse, keyboard and screen. This can only be viewed as archaic, and something that should not survive. Fortunately, technology is now reaching a point where voice control and command, and even limited conversations between people and machines, are possible (Cochrane & Westall 1995). This Star Trek vision is the first step in the journey to a symbiotic relationship between carbon (us) and silicon (chips) life forms. It is also the first example of directly linking the nervous systems of two different entities. The next extension will be our sense of touch, as fingertips and other sensory areas are coupled directly into machines (Drexler 1990). In the meantime we have to make do with sight and sound, head-mounted screens, cameras, microphones and earphones. But even with this limited technology we can achieve a tremendous expansion and change in our abilities and society as we increase the access to, and throughput of, information and experience (Earnshaw & Vince 1995). Education In the slow-moving world of the ancients, who wrote and drew in the sand, on clay and parchment, education followed the master-disciple model, whereby only a select few were chosen to be educated by a very few teachers. The world was a slow-moving place where innovation and technology were alternately promoted and constrained by war and religion (Bronowski 1973). With the invention of the printing press major changes evolved: this was a new means of propagating the written word and, more importantly, ideas. Mass education started to take off; formal systems, teachers and classes grew in size and number throughout the developing world. Up to, and throughout, the industrial revolution this “Sage-on-the-Stage” system of imparting knowledge was very effective. Regimented classes of 30–50 children, drilled by a single teacher, proved an efficient and essential means of educating the armies of people required to fuel
4
EXPONENTIAL EDUCATION
the transition of society from agriculture and cottage industry to mass production and industrialization. Up to the end of this era, change was still relatively modest and within the grasp of the individual, and so was education. However, at the dawn of the information age, the system and individuals were beginning to creak under the pace of change and demand for more diversity (Toffler 1971). Long-held wisdoms of science, technology, economics and commerce started to shift or became increasingly challenged. In contrast, other topics such as mathematics, history, sociology and law remained relatively stable for a further 30 years. Today, nothing is stable, nothing goes unchallenged, and certainly our accepted modes of education and training are under threat as the world accelerates into the information age (Emmot 1995). Let us examine this change in more detail. Thirty years ago the vast majority of children came from homes with few books and went to school for education. Today, unfortunately, the reverse is often true. For many children with top end computers, access to CDS and networks at home, they see school as having little to offer. Interestingly, in numerous programmes with children, it has become abundantly clear that the primary impediment to progress is not the young people. It is the older generation who are trying to impart their experience and knowledge who present the key limitation (Cochrane 1995a). For the most part our society appears divided at about the age of 29, with those older computer illiterate, and those younger fully able. So it is not unusual to find a class dominated by a teacher who is IT illiterate, and feels threatened by a class full of capability. This problem is compounded by the lifestyle of children which is now partly governed by the games environment (Martyn, Vickers, Feeney 1990) of intuitive learning, and a “crash and burn” culture. They feel no inhibition in discovering by doing, and coming to grief in full public gaze, while the cultural background of their elders is the converse. It is perhaps not surprising to find that many youngsters view university, college and school as boring, where the teaching methods have not changed in aeons. These young people happen to come from a world of instant gratification, of IT, and rapid access and experience, of new and dynamic skills learnt in new ways (Cochrane (ed.) 1994a). Examples of new ways Just two decades ago a young child would have learnt to tell the time on an analogue clock-face, and the digital form would have been unusual. If they later developed an interest in science, engineering or flying, they would come to grips with the vernier scale and altimeter by a single step analogy with telling the time. Today, the converse is often the case. They learn about flying very early, and you cannot fly an F16 simulator if you do not learn about cockpit instrumentation. So telling the time on an analogue display involves analogical reasoning in the reverse direction of 20 years ago. Finding information has always been a social activity. The Dickensian library offered a degree of order and mapping that allowed a fair degree of success by the individual. However, much of the information retrieval process of this old world involved finding knowledgeable people; teachers, friends and colleagues could usually help steer us in the right direction. In the IT world we now have search engines and Gofers or Agents that serve the same purpose (Milne & Montgomery 1994). We can also communicate electronically with vastly more people to gain their assistance and steer. With such devices, students can search, find, sort and assemble information hundreds of times faster than previous generations. Curiously, older people, especially teachers, often consider this as cheating. There are now over 24,000 CD titles available for use with PCS containing everything from classic books and whole-body interactive encyclopaedias to scientific experiments and university degree courses. The te aching of some difficult topics in science, statistics and engineering can now be enhanced significantly
EDUCATION
5
Figure 1.1 Children using computers: an on-line school session.
through computer animation and visual representation. Instead of static words and two-dimensional pictures on paper, students can interact with three-dimensional entities on the screen to experience cause and effect first hand. There is a growing library of standard experiments and situations available, along with medical operations, Shakespearean plays and legal cases. In this regard, interactive multimedia is providing an often superior alternative (Martyn, Vickers, Feeney 1990) to individual teachers and books for large tranches of education. It is now possible to illustrate and explain immensely complex systems and situations with the technology of visualization and virtual reality (MacDonald and Vince 1994). Unlike Crick & Watson, students should not have to construct a model of DNA using cardboard and coathangers (Crick 1994). Access to mathematical representations of a visual form that is both exciting, stimulating and edifying is now a given in modern industry. Leading manufacturers no longer construct prototypes, but the real thing in virtual space, and then go straight to the production line with the finished product (Earnshaw & Vince 1995; Yates 1992). Education needs this technology too. Shared experiences With telepresence technology it is now possible for a one-to-many or one-to-one experience to be realized efficiently on a massive scale. The surrogate head is just one development where miniature television cameras above the eyes, and microphones above the ears, collect information in real space and time. This can then be transmitted and displayed on screen, or a VR headset, to one or more people in any location on the planet. So a surgeon can perform an operation with a thousand students standing inside his or her head looking out. Conversely, when a protégé performs the same operation, the surgeon can stand inside and advise in the closest possible sense (Cochrane 1994).
6
EXPONENTIAL EDUCATION
Figure 1.2 The surrogate head surgeon.
Within the next 15 years the addition of touch to such systems will make this human experience almost complete. This might sound farfetched, but it exists in the laboratory today, and has been used for real operations on humans over standard dial-up ISDN circuits (Cochrane, Heatley, Pearson 1995). This technology is applicable to a wide range of disciplines, and has the potential to completely change the education and training paradigm to just-in-time. Half-life education In fast-moving areas of technology many degrees now have a half-life of less than five years. Moreover, the time when a single discipline degree was sufficient for a lifetime of work has long gone (Gell & Cochrane 1995). For example, it is not unusual to find electrical engineers now concerned with biology, sociology and genetics. So it seems time to create a new form of degree that is much lower, broader, more generic, and able to equip people for a world that will change rapidly over a working lifetime. In addition, a series of higher degrees are necessary that can be rapidly acquired as technology and work practices change. However, as business life and industry also accelerate and demand increases, then so does the pressure to hang on to the scarce well-trained resource that is key to the success of the very enterprise itself (Hague 1991). Virtual university It is partly in response to the above paradox that five years ago BT created a series of internal degree courses. Their organization and running were under the auspices of several conventional universities banded together to create the desired profile and course content (Cochrane 1995b). Interestingly, this content is increasingly dynamic as each year sees the course material change to meet the needs of a fast-moving business. Everyone wins: the students who become empowered and capable, the company that has the workforce it requires, and the universities that gain access to key people and activities in industry.
EDUCATION
7
Figure 1.3 The virtual university.
At first the courses were conventional, with students and teacher gathered in a lecture theatre for a few hours a week, followed by tutorials and assignments. More recently a new format began to unfurl, whereby lecturers from North America and other regions were teleported into the lecture theatre by suitably mounted cameras and ISDN dial-up lines. They appear on a three-metre-square back-projected screen to give their lectures eye-to-eye. Only two years ago such a lecture was costing £60 for the communication connection, and today it is only £40, much less than the hotel charges for a real lecturer in a real hotel. There are those who would argue that this is not a real experience, and it is not as good as the real thing. While this may be true, the choice is rather more stark: either you have the electronic experience, or none at all! On that basis, the students would sooner have world experts in front of them electronically rather than never getting their presence. More recently the next step has been taken: teleporting the event to the desks of individual students so that they no longer have to break away from work, and they do not have to crowd into a lecture theatre. They can now attend courses or tutorials, and interact with each other, directly on the screen. Within BT this experiment has now been ratified as the primary model for future company education and training. The key discovery has been that the downside of apparent isolation at the desk can be overcome by a series of short communal periods where everyone on the course meets and works together. To date technology presents a poor meeting environment for people. The images are small and distorted, often with sound and vision slightly disconnected. After a first real face-to-face meeting, however, these deficiencies tend to be overlooked and the participants just get on with working together. In the not-too-distant future new display and audio technology will provide life-size images of near-zero distortion and daylight brightness. This is expected to extend this education and training regime significantly, and may totally remove the current need for real interaction. Experiments on breaking down
8
EXPONENTIAL EDUCATION
the social barriers (Cooper 1994) and establishing trust and relationships will thus form a primary target in the next phase of development. The critics There are very few of us who look forward to, or enjoy, change on a large scale (Toffler 1971). This is certainly true of the education establishment and many who are involved indirectly. The primary direction of criticism always seems to be, “That’s not the way they did it in my day!” I suppose if we went back to the time of Archimedes and Aristotle, people were saying much the same thing about their methods of teaching. The reality is that just 50 years ago in British universities the lecturing and teaching practice was totally different. Today you can still see the benches at the front of lecture theatres where experiments on a grand scale would be conducted in front of an enthralled class. This was real experience, and teaching in a manner that is now long lost. Why? Because education has been squeezed and changed continually. This has resulted in small universities with very small departments trying to do far too much in too short a time. Students are being asked to take in more and more information and experience in less time while staff-student contact time continues to decline (Gell & Cochrane 1994). Ultimately education is becoming impossible relative to the rate and breadth of change in a world of technologically driven progress (Ravitch 1995). Most active university staff have far too many research students, and far too many classes to teach. To exacerbate the problem, most university departments seriously lack the necessary number of people with the right abilities. No doubt, all of the abilities required to create a suitably well qualified, skilled and able department are available in the country. However, they are seldom, if ever, available in one location—a university (Hague 1991). The virtual university, an ethereal space in the information world, overcomes this problem, and allows groups of people with the right interests and skills to come together to work and be proactive. The problem is that it does mean a different mind set, and a different way of doing things (Lyons & Gell 1994). Unfortunately for the traditionalists there is no other solution that will allow us to meet the challenge of technologically driven change in our society. It is, therefore, imperative that we embrace the technology, and experiment to find out what works and what doesn’t (Handy 1990). The virtual world today On a Saturday morning I can struggle into Ipswich, park my car, walk across town, buy some software at a high price and pay VAT. Alternatively I can go onto the Net, access the software directly from the supplier in the USA, pull it down the Network, and pay for it without even leaving my machine. The advantage is not only in the time and inconvenience saved but in the lower cost of a product that no longer requires a wholesaler, distributor, retail outlet or VAT. The same is true for the library, the bookstall and potentially for all forms of “soft products”. Such thoughts alarm many people when they ought to make them feel relaxed. For this virtual world is not an instead of, but an aswell-as, technology. It opens opportunities for new ways of doing things, new forms of trading and enterprise, and new dynamic markets. Shopping, entertainment, education and training, from your desktop at home, at work, or wherever you happen to be, are now very real options (Heldman 1988).
EDUCATION
9
Society and change While technology changes our world irrevocably, there are some features of it that will remain for many decades to come—but not many. For example, consider such stable institutions as government, banking and the City of London. We currently have a governing mechanism that involves people sitting two swordlengths apart acting like demented schoolchildren in lengthy debates eyeball to eyeball. The decisionmaking processes of this system is orders of magnitude slower than counterparts in the virtual (electronic) world. Similarly, financial institutions are being touched by technology in ways that are changing them, and impacting on our society. The vast majority of bank branches are no longer required. It is possible to run the entire operation from one location or even no location at all. The same is true of the City as it now deals primarily with information rather than money, for gold has become an abstract concept, as are the pound and other currencies. If such solid institutions are being challenged (Drucker 1993) by technologically driven change, then the role of an education system is to prepare the population for the new world that will result. It is vital that the education and training sector produces the right people with the right skills. This will not happen by following the market; education (Gell & Cochrane 1994) has to get ahead. The difference between the old world and the new is exemplified by the typing pool. Only a decade ago most large organizations had such a resource staffed by young women, their sole purpose to take handwritten or spoken text and transcribe it onto the typed page. The process could take several days depending on the queue length. The very thought is inconceivable today; who would operate in such a way? Things are now turned around in a matter of minutes, not days. Modern companies operate with telephone, fax, e-mail, video conferencing; they have very low flat structures (Lyons & Gell 1994) with people empowered to make local decisions and get on with the job fast. Any form of delay is just inviting the competition to take away your business and markets. The same is increasingly true in education and training —any school, college, university or training establishment that sits back and continues to exclusively use the old chalk-and-talk methods is destined for extinction (Gell & Cochrane 1995; Ravitch 1995). We have to move forward with the technology if we are going to keep up with a world that is changing ever faster. The future My father had a working life of 100,000 hours; I can now do his work in 10,000 hours; my son will be able to do it in 1,000 hours, and so on (Handy 1990). The work that took me a whole morning as a young engineer is now completed in less than 15 seconds by the power of computer-based automation. This level of progress is assured for at least another decade as we can see all of the techniques, and all of the technologies on the laboratory bench today. It is likely that this progress will continue for at least another two decades and probably three, but after that we reach the ultimate limit of using sub-atomic particles as components (Drexler 1990). There is little doubt, as history shows, that our innate curiosity, creativity and inventiveness will generate even more technology (Bronowski 1973) and cause more change beyond silicon and silica. However, we are at a unique epoch, and there is a new proviso; for the first time in our entire history, we have to keep up with the technology. We have to stay ahead, stay educated and trained, and somehow understand things that currently defy our limited wetware. Tapping the exponential power of the technology itself (Pagels 1988) appears the only option if we are to live and prosper as individuals and a society (Cochrane (ed.) 1994b).
10
EXPONENTIAL EDUCATION
References J.Bronowski, “The Ascent of Man” (television series) (London: BBC, 1973). W.H.Calvin, The ascent of mind (New York: Bantam, 1991). P.Cochrane, “Communications, care and cure”, Telemed ‘94 conference, Hammersmith Hospital, London, 1–4 September 1994. P.Cochrane (ed.), “The potential for multimedia, information technology and public policy”, The Journal of the Parliamentary Information Technology Committee 13, 3, Summer 1994a. P.Cochrane (ed.), (37) Special Series on “The 21st Century”, British Telecommunications Engineering, 13, Pt 1, April 1994, continuing into 1996. Features a wide range of articles on technologies and applications concerned with education and training. P.Cochrane, “Desperate race to keep up with children”, The Times Educational Supplement, 23 June 1995a, p. 25. P.Cochrane, “The virtual university”, Business of Education 5, March 1995b. P.Cochrane, & D.J.T.Heatley, “Aspects of optical transparency”, British Telecom Engineering 14, 1, April 1995, pp. 33–7. P.Cochrane, D.J.T.Heatley, I.D.Pearson, “Who cares?”, British Telecom Engineering, 14, 3, October 1995, pp. 225–32. P.Cochrane, & F.Westall, “It would be good to talk!”, paper presented at Second Language Engineering Convention, London, October 1995. M.Cooper,“Human Factors in Telecommunications Engineering”, special issue of British Telecom Engineering Journal 13, 1994. F.Crick, The astonishing hypothesis: the scientific search for the soul (London: Simon & Schuster, 1994). K.E.Drexler, Engines of creation: the coming era of nanotechnology (Oxford: Oxford University Press, 1990). P.Drucker, Post-capitalist society (Oxford: Butterworth-Heinemann, 1993). R.A.Earnshaw, & J.A.Vince, Computer graphics: developments in virtual environments (London: Academic Press, 1995). S.Emmot, Information superhighways: multimedia users and futures (London: Academic Press, 1995). M.Gell, & P.Cochrane, “Education and the birth of the experience industry”, paper presented at European Technology in Learning Conference, Birmingham, 16–18 November 1994. M.Gell, & P.Cochrane, “Turbulence signals a lucrative experience”, The Times Higher Education Supplement, 10 March 1995, p. 11. D.Hague, “Beyond universities: a new republic of the intellect”, Hobart Paper, Institute of Public Affairs, London, 1991. C.Handy, The age of unreason (London: Arrow, 1990). R.K.Heldman, ISDN in the information marketplace (Blue Ridge Summit, PA: TAB Books, 1988). P.Kennedy, Preparing for the 21st century (London: HarperCollins, 1993). R.Lilley, Future proofing (London: Radcliffe Press, 1995). M.Lyons, & M.Gell, “Companies and communications in the next century”, British Telecommunications Engineering Journal 13, 2, 1994, p. 112. L.MacDonald, & J.Vince, Interacting with virtual environments (Chichester: Wiley, 1994). J.Martyn, P.Vickers, M.Feeney, “Information UK 2000”, British Library Research (London: Bowker-Saurn, 1990). R.Milne, & A.Montgomery, “Proceedings of expert systems 94”, British Computer Society (Oxford: Information Press, 1994). D.A.Norman, The psychology of everyday things (New York: HarperCollins, 1988). H.R.Pagels, The dreams of reason: the computer and the rise of the sciences of complexity (London: Bantam New Age Books, 1988). D.Ravitch, “When school comes to you: The coming transformation of education and its underside”, The Economist, 11 September 1995, pp. 53–5. E.Regis, Great mambo chicken and the transhuman condition (London: Penguin Books, 1991). A.Toffler, Future shock (London: Pan Books, 1971). I.Yates, Innovation investment and survival (London: The Royal Academy of Engineering, 1992).
EDUCATION
11
Chapter 2 PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE Stephen Heppell
The development of educational technology from Skinnerian teaching machines onwards has offered a contrast between rapidly advancing technological potential and slower pedagogical, social and political development. Initially, this had not necessarily been disadvantageous; in many cases technology failed to deliver on its potential, often embodying models of learning that owed more to convenience than cognition. Technology’s contribution was often then touted with a triumph of hype over hope and institutional learning’s conservatism provided a pragmatic, and welcome, buffer against the tide of misplaced optimism. However, it is wrong to assume blandly that this will continue to describe the state of affairs in learning technology. Technology continues to advance its potential exponentially and, inexorably, we reach a point where rhetoric is eclipsed by reality and learning technology holds out the hope of simply better learning, whatever we mean by that. Unfortunately it is not as simple as “sitting and waiting” for progress to take root. Many systemic barriers to progress must be surmounted and many confusions result from the mismatch of technological and pedagogical progress. Far from a model of irredeemable technological determinism, the choices thrown up by these confusions are the stuff of political and social debate with real choices to be made: for example, it could be suggested that increasingly affordable micro-technology liberates individuals from old forms of capital. In publishing and communications, for example, we have seen the economies-of-scale barriers to entry of new competition fall away rapidly as everyone with a desktop micro and a laser printer becomes a publishing house, and the Internet has offered access to vast audiences for minimal capital outlays. But equally we could argue that, although the barriers are lower, lack of access to new communication technology has created a further disenfranchized techno-poor minority. Similar political and social debate should surround concepts of what public service looks like in cyberspace, whether information is a new factor of production or a new form of capital, whether teleworking liberates or imprisons…and so on. Technological progress in this way is posing some fundamental questions for nations which are a long way from the simple grasping of “the white heat of technology”. Indeed, as telecommunications reduce our reliance on geographical proximity the concept of the nation state itself becomes challenged; will I vote and pay taxes with my geographical neighbour, or with the electronic community that I work, shop and socialize with? However, these are broad and general issues for future debate. This paper will reflect on more pressing concerns. Firstly, the emergent capabilities both of technology itself and of the “children of the information age”; reflecting on the challenge that these capabilities pose for existing models of education and assessment. As I demonstrated in my conference plenary address1, technology allows us to offer powerful support for small cultures whether they are linguistically determined (for example Catalan) or (like Deaf culture) based on some other common circumstances. We can dictate to our computers, they can reward our engagement with multiple media types—speech, text, graphics, aural ambience, video—a tapestry of cues, clues and primary information. This should not mean that we require our school students to be media eclectic, strong in every
PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE
13
media type that technology can support; already we disadvantage those not fluent in textual notation (for example dyslexics) by an insistence that we filter much of our children’s learning through the ability to represent it textually. Requiring them to be strong in all other media too would be to further narrow the corridor of success. What we are seeking is media redundancy where children can derive and represent meaning from a menu of media type/s and this of course has profound implications for the assessment and examination system. Children too pose challenges to that assessment system. It is clear from research at ULTRALAB that children are adept performers with (and through) technology. Faced with new tasks and problems they adopt strategies (for example Observe, Question, Hypothesize, Test and Reflect), they represent Process to each other (“look at how I did this” rather than “look at what I did”), they work collaboratively and they multi-task. When asked for example to watch multiple television programmes simultaneously they adopted a strategy which reflected their own understanding of media (for example they used their knowledge of genre and of the role of aural information) which allowed them to answer detailed questions afterwards about minutiae (“what colour was the…”) and also successfully to tackle meta-level questions about character development and production decisions. They showed themselves to be highly media literate and yet much of our pedagogy and assessment fails to allow them to reflect this capability. Worse still, as we abandon (for good reasons) our reliance on normative-referenced testing in favour of criteria referencing we find that technology moves the criteria faster than we can pin them down, with the result that either the assessment model becomes an unacceptable drag on progress or we are uncertain about the quality of our assessment procedures. Ten years ago I could have gained a recognized qualification by typing at n words per minute on a manual Remington typewriter. Now “speech to text” technology lets me dictate to a portable computer faster and with fewer errors; do I still qualify for the certificate? Our constant problem with technology has been to look at its impact on an existing model of behaviour. Too often we make judgements from a deficiency model of both people and technology (“they can’t use it and it doesn’t work”). In 1939 the New York Times commented that “The problem with television is that people must sit and keep their eyes glued to the screen. The average American family doesn’t have the time for it”, which undervalued the ability of technology and of individuals to modify behaviours. In 1967 Chu & Schram looked back on half a decade of research into the impact of colour television on learning. They concluded from the research data that “Where learning is concerned colour television has no distinct advantage over monochrome”, which was true in retrospect because at that point television companies had failed to grasp the new ways that colour might allow them to represent knowledge and entertainment. One of the first TV entertainment programmes to be converted into colour was The Black and White Minstrel Show, which shows how easy it is to miss both social and technological change. Similarly today much of the output of publishers on CD ROM is simply in the form of electronic books and the resultant multimediocre both misrepresents the potential of technology and the capability and new literacy of technology. A second important area for current debate centres around the shape of our media services and the institutional or public policy that attempts to keep pace with them. As computers put communication tools into more and more hands the national debate about preserving standards and about what is appropriate or inappropriate is reminiscent of the church’s rearguard action to preserve literacy for itself as printing began to impact on our social lives. From pirate radio onwards the democratization of communications has been characterized by stout defences of position by existing institutions. In the context of learning, schools and universities as 1. At the First International SocInfo Conference: (TESS) Technology and Education in the Social Sciences, 5–7 September 1995. For details refer to: http://www.stir.ac.uk/socinfo
14
PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE
institutions too have worked to preserve and strengthen their role in formal learning. Schools even encourage parents to create little institutional microcosms in the home by sending students back with homework, while parents respond in some cases by setting up little school desks and trying to recreate the classroom in the bedroom (“you wouldn’t have the radio on in the classroom would you?”). Suddenly, however, the learning industry looks a lot bigger than schools and universities and high quality learning will be occurring through other channels like the digital annotative side channels offering parallel commentary to TV programmes. A huge challenge for educational institutions will be the way in which they respond to these new learning environments. There are already popular “project collaboration and exchange” areas available on the Internet and these can either be seen as an appropriate and imaginative use of technology or as cheating. Education’s response (and the way it addresses the issues of social equity raised) will determine its future significance in the learning industry, just as the church’s response to mass literacy was crucial in shaping its own destiny. For politicians looking to build policy in the information age it should be clear that alternative scenarios can and will result and the extrapolation of policies to build those scenarios will become a key differentiator of political perspectives. Not so long ago in us politics it was a universal tenet that “motherhood and apple pie” would always be a Good Thing but now our understanding of the changing dynamics of the family, and of nutrition and diet, leaves many credible shades of opinion about just how “good”. Similarly our view of technology as a Good Thing needs to evolve levels of sophistication; the critical awareness that social sciences bring will be crucial to this process and education needs to be at the heart of the debate if it is not to be excluded. References G.C.Chu & W.Schramm, Learning from television: what the research says (Stanford, California: Institute for Communication Research, 1967). A.M.Gillaume & G.L.Rudney, “Student teachers’ growth towards independence: an analysis of their changing concerns”, Teaching and Teacher Education 9, 1993, pp. 65–80. M.C.Heck, “The ideological dimension of media messages”, in Culture, Media Language S.Hall et al. (eds) (London: Hutchinson, 1980). S.Papert, “Literacy and letteracy in the media ages”, Wired 1(2), May/June 1993, pp. 50–52.
Chapter 3 TECHNOLOGY AND SOCIETY: AN MP’S VIEW Anne Campbell
We are living through a period of intense technological change. The number of people employed in manufacturing industry is down from around 7 million in 1979 to 4 million today. These changes have left many scars and caused insecurity and unease among those who are still employed as well as those who have given up hope. Much of the population has been left with a feeling of deep suspicion. In many quarters an “anti-science” culture is developing, particularly among the young. The free market approach to technological development has left many people without access to the new technologies and consequently the gap between the “haves” and the “have-nots” has grown. What is required is more vision. Technology could be employed to open up the opportunities for education and research for millions of people. The government’s role must be to ensure that such access is available to everyone and that the information revolution is used to encourage opportunity, equality and democracy. Social scientists need to engage in this process and find ways to understand the implications of these technological advances. Using the example of my own constituency in Cambridge, this paper illustrates how, through free public access points across the city, citizens can access socially useful information about the city, council services and information from government agencies. Science, technology and government policy In May 1993, the Conservative Government produced a White Paper called “Realizing our potential: a strategy for science, engineering and technology”. Their strategy was to improve the nation’s competitiveness and quality of life by maintaining the excellence of science, engi neering and technology. However, this was accompanied by a sharp reduction in the funds available for science and technology across government departments. The Government also expressed its concern that public money spent on science and technology might not always be directed in a way that best satisfied industrial needs. A Technology Foresight exercise was established in order to predict the future needs of society. Few would argue that this exercise has no merit. It is helpful for the Government, academics and the business community to sit down together and discuss issues of common interest. Nevertheless there are many concerns that focus on the way the results of this exercise might be used. Some parts of the White Paper certainly sent a chill through the scientific community. Many scientists believed that the Government would try to restrict its spending on science and technology to those areas most useful for industrial application. In fact, it has been proved on numerous occasions that attempts to pick industrial winners often meet with failure. The pressure group Save British Science sent out a Christmas card to MPS in 1993 which cited examples of research projects that had been turned down for funding in the past, because they did not appear to have any industrial application. Liquid crystal display technology, which was invented in the UK, was developed abroad because nobody in the grant-awarding
16
TECHNOLOGY AND SOCIETY: AN MP’S VIEW
bodies believed that it was an idea which had any commercial future. Many scientists have argued that many of the best commercial ideas come from “blue skies research”. We risk missing the more innovative and exciting developments if the direction of our scientific effort is determined by society’s existing needs. It is important that scientists are given the freedom to continue with blue skies research to ensure that we do not extinguish future opportunities to improve our lifestyles. It is not too difficult to predict some of the areas in which scientists and engineers will develop the technology in the five or ten years which lie ahead, but it is much harder to imagine how people will adapt to the changes which it can bring. The role that social scientists can play here is therefore quite important. We need to understand the new social processes that are emerging with the developments in technology. Since technology can create new needs, we should not limit ourselves to merely doing more efficiently or more cheaply those things which we can do already. Word processors are not simply about typing letters more quickly; they allow people to think in a different and more flexible way. Mobile telephones are not just different ways of using the telephone; they enable people to be in touch anywhere at any time. The Internet is not only a means of downloading information; it encourages communication and interaction across national boundaries as well. All these changes have been supported and welcomed by the people who could afford to pay for them. Successful technological development is often a leap of faith. It depends on being able to predict the new needs which are generated by the scientific progress which has been made. Thirty years ago it was not anticipated that computers would be used to do anything other than high-speed mathematical calculations. Now we see them being used to organize information in a way that has revolutionized our lives. The ways in which the national communication networks are used in future will depend very much on human ingenuity and imagination. The information gap We must ask whether these advantages will benefit everyone or will leave us with an underclass of information poor. Will they increase employment so that everyone can afford to have shopping, entertainment, education, business, and so on, all available from home? From present trends that seems doubtful. Without intervention, the free market will force open the divisions in society even wider than they are at present. People who are employed will acquire the experience and up-to-date technological skills to flourish in the new age. The “haves” will rapidly accumulate more and the “have-nots” will have no relevant skills to pull themselves out of the poverty trap. When 80 per cent of the population are using electronic mail, what happens to those dependent on the daily mail deliveries when the postman disappears? Does electronic surveillance drive the homeless even further from the centres of civilization, to dark hidden corners where they become invisible? Will the corner shop disappear completely with the advent of teleshopping? How will the “have-nots” manage in those circumstances? There are some serious and difficult issues to do with access and equality. It may be easy if you have the necessary computer and modem and can afford to pay the subscription to an Internet provider, as well as the expensive phone bills which arrive when you have been surfing and forgotten the time. If you have never been able to afford a telephone, which is the situation for up to 75 per cent of households on some housing estates in Britain, then life is bound to be much more difficult. These issues are the cornerstone of social scientific research and more work should be done in this area.
THE INFORMATION GAP
17
The Cambridge experience About a year ago, I decided to make use of the new technologies by giving my constituents the facility to contact me by e-mail. It is probably a more viable prospect in Cambridge than in most other constituencies, since about 30,000 of my 70,000 constituents already have access to e-mail. About 25 per cent of my constituency mail arrives this way. I also give my constituents the chance to contact me at an e-mail advice surgery. This specifies an exact time when I shall be sitting at a terminal ready to receive messages and I try to respond to them immediately. But for many, and for 40,000 of my 70,000 constituents, there is no access and probably little inclination. What is the point of connecting when you have never used a computer anyway and you just do not believe that there is anything on the Internet which would be of any conceivable use to you? In Cambridge we have attempted to tackle these problems by launching the Cambridge On-line City Project, with the aim of providing socially useful information and free network access for people to whom it would not normally be available. In its first phase, six public access points have been provided in public buildings such as libraries, community centres and council offices around the city. This has now increased to 17. The information is provided via a Website.1 This contains an A–Z of council services, an index of voluntary groups, advertisements of council leisure facilities, information on where to get benefits advice, and some links to job vacancy databases. We hope to add doctors’ lists, NHS dentists, council house exchange lists, chemists which are open late and many others. Many public service organizations have been consulted, and are enthusiastic about having their information provided over the Web. In these early stages, the project has relied on the generosity and goodwill of local companies and local councils. Cambridgeshire County Council and Cambridge City Council have contributed officer time, Cambridge Cable have provided telephone lines, and Uunet have given server space and technical support, with further support from CMS and Software AG. In order to progress, the project will need funds to employ a manager and to expand the system. The success of the recent lottery bid will ensure that this happens. It is also hoped to have an IT learning centre so that people can pick up the skills required in order to be comfortable with the technology. In the future, this facility could also be used to provide a feedback mechanism so that users can comment on council services and on other public services as well. In a properly developed framework, it could greatly increase the accountability of councillors, MPS and other elected public representatives, particularly if the comments were accessible in an open and public way. Another project that will link-in to the Cambridge On-line City is the Cambridge Childcare Information Project, now called Opportunity Links. The purpose of this venture is to help parents get back to work by providing most of the information that they need in one location. There has been a very generous response from the organizations and firms which we approached about sponsoring this project. Initially, £10,000 was raised from commercial and public sources which has enabled us to employ a part-time project worker to collect the information. A Website was launched in 1996 to give parents some basic advice on childcare: the different kinds available, their relative costs, local nurseries and playgroups. Other information about ways in which to look for a job, how to find appropriate training, and benefits advice together with “better-off” calculations are also supplied. The continuation of this project will rely on local government and businesses donating sufficient funds to continue to employ project workers. The need for such a service is clearly there. In the future, these information services will revolutionize libraries and welfare information provision. It will be cost-effective to spend some public money but it is difficult to envisage government, local or central, being able to afford the expenditure to assure completely open access. There are commercial 1. http://www.worldserver.pipex.com/cambridge/
18
TECHNOLOGY AND SOCIETY: AN MP’S VIEW
advantages for the private sector wishing to provide additional entertainment and leisure facilities. This could stimulate the development of public-private partnerships that will provide the systems and access facilities, giving benefit to both community and business. Access and empowerment: some lessons There are many issues that I have had to consider in my own personal use of the Net. A fundamental belief is that participation in the information society should be available to all, and not just the privileged few.2 There must be equality of access and we must seek to empower citizens both as participants and consumers, as well as providing equal access for the providers of services. The new networks must help to increase citizen participation in decision-making and contribute to the development of a more open society. At the same time, legislation should be framed which enables privacy to be respected and legitimate rights to the ownership of information to be acknowledged. Government itself can become more open and accessible through this process. The implications for education and research are central to issues of access.3 The opportunities to learn will undergo the same massive expansion as occurred when the first public libraries opened. But this is a new kind of active learning, since it will involve interaction with individuals and not just passive absorption of information. How much more then will learners need guidance through the maze of information, learning packages, electronic courses and offers of tuition. The role of teachers and lecturers will change for the better: there will be more individual direction and guidance, less bureaucratic record-keeping and factgiving. Programmes can be tailored to the needs of individuals, but that individual will still want human contact and human input to learn in the most effective way. The national communication networks have the potential to open up learning channels for very many more people than those who benefit from further and higher education at present. It is through this new technology that we see “The Learning Society” within our grasp. It is vitally important that we take hold of these chances and use them to improve the quality of life for all our people. But that takes more than pure commercial development. This is not an area that we can safely leave to the scientists and the business community. It is one that requires government intervention and the social scientific community to ensure that the benefits are available to everyone. Let us make sure that the information revolution does not worsen the divisions in society, but is used to enable opportunity, equality and democracy. References Office of Science and Technology, Realizing our potential: a strategy for science, engineering and technology (London: HMSO, 1993). Office of Science and Technology, Progress through partnership, 1–15 (London: HMSO, 1995).
2. The Labour Party held a Superhighway Policy Forum in 1995, a wide-ranging investigation into the effects of the new networks and how government can manage them to benefit the many, not just the few. Its findings (Labour Party (1995)) were adopted by the Party conference in October 1995. Labour Party, Communicating Britain’s future (London: The Labour Party, 1995). 3. For papers and information on the development and effects of IT in education see ULTRALAB’S Website at: http:// www.ultralab.anglia.ac.uk/pages/ultralab
Chapter 4 INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY Adrian Kirkwood
Although there has recently been a significant growth in the use of information technologies in the workplace, in education and in the home, there is little evidence to support the technologically deterministic predictions about IT becoming ubiquitous throughout society and about radical social changes that would follow. In Western countries, the impact of IT upon different groups in society has been varied, tending to reinforce rather than ameliorate existing inequalities. This chapter will examine some of the differences that exist between groups (primarily within the UK) in the extent of access to and use of IT in the home and in education. In particular, it will consider variations that exist in terms of gender, age and socio-economic group. As well as presenting the quantitative evidence for the existence of these differences, the chapter will consider whether IT is likely to exacerbate social differences. Introduction For at least two decades predictions have been made about the imminent ubiquity of IT throughout society and the radical social changes that would follow. Alvin Toffler’s view of a future society (1980) had at its centre the home; an “electronic cottage” in which not only paid work, but also leisure and service consumption, would be mediated through information and communication technologies. Although there has been a significant growth in the use of information technologies in the workplace, in education and even in the home, the technologically deterministic prediction of IT bringing about fundamental social change has failed to materialize. In Western countries, the impact of IT upon different groups in society has been varied, tending to reinforce existing inequalities (e.g. Forester 1988; Miles 1988). Those people with good access to IT and familiarity with its use often assume that their situation is typical. For example, Eliot Soloway (a professor at the University of Michigan, USA) introduced his keynote speech at an international conference in 1994 with these words: There is no longer a problem of access to computers.” Perhaps access to IT is not a problem if, like Soloway, you are a male, white American in a middle-class professional occupation—if not, the situation might be viewed differently. In fact, the pattern of ownership and use of IT varies considerably between social groups. This chapter will examine some of the differences that exist between groups (primarily within the UK) in the extent of access to and use of IT in the home and in education. As well as presenting the quantitative evidence for the existence of these differences, the chapter will consider whether IT is likely to ameliorate or exacerbate social differences. It will also examine some of the social factors that tend to have been overlooked (or dismissed) by those who expound technological determinist predictions.
20
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY
Access to information technology in the home Computers are not accessible to all, even in the richer industrialized Western countries. Many of the claims made by computer manufacturing and marketing companies about the numbers of machines available in particular countries are based upon their measures of output, usually “deliveries to the trade”—i.e. machines shipped out from their own factories, assembly plants or warehouses to retailers or other distributors. Even the consumer sales figures of computer retailers or other distributors provide little or no indication of who the purchasers are and whether the machines are being sold into existing markets (i.e. additional or replacement equipment) or penetrating new markets (i.e. first-time buyers). National social surveys can provide independent information about the extent to which households have computers and other media technologies. The us Bureau of the Census (1993) reported that there was a computer available in about 45 per cent of households in the USA, but only about 10 per cent of the homes of blacks or Hispanics contain a computer. In the UK, data from the General Household Survey for 1994 (OPCS 1996) indicates that less than a quarter of households (24 per cent) contained a computer, compared with 77 per cent having a video recorder and 47 per cent an audio CD player. (This figure for computer access is in line with data from commercial market research.) Even more revealing is the extent to which access in the UK has changed over the last decade. Figure 4.1, below, uses data from successive General Household Surveys from 1985 to 1994 to reveal trends in access to computers and media technologies. The growth in access to these three domestic technologies exhibits strikingly different patterns over this period. Home access to a video recorder steadily rose to over three-quarters of households (increasing by almost two and a half times, from 31 per cent to 77 per cent). There was a similar (but slightly more rapid) rate of growth in access to an audio CD player over a shorter period; more than tripling, from 15 per cent in 1989 to 47 per cent in 1994. Over the whole period, access to a home computer increased, but only very gradually (from 13 per cent to 24 per cent, often increasing by only 1 per cent per year). But why has the computer failed to penetrate more than three-quarters of UK households despite the highprofile marketing campaigns of the 1980s and 1990s? One of the reasons why computers have not become as ubiquitous as video recorders in the home (a forecast that was commonly made throughout the 1980s) is that many people are uncertain about what the multi-function computer could usefully do for them in the domestic setting. A video recorder and an audio CD player have clearly defined and easily understood functions within a household. Both offer increased convenience to users (extending control over when and what TV programmes and films can be watched, or the quality of music reproduction) and also a degree of continuity—unlike home computers, they have not been subject to a rapid succession of changes that give rise to problems of technical incompatibility and obsolescence. Another factor must surely be the marketing and pricing policies of the hardware manufacturers, who prefer to increase the technical specification of computers on a regular basis rather than reduce the base price. Increasingly powerful machines are being marketed as “entry level” computers, but these still require a large financial outlay for many people. Software developers reinforce this strategy by frequently producing enhanced programmes that require ever more computer memory to operate. Information technology at home: differences between social groups In the UK, domestic access to computing equipment is clearly not universal and the penetration of new households has been very slow. So who does have computing equipment at home? For more than a decade, much of the computer companies’ marketing effort has targeted families with children of school age: a computer at home was desirable, if not essential, as it would extend the educational opportunities for
ACCESS TO INFORMATION TECHNOLOGY IN THE HOME
21
Figure 4.1 Households in the UK with video recorder, home computer and audio CD player, 1985–94. Sources: OPCS, 1989 and 1996.
children and allow them to practise and consolidate the skills they learn in the classroom. Has this strategy had any effect? Families with children of school age There is evidence from the annual survey undertaken by the Independent Television Commission (ITC) that homes with children are more likely than others to contain domestic media technologies, including a computer (e.g. ITC 1995). Figure 4.2 shows differential rates of access to a range of technologies. Data from the General Household Survey 1994 (Central Statistical Office 1996) provides confirmation that households containing dependent children are more likely to possess a video recorder, audio CD player and home computer. So, households that include children are more likely than others to contain domestic media technologies. What other differences can be identified between groups in society? Gender differences Survey data on home access to technologies often fails to identify which members of the household make (or control) use of particular items of equipment. Even if there is a computer at home, it may not be equally accessible to all members of a household. Or, looked at another way, not all household members might choose to make use of a computer for leisure or entertainment, for educational purposes or for other domestic or business purposes. Market research surveys and studies of adult students, teenagers and children have consistently found that males are more likely than females to have access to a computer and to spend more time using a computer at home. For example, a study of 12-year-old children in England (Kirkman 1993) revealed that
22
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY
Figure 4.2 Proportion of UK households with selected media technologies, 1994. Source: ITC, 1995.
55 per cent of the students in the sample used a computer at home—70 per cent of the boys compared with 38 per cent of the girls. The amount of time spent using a computer at home averaged 7.1 hours per week for the boys, but only 4.2 hours per week for girls. Another study of English secondary school students (Robertson et al. 1995) found that 47 per cent of students had access to a computer at home, but that ten times as many boys as girls had sole access to a home computer and that there was a significant difference in the extent to which they made use of them. In the USA, a study of high school students (Shashaani 1994) found 68 per cent of boys and 56 per cent of girls reporting the presence of a computer at home. When asked who used the home computer, two-thirds of the primary users identified were male. In a study of Norwegian undergraduate students, Busch (1995) found that more male students than female students had had a home computer before entering higher education (41 per cent compared with 24 per cent), and as college students the difference persisted, although not to the same extent. In many of these studies it has been found that the extent of use correlates with attitudes and perceptions about the potential value of computer-related activities and with performance on learning tasks involving the use of a computer. Home computers are more likely to be bought for the use of men and boys, and even when a machine is acquired as a family resource, the main users are very infrequently reported to be female. This might reflect the fact that computers were initially marketed for the male leisure industry (Haddon 1988). It is also associated with the greater control that tends to be exerted over domestic finances by men. Research undertaken with large numbers of adults studying with the Open University has consistently indicated that men are more likely than women to have access to a computer, either at home or at their place of work (Kirkwood & Kirkup 1991; Kirkwood et al. 1994; Taylor & Jelfs 1995; etc.). Furthermore, men were much more likely than women to have made the decision to acquire or upgrade home computing equipment and to make use of such equipment in the home. For example, in a large-scale survey conducted in 1995 (Taylor & Jelfs 1995), over 40 per cent of female students had no access to a computer (either at home or at work) compared with only 25 per cent of male students. When students with a computer at home
ACCESS TO INFORMATION TECHNOLOGY IN THE HOME
23
Figure 4.3 Access to information and communication technologies in UK households, 1994 (by economic status of head of household). Source: OPCS, 1996.
were asked who in the household provided the main impetus to acquire computing equipment, 77 per cent of men—but only 41 per cent of women—answered “self”; 26 per cent of women indicated that their spouse or partner had been the main decision-maker, compared with only 4 per cent of men who answered that way. Patterns of use also favoured men. Half of the females with access to a computer at home reported that their spouse or partner made frequent use of the equipment, compared with only 26 per cent of male students. Occupation and social class differences The basic data on domestic access to media technologies also conceals social differences. Where the head of a household has a high-status occupation (i.e. classified as being in the categories “Professional” or “Employers and managers”) there is greater likelihood that the home will contain a telephone, video recorder, audio CD player and home computer than if the occupation is classified as “Semi-skilled manual”, “Unskilled manual” or “Economically inactive” (OPCS 1996). Data from the 1994 General Household Survey is presented in Figure 4.3. It is not just a matter of those using stand-alone computers: those in the higher socio-economic groups are more likely than others to participate in computer-mediated communication and have access to networks. A survey conducted by Continental Research in September 1995 (quoted in ITC 1996) found that less than 7 per cent of the UK population had access to the Internet, which was mainly available at the workplace. The user profile is biased towards younger men in the higher occupational categories. However, an earlier survey by the same organization (quoted in Screen Digest) indicated that 23 per cent of UK company executives had access (either at home or at work) in June 1995.
24
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY
Is information technology bringing about fundamental social change? So the evidence does not support the predictions of IT becoming ubiquitous in Western countries in the near future, particularly in terms of home access. But what of the fundamental social changes that were expected to be brought about through the widespread use of IT? Has IT made any contribution to changes in society and, if so, have these tended to ameliorate or exacerbate social differences? A number of aspects will be considered, paying particular attention to home-based activities. Changing employment patterns The overall pattern of employment in the UK has changed in recent years. Since the mid-1980s part-time working has become more common for both men and women, which has led to a rise in the number of women in paid employment (Central Statistical Office 1996). However, while the number of women in fulltime work has also increased, fulltime employment for men has declined. Information and communication technologies have had both positive and negative effects upon the level of employment in the UK, as they have in most other developed countries. The impact of new technologies can be seen in the creation of new employment opportunities as well as the destruction of jobs in certain industries and services (Freeman 1995). Many of the new jobs made possible by greater use of information and communication technologies have involved changes in the geographical location of companies, particularly in the service sector. For example, the organization of banking, insurance and other financial services has been transformed in recent years with a much greater emphasis on access to “remote” rather than “High Street” provision. But while the use of IT in the workplace has permeated a large proportion of companies and organizations, there is little evidence of significant changes in the practice of homeworking—an essential element of the predictions for a new electronic society. Homeworking encompasses many categories of activity, including farmers, selfemployed building and maintenance workers, those in creative fields (writers, designers, artists, etc.) as well as people undertaking unskilled or semi-skilled assembly jobs or other forms of piece-work. Few of these activities lend themselves to being IT-based. Although some people are engaged in “teleworking” (i.e. working from home with information and communication technologies rather than travelling to a place of work located elsewhere), much of this appears to be done as only part of the normal work pattern or by people engaged in professional and creative occupations. Many companies would be reluctant to encourage or facilitate homeworking because it would necessitate a loss of control over employees’ time and the tasks they undertake. Furthermore, many homes are not suitable for teleworking. Using IT for homeworking requires not only appropriate facilities, but also space and arrangements that allow work to proceed without too much disruption being caused (both to the homeworker and to other members of the household). Much of the growth in professional homeworking arises not so much from developments in IT as from economic changes that have brought about an increase in selfemployment and home-based consultancy work. In 1995, more than three-quarters of all UK home-workers owned their own business or worked on their own account (Central Statistical Office 1996). Leisure and service consumption at home using information and communication technologies It was predicted that information and communication technologies would bring about significant changes in the patterns of leisure and service consumption. IT would make it unnecessary for people to leave their homes for many forms of entertainment or to undertake activities such as shopping, banking or gaining access to information and advice on a wide range of topics.
ACCESS TO INFORMATION TECHNOLOGY IN THE HOME
25
The convergence of computing and digitized telecommunications services has made possible the development of an infrastructure that is often referred to as an Information Highway (or even Superhighway). This would comprise linked networks of high-capacity fibre optic (broadband) cables capable of conveying at high speed very large volumes of data (audio, text, video, etc.) to and from a very high proportion of business and domestic properties and institutions such as schools, libraries, hospitals, etc. A high level of investment has already been made in installing the necessary infrastructure and this will continue for at least the next decade. The principal actors involved are the telecommunications providers (BT, Mercury, etc.) and the cable television companies. In terms of the domestic market, cable television has not achieved a high degree of penetration of UK homes since it was established in 1984. Figure 4.2, above, showed that in 1994 only 7 per cent of UK households were connected to cable TV services (ITC 1995). These companies are seeking to achieve a target of 75 per cent of households being capable of being connected by the year 2000. It is frequently claimed that there is an enormous demand for consumer services using information and communication technologies, but to what extent are the services currently provided being used? To date, the limited number of services that have been offered have achieved only a modest amount of success. In recent decades, leisure and recreation time has increasingly been spent in the home rather than in the public sphere. In Western societies attendance at public performances (e.g. cinema, concerts, theatre and attending sports events as spectators) has declined in favour of home consumption using audio-visual means (television, video, etc.). However, there are other activities which involve people going outside the home, for example to restaurants, shopping expeditions, day-trips, etc. Increased leisure services using information and communication technologies are unlikely to replace the “outside the home” activities to any great extent—social contacts are important and people do not want to remain at home unless they have no alternative; the new services are more likely to be in competition with other home-based leisure activities. “A new supply of information, communication and entertainment services is more likely to result in increased competition to win round the consumer. The idea that as a result of new services, new markets will open up is a distortion of the facts” (Punie 1995, p. 33). Economic capacity as a limiting factor Those who have predicted the ubiquity of home computers have tended to adopt a diffusion model, seeking to explain patterns of adoption and use by relating the characteristics of computers and the needs and attitudes of potential users. There has been a tendency to assume that people would perceive the benefits to be gained from the use of IT applications and acquire equipment for home use. If existing uses and applications were unable to convince the reluctant to become involved, then efforts were needed to develop a “killer application”, i.e. a service or use for IT that met so many needs for so many people that it was impossible to resist. The economic capacity of a household was largely overlooked, because such model: took it for granted that everybody was a potential computer owner and that the diffusion curve would follow other major innovations in domestic electronics, such as the television set, with adoption trickling steadily down the income scale. (Murdock et al. 1994, p. 271) Despite enormous promotional activities, home computer ownership remains concentrated within the professional and managerial groups, often increasing opportunities for those who already have them.
26
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY
Who’s using the Net? The Internet enables communication between computers to be established for the purpose of data transfer, email, access to remote databases and information sources, etc. In a newspaper article Bowen (1996) sought to draw the attention of the business community to the commercial possibilities offered by the Internet: “It is not fanciful to compare the potential of the Internet with that of the motor car.” He used a “comparative history” of the motor car and the Internet to draw an analogy with early scepticism about the potential of the car. However, the historical “facts” about the motor car are very selective, with no mention whatsoever of the negative and detrimental effects brought about by the dominance of the motor trade in Western countries. For example, although one-third (32 per cent) of UK households were without a car in 1994 (OPCS 1996), transport policies make travel in rural and remote areas very difficult, while retail and leisure activities in very many town centres have declined as a result of the growth in out-oftown shopping and entertainment developments. It also ignores the economic and environmental effects of traffic congestion that renders many journeys very time-consuming. Conclusions A number of sources have been used to provide evidence that significant differences exist between social groups in terms of access to and use of IT, particularly in the home. Some of those differences have been examined with a view to assessing the likely impact of IT. Little evidence has been found to support the idea that IT is bringing about fundamental changes to the existing social structures. References D.Bowen, “Is anybody out there?”, Independent on Sunday, 10 March 1996. T.Busch, “Gender differences in self-efficacy and attitudes towards computers”, Journal of Educational Computing Research 12, 1995, pp. 147–58. Central Statistical Office, Social Trends 26 (HMSO, London, 1996). T.Forester, “The myth of the electronic cottage”, Futures, June 1988, pp. 227– 40. C.Freeman, “Unemployment and the diffusion of information technologies: the two-edged nature of technical change,” PICT Policy Research Paper no. 32, Programme on Information and Communication Technologies, Economic and Social Research Council, 1995. L.Haddon, “The home computer: the making of a consumer electronic”, Science as Culture, no. 2, 1988, pp. 7–51. Independent Television Commission, Television: the public’s view 1994 (London: Independent Television Commission, 1995). Independent Television Commission, “Surfin’ UK”, Spectrum, Issue 19, 1996. C.Kirkman, “Computer experience and attitudes of 12-year old students: implications for the UK national curriculum”, Journal of Computer Assisted Learning 9, 1993, pp. 51–62. A.Kirkwood, & G.Kirkup, “Access to computing for home-based students”, Studies in Higher Education16, no. 2, 1991, pp. 199–208. A.Kirkwood, A.Jelfs, A.Jones, “Computing access survey 1994: foundation level students”, Paper no. 51, Programme on Learner Use of Media, Institute of Educational Technology, The Open University, 1994. I.Miles, “The electronic cottage: myth or near-myth?”, Futures, August 1988, pp. 355–66. G.Murdock, P.Hartmann, P.Gray, “Contextualizing home computing: resources and practices”, in Information technology and society, N.Heap et al. (eds) (London: Sage, 1994). Office of Population Censuses and Surveys, General Household Survey 1987 (HMSO, London, 1989).
ACCESS TO INFORMATION TECHNOLOGY IN THE HOME
27
Office of Population Censuses and Surveys, Living in Britain: results from the 1994 General Household Survey (London: HMSO, 1996). Y.Punie, “Media use on the information highway: towards a new consumer market or towards increased competition to win round the consumer?”, paper presented at PICT International Conference on the Social and Economic Implications of Information and Communication Technologies, London, 10–12 May 1995. S.I.Robertson, J.Calder, P.Fung, A.Jones, T.O’Shea, “Attitudes to computers in an English secondary school”, Computers and Education 24, 1995, pp. 73– 81. L.Shashaani, “Gender-differences in computer experience and its influence on computer attitudes”, Journal of Educational Computing Research 11, 1994, pp. 347–67. E.Soloway, “Reading and Writing in the 21st Century”, keynote address to EDMEDIA 94, World Conference on Educational Multimedia and Hypermedia, Vancouver, Canada, 1994. J.Taylor, & A.Jelfs, “Access to new technologies survey (ANTS) 1995”, Report no. 62, Programme on Learner Use of Media, Institute of Educational Technology, The Open University, 1995. A.Toffler, The Third Wave (London: Pan, 1980). US Bureau of the Census, “Current Population Reports: Computer use in the United States, 1993”, Washington. The data is available at the following URL: http://www.census.gov/population/www/socdemo/computer.html
Section Two DEVELOPING COURSEWARE FOR THE SOCIAL SCIENCES
Chapter 5 EXPECTATIONS AND REALITIES IN DEVELOPING COMPUTER-ASSISTED LEARNING: THE EXAMPLE OF GraphIT! Ruth Madigan, Sue Tickner and Margaret Milner
Working with an interdisciplinary team to produce CAL courseware (a tutorial package introducing basic statistics) proved more difficult than anticipated. More training, organization and sustained teamwork were needed to establish a common language and mode of operation. The design and the use of CAL forced a fundamental re-evaluation of teaching methods. Defining how students learn may be as important as defining what they learn. This is necessary in order that academics (and other teachers) can come to terms with a new medium and its integration into the curriculum. The aim of this paper is to pass on a few honest reflections on the problems encountered in developing a piece of interdisciplinary courseware under the umbrella of TLTP. We are taking a risk here, since we are focusing on our mistakes rather than our successes, but we are doing this in order to clarify our own thoughts and in the hope that others can learn from us. Despite our mistakes, we do believe we have produced a tutorial program which others will find useful.1 Our particular objective was to create an independent learning package (GraphIT!) which could serve as an introduction to basic statistics across a number of university departments; accounting and finance, sociology and statistics were represented on the development team2. Introductory statistics appeared to be an appropriate area in which to make use of such a program. Many of the social sciences are essentially discursive and evaluative subjects, which do not generally lend themselves to a simple rehearsal of factual knowledge or established routines. Basic statistics on the other hand is an area which requires a certain amount of repetitive exercises to grasp its application and does produce some right and wrong answers. Moreover it is an area of study which many students (and staff!) find difficult, so any additional aid to learning would be welcomed. The computer has the obvious advantage of an interactive dimension and the capacity for rapid calculation, so the drudgery is removed and the student can concentrate on the application. Moreover, the interactive, dynamic aspect of CAL can introduce an element of fun or play which is often welcome in a subject which many experience as rather dry; a means to an end perhaps rather than interesting in its own right (apologies to all those statisticians who evidently love their subject, but many teachers will recognize the problem). Introductory statistics therefore seemed an appropriate area in which to develop a CAL package: • At this introductory level at least, it rests on a well-defined paradigm. • It can be presented as exercises which are susceptible to right and wrong answers. • It is an area in which students are likely to find repetitive practice helpful. 1. For those who are interested, a copy can be found at: http://www.elec.gla.ac.uk/TILT/cat-of-software/ downloadGraphIT.html
30
EXPECTATIONS AND REALITIES
• The fun/play element of CAL helps in an area of learning which many regard as necessary rather than popular. Tutorial programs Only some of our team had any experience of authoring systems and, as a consequence perhaps, some of us at the start had very little understanding or “feel” for what could be achieved with CAL. It was only later when we discussed and read some of the literature about the role of IT in education that we came to understand that the “drill-and-practice” tutorial has attracted a lot of criticism from CAL professionals, because it appears to rely on a rather old-fashioned approach to learning with built-in assumptions about a fixed body of knowledge and narrowly prescribed learning objectives. This is seen, understandably, as a rigid and non-exploratory approach to learning. “It is judged to offer poor approximations to what is itself a rather poor model of the teaching process in the first place (didactic encounters guided by the IRE3 pattern of dialogue)” (Crook 1994, p. 13). In planning our own tutorial package we were happy to include some elements of “drill-and-practice” routines. None the less, our original conception aimed to be rather more discursive and adaptive (Laurillard 1993, pp. 94–5) than we finally achieved. We had hoped that GraphIT! would provide in effect a front end for Minitab (a commercially available statistical package) so that the student could access and analyze the data sets in a rather more creative, flexible way than has actually proved possible. A series of technical problems and the consequent pressure of time meant we had to abandon the direct use of Minitab and as a consequence we lost the more exploratory dimension. We were also keen to retain a more interpretational dimension, to encourage students to realize, for example, that there is not always agreement about the best way of presenting data or indeed interpreting data. The problem here is not just technical, but may also reflect the inexperience of the academics as scriptwriters, who found it difficult to think themselves into a new medium (see below). What we have produced, then, is a series of modules arranged in a hierarchy of learning (from the simple to the more complex). It is possible for the student to go back and forth at will, but it is essentially a linear tutorial program following a fairly traditional, didactic model of learning. This is less than we had originally envisaged, but still has a useful role in many courses. As Crook (1994) points out, this sort of tutorial is popular with teachers because it is easy to assimilate into prevailing patterns of teaching practice, and because the drill-and-practice approach is appropriate for some types of material and some forms of learning. We can recognize the value of this type of teaching technique in certain situations: “it need not presume a wholesale reduction of educational activity to the rehearsal of discrete subskills…. It needs to be made sense of rather than automatically disparaged” (Crook 1994, p. 14). Its value must depend to a great extent on how successfully the tutorial is integrated into the rest of the course and other complementary teaching methods.
2. The courseware development was carried out within the context of a wider initiative within the University of Glasgow: Teaching with Independent Learning Technologies (TILT) . 3. I-R-E “verbal exchanges taking the form of a (teacher) Initiation, a (pupil) Response and a (teacher) Evaluation” (Crook 1994, p. 11).
TUTORIAL PROGRAMS
31
Interdisciplinary work At a purely practical level, we found an interdisciplinary project more difficult than anticipated. The academics in our group were originally located in four departments spread across the campus between five and fifteen minutes’ walk from each other. All the academics involved had heavy teaching timetables and other commitments, so even when the number of participating departments was reduced to three (because one member moved department) it remained very difficult to get the whole group together on a regular basis. The RAS (research assistants) were located at some distance from the Chair of the group who was responsible for administration and liaison with the centre (i.e. the TILT Steering Group overseeing all the subgroups). At the beginning of the project not everyone had access to e-mail. It is easy to say we should have given more thought to these practical issues, but they arose as a result of the resources available, the fact that the central project administration was not yet established and the interdisciplinary nature of the group. They were, however, very serious for the operation of the group. The fact that we found it so difficult to meet regularly as a group meant that we also had difficulty in developing a common language and were slow to pick up divergent views. Part of our problem was also an intellectual one. We used the same words, the same statistical terms for instance, but we did not necessarily speak the same language; the relative importance of categorical versus interval data for different disciplines for example, or what constitutes an attractive data set. At one level we knew these differences existed before we began, which is why indeed we have included interchangeable data sets so that to some extent the package can be customized to suit each subject area, but one can know these things without realizing their full implications. With hindsight we should have spent more time at the planning stage (though in our defence, we worked through all the recommended stages of defining objectives, distinguishing our project from comparable software, creating a common framework, agreeing on key design features, etc.). The pressure of time, the fact that the RAS had already been recruited and the difficulty of getting together as a group encouraged us to subdivide the task of scriptwriting as soon as we had an agreed framework. This seemed a practical way to progress in the circumstances, but had the unintended effect of reinforcing a disciplinary divide, the statisticians on the one hand and the social sciences/accounting on the other, and allowed two sets of interests to develop in isolation. This slowed down the development process considerably as material then had to be rewritten and reshaped at a later stage. It left the RAS in a difficult situation trying to reconcile the two groups. Authoring and a new medium for academics As well as interdisciplinary problems there were also problems of communication between those with experience of authoring software (in particular, though not exclusively, the RAS) and those without. Again this was a gap which tended to be reinforced, rather than reduced, by organizational arrangements. The RAS, across all the subgroups, not just ours, were appointed at an early stage in the project before the central organization had really been established. This had advantages and disadvantages: they were in at the beginning and consequently able to make their own contribution to developments, but at the same time they were newcomers to an organization which had not yet established its own lines of communication and administration. Thrown on their own resources, the RAS developed a camaraderie and a lively network of working relationships right across the university. This has been enormously beneficial in that RAS have been able to swap technical knowledge, offer each other support and spontaneously advance one of the aims of such a project, that is to evaluate the role of IT across a diversity of disciplines and teaching situations. The academic teaching staff were often marginal to this process and slow to benefit. Unlike the RAS they were not newcomers: they already had an established niche in the university and they worked to a different
32
EXPECTATIONS AND REALITIES
timetable and a different set of imperatives. Many of the academics were ignorant about authoring software, had never attended a CAL conference or read any of the debates about teaching with IT. They were experienced in teaching in a verbal medium (written and spoken) but had difficulty in envisaging the possibilities of the new medium of the computer. As Bunderson et al. suggest: instruction has been trapped in a “lexical loop” perpetuated by print based media and methodology… the skill/knowledge of the expert [is translated] into a list of verbal abstractions descriptive of the critical tasks [and] given to students. The student is expected to translate the verbal abstraction back into the skills/ knowledge of the expert. They are expected to create a model of the performance of the expert from the verbal abstraction. This then is the lexical loop (1981, p. 206). The alternative approach suggested by Bunderson et al. is to provide working models in which the learner can perform. Computers are valuable because they can provide elements of simulation, but it requires imagination and experience to be able to take advantage of these possibilities. Yet at the outset of the project it was the RAS, not the academics, who were offered training (on the grounds presumably that they were the people entering a new situation). It seems obvious now that it was the academics who required the training and who needed the introduction to CAL philosophy and educational debates. It was they who were having to shift to a new medium of presentation and a new pedagogy. We learnt the hard way, through our own mistakes, and at the end that may be the only way to learn, but a bit of basic training would have speeded up the process and made for easier communication between the academics with mostly teaching experience and the RAS with mostly development experience. Editorial function Again with hindsight, we should have been much more specific about defining the working relationships within our subgroup. We had a good range of skills for a courseware development team (as defined by Laurillard 1993, p. 237) either within the subgroup or the wider TILT project. After one or two false starts, we established satisfactory procedures for dealing with accounts and routine administration. What we failed to do was to establish an appropriate editorial structure. Academics are used to working within a broadly collegiate environment where at least in theory everyone contributes as equals. As any social scientist will recognize, this is a rather naïve view of academic life, but it allowed us to believe that working relationships would develop organically. We had assumed that we could work with a division of labour (referred to above) in which different individuals and combinations of individuals went off and were responsible for writing different parts of the initial script, and then came together on a series of “Design Days” for collective approval, editorial decision and so on. This system failed, or at least worked only intermittently, for a variety of reasons already alluded to: the group found it difficult to come together on a regular basis and the distance between the disciplines was greater than we had anticipated. As a consequence we were left with a very weak editorial decision-making structure. The RAS were particularly affected by this. They would produce alternative designs and receive a range of comments and preferences, when what they needed was a decision. The irony is of course that the RAS, who had most direct experience of the working practices involved in a project of this sort, were the least able to dictate or change the group structure; they were the newcomers to the institution, part-time, less well paid, and the academics were the original instigators of the project. The organic collegiate model tends to ignore these differences and pretend they do not exist. This can have its plus side if it embraces a genuine respect for
TUTORIAL PROGRAMS
33
people’s expertise, but a more formal structure of decision-making is needed and can also, we think, be empowering. In our group there was sufficient goodwill that we did evolve ways of working together, but we would have done so more efficiently and with less frustration had we recognized at an earlier stage that our existing model of editorial control was not working as intended and needed to be replaced. Evaluation The university-wide TILT project was initially set up by inviting groups of academics throughout the university to submit ideas for projects in their area of work. These proposals were then combined into cognate areas (the subgroups) which in turn were combined into the single TILT programme. At the outset the academics, in our group at least, tended to be focused on their own project and cognate areas rather than the TILT project as a whole, and tended to resent the demands made by the centre for information and participation in activities which appeared to have more to do with the central project than their own subgroup. In particular the role of evaluation, which was crucial to the overall project, caused a great deal of initial misunderstanding at the subgroup level. In fact the RAS, who were better integrated as a group across the university and closer to the centre than the academics, had a better understanding of the role of evaluation in the project as a whole. This changed as the project progressed and it became clear that the subgroup with special responsibility for evaluation could offer something of value to the other subgroups, rather than seeming to intrude and demand more paperwork, more reports and so on. In the end we all came to appreciate the value of having a group of independent evaluators who had the time and expertise to design instruments (before and after questionnaires, observation schedules, video recordings) with which to evaluate the effectiveness of the software we were producing. They carried out their evaluations in laboratory conditions with selected groups and in genuine classroom situations. Quite apart from anything else, positive feedback from such thorough external evaluation has done a great deal to restore confidence in moments of self doubt. It is important, though, to recognize that our own subgroup also carried out evaluation exercises which were crucial at a formative stage. These tended to be smaller in scale and more informal, but allowed us to try out an unfinished piece of software which was still rough around the edges and could not therefore be used in a fully fledged teaching situation. The classroom evaluations were extremely valuable in focusing our attention on the importance of locating such a package within the course structure (Laurillard 1993, p. 213). Different teachers will want to use the package in different ways, but it is extremely important that it is properly introduced to students and it is clear what they are expected to do with it. There is a temptation for teachers everywhere when given an independent learning package (equally video or film) to treat it as a “child-minder”, something just to occupy a class room hour or so. Courseware is only of interest if it promotes learning. However, to the extent that it does, it only does so in conjunction with the wider teaching context in which it is used: how it is supported by handouts, books, compulsory assessment, whether the teacher seems enthusiastic about it, support among learners as peer group, and many other factors (Draper 1995). Both formal and informal evaluation were found to be important as part of the formative development process and as part of the transition to the classroom. Summative evaluation is more difficult to accomplish. Our students for example, were questioned and “tested” before and after classroom sessions using GraphIT!. For the most part they reported that they had enjoyed using the tutorial package, they believed it to be useful, and the “tests” showed that they had acquired new knowledge or information (Henderson et al.
34
EXPECTATIONS AND REALITIES
1995). At one level then, this example of CAL courseware appears to be effective, but we cannot say whether it is more effective than other methods because we did not compare our CAL tutorial with alternative, more conventional teaching and learning methods. We had always intended that the use of this courseware should be integrated with other coursework to supplement or reinforce, not to replace, conventional teaching, though it might obviously reduce the time spent on certain topics. In these circumstances it is very difficult to “pinpoint the precise variables that determine the superiority of a particular approach” (Booth et al. 1993, p. 83). We believe that CAL is attractive because it adds to the diversity of teaching methods available and it offers the student an additional source of independent learning. Whether it is cost-effective can be substantiated only in the longer term. GraphIT! has been very expensive to produce (two part-time staff working for three years) and, if we ignore the research and learning experience involved, could be justified only if it is widely adopted. So far we have received many expressions of interest, but only time will tell if it is widely used in practice. Generally CAL has not been taken up as enthusiastically as its developers would like (Booth et al. 1993, p. 83). One of the problems with the evaluation of CAL is that it is often done by CAL professionals and enthusiasts, who are already committed to developing and expanding its use. They are faced with the issue of overcoming the conservatism of course teachers and ensuring that genuine opportunities for the constructive use of CAL are created. But the real evaluation must in the end come from the long-term patterns of usage which emerge, and we must allow for the possibility that in many areas of education these evaluations may be negative and the use of CAL will be rejected. This is not an easy finding for a project like TLTP, which is committed to expanding the use of CAL, to contemplate objectively. Conclusion We have produced what we believe to be an attractive tutorial introduction to basic statistics and graphical presentation. The evaluation, from within our own institution where it has been piloted in classroom situations and from other institutions, where we have received teacher evaluation, has been most encouraging. We hope the final product, which has a teacher’s editing facility so that data sets which have particular relevance to individual courses can be include as part of the exercise set, will also be well received. We have not achieved everything we set out to achieve: we were over-ambitious given our resources. The whole courseware development took much longer than we had anticipated; as a consequence the tutorial package is shorter than intended. We had planned a number of additional modules which would have taken the student on to a slightly more advanced level. What we have learned: • Keep talking to each other! It is not enough to identify learning objectives at the outset: the same words may mean different things to different people. This is particularly true where people are coming from different intellectual backgrounds. • Defining how you want students to learn may be more important than exactly what you want them to learn. • Although there is bound to be a division of labour and of expertise within the group, it helps to identify a minimum training scheme and/or literature review with which you expect everyone to be familiar. • Do not fall into the trap of thinking that the practical day-to-day arrangements and working relationships will take care of themselves: they need regular review.
TUTORIAL PROGRAMS
35
• No package is a “stand-alone”, even if it is designed for independent learning. It has to be integrated into the rest of the course and its success or failure will depend in part on how it is used. References J.Booth, J.Foster, D.Wilkie, K.Silber, “Evaluating CAL”, Psychology Teaching Review 2, 1993, 2. C.V.Bunderson, A.S.Gibbons, J.B.Olsen, G.P.Kearsley, “Work models: beyond instructional objectives”, Instructional Science 10, 1981, pp. 205–15. C.Crook, Computers and the collaborative experience of learning (London: Routledge, 1994). S.W.Draper, “Two notes on evaluating CAL in HE”, University of Glasgow, 1995. www URL: http://psy.gla.ac.uk/ steve F.P.Henderson, C.Duffy, L.Creanor, S.Tickner, “Teaching with Independent Learning Technology Project: University of Glasgow” paper presented at CAL Conference, Cambridge, 10–13 April 1995. D.Laurillard, Rethinking university teaching: a framework for the effective use of educational technology (London: Routledge 1993).
Chapter 6 THE DATA GAME: LEARNING STATISTICS Stephen Morris and Jill Szuscikiewicz
Learning statistics is a perennial problem for students and research workers from non-mathematical backgrounds. The social sciences and medicine in particular rely on high quality analysis and interpretation of data. However, the teaching of statistics throughout higher education assumes a high degree of mathematical competence even when the students are from non-mathematical disciplines. This is without doubt a major reason for students’ perceived lack of statistical judgement (Jamart 1992). Clearly a different approach is called for. With imagination it is possible to convert difficult statistical problems into simpler problems of pattern recognition. Furthermore, with computerization the extra ingredient of interactivity can be added enabling the student to manipulate the raw data while observing the changing patterns and thereby build up an intuitive understanding of how statistics works. In effect statistics is turned into a game, the data game. In this paper we describe this approach to the teaching of statistics. The problem Imagine learning to play chess, with a set of principles and examples in a book but without the board and pieces. Or learning music composition without sound. Statistics learnt solely from the pages of a book (or lectures) suffers the same problem, and unfortunately most students and researchers are expected to gain a practical grasp of the subject in exactly this way. A further problem is that much statistics teaching is based on mathematics, which is beyond the easy reach of most students. This makes it more like learning chess without a board in a foreign language. Finally, most statistics textbooks and courses take a few sets of data and work through them; which sounds acceptable, but does not actually teach statistics. Students learn a handful of analyses instead; examples of statistics rather than statistics itself. Statistics is particularly in need of a new approach. Its purpose is to present a large and probably complex body of raw data in a meaningful, summarized form. Everyone understands what raw data is, because they collect it and it is largely self-explanatory. Students accept the concept that you come to a conclusion, and that that is the end of the process—however, they do not understand what goes on in between. Statisticians work with a series of steps culminating in the table of test statistics, each step condensing the data and making it more manageable. This forms a kind of “Information Funnel”, large and raw at one end, and clear and informative at the other. Statistical analysis programs used by those in education and research every day emphasize the two ends of the funnel, and hide the intermediate steps. How is a student to understand what is happening when they enter a vast array of data and a moment later are presented with a few probabilities? Although pride of place appears to go to the raw data and the final statistics, educationally the in-between steps are of primary importance. Most of what we thoroughly know has been learnt by observation, trial and error. Statistics cannot be taught this way within the current mathematical framework using a selection of data sets, since the time
OUR SOLUTION
37
required (and the number of prepared analyses) would be far too great. However, most students of statistics are not interested in it as an academic discipline, and by approaching the subject instead as a tool to be used, a higher degree of teaching flexibility can be achieved. Mathematical proofs become irrelevant; when learning to ride a bicycle, a child needs no knowledge of angular velocity, frictional forces or gravitational pull; an intuitive understanding of all of them will be impressed on him/her more or less painfully. While deep mathematics can prove or disprove assertions, proof does not necessarily lead to enlightenment (Jamart 1992), and much of what we truly understand requires no proof at all but repeated, varied and directed observation. A deep appreciation of almost any subject comes after practical experimentation. This can be achieved in statistics teaching, by making it into a game, after which it can be at least as interesting as chess, and certainly a lot more useful. Now that the IT revolution has made faster, more powerful PCS available to university education at reasonable cost, their advantages to both teachers and students are widely recognized. When used with imagination, the increased interactivity of PC software is a powerful ally in the move away from didactic teaching; and the potential availability of networked software 24 hours a day enables students to work with complex concepts, at their own pace, whenever it suits them (Simpson 1995). Our solution With this in mind we have approached the problem of teaching statistics by creating and computerizing a series of challenges and games which the user plays by changing the data. This is a radical departure from traditional statistics teaching, where correct statistical practice is mirrored unnecessarily closely in making the data sacred and immutable. It is still possible to impress on students that the real life data cannot be changed; and by giving them the opportunity in the classroom to experiment in a way impossible in real life, they become experienced in recognizing patterns and exploring strategies without danger. They are exposed to a wide variety of situations which might take a decade or more to accumulate through real research. Although traditional teaching styles may be able to show some variety to students, the interactivity of our approach involves them directly, making it more successful than a strictly didactic method. However, if the results of the interactivity remain complicated, the students will simply have a deeper understanding of their confusion. By transposing the data into a simple graphical representation, whether a fitted line or a set of normal plots, the results of the interactivity become clear and the student gains a genuine understanding. This gives an entirely natural representation of the middle stage of the Informational Funnel, connecting the original data in a clear way to the otherwise slightly mysterious test results and conclusions. The data points are the game pieces which may be moved in any direction. The student can be guided through a number of scenarios within which they are encouraged to experiment and observe changes in the resulting test statistics, finally being challenged to generate particular outcomes. Through these exercises, the students recognize that the processes directly connect the data to the results, and that they can be understood. Although they may not understand the processes at first, by the end a clear intuitive understanding will be established. We tested this teaching approach by producing a suite of PC-based gameplaying scenarios designed to demystify a wide range of statistical concepts. These were incorporated into a comprehensive teaching package called Statistics for the Terrified. Although the software does cover some quite advanced topics, it approaches everything in a basic, commonsense way. The areas covered by the software include: Table 6.1 Areas covered by the software. How to choose a test
The importance of Groups in statistics
38
THE DATA GAME: LEARNING STATISTICS
Identifying appropriate tests by data layout When to use: Chi Square, Kruskall-Wallis, Mann-Whitney, Oneway Analysis of Variance, Paired and Two-sample t-tests, Wilcoxon Basic data Description Descriptive statistics Advantages and disadvantages of median and mean, range and variance, and coefficient of variation The importance of the normal curve and how the mean and s.d. affect its shape and position Standard error and confidence intervals Testing for differences between groups The differences and similarities between two-sample t-test and oneway analysis of variance Role of the normal distribution The differences and similarities between the Mann-Whitney Test and the Kruskall-Wallis test Role of box and whisker plots When and how to use oneway and twoway analysis of variance Develop the ability to visualize data in graph form Uncovering hidden influences Uncovering influences! Reducing bias and variance Analysis of covariance (when the influence is a measurement) Twoway analysis of variance (when the influence is a category, such as gender) Matching groups to prevent bias Reduced variance enhances the likelihood of a significant result Fitting lines to data Regression Judging the value of a linefit Using the fitted line Describing the line Analyzing repeated measurements Before and after studies About the normal curve The paired t-test Why area under a curve? The differences between areas Different repeated measurement shapes Analyzing 2 × 2 classification tables What are classification tables? Interpreting proportions Risk difference Relative risk and relating two proportions Constructing a hypothesis of no difference between the groups Issues surrounding the Chi Square Test (Fisher’s Exact Test) What does p