Prior to the introduction of computer technologies in the 1960s, correspondence course and independent study models of distance education posed challenges to the learning and teaching processes. This contributed to a persistent problem of credibility for the field. “Tell courses” (Verdun & Clark, 1991), developed in the 1970s, showed Promise for minimizing some of these problems. Previously, television had primarily been used as an electronic blackboard and for the delivery of standardized content, through lectures intended to reach wide audiences. The development of videotape allowed educators to customize the same content for different learning environments.
This medium also allowed increased flexibility; course content could be stored, delivered, and repeated at will. This minimized time-dependency, a drawback of previous televised courses. However, despite their advantages, the cost and complexity of producing tell courses made them impractical for teaching large numbers of students. Around the same time, the “open university” concept was launched. The creation of universities open to all was driven by the need to provide alternative education for adults whose needs could not be met in the traditional classroom. The British Open University began in 1969 through video broadcasting of its weekly courses on the BBC. Over time and with the advent of new technologies, the British Open University’s model of distance learning evolved into a student-centered delivery system and administrative structure separate from a campus setting.
More economically practical than tell courses, this system envisioned each student as “a node in the network” (Granger, 1990, p. 189) that provides individualized instruction in a virtual classroom. The students have access to a virtual library, customizable based on their particular learning style, and to collaborative tools that encourage discourse and critical thinking (Prewitt, 1998). By encouraging a community of learners, this model overcomes some of the problem of isolation. During the 1970s, the capability of computers to automate tasks and deliver information made them invaluable tools for many companies, thereby increasing the need for technologically competent workers. This prompted the inception of corporate training programs focused on technology literacy. In schools, word processors, spreadsheets, and database applications enhanced the productivity of educators and students, and the development of educational software offered interactive ways to deliver academic content.
As early as 1960s computers were being used with adult learners at the University of Illinois through an integrated learning system called Programmed Logic for Automatic Teaching Operations (PLATO). Integrated learning systems—computer-based instructional programs—were used to distribute educational content through a local network, usually to students in the same room or building. In the 1980s, due to the rapid evolution of information technology and our increasing dependence upon computers as a society, the use of educational technologies expanded considerably. However, during these first few decades, information technologies were used more to automate traditional models of educational delivery than to develop new forms of pedagogy or to enhance learning across distance.