http://web.simmons.edu/~chen/nit/NIT%2793/93-173-ives.htmlINFORMATION ACCESS IN THE 21ST CENTURY:
THEORY VS. REALITY
David J. Ives
University of Missouri
Columbia, MO 65201
Information Access, Access, Theory of Access, Realty of Access, Electronic Superway, Omni-Presence, Omni-Knowledge, Omni-Collaboration, Omni-Access, Universal Access, Restricted Access, Filtering, Comprehension, Synthesis, Access Control, Access Cost, Affordability, Repository, Censoring, NREN, National Research and Education Network., Information Superhighway.
Abstract: As we are entering the 21th century's digital information age, it is important to examine the "electronic superhighway " model of information access in terms of our daily encountered forces, variables, and factors. This paper does just that.1. INTRODUCTION
The long-awaited and oft-predicted "electronic information superhighway" is actually better described as an electronic wish list, compiled so as to promise at least something for any and all possible users. In fact, this particular model has about it many of the trappings of a ritualistic construct -- it provides "an occasion for reflection and rationalization in the fact that what ought to have been done was not done, and what ought to have taken place did not occur" (Smith, 1982, p. 63). For the past two decades or more, the world's information specialists, librarians, and the general public have been exposed, on many occasions and in many formats, to this particular model of what the future might, or "will," hold for the broad arena of information access and use (e.g., Hays, 1970; Dordick et al., 1981; Dowlin, 1984; Lewis, 1985; Brownrigg, 1990; Fisher, 1992; McClure et al., 1992; Rossman, 1992; Benhamou, 1993; Elmer-DeWitt, 1993; Reynolds, 1993).
The time has come, indeed is long past, when this "electronic superhighway" model must be examined in terms of those forces, variables, and factors that we all must deal with every day. This is a time for examining the reality that is integral to the concepts of information access and use, not the ideality; and it is against this reality that the following discussion is presented.2. ELECTRONIC SUPERHIGHWAY MODEL
The goals of this model (also known as the NREN, National Research and Education Net-work model) -- which I will call the "universal-access" model -- have been succinctly put forth by Kenneth M. King (Brownrigg, 1990) as follows:
* "Connect every scholar in the world to every other scholar and thus reduce the barriers to scholarly interaction of space, time, and cultures.
* Connect to the network all important information sources, specialized instruments, and computing resources worth sharing.
* Build databases that are collaboratively and dynamically maintained that contain all that is known on a particular subject.
* Create a knowledge management system on the network that will enable scholars to navi-gate through these resources in a standard, intuitive, and consistent way."
For purposes of this analysis and discussion, it is assumed that most NREN "electronic superhighway" proponents and supporters would agree with the gist, if not the letter, of these summarizing statements. These statements would seem to forecast a future that encompasses an information and scholarly "nirvana;" a time when all people will be able to access effortlessly all known existing information. As such, these aforementioned four summarizing statements will be designated as the principles of:
* Omni-Collaboration and,
These principles are briefly discussed, seriatim and collectively, below.
The principle of Omni-Presence states:
"Connect every scholar in the world to every other scholar and thus reduce the barriers to scholarly interaction of space, time, and cultures."
This statement is based on a simple "if...then" construct -- If 'A' occurs, then 'B' will occur as a direct result. Unfortunately, there is no basis for assuming that this relationship exists vis a vis scholarly communication. Space and time currently are bridged by various means of communi-cation:
face-to-face conversations, mail, telephone, facsimile, and nodal (i.e., common communication loci such as bulletin boards). While near-instantaneous communications might, in a few instances, provide some real advantage over these other means, no empirical evidence exists to prove that any inherent good can be derived directly or indirectly from a near-instantaneous bridging of time and space. Any such statements based on the existence of an "if..then" relationship must be grounded in scientific inquiry and testing, (Hempel 1965; Davies 1973, pp. 4-8, 65-70, 102; TenHouten and Kaplan, 1973, pp. 143-156) not on suppositions and unsupported and untested statements.
In fact, the opposite could just as easily be posited -- that instantaneous communications might be deleterious for those problems and those fields of study whose subject matter might require significant thought, internalization, or a synthesis of information, over a measurable period of time. It is proposed that many scholars would receive no substantive and verifiable benefits from being connected to every other scholar, in or outside of their field.
Even if the disappearance of existing space and time barriers were to be granted, there is no reason to believe that instantaneous communications would have any beneficial effects on differences that might be attributable to the "culture" of those participating in such communications. For example, would instantaneous communications among a Chinese sociologist, a Brazilian sociologist, and a Nigerian sociologist really reduce the "barriers to scholarly interaction" that are, or might be, based on the differing "cultures" and cultural backgrounds of these researchers?
The principle of Omni-Knowledge states:
"Connect to the network all important information sources, specialized instruments, and computing resources worth sharing."
While this seems, on the surface, a straightforward statement, it is not. Who will decide what an "information source" is or what it is not? How will it be decided which are the "impor-tant" information sources and which are not "important" and for whom are they important; and who will make these decisions? What will be the criteria for ascertaining which resources are "worth sharing" and which are not; and who will make these decisions? Until such significata are addressed in an objective and substantive manner, this principle (and all statements derived from it) must be considered arbitrary, obscure, and misleading.
The principle of Omni-Collaboration states:
"Build databases that are collaboratively and dynamically maintained that contain all that is known on a particular subject."
If it is assumed, and it is, that all fields of knowledge and subjects of endeavor are dynamic in nature, then it is impossible for any database ever to contain "all that is known" on any given subject. At any moment, some student or researcher is generating additional data or new know-ledge for any, and every, given subject -- data and knowledge that will not be incorporated into existing databases, no matter what their nature or how frequently they are updated. In fact, there is evidence from at least one study that indicates that the "up-to-dateness" of information is only a minor real consideration among information users (Chen and Hernon, 1982, pp. 66-81).
The principle of Omni-Access states:
"Create a knowledge management system on the network that will enable scholars to navigate through these resources in a standard, intuitive, and consistent way."
How readily this principle can be accepted; for it is crucial to the successful operationaliza-tion of the "universal access" information model. Unfortunately, such a proposed "knowledge management system" would have to be a "standard" for all current and yet-to-come hardware and software platforms and would have to be "intuitive" not only for scholars in every known area of study (and at every level; novice to expert), but it also would have to cross-connect all known rational intuition systems based on all known languages, cultures and scientific constructs (e.g., inductive, hypothetico- deductive, abductive). It is proposed that such a "knowledge management system" would be impossible to generate and maintain -- due both to the dynamic nature and multiplexity of the technology and to the multifaceted nature of ongoing scientific inquiry.
3. INFORMATION RETRIEVAL AND USE
In fact, the "universal access" model glosses over or disregards a number of concrete require-ments, costs, and restrictions that are, and that will continue to be, associated with information access and use. These factors are inherent in the various stages of information use -- access, filtering, comprehension, and synthesis.
* AccessInformation access is neither free nor universal. Certain essential levels of access technology (e.g., pen, telephone lines, fax machine or computer) must be made both affordable and readily available
. The requisite information-access nodes or routes (e.g., telephone switches, Internet nodes, satellites, or anonymous FTP-access computers) also must be made affordable and easily available.Censorship is a limiting factor that also must be considered; not all information may be accessible by all classes, groups, or organizational levels of users
(My note: See that righ there?
). Almost every organization today is practicing deliberate or de facto censorship with regard to limiting or restricting access of users to certain information.Access also bespeaks the necessity of information archiving and storage -- which informa-tion will be stored and for how long, and who will make those decisions?
In what format(s) will it be stored and where will it be archived; and, who will make these decisions? Who will pay the costs inherent in such long-term storage?
All information can be, or has been, filtered -- screened or refined either by the end-user or by any of the human and non-human nodes between the raw information itself and the end-user of that information (Norman 1969:23-31). These filtering entities can include the person who entered the data into an electronic format, the program that is used to search for the information, and the person or the program that indexed the information. As noted above, censorship of various kinds also can act as a filter, even to the point of making the existence of certain informa-tion known or not known to a particular end-user
* Comprehension and Synthesis
Information comprehension and synthesis is subject to a factor commonly known as "infor-mation overload" -- the inability to understand, utilize or manipulate information due to the sheer quantity or complexity of it. This phenomenon already is obvious in many of today's organiza-tions (Woodman, 1985, p. 99; Connors, 1993). When faced with a corpus of paper or electronic data or information, the user must be able to distinguish the relevant information (signal) from the non-relevant information (noise) (Travers, 1970, p. 94-98).
Additionally, new information is constantly being generated by researchers or by the mani-pulation of previously existing data. With this consideration as a given, end-users may never be "current" in a field or subject area because they are attempting to assimilate and comprehend information while, simultaneously, new information is being generated or derived. As the end-user obtains access to more and more sources of information, the probability of such "information overload" increases.Therefore, the "universal access" model of the future's information universe -- "all of the information all of the time for all of the people" -- founders on the shoals of the real costs, require-ments, and restrictions of information access and use. And, the "universal access" model fails to successfully account for one more factor, a factor that is becoming, and will increasingly become, the dominant force in information access. This factor can be labeled, "information as commodity." It is proposed that information and access to information will become a commodity that must be purchased by the end-user. This transformation into a commodity already can be seen in micro-computer hardware and software and in some areas of information access, where different vendors make available the same database(s) at varying costs. Thus, both the information and the access to it can be considered a commodity in an open and competitive marketplace
4. RESTRICTED ACCESS MODELBecause the "universal access" model does not adequately consider numerous tangible factors and criteria, a competing model of future information access is proposed -- the "restricted access" model. Simply stated, this model predicts that future access to information will be based in large part on the end-users' budget for both information access and use; that technological expertise and capabilities will continue to be unevenly distributed; and, that the great bulk of available information will become a commodity
-- the end result of these factors being both local and global information "haves" and "have nots." This viewpoint is not unique; others have published admonitions for a number of years (e.g., Boss, 1982; Kibirige, 1983, pp. 83-103; Coy, 1991a; Anthes, 1993; Botein, 1993; Macilwain, 1993; Schiller, 1993).
This "restricted access" information model is predicated on the following seven elements and
Only those who can afford to purchase, or obtain access to, the requisite hardware and soft- ware will be able to utilize electronic information.
The actual costs of Internet access and use are many. While often not obvious to the casual user, these costs are substantial and are not limited to financial units of measure. They may include:- User account costs in dollars per connection unit time as well as per CPU use unit,
(My note: Is this with the smart meters will be about? Advanced energy usage metering?
- Pertinent telecommunication line installation cost and use charges,
- Purchase costs of all requisite hardware,
- The maintenance, repair and replacement costs for requisite hardware,
- Costs inherent in the purchase and upgrading of any requisite software,
- Costs of requisite printer paper, ribbons or output page charges,- Costs of the requisite electricity, and climate control, needs and usage,
(My note: Look at 'climate control' there
- Any initial and on-going staff and user training costs and all training materials costs and, - Cost of the user's time while accessing and using the information source(s) -- for while they are doing so, their time does represent a real cost to their employing/sponsoring organization.
* Data Repositories and Their Costs
The major electronic information repositories, and the access to them, will be held by either major research universities or private corporations, who will pass on the storage, access and overhead costs to organizational or individual users.
As more and more data are generated and made available via the Internet (or its future per-mutations) and other means, the need for massive and permanent data-storage facilities will grow. The two types of organizations that currently are able to fund, staff, and maintain such facilities are private corporations and a few major research universities. Universities may do so to service their own users, a consortium of users, or any user who may access their facility. Private corpor-tions will do so for profit, rather than collegial, motives (e.g., DIALOG, OCLC, WestLaw, Lexis/ Nexis, Dow Jones). While major "data repository" universities may have to charge access or usage fees, it is proposed that such fees usually will be less than those charged by profit-oriented private corporations. However, it also may be the case that these private corporations may be more responsive, than universities, to the changing needs and demands of their clients -- for these are the users and organizations whose fees help support the corporation itself.
* Proprietary Access
Competing or similar databases or data repositories will be accessed by proprietary means.
The major point of difference or competition among organizations that offer competing or similar databases (e.g., MedLine) is the thoroughness, speed, and ease-of-use of their searching and data-compiling software. Because this may be the only significant difference among such organizations, it behooves them to ensure the uniqueness and the proprietary nature of this "front- end" software. To do otherwise would be to make that database available on an open market, controlled solely by cost, with the organization/corporation with the lowest prices eventually driving out all competitors.
* Access Control and Cost
The next transmutation of today's "Internet" will be controlled by private corporations.
Any transmutation of today's Internet into the heralded "electronic superhighway" will require substantial funding -- on the order of tens of billions of dollars. This transmutation will involve, minimally: increasing the number of access points; rewiring all major and/or minor nodes with optic fiber; increasing the available transmission bandwidth; and, developing uncomplicated, consistent, and "intelligent" front-end user software. While Congress may pass ENABLING legislation at some point for the generation of this "electronic superhighway," any passage of the necessary APPROPRIATING legislation will be a much longer and considerably more hard-fought Congressional process.
Therefore, the requisite funding for any such "electronic superhighway" (in part or in toto) will have to come from private corporations and organizations -- the only non-governmental sources capable of supporting such massive costs (Coy, 1991b; Anonymous, 1992; Anonymous, 1993a; Anonymous, 1993b; Carlson, 1993; Roberts and Carnevale, 1993). And, it is extremely likely that these private entities will wish to recoup their initial, and ongoing, financial investments and even to turn a profit from them. The only way such costs could be recouped is through some sort of charge or assessment for access to and use of this "electronic superhighway," said assess-ment to be levied either directly on the end-user or on their sponsoring organization (e.g., public library, university, company).
* Information CensoringCertain databases or data repositories will be selectively censored by the organizations that maintain them or maintain access to them; such censoring may or may not be made known to the end users.
(My note: Especially read that last part - SUCH CENSORING MAY OR MAY NOT BE MADE KNOWN TO THE END USERS
Whether such censorship is based on access/storage costs, or on a "need to know" basis, or on the current software restrictions or capabilities, or on organizational internal politics, or on some manifestation of randomness is a moot point. The critical factor is the actual existence of the censorship process -- which results in the presenting, to end-users, of only a partial information universe rather than the user-assumed totality of that universe. Such censorship-like processes already exist, whether the individual user is aware of them or not
(Appendices I & II). And, any existence, or manner, of censorship flies in the face of those envisioning an "electronic superhigh-way" -- accessible by all, with all information being accessible.
(My note: Yeah - but as that previous article already showed, Nick Bostrom and Ray Kurzweil don't believe half the goddamn crap they're peddling to the public. In the same manner, all of the life-extension technologies that Kurzweil is hyping up - you will have a good chance in Hell of ever hoping to be able to get that. It's a panacea - it's a carrot being dangled in front of people's faces
* Information Limitations
Software and hardware errors, incompatibilities, and malfunctions will limit an end-user's knowledge of, access to, and use of the entire information universe -- in an unknowable random or non-random pattern.
Such software and hardware factors are inherent in the complex electronic nature of the user-to-source information architecture. In such an architecture, it is a truism that "a chain is only as strong as its weakest link." The current, or future, electronic chain contains numerous links: The end-user hardware and software; the data source hardware and software; the hardware and soft-ware of all intermediary nodes (e.g., retransmission sites, Veronica sites, Internet-access main-frames or LAN servers, Archie sites, Gopher or WAIS software, satellites and ground stations); and the multiple copper, optic fiber, or wireless transmission linkages that connect these requisite nodes. Should any of the links fail or diminish the robustness of the electronic chain, the result will be a failure to locate certain, or all, desired data and information. Any such reduction of the totality of the information universe is, of course, in addition to those reductions due to censorship and access costs.
* The Fallacy of "All Data"Access to all of the information and data available for any given subject is not knowingly possible, nor is it necessarily a desirable goal.
At the present time, and for the foreseeable future, there does not exist a means which would enable a user to ascertain the quantity, nature and location of ALL electronically-available information on any given subject. As previously noted, all information subject areas are dynamic in nature, constantly being enhanced by new or synthesized data and information. While it may be theoretically possible to define the totality of information in a particular subject area at a given instant of time, that totality is but a moving point on a continuum of unknown dimensions; it constantly changes. The inherent limitation of this totality of information also is dependent on previously mentioned factors as well: Information censorship and information accessibility.5. CONCLUSIONS
While the "universal access" or "electronic superhighway" model may present a tempting theoretical vision of future information access and use, it presents neither the only vision of that future, nor does it necessarily present the most probable vision of that future. The "universal access" model fails to address adequately such factors as: funding and the inherent costs of information access and use; data storage, its ownership and costs; deliberate or non-deliberate censor-ship of information; and the effects and consequences of information overload on the user -- both at the individual user level and at the more generalized level of how much, and how, information will and should be accessed and used.It is proposed that a "restricted access" model does incorporate most such limiting and controlling factors. While this latter model portrays neither an idealistic nor an egalitarian future for information access and retrieval, it does portray a realistic future of information "haves" and "have nots," a future based on the very real human, fiscal, technological, and political variables with which all of us are faced daily
(My note: See - it is NOT egalitarian, but it reimposes class differences, 'information haves' and 'information have nots. See - they don't believe in this very utopia of 'all the world's information at your fingersteps' yourself - they believe in 'filtering' - in 'rationing' information - in 'paying' for information'
[/b]). The future of local and global information access and use must be based on reality, not ideality. The "restricted access" model of information access and use proposed above cannot be replaced by any "universal access" model unless those external and internal limiting and restricting factors that have been presented are addressed and appropriately modified or eliminated. REFERENCES
Anonymous, "Venture with Motorola Set for Highway Technology," The Wall Street Journal, 25 August, 1992, p.A4.
Anonymous, "Bells Open With $100B Offer," Variety, 350 (12): 23 (1993). [1993a]
Anonymous, "BellSouth, GTE, Unit of Sprint Join to Build Information Highway," The Wall Street Journal, 11 May, 1993, p.A6. [1993b]
Anthes, Gary H., "Data Superhighway Plan Panned," Computerworld, 27,(28): 8 (1993).
Benhamou, Eric, "Let's Get Going on a Data Highway," The New York Times, 14 March, 142 (3): F11 (March 14, 1993).
Boss, Richard W., "The Impact of Technology on Libraries: The Myth of the Paperless Society," in: Information Technology: Critical Choices for Library Decision-Makers, edited by Allen Kent and Thomas J. Galvin. New York: Marcel Dekker, Inc., 1982.
Botein, Michael, "It is Coming, But Don't Rush It," The New York Times, 142 (3): F11 (14 March, 1993).
Brownrigg, Edwin, "Developing the Information Highway: Issues for Libraries," in Library Perspectives on NREN, edited by Carol A. Parkhurst. Chicago, IL: LITA, 1990.
Carlson, Bob, "Public and Private Interests Pursue National Information 'Superhighway'," Computer, 26 (3): 93 (1993).
Chen, Ching-chih and Peter Hernon. Information Seeking: Assessing and Anticipating User Needs. Applications in Information Management and Technology Series. NY: Neal-Schuman Publishers, Inc., 1982.
Connors, Tom, "The Highway to a Short Circuit?" Journal of Commerce, 396 (27971): 8A (May 6, 1993).
Coy, Peter, "How Do You Build An Information Highway?" Business Week, (3231): 108-110, (September 16, 1991). [1991a]
Coy, Peter, "Why the Highway Won't Reach the Home Just Yet" Business Week, (3231): 112, (September 16, 1991). [1991b]
Davies, J. T. The Scientific Approach. London: Academic Press, 1973.
Dordick, Herbert S., Helen G. Bradley and Burt Nanus. The Emerging Network Marketplace. Norwood, NJ: Ablex Publishing Corporation, 1981.
Dowlin, Kenneth E. The Electronic Library: The Promise and the Process. New York: Neal-Schuman Publishers, Inc., 1984.
Elmer-DeWitt, Philip, "Electronic Superhighway," Time, 141 (15): 50-56 (1993).
Fisher, Mary Jane, "Health Care Electronic 'Highway' May Save Billions," National Underwriter Life & Health-Financial Services, 31: 28-29 (3 August, 1992).
Hays, David G., "The On-Line Society: With Liberty and Access for All," in The Social Impact of Information Retrieval: The Information Bazaar. edited by Alberta D. Berton. Philadelphia, PA: Medical Documentation Service, The College of Physicians of Philadelphia, 1970.
Hempel, Carl G., "Science and Human Values," in Aspects of Scientific Explanation. New York: The Free Press, 1965.
Kibirige, Harry M., "The Information Dilemma," in New Directions in Librarianship, No. 4. Westport, CT: Greenwood Press, 1983.
Lewis, Dennis, "Expanding Horizons," in Information Management: From Strategies to Action, edited by Blaise Cronin. London: Aslib, 1985.
McClure, Charles R. et al. The National Research and Education Network (NREN): Research and Policy Perspectives. Norwood, NJ: Ablex Publishing Corporation, 1992.
Macilwain, Colin, "US Electronic Highway Needs Consensus More Than Dollars," Nature, 362 (6421): 582 (1993).
Norman, Donald A. Memory and Attention. New York: John Wiley & Sons, Inc., 1969.
Reynolds, Larry, "Speeding Toward the Information Superhighway," Management Review, 82 (7): 61-63 (1993).
Roberts, Johnnie L. and Carnevale, Mary Lu, "Time Warner Plans Electronic 'Superhighway'," The Wall Street Journal, 27 Jan. 1993, p. B1.
Rossman, Parker. The Emerging Worldwide Electronic University: Information Age Global Higher Education. Westport, CT: Greenwood Press, 1992.
Schiller, Herbert I., "Public Way or Private Road? The 'Information Highway'," The Nation, 257 (2): 64-66 (1993).
Smith, Jonathan Z., "The Bare Facts of Ritual," in Imagining Religion: From Babylon to Jonestown. University of Chicago Press, Chicago, 1982. pp. 53-65.
TenHouten, Warren D. and Kaplan, Charles D. Science and Its Mirror Image. NY: Harper & Row, 1973.
Travers, Robert M. W. Man's Information System. Scranton, PA: Chandler Publishing Company, 1970.
Woodman, Lynda, "Information Management in Large Organizations," in Information Management: From Strategies to Action, edited by Blaise Cronin. London: Aslib, 1985.
APPENDIX IIt is an easily provable fact that a search of the available information universe for any num-ber of topics will not generate consistent and reliable results. For example, a search of Internet gopher sites, using the Veronica search software, produced the indicated number of "hits" for each of the denoted keywords. The quality or redundancy of the data pointed to by each of these hits was not evaluated and is moot for the purposes of this example.
The cause(s) of the disparity among the number of hits per keyword per Veronica site is unknown, but could be due to some form of deliberate/non-deliberate censorship
(My note: Do you see this? Wow - they already psy-opped us all into thinking all of the search entries and stuff that you input into your search engines - well, all of that is all the algorithm's doing, and the algorithm is 'non-biased' and 'all-open'
), or to in situ software/hardware problems or incompatibilities. It also must be noted, and emphasized, that the first search on a keyword on many occasions returned an error message or a message indicating that no information was available. Another search, initiated immediately afterwards on the same site, often would return a number of hits. Thus, the internal integrity and replicative abilities of any of these Veronica sites must be suspect. In any case, these data illustrate the varying results that can be obtained for any such broad-ranging search.
Keyword Veronica Search Site
scs.unr.edu nysernet.org serra.unipi.it gopher.psi.com
Bondage 23 0 0  
Bosnia 163 61 163  
Euthanasia 9 3 9  
New York 656 579 700  
Overload 0 28 31  
Terrorist 2 3 0  
NOTE: Unbracketed dates = a search conducted 12 June 1993;
[Bracketed] dates = a search conducted 1 September 1993.
Only the scs.unr.edu, nysernet.org and serra.unipi.it Veronica sites were available on 12 June 1993; only the serra.unipi.it and gopher.psi.com Veronica sites were available on 1 September 1993. This, of course, introduces yet another variable in any attempt to locate "all" of the information on a given topic.
The following listing of electronic newsgroups are but a small example that illustrates the extant censorship of data and information, or the censoring of access to it. These newsgroups represent a non-random sampling of those that can, or cannot, be subscribed to at the University of Missouri, Columbia, Missouri (all of them, however, are listed on the campus IBM mainframe computer) as of 1 September 1993.
CAN ACCESS NEWSGROUP: CANNOT ACCESS NEWSGROUP:
The reason, or reasons, for this differentiation in data access and availability is neither known nor assumed; it merely exists, and no approbation of it should be assumed. Nor is it assumed that the University of Missouri is unique in this respect; in fact, quite the opposite is assumed. The point to be made is that many organizations undoubtedly censor the information (or access to it) that they make available to their users. The reason(s) for this censorship, and its very existence, may or may not be known to those users.
Yeah, you could say - this article is from 1993 - this is all outdated stuff - well, you can see many of the things it describes in here already trickling down into society (well, the Internet, really). So looks like they're going by a gameplan.