A seminar in the series, "Is
humanities computing an academic discipline?", held under the auspices
of the Institute for Advanced Technology in the Humanities (IATH),
at the University of Virginia, Guy
Fawkes Day 1999. (Ver. 13/10/99; rev. 14/10, 15/10; 22/10).
Before we get even that far, however, we need to understand why we are asking this question at all?
One answer is self-interest: we want our place in the sun, in order that we may be recognised academically, grow, become what a number of us know we can become without the loss of so many good people. We want to benefit the many fields of scholarship in the humanities, wherever the computer can improve conditions for teaching and research. Enlightened self-interest, if you will.
Another answer is that questioning disciplinarity, indeed questioning the raison d'être of higher education, is a feature of contemporary society, at least in N America and Europe. As Jonathan Culler argues in Framing the Sign this questioning indicates the vitality of our academic cultures, but it also accompanies fundamental changes, not all of which are necessarily good. Universities are now reconfiguring themselves to adapt to changing conditions of societies that are no longer certain of what they're for; some of this reconfiguring seems manifestly silly. Disciplines are questioning identities formed around cultural assumptions no longer workable. We can see how computing is apt to have profoundly transformative effects within the academy, and we can observe the beginnings of these changes, some of which affect how we construct various disciplines, what kinds of work gets done, who gets hired to do it and so forth. There are very good reasons for concern here. In asking, "is humanities computing a discipline?" we are in part asking how a disciplined understanding of our subject is to shape our institutional response to these changes. Perhaps it needs to be said that unless we respond intelligently we're likely to find ourselves in a world much less to our liking. So, it is very important to everyone that we ask our question – and so even more important that we get that question right.
The first assumption is the institutional context of knowledge, which is to say that what we know, how we know it and what we do with the knowledge is shaped by the institutions within which we work. Disciplines can of course be practiced independently, by euphemestically named "independent scholars", but the context in which our question is being asked quite clearly implies an institutional definition. As Culler notes, the increasing professionalisation of the intellectual life in this century-almost-gone has made a disciplined pursuit of knowledge outside the academy anomalous. Institutionalised disciplinarity now determines not just how scholarship is done but what is recognised as scholarship.
The first new question, then, is what are institutions for?
The definition of the term "institution" in the Oxford English Dictionary suggests "regulation and instruction subservient to the needs of those whom the institution serves". In other words we build our institutions, they in turn shape us. They embody an ideal or function that might otherwise perish, they see to its survival beyond any individual life. To borrow a metaphor from Mircea Eliade, an institution is a enduring stone body for a mortal soul. Thus we ask the question posed by this Seminar because we want to see our work survive, and we would spare the next generation of computing humanists the difficulties we have had so that they may concentrate on more important matters than whether what we do exists or not.
By definition the embodied mental form of an institution directs, focuses and so limits the thinking of those inside it. Hence fundamental questioning requires the imagination to step outside the institution, think beyond it. People in some institutions may in fact be able to think more clearly than those in others not because they are more intelligent but because they are less inhibited. Take the specific case of humanities computing, for example. Consider in particular two institutional frameworks within which humanities computing has been done: the computing centre, which is as a rule predicated on its own relegation to providing subservient technical services; and the conventional academic department, whose scope of vision is necessarily constrained to its own set of interests. From either standpoint, humanities computing is very difficult to think about clearly.
Difficult, not impossible. Jonathan Culler points out that in talk of this kind, in which the word "context" is often used to reference the environment of an intellectual object, it is dangerously easy to take the environment as an unalterable, determinative given, to suppose, for example, that a particular institutional "context" necessarily has a particular effect (ix). Reference to context is thus often a reductive move, away from the complexity of competing tendencies and minds of varying independence. Better, he suggests, is "frame": it is something we make; it sometimes falsifies ("I've been framed"); it sets apart the thing framed, defines and signifies it, as a frame does a picture.
Let's say then that our knowledge is "framed", which is to say, mediated or filtered. Technological mediation is of course a popular topic now and part of what we study; attention to it has illuminated the role of the codex as well as more recent tools in the shaping of knowledge. We have paid considerably less attention to institutional mediation – a highly political, hence potentially more threatening subject, but one we must face if we are to understand how to bring about the changes we wish for. The topic of this seminar is very timely indeed: it raises the question of the sociology of knowledge, a whole area of research we need to know much more about. (So, here I flag a paper that I will not write but which needs to be written, and will be most welcome.)
The next and perhaps most crucially overlooked assumption in our question is the nature of disciplines. Again and again I have heard people address the issue of whether humanities computing is one without ever asking what one is. It seems to me that our primary task here is to relativise the notion of "discipline", in particular to bring its principled status into sharp question.
Again I start with the OED. With a nod to the well-known fallacy, I draw your attention to the etymological headnote, which tells us that
Etymologically discipline, pertaining to the disciple or scholar, is antithetical to doctrine, the property of the doctor or teacher; hence, in the history of the words, doctrine is more concerned with abstract theory, and discipline with practice or exercise.There are other shades to the meaning, some of them unpleasant, but I suggest that the etymological sense is a good place to begin. It suggests that we could begin by deriving our notion of discipline inductively, from what is actually done, rather than deductively, from a set of first principles. You will notice that in discussions of disciplinary status, these supposed principles seem always to be off stage, behind a curtain, like the Wizard of Oz. Culler, surveying the history of literary criticism, notes that in English studies the myth of foundationalism (that is to say, the founding story of unified principles that members of a field subsequently tell themselves, for which see Chapter 2) does not hold up to close inspection. What you actually find are fundamental disagreements animated by a vigour that, Culler argues, is what we really need to pay attention to.
In the history of disciplines, we may ask, from what does this vigour originate? Is it entirely a matter of contingencies, or are there deeper principles involved? What in Platonic terms might a discipline be?
These are important questions, but fortunately for us the want of an answer need not detain us. (Again I flag research that needs to be done and communicated.) In practical terms the most important consequence of raising these questions is to forstall the assumption that "discipline" is a settled criterion to measure our field by. It most certainly is not.
A true interdiscipline is, however, not easily understood, funded or managed in a world already divided along disciplinary lines, despite the standard pieties. Properly so called an interdiscipline is not just another administrative entity with its budget, chair and department members – difficult as this is to carve these days out of existing turf; it isn't an institutionally sactioned kind of poaching. Rather it is an entity that exists in the interstices of the existing fields, dealing with some, many or all of them. It is the Phoenician trader among the settled nations. Its existence is enigmatic in such a world; the enigma challenges us to rethink how we organise and institutionalise knowledge. Such challenges have political consequences I need not spell out.
Northrop Frye somewhere comments that "each discipline is the centre of all knowledge". Not every object of knowledge is equidistant from the disciplinary centre, not every one as relevant but all such objects are defined in its terms. Hence we return to the earlier point that knowledge is mediated by institutional structures. As long as issues of ownership over source materials and approaches to them are clear, the politics can be relatively simple. Communication among disciplines is problematic, however. By nature humanities computing communicates among the disciplines; by nature it challenges issues of ownership, which is to say, reveals that many are held in common and there is much to be gained from sharing them. If its real potential is understood, humanities computing can be quite threatening to the status quo.
To the list of work to be done, then, is construction or adoption of a model that adequately represents the interrelationship of humanities computing with the disciplines it serves and draws upon for its research and teaching. Comparative literature might seem a candidate to supply one, but the examples I know suggest that it is a recent discipline along the old lines rather than an interdiscipline. Elsewhere I have suggested that the interlanguage model proposed by Peter Galison in Image and Logic: A material culture of microphysics suits us well. In his book Galison conceptualises the interrelations among subfields in that discipline by focusing on their shared instrumentation. He adopts the anthropological—linguistic metaphor of a "trading zone" between radically divergent cultures, where people lacking a common speech cobble together a temporary, semantically restricted language, or "pidgin", to negotiate the exchange of objects. These objects, he points out, tend to have very different meanings and uses in the two cultures. From the wholistic view of each culture, translation of the object into the other denudes it of its local, original meaning, but from the trader's view there is no loss, only change in how the shared object is put to use. The trader's concern is perforce with communication of a common object.
The computing humanist is, then, like the Phoenician trader whom I mentioned earlier in passing: moving from culture to culture, bringing techniques from one very different application to another. How this leads to genuine research problems is a matter I will return to shortly. I have also suggested elsewhere that in its specific application to humanities computing not all of the conventional disciplines have the same kind of relationship to our field. I have made the distinction between primary and secondary relationships, depending on whether we incorporate the ways and methods of a field into our own or just serve its interests and draw from it whatever common techniques might be of use elsewhere. I have proposed that we regard as primary the disciplines of history (esp the history of science and technology), philosophy and sociology, all the rest as secondary. I will not argue the matter here. Rather I wish to stress the point that we need to think first about a model for truly interdisciplinary work, then about how we may draw what we need from the disciplines that have something to teach us.
The latter point leads immediately to the question of what we put into a curriculum for the computing humanists of the future, which I suggest we begin to address by finding out what is currently taught world-wide under the rubric of humanities computing. More research I flag here to be done.
Computer science is a special case that is curently under discussion at least in the United States. Evidence from computer science departments suggests that our colleagues in CS tend strongly to have other interests than our own: there are very few departments that offer a humanities computing option, only then under quite special circumstances. Courses for non-majors seldom overlap with anything that we would consider to be relevant, the skill of programming perhaps excepted. If, however, disciplines are as plastic as their history would lead us to think, then the question is an open matter. What might a mutually beneficial relationship be like? (Another question for which answers would be useful.)
What have we learned from the effort? A few tentative points.
First, we can observe widespread recognition that humanities computing is a specialised activity belonging to the arts and humanities rather than to the computing centre; second, that at least in the U.S. the majority response has been to interpret "humanities computing" as meaning the provision of electronic resources to academics in conventional departments; third, that electronic text centres and the like have tended to be incorporated into libraries, which increasingly provide electronic data alongside printed material; fourth, that it is not generally accepted as a variety of computer science.
The majority view, however, homogenises away telling differences. It would, perhaps, not be a majority at all were we to factor in the differences in respective population size for the various countries where activity in our field has been reported. More importantly, there are fundamental differences in the national academic cultures involved. These differences raise many most basic questions I cannot answer but which, I suggest, would greatly repay the investment of time and effort to study. (Another research flag.) I will return to a brief consideration of the two I know from personal experience, the Canadian-American and the British.
Perhaps the immediate effect of the survey, however, is the amount of related activity to which the list of institutional departments, centres, programmes and functions attests. Clearly we have many, many bridges to build.
Anecdotal evidence may be all we have at the moment, but it is valuable. Again, elsewhere I have discussed my own. Indeed, my conception of humanities computing has come from the effort to try to make sense of my own experiences of teaching, research and advising.
Briefly, classroom teaching has shown me that there is a subject to be taught and that it consists of common methodologies; stewardship of Humanist, that a diverse, intellectually vital community of people is out there, however scattered; advising, that what is learned methodologically in one specific research situation can be carried into others; finally, research, that failure plays the key role in our work intellectually.
Let me focus all that onto the question of what might constitute scholarship in humanities computing.
Across the disciplines of application, there are from the computational perspective three fundamental approaches. These are normally found in combination; I distinguish them here as a matter of emphasis.
The first I call algorithmic: using it we apply mechanical processes, or algorithms, to source materials in order to analyse them. Often, though not always, these algorithms model an existing kind of analysis. For example, crudely equating words with ideas we get at meaning in a text through some statistical test, such as word-frequency, or by examining collocations within a defined span of words. If an algorithm does the analysis completely satisfactorily, then the phenomenon analysed has been shown to be trivial in the mathematician's sense, i.e. it no longer has any scholarly interest in itself. In other words – this is an extremely important point to which I will return – as scholars we are interested in algorithms that fail. Fortunately for us most of them do most of the time in some way or other.
The second approach I call metatextual: using it we invest our own interpretative intelligence into the source material by adding tags that render chosen phenomena into a form software can handle reliably. The metatextual approach is, if you will, our wisely impatient response to algorithmic failures, since it allows us to get results while at the same time moving the locus of failure from the machine to our minds, where we can observe the details more easily. To put the matter another way, tagging has two immediate consequences for the research in question. The first is that we produce augmented source material useful with existing programs, and so can get on with our more or less conventional scholarship. (I observe that the more we do it well, the less we do it conventionally.) For this augmented material to be useful in a scholarly sense, however, we must undertake a discipline to specify in as completely explicit and consistent a manner as possible the phenomenon we are investigating. The second consequence of tagging, then, is the set of guidelines or rules – a descriptive grammar of the phenomenon, if you will – that ensures consistency. The grammar is consequential because, as Edward Sapir said in Language, all grammars leak, that is, fail – and failure is the scholar's gold. More on this in a moment.
The third approach I call presentational: using it we arrange and format data in order to help ourselves think more clearly, in order (in a sense) to see what is there. The KWIC concordance format has, for example, had such an enormous influence on how we think about language because of its presentational effects. Similarly charts or graphs are effective because they present data in a way that can make otherwise obscure patterns in them immediately obvious. Note that with the presentational approach our scholarship has even less to do with the strictly computational processes than it does when we use the metatextual approach. The computer becomes a "black box"; all we are interested in is the output of the box. If we notice that a particular transformation is especially effective, we may then begin to get engaged with the details of the algorithms involved, but otherwise not.
Let me return to the problem (or, we might say, achievement) of failure for a moment. The core issue is forced disambiguation. In the design or adjustment of an algorithm or in marking up a source, that is, we must specify completely the phenomenon under investigation. All ambiguities must be resolved, full stop. In other words, we must say exactly how we know what we think we know. Because in the normal circumstance of scholarship, e.g. with a poetic text, a painting or colloquial speech, we cannot say exactly, we raise the question, how do we know what we know? That question is at the root of all scholarship, its fundamental beginning, its fundamental goal. The real issue in this seminar, I submit to you, is whether humanities computing gets to this question in a particularly interesting, productive way. If so, then our field can stand tall and proud among the disciplines.
Let me give you a single example from my own work: a possible personification in the text of Ovid's Metamorphoses.
The poet is describing the creation of the world, in particular the behaviour of the winds (I translate the passage as literally as I can):
His quoque non passim mundi fabricator habendumWhether the venti "winds" are personified here turns in large measure on whether we read the highlighted phrase as a comparison of winds to brothers or as a statement that they are brothers, thus the Loeb translator's "So fiercely do these brothers strive together". In the foregoing clause, the verbal phrase quisque regat "each rules" and the entity to which it refers, sua… flamina "his own blasts", have already laid the groundwork for personification by attributing a possession to each wind and underscoring the relationship reflexively. As I read the passage, however, it is very much to the purpose in this section of the poem, where things are coming out of nothing and life arising from inert matter, that we not know whether the venti "winds" are persons. Furthermore, the poem as a whole is trivialised if we do not see that undermining ontological categories is central to its overall purpose. The ambiguity, in other words, is not to be resolved.
aera permisit; vix nunc obsistitur illis,
cum sua quisque regat diverso flamina tractu
quin lanient mundum; tanta est discordia fratrum. (Met 1.57-60)
To these [the winds] also the creator of the world not everywhere entrusted
possession of the air; hardly even so can they be prevented,
although each rules his own blasts in separate channels
from tearing the world to pieces; so great is discord of brothers.
Yet if one's purpose is to catalogue personifications so as better to understand what the poet is up to across the poem, ambiguity is anathema. We could say that the troublesome passage is an instance of "ambiguous" personification, but that does not avoid the problem, since one only creates a new category and so, in effect, resolves the real ambiguity. Attaching a weight or "degree of certainty" has the same problem and brings with it the danger of seeming precision, raising the question of how one calculates this degree reliably. My point here is that however one pronounces on the case, however one marks it up, using whatever metalanguage, expressed or implied, disambiguation is inevitable.
The problem does not seem so great when reading a prose translation and may not seem a problem at all when reading some critic's argument. The computational approach, however, makes no sense and yields no useful results unless each instance or possible instance of the phenomenon (whatever it may be) is rendered explicitly and consistently, according to a descriptive grammar, as I noted earlier. In computational terms, this grammar and the tagging that it describes (or should it be possible the equivalent algorithm) models the phenomenon under investigation. It is, according to the way physicists tend to think about modelling, a manipulable device by which we study something recursively, running the model, altering it to improve results, running it again to see what happens. To be more precise, it is what American physicists have called a "tinkertoy model", i.e., a very crude approximation knowingly used to get at a phenomenon that we cannot approach directly. Subatomic particles and poetic phenomena both qualify. The point of modelling is not to establish the truth directly, since models are never true; it is to achieve failure so as to raise and point the question of how we know what we know, as I said before.
In his argument concerning the now famous "Chinese Room" parable, John Searle comments that the nature of the refutation which it delivers "has nothing whatever to do with any particular stage of computer technology. It is important to emphasise this point", he notes,
because the temptation is always to think that the solution to our problems must wait on some as yet uncreated technological wonder. But in fact, the nature of the refutation is completely independent of any state of technology. It has to do with the very definition of a digital computer, with what a digital computer is. (Minds, Brains and Science, 30)I wish to make a similar, even more radical point here. The scholarly problems we have as computing humanists are not as such can be solved by some future development in the technology. The problem I have just illustrated, for example, is at root the fundamental dilemma of categorisation, which (one might say) is coeval with analytic thought. Our computational methods give us a systematic way of pursuing the problem, which is to say, of raising better questions from those with which we began, not of solving it.
The outlines of a research programme for humanities computing begin to emerge from the three fundamental approaches (algorithmic, metatextual, presentational), the via negativa that underlies them and the broader questions that computing the humanities raises. As I have discussed elsewhere, if pursued systematically the algorithmic approach should lead to research into the basic mechanical primitives of scholarship in the humanities. I have suggested that were we able to identify a significant number of these, implement them and provide an environment in which the ordinary scholar could assemble and connect those he or she required, we would then tap into the collective intelligence of our community in a way considerably more productive than conducting yet another survey of what our colleagues think they want want. The metatextual approach, largely through the interest provoked by the work of the Text Encoding Initiative, has certainly emerged as a major area for work. The area of cyberculture is often pursued by those with too little knowledge of computing, but it does suggest a host of research questions concerning the computer as a cultural object, the history of computing technology (or, more broadly, automata as a whole) and the sociology of knowledge as it is affected by computing. If we include study of the creative arts in our purview, then as Geoffrey Rockwell has said, we deal with questions not answered but acted on – the possibilities of computing realised in performance.
In 1900 David Hilbert delivered a pivotal lecture in Paris, "Mathematische Probleme", in which he set out what he saw as the research agenda for his discipline in the future. Perhaps by the time we celebrate the centenary of that lecture, we will have been able mutatis mutandis to do something like the same for ours.
Pursuit of knowledge comes immediately to mind, but that begs the question of what we consider knowledge to be in humanities computing. Clearly it cannot be empirical facts about the objects of study to which the computer is applied; those belong to the disciplines of application. Which software to use, if any, and the techniques for applying it constitute the know-how that qualify us to advise colleagues, but on what knowledge is this know-how based? The immediate answer, it seems to me, is knowledge of the common methodologies – to quote the title of a song by The Temptations, it's "the way you do the things you do" that stirs our blood. Prior to the common techniques, I would suppose, are the questions of the research agenda I have just adumbrated. As I suggested earlier, the field makes no sense intellectually unless we forswear deferral to the future and face the crudity of our tools now; doing that leads to the core realisation that this crudity is precisely what makes the computer a cognitively powerful device. In the field of application the discrepancy between what we can do with it and what we somehow know raises pertinent questions about the object of knowledge; in humanities computing the task is to focus in on this discrepancy, to ask how such a via negativa functions. As Vannevar Bush almost said, we aspire to knowledge of "how we may think".
I take the history of humanities computing at the University of Toronto to have been typical up to the middle of the current decade. There humanities computing grew out of the computing centre over a period of several years thanks to the energetic activities of a group chiefly comprising tenured professors. This effort concluded eventually in the establishment of a service centre within the Faculty of Arts and Science under the direction of one of the professors. What I take as typical here is the creation of a space for humanities computing with essentially no disturbance to the existing power-structure – no tenure-track positions were created, no new department set up, nothing done that could not be undone, as in fact happened a decade later. Subsequently at a few other institutions within N America, such as the University of Virginia, academic centres have been created, making academic appointments that in every case of which I am aware have been under the aegis of an existing department or administrative unit, usually English. That is, in N America there have been no tenure-track or tenured appointments to date directly in humanities computing, though some appointments have been clearly designated for humanities computing.
I stress the point that on this continent our field has emerged historically from existing disciplines, usually English, in order to frame the notion that humanities computing is destined to become simply what everyone does in every department. This is what I call the dismissive "withering away of the state" argument against our field, used to turn aside the question of this Seminar. The argument is hardly cogent to those who think, for reasons I have already given, but it has considerable appeal because it costs nothing up front and requires no action. There are heavy long-term costs to those within departments who bear the brunt of keeping up with developments across the disciplines, and costs to their colleagues who will tend to get the partial view. This being a seminar, we need spend no more time on this argument, but we do need to recognise its soporific blandishments.
In a popular article published in the mid 1980s, a senior physicist noted that the term "personal computer", then newly coined, marked the return of computing from its temporary imprisonment behind the locked doors of computing "centres" to the users of computers. Originally, that is, computers were in the hands of those who built or programmed them in in order to do their research. The advent of networking, the Internet in particular, has meant that now or very soon the only difference between the machine on my desk and machines in the computing centre is physical location. In other words, very soon the only function left to the computing centre will be running the intranet and the public-access workstations. My point is that what has withered away is the centralised ownership of the hardware and software; what remains is the intellectual function, i.e. what humanities computing lays claim to.
Historically at least libraries have not gone beyond the provision of resources, "electronic books" if you will, to the application of them in the pursuit of knowledge; such is not the role of the library and never has been. From the survey Matt Kirschenbaum and I have done, I suspect that nearly every library now has an "e-text" or multimedia centre, but as far as the question of this Seminar is concerned, that fact only marks the distinction between humanities computing and the resources with which to do it. I conclude that the library is in general not our natural home.
As I noted earlier, computer science isn't either. Might we usefully say that it is to humanities computing as mathematics is to physics?
Physics is, I think, a helpful though of course quite distant ally. What we can draw from it are parallels in the practical attitude toward data, shown in the tinkertoy modelling physicists and computing humanists share, and in the central role of instrumentation. Peter Galison's interlanguage metaphor works well for us because it is a cultural figure of thought, and culturally we are very much in the same situation, with the difference that we are taking up the instrumental role that in microphysics is not as well defined. I have suggested the additional metaphor of Phoenician merchant, whose intellectual contribution to the world, the alphabet, gets ours just about right, I think.
I say "the academy", but of course this is an abstraction. There are many academies, and even within the English-speaking world the differences among them are great and subtle. This, for example, is a very North American event based on a very North American question. For one thing, the existence of tenure in American institutions makes experimentation with new fields of knowledge quite difficult. The difficulty of making the multi-million dollar decision that a tenure-track position represents is easily escaped for humanities computing by posing the question of this Seminar, which so far has not had a satisfactory answer. Thus mostly the question functions as a dismissive red herring. Our ambition here, I would assume, is to change that smelly fish into a lasting banquet at which we have an equal place. I recognise that for a culture so characterised by explicit rules and procedures, as the North American is, making the change is absolutely crucial. In British academic society mirabile dictu the absolute class distinction you make between tenured and un-tenurable is not an issue, so we can grow a new field with comparatively less effort and fuss.
The survey that Kirschenbaum and I have done raises many other questions that begin to point the differences between the national academic cultures. Again, I think we must raise these questions and work on answering them while at the same time not stopping our institution-building efforts to create a proper home for humanities computing. While our strength remains, I exhort us all to move the stones to make the enduring body to which the mental form we have helped to discover and now embody in ourselves can move when the time comes.
Allow me, however, not to end with intimations of mortality but with a admonitory parable that avoids any attempt to imprision the next generation in our stone house. This parable is told by Leonard Forster at the conclusion of his presidential address before the Modern Humanities Research Association, University College London, about 22 years ago, under the title "Literary Studies as a Flight from Literature?" "A sheep, rather more intelligent than the rest of the herd," he begins,
notices something interesting and moves purposefully towards it. Purposeful action tends to attract notice; gradually the attention of others is aroused and in a short time the whole herd is trotting along behind with loud baaing noises. Once it is under way there is no stopping it, until it comes upon a dead centre and awaits for the process to be repeated."In other words," he continues,
every system – as we know from other spheres too – falls a victim to its own systematic. What was at the start alive, vital, creative, flexible, tends to stiffen and ossify until it becomes a venerated idol or – if it still retains any life at all – it becomes a sacred cow. A sacred cowshed is soon built to accommodate the animal, where it feels happy and protected. Its withdrawal into the cowshed is a flight from freedom, which may have fatal results, for at a convenient distance behind the holy cowshed, tactfully concealed by ornamental shrubbery, there is, all too often, an unholy ideological slaughterhouse in which the dear creature is rapidly turned into mincemeat and served up, variously flavoured, to a hungry but usually uncomprehending public, in a large chain of restaurants and students' refrectories at home and abroad…."Another important member of the family of symbolical cattle, "Forster notes,
is the idol, the golden calf, whose adepts dance around it…. What we need to beware of is not the new idea, not the pure milk of the gospel, but the little gospellers who exploit it and pervert it to their own purposes and in so doing make absolute claims for it. These claims are barren because they are without imagination and they are without imagination because the creative idea is received at second and third hand…. (xxix-xx).My final reason for refusing simply to answer the question is not to let slip the primacy of the creative idea from which our interdiscipline comes. In the moment before the question is built into an answer is when we can most easily see the purpose of the building.