Loading...
 

Rigour Banner.png

Frankenstein-Boris_Karloff.jpeg square-50cm-spacer.jpg brick.jpg square-50cm-spacer.jpg Gove-bw.png

A subsequent draft of this article was published in the Journal of Writing in Creative Practice: Wood, J, (2012), ‘In the Cultivation of Research Excellence, is Rigour a No-Brainer?’, Journal of Writing in Creative Practice 5:1, pp. 11-26, doi: 10.1386/jwcp.5.1.11_1 by John Wood''

Keywords

Academic rigour | REF | research | dyslexia |

Introduction

Why is it that the nearer we get to the UK’s Research Excellence Framework (REF) evaluation, the more irritated I become? There are several things I don't like about it, but the one that really gets my goat is the term ‘academic rigour’. It is interesting that I should feel so passionate about a mere metaphor, especially when I published an article about it more than a decade ago. One reason for this is that I feel let down by a system that can influence the way UK academics conduct their business. Well, as they said in the movie about the shark: 'this time it's personal'. This is a vexatious topic, so I will resist the temptation to deviate into critiques of other terms such as ‘research’, ‘scholarship’ and ‘skill’. Never mind; ‘academic rigour’ will provide ample fun. For a start, it is oxymoronic. Basically, ‘rigorous’ means ‘stiff’, as in the term ‘rigor mortis’, which commonly describes the condition of someone who has been dead for more than three hours. If we were to analyse it in semiotic terms we would probably think of funerals, steroids and Viagra. Thus, it is a splendid combination of manly potency, desperation and stubborn intransigence, which makes it a handy buzzword for dodgy political speeches. What I cannot quite understand is how it encourages academics in art and design to do better research. To be fair, the REF framework does not use it by itself, but as one of a trio of questionable terms (i.e. ‘originality’, ‘significance’ and ‘rigour’) that will stand for the ‘quality’ or ‘excellence’ of research, in 2014. For me, although rigour might help novice researchers to achieve some basic results, this would not equate with what I had assumed to be ‘research excellence’. Good research very seldom comes from self-styled individuals whose dogged organizational consistency inspires a better world (see article challenging this myth). It is a more often a creative, team-based activity that calls for imagination, self-reflexivity, empathetic awareness, ingenuity, resourcefulness, adaptability, clarity, insight, rationality, passion, scepticism, and perceptiveness. As I see it, ‘rigour’ is more relevant to death than to life. Even in bog-standard, value-for-money terms it is a no-brainer.

Dead on Arrival

When the medium-term survival of our species is in question, how can we orchestrate research for the common good? What is needed is an up-to-date approach that will address this question using 21st century thinking. Unfortunately, although the metaphor of rigour was ‘dead on arrival’ by the time it reached the early 20th century, it has refused to lie down. As I said, I can understand why politicians resurrect it when funding is tight, but I have less sympathy with senior academics who wheel it past wayward colleagues like a scary cadaver. It’s strange. Despite its name, academic rigour is a surprisingly scrawny, ill-defined creature that exercises its power through fear and supposition, rather than through arguments that withstand scrutiny in the light of day. Moreover, it is as contagious as it is anachronistic. Michael Jackson’s ‘Thriller’ video comes to mind, here. By the time we have reached the appointed Day of Judgement (REF Day), ‘rigour’ will probably have replicated itself as numberless sclerotic forms that rise spontaneously from the ground. The fact that they are dead, yet still able to lurch towards us, will be taken as palpable proof that everything is just fine. Fear is a powerful master. Many years ago, before political correctness frowned upon liberal academics who found themselves using the word ‘objective’, lecturers in art and design became ill with anxiety, lest they be found to apply what, in QA circles might be called ‘double standards’. So they met secretly in caves and taught themselves how to discuss the size, form, cost, weight, swankiness and ‘impact’ of what they did. They called their new activity ‘Research’, and did their best to conceal its glorious ambivalence and subjectivity from outsiders. But that was a long time ago, and the fear has been replaced by amnesia and a mild sense of unease. If we are to meet the needs of the REF in showing that our research is appropriate and effective we will need to rethink what it means for those whose task it is to nurture and cultivate the finest practices of art and design. This, in turn, means understanding, then carefully re-languaging, the term ‘academic rigour’.

wikimedia-cerussite-hard.jpg
Hard lump of cerussite

Pedantry, wishful thinking and bureaucracy

What we know as the mindset of ‘rigour’ emerged at least several thousand years ago, with ideas from pre-Socratic philosophers such as Melissus (b. circa 470) and Parmenides (b.515BC). Where Melissus believed that the universe was indestructible, indivisible, changeless, and motionless, Parmenides used similar tenets to justify an unwavering approach to reason and logic. This approach is exemplified in the idealism of Pythagoras and Plato, and in the categorical reasoning of Aristotle, all of whom, in different ways inspired the notion that data, numbers, forms or sets might represent a mode of ‘reality’ that is higher than that of the sensory world. Many legal, bureaucratic and political systems still reflect this kind of alienation. Then came the analytical and convergent argumentation methods of John of Ockham, and the expedient reductionism of Leibnitz. Their arguments became the bedrock from which Descartes, Kant, Galileo and Newton chiselled out some of the axioms of our modernist world. By the 19th century, ‘rigour’ had reached its apotheosis with Pierre Laplace, who hoped that Newton’s Laws of motion would enable us to read all details of past events, and to predict the future, with absolute certainty (1825). If he had been correct, the term ‘rigour’ would have been applicable to a great deal of scientific endeavour. In the event, practicalities prevailed, and Laplace’s deterministic agenda crumbled to dust. Nonetheless, it continued to enjoy a special place in the Western imagination until a century ago, when scientists and philosophers finally drove a stake through its adamantine heart.


Newtons-Cradle.jpeg
Newton's Cradle - assumed to be predictable

Catching up with the 20th century

While, at the practical level of observation, Friedmann (1922) and Hubble (1929) showed that the universe was not rigid, similar ideas were being developed at the theoretical level. As Einstein noted: Insofar as mathematics is true, it does not describe the real world. Insofar as it describes the real world, it is not true. (Einstein, 1920). In 1931, Kurt Gödel went further, by formulating a theoretical proof that showed how no mathematical system can be both consistent and true. This added to Heisenberg’s (1927) proposal that it is impossible to detect both the velocity and position of a sub-atomic particle in a single act of observation. Of all these ideas, Einstein’s theory of Relativity (as interpreted by Bohm, 1980) was probably the most devastating attack on rigour. It showed that, if there is an upper limit to the speed of information, nothing is rigid, either theoretically or practically. By 1962, similar ideas were extended to the factors that sustain belief systems, when Thomas Kuhn explored the subjective nature of academic ‘objectivity’. Around the same time, Lorenz’s theory of chaos gave ‘rigour’ the coup de grace by showing that wave-based phenomena (i.e. the living world) are incommensurable with the granular nature of numbers. While many would regard all these notions as non-transferrable axioms of science, they have, nevertheless, changed the way we feel we can do research. It has meant that the more systemic terms, such as ‘reflexivity’, or ‘feedback’, are perceived as more useful. In the light of this change, terms such as ‘robustness’ and ‘rigour’ should be used with more caution.

GPS60-1-Tripcomputer.jpeg
GPS trip computer

Even old chestnuts go soft

Readers may be forgiven for thinking that the previous two paragraphs are like a sledgehammer that is used to retrieve a peanut from its shell. Perhaps the author is having a bad day, and has underestimated the importance of shared deadlines, agreed timetables and unequivocal room numbers. But, he would not be the first to criticise alphabetical writing for permeating the type of insensitivity and dumbness that, logically speaking, attends rigour. Plato recounted the story of an Egyptian ruler, who refused to permit the introduction of alphabetical writing because he knew that it would ossify important relations and distort the tacit understandings within the culture. As he (Plato) noted, "books are like the 'painted figures that seem to be alive, but do not answer a word to the questions they are asked". This is less true of today’s ‘talking books’, some of which are able to respond and adapt to the reactions of its readers with a quickness that was impossible with paper publications. In the 21st century we no longer rely completely on clumsy clocks and tiny-print A-Z maps. While our library shelves and desk diaries keep the faith with the Cartesian and Newtonian paradigm, many of the newer technologies are confounding it. Skype meetings, GPS navigators, pictorial phone apps, Tweets and talking books are steering us into a far more mercurial, fluent and interactive modes of discovery. How does ‘academic rigour’ figure in these new protocols of communication and signification? Some dictionary definitions of 'rigour' equate it with an extremely exhaustive and punctilious approach to accuracy. But this is paradoxical, because excellent research also requires diametrically opposite characteristics and these are not mentioned. Perhaps we like to imagine we are harder than we are, for symbolic and rhetorical reasons.

Houdini_Gravesite_1024.jpeg
The gravestones of Harry Houdini

Adaptability & resilience

Harry Houdini (1874-1926) was a famous escapologist and illusionist whose stage act included the ability to withstand a pre-meditated stomach punch without flinching. But the irony of this kind of ‘rigour’ is that it requires alertness, suppleness, and split-second timing, rather than an immovable body. When he was old, someone confronted Houdini in the street and, without warning, struck him with great force, in the midriff. Taken by surprise he did not have time to tense his body for the punch and his internal injuries subsequently proved fatal. Was it his ‘lack of rigour’ that let him down so badly in the so-called ‘real world’, or his rigidity of response? A slab of rock has rigour but, like all forms of stasis, it is pretty useless in most research contexts. Compared with Houdini's ability to behave, for a moment, like a rock it is doubly incompetent. It has a poor sense of timing because of its lack of consciousness and a very limited ability to react, on account of its inflexibility. Houdini trained himself to resist injury because of his simulation of rigour, not because he was innately rigorous. And this worked well in the controlled environment of the theatre. What this tells us is that, while ‘rigour’ is not sustainable for living beings, we can apply it on a temporary basis. It is relevant to our academic use of the terms ‘attack’ and ‘defence’ when used to describe a doctoral ‘viva voce’. My point is, that while consistency and self-discipline are needed at some stages of academic inquiry, these qualities can become self-serving, cynical or instrumentally rationalistic if we forget that they are the clumsiest, least relevant research tools we have at our disposal. However, while uncertainty and entanglement are the intrinsic to our existence, the school curriculum teaches us how to be faithful to the rational logic of ‘projects’. This entails learning how to simulate full managerial control over our actions. In a design and technology lesson, it means pretending that we anticipated the outcome of a wicked design task. In effect, it teaches students to curb their curiosity and inventiveness by following tacit imperatives, mindless trajectories, exam-based commandments and assessment-oriented algorithms. This is inexcusable. Humans do not live as inertial objects in a Newtonian and Cartesian space. We are co-agents of an ineffable ecosystem in which we are but one of many trillions of living organisms.

GreatDaveCoventGarden.jpg
Juggler (Great Dave)

When clocks go soft

Noticing that designers often initiate actions without having a full, pre-meditated theory for them, Donald Schön (1983) coined the term ‘reflection-in-action’ as a distinctive feature of our way of working. Benjamin Libet (1996) added some qualifiers to this concept when he found empirical evidence that our brains lag 200ms behind the information arriving at our senses, and that brain responses account for a further lag of 350ms. This means that the motor signals of a bodily action take place around half a second before the brain catches up with them. He argues that we do not notice the delay, as it is an inalienable, and time-honoured feature of being alive. It would appear that our innocent belief in what Heidegger called ‘now time’ depends on a neural mechanism that deceives us into merging what recently happened at the bodily level with the ‘reality’ of the thoughts we are having now. Libet calls this mechanism ‘backward referral in time’ (Libet, 1996). If we accept that the processes of delay are ubiquitous throughout the human mind-body we may better understand the nature of those types of knowledge that we describe as 'implicit', 'unconscious', or 'tacit' (Ryle, 1949; Polanyi, 1969; Dreyfus & Dreyfus, 1986). It further confirms that human researchers would be ill advised to aspire to rigour. One of the familiar characteristics of creative thinking is a flexibility concerning rules. This flexibility can, of course, be interpreted as a form of scepticism that adds rigour to the research practice. However, if scepticism is rigorously applied it raises a paradoxical issue of whether it can only apply to processes other than itself. As Wittgenstein put it, When I obey a rule, I do not choose, I do it blindly. In effect, the quest for rigour becomes self-defeating when applied within a research context. This raises questions about the politics surrounding research in the UK, and why anyone would want to compromise its efficacy in return for bureaucratic compliance and control.

tetrad-ABCD-1.jpg
Is the mind tetrahedral?

Waking up to how we think

In 1949, Richard Buckminster Fuller asserted that the human mind is tetrahedral. By this he meant that our apparatus for thinking consists of four interdependent agents that juggle information from the rest of the body and uses it to make operational decisions, etc. It seems strange that Fuller chose four as the magic number. However, while he did not reveal much about his reasoning, it proved to be a pretty cool insight. Since 1956, many people followed psychological research (c.f. Miller, 1956) suggesting that the mind can only hold around seven, or so, interdependent factors. Since then, scientific research has reduced the magic number 7 down to the magic number 4. Other evidence shows that, despite our pride in ‘multi-tasking’ - humans can barely cope with two tasks at the same time, even if some people learn to juggle one pair of tasks with another pair, and then back again. The discovery of important differences between the way information is processed in different hemispheres of the brain has led to other important insights about human cognition and, in turn, about research. Where the right hemisphere generally produces a general, holistic sense of the world, the left hemisphere is better at processing the detail in more analytical, atomistic way. It is our ability to orchestrate the actions of both hemispheres successfully that has enabled our species to survive in a complex, often hostile world. There are several explanations for this, including Karl Pribram’s (1991) notion, inspired by Dennis Gabor's (1947) theory of holography, that the two hemispheres operate like a illumination beam and a reference beam of laser light that combine to create a <2D sense of reality.

Cerebral_lobes.png
Lobes of the brain

What happened to our obsession with economics?

In Iain McGilchrist’s (2010) account, the relationship between the hemispheres is effective when the processing in the left hemisphere sets the conditions for what the right hemisphere should do. However, he argues that this order became reversed in western thought. This can be seen in the acceptance of increasingly bureaucratic values, in which the data, or instruction sets that pertain to a given situation are deemed to be more important than the original purpose that created them. Some critics of the REF have assumed that its covert purpose is to encourage a culture of enterprise and business, rather than a cultivation of lofty values and learning for its own sake. If so, this is another reason why the quest for ‘rigour’ is a no-brainer. A high proportion of successful entrepreneurs and wealth-creators appear as underachievers, failures or misfits within the current education system. The same is true of those whose difficulties are summarised under terms, such as ‘learning difficulties’, ‘dyslexia’, ‘dyspraxia’ and ADHT, despite the fact that there is a significant correlation between the two groups. Obviously, this is a vastly complex topic that extends beyond the metaphor of rigour. However, I continue to be bemused by the scientific evidence that many dyslexics have ‘abnormally symmetrical’ (sic) brains. My somewhat playful explanation for this little detour is that the recent evolution of homo sapiens led to the emergence of a bureaucratic gene (download article), which made the left hemisphere grow larger than the right hemisphere.

earth-apple.jpg

Ecological crisis - what crisis?

As our species now has a very short time in which to adapt to its global ecological habitat my pedantic discussion of words may seem wilfully self-indulgent. However, their implications may be of practical importance. In this case, ‘rigour’ plays a role that is similar to that of the word ‘sustainable’. Both are square-bashingly reductionist, if not dangerously misleading. They are symptoms of a solipsistic and presumptuous social order that is in denial about its actions and beliefs. At a time when universities need to address increasingly complex and difficult issues, such as climate change and biodiversity depletion, a new vocabulary is required. Our collective stupidity is apparent and we urgently need a ‘research’ culture that is more creative, eco-mimetic and inclusive. This calls for much wiser, more ‘joined-up’ educational policies in which survival, not economic advantage, is our main priority. It means re-connecting many different ways of knowing, thinking, imagining, acting, feeling, doing and making in new ways that make new sense. Society urgently needs to change itself at the level of lifestyle. Changing the paradigm is a highly complex task that requires coherence rather than consistency. This means orchestrating types of creative thinking that will permeate existing boundaries. It calls for openness to synergies that may, currently, be unthinkable within the current vernacular. An obdurate, target-based approach will not be helpful. Indeed, too much rigour would make this an impossible task. Ironically, it is often the well-meaning rigour of science, combined with a myopic political imagination that has brought the Earth’s life into most danger. The abject failure of international fishing quotas in Europe is a recent example. Where scientists have rigidly maintained their role as ‘neutral’ observers and predictors of fishing stocks, politicians may delude themselves by pretending that their own country’s transgressions will be the only ones. Or they may find it easier to dismiss the scientific evidence as ‘academic’ or biased. Artists and designers tend to think in a very different way from both, because they work more directly and empirically with proximal events, materials and the ‘stuff’ of pragmatic reality. Where the academic legacy of science and scholarship has encouraged an over-emphasis on logical verification, taxonomies and the alphanumerical repeatability of truth-claims, artists and designers are usually more interested in the heuristics of immediate actions and the opportunities that may follow. These approaches are not polar opposites. Indeed, both are important to good practice. However, a bias towards ‘rigorous’ methods can lead to fatal errors of collective judgement. This was evident from public debates surrounding climate change. While a huge emphasis was placed on the veracity of evidence-based truth claims (i.e. whether it is happening), far less importance was attached to the more ‘designerly’ discussion of possible futures (i.e. how best to act, in case it is happening).

Visions of a creative democracy

For more than twenty years, my own ‘research’ has explored ways in which design might be re-designed in order to cultivate wiser outcomes, rather than ‘cooler’, or ‘smarter’ products. In our attempts to devise forms of what we call ‘metadesign’ was intended to address highly complex concerns that are beyond the remit of ‘design’ as it is taught in universities. This taught us several things about the use of language within design practice. The first is that there has been a surprising lack of research into co-authorship. The second is that language is a more influential and auspicious part of the process than we thought. Both issues need some explanation. In general, designers underestimate the power of writing as a creative tool for reforming the world and, we believe, can learn much from the way that infants manage to sustain their survival through the co-creative act of speaking. By definition, this approach lacks rigour. Instead of memorizing grammatical rules, they use a profoundly heuristic approach to discover what 'works' for them. The experience then guides the learning process by association, memory, creative interpretation and re-invention. This is a highly affirmative process that our researchers call ‘languaging’. By working to incorporate ‘languaging’ and co-authorship into metadesign we believe that it may be possible to achieve outcomes that cannot be achieved by current methods. While the global population has now reached 7 billion, the rate of net consumption in wealthy nations continues to rise. What is needed is a new approach to the problem, as all previous attempts to solve it have been of very limited success. One of the issues is the inauspicious nature of the kinds of reasoning that we developed from the Romans and the ancient Greeks. For example, our research has shown us that only a minority of people will attempt a task unless they believe it to be possible. Most peole tend to assume that if something is ‘unthinkable’, it is probably impossible. It follows, therefore, that by developing aspects of the language that will make a hitherto ‘impossible’ thing thinkable, we reduce the magnitude of its perceived impossibility. Donella Meadows showed how governments very seldom achieve a significant change in behaviour because the methods they choose are the least effective. Where paradigm change would require actions that include a profound re-languaging of purpose, governments and NGOs tend to frame legislation, set targets, and sign treaties using the existing language. These processes are too indirect and bureaucratic to work. When we created the Writing-PAD Network, we wanted the ‘P’ (in the acronym ‘PAD’) to stand for ‘purposefully’. This was because we wanted to remind learners in Art and Design that it is important to ask oneself why one is doing a given thing (e.g. researching or writing) and, when necessary, to change one’s assumptions accordingly.

Re-languaging new opportunities

The word ‘purpose’ is uniquely powerful in that extant purposes seem to encapsulate a model of the prevailing paradigm which, in turn, determines the quality and effectiveness of what we do. It may be enough to catalyse a paradigm change simply by persuading society that the purpose of a given action has changed. However, the reframing of purpose can be a chaotic and troubling process. This is because we risk changing the basis upon which our prior assumption was made, and because new paradigms are unthinkable until we write, or vocalise our ‘reality’ in a new form. This means that, for metadesigners, the purpose of writing may be ‘re-language’ the status quo at a profound level. When Lord Young introduced the word ‘meritocracy’ in his book of 1958, he changed the course of British politics, but this approach has also proved effective at an international level, despite the problems of translation. The first person to be held to public account for systematically attacking the peoples of a specific tribe was in 1997. This would have not be possible or, arguably, thinkable before 1943, when Raphael Lemkin coined the word ‘genocide’. Previously, Lemkin had spent several years trying, unsuccessfully, to persuade the United Nations to recognize the reality of what he believed to be a crime. Today, some experts are developing this approach with an experimental legal case that, in 2011, tested the new concept of ‘ecocide’ in a British high court. This would shift public awareness from the concern with killing individual animals or birds to the more important fact that we are eradicating particular species of living creatures.

The Law of Increasing Returns

The idea that metadesigners might be able to change the paradigm by attracting voluntary changes in lifestyle, and by ‘relanguaging’ the belief system that sustains it is very ambitious. In order to encourage a creative shift from what is deemed ‘unthinkable’ to ‘thinkable’, to achievable it is vital to assert the idea of ‘radical optimism’ (Wood, *). This challenges a longstanding pessimism that was part of the prevailing mindset within economics and science. The idea of a 'Law of Diminishing Returns' evolved within the economic theories of Ricardo, *; Marx, * and others. At this time, there was a huge increase in the wholesale exploitation of non-renewable resources to service the growth of the industrial revolution. Seen from within a mining perspective, the Law of Diminishing Returns’ is a compelling one that might, in the 18th and 19th centuries, have seemed like an inescapable axiom of nature. Whereas a mine’s production may be highly profitable at the beginning of its life, sooner or later, its accessible resources will begin to diminish. As this happens, the miners will need to travel greater and greater distances to exploit less and less material. This will cause a rise in operating costs and a fall in profits. This pessimistic observation led the physicists (Clausius, 1865) and Kelvin (*) to invent concept of ‘entropy’, which offered a general model of the universe in which energy constantly dissipates to distant regions that would require more and more energy to retrieve. In the model that evolved from this idea the universe would ultimately suffer a heat death that would leave it cold and rigid. Fortunately, in the 20th century, a rather sunnier, Darwinian view had begun to emerge. Very loosely, this was based on the observation that the emergence of diversity within ecosystems enabled them to formulate adaptive synergies and to maintain their equilibrium. In the logic of the older, physics, paradigm it meant they could ‘resist entropy’, or even to work against entropy. By the 20th century, this more optimistic view became crystallised in a ‘Law of Increasing Returns’ (Young, 1928; Romer, 1986; Arthur, 1996), which has proved invaluable to the development of the creative economy.

New business models need flexibility

Our research into metadesign suggests that we need to can offers virtually unlimited growth in new synergies (Wood, *). It is ironic that politicians want academics to justify their ‘research’ in the context of innovation, cross-disciplinarity and new wealth creation, yet ask them - via the REF - to do so using methods that are counter-productive and anachronistic. Today, new configurations and opportunities for co-creative change and renewal are emerging all around us. In the democratic developed countries, paradigm change needs an orchestration, hybridization or amelioration of current top-down and bottom-up discourses. However, each has its own level of abstraction, syntax, metaphors and relational logic. Within a very short time we have become accustomed to sharing Wikipedia, crowd-sourcing, Avaatz and Open Design and You propose a new paradigm. This is easier for China than for 'democratic' nations because they are not exclusively top-down. In this case 1) the effectiveness of cradle-to-cradle etc. is that it derives from a top-down capitalist mindset, and can therefore be attractive to those with the greatest fiscal and economic power, currently. We can persuade them by noting that a circular economy is far more lucrative because circles can 'grow' new tangients that make links and networks. Business is transacted at every point on the circumference of each of these circles. 2) In persuading the local stakeholders, you can show that micro-circles are far less leakey than large ones (e.g. global currencies). This means that local money is cheaper and less prone to inflation.

We need creative teamwork

If we are to help communities to deliver ethical and benign outcomes to a wide variety of stakeholders what is needed is an auspicious framework for collective reasoning. It is surprising, then, that academia lacks an established methodology and rationale. Given that employment consists almost entirely as collaborative, rather than solo, actions, it is strange that co-authorship is not taught in schools and universities. This is reflected in the way that most examinations and research evaluations are conducted. While the writing of co-authored articles and papers is common practice within academic research, a great deal may not even be, in truth, co-authored. Many seem to have been the result of an executive editing process in which the research director makes ‘top-down’ decisions on behalf of the whole team. This process tends to stifle the potential for team-building and the co-evolutionary processes this offers. It is a missed opportunity, as it is an effective way to reconcile different or even seemingly contradictory theories or models; and to develop new modes of shared practice. The effective co-production of new knowledge in art, design, or science calls for a reasonably high level of mutual and reflexive self-awareness of each author's personal strengths and weaknesses. Professionals who collaborate effectively must therefore acquire and develop sub-cognitive (e.g. intuitive) skills that are 'co-anticipatory' in order to guide the overall outcomes of collaboration. The ‘creative space’ of collaborative writing resides in a kind of intermediate zone that is constructed by, and resides in ‘between’ the author's intentions, rather than within them, as individuals. Often, the dynamic process itself leads to an emergent outcome which that can be evaluated only after it has been attained. Nonetheless, a common form that may otherwise be present in a single discipline is absent, and needs to be generated or agreed. Inevitably, these skills and strategies will be informed by the ideological, cultural, and cognitive preferences of the authors themselves. The process of interfacing (attaining the 'in-between-ness') is a reciprocal hermeneutic process in which both parties strive to reach a common aim or to find a deeper, joint, understanding.
See Quantum Science article in New Scientist


Return to articles

(Cached)