<iframe src="//www.googletagmanager.com/ns.html?id=GTM-K3L4M3" height="0" width="0" style="display:none;visibility:hidden">

Flat White

The future of learning and the independent scholar

29 July 2017

9:05 AM

29 July 2017

9:05 AM

When the British historian, Christopher Dawson, received the Christian Culture Award in 1951 from a Canadian university, he called his acceptance speech “Ploughing a Lone Furrow.” He wanted to draw attention to the tradition of private, independent study which he himself had followed, and which characterised English research and writing in the past so that it complemented the work of professional historians.

Dawson noted that “there is no longer any room for this tradition in the modern world, where modern methods of coordinated research combine with social and economic conditions to make it impossible.” He confessed that, if in his lifetime he had had to follow his own line of studies and “plough a lone furrow,” it was not out of choice or because he could dispense with the help of other scholars, but because the subject to which he had devoted himself, the study of Christian culture, had no place in modern university studies.

In this essay I will focus on three things – first, to sketch the contribution of the independent scholar to the world of learning in the past, and more broadly to the world of culture; secondly, to reflect on changes in university and academic life that have affected the capacity, and even the existence, of the independent scholar; and thirdly, to highlight the potential for independent scholarship in present-day culture, given that the university has now come to dominate the world of learning, and even of vocational training. I will be concentrating on recent centuries, particularly from the eighteenth century, when various scholars were working and writing outside of universities: towering figures such as Samuel Johnson and Edward Gibbon in the eighteenth century and Charles Darwin in the nineteenth, who were influenced by their Christian upbringing (even if they later disavowed it), and scholars of Jewish background, such as Karl Marx in the nineteenth century and Sigmund Freud in the twentieth.

In addition to these prominent figures, there was a second – and substantial – rank of writers and thinkers who would have carried the conventional title of “man of letters.” Evelyn Waugh once described his father as a “man of letters,” and noted that this category was “now almost extinct,” like that of the maiden aunt. A fascinating study in 1969 by John Gross on the man of letters revealed how this distinctive cultural figure emerged in the nineteenth century as a literary scholar and was later transmuted into an author in general, particularly in literature. The nineteenth and early twentieth centuries were the age of this sort of independent scholar, who combined literary composition and criticism with practical journalism and writing for a wider public. One thinks of such cultural critics as Thomas Carlyle and Matthew Arnold and Sir John Squire, or, at an even more popular level, writers like G. K. Chesterton (who, despite seeing himself as never more than a journalist, produced, in the midst of his vast output, some acclaimed works of literary criticism, particularly on Charles Dickens, as well as a penetrating study of The Victorian Age in Literature).

The term, “public intellectuals,” could today be applied to such writers, but this might convey a misleading idea of the cultural purpose which the man of letters served, and his role in the broad dissemination of learning and the shaping of communal understanding. As John Gross notes, in the past most English critics were fortified by the idea that they were the guardians and interpreters of one of the world’s great literary traditions. Such a notion, that of being the guardian of a tradition and seeing traditions as at once nourishing and sustaining in the life of a culture, would strike contemporary elites as not just quaint but outrageously wrong-headed, given the prevailing scepticism and even disgust that many seem to have for the traditions and values of Western civilisation. It is somewhat reminiscent of the description by John Ruskin, the nineteenth-century art critic, of the entire output of Fleet Street, the centre of British journalism at that time. Ruskin referred to this output as “so many square leagues of dirtily printed falsehood.” Some might think that little has changed in the hurly burly world of journalism today.

More recently, in the twentieth century, there have been some notable scholars whose works, often popular in their readership, earned academic respect as well: historians such as Lewis Mumford, an acknowledged authority on cities, and Barbara Tuchman, winner of two Pulitzer Prizes for books on World War I and American-China relations in wartime; and literary figures like Edmund Wilson and Paul Goodman. These independent scholars had no regular academic posts, and often no university connection at all. They were, in the main, independent like Christopher Dawson, whose only academic appointments were, in his early years, at Exeter University, and in his closing years, at Harvard when he was appointed as the inaugural Professor of Catholic Studies in the Harvard Divinity School.

As an independent scholar, Dawson himself contributed to historical understanding with pioneering works of scholarship and analysis. In 1932, for example, he published The Making of Europe, perhaps his best-known book, a work of groundbreaking importance. It focused on the Dark Ages as a period of silent growth that paved the way for the extraordinary cultural flowering of the twelfth and thirteenth centuries, a flowering that could not have occurred without this long period of painful, and unappreciated, preparation. Dawson probed the roots and the runners beneath the soil of the society. He saw, below the surface chaos, a creative process at work – the germination of a new way of life, a new Christian culture. He challenged the prevailing view of the Dark Ages, conditioned as it was by the rationalist Enlightenment, and expressed by Voltaire who believed that the Dark Ages presented the historian with “the barren prospect of a thousand years of stupidity and barbarism,” by comparison with the creations of the thirteenth century which “vindicate the greatness of the human spirit.” Dawson’s singular achievement was to show how these two periods of human history – what Dawson himself called “the long winter of the Dark Ages”, on the one hand, and the cultural spring of the twelfth and thirteenth centuries, on the other – were profoundly connected; not a simple black and white contrast, as if the renaissance of the twelfth and thirteenth centuries had come out of nowhere.

The Making of Europe was the product of years of quiet research and private reflection on Dawson’s part – the devotion of the individual, independent scholar. Despite his reservations about scholarly isolation, which I noted at the outset, the freedom that Dawson enjoyed, both intellectually and in the time and scope it gave him for concentrated work, played a decisive part in the originality of his perspectives – in two respects: first, it heightened his ability to penetrate the inner life of our culture and not merely recognise and record its external manifestations; and secondly, it gave him a greater capacity for synthesis, for bringing together ideas and insights in a way that academic specialisation now forbids. The independent scholar can strive for an integration of knowledge and a coherent worldview that are much harder to achieve in a present-day university setting.

Dawson’s analysis in The Making of Europe covered both the West and the East. He explored in a seminal way three crucial forces which had been underrated in the shaping of the civilisation of the West – first, the contribution of the barbarian peoples of Northern Europe, which provided its popular grounding, and the wellspring of its native, and later, national loyalties; secondly, the extent to which the universal spread and spirit of citizenship of Rome (“civis Romanus sum”) anticipated and prepared for the missionary outreach of the new Christian religion, which stressed a common spiritual citizenship that united all human beings, regardless of geography or race or sex or station in life; and thirdly, the way in which the different threads of cultural life – the Greek and Roman traditions, intellectual, social, institutional and organisational, and the contributions of the barbarian peoples – were brought together by the spiritual dynamism and direction of the Catholic Church, and forged into a cultural synthesis that found supreme expression in new philosophical, educational, artistic, and political forms, such as Aquinas’s medieval philosophical synthesis, the birth of the university, the building of the Gothic cathedrals, and the early and embryonic examples of parliamentary government.

In addition to these insights about Western civilisation, Dawson gave serious attention to the East – to the rise of Byzantine culture that contributed to the character of medieval society and the emergence of Islam and expansion of Muslim culture. Again he revealed an independent view that challenged the received scholarship of the time, notably expressed by Edward Gibbon. This view saw the culture of the East as essentially “a decadent survival from the classical past,” which deserved to be denigrated or at least dismissed. Dawson focused on the special strengths of Byzantine culture, which were in the realm of religion and art; but he also looked sympathetically at the political and social strengths of the Eastern Empire. He argued that the prevailing historical view about the Byzantine culture of the East was that it had only been viewed in secular terms, as an economic and political entity. As he noted:

The modern European is accustomed to look on society as essentially concerned with the present life, and with material needs, and on religion as an influence on the moral life of the individual. But to the Byzantine, and indeed to medieval man in general, the primary society was the religious one, and economic and secular affairs were a secondary consideration.

The independent scholar represented by Christopher Dawson and others has now all but vanished. This reflects not just a loss of independence, for independence is not an end in itself since it has an ultimate purpose, which is to foster the freedom to explore, and finally to embrace, the truth.

Occasionally the independent scholar still pops up in newspapers and magazines. One thinks, for example, of the Australian journalists Paul Kelly and Greg Sheridan, who venture at times outside their designated domain of politics and foreign affairs to reflect on issues of culture, religion and education; or Peter Craven, a highly knowledgeable literary critic, who writes frequently for a popular audience and not just academic journals.

Another variant of the modern independent scholar is one who has intermittent links with universities, occasionally as a writer-in-residence, such as Australia’s “poet-laureate,” Les Murray; or the American literary critic Joseph Pearce, who has written biographies of Tolkien, Chesterton, Belloc and others. There are also the refugees from universities, like the British philosopher Roger Scruton, who regards himself as no longer academically respectable and employable and yet continues to be a prolific writer in spite of – or perhaps because of – his self-imposed scholarly exile. There are also those who qualify as independent scholars – indeed, as “men of letters” – who have managed to be freelance writers most of their working life, such as the British author Piers Paul Read. Read is primarily a novelist, but he has also written authoritative histories and biographies, as well as a best-selling account of the survivors of a plane crash in the Andes in the 1970s, Alive. But such specimens are now few and far between.

An important mark of the passing of the independent scholar is the virtual disappearance of the academic eccentric, the eccentric scholar within universities as well as outside them. I entered universities in the early 1970s and it was still possible to find the academic eccentric at that time – an individual who was distinctive, irredeemably untidy, incurably absent-minded, and peerlessly un-selfconscious. Indeed, that was part of the definition of the eccentric, that he thought everyone else was unusual, not himself. Only in 2016, for example, the Oxford historian, Professor James Campbell, died. He was renowned – if that is the word – for being an absent-minded professor; such as lighting his pipe and then, a few moments later, placing it in his pocket!

If we go back to the nineteenth century, we recall another Oxford don, the famously eccentric Dr William Spooner, who became renowned for his “Spoonerisms,” of transposing the first letter of two words; as when he knocked on the door of the Dean’s office and enquired: “Is the bean dizzy?” Or when he reprimanded a student in these terms: “You have hissed all my mystery lectures” [when he thought he had said “missed all my history lectures”]. “You were caught fighting a liar in the quad” [“lighting a fire”], and “having tasted two worms” [for “wasted two terms”], you will leave by “the next town drain” [actually “the next down train,” from Oxford to London].

Apart from his wonderful muddling of words, Spooner also displayed a legendary absent-mindedness, as when he invited an Oxford don to tea, to welcome Stanley Casson, “our new archaeology fellow.” “But sir,” the man replied, “I am Stanley Casson.” “Never mind,” Spooner said, “Come all the same.”

In the twentieth century, one of the most endearing of eccentric scholars was Sir John Squire. He was a poet, critic and founding editor of the journal, London Mercury, which showed a distinct boldness in providing an outlet for new writers. Squire had a strong influence on British culture outside of universities, as an independent voice, in opposing literary modernism between the World Wars, for which he earned the scorn of writers like Virginia Woolf and T.S. Eliot.


Squire was also a historian, but his approach was unconventional – the mark of an independent scholar. This was shown by his interest in historical speculation. He thought that conjecturing about the past – of asking questions beginning with “if”, or “suppose”, or “if only” – was not at odds with what had actually happened. Rather it offered new ways of reflecting on the past and fostered intuitive understanding and insight. His interest in historical conjectures led him to edit a collection of essays of “alternative history”, called If It Had Happened Otherwise (1931). In this work, various contributors speculated on the course of history if certain events had turned out differently. Thus, Chesterton wondered if Don John of Austria had married Mary Queen of Scots, thereby extinguishing Scottish Calvinism and, by serving jointly on the English throne, arrested the Reformation and kept England a Catholic country. Squire himself contemplated the impact on English literature if, in 1930, it had been discovered that Bacon really did write Shakespeare.

No doubt these now sound simply facetious and frivolous, but such conjectures can be illuminating. They provoke the imagination to look at alternatives and probe the inner and underlying realities of history, not just the outward manifestations; the inner substance, not just the external evidence. They can help to counter the claim, often heard nowadays, that supporting a new and fashionable cause will ensure we end up “on the right side of history,” as if history is predestined and predictable, when it is so plainly unpredictable and seemingly arbitrary (except for those who, as Christians believe, discern in the unfolding of history a divine meaning and a providential purpose). The independent scholar is less likely to be seduced by such a notion as being “on the right side of history,” recognising that it injects a false and self-serving authority into any debate, and can be too easily enlisted in support of favoured social and political movements, such as the current same-sex marriage debate, which should rely on substantive arguments, not spurious historical summons.

Many of the stories about academic eccentrics may be apocryphal, but even so, the fact that they seem believable, and have often been retold, resonates with the need for the scholar to cultivate a certain detachment from the consuming concerns of everyday experience. In G. K. Chesterton’s famous comment, absence of mind is really only the presence of mind on something else. The point about the absent-minded professor is that he was absorbed in his own world of learning, undistracted by competing interests or the trivia of everyday life. In particular, he was largely immune to the insinuations of conformism.

Nor was the eccentricity of the scholar an end in itself. No doubt such a man – and it was often a man, though occasionally a woman, such as the Cambridge philosopher, Elizabeth Anscombe, a cigar-smoking mother of eight, who wore a monocle – provided moments of frustration for spouses, and for university administrators! But the Campbells and the Spooners and the Anscombes were not simply eccentrics: they were academic eccentrics; that is, their oddness was, first and foremost, of the mind, which translated comprehensively into the rest of their lives, most conspicuously their dress and demeanour. They were often impressive scholars and writers and teachers. Their eccentricity was the expression of a distinctive culture, a culture of curiosity, of absorption in learning – and learning that was self-propelled, not dictated by university committees or government bureaus or corporate entities. In the words of Samuel Johnson, “curiosity is, in great and generous minds, the first passion and the last; and perhaps always predominates in proportion to the strength of the contemplative faculties.” The seeming remoteness of scholars, their lack of social normality and conformity, has at once reflected and reinforced a spirit of intellectual detachment. There is indeed something unworldly about the academic eccentric, which has helped to detach the scholar from the obsessive fashions and absorptions of an over-organised society and protect a perspective of objectivity.

Unfortunately, this over-organisation now threatens to engulf academic life. It militates against the life of an independent scholar within universities, making far less likely the presence, and even the survival, of the academic eccentric. Various factors have registered an impact in the midst of this over-organisation, either contributing to it or flowing from it; factors which have made the present-day university less hospitable to genuine scholarship, and cast doubt on the university’s educational condition and value in the current institutional form.

These factors fall into two broad categories. Some are practical and organisational, affecting the university as an institution. Others are intellectual and cultural – and, I believe, spiritual, at their core – relating to what has happened to the academic mind and intellectual culture. Together I think they have conspired to erode confidence in the university as a centre of learning in our society. This development is rendered even more serious by the extent to which the world of learning has come to be, as mentioned earlier, almost entirely absorbed by the university, so that learning itself is now deeply institutionalised, and subject to cultural and political pressures and sanctions that have crucial implications for the world of culture as well as of learning. Universities seem no longer aware of the extent to which they need independent scholars and rely on the scholarly energy and insight outside of their walls to sustain a life of learning.

The first practical change to be highlighted is the clash of cultures that has taken place in universities in recent decades. This has arisen from the penetration of a scholarly culture by an alien culture, a culture of managerial supervision and accountability that dwells on structures and means, insisting on a uniformity of approach and the imposition of tests that are based, not on scholarly criteria and the search for truth and wisdom, but on the bureaucratic measurement of processes, the designated aim of which is compliance, even when it is disguised by the invoking of words such as “quality,” as in “quality audit” and “quality assurance.”

No doubt the university as an institution has always been subject to market and managerial influences, as in the Middle Ages when it was preparing people especially for ecclesiastical office; or, in the age of colonial expansion, for imperial leadership and service. The difference now is that the actual life of learning has become invaded so that learning itself is thought to be dubious and indefensible without vocational direction or political regulation and manipulation. The politics of bureaucratic surveillance are supplanting the culture of intellectual appetite and scholarly responsibility.

A second practical change is the massive expansion of higher education in Australia in recent decades. This has intensified pressure, both time and teaching pressure, on academic staff, and reduced the capacity, not only for research but, at a more fundamental level, for thinking, so that the mental space for independent scholarship, and the leisure to carry it out – leisure, in the classical and medieval sense of a condition of intellectual freedom and reflection, rather than of utilitarian relaxation – have been greatly lessened. The expansion of universities has led to a proliferation of courses and degrees, which reflects both the intense specialisation of academic life and the claims of vocational preparation, and produced a narrowing of intellectual focus that detracts from the breadth and integration of understanding that should characterise the scholar.

The main justification for mass education at the university level in Australia has been utilitarian, a necessary form of employment preparation and income potential. It has not been connected, in any articulated way, to a higher or wider purpose, such as to prepare people for citizenship and democratic participation, or to heighten the intellectual and cultural benefits of learning. Yet the assumptions governing employment and income prospects arising from a university education are now coming under scrutiny as the link between a university degree and the earning of an attractive income is no longer assured. Prior to the May 2017 Federal Budget, which foreshadowed a higher student share of university fees, there was debate about the growing rate of unemployment among graduates. A workforce economist, Ian Li from the University of Western Australia, commented:

The graduate degree premium has been eroding for some time. Private returns to higher education are still positive but they are no longer what they once were.

If these doubts grow, the mass popularity of universities may decline. We may begin, as a society, to question the social status of a university degree and the idea that universities are good for everyone. While the prospect of universal higher education, complementing education at the primary and secondary levels, might be flattering to egalitarian sensibilities and yearnings, it reflects the mistake that treating people equally has come to mean treating them identically. (I speak as the father of a plumber as well as the father of a university lecturer.)

At the same time, it is important to recognise that a society needs well-formed elites, especially spiritual and intellectual elites, in order to cultivate qualities of vision and leadership and sustain the popular understanding and confidence necessary for the life of a culture. The poet James McAuley stressed this need in 1976, shortly before he died:

All societies depend on the presence of elites, which are – with whatever failures, limitations and delinquencies natural to the human condition – bodies of people with superior discipline, capability of responsibility and leadership, sources of morale and integrity.

The formation of elites, as McAuley noted, hinges on the influence of home and school rooted in tradition. He identified four matrices in Australia in which elites have been traditionally nurtured – three of them of religious origin (Anglican, non-conformist/evangelical, and Catholic) and the fourth from the humanist-rationalist tradition. These matrices, serving as fonts of intellectual insight, moral integrity, and cultural and political leadership, were also fundamental to the production of scholars, and especially independent scholars, for it gave them, not only the inspiration for intellectual work, but also the discipline and desire to serve wider communities.

Christopher Dawson was conscious of the historical significance of a spiritual elite, the priesthood, and how it functioned in various societies as a culture-building institution. The priesthood had an intellectual foundation. It constituted a learned class that had mastered a body of knowledge, and it provided the historical cradle for the intellectual elites of secular modernity. It had a stabilising influence on societies by its preservation of sacred rituals, which gave regular opportunities for the enactments of our spiritual nature in a material world; the saving of the world, as Christ’s Incarnation attested, by means of the stuff of the world.

Modern secular society shuts out any religiously inspired office, and would recognise no connection – in fact, only an antipathy – between learning and religious faith; but it finally has to reckon with the consequences of such an exclusion. The price of banning religious elites, with a different philosophy and scale of values from that of secular elites, is to open up a vast social and political void that has immense implications for individual health and cultural integrity. This is what happens with the slaying of a transcendental vision and value system.

Chesterton once said that poets are “those who rise above the people by understanding them.” An intellectual elite needs to be of the same disposition. This does not suggest, should not suggest, social superiority or political dominance (though we know these are not easy to avoid). It certainly should not imply superior behaviour. It suggests, rather, insight and sympathy, a profound affinity with the hungers and needs of ordinary people, and a capacity to capture these in the forms of the culture – its social institutions and laws and moral values, as well as its imaginative expressions, such as symbols and rituals, poetry, art and music. This capacity should be enhanced by the scholar’s freedom from institutional pressures, not only to bow to embedded elitist opinion but also to “publish or perish,” which imposes on scholars an imperative to write and speak continually, even when they might have little or nothing to say.

A final practical change affecting present-day universities is the pressures on scholarly publication, which are beginning to compromise its integrity and credibility. One form of pressure arises from the growth of electronic publishing as increasingly the main channel by which scholarly findings are reported. Traditional print publications, both books and journals, involved considerable time and production processes, which had the unintended effect of enforcing delays and providing inbuilt opportunities for review and safeguards of quality. By contrast, the electronic world allows anyone to publish anything, and to distribute it immediately, at times more quickly than might have been planned as a result of hitting the Send button on the computer prematurely. At the high end of electronic publishing, these risks can be managed by pay walls and password access, but the intrinsic nature of the medium tends to impose a regime of haste that makes quality control more difficult.

On the positive side, however, the very ease and cheapness of electronic facilities represents a new advantage for the independent scholar. The electronic revolution has ushered in an era of self-publishing, and afforded a new freedom and flexibility in communication. While economic factors account for the demise of the traditional independent scholar – in that it is no longer financially viable for a person to pursue such a vocation without the support of an institution or, more rarely, a patron (such as the late Bill Leak enjoyed in the early years of his artistic and cartooning career) – the countervailing factor of personal communication, based on such devices as the iPhone and the iPad, has given scholars a degree of independence from institutions, and perhaps especially libraries, that was unimaginable a few decades ago. These conditions are proving vital for the many independent scholars who may be pre- or ex-academics, such as displaced PhD graduates unable to secure university employment, or retired academics, who can access various channels of online enquiry and communication, via blogs and other means, and disseminate research findings as well as scholarly (and even, at times, not so scholarly) opinion.

A second issue touching quality control is editorial and peer review, a mechanism that has traditionally ensured the dependability of scholarly publications. This is no longer as certain as it once was. There is an increasing incidence of corruption of the system, demonstrated by flawed research papers of various kinds, involving fabrication of data and of peer reviews, doctoring of images, and the citing of imaginary editorial boards. In early 2017, a UK parliamentary enquiry into the integrity of university research heard evidence of increasing scientific misconduct, which has given rise to a growing number of corrections and retractions.

In addition to the practical changes to universities and academic culture, which have implications for the independent scholar, there are institutional and cultural developments that are limiting and distorting intellectual independence.

One is the politicisation of the university, notably in academic programs in the fashionable areas of the humanities and social sciences that have become infected by identity politics. There is, among at least a vocal proportion of academic staff, a tight uniformity of political attitudes – a form of compliance, one might say – essentially of the cultural Left, which is hostile to the traditions and values of Western civilisation, stigmatising them as backward and oppressive. Universities have become constrained by an ideological straitjacket, which favours advocacy over the contest of ideas, and raises serious issues about their reputation for academic freedom.

A second development is the rising atmosphere of intimidation, mob-based and media-promoted, at times physically threatening, which has led to a new generation of campus protests. This is especially so in America, but it is now emerging in Australia as well, as was shown in 2015 at Sydney University when a visiting pro-Israeli speaker was interrupted by a violent protest led by the Director of the University’s Centre for Peace and Conflict Studies.

In the light of my own memory, conditioned by the campus protests of the 1960s, the present era seems rather restrained. The difference now, however, is that the rise of identity politics among intellectual and media elites reflects a new form of intellectual conditioning that is taking root in universities and elsewhere. It is not easy to hope for a recovery of what the Australian poet Vincent Buckley once called the “great tradition of intellectual chivalry” in Western culture. The approach now being adopted is a new version of Voltairean wisdom, once wryly expressed by Ronald Reagan, that modern liberals will “defend to the death your right to agree with them.”

Among present-day ironies, the old liberal value of free speech is now falling to conservatives to defend. While liberals lurch towards totalitarian controls, as though moral meaning can be legally imposed rather than drawn from transcendental sources, conservatives embrace a libertarian autonomy, relying on self-chosen identities at odds with a broad sense of community. Both these quests represent bleak prospects for Western society, and in pursuit of them we suffer from an extraordinary readiness to being offended; in fact, the “offence industry” may be responsible, in a perverse way, for a new form of “manufacturing” in Australia and other Western countries. As one Canadian journalist has commented:

In the new order, a high level of suggestibility and a low level of common sense will be important survival skills.

A more subtle – and insidious – form of intellectual conditioning now taking place is the notion of collective guilt, which is giving rise not only to apologies for the behaviour of our ancestors, but to the removal of any signs of their historical existence, such as changing the commemorative names of buildings in universities or public places. The aim is to rewrite history so that it is “cleansed,” free of all signs of what are now seen as objectionable biases. James V. Schall has argued that the desire to take on collective guilt – for a past in which we personally played no part – is to impose the norms of one generation on another. It is being used to justify punishing those who are long dead, whom we can no longer personally accuse or arrest or put on trial – nor, of course, are they in a position to defend themselves. Schall asks: “How far back do we pursue our cleansing vengeance?”

To return to Sir John Squire, who was, in addition to being an independent scholar and an eccentric, a cricket “tragic.” He formed a team, aptly called The Invalids, which played occasional village games. Their cricketing performances, so quintessentially English, are immortalised in the social satire by A.G. Macdonell, called England, Their England (1933), which includes a caricature of Sir John Squire. In one Invalids game, Squire’s side was fielding and the batsman at the crease skied a ball. Squire screamed out: “Leave it to Carstairs.” The ball rose in the sky and eventually came down, with a heavy thud on the grass. No one caught it. Squire then realised – Carstairs had died the year before!

We know that we cannot “leave it to Carstairs”! The independent scholar lives a life of relative isolation, but his survival is likely to depend on some measure of community and solidarity – with other scholars, and with intellectually sympathetic institutions and organisations; both in person and with electronic assistance. I am greatly heartened by the presence in Australia of the Christopher Dawson Centre in Hobart, and of Campion College in Sydney, for they offer a spiritual and intellectual home to independent scholarship and learning. As Dawson himself recognised as a historical reality:

Every advance in education has been prepared by a preliminary period in which the pioneers work outside the recognised academic cadres. This was so at the beginning of the European university and in the beginnings of humanism, while today the diffusion of leisure throughout the affluent society offers new opportunities for free intellectual activity.

At the same time, Dawson realised the importance of individual friendships and support. “Conversation is more than bread and meat to me,” he once confided to a friend. “I cannot exist without it.”

Karl Marx, that well-known independent scholar, put it well – or almost did when rallying the proletariat to release themselves from their chains. With apologies to BBC Radio’s Frank Muir and Denis Norden, Karl Marx went so close to saying:

Working scholars of the world, unite. You have nothing to lose but your brains!

Karl Schmude is a Founding Fellow of Campion College Australia and formerly University Librarian at the University of New England in Armidale NSW. This essay is based on a paper he delivered at a colloquium of the Christopher Dawson Centre in Hobart on June 30, 2017.

Got something to add? Join the discussion and comment below.

 

Got something to add? Join the discussion and comment below.


Comments

Don't miss out

Join the conversation with other Spectator Australia readers. Subscribe to leave a comment.

Already a subscriber? Log in

Close