Wednesday 5 July 2017

Some thoughts on a monograph - James Baker's, The Business of Satirical Prints in Late-Georgian England



I was recenlty charged to say a few words at a launch event for James Baker's new monograph, The Business of Satirical Prints in Late-Georgian England.  I am afraid that in the nature of academic hardback monographs the volume is too expensive to actually buy, but the link to the publishers' page is here; and James has blogged extensively about writing the volume, and the 'soft' and 'hard' digital methodologies that went in to it.  I am posting a version of what I said, because in working up a few enthusaistic words with which to toast the publication, it also became clear to me that this book - or perhaps just books in general - are changing in dialogue with the changing nature of historical research and publication.  While I have been profoundly frustrated by what has appeared to me to be the slow evolution of the monograph (and historians' attitudes to it as a form), this book suggested that I had missed a subtle change along the way.   So, for what's it worth, this is pretty much what I said.



I was asked a few weeks ago to say a few words to help launch The Business of Satirical Prints in Late-Georgian England, and I thought - why not?  Before I had read the book, I thought I sort of knew both James and Georgian England, and the kind of book it would be.  And at the same time, given that James Baker is seriously into the Digital Humanities, I thought I knew a little bit about that too.  And when I got my grubby hands on the book itself – I thought – OK – that looks like a book – and I know a little bit about those too.  I don't really like them very much, but I own a few and have written a few.   So, all in all, I thought this would be a walk in the park that was unlikely to challenge any of my hard won prejudices or lazy assumptions.  


I was wrong.  This book confounded pretty much everything I thought I knew about James, and books, and Georgian England and the digital humanities. 



My first shock came in the acknowledgements (and yes, that is the first place I looked) – where James very kindly acknowledged both me and a bunch of other people for their support.  But uniquely in my experience, he did so by listing our Twitter user names.  I thank god I chose the relatively innocuous @TimHItchcock – but what this brought home immediately was that this was a book that was created in a self-conscious engagement with the digital humanities, and the modern practise of academic history writing. It may seem a simple thing, but it confounded and messed with the way we still represent books and book writing.  Despite the revolution in how we research and write books, we still pretend they are the product of old-school debate, musty research libraries, foolscap and ink - that they are written in some book-lined study, in the gaps between feasting at high table.  The acknowledgements said that this book was the product of a different technology, a different conversation, held in a different world.  As a result, though a very small thing, it felt remarkably radical – and that radicalism extended through the rest of the book. 
 

Next came the introduction – in which James outlined the intellectual forces that brought him to the topic, and which informed his approach.  The list went from Fernand Braudel and Robert Darnton to EP Thompson – temporarily lulling me into a sense of the familiar – before moving on to Bruno Latour and Franco Moretti.


Anyone who knows the field of eighteenth-century British history will realise in a minute that this is not normal.  You can do Darnton and Thompson, or if you are under 35 and working on a literary topic you can do Latour or Moretti – but bringing them together in the same analysis forms a profound journey across intellectual boundaries more secure than any physical wall.  Here was a mixture of Thomspon’s Marxist literary approach, with Braudel’s social science, and Darnton’s cultural history; with Latour’s anthropology of scientific practice; and Moretti’s tools based explporations.  What looked initially like a book defined by its topic – the late Georgian satirical print – rapidly emerged as a book defined by its intellectual ambition.


In the end, it fulfils that ambition.  It brings together a genre of description that Thompson perfected; with Braudel’s clear understanding of the materialities of life; with Darnton’s sharp ear for cultural difference – and then throws into the mix, Latour’s beautiful engagement with the cultural practises of production, and Moretti’s joy in deploying the tools of distant reading. 


The chapter that seemed to me to epitomise the book – not because it used any of the tools of the digital humanities, but because it contained a breadth of approach and understanding that transcends normal history writing – is chapter three on the mechanics of making prints.  It dealt with that magic combination of copper and paper and ink; of engraving, and etching and mezzotint deployed in pursuit of cultural impact.  It seemed to me that in that chapter, James captures perfectly the ambiguities of making – the extent to which every cultural act and every material act is a balance between purpose, materiality and constraint.  It felt to me he could have been writing about the evolution of the home computer, the internet of things, or the materialities of code. What he has nigh on perfected is a balance between cultures of materiality, and the limits to our ability to escape that materiality.



In the end, what I think I learned most fully from this book is that it is possible to practise digital history – to make new narratives, informed by new technologies, in direct engagement with old ones.  And that the outcome will look different to the kinds of books that historians have now been writing for over 200 years.  This still looks like a book, but actually it is something more than that - it is an encoding of a journey through data and tools; through history certainly, but also through the mechanics of the academy.

One of the complaints heard about the digital humanities – or indeed the digital revolution – is that it has not transformed how we do history, or sociology, literature or anthropology – that somehow it has failed to fulfil the early hyperbole.  But this book suggests that a different kind of history is gradually emerging; and that while it will no doubt retain the form of the codex, it is nevertheless different.  By self-consciously using the conditions of the present, to rework our inherited forms of history writing, this book represents a positive step forward. 


In other words, it is a radical, self-conscious, and technically informed experiment in genre.  It is a practical intervention in the creation of digital history.  Buy it if you really want to subsidse the commercial academic presses, or figure out how to access it in other ways, but read it.

Friday 3 February 2017

Research Infrastructure and the Future of National Libraries



In my role as a member of the British Library Advisory Council, I was recently asked to present a few thoughts on how research infrastructure might change in response to the changing demands of academics.  This post records my notes for that discussion.  It does not record what I said to that committee on the day - and certainly does not imply that the British Library in any way endorses or subscribes to my views; but it does reflect what I believe is the necessary direction of travel in the provision of resources for academic research. 


I have been asked to speak for a few minutes about developments in academic research and the implications these might have for the British Library; and where I really wanted to start was with a quick appreciation of where we have come from.

It is important to remember that we are sitting at the centre of what was a 200 year project to create a comprehensive – divided, but universal - infrastructure for research and knowledge creation.  All you have to do is walk down Museum Row – from The Science Museum, to the Natural History Museum, to the V&A – each with their active Higher Education equivalent research staff – to remember that we inherited a powerful, cross disciplinary research infrastructure.  Or look back to the old round reading room of the British Library with its 400-odd volumes of an ever changing manuscript catalogue – seeking to encompass all of human knowledge.  Whatever your field, whatever you methodology, the nineteenth and early twentieth century created an infrastructure in stone and brick.
 
In the last fifty years much of this has either been transformed, or else become increasingly redundant – catering for an ever shrinking body of old-school scholars; while much of the effective infrastructure that underpins research has moved elsewhere.  Arguably 'Science, Technology, Engineering and Medicine' (STEM) subjects, with their greater resources have led the way in creating endless new data stores and distributed infrastructural kit.  And while buildings – like the Crick Institute, or CERN – represent a fragment of a constant ongoing rebuilding of intellectual infrastructure, they are just the tip of a much larger transformation that has taken a new form.

Through repositories like Cern’s Zenodo project; through the Genome project (with at the time, its seemingly huge demand for data storage), with Gold Open Access science journals (built on commercial publishing models, and incorporating their own data stores), with GitHub and with a collaborative project-based approach to research, STEM has created a new distributed knowledge infrastructure – because the older one failed first for their disciplines.  

In the process STEM has largely side-stepped the brick and stone infrastructure along the way – in particular the British Library.  You will not find a physicist or an astronomer in any of the Library’s reading rooms.  In other words the hardest end of STEM seems to me to have cracked substantial elements of this conundrum, and left the other two thirds of the research landscape – from the softer end of STEM, to social science, business and economics, and the humanities, largely eating dust, and reliant on an increasingly creaky twentieth century infrastructure.

So, in the first instance, it seems to me that we are challenged to rethink ‘research’ data, and publication as a new form of infrastructure.  So, the Library – or somewhere – needs to create a context in which notes and files, data stores of all kinds can be shared and curated – in a digital form.   And this data, or data store, needs in turn to be tied directly to the public commentary – or publication – built upon that data.

But in the process there is also something more subtle going on.  While STEM has led in a particular direction, it has brought with it a particular style of research organisation, which again changes the nature of the infrastructure required.  All you have to do is look at the evolution of the Research Councils UK – from its shared services centre, to an ever growing emphasis on inter-disciplinary funding – and emphasis on large team projects, and the training of Early Career Researchers to be ‘leaders’ – by which they mean project heads – to see a direction of travel towards large teams of ‘laboratory-ish’ groups, fronted by media friendly ‘interpreters’.  And of course, this is all combined with a precipitate concentration of research funding on an ever smaller number of ever more self-congratulatory institutions.

In other words, it seems to me that national research culture – and in a more chaotic way, international infrastructure as well - is faced with a twofold change.  First, there is a fundamental transformation in the most significant core of the research ‘infrastructure’ from bricks and mortar, to online - to immediately accessible data; with the tools to use it and ‘publish it’.

And second we are faced with a gradual, forced move towards larger and more ‘laboratory-like’ forms of research, in which collaboration – both virtual and face-to-face – are increasingly normal.
By way of a caveat, however, we are also faced with a multi-generational lag in which every variety of lone and independent scholar will want the same old, same old – to be available regardless of the cost.

All of which just leads me to believe that the evolving nature of research – mainly that based in Higher Education – needs urgent attention in the following areas.


  • The shared curation and storage of data and research materials – building on STEM models, but made friendlier to different data types.
  • We also need the tools to work with that data – and training that supports their use.
  • We need to explore different validation and authorisation models for ‘publication’.  At the moment we are allowing a multi-billion pound business to be built on national expenditure, and we need to reclaim elements of this – through new models of peer review and distribution.  These in turn need to be tied to data – vertically integrated from data to experiment to commentary - and more amenable to collaboration – with a traceable development path through all of it.
  • We also need a clearer commitment to non-HE researchers.  We need to acknowledge that HE – as gatekeeper of research authority - forms part of the problem.  And we need to keep a weather eye on the boundaries around who can research.  The BL certainly needs to create an infrastructure for HE research, but it needs to be an infrastructure that is open to everyone.

In other words, and as usual, the British Library needs to remember that it is a national machine for research and learning, committed to access to all knowledge, for everyone who needs it; and to use these first principles to navigate a remarkably complex and rapidly changing landscape.

We also need to remember, as William Gibson said: ‘The future is already here – it’s just not very evenly distributed’.

Friday 20 January 2017

Humanities2

What follows is the lightly revised text of a 'Provocation' I presented at the launch event held to celebrate the creation of the Digital Humanities Institute  from the Humanities Research Institute.  Held in Sheffield on the 17th of January 2017, the event perhaps required more celebration than critique (and the DHI deserves to be celebrated).  But I hope that the main point - that we need to move beyond the textual humanities to something more ambitious - comes through in what follows.


I have been working with what has up till now been the Humanities Research Institute for almost twenty years.   I have witnessed as it has grown with each project, and engaged with each new twist and turn in that remarkable story of the evolution of the digital humanities in the UK since the 1990s.   It has been a real privilege. 


Of all the centres created in the UK in that time – the HRI has been the most successful, and most influential.  And it has been successful, because, more than any other equivalent it has created a sustainable model of online publishing of complex inherited materials, and done so in delicate balance with an ongoing exploration of the new things that can be done with each new technology – and in balance again with a recognition of the new problems the online presents.


I frequently claim that the UK is at the forefront of the digital humanities – not necessarily because the UK has been at the bleeding edge of technical innovation; or because its academics have won many of the intemperate arguments that pre-occupy critical theory.  Instead, it is at the forefront of worldwide developments because, following the HRI, the UK figured out early that the inexorable move to the online, both demanded a clarity of purpose, and a constant and ongoing commitment to sustainable publication.  The HRI, and now the DHI, represent that clear and unambiguous commitment to putting high quality materials online in an academically credible form; and an equally unambiguous commitment to measured innovation in search and retrieval, representation, and analysis.


But, while it is a moment to look back on a remarkable achievement, it is also a moment to grasp the nettle of change.  This re-foundation is a clear marker of that necessity and reflects a recognition both that the Humanities as a whole are on the move, and that the roles the DHI might play in that process are themselves changing.

For me, this sets a fundamental challenge.   And where I tend to start is with that label ‘The Humanities’.  This category of knowing has never really sat very comfortably for me.  It has always seemed a rather absurd, portmanteau import from the land of Trump – a kind of Trumpery – used to give a sense of identity to the thousand small private Universities that pock-mark the US; and to a collection of ill-assorted sub-disciplines brought together primarily in defence of their funding.  And it goes without saying, ‘The Humanities’ are always in crisis.
 
But, if you asked me to define the ‘humanities’ part of that equally awkward phrase – the Digital Humanities – it has to encompass that process through which a society learns about itself; where it re-affirms its collective identity and values; where the past and the present work in dialogue.  And whether that is via history, or literature, philosophy or politics, or the cultural components of geography and sociology – the ‘Humanities’ is where a community is first created and then constantly redefined in argument with itself, and with its past. 

For all the addition of the ‘digital’ to the equation, that underlying purpose remains, and remains uniquely significant to a working civil society. 


But, up until now – that conversation – that dialogue between the past and the present – has pre-eminently taken the form of text – the texts of history books and novels; long analytical articles and essays; aphorisms, poems and manifestos.  And even when you add the ‘digital’ to create the ‘Digital Humanities’, the dominance of ‘text’ remains constant.  Indeed, if you look at the projects that have been undertaken by the HRI over the last two decades, the vast majority have been about text, and the re-representation of inherited text in a new digital format.  You can, of course, point to mapping projects, and 3d modelling of historic buildings, but the core work of the ‘digital humanities’ to date has been taking inherited text, and making it newly available for search and analysis as a single encoded stream of data.


This is a fantastic thing – the digital humanities have given us new access to old text; and created several news forms of ‘reading’ along the way – distant, close, and everywhere in between.  It has arguably, created a newly democratic world of knowledge – in which some 40% of all humans have access to the web and all the knowledge contained therein – all 3.5 billion of them.  That small-minded world many of us grew up in, of Encyclopaedia salesmen peddling access to a world of information most of us were otherwise excluded from by class and race and gender – is simply gone.  This is a very good thing.


But, while the first twenty years of the web forms a place where the stuff of the post-enlightenment dead needed to find a home; our hard work recreating this body of material also means that we have spent the last twenty years very much swimming against the tide of the ‘humanities’ as a set of contemporary practises.  We have reproduced an old-school library, but online – with better finding aids and notetaking facilities, and we have made it more democratic and hyper-available – for all the paywalls in the world.  But at the same time, we have also allowed ourselves to limit that project to a ‘textual’ humanities; when the civic and civil conversation that the ‘humanities’ must represent, has itself moved from text to sound and from sound to image.  There is a sense in which we are desperately trying to represent a community – a conversation – made up of an ever changing collection of voices in an ever changing series of formats, but trying to do so, via that single encoded stream of knowing:  text.


This is where the greatest danger and the greatest opportunity for the ‘digital humanities’ lies – because if you look at ‘data’ in its most abstract forms, this equation between knowing and text, is breaking down, and is certainly changing at a dramatic pace.


The greatest technological developments shaping the cultures of the twentieth century focussed on creating alternatives to text.  Whether you look to sound and voice, via radio and recording; or image and movement, via film and television – the first half of the twentieth century created a series of new forms of aural and visual engagement that gave to sound and image, the same universal reach that for the preceding four hundred years, was provided by print.  The second half of the twentieth century, and the first decade of the twenty-first, was equally taken up with putting sound and image in our everyday - jostling for attention, and pushing aside – text.  


It is perhaps difficult to remember that the car radio only became commonplace in the 1950s; and that the transistor radio making mobile music possible – on the beach and on the street – was a product of the same decade.  Instant photography and moving images were similarly, only given freedom to go walkabout in the 1970s and 1980s, with luggable televisions, and backbreaking video cameras. 


This trajectory of change – and ever greater focus on the non-textual – has simply increased in pace with the advent of the smart phone and the tablet.  While at the margins, the Kindle may have changed how we read Fifty Shades of Grey on public transport; it was the Walkman, the iPod, and the smartphone that have most fundamentally changed how we spend our time - what kinds of signals we are interpreting from minute to minute.   The most powerful developments of the last decade have involved maps and images – from Google Earth to Flickr and PinInterest.


Ironically, while the book and the journal article have remained stubbornly the same - even in their digital forms; and while much of ‘digital humanities’ efforts have been directed towards capturing a technology of text that had been largely invented by 1600, and remained largely unchanged since; the content of our culture has been radically transformed by the creation of unlimited sound and image. 

If you want proof of this, all you need do reflect on the triggers of your imagination when contemplating the 1960s or 1980s – or the 2000s or 2010s.  We have become a world of sound and image.   

Half the time we now narrate the past through discographies of popular music; and most of what we know about the past is delivered via image rich documentaries, and historical dramatizations – wholly dependent on film archives for their power and claim to authenticity.  Our conversation – that dialogue with the dead, that forms the core of the humanities – has become increasingly multi-modal; and multi-valiant. A simple measure of this – is that the percentage of text on the web has been declining steadily since the mid-2000s.  According to Anthony Cociolo, text currently represents only some 27% of web content.  


Over the last two decades the Digital Humanities has crafted a technology for the representation of text; but we now need to pay more attention to all that other data – the non-textual materials that increasingly comprise our cultural legacy, and the content of our humanities conversation.


And the digital humanities have a genuine opportunity to create something exponentially more powerful than the textual humanities.  What the digital side of all this allows, is the removal of the barriers between sound and image and text – between novel, song and oil painting.  Each of these is no more than just another variety of signal – of encoding – now, in the digital, divided one from the other by nothing more substantial than a different file format.


If we can multiply sound by text – give each encoded word a further aural inflection; and each sound a textual representation of its meaning to the listener – we make the humanities stronger and more powerful.  By bringing text and image together; we create something that allows new forms of analysis, new layers of complexity, and new doubts and claims, to be heard among the whispering voices of that humanities conversation.  In part, this is a simple recognition that the physical heft of a book, changes how you read it; and that doing so on a crowded tube train, is different from reading even the very same physical book on a sunny beach.


Much has already been done to bring all these signals on to the same screen – to map texts; and add image to commentary; but there is an opportunity to go much further with this, and to acknowledge in a methodologically consistent way, that we can use sound and image, place, space and all the recoverable contexts of human experience to generate a more powerful, empathetic understanding of the past; to have a fuller more compelling conversation with the dead.  To my mind, we need new methodologies that allow us to analyse and deconstruct multiple signals, multiple data streams – sound multiplied by text by image by space.  We need to recreate the humanities by multiplying its various strands, one against the other, to create something more powerful, more challenging, and more compelling.  Perhaps, the Humanities3.