Monday, 12 December 2011

Playing with Locating London's Past

With colleagues at the Universities of Sheffield and the IHR, we launched a new web resource this morning that allows you to map some seventeen different large scale datasets related to 18th century London on to a GIS compliant version of John Rocque's 1746 map of the capital - all in a Google Maps environment.  See www.locatinglondon.org   I think it is very pretty and intuitive, but what I find most interesting about the site is that it allows you to explore a component of these datasets that we have hitherto done very little with - the spatial.  I don't know what is there yet, but I suspect I will have a good time finding out.  

My first thought was to play with a nice dichotomy in the data for the Old Bailey Proceedings - the published trial accounts for London, 1674-1819 (they continue to be printed up till 1913 but only the 18th century elements are currently available for mapping).  

One aspect of the tagging we imposed on the Proceedings was a distinction between 'Crime Location' and 'Defendants' Home'.  This information is pretty consistently given in the text and tagged in the XML, and the 18th century trials include around 34,000 crime locations, and around 12,000 defendants' homes.  
 
A quick search for all 'Crime Locations' (34,427), when mapped on to 'Street' and displayed on to a blank screen, looks like this:

  
And an equally quick mapping of 12,031 Defendant's Homes looks like:


 
When placed over the warped version of John Rocque's 1746 map of London, the result is:


 I don't have an argument about this data, or even much of an observation.  The predominance of 'Defendants' Home' in the eastern part of the city, seems pretty compelling, and could form the basis for an analysis of the relative access to justice in eighteenth-century London, or when mapped against wealth, part of an argument about the nature of crime, and its motivation.  But more importantly, the process of 'playing' with this data strikes me as central to a very different kind of research narrative than I am used to.  I am not formulating questions, and then using the data to answer them - I am throwing together visualisations in search of contrasts that stand out, and look weird.

I am very much looking forward using the interactive elements of the Locating London's Past site to find anomalies and confusions that allow me to reformulate the questions I am asking.





Sunday, 23 October 2011

Academic History Writing and its Disconnects


This is the rough text of a short talk I am scheduled to deliver at a symposium on 'Future Directions in Book History'  at Cambrdige on the 24th of November 2011.


I am on the programme as talking briefly about the ‘OldBailey Online and other resources’ (by which I assume is meant London Lives, Connected Histories, and Locating London’s Past, and the other websites I have helped to create over the last ten or twelve years).  But I am afraid I have no interest whatsoever in discussing the Old Bailey or the other websites.  The hard intellectual work that went in to their creation was done between 1999 and 2010, and for the most part they have found an audience and a user base and will have their own impact, without me having to discuss them any further.  We know how to do this stuff, and anyone can read the technical literature, and I very much encourage you to do so.


Instead, I want to talk about how the evolution of the forms of delivery and analysis of text inherent in the creation of the online, problematizes and historicises the notion of the book as an object, and as a technology; and in the process problematizes the discipline of history itself as we practise it in the digital present. 


The project of putting billions of words of keyword searchable stuff out there is now nearing completion.  We are within sight of that moment when all printed text produced between 1455 and 1923 (when the Disney Corporation has determined that the needs of modern corporate capitalism trumped the Enlightenment ideal), will be available online for you to search and read.  The vast majority of that text is currently configured to pretend to be made up of ‘books’ and other print artefacts,   But, of course, it is not.  At some level it is just text – the difference between one book and the next a single line of metadata.  The hard leather covers that used to divide one group of words from another are gone; and every time you choose to sit comfortably in your office reading a screen, instead of going to a library or an archive, while kidding yourself that you are still reading a ‘book’, you are in fact participating in a charade.  We are swimming in deracinated, Google-ised, Wikipedia-ised text.


In other words, and let’s face it: the book as a technology for packaging and delivery, storing and finding text is now redundant.  The underpinning mechanics that determined its shape and form are as antiquated as moveable type.  And in the process of moving beyond the book, we have also abandoned the whole post-enlightenment infrastructure of libraries and card catalogues (or even OPACS), of concordances, and indexes and tables of contents.  They are all built around the book, and the book is dead. 


If this all sounds rather doom laden and apocalyptic – and no doubt we could argue about the rosy future and romantic appeal of the hard copy book – it shouldn’t.  At least as far as the ‘history of the book’ is concerned these developments have been entirely positive

First, it has allowed us to begin to escape the intellectual shackles that the book as a form of delivery, imposed upon us.  If we can escape the self-delusion that we are reading ‘books’, the development of the infinite archive, and the creation of a new technology of distribution,  actually allows us to move beyond the linear and episodic structures the book demands, to something different and more complex.  It also allows us to more effectively view the book as an historical artefact and now redundant form of controlling technology.  The 'book' is newly available for analysis.


The absence of books makes their study more important, more innovative, and more interesting.  It also makes their study much more relevant to the present – a present in which we are confronted by a new, but equally controlling and limiting technology for transmitting ideas.  By mentally escaping the ‘book’ as a normal form and format, we can see it more clearly for what it was.  And to this extent, the death of the book is a fantastic and liberating thing – the fascism of the format is beaten.


At the same time, I think we are confronted by a profound intellectual challenge that addresses the very nature of the historical discipline.  This transition from the ‘book’, to something new, fundamentally undercuts what we do more generally as ‘historians’.  When you start to unpick the nature of the historical discipline, it is tied up with the technologies of the printed page and the book in ways that are powerful and determining.  Our footnotes, our post-Rankean cross referencing and practises of textual analysis are embedded within the technology of the book, and its library.


Equally, our technology of authority – all the visual and textual clues that separate a CUP monograph from the irresponsible musings of a know-nothing prose merchant – are slipping away.  While our professional identity – the titles, positions and honorifics – built again on the supposedly secure foundations of book publishing – is ever less compelling. So the question then becomes, is history – particularly in its post-Rankean, professional and academic form - dead?  Are we losing that beautiful disciplinary character that allows us to think beyond the surface, and makes possible complex analyses that transcend mere cleverness?
 

And on the face of it, the answer is yes – the renewed role of the popular block buster, and an every growing and insecure emphasis on readership over scholarship, would suggest that it is. In Britain we shy away from the metrics that would demonstrate ‘impact’ primarily because we  fear that we may not have any.


Collectively we have put our heads in the sands, and our arses in the air, and seemingly invited the world to take a shot.  A single and self-evident instance that evidences a deeper malaise is our current failure to bother citing what we read.  We read online journal articles, but cite the hard copy edition; we do keywords searches, while pretending to undertake immersive reading. We search 'Google Books', and pretend we are not.


But even more importantly, we ignore the critical impact of digitisation on our intellectual praxis.  Only 48% of the significant words in the Burney collection ofeighteenth-century newspapers are correctly transcribed as a result of poor OCR.  This makes the other 52% completely un-findable.  And of course, from the perspective of the relationship between scholarship and sources, it is always the same 52%.  My colleague Bill Turkel, describes this as the Las Vegas effect – all bright lights, and an invitation to instant scholarly riches, but with no indication of the odds, and no exit signs.  We use the Burney collection regardless – not even bothering to apply the kind of critical approach that historians have built their professional authority upon.  This is roulette dressed up as scholarship.

In other words, we have abandoned the rigour of traditional scholarship.  Provenance, edition, transcription, editorial practise, readership, authorship, reception – the things we query issues in relation to  books, are left unexplored in relation to the online text we actually read.

And as importantly, the way we promulgate our ‘history’ has not kept up either.  I want television programmes with footnotes, and graphs with underlying spreadsheets and sliders.  Yes, I want narrative and analysis, structure, point and purpose.  I want to continue to be able to engage in the grand conversation that is history; but it cannot continue to be produced as a ragged and impotent ghost of a fifteenth century technology; and if we don’t do something about it, we might as well all go off and figure out how to write titillating tales of eighteenth-century sex scandals, because at least they sell.

The book had a wonderful 1200 odd year history, which is certainly worth exploring.  Its form self-evidently controlled and informed significant aspects of cultural and intellectual change in the West (and through the impositions of Empire, the rest of the world as well); but if, as historians, we are to avoid going the way of the book, we need to separate out what we think history is designed to achieve, and to create a scholarly technology that delivers it.


In a rather intemperate attack on the work of Jane Jacobs, published in 1962, Louis Mumford observed that:


‘… minds unduly fascinated by computers carefully confine themselves to asking only the kind of question that computers can answer and are completely negligent of the human contents  or the human results.’

I am afraid that in the last couple of decades, historians who are unduly fascinated by books, have restricted themselves to asking only the kind of questions books can answer.  Fifty years is a long time in computer science.  It is about time we found out if a critical and self-consciously scholarly engagement with computers might not now allow us to more effectively address the ‘human contents’ of the past.

Sunday, 19 June 2011

Culturomics, Big Data, Code Breakers and the Casaubon Delusion

Suddenly it seems as if 'big data' humanities is all the crack; with quantitative biologists and mathematicians diving in where previously only historians, literary critics and linguists dared to swim.  Digital humanists have been slowly engineering a new field from history and linguistics (aided and abetted by library science) for over a decade, gradually building new bodies of evidence, and road testing new methodologies.  But in just the last year or so, the biologists and mathematicians, with Google's help, have stolen a march on all their puny efforts.  In particular, it seems that Science and Nature have fallen head over heels in love with 'culturomics' and the heady enthusiasms of Erez Lieberman Aiden and Jean-Baptiste Michel, and their Google ngram viewer.  To read the most recent issue of  Nature is to be confronted with a heady mix of big science and gushing Hello Magazine prose, that work to mythologise the new 'science' of  culturomics and its creators.  It feels like the birth of a myth and of a brand.


This is all rather wonderful, and I am a huge fan of the Google ngram viewer, and the playful way it allows scholars and students to engage with the 'infinite archive' of inherited texts.  I think Aiden and Michel (and Google) have done the humanities a huge service.   But their real achievements do not quite explain the cloud of hyperbole that seems to be rising around them.


And this made me wonder what is really at issue here?  What is it about culturomics that turns on the reporters from  Nature.  At its heart, the use of word frequency with a reasonably sized (if problematic) data set simply provides one more form of evidence to be added to all the rest.  Knowing that the term 'electricity' peaks between 1870 and 1900 is useful evidence, but does not provide either an explanation for why, or a description of how it is being used.   Historians will no doubt look this particular gift horse in the mouth, and worry at the condition of its teeth; but they will also happily use the ngram viewer as one more component in a complex landscape of evidence.  This use may be delayed by the peculiar lack of any guidance on how to cite the results of a search, but it will be normalised in due course.


But simply providing a new body of evidence is not what seems to get Nature going.  Instead, it is the claim that the ngram viewer lays the basis for a new 'science', and that the results make other forms of historical analysis redundant.  In the words of Aiden and Michel, somehow this data is uniquely available for 'scientific purposes',  in contrast of other forms of evidence. 

It is not, therefore, the mechanics of the ngram viewer that is at issue.  Instead it is the underlying intellectual paradigm that Aiden and Michel bring to its use.  They appear to claim to be able to read history from the patterns the ngram viewer exposes - to decipher significant patterns from the data itself.  Their great party tricks (and they are particularly impressive in live performance) include the analysis of the decline of irregular verbs to a describable mathematical pattern, an equation, and the rise of 'celebrity' as measured by the number of times an individual is mentioned in print.  These imply that all historical development can, like irregular verbs, be described in mathematical terms, and that 'human nature', like the desire for fame, can be used as a constant to measure the changing technologies of culture. 

In some respects, we have been here before.  In the demographic and cliometric history so popular through the 1970s and 80s, extensive data sets were used to explore past societies and human behaviour.  The aspirations of that generation of historians were just as ambitious as are those of the parents of culturomics.  But, demography and cliometrics started from a detailed model of how societies work, and sought to test that model against the evidence; revising it in light of each new sample and equation.

The difference with culturomics is that there is no pretence to a model.  Instead, its practitioners will simply seek to discover patterns in the entrails of human speech, hoping to find the inherent meanings encoded there.  What I think the scientific community finds so compelling is that like quantitative biology and DNA analysis, Aiden and Michel are using one of the controlling metaphors of 20th-century science, 'code breaking' and applying it to a field that has hitherto resisted the siren call of analytical positivism.  


Since the 1940s the notion that 'codes' can be cracked to reveal a new understanding of 'nature' has formed the main narrative of science.  With the re-description of DNA as just one more code in the 1950s, wartime computer science became a peacetime biological frontier (cashing in on big-pharma, as military expenditure declined).  That Aiden comes from a background in DNA analysis should clue us to the fact that culturomics is an attempt to apply the same kind of code breaking to human society as a whole.



I strongly suspect that the project will fail, just as naive readings of DNA as a code for life have largely failed to fulfil their promise. But much more importantly, this attempt to repurpose a 'scientific' approach to historical analysis simply miss-understands the function of history itself.  These large-scale visualisations of language may be the raw material of history, the basis for an argument, the foundation for a narrative, the evidence put in the appendix in support of a subtle point, but they do not serve as a work of history. 

Historians interpret the past to the present.  They marshal evidence and use all the tools of genre writing to allow a modern reader to engage with the past.  And the questions they ask are not driven by the evidence, but by the needs of a modern society.  Gender history, the history of sexuality, and of race, have been created by two generations of historians not because the archives are groaning under the weight of relevant evidence, but because our society needs to understand the role of these forces in the present.  The fundamental flaw with culturomics is that it assumes that history is about the past; that what historians seek to achieve is an ever more accurate description of everything.  Instead, it is about the present.  Ironically, Aiden and Michel have rediscovered the 'Casaubon delusion'; and believe, like George Eliot's tragic figure, that they can create a new 'Key to all Mythologies'.   They need to listen to the Dorotheas of this world.

Friday, 1 April 2011

Towards a New History Lab for the Digital Past

The text of a talk delivered at the launch of the Connected Histories held at the Intitute of Historical Research, London, on 31 March 2011.


When the Institute of Historical Research was first established in 1921, its purpose and object was described as: 

to become an index to historical knowledge, a focus of historical research, a clearing-house of historical ideas, and a historical laboratory open to students of all universities and all nations.
Institute of Historical Research leaflet, 1921

And those of you who know the IHR, as it has evolved through almost a century of change, will recognise in its seminars, in its unique open shelf library, and in its simple role as the centre of a community of historians, of students, and of the curious and argumentative, the continuing vibrancy of this original spirit and purpose.
In many respects, Connected Histories is a simple attempt to ensure that this spirit and objective continues to thrive online; that immediate access to 2 billion words and 150,000 images  - searchable at the click of a mouse, and sharable across time and space – will enhance that community, and the history it creates.


But Connected Histories is also a recognition that the nature of historical research has changed; that we are drowning in an infinite archive – an ever expanding world of information.  And that the secure sense of a discipline that knew how to judge quality, how to assess evidence, is challenged by the sheer number of sources we can interrogate for words – at least - if not yet for meaning.

Given the privilege of a few minutes with a powerful audience, I want to do a couple of things this afternoon.  First, I want to describe just what Connected Histories does and how it works.  And in the process say a bit about why it is designed the way it is, and what issues it is meant to address.  And second, I want to talk a bit about how it fits into a trajectory of changing research and publishing practise – to describe where it sits in a process of frighteningly rapid change, and to locate it along with the other resource being introduced today – Mapping Crime.
Connected Histories is what is called a ‘federated’ search facility, and currently makes some eleven different web resources available – over two billion words of text, and 150,000 images, some free to access, others supported by a JISC license for use in British Higher Education, and others still, commercial sites designed for a wider audience of family and local historians.  

It includes
  • British History Online

  • British Museum Images

  • British Newspapers, 1600-1800

  • Charles Booth Archive

  • Clergy of the Church of England Database 1540-1835

  • House of Commons Parliamentary Papers

  • John Johnson Collection of Printed Ephemera

  • John Strype's Survey of London Online

  • London Lives 1690-1800

  • Origins Network

  • The Proceedings of the Old Bailey Online, 1674-1913


And we are in the process of adding several more.

Underpinning its searches are indexes of every word in those eleven ‘distributed’ websites – each of which were chosen to represent large bodies of academically credible and relevant material.  As a part of this index, each word is associated with a web address, a URL, that allows you to click through to the original.  This creates a basic facility that in response to a word or phrase search will return tens of thousands of results, each associated with a snippet of text, and each linked to the full resource held elsewhere.

In other words, at its fundament and in its water, Connected Histories is simply a comprehensive index of words.  But in the process of creating that index, we also sought to assign meaning to some of them.  Using a methodology called natural language processing, we identified names and dates and places (to an accuracy of around 75%).  So, in addition to an index of all the words in these 11 resources, we have also created indexes of all the names and places and dates mentioned: all the names in the Burney Collection, and all the dates in the Parliamentary Papers (however they are expressed).  

In other words, what we have is not just one, but four indexes, and you are searching each of these 2 billion words, or the millions of names or dates or places, each time you enter a query – allowing you to combine keyword and name, date and place searches to find just what you want.

That it works is a testimony to the hard work of the technical staff at the HRI and IHR, to Kathy Rogers and Bruce Tate in particular.  But also to Sharon Howard, who managed the project, and to the large team of people involved.

The starting point for this project was always an attempt to address what Digital Humanists tend to label the ‘silo effect’ – the idea that one of the problems with small scale websites and resources of the sort so many of us have worked to create over the last fifteen years, is that you tend to go along to one site – do a bit of research – before heading to another.   That just like traditional forms of research most of us forget what we knew in the British Library, during the short walk to the London Metropolitan Archives.

And in its most basic formulation the silos Connected Histories seeks to blow apart, are the boundaries between web sites.  You can now cross search the British Museum image collection, against Strype’s history of London, and associate images of specific locations, with descriptions and commentary on them.  You can search the Parliamentary Papers, in combination with the records of all the sessions papers of the county of Middlesex – bringing onto a single screen precept and practise.  There are sixty thousand settlement examinations that can now be cross referenced against apprenticeship documents, and trial records.

But this blithe image of easy cross searching, fundamentally understates the complexity of the issue, and the precise reasons Connected Histories is designed in the way that it is.

One overwhelming, and very real, aspect of the ‘silo effect’, is that while many of the primary sources we need are freely available, many others are not.  The walls of some silos are much more difficult to breech than others.  While frustrating, this is not necessarily a bad thing.  Unless we can convince the state and the taxpayer to pay for universal digitisation, we can’t really complain if the cost of digital resources is being borne by the end users.  Early modern and modern Britain is quite simply the most digitised where and when in existence, because of the combined efforts of the academy, of the great cultural institutions of Britain, of individual scholars, and private publishing companies motivated by profit.  But, at the same time, as scholars and teachers, we need access to all those resources.  Or at the least we need to know what is in them, in order to make an informed decision about what we want out of them.

The model of a series of indexes to the original material is precisely designed to address this issue.  We don’t need to have direct access to all the products of ProQuest and Gale, of the Origins.net, or JISC funded projects like the John Johnson collection, if we have an index that tells us what they contain.  These materials can sit behind their paywalls, the intellectual property they contain safe from harm; while we can now interrogate them from a distance.

In other words, Connected Histories is designed to build a bridge between the academy and commercial publishers; it is designed to mess up the models of delivery, and the walls of division that keep us apart.  These ‘silos’ are almost philosophical in character, but even more than the technical ones that divide one website from another, they need to be breeched; and that is part of what this project has been about.

But, the silo effect goes beyond even this.  It exists between our own ears as well. At its best and most compelling, history is a community of scholars, sharing knowledge and effort in pursuit of a real and usable understanding of the past – it is a collective project.  At its worst, it is a collection of egomaniacs, desperate to be lauded as the great authority on this or that – however specialised and narrow that might be.  The much lamented lone scholar is as frequently a Casaubon, forever seeking and failing to find the key to all knowledge; as they are a Dorothea, driven by enthusiasm and a desire to share with others.  At some level, the ‘silo effect’ is inherent in the idea of ‘authorship’ (and the ‘authority’ it implies).  It is there when we decide one persons’ work is literature, and another is art history; when we label by period or methodology, when we decide who to exclude from the conversation, and who to include.

We do not all need to work collaboratively, or to abandon our notions of intellectual property, but in the spirit of a ‘history lab’, we do need to share our work, and remember the common purpose of historical research.  And Connected Histories is again an attempt to address these particular silos.  

By creating individual workspaces that build into a new body of ‘connections’ , by allowing users to link documents, and names, and stuff, across billions of words, and then pooling those links and allowing them to be explored by a wider public; Connected Histories, is designed to build a new shared body of knowledge grounded in everyday practical scholarship.  It is designed to nudge the lone scholar to become a more sociable animal.  

In many respects all these ‘silos’ are part of our inheritance from the Enlightenment.  They are inherent in every library catalogue, and in the practise of individual scholarship leading to named authorship.  They reflect the co-evolution of the academic community, in a symbiotic death grip, with commercial publishing; and they were imported without fanfare or thought, into what one might want to describe as Web 1.0 – that first iteration of the internet created in the image of older forms of scholarship and communication – with e-mail, e-spreadsheets, e-footnotes, e-everything – all mimicking an older intellectual technology.

In other words, there is a bigger ‘silo’ out there – a division that is more fundamental to the internet and the cultures of scholarship than the mere distance between the technical implementation of British History Online, on one hand, and The Burney Newspapers or the Old Bailey on the other.  

Yes, we want to consult this material in one go; and yes we need to overcome the boundaries, created by pay walls and subscription; and yes we want historians to work together in a common laboratory of ideas and connections.  But, what really needs to break down is the silo that suggests that information itself is something to be consulted and collected; that it is an unchanging object of study, rather than a pool of constantly changing stuff that can be interrogated from any angle, and pursued along any trajectory. 

The most fundamental silo Connected Histories is intended to address is between traditional forms of criticism and scholarship that assume we can contain data in an internally structured and divided, ‘library’; and the emerging world of text and data mining, that sees data as a process – something to be played with and analysed on a massive scale, across boundaries of genre and type.

The innovation at the heart of Connected Histories, the one I think is most interesting, is the methodology used to allow us to sit in London this afternoon, and locate the site and its gubbins in the IHR; while the indexes it interrogates sit on a server in Sheffield – which distributes pointers to eleven different servers around the country.

What has been created by the Institute and the Humanities Research Institute in Sheffield, is a model that uses an ‘API’ as its core.  An API is an Application Programming Interface (the most widely used version of which is Google Maps), and it is designed to allow you to create a simple query that can address a dataset from a distance (in this case four indexes).  It is not a website, or a ‘front end’, it doesn’t need to exist as a visual or physical thing.  It is essentially a series of agreed conventions that allow anyone to address a web resource and ask it for a bit of the data it contains.  What Connected Histories does, is locate the ‘front end’ in London, with information about the sources, with the workspace and connections, etc., but that front end’s main job is to address the API in Sheffield, to gather the data required from the indexes, to bundle it up into an xml file, and to present it in an attractive way to the end user who can then navigate to the original sites, create links, and share searches.

In other words, the indexes in Sheffield have been created as a standardised and generic resource, which is then addressed by a specialised and bespoke search and save environment.

For most of us, this is a seamless process of little interest; but what it does is create a space between search and data, that can now be occupied by anyone.   In other words, and unlike most free-standing websites - it is designed to be mash-upable.

With a bit of technical nous you can now generate a bit of code that will automatically select and download the contents of all the indexes, reflecting all the words in Connected Histories.  The text miners who do this will not be gaining access to the original resources – there is no intellectual property issue here (beyond that of this project).  There is no question of them being able to recreate the sites so laboriously constructed by whatever business or academic model was employed to create them in the first instance; but they will have access to what amounts to a detailed description of the contents of it all – the index of every word, and name and place and date.

Or to put it another way, the API architecture breaks down the structures of online resources into their component parts – separates out data from processing, from delivery - allowing each to be re-used and re-purposed.  At the moment it looks like a traditional website with a single front end and datastore, but that front end can address more than one data store; and the datastore can be addressed by more than one front end.   

The API architecture addresses that final wall, that silo that means that providers are on one side and data consumers, forced to query the data through an ever narrowing front end, are on the other.  Suddenly, we are all mixed up in the infinite archive.
 

To my way of thinking, this comes under the heading of an unalloyed good thing.  An outcome that liberates data, while protecting it; that makes for better history (whoever is writing it), and contributes to the democratisation of scholarship.

But it is only one step in a longer journey; and I want to spend the next few minutes pointing up three or four directions, that I think Connected Histories helps make possible; or which seem to grow naturally from it.

And the first is to do with those academic text miners, suddenly empowered to access ridiculously large bodies of data.  What do you do with a 2 billion word index?

What I want to do, is to begin the process of modelling what recorded language since Gutenberg, looks like; how does vocabulary change; how do genre evolve; how are ideas passed from medical literature to political science, to novels; how is changing technology and a changing environment reflected in changing texts.  In a sense, half of the last century was taken up in worrying about whether text, words, reflected a knowable universe, or were themselves controlling discourses, leaving humans powerless to imagine something new or describe something real – held captive in words. 

I like to describe what we can now access as ‘massive text objects’ – too large read, too complex to be contained in traditional taxonomies.  But, if we can begin to model them – if we can know both the absolute amount of language recorded; and how it changes from source to source, and decade to decade, we can use it in a more sophisticated way to trace first, the controlling forms of language, but also to more securely tie description to an underlying and knowable historical past.  If you know the shape, and texture of what has survived, you can begin to think through how it might relate to Herbert Butterfield’s, ‘...genuine relationship with the actual…’.[Herbert Butterfield, The Whig Interpretation of History, (George Bell and Sons: 1950), p. 73.  See also Michael Eamon, ‘A "Genuine Relationship with the Actual": New Perspectives on Primary Sources, History and the Internet in the Classroom’, The History Teacher 39.3 (2006): 32 pars. 6 Sep. 2006]
 
It is in text mining massive text objects that the hope of a new empiricism in historical analysis lies.

But for myself, I suspect there also something else also going on.  The urge to create new connections, to escape our inherited taxonomies, can already be seen in projects such as Mapping Crime – being demonstrated later today.  By tying material related to crime available through the John Johnson Collection of printed Ephemera to other repositories and other genre, a reconstructed set of links begins to emerge, that confound the structures created by librarianship.  The data itself, its newly digital form, seems to suggest the need for new connections.

And the API model at the heart of Connected Histories is itself an attempt to embed this idea and aspiration at the core of the design process.  It assumes that new connections are there to be made, and that they will inevitably cross boundaries of form and origin to encompass an ever expanding body of inherited artefact.

To take just a small example, we will soon be able to geo-reference at least a portion of the place names in Connected Histories, tying all that text to space in new ways.  By modelling maps in the way that we are beginning to model massive text objects, we can relate historical geography to present geography, to secure a further line between representation and a knowable past; and by using an API methodology we are ensuring that it is all mash-upable, with resources from wherever they come.  By September (if we keep to schedule), we will have the ability to mash-up eighteenth century London as found in the Old Bailey, and in London Lives, in Google Maps, with a rectified version of Rocque’s 1746 map of London, in combination with around 3 million artefacts dating from the period, dug up by the Museum of London.

This JISC funded project is in hand, and is in many ways the natural outcome of what has been described as the spatial turn in historical studies.  But by putting an API at the heart of the system, it will again facilitate the re-use and re-imagination of what we can do with a few billion lines of data.

And, of course, we can take the same approach to that other great inherited body of evidence: objects.  The historians and the museums will work together eventually (the logic is too ridiculously obvious to need re-enforcing), and at that point, the ability to cross reference maps and texts and objects, again will begin to change how we can evidence the past.  

And if we could add to the museum collections, that other massive online record of surviving historical artefacts; that other massive resource digitised by accident – the auction catalogues – we would have created an entirely new resource, available in a new way.  Auction catalogues have been created as online, digital resources for over a decade, and already contain detailed descriptions and images of millions of objects: the record of what individuals have valued and preserved on their own behalf, from the past.  And thousands more images and descriptions are added each month.  

Again, these represent a massive lens through which we can observe the past, and a silo dividing related and cognate materials.  Connecting them to texts and maps and stuff, will help us better understand the whole.

It is intended that Connected Histories will grow over time.  In its first update in September the National Archives ‘documents online’ will be added, as well as two key nineteenth-century resources: 65,000 digitised British Library books from the Jisc Historic Books Platform and the JSTOR collection of pamphlets on social and political issues.  Suggestions for additional content are welcome.  

But beyond more text, we are confronted with the challenge of integrating more different things.  And with each new variety of stuff, we move to a different kind of understanding, more sophisticated, better articulated, more firmly rooted in a clear model of what it is we are looking at; what we can securely see, and what we can’t.   

All in all, I think it is kind of cool.  

But I also think it remains part of that bigger project: 

to become an index to historical knowledge, a focus of historical research, a clearing-house of historical ideas, and a historical laboratory open to students of all universities and all nations.

Connected Histories.




Thursday, 13 January 2011

Urbanism Kolkata Style

I recently spent a week at a workshop in Kolkata organised by the British Library and the National Library of India, designed to lay the foundations for a project to digitise early Bengali Books (1778-1914).  In many respects it was a wonderful experience, and the project is incredibly worthwhile (though it will be difficult to implement).  But this was also my first visit to India, and my first visit to what might be called a 'mega-city', and the experience has forced me to revisit some aspects of how I have been conceptualising urban living, and in particular the nature of eighteenth and nineteenth century London.

A couple of years ago I visited Marrakesh and was struck by the essentially orderly character of what is a relatively poor and very crowded urban environment.  By contrast, what struck me in Kolkata was the extent to which urban growth seemed to have outstripped the city's ability to discipline behaviour.  At around seven million people at midday, and four million at midnight, Kolkata is both one of the worlds most recently created cities, and a city occupied by a rapidly burgeoning population drawn primarily from its own rural hinterland.  The city lived up to many stereotypes - crowded streets, fearsome pollution, and traffic that worked like fairground dodgems.   And while there were fewer beggars (either children or the disabled) than I had expected, and while I saw little evidence of malnutrition,  there were ragged shanties and street people around every corner.

But what surprised me was the nature of the rubbish.  It seemed to pile up along every roadside, untended and ignored.  And it seemed to be made up overwhelmingly of small bits of plastic, mixed with dust.  There was little evidence of large amounts of organic matter - no rotting fruit, or vegetables, fly specked bones or industrial by-products.  At one level it seemed the least varied or interesting rubbish I had ever seen - and it seemed to sit entirely unregarded, unmoved and unchanging.

My only explanation for this phenomenon is that, first, everything that could be recycled, re-used, turned to any account whatsoever, had been sifted from the pile and moved on.  And second, that there was no working system in place to remove the last, uneconomic residuum. 

As a historian of the urban environment this seems to me to re-enforce the profound inter-relationship between the economics of city life (its wealth), and the need for cultural controls on the behaviour of each individual urban dweller in order to make a city work.  In essence, what it confirmed for me is the extent to which living in a city requires the kind of detailed social and cultural system that could remove the rubbish; and that such cultural and bureaucratic systems can only be sustained in the face of measured growth - and that they can easily breakdown in the face of rapid migration.

When cities grow quickly as Kolkata certianly has; when behaviours and systems of bodily maintenance fitted for small holdings and low-density living, are practised in a high-density urban environment, the result in this instance seemed to approach the unliveable; or at least a form of urban dystopia.

In relation to the history of London, this observation seems to me to both emphasise the extent to which the city in the eighteenth century was able to maintain a culturally disciplined series of behaviours that both ensured that the rubbish was coralled to the correct place on the street; and that it was not allowed to stay there - poor neighbourhoods seldom tipped irredeemably into chaos.  It also suggests that the evolution of the nineteenth-century rookery formed an outpost of disfunctional urbanism that can be mapped against rural in-migration.  But, most importantly, this experience has re-enforced a belief that 'urban living' can be conceptualised as a distinct cultural phenomenon that takes similar forms across the globe - that being a city dweller is first and foremost about sharing a cultural system built on specifically urban forms of behaviour.



 



Monday, 29 November 2010

The Conundrums of Assessment

To my chagrin I recently realised that I have been assessing research proposals and grant applications for some twenty years, and have done so for most of the major humanities and social science funders in the UK, Europe and North America.  Over the last ten years I have also sat on numerous grant awarding panels, and helped to design the odd funding programme.  I have form in this particular small area of academic life.  So when I was asked for advice about how to write an assessment by a colleague faced with their first request of this sort, I felt obliged to offer some.  And since there is no independent body of advice about how the system works, or how the panels who arbitrate on the final decision use the reports drawn up by assessors, it seemed worthwhile posting that advice:

In the UK, funding bodies have made great efforts to train assessors, and brief them on what is expected; and to all intents and purposes have created a system that seems transparent and clear, with apparently precise criteria laid out in straightforward prose.   A great deal of effort has also been put into the supporting documentation in an attempt to ensure that assessments can be compared against one another, and that the process of decision making undertaken by the panels is speedy and uncontentious.  A real attempt has been made to eliminate special pleading, conflicts of interest, and the administratively perverse.  But, of course, all bureaucratic systems are also cultural systems, and there remain many unstated realities that effectively determine how a grant application and the assessments written in response to it, are read by the people charged with eventually sifting the funded from the unfunded. 

The single most important and determining factor that every assessor needs to keep in mind is that  most funding programmes have a success rate of around 20-25% (some as low as 10%).  From a panel's perspective, four out of five applications must be rejected.  As a result even small issues and problems in an otherwise exemplary application will be used to make a determination.  After reading perhaps thirty solid projects (and these days few applications are less than solid) and when faced with the need to find just five or six to fund, any panel, however well meaning or intelligent its individual members, will begin to reach for the smallest weakness.  

As an assessor you need to be aware of this problem.  This does not mean that you simply laud your favoured project to the skies.  If you do, your assessment will be judged to be insufficiently critical, and therefore worthless.  Instead, it means that if you seriously think a project is likely to be better than 80% of the others, you need to act as a critical advocate, and to place yourself at the heart of the debate that the application you are assessing will inevitably generate.

There will always be one or two applications that sit on the top of the pile, and if you are assessing one of these you can give yourself the freedom to engage with the underlying ideas, and to simply discuss the project's importance for a wider field.  Even in this instance, you might want to suggest where small problems might exist, but have been effectively addressed.

But these few, intellectually exciting and beautifully realised projects are rare and are consistently funded.  Where all the debate will be focussed is over the next tranche of projects.  This usually comprises some 40% of the total.  In a panel meeting (and regardless of how they are organised) the grading schema inevitably breaks down, and this 40% of applications start to bunch around the boundary grades.  Most panels give up on whole number grading of the sort the funders recommend, and end up using some form of 4.257, or 4-+(?) (if they are dominated by older academics from the Russell Group).  If you are asked to assess an application of this sort, the first decision you need to make is whether you think it should be funded; and having made that decision (assuming it is positive) you need to act as an advocate.

In other words, the first thing that any assessor needs to do is make an informed, over-arching judgement about the quality and importance of the application in front of them.  If you think the application you are assessing is compelling, although not so strong as to sit on the top of the pile, then you need to say so, and say why.  Alternatively,  if it is in the bottom 50%, for either technical or intellectual reasons (i.e. not exciting, or not practical, or just not well written) there is little point in expending your time struggling to find something good to say.  It will not affect the outcome, and it is likely to be fed back to the applicant in a way that just encourages them to resubmit something similar, instead of something better.

For myself, I start of with a basic question in my mind:  Am really excited by the project?  Has it caught my imagination, and left me thinking that I would actually want to know the outcome?  Would I want to read the book?  Or in a Knowledge Transfer  context, would it make a significant social difference? 


A minority of  academic projects get past this hurdle.  As a result, the real problem comes with the next stage, which is that while the intellectual case needs assessing (and if you are excited by the project, this is the easy bit) it is essentially all the things around it that will be used to exclude marginal projects.  If the project plan (with methodology and budgeting etc.) looks less than professional and doable, the application will be excluded on these grounds.  But if the fundamental idea is exciting and you decide you want to support it, regardless of its minor faults, you will need to deal with these issues directly and explicitly.  If you don't, the curmudgeon in the corner (and there is always one) will use these small issues to denigrate the project - generally as a stratagem to promote their own discipline or methodology or favoured project.  In this context, if you see a weakness, you need to address it directly and explain why it is not important to the success of the project.

My feeling is that you need to exercise an abstract academic and professional judgement, and in the round (regardless of the hoops and bureaucratic forms the funders want you to jump through) come to a conclusion about the worth of the project.  Once you do this, you are duty bound to do everything you can to ensure that the result is positive, in full recognition that most projects, including innumerable worthy ones, wont be funded.   This can lead to a rather instrumental and manipulative approach (which is a problem), but it at least has the advantage of allowing us to exercise the kind of judgement that is implied in peer review even when the process feels like bureaucratic games playing.



Thursday, 18 November 2010

A Review to be published in the Economic History Review

Phil Withington, Society in early modern England: the vernacular origins of some powerful ideas. (Cambridge: Polity Press, 2010. Pp. xi + 298. 23 figs. 14 illus.  ISBN 9780745641300 Pbk. £16.99)
10129

This volume does something new, remarkable and important.  It uses a quantitative approach to the evolution of sixteenth- and seventeenth-century language and word use as the basis for a re-analysis of the significance of corporatism and sociability in the creation of a ‘modern’, and more specifically ‘early modern’ society.  In the process, it attempts to re-integrate the economic and the cultural, the linguistic and the material.

Following an extended and sophisticated account of the development of the profession of economic and social history in Britain since the nineteenth century, organised around the evolution of the phrase ‘early modern’, Withington dives into an entirely innovative form of analysis.  The core of this study is a new mapping of the appearance of a series of key words in the titles of all the books that appear in the English Short Title Catalogue for the period up to 1700.  ‘Modern’, ‘Society’, ‘Company’, ‘Wit’, ‘Civil’, ‘Commonwealth’, among a host of related terms, have been trawled from the full title fields of the ESTC, and transformed into frequency graphs.  These graphs have then been used to illustrate, first, that terms signifying and labelling a specific kind of ‘modernity’ (based in a notion of what Withington terms the ‘sociable self’), became prominent in the latter half of the sixteenth century, and in particular during the 1570s and 80s.   And second, that terms like ‘commonwealth’, which referenced an older form of social ordering, went into relative decline (particularly after the Civil Wars).

Withington’s conclusions essentially re-enforce a growing consensus among historians about the importance of sociability and forms of corporatism in creating a transitional Res Publica, (in this context a kind of beer and skittles ‘public sphere’), that contributed to and resulted from both a newly decentralised but bureaucratic state, and the development of corporate capitalism (with a remarkably sociable scientific revolution thrown in for good measure).  In many respects, and in company with Keith Wrightson, Mike Braddick, Steve Hindle and Andy Wood, Withington is pushing back the origins of Jürgen Habermas’s ‘authentic public sphere’ from the 1690s to the 1570s, and attempting to articulate the relationship between ‘modernity’ (in both its statist and possessive individualist forms) and civic humanism.

As a description of early modern English and British culture, and the evolution of the state and the economy, this is entirely compelling.  Habermas’ chronology, based on coffee and newsprint, has always been suspect even if his overarching analysis of the role of public debate in the history of the nation state remains compelling.  More problematic is the methodology Withington uses to illustrate this new chronology.

As historians we are faced with an entirely new kind of evidence – mass digitised text - billions of words, retrievable through keyword searches.   To make sense of these new resources we need new tools; and this book is a laudable first attempt at creating precisely these.  Unfortunately the methodology used here is essentially unconvincing.  What appears on a title page of a book, on the colophon, and end papers, changed dramatically between the late fifteenth and the eighteenth centuries – the very nature of a book changed.  For all the heroic efforts of cataloguers and bibliographers to force early modern print objects into a single format, to tame their ‘slipperiness’, it is not credible to read the title of a book published in 1520 in the same way as one produced in 1690.  Nor is it necessary to take this aggressively reductionist approach.  Withington could, for instance, have translated the text of his titles in to a formal corpus, and used the tools of quantitative linguistics to chart the rise and fall of his ‘keywords’ against more robust measures of textual density, variety and proximity.   By restricting himself to only the most basic statistical techniques we are left with a series of graphs measuring change in a way that at first sight seems intuitively reasonable, indeed common sensical, but which belie all the subtle complexity that historians have found in language through the many decades of the linguistic turn.  In essence, what Withington has produced is a clear and compelling narrative of the evolution of the ‘sociable self’, and an equally clear series of measures charting the development of the language of title pages, but has not effectively related one to the other.

This will seem a harsh criticism, but it is not meant to be.  As a profession we are confronted with both the real challenge of dealing with massive electronic texts (produced in half a dozen different ways, and of hugely varying quality), and the need to create usable and intellectually credible tools that can both deal with words in their billions, and at the same reflect our new understanding of the complexity hidden in a single phrase.  Dr Withington has taken an all-important first step in the direction of a new form of historical scholarship and we should all look forward to the next.

University of Hertfordshire                                                                 TIM HITCHCOCK