This blog post is by Dave Lorimer, an LLM by Research candidate at the University of Aberdeen. In this post – the second this blog has hosted from a postgraduate student – he reflects on a recent interdisciplinary event at the University of Aberdeen.
The inaugural Granite Symposium on 25 April 2016 provided a good opportunity to present some postgraduate thoughts. The event allowed discussion of interdisciplinary issues, in particular where the social sciences meet technology, with a view to publication in a special edition of Granite Online Journal .
The keynote presentation by Prof Karen Kelsky was an enlightening review, albeit a very trans-Atlantic one, of where to go after ‘grad school’ and how to get there. Tricky questions for many of us. There was some thought-provoking advice about pacing the publication of research and how to tie it into an academic career.
For my part I presented on my latest numerical analysis on aspects of the criminal law. Having had an interesting and varied career as an engineer in the oil and gas industry, my early retirement (at a time when ‘bean-counting’, diminishing local reserves, standardization and ‘business modelling’ was taking much of the creative fun out of engineering) allowed me to pursue a life-long fascination with the law. It struck me in the course of my LLB that there were a number of analogous processes in the logic of the law and the logic of engineering. In fact, much of the work towards the end of my engineering career was primarily determined by a Law Lord.
The technical quality of Lord Cullen’s report on the likely chain of events leading to the Piper Alpha disaster would have made any experienced engineering professional proud. His review and recommendations on the use of Quantitative Risk Assessment (QRA) as a predictive tool in industrial safety management, for me at any rate, now resonate with a defeasible approach (as often used in artificial intelligence work) to assessing the ‘unknowable’ using the logic of numbers in a legal context; QRA predicts the likelihood of possible future events in order to identify the optimum approach to the best outcome – so why not seek to apply the same logic to the understanding of past events? So far I’ve worked on four discrete numerical applications within the area of criminal law. I presented separately on the first three at Strathclyde and Aberdeen, namely: 1) the risk reducing nature of corroboration with respect to wrongful conviction; 2) a numerical view of the criminal process as a chain of events; and 3) an exculpatory assessment of defence witness reliability in a murder case-study. The Granite Symposium allowed the opportunity to present on the fourth: a numerical assessment of information transfer by witnesses at trial. Each time I present, not only do the ideas under consideration become better understood but new facets are revealed in the process of receiving feedback from the audience, which ultimately strengthens and further develops the thinking.
Information transfer by witnesses at trial
At the heart of the witness accuracy model is some Enlightenment philosophy that I gleaned during the relatively brief study of McCoubrey and Whyte’s Jurisprudence as a law undergraduate, in conjunction with a curiosity about the history of philosophy as integrated with fiction by Jostein Gaarder and Robert Pirsig, plus a schoolboy interest in the writings of Jean-Paul Sartre. The Existentialist/Kantian view that what we see is not what we think we see and that the real world is unknowable is readily transferable to the analysis of the perceptive capacities of a jury. Information transfer from crime to witness to jury in the course of a trial may be broken down into seven stages and High (> 95%), Medium (50% mid-range) and Low (< 5%) rates of information transfer accuracy can be applied at each stage to give an overview of the ultimate extent of ‘erosion of truth’ in the picture as perceived by the jury. The seven ‘Kantian’ stages can be broken down as follows;
Event occurs – Witness perceives – Witness reflects – Witness recalls – Witness testifies – Jury perceives – Jury reflects.
The idea of ‘reflection’ after perception comes from David Hume (the philosopher, uncle of the institutional writer Baron David Hume) as do a number of other ‘Kantian’ concepts (the possibility that Kant believed he had a Scottish grandfather makes one wonder if he had read Burns too; ‘To see oursels as ithers see us’, ‘A man’s a man for a’ that’ and ‘That sense and worth o’er a’ the earth’ are well reflected in Kant’s moral philosophy; in this author’s humble view, they represent the Scotsman’s equivalent to Kant’s Categorical Imperative and its various formulations).
The surprising thing was that this ‘Kantian’ breakdown of stages raised the majority of feedback after the Granite Symposium presentation, in particular from researchers variously asking about applications with regard to:
- advertising design development processes;
- studying the history of theology;
- computer science and the possibility of empirical work on the seven stages; and
- neuroscience research, raising the question of whether Magnetic Resonance Imaging (MRI) scanning of the brain (known as functional MRI or ‘fMRI’) could lead to real-time assessment of information transfer accuracy at any or each of the seven stages. The state of the art with respect to the current size of MRI machines presents obvious practical issues with this but the idea that one day witnesses (and even jury members??) could wear an MRI scanner as a hat, may ultimately make polygraph (lie-detector) machines as obsolete and humour-inspiring as the wind-up gramophone.
The initial perception that this fourth and latest numerical application was of little more than scene-setting or background interest turned out to be wrong, at least as far as the Granite Symposium feedback is concerned. Some interesting ideas have been generated and the strength of the analysis – as in many if not most numerical assessments of this type – is not necessarily in the final arbitrary or defeasible numbers generated. As with experience in numerical risk analysis in industrial projects, much of the real value is in the analytical process of categorising inter-related parameters and the comparison of a range of inputs from a cause and effect perspective, as well as development of deeper understanding of the overall process and identification of key issues and new ideas. This is at the heart of a reasoning process and the numerical approach may be seen as a thread that binds or a link that chains – or even a kernel that continues to grow.
As far as academic presenting is concerned, any form of peer review – which includes presenting and discussing the issues with people of sound intellect and experience, plus any ‘digital dialectic (reasoning)’ such as blogging – also becomes part of the research and reasoning process. In fact, with respect to the research under consideration here, it seems at first thought perhaps a pity that jury members no longer get to question witnesses directly… but that’s another story.