This post is part of our series of summaries of works in progress presented at the 6th ADRRN Roundtable held in Dunedin in December 2017.
The aim of my project is to identify hidden knowledge about methods that do and don’t work in capturing data from end users (clients) of dispute resolution processes. If we are serious about measuring the quality, effect, and experience of DR, then we need to gather data from the people for whom these services are provided. There are, however, many challenges to gathering that data, which I explored in the earlier stage of my research project at the 2016 Roundtable. Some of these challenges are related to the DR process itself – an often stressful experience about which clients may be (a) unwilling to speak about for research purposes or (b) unable to reflect upon dispassionately. There are barriers of ethics, reliance upon third parties to gather data, and then for service providers who gather data routinely, there are often limited resources to actually systematically analyse that data.
At the 2017 ADRRN Roundtable I reported the next stage of my project. Since the 2016 Roundtable I have engaged a research assistant to conduct a systematic literature review, reworked my proposed interview questions, and conducted some pilot interviews with DR researchers about their experiences gathering data from end users / clients. This post will focus upon what I learnt through the attempt to conduct a systematic literature review.
An account of a systematic literature review
I set out to gather relevant literature through a systematic approach that was designed to capture research conducted in Australia that involved the gathering of data from end users/ clients of DR processes. I enlisted the assistance of my law librarian and met with her and my research assistant to design the systematic literature review. A variety of databases and search terms were used. I wanted to have confidence that this would identify all of the relevant research reports that already exist. The main problem that I have faced is that the “systematic” review simply hasn’t identified all of the relevant literature. I am aware of some resources that I identified earlier in the project (through non-systematic searching) and also reported in the appendix of Tania Sourdin’s Alternative Dispute Resolution (Thomsen Reuters, 5th ed, 2016), which were not captured. My research assistant found that the search terms that we had planned often failed to limit results to material that met our criteria of Australian research in the DR area that included data gathered from clients / end users. He spent a lot of time wading through material that did not meet our research criteria. The result of the review was that 43 relevant reports of research were identified.
On reflection, there are a number of possible reasons why the “systematic” review conducted in accordance with the conventions of traditional legal academic research has not achieved the result that I hoped to achieve.
- The database searches privileged peer reviewed journal articles. Not all DR research involving data from clients / end users is published in peer review journal articles. It is likely that most of the data gathered from DR clients / end users is not gathered or analysed by academics. Non-academics are unlikely to be motivated to publish in peer review journals, which mostly sit behind paywalls. Instead, open access self-publication, reports to funders, and internal reporting are likely to be frequent destinations for research. These kinds of publications were not captured by the systematic review.
- There is possibly a wealth of client / end user data being collected, but much of it is either not analysed at all or only analysed for confidential purposes. Most service providers conduct research to capture feedback from their clients. These data may never be systematically analysed and even when analysis occurs, there may be no public output from the research. “In house” evaluations may be conducted for purposes of quality assurance, reflective practice, and performance management. These purposes are not enhanced by making research results available publicly, and commercial interests may be compromised by publishing client feedback data.
- Even where DR research is published in peer review journals, there are few discipline specific publication destinations (particularly those considered by universities to be prestigious), resulting in a scattering of publications. It may be difficult to locate relevant literature because DR researchers publish across a broad spectrum of publications. Each journal has its own preferences in relation to reporting of research method, language and style. This could potentially have affected the ability of the systematic approach to capture all relevant literature.
The purpose of locating existing research reports was so that I could review the methods of recruitment of DR clients / end users and data capture that researchers have used. In the literature that I have identified so far, although relevant data were used as a foundation for the findings reported, the method of recruitment of participants and capturing of the data were not always explained. This possibly reflects the tradition in legal research of not reporting methods clearly, and the preferred style of some journals, which have strict word limits and may not value detailed accounts of research method. Often research reported in peer reviewed journal articles is reported in greater detail in non-peer reviewed reports. These were not always readily available when I tried to locate them.
It is clear that there is a vast amount of grey literature available that is not necessarily captured through subscribed databases. My next steps will involve new search strategies that will capture a broader range of literature. My pilot interviews and interviews with DR researchers about their experiences capturing data from end users will also be an opportunity to identify research reports that may not have come to my attention through my searches.
My reflections on the data that remains hidden within organisations has caused me to wonder how those of us in academia can better engage with industry. DR service providers are often able to achieve very high response rates that are difficult for independent researchers to achieve. I believe that there are opportunities for academic researchers to build better working partnerships with industry, with all parties exploring the skills and resources that they can offer one another.
The experience has also highlighted a need for a comprehensive, well funded clearing house of DR research reports, which would provide a portal through which prior DR research can be more readily located. A significant initial investment would need to be followed by funding for ongoing maintenance, but there could be great benefits to clients, practitioners, organisations and researchers working the DR field. I am percolating ideas about how to pursue this idea and would welcome any offers of assistance.