Between April and August 2017, we worked with Digirati and Pleiade Management and Consultancy to scope the creation of an open access (OA) dashboard for Jisc. Our analysis showed that building an OA dashboard would be a high-risk undertaking, with significant uncertainties over both the cost of implementation and the value which could be realised for the higher education sector. This led us to recommend that this is not pursued for the time being.
- Why develop an OA Dashboard? Higher education institutions have access to a vast amount of information on their OA outputs, from Jisc services and other sources. However, this information is held in different systems, which are often incompatible with one another. This presents an opportunity for Jisc to support them by creating a dashboard to combine and visualise data in an easy-to-understand way. The rationale for this would be to improve institutional workflows by providing easier access to information on OA.
- We followed a three-step approach to scope the creation of an OA dashboard: We analysed five alternative dashboard options, created a prototype for one of these, and considered the business case for further development. The project was split into three phases:
- Phase 1 involved the definition and prioritisation of a series of use cases, based on inputs from (i) a workshop with representatives of UK institutions and (ii) additional stakeholder interviews.
- Phase 2 focused on the development of a prototype of Dashboard A and its testing by institutional users through online demonstrations.
- Phase 3 investigated the business case for further development of the dashboard prototype, taking account of feedback from institutions and research funders.
Phase 1 – User requirements of a Jisc OA dashboard
- An exploratory study led to the development of five possible dashboards/use cases along with possible data sources:
- Informing OA policy effects by monitoring the authors’ uptake of OA options
- Informing Green and Gold OA policy effectiveness by monitoring the usage, citations, and altmetrics of OA articles in comparison with non-OA articles.
- Informing article publication charge (APC) financial implications and offsetting deals
- Reporting on/accounting for OA policy compliance
- Repository management by combining institutional repository statistics, subject repository statistics and cost information on running the repository.
- Input from institutional representatives helped us select two preferred dashboards/use cases: Discussing the possible use cases with institutional representatives led to the prioritisation of Dashboard A (monitoring OA articles) and Dashboard B (effectiveness of OA policy).
Phase 2 – Technical features of a dashboard prototype and user testing
- Constraints in the data sources led to the disqualification of Dashboard B: When looking at the data needed for Dashboards A and B, it became clear that Dashboard B depends on the information gathered for Dashboard A. However, Dashboard B also relies on a further set of data sources, which previous work (as part of the Library Data Labs project) indicates are very difficult to combine. In consequence phase 2 involved work on a prototype of Dashboard A only.
- Three data sources were selected to build a Dashboard A prototype: Although no options were ideal for this purpose, we chose Crossref, oaDOI, and Sherpa/RoMEO to obtain data on the universe of publications, licence information and OA status, and publishers’ policies on copyright and self-archiving, respectively.
- Important issues in the data sources were highlighted: Our investigation showed that delivering both Dashboards A and B would require significant effort, with the need to obtain and normalise a large amount of information. Other key issues identified included:
- The ‘universe of publications’ for UK HEIs is very difficult to source from open data sources, which affects the completeness of the dashboard.
- None of the data sources selected were fully fit for use in a dashboard, due to intrinsic limitations. These arose from issues with their APIs, incompleteness of the information provided, and data quality.
- Prospective institutional users appreciated the Dashboard A prototype but highlighted possible issues with data: While the dashboard prototype was seen as attractive and intuitive, concerns were raised over data quality and coverage. Profiling against other institutions was considered the most important feature of the prototype, followed by the availability of DOIs and the chance to measure the citation advantage of OA.
Phase 3 – Business case development
- We asked stakeholders to identify the value proposition of the Dashboard prototype: The most valuable use cases highlighted were support with monitoring REF compliance, profiling against other institutions, and identifying items which ‘could have been green OA’.
- Interviewees struggled to quantify the dashboard’s contributions to streamlining workflows: Institutional users noted that the data shown in the Dashboard A prototype provide additional insights rather than replacing existing activities, thus, they felt that time savings would likely be limited.
- Building the foundations for a business case proved challenging: In addition to the limited efficiency added to institutional workflows highlighted above, the following observations were raised:
- Data would need to be more comprehensive and should be more robust, which indicates that proprietary sources might be required.
- Usage of the dashboard would likely be ad-hoc, rather than regular.
- The dashboard has limited value as a standalone service and could be better marketed if it were embedded with other Jisc services.
- Funders have existing mechanisms to obtain some of the information presented in the prototype dashboard, and are pursuing other developments (e.g. via EuropePMC, Researchfish) to address known gaps.
Conclusions and recommendations
- We reached the conclusion that a full business case cannot be built at this time: The strength of the available evidence is, on average, low, and does not enable a strong case for further investment to be made.
- Although there is a gap in terms of analysing data on OA, open data sources are not mature enough to power a dashboard: Institutions wish to have better data on OA and its benefits, however, a dashboard with the features discussed in this report would not provide robust enough evidence. The low quality and maturity of existing data sources is likely to undermine the validity of dashboard outputs.
- Evidence indicates that an OA dashboard should not be pursued at the present time. We recommend that this is put on hold and re-evaluated in the future. Meanwhile, Jisc could seek to improve the quality and availability of data sources to enable future efforts, by:
- Developing a comprehensive, open-source record of UK HEIs’ publication output;
- Ensuring that the terms and conditions for existing Jisc services permit re-use of relevant data in future services;
- Promoting greater uptake of institutional identifiers within key data sources;
- Continuing its support for ORCID;
- Improving internal consistency of Jisc data sources;
- Extending the Research Data Shared Service (RDSS) data model to include a Resource Type profile for a journal article; and
- Rebuilding the data model and API for SHERPA services.