Contained components

large_h
wincewicz

Post by Richard Wincewicz, Software Engineer for SafeNet at EDINA.

In the previous post we saw an overview of the whole SafeNet project. In this post I wanted to dig a little deeper into the technical side of the project.

Key Components

We are currently developing the SafeNet Service Interface component by extending the LOCKSS software (http://www.lockss.org/), a software platform which allows libraries to store and provide access to locally managed copies of electronic content such as e-journals. The LOCKSS software was originally designed to work from within an institution’s network and provide access only to users that are part of that network. A key component of the SafeNet service is to introduce a centrally-managed Private LOCKSS Network that can be used by UK HE institutions to provide assurances of continuing access to their subscribed content, without having to run a server locally. The SafeNet model will allow institutions to participate in a shared service offering but there are a number of challenges that need to be addressed for this to work at scale.

The first challenge is determining who can have access to what content. In standard LOCKSS deployments, access is restricted by IP ranges (e.g. the university network) and so all users can access the same content. With a centrally-managed service this is no longer the case and we need a mechanism to ensure that a user is entitled to access content that they request.  For this purpose, we are designing and deploying an Entitlement Registry which holds information about the subscriptions that institutions have for specific journals. The Entitlement Registry provides a REST API that allows a user or application to query its database. Some of this data may be made openly available, such as lists of publishers and titles, and some of it will be restricted, such as the journals that an institution is subscribed to. We are extending the LOCKSS software to include a query to the Entitlement Registry whenever a user requests some journal content. As a pre-requisite to this, a user will be required to identify themselves by logging in and providing the LOCKSS software with identifying information about their institution. Using this information, we can then determine whether a user is entitled to access the requested content.

The Entitlement Registry has broader potential value as a reference tool used by both libraries and publishers.  To this end, we are also designing a user interface on top of the Entitlement Registry to allows users to interact with the entitlement data. Users will be able to view general information about titles and publishers as well as entitlement information specific to their institution. In addition to this, we are assessing use cases about external access to the Entitlement Registry API, so that other applications can make use of the data without having to collect and host it themselves.

Deployment Infrastructure

If a service is going to be successful it first needs to be reliable and responsive. These are two aspects of a service but can be solved with similar approaches. Having redundant copies of a service in different locations allows for one site to fail while still allowing users access the service from the second site. This also helps when dealing with heavy traffic because there are now two servers able to handle requests. This approach works well up to a point, but if different parts of a service all start to require large amounts of resource then creating more copies of the service doesn’t help.

At this stage the architecture of the service becomes important. If the service consists of a single application then the only way to deal with increased load is to run the application on a more powerful server. If the service is comprised of many small applications that communicate with each other then copies of these components can be created independently of each other. This leads to a much greater flexibility and allows the service to handle hardware and software failures as well as heavy traffic.

With the different components we wanted to make sure that each could run efficiently, scale well and be updated without disruption to the service. In order to do this we have created each component separately and given them their own environment to run in. Using Docker (https://www.docker.com/) each component runs in a container that isolates it from the rest of the processes that are running on the server. This means that we can have a lot of different components running in the same place without worrying about how they will affect each other.

Use of Docker also gives us a portable object meaning we can create as many identical versions of the component as needed to provide resilience or to deal with load. These portable objects can be started and stopped very quickly allowing us to deal with failed components or manage updates without affecting the running service.

For this model to be successful we had to put some thought into the design of the components so that they work under these conditions. In particular all of the information that the application uses is stored in an external database. In fact, a minimum amount of data is stored with the component to allow for it to be shutdown or restarted without having to worry about what will happen to the data. At EDINA we are lucky enough to have access to two datacentres meaning each of our services is spread across two sites. A load balancer deals with each request and passes it on to an available server in one of the datacentres. If one of the servers is down then all requests are passed to the other available servers ensuring that the service remains accessible.

Now that we have the basic structure of the service set up it is important that we continue to develop the service in a way that maintains the reliability and resilience. Docker makes it easier to rapidly deploy multiple copies of an application in different locations but it brings its own complexities. The goal now is to use Docker to make our lives easier rather than more complicated.

What we talk about when we talk about SafeNet

roads

July 2015 marks the halfway point for the SafeNet project. A lot of progress has been made towards developing a service that will provide value to the HE community. As we look forward towards the next reporting phase, one which involves significant outreach and negotiation efforts, our attention has focused on the need to produce a clear model of the service proposition by way of infrastructural components and stakeholders, with a demonstration of how those aspects will function and inter-relate.

In our last blog post, we introduced project personas that emerged via discussions with UK HE librarians.  Those discussions regularly explored the issues around post cancellation access in close detail, however interviewees found it harder to identify the shape of the tool or service that would address these problems. It was clear from these discussions that a more visual approach would be beneficial in explaining how the SafeNet service will provide content and how the components will work together to create a cohesive whole.

A recent work package has focused on the legal agreements required by the emergent SafeNet service, specifically in defining the publisher participation agreement that would underpin the supply and deposit of publisher content. The participation agreement outlines the commitments and responsibilities of those involved in supplying material and those operating the service. Along with these responsibilities, the agreement outlines the individual elements of the proposed service and the relationships of the main actors to the final product.

To this end the project team have spent time defining and illustrating who will do what and why they will do it as participants in the service.  The diagram below visualises the service components and, at a high level, clarifies the responsibilities of the stakeholders involved in the project (click to enlarge):

SafeNet blog diagram

This is a simplified diagram to show the high level relationships and interactions.  We can see, for example, the project responsibilities of EDINA and Jisc and their anticipated responsibilities once in service mode. We will be refining this model and adding further details where relevant to assist with production of a tool kit that will be used to aid negotiation and promotion, describing how the service works in practice.

That said, some of the above components are well defined at this stage and some require further work and investigation. For example, while the responsibility for the service components and operation lies with EDINA, Jisc Collections will deal with publisher negotiations building on their considerable experience in this area. Publishers will provide the e-journal content archived nationally using a private LOCKSS network (PLN). The publisher will always remain the preferred supplier of access, and in the event that content from SafeNet is accessed the service will provide usage information back to the publisher.

The diagram also shows, in red, those components EDINA will manage, including two of the PLN nodes which are complemented by four co-located nodes. Establishing this national infrastructure and formalising the agreements to support this is something that will be progressed in the coming months.

Methods for gathering entitlement information are being closely examined at the moment. We hope to convene a second community meeting in the coming months to discuss approaches and consider challenges. The focus in developing the entitlement registry is currently centred on considering data sources and assessing the quality of information available. The KB+ team — Magaly Bascones in particular — have been instrumental in assisting our progress with this. The SafeNet project are also grateful to KB+ users at the universities of Huddersfield, Newcastle, East Anglia and Cambridge for access to their KB+ test profiles as we investigate the possibility of reusing information held there.

As we reach the halfway point the roadmap above shows where the project is headed. Upcoming landmarks include drafting service level definitions, testing data ingest and integrating components into the broader service architecture as shown above. There’s another year to navigate through with plenty of challenging diversions along the way.

Interviews, personas & perpetual access pain points. Oh my!

Book question. 3D modeling and renderingTo help the SafeNet project team gain a better understanding of user needs, exploratory interviews were carried out with 19 serials librarians between the 19th of January and the 6th of March 2015.

As reported in an earlier post, Jisc Collections carried out a survey of its membership in order to understand the post cancellation access needs of the UK HE community. Part of the drive to do this came from earlier consultations with selected NESLi2 publishers. During these discussions it was clear that, for there to be buy-in on behalf of the publishing community, demand for a service based on the SafeNet project from UK HE library community would need to be demonstrated.

The interviews also provided the basis for the identification of 9 distinct personas. The personas do not represent real individuals; they are composites of common themes identified during the interviews. These will be used to assist the project team in being mindful of the audience for the final service and identifying their needs in relation to perpetual access.

The interviews provided an opportunity to find out what concerns and particular pain points librarians experience in relation to perpetual access. The earlier PECAN project identified that, despite existing external service providers offering long term digital preservation services, there are still concerns from the community about continuing access that require improvement and investment. The SafeNet project team specifically wanted to explore user needs in relation to these concerns and how the potential service components of SafeNet could address these issues.

Key Findings

All interviewees noted that their main goal was for users to experience seamless access to content and, should access be lost, to rectify it as quickly as possible. The quantity of journal content available electronically to libraries means that pro-actively checking access to all subscribed material is not practical. Library staff reported that they have to be reactive to access issues when notified by users and this can give a poor impression of their service. Many interviewees indicated that users don’t always see the distinction between the library catalogue and the content provider which is often reflected in NSS and LibQUAL surveys.

Institutional engagement with the issue of post cancellation access (PCA) varied. In some cases a ‘belt and braces’ approach had been taken with institutions participating in both LOCKSS and Portico. In others there were no library-side arrangements and PCA was left to publisher provision.

Record keeping in relation to entitlements also varied. There were similarities in terms of storing physical and digital copies of licences but strategies for making this information usefully available ranged from using the library management system to spreadsheets to nothing at all.

A common theme throughout the interviews was the time constraints library staff face. It was not uncommon for interviewees to report that correspondence with publishers was often protracted and required significant investment of time to provide evidence to support assertions. Again, record keeping was an issue here. In one specific case it was reported that entitlement claims were not pursued because the library was unlikely to have the evidence to hand and the staff time spent investigating the loss of access would outweigh the cost of an inter-library loan.

Several interviewees reported that assurance of PCA was most pressing when moving from a print and electronic subscription to e-only. The SafeNet service, offering a level of national resilience for content, was viewed positively in this scenario as it was common for interviewees to report that they continued to receive print journals in conjunction with the e-version to act as an archive should the subscription be cancelled and electronic access lost. Many stated that these print copies were never made available to users. The proposed SafeNet archive was also welcomed by librarians who had experience of PCA clauses being fulfilled on CD-ROMs or hard drives but who lacked the local infrastructure to provide access to this content for their users.

Overall it was clear from the discussions that there was enthusiasm from librarians about the SafeNet project. The sense that it would save time and provide a centralised, authoritative source of entitlements should access — either current or post cancellation — become a problem, was viewed positively. The national infrastructure was seen as an extremely useful step on the road to providing more robust perpetual access to content which had been paid for.

The persona document is available for download.

If you have any comments or feedback please contact us at edina@ed.ac.uk

SafeNet: Nine months on

Processed with VSCOcam with c1 presetThe SafeNet project has been officially underway for around 9 months. As SafeNet begins to take shape so too has Project Manager Adam Rusbridge’s son who emerged into the world three weeks ago. The first project baby but, with another 15 months to go, there’s no guarantee he’ll be the last. Congratulations to you all, the gauntlet has been thrown down to the rest of the team.

The team have been productive in other ways since our last post on project activity. In January the SafeNet project group met at the Jisc offices in London for a face-to-face meeting that included colleagues from Jisc and EDINA as well as contributors to the project from RLUK and Stanford University.

The group converged to discuss work carried out and planning for the future. The team reviewed project activity that included, at that stage, consultations with publishers and the beginning of consultations with libraries around the pain points of post cancellation access. Consideration was also given to access triggers, content scope, community development and the eventual negotiations with publishers regarding the intended local load agreement.

Aims for the six month period following the meeting up to our next face to face in July 2015 included drafting and testing a publisher participation agreement for the service, planning the service infrastructure, and developing community engagement. These elements would be addressed in tandem with the practicalities of building a service platform.

The publisher participation agreement is in the final stages of revision and should be ready by July 2015 as planned.  Setting up the service infrastructure is progressing and we are investigating options for hosting and co-location.  In terms of community outreach the first meeting of the advisory group took place in York and we aim to take advantage of the input the group have to offer to ensure the resulting service meets the needs of the community.

Development of the Entitlement Registry has progressed. The Entitlement Registry now has a user interface which will be tested and refined over the coming months. Publisher and library test data has been kindly supplied for testing and Magaly Bascones of the KB+ service has been very helpful, providing insights into data held for NESLi2 deals. This data will form the basis for initial testing.

Finally SafeNet has attracted international attention and resulted in conversations with both German and Italian colleagues who are also exploring the national hosting problem space. More information on these and similar initiatives will feature in a future post.

All Aboard: SafeNet Workshop, York, 25/3/15

DSC_0119

The inaugural meeting for prospective members of the SafeNet Community Advisory Group took place at the National Railway Museum in York at the end of March. The CAG will provide guidance on community priorities and workflows as the project progresses to assist in the design of a valuable service.

John McColl (RLUK Chair and University Librarian, St Andrews) introduced attendees by outlining the changes that have occurred as the shift from print to electronic journal content has become more prevalent. John spoke of the need for SafeNet within the higher education community as libraries increasingly find that they no longer retain the kind of archival access physical material traditionally gave to readers.

Members of the SafeNet team provided overviews on the origins of SafeNet, project activity and current thinking about several issues in the problem space. Lorraine Estelle (CEO Jisc Collections) gave an insight into the involvement of Jisc Collections and the approaches they will take when negotiating with publishers to create a national archive of content.

In and around these presentations the group engaged in discussions about the project in relation to the community and their experience of the issues. Some of the key talking points are summarised below. The contributions from members of the group will prove valuable in meeting the needs of the community as the project moves forward.

If you would like more information about SafeNet or have an interest in contributing to this group please contact the project team.

SafeNet in general

There was a great deal of enthusiasm for the project with vocal support for a solution to the problem of post cancellation access. The group provided insight into the day-to-day realities of resource management and were unafraid to pose provocative and challenging questions to the project team about larger issues. Discussion roamed around post cancellation access towards related problems that a service based on SafeNet could attempt to address. John McColl stood for the discrete aim of SafeNet, explaining that in trying to address a wide variety of problems it may well achieve nothing. Members of the group agreed that the tight focus of the project was a good thing because it is more likely to succeed in its stated aim.

There was a strong sense from the group that, while SafeNet is undoubtedly a welcome addition to the suite of Jisc library services, they would like to see explicit links with other silo solutions like KB+ and JUSP. This was seen to be very important because, for any new service that requires library data, users do not want to replicate information already held elsewhere.

Core titles, data sources & assertions

Members of the group highlighted that it may be difficult for libraries and publishers to determine which titles were core at what time. This related to a discussion around data sources for populating the Entitlement Registry in SafeNet.

Current thinking around data gathering begins with publisher data then, if that is unavailable, looks to library assertions already held elsewhere (e.g. KB+) before moving to library held data (e.g. catalogues or local records of entitlement). The group felt that there was likely to be a great deal of diversity in relation to how well documented the subscription history of an institution is – in terms of both library and publisher records — and how much effort will be required to retrieve this information.

Participants agreed that while it may be possible to recover historical data through data archaeology the result is certain to be, in some cases, partial and the cost of recovering it very high. Again, this is likely to be the case for both libraries and publishers.

It was agreed that it would be important to separate collection of current subscriptions and renewal data from historical data. The collection and verification of historical information is likely to present a substantial challenge during the early stages of the resulting service.

Future entitlement data should present less of an issue. For future entitlement information SafeNet aims to establish an agreed process with publishers. Entitlement information would be provided by publishers, and possibly fed to multiple places, without the need for library supplied data.

The appropriate copy problem for PCA

Briefly, this refers to the fact that content for a journal can be served by various different providers. For example, current content may be served by the publisher while backfiles are served by an aggregator. Users looking for content from outside the library’s online environment will not necessarily find the appropriate copy.  The route to SafeNet content will be seamless for some users – those in the library environment – but not for others.

It was agreed that this was a common issue which affects various library resources and is unlikely to be something that SafeNet can solve over the lifetime of the project. However, it is something for the project team to consider in relation to proposed workflows and how those reflect real world user interactions.

Two tails – the long and the short of it

In creating a national archive of content the initial intention is for Jisc Collections to approach publishers involved in the NESLi2 deals. The licences for these deals already include clauses on post cancellation access; members of the group agreed that using SafeNet to ensure compliance was a positive step. However, there was concern for publishers not involved in NESLi deals and, in particular, more specialist or foreign publishers with whom libraries have to negotiate post cancellation access themselves.

This reiterates an important aspect of SafeNet: that post cancellation access is a title level problem and a title level concern rather than an issue specific to a publisher or subject area. The group were quick to point out that the content most at risk, and in some cases most important to their users, can be specialised titles from small publishers that comprise the long tail of the problem. However, the publishers involved with NESLi deals provide a promising starting point for SafeNet to build from because of the existing licence clauses.

Next Steps

It is anticipated that the group will meet 3-4 more times over the remainder of the project. Members of the group and additional nominees will also be involved in a separate Entitlement Registry Development Group with a similar frequency of meetings.

Postcards from the Router

 

pcard

An exciting feature of the Jisc Publications Router is the built-in notification system.

As the Router is able to parse publisher data to identify an author’s home institution it can then identify the institutional repository or repositories the output should be deposited in. The postcard notification system will provide anyone who registers with daily updates on new content for repositories they select to track.

Access to the postcard notification system requires a My EDINA account. Once registered you can look up the repository or repositories you’d like to receive notifications for and choose the format for them to appear in:

postcard reg

Sign up for the Router’s postcard notification system at the following URL:

http://broker.edina.ac.uk/cgi/postcard_registration

It is possible to view content already held in the router using the institutional or target repository browse features. Postcards will notify you of any new content added in the future.

Repository managers can use the postcard system to see what content is being provided to the Router for their institution. It is then possible to harvest open access content from the Router or sign up to receive a feed directly into the repository. Transferred content can be added at any stage of the repository workflow; to a specific user profile or the review file as well as the live archive.

For more information about the Jisc Publications Router contact the EDINA Helpdesk at Edina@ed.ac.uk.

New skin for the old Router

JPRskin

As discussed in our previous post the Jisc Publications Router has had a long and interesting history as a project. Now, as it moves into a new life as a beta service, the team have been working to spruce things up.

All existing features are still available; documentation for suppliers and consumers can be viewed and users can still browse all content by organisation or repository. The browse views have been tidied up and the citations are now significantly clearer:

citationRepository managers can sign up the for the postcard notification service and the stats wheel (shown below) displaying a visualisation of the data is still as mesmerising and clickable as ever.

statwheel

 

The service the Router provides remains the same regardless of how it looks. The Router parses the metadata of an article to determine the appropriate target repositories and transfers the publication to the registered repositories. It minimises efforts on behalf of potential depositors while maximising distribution and exposure of research outputs.

For more information about interacting with the Publications Router please contact the Edina Helpdesk or email Edina@ed.ac.uk.

See also: update from Jisc on their plans for OA services including support for the Router.

A challenging project but an essential one!

safety-net

lorraineGuest post by Lorraine Estelle, CEO of Jisc Collections. Lorraine is Executive Director of Jisc Digital Resources and Divisional CEO of Jisc Collections, overseeing all of Jisc’s digital content and discovery related people, organisations, strategy, services and operations. Among her many successes at Jisc Collections, Lorraine was instrumental in setting up NESLi2 and devising a national consortium with an opt-in model. Lorraine sits on the EDINA management board and has been a member of the SafeNet project team since inception.

I can think of no other asset which an academic institution buys, but to which it has neither physical possession nor a recognised certificate of ownership. Electronic journals are unique in this respect. Academic libraries subscribe, at the cost of millions of pounds each year, to electronic journals under licences that grant them perpetual rights. This system works well providing that the publisher remains in business and the library continues to renew its journal subscription every year.

If an academic library is forced (usually through lack of funds) to cancel a subscription, the problem arises of how its users continue to have online access to the previously acquired journals, given that the content is generally only accessible behind paywalls on publishers’ websites.

Some publishers provide explicit information about such an occurrence and, for example, state that they will make a per-download charge for access to journals post cancellation. These proposed charges are the equivalent to around 1/10th of the current subscription charge. Other publishers are silent on this issue, meaning that a library cancelling a subscription would be required to enter into a negotiation with the publisher to agree an affordable access fee.

This situation is further complicated because an institution will typically only have perpetual rights to some of the journal titles in each publisher’s collection. In order to gain access, the library must claim its rights to the issues of journal titles to which it historically subscribed. The Entitlement Registry project run by Jisc Collections in 2011, demonstrated how complex and time consuming these claims can be. Very often, library records and publishers’ records of entitlement do not agree. This is exacerbated when the publication of a journal title has transferred from one publisher to another, or when one publisher has acquired another and entitlement records are kept on different and often out-of-date legacy systems.

It is this messy landscape which the SafeNet project seeks to address, by building a nationally managed digital archive of journal content and a registry of entitlement. It will provide access to those UK academic institutions which have bought perpetual rights, following a number of trigger events, one of which is post-cancellation access.

Some may question why a national solution is required when global digital and archival solutions already exist. There are indeed some excellent technical solutions, but none quite meets the needs of UK academic institutions in the way that SafeNet will.  One such solution requires payment of annual fees (which may be unaffordable in an economic environment which forces the need to cancel journal subscriptions). CLOCKSS is a successful global solution, but one which does not allow for post-cancellation access. LOCKSS is another excellent solution, but one which is arduous for libraries to maintain. None of these solutions provides a registry of entitlement.

Our vision for SafeNet is that it will be a highly dependable and robust part of the national academic infrastructure. It will be a challenging project, not only from the technical perspective, but because publishers will be required to agree that SafeNet can load and preserve their content. The project team will need to advocate that a national academic archival solution is necessary to safeguard continued access to the journal content purchased by UK libraries. We will need to demonstrate to publishers that there is customer demand for such a service; and that the technical and governance structures of SafeNet will ensure access to each issue of a journal is only ever given to users in institutions that paid for it.

A challenging project but an essential one! The financial future is difficult to predict and a safety net is required in the event of severe economic pressures that would force UK academic libraries to cancel journal subscriptions. Jisc working with EDINA as trusted, non-commercial organisations are well placed to safe guard the scholarly content in which academic libraries have so heavily invested.

eLife supports the Jisc Publications Router

elife_final_logo_rgb

 

The Publications Router team are delighted to welcome eLife as the first publisher to provide content for distribution via the Router.

The UK’s Research Excellence Framework Policy for Open Access requires that authors’ final peer-reviewed manuscripts must be deposited in an institutional or subject repository within three months of acceptance for publication. The policy applies to research outputs (such as journal articles and conference proceedings) accepted for publication after 1 April 2016.

As an open-access publisher that makes articles immediately available online, eLife complies with this requirement by delivering all content at the point of final publication to PubMed Central (PMC) and, via PMC, Europe PMC. However, to ease compliance for individual UK institutions with the REF policy, eLife is going a step further and linking up with the Jisc Publications Router.

In response to joining Melissa Harrison, eLife’s Head of production, stated:

The process to set up and supply our archive of content through the Jisc Publications Router was simple and involved minimal time and effort. We would be happy to support other publishers and institutions as they seek to become part of this important initiative. We’re pleased to support the Jisc Publications Router as an important step in facilitating compliance with the UK open-access policy, in particular, and in extending the infrastructure for open access in general.

For eLife, the Publications Router helps to:

  • Push content to further end points
  • Help content reach institutional repositories as soon as it is published in final format, faster than it’s available elsewhere, aside from the eLife site
  • Make available all formats of the content, including final typeset PDF, all figures, videos and supplementary files, and JATS XML
  • Support institutional compliance with the REF policy through just one relationship

To discuss eLife’s experience with the Router, email eLife’s Head of production, Melissa Harrison m.harrison@elifesciences.org.

For more information about the Jisc Publications Router  contact the EDINA Helpdesk at Edina@ed.ac.uk.

See also: eLife supports the Jisc Publication Router on eLife news.

 

History of the Router – it started on the back of an envelope

Envelope concept image

The Jisc Publications Router has its origins in the preceding Open Access Repository Junction (OA-RJ) project which itself continued on from the work carried out on the Depot.

The Depot bridged a gap for researchers before a specific local institutional repository was available to them. It aimed to make more content available in repositories and to make it easier for researchers to have research results exposed to a wider readership under open access. The Depot is still available and providing researchers with a repository at http://opendepot.org/

One of the objectives of the Depot was to devise an unmediated reception and referral service called the Repository Junction. The Junction collected information in order to redirect users to existing institutional repository services near them. Institutional affiliation of potential depositors was deduced through an IP lookup and external directories were queried to find an appropriate location for deposit. This facilitated the redirection of a user to the most appropriate repository. If none of the suggested repositories were suitable for the researcher they could still deposit in the Depot.

OA-RJ started as an investigation to improve the simplistic approach of the Repository Junction and provide a service within the Jisc information environment. After consultation with other technologists in the Repository community it because clear that there were two workflows that should be addressed. Firstly that the deposit object could be data-mined for additional information on the author affiliation and, secondly, that the object could be, itself, deposited into repositories. This second workflow could solve the many-to-many problem of research publications with multiple authors from multiple institutions who require their publications be deposited in multiple locations. The aim was to minimise effort on behalf of potential depositors while maximising the distribution and exposure of research outputs.

The foundation for OA-RJ can be seen in the ‘back of an envelope’ diagram (above) born from a meeting between Theo Andrew, Jim Downing, Richard Jones, Ben O’Steen and Ian Stuart. With smoother edges the above diagram looks like this:tidy concept image

OA-RJ then split into discovery and delivery providing services for each. The Repository Junction would discover repository targets while a standalone broker would enable content providers to make deposits with multiple recipients. OA-RJ became two distinct projects as part of the UK Repository Net+ (RepNet) infrastructure project; Organisation and Repository Identification (ORI) handling the discovery while the Repository Junction Broker (RJB) dealt with delivery. ORI is now an Edina micro service providing APIs to access authoritative data on organisations and repositories. The latest phase of RJB is the Jisc Publications Router.

The Router is a service based on the RJB application. The Publications Router aims to deliver open access content in a format that can be understood by institutional repositories. Having evolved from the projects outlined above the Router automates the delivery of research publications from multiple suppliers (publishers, subject repositories) to multiple institutional repositories. The Router parses the metadata to determine the appropriate target repositories based on the authors responsible for the output and transfers the publication to the institutional repositories registered with the service. It is intended to minimise efforts on behalf of potential depositors in order to maximise the distribution and exposure of research outputs.

The envelop sketch is now a fully realised service.

You can view blog posts from the previous incarnations of the Router at the following URL but we will highlight some of these older posts in the future: https://oarepojunction.wordpress.com/

If you have any queries about the Publications Router please contact the Edina Helpdesk or email Edina@ed.ac.uk.