Ana səhifə

National Library of Medicine Recommendations on nlm digital Repository Software


Yüklə 1.4 Mb.
səhifə6/15
tarix10.06.2016
ölçüsü1.4 Mb.
1   2   3   4   5   6   7   8   9   ...   15

Appendix A - Master Evaluation Criteria Used for Qualitative Evaluation of Initial 10 Systems


NLM Digital Repository Master Evaluation Criteria

Updated August 13, 2007


Purpose

  • Provide a decision method to select 3-4 systems for installation and testing at NLM from the initial list of 10 digital repository candidate systems.



Context

  • The Digital Repository Evaluation and Selection Working Group (DRESWG) has begun evaluating the initial list of 10 candidate systems against a list of approximately 175 functional requirements specified in the NLM Digital Repository Policies and Functional Requirements Specification, March 16, 2007.

    • A weighted numerical scoring method is being used to compute a total score for each candidate system.

  • The Functional Requirements score is one of the master evaluation criteria.

  • Additional master evaluation criteria address other programmatic factors and risks that should be considered in the down-selection decision.



Master Evaluation Criteria

  • Functionality - Degree of satisfaction of the requirements enumerated in the NLM Digital Repository Functional Requirements Specification OR

  • Scalability – Ability for the repository to scale to manage large collections of digital objects.

    • Evaluation: 0-3 assessment scale (see below)

  • Extensibility – Ability to integrate external tools with the repository to extend the functionality of the repository, via provided software interfaces (APIs), or by modifying the code-base (open source software).

    • Evaluation: 0-3 assessment scale (see below)

  • Interoperability – Ability for the repository to interoperate with other repositories (both within NLM and outside NLM) and with the NLM ILS.

    • Evaluation: 0-3 assessment scale (see below)

  • Ease of deployment – Simplicity of hardware and software platform requirements; simplicity of installation; ease of integration with other needed software.

    • Evaluation: 0-3 assessment scale (see below)

  • System security – How well does the system meet HHS/NIH/NLM security requirements?

    • Evaluation: 0-3 assessment scale (see below)

  • System performance – How well the system performs overall; response time (accomplished via load testing). System availability (24x7 both internally and externally?).

    • Evaluation: 0-3 assessment scale (see below)

  • Physical environment – Ability for multiple instances for offsite recovery; ability to function with the NIH off-site backup facility (NCCS); ability for components to reside at different physical locations; ability for development, testing and production environments; capability for disaster recovery.

    • Evaluation: 0-3 assessment scale (see below)

  • Platform support – Operating system and database requirements. Are these already supported by OCCS? Is there staff expertise to deal with required infrastructure?

    • Preferable: O/S: Solaris 10 (container); Storage: On NetApp via NFS; DB: Oracle; Web: java-tomcat or other application tier technology (OCCS will evaluate)

    • Acceptable: O/S: Windows 2003, Linux Red Hat ES; DB: MySQL; Web: (no constraints for now – OCCS will evaluate)

    • Evaluation: 0-3 assessment scale (see below)

  • Demonstrated successful deployments – Relative number of satisfied users (organizations).

    • Evaluation: 0-3 assessment scale (see below)

  • System support – Quality of documentation, and responsiveness of support staff or developer/user community (open source) to assist with problems.

    • Evaluation: 0-3 assessment scale (see below)

    • Evaluation: 0-3 assessment scale (see below)

  • Stability of development organization – Viability of the company providing the software; or stability of the funding sources and organizations developing open source software.

    • Evaluation: 0-3 assessment scale (see below)

  • Strength of technology roadmap for the future – Technology roadmap that defines a system evolution path incorporating innovations and “next practices” that are likely to deliver value.

    • Evaluation: 0-3 assessment scale (see below)

To be considered only after the functional and technical criteria above are addressed:



  • Cost – Expected total cost of software deployment, including initial cost of software, plus cost of software integration, modifications, and enhancements.

    • Evaluation: 0-highest cost 3-lowest cost


Assessment Scale

    • 0 – None

    • 1 – Low

    • 2 – Moderate

    • 3 – High

Appendix B - Results of Qualitative Evaluation of Initial 10 Systems


Final Systems Evaluation Matrix Last updated: September 25, 2007




1

Type (open source, vendor)

Advantages

Risks

For further investigation

Notes

Top contenders

Fedora

Open source

Great flexibility to handle complex objects and relationships.

Fedora Commons received multi-million dollar award to support further development. Community is mature and supportive.



Complicated system to configure according to our research and many users.

Need additional software for fully functional repository.









DigiTool (Ex Libris)

Vendor

“Out-of-the-box” solution with known vendor support. Provides good overall functionality. Has ability to integrate and interact with other NLM systems.

Scalability and flexibility may be issues.

NLM may be too dependent on one vendor for its library systems.



Ingest issues




DSpace

Open source

“Out-of-the-box” open source solution. Provides some functionality across all functional requirements (7.1-7.6)

Community is mature and supportive.



Planned re-architecture over next year.

Current version’s native use of Dublin Core metadata somewhat limiting.









Further evaluation and discussion needed

DAITSS

Open source

Richest preservation functionality

Back-end/archive system. Must use DAITSS in conjunction with other repository or access system.

Planned re-architecture over next 2 years.

Limited use and support; further development dependent on FCLA (and FL state legislature).


If selected for testing, code base needs examination for robustness.





Greenstone

Open source

Long history, with many users in the last 10 years. Strong documentation with commitment by original creators to develop and expand.

Considered “easy” to implement (library school students have used it to create projects) a simple repository out of the box; DL Consulting available for more complex requirements.

Compatible with most NLM requirements.


Program is being entirely rewritten (C++ to Java) to create Greenstone 3. Delivery date unknown.

Development community beyond the originators is not as rich as other open-source systems.

DL Consulting recently awarded grant “to further improve Greenstone’s performance when scaled up to very large collections”—implies it may not do so currently.

Core developers and consultants in New Zealand.






If selected for testing, not entirely clear whether Greenstone 3 (in beta) or Greenstone 2 (robust but going away) would be best to test with. Developers claim any system implemented in Greenstone 2 will be compatible with Greenstone 3. Should probably contact Greenstone developers and/or DL Consulting with this question if we select it.

Keystone DLS

Open source

Some strong functionality.

Relatively small user population.

Evaluators felt it should be strongly considered only if top 3 above are found inadequate.









No further consideration needed at this time

ArchivalWare (PTFS)

Vendor

Strong search capabilities.

Small user population. Reliability and development path of vendor unknown.




Very low rating across all master criteria.

CONTENTdm (OCLC)

Vendor

Good scalability.

No interaction with third party systems. Data stored in proprietary text-based database and does not accommodate Oracle. Development path of vendor unknown.




Lower ratings across majority of master criteria.

EPrints

Open source










Lower ratings across majority of master criteria.

VITAL (VTLS)

Vendor

Vendor support for Fedora add-ons

Vendor-added functionality may be in conflict with open-source nature of Fedora.




If full evaluation of Fedora is successful, VITAL may be considered as an add-on.



1   2   3   4   5   6   7   8   9   ...   15


Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©kagiz.org 2016
rəhbərliyinə müraciət