Ana səhifə

National Library of Medicine Recommendations on nlm digital Repository Software


Yüklə 1.4 Mb.
səhifə9/15
tarix10.06.2016
ölçüsü1.4 Mb.
1   ...   5   6   7   8   9   10   11   12   ...   15

Appendix D – DigiTool Testing Results


Consolidated Digital Repository Test Plan

Last updated: June 13, 2008

Source Require-ments

Sub-

group

See Note 1

DigiTool 3.0 Tests







Test ID

Test Plan Element







Test Procedure and Results

Score

(0-3)

Note 2

Notes

7.1.1 Ingest - Receive Submission




T










7.1.1.7

File types - Demonstrate that the system can ingest content in all the file formats listed as "supported" in Appendix B of the NLM DR Functional Requirements document (plus MP3 and JPEG2000), specifically: MARC, PDF, Postscript, AIFF, MPEG audio, WAV, MP3, GIF, JPEG, JPEG2000, PNG, TIFF, HTML, text, RTF, XML, MPEG.

Demonstrate that the system can ingest the following types of content: articles, journals, images, monographs, audio files, video files, websites, numeric data, text files, and databases.

Conduct this test element by ingesting the set of files listed in the Test File spreadsheet. (The files listed in this spreadsheet contain examples of all the file formats, and all the content types identified above.)


7.1.1.7

7.1.1.9


T

Can automatically create thumbnails when ingesting JPG2000, PDF. (Some initial PDF ingest tests failed to produce thumbnails due to unusual PDF format.)


3

Answer was received to Question QT4.



7.1.1.1

Manual review - Demonstrate that the system has the capability to require that submitted content be manually reviewed before it is accepted into the repository.

Demonstrate that the system maintains submitted content in a staging area before it is accepted.

Demonstrate that the system notifies a reviewer when new content is ready for review.

(Also see tests for 7.1.4.1, 7.1.4.2, and 8.1.2.)



7.1.1.1

T




3




7.1.1.2

Review and acceptance workflow - Demonstrate that the system supports a workflow for the review and acceptance of submitted content. Demonstrate that the workflow includes the following functions:

a - Receive and track content from producers;

b - Validate content based on submitter, expected format, file quality, duplication, and completeness;

c - Normalize content by converting content into a supported format for final ingestion into the repository;

d - Human review of content;

e - Acceptance or rejection of content or file format.




7.1.1.2,

7.1.1.10


T

a - yes

b - no


c - no

d - yes


e - yes

2




7.1.1.3

Reason for rejection - Demonstrate that the system records a set of identifying information or metadata that describes the reason for the rejection of submitted content. Demonstrate two cases: (1) automatic rejection, and (2) rejection by a human reviewer.

7.1.1.3

T

1 - no

2 - maybe - needs further testing



1




7.1.1.4

Rejection filter - Demonstrate that the system allows the creation of a filter that can be used to automatically reject submitted content. (This capability will eliminate the need for manual review of some submissions and resubmissions.)

7.1.1.4

T

This is not filter based but rather template based

1




7.1.1.5

Rejection notification - Demonstrate that the system can notify the producer or donor when submitted content is rejected. Demonstrate two cases: (1) notification after immediate rejection by an automated process, and (2) notification after rejection by manual review.

7.1.1.5,

7.1.1.11


T

Failure indication (instead of "success") upon immediate rejection.

1



(7.1.1.8)

Metadata types - Demonstrate that the system can ingest content with associated metadata in the following formats: all NLM DTDs, Dublin Core, MARC21, MARCXML, ONIX, MODS, EAD, TEI, PREMIS, METS. (NOTE: This test is covered by tests 8.1.1, 8.1.8, and 8.1.9)

7.1.1.8,

8.1.1,


8.1.8,

8.1.9


M/T




T=2.5 M=2.5

7.1.1.10

Format conversion - Demonstrate that the system has the capability to convert the format of a file being ingested to a desired supported format. As a test case, demonstrate that a WAV file can be converted to MP3 format when it is ingested. (An external tool may be needed to perform the conversion. If this is the case, demonstrate that the system can invoke the required external tool.)

7.1.1.10,

7.1.1.2


T

Can automatically create JPG and JP2 when ingesting TIFF. Can automatically create JPG when ingesting JP2. Can automatically create JPG thumbnail when ingesting JP2 and PDF. Can add other external file converters.

2

Answer was received to Question QT1

7.1.1.12

Resubmission - Demonstrate that the system can ingest a SIP that is resubmitted after an error in the SIP was detected and corrected. Demonstrate two cases: the resubmission can occur after an error was detected in (1) the content of the SIP, and (2) the metadata of the SIP.

7.1.1.12

T

Failed ingests can be rolled back, edited, and reingested.

2




7.1.1.14

Versions - Demonstrate that the system can store, track, and link multiple versions of a file.

7.1.1.14

T

Alternate manifestations can be created but there are no "Versions"

0




7.1.1.15a

Unique identifiers - Demonstrate that the system assigns a unique identifier to each object ingested. Demonstrate two cases: (1) a unique identifier assigned to a digital object, which may be comprised of a set of component files, and (2) a unique identifier assigned to each of the component files of a digital object.

7.1.1.15a,

7.1.1.15b



T




3




7.1.1.15b

Relationships - Demonstrate that the system can represent a parent-child relationship between content items. Demonstrate two cases: (1) an object having multiple components (e.g., a document having multiple pages, each in a separate file), and (2) an object having multiple manifestations (e.g., an image having both TIFF and JPEG files).

7.1.1.15b

T




3




7.1.1.16

Audit trail - Demonstrate that the system maintains an audit trail of all actions regarding receiving submissions (SIPs).

7.1.1.16

T




2.5




7.1.2 Ingest - Quality Assurance




T










7.1.2.1

Virus checking - By design analysis, confirm that the system performs automatic virus checking on submitted content files.

7.1.2.1

T




0




7.1.2.2

Transmission errors - Demonstrate that the system uses MD5, CRC, checksums, or some other bit error detection technique to validate that each data file submitted is received into the repository staging area without transmission errors.

7.1.2.2

T

MD5 is created during ingest and is saved with the file. However, an MD5 generated pre-ingest cannot be compared with the DigiTool-created MD5 to verify that transmission errors have not occurred.

1

Answer was received to Question QT2.

7.1.2.3

Submission validation - Demonstrate that the system verifies the validity of submitted content based on the following criteria: submitter; expected file format; file quality (e.g., actual format of file matches the filename extension, and content of file is well-formed); duplication (e.g., existence of object in the repository); completeness of metadata; completeness of file set (e.g., all expected files are included in the submission).

7.1.2.3

T

No submission validation other than JHOVE checksum. Checksum done at ingest and no capability to compare externally provided checksum with that done during ingest.

1




7.1.2.4

QA UI - Demonstrate that the system allows NLM staff to perform manual/visual quality assurance on staged SIPs via a user-friendly interface.

7.1.2.4

T




1




7.1.2.5

Reaction to QA errors - Demonstrate that the system can react to specified QA errors in two ways: (1) request that the producer correct and resubmit the content, or (2) automatically modify the submission (e.g., converting to a supported format).

7.1.2.5

T

After failed ingest user can rollback, edit, and resubmit. No automatic modifications performed.

1




7.1.2.6

File/batch accept/reject - Demonstrate that the system enables NLM staff to accept or reject submitted content (SIPs) at the file or batch level.

7.1.2.6

T




1.5




7.1.2.7b

Error reports - Demonstrate that the system generates error reports for ingest quality assurance problems.

7.1.2.7b

T




0




7.1.2.8

Adjustable level of manual QC - By design analysis, confirm that the system has the ability to adjust the level of manual ingest quality control needed, based on the origin of the file.

7.1.2.8

T




0




7.1.2.9

Audit trail - Demonstrate that the system maintains an audit trail of all actions regarding ingest quality assurance.

7.1.2.9

T




0




7.1.4 Ingest - Generate Descriptive Information / Metadata




M










7.1.4.1

Additional metadata - Demonstrate the entry of additional metadata (e.g. subject headings, names, dates, “curatorial” descriptive metadata - evaluative information that explains why an object is important, whether it was part of a larger collection (e.g., an exhibit), etc.).

7.1.4.1

M




3




7.1.4.2

Validate metadata - Demonstrate ability to validate specified metadata elements.

7.1.4.2

M




1.5




7.1.4.4

Metadata storage - Demonstrate that metadata is stored in the database in a manner that conforms to repository reformatting and linked to their corresponding objects via an identifier.

o Demonstrates that basic descriptive metadata is also stored with the objects (e.g., unique identifier, title and date stored in the TIFF header) so that the objects can still be identified in the event that information in the database is corrupted.

o See Appendix D for examples of TIFF header metadata requirements.

(Use of external tool probable)



7.1.4.4

M




3




7.1.4.5

Required descriptive elements - Demonstrate the ability to recognize required descriptive elements.

7.1.4.5

M




3




7.1.4.7

Audit trail - Demonstrate the creation of an audit trail of all actions.

7.1.4.7

M




1




7.1.3 Ingest - Generate AIP

Note 3

P










7.1.5 Ingest - Coordinate Updates

Note 3

P










7.2.1 Archival Storage - Receive Data

Note 3

P










7.2.2 Archival Storage - Manage Storage Hierarchy

Note 3

P










7.2.3 Archival Storage - Replace Media

Note 3

P










7.2.4 Archival Storage - Error Checking and Disaster Recovery

Note 3

P










7.2.5 Archival Storage - Provide Data

Note 3

P










7.3.1 Data Management - Administer Database

Note 3

P










7.3.2 Data Management - Administer Perform Queries

Note 3

P










7.3.3 Data Management - Generate Report

Note 3

P









1   ...   5   6   7   8   9   10   11   12   ...   15


Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©kagiz.org 2016
rəhbərliyinə müraciət