Experiences in deploying metadata analysis tools for institutional repositories

David M. Nichols, Gordon W. Paynter, Chu Hsiang Chan, David Bainbridge, Dana McKay, Michael B. Twidale, Ann Blandford

Research output: Contribution to journalArticle

Abstract

Current institutional repository software provides few tools to help metadata librarians understand and analyze their collections. In this article, we compare and contrast metadata analysis tools that were developed simultaneously, but independently, at two New Zealand institutions during a period of national investment in research repositories: the Metadata Analysis Tool (MAT) at The University of Waikato, and the Kiwi Research Information Service (KRIS) at the National Library of New Zealand. The tools have many similarities: they are convenient, online, on-demand services that harvest metadata using OAI-PMH; they were developed in response to feedback from repository administrators; and they both help pinpoint specific metadata errors as well as generating summary statistics. They also have significant differences: one is a dedicated tool wheres the other is part of a wider access tool; one gives a holistic view of the metadata whereas the other looks for specific problems; one seeks patterns in the data values whereas the other checks that those values conform to metadata standards. Both tools work in a complementary manner to existing Web-based administration tools. We have observed that discovery and correction of metadata errors can be quickly achieved by switching Web browser views from the analysis tool to the repository interface, and back. We summarize the findings from both tools' deployment into a checklist of requirements for metadata analysis tools.

Original languageEnglish (US)
Pages (from-to)229-248
Number of pages20
JournalCataloging and Classification Quarterly
Volume47
Issue number3-4
DOIs
StatePublished - Apr 1 2009

Fingerprint

experience
New Zealand
information service
Values
librarian
statistics
demand

Keywords

  • Evaluation
  • Institutional repositories
  • Metadata quality

ASJC Scopus subject areas

  • Library and Information Sciences

Cite this

Nichols, D. M., Paynter, G. W., Chan, C. H., Bainbridge, D., McKay, D., Twidale, M. B., & Blandford, A. (2009). Experiences in deploying metadata analysis tools for institutional repositories. Cataloging and Classification Quarterly, 47(3-4), 229-248. https://doi.org/10.1080/01639370902737281

Experiences in deploying metadata analysis tools for institutional repositories. / Nichols, David M.; Paynter, Gordon W.; Chan, Chu Hsiang; Bainbridge, David; McKay, Dana; Twidale, Michael B.; Blandford, Ann.

In: Cataloging and Classification Quarterly, Vol. 47, No. 3-4, 01.04.2009, p. 229-248.

Research output: Contribution to journalArticle

Nichols, DM, Paynter, GW, Chan, CH, Bainbridge, D, McKay, D, Twidale, MB & Blandford, A 2009, 'Experiences in deploying metadata analysis tools for institutional repositories', Cataloging and Classification Quarterly, vol. 47, no. 3-4, pp. 229-248. https://doi.org/10.1080/01639370902737281
Nichols, David M. ; Paynter, Gordon W. ; Chan, Chu Hsiang ; Bainbridge, David ; McKay, Dana ; Twidale, Michael B. ; Blandford, Ann. / Experiences in deploying metadata analysis tools for institutional repositories. In: Cataloging and Classification Quarterly. 2009 ; Vol. 47, No. 3-4. pp. 229-248.
@article{e0d2ac3e6ba7436f975a2ad4f5b01ee3,
title = "Experiences in deploying metadata analysis tools for institutional repositories",
abstract = "Current institutional repository software provides few tools to help metadata librarians understand and analyze their collections. In this article, we compare and contrast metadata analysis tools that were developed simultaneously, but independently, at two New Zealand institutions during a period of national investment in research repositories: the Metadata Analysis Tool (MAT) at The University of Waikato, and the Kiwi Research Information Service (KRIS) at the National Library of New Zealand. The tools have many similarities: they are convenient, online, on-demand services that harvest metadata using OAI-PMH; they were developed in response to feedback from repository administrators; and they both help pinpoint specific metadata errors as well as generating summary statistics. They also have significant differences: one is a dedicated tool wheres the other is part of a wider access tool; one gives a holistic view of the metadata whereas the other looks for specific problems; one seeks patterns in the data values whereas the other checks that those values conform to metadata standards. Both tools work in a complementary manner to existing Web-based administration tools. We have observed that discovery and correction of metadata errors can be quickly achieved by switching Web browser views from the analysis tool to the repository interface, and back. We summarize the findings from both tools' deployment into a checklist of requirements for metadata analysis tools.",
keywords = "Evaluation, Institutional repositories, Metadata quality",
author = "Nichols, {David M.} and Paynter, {Gordon W.} and Chan, {Chu Hsiang} and David Bainbridge and Dana McKay and Twidale, {Michael B.} and Ann Blandford",
year = "2009",
month = "4",
day = "1",
doi = "10.1080/01639370902737281",
language = "English (US)",
volume = "47",
pages = "229--248",
journal = "Cataloging and Classification Quarterly",
issn = "0163-9374",
publisher = "Routledge",
number = "3-4",

}

TY - JOUR

T1 - Experiences in deploying metadata analysis tools for institutional repositories

AU - Nichols, David M.

AU - Paynter, Gordon W.

AU - Chan, Chu Hsiang

AU - Bainbridge, David

AU - McKay, Dana

AU - Twidale, Michael B.

AU - Blandford, Ann

PY - 2009/4/1

Y1 - 2009/4/1

N2 - Current institutional repository software provides few tools to help metadata librarians understand and analyze their collections. In this article, we compare and contrast metadata analysis tools that were developed simultaneously, but independently, at two New Zealand institutions during a period of national investment in research repositories: the Metadata Analysis Tool (MAT) at The University of Waikato, and the Kiwi Research Information Service (KRIS) at the National Library of New Zealand. The tools have many similarities: they are convenient, online, on-demand services that harvest metadata using OAI-PMH; they were developed in response to feedback from repository administrators; and they both help pinpoint specific metadata errors as well as generating summary statistics. They also have significant differences: one is a dedicated tool wheres the other is part of a wider access tool; one gives a holistic view of the metadata whereas the other looks for specific problems; one seeks patterns in the data values whereas the other checks that those values conform to metadata standards. Both tools work in a complementary manner to existing Web-based administration tools. We have observed that discovery and correction of metadata errors can be quickly achieved by switching Web browser views from the analysis tool to the repository interface, and back. We summarize the findings from both tools' deployment into a checklist of requirements for metadata analysis tools.

AB - Current institutional repository software provides few tools to help metadata librarians understand and analyze their collections. In this article, we compare and contrast metadata analysis tools that were developed simultaneously, but independently, at two New Zealand institutions during a period of national investment in research repositories: the Metadata Analysis Tool (MAT) at The University of Waikato, and the Kiwi Research Information Service (KRIS) at the National Library of New Zealand. The tools have many similarities: they are convenient, online, on-demand services that harvest metadata using OAI-PMH; they were developed in response to feedback from repository administrators; and they both help pinpoint specific metadata errors as well as generating summary statistics. They also have significant differences: one is a dedicated tool wheres the other is part of a wider access tool; one gives a holistic view of the metadata whereas the other looks for specific problems; one seeks patterns in the data values whereas the other checks that those values conform to metadata standards. Both tools work in a complementary manner to existing Web-based administration tools. We have observed that discovery and correction of metadata errors can be quickly achieved by switching Web browser views from the analysis tool to the repository interface, and back. We summarize the findings from both tools' deployment into a checklist of requirements for metadata analysis tools.

KW - Evaluation

KW - Institutional repositories

KW - Metadata quality

UR - http://www.scopus.com/inward/record.url?scp=73849090216&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=73849090216&partnerID=8YFLogxK

U2 - 10.1080/01639370902737281

DO - 10.1080/01639370902737281

M3 - Article

AN - SCOPUS:73849090216

VL - 47

SP - 229

EP - 248

JO - Cataloging and Classification Quarterly

JF - Cataloging and Classification Quarterly

SN - 0163-9374

IS - 3-4

ER -