Steven Duckworth

...now browsing by category

 

SAA Student Poster Re-Cap: “Reprocessing: The Trials and Tribulations of Previously Processed Collections”

Monday, August 25th, 2014

from the poster presented at the Society of American Archivists Annual Meeting, August 2014, Washington, D.C.

by Annalise Berdini, Steven Duckworth, Jessica Hoffman, Alina Josan, Amanda Mita, & Evan Peugh; Philadelphia Area Consortium of Special Collections Libraries (PACSCL)

OVERVIEW:

PACSCL’s current project, “Uncovering Philadelphia’s Past: A Regional Solution to Revealing Hidden Collections,” will process 46 high research value collections, totaling 1,539 linear feet, from 16 Philadelphia-area institutions that document life in the region. Since the start of processing in October 2013, the team has completed 31 collections at 13 repositories, totaling over 1,225 linear feet. Plans have evolved over the course of the project due to previous processing in many collections. As the processing teams tackled the collections, the solutions devised for the various challenges they encountered developed into a helpful body of information regarding minimal processing. Future archivists and collaborators can use this knowledge to choose appropriate collections for minimal processing projects, and be prepared to handle unexpected challenges as they arise.

NOTED ISSUES:

  • Novice Archivists: Volunteers and novice archivists, while well meaning, can make simple mistakes that lead to larger problems.
    • Learn about the previous processors; their background and level of knowledge with the materials. Having a better idea of their relationship to the collection helps guide decisions in the new iteration of processing.
    • “Miscellaneous.” It is a very popular word, even with seasoned archivists. Attempts should be made to more accurately describe the contents of a folder, such as “Assorted records” or “Correspondence, assorted,” followed by examples of record types or 1 to 3 names of individuals represented.
  • Losing Original Order: Processors with good intentions can disrupt original order through poor arrangement, item-level processing, and removing items for exhibits or other purposes.
    • Use what original order remains to influence arrangement in a way that might bring separated records back together.
    • Lone items may require more detailed description to provide links back to other documents.
    • Be aware of handwriting: Previous folder titling can serve as a clue for separated items and original order.
  • Item-Level Description: Item-level description can render the collection’s original order impossible to discern and greatly diminish access.
    • Gain a broad perspective of the collection in order to determine the most intelligible arrangement of materials with an awareness of grouping like with like.
    • For item-level reference materials, such as newspaper and magazine clippings, merge materials into larger subject files and include a rough date span.
    • Be cautious when merging other records, such as correspondence. Arrange materials into a loose chronological order and include in the folder title the names of recurring correspondents, if possible.
    • Make sure to account for the new arrangement in one’s arrangement note. Reuniting item-level materials and describing those materials to the new level of arrangement will greatly enhance access to the collection.
  • Legacy Finding Aids: It can be difficult to tell how accurate an existing finding aid is, and the decisions made on how much of it to preserve can be complicated.
    • Again, knowledge of the previous processors’ education and history with the collection will prove helpful.
    • Consider the fate of the legacy finding aid. If the collection will be entirely reprocessed, is anything in the legacy finding aid worth keeping? Should the old and new simply be linked or should parts of the old finding aid be incorporated into the new one?
    • Proofread! Anything retained from a legacy finding aid should be proofread very carefully.
    • Keep ideas of continuity in mind while creating new folder titles and dates.
    • Format can be a problem. Will the format (e.g., hardcopy only) prove problematic for import? Scanning and OCR can be a time-consuming process.
  • Collection Size and Type: Size and type of collection can have a drastic impact on processing speeds.
    • If possible, choose larger collections to economize on time and money. Multiple smaller collections require more effort than one larger one.
    • Institutional records average a faster processing speed than family or personal papers. Keep this in mind when choosing which collections to process.

OVERALL RECOMMENDATIONS

  • Work closely with current staff; understand the history of the collection and the desired shape of its future.
  • Learn about previous processors to understand their training, background, and history with the records.
  • Edit and expand upon non-descriptive terms (e.g., miscellaneous) when possible. More detailed descriptions can assist in linking separated records back together.
  • Merge clippings and reference files together when feasible.
  • Make note of reprocessing decisions in the finding aid.
  • Proofread any reused documents or folder titles, keeping ideas of consistency in mind.
  • Be mindful of donor relationships in discussing past problems, especially in any public forum, such as a project blog.
  • Plan carefully from the outset. If possible, choose collections that best fit the project goals.
  • Remain flexible and be prepared to compromise.

FILES

"Reprocessing" poster for Society of American Archivists 2014 Annual Meeting

Poster for Society of American Archivists 2014 Annual Meeting

Processing speed by collection size graph

Average processing speed by collection size

Processing speed by collection type graph

Average processing speed by collection type

Minimal deaccessioning

Thursday, April 17th, 2014

The parameters of our Hidden Collections project generally preclude any deaccessioning efforts from being part of the process. We’re tasked with moving at a relatively swift pace – roughly twice the speed of “traditional” archival processing – and this doesn’t leave a lot of time to go through and check to see if some items could or should be removed from the collections. Additionally, being archival interlopers, fairly unfamiliar with the collections and procedures of our temporary homes, leads us to err on the side of caution and leave the task of deaccessioning for another time and, usually, another archivist. However, I’ve found that from time to time, some deaccessioning can take place with relatively no additional time taken for the process.

Folders of publications.

Folders of publications.

A prime example of this came in the past couple of weeks with our collection at the Drexel University College of Medicine (DUCOM) Legacy Center Archives and Special Collections. At DUCOM, we are processing about 250 feet of materials in the Academic Affairs records group of Hahnemann University. This group is made up of many smaller collections of papers from administrators and faculty, as well as broader collections from academic units, assorted publications, and more. While processing each of these collections, we often noted files that we knew we had seen before and were obviously duplications, but due to time constraints and issues of provenance, we let this fact bother us momentarily and then moved on. But when it came to the series of publications, the rules changed a bit.

As the materials in the series came from a variety of smaller collections of publications, the aim was to file them all together, leaving issues of provenance out of the picture. And, as we decided to file them chronologically within four subseries, picking out the duplicates became quite simple during the final process of arranging and boxing. As can be seen in the accompanying pictures, duplicated publications were blatantly obvious.

Deaccessioned publications.

Deaccessioned publications.

After a quick glance through each set of duplicates, three copies of each were retained, consisting of the versions in the best condition or any annotated copies. The excess duplicates were removed from the collection and given to the main archivists who will decide upon their ultimate fate. Though it may not seem like much in a collection of roughly 250 feet, we were able to remove over a foot of redundant material in this manner without slowing down our process. We consider this a win-win situation and recommend using this idea of minimal deaccessioning when possible with future collections.

Saint Peter Claver Roman Catholic Church records

Friday, April 11th, 2014

The records of St. Peter Claver Roman Catholic Church of Philadelphia, one of the collections held at Temple University’s Special Collections Research Center, sheds light on a unique aspect of Philadelphia history. The church was started in 1886 when African American Catholics in the region grew tired of the discrimination they faced at Catholic Churches of the day (if they were allowed in at all). Members of three parishes united together to form the Peter Claver Union with the goal of creating a “Church for Colored Catholics” in Philadelphia.

In 1889, they were officially recognized by the Archdiocese of Philadelphia, and in 1892, they moved into their new home at 12th and Lombard Streets (a former Presbyterian church). The church continued to function for almost a century until the Archdiocese suppressed the church in 1985, stating that due to the changing racial climate, a dedicated church for African Americans was no longer needed, thus removing their parish status, as well as all of their records. At this point, the church continued to function as a community, but could not offer most religious sacraments and services.

Steve processing at Temple University.

Steve processing at Temple University.

In processing the records of this collection, one obvious drawback is the lack of most records from before 1985 (outside of the school records). Rather than finding records focused mainly on the administration and rituals of a church, this collection’s focus is found in the community outcry over the suppression of the parish, clippings and other subject files covering the African American community at the time, the church community’s struggle to remain vibrant in a neighborhood that had lost its African American majority, and many issues of racism (real or perceived) within the Catholic Church as a whole.

From a processing perspective, this was my favorite collection from our time at Temple and that comes from it not having been previously processed. It was quite rewarding to take a box full of papers and create a logical order to the contents, rather than just relabeling folders or trying to figure out why someone had deemed certain records appropriate to folder together.  This collection, though smaller than our previous ones, offered a chance to do some actual MPLP processing (a goal of this project), as well as learn more about Philadelphia history. And while I’ll not comment on my personal views of the acts of the Catholic Church regarding St. Peter Claver’s, it is quite eye opening to read about this time in Catholic history.

Processing up

Tuesday, April 8th, 2014

The Hebrew Sunday School Society (HSSS) collection at Temple University’s Special Collections Research Center contains roughly 35 linear feet of records that span two centuries (1802 to 2002) and document the history of the Society. HSSS was founded in 1838 by Rebecca Gratz (a Jewish philanthropist in Philadelphia and the basis for the character of Rebecca in Sir Walter Scott’s Ivanhoe) with the intention that all Jewish children could attend classes regardless of financial standing or synagogue affiliation. The collection consists of administrative records, papers and programs from school teachings and functions, some very cool artifacts (e.g., lantern slides, a large hand bell used for fire drills, books and other items originally belonging to Rebecca Gratz), and many photographs.

Hopping through the decades.

Hopping through the decades.

In working with the collection, my processing partner (Annalise Berdini) and I came across a somewhat frustrating issue – that of attempting to minimally process a collection that had been previously processed to a much more detailed level. This collection, which consists of no less than 17 different accessions, had been processed by various people, and to varying levels. Additionally, a number of the more ‘eye-catching’ items had been used in an exhibit, so they had been somewhat separated from their contextual homes. Many folders were found to contain just one document, or perhaps a few. Others had a slew of records stretching back many decades, but hopscotching through the years like a child at play. It’s not uncommon to find a date span such as “1877, 1882-1888, 1906, 1910-1913, 1930-1959, 1965-1985.”

Other folders seemed to be making a summary of the entire collection, with one or two examples of each type of document from each series we’d constructed, leaving us frequently asking, “How do I label this and where does this go?” (Personally, I’m planning to petition for the word hodgepodge to be added as acceptable terminology since miscellaneous is out of the question.) And then there were the occasional appearances of spotty preservation work (though I can’t be sure when that occurred).

Spotty preservation practices.

Spotty preservation practices.

The folder titles were sometimes helpful, but with any number of people having created the folders over those many many accessions, they were inconsistent. Some had specific titles (some VERY specific); some were quite vague (my favorite from the collection being “Miscellaneous, etc.”). Some had dates (often inaccurate); most did not. This all boiled down to a lot of folders being refoldered; all of which needed to be inspected for more accurate information; and this all slowed down the process considerably. One day, I spent close to five hours making my way through just one linear foot of folders.

The takeaway from the HSSS records is in highlighting the fact that MPLP (or maximal processing, really, which is closer to what we’re doing in this project) is not suited to every collection. This collection, though not done to our current standards, had been previously processed and some sort of inventory did exist. As such, it was most likely not the best choice for this processing project (though we all enjoyed the content of the collection quite a bit). If a collection has already gone past minimal processing, it’s rather difficult to back that process up.