Minimal Processing

...now browsing by category

 

Stop and Smell the Roses

Friday, November 7th, 2014

GHSMS_waste3Long ago, when it was snowing a lot and often, Alina Josan and I spent eight weeks at the Germantown Historical Society.  The majority of our time at GHS was spent with the records of the Germantown Theatre Guild, but at the outset we worked on a much smaller project, the (very) Small Manuscript Collection.

This collection was made up of a smattering of many things. Full of treasures, but usually only one or a few records concerning an individual, family or organization. The legacy decision had been to collect these items together into a catch-all collection and we were foldering the records and creating an inventory. As we worked our way through these disparate pieces, I remembered a question that Christiana Dobrzynski Grippe had asked during my interview for the Hidden Collections project, “Will you have any problem moving quickly through materials that interest you?” My answer was, “No problem—I’ve handled many interesting things in my career and I’ll be able to resist the urge.” Well, here’s to report, it’s difficult to glide by some things, without pausing to wonder! I’ll call it the inevitable minimal processing speed bump—you fly along, until you just have to stop and take a good look at something.

GHSMS_waste5The Small Manuscripts collection had a number of speed bumps, especially since foldering meant, in many cases, item level labeling. I noticed that Alina was riveted by some materials relating to Thomas Meehan (the 19th century botanist and nurseyman), but I resisted temptation until I encountered the Billmeyer waste books. The manuscripts were dated 1795, and while very handsome in themselves, with elegant handwriting on fine paper, they were fairly mundane account books representing “the mony, goods or debts owed to me.” However, on further inspection I noticed something special: Anna Billmeyer—perhaps a daughter, or granddaughter—had co-opted one book at a later date and used it as a sketch diary. Careful sketches and notes pasted on top of the accounting records reveal this well-educated girl’s world view: several pencil sketches of “Mr. Chew’s House,” a series of watercolor vignettes, maps of the South Pole, Bolivia, and “Ethiopia Unexplored Region,” a botanical watercolor of “The Drooping Lily,” architectural renderings with simple perspective diagrams, and pressed violets are just some of the things that Anna chose to document and preserve in the recycled Waste Book.GHSMS_waste9

I had to stop and admire. I had to stop and consider. I was drawn by the mixture of romance and smartness; the dreamy yet precise nature of this girl’s mind that was so clearly reflected in these drawings tucked into the back of an account book. So there you have it, no matter what your processing cruising speed or how short the deadline, sometimes the records simply deserve a pause, an extra few beats of inspection. No matter how many things you may have seen, no matter how many objects you’ve handled, you’re not immune to the allure of “The Drooping Lily.”

Giving 110%: Processing Completed!

Thursday, November 6th, 2014
Training the first cohort of processors.

Training the first cohort of processors.

I’m pleased to announce that our 12-month processing phase of the project has been completed successfully! Overall, the amazing project team efficiently processed 45 collections at 16 repositories in 50 weeks of work, totaling 1685.07 linear feet at an average speed of 3.4 hours per linear foot. We exceeded our goal of processing at a rate of 4 hours per linear foot, which allowed us to process an additional 145.86 linear feet more than we originally anticipated and promised to CLIR in the same timeframe. That’s fulfilling 110% of our linear footage commitment to CLIR!

I am immensely proud of the work that Jessica Hoffman, Annalise Berdini, Steven Duckworth, Megan Evans, Cortney Frank, Carey Hedlund, Alina Josan, Chase Markee, Amanda Mita, and Evan Peugh accomplished during our challenging and ambitious project. Please stay tuned to the blog in the coming weeks and months as I post some terrific reflections written by the team members, as well as my own project documentation, including discussing the challenges of the project, what we learned this time around, and recommendations. Be sure to check out our Flickr page for action photos of processing, too. It’s been a wonderful and productive 12 months!

 

I Love It When a Plan Comes Together!

Thursday, November 6th, 2014

One of the lessons I’ve learned during the course of this project is that often, despite your best efforts, processing will inevitably lead to snags that slow down your pace and extend processing time. When you’re aiming for 4 hours per linear foot in order to stay under the minimal processing time requirements, this can definitely cause some problems. While my partner, Steve, and I have had collections that matched the MPLP requirements closely enough to stay within that deadline, there have been times when it was a struggle to make the timeline work. Some forced us into item level processing. Some surprised us with accessions that had been completely removed from their original home or reordered for no apparent reason. These slowed down our processing time considerably.

How not to store blueprints.

How not to store blueprints.

But then there were the Hahnemann University Academic Affairs records at the Drexel University College of Medicine Legacy Center Archives and Special Collections. This collection has by far best matched the MPLP requirements at this point in the project, despite being the largest collection with which we’ve worked. This collection consists of 250 linear feet of Academic Affairs records, coming from all the various iterations of Hahnemann University. These include the Homeopathic College of Pennsylvania, Hahnemann University, and even a few records from its current Drexel University College of Medicine title. This large collection also came to us quite disjointed, with multiple accessions often originating from various faculty members’ offices or departments within the college, which made for a lot of overlap. However, despite this small challenge, the records themselves were in great shape for MPLP. None had been previously processed (aside from one small “collection,” whose enterprising owner had taken out all the records from their folders and stacked them loosely into a Xerox box, destroying most of the original order). Additionally, because the records came from specific offices and departments, they were often far more consistently organized than personal papers, making it easier to find links between the contents and to figure out what certain folders contained, without excessive detective work.

Because we did not have to focus on item level processing or learning how to re-work previously written folder titles, it left us free to focus on carefully constructing DACS compliant folder titles, and made physical processing that much easier, as many of the separate “collections” were left intact and made into series or subseries. For example, Series II of this collection consists of administration and faculty records. We created subseries based on the faculty member or department from which the records came, which meant very little reorganization, since these records were already split this way.

A student from 1883 -- what fine hair!

A student from 1883 — what fine hair!

As a result of having to spend less time worrying about archives “detective” work, we were able to come up with some methods to streamline the process even further. My favorite of these methods arose when it came time to create the container list. Generally, we had done data entry first and then wrote the scope note after all the physical arrangement had been completed. This time, we wrote the scope notes as we created our container list. It seems like common sense now, because this allowed us to have a fresher memory of what each series and subseries included, and we were able to make preservation and digitization notes as we went along. It helped us track some of the connections between series, as well as to look through the material to double check records that were especially unique within their series. I thought that it would extend the data entry process, looking over all those records again, but this time around Steve and I worked separately on different series, cutting data entry time in half and allowing us to become ‘experts’ on certain sections of the collection. This reinforced the knowledge we had already gathered while working on the collection, and contributed to the ease of creating the scope note as well.

Aside from the well-suited nature of the collection to MPLP, Steve and I also divided our roles more efficiently this time around. We split up data entry, re-boxing, and physical arrangement duties.  Having more time in one institution was also helpful, although a variety of ‘snow days’ meant that, despite finishing about 8 weeks ahead of schedule, there were still a couple of wrenches thrown in that could have considerably stalled us were we working with a less-ideal collection.

250 feet of beauty!

250 feet of beauty!

The takeaway here is that minimal processing works much better for some collections than for others. Repositories looking to get through some of their backlog should carefully consider the fact that not all collections are going to yield a 2-4 hour per linear foot result, regardless of applying MPLP methods. Often, previously processed collections in particular make that result extremely difficult. If a processing archivist is given a previously-processed item level collection with vague folder titles and no obvious original order, MPLP is probably not going to function like one might hope. However, when the right collection is chosen, the result can be a collection ready for researchers in a fraction of the time.

 

 

A challenge from the Superintendents

Wednesday, November 5th, 2014

When I first approached the Archdiocesan Superintendent of Schools records at the Philadelphia Archdiocesan Historical Research Center (PAHRC), I was concerned to say the least.  In fact, I was panicked.  The collection, which documents the administrations of three superintendents spanning a period of thirty years, is all of 9.2 linear feet, which is small compared to most collections.  One hopes that a collection of this size can be dealt with quickly.  However, given that half of the collection consisted of loose, unsorted papers stuffed in document boxes and the other half processed by no less than 8 different LIS students at Villanova University in the 1960s, I assumed we would never meet our processing deadline.  Little did I know upon the terror of first viewing the Superintendent records that it would be the first collection my partner and I completed processing in well under our 4 hours per linear foot limit.  When all was said and done, MPLP processing allowed us to transform twenty-three boxes of disarrayed records into an accessible and usable collection at a swift processing speed of 2.6 hours per linear foot.

My biggest concern while processing this collection was how my partner and I could responsibly process 9 boxes of loose, unsorted records and somehow meaningfully interfile those records into the arrangement we imposed upon the collection.  We did not have enough time to view every loose record individually.  Item-level review is not a luxury afforded to MPLP processors!  Therefore, we could not be entirely sure that we were spot-on with regard to chronology.  We also realized early on that some of those unsorted records were bound to be interfiled within the wrong series, since we were not processing at item level.  Bearing in mind Greene and Meissner’s principle of processing “good enough,” we allowed ourselves to become comfortable with the idea that a few records may be misplaced, which seemed well worth the sacrifice if it meant that the majority of those loose records would finally be given an intelligible arrangement and made accessible to researchers.  When faced with a predicament such as this, it is important to remember that whatever has been done according to the MPLP methodology can be undone.  Those potentially misplaced records would not be buried and lost forever and could very easily be repositioned according to a more refined arrangement at a future point in time!

MPLP is not a final solution.  MPLP is a step in the right direction, though not without its imperfections and limitations.  With MPLP, one should always assume slight imperfections, and collections that have been processed minimally should indeed be revisited and refined when more resources become available.  While easily corrected imperfections are a real possibility of MPLP processing, access is an absolute certainty.

SAA Student Poster Re-Cap: “Reprocessing: The Trials and Tribulations of Previously Processed Collections”

Monday, August 25th, 2014

from the poster presented at the Society of American Archivists Annual Meeting, August 2014, Washington, D.C.

by Annalise Berdini, Steven Duckworth, Jessica Hoffman, Alina Josan, Amanda Mita, & Evan Peugh; Philadelphia Area Consortium of Special Collections Libraries (PACSCL)

OVERVIEW:

PACSCL’s current project, “Uncovering Philadelphia’s Past: A Regional Solution to Revealing Hidden Collections,” will process 46 high research value collections, totaling 1,539 linear feet, from 16 Philadelphia-area institutions that document life in the region. Since the start of processing in October 2013, the team has completed 31 collections at 13 repositories, totaling over 1,225 linear feet. Plans have evolved over the course of the project due to previous processing in many collections. As the processing teams tackled the collections, the solutions devised for the various challenges they encountered developed into a helpful body of information regarding minimal processing. Future archivists and collaborators can use this knowledge to choose appropriate collections for minimal processing projects, and be prepared to handle unexpected challenges as they arise.

NOTED ISSUES:

  • Novice Archivists: Volunteers and novice archivists, while well meaning, can make simple mistakes that lead to larger problems.
    • Learn about the previous processors; their background and level of knowledge with the materials. Having a better idea of their relationship to the collection helps guide decisions in the new iteration of processing.
    • “Miscellaneous.” It is a very popular word, even with seasoned archivists. Attempts should be made to more accurately describe the contents of a folder, such as “Assorted records” or “Correspondence, assorted,” followed by examples of record types or 1 to 3 names of individuals represented.
  • Losing Original Order: Processors with good intentions can disrupt original order through poor arrangement, item-level processing, and removing items for exhibits or other purposes.
    • Use what original order remains to influence arrangement in a way that might bring separated records back together.
    • Lone items may require more detailed description to provide links back to other documents.
    • Be aware of handwriting: Previous folder titling can serve as a clue for separated items and original order.
  • Item-Level Description: Item-level description can render the collection’s original order impossible to discern and greatly diminish access.
    • Gain a broad perspective of the collection in order to determine the most intelligible arrangement of materials with an awareness of grouping like with like.
    • For item-level reference materials, such as newspaper and magazine clippings, merge materials into larger subject files and include a rough date span.
    • Be cautious when merging other records, such as correspondence. Arrange materials into a loose chronological order and include in the folder title the names of recurring correspondents, if possible.
    • Make sure to account for the new arrangement in one’s arrangement note. Reuniting item-level materials and describing those materials to the new level of arrangement will greatly enhance access to the collection.
  • Legacy Finding Aids: It can be difficult to tell how accurate an existing finding aid is, and the decisions made on how much of it to preserve can be complicated.
    • Again, knowledge of the previous processors’ education and history with the collection will prove helpful.
    • Consider the fate of the legacy finding aid. If the collection will be entirely reprocessed, is anything in the legacy finding aid worth keeping? Should the old and new simply be linked or should parts of the old finding aid be incorporated into the new one?
    • Proofread! Anything retained from a legacy finding aid should be proofread very carefully.
    • Keep ideas of continuity in mind while creating new folder titles and dates.
    • Format can be a problem. Will the format (e.g., hardcopy only) prove problematic for import? Scanning and OCR can be a time-consuming process.
  • Collection Size and Type: Size and type of collection can have a drastic impact on processing speeds.
    • If possible, choose larger collections to economize on time and money. Multiple smaller collections require more effort than one larger one.
    • Institutional records average a faster processing speed than family or personal papers. Keep this in mind when choosing which collections to process.

OVERALL RECOMMENDATIONS

  • Work closely with current staff; understand the history of the collection and the desired shape of its future.
  • Learn about previous processors to understand their training, background, and history with the records.
  • Edit and expand upon non-descriptive terms (e.g., miscellaneous) when possible. More detailed descriptions can assist in linking separated records back together.
  • Merge clippings and reference files together when feasible.
  • Make note of reprocessing decisions in the finding aid.
  • Proofread any reused documents or folder titles, keeping ideas of consistency in mind.
  • Be mindful of donor relationships in discussing past problems, especially in any public forum, such as a project blog.
  • Plan carefully from the outset. If possible, choose collections that best fit the project goals.
  • Remain flexible and be prepared to compromise.

FILES

"Reprocessing" poster for Society of American Archivists 2014 Annual Meeting

Poster for Society of American Archivists 2014 Annual Meeting

Processing speed by collection size graph

Average processing speed by collection size

Processing speed by collection type graph

Average processing speed by collection type

More Pragmatism, Less Protocol

Wednesday, May 28th, 2014

Griffin1

More Product, Less Process is a great way for putting our workflow into perspective in archives. Some tasks do not require a lot of detailed attention, and a cursory run-through along with some healthy description should suffice for making archival materials accessible. Other tasks may require a bit more work, but in the grand scheme of things will not suffer from the prioritization of other duties. MPLP is ideally suited for these types of situations, but when one is confronted with a stack of papers with no obvious relationship or readily determined content, more work is necessary. So what do we do when a collection contains records of both types?

Griffin4Such was the case with the Martin I. J. Griffin Collection at the Philadelphia Archdiocesan Historical Research Center. Griffin was a Catholic historian during the late 19th and early 20th centuries, and thus has some interesting research files. Some of these files easily fit into categories and can thus be minimally processed, but others are almost unusable without further item-level description and conservation. Examples include scrolls of brittle paper and assorted research files, all of which are written in Griffin’s cryptic handwriting, and these materials cannot be described further due to the time restraints of MPLP. We thus have a collection that is mostly processed, but I cannot call it complete until the miscellanea is dealt with.

Of course, we cannot simply abandon our working model every time we come across materials that areGriffin3 not suited to MPLP; we must press on! But in retrospect, what is best for such a collection is a synthesis of MPLP and standard item-level processing. Since there are two types of needs for these hybrid collections, we should use a hybrid working model. This type of synthesis does not come naturally in an administrative environment, however, since schedules are often designed around predictable processing rates.

Where does this leave us? I’m not sure, but I know that we must approach collections pragmatically, and address each collection’s specific need. MPLP, traditional processing, or both, we need to use whatever method is appropriate. This does not mean that processing projects will necessarily be designed to accommodate such circumstantial decision making. Nevertheless, within the confines of established procedure we can certainly try our hardest to act in the best interests of collections and vocalize our dissatisfaction when this proves insufficient.

Griffin5

Minimal deaccessioning

Thursday, April 17th, 2014

The parameters of our Hidden Collections project generally preclude any deaccessioning efforts from being part of the process. We’re tasked with moving at a relatively swift pace – roughly twice the speed of “traditional” archival processing – and this doesn’t leave a lot of time to go through and check to see if some items could or should be removed from the collections. Additionally, being archival interlopers, fairly unfamiliar with the collections and procedures of our temporary homes, leads us to err on the side of caution and leave the task of deaccessioning for another time and, usually, another archivist. However, I’ve found that from time to time, some deaccessioning can take place with relatively no additional time taken for the process.

Folders of publications.

Folders of publications.

A prime example of this came in the past couple of weeks with our collection at the Drexel University College of Medicine (DUCOM) Legacy Center Archives and Special Collections. At DUCOM, we are processing about 250 feet of materials in the Academic Affairs records group of Hahnemann University. This group is made up of many smaller collections of papers from administrators and faculty, as well as broader collections from academic units, assorted publications, and more. While processing each of these collections, we often noted files that we knew we had seen before and were obviously duplications, but due to time constraints and issues of provenance, we let this fact bother us momentarily and then moved on. But when it came to the series of publications, the rules changed a bit.

As the materials in the series came from a variety of smaller collections of publications, the aim was to file them all together, leaving issues of provenance out of the picture. And, as we decided to file them chronologically within four subseries, picking out the duplicates became quite simple during the final process of arranging and boxing. As can be seen in the accompanying pictures, duplicated publications were blatantly obvious.

Deaccessioned publications.

Deaccessioned publications.

After a quick glance through each set of duplicates, three copies of each were retained, consisting of the versions in the best condition or any annotated copies. The excess duplicates were removed from the collection and given to the main archivists who will decide upon their ultimate fate. Though it may not seem like much in a collection of roughly 250 feet, we were able to remove over a foot of redundant material in this manner without slowing down our process. We consider this a win-win situation and recommend using this idea of minimal deaccessioning when possible with future collections.

“Two Gun” Bessie and the case for better folder titles

Monday, April 14th, 2014

One of the issues with working with a legacy finding aid is that previous descriptions can easily fall short. Such is the case with the MOLLUS collection, and we tried to go back through folders with unclear titles to fix this problem. One such folder, titled “Front, 1941”, provides an excellent example of why accurate folder description is important.

Jessica and Evan process MOLLUS.

Jessica and Evan process MOLLUS.

Upon further inspection, “Front, 1941” contains a series of newspaper clippings related to the sudden resignation of Dr. Bessie Burchett. Dr. Burchett, known as “Two gun Bessie” for her tendency to carry two pistols to defend herself, was a Latin teacher at West Philadelphia High School who strongly opposed communism. She even wrote a book on the communist infiltration of American schools: Education for Destruction. In fact, Burchett was so strongly against communism that she was revealed to have Nazi sympathies. When news of her political extremism broke, there was a cry of public outrage against her, and rather than awaiting her inevitable dismissal, Burchett elected to resign.

The case of Dr. Bessie Burchett provides an interesting snapshot of Philadelphia and the United States during an era of extreme political movements. But if a researcher were to come across the title “Front, 1941”, the researcher could never be aware of the treasures in the folder unless they opened it because the folder title provides so little useful information.

Processed MOLLUS at home in the new Union League secure vault.

Processed MOLLUS at home in the new Union League vault.

This means that an archivist must choose between properly titled folders or item level description, and when using MPLP the latter is out of the question. Folder titles should thus properly identify contents, and it is important to conscientiously consider such titles. For “Front, 1941” we had some difficulty coming up with a title that adequately captured the contents, but after a while we settled on “’Front:’ Clippings regarding Philadelphia school teacher Bessie Burchett, especially regarding anti-communism and Nazi sympathy, 1941”. This title is a much more accurate description of the folder contents.

So much for this folder, but how many other folders are out there that fail to describe their contents? How many more stories like Dr. Burchett’s are hiding in the crevices of archives, waiting to be discovered?

Saint Peter Claver Roman Catholic Church records

Friday, April 11th, 2014

The records of St. Peter Claver Roman Catholic Church of Philadelphia, one of the collections held at Temple University’s Special Collections Research Center, sheds light on a unique aspect of Philadelphia history. The church was started in 1886 when African American Catholics in the region grew tired of the discrimination they faced at Catholic Churches of the day (if they were allowed in at all). Members of three parishes united together to form the Peter Claver Union with the goal of creating a “Church for Colored Catholics” in Philadelphia.

In 1889, they were officially recognized by the Archdiocese of Philadelphia, and in 1892, they moved into their new home at 12th and Lombard Streets (a former Presbyterian church). The church continued to function for almost a century until the Archdiocese suppressed the church in 1985, stating that due to the changing racial climate, a dedicated church for African Americans was no longer needed, thus removing their parish status, as well as all of their records. At this point, the church continued to function as a community, but could not offer most religious sacraments and services.

Steve processing at Temple University.

Steve processing at Temple University.

In processing the records of this collection, one obvious drawback is the lack of most records from before 1985 (outside of the school records). Rather than finding records focused mainly on the administration and rituals of a church, this collection’s focus is found in the community outcry over the suppression of the parish, clippings and other subject files covering the African American community at the time, the church community’s struggle to remain vibrant in a neighborhood that had lost its African American majority, and many issues of racism (real or perceived) within the Catholic Church as a whole.

From a processing perspective, this was my favorite collection from our time at Temple and that comes from it not having been previously processed. It was quite rewarding to take a box full of papers and create a logical order to the contents, rather than just relabeling folders or trying to figure out why someone had deemed certain records appropriate to folder together.  This collection, though smaller than our previous ones, offered a chance to do some actual MPLP processing (a goal of this project), as well as learn more about Philadelphia history. And while I’ll not comment on my personal views of the acts of the Catholic Church regarding St. Peter Claver’s, it is quite eye opening to read about this time in Catholic history.

Processing up

Tuesday, April 8th, 2014

The Hebrew Sunday School Society (HSSS) collection at Temple University’s Special Collections Research Center contains roughly 35 linear feet of records that span two centuries (1802 to 2002) and document the history of the Society. HSSS was founded in 1838 by Rebecca Gratz (a Jewish philanthropist in Philadelphia and the basis for the character of Rebecca in Sir Walter Scott’s Ivanhoe) with the intention that all Jewish children could attend classes regardless of financial standing or synagogue affiliation. The collection consists of administrative records, papers and programs from school teachings and functions, some very cool artifacts (e.g., lantern slides, a large hand bell used for fire drills, books and other items originally belonging to Rebecca Gratz), and many photographs.

Hopping through the decades.

Hopping through the decades.

In working with the collection, my processing partner (Annalise Berdini) and I came across a somewhat frustrating issue – that of attempting to minimally process a collection that had been previously processed to a much more detailed level. This collection, which consists of no less than 17 different accessions, had been processed by various people, and to varying levels. Additionally, a number of the more ‘eye-catching’ items had been used in an exhibit, so they had been somewhat separated from their contextual homes. Many folders were found to contain just one document, or perhaps a few. Others had a slew of records stretching back many decades, but hopscotching through the years like a child at play. It’s not uncommon to find a date span such as “1877, 1882-1888, 1906, 1910-1913, 1930-1959, 1965-1985.”

Other folders seemed to be making a summary of the entire collection, with one or two examples of each type of document from each series we’d constructed, leaving us frequently asking, “How do I label this and where does this go?” (Personally, I’m planning to petition for the word hodgepodge to be added as acceptable terminology since miscellaneous is out of the question.) And then there were the occasional appearances of spotty preservation work (though I can’t be sure when that occurred).

Spotty preservation practices.

Spotty preservation practices.

The folder titles were sometimes helpful, but with any number of people having created the folders over those many many accessions, they were inconsistent. Some had specific titles (some VERY specific); some were quite vague (my favorite from the collection being “Miscellaneous, etc.”). Some had dates (often inaccurate); most did not. This all boiled down to a lot of folders being refoldered; all of which needed to be inspected for more accurate information; and this all slowed down the process considerably. One day, I spent close to five hours making my way through just one linear foot of folders.

The takeaway from the HSSS records is in highlighting the fact that MPLP (or maximal processing, really, which is closer to what we’re doing in this project) is not suited to every collection. This collection, though not done to our current standards, had been previously processed and some sort of inventory did exist. As such, it was most likely not the best choice for this processing project (though we all enjoyed the content of the collection quite a bit). If a collection has already gone past minimal processing, it’s rather difficult to back that process up.