Archivist/Digital Resource Manager
Dear Lone Arrangers,
I follow our section’s listserv pretty religiously, and can attest to the fecundity and variety of the ongoing conversations (just take a look, for example, of our summary of Meg Miner’s post about policy language for access and reuse of analog materials in the archives in the current issue!). That being said, I also spy on a few other section lists, and in early January, read an excellent post on the Manuscript Repositories Section Discussion List by Jane Gorjevsky, the Digital Assets Archivist at the Rare Book and Manuscript Library at Columbia University. Jane offered this thought provoking problem regarding unprocessed collections:
“I am trying to compare the policies of different archives and manuscript repositories on exposing their collections that have not been processed (even minimally).
I am specifically interested whether your repository
a) creates public basic collection-level records upon accession
b) publishes a list of unprocessed collections on their websites
c) provides publicly accessible information about their existing unprocessed collections in any other way (please specify).”
Jane received 15 replies in total, from 11 institutions of higher education, 2 State archives, and 2 Public Libraries respectively. Jane found that the overwhelming majority (11 out of 13) of the respondents provides (or intends to provide in the nearest future) publicly accessible information online via (a) their OPAC or (b) their website. This number includes all institutions of higher education with only one exception; in the latter case the respondent indicated that they are planning to make their unprocessed collections more visible to researchers. Jane concluded that, in general, there exists a great popularity of the “accessioning-as-processing” approach and allowing access to unprocessed collections, whenever it is practical and legally permitted.
This got me wondering: How similarly are us Lone Arrangers doing things? Do we face unique challenges with providing access to unprocessed materials?
So, with Jane’s blessing, I would like to pose these same questions to the Lone Arranger community.I believe the encompassing issues affect archivists in a diversity of settings, and are especially salient for those of us working alone, or in very small staff situations.
Please reply to this post in the comments section, as to whether your institution employs any of the following approaches to exposing unprocessed materials:
a) creates public basic collection-level records upon accession
b) publishes a list of unprocessed collections on their websites
c) provides publicly accessible information about their existing unprocessed collections in any other way (please specify).
Further, I would encourage respondents to this informal survey to highlight challenges unprocessed collections pose for lone arrangers specifically. Is it harder to respond to research interest in unprocessed collections with limited staff? Does your institution employ MPLP or another approach for on-demand access to unprocessed collections? Do researchers typically appreciate having access to unorganized materials? Has your institution pursued funding for processing specific closed or backlogged collections? Please comment below!
Archivist & Records Manager
Public Design Commission of the City of New York
As a “lone arranger,” I’m hesitant to provide tips and tricks for other lone arrangers. As we know, we each face challenges as varied and unique as our own archival collections. Our collections don’t always follow the rules and we aren’t always able to follow professional “best practices” due to staffing or budget concerns. In my case, I was lucky to take up the reigns in an archive that didn’t require a systematic structural or organizational overhaul. Moreover, I was lucky to join an agency that values its archival collection and recognizes its unique historical picture of how the public landscape of New York City has evolved over time.
The Public Design Commission was established as the Municipal Art Commission by the New York City Charter in 1898. The Commission was tasked with the oversight of all public artworks and monuments, but its scope quickly expanded to include public structures and open spaces. In 2008, the agency was renamed the Public Design Commission to better reflect its mission. The Commission reviews permanent works of architecture, landscape architecture, and art proposed on or over City-owned property. The Commission comprises 11 members and includes an architect, landscape architect, painter, sculptor, and three lay members, as well as representatives of the Brooklyn Museum, Metropolitan Museum of Art, New York Public Library, and the Mayor. The Commission also acts as caretaker and curator of the City’s public art collection, which is located throughout the city’s public buildings and open spaces. The Design Commission is a decidedly small city agency, with only 6 full time staff members.
The Commission maintains an extensive archive of projects reviewed by the Commission since 1902, documenting more than 7,000 sites throughout New York City and including tens of thousands of individual project records. The archive contains approximately 2,100 linear feet of records and continues to grow by approximately 1-2 linear feet per month, and contains original documents, drawings, photographs, and architectural plans. The archive informs the Commission’s review of current projects and provides a valuable resource to researchers. In addition, the archive holds special collections that were acquired as reference material by Commission members and staff over our 119 year history.
Though functioning without an archivist for 115 years, the Commission staff thankfully took great care to preserve a common sense filing system and preserve the records as best they could. In fact, the filing system used today dates back to 1902. Unlike many archives, this archive is very much alive. Everything submitted to and approved by the Commission is considered an active and permanent record. Our record series, which we define as a single location (for example: a building, park or public artwork) continue to grow as new projects are proposed and approved at new and existing sites. Each location is assigned a series number. Each document that was reviewed and approved by the Commission for a public project is assigned a letter. Aside from oversized bound or rolled architectural drawings, everything can be logically located by its series number. The downside of this system is that the collection physically expands from within, requiring periodic shifting of boxes to create room to grow.
Public Design Commission records are legal public documents and are available by appointment to the public upon request on a first-come, first-served basis. I usually handle approximately 2 research requests per week either remotely or by appointment in the archive. The archive handles research requests from three distinct groups: staff, city agencies, and the public. Approximately 80-100 project records are requested by staff each month as reference material for new city projects. City agency staff also reviews our records for the same purpose, often filling in gaps in their own recordkeeping. Lastly, our archive supports research by outside architects, artists and designers, students, historians, and citizens interested in public projects.
The Design Commission hired its first archivist (yours truly) in 2013 to oversee and maintain archival records and provide research services for staff, other city agencies, and the public. Being the first archivist at an institution can be overwhelming, but thankfully I was brought into an agency that historically loved and maintained their archive. Therefore, from the outset, I was able to focus on promoting, preserving, and making the archive more inviting to researchers, instead of reinventing the file system wheel. In 2013, the Commission launched a long-term preservation project to digitize the oldest and most fragile materials in the collection, increasing staff and public access to these historic documents while preserving the originals. As of 2017, we have digitized over 16,000 individual documents with the help of staff, interns and two digitization grants awarded by the New York State Archives in 2014 and 2016. These records are available to the public upon request and are periodically posted on our Flickr, Tumblr, and Twitter pages. All digitized material is linked to our database and available remotely to staff. This digitization project has significantly reduced the handling of our oldest records which are still actively reviewed by staff.
The digitization project easily lent itself to promoting the archive and inviting researchers to use our records. By improving our public face on the Design Commission website and on social media, we created a more open and user friendly environment. In 2013 we received only 34 research requests, but by 2016 we received 87 research requests.
I continue to look for ways to promote the archive, including providing after hours tours. We’ve recently added a public portal for archive tours every other month and hope to expand to offer tours for city students. As a lone arranger, I hope to continue finding ways to highlight and disseminate the Commission’s singular holdings, a goal I imagine many of us share for our own unique materials.
I am a lone archivist/digital resource manager at Artifex Press, an New York City-based company dedicated to publishing digital catalogues raisonnés. I work with digital and analogue artwork photography, as well as a growing collection of digital audiovisual materials, and am in charge of administering a digital asset management system (DAMS), Extensis Portfolio, embedding metadata in digital photos, as well as digitizing, editing, and color correcting physical film photography, using a Microtek flatbed scanner and Adobe Photoshop/Bridge. Our digital assets encompass the intellectual property of artists, photographers, galleries, museums, etc., so I am also the copyright point person. Furthermore, I’m the de facto IT lead, so I also manage the company file server, and back up all server data to Fuji LTO tapes (daily and monthly). I work on multiple photography digitization projects simultaneously, correlating to our several published and unpublished digital catalogues raisonnés, including Chuck Close: Paintings, 1967-Present; Jim Dine: Sculpture, 1983-Present; and Tim Hawkinson. Our published catalogues can be accessed for a free, limited time subscription via the Artifex website, which is the public facing, final product.
A Catalogue Raisonné?
A catalogue raisonné is the definitive, comprehensive, and annotated compilation of all the known works of art of an artist. Catalogues Raisonnés are critical tools for researching the provenance, attribution, and exhibition and literature histories of an artist’s body of work. The information in a catalogue raisonné is constantly in flux, and conventional printed catalogues cannot achieve both completeness and accuracy. Digital catalogues raisonnés afford instantaneous editing and modification, and thus are more accurate and up-to-date than their traditional counterparts.
Faithful visual representation of artworks is crucial to maintain this accuracy, and to the overall production of a digital catalogue raisonné. In the process of creating its digital catalogues, Artifex provides access to visual resources associated with an artist’s body of work by centralizing artwork photography from disparate sources in individual artwork records. Therefore, Artifex effectively manages the intellectual property of various artists, museums, galleries, and photographers. Cataloging visual materials for the catalogue raisonné creates a number of challenges for description, including discerning the layered intellectual property rights (i.e. copyright) inherent in artwork photography.
Metadata is a set of data that describes and gives information about other data. Embedded photo metadata stays within an image file, and allows this information to be transferredwith the image in a way that can be understood by other software and hardware.
Embedding metadata in digital visual materials as they are acquired, ensures:
The correct copyright holders are credited, and that this info is retained in the files themselves
Embedded metadata minimizes the need for multiple spreadsheets/documents to fathom what an asset is, and who the copyright holders are
embedded metadata centralizes this crucial info
Artifex staff can locate materials after ongoing use has ceased
Metadata is harnessed by a searchable digital asset management system
Less need to rely solely on file names for retrieval
Robust descriptive information is captured and retained for future projects
Artifex Press uses the IPTC Core metadata standard to describe and catalogue digital visual resources, due to its universal acceptance among a number of industries, including news agencies, photographers, libraries, and museums. The IPTC Core standard provides structured metadata fields that enable archivists to embed accurate data about images in the files themselves.This systematizes the way information is stored and transferred between images and institutions.
At Artifex, I item level catalogue born digital visual materials to facilitate better searchability via our DAMS, Extensis Portfolio. I simultaneously utilize a hybrid item/collection level cataloging approach for our multifaceted analogue collections, and I discuss both methods below.
Chuck Close: Item-Level Cataloging
I often receive born digital photos from various institutions with little-to-no metadata embedded. For example, the visual artist Chuck Close exhibited his new paintings at Pace Gallery in New York, in the Fall of 2015. I received the above photos (Figure 1) from Pace, but with very little info embedded, save for a time stamp and camera make/model (Figure 2). Fortunately, I was able to utilize Adobe Bridge’s metadata templates to batch apply metadata values that all photos in this particular group have in common, such as creator, pictured exhibition, title, credit line, copyright info, city, state, and country. In this instance, I created and employed the metadata template I’ve called “Chuck Close Pace Install” which instantly fills the IPTC Core fields with general values I’ve set, allowing me to quickly embed data all images have in common (See Figures 3 and 4). So, in a few key strokes I’ve ensured the intellectual control of a batch of born digital files, which otherwise had very little embedded info originally. With time permitting, I was able to embed artwork/photo specific metadata in individual photos in this batch, to augment the number of search results for specific works via our DAMS (Figures 5 and 6). For larger accessions of digital images, applying a metadata template will preserve at least a modicum of common descriptive info at accession, to allow for more granular cataloging down the road.
Sol LeWitt Studio Collection: A Hybrid Approach
In contrast I‘ve described our multifaceted analog collections at the collection level, as item level cataloging for these materials would prove too time consuming. Often I am digitizing photography from our analog collections – and primarily the Sol LeWitt Studio Collection – and then creating item–level metadata of these materials at the time of digitization.
Sol LeWitt, a progenitor of the Conceptual Art movement, created a numerical series of ephemeral works he called Wall Drawings, and Artifex Press is conducting ongoing research towards the compilation of the LeWitt Wall Drawing catalogue raisonné. The LeWitt Studio Collection contains similarly number boxes of photos related to his Wall Drawings, from which the LeWitt team draws for research. I‘ve created a traditional archival finding aid for the LeWitt Studio Collection, which provides a general (collection level) description to ensure a minimum amount of intellectual control of the enclosed materials.
I devised a digitization workflow with the Sol LeWitt research team that combines digitization and research priorities:
The LeWitt research team conducts continuous research on LeWitt’s Wall Drawings, using the Studio Collection (I have the benefit of content experts supplying reliable metadata).
The LeWitt team selects (prioritizes) materials from the Studio Collection to digitize for publication
The LeWitt team simultaneously discerns the copyright holders (i.e. The Estate of Sol LeWitt and the contributing institution) of selected materials
I employ a Microtek flatbed film/transparency scanner to digitize the selected Wall Drawing photography, creating an unprocessed master TIFF file.
After digitization, I embed item-level IPTC Core metadata in the unprocessed TIFF file.
I create a copy of, and color correct and touch-up the unprocessed image, and file a second, processed master TIFF file.
Artifex retains two master TIFF files for digitized visual materials, in case originals need to be consulted. Processed images are copies of raw scans (which retain the embedded metadata), with color correction and editing executed in Adobe Photoshop. Artifex Press attempts to represent accurate color of depicted artworks, to further enrich the digital catalogue raisonné.
I owe a huge debt of gratitude to the online Lone Arranger community for helping me fathom some of the above solutions over the last couple years. I hope my account of Artifex’s digital workflow can similarly assist other Lone Arrangers in their necessarily challenging and multifaceted roles.
Society of Mary (Marist Fathers and Brothers) in New Zealand
As we know context is everything so here’s mine-
The New Zealand Province of the Society of Mary – Marist Fathers and Brothers brought Roman Catholicism to NZ in 1838. The New Zealand Archives have been in existence since 1960s, with a professional archivist employed since 2005. We are an in-house archives which cares for the records that document the congregation’s temporal and spiritual affairs as well as collecting the papers of members. The archives is open to external researchers of any religious belief.
The Archives’ operational budget (excluding capital expenditure and my salary) is approximately $15,000US/year.
In 2015 there was a reduction in staffing from 3 to 1 part-time archivist. I have a sole charge position reporting to a committee of a Provincial Council representative and an outside consultant.
To date the Archives has been a paper-based operation. Former staff members viewed born-digital material as artefacts – since we have the carrier, we have the material. The earliest born-digital was transferred onto floppies in 2001.
In fact the impetus for me to take action was to see a naked hard drive arrive, be put on a shelf and then be told, “You can deal with it in 10 years”! Receiving born-digital material this way is a growth area with an ever-aging congregation – in 2013 there was one naked drive, now there are 5! I received a scholarship in 2014 from the Ian McLean Wards Memorial Trust to work out how to start managing this.
Specific born-digital files entering the Archives as opposed to whole drives are photos, radio broadcasts and documents.
At DigCCurr Professional Institute, Nancy McGovern posed the question “What is good enough in your situation?” So my focus has been to keep my processes simple and manageable. As far as the born-digital sphere goes, I’m setting everything up.
My workflows may appear very simple and basic, but I have demonstrated that it is possible for very small institutions to take charge of their born-digital material.
Before taking action, I needed to check that my storage was adequate. For what I have processed to date, yes. The server is replicated to a separate geographic location. We also back up to an external hard drive.
Digital acquisitions are kept in a separate folder on the network and a spreadsheet holds the metadata.
Digital forensics equipment
To follow the principle in digital preservation of not to create any irreversible changes to the data, I use write-blockers.
A write blocker is a mechanism that does not allow anything to be written to the media. There are software and hardware write blockers. I use hardware ones – for hard drives I have a Wiebetech Forensic Ultradock v5 and for USBs a Wiebetech inline USB write blocker.
To work with born-digital material on a carrier, I use an off-network laptop, loaded with the open-source tools introduced at the SAA Digital Forensics: Advanced course. The two programs used in particular are FTK Imager and BitCurator. Another useful tool is Droid from The National Archives (UK). In addition to the write blockers, I also use a USB floppy drive and an external 2TB hard drive. This equipment set-up cost approximately $1500US. Setting up a desktop as a processing station would bring the cost down.
The Archives has a very narrow collection scope and limited resources so all items need appraising.
For legacy material – digital forensic practices
To follow the principle in digital preservation of not creating any irreversible changes to the data, practices adopted from digital forensics used in e-discovery and law enforcement are applied.
One practice is to create a disk image, i.e. to create an exact replica of the contents of the source medium – the data, structure and size of the original media contained in a single file. I have tried both FTK Imager and BitCurator. Each program has its own set of strengths.
Make working copies.
Forensic disk images are used in the first instance to appraise contents. The rule of thumb for me is not to retain the images, just the items selected from the appraisal.
Appraise by viewing in a Hex editor using the character area. I check the text content to see if it is worth looking at further. If files do not have an extension (e.g. .docx) it can be difficult for a computer to open it; using the hex editor means that I can look for a file signature that indicates the file format. I prefer using the Hex editor in FTK Imager.
Generate Reports with BitCurator. These reports are used to check for personally identifiable information. A disk image is needed for this step.
Extract files selected for retention from the disk image.
Run an anti-virus check over selected files (I update the laptop before using it off-network).
If required files are normalized, that is creating a copy in a preservation format. The original format is also retained.
AVPreserve’s Exactly is used to bag and transfer the selected material to the network. This provides checksums for fixity.
Record information in spreadsheet.
Monitor the files for fixity by using AVPreserve’s Fixity.
When it has already been agreed that we will accept the files, there is no need to appraise as the donor has already informed us of their contents.
For modern born-digital on a physical carrier and for born-digital transferred through internet
I use a write blocker to transfer from the physical carrier or through Exactly when receiving files over the Internet. Then the same workflow as for legacy media from the virus scan is followed.