Planet Code4Lib

ePADD / FOSS4Lib Updated Packages

Last updated July 3, 2015. Created by Peter Murray on July 3, 2015.
Log in to edit this page.

ePADD is a software package developed by Stanford University's Special Collections & University Archives that supports archival processes around the appraisal, ingest, processing, discovery, and delivery of email archives.

The software is comprised of four modules:

Appraisal: Allows donors, dealers, and curators to easily gather and review email archives prior to transferring those files to an archival repository.

Processing: Provides archivists with the means to arrange and describe email archives.

Discovery: Provides the tools for repositories to remotely share a redacted view of their email archives with users through a web server discovery environment. (Note that this module is downloaded separately).

Delivery: Enables archival repositories to provide moderated full-text access to unrestricted email archives within a reading room environment.

License: 
Development Status: 
Operating System: 

Releases for ePADD

Technologies Used: 
Programming Language: 
Open Hub Stats Widget: 

Amazon Crawl: part if / Open Library Data Additions

Part if of Amazon crawl..

This item belongs to: data/ol_data.

This item has files of the following types: Data, Data, Metadata, Text

Preserving the Star-Spangled Banner / DPLA

The tune of the “Star-Spangled Banner” is one that will be played at picnics, fireworks displays, and other Fourth of July celebrations across the country this weekend. But the “broad stripes and bright stars” of the original flag that flew over Fort McHenry in 1814–inspiring Francis Scott Key to pen the iconic poem–have required some refreshing over the years. While recent conservation efforts have made the flag a centerpiece of the Smithsonian’s climate-controlled Flag Hall at the National Museum of American History, that wasn’t the only big upkeep project on the flag. Here’s the story behind the 1914 conservation effort spearheaded by a talented embroidery teacher to bring new life to an American icon.

Women at work repairing the Star-Spangled Banner, 1914. Courtesy of The New York Public Library.

Women at work repairing the Star-Spangled Banner, 1914. Courtesy of The New York Public Library.

The Star-Spangled Banner first came to the Smithsonian in 1907 and  was formally gifted a few years later from the family of Lieutenant Colonel George Armistead. When it came to the museum, the flag itself had seen significant damage. In addition to the battle it survived at Fort McHenry, pieces of the flag had been given out as mementos by Armistead’s family to friends, war veterans, and politicians (legend has it even to Abraham Lincoln, though his rumored piece has never been found).

The original "Star-Spangled Banner." Courtesy of The New York Public Library.

The original “Star-Spangled Banner.” Courtesy of The New York Public Library.

By the time the Smithsonian’s first conservation efforts began, the flag itself was 100 years old and in fragile condition. In 1914, the Smithsonian brought on embroidery teacher and professional flag restorer Amelia Fowler (who had experience fixing historic flags at the US Naval Academy) to undertake the Star-Spangled Banner project. Fowler, alongside her team of ten needlewomen, spent eight-weeks in the humid early summer restoring the flag. The team took off a canvas backing that had been attached in the 1870s, when the flag was displayed at the Boston Navy Yard. Fowler attached a new linen backing, with approximately 1,700,000 stitches, in a unique honeycomb pattern–a preservation technique Fowler herself patented. For the project, Fowler was paid $500 and her team split an additional $500. The newly-preserved flag was on display for the next fifty years.

Fowler’s flag restoration, which she said would “defy the test of time,” did last until 1999, during the “Save America’s Treasures” preservation campaign, when conservation efforts began again. The extensive work that Fowler completed to revive the Star-Spangled Banner, those millions of stitches, took conservators almost two years to remove. The iconic flag remains up for display in the Smithsonian’s National Museum of American History, inspiring new generations in “the land of the free and the home of the brave.”

Featured image, 1839 sheet-music for “The Star-Spangled Banner,” courtesy of the University of North Carolina at Chapel Hill via North Carolina Digital Heritage Center.

“Dutch universities start their Elsevier boycott plan” / Jonathan Rochkind

“We are entering a new era in publications”, said Koen Becking, chairman of the Executive Board of Tilburg University in October. On behalf of the Dutch universities, he and his colleague Gerard Meijer negotiate with scientific publishers about an open access policy. They managed to achieve agreements with some publishers, but not with the biggest one, Elsevier. Today, they start their plan to boycott Elsevier.

Dutch universities start their Elsevier boycott plan


Filed under: General

Characteristics of subjects in the DPLA / Mark E. Phillips

There are still a few things that I have been wanting to do with the subject data from the DPLA dataset that I’ve been working with for the past few months.

This time I wanted to take a look at some of the characteristics of the subject strings themselves and see if there is any information there that is helpful, useful for us to look at as an indicator of quality for the metadata record associated with that subject.

I took at look at the following metrics for each subject string; length, percentage integer, number of tokens, length of anagram, anagram complexity, number of non-alphanumeric characters (punctuation).

In the tables below I present a few of the more interesting selections from the data.

Subject Length

This is calculated by stripping whitespace from the ends of each subject, and then counting the number of characters that are left in the string.

Hub Unique Subjects Minimum Length Median Length Maximum Length Average Length stddev
ARTstor 9,560 3 12.0 201 16.6 14.4
Biodiversity_Heritage_Library 22,004 3 10.5 478 16.4 10.0
David_Rumsey 123 3 18.0 30 11.3 5.2
Digital_Commonwealth 41,704 3 17.5 3490 19.6 26.7
Digital_Library_of_Georgia 132,160 3 18.5 169 27.1 14.1
Harvard_Library 9,257 3 17.0 110 30.2 12.6
HathiTrust 685,733 3 31.0 728 36.8 16.6
Internet_Archive 56,910 3 152.0 1714 38.1 48.4
J._Paul_Getty_Trust 2,777 4 65.0 99 31.6 15.5
Kentucky_Digital_Library 1,972 3 31.5 129 33.9 18.0
Minnesota_Digital_Library 24,472 3 19.5 199 17.4 10.2
Missouri_Hub 6,893 3 182.0 525 30.3 40.4
Mountain_West_Digital_Library 227,755 3 12.0 3148 27.2 25.1
National_Archives_and_Records_Administration 7,086 3 19.0 166 22.7 17.9
North_Carolina_Digital_Heritage_Center 99,258 3 9.5 3192 25.6 20.2
Smithsonian_Institution 348,302 3 14.0 182 24.2 11.9
South_Carolina_Digital_Library 23,842 3 26.5 1182 35.7 25.9
The_New_York_Public_Library 69,210 3 29.0 119 29.4 13.5
The_Portal_to_Texas_History 104,566 3 16.0 152 17.7 9.7
United_States_Government_Printing_Office_(GPO) 174,067 3 39.0 249 43.5 18.1
University_of_Illinois_at_Urbana-Champaign 6,183 3 23.0 141 23.2 14.3
University_of_Southern_California._Libraries 65,958 3 13.5 211 18.4 10.7
University_of_Virginia_Library 3,736 3 40.5 102 31.0 17.7

My takeaway from this is that three characters long is just about the shortest subject that one is able to include,  not the absolute rule, but that is the low end for this data.

The average length ranges from 11.3 average characters for the David Rumsey hub to 43.5 characters on average for the United States Government Printing Office (GPO).

Put into a graph you can see the average subject length across the Hubs a bit easier.

Average Subject Length

Average Subject Length

The length of a field can be helpful to find values that are a bit outside of the norm.  For example you can see that there are five Hubs  that have maximum character lengths of over 1,000 characters. In a quick investigation of these values they appear to be abstracts and content descriptions accidentally coded as a subject.

Maximum Subject Length

Maximum Subject Length

For the Portal to Texas History that had a few subjects that came in at over 152 characters long,  it turns out that these are incorrectly formatted subject fields where a user has included a number of subjects in one field instead of separating them out into multiple fields.

Percent Integer

For this metric I stripped whitespace characters, and then divided the number of digit characters by the number of total characters in the string to come up with the percentage integer.

Hub Unique Subjects Maximum % Integer Average % Integer stddev
ARTstor 9,560 61.5 1.3 5.2
Biodiversity_Heritage_Library 22,004 92.3 2.2 11.1
David_Rumsey 123 36.4 0.5 4.2
Digital_Commonwealth 41,704 66.7 1.6 6.0
Digital_Library_of_Georgia 132,160 87.5 1.7 6.2
Harvard_Library 9,257 44.4 4.6 9.0
HathiTrust 685,733 100.0 3.5 8.4
Internet_Archive 56,910 100.0 4.1 9.4
J._Paul_Getty_Trust 2,777 50.0 3.6 8.0
Kentucky_Digital_Library 1,972 63.6 5.7 9.9
Minnesota_Digital_Library 24,472 80.0 1.1 5.1
Missouri_Hub 6,893 50.0 2.9 7.5
Mountain_West_Digital_Library 227,755 100.0 1.1 5.5
National_Archives_and_Records_Administration 7,086 42.1 4.7 9.4
North_Carolina_Digital_Heritage_Center 99,258 100.0 1.5 5.9
Smithsonian_Institution 348,302 100.0 1.1 3.6
South_Carolina_Digital_Library 23,842 57.1 2.3 6.5
The_New_York_Public_Library 69,210 100.0 12.0 13.5
The_Portal_to_Texas_History 104,566 100.0 0.4 3.7
United_States_Government_Printing_Office_(GPO) 174,067 80.0 0.4 2.4
University_of_Illinois_at_Urbana-Champaign 6,183 50.0 6.1 10.9
University_of_Southern_California._Libraries 65,958 100.0 1.3 6.4
University_of_Virginia_Library 3,736 72.7 1.8 6.8
Average Percent Integer

Average Percent Integer

If you group these into the Content-Hub and Service-Hub categories you can see things a little better.

Percent Integer Grouped by Hub Type

It appears that the Content-Hubs on the left trend a bit higher than the Service-Hubs on the right.  This probably has to do with the use of dates in subject strings as a common practice in bibliographic catalog based metadata which isn’t always the same in metadata created for more heterogeneous collections of content that we see in the Service-Hubs.

Tokens

For the tokens metric I replaced punctuation character instance with a single space character and then used the nltk word_tokenize function to return a list of tokens.  I then just to the length of that resulting list for the metric.

Hub Unique Subjects Maximum Tokens Average Tokens stddev
ARTstor 9,560 31 2.36 2.12
Biodiversity_Heritage_Library 22,004 66 2.29 1.46
David_Rumsey 123 5 1.63 0.94
Digital_Commonwealth 41,704 469 2.78 3.70
Digital_Library_of_Georgia 132,160 23 3.70 1.72
Harvard_Library 9,257 17 4.07 1.77
HathiTrust 685,733 107 4.75 2.31
Internet_Archive 56,910 244 5.06 6.21
J._Paul_Getty_Trust 2,777 15 4.11 2.14
Kentucky_Digital_Library 1,972 20 4.65 2.50
Minnesota_Digital_Library 24,472 25 2.66 1.54
Missouri_Hub 6,893 68 4.30 5.41
Mountain_West_Digital_Library 227,755 549 3.64 3.51
National_Archives_and_Records_Administration 7,086 26 3.48 2.93
North_Carolina_Digital_Heritage_Center 99,258 493 3.75 2.64
Smithsonian_Institution 348,302 25 3.29 1.56
South_Carolina_Digital_Library 23,842 180 4.87 3.45
The_New_York_Public_Library 69,210 20 4.28 2.14
The_Portal_to_Texas_History 104,566 23 2.69 1.36
United_States_Government_Printing_Office_(GPO) 174,067 41 5.31 2.28
University_of_Illinois_at_Urbana-Champaign 6,183 26 3.35 2.11
University_of_Southern_California._Libraries 65,958 36 2.66 1.51
University_of_Virginia_Library 3,736 15 4.62 2.84
Average number of tokens

Average number of tokens

Tokens end up being very similar to that of the overall character length of a subject.  If I was to do more processing I would probably divide the length by the number of tokens and get an average work length for the tokens in the subjects.  That might be interesting.

Anagram

I’ve always found anagrams of values in metadata to be interesting,  sometimes helpful and sometimes completely useless.  For this value I folded the case of the subject string to convert letters with diacritics to their ASCII version and then created an anagram of the resulting letters.  I used the length of this anagram for the metric.

Hub Unique Subjects Min Anagram Length Median Anagram Length Max Anagram Length Avg Anagram Length stddev
ARTstor 9,560 2 8 23 8.93 3.63
Biodiversity_Heritage_Library 22,004 0 7.5 23 9.33 3.26
David_Rumsey 123 3 12 13 7.93 2.28
Digital_Commonwealth 41,704 0 9 26 9.97 3.01
Digital_Library_of_Georgia 132,160 0 9.5 23 11.74 3.18
Harvard_Library 9,257 3 11 21 12.51 2.92
HathiTrust 685,733 0 14 25 13.56 2.98
Internet_Archive 56,910 0 22 26 12.41 3.96
J._Paul_Getty_Trust 2,777 3 19 21 13.02 3.60
Kentucky_Digital_Library 1,972 2 14.5 22 13.02 3.28
Minnesota_Digital_Library 24,472 0 12 22 9.76 3.00
Missouri_Hub 6,893 0 22 25 11.09 4.06
Mountain_West_Digital_Library 227,755 0 7 26 11.85 3.54
National_Archives_and_Records_Administration 7,086 3 11 22 10.01 3.09
North_Carolina_Digital_Heritage_Center 99,258 0 6 26 11.00 3.54
Smithsonian_Institution 348,302 0 8 23 11.53 3.42
South_Carolina_Digital_Library 23,842 1 12 26 13.08 3.67
The_New_York_Public_Library 69,210 0 10 24 11.45 3.17
The_Portal_to_Texas_History 104,566 0 10.5 23 9.78 2.98
United_States_Government_Printing_Office_(GPO) 174,067 0 14 24 14.56 2.80
University_of_Illinois_at_Urbana-Champaign 6,183 3 7 21 10.42 3.46
University_of_Southern_California._Libraries 65,958 0 9 23 9.81 3.20
University_of_Virginia_Library 3,736 0 9 22 12.76 4.31
Average anagram length

Average anagram length

I find this interesting in that there are subjects in several of the Hubs (Digital_Commonwealth, Internet Archive, Mountain West Digital Library, and South Carolina Digital Library that have a single subject instance that contains all 26 letters.  That’s just neat.  Now I didn’t look to see if these are the same subject instances that were themselves 3000+ characters long.

North_Carolina_Digital_Heritage_Center

 

Punctuation

It can be interesting to see what punctuation was used in a field so I extracted all non-alphanumeric values from the string which left me with the punctuation characters.  I took the number of unique punctuation characters for this metric.

Hub Name Unique Subjects min median max mean stddev
ARTstor 9,560 0 0 8 0.73 1.22
Biodiversity Heritage Library 22,004 0 0 8 0.59 1.02
David Rumsey 123 0 0 4 0.18 0.53
Digital Commonwealth 41,704 0 1.5 10 1.21 1.10
Digital Library of Georgia 132,160 0 1 7 1.34 0.96
Harvard_Library 9,257 0 0 6 1.65 1.02
HathiTrust 685,733 0 1 9 1.63 1.16
Internet_Archive 56,910 0 2 11 1.47 1.75
J_Paul_Getty_Trust 2,777 0 2 6 1.58 0.99
Kentucky_Digital_Library 1,972 0 1.5 5 1.50 1.38
Minnesota_Digital_Library 24,472 0 0 7 0.42 0.74
Missouri_Hub 6,893 0 3 7 1.24 1.37
Mountain_West_Digital_Library 227,755 0 1 8 0.97 1.04
National_Archives_and_Records_Administration 7,086 0 3 7 1.68 1.61
North_Carolina_Digital_Heritage_Center 99,258 0 0.5 7 1.34 0.93
Smithsonian_Institution 348,302 0 2 7 0.84 0.96
South_Carolina_Digital_Library 23,842 0 3.5 8 1.68 1.41
The_New_York_Public_Library 69,210 0 1 7 1.57 1.12
The_Portal_to_Texas_History 104,566 0 1 7 0.84 0.91
United_States_Government_Printing_Office_(GPO) 174,067 0 2 7 1.38 0.99
University_of_Illinois_at_Urbana-Champaign 6,183 0 2 6 1.31 1.25
University_of_Southern_California_Libraries 65,958 0 0 7 0.75 1.09
University_of_Virginia_Library 3,736 0 5 7 1.67 1.58
63 0 2 5 1.17 1.31
Average Punctuation Characters

Average Punctuation Characters

Again on this one I don’t have much to talk about.  I do know that I plan to take a look at what punctuation characters are being used by which hubs.  I have a feeling that this could be very useful in identifying problems with mapping from one metadata world to another.  For example I know there are examples of character patterns that resemble sub-field indicators from a MARC record in the subject values in the DPLA, dataset, (‡, |, and — ) how many that’s something to look at.

Let me know if there are other pieces that you think might be interesting to look at related to this subject work with the DPLA metadata dataset and I’ll see what I can do.

Let me know what you think via Twitter if you have questions or comments.

Just Released: “Where Does Europe’s Money Go? A Guide to EU Budget Data Sources” / Open Knowledge Foundation

The EU has committed to spending €959,988 billion between 2014 and 2020. This money is disbursed through over 80 funds and programmes that are managed by over 100 different authorities. Where does this money come from? How is it allocated? And how is it spent?

Today we are delighted to announce the release of “Where Does Europe’s Money Go? A Guide to EU Budget Data Sources”, which aims to help civil society groups, journalists and others to navigate the vast landscape of documents and datasets in order to “follow the money” in the EU. The guide also suggests steps that institutions should take in order to enable greater democratic oversight of EU public finances. It was undertaken by Open Knowledge with support from the Adessium Foundation.

Where Does Europe's Money Go?

As we have seen from projects like Farm Subsidy and journalistic collaborations around the EU Structural Funds it can be very difficult and time-consuming to put together all of the different pieces needed to understand flows of EU money.

Groups of journalists on these projects have spent many months requesting, scraping, cleaning and assembling data to get an overview of just a handful of the many different funds and programmes through which EU money is spent. The analysis of this data has led to many dozens of news stories, and in some cases even criminal investigations.

Better data, documentation, advocacy and journalism around EU public money is vital to addressing the “democratic deficit” in EU fiscal policy. To this end, we make the following recommendations to EU institutions and civil society organisations:

  1. Establish a single central point of reference for data and documents about EU revenue, budgeting and expenditure and ensure all the information is up to date at this domain (e.g. at a website such as ec.europa.eu/budget). At the same time, ensure all EU budget data are available from the EU open data portal as open data.
  2. Create an open dataset with key details about each EU fund, including name of the fund, heading, policy, type of management, implementing authorities, link to information on beneficiaries, link to legal basis in Eur-Lex and link to regulation in Eur-Lex.
  3. Extend the Financial Transparency System to all EU funds by integrating or federating detailed data expenditures from Members States, non-EU Members and international organisations. Data on beneficiaries should include, when relevant, a unique European identifier of company, and when the project is co-financed, the exact amount of EU funding received and the total amount of the project.
  4. Clarify and harmonise the legal framework regarding transparency rules for the beneficiaries of EU funds.
  5. Support and strengthen funding for civil society groups and journalists working on EU public finances.
  6. Conduct a more detailed assessment of beneficiary data availability for all EU funds and for all implementing authorities – e.g., through a dedicated “open data audit”.
  7. Build a stronger central base of evidence about the uses and users of EU fiscal data – including data projects, investigative journalism projects and data users in the media and civil society.

Our intention is that the material in this report will become a living resource that we can continue to expand and update. If you have any comments or suggestions, we’d love to hear from you.

If you are interested in learning more about Open Knowledge’s other initiatives around open data and financial transparency you can explore the Where Does My Money Go? project, the OpenSpending project, read our other previous guides and reports or join the Follow the Money network.

Where Does Europe’s Money Go - A Guide to EU Budget Data Sources

Thursday Threads: New and Interesting from ALA Exhibits / Peter Murray

Receive DLTJ Thursday Threads:

by E-mail

by RSS

Delivered by FeedBurner

I’m just home from the American Library Association meeting in San Francisco, so this week’s threads are just a brief view of new and interesting things I found on the exhibit floor.

NOTE! Funding for my current position at LYRASIS ran out at the end of June, so I am looking for new opportunities and challenges for my skills. Check out my resume/c.v. and please let me know of job opportunities in library technology, open source, and/or community engagement.

Feel free to send this to others you think might be interested in the topics. If you find these threads interesting and useful, you might want to add the Thursday Threads RSS Feed to your feed reader or subscribe to e-mail delivery using the form to the right. If you would like a more raw and immediate version of these types of stories, watch my Pinboard bookmarks (or subscribe to its feed in your feed reader). Items posted to are also sent out as tweets; you can follow me on Twitter. Comments and tips, as always, are welcome.

Book-Donations-Processing-as-a-Service


I didn’t get to talk to anyone at this booth, but I was interested in the concept. I remember donations processing being such a hassle — analyze each book for its value, deciding whether it is part of your collection policy, determining where to sell it, manage the sale, and so forth. American Book Drive seems to offer such a service. Right now their service is limited to California. I wonder if it will expand, or if there are similar service providers in other areas of the countries.

Free Driver’s Ed Resources for Libraries

This exhibitor had a good origin story. A family coming to the U.S. had a difficult time getting their drivers licenses, so they created an online resource for all 50 states that covers the details. They’ve had success with the business side of their service, so they decided to give it away to libraries for free.

Free Online Obituaries Service from Orange County Library

With newspapers charging more for printing obituaries, important community details are no longer being printed. The Epoch Project from the Orange County (FL) Library System provides a simple service with text and media to capture this cultural heritage information. Funded initially by an IMLS grant [PDF], they are now in the process of rounding up partners in each state to be ambassadors to bring the service to other libraries around the country.

HathiTrust Metadata / Open Library Data Additions

Metadata records from http://www.hathitrust.org/hathifiles.

This item belongs to: data/ol_data.

This item has files of the following types: Data, Data, Metadata

The Evolving Scholarly Record Workshop — the San Francisco edition and the series wrap-up / HangingTogether

The report, The Evolving Scholarly Record, introduced a framework for discussing the changes in the scholarly record and in the roles of stakeholders.ESR framework

Over the past year, OCLC has conducted a series of workshops to socialize the framework.   You can read about the first three Evolving Scholarly Record workshops on hangingtogether.org:  the Amsterdam workshop, the DC workshop, and the Chicago workshop.

For the fourth and final workshop in the series, we wanted to be more cumulative so we took a different tack from the first three workshops.  Instead of having guest speakers in the morning and small group breakout discussions in the afternoon, presentations by OCLC staff set the context for plenary discussions.  I reviewed the ESR framework and recapped the 3 previous workshops, Constance Malpas previewed the report, Stewardship of the Evolving Scholarly Record: From the Invisible Hand to Conscious Coordination, and Jim Michalko talked about boundaries and internalizing and externalizing roles in managing scholarly outputs.  Slides and videos of these presentations are available.

wethree

In the previous workshops, breakout discussions had focused around these four topics:  Selection, Support for the Researcher, Collaboration within the University, and Collaboration with External Entities.  Here are some of the takeaways from those discussions. (There were also discussions under the broad topic of technology, but those have been integrated with the other topics.)

breakout groupSelection

  • Establish priorities: for example, institutional (local) materials, at-risk materials, materials most valued by researchers in specific disciplines.
  • Establish limits: what doesn’t need to be saved? What can be de-selected?
  • Establish clear selection criteria, especially for non-traditional scholarly outputs: for example, blogs, web sites.
  • Accept adequate content sampling.
  • Be aware of system-wide context: how do local selection decisions complement/duplicate stewardship activities elsewhere? Which local collections are considered “collections of record” by the broader scholarly community?

Support for Researchers

  • Offer expertise with reliable external repositories to help researchers make good choices in use of disciplinary repositories.  Provide a local option for disciplines lacking good external choices.
  • Use the dissertation as the first opportunity to establish a relationship with a researcher.  Mint an ORCID and/or ISNI and provide DOIs.  Offer profiling, bibliography, and resume services that save researchers time.  Find ways to ensure portability of research outputs throughout a researcher’s career.
  • Determine how to link various research materials to a project and define for each project what an object is and how to link related bits to the object.
  • Become an integral part of the grant proposal process to ensure that materials flow to the right places instead of needing to be rescued after the fact.
  • Agree on and be explicit about service levels and end-of-life provisions.

Collaboration within the University

  • Use service offerings to re-position the library in the campus community.  Decide where the library will focus; it can’t be expert in all things.
  • Make alliances on campus so you can integrate library services into the campus infrastructure. Help other parts of the university negotiate licensing of data from vendors.
  • Use policy and financial drivers (mandates, ROI expectations, reputation and assessment) to motivate a variety of institutional stakeholders.
  • Create statements of organizational responsibility about selection, services, terms, and which parts of the university will do what.
  • Coordinate to optimize expertise, minimize duplication, rebalance resources, and contain costs.

Chi breakoutCollaboration with External Entities

  • Identify the things can be done elsewhere and those that need to be done locally. Figure out what kinds of relationships are needed with external repositories.
  • Help researchers negotiate on IP rights, terms of use, privacy, and so forth.
  • Determine which external repositories are committed to preservation and which will collect the related materials from processes and aftermaths.  Rely on external services like JSTOR, arXiv, SSRN, and ICPSR, which are dependable delivery and access systems with sustainable business models.
  • Learn how to interoperate with systems such as SHARE. Employ persistent object identifiers and multiple researcher name identifiers to interoperate with other systems.
  • Consider centers of excellence; host one and rely on others.

stewardship report


It is clear that no single institution can hope to gather and manage all of—or even a significant share of—the scholarly record.  This is the starting point for the new report, Stewardship of the evolving scholarly record: From the invisible hand to conscious coordination.


In the fourth workshop we had discussions with all attendees present.  Having started with what came out of the previous workshops, it was easier for them to stretch a little bit beyond that.  Highlights from the plenary discussions are:

Things that institutions should consider doing:

  • Establish when research outputs should be archived locally.  In many cases a citation with a pointer to outputs archived elsewhere will be satisfactory.
  • Decide which materials merit application of preservation protocols.
  • Embed data capture requirements in the researchers’ workflow and see that metadata is created early in the flow.
  • Partner with the sponsored projects office to communicate about data lifecycle.
  • Explore with the Office of Academic Affairs if there are opportunities to work together on collecting assets for promotion and tenure.
  • Do ongoing analysis on Data Management Plans to provide fundamental planning data.
  • Use the library’s space and its “power to convene” to foster critical cross-campus conversations.
  • Develop practices for library assignment and management of ORCID, ISNI, DOI…  Identifiers are crucial.
  • When other units are licensing services (such as those from Elsevier), help with the negotiations and help to ensure that the various campus systems will interoperate.
  • Establish a relationship with HathiTrust and others who can share the stewardship workload.
  • Think about what else you will archive, beyond your institutional output.

SF workshopThings to consider as a community

  • Assemble case studies of successful faculty engagement.
  • Decipher and interpret impact calculations in different systems.
  • Develop models for above-campus infrastructure, with shared investment and governance.  For instance, instead of allowing commercial providers to mine and share our data, develop Open Source tools and retain our data and mine it ourselves.
  • Identify a way to coordinate selection decisions with those of other institutions.
  • Develop shared goals and criteria to influence vendors to improve tools:  Aggregate information about researcher workflow preferences and what the potential is for their tools interoperating with other systems.  Prioritize vendor metadata interoperability requirements for selected tools to allow machine-readable acquisition.
  • Assess the reliability of external repositories.
  • Develop best practices for agreement language, such as preservation commitments with repositories and exit plans with vendors.

The four workshops gave us a chance to not just socialize the framework, but to really hear about the concerns of libraries and other stakeholders, learn what is being done, and begin to think about what lies ahead.  In the near future, we’ll be synthesizing all this and considering next steps.

About Ricky Erway

Ricky Erway, Senior Program Officer at OCLC Research, works with staff from the OCLC Research Library Partnership on projects ranging from managing born digital archives to research data curation.

Collective Access - 1.5 / FOSS4Lib Recent Releases

Release Date: 
Thursday, June 11, 2015

Last updated July 1, 2015. Created by David Nind on July 1, 2015.
Log in to edit this page.

An exciting new version of CollectiveAccess is now available, with a number of new features and improvements!

  • Improved PDF reports
  • Additional external data sources such as Wikipedia and WorldCat
  • Media Annotation tools
  • Improved search functionality
  • Support for complex "interstitial" relationships
  • Collection management tools such as support for deaccessions and location tracking
  • "Check in/ Check out" library circulation module
  • Improvements to the data importer

...and more!

Jobs in Information Technology: July 1, 2015 / LITA

New vacancy listings are posted weekly on Wednesday at approximately 12 noon Central Time. They appear under New This Week and under the appropriate regional listing. Postings remain on the LITA Job Site for a minimum of four weeks.

New This Week

Digital Systems, Training, and Support Coordinator, University of Arkansas at Little Rock, Little Rock, AR

Systems and Digital Services Librarian, University of Arkansas at Little Rock, Little Rock, AR

Digital Library Data Curation Developer, University of Notre Dame, Notre Dame, IN

Visit the LITA Job Site for more available jobs and for information on submitting a job posting.

Revising Academic Library Governance Handbooks / In the Library, With the Lead Pipe

image_pdf

more_difficult_less_difficult_sign

Original Image by Flickr user Sasquatch 1 (CC BY 2.0), with minimal modification by C. Strunk (10 June 2015).

In Brief

Regardless of our status (tenure track, non-tenure track, staff, and/or union), academic librarians at colleges and universities may use a handbook or similar document as a framework for self-governance. These handbooks typically cover rank descriptions, promotion requirements, and grievance rights, among other topics. Unlike employee handbooks used in the corporate world, these documents may be written and maintained by academic librarians themselves1. In 2010, a group of academic librarians at George Mason University was charged with revising our Librarians’ Handbook. Given the dearth of literature about academic librarians’ handbooks and their revision, we anticipate our library colleagues in similar situations will benefit from our experience and recommendations.

 

Background and Context

There are three handbooks at George Mason University (Mason) governing individuals in various faculty positions: the Academic/Professional Faculty Handbook, the LibrariansHandbook, and the Mason Faculty Handbook, which covers instructional faculty.2 Librarians at Mason, a young institution founded in 1957, are classified as professional faculty, a non-tenured faculty classification. As such, librarians are governed by the University’s Administrative/Professional Faculty Handbook (A/P Handbook), as well as by the LibrariansHandbook (Handbook), which became an appendix of the former in 2000. The Handbook contains provisions that apply only to professional faculty librarians. Although the history of the Handbook is not well documented, its precursor was an evaluation and promotion document that was used by library administration as early as the 1970s.

Librarians who hold a professional faculty position at Mason (~45) are members of the Librarians’ Council (Council), which plays a significant governance role by defining the standards for librarian rank, contract renewal, and promotion in the Librarians’ Handbook. Our Handbook differs significantly from the A/P Handbook because the Librarians’ Handbook includes a statement on academic freedom, a professional review process for contract renewal and promotion, professional ranks, and some aspects of the grievance and appeal processes. Consequently, our Handbook is more analogous to the Mason Faculty Handbook.

In late 2009, the Council voted to review and, as needed, revise the Handbook, especially sections related to professional peer review and librarian ranks. We began in the summer of 2010 with the appointment of an ad hoc handbook review committee that was selected by Council officers and approved by the library’s senior administrators. The A/P Handbook was under review at this same time, so revising the Librarians’ Handbook concurrently made sense. Our colleague representing the library in the A/P Handbook group was also appointed to chair the Council’s ad hoc committee, and her dual role proved to be most advantageous to our revision process.

Literature Review

Although there is a substantial body of literature regarding employee handbooks as a whole (most of it in Business and Human Resources), relatively little has been published on creating and revising faculty handbooks, let alone librarian faculty handbooks. Articles that do include library faculty tend to do so in a cursory fashion. One example is a 1985 Chronicle of Higher Education article “Writing a Faculty Handbook: a 2-Year, Nose-to-the-Grindstone Process” that provides a brief description of the recommended process from a legal standpoint and an outline for how to structure the resulting document. This outline includes librarians, along with other “Special Academic Staff and Categories,” but only as a line item (Writing 1985, 29).

One of the few articles describing the actual process of writing and revising faculty handbooks, James L. Pence’s “Adapting Faculty Personnel Policies” focuses solely on instructional faculty (Pence 1990). A more detailed faculty handbook outline that addresses material applicable to librarians is provided in Drafting and Revising Employment Policies and Handbooks. 2002 Cumulative Supplement (Decker et al. 2002, 456-511), but it is more of a prescriptive example from the standpoint of human resources law rather than a “how to” or case study.

Although it is unclear why library faculty are not more fully included in such articles, one reason may be the lack of conformity regarding librarian status in higher education institutions. As surveys such as Mary K. Bolin’s “A Typology of Librarian Status at Land Grant Universities” indicate, librarian status varies widely (Bolin 2008). For the purpose of her survey, Bolin grouped librarian statuses into “Professorial,” “Other ranks with tenure,” “Other ranks without tenure,” and “Non-faculty (Staff)” (Bolin 2008, 223). These statuses span a continuum, with “Professorial” being closest to instructional faculty who have tenure and research requirements and “Non-Faculty (Staff)” being the furthest. More variation may exist within those statuses; for instance, at some institutions, “Non-Faculty (Staff)” librarians are represented in Faculty Senate, while at others, they are not (Bolin 2008, 224).3

We are not aware of any research that reports how many academic librarians are covered by broader faculty handbooks. Given the wide disparity of librarian status, and the fact that librarians may or may not be part of their institution’s larger faculty handbook, it isn’t surprising that librarian handbooks have not received a lot of attention in the literature.

Process

Our review began in earnest in September 2010. We met weekly, which made it easier to maintain discussion continuity from one meeting to the next, and began with deciding on an approach and a tentative timeline. Our initial deadline was April 2011, which mirrored the working deadline for the A/P Handbook. Because both handbooks had to be approved by Mason’s Board of Visitors, we believed it would be advantageous to submit ours as part of the A/P Handbook.

To learn more about the history of pertinent sections, we talked to our library colleagues who worked on previous versions of the Handbook. We reviewed the Academic College and Research Libraries (ACRL) Standards for Faculty Status for Academic Librarians (2007) as well as librarian governance documents from other colleges and universities in Virginia.4 These document reviews confirmed that our handbook was already aligned with the published ACRL standards, provided insight into how governance was handled at other institutions, and gave us ideas to consider for our own handbook. For instance, we considered aligning the professional peer review process with the annual administrative review process and adjusting the professional review calendar.

We met with representatives from the University’s Human Resources Department, the Provost’s Office, and the Office of University Counsel several times to ask questions and learn more about the legal and administrative issues and policies affecting our Handbook. Information shared during these meetings indicated the roles of different faculty handbooks at Mason and how ours fits into the broader institutional picture.

Each committee member volunteered to revise specific sections based on their interest and experience. We reviewed sections as they were revised, rather than in any specific order. Several sections (e.g., Introduction, Professional Development) were revised quickly, whereas others involved deeper discussion. For example, we thought it was critical to discuss the sections on Librarian ranks and professional review together because they were so closely related. The complexity and sensitivity of this subject matter sparked discussions that spanned multiple meetings and content iterations.

Section discussions were often quite detailed, covering all possible aspects–from the overall intent and purpose of the content to the specific definitions of words and phrases. Decisions about the level of textual vagueness or detail desired had to be made. Proposed revisions were considered, modified, discussed, and modified again. We spent a lot of time on word choice to make the document more cohesive and minimize ambiguity. To enhance the Handbook’s professional appearance, we standardized punctuation, capitalization, and format, and removed references to specific web sites.

Throughout our work, we needed to share working documents easily with one another (there were seven committee members), which we accomplished using Dropbox. This practice alleviated some problems with version control, but edits were occasionally made to multiple versions of the same document that later needed to be reconciled. The “track changes” functionality within Microsoft Word was also critical to share changes we had made and add comments and questions. As the work progressed, the committee Chair compiled the revised sections into a final draft for review and created a list of major revisions to share with Council members and reviewers.

Feedback from our colleagues was critical, and we gathered it using online polls and surveys, and town hall meetings. We frequently presented reports at Council meetings to inform the larger body of our progress, receive verbal feedback, and address any questions or concerns.

Both the University Librarian and Vice President/CIO for Information Technology (CIO), the Libraries’ most senior administrators at that time5, were required to review and approve our revised Handbook prior to submission of the final draft to HR for integration with the A/P Handbook. To expedite this process, we gave the University Librarian revised sections as they were completed for his review and comments, and we met with him on several occasions to discuss his questions and concerns.

The revision schedule changed during the process, largely because it took us longer to revise some sections than we originally anticipated. Other delays occurred after we wisely decided to mirror the A/P Handbook revision schedule, which lagged behind ours. For example, protracted discussions of the grievance and termination sections of the A/P Handbook lead to delayed revision of those same sections in the Librarians’ Handbook. We wanted to ensure that whatever modifications the A/P Handbook Committee made would not conflict with the rights conferred librarians in our existing Handbook (e.g., grieving salary or filing a grievance as a group). We also chose to defer to the A/P Handbook for parts of the grievance and termination sections that were duplicative, which streamlined our document. A more flexible timeline meant our revision process took longer than it might have otherwise, but our revised text did not conflict with the revised  A/P Handbook. As a result, it enabled submission of a combined, single document from HR to the Board of Visitors at one time rather than in pieces.

We finished the Handbook revision in July 2011 and sent electronic copies to the CIO and the University Librarian for review and comment. The University Librarian provided his feedback in late November and our final revision was completed in February 2012, after which it was sent to Human Resources for integration into the newly revised A/P Handbook. Subsequently, the combined document was submitted to Mason’s Board of Visitors, who approved it on March 21, 2012.

StevensTable1

Table 1: Table of Contents. George Mason University Librarians’ Handbook (George Mason University 2012b).

 

Professional Peer Review for Librarians

Because professional review is the most important aspect of self-governance defined in our Handbook and detailed in our Council’s Bylaws, a brief description of this process is in order.6 The Council’s Professional Review Committee (PRC), a standing committee, consists of seven elected members who serve staggered two-year terms; a librarian is eligible to serve on the committee after having gone through this peer review process at least once. The University Librarian, in consultation with the PRC Chair, designates subcommittees of three reviewers for each reappointment or promotion review. Librarians are permitted to request that a subcommittee member be recused if a potential conflict of interest exists.7

Based on a librarian’s hire date (see the Professional Review Calendar section below), their review begins with submission of an annotated curriculum vitae (CV) or a “dossier” to the PRC. The dossier is, in fact, a notebook containing the librarian’s CV and detailed documentation of all accomplishments (e.g., publications, presentations, awards, grants, offices held, etc.) achieved during a specific period of time. Mason librarians report progress in three areas: 1) professional competence; 2) scholarship and professional service; and, 3) service to the university and the community. However, only information related to scholarship and service are included in the dossier; content related to professional competence, as well as the required supervisor’s evaluation letter, are neither reviewed by nor available to the PRC subcommittee.

Points to Consider

During the revision process, we identified several major issues we believe readers will benefit from learning how we handled, or did not handle. They are grouped by issues related to Handbook content and those related to our revision process (see below).

Professional Review Calendar

A critical situation the committee wanted to rectify was the inequity in time newly hired librarians were allowed before their initial professional peer review. All librarians, regardless of rank, receive an initial two-year contract.8 Librarians hired before March in a calendar year must submit their CV or dossier for review the first January after they are hired. However, librarians hired later in a year may have up to twice as much time before their initial review (Table 2). Several adjustments to the calendar were considered, but ultimately, we could only incorporate minor changes due to limits imposed by the Provost’s and University Librarian’s schedules, which, in turn, are dictated by the University’s fiscal year.

Stevens_table2a

*For a subsequent promotion, the dossier should cover all professional activities since the last promotion.
** Contract Term = Rank + 1 year

Table 2. Documentation Requirements by Librarian Rank and Review Type (George Mason University 2012b).

 

Librarian Ranks

Handbook content related to librarian rank required a lot of attention, with one example being the basic definition of a librarian. The Handbook defines a Mason librarian as a library employee with a professional faculty appointment and an ALA recognized degree9; this definition also confers Council membership. Table 3 details the basic criteria required for each librarian rank.

Stevens_Table3

Table 3. Mason librarian rank criteria (George Mason University 2012b).

 

Recently, however, professional faculty positions formerly held by librarians have been filled by individuals without an MLS, thus disqualifying those individuals from becoming Council members. Likewise, individuals who hold an MLS or similar degree and are hired in classified staff positions are not eligible for Council membership. The revision committee and Council discussed retiring the library degree requirement, but ultimately did not change the definition primarily because these colleagues would not be subject to professional peer review. If the Council membership definition were changed, three repercussions may take place:

  1. the Librarians’ Council would no longer be a “Librarians’” Council;
  2. non-MLS professional faculty would be subject to peer review or there would be two systems of review; and/or
  3. the Librarians’ Council would be dissolved and all rights conferred by the Librarians’ Handbook terminated.

None of these possible repercussions appealed to Council members at the time.

We also discussed whether to require Librarian 1s to apply for promotion to the Librarian 2 rank as part of their initial reappointment. This idea was dismissed because we were unable to make the desired changes to the professional review calendar. Under the existing calendar, Librarian 1s with no previous experience going up for initial professional peer review might have as little as 18 months of experience in an academic library. This is insufficient experience for advancement to the rank of Librarian 2, which at Mason requires a minimum of three years.

External Reviewers

Composition of the Professional Review (PRC) subcommittee for individuals seeking promotion to Librarian 4 concerned the Handbook committee. Neither the Handbook nor Council Bylaws require reviewers to be at or above the level of the librarian under review or promotion, even though it circumvents potential personnel problems to make this a requirement (e.g., a negative review may be more easily challenged). Because no Mason librarians held the Librarian 4 rank when we revised the Handbook (and there still are none), we wanted to ensure promotion to this rank was conducted by reviewers who could draw from the maximum years of experience possible despite holding the rank of, at least, Librarian 3.

One solution we considered was to invite an external reviewer to participate in a PRC promotion subcommittee. This reviewer would be selected from either the Mason community (non-library), or from another institution. Although external reviewers typically participate in instructional faculty promotion and tenure reviews, we decided this option would not work for us and sought another solution. Eventually, we concluded that all reviewers for a Librarian 4 promotion should be a Librarian 3 at a minimum. Because PRC members are elected on staggered terms, however, there is no way to predict how many Librarian 3s may be serving on the PRC in a given year, nor do we know who has decided to seek promotion until a month before the reviews begin.10

Consequently, to increase the pool of reviewers needed for a given year, we revised the Handbook to allow eligible library faculty not currently on the PRC to be appointed as a reviewer rather than hold an election. This change, which was incorporated into our Council’s Bylaws, ensured that all PRC subcommittee members who review a Librarian 4 promotion bring the experience of at least a Librarian 3 to the process. Furthermore, in years when there are a large number of reviews (15-20) to be conducted, the PRC now has the ability to appoint additional reviewers when needed.

Dossier Requirement for Review

Formerly, librarians being reviewed for each contract renewal and/or promotion submitted dossiers (i.e., often lengthy notebooks) to the PRC that documented their publications, presentations, service, and professional development activities. After much discussion, we proposed that librarians undergoing a second or later reappointment had the option to submit an annotated CV in lieu of a full dossier. Dossiers would continue to be required from Librarian 2s and higher undergoing their first contract renewal and/or all librarians applying for promotion in rank. We thought this option was logical from the standpoint that annual evaluations are required of each librarian, anyway, so an annotated CV would suffice for the purpose of contract renewal. Because an annotated CV is a synopsis of one’s professional activities, it requires less work and documentation than a dossier. The University Librarian approved this option.

Council Approval

Neither the Council’s Bylaws nor the Handbook require members to vote on a Handbook revision. We had to decide whether it was important to seek Council approval of the revised document. After much discussion, we chose to ask for a vote of endorsement before sending a complete draft revision to the University Librarian for his formal review. Council approved the draft by a substantial majority.

Follow Up on Implementation of Revisions

Once our ad hoc committee met its original charge of producing a revised and approved handbook, we were disbanded. We did not develop a plan to implement changes to the Handbook, and neither did the Librarians’ Council. As a result, three years after approval, much work remains to be done. Changes to the Handbook required revision of the Council’s Bylaws and procedural changes in the professional peer review process. The Bylaws were revised, but the Professional Review Committee has been slow to incorporate all the procedural changes and decisions described in the new Handbook into the PRC documents that are used to manage and guide the process. This has resulted in continuing confusion with the professional review process, ironically the primary reason we opened the Handbook for revision.

Recommendations for Revising Your Handbook

When we began this project, it seemed overwhelming. Early on, we discussed our revision strategy and made decisions about how to allocate and accomplish the work. Our plans changed over time, of course, and new approaches were proposed and adopted. Re-examination and adjustment of our workflow throughout the project contributed greatly to our success.

Library consolidation, changes in librarian status, and other factors are affecting even long-established academic libraries, public and private. These changes likely require modifications to documents governing librarians at these institutions. Despite our institution’s relative youth, we offer the following recommendations to other librarians embarking on a handbook or similar governing document revision.

Stevens_Table4a

Stevens_table4b

Stevens_Table4_final

 Table 4. Recommended actions and resulting benefits when conducting a handbook revision.

 

Like most undertakings of this magnitude and importance, the Handbook revision project was extremely time-consuming and, at times, frustrating. Nevertheless, we successfully balanced our Council’s needs within the University’s framework. We were intent on working with our colleagues to create a more professional document that is applicable and fair to today’s members. Most importantly, we gained an intimate familiarity with our handbook—a responsibility all academic librarians with a similar governance structure should work toward. Even when librarians who have a handbook or similar governance documents never have the opportunity or need to revise their handbook, we believe that it is vital to be knowledgeable about its content and ready to advocate for and promote the rights it confers to their colleagues and administrations.

Acknowledgements

The authors would like to thank our peer review editor, Vicki Sipe, Catalog Librarian at the University of Maryland, Baltimore County and our editors at In the Library with the Lead Pipe, Ellie Collier and Annie Pho.

References

Association of College and Research Libraries. Association of College and Research Libraries Standards for Faculty Status for Academic. 2007. Accessed June 5, 2015. http://www.ala.org/acrl/standards/standardsfaculty.

Bolin, Mary K. “A Typology of Librarian Status at Land Grant Universities.” The Journal of Academic Librarianship. 2008. v. 34, issue 3, pp. 220-230. doi:10.1016/j.acalib.2008.03.005

Decker, Kurt. H. Drafting and Revising Employment Policies and Handbooks, 2nd ed., 2 volumes. New York, NY: Wiley, 1994.

Decker, Kurt. H., Louis R. Lessig, and Kermit M. Burley. Drafting and Revising Employment Policies and Handbooks. 2002 Cumulative Supplement. New York, NY: Panel Publishers, 2002.

George Mason University. Administrative/Professional Faculty Handbook, (2012a). Accessed December 22, 2014. http://hr.gmu.edu/policy/APFacultyHandbook.pdf.

George Mason University. “Librarians’ Handbook,” in Administrative/Professional Faculty Handbook, Appendix C (2012b): 18-32. Accessed December 22, 2014. http://hr.gmu.edu/policy/APFacultyHandbook.pdf.

George Mason University. Faculty Handbook, 2014. Accessed December 22, 2014. http://www.gmu.edu/resources/facstaff/handbook/.

Pence, James. L. “Adapting Faculty Personnel Policies.” New Directions for Higher Education. Fall 1990. 59-68. DOI: 10.1002/he.36919907108

“Writing a Faculty Handbook: a 2-Year, Nose-to-the-Grindstone Process.” Chronicle of Higher Education. October 2, 1985. v. 31, issue 5. p. 28

 

 

 

  1. Some librarians are governed by documents developed by Human Resources, faculty unions, or content within a Faculty Handbook. There is a dearth of available information regarding handbooks for academic librarians. See Bolin 2008
  2. For the purposes of this article, we define “instructional faculty” as faculty in the more commonly accepted traditional sense (i.e., professors of English or Chemistry) as well as non-teaching research faculty, and term and adjunct faculty, who are also included in this faculty handbook.
  3. Professional library faculty at George Mason do not have elected representation in the Faculty Senate.
  4. In addition to the George Mason University Faculty Handbook, we read faculty handbooks from the following Virginia colleges and universities:  James Madison University, Radford University, University of Mary Washington, University of Virginia, Virginia Commonwealth University,  and Virginia Tech. Since our review, all of these handbooks have been revised except for Radford University and the University of Mary Washington.
  5. The University Libraries is now directly under the purview of the Provost.
  6. The professional peer review process is distinct from Mason’s professional faculty annual performance review, which takes place between a librarian and his/her supervisor and is required by the A/P Handbook.
  7. The standard  procedure is that librarians from the same department cannot review one another. This can cause problems for departments with large numbers of librarians.
  8. Mason librarians hold multi-year contracts, the duration of which is determined by an individual’s rank.
  9. According to the A/P Handbook, “Typical professional faculty positions are librarians, counselors, coaches, physicians, lawyers, engineers and architects…[that] require the incumbent to regularly exercise professional discretion and judgment and to produce work that is intellectual and varied and is not standardized” (George Mason University 2012a, 4).
  10. In 2014-2015, the PRC included three Librarian 2s and four Librarian 3s while the entire Council composition was: 2 Librarian 1s, 21 Librarian 2s and 18 Librarian 3s and no Librarian 4s.

An Interview With Emerging Leader Isabel Gonzalez-Smith / LITA

Isabel Gonzalez-SmithTell us about your library job.  What do you love most about it?

I am an Undergraduate Experience Librarian at the University of Illinois at Chicago’s Richard J. Daley Library where I focus on how the library can support the academic success of our undergraduates. It’s hard to pick a single thing I love about my job because it is really personal to me. As an alumna, serving UIC undergrads is like stepping back into my own undergraduate experience and constantly thinking about ways I can improve that of our current students. Collaboration is key to many of our library efforts and my current role at UIC Library allows me to meet campus partners with the same mission. It doesn’t hurt that I work with an inspiring team of librarians that constantly push me to be the best professional I can be.

Where do you see yourself going from here?

My greatest motivator is improving the experience of the communities we serve as librarians. It might be nerdy but I geek out about data-driven decision making, the iterative process of refinement, and holistic problem solving when it comes to both virtual and physical services. I’m hoping my next career move is in user experience and assessment.

Why did you apply to be an Emerging Leader? What are your big takeaways from the ALA-level activities so far?

It’s funny – I applied to the program several years ago when a previous EL and friend of mine encouraged me to but I wasn’t accepted. I remember feeling really bummed about it! Years later, I had other friends who became Emerging Leaders bring it up and motivate me to try again. I’m so glad I did! I have found the Emerging Leaders and ALA community very welcoming – people want to see you succeed. Being an Emerging Leader means having the tools and the encouragement to engage more directly with ALA – developing a true appreciation and understanding that it is YOUR organization.

What have you learned about LITA governance and activities so far?

LITA is such an awesome division. I am very grateful I was selected as the LITA sponsored Emerging Leader because it has allowed me to get to know the members who make LITA happen. Members work so hard for each other and they’re truly an innovative bunch. I had no idea how many groups of people worked towards different initiatives in committees, task forces, interest groups and I’m still learning about each of them. Governance takes a lot of people and it is much clearer to me now that I have been more involved.

What was your favorite LITA moment? What would you like to do next in the organization?

Hands down – working with the search committee in selecting LITA’s next Executive Director. Special thanks to the LITA Board for inviting me to have a voice on the committee. It speaks volumes that LITA Board members embraced an early career librarian and allowed me the opportunity to have a say in LITA’s future. Very exciting moment!

UK Crime Data: Feeling is Believing / Open Knowledge Foundation

Latest crime data shows that the UK is getting significantly more ‘peaceful’. Last month, the Institute for Economics and Peace published the UK Peace Index, revealing UK crime figures have fallen the most of all EU countries in the past decade. Homicide rates, to take one indicator, have halved over the last decade.

Crime Scene by Alan Cleaver, Flickr, CC-BY

Crime Scene by Alan Cleaver, Flickr, CC-BY

But the British public still feels that crime levels are rising. How can opening up crime data play a part in convincing us we are less likely to experience crime than ever before?

The ‘Perception Gap’

The discrepancy between crime data and perceptions of the likelihood of crime is particularly marked in the UK. Although it has been found that a majority of the public broadly trust official statistics, the figures are markedly lower for those relating to crime. In one study, 85% of people agreed that the Census accurately reflects changes in the UK, but only 63% said the same of crime statistics.

Credibility of Police Data

Police forces have been publishing crime statistics in the UK since 2008, using their own web-based crime mapping tools or via the national crime mapping facility (http://maps.police.uk/ and http://www.police.uk). This has been purportedly for the purpose of improving engagement with local communities alongside other policy objectives, such as promoting transparency. But allegations of ‘figure fiddling’ on the part of the police have undermined the data’s credibility and in 2014, the UK Statistics Authority withdrew its gold-standard status from police figures, pointing to ‘accumulating evidence’ of unreliability.

The UK’s open data site for crime figures allows users to download street-level crime and outcome data in CSV format and explore the API containing detailed crime data and information about individual police forces and neighbourhood teams. It also provides Custom CSV download and JSON API helper interfaces so you can more easily access subsets of the data.

Crime map from data.police.co.uk

But the credibility of the data has been called into question. Just recently, data relating to stop-search incidents for children aged under-12 was proved ‘inaccurate’. The site itself details many issues which call the accuracy of the data into question: inconsistent geocoding policies in police forces; “Six police forces we suspect may be double-reporting certain types of incidents“; ‘siloed systems’ within police records; and differing IT systems from regional force to force.

In summary, we cannot be sure the ‘data provided is fully accurate or consistent.’

The Role the Media Plays: If it Bleeds, it Leads

In response to persistent and widespread public disbelief, the policies of successive UK governments on crime have toughened: much tougher sentencing, more people in prison, more police on the streets. When the British public were asked why they think there is more crime now than in the past, more than half (57%) stated that it was because of what they see on television and almost half (48%) said it was because of what they read in newspapers [Ipsos MORI poll on Closing the Gaps. One tabloid newspaper, exclaimed just recently: “Rape still at record levels and violent crime rises” and “Crime shows biggest rise for a decade“. As the adage goes, If it Bleeds, it Leads.

Crime Data and Mistrust of the Police

Those engaged in making crime figures meaningful to the public face unique challenges. When Stephen Lawrence was murdered in 1993, and the following public inquiry found institutional racism to be at the heart of the Met police, public trust towards the police was shattered. Since then, the police have claimed to have rid their ranks of racism entirely.

Police by Luis Jou García, Flickr, CC BY-NC 2.0

Police by Luis Jou García, Flickr, CC BY-NC 2.0

But many remain less than convinced. According to official statistics, in 1999-2000, a black person was five times more likely than a white person to be stopped by police. A decade later, they were seven times more likely. One criminologist commented: “Claims that the Lawrence inquiry’s finding of institutional racism no longer apply have a hollow ring when we look at the evidence on police stops.” [Michael Shiner reported in the Guardian].

Equally, the police distrust the public too. The murder of two young, female police officers in Manchester in 2012 ignited the long-rumbling debate over whether the police should be armed. So the divide between the police and the public is a serious one.

A Different Tack?

In 2011, a review was undertaken by the UK Statistics Authority into Crime Data. Its recommendations included:

  • Improving the presentation of crime statistics to make them more authoritative
  • Reviewing the availability of local crime and criminal justice data on government websites to identify opportunities for consolidation
  • Sharing of best practice and improvements in metadata and providing reassurance on the quality of police crime records.

It’s clear that the UK police recognise the importance of improving their publication of data. But it seems that opening data alone won’t fix the shattered trust between the public and the police, even if the proof that Britons are safer than ever before is there in transparent, easily navigable data. We need to go further back in the chain of provenance, scrutinise the reporting methods of the police for instance.

But this is about forgiveness too, and the British public might just not be ready for that yet.

MarcEdit 6 Update / Terry Reese

Changes in this update:

6.1.21
* 6.1.21
** Bug Fix: Conditional Delete - When selecting regular expressions -- there were times when the process wasn't being recognized.
** Enhancement: Conditional Delete - This function use to only work when using the Regular Expression option.  This now works for all options.
** Bug Fix: ValidateISBNs - Process would only process the first subfield.  If the subfield to be processed wasn't the first one, it wouldn't be validated.
** Enhancement: ValidateISSN: Uses mathematical formula to validate ISSNs.
** Bug Fix: Generate Fast Headings (Stand alone tool) -- LDR fields could be deleted.  
** Enhancement: Working to make the global edit functions a little more fault tolerant around record formatting.
** Enhancement: Generate MARC record from URL -- program generates MARC records from Webpages.  If you pass it an LC URL, it will generate data from the MARCXML.

At this point, only the Windows and Linux downloads were updated.  I'll be replacing the Mac download with the first version of the native OSX build the July 4th weekend.  You can get the updates either via the Automated updated tool or from the website at: http://marcedit.reeset.net/downloads

--tr

JSTOR Workset Browser / Eric Lease Morgan

Given a citations.xml file, this suite of software — the JSTOR Workset Browser — will cache and index content identified through JSTOR’s Data For Research service. The resulting (and fledgling) reports created by this suite enables the reader to “read distantly” against a collection of journal articles.

The suite requires a hodgepodge of software: Perl, Python, and the Bash Shell. Your milage may vary. Sample usage: cat etc/citations-thoreau.xml | bin/make-everything.sh thoreau

“Release early. Release often”.

Zotero 4.0.27: Streamlined saving, easier bibliography language selection, and more / Zotero

Zotero 4.0.27, now available, brings some major new features, as well as many other improvements and bug fixes.

Streamlined saving (Zotero for Firefox)

In Zotero for Firefox, it’s now easier than ever to save items from webpages.

Zotero senses information on webpages through bits of code called site translators, which work with most library catalogs, popular websites such as Amazon and the New York Times, and many gated databases.

In the past, there have been two different ways of saving web sources to Zotero:

  • If Zotero detected a reference on a webpage, you could click an icon in the address bar — for example, a book icon on Amazon or a journal article icon on a publisher’s site — to save high-quality metadata for the reference to your Zotero library.
  • If a site wasn’t supported or a site translator wasn’t working, you could still save any webpage to your Zotero library by clicking the “Create Web Page Item from Current Page” button in the Zotero for Firefox toolbar or by right-clicking on the page background and choosing “Save Page to Zotero”. In such cases, you might need to fill in some details that Zotero couldn’t automatically detect.

In Zotero 4.0.27, we’ve combined the address bar icon and the “Create Web Page Item from Current Page” button into a single save button in the Firefox toolbar, next to the existing Z button for opening the Zotero pane.

Hovering over the new save button on a New York Times article
The new save button on a New York Times article


(Don’t be confused by the book icon in the address bar in the top left — that’s a new Firefox feature, unrelated to Zotero.)

You can click the new save button on any webpage to create an item in your Zotero library, and Zotero will automatically use the best available method for saving data. If a translator is available, you’ll get high-quality metadata; if not, you’ll get basic info such as title, access date, and URL, and you can edit the saved item to add additional information from the webpage. The icon will still update to show you what Zotero found on the page, and, as before, you can hover over it to see which translator, if any, will be used.

This also means that a single shortcut key — Cmd+Shift+S (Mac) or Ctrl+Shift+S (Windows/Linux) by default — can be used to save from any webpage.

The new save button also features a drop-down menu for accessing additional functionality, such as choosing a non-default translator or looking up a reference in your local (physical) library without even saving it to Zotero.

Save menu with options for saving using JSTOR or DOI translator
Additional save options

(This functionality was previously available by right-clicking on the address bar icon, though if you knew that, you surely qualify for some sort of prize.) The new menu will be used for more functionality in the future, so stay tuned.

Prefer another layout? In addition to the new combined toolbar buttons, Zotero provides separate buttons for opening Zotero and saving sources that can be added using Firefox’s Customize mode.

Separate toolbar buttons
Custom button layout

With the separate buttons, you can hide one or the other button and rely on a keyboard shortcut, move the buttons into the larger Firefox menu panel, or even move the new save button between the address bar and search bar, close to its previous position. (Since the new save button works on every page, it no longer makes sense for it to be within the address bar itself, but by using the separate buttons you can essentially recreate the previous layout.)

While all the above changes apply only to Zotero for Firefox for now, similar changes will come to the Chrome and Safari connectors for Zotero Standalone users in a future version. For now, Zotero Standalone users can continue to use the address bar (Chrome) or toolbar (Safari) icon to save recognized webpages and right-click (control-click on Macs) on the page background and choose “Save Page to Zotero” to save basic info for any other page.

Easier bibliography language selection

Making Zotero accessible to users around the world has always been a priority. Thanks to a global community of volunteers in the Zotero and Citation Style Language (CSL) projects, you can use the Zotero interface and also generate citations in dozens of different languages.

Now, thanks to community developers Rintze Zelle and Aurimas Vinckevicius, it’s much easier to switch between different languages when generating citations.

Previously, Zotero would automatically use the language of the Zotero user interface — generally the language of either Firefox or the operating system — when generating citations. While you’ve always been able to generate citations using a different language, doing so required changing a hidden preference.

You can now set the bibliography language at the same time you choose a citation style, whether you’re using Quick Copy, Create Bibliography from Selected Items, or the word processor plugins.

Selecting 'Français (France)' for the bibliography language
Choosing a bibliography language for Quick Copy


In the above example, even though the user interface is in English, the default Quick Copy language is being set to French. If an item is then dragged from Zotero into a text field, the resulting citation will be in French, using French terms instead of English ones (e.g., “édité par” instead of “edited by”).

The new language selector is even more powerful when using the word processor plugins. The bibliography language chosen for a document is stored in the document preferences, allowing you to use different languages in different documents — say, U.S. English for a document you’re submitting to an American journal and Japanese for a paper for a conference in Japan.

Note that, of the thousands of CSL styles that Zotero supports, not all can be localized. If a journal or style guide calls for a specific language, the language drop-down will be disabled and citations will always be generated using the required language. For example, selecting the Nature style will cause Zotero to use the “English (UK)” locale in all cases, as is required by Nature’s style guide.

Other changes

Zotero now offers an “Export Library…” option for group libraries, allowing the full collection hierarchy to be easily exported. If you find yourself facing many sync conflicts, you can now choose to resolve all conflicts with changes from one side or the other. For Zotero Standalone users, we’ve improved support for saving attachments from Chrome and Safari on many sites, bringing site compatibility closer to that of Zotero for Firefox. And we’ve resolved various issues that were preventing complete syncs for some people.

There’s too much else to discuss here, but see the changelog for the full list of changes.

Get it now

If you’re already using Zotero, your copy of Zotero should update to the new version automatically, or you can update manually from the Firefox Add-ons pane or by selecting the “Check for Updates” menu option in Zotero Standalone. If you’re not yet using Zotero, try it out today.

Links in Obergefell v. Hodges / Ed Summers

Last week’s landmark ruling from the Supreme Court on same sex marriage was routinely published on the Web as a PDF. Given the past history of URL use in Supreme Court opinions I thought I would take a quick look to see what URLs were present. There are two, both are in Justice Alito’s dissenting opinion, and one is broken … just four days after the PDF was published. You can see it yourself at the bottom of page 100 in the PDF.

If you point your browser at

http://www.cdc.gov/nchs/data/databrief/db18.pdf

you will get a page not found error:

Sadly even the Internet Archive doesn’t have a snapshot of the page available.

But notice it thinks it can get a copy of it still. That’s because the Center for Disease Control’s website is responding with a 200 OK instead of a 404 Not Found:

zen:~ ed$ curl -I http://www.cdc.gov/nchs/data/databrief/db18.pdf
HTTP/1.1 200 OK
Content-Type: text/html
X-Powered-By: ASP.NET
X-UA-Compatible: IE=edge,chrome=1
Date: Tue, 30 Jun 2015 16:22:18 GMT
Connection: keep-alive

At any rate, it’s not Internet Archive’s fault that they haven’t archived the Webpage originally published in 2009, because the URL is actually a typo. Instead it should be

http://www.cdc.gov/nchs/data/databriefs/db18.pdf

which leads to:

So between the broken URL and the 200 OK for something not found we’ve got issues of link rot and reference rot all rolled up into a one character typo. Sigh.

I think a couple lessons for web publishers can be distilled from this little story:

  • when publishing on the Web include link checking as part of your editorial process
  • if you are going to publish links on the Web use a format that’s easy to check … like HTML.

ALA appoints Jenny Levine next LITA Executive Director / LITA

The American Library Association is pleased to announce the appointment of Jenny Levine as the Executive Director of the Library and Information Technology Association, a division of the ALA, effective August 3, 2015. Ms. Levine has been at the American Library Association since 2006 as the Strategy Guide in ALA’s Information Technology and Telecommunications Services area, charged with providing vision and leadership regarding emerging technologies, development of services, and integration of those services into association and library environments. In that role she coordinated development of ALA’s collaborative workspace, ALA Connect, and provided ongoing support and documentation. She convened the staff Social Media Working Group and coordinated a team-based approach for strategic posting to ALA’s social media channels. In addition, she has been the staff liaison to ALA’s Games and Gaming Round Table (GameRT) and coordinated a range of activities, including the 2007 & 2008 Gaming, Learning, and Libraries Symposia and International Games Day @ your library. She developed the concept for and manages the Networking Uncommons gathering space at ALA conferences.

Prior to joining the ALA staff, Jenny Levine held positions as Internet Development Specialist and Strategy Guide at the Metropolitan Library System in Burr Ridge (IL), Technology Coordinator at the Grande Prairie Public Library District in Hazel Crest (IL), and Reference Librarian at the Calumet City Public Library in Calumet City (IL). She received the 2004 Illinois Library Association Technical Services Award and a 1999 Illinois Secretary of State Award of Recognition.

Jenny has an M.L.S. from the University of Illinois, Urbana-Champaign, and a B.S. in Journalism/Broadcast News from the University of Kansas, Lawrence. Within ALA, she is a member of LITA, GameRT, the Intellectual Freedom Round Table (IFRT), and the Gay, Lesbian, Bisexual, and Transgender Round Table (GLBTRT). She is also active outside ALA and belongs to the American Civil Liberties Union (ACLU), the Electronic Frontier Foundation (EFF), the ALA-tied Freedom to Read Foundation (FTRF), the Human Rights Campaign (HRC) and the Illinois Library Association (ILA).

Jenny Levine has been an active presenter and writer, including three issues of Library Technology Reports on Gaming & Libraries. Among the early explorers of Library 2.0 technologies, from the Librarians’ Site du Jour (the first librarian blog) to the ongoing The Shifted Librarian, she is active in a wide variety of social media.

Ms. Levine becomes executive director of LITA on the retirement of Mary Taylor, LITA executive director since 2001. Thanks go to the search committee for a thoughtful and successful process: Rachel Vacek, Thomas Dowling, Andromeda Yelton, Isabel Gonzalez-Smith, Keri Cascio, Dan Hoppe and Mary Ghikas.

Blaming the Victim / David Rosenthal

The Washington Post is running a series called Net of Insecurity. So far it includes:
  • A Flaw In The Design, discussing the early history of the Internet and how the difficulty of getting it to work at all and the lack of perceived threats meant inadequate security.
  • The Long Life Of A Quick 'Fix', discussing the history of BGP and the consistent failure of attempts to make it less insecure, because those who would need to take action have no incentive to do so.
  • A Disaster Foretold - And Ignored,  discussing L0pht and how they warned a Senate panel 17 years ago of the dangers of Internet connectivity but were ignored.
Perhaps a future article in the series will describe how successive US administrations consistently strove to ensure that encryption wasn't used to make systems less insecure and, the encryption that was used was as weak as possible. They prioritized their (and their opponents) ability to spy over mitigating the risks that Internet users faced, and they got what they wanted. As we see with the compromise of the Office of Personnel Management and the possibly related compromise of health insurers including Anthem. These breaches revealed the kind of information that renders everyone with a security clearance vulnerable to phishing and blackmail. Be careful what you wish for!

More below the fold.

The compromises at OPM and at Sony Pictures have revealed some truly pathetic security practices at both organizations, which certainly made the bad guy's job very easy. Better security practices would undoubtedly have made their job harder. But it is important to understand that in a world where Kaspersky and Cisco cannot keep their systems secure, better security practices would not have made the bad guy's job impossible.

OPM and Sony deserve criticism for their lax security. But blaming the victim is not a constructive way of dealing with the situation in which organizations and individuals find themselves.

Prof. Jean Yang of C-MU has a piece in MIT Technology Review entitled The Real Software Security Problem Is Us that, at first glance, appears to make a lot of sense but actually doesn't. Prof. Yang specializes in programming languages and is a "cofounder of Cybersecurity Factory, an accelerator focused on software security". Writing:
we could, in the not-so-distant future, actually live in a world where software doesn’t randomly and catastrophically fail. Our software systems could withstand attacks. Our private social media and health data could be seen only by those with permission to see it. All we need are the right fixes.
A better way would be to use languages that provide the guarantees we need. The Heartbleed vulnerability happened because someone forgot to check that a chunk of memory ended where it was supposed to. This could only happen in a programming language where the programmer is responsible for managing memory. So why not use languages that manage memory automatically? Why not make the programming languages do the heavy lifting?
Another way would be to make software easier to analyze. Facebook had so much trouble making sense of the software it used that it created Hack and Flow, annotated versions of PHP and Javascript, to make the two languages more comprehensible.
...
Change won’t happen until we demand that it happens. Our software could be as well-constructed and reliable as our buildings. To make that happen, we all need to value technical soundness over novelty. It’s up to us to make online life is as safe as it is enjoyable.
It isn't clear who Prof. Yang's "we" is, end users or programmers. Suppose it is end users. Placing the onus on end users to demand more secure software built with better tools is futile. There is no way for an end user to know what tools were used to build a software product, no way to compare how secure two software products are, no credible third-party rating agency to appeal to for information. So there is no way for the market to reward good software engineering and punish bad software engineering.

Placing the onus on programmers is only marginally less futile. No-one writes a software product from scratch from the bare metal up. The choice of tools and libraries to use is often forced, and the resulting system will have many vulnerabilities that the programmer has no control over. Even if the choice is free, it is an illusion to believe that better languages are a panacea for vulnerabilities. Java was designed to eliminate many common bugs, and it manages memory. It was effective in reducing bugs, but it could never create a "world where software doesn’t randomly and catastrophically fail".

Notice that the OPM compromise used valid credentials presumably from social engineering, so it would have to be blamed on system administrators not programmers, or rather on management's failure to mandate two-factor authentication. But equally, even good system administration couldn't make up for Cisco's decision to install default SSH keys for "support reasons".

For a more realistic view, read A View From The Front Lines, the 2015 report from Mandiant, a company whose job is to clean up after compromises such as the 2013 one at Stanford. Or Dan Kaminsky's interview with Die Zeit Online in the wake of the compromise at the Bundestag:
No one should be surprised if a cyber attack succeeds somewhere. Everything can be hacked. ...  All great technological developments have been unsafe in the beginning, just think of the rail, automobiles and aircrafts. The most important thing in the beginning is that they work, after that they get safer. We have been working on the security of the Internet and the computer systems for the last 15 years.
Yes, automobiles and aircraft are safer but they are not safe. Cars kill 1.3M and injure 20-50M people/year, being the 9th leading cause of death. And that is before they become part of the Internet of Things and their software starts being exploited. Clearly, some car crash victims are at fault and others aren't. Dan is optimistic about Prof. Yang's approach:
It is a new technology, it is still under development. In the end it will not only be possible to write a secure software, but also to have it happen in a natural way without any special effort, and it shall be cheap.
I agree that the Langsec approach and capability-based systems such as Capsicum can make systems safer. But making secure software possible is a long way from making secure software ubiquitous. Until it is at least possible for organizations to deploy a software and hardware stack that is secure from the BIOS to the user interface, and until there is liability on the organization for not doing so, blaming them for being insecure is beside the point.

The sub-head of Mandiant's report is:
For years, we have argued that there is no such thing as perfect security. The events of 2014 should put any lingering doubts to rest.
It is worth reading the whole thing, but especially their Trend 4, Blurred Lines, that starts on page 20. It describes how the techniques used by criminal and government-sponsored bad guys are becoming indistinguishable, making difficult not merely to defend against the inevitable compromise, but to determine what the intent of the compromise was.

The technology for making systems secure does not exist. Even if it did it would not be feasible for organizations to deploy only secure systems. Given that the system vendors bear no liability for the security of even systems intended to create security, this situation is unlikely to change in the foreseeable future.

Announcing our AccessYYZ Binkley Lecturer: Molly Sauter! / Access Conference

The Access 2015 Organizing Committee is thrilled to announce that our speaker for the Dave Binkley Memorial Lecture is Molly Sauter!

Molly is a Vanier Scholar and PhD student in Communication Studies at McGill University in Montreal, Canada. She holds a masters degree in Comparative Media Studies from MIT, and is an affiliate researcher at the MIT Center for Civic Media at the MIT Media Lab and at the Berkman Center for Internet and Society at Harvard University. Molly has published widely on internet activism, hacker culture, and depictions of technology in the media. Her recent book, The Coming Swarm, examines the use of Distributed Denial of Service (DDoS) actions as a form of political activism.

You can find Molly online at oddletters.com and on twitter at @oddletters.


More about the Dave Binkley Memorial Lecture.

2015 LITA Forum, Registration Opens! / LITA

Registration Now Open!

2015 LITA Forum
Minneapolis, MN
November 12-15, 2015

Plan now to join us in Minneapolis, Minnesota, at the Hyatt Regency Minneapolis for the 2015 LITA Forum, a three-day educational event that includes 2 preconferences, 3 keynote sessions, more than 55 concurrent sessions and 15 plus poster presentations.

2015 LITA Forum is the 18th annual gathering of technology-minded information professionals and is a highly regarded annual event for those involved in new and leading edge technologies in the library and information technology field. Registration is limited in order to preserve the important networking advantages of a smaller conference. Attendees take advantage of the informal Friday evening reception, networking dinners and other social opportunities to get to know colleagues and speakers. Comments from past attendees:

  • “Best conference I’ve been to in terms of practical, usable ideas that I can implement at my library.”
  • “I get so inspired by the presentations and conversations with colleagues who are dealing with the same sorts of issues that I am.”
  • “After LITA I return to my institution excited to implement solutions I find here.”
  • “This is always the most informative conference! It inspires me to develop new programs and plan initiatives.”

This Year’s featured Keynote Sessions

Mx A. Matienzo
Director of Technology for the Digital Public Library of America, he focuses on promoting and establishing digital library interoperability at an international scale. Prior to joining DPLA, Matienzo worked as an archivist and technologist specializing in born-digital materials and metadata management, at institutions including the Yale University Library, The New York Public Library, and the American Institute of Physics.

Carson Block
Carson Block Consulting Inc. has led, managed, and supported library technology efforts for more than 20 years. He has been called “a geek who speaks English” and enjoys acting as a bridge between the worlds of librarians and hard-core technologists.

Lisa Welchman
President of Digital Governance Solutions at ActiveStandards. In a 20-year career, Lisa Welchman has paved the way in the discipline of digital governance, helping organizations stabilize their complex, multi-stakeholder digital operations. Her book Managing Chaos: Digital Governance by Design was published in February of 2015 by Rosenfeld Media.

The Preconference Workshops include

So You Want to Make a Makerspace: Strategic Leadership to support the Integration of new and disruptive technologies into Libraries: Practical Tips, Tricks, Strategies, and Solutions for bringing making, fabrication and content creation to your library.
Presenters:
Leah Kraus is the Director of Community Engagement and Experience at the Fayetteville Free Library.
Michael Cimino is the Technology Innovation and Integration Specialist at the Fayetteville Free Library.

Beyond Web Page Analytics: Using Google tools to assess searcher behavior across web properties
Presenters:
Rob Nunez, Robert L Nunez, Head of Collection Services, Kenosha Public Library, Kenosha, WI
Keven Riggle, Systems Librarian & Webmaster, Marquette University Libraries

Visit http://litaforum.org
for registration and additional information.

Join us in Minneapolis!

Link roundup June 29, 2015 / Harvard Library Innovation Lab

This and that for the end of June.

9 Themed Color Palette Collections to Inspire You | Mental Floss

Oxford Scientist Explains the Physics of Playing Electric Guitar Solos | Open Culture

Hyperlax

Islandora Show and Tell: Worthington Memory / Islandora

It's time for another Islandora Show and Tell. This time we're taking a look at a collection put together by a public library: Worthington Memory, a collection of photographs and other documents connecting the citizens of Worthington, Ohio to their local history.

This innovative Islandora site makes heavy use of Islandora Sync to seamlessly integrate objects in Fedora and nodes in Drupal into one easily browsable collection. The team made some tweaks to Sync handle some complex MODS scenarios, such as taxonomy terms and field collections. They also altered the Large Image and Book viewers to use a colorbox display. Developer Stefan Langer has contributed back many of his tweaks to Sync - and if you are working with Islandora in a Windows server environment, you have him to thank for fixes in the code that make the process easier (we do not officially support Windows for Islandora, so contributions like Stefan's are all the more important). You can find out more about the underpinnings of the site on their Technical information page.

And now, the cats. Any collection with a large number of newspaper object is bound to have some interesting results on a very generic search term, and Worthington Memory does not disappoint, although these articles themselves are tantalizingly out of reach on microfilm, stored online as an index in Drupal:

As for actual repo objects relating to cats, the field is a little thinner, coming down to a theatre program referencing the musical "cats" - which you can browse through using their customized viewer:

To round out the Show and tell, let's hear about Worthington Memory from its creators and curators. Kara Reuter answered some questions for our blog:

What is the primary purpose of your repository? Who is the intended audience?

 We want Worthington Memory to enrich people’s experience of our community and to convey a sense of wonder.  See our project and site goals
 
Our audience spans the serious genealogical researcher to the third-grade student doing a school project.  We also want the site to be fun to browse for more casual visitors – we’re inspired by sites like Shorpy, Retronaut, and Cooper Hewitt.
 
Why did you choose Islandora?
 
In the eight years we’ve worked together at Worthington Libraries, Stefan Langer and I have been steadily migrating all our online systems from locally-developed to third-party.  We’ve been using Drupal for the library’s other Web properties since 2008 and we really appreciate how powerful and flexible it is.  When it finally came time to migrate Worthington Memory – we saved it for last – we looked at several other mainstream digital repository products, but Islandora’s Drupal connection made it a natural fit for us.
 
 Which modules or solution packs are most important to your repository?
 
Far and away, Islandora Sync.  In fact, when we were first investigating Islandora, we just assumed that the integration with Drupal meant it would create Drupal nodes for each object we ingest.  We were initially discouraged to discover otherwise, but quickly rebounded when we discovered Islandora Sync. 
 
Worthington Memory includes not just digitized objects, but also burial records from three local cemeteries and a newspaper index spanning 200 years of local newspapers – about 125,000 items/records in all.  The previous incarnation of the site treated our collections of digitized objects, news index items and burial records as three separate databases, walled off from one another, even though there’s significant areas of overlap.  The burial records and news index items are all native Drupal nodes, so to intersperse the digitized objects among them, we needed to get the Islandora objects into Drupal.  With a little bit of customization, Islandora Sync does the job.
 
What feature of your repository are you most proud of?
 
I love seeing the connections across all our content, especially for people.  If you’re searching for a particular burial record, you might just discover a photo of that person at the same time. For many individuals in our community, you can trace the major milestones in their lives through our collection.  For instance, Elmer Snouffer  – we have photos of him as a boy on a fishing trip with his brothers, on his high school basketball team and in middle age; we see newspaper articles announcing the birth of his children, the evolution of his business and his 25th and 50th wedding anniversaries; and finally we have his burial record and obituaries.  By using standard Drupal taxonomies to tag our content, these kinds of connections emerge naturally.
 
I also love showing off the less traditional ways to browse content, such as by location and by date.  Even people who are knowledgeable about our community’s history or familiar with our collection can make new connections by using these tools.
  
Who built/developed/designed your repository (i.e, who was on the team?)
 
Development was done by just the two of us— Stefan Langer, Web Developer, and Kara Reuter, Digital Library Manager.  The visual design was done by Stacy Clark, Graphic Designer.  Meredith Southard, Librarian, and Susan Allen, Director of Technology Services, are our subject matter experts and content authors.
  
Do you have plans to expand your site in the future?
 
Absolutely!  As far as content, we continue to index our local newspapers each week and our contacts at our local cemeteries continue to pass along burial records.  After a long period of dormancy, the redesign is spurring us to put out a call for submissions of new objects to digitize.  As far as functionality, we want to beef up the site search.  We also have what we think of as “secondary objects,” like transcripts of manuscripts or additional views of realia, that we want to display, attached to their primary objects.  And we have a backlog of around 100,000 subject and people terms that we need to apply to our newspaper articles.
 
 What is your favourite object in your collection to show off?
 
I love this group portrait from 1900. It doesn’t look like much at first, but the large image viewer really shows it off.  Every time I zoom in and pan around that image, I see something new. 
 
Stefan likes this advertisement for a meat replacement product from Worthington Foods: Choplets!

3 Tips for Tech Empathy / LITA

empathyI recently participated in a training session about empathy, led by our wonderful Staff Development Specialist here at the Martin County Library System. The goal of this session was to define empathy and discuss how to show empathy for our patrons and co-workers. It got me thinking about empathy in regards to teaching technology. I frequently work with library patrons who are frustrated with technology. Many of these patrons are older adults who feel handicapped because they were not raised in the digital age.

I, on the other hand, was born born in the digital age. I learned how to use a computer in elementary school and technology has been present in my life ever since. It’s easy to forget this advantage and lose patience when you are teaching someone with a different background. In teaching classes and offering one-on-one technology help, I’ve picked up a few tips about how to empathize with your students.

reflectIf you find your patience wearing thin, think of a time when you struggled to learn something. For me, it’s learning to drive stick. I’ve tried several times and each attempt was more frustrating than the last. When I think about how nerve-wracking it is to be behind the wheel with my hand on the stick shift, I remember how scary it can be to learn something new. I often help patrons who have purchased a new device (iPad, smartphone, etc.) and they are terrified to do the wrong thing. Returning to my adventures with manual transmissions helps me understand where they’re coming from.

relateI was teaching a class a few weeks back and one patron was really struggling to keep up with the group. I started to get irritated by her constant questions, until halfway through when I realized that she looked exactly like my aunt. This immediately snapped me back to reality. If my aunt walked into a library I would want her to receive the best customer service possible and be treated with the utmost respect. My patience was instantly renewed, and I’ve used this trick successfully several times since by comparing patrons to my grandparents, parents, etc. Empathy is often defined as putting yourself in the other person’s shoes, but putting a loved one in the other person’s shoes can also do the trick.

listenI often hear the same complaints from patrons who are frustrated, confused, or overwhelmed by technology. I’ll admit it can be trying to listen to the same thing again and again, but I also recognize that listening to these grievances is very important. Sometimes it’s best to get those frustrations out right off the bat in order to set them aside and focus on learning. Listening is one of our best tools, and acknowledging that someone’s problem is valid can also be extremely helpful.

Do you have any tips for tech empathy?

We Did All That? NDSA Standards and Practices Working Group Project Recaps / Library of Congress: The Signal

SurveyBookmarkFront

Video Stumbling Blocks Survey bookmark, designed by Kara Van Malssen, AVPreserve

The end of the school year often finds me thinking about time gone by. What did I work on and what can I show for it? The NDSA Standards and Practices Working Group members were in the same frame of mind so we recently did a survey of our projects and accomplishments since the NDSA launched in 2010. It’s an impressive list (if we do say so ourselves), especially once you realize that these topics come from the interests of our diverse membership. As co-chair of the working group, I’d like to share with you all of the the S&P-related blog posts to bring readers up-to-date with many of our topical and timely initiatives.

Video Survey
Video has been a hot topic in S&P recently. Several round-robin discussions led to a “Video Deep Dive” action team which developed and conducted the Stumbling Blocks to Preserving Video Survey to identify and rank issues that may hinder digital video preservation. The preliminary results led us to dig a little deeper in how we processed and analyzed the data so look for an update on this soon.

Preserving Digital and Software-Based Artworks
S&P hosted a two-part discussion with experts from four collecting institutions (San Francisco Museum of Modern Art, Museum of Modern Art, The Rose Goldsen Archive of New Media Art, and Smithsonian Institution Time Based Media Art project) to share their experiences in both preserving and providing access to digital art works and other new media. These complex digital objects materials are increasingly part of collections outside of traditional museum environments and cultural heritage institutions, including libraries and archives, will see more and more of this type of content in their collections.

digitalcastle_sm2

Photo courtesy of Time Based Media Art at the Smithsonian from the Report on the Status and Need for Technical Standards in the care of Time-Based Media and Digital Art

PDFA/3
S&P members contributed to a report that takes a measured look at the costs and benefits of the widespread use of the PDF/A-3 format, especially as it effects content arriving in collecting institutions. The report provides background on the technical development of the specification, identifies specific scenarios under which the format might be used and suggests policy prescriptions for collecting institutions to consider.

Staffing for Effective Digital Preservation Survey
S&P conducted a survey of 85 institutions with a mandate to preserve digital content about how they staffed and organized their preservation functions. In addition to an award-winning poster (PDF) at iPRES2012, S&P members produced a detailed report and deposited the raw data in ICPSR.

fixity-images

Icons for Archive and Checksum, Designed by Iconathon Los Angeles, California, US 2013. Public Domain

Fixity
Along with our colleagues in the NDSA Infrastructure Working Group, S&P members helped author the NDSA publication, “Checking Your Digital Content: What is Fixity and When Should I Be Checking It?” (PDF). This resource provides stewards of digital objects with information about implementing fixity concepts and methods in a way that makes sense for their organization based on their needs and resources. Topics covered include definitions of fixity and fixity information, general approaches to fixity check frequency and comparison of common fixity information-generating instruments.

2015 National Agenda
S&P members also contributed significant input and informed actionable recommendations to the Organization Policies and Practice chapter of the NDSA 2015 National Agenda for Digital Stewardship.

2015-nat-agenda-coverEmail
Issues with archiving email proved to be another rallying point for S&P members who participated in initiating an informal Email Interest Group to discuss issues, projects and workflows to preserve email.

Moving Forward
Compiling this review list for S&P proudly reminds me of how much we’ve done through our active and engaged membership. And I should mention that this post doesn’t even cover all our projects – just the ones with blog posts! Even with all we’ve done so far, S&P still has many issues and practices to explore.

DSpace in Vietnam with Registered Service Provider D & L Technology Integration and Consulting / DuraSpace News

Winchester, MA  Efforts are increasing at institutions around the world to provide open access to global culture and scholarship including theses, dissertations, journals, digitized materials, special collections, maps, videos, audio recordings and other types of data. D & L Technology Integration and Consulting, a new DuraSpace Registered Service Provider located in Hanoi-Vietnam is part of that worldwide effort.

American Association of School Librarians Names DPLA a 2015 Best App for Teaching & Learning / DPLA

Logo for the AASL Best Apps for Teaching and Learning The Digital Public Library of America is extraordinarily grateful to be recognized as one of 2015’s Best Apps for Teaching & Learning by the American Association of School Librarians (AASL). Chosen for its embodiment of AASL’s learning standards and support of the school librarian’s role in implementing career and college readiness standards, this is DPLA’s second “Best of” award from the prestigious education-oriented division of the American Library Association. DPLA was recognized as a Best Website for Teaching & Learning in 2013.

“This recognition from AASL means so much to us, since school librarians have been such great advocates for DPLA, especially as we strive to make our materials useful to students,” said Dan Cohen, DPLA’s Executive Director. “This second award from AASL highlights that DPLA is available in multiple formats, including apps, a website, and other websites that incorporate our extraordinary content from collections across the United States.”

The Best Apps for Teaching & Learning recognition honors apps of exceptional value to inquiry-based teaching and learning as embodied in the AASL’s Standards for the 21st-Century Learner. The recognized apps foster the qualities of innovation, creativity, active participation, and collaboration and are user-friendly to encourage a community of learners to explore and discover. The apps were announced during the 2015 ALA Annual Conference in San Francisco.

The AASL, a division of the American Library Association, promotes the improvement and extension of library services in elementary and secondary schools as a means of strengthening the total education program. AASL’s mission is to empower leaders to transform teaching and learning.

To find out more about DPLA’s efforts around education, read the DPLA’s three-year strategic plan, published in January 2015, and its Whiting Foundation-funded research paper on using large digital collections in education, published in April 2015.

Koha - Security and maintenance releases - v 3.20.1, 3.18.8, 3.16.12, 3.14.16 / FOSS4Lib Recent Releases

Package: 
Release Date: 
Tuesday, June 23, 2015

Last updated June 27, 2015. Created by David Nind on June 27, 2015.
Log in to edit this page.

Security and maintenance releases for Koha.

As these are security releases it is strongly recommended that you upgrade as soon as possible.

Special thanks also goes to Raschin Tavakoli and Dimitris Simos from the Combinatorial Security Testing Team of SBA Research for finding and reporting the security bugs.

See the release announcements for the details:

ALA releases National Policy Agenda for Libraries / District Dispatch

Libraries are in a revolution fueled by rapid advances in technology, and thus the roles, capabilities, and expectations of libraries are changing rapidly. National public policy for libraries must reflect these changes. Today the American Library Association (ALA) released a National Policy Agenda (pdf) for Libraries to guide a proactive policy shift.

“Too often, decision makers do not yet understand the extent to which libraries can be catalysts for opportunity and progress,” said ALA President Courtney Young in a press release. “As a result, investments in libraries and librarians lag our potential to contribute to the missions of the federal government and other national institutions. We must take concerted action to advance shared policy goals.”

The agenda was developed in concert with major library organizations that serve on a Library Advisory Committee for the Policy Revolution! initiative and with input from a public comment period. Funding for this project is provided by the Bill & Melinda Gates Foundation as part of a three-year grant that also supports efforts to deepen national stakeholder engagement and increase library advocacy capacity.

“Libraries cannot wait to be invited to ‘the table.’ We need proactive, strategic and aligned advocacy to support national policies that advance the public’s interest in the digital age and support libraries as essential community assets,” writes Deborah Jacobs, director of the Global Libraries Program at the Bill & Melinda Gates Foundation, in a foreword (pdf) to the agenda (pdf).

The agenda flows out of library values and the imperative of “opportunity for all,” as well as within a context of national political, economic and demographic trends. It seeks to answer the questions “What are the U.S. library interests and priorities for the next five years that should be emphasized to national decision makers?” and “Where might there be windows of opportunity to advance a particular priority at this particular time?”

The agenda articulates two broad themes—building library capacity to advance national priorities and advancing the public interest. Among the areas for capacity building are education and learning, entrepreneurship, and health and wellness. Public interest topics include balanced copyright and licensing, systems for digital content, and privacy and transparency. The agenda also identifies specific populations for which there are significant demographic shifts or bipartisan opportunities to address specialized needs.

“National decision makers often don’t understand the roles or capabilities of modern libraries,” said Alan S. Inouye, director of ALA’s Office for Information Technology Policy and co-principal investigator of the Policy Revolution! initiative. “Thus, an underlying imperative of the agenda is communication about how modern libraries contribute to society. Progress on specific policy goals is significantly impeded if this broader understanding is lacking.”

“Sustainable libraries are essential to sustainable communities,” said Ken Wiggin, president of the Chief Officers of State Library Agencies (COSLA), which is a grant partner. “I believe this agenda will help unify and amplify our voices at the national level and can be customized for state-level action, as well.”

Using the Agenda, the ALA Washington Office will match priorities to windows of opportunity and confluence to begin advancing policy goals—in partnership with other library organizations and allies with whom there is alignment.

While initiated at different times, the Policy Revolution! initiative dovetails with the new proposed strategic framework and plan for the ALA, which focuses on three Strategic Directions: information policy, advocacy and professional and leadership development. “Taken together, along with a growing focus on transforming libraries, we are ‘connecting the dots’ across the profession and strengthening our collective voice,” said Larra Clark, deputy director of ALA’s Office for Information Technology Policy and co-principal investigator of the Policy Revolution! initiative.

Attendees at the ALA Annual Conference in San Francisco can learn more about the agenda and related advocacy at two programs. On Saturday, June 27, from 1-2:30 p.m., Policy Revolution! Senior Policy Counsel and partner at Arent Fox, Alan Fishel, will lead an interactive program on Negotiating to Advocacy Success with Clark.  On Sunday, June 28, from 3 to 4 p.m., ALA Incoming President-Elect Julie Todaro will join Inouye and Wiggin to discuss Dollars for Local Libraries. More information on the initiative also is available online at www.ala.org/oitp.

The post ALA releases National Policy Agenda for Libraries appeared first on District Dispatch.

Arts librarian receives 2015 Robert Oakley scholarship / District Dispatch

The American Library Association (ALA) this week awarded Kathleen DeLaurenti the 2015 Robert L. Oakley Memorial Scholarship. The Library Copyright Alliance, which includes ALA, established the Robert L. Oakley Memorial Scholarship to support research and advanced study for librarians in their early-to-mid-careers who are interested and active in public policy, copyright, licensing, open access and their impacts on libraries.

DeLaurenti serves as the arts librarian at the College of William and Mary, where she led a user-centered re-design of the Music Library, including adding new equipment, collections, and services. She also is the first librarian at William and Mary to receive a Creative Adaption Grant to begin a pilot project to help faculty incorporate Open Educational Resources into their courses. The Oakley scholarship will support DeLaurenti’s work in copyright education, focusing on students’ understanding of music licensing and copyright basics.

“The support of the Oakley Scholarship would allow me to not only continue the next phase of this project to create music copyright learning modules, but it would provide the resources to involve students in curricular development and module creation,” said DeLaurenti.

The Oakley Scholarship awards a $1,000 scholarship to individuals or a team of individuals who meet eligibility criteria to encourage and expand interest in and knowledge of these aspects of librarianship, as well as bring the next generation of advocates, lobbyists and scholars to the forefront with opportunities they might not otherwise have.

“The Oakley scholarship is intended to support librarians in non-administrative positions who are less likely to have the funds necessary to build on their copyright interests,” said Carrie Russell, program director of the ALA Program for Public Access to Information, in a statement. DeLaurenti’s project will ultimately be helpful to any librarian who works with library users with music copyright questions. Music copyright is about licensing, it’s complex, and has always been a topic of great interest to librarians.”

Robert Oakley

Robert Oakley

Law librarian and professor Robert Oakley was an expert on copyright law and wrote and lectured on the subject. He served on the Library Copyright Alliance representing the American Association of Law Libraries (AALL) and played a leading role in advocating for U.S. libraries and the public they serve at many international forums including the World Intellectual Property Organization (WIPO) and United Nations Educational Scientific and Cultural Organization (UNESCO). He served as the United States delegate to the International Federation of Library Associations (IFLA) Standing Committee on Copyright and Related Rights from 1997-2003.

Oakley testified before Congress on copyright, open access, library appropriations and free access to government documents and was a member of the Library of Congress’ Section 108 Study Group. A valued colleague and mentor for numerous librarians, Oakley was a recognized leader in law librarianship and library management who also maintained a profound commitment to public policy and the rights of library users.

The post Arts librarian receives 2015 Robert Oakley scholarship appeared first on District Dispatch.

Letting Theory Influence Practice / LITA

This spring, I taught a technology course for pre-service teachers. In addition to my MLS, I have a master’s degree in educational technology, a graduated certificate in online teaching and learning, and an undergraduate degree in education. My own schooling had taught me the importance of making pedagogically sound decisions and never using technology for only the sake of using technology. I quickly learned though that making those pedagogically sound decisions when looking into the eyes of students was a bit more challenging than I had originally thought.

Image made available under a Creative Commons Attribution 3.0 License from http://quality.ecampusalberta.ca/Image made available under a Creative Commons Attribution 3.0 License from http://quality.ecampusalberta.ca/

As I reflected on my teaching after every class, I asked myself many questions including: How do we learn? How can I incorporate technology in a way that is beneficial for my students? How can I use technology in a seamless manner where the learning is not interrupted by inclusion of technology?

Once the spring semester ended and I was able to breathe, I started to think about how what I learned teaching a technology course could (and should) influence my work as a librarian. Overall, I think librarians do a pretty great job using technology, but I realized for me that many of the technology decisions I make in my day job as an academic librarian are not nearly as grounded in learning theory as I think they should be. When I was teaching a full course it was easier to think about theory and wrestle with these questions, but when I create libguides, build tutorials, make suggestions for the library website, and recommend new technology for the learning commons, how often do I first think about how we learn?

So here is my goal (I’m admitting it online and hoping the LITA community will support me in it), I want to start reading more books on learning theory and start using that knowledge to influence all aspects of my work, and specifically with the technology that I use since almost everything that I do is somehow connected to technology.

Current reading list:

What do you recommend that I read?  Do you have any tips for connecting learning theory to non-teaching library technology responsibilities?

DPOE Makes a Splash Down Under! / Library of Congress: The Signal

The following is a guest post by Barrie Howard, IT Project Manager at the Library of Congress.

The Digital Preservation Outreach and Education (DPOE) program is pleased to announce a successful outcome for two international Train-the-Trainer workshops. These workshops were recently held in Australia, and are the first of their kind to be held outside of the United States.

The first workshop (May 26-29, 2015) was hosted by the State Library Victoria in Melbourne, sponsored by a collaborative organization of public libraries in Victoria called the Public Libraries Victoria Network (PLVN). The second workshop (June 2-5, 2015) took place in Sydney at the State Library of New South Wales, sponsored by a ten member consortium of national, state and territory libraries of Australia and New Zealand, the National and State Libraries of Australasia (NSLA). In addition to these two international workshops, DPOE has previously delivered four domestic workshops, partnering with organizations across the nation.

Participants and trainers at the workshop in Sydney. Photo credit: State Library of New South Wales.

Participants and trainers at the workshop in Sydney. Photo credit: State Library of New South Wales.

The aim of the DPOE workshop is to produce a corps of trainers, who are equipped to teach others the basic principles and practices of preserving digital materials. In this way, DPOE’s “teach-a-person-to-fish” model extends the benefits of a workshop well beyond only those who can attend. There are many examples of DPOE trainers working together across jurisdictional and organizational boundaries to meet the needs of cultural heritage institutions of all shapes and sizes. DPOE trainers go on to develop training events of their own, and have delivered many webinars and workshops in the Midwest, Pacific Northwest, and Southeast regions of the United States, which will be replicated in regions across Australia in the coming year. Some of these examples have been highlighted in previous blog posts.

Pictured: Corin Haines, National Library of New Zealand; Lesley Sharp, State Library of South Australia; Alex Byrne, State Library of New South Wales, and instructors Jake Nadal; Mary Molinaro; Amy Rudersdorf. Corin and Lesley are co-managers of the NSLA Digital Skills project, which sponsored the workshops; Alex is the CEO and State Librarian of SLNSWPictured: Photo credit: State Library of New South Wales

Corin Haines, National Library of New Zealand; Lesley Sharp, State Library of South Australia; Alex Byrne, State Library of New South Wales, and instructors Jake Nadal, Mary Molinaro, and Amy Rudersdorf. Photo credit: State Library of New South Wales.

The DPOE Down Under workshops were well received due largely to the exceptional knowledge and leadership of three of the program’s anchor instructors: Mary Molinaro (University of Kentucky Libraries), Jacob Nadal (The Research Collections and Preservation Consortium), and Amy Rudersdorf (Digital Public Library of America). This extremely talented team has provided subject matter expertise to the program in the past. Over the last year, DPOE Program Manager George Coulbourne has convened two meetings of the core instructors to give the training curriculum a significant overhaul. The instructors worked with DPOE staff to review and revise training materials in anticipation of the back-to-back DPOE workshops in Australia, ensuring the curriculum is as relevant and up-to-date as ever.

The workshops are just one way that DPOE fosters outreach and education about digital preservation on a global scale. After a workshop, students graduate and enter into a vibrant network of practitioners, and continue to engage with each other–and the broader digital preservation community–online. DPOE supports this network by providing an email distribution list so practitioners can share information about digital preservation best practices, services, and tools, and to surface stories about their experiences in advancing digital preservation.

Additionally, DPOE maintains a training calendar as a public service to help working professionals discover continuing education, professional development, and training opportunities in the practice of digital preservation. The calendar is updated on a monthly basis, and includes training events hosted by DPOE trainers.

Updated 6/29/15 for typos.

Digital Public Library of America makes push to serve all 50 states by 2017 with $3.4 million from the Sloan and Knight foundations / DPLA

The Digital Public Library of America (DPLA) is on the way to connecting online collections from coast to coast by 2017 – an effort boosted by a new $3.4 million investment, comprising $1.9 million from the Alfred P. Sloan Foundation and $1.5 million from the John S. and James L. Knight Foundation. These two new awards, coupled with significant earlier support from the Institute of Museum and Library Services and the National Endowment for the Humanities, will allow DPLA to open new Service Hubs that provide a way for all cultural heritage organizations across the country to connect through one national collection.

The Digital Public Library of America brings together the riches of America’s libraries, archives and museums, and makes them freely available to the world. DPLA provides public access to more than 10 million items – including the written word plus works of art and culture – from 1,600 institutions.

“The Sloan and Knight foundations have been such generous contributors to DPLA’s success, from our planning phase to the rapid build-out of our national network,” said Dan Cohen, executive director of the Digital Public Library of America. “With these major grants, we will be able to bring online 16 new states, and approach completion of that network.”

This series of investments represents a significant milestone in the development and growth of DPLA’s Service Hubs. These Services Hubs are state or regional digital collaboratives that host, aggregate or otherwise bring together digital objects from libraries, archives, museums and other cultural heritage institutions in their state or region. At the library’s launch in 2013, DPLA represented a collaborative of 16 major partners, covering nine states. The number has since doubled to more than 20 states, and is on the way to 50 in the next two years. As thousands of digital collections have been brought together through DPLA’s platform, fascinating new projects and tools using America’s cultural heritage have emerged, including curated exhibitions on historical topics and eras, dynamic visualizations and other cutting-edge apps, community engagement opportunities at an international scale, and much more.

These new grants will accelerate the growth of the Hubs program so that all collections and item types in America can easily be a part of DPLA. The Sloan Foundation’s $1.9 million award will build on its continued support since DPLA’s launch to establish Service Hubs in eight uncovered states and to further explore how it might address e-books in the collection. The Knight Foundation’s $1.5 million award will facilitate the expansion of the DPLA’s hub network in another eight states where Knight Foundation invests.

“We are delighted to continue our founding support of DPLA with this $1.9 million grant to facilitate the completion of a nationwide Service Hub network—a unique state-by-state approach to aggregating and sharing the digital record of America’s cultural heritage—and to help pilot a modern ebook distribution system for libraries,” said Doron Weber, Vice President Programs and Program Director at the Sloan Foundation. “DPLA represents an historic, non-commercial, grass-roots network to collect, curate, innovate and disseminate a comprehensive catalog of every form of digital knowledge for the benefit of all under the highest standards of quality, stewardship and open access, and Sloan is proud to be a small part of this great undertaking with many wonderful and generous partners such as the Knight Foundation.”

“An informed and engaged public is a prerequisite of American democracy. Libraries – be they physical or digital – play a fundamental role in encouraging people to know more about and become involved in the places where they live. DPLA brings to life the unique items locked away in our nations libraries and archives while providing an invaluable opportunity to bring this information into peoples lives and homes – better connecting them to each other and their communities,” said Jorge Martinez, vice president and chief technology officer at Knight Foundation, which also announced today that in 2016 it will host an international call for ideas on innovating libraries, the second Knight News Challenge on Libraries.

“With this gracious, continued support from Sloan and Knight, we can continue to focus on our largest strategic effort, which is to expand the DPLA network and provide an on-ramp for all states to participate,” said Emily Gore, DPLA’s director of content. “By building out DPLA’s coverage of state and regional Service Hubs, new communities and organizations from across the country will have access to essential 21st century services and programs, further enriching the scale and availability of our shared national cultural heritage online.”

To find out more about DPLA’s efforts towards completing the map of state-based Service Hubs, in addition to other significant initiatives, read the DPLA’s three-year strategic plan, published in January 2015.

###

About the Digital Public Library of America

The Digital Public Library of America (http://dp.la) brings together the riches of America’s libraries, archives, and museums, and makes them freely available to the world. It strives to contain the full breadth of human expression, including the written word, works of art and culture, records of America’s heritage, and the efforts and data of science. DPLA’s ever-expanding collection includes over 10 million items from 1,600 institutions across the United States.

About the Alfred P. Sloan Foundation

The Sloan Foundation is a philanthropic, not-for-profit grantmaking institution based in New York. Established in 1934 by Alfred Pritchard Sloan Jr., then-president and chief executive officer of General Motors, the foundation makes grants in support of original research and education in science, technology, engineering, mathematics and economics. For more, visit sloan.org.

About the John S. and James L. Knight Foundation

Knight Foundation supports transformational ideas that promote quality journalism, advance media innovation, engage communities and foster the arts. We believe that democracy thrives when people and communities are informed and engaged. For more, visit knightfoundation.org.

ALA Annual 2015 schedule, with bonus mod_proxy hackery / Galen Charlton

My ALA Annual this year is going to focus on five hashtags: #mashcat, #privacy, #nisoprivacy, #kohails, and #evgils.

#mashcat is for Mashcat, which an effort to build links between library systems and library metadata folks. We’ve had some recent success with Twitter chats, and I’ve made up some badge ribbons. If you’d like one, tweet at me (@gmcharlt)!

#privacy and #nisoprivacy are for patron privacy. My particular interest in using our technology to better protect it. I’ll be running the LITA Patron Privacy Technologies Interest Group meeting on Saturday, (where I look forward to Alison Macrina’s update on Let’s Encrypt). I’ll also be participating in the face-to-face meeting on Monday and Tuesday for the NISO project to create a consensus framework for patron privacy in digital library and information systems.

#kohails and #evgils are for Koha and Evergreen, both of which I hack on and which MPOW supports – so one of the things I’ll also be doing is wearing my vendor hat while boothing and meeting.

Here’s my conference schedule so far, although I hope to squeeze in a Linked Data program as well:

In the title of the post, I promised mod_proxy hackery. Not typical for an ALA schedule post? Well, the ALA scheduler website allows you to choose you make your schedule public. If you do that, you can embed the schedule in a blog post using an iframe.

Here’s the HTML that the scheduler suggests:

<iframe src="http://alaac15.ala.org/user/36364/schedule-embed"
 width="600" height="600"></iframe>

There’s a little problem with that suggestion, though: my blog is HTTPS-only. As a consequence, an HTTP iframe won’t be rendered by the browser.

What if I change the embedded URL to “https://alaac15.ala.org/user/36364/schedule-embed”? Still doesn’t work, as the SSL certificate returned is for https://connect.ala.org, which doesn’t match alaac15.ala.org. *cough*

Rather than do something simple, such as using copy-and-paste, I ended up configuring Apache to set up a reverse proxy. That way, my webserver can request my schedule from ALA’s webserver (as well as associated CSS), then present it to the web browser over HTTPS. Here’s the configuration I ended up with, with a bit of help from Stack Overflow:

# ALA scheduler needs SSL with a cert that matches badly
    ProxyPass /alaac15/ http://alaac15.ala.org/
    ProxyPassReverse /alaac15/ http://alaac15.ala.org/
    ProxyHTMLURLMap http://alaac15.ala.org /alaac15/

    <Location /alaac15/>
       ProxyPassReverse /
       SetOutputFilter  proxy-html
       ProxyHTMLURLMap http://alaac15.ala.org /alaac15/
       ProxyHTMLURLMap / /alaac15/
       ProxyHTMLURLMap  /alaac15/ /alaac15/
       RequestHeader    unset  Accept-Encoding
    </Location>

This is a bit ugly (and I’ll be disabling the reverse proxy after the conference is over)… but it works for the moment, and also demonstrates how one might make a resolutely HTTP-only service on your intranet accessible over HTTPS publicly.

Onward! I look forward to meeting friends old and new in San Francisco!

For ALA 2015: Three Free OPAC Enhancements / LibraryThing (Thingology)

threefreepdfpng

For a limited time, LibraryThing for Libraries (LTFL) is offering three of its signature enhancements for free!

There are no strings attached. We want people to see how LibraryThing for Libraries can improve your catalog.

  1. Check Library.

    The Check Library button is a “bookmarklet” that allows patrons to check if your library has a book while on Amazon and most other book websites. Unlike other options, LibraryThing knows all of the editions out there, so it finds the edition your library has. Learn more about Check Library

  2. Other Editions

    Let your users know everything you have. Don’t let users leave empty-handed when the record that came up is checked out. Other editions links all your holdings together in a FRBR model—paper, audiobook, ebook, even translations.

  3. Lexile Measures

    Put MetaMetrics’ The Lexile Framework® for Reading in your catalog, to help librarians and patrons find material based on reading level. In addition to showing the Lexile numbers, we also include an interactive browser.

Easy to Add

LTFL Enhancements are easy to install and can be added to every major ILS/OPAC system and most of the minor ones. Enrichments can be customized and styled to fit your catalog, and detailed usage reporting lets you know how they’re doing.

See us at ALA. Stop by booth 3634 at ALA Annual this weekend in San Francisco to talk to Tim and Abby and see how these enhancements work.

If you need a free pass to the exhibit hall, details are in this blog post.

Sign up

We’re offering these three enhancements free, for at least two years. We’ll probably send you links showing you how awesome other enhancements would look in your catalog, but that’s it.

Find out more http://www.librarything.com/forlibraries or email Abby Blachly at abby@librarything.com.

Disenfranchising Language in Library Technology / LITA

Editor’s Note: This is a guest post by Justin M. White.

A post by the net librarian was making the rounds on Tumblr a while back and caught my eye. It was short, so I’ll quote most of it here:

As a public librarian, a lot of my job is writing. Copy for websites, computer class handouts, signage, etc. It’s critical that librarians know what language patrons understand. Unfortunately a lot of tech stuff doesn’t use accessible language.

There’s a copier in one of the libraries I work at which has an error message that pops up often which says “insert key counter”. I’m sure this is precise and accurate language to the programmer who wrote the error message, but it really doesn’t mean anything. After trial and error it means you forgot to put money in, so the copier won’t work. But how is the average patron supposed to figure that out?

hodges_1

I subsequently discovered that there’s a surprising lack of discussion about this in the library literature, but what does exist is very promising. Adriene Lim wrote “The Readability of Information Literacy Content on Academic Library Web Sites” back in 2010, which analyzed the readability of library website content that was designed to provide basic research instruction. While most of the libraries surveyed scored well in accessibility of language, some were far more complicated. This is of particular concern for librarians like myself who are working with large populations of ESL and first-generation students.

Here is an actual example of an error message an ESL student in my library had trouble with:

hodges_2

Note that there isn’t a field called “Help Explanation”, but rather a “Describe the kind of help” section. The error message was generated, in this instance, by a space being the first character in the field. As far as the student knew, there was some other field called Help Explanation that wasn’t being filled out, leading them to frantically search the page in vain.

The LibPunk podcast addressed the issues of communication between librarians and IT staff in its final episode. One important point brought up was the difference in focus: a fix from IT might be well done, but does it have the user in mind? Librarians can have the same blindsides: the example brought up was catalogers who make records without the user in mind.

Another article, “ESL Library Skills: An Information Literacy Program for Adults with Low Levels of English Literacy”, focused on the range of information literacy programs for ESL populations. Libraries are overwhelmingly in the ESL education business, and those users are going to require dependable and accessible technology as their English language skills grow.

Take note of the messages your library technology gives you. Are they indecipherable? Would they be accessible to an ESL student, or a student with below-average reading levels? Take a look at the messages you create for your library: the sticky note on the copier that explains some workaround. Is your note actually making things worse by putting a wall of text in front of the interface? Do you utilize non-text instructional materials in your LibGuides, or do the words tower over anxious ESL readers? Is your website content intuitive and clearly written out?

As librarians we push access as part of our professional goals. No librarian should be making their content and technology less accessible on purpose, but keeping the effect of the language we use in our minds as we go throughout our careers can lead to some very simple yet effective solutions.

Justin is an accidental technical services librarian at Hodges University in Florida. His interests usually revolve around library/archival technology, history, and information literacy, and reblogging photos of bunnies on all known social media outlets.

Become a Friend of The Public Domain Review / Open Knowledge Foundation

Open Knowledge project The Public Domain Review launches a major new fundraising drive, encouraging people to become Friends of the site by giving an annual donation.

For those not yet in the know, The Public Domain Review is a project dedicated to protecting and celebrating, in all its richness and variety, the cultural public domain. In particular, our focus is on the digital copies of public domain works, the mission being to facilitate the appreciation, use and growth of a digital cultural commons which is open for everyone.

We create collections of openly licensed works comprised of highlights from a variety of galleries, libraries, archives, and museums, many of whom also contribute to our popular Curator’s Choice series (including The British Library, Rijksmuseum, and The Getty). We also host a fortnightly essay series in which top academics and authors write about interesting and unusual public domain works which are available online.

Founded in 2011, the site has gone from strength to strength. In its 4 plus years it has seen contributions from the likes of Jack Zipes, Frank Delaney, and Julian Barnes – and garnered praise from such media luminaries as The Paris Review, who called us “one of their favourite journals”, and The Guardian, who hailed us as a “model of digital curation”.

This is all very exciting but we need your help to continue the project into the future.

We are currently only bringing in around half of the base minimum required – the amount we need in order to tick along in a healthy manner. (And around a third of our ideal goal, which would allow us to pay contributors). So it is of urgent importance that we increase our donations if we want the project to continue.

Hence the launch of a brand new fundraising model through which we hope to make The Public Domain Review sustainable and able to continue into the future. Introducing “Friends of The Public Domain Review”https://publicdomainreview.org/support/

Image 1: one of the eight postcards included in the inaugural postcard set. The theme is "Flight" and the set will be sent out to all Friends donating $30/£20/€27.50 or more before 8th July - Source.

Image 1: one of the eight postcards included in the inaugural postcard set. The theme is “Flight” and the set will be sent out to all Friends donating $30/£20/€27.50 or more before 8th July. Source = http://www.loc.gov/pictures/item/00650258.

What is it?

This new model revolves around building a group of loyal PDR (Public Domain Review) supporters – the “Friends” – each of whom makes an annual donation to the project. This club of patrons will form the beating heart of the site, creating a bedrock of support vital to the project’s survival.

How can one become a Friend?

There is no fixed yearly cost to become a Friend – any annual donation will qualify you – but there is a guide price of $60 a year (£40/€55).

Are there any perks of being a Friend?

Yes! Any donation above $30 will make you eligible to receive our exclusive twice-a-year “postcard set” – 8 beautiful postcards curated around a theme, with a textual insert. Friends will also be honoured in a special section of the site and on a dedicated page in all PDR Press publications. They will also get first refusal in all future limited edition PDR Press creations, and receive a special end of year letter from the Editor.

How do I make my donation?

We’ve worked hard to make it as easy as possible to donate. You no longer have to use PayPal on the PDR site, but can rather donate using your credit or debit card directly on the site.

For more info, and to make your donation, visit: https://publicdomainreview.org/support/

Become a Friend before 8th July to receive the inaugural postcard set upon the theme of “Flight”

Image 2: one of the eight postcards included in the inaugural postcard set. The theme is "Flight" and the set will be sent out to all Friends donating $30/£20/€27.50 or more before 8th July - Source.

Image 2: one of the eight postcards included in the inaugural postcard set. The theme is “Flight” and the set will be sent out to all Friends donating $30/£20/€27.50 or more before 8th July. Source = http://www.loc.gov/pictures/item/2002722387/.

Thursday Threads: Data Management Plans, Better Q/A Sessions, App for Bird Identification / Peter Murray

Receive DLTJ Thursday Threads:

by E-mail

by RSS

Delivered by FeedBurner

This week’s threads:

NOTE! Funding for my current position at LYRASIS runs out at the end of June, so I am looking for new opportunities and challenges for my skills. Check out my resume/c.v. and please let me know of job opportunities in library technology, open source, and/or community engagement.

Feel free to send this to others you think might be interested in the topics. If you find these threads interesting and useful, you might want to add the Thursday Threads RSS Feed to your feed reader or subscribe to e-mail delivery using the form to the right. If you would like a more raw and immediate version of these types of stories, watch my Pinboard bookmarks (or subscribe to its feed in your feed reader). Items posted to are also sent out as tweets; you can follow me on Twitter. Comments and tips, as always, are welcome.

Where Should You Keep Your Data?

Federal funding agencies have made it clear that grant proposals must include plans for sharing research data with other scientists. What has not been clear is how and where researchers should store their data, which can range from sensitive personal medical information to enormous troves of satellite imagery. …

The good news is that formal policies — with recommendations for storage — are beginning to emerge from federal agencies. The bad news is that if you don’t comply with the new policies, you might be prohibited from receiving additional grant money.

Where Should You Keep Your Data?, by Karen M. Markin, The Chronicle of Higher Education, 23-Jun-2015

The even better news? Libraries are gearing up to help you. The article suggests searching for “data-management plan” in your university’s search engine. It also points to the “DMP Tool,” hosted by the University of California. It provides free, interactive forms that guide your preparation of data management plans.

Index Card-based Question and Answer Sessions

Here is the formula:

  1. Throw away the audience microphones.
  2. Buy a pack of index cards.
  3. Hand out the cards to the audience before or during your talk.
  4. Ask people to write their questions on the cards and pass them to the end of the row.
  5. Collect the cards at the end of the talk.
  6. Flip through the cards and answer only good (or funny) questions.
  7. Optional: have an accomplice collect and screen the questions for you during the talk.

Better yet, if you are a conference organizer, buy enough index cards for every one of your talks and tell your speakers and volunteers to use them.

Ban boring mike-based Q&A sessions and use index cards instead, by Valerie Aurora, 23-Jun-2015

I love this idea. It is a great way to get questions from people who aren’t confident enough (or quick enough) to get to the aisle microphones to ask questions. It also allows the the speaker to get to the most interesting questions from the audience. A second optional suggestion: have another accomplice transcribe questions from Twitter for both in-person and livestream attendees.

What’s that Bird? There is an App for That!

Part of the mission of the Cornell Lab of Ornithology is to help people answer the question, “What is that bird?” And so, in collaboration with the Visipedia research project, they’ve designed Merlin, a free app available on iTunes and Google Play.

What Kind of Bird Is That?: A Free App From Cornell Will Give You the Answer, by Dan Colman, Open Culture, 10-Jun-2015

Our family tried this one out in the backyard, and it works! Here is a video created by Cornell that shows off the app.

Firefox privacy—updated / William Denton

Here’s an update to my post a month ago about the Firefox extensions I use on my laptop to increase privacy. I’m no expert, and it may do little or nothing against spy agencies, but it does confuse corporate tracking, which is also important. One key point, which does defend against spy agencies: use Tor more (but watch out). Every grain of sand we each throw in the gears is a help.

Remove Chromium

First, though, yesterday I deleted Chromium, the purportedly free and open browser that Google extends to make Chrome, its proprietary browser. I had used it as the only browser for using my Google accounts (which are purely for work) and for other web sites that had unreasonable demands for cookies and Javascript but that I had to use (such as for buying a ticket to a concert). Then I saw Rick Falkvinge’s Google Chrome listening in to your room shows the importance of privacy defense in depth, which reports that the browser was secretly downloading a binary that would let it listen for “ok google” in order to trigger searches.

Hotword search enabled: no

The browser required a setting to be turned on to actually start listening, Google says, but nevertheless, without my knowing it, a blob of unknown code had been installed on my computer that could listen to my microphone. The links in Falkvinge’s piece are worth following to what developers thought of this and how Google handled it. The Guardian picked up the story with Google eavesdropping tool installed on computers without permission.

My response:

$ sudo apt-get purge chromium-browser

Of course, I am walking around with a perfectly constructed and highly sophisticated monitoring device in my pocket which must have its microphone turned on—because it’s my phone—but that’s another issue.

I’m not sure what browser I’ll now use for Google access.

Extensions I use

Now, the extensions. I deleted Adblock Plus and now use uBlock, thanks to a recommendation on Twitter. It’s under the GPL v3 and completely free and open, and isn’t, in effect, a protection racket, as described in Both AdBlock Plus and the media are worried about Safari’s upcoming features.

AdBlock Plus needs to work well for its parent company to make money. Not from its users, of course, but from the companies that want to pay the company to make sure their advertisements aren’t blasted into oblivion by the extension.

On my phone I’m still seeing lots of ads (and hence being tracked) but that’s another issue too.

I also installed NoScript, which I should have done a long time ago. It blocks Javascript (and some other things) on sites unless I enable it where I want it to run. It comes with a whitelist, but it’s editable.

This is my updated list of privacy extensions, with their licenses (MPL == Mozilla Public License, GPL == GNU Public License) and links to source if public:

Also:

Check in on Lightbeam every few days to see a nice visualization of how your information is being passed from one site to another. It’s incredible.

Blocking referrers

None of those do anything with the HTTP referer header, so I looked into how to handle that myself. Opening the (pseudo-)URL about:config in Firefox and searching for referer (an aged misspelling) showed this:

about:config referers

Finding the details of these is strangely difficult on the Mozilla Firefox site, but Improve online privacy by controlling referrer information documents them.

network.http.referer.XOriginPolicy

  • 0: always send referrer (default).
  • 1: only send if base domains match.
  • 2: only send if hosts match.

I set this to 1.

network.http.referer.spoofSource

  • false: send the referrer (default).
  • true: spoof the referrer and instead use the target URI

I set this to true.

network.http.referer.trimmingPolicy

  • 0: send full URI (default)
  • 1: send scheme, host, port and path
  • 2: send scheme, host and port

I set this to 2.

network.http.sendRefererHeader

  • 0. never send the referring URL
  • 1. send when following a link
  • 2. send when following a link or loading an image (default)

I set this to 1.

This may stop some sites from working. I’ll wait and see.

Server-side

Eric Hellman’s Protect reader privacy with referrer meta tags told me about the new referrer (finally properly spelled) meta tag in HTML 5, which allows a site owner to suggest to a browser what referrer information it should pass on to a linked site. You’re viewing this over HTTPS, so this shouldn’t matter when linking to a non-secure HTTP site: 15.1.3 Encoding Sensitive Information in URI’s in RFC 2616 says, “Clients SHOULD NOT include a Referer header field in a (non-secure) HTTP request if the referring page was transferred with a secure protocol.” However, my URL would be passed on to HTTPS sites. To prevent this, I added this to my page template:

 <meta name="referrer" content="origin-when-cross-origin" />

This means that links leading off my site should pass on that the browser came from https://www.miskatonic.org/, without giving the particular page, but links staying inside my site will pass the full referring URL.

Use Tor more

I’m using Tor more. I try to use it for as much regular anonymous browsing as I can: just reading the newspapers or looking at blogs or checking something on Wikipedia or any other normal behaviour. Using Tor like this has two wide advantages: it increases the overall traffic on the network, which helps confuse what everyone else is doing, and it means that more Tor traffic hits regular web sites, which also helps confuse what everyone else is doing. The more people that use it, the better for everyone.

It has its faults. It shows ads! Here’s what the Toronto Star home page looks like right now in Tor:

Tor Star

Not only do I see the ad at the top, which uBlock prevents, it’s messed up and ugly. But I don’t mind.

Using Tor (or ssh) means that you become a permanent target of the security agencies and everything you do will be logged and analyzed, so know what you’re doing.

Still, every grain of sand we each throw in the gears is a help.

DSpace User Group Highlights from OAI9 / DuraSpace News

From Bram Luyten, @mire  

Heverlee, Belgium  The OAI9 Conference held in Geneva shed light on a wide range of current open access developments. With over 138 countries represented, it was more than a success. As member of the organizing committee Jens Vigen stated: “Geneva is the place where people meet and particles collide”. We could not agree more.

University reputation and ranking – another way that Europe is not like US / HangingTogether

View of Union Square during the Research Library Partnership meeting

View of Union Square during the Research Library Partnership meeting via Roy Tennant Instagram

The OCLC Research Library Partnership Rep, Rank & Role meeting was held in San Francisco, California on 3-4 June 2015. It focused on the library’s contribution to university ranking and researcher reputation. A distinguished group of speakers provided a mix of perspectives. You can check out their slides (and videos) here. You can get a good sense of the conference from these but a few of us have decided to report here on hangingtogether.org on some of the major themes that shaped interests and interaction. The others will follow soon.

One of our meeting goals was to have these presentations spark discussion among attendees about how the library might advance university goals around reputation, assessment and recording of research. We had a nicely varied attendance that included a contingent from outside the USA. This trans-national dimension also characterized our speakers.

Most importantly, this perspective from outside the US shaped discussions and was one of the most important differentiators of perspective and focus among libraries contemplating their role in the management of information generated by research and related to the research process.

If you operate in a country with a national research assessment regime such as those implemented in the UK, the Netherlands and Australia you think differently about the reputation and ranking challenge, you feel differently about the ways in which research outputs get judged qualitatively, and you emphasize investment in services that respond to these regimes and judgments. And that’s not the way peers in the US think about those same matters. See Keith Webster’s presentation (slides, video) for a nice timeline of assessment emergence and his thoughtful take on the impact of global ranking, the effects on resource discovery and some of the implications for librarians.

[N.B. If you are unfamiliar with these national research assessment exercises you might want to look through this report:
MacColl, John. 2010. Research Assessment and the Role of the Library. Report produced by OCLC Research. Published online. (.pdf: 276K/13 pp.).

Granted it is a few years old but it provides a good introduction to what might be an unfamiliar process see pages 5-8 and catalogs some of the impacts for libraries.]

Where there is a national assessment regime they have been implemented to guide, and in some countries, dictate the award of research funding from the national government agencies that support the university-based research within the country. The tie-in to funding changes the conversation. In the US there is concern about faculty motivation to participate in activities that contribute to university reputation and rank. Services are designed to deliver benefits to researchers. Successful participation is driven by offering a personal individual benefit – a bibliography suitable for website publication, advice about how to increase the readership or citation frequency of a faculty member’s work, introductions to other researchers with similar interests for collaboration and networking, etc. See Ginny Steel’s presentation (slides, video soon) that emphasizes the faculty-centered design of the UCLA reputation management system. Contrast that with Wouter Gerritsma’s view (slides, video soon) of his work at Wageningen University in the Netherlands which began with data and moved on to service provision.

Where research assessment exercises rule there is no question about faculty motivation. It is a requirement. It determines personal reputation, shapes professional contribution and feeds into the local reward system of the home institution. It creates and sustains the more intense interest in bibliometrics and altmetrics that characterizes discussion of reputation and rank in these countries.

It also creates a much franker interest in and discussion of institutional rank. Institutions outside the US are much more likely to have set a public goal of climbing to a particular level in one or more of the global university ranking schemes (THE, QS World, USNews, etc.) Once you begin granular assessment of individual research output it seems to make it easier to be honest about the way that rolls up into institutional striving and ambition. See Jan Wilkinson’s presentation (slides, video soon) about the University of Manchester’s ambitions and her candid assessment of what it takes to move an institution’s ranking. This nicely complimented Jiro Kikuryo’s presentation (slides, video soon) about the challenges of getting non-English language research properly appreciated and linking university ranking ambitions to broader societal values and goals.

Some of the conference discussion made the point that the US is on the same trajectory and only behind in the adoption of these assessment and ranking objectives. The market penetration of the support systems and tools outside the US is deeper but a similar pattern of take-up may unfold in the US over the next few years.

There were at least two areas where those with assessment regimes and those without shared similar concerns. One was with the balance between compliance and service goals. Whether being driven by mandates (like those proceeding from the unfunded OSTP pronouncements in the US) or by funding allocations (like those dictated by the UK’s Research Excellence Framework) librarians worried that their support roles would result in them being viewed as ‘instruments of compliance’ by faculty and research staff. This is at odds with the long history of the library as service and support organization within the academy. There was some feeling that a convergence of the US faculty-oriented service development with the expertise outside the US in implementation and support of assessment tools would be the combination that allows librarians to be viewed as partners at the individual faculty level and as important contributors to university administrative ambitions. See David Seaman’s presentation (slides, video soon) for a candid view of the struggle to reconcile these conflicting goals while developing a new research and reputation management system at Dartmouth.

Another area of shared concern independent of assessment was how to round out the view of research beyond STEM disciplines. There are significant challenges in measuring the research contributions of humanities and social science disciplines. The metrics that have emerged around STEM disciplines don’t fit these disciplines and consequently they are undervalued in both university ranking as well as assessment judgments. There was a desire to bring library expertise to bear even while understanding the systemic and cultural complexity of this challenge. See Catherine Mitchell’s nuanced presentation (slides, video soon) about the humanities push back as she led the California Digital Library’s attempt to introduce a research information management system.

In short, all discussions about reputation and ranking need to be predicated on the presence or absence of national assessment regimes and should take into account the need to offer library services that provide individual benefit, contribute to institutional ambitions, and reflect the full range of impact that research has for the university community, the nation’s citizenry and global society.

About Jim Michalko

Jim coordinates the OCLC Research office in San Mateo, CA, focuses on relationships with research libraries and work that renovates the library value proposition in the current information environment.

ALA secures activism leaders for Washington Update session / District Dispatch

Golden gate bridge and Fort Point in San Francisco. Photo by Jon Sullivan.

Golden gate bridge and Fort Point in San Francisco. Photo by Jon Sullivan.

For decades, the Electronic Frontier Foundation (EFF) and the American Library Association (ALA) have stood shoulder to shoulder on the front lines of the fight for privacy online, at the library and in many other spheres of our daily lives. Standing in for EFF Executive Director Cindy Cohn, the award-winning group’s Activism Director, Rainey Reitman, will discuss that proud shared history and the uncertain future of personal privacy during this year’s 2015 ALA Annual Conference in San Francisco.

The session, titled “Frenetic, Fraught and Front Page: An Up-to-the-Second Update from the Front Lines of Libraries’ Fight in Washington,” takes place from 8:30 to 10:00 a.m. on Saturday, June 27, 2015, at the Moscone Convention Center in room 2001 of the West building. Also speaking will be Jackson Bird, communications director and spokesperson for the Harry Potter Alliance (HPA) and noted YouTube video producer and blogger. HPA, an online activism community of hundreds of thousands of people, is dedicated to “changing the world by making activism accessible through the power of story.” Since 2005, they have engaged millions of fans through its work for equality, human rights, and literacy and are among the latest national advocacy organizations to actively collaborate with the American Library Association.

In addition to the featured speakers, Adam Eisgrau, managing director of the ALA Office of Government Relations in Washington, will provide up-to-the-minute insight from the congressional trenches of key federal privacy legislation “in play,” including the current status of efforts to reform the USA PATRIOT Act, the Freedom of Information Act (FOIA), as well as copyright reform, and federal library funding. Also, Larra Clark, deputy director of the ALA Office for Information Technology Policy, will update attendees on critical broadband policy issues and OITP’s multi-year Policy Revolution! initiative.

Participants will have the opportunity to pose questions to the speakers, which include Jackson Bird, communications director, Harry Potter Alliance; Larra Clark, deputy director, Office for Information Technology Policy,  American Library Association Washington Office; Adam Eisgrau, managing director, Office of Government Relations, American Library Association Washington Office; and Rainey Reitman, activism director, Electronic Frontier Foundation.

As the leader of the activism team at the Electronic Frontier Foundation, Reitman is especially interested in the intersection between personal privacy and technology, particularly social networking privacy, network security, web tracking, government surveillance, and online data brokers. Reitman is the chief operating officer and co-founder of the Freedom of the Press Foundation, a nonprofit organization that defends and supports unique, independent, nonprofit journalistic institutions. In 2013, she received the Hugh M. Hefner First Amendment Award in Journalism. Reitman also is a founder and steering committee member for the Chelsea Manning Support Network, a network of individuals and organizations advocating for the release of accused WikiLeaks whistleblower Private Chelsea Manning.

A speaker, video creator, and wizard activist, Bird began working with the Harry Potter Alliance as a volunteer in 2010 and was hired as the Communications Director and Spokesperson in 2013. He graduated from New York University in 2012 with a Bachelor’s in Comparative Literature and a concentration in Documentary Filmmaking. He regularly speaks about new media and fan activism at events including TEDx Women, MIT’s Futures of Entertainment, Book Expo America, and San Diego Comic-Con. Jackson produces the HPA’s online videos, which have been featured on Upworthy, the Huffington Post, BuzzFeed, Mashable, and more. He also writes the horoscopes for the HPA’s quarterly newsletter, The W.A.N.D., and stars on the organization’s YouTube channel as The Boy Who Vlogged. Jackson currently lives in New York City, where he co-runs the Giant Squidstravaganza fanblog, Cephanloblogcast, and can be found on YouTube answering the age-old question, “Will It Waffle?”

The post ALA secures activism leaders for Washington Update session appeared first on District Dispatch.

Domain Name Privacy / Coral Sheldon-Hess

An important piece of internet privacy is under attack, and I’m asking for help protecting it. The EFF explains it well, but short version: every domain (the part of a web address that comes after “www” and includes something like “.org” or “.com” — in the case of this blog, “sheldon-hess.org” is the domain) is registered to a person or by a company. As part of that registration, the entity’s physical address is recorded. By default, that address is available when someone looks up information about the domain, using a service called “WHOIS,” but it has become a fairly common practice to allow domain registrars (the companies that lease domains) to anonymize that information.

As a woman with opinions on the internet, you bet I avail myself of this service. Here’s the WHOIS on this domain (also visible in the banner of this post). The WHOIS on my other four domains look much the same.

ICANN, the organization that makes policies around WHOIS (and domain names in general), is considering removing this option for websites deemed “commercial” — which could be construed as including any site that uses ads or talks about consulting services. (Like this one.) A lot of really important sites’ owners would suddenly lose this protection, all because the entertainment industry wants to have an easier time suing people.

If you don’t own any domains, or even if you do, I’m hoping you’ll take a minute to write to ICANN. I put a sample letter, similar to (but different from) the one I sent, below; modify it until you like it, and send it along, please. Also, please consider signing this petition, hosted at SaveDomainPrivacy.org.

If you do have a domain registered, will you please also ask your host/registrar to fight for anonymity? I offer a sample letter, much like the one I sent DreamHost, below.

A sample letter to send to ICANN

This should be sent to comments-ppsai-initial-05may15@icann.org

Anonymized WHOIS is a very important first step in the fight to protect free speech online, especially for marginalized voices. GamerGate and other online hate groups target people people whom they disagree with and whose addresses they are able to find. This article is a good primer on the kind of damage they do: http://feministing.com/2015/01/16/things-have-happened-in-the-past-week-on-doxing-swatting-and-8chan/

Because “commercial” has been construed so broadly, in the past, preventing registrars from anonymizing WHOIS data would put the owners of many valuable and informative websites at risk.

While I understand that the entertainment industry wants to increase individual liability for unauthorized distribution of their property, the threat to domain owners and the chilling effect this would have on free speech cannot be overlooked.

A sample letter to send to your domain registrar or web host

One of the things I like very best about [your hosting service] is the private domain name registrations that you offer. Anonymized WHOIS is a very important first step in the fight to protect free speech online, especially for marginalized voices. You might be aware of GamerGate and other online hate groups and the real damage they do to people whose addresses they are able to find. If not, this is a good primer: http://feministing.com/2015/01/16/things-have-happened-in-the-past-week-on-doxing-swatting-and-8chan/

The ability anonymize WHOIS data is under attack, now, and I would like to know what [your company] is doing to help keep this important safety option available. https://www.eff.org/deeplinks/2015/06/changes-domain-name-rules-place-user-privacy-jeopardy

DreamHost’s response

I completely understand your concern about this, and will address the issue as candidly as I can. DreamHost has no official position with regard to this matter, but we have not abandoned our commitment to freedom of speech, and the avoidance of chilling effects on that speech, with regard to anonymous speech on the web.

As an ICANN accredited registrar, we are bound by whatever rules ICANN puts in place, and are closely watching this development. To the degree we are able we will lobby to keep the current policy in place.

That said, while I am an avid supporter of the EFF (and a monthly contributor), they were less than completely accurate in that post with regard to the facts of the current situation. All ICANN registrars are bound by section 3.7.7.3 of the ICANN Registrar Accreditation Agreement (see https://www.icann.org/resources/pages/ra-agreement-2009-05-21-en#3.7.7.3 and https://www.icann.org/en/system/files/newsletters/draft-advisory-raa-3773-14may10-en.pdf). As a practical matter, this effectively requires all ICANN Accredited Registrars to release to any third party (not just law enforcement or attorneys) identifying data for registrants upon receipt of “reasonable evidence of actionable harm”, and does not require a court order or subpoena. We address this in our Privacy Policy”.

For this reason, while Whois Privacy does provide some protection from casual trolls or Internet miscreants, it is naive to assume that any registrant’s actual identifying data is truly private in the face of an articulated claim of wrongdoing supported by any reasonable evidence.

For most our customers, the current reality is that the degree of privacy afforded by Whois Privacy Services is of value, but it should not be considered a general protection against being identified as the registrant of a domain.