Planet Code4Lib

Abby Smith Rumsey's "When We Are No More" / David Rosenthal

Back in March I attended the launch of Abby Smith Rumsey's book When We Are No More. I finally found time to read it from cover to cover, and can recommend it. Below the fold are some notes.

There are four main areas where I have comments on Rumsey's text. On page 144, in the midst of a paragraph about the risks to our personal digital information she writes:
The documents on our hard disks will be indecipherable in a decade.
The word "indecipherable" implies not data loss but format obsolescence. As I've written many times, Jeff Rothenberg was correct to identify format obsolescence as a major problem for documents published before the advent of the Web in the mid-90s. But the Web caused documents to evolve from being the private property of a particular application to being published. On the Web, published documents don't know what application will render them, and are thus largely immune to format obsolescence.

It is true that we're currently facing a future in which most current browsers will not render preserved Flash, not because they don't know how to but because it isn't safe to do so. But oldweb.today shows that the technological fix for this problem is already in place. Format obsolescence, were it to occur, would be hard for individuals to mitigate. Especially since it isn't likely to happen, it isn't helpful to lump it in with threats they can do something about by, for example, keeping local copies of their cloud data.

On page 148 Rumsey discusses the problem of the scale of the preservation effort needed and the resulting cost:
We need to keep as much as we can as cheaply as possible. ... we will have to invent ways to essentially freeze-dry data, to store data at some inexpensive low level of curation, and at some unknown time in the future be able to restore it. ... Until such a long-term strategy is worked out, preservation experts focus on keeping digital files readable by migrating data to new hardware and software systems periodically. Even though this looks like a short-term strategy, it has been working well  ... for three decades and more.
Yes, it has been working well and will continue to do so provided the low level of curation manages find enough money to keep the bits safe. Emulation will ensure that if the bits survive we will be able to render them, and it does not impose significant curation costs along the way.

The aggressive (and therefore necessarily lossy) compression Rumsey enviasges would reduce storage costs, and I've been warning for some time that Storage Will Be Much Less Free Than It Used To Be. But it is important not to lose sight of the fact that ingest, not storage, is the major cost in digital preservation. We can't keep it all; deciding what to keep and putting it some place safe is the most expensive part of the process.

On page 163 Rumsey switches to ignoring the cost and assuming that, magically, storage supply will expand to meet the demand:
Our appetite for more and more data is like a child's appetite for chocolate milk: ... So rather than less, we are certain to collect more. The more we create, paradoxically, the less we can afford to lose.
Alas, we can't store everything we create now, and the situation isn't going to get better.

On page 166 Rumsey writes:
Other than the fact that preservation yields long-term rewards, and most technology funding goes to creating applications that yield short-term rewards, it is hard to see why there is so little investment, either public or private, in preserving data. The culprit is our myopic focus on short-term rewards, abetted by financial incentives that reward short-term thinking. Financial incentives are matters of public policy, and can be changed to encourage more investment in digital infrastructure.
I completely agree that the culprit is short-term thinking, but the idea that "incentives ... can be changed" is highly optimistic. The work of, among others, Andrew Haldane at the Bank of England shows that short-termism is a fundamental problem in our global society. Inadequate investment in infrastructure, both physical and digital, is just a symptom, and is far less of a problem than society's inability to curb carbon emissions.

Finally, some nits to pick. On page 7 Rumsey writes of the Square Kilometer Array:
up to one exabyte (1018 bytes) of data per day
I've already had to debunk another "exabyte a day" claim. It may be true that the SKA generates an exabyte a day but it could not store that much data. An exabyte a day is most of the world's production of storage. Like the Large Hadron Collider, which throws away all but one byte in a million before it is stored the SKA actually stores only(!) a petabyte a day (according to Ian Emsley, who is responsible for planning its storage). A book about preserving information for the long term should be careful to maintain the distinction between the amounts of data generated, and stored. Only the stored data is relevant.

On page 46 Rumsey writes:
our recording medium of choice, the silicon chip, is vulnerable to decay, accidental deletion and overwriting
Our recording medium of choice is not, and in the foreseeable future will not be, the silicon chip. It will be the hard disk, which is of course equally vulnerable, as any read-write digital medium would be. Write-once media would be somewhat less vulnerable, and they definitely have a role to play, but they don't change the argument.

CC-BY / William Denton

I’ve changed the license on my content to CC-BY: Creative Commons Attribution 4.0.

UPDATE 25 May 2016: The feed metadata is now updated too. “We copy documents based on metadata.”

Evergreen 2.10.4 released / Evergreen ILS

We are pleased to announce the release of Evergreen 2.10.4, a bug fix release.

Evergreen 2.10.4 fixes the following issues:

  • Fixes the responsive view of the My Account Items Out screen so that Title and
    Author are now in separate columns.
  • Fixes an incorrect link for the MVF field definition and adds a new link to
    BRE in fm_IDL.xml.
  • Fixes a bug where the MARC stream authority cleanup deleted a bib
    record instead of an authority record from the authority queue.
  • Fixes a bug where Action Triggers could select an inactive event
    definition when running.
  • Eliminates the output of a null byte after a spool file is processed
    in MARC steam importer.
  • Fixes an issue where previously-checked-out items did not display in
    metarecord searches when the Tag Circulated Items Library Setting is
    enabled.
  • Fixes an issue in the 0951 upgrade script where the script was not
    inserting the version into config.upgrade_log because the line to do so
    was still commented out.

Please visit the downloads page to retrieve the server software and staff clients

Sharing journals freely online / John Mark Ockerbloom

What are all the research journals that anyone can read freely online?  The answer is harder to determine than you might think.  Most research library catalogs can be searched for online serials (here’s what Penn Libraries gives access to, for instance), but it’s often hard for unaffiliated readers to determine what they can get access to, and what will throw up a paywall when they try following a link.

Current research

The best-known listing of current free research journals has been the Directory of Open Access Journals (DOAJ), a comprehensive listing of free-to-read research journals in all areas of scholarship. Given the ease with which anyone can throw up a web site and call it a “journal” regardless of its quality or its viability, some have worried that the directory might be a little too comprehensive to be useful.  A couple of years ago, though, DOAJ instituted more stringent criteria for what it accepts, and it recently weeded its listings of journals that did not reapply under its new criteria, or did not meet its requirements.   This week I am pleased to welcome over 8,000 of its journals to the extended-shelves listings of The Online Books Page.  The catalog entries are automatically derived from the data DOAJ provides; I’m also happy to create curated entries with more detailed cataloging on readers’ request.

Historic research

Scholarly journals go back centuries.  Many of these journals (and other periodicals) remain of interest to current scholars, whether they’re interested in the history of science and culture, the state of the natural world prior to recent environmental changes, or analyses and source documents that remain directly relevant to current scholarship.  Many older serials are also included in The Online Books Page’s extended shelves courtesy of HathiTrust, which currently offers over 130,000 serial records with at least some free-to-read content.  Many of these records are not for research journals, of course, and those that are can sometimes be fragmentary or hard to navigate.  I’m also happy to create organized, curated records for journals offered by HathiTrust and others at readers’ request.

It’s important work to organize and publicize these records, because many of these journals that go back a long way don’t make their content freely available in the first place one might look.  Recently I indexed five journals founded over a century ago that are still used enough to be included in Harvard’s 250 most popular works: Isis, The Journal of Comparative Neurology, The Journal of Infectious Diseases, The Journal of Roman Studies, and The Philosophical Review.  All five had public domain content offered at their official journal site, or JSTOR, behind paywalls (with fees for access ranging from $10 to $42 per article) that was available for free elsewhere online.  I’d much rather have readers find the free content than be stymied by a paywall.  So I’m compiling free links for these and other journals with public domain runs, whether they can be found at Hathitrust, JSTOR (which does make some early journal content, including from some of these journals, freely available), or other sites.

For many of these journals, the public domain extends as late as the 1960s due to non-renewal of copyright, so I’m also tracking when copyright renewals actually start for these journals.  I’ve done a complete inventory of serials published until 1950 that renewed their own copyrights up to 1977.  Some scholarly journals are in this list, but most are not, and many that are did not renew copyrights for many years beyond 1922.  (For the five journals mentioned above, for instance, the first copyright-renewed issues were published in 1941, 1964, 1959, 1964, and 1964 respectively– 1964 being the first year for which renewals were automatic.)

Even so, major projects like HathiTrust and JSTOR have generally stopped opening journal content at 1922, partly out of a concern for the complexity of serial copyright research.  In particular, contributions to serials could have their own copyright renewals separate from renewals for the serials themselves.  Could this keep some unrenewed serials out of the public domain?  To answer this question, I’ve also started surveying information on contribution renewals, and adding information on those renewals to my inventory.  Having recently completed this survey for all 1920s serials, I can report that so far individual contributions to scholarly journals were almost never copyright-renewed on their own.  (Individual short stories, and articles for general-interest popular magazines, often were, but not articles intended for scientific or scholarly audiences.)  I’ll post an update if the situation changes in the 1930s or later. So far, though, it’s looking like, at least for research journals, serial digitization projects can start opening issues past 1922 with little risk.  There are some review requirements, but they’re comparable in complexity to the Copyright Review Management System that HathiTrust has used to successfully open access to hundreds of thousands of post-1922 public domain book volumes.

Recent research

Let’s not forget that a lot more recent research is also available freely online, often from journal publishers themselves.  DOAJ only tracks journals that make their content open access immediately, but there are also many journals that make their content freely readable online a few months or years after initial publication.  This content can then be found in repositories like PubMedCentral (see the journals noted as “Full” in the “participation” column), publishing platforms like Highwire Press (see the journals with entries in the “free back issues” column), or individual publishers’ programs such as Elsevier’s Open Archives.

Why are publishers leaving money on the table by making old but copyrighted content freely available instead of charging for it?  Often it’s because it’s what’s makes their supporters– scholars and their funders– happy.  NIH, which runs PubMedCentral, already mandates open access to research it funds, and many of the journals that fully participate in PubMedCentral’s free issue program are largely filled with NIH-backed research.  Similarly, I suspect that the high proportion of math journals in Elsevier’s Open Archives selection has something to do with the high proportion of mathematicians in the Cost of Knowledge protest against Elsevier.  When researchers, and their affiliated organizations, make their voices heard, publishers listen.

I’m happy to include listings for  significant free runs of significant research journals on The Online Books Page as well, whether they’re open access from the get-go or after a delay.  I won’t list journals that only make the occasional paid-for article available through a “hybrid” program, or those that only have sporadic “free sample” issues.  But if a journal you value has at least a continuous year’s worth of full-sized, complete issues permanently freely available, please let me know about it and I’ll be glad to check it out.

Sharing journal information

I’m not simply trying to build up my own website, though– I want to spread this information around, so that people can easily find free research journal content wherever they go.  Right now, I have a Dublin Core OAI feed for all curated Online Books Page listings as well as a monthly dump of my raw data file, both CC0-licensed.  But I think I could do more to get free journal information to libraries and other interested parties.  I don’t have MARC records for my listings at the moment, but I suspect that holdings information– what issues of which journals are freely available, and from whom– is more useful for me to provide than bibliographic descriptions of the journals (which can already be obtained from various other sources).  Would a KBART file, published online or made available to initiatives like the Global Open Knowledgebase, be useful?  Or would something else work better to get this free journal information more widely known and used?

Issues and volumes vs. articles

Of course, many articles are made available online individually as well, as many journal publishers allow.  I don’t have the resources at this point to track articles at an individual level, but there are a growing number of other efforts that do, whether they’re proprietary but comprehensive search platforms like Google Scholar and Web of Science, disciplinary repositories like ArXiV and SSRN, institutional repositories and their aggregators like SHARE and BASE, or outright bootleg sites like Sci-Hub.  We know from them that it’s possible to index and provide access to the scholarly knowledge exchange at a global scale, but doing it accurately, openly, comprehensively, sustainably, and ethically is a bigger challenge.   I think it’s a challenge that the academic community can solve if we make it a priority.  We created the research; let’s also make it easy for the world to access it, learn from it, and put it to work.  Let’s make open access to research articles the norm, not the exception.

And as part of that, if you’d like to help me highlight and share information on free, authorized sources for online journal content, please alert me to relevant journals, make suggestions in the comments here, or get in touch with me offline.


The Radcliffe Workshop on Technology & Archival Processing / Library of Congress: The Signal

This is a guest post from Julia Kim, archivist in the American Folklife Center at the Library of Congress.

Photo of keynote speaker Michael Connelly at podium.

Professor Michael Connelly delivering keynote. Photo by Radcliffe Workshop on Technology and Archival Processing.

The annual meeting of the Radcliffe Technology Workshop (April 4th – April 5th, #radtech16) brought together historians, (digital) humanists and archivists for an intensive discussion of the “digital turn” and its effect on our work. The result was a focused and highly participatory meeting among professionals working across disciplinary lines with regards to our respective methodologies and codes of conduct. The talks and panels served as springboards for rich conversations addressing many of the big picture questions in our fields. Added to this was the use of round-table small group discussions after panel presentations, something that I wish was more a norm at professional events. This post covers only a small portion of the two days.

Matthew Connelly (Columbia University) asked “Will the coming of Big Data mean the end of history as we know it?” The answer was a resounding “yes.” Based on his years as a researcher at the National Archives and Records Administration (NARA), Connelly surveyed the history of government secrets, its inefficiencies, and the minuscule sample rate determining record retention and the resultant losses to the historical record of major world events. Part of his work as a researcher involved making use of these efforts to initiate the largest searchable collection of now de-classified government records with “The Declassification Engine” and the History Lab. In amassing and analyzing the largest data collection of declassified and unredacted records, their work uncovers secrets via systematic omission, for example. (Read more at Wired magazine.)

The next panel, “Connections and Context: A Moderated Conversation about Archival Processing for the Digital Humanities Generation,” was organized around archival processing challenges and included Meredith Evans (Jimmy Carter Presidential Library and Museum), Cristina Pattuelli (Pratt Institute), and Dorothy Waugh (Emory University).

  • Meredith Evans (Jimmy Carter Presidential Library and Museum) of “Documenting Ferguson,” discussed her work “Documenting the Now” and her efforts to push archivists outside of their comfort zone and into the community to collect documentation as events unfolded.
  • Cristina Pattuelli (Pratt Institute) presented on the Linked Jazz linked data pilot project, which pulls together tools into a single platform to create connections with jazz-musician data. The initial data, digitized oral history transcripts, is further enriched and mashed with other types of data sets, like discography information from Carnegie Hall. (Read the overview published on EDUCAUSE.)
  • Dorothy Waugh (Emory University) spoke to the researcher aspect — or more aptly, the lack of researchers — of born-digital collections. (I wrote a related story titled “Researcher Interactions with Born-Digital”.) Her work underlines the need to cultivate not only donors but also the researchers we hope will one day want to investigate time-date stamps and disk images, for example. While few collections are available for research, the lack of researchers using born-digital collections is also a problem. Researchers are unaware of collections and do not, in a sense, know how to approach using these collections. She is in the process of developing a pilot project with undergraduate students to remedy this.
  • Benjamin Moser, the authorized biographer of Susan Sontag, spoke of his own discomfort, at times, with a researcher’s abilities to exploit privileged knowledge in email. To Moser, email increased the responsibilities of both the archive and the researcher to work in a manner that is “tasteful” and underlined the need to define and educate others in what that may mean. (Read his story published in The New Yorker.)
Conference - Mary O'Connell Murphy introducing "Collections and Context: A Moderated Conversations about Archival Processing for the Digital Humanities Generation.”

Mary O’Connell Murphy introducing “Collections and Context” panel. Photo by Radcliffe Workshop on Technology and Archival Processing.

There were a number of questions and concerns that we discussed, such as: What course of action is necessary or right when community activists feel discomfort with their submissions? How can we make sure that these collections aren’t misused? How can we protect individuals from legal prosecution? What are our duties to donors, to the law, and to our professions, and how do individuals navigate the conflicts among their competing claims? How can we, across disciplines, develop a way of discussing these issues? If the archives are defined as an associated set of values and practices, how can we address the lack of consensus on how to (re)interpret them, in light of the challenges of digital collections?

Claire Potter (the New School) delivered a keynote entitled “Fibber McGee’s Closet: How Digital Research Transformed the Archive– But Not the History Department,” which underlined these new challenges and the need for history methodologies to shift alongside shifts in archival methodologies. “The Archive, of course, has always represented systems of cognition,” as Potter put it, “but when either the nature of the archive or the way the archive is used changes, we must agree to change with it.” Historians must learn to triage in the face of the increased volume, despite the slow pace at which educational and research models have moved. Potter called for archivists and historians to work together to support our complimentary roles in deriving meaning and use from collections. “The long game will be, historians, I hope, will begin to see archives and information technology as an intellectual and scholarly choice.” The Archives can be a teaching space and research space. (Read the text of her full talk.)

“Why Can’t We Stand Archival Practice on Its Head?” included three case studies experimenting with forms of “digitization as processing”- Larisa Miller (Hoover Institution, Stanford University), Jamie Roth and Erica Boudreau (John F. Kennedy Center Presidential Library and Museum), and Elizabeth Kelly (Loyola University, New Orleans).

  • ­Larisa Miller (Hoover Institution, Stanford University) reviewed the evolution of optical character recognition (OCR) and its use as a processing substitute. In comparing finding aids to these capabilities, she noted that “any access method will produce some winners and some losers.” Miller underscored the resource decisions that every archive must account for: Is this about finding aids or the best way to provide access? By eliminating archival processing, many more materials are digitized and made available to users. Ultimately, what methods maximize resources to get the most materials out to end users? In addition to functional reasons, Miller was critical of some core processing tasks: “The more arrangement we do, the more we violate original order.” (Read her related article published in The American Archivist.)
  • Jamie Roth and Erica Boudreau (John F. Kennedy Center Presidential Library and Museum) implemented multiple modes to test against one another: systematic digitization, digitization “on-demand” and simultaneous digitization while processing. Their talks emphasized impediments to digitization for access, such as their need to comply with legal requirements with restricted material and the lack of reliability with OCR. Roth emphasized that poor description still leads to lack of access or “access in name only.” They also cited researcher’s strong preferences for the analog original, even when given the option to use the digitized version.
  • Elizabeth Kelly (Loyola University, New Orleans) also experimented with folder-level metadata in digitizing university photographs. The scanning resulted in significant resource savings but surveyed users found the experimentally scanned collection “difficult to search and browse, but acceptable to some degree.” (Her slides are on Figshare.)

A great point from some audience members was that these types of item-level online displays are not viable information for data researchers. Item-level organization seems to be a carryover from the analog world that, once again, serves some and not others with their evaluations.

“Going Beyond the Click: A Moderated Conversation on the Future of Archival Description,” included Jarrett Drake (Princeton), Ann Wooton (PopUp Archive) and Kari Smith (Massachusetts Institute of Technology, but I’ll focus on Drake’s work. Drake, Smith, and Wooten all addressed the major insufficiencies in existing descriptive and access practices in different ways. Smith will publish a blog post with more information on MIT’s Engineering the Future of the Past this Friday, May 27.

  • Jarrett Drake (Princeton) spoke from his experiences at Princeton, as well as with “A People’s Archive for Police Violence in Cleveland.” He delivered an impassioned attack of foundational principles — such as provenance, appraisal and respect des fonds — as not only technically insufficient in a landscape of corporatized ownership in the cloud, university ownership of academic work and collaborative work, but also as unethical carryovers of our colonialist and imperialistic past. With this technological shift, however, he emphasized the greater possibility for change: “First, we occupy a moment in history in which the largest percentage of the world’s population ever possesses the power and potential to author and create documentation about their lived experiences.” (Read the full text of his talk.)

While I haven’t done justice to the talks and the ensuing conversation and debate, the Radcliffe Technology Workshop helped me to expand my own thinking by framing problems to include invested practitioners and theorists outside of the digital preservation sphere. To my knowledge it is also the only event of its kind.

Presidential campaigns weigh in on education & libraries / District Dispatch

Representatives from all three major Presidential campaigns are expected to participate in this week’s CEF Presidential Forum to be held May 26 in Washington. ALA will be participating in the half-day forum and encourages members to view and participate online.

Speech bubbles that say "Q & A"

Source: www.thisisamericanrugby.com”

ALA members are invited to follow the Forum online as the event will be live streamed  starting at 10:00 AM and running through 12:00 PM EST. ALA has submitted library-themed questions for the Presidential representatives, but you can participate in the event by submitting your questions at  SubmitQ@cef.org or tweeting your questions via twitter using #CEFpresForum.

The Committee for Education Funding (CEF) is hosting the 2016 Presidential Forum, which will emphasize education as a critical domestic policy and the need for continuing investments in education. At the forum, the high-level surrogates will discuss in depth the education policy agendas of the remaining candidates. A second panel of education experts from think tanks will discuss the educational landscape that awaits the next administration.  CEF has hosted Presidential Forums during previous elections.

Candy Crowley, award-winning journalist and former Chief Political Correspondent for CNN, will moderate both panels.

The post Presidential campaigns weigh in on education & libraries appeared first on District Dispatch.

Introducing: MyData / Open Knowledge Foundation

this post was written by the OK Finland team

What is MyData?

MyData is both an alternative vision and guiding technical principles for how we, as individuals, can have more control over the data trails we leave behind us in our everyday actions.

The core idea is that we, you and I, should have an easy way to see where data about us goes, specify who can use it, and alter these decisions over time. To do this, we are developing a standardized, open, and mediated approach to personal data management by creating “MyData operators.”

Standardised operator model

A MyData operator account would act like an email account for your different data streams. Like an email, different parties can host an operator account, with different sets of functionalities. For example, some MyData operators could also provide personal data storage solutions, others could perform data analytics or work as identity provider. The one requirement for a MyData operator is that it lets individual receive and send data streams according to one interoperable set of standards.

What “MyData” can do?

“MyData” model does a few things that the current data ecosystem does not.

It will let you to re-use your data with a third party – For example, you could take data collected about your purchasing habits from a loyalty card of your favourite grocery store and re-use it in a financing application to see how you are spending your money on groceries.

It will let you see and change how you consent to your data use Currently,  different service providers and applications use complicated terms of service where most users just check ‘yes’ or ‘no’ once , without being entirely sure what they agree to.

It will let you change services – With MyData you will be able to take your data from one operator to another if you decide to change services.

mydata-banner-transp

Make it happen, make it right

MyData2016 conference will be held in Aug 31st- Sep 2nd in Helsinki Hall of Culture.

Right now, the technical solutions for managing your data according to MyData approach exist. There are many initiatives, emerging out of both the public and private sectors around the world, paving the way for human-centered personal data management. We believe strongly in the need to collaborate with other initiatives to develop an infrastructure in a way that works with all the complicated systems at work in the current data landscape. Buy your tickets for early bird discount before May 31st.

Follow MyData on social media for updates:

Twitter https://twitter.com/mydata2016 Facebook https://www.facebook.com/mydata2016/

Last week in appropriations / District Dispatch

The Appropriations process in Congress is a year-long cycle with fits and starts, and includes plenty of lobbying, grassroots appeals, lobby days, speeches, hearings and markups, and even creative promotions designed to draw attention to the importance of one program or another. ALA members and the Office of Government Relations continue to play a significant role in this process. Recently, for example, we’ve worked to support funding for major library programs like LSTA and IAL, as well as to address policy issues that arise in Congressional deliberations. Your grassroots voice helps amplify my message in meetings with Congressional staff.

The House and Senate Appropriations Committees have begun to move their FY2017 funding bills through the subcommittee and full committee process as the various spending measures to the Floor and then to the President’s desk. Last week was a big week for appropriations on Capitol Hill and I was back-and-forth to various Congressional hearings, meetings, and events. Here are a few of last week’s highlights:

Clock hands pointing to the words "time for reveiw"

Source: csp_iqoncept

Tuesday – There’s another word for that    

The full House Appropriations Committee convened (in a type of meeting called a “markup”) to discuss, amend and vote on two spending bills: those for the Department of Defense and the Legislative Branch. A recent proposed change to Library of Congress (LC) cataloging terminology having nothing to do with funding at all was the focus of action on the Legislative Branch bill. Earlier in April, the Subcommittee Chair Tom Graves (R-GA14) successfully included instructions to the Library in a report accompanying the bill that would prohibit the LC from implementing changes in modernizing the outdated, and derogatory, terms “illegal aliens” and “aliens.”

An amendment was offered during Tuesday’s full Committee meeting by Congresswoman Debbie Wasserman Schultz (D-FL23) that would have removed this language from the report (a position strongly and actively supported by ALA and highlighted during National Library Legislative Day). The amendment generated extensive discussion, including vague references by one Republican to “outside groups” (presumably ALA) that were attempting to influence the process (influence the process? in Washington? shocking!).

The final roll call vote turned out to be a nail biter as ultimately four Committee Republicans broke with the Subcommittee chairman to support the amendment. Many in the room, myself included, thought the amendment might have passed and an audible gasp from the audience was heard upon announcement that it had failed by just one vote (24 – 25). The Legislative Branch spending bill now heads to the Floor and another possible attempt to pass the Wasserman Schultz amendment …. or potentially to keep the bill from coming up at all.

Wednesday – Can you hear me now? Good.

In Congress, sometimes the action occurs outside the Committee rooms. It’s not uncommon, therefore, for advocates and their congressional supporters to mount a public event to ratchet up the pressure on the House and Senate. ALA has been an active partner in a coalition seeking full funding for Title IV, Part A of the Every Student Succeeds Act. On Wednesday, I participated in one such creative endeavor: a rally on the lawn of the US Capitol complete with high school choir, comments from supportive Members of Congress, and “testimonials” from individuals benefited by Title IV funding.

This program gives school districts the flexibility to invest in student health and safety, academic enrichment, and education technology programs. With intimate knowledge of the entire school campus, libraries are uniquely positioned to assist in determining local needs for block grants, and for identifying needs within departments, grade levels, and divisions within a school or district. Congress authorized Title IV in the ESSA at $1.65 billion for FY17, however the President’s budget requests only about one third of that necessary level.

The cloudy weather threatened — but happily did not deliver — rain and the event came off successfully. Did Congress hear us? Well, our permit allowed the use of amplified speakers, so I’d say definitely yes!

Thursday – A quick vote before lunch

On Thursday, just two days after House Appropriators’ nail biter of a vote over Legislative Branch Appropriations, the full Senate Appropriations Committee took up their version of that spending bill in addition to Agriculture Appropriations. For a Washington wonk, a Senate Appropriations Committee hearing is a relatively epic thing to behold. Each Senator enters the room trailed by two to four staffers carrying reams of paper. Throughout the hearing, staffers busily whisper amongst each other, and into the ears of their Senators (late breaking news that will net an extra $10 million for some pet project, perhaps?)

While a repeat of Tuesday’s House fracas wasn’t at all anticipated (ALA had worked ahead of time to blunt any effort to adopt the House’s controversial Library of Congress provision in the Senate), I did wonder whether there had been a last minute script change when the Chairman took up the Agriculture bill first and out of order based on the printed agenda for the meeting. After listening to numerous amendments addressing such important issues as Alaska salmon, horse slaughter for human consumption (yuck?), and medicine measurement, I was definitely ready for the Legislative Branch Appropriations bill to make its appearance. As I intently scanned the room for any telltale signs of soon-to-be-volcanic controversy, the Committee Chairman brought up the bill, quickly determined that no Senator had any amendment to offer, said a few congratulatory words, successfully called for a voice vote and gaveled the bill closed.

Elapsed time, about 3 minutes! I was unexpectedly free for lunch…and, for some reason, craving Alaska salmon.

Epilogue – The train keeps a rollin’

This week’s activity by the Appropriations Committees of both chambers demonstrates that the leaders of Congress’ Republican majority are deliberately moving the Appropriations process forward. Indeed, in the House and Senate they have promised to bring all twelve funding bills to the floor of both chambers on time…something not done since 1994. Sadly, however, staffers on both sides of the aisle tell me that they expect the process to stall at some point. If that happens, once again Congress will need to pass one or more “Continuing Resolutions” (or CRs) after October 1 to keep the government operating. One thing is certain; there is lots of work to be done this summer to defend library funding and policies.

The post Last week in appropriations appeared first on District Dispatch.

Judiciary Committee Senators face historic “E-Privacy” protection vote / District Dispatch

More good news could be in the offing for reform of ECPA, the Electronic Communications Privacy Act. Senate Judiciary Committee Chairman Charles Grassley (R-IA) recently (and pleasantly) surprised reform proponents by calendaring a Committee vote on the issue now likely to take place this coming Thursday morning, May 26th.  The Committee, it is hoped, will take up and pass H.R. 699, the Email Privacy Act, which was unanimously approved by the House of Representatives, as reported in District Dispatch, barely three weeks ago.  (A similar but not identical Senate bill co-authored by Judiciary Committee Ranking Member Patrick Leahy [D-VT], S. 356, also could be called up and acted upon.)

Laptop chained up

Source: cksyme.com

Either bill finally would update ECPA in the way most glaringly needed: to virtually always require the government to get a standard, judicially-approved search warrant based upon probable cause to acquire the full content of an individual’s emails, texts, tweets, cloud-based files or other electronic communications. No matter which is considered, however, there remains a significant risk that, on Thursday, the bill’s opponents will try to dramatically weaken that core reform by exempting certain agencies (like the IRS and SEC) from the new warrant requirement, and/or by providing dangerous exceptions to law enforcement and security agencies acting in overbroadly defined “emergency” circumstances.

Earlier today, ALA joined a new joint letter signed by nearly 65 of its public and sector coalition partners calling on Senators Grassley and Leahy to take up and pass H.R. 699 as approved by the House: in other words “without any [such] amendments that would weaken the protections afforded by the bill” ultimately approved by 419 of the 435 House Members.

Now is the time to tell the Members of the Senate Judiciary Committee that almost 30 years has been much too long to wait for real ECPA reform. Please go to ALA’s Legislative Action Center to email to your Senate Judiciary Senator now!

The post Judiciary Committee Senators face historic “E-Privacy” protection vote appeared first on District Dispatch.

Welcome Jeff Depa! / SearchHub

We’re happy to announce another new addition to the Lucidworks team! Please welcome Jeff Depa, our new Senior Vice President of Worldwide Field Operations in May 2016 (full press release: Lucidworks Appoints Search Veterans to Senior Team).

jeff-depa-lucidworks

Jeff will lead the company’s day-to-day field operations, including its rapidly growing sales, alliances and channels, systems engineering and professional services business. Prior to Lucidworks, Jeff has over 17 years in leadership positions across sales, consulting, and systems engineering with companies such as Oracle, Sun, and most recently at DataStax.

Jeff earned a B.S. in Biomedical Engineering from Case Western Reserve University and also holds a Masters in Management. Aside for a passion to enable clients to unleash the power of their data, Jeff is an avid pilot and enjoys spending time with his family in Austin, TX.

We sat down with Jeff to learn more about his passion for search:

What attracted you to Lucidworks?

Lucidworks is at the forefront of unleashing the value hidden in the massive amount of data companies have collected across disparate systems. They have done a phenomenal job in driving the adoption of Apache Solr, but more importantly, building a platform in Fusion that allows enterprises from high volume ecommerce shops to healthcare to easily adopt and deploy a search solution that goes beyond the industry standard, and really focuses on providing the right information at the right time with unique relevancy and machine learning technologies.

What will you be working on at Lucidworks?

I’ll be focused on building on top of a solid foundation as we continue to drive the adoption of Fusion in the market and expand our team to capture the market opportunity with our customers and partners. I’m excited to be part of this journey.

Where do you think the greatest opportunities lie for companies like Lucidworks?

In today’s economy, value is driven from creating a unique, personalized and real time experience for customers and employees. Lucidworks sits squarely in the middle of an enterprise’s disparate and rapidly evolving data sources and enables the transformation of data to information that can be used to improve the user experience. The ability to to tie that information to a high impact customer result is a huge opportunity for Lucidworks.

Welcome to the team Jeff!

The post Welcome Jeff Depa! appeared first on Lucidworks.com.

Mindful Tech, a 2 part webinar series with David Levy / LITA

Mindful Tech: Establishing a Healthier and More Effective Relationship with Our Digital Devices and Apps
Tuesdays, June 7 and 14, 2016, 1:00 – 2:30 pm Central Time
David Levy, Information School, University of Washington

Register Now for this 2 part webinar

“There is a long history of people worrying and complaining about new technologies and also putting them up on a pedestal as the answer. When the telegraph and telephone came along you had people arguing both sides—that’s not new. And you had people worrying about the explosion of books after the rise of the printing press.

What is different is for the last 100-plus years the industrialization of Western society has been devoted to a more, faster, better philosophy that has accelerated our entire economic system and squeezed out anything that is not essential.

As a society, I think we’re beginning to recognize this imbalance, and we’re in a position to ask questions like “How do we live a more balanced life in the fast world? How do we achieve adequate forms of slow practice?”

David Levy – See more at: http://tricycle.org/trikedaily/mindful-tech/#sthash.9iABezUN.dpuf

mindfultechbookDon’t miss the opportunity to participate in this well known program by David Levy, based on his recent widely reviewed and well regarded book “Mindful Tech”. The popular interactive program will include exercises and participation now re-packaged into a 2 part webinar format. Both parts will be fully recorded for participants to return to, or to work with varying schedules.

Register Now for the 2 part Mindful Tech webinar series

This two part, 90 minutes each, webinars series will introduce participants to some of the central insights of the work Levy has been doing over the past decade and more. By learning to pay attention to their immediate experience (what’s going on in their minds and bodies) while they’re online, people are able to see more clearly what’s working well for them and what isn’t, and based on these observations to develop personal guidelines that allow them to operate more effectively and healthfully. Levy will demonstrate this work by giving participants exercises they can do, both during the online program and between the sessions.

Presenter

David LevyDavid Levy

David M. Levy is a professor at the Information School of the University of Washington. For more than a decade, he has been exploring, via research and teaching, how we can establish a more balanced relationship with our digital devices and apps. He has given many lectures and workshops on this topic, and in January 2016 published a book on the subject, “Mindful Tech: How to Bring Balance to Our Digital Lives” (Yale). Levy is also the author of “Scrolling Forward: Making Sense of Documents in the Digital Age” (rev. ed. 2016).

Additional information is available on his website at: http://dmlevy.ischool.uw.edu/

Then register for the webinar and get Full details

Can’t make the dates but still want to join in? Registered participants will have access to both parts of the recorded webinars.

Cost:

  • LITA Member: $68
  • Non-Member: $155
  • Group: $300

Registration Information

Register Online page arranged by session date (login required)
OR
Mail or fax form to ALA Registration
OR
Call 1-800-545-2433 and press 5
OR
email registration@ala.org

Questions or Comments?

For all other questions or comments related to the preconference, contact LITA at (312) 280-4269 or Mark Beatty, mbeatty@ala.org.

iCampBC - Instructors Announced! / Islandora

Islandora Camp is going back to Vancouver from July 18 - 20, courtesy of our wonderful hosts at the British Columbia Electronic Library Network. Camp will (as usual) consist of three days: One day of sessions taking a big-picture view of the project and where it's headed, one day of hands-on workshops for developers and front-end administrators, and one day of community presentations and deeper dives into Islandora tools and sites. The instructors for that second day have been selected and we are pleased to introduce them:

Developers

Mark Jordan has taught at two other Islandora Camps and at the Islandora Conference. He is the developer of Islandora Context, Islandora Themekey, Islandora Datastream CRUD, and the XML Solution Pack, and is one of the codevelopers of the the Move to Islandora Kit. He is also an Islandora committer and is currently serving as Chair of the Islandora Foundation Board. His day job is as Head of Library Systems at Simon Fraser University.

Rosie Le Faive started with Islandora in 2012 while creating the a trilingual digital library for the Commission for Environmental Cooperation. With experience and - dare she say - wisdom gained from creating highly customized sites, she's now interested in improving the core Islandora code so that everyone can use it. Her interests are in mapping relationships between objects, and intuitive UI design. She is the Digital Infrastructure and Discovery librarian at UPEI, and develops for Agile Humanities.  

Admins

Melissa Anez has been working with Islandora since 2012 and has been the Community and Project Manager of the Islandora Foundation since it was founded in 2013. She has been a frequent instructor in the Admin Track and developed much of the curriculum, refining it with each new Camp.

Janice Banser is the Systems Librarian at Simon Fraser University.  She has been working with Islandora, specifically the admin interface, for over a year now. She is a member of the Islandora Documentation Interest Group and has contributed to the last two Islandora releases. She has been working with Drupal for about 6 years and has been a librarian since 2005.

MarcEdit Update / Terry Reese

Yesterday, I posted a significant update to the Windows/Linux builds and a maintenance update to the Mac build that includes a lot of prep work to get it ready to roll in a number of changes that I’ll hopefully complete this week.  Unfortunately, I’ve been doing a lot of travelling, which means that my access to my mac setup has been pretty limited and I didn’t want to take another week getting everything synched together. 

So what are the specific changes:

ILS Integrations
I’ve been spending a lot of time over the past three works head down working on ILS integrations.  Right now, I’m managing two ILS integration scenarios – one is with Alma and their API.  I’m probably 80% finished with that work.  Right now, all the code is written, I’m just not getting back expected responses from their bibliographic update API.  Once I sort out that issue – I’ll be integrating this change into MarcEdit and will provide a youtube video demonstrating the functionality. 

The other ILS integration that I’ve been accommodating is working with MarcEdit’s MARC SQL Explorer and the internal database structure.  This work builds on some work being done with the Validate Headings tool to close the authority control loop.  I’ll likely be posting more about that later this week as I’m currently have a couple libraries test this functionality to make sure I’ve not missed anything.  Once they give me the thumbs up, this will make its way into the MarcEditor as well. 

But as part of this work, I needed to create a way for users to edit and search the local database structure in a more friendly way.  So, leveraging the ILS platform, I’ve included the ability for users to work with the local database format directly within the MarcEditor.  You can see how this works here (https://www.youtube.com/watch?v=dMJ_pUxyoFc&feature=youtu.be): Integrating the MarcEditor with a local SQL store.  I’m not sure what the ideal use case is for this functionality – but over the past couple of weeks, it had been requested by a couple of power users currently using the MARC SQL Explorer for some data edits, but hoping for an easier to user interface.  This work will be integrated into the Mac MarcEdit version at the end of this week.  All the prep work (window/control development) has been completed.  At this point, its just migrating the code so that it works within the Mac’s object-C codebase.

Edit Shortcuts
I created two new edit shortcuts in the MarcEditor.  The first, Find Records With Duplicate Tags, was created to help users look for records that may have multiple tags or a tag/subfield combination with a set of records.  This is work that can be done in the Extract Selected Records tool, but it requires a bit a trickery and knowledge of how MarcEdit formats data. 

image

How does this work – say you wanted to know which records had multiple call numbers (050) fields in a record.  You would select this option, enter 050 in the prompt, and then the tool would create for you a jump list showing all the records that met your criteria. 

Convert To Decimal Degrees
The second Edit ShortCut function is the first Math function (I’ll be adding two more, specifically around finding records with dates greater than or less than a specific value) targeting the conversion of Degree/Minutes/Seconds to decimal degrees.  The process has been created to be MARC agnostic, so users can specify the field, and subfields to process.  To run this function, select it from the Edit Shortcuts as demonstrated in the screenshot below:

image

When selected, you will get the following prompt:

image

This documents the format for defining the field/subfields to be processed.  Please note, it is important to define the all four potential values for conversion – even if they are not used within the record set. 

Using this function, you can now convert a value like:
=034  1\$aa$b1450000$dW1250000$eW1163500$fN0461500$gN0420000
To:
=034  1\$aa$b1450000$d+125.0000$e+116.5800$f+046.2500$g+042.0000

This function should allow users to transition their cartographic data to a format that is much more friendly to geographic interpretation if desired.

Bug Fixes:
This update also addressed a bug in the Build New field parser.  If you have multiple arguments, side-by-side, within the same field grouping (i.e., {100$a}{100$b}{100$c} – the parser can become confused.  This has been corrected.

Updates:
Included and update to the linked data rules file, updating the 7xx fields to include the $t in the processing.  Also updated the UNIMARC translation to include a 1:1 translation for 9xx data.

Over the next week, I hope to complete the Alma integration, but will focusing the development work in my free time on getting the Mac version synched with these changes.

–tr

Find out What’s Inside Hydra-in-a-Box at Open Repositories 2016: PCDM, Design, Emerging Architecture, Repository Tooling / DuraSpace News

Austin, TX  It’s only three weeks away! If you will attend the 11th Annual International Conference on Open Repositories (#OR2016) here are the sessions that will be of interest if you want to learn more about the Hydra-in-a-Box project

Workshop: Modeling your Repository Objects with the Portland Common Data Model (PCDM)

Monday, June 13, 1:30-3:30 PM; 4:00-6:00 PM

97% of Research Library Searches Leak Privacy... and Other Disappointing Statistics. / Eric Hellman


...But first, some good news. Among the 123 members of the Association of Research Libraries, there are four libraries with almost secure search services that don't send clickstream data to Amazon, Google, or any advertising network. Let's now sing the praises of libraries at Southern Illinois University, University of Louisville, University of Maryland, and University of New Mexico for their commendable attention to the privacy of their users. And it's no fault of their own that they're not fully secure. SIU fails to earn a green lock badge because of mixed content issues in the CARLI service; while Louisville, Maryland and New Mexico miss out on green locks because of the weak cipher suite used by OCLC on their Worldcat Local installations. These are relatively minor issues that are likely to get addressed without much drama.

Over the weekend, I decided to try to quantify the extent of privacy leakage in public-facing library services by studying the search services of the 123 ARL libraries. These are the best funded and most prestigious libraries in North America, and we should expect them to positively represent libraries. I went to each library's on-line search facility and did a search for a book whose title might suggest to an advertiser that I might be pregnant. (I'm not!) I checked to see whether the default search linked to by the library's home page (as listed on the ARL website) was delivered over a secure connection (HTTPS). I checked for privacy leakage of referer headers from cover images by using Chrome developer tools (the sources tab). I used Ghostery to see if the library's online search used Google Analytics or not. I also noted whether advertising network "web beacons" were placed by the search session.

72% of the ARL libraries let Google look over the shoulder of every click by every user, by virtue of the pervasive use of Google Analytics. Given the commitment to reader privacy embodied by the American Library Association's code of ethics, I'm surprised this is not more controversial. ALA even sponsors workshops on "Getting Started with Google Analytics". To paraphrase privacy advocate and educator Dorothea Salo, the code of ethics does not say:
We protect each library user's right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted, except for Google Analytics.
While it's true that Google has a huge stake in maintaining the trust of users in their handling of personal information, and people seem to trust Google with their most intimate secrets, it's also true that Google's privacy policy puts almost no constraints on what Google (itself) can do with the information they collect. They offer strong commitments not to share personally identifiable information with other entities, but they are free to keep and use personally identifiable information. Google can associate Analytics-tracked library searches with personally identifiable information for any user that has a Google account; Libraries cannot be under the illusion that they are uninvolved with this data collection if they benefit from Google Analytics. (Full disclosure: many of the the web sites I administer also use Google Analytics.)

80% of the ARL libraries provide their default discovery tools to users without the benefit of a secure connection. This means that any network provider in the path between the library and the user can read and alter the query, and the results returned to the user. It also means that when a user accesses the library over public wifi, such as in a coffee shop, the user's clicks are available for everyone else in the coffee shop to look at, and potentially to tamper with. (The Digital Library Privacy Pledge is not having the effect we had hoped for, at least not yet.)

28% of ARL libraries enrich their catalog displays with cover images sourced from Amazon.com. Because of privacy leakage in referer headers, this means that a user's searches for library books are available for use by Amazon when Amazon wants to sell that user something. It's not clear that libraries realize this is happening, or whether they just don't realize that their catalog enrichment service uses cover images sourced by Amazon.

13% of ARL libraries help advertisers (other than Google) target their ads by allowing web beacons to be placed on their catalog web pages. Whether the beacons are from Facebook, DoubleClick, AddThis or Sharethis, advertisers track individual users, often in a personally identifiable way. Searches on these library catalogs are available to the ad networks to maximize the value of advertising placed throughout their networks.

Much of the privacy leakage I found in my survey occurs beyond the control of librarians. There are IT departments, vender-provided services, and incumbent bureaucracies involved. Important library services appear to be unavailable in secure versions. But specific, serious privacy leakage problems that I've discussed with product managers and CTOs of library automation vendors have gone unfixed for more than a year. I'm getting tired of it.

The results of my quick survey for each of the 123 ARL libraries are available as a Google Sheet. There are bound to be a few errors, and I'd love to be able to make changes as privacy leaks get plugged and websites become secure, so feel free to leave a comment.

LITA announces the Top Tech Trends panel at ALA Annual 2016 / LITA

ALA Annual 2016 badgeKicking off LITA’s celebration year of it’s 50th year the Top Technology Trends Committee announces the panel for the highly popular session at  2016 ALA Annual in Orlando, FL.

Top Tech Trends
starts Sunday June 26, 2016, 1:00 pm – 2:30 pm, in the
Orange County Convention Center, Room W109B
and kicks off Sunday Afternoon with LITA.

tttvertheadsThis program features the ongoing roundtable discussion about trends and advances in library technology by a panel of LITA technology experts. The panelists will describe changes and advances in technology that they see having an impact on the library world, and suggest what libraries might do to take advantage of these trends. This year’s panelists line up is:

  • Maurice Coleman, Session Moderator, Technical Trainer, Harford County Public Library, @baldgeekinmd
  • Blake Carver, Systems Administrator, LYRASIS, @blakesterz
  • Lauren Comito, Job and Business Academy Manager, Queens Library, @librariancraftr
  • Laura Costello, Head of Research & Emerging Technologies, Stony Brook University, @lacreads
  • Carolyn Coulter, Director, PrairieCat Library Consortium, Reaching Across Illinois Library System (RAILS), @ccoulter
  • Nick Grove, Digital Services Librarian, Meridian Library District – unBound, @nickgrove15

Check out the Top Tech Trends web site for more information and panelist biographies.

Safiya NobleSafiya Noble

Followed by the LITA Awards Presentation & LITA President’s Program with Dr. Safiya Noble
presenting: Toward an Ethic of Social Justice in Information
at 3:00 pm – 4:00 pm, in the same location

Dr. Noble is an Assistant Professor in the Department of Information Studies in the Graduate School of Education and Information Studies at UCLA. She conducts research in socio-cultural informatics; including feminist, historical and political-economic perspectives on computing platforms and software in the public interest. Her research is at the intersection of culture and technology in the design and use of applications on the Internet.

LITA50_logo_vertical_webConcluding with the LITA Happy Hour
from 5:30 pm – 8:00 pm
that location to be determined

This year marks a special LITA Happy Hour as we kick off the celebration of LITA’s 50th anniversary. Make sure you join the LITA Membership Development Committee and LITA members from around the country for networking, good cheer, and great fun! Expect lively conversation and excellent drinks; cash bar. Help us cheer for 50 years of library technology.

 

Open Knowledge International – our new name! / Open Knowledge Foundation

Notice something a little different? We have had a change of name!

As of today, we officially move from being called “Open Knowledge” to “Open Knowledge International (OKI)”.

standard-rgb-334x344

“Open Knowledge International” is the name by which the community groups have referred to us for a couple of years, conveying our role in supporting the groups around the world, as well as our role within the broader open knowledge movement globally. We are excited to announce our new name that reflects this.

Open Knowledge International is registered in the UK, and this has sometimes led to assumptions that we operate in and for the benefit of this region. However, the UK is no more of a priority to Open Knowledge International than other areas of the world; in fact, we want to look more closely at ways we can be engaged at a global level, where efforts to push open knowledge are already happening and where we can make a difference by joining alongside the people making it happen. This is evident by our efforts to support the associated Open Knowledge Network, with a presence in more than 40 countries and cross-border Working Groups, as well as our support of international projects, such as the Global Open Data Index, that both blends open knowledge expertise and draws upon the global open data community. Finally, we are an international team, with staff based in nearly every region, collaborating virtually to promote openness online and on the ground.

By formalising Open Knowledge International as our name beyond the community groups associated with us and to the broader open knowledge movement, we are reflecting the direction we are striving to undertake, now and increasingly so in the future. We are grateful to have such a strong community behind us as we undertake a name change that better reflects our priorities and as we continue to seek new opportunities on a global scale.

We are also planning to transition from the domain name okfn.org for brand consistency and will begin that transition in the coming months. If you would like to discuss this change of name, and what it means, please join in on our forum – discuss.okfn.org.

For revised logos please see okfn.org/press/logos and please contact press@okfn.org if you have any questions about the use of this brand.

Improving e-Journal Ingest (among other things) / David Rosenthal

Herbert Van de Sompel, Michael Nelson and I have a new paper entitled Web Infrastructure to Support e-Journal Preservation (and More) that:
  • describes the ways archives ingest e-journal articles,
  • shows the areas in which these processes use heuristics, which makes them fallible and expensive to maintain,
  • and shows how the use of DOIs, ResourceSync, and Herbert and Michael's "Signposting" proposal could greatly improve these and other processes that need to access e-journal content.
It concludes with a set of recommendations for CrossRef and the e-journal publishers that would be easy to adopt and would not merely improve these processes but also help remedy the deficiencies in the way DOI's are used in practice that were identified in Martin Klein et al's paper in PLoS One entitled Scholarly Context Not Found: One in Five Articles Suffers from Reference Rot, and in Persistent URIs Must Be Used To Be Persistent, presented by Herbert and co-authors to the 25th international world wide web conference.


Getting your color on: maybe there’s some truth to the trend / LITA

Coloring was never my thing, even as a young child, the amount of decision required in coloring was actually stressful to me. Hence my skepticism of this zen adult coloring trend. How could something so stressful for me be considered a thing of “zen”. I purchased a book and selected coloring tools about a year ago, coloring bits and pieces here and there but not really getting it. Until now.

While reading an article about the psychology behind adult coloring, I found this quote to be exceptionally interesting:

The action involves both logic, by which we color forms, and creativity, when mixing and matching colors. This incorporates the areas of the cerebral cortex involved in vision and fine motor skills [coordination necessary to make small, precise movements]. The relaxation that it provides lowers the activity of the amygdala, a basic part of our brain involved in controlling emotion that is affected by stress. -Gloria Martinez Ayala [quoted in Coloring Isn’t Just For Kids. It Can Actually Help Adults Combat Stress]

Color Me Stress Free by Lacy Mucklow and Angela PorterA page, colored by Whitni Watkins, from Color Me Stress Free by Lacy Mucklow and Angela Porter

As I was coloring this particular piece [pictured to the left] I started seeing the connection the micro process of coloring has to the macro process of managing a library and/or team building. Each coloring piece has individual parts that contribute to forming the outline of full work of art. But it goes deeper than that.

For exampled, how you color and organize the individual parts can determine how beautiful or harmonious the picture can be. You have so many different color options to choose from, to incorporate into your picture, some will work better than others. For example, did you know in color theory, orange and blue is a perfect color combination? According to color theory, harmonious color combinations use any two colors opposite each other on the color wheel.” [7]  But that the combination of orange, blue and yellow is not very harmonious?

Our lack of knowledge is a significant hindrance for creating greatness, knowing your options while coloring is incredibly important. Your color selection will determine what experience one has when viewing the picture. Bland, chaotic or pleasing, each part working together, contributing to the bigger picture. “Observing the effects colors have on each other is the starting point for understanding the relativity of color. The relationship of values, saturations and the warmth or coolness of respective hues can cause noticeable differences in our perception of color.” [6]  Color combinations, that may seem unfitting to you, may actually compliment each other.  

Note that some colors will be used more frequently and have a greater presence in the final product due to the qualities that color holds but remember that even the parts that only have a small presence are crucial to bringing the picture together in the end. 

“Be sure to include those who are usually left out of such acknowledgments, such as the receptionist who handled the flood of calls after a successful public relations effort or the information- technology people who installed the complex software you used.”[2]

There may be other times where you don’t use a certain color as much as it should have and could have been used. The picture ends up fully colored and completed but not nearly as beautiful (harmonious) as it could have been. When in the coloring process, ask yourself often “‘What else do we need to consider here?’ you allow perspectives not yet considered to be put on the table and evaluated.” [2] Constant evaluation of your process will lead to a better final piece.

While coloring I also noticed that I color individual portions in a similar manner. I color triangles and squares by outlining and shading inwards. I color circular shapes in a circular motion and shading outwards. While coloring, we find our way to be the most efficient but contained (within the lines) while simultaneously coordinating well with the other parts. Important to note, that the way you found to be efficient in one area  may not work in another area and you need to adapt and be flexible and willing to try other ways. Imagine coloring a circle the way you color a square or a triangle. You can take as many shortcuts as you want to get the job done faster but you may regret them in the end. Cut carefully. 

Remember while coloring: Be flexible. Be adaptable. Be imperturbable.

You can color how ever you see fit. You can choose which colors you want, the project will get done. You can be sure there will be moments of chaos, there will be moments that lack innovation. Experiment, try new things and the more you color the better you’ll get. However, coloring isn’t for everyone, at that’s okay. 

Now, go back and read again, this time substitute the word color for manage.

Maybe there is something to be said about this trend of the adult coloring book. 


References:
1. Coloring Isn’t Just For Kids. It Can Actually Help Adults Combat Stress http://www.huffingtonpost.com/2014/10/13/coloring-for-stress_n_5975832.html
2. Twelve Ways to Build an Effective Team http://people.rice.edu/uploadedFiles/People/TEAMS/Twelve%20Ways%20to%20Build%20an%20Effective%20Team.pdf
3. COLOURlovers: History Of The Color Wheel http://www.colourlovers.com/blog/2008/05/08/history-of-the-color-wheel
4. Smashing Magazine: Color Theory for Designers, Part 1: The Meaning of Color: https://www.smashingmagazine.com/2010/01/color-theory-for-designers-part-1-the-meaning-of-color/
5. Some Color History http://hyperphysics.phy-astr.gsu.edu/hbase/vision/colhist.html
6. Color Matters: Basic Color Theory http://www.colormatters.com/color-and-design/basic-color-theory
7. lifehacker: Learn the Basics of Color Theory to Know What Looks Good http://lifehacker.com/learn-the-basics-of-color-theory-to-know-what-looks-goo-1608972072
8. lifehacker: Color Psychology Chart http://lifehacker.com/5991303/pick-the-right-color-for-design-or-decorating-with-this-color-psychology-chart
9. Why Flexible and Adaptive Leadership is Essential http://challenge2050.ifas.ufl.edu/wp-content/uploads/2013/10/YuklMashud.2010.AdaptiveLeadership.pdf

Sktchy portrait / Patrick Hochstenbach

Filed under: portaits, Sketchbook Tagged: fountainpen, illustration, ink, Photoshop, portrait, sktchy

ALA briefs congress on critical impact of rural broadband access / District Dispatch

Launched in February of this year, the bipartisan Congressional Rural Broadband Caucus was founded “to facilitate discussion, educate Members of Congress and develop policy solutions to close the digital divide in rural America.”At its most recent meeting, Marijke Visser of the ALA’s Office for Information Technology Policy (OITP) and co-panelists from the public and private sectors briefed the Caucus, congressional staff and a general audience at a public session entitled “Strengthening Rural Economics through Broadband Deployment.”

Her presentation highlighted that libraries currently play a pivotal role in providing broadband access in rural communities, addressing the “E’s of Libraries®”across the country: employment and entrepreneurship, education, individual empowerment, and civic engagement. Noting broadly  that “Libraries

Broadband tower seen over a field

Source: Consumer Affairs

strengthen local economies through supporting small business development and entrepreneurship,”Marijke went on to provide specific examples of how libraries have  helped small businesses develop business plans, conduct market research, foster employee certification, use 3D printers, and even use library software programs to design and print creative menus for a restaurant.

She also spotlighted the growing importance of video conferencing availability to rural residents and communities, telling of how a new mother in Alaska received needed healthcare training via video conference at her local public library, thus avoiding a lengthy trip to Seattle, and how a business in Texas was able to secure a contract by using video conferencing through a local public library to expedite OSHA certification of 40 workers.

Marijke also emphasized that today’s libraries clearly are much more than book-lending facilities and places for children’s story time; they are one-stop community hubs, replete with maker spaces, digital production studios, video-conferencing capacity and more. In response to questions from Congressional staff, Marijke also highlighted various services libraries provide to veterans, including resume building, job application assistance, benefit application filing, and financial literacy training.

Marijke and her fellow panelists were welcomed to the Caucus’ meeting by Committee Co-Chairs Reps. Kevin Cramer (R-ND), and Mark Pocan (D-WI2), and Rep. Dave Loebsack (D-IA2). Membership in the Caucus currently stands at 34 Representatives. Its mission, as explained upon its launch by Rep. Bob Latta (R-OH5), is to “bring greater attention to the need for high-speed broadband in rural America, and help encourage and spur innovative solutions to address this growing consumer demand.”

ALA thanks the Caucus for the opportunity to participate in its event, and both the Office of Government Relations and OITP look forward to continuing to work with its members to boost broadband capacity in libraries and homes across rural America.

The post ALA briefs congress on critical impact of rural broadband access appeared first on District Dispatch.

UNESCO PERSIST: A Global Exchange on Digital Preservation / Library of Congress: The Signal

This is a guest post by Robert R. Buckley, Technical Adviser at the National Archives of the UAE in Abu Dhabi and the Coordinator for the PERSIST Policy Working Group.

Photo of meeting.

UNESCO PERSIST meeting in Abu Dhabi. Photo courtesy of National Archives of the UAE.

Readers of this blog would have first seen mention of the UNESCO PERSIST project in The Signal last January. It occurred in a guest post on intellectual property rights related to software emulation. Dealing with IP rights is one of the known challenges of digital preservation. Dealing with the volume of digital content being generated is another, requiring decisions on what content to select and preserve for the benefit of society. These and other digital preservation activities typically depend on policies that influence decision-making and planning processes with a view to enabling sustainability. All these issues fall within the scope for the PERSISTproject and were addressed at its recent meeting held March 14-16 in Abu Dhabi.

The meeting was hosted by Dr. Abdulla El Reyes, Director General of the National Archives of the UAE and Chair of the UNESCO Memory of the World Program. PERSIST is part of the Memory of the World Program and a partnership between UNESCO, the International Council of Archives and the International Federation of Libraries Associations and Institutions. (If it were an acronym, PERSIST would stand for Platform to Enhance and Reinforce the Sustainability of the Information Society Trans-globally.) It is a response to the UNESCO/UBC Vancouver Declaration, adopted at the 2012 Memory of the World in the Digital Age: Digitization and Preservation in Vancouver, where conference participants agreed on the pressing need to establish a road map proposing solutions, agreements and policies for implementation by all stakeholders, in particular governments and industry.

The focus of the PERSIST project is on providing these stakeholders, as well as heritage institutions, with resources to address the challenges of long-term digital preservation and the risks of losing access to part of our digital heritage through technology obsolescence. Fostering a high-level dialogue and joint action on digital preservation issues among all relevant stakeholders is a core objective of PERSIST. For example, during the UNESCO General Conference last November in Paris, PERSIST hosted an event that included Microsoft, Google and the ACM. This is the kind of thing UNESCO is well positioned to do and where it can add value on a global scale in the very active and fertile field of digital preservation.

The Abu Dhabi meeting was attended by over 30 experts, representing heritage institutions, universities and governmental, non-governmental and commercial organizations from a dozen countries spread across five continents. The meeting had an ambitious agenda that included formulating an operating plan for 2016-2017. The major outcomes of the meeting were organized around the work of the three task forces into which PERSIST was divided: Content, Technology and Policy.

Photo session at a meeting

Official launch of the UNESCO/PERSIST Selection Guidelines. Photo courtesy of National Archives of the UAE

First was the launch of the UNESCO/PERSIST Guidelines for the selection of digital heritage for long-term preservation, drafted by the Content Task Force. The selection process, in the form of a decision tree, takes a risk-assessment approach to evaluating significance, assessing sustainability and considering availability in dealing with the overwhelming volume of digital information now being created and shared. Written by a team of seven experts from the library, archives, and museum community, the Guidelines aim to provide an overarching starting point for heritage institutions when drafting their own policies on the selection of digital heritage for long-term sustainable digital preservation.

Second was the progress by the Technology Task Force on defining the PERSIST technology strategy and finding an organizational home that would maintain, manage and make available the legacy software platform for future access to digital heritage at risk due to software obsolescence. (PERSIST is in contact with the Software Preservation Network and will be presenting at the SPN Forum in August.)

The diagram illustrates the role of the UNESCO PERSIST project in the digital preservation ecosystem, including access to legacy software licenses. The organizational home, which we have been calling the UNESCO PERSIST Organization or UPO, would complement the work of the UNESCO PERSIST project. It would be a non-profit that would be able to enter into legal agreements with software vendors—a significant capability. Conversations are underway with a candidate organization about hosting the UPO.

Chart

Role of UNESCO PERSIST in the Digital Preservation Ecosystem. Diagram by Natasa Milic-Frayling. CLICK TO ENLARGE

Third was the formal creation of the Policy Task Force. In one way or another its initial outputs are all related to the Recommendation concerning the preservation of, and access to, documentary heritage including in digital form, which was approved at the UNESCO General Conference in November 2015 and which requires action by UNESCO Member States. Besides contributing directly to the guidelines for implementing the digital part of the Recommendation, the task force also plans to take a community-based approach to develop supporting tools such as a Model National Digital Preservation Strategy and a Starter’s Guide for policymakers. Already the Selection guidelines provide a tool for the identification of documentary heritage called for by the Recommendation. The Policy team will also work with the other task forces on strategic policy questions.

From here, there is still much to be done in disseminating the selection guidelines that would make the challenges of digital preservation more manageable, in developing and putting on a firm foundation the software technology platform that would enable access to legacy documents, and in establishing policy guidelines that would provide institutional and national frameworks where they are most needed for the preservation of digital documentary heritage.

You can hear more about PERSIST at the IFLA WLIC 2016 and the SPN Forum in August, the ICA Congress in September and iPRES 2016 in October. You can also read about PERSIST online, watch an introductory video and follow it on Twitter at #unescopersist.

Senate committee approves legislative branch funding without fireworks / District Dispatch

In stark contrast to Tuesday’s full House Appropriations Committee markup – which, as previously reported, featured almost 30 minutes of hot debate over legislative report language intended to bar the Library of Congress (LC) from retiring the subject headings “Aliens” and “Illegal aliens” — the Senate

Japanese macaque yawning

Photo source: Daisuke Tashiro

Appropriations Committee took scarcely 3 minutes on Thursday to call up, “debate” and pass its version of the “Leg Approps” bill devoid of the House’s controversial provision. As in the earlier debate, the presidents of ALA and ALCTS wrote to key Members of the Senate Committee prior to the vote asking them not to incorporate language like that adopted by the House.

Both bills are now in line to be considered on the floors of their respective chambers but no timetable has yet been set . . . and that could take some time. Even if passed by both bodies, House and Senate negotiators will then need to reconcile differences between the bills, hot button LC subject heading report text included. Congressional insiders forecast that, if necessary, no such negotiations are anticipated until after November’s elections.  ALA and ALCTS will continue to educate all Members of Congress in the intervening months, however, about the House’s folly in countermanding the Library of Congress’ solidly reasoned, professional cataloging judgement.

The post Senate committee approves legislative branch funding without fireworks appeared first on District Dispatch.

Really slow rspec suite? Use the fuubar formatter! / Jonathan Rochkind

I am working on a ‘legacy’-ish app that unfortunately has a pretty slow test suite (10 minutes+).

I am working on some major upgrades to some dependencies, that require running the full test suite or a major portion of it iteratively lots of times. I’m starting with a bunch of broken tests, and whittling them down.

It was painful. I was getting really frustrated with the built-in rspec formatters — I’d see an ‘f’ on the output, but wouldn’t know what test had failed until the whole suite finished, or or I could control-c or run with –fail-fast to see the first/some subset of failed tests when they happen, but interrupting the suite so I’d never see other later failures.

Then I found the fuubar rspec formatter.  Perfect!

  • A progress bar makes the suite seem faster psychologically even though it isn’t. There’s reasons a progress bar is considered good UI for a long-running task!
  • Outputs failed spec as they happen, but keep running the whole suite. For a long-running suite, this lets me start investigating a failure as it happens without having to wait for suite to run, while still letting the suite finish to see the total picture of how I’m doing and what other sorts of failures I’m getting.

I recommend fuubar, it’s especially helpful for slow suites. I had been wanting something like this for a couple months, and wondering why it wasn’t a built-in formatter in rspec — just ran across it now in a reddit thread (started by someone else considering writing such a formatter who didn’t know fuubar already existed!).  So I write this blog post to hopefully increase exposure!


Filed under: General

Hydra Virtual Connect 2016 / FOSS4Lib Upcoming Events

Date: 
Thursday, July 7, 2016 -
11:00 to 14:00
Supports: 

Last updated May 20, 2016. Created by Peter Murray on May 20, 2016.
Log in to edit this page.

Hydra Virtual Connect is a new opportunity for the Hydra community to ‘connect' online in between face-to-face meetings, complementing the annual fall Hydra Connect conference and regional Hydra events. Presentations will include some of the Hydra talks given at Open Repositories 2016 for those that were unable to attend, plus reports from partners, community members, and interest groups.

For more information, see the Hydra Virtual Connect wiki page at https://wiki.duraspace.org/x/i4aUB

Happy to Announce that I’m Joining Index Data / Peter Murray

Index Data posted an announcement on their blog about how I will be joining them next month. Confirmed! I'll be working on the open source library service platform that was announced by EBSCO last month, and more specifically in a role as an organizer and advocate for people participating in the project. It feels like my career has been building to this role. And it also means getting re-engaged in the OLE project; I was part of the design effort in 2008-2009 and then drifted away as professional responsibilities took me in other directions. In the executive overview of the OLE design report, we said:

…the project planners produced an OLE design framework that embeds libraries directly in the key processes of scholarship generation, knowledge management, teaching and learning by utilizing existing enterprise systems where appropriate and by delivering new services built on connections between the library’s business systems and other technology systems.

That vision is as important to libraries today as it was then — even as the state of technology has advanced to make this vision harder (in some ways) and easier (in others) to achieve.

I will miss my colleagues and work at Cherry Hill. Cary, Jungleen, Justin and everyone else are great to work with, which is really important with a virtual organization. The client work has also been interesting and challenging with the opportunity to learn more about Drupal and relearn how to do operations in a cloud computing environment.

This job change also means that I'll be moving away from the Islandora and CollectionSpace open source communities. I've learned a lot from these groups that I'll be taking forward into my next adventure. (Anyone interested in creating a virtual gathering point for library open source project community managers?) I have a soft spot in by heart for these projects, and I'll be watching them as they grow.

OpenVIVO at OR2016 / DuraSpace News

Austin, TX  Join Michael Conlon, VIVO project director and Graham Triggs, VIVO technical lead from Duraspace for an OpenVIVO poster presentation at Open Repositories 2016 next month in Dublin. OpenVIVO is a hosted VIVO for representing scholarly work that anyone with an ORCiD can use.

Evergreen - 2.9.5 / FOSS4Lib Recent Releases

Package: 
Release Date: 
Wednesday, May 18, 2016

Last updated May 18, 2016. Created by gmcharlt on May 18, 2016.
Log in to edit this page.

Evergreen 2.9.5 fixes the following issues:

Email templates now properly encode header fields.
Fix an incorrect link field in the IDL.
Fix labels for Author and Title fields in Items Out of the OPAC when viewed on a small screen.

Jobs in Information Technology: May 18, 2016 / LITA

New vacancy listings are posted weekly on Wednesday at approximately 12 noon Central Time. They appear under New This Week and under the appropriate regional listing. Postings remain on the LITA Job Site for a minimum of four weeks.

New This Week

University of Rhode Island, Associate Professor, Librarian (Data Services), Kingston, RI

Reaching Across Illinois Library System, Cataloging and Database Supervisor-PrairieCat, Coal Valley, IL

e-Management, Senior Librarian, Silver Spring, MD

Visit the LITA Job Site for more available jobs and for information on submitting a job posting.

Evergreen 2.9.5 Released / Evergreen ILS

We are pleased to announce the release of Evergreen 2.9.5, a bug fix release.

Evergreen 2.9.5 fixes the following issues:

  • Email templates now properly encode header fields.
  • Fix an incorrect link field in the IDL.
  • Fix labels for Author and Title fields in Items Out of the OPAC when viewed on a small screen.

Please visit the downloads page to retrieve the server software and staff clients.

Jeffrey MacKie-Mason on Gold Open Access / David Rosenthal

I've written before about the interesting analysis behind the Max Planck Society's initiative to "flip" the academic publishing system from one based on subscriptions to one based on "gold" open access (article processing charges or APCs). They are asking institutions to sign an "Expression of Interest in the Large-scale Implementation of Open Access to Scholarly Journals". They now have 49 signatures, primarily from European institutions.

The US library community appears generally skeptical or opposed, except for the economist and Librarian of UC Berkeley, Jefferey MacKie-Mason. In response to what he describes as the Association of Research Libraries':
one-sided briefing paper in advance of a discussion during the spring ARL business meeting on 27 January. (I say “one-sided” because support of gold OA was presented, tepidly, in just nine words — “the overall aim of this initiative is highly laudable” — followed by nearly a page of single spaced “concerns and criticisms”.)
he posted Economic Thoughts About Gold Open Access, a detailed and well-argued defense of the initiative. It is well worth reading. Below the fold, some commentary.

He summarizes US research library community's concerns thus:
  • Will gold OA further strengthen the monopoly scholarly publishing firms?
  • Will there be a change in the current market model?
  • Will research-production-intensive institutions be made worse off?
  • Will gold OA hurt under-resourced institutions?
  • Will flipping to gold OA take too long and cost too much?
He starts by pointing out that although there isn't a monopoly in scholarly publishing there is very significant rent extraction:
articles that people want to read have scarcity value — you can only get them from one publisher, generally (this is why publishers care so much about obtaining and protecting copyrights). Whoever has copyright on an article that readers want to read can charge a scarcity rent (price > cost).

The market for publishing has evolved so that a small number of organizations control copyright on the most valuable articles (e.g., Elsevier, Springer, Wiley, Taylor & Francis, the American Chemical Society). They are able to charge prices well above incremental and average cost, so they are are earning above-competitive profit margins. In recent years the profit margins of the largest for-profit scholarly publishers have been around 35% or higher; a competitive, risk-adjusted profit margin is probably closer to about 10%. ... So, on the order of 25% of what we’re paying is not for the cost of publishing value added, but for excess (above-competitive, or monopolistic) profit.
This rent amounts to several billion dollars per year, which the incumbent publishers will fight tooth and nail to preserve. They have powerful weapons with which to do so, among which are co-opting librarians with slots on "advisory boards", and co-opting faculty with editorial board positions (apparently the care and feeding of the editorial board is the largest single cost at some journals).

MacKie-Mason's analysis of the sources of the publishers' ability to extract rent is acute:
  1. They have journals that have a reputation for prestige, and so authors want to submit their articles to be published in those journals, rather than in journals published by less monopolistic organizations.
  2. They have many such journals, which they can sell in “big deal” bundles that make it very difficult for purchasers (mostly libraries) to put competitive pressure on the publishers by dropping subscriptions to their weaker journals (that is, those for which there are reasonably good substitutes).
  3. (This is a big one, and will come back below.) The decision to commit resources to purchase a journal is for the most part made by someone different (usually a librarian) than the decision about where to submit an article for publication (made by the author). Even if authors realize in the abstract that by submitting to publishers that charge monopoly prices they are reinforcing the power those publishers have, which results in their university or research lab having to spend too much on subscriptions, we have a classic collective action problem: the decision of each individual author about where to publish does not directly affect the amount the author’s institution spends on subscriptions, but does affect his or her readership and prestige, so authors (for the most part) quite rationally ignore the monopoly power of the publishers to whom they submit.
He argues that these factors are greatly diminished in a gold open access world:
When subscription purchasing decisions are made at the level of the journal (bundle of articles) — and increasingly the “big deal” (bundle of journals) — the only possibility for competition is between journals or bundles of journals, and as I pointed out, different articles published in different journals are not very good substitutes for each other. But in a gold OA world there is competition at the level of the article submission, and before the article is accepted for publication, multiple journals can be very good substitutes for each other (for example, an economist generally gets as much readership and prestige whether her article is published in the American Economic Review, the Journal of Political Economy, or the Quarterly Journal of Economics).
Of course, this mechanism only works if the author's choice for a more expensive APC diverts resources from other uses the author values. To the extent that their institution dedicates funds to APCs and thus insulates the author from the consequences of their choice, they disable the mechanism. MacKie-Mason somewhat fudges this aspect:
campuses might offer only a fixed reimbursement, equal say to a reasonable estimate of the current resource cost (not price) of publishing an article — today around, say, $2000 — and if the author wants to publish in a journal with an above-competitive (monopolistic) APC she will have to come up with additional funds herself. (An elegant alternative that may seem less harsh to authors, suggested to me by Mark McCabe via his collaborator Mackenzie Smith, is to give authors a somewhat higher amount — say, average cost plus $1000 — per article, and let the author bank the difference between that amount and the APC in a research account.)
$1000 is not much compared to the perceived difference in value to an author of different journals. The money has to come from somewhere; the only place it can is from library subscription budgets:
central campuses could reduce the budgets of libraries by the amount that we save by not having to pay for journal subscriptions.
and, importantly, these funds would have to become discretionary funds that authors could spend whatever they needed. The publishers need this not to happen, which is why they want to keep subscriptions stable and refund APCs to libraries:
Some advocates for gold OA argue that libraries — especially consortia — will be able to negotiate “offsets”, that is, reducations in subscription payments to offset the amount of revenue the publishers collect in APCs. There has been some progress in this regard, for example in the UK, Austria, and the Netherlands.

I suspect that offsets will only partially cover the costs of transition to a gold OA world. Even where progress is being made, offset savings are lagging behind growing APC payments. And my economic arguments above suggest why offsets are unlikely to fully finance the costs of transition: the dominant publishers have substantial market power, and they are going to use that power to resist the transition to gold OA, trying to make sure we (research-producing institutions) find the transition to be costly.
The priority for both sides of the current system is to prevent the funds currently flowing through their hands from being diverted to other, less worthy hands. This makes what MacKie-Mason refers to as "merely a problem of the distribution of scholarly communication funds" extremely difficult.

It is important to note that the Max Planck initiative is different from the "hybrid" gold open access model behind these offsets. The publishers are happy with subscription journals in which, for an APC, authors can make their work open access. They can refund (later) these APCs but they don't lose the subscription income. The Max Planck proposal is that, instead of article-by-article open access, journals would "flip" to open access. No subscription, just APCs. Neither publishers nor libraries like this, it carries much higher risks for both.

Ready, set, #Readathon2016 / District Dispatch

National Readathon Day logo.Saturday, May 21, is National Readathon Day 2016 – an opportunity for readers of all ages to come together at their local libraries, schools and bookstores to celebrate reading and raise money for early literacy. Proceeds from the event will benefit ALA’s Every Child Ready to Read initiative, which supports literacy skills among children from birth to age 5.

Leading up to and during National Readathon Day, you can proselytize the cause of early literacy by using the hashtag #Readathon2016 on your favorite social media platforms. You can also make a donation to the cause on the event’s Firstgiving Fundraising page.

Penguin Random House also is using #Readathon2016 to roll out its Library Awards for Innovation, which will afford libraries across the country the opportunity to apply for grants for community-based programs.

Be sure to visit the #Readathon2016 share page, for shareable images and videos and information on local reading parties. #Readathon2016 is part of ALA’s Libraries Transform campaign to raise awareness of all that libraries and library professionals do for their communities in the digital age.

Read on, everyone.

The post Ready, set, #Readathon2016 appeared first on District Dispatch.

The Frivolity of Making / LITA

Makerspaces have been widely embraced in public libraries and K-12 schools, but do they belong in higher education? Are makerspaces a frivolous pursuit?making

When I worked at a public library there was very little doubt about the importance of making and it felt like the entire community was ready for a makerspace. Fortunately, many of my current colleagues at Indiana University are equally as curious and enthusiastic about the maker movement, but I can’t help but notice a certain reluctance in academia towards making, playing, and having fun. From the moment I interviewed for my current position I’ve been questioned about my interest in makerspaces and more specifically, my playful nature. I’m not afraid to admit that I like to have fun, and as librarians there’s no reason why our jobs shouldn’t be fun (at least most of the time). My mom is a nurse and there are plenty of legitimate reasons why her job isn’t fun a lot of the time. But it’s not just about me or even librarians. In higher education we constantly question if it’s okay to have fun.

Things like 3D printing and digital fabrication are an easy sell in higher ed, but littleBits and LEGOs prove slightly more challenging. I recently demonstrated the MaKey MaKey, Google Cardboard, and Sphero robotic ball for 40 of my colleagues at our library’s annual “In-House Institute.” My session was called “Intro to Makerspaces” and consisted of a quick rundown of the what and why of the maker movement, followed by play time. I was surprised to see how receptive everyone was and how quickly they got out of their seats and started playing. As the excitement in the room grew, I noticed one of my colleagues sitting with a puzzled look on his face. “But, why?” he said. As in, “why are you asking me to play with toys?” A completely reasonable question to ask, especially if you’ve been working in higher ed for 40+ years.

For starters, we know that learning by doing can be very effective, but that’s only part of it. Tinkering with littleBits does not make you an electrical engineer and it’s not supposed to. Tools like these are meant to expose you to the medium and to spark ideas. Cardboard is a great introduction to virtual reality, MaKey MaKey opens up the world of electronics, and Sphero is a much friendlier intro to programming than a blank terminal window. Many of these maker-type tools are marketed towards kids, but I’m convinced that adults are the ones who really need them. We need to be reminded of how to play, tinker, and fail; actions that many of us have become completely removed from.

Making is also a great opportunity for peer-to-peer learning across disciplines. The 2015 Library Edition of the NMC Horizon Report makes a solid argument for makerspaces in libraries: “University libraries are in the unique position to offer a central, discipline-neutral space where every member of the academic community can engage in creative activities.” I refuse to believe that our music students are the only ones who can play music or that our fine arts students are the only ones who can draw. The library offers a safe and neutral zone for students to branch out from their departments and try something new.

Interacting with new technologies is another key selling point for makerspaces, and the best makerspaces are a blend of high-tech and low-tech. Our very own MILL makerspace in the School of Education has 3D printers alongside popsicle sticks and pom-poms. It’s tough to be intimidated by the laser engraver once you’ve seen a carton full of googly eyes. This type of low-stakes environment is a great way to explore new technologies and there are few instances like this in the modern academic institution.

So are makerspaces frivolous? On the surface, yes, they can be. Sometimes playing is nothing more than a mental break, but sometimes it’s a gateway to something greater. I’d argue that we owe our students opportunities to do both.

There are tons of resources about makerspaces out there, but here’s just a few of my favorites if you’re eager to learn more…

Beta Spaces as a Model for Recontextualizing Reference Services in Libraries / In the Library, With the Lead Pipe

In Brief:

Reference services are at a cross-roads. While many academic libraries continue to offer reference services from behind a desk, others are moving to roving and embedded reference models. Meanwhile, libraries are also engaged in the development of collaborative learning spaces—often rich with technology, such as makerspaces and learning labs—but these spaces are often removed from the reference services environment. Beta spaces are another type of collaborative environment used in both public and academic libraries with the potential to infuse energy into the reference space and emphasize research support through experimentation, collaboration, and user contribution. Beta spaces are user-oriented environments with a focus on innovation and experimentation, much like a makerspace but with an emphasis on ideas over technology. A beta space model for reference services would enhance opportunities for active learning, help make the research process visible and tangible, and effectively demonstrate the value of reference.

Introduction

If the “library of the future” is an environment in which knowledge is created, not merely preserved and accessed (as Arizona State University Librarian James O’Donnell suggested recently in his keynote at the 2015 Charleston Conference), then reference services are positioned within this future library to foster that environment (Hawkins, 2015). In reality, traditional reference services are often questioned as an effective model for delivery of research support in academic libraries. The reference desk as a physical space was called into question by Barbara J. Ford in the mid 1980’s and Sonntag and Palsson boldly stated in 2007 that “it is unquestionably time to eliminate the reference desk and recognize that the services it originally provided have been replaced by course-integrated instruction and research assistance ‘on demand’” (Sonntag and Palsson, 2007). Whether located at a central services desk or compartmentalized as a series of services such as roving or embedded reference, the way we think about reference delivery and the role it plays in the facilitation of intellectual experimentation and student scholarship is under constant pressure to demonstrate relevancy, and it certainly faces competition from both within and outside libraries (Campbell, 1992; Campbell, 2008; O’Gorman and Trott, 2009).

Alternative “spaces” in libraries are not new, but they tend not be built around reference services. Beta spaces are defined by Jeff Goldenson and Nate Hill as “environments within a larger library ecosystem created to prototype and deploy new ventures” (Goldenson and Hill, 2013). While this often takes the form of makerspaces or digital labs in libraries, it also describes the work of student researchers (or any library user). Scholarship is a “new venture” and the reference space can be a safe place outside of the formal classroom where students can experiment, explore, and even fail without fear of negative consequences. In this article, I explore the concept of the beta space and think about the ways that reference as an activity is one that makes the most sense if delivered in a beta environment. The final section of this article is a narrative case study of my own attempt (which may or may not have been successful) to recontextualize reference services at my library into a collaborative, experimental environment designed to inspire, encourage user ownership of the space, and demonstrate the value of reference.

Beta Spaces: A Definition

The term “beta space” is not yet commonly used in library discourse, though the word “makerspace” is. Makerspaces have been around for over a decade, and according to MAKE Magazine, the term began being used widely in 2011 (Cavalcanti, 2013). According to the EDUCAUSE Learning Initiative (ELI), a makerspace is a “physical location where people gather to share resources and knowledge, work on projects, network, and build” (“7 Things You Should Know About Makerspaces”). Though this definition is broad, it emphasizes technology and the physical building of materials in a creative environment. Further along in its definition of makerspaces, ELI goes on to explain that “makerspaces owe a considerable debt to the hacker culture that inspired them, and many are still primarily places for technological experimentation, hardware development, and idea prototyping”. There are certainly elements of the makerspace in the beta space, but these terms (and these spaces) are not synonymous. The beta space is a prototyping space, but one that focuses more on ideas than technology. In a succinct definition from their article in Library Journal, Jeff Goldenson and Nate Hill describe beta spaces as “environments within a larger library ecosystem created to prototype and deploy new ventures.” Both Goldenson and Hill worked to co-develop two independent beta spaces at their respective institutions, The Harvard Graduate School of Design, and the Chattanooga Public Library. The emphasis for both of these projects was the development of a community that supported experimentation—not just with technology, but with ideas. (Goldenson and Hill, 2013).

Chattanooga’s project is called “The 4th Floor.” It evolved from the transformation of a 14,000 square foot storage area into a collective learning environment. The space is described as a “public laboratory and educational facility” with a focus on information, design, technology and applied arts. The space features computers with access to interactive online courses, a small collection of business and innovation periodicals, provides access to digital technology, and serves as an events space. Harvard’s project was called the Labrary. Occupying a vacant storefront in Harvard Square, it was conceived by students as part of the Library Test Kitchen, a course taught at the Harvard Graduate School of Design. Unlike the 4th Floor, which is a permanent space, the Labrary was a 37 day experiment, essentially a “pop-up” library designed to facilitate creative collaboration, exhibit student work, and try out new ideas from the Library Test Kitchen such as tables that play low ambient noise to stave off complete silence (Koerber, 2013).

Makerspaces, technology-rich labs, and the growth of digital humanities in the library space is not without controversy. The makerspace movement can be seen as part of a larger trend of applying a corporate mindset to library services, with a focus on technology and production rather than discourse. There are wider concerns that academic libraries are under pressure to adopt business strategies and focus on library users as “customers” (Nicholson, 2015) as well as on the creation of knowledge as a consumable product (Ward, 2012). A recent article in the Los Angeles Review of Books questions the neoliberal agenda of digital humanities in particular and specifically targets the “promotion of project-based learning and lab-based research over reading and writing” (Allington, Brouillette, and Golumbia, 2016). These concerns are legitimate and it is healthy to question the motivations behind the transformation of any library service. Library makerspaces—and by extension, beta spaces—are designed to support active learning through hands-on experiences. Kurti, Kurti and Fleming explain that “maker education is a branch of constructivist philosophy that views learning as a highly personal endeavor requiring the student, rather than the teacher, to initiate the learning process” (2014). I believe that beta spaces offer an opportunity to facilitate collaborative learning outside of the classroom in a way that does not negate the value of traditional scholarship, nor supplant traditional library services, but it does offer an opportunity to enhance them.

Out of their experimentations, Goldenson and Hill establish three “shared beliefs” or themes about beta spaces. For them, beta spaces:

  1. Facilitate real-time knowledge creation
  2. Are designed for experimentation, and
  3. Encourage community-driven innovation (Goldenson and Hill, 2013).

It is important to remember the participatory element of beta spaces. The creative activities taking place within beta spaces such as the 4th Floor and the Labrary are user-driven. The spaces themselves were designed by librarians, faculty, (and in Harvard’s case, graduate students), but the work that goes on there is fueled by user inquiry, needs, and creative impulses. A participatory design approach to the development of beta spaces in libraries is therefore at the foundation of the concept. “Participatory design” was defined by the Council of Libraries and Information Resources (CLIR) in 2012 as “an approach to building spaces, services and tools where the people who will use those things participate centrally in coming up with concepts and then design the actual products” (Participatory Design in Academic Libraries). A beta space is nothing without the people who come into it to try out new ideas, whether through discussion, a more formal reference interview, the exhibition of user-created work, or even a creative response to a display prompt.

The Idea Box at the Oak Park Public Library in Illinois is an example of a beta space-type environment that is set up by library staff, but then powered by the creativity of the public who interact with and add value to the space through participation. The Idea Box is a 19’ x 13’ glass-walled storefront with regularly rotating displays that encourage people to come in, “tinker,” and experiment. The range of activities in this space has included magnetic poetry, advice sharing, dancing, and oral histories—all driven by user contribution. Staff may have painted the room with magnetic paint and populated it with word fragments, but the poems were created by visitors and it is the visitors who give this space meaning (Library as Incubator Project, 2013). With these examples in mind, a beta space can perhaps be summed up as: a space within the library environment designed to facilitate knowledge creation in real-time through user participation and experimentation. This is also what I consider to be the heart of reference services.

Beta Spaces and Reference Services

Reference as a library service can encompass a range of activities, depending on the type of library and its particular mission. I have attempted to identify a core definition of “reference services” from ALA’s Reference and User Services Association (Definitions of Reference – Reference & User Services Association), but found only definitions for the components of this service: “Reference transactions” and “reference work.” According to RUSA, reference transactions are “information consultations in which library staff recommend, interpret, evaluate, and/or use information resources to help others meet particular information needs.” The makeup of this reference work “includes reference transactions and other activities that involve the creation, management, and assessment of information or research resources, tools, and services” (RUSA). These definitions were last approved in 2008 by the RUSA Board of Directors and describe a fairly straightforward exchange between library staff and user, one that emphasizes the transference of information from authority figure to knowledge seeker and explicitly excludes formal instruction. With this definition, it is easy to see why reference services are at a cross-roads.

An informal survey of reference services mission statements and statements of philosophy shows a broader scope for reference and research support services in both academic and public libraries. The mission statement for Research and Information Services at the University of Illinois at Urbana-Champaign, for example, states:

“The Research and Information Services is the University Library’s central hub for research assistance, leading patrons to the discovery of library resources and expert help. We provide assistance to researchers working in all disciplines, help people to locate difficult to find items, and make referrals to subject specialists when appropriate. We support the educational mission of the university by approaching research support services from an instructional perspective, and by fostering user independence and the development of information literacy skills” (Mission Statement & Vision).

While couched in the practical (i.e. locating items and making referrals), this mission statement addresses the centrality of reference services to the overall mission of the library, and encompasses instruction in a way that the RUSA definition does not. Likewise, New Jersey’s Newark Public Library philosophy of service puts reference at the very center of what the library does. Newark goes so far as to say: “Reference service at The Newark Public Library is one of the most vital and visible expressions of the Library’s purpose and mission and is key to each of the Library’s four primary service roles: to serve as a center for information, formal education, research and independent learning” (Reference Services Policy – The Newark Public Library).

In many cases, however, reference services are not explicitly addressed in the library mission statements and the physical footprint of these services is being dismantled in some libraries. The news is often alarmist. In 2010, a Los Angeles Times article on libraries in the digital age opened with the declaration that a public library in the Denver area replaced its reference desk in order to make space for patrons to play “Guitar Hero” (Sarno, 2010). In his 2013 study, “Shall We Get Rid of the Reference Desk,” Dennis B. Miles found that a large percentage (66.4%) of academic libraries still use a physical desk to deliver reference services. But libraries are experimenting. In addition to desk-based services, librarians are engaged in roving reference, are consolidating service points (such as merging reference and circulation), and are offering more in-office consultations with students (Miles, 2013). At Sonoma State University, the reference desk has gone through a number of transitions in recent years, starting with a roving reference program in 2012 and the consolidation of the Reference and Circulation Desks. While the combined desk allowed for more efficient staffing, it also served as a central place for answering quick directional questions and contacting subject specialists on an on-call basis (Lawson & Kinney, 2014). In 2001, librarians at Northwest Missouri State University completely removed their traditional reference desk and instead invested time in embedded instruction (both in the classroom and online), among other just-in-time services (Meldrem, Mardis, & Johnson, 2005). This model essentially disperses the research activities central to the work of the library across campus and within the online environment.

There is an uncomfortable tension between the stated value of reference services in library mission statements and the threat to the visible presence of these services in the physical environment through a dispersal of services or a limitation of services behind a static desk. Many of the newest and most exciting spaces in libraries are technology-rich spaces such as makerspaces and digital labs, but these are often built out in separate classroom-like spaces. Even the 4th Floor at the Chattanooga Public Library, a great example of a successful beta space, is quite removed from the primary services desk. If a library does have a reference desk, its function is surely in question when the new super-star room filled with collaborative technology and innovative resources pops up down the hall. Instead, I see the potential in developing reference services spaces—such as a research or information commons space—as a beta space. Instead of dispersing reference services, they can be integrated into the fabric of a creative, user-driven environment where a research consultation is not merely a “transaction.”

Goldenson and Hill’s three themes for beta spaces (real-time knowledge creation, experimentation, and community-driven innovation) are very much in line with the scope of reference services. Because users are actively engaged in the research process while using reference services—or have the potential to be while asking more directional questions at the reference desk—the beta reference space is an ideal environment to make research visible through collaborative inquiry, curated reference source collections, and interactive and other displays/exhibits like those in the Idea Box at Oak Park. Placing reference within the beta space helps to clarify the services offered, inspire other researchers about what is possible, and educate users on available resources. Focusing on knowledge creation within reference services validates the work of the library user and helps to establish a healthy symbiotic relationship between library staff and user.

A (Very Beta) First Foray

When I was hired in July 2014 as the information specialist overseeing reference services at the Pearson Library at California Lutheran University, the lines between reference services, circulation, and information literacy were fluid and confusing. The official home base for reference services was the “Information Commons” (IC Desk), a dilapidated desk with two office chairs for staff on one side and a bank of 5 public computer workstations on the other. Open every day from 10am-10pm during the semester, the IC Desk was mainly staffed by students cross-trained in circulation. The Circulation Desk served as a de-facto reference desk during periods of understaffing. I needed to figure out what reference meant for us, how to ensure that it was meaningful to users and to the staff who worked there, and how to isolate it as a singular service in order to market it. In order to do this, I unconsciously drew on my experience teaching at an art school, where the students spent hours working in an atelier environment learning new techniques through trial and error. I was also greatly inspired by my very first library school class at San Jose State University in Spring 2015: “Innovation and Participatory Programming in Libraries,” taught by Monica Harris, to whom I owe a great deal of credit. This is where I learned about beta spaces.

Information Commons Desk with directional sign above

Information Commons Desk with directional sign above

At the IC Desk, we were experiencing healthy patron interaction statistics. Based on internal statistics collected on Springshare’s LibAnswers Reference Analytics from Fall 2014, the IC Desk logged a monthly average of 101 in-person patron interactions and through LibChat, desk staff engaged in an average of 116 online chats with patrons per month (serving an FTE of approximately 4,100—about 2800 undergraduate and 1300 graduate students). Most of the time, the desk was staffed with students while librarians were on-call. The stats seemed good, but anecdotally, there was a lack of awareness about reference services at the IC Desk and low morale for those who worked at the “Isolation Corner.” Many users came to Circulation to ask reference questions and were annoyed at being redirected to the desk behind them. As the reference coordinator, it was challenging to staff the desk with librarians, who were heavily engaged with instruction and weren’t consistently able to commit to regular hours at the desk. For me, the Information Commons’ mission to “support research and learning by offering a conceptual, physical, and instructional space designated to deliver, instruct and gather information” was in question. It wasn’t really a commons. It was just a desk.

With the support of the library director, I set out to establish a strategic plan for reference services (as a service and as a space) and as I developed these plans, they began to solidify around the concept of the beta space. The strategic plan’s stated mission was to “inspire research by providing a variety of research services to best meet the needs of CLU students, faculty and staff by creating a scholarly environment that supports student learning across departments.” The plan was founded on five primary goals, which were stated with no clear timeline, and were based on an outline developed by Nina Simon in The Participatory Museum in order to evaluate success.

Goals for Reference Services at Pearson Library

  1. Establish an environment of intellectual curiosity and exploration at Pearson Library
  2. Raise campus awareness of research resources provided by Pearson Library
  3. Increase number of meaningful patron interactions at the reference services desk
  4. Increase opportunities for experiential learning at Pearson Library
  5. Increase opportunities for CLU community to share perspectives and experiences

An important part of setting up a framework for later evaluation was not just thinking about the goals for the service, but about how users would ideally be affected by interacting with us. These desired outcomes are admittedly lofty.

Desired User Behaviors and Outcomes

  1. Learn more about research resources and effective utilization of these resources
  2. Visit and use the Library more often, whether a student looking for research assistance, or faculty looking to support learning in their classroom
  3. Perceive Library as a center for rigorous scholarship on campus
  4. Perceive Library as a fun and approachable space for informal learning
  5. Develop academic confidence and intellectual curiosity that leads to a life of learning outside the classroom

The plan was to transform the IC Desk into the “Collaborative Research Commons” and to think of reference services as operating not just from a desk, but as encompassing all of the space around it—including the glass atrium, mobile furniture, and exhibit furniture that was already near the desk. The desk itself was due for an upgrade so I proposed moving it to a slightly more central location (but in fact just a few feet away) and adding truly collaborative furniture around a mobile, U-shaped central desk.

Proposed relocation of the “Collaborative Research Commons”

Proposed relocation of the “Collaborative Research Commons”

 

Proposed floor plan for the Collaborative Research Commons

Proposed floor plan for the Collaborative Research Commons

The Collaborative Research Commons would both replace and enhance existing IC Desk services. Instead of being parallel to the Circulation Desk near the entrance to the library, the new Collaborative Research Commons would be located adjacent to the library’s open-air central atrium, which could be leveraged as exhibition and workshop space. This is where I imagined that research would truly become visible. Students working in the space can be seen from all corners of the building and completed work can be hung on the glass. Aligning the new desk to this Idea Box-like space was central to the renovation proposal. The new area would feature a more open, approachable (and mobile) desk that faces approaching patrons coming in through the front doors. Optionally, a back-facing desk would face the general computer lab and serve as technology help, which did not have a public-facing desk in the library. Surrounding the service desk(s) would be four round tables (on casters) with seating for between four to six people each. These tables would serve small groups working together on projects, as well as space for longer research consultations with librarians. They would not have mounted computers (as in the existing IC Desk area), which impede mobility and effectively block communication between staff and patrons. Laptops, tablets, and other technology could be brought into the space if needed from the existing mobile lab. These tables could also serve small classes coming in to do research together, which until then had relied on rows of desktop computers in the computer lab.

By transforming the Information Commons into a “Collaborative Research Commons,” which emphasizes the activity as opposed to the resource, we would be employing a “beta space” approach to reference services that encourages exploration of ideas and a cooperative learning environment based on social interactions and participatory practice. Including collaborative tables and exhibition space into the overall research space is a way of “envision[ing]…boundaries in more porous ways” (Rogers and Seidl-Fox, 2011). The space would become a classroom-like space outside of the traditional classroom allowing librarians to meet with students and faculty for research appointments, not tucked away in back offices, but at one of the round tables with a laptop, for example.

Now for the reality check. During the 2014-2015 academic year, we were not able to realize the physical transformation of the space through renovation, though library administration approved of the ideas and design consultants were brought in. The money just wasn’t available that year. But we ran a series of successful programs that demonstrated the potential of the beta reference space and generated new energy around reference services as the central arm for outreach and research support (as opposed to access services or information literacy instruction). In our first event, the atrium windows were used for the first time as gallery space for a student exhibition of Islamic Calligraphy and a small reception was held adjacent to the IC Desk in December 2014. Students and faculty came to speak about their work learning a new language and artistic technique through experiential learning in this calligraphy course.

Hanging the Islamic Calligraphy Show on the atrium windows in 2014

Hanging the Islamic Calligraphy Show on the atrium windows in 2014

The following semester, the space was used to display erasure poetry made from weeded library materials by staff, faculty, and students. The atrium hosted two creative writing classes in which students added their final products to both the atrium’s windows and to the April National Poetry Month display, and were assisted by reference staff as they navigated the exhibit space and selected source material for their poems. The IC Desk also served as a stop for a poetry prompt station (staffed by the same faculty member who brought her creative writing classes into the atrium) where library visitors also had an opportunity to add their work to the display.

Students writing poetry inside and outside the atrium, our “Idea Box” of sorts at Pearson Library

Students writing poetry inside and outside the atrium, our “Idea Box” of sorts at Pearson Library

 

Adding new work to the April National Poetry Month display

Adding new work to the April National Poetry Month display

 

Students began to add their work to the atrium windows after writing poetry in a class session held in the atrium

Students began to add their work to the atrium windows after writing poetry in a class session held in the atrium

The events we held that year in the atrium and space around the IC Desk may or may not have happened regardless of the strategic plan for reference services. But we were intentional in making the (in this case) creative scholarship visible and placing these creative activities around ongoing reference activities at the services desk. Physical transformation of the space as proposed would help to cement the connection between the process of research and the resulting scholarship on display. Going forward, we would need to expand beyond art and poetry in order to truly align reference activities to the generation of new scholarly ideas and demonstrate the value of reference services by highlighting evidence of learning outcomes, student accomplishments, and models for inspiring research projects. The official Collaborative Research Proposal included dozens of thematic starting-off points to generate research ideas and pull in work from ongoing courses with amenable faculty members in a range of disciplines. Ideas included inviting campus wellness center staff to serve as “reference librarian for the day” during Health Awareness Month in January with interactive displays and resources on health topics; a “blind date with a book” display with a table set up for users to write Valentine’s Day love notes to their favorite books during February; and a mobile technology workshop with resources on creating short videos on tablets and iPads with an option to play the films on screens mounted in the space.

The fact was that we tried these new programs in a fairly haphazard and difficult-to-assess way by making connections with faculty willing to experiment with new library spaces. We started to collectively think about the IC Desk as something more than a little desk with a bank of computers and more as the potential hub of the library. To truly assess the impact of the renovation and actual utilization of a Collaborative Research Commons, the following methods of assessment were identified as part of the proposal:

  1. Measure and compare length and type of patron interaction taking place at reference desk before and after implementation of beta space project changes using LibAnswers Reference Analytics tool; align interactions with ACRL Framework as is currently done with information literacy instruction.
  2. Include questions about awareness and effectiveness of environment and services in library survey deployed annually to students.
  3. Offer short point-of-service surveys (such as reply cards) at time of patron interaction and/or program or event.
  4. Measure attendance at programs and workshops.
  5. As programs are designed, define intended learning goals; document evidence of student learning through collection of work and/or photographs of participation and work (such as collecting found poetry and taking pictures of exhibition space after participants have added their work to it).
  6. Add a question about reference space to course evaluation for classes that utilized the space during the semester.
  7. Over time, collaborate with the alumni affairs department to identify post-graduation activities of participants and their continued perceptions of the library after graduation.

Our work was decidedly beta. We tried something new and made some concrete proposals. Many libraries do the same kind of work and run the same kinds of programs that we did, but we placed these programs within the reference environment and linked the products of creative scholarship to the research process through physical association, and mindfully reconceptualizing the reference space as an informal learning environment founded upon experimentation. It is impossible to truly assess the success of what we did (beyond the communal feeling that we were on to something) because the Collaborative Research Commons proposal wasn’t actualized and the methods of assessment couldn’t be tested. I left the Pearson Library the following summer and I know creative work continues to be done there, but I can’t know for sure how much of the original proposal will be supported in my absence.

Conclusion

Questions have been raised about the value of reference services in the 21st century library. What value does it offer users? How are users engaging with reference spaces? Applying a participatory design model to reference services is an alternative to dismantling it all together or dispersing it to the point of invisibility. As libraries design and develop collaborative learning commons, digital labs, makerspaces, and beta spaces, why not centralize them around reference services. If these are the places where users engaged in new ideas and technologies really want to be, then what better way to facilitate new learning and guide the process than the physical and conceptual merging of beta with reference? In his 2009 article, “Libraries and Learning: A History of Paradigm Change,”Scott Bennett wrote the following in regard to library learning spaces designed in the 1990’s and early 2000’s:

“Some features of a learning-centered design – with the generous provision of group study spaces and information and learning commons chief among them – are now regular features of library planning. It is far from clear that our concern with learning goes much beyond these features, however” (Bennett, 2009).

I believe that incorporating the ethos of the beta space into the library learning space, and placing reference services within this context, is a way to take this next step.

In my dream of dreams, academic reference librarians and subject specialists would not have offices deep in the back of the library building only to emerge for a couple of hours at the reference desk, but they would be permanently based out in the open—visible models for research and intellectual engagement in a user-driven, participatory environment like a beta space. Understaffing is likely to continue to be a problem for many libraries long into the future, and the beta space model is an experimental step towards blurring the lines between faculty office, classroom, scholar commons, and gallery. Librarians would not have static “shifts” out at a desk, or recede into the depths of an interior office to hold consultations. In the beta space model, there are opportunities to place librarians more permanently in public spaces by placing their offices within the space. This has the potential to relieve some of the pressure on reference librarians forced to bounce back and forth between office, desk, and classroom. It also has the potential to infuse reference services with subject expertise from teaching faculty, graduate students, and visiting scholars. It could be a place where faculty or graduate “subject experts” could hold public office hours or drop-in research sessions, groups could engage in collaborative research projects, and technology experts could triage technology questions. These alternative activities keep the space alive even if there isn’t a reference librarian on duty at a given time. Again, these things aren’t necessarily new to reference services, and these activities may be happening in other parts of the library or across campus, but I see an opportunity to centralize these key learning activities around reference services.

In terms of curating the products of knowledge creation, our work at Pearson Library captured primarily examples of creative work—poetry, art, etc. But other examples could include collections of bound theses and dissertations, screens highlighting student and faculty work collected in institutional repositories, physical collections of student or community newspapers, campus journals, zines, or thematic user-curated displays of library materials. White boards, chalk-board paint, over-sized sticky notes, and tables topped with white-board surfaces are just some of the ways to collect the ephemera of the research process within the reference space. In addition, these activities support focus on student creation of knowledge as part of information literacy education as described in the ACRL Framework and have the potential to be expanded into a larger information literacy program in collaboration with other library departments (“Framework for Information Literacy for Higher Education”). Showcasing the work of student and faculty researchers provides a model and a shining light of what is possible. It is both inspiring and encouraging. When this is done in the active, participatory environment of the beta reference space, research is highlighted as both an end-goal and a process.

The heart and soul of reference services is the personal interaction between librarian and user, which is itself an entrypoint into intellectual discourse. As technology evolves, this interaction takes on many forms—it can be a telephone call, an online chat, an embedded classroom session, or a conversation while sitting in front of a computer workstation. Reference services is not bound by a desk, nor even by a room, but allowing the reference space to incorporate the elements of the beta space through display, participation, collaboration, and simple conversation, the “beta space” model positions reference services as essential to branding the library and facilitating an environment of intellectual curiosity and exploration. This might not work for every library—not everyone has a central atrium of course—but I truly believe in the value of reference services and in the value of the beta space. By merging the two, libraries offer a unique opportunity for user empowerment, demonstration of value, and research support.


I would like to offer my most sincere thanks to Ian Beilin and Kate Adler for their incredibly meaningful feedback and guidance during the peer review process. As the internal reviewer for In the Library With the Lead Pipe, Ian offered supportive, substantive commentary on multiple drafts and some keen editing skills. Kate made connections I wouldn’t have thought of on my own as external reviewer, and I have greatly valued her insights. Many thanks to Erin Dorney, publishing editor at In the Library With the Lead Pipe, for her support and guidance. I would also like to sincerely thank Monica Harris for first introducing me to the beta space. Thank you!


References

4th Floor – Chattanooga Public Library. (n.d.). Retrieved from http://chattlibrary.org/4th-floor

7 Things You Should Know About Makerspaces (n.d.). Retrieved from https://net.educause.edu/ir/library/pdf/eli7095.pdf

Bennett, S. (2009). “Libraries and Learning: A History of Paradigm Change.” portal: Libraries and the Academy. 9(2), 181-197.

Campbell, J.D. (1992). “Shaking the Conceptual Foundations of Reference: A Perspective.” RSR: Reference Services Review, 20(4), 29-36.

Campbell, J.D. (2008). “Still Shaking the Conceptual Foundations of Reference: A Perspective.” The Reference Librarian, 48 (100), 21-24.

Cavalcanti, G. (2013, May 22). “Is it a Hackerspace, Makerspace, TechShop, or FabLab? Make Magazine.” Retrieved from http://makezine.com/2013/05/22/the-difference-between-hackerspaces-makerspaces-techshops-and-fablabs/

Definitions of Reference – Reference & User Services Association (RUSA). (n.d.). Retrieved from http://www.ala.org/rusa/resources/guidelines/definitionsreference

Ford, B. J. (1986). “Reference Beyond (and Without) the Reference Desk.” College And Research Libraries, 47(5), 491-94.

“Framework for Information Literacy for Higher Education” (2015, Feb 2). American Library Association. Retrieved from: http://www.ala.org/acrl/standards/ilframework

Goldenson, J., & Hill, N. (2013, May 16). “Making Room for Innovation.” Library Journal. Retrieved from http://lj.libraryjournal.com/2013/05/future-of- libraries/making-room- for-innovation/#_

Hawkins, D. (2015, Nov 5). “Star Wars in the Library.” Charleston Conference Blog. Against the Grain. Retrieved from: http://www.against-the-grain.com/2015/11/star-wars-in-the-library/

Kurti, S., Kurti D., & Fleming L. “The Philosophy of Educational Makerspaces: Part 1 of Making an Educational Makerspace.” Teacher Librarian. 41(5), 8-11.

Koerber, J. (2013, May 28). “Looking Through the Labrary Lens: Lessons from the Library Test Kitchen.” Library Journal. Retrieved from http://lj.libraryjournal.com/2013/05/buildings/lbd/looking-through-the-labrary-lens/#_

“Library As Incubator Project.” (2013, June 26). Out of the Archives: Live Art & Community Participation in the Oak Park Public Library Idea Box. Retrieved from http://www.libraryasincubatorproject.org/?p=5025

Medlrem, J., Mardis, L., and Johnson C. (2005). “Redesign Your Reference Desk Get Rid of It!” In Currents and convergence: Navigating the rivers of change : proceedings of the Twelfth National Conference of the Association of College and Research Libraries, April 7-10, 2005, Minneapolis, Minneapolis, Minnesota. Chicago: Association of College and Research Libraries.

Miles, D. (2013). “Shall We Get Rid of the Reference Desk?” Reference & User Services Quarterly, 52: 4, 320-333. Retrieved from https://journals.ala.org/rusq/article/viewFile/2899/2972

Mission Statement & Vision. (About Research and Information Services). Retrieved from http://www.library.illinois.edu/rex/about/mission.html

O’Gorman, J., & Trott, B. (2009). “What Will Become of Reference in Academic and Public Libraries?” Journal of Library Administration, 49:4, 327-339. Retrieved from: http://dx.doi.org/10.1080/01930820902832421

Nicholson, K. P. (2015). “The Mcdonaldization of Academic Libraries and the Values of Transformational Change.” College & Research Libraries, 76(3), 328-338.

Participatory Design in Academic Libraries: Methods, Findings, and Implementations. (2012, October). Council on Library and Information Resources. Retrieved from http://www.clir.org/pubs/reports/pub155/pub155.pdf

Reference Services Policy – The Newark Public Library. (n.d.). Retrieved from http://www.npl.org/pages/aboutlibrary/reference_policy.html

Sarno, D. (2010, Nov 12). “Libraries reinvent themselves as they struggle to remain relevant in the digital age.” Los Angeles Times. Retrieved from: http://articles.latimes.com/print/2010/nov/12/business/la-fi-libraries-20101112

Simon, N. (2010). The Participatory Museum. Santa Cruz, Calif.: Museum 2.0.

Sontag G., & Palsson, F. (2007). “No Longer the Sacred Cow – No Longer a Desk: Transforming Reference Service to Meet 21st Century User Needs.” University of Nebraska: Lincoln Libraries. Retrieved from http://dialnet.unirioja.es/servlet/oaiart?codigo=2293893

Ward, S. C. (2012). Neoliberalism and the Global Restructuring of Knowledge and Education. New York: Routledge.

Plagiarism / Ed Summers

Because I wish I said these things. I’m glad Samantha did.

VIDEO UPDATES from the DSpace New User Interface (UI) Initiative team / DuraSpace News

From Tim Donohue, DSpace Tech Lead

Austin, TX  In the weeks leading up to Open Repositories 2016 (and beyond) the DSpace New User Interface (UI) Initiative team will post regular, brief video updates on progress available on YouTube. Viewers are encouraged to send feedback or contribute to the initiative. Currently, the team consists of volunteers from Texas A&M, @mire, Cineca and DuraSpace. We welcome new volunteers. 

Current DSpace New User Interface (UI) video updates:

038 – Tim Spalding / LibUX

In our interview with Tim Spalding (the founder of LibraryThing) we talk about the design and development of TinyCat, the poor usability of libraries’ critical software, and loads more.

Here’s what we talked about

  • 0:53 – You wouldn’t believe how big a feral iguana gets!

What's happening!? #theHorror #florida

A photo posted by Michael Schofield (@michaelschofield) on

  • 1:15 – All about LibraryThing
  • 4:30 – We are critical about the OPAC experience

It’s a tragedy, it’s more than just broken software. Somehow we’ve stumbled into this world where libraries are not part of the normal web experience of people. When they’re your local library and you end up using their interface to their catalog, it’s terrible. The problem is sort of circular: terrible software has lead to this situation, terrible choices have lead to this situation, but it’s somewhere in between a catastrophe and a disgrace. Tim Spalding

  • 6:30 – TinyCat
  • 9:43 – What if Springshare teamed-up with LibraryThing … ?

I thought having something that’s like LibApps plus TinyCat would be an incredibly compelling product that comes in swinging-around quite a big bat on this spectrum of library vendorware. Michael Schofield

  • 13:16 – Lazy use of stock photography
  • 14:51 – It’s impossible in the library world to have public pricing

Library software in general is far too expensive for what it actually is and not as good as it should be. There’s something fundamentally broken in the library software market that’s produced that situation. Tim Spalding

  • 15:30 – On the usability of the TinyCat interface
  • 20:00 – The promise and tragedy of the current open-source solutions

I’ve been in this game for so long and I’ve been pushing Evergreen and Koha because they’re open source and they can get better, they can transcend this market, but they haven’t. They’re not markedly better than other things, they’ve just kind of copied the dysfunctions. I still think people should go with an open-source solution but it hasn’t produced the sort of Nirvana that five years ago I thought it would. Tim Spalding

  • 22:18 – “Bibliocommons is sort of the library catalog I would have built.”
  • 23:00 – Speed vs UI, and we come back around to mobile-first designs – and its future being something more like API-first.

There was this dream that the library technology world would be given APIs, given open source, and just do these amazing things. I think that dream failed for a number of reasons. There aren’t enough library developers. Once you do it by language you have tiny little communities. Some of the best products – just to mention Bibliocommons again – they’re not something you can get inside … they’re controlling the interface. Largely that’s been a good thing. There’s so many library catalogs out there where someone’s been put in control of the interface and it’s dreadful. So there’s this kind of push/pull between openness and extensibility and people who create the right product …. I’ve lost the religion that I long held that if library programmers are finally given access to everything the world would be better. I think we’re moving into a somewhat darker world where everything is in the cloud, things are locked down, and they’re probably pretty good. Tim Spalding


If you like, you can download the MP3, or you can subscribe to LibUX on Stitcher, iTunes, Google Play Music, SoundCloud or plug our feed right into your podcatcher of choice. Help us out and say something nice. You can find every podcast right here on www.libux.co.

The post 038 – Tim Spalding appeared first on LibUX.

House appropriators narrowly vote to politicize LC subject heading choices / District Dispatch

Library of Congress Subject Headings

Source: Alibaba.com

The full House Appropriations Committee met earlier today to “mark up” (amend and vote on) legislation to fund the Legislative Branch for FY 2017. As previously reported and expected, language inserted in the official Report accompanying the bill at the Subcommittee level essentially instructing the Library not to implement proposed changes to the subject headings “aliens” and “illegal aliens” was hotly debated. Rep. Debbie Wasserman Schultz (D-FL23), the Ranking Member (most senior Democrat) of the Legislative Branch Subcommittee, spoke passionately and at length in support of her amendment to remove that language from the Report. She was joined by full Appropriations Committee Ranking Member Nita Lowey and by many other Members of the minority. The amendment also was supported by a joint letter, entered into the record, by the Chairs of the Hispanic, Black, and Asian Pacific American Caucuses.

Although four Members of the majority broke with their party and Committee leaders to support the measure — Reps. Diaz-Balart (R-FL25); Jolly (R-FL13); Valadao (R-CA21) and Young (R-IA03) – the amendment regrettably failed 24 – 25 (with two Members of the Committee absent or not voting). However, neither the Senate’s version of the legislation nor its report is expected to include similar instructions to the Library. Consequently, even if the bill and its Report approved today by the Appropriations Committee is passed by the House, whether the Library’s instructions remain in the final “conferenced” version of the bill will be up to House and Senate negotiators. While clearly not the preferred outcome, today’s tightest of votes on the Wasserman Schultz amendment will make it much harder for House negotiators to successfully insist on retaining the objectionable Report text.

Indeed, it remains unclear whether the Legislative Branch appropriations bill, or any other such funding legislation, actually will make it all the way through this Congress before it adjourns at year’s end. In recent years, the appropriations process has stalled forcing Congress to resort to a procedural measure called a Continuing Resolution (“CR”) to fund the government for a specified period of time at the previous fiscal year’s levels. Typically, report language like that debated today does not accompany a CR. Look for developments on that front, however, no sooner than September.

The post House appropriators narrowly vote to politicize LC subject heading choices appeared first on District Dispatch.

Reflections on DPLAfest from a DLF Cross-Pollinator / DPLA

Nancy MoussaThis guest post was written by Nancy Moussa, Web Developer at University of Michigan Library, DPLA Community Rep, and DPLA + DLF ‘Cross-Pollinator.’ (Twitter: @nancy_moussa)

It has been a month now since I came back from DPLAfest in Washington, D.C. This year, there were over 450 people that attended the fest, which was co-hosted by the Library of Congress, National Archives, and the Smithsonian.

I attended this conference because of the generous award I received from the DPLA +DLF Cross-Pollinator travel grant. Over the two days at the fest, I communicated with various librarians, archivists, developers and publishers.

The first time I learned about the DPLA was in Hathitrust Camp in 2014. Since I was already passionate about public goods and working in public institutions, DPLA seemed like the perfect platform to be involved in. Therefore, I applied and I was selected to be as a DPLA Community Rep for Canton, MI for 2016.

Now, back to the DPLAfest. The energetic DPLA members inspired me. They were successfully collaborating and sharing their ideas and best practices they learned from their digital projects.

The sessions presented at the fest were on various topics. These included: the 100 Primary Source Sets for education, and the implementation of service hubs in different institutions. There were also few sessions about the RightsStatements.org, which has standardized the way hubs describe the usage rights on materials.

Most of the sessions I attended were related to the usage of DPLA in Education, such as the 100 Primary Source Sets and the usage of DPLA in academia. Since I am a Community Rep for DPLA, these sessions were a natural fit for me.

The Primary Source Sets project is one of the important contributions from DPLA to teachers and schools. There are 100 ready-made lessons with activities for teachers to use. Most of the collections/primary source sets are targeted for 6-12K and college students. These resources are for subjects like Social Studies and History. However, DPLA is looking for others resources in STEAM as well.

Also, there was a hackathon session that ran before the conference where some developers brainstormed ideas for developing apps using DPLA API.

An interesting session was given by Assistant Professor Krystyna Matusiak, about “Using DPLA for Teaching and Learning in Higher Education.” Matusiak had conducted a usability study among 21 subjects that included undergraduate, graduate students and faculty. The study was about the academic usage of DPLA. The study is not published yet, but the feedback from participants was encouraging. The idea of one-stop-shop was appealing for everybody.

DPLAfest attendees check out the Constitution in the National Archives Rotunda.

DPLAfest attendees check out the Constitution in the National Archives Rotunda.

Finally, the special event prepared by the DPLA at the Rotunda for the Charters of Freedom in the National Archives was impressive. The original documents from the Constitution of the United States, the Bill of Rights and Declaration of Independence were presented in this exhibit. I had the luxury to enter this exhibit and embraced the chance to see the “Charters of Freedom” original documents in their huge glass cases.

The DPLA so far has over 13 million items from 1,900 contributing institutions and it is growing fast. I hope for more collaboration between DLF and DPLA in the future; The “cross-pollination” award is one way to strengthen this collaboration and provides the opportunity for other DLF members to attend the future DPLAfests.

I am very excited to see the community of DPLA growing and getting strong. My attendance to the DPLAfest had served to strengthen my passion to spread the word about DPLA.

Special thanks to the Digital Library Federation for making the DPLAfest Cross-Pollinator grant possible.

DLF logo