College Football Top 10

Opps… I forgot to publish a top 10 last week. Sorry about that. I had one ready to o but wanted to check on a few things and never got it into the blog. There were a number of upsets in the “Top 25” this weekend. The only one I was really surprised about was South Carolina losing. I sort of thought Ohio State might loose as they didn’t play anyone like Wisconsin yet and they were on the road. At least that will solve the potential of two Big 10(11) teams going through the whole season and not playing each other (what a mess that would be). FWIW: The previous is where I had them two weeks ago since I didn’t publish last week,

  1. Boise State (6-0; Prev 1). Virginia Tech’ is rolling through the early part of their ACC schedule, so I still think that was a good win despite some questions about VaTech after their loss to James Madison loss. Although Oregon State lost in Overtime to Washington last not go to 3-3, I still think they are a good team. Losses to Boise, TCU, and Washington won’t get you in the top 10, but they are nothing to be ashamed of.
  2. Oregon (6-0; Prev 3). Oregon has scored at least 42 points in every game and has a great second half defense. Although Tennessee isn’t the Tennessee that Peyton Manning played for, credit for playing them on the road is due and a nice win against Arizona have them in my top 2
  3. Oklahoma (6-0; Prev 4). Texas shows that they were better than advertised this week by beating Nebraska, thus making Oklahoma’s win against the Longhorns a bit more impressive.
  4. TCU (7-0; Prev 9). Texas Christian has out of conference wins against Pac-10 and Big-12 teams. And while SMU isn’t a top 10 (or even 20) caliber team, they aren’t chopped liver so that is another good out of conference win. I’m looking forward to the Utah-TCU game later this season.
  5. Utah (6-0; Prev Unranked). Utah’s opening win against Pitt might not be as impressive as it looked at the time with Pitt only being 3-3, but still the get credit for scheduling. That combined with rolling over the rest of their schedules moves Utah into my top 10. BTW: Wow, has BYU dropped down this year.
  6. Missouri (6-0; Prev Unranked). Missouri keeps rolling through the early part of the schedule, which included a Big 10 foe (Illinois) and some out of conference schools that are not impressive. At least when they played McNeese State I was in the second week of the season.
  7. Auburn (7-0; Prev unranked). Only tough out of conference game was an overtime win at home against Clemson, but they put up 65 this week against Arkansas. I thought the SEC was supposed to have good defense?
  8. LSU (7-0; Prev 5). If you pay McNeese State out of conference mid-season and they hang tough with you for a while, you drop. Esp. since my #6 beat the same common opponent by 44. They would have dropped further, but I still giver them credit for scheduling West Virginia.
  9. Michigan State (7-0; Prev Unranked). Only good out of conference win was against Notre Dame, but they have looked good in the Big 10(11) conference schedule thus far.
  10. Oklahoma State (6-0); Prev Unranked). The Cowboys are going to start facing much tough competition starting next week when Nebraska comes to town.

A History of OCLC’s Ohio Tax Exemption Status

The Disruptive Library Technology Jester has an interesting look at A History of the OCLC Tax-Exemption Status. As the author points out, it is but one version of the history. However, it is the best one I have seen and worth a look if you are interested in these sorts of things.

Code4Lib Journal, Issue 11

Issue 11 of the Code4Lib Journal is now available. The contents are as follows:

Editorial Introduction – A Cataloger’s Perspective on the Code4Lib Journal

Kelley McGrath
On the Code4Lib Journal, technology, and the universe of library cataloging and metadata.

Interpreting MARC: Where’s the Bibliographic Data?

Jason Thomale
The MARC data format was created early in the history of digital computers. In this article, the author entertains the notion that viewing MARC from a modern technological perspective leads to interpretive problems such as a confusion of “bibliographic data” with “catalog records.” He explores this idea through examining a specific MARC interpretation task that he undertook early in his career and then revisited nearly four years later. Revising the code that performed the task confronted him with his own misconceptions about MARC that were rooted in his worldview about what he thought “structured data” should be and helped him to place MARC in a more appropriate context.

XForms for Libraries, An Introduction
Ethan Gruber, Chris Fitzpatrick, Bill Parod, and Scott Prater
XForms applications can be used to create XML metadata that is well-formed and valid according to the schema, and then saved to (or loaded from) a datastore that communicates via REST or SOAP. XForms applications provide a powerful set of tools for data creation and manipulation, as demonstrated by some projects related to library workflows that are described in this paper.

Why Purchase When You Can Repurpose? Using Crosswalks to Enhance User Access

Teressa M. Keenan
The Mansfield Library subscribes to the Readex database U.S. Congressional Serial Set, 1817-1994 (full-text historic reports of Congress and federal agencies). Given the option of purchasing MARC records for all 262,000 publications in the Serial Set or making use of free access to simple Dublin Core records provided by Readex, the library opted to repurpose the free metadata. The process that the Mansfield Library used to obtain the Dublin Core records is described, including the procedures for crosswalking the metadata to MARC and batch loading the bibliographic records complete with holdings information to the local catalog. This report shows that we successfully achieved our goals of dramatically increasing access to Serial Set material by exposing metadata in the local catalog and discusses the challenges we faced along the way. We hope that others tasked with the manipulation of metadata will be able to use what we learned from this project.

Hacking Summon
Michael Klein
When the Oregon State University Libraries selected Serials Solutions’ Summon as its discovery tool, the implementation team realized that they had an opportunity to implement a set of “hacks” that that would improve the overall user experience. This article will explore the space between Summon’s out-of-the-box user interface and full developer API, providing practical advice on tweaking configuration information and catalog exports to take full advantage of Summon’s indexing and faceting features. The article then describes the creation of OSUL’s home-grown open source availabilty service which replaced and enhanced the availability information that Summon would normally pull directly from the catalog.

Automatic Aggregation of Faculty Publications from Personal Web Pages

Najko Jahn, Mathias Lösch, and Wolfram Horstmann
Many researchers make their publications available on personal web pages. In this paper, we propose a simple method for the automatic aggregation of these documents. We search faculty web pages for archived publications and present their full text links together with the author’s name and short content excerpts on a comprehensive web page. The excerpts are generated simply by querying a standard web search engine.

Managing Library IT Workflow with Bugzilla
Nina McHale
Prior to September 2008, all technology issues at the University of Colorado Denver’s Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria’s IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

ASIS&T 2010 early-bird registation ends tommorow

The last day to register for ASIS&T 2010 at the early bird rate is tomorrow, September 17. I know many think of ASIS&T as a conference primarily for LIS faculty, and that may be true, but I find it very valuable as a practitioner and I think more people working in the field should attend because they would get a lot out of it, and the researchers would get some valuable real-world input into their projects. For more details see the ASIS&T 2010 program or facebook page.

Call for chapters: Getting started with cloud computing: A LITA guide

Dear Librarian Colleagues:

Consider writing a chapter for the forthcoming book, “Getting started with cloud computing: A LITA guide”.

Edward Corrado and Heather Moulaison, editors, are looking for 8-12 page (double spaced standard font) chapters on either:

1. Applications and services used by librarians in the cloud and how they might be used in a variety of libraries, including information on:

a. The tool itself (what it does, why it could be of use to libraries)
b. Why librarians should know about this application or service

2. Descriptions of best practices/ok practices/not good practices in using cloud services, including information on:

a. The background to the project: Describe your library, your collection, your resources, or any other element that will be necessary to understand what you did and why

b. The project: Describe what you did, why you did it, who did what, and how, being sure to mention any special funding you needed or resources you used

c. The assessment: How have you assessed your project and what are the results of that assessment

Possible topics: Using Amazon S3 for backups/storage, Hosting Websites, blogs, wikis, etc., in the Cloud, Hosting Library Subject Guides in the Cloud, Using Google Docs and other Google Applications, etc.

Examples can focus on all kinds of libraries, including public, special, museum, academic, etc.

Projected deadline for chapter: Nov. 1, 2010.

Authors will receive a copy of the book as compensation.

If you are interested in submitting an idea for consideration, please send a rough outline of your proposed chapter to before Sept. 15, 2010. Clearly indicate in your email your name, contact information, and any other information the editors should take into
consideration about the context of your proposal.

Staying Free from “Corporate Marketing Machines”: Library Policy for Web 2.0 Tools

The presentation, Staying Free from “Corporate Marketing Machines”: Library Policy for Web 2.0 Tools, that Heather Lea Moulaison and I gave at the Marketing Libraries in a Web 2.0 Worldd IFLA Satellite conference is available on codabox. Here is the citation:

Moulaison, Heather Lea and Corrado, Edward M. (2010) Staying Free from “Corporate Marketing Machines”: Library Policy for Web 2.0 Tools. In: Marketing Libraries in a Web 2.0 World, 7-8 August 2010, Stockholm University, Sweden. Available at

New Article: SkyRiver and Innovative Interfaces File Antitrust Suit Against OCLC

I just had an article, SkyRiver and Innovative Interfaces File Antitrust Suit Against OCLC, published as an Information Today NewsBreak. The introduction states:

SkyRiver Technology Solutions filed a complaint for Federal and State antitrust violations and unfair competition against OCLC in United States District Court, Northern Division of California on July 28. The suit [1] alleges that OCLC is “unlawfully monopolizing the bibliographic data, cataloging service and interlibrary lending markets and is attempting to monopolize the market for integrated library systems by anticompetitive and exclusionary agreements, policies and practices.” Innovative Interfaces, Inc. is listed as a co-plaintiff. OCLC released a statement on July 29 saying that it hadn’t reviewed the complaint yet and after it reviews the complaint and “have had an opportunity to review the allegations with its legal counsel, a statement in response will be forthcoming.” This suit could have major implications in the library software and technology services industry. If the suit is successful, OCLC may have to provide for-profit firms access to the WorldCat database and there could be implications for OCLC’s status as a non-profit cooperative.

Please go to the Information Today Web site to read the whole article.

SkyRiver files antitrust lawsuit against OCLC

When the SkyRiver bibliographic utility was first announced, I thought this would eventually lead to some sort of legal action. What I didn’t know is who would be the first to bring legal action and against whom. Well, now we know. SkyRiver, joined by Innovative Interfaces, has filed a lawsuit in federal court in San Francisco.

The likelihood of a lawsuit seemed more certain after the fees OCLC wanted to charge some of the first customers of SkyRiver like Michigan State University and California State University, Long Beach, to upload holdings. According to SkyRivers’ press release about the lawsuit (pdf) OCLC quoted them a price increase of over 1100%. I’m not a legal scholar and don’t know any details of the actually filling, so I don’t know what will happen, but it certainly will be interesting and will be a game changer. I also don’t expect it to have a quick outcome.

I didn’t see a press release from Innovative Interfaces yet, but I am sure that one of the reasons the company joined the lawsuit was the new OCLC Web-scale Management Services which directly competes with the traditional ILS.* Honestly, I was really surprised that the new OCLC system didn’t create a bigger buzz because in my mind it is a game changer. OCLC with control of so many bibliographic members created by members via there WorldCat platform is in a position to leverage WorldCat and a tremendous amount of data in ways other vendors simply can’t, especially if SkyRiver’s anti-trust claims are accurate. I also think the whole WorldCat record use policy fiasco over the last year or so has also added to the factors leading to this lawsuit.

As far as I know, OCLC also hasn’t made a public response as of yet.

I plan on following this story closely because I believe however it turns out, as I mentioned earlier it will be a game changer. If OCLC prevails, startups like SkyRiver won’t have a fair chance. If SkyRiver prevails, we can see a major restructuring of services that OCLC provides and possibly even a breakup of OCLC.

For information about the lawsuit from SkyRiver, check out the Web site they created about it, called Choice for Libraries.

* Yes, I know that SkyRiver and Innovative Interfaces are owned by the same people, but they are different companies.

E-mail Signature Files

I decided to update my work e-mail signature file to reflect my new job title and at the same time make it automatically attach via my e-mail clients (Mozilla Thunderbird and the Gmail interface). While doing so I decided to look at what the prevailing thought on e-mail signatures is. Using a Google Search I picked out about ten Web pages/blog posts to review on this subject and this is what I found.

Things that all or almost all of the posts agreed upon:

  1. Name (obvious, no?)
  2. Professional Title / Position
  3. Website URL (One or two people said it wasn’t needed but most thought this was good. Personally I think unless it is a small company with a minimal Web site you should include it. For example, finding the Library Web site on a large University site can sometimes take a while).
  4. Phone number (Possibly also Mobile and Fax numbers. Joshua Dorkin pointed out “If you’re not willing to include a phone number with an email, then who on earth can take you seriously?“).
  5. Keep the signature from 4 to 6 lines

Her are some things that some people thought were appropriate and others not:

  1. E-mail Address (Some people said it is in the header, but others pointed out that some e-mail clients hide it and once the mail gets forwarded, the e-mail address may no longer be there. Personally, I decided to include it).
  2. Instant Messaging Names (I didn’t see anyone say not to include it, but only a few mentioned it. Nathan Jones pointed out you should only include one. I would just say if you use IM all the time it makes sense, but if you are a light or even moderate user, probably not.
  3. Mobile Note (As with IM, I didn’t see anyone say not tto include it, but not everyone mentioned it. Nathan Jones writes “I think it’s a good idea to add a small note at the bottom of the signature that indicates that the email is being sent from your mobile phone.” The thought is that people will be more forgiving of small typos and short responses.
  4. Sig Separators (Again, no one said not to use them but I was surprised by how many didn’t mention them at all).

Here are somethings with more disagreement where the leaning was to not include the following:

  1. Business address (More people in my small sample didn’t like the idea of a street address then did, but it was up for debate. Joshua Dorkin wrote “While it helps to know where someone’s physical presence is, in the current day and age people aren’t using snail mail as often as they used to. Mailing addresses are great to have, but not 100% necessary.” Others thought it depended on how hard it would be to find out the address or if people are likely to want to come visit you. Personally I included it because people may not know where Binghamton University is otherwise, and If I’m going to include “Binghamton NY, USA” I might as well add a PO Box and Zip code. Besides, how often do a see complaints about job postings that don’t include addresses or people getting schools with similar names confused?).
  2. Quotes, mottos, etc. (Judith at specifically pointed out not to “use inflammatory quotes in your signature file.” I see a lot of professional e-mail with quotes that might not be inflammatory, but definitely could turn some people off. On your personal e-mail to friends and family that is your choice but I don’t think it is appropriate for professional e-mail. I just say no to quotes in professional email signatures).
  3. Branding via color or images (Some thought minimal levels of branding such as fonts matching the organizations color or a small image are okay, but all agreed that too much is too much)
  4. Closing sentiment (Some posts mentioned that the “first line of an email signature should be a closing sentiment, such as ‘Thank you,’ or “Sincerely.’” Personally, I don’t agree. If I want a closing sentiment, I’ll type it myself and make sure it is appropriate for the situation).
  5. Formatting (Surprisingly not too many people mentioned formatting. One person that did was Judith at who said you should “align your sig’s text with spaces rather than tabbing […] Also keep in mind that you want to keep your sig file to 70 characters or less, as that is the set screen width default for most email programs.” I think the 70 character wide rule is a good one to keep in mind.
  6. Degrees (Most people thought listing things like MBA looked arrogant if for no other reason then because it is uncommon – at least in the United States. However, these people didn’t work at Universities as far as I could tell. I think the attitude in academia about this would be different than in corporations, so I see no harm in listing MLS, MBA, EdD, PhD, etc in the library world. I chose not to list my MLS, but if I had a doctorate I might have choose differently).

Anyway, if you are interested this is what I came up with…

Edward M. Corrado
Assistant Director for Library Technology
Binghamton University Libraries
P.O. Box 6012, Binghamton, NY 13902 USA
Phone: +1-607-777-4909 | Fax: +1-607-777-4848 |

Preserving Electronic Records in Colleges and Universities workshop

On Friday, July 9 I went to a Preserving Electronic Records in Colleges and Universities workshop held at Cornell University and sponsored by the New York State Archives. The workshop was presented by Steve F. Goodfellow, President, Access Systems, Inc. The workshop was well organized and Steve Goodfellow did a good job presenting the material. In some respects, I can’t say I learned a real lot, especially on the technology side, but the workshop was more the worthwhile, if only to have some of my thoughts on the issue reinforced by an expert. I did learn about some policy considerations and retention schedules however.

During a break I talked with Steve and we agreed that while the technology is important and their are technological challenges, really electronic preservation is more of a policy challenge then a technological one. If the policies are in place and carried out (which included the proper funding), the technology can be worked out. That is not to say the technological solutions are always worked out properly. During the first part of the workshop we discussed items when it didn’t. One example was a client of his had an old student records system and they thought they migrated everything. However, they kept the old system around for old lookups “just in case.” Well, a new CIO came in and asked when was the last time it was used. The answer was not in a long time, so the old system was removed. Guess what happened? Not everything was moved and now they didn’t have it any more.

One of the big take aways for me was the Fundamental Goals of an electronic records preservation system identified during the workshop. The three are:

  1. Readable of electronically stored records
  2. Authoritative & trustworthy process
  3. Maintain a secure and reliable repository

These to a large degree are obvious, but if you are embarking on a electronic preservation program, you should identify how you are accomplishing these goals.

« Previous Page« Previous entries « Previous Page · Next Page » Next entries »Next Page »