College Football Top 10

Here is my top 10 going into today’s action; again without comment….

1. Boise State (7-0; Prev 1)
2. Oregon (8-0; Prev 2)
3. TCU (9-0; Prev 3)
4. Utah (8-0; Prev 5)
5. Auburn (8-0; Prev 6)
6. Wisconsin (7-1, Prev 10)
7. Oklahoma (7-1; Prev 8)
8. Arizona (7-1; Prev 9)
9. Alabama (7-1; Prev Unranked)
10. Nebraska (7-1; Prev Unranked)

College Football Top 10

I made a college football top ten week early last week, but I never got around to annotating it because I was away at a conference. So before today’s action here is my last with out comment:

1. Boise State (6-0; Prev 1) (now 7-0 after winning Tuesday)
2. Oregon (7-0; Prev 2)
3. TCU (8-0; Prev 4)
4. Missouri (7-0; Prev 6)
5. Utah (7-0; Prev 5)
6. Auburn (8-0; Prev 7)
7. Michigan State (8-0; Prev 9)
8. Oklahoma (6-1; Prev 3)
9. Arizona (6-1; Prev Unranked)
10. Wisconsin (7-1, Prev Unranked)

College Football Top 10

Opps… I forgot to publish a top 10 last week. Sorry about that. I had one ready to o but wanted to check on a few things and never got it into the blog. There were a number of upsets in the “Top 25” this weekend. The only one I was really surprised about was South Carolina losing. I sort of thought Ohio State might loose as they didn’t play anyone like Wisconsin yet and they were on the road. At least that will solve the potential of two Big 10(11) teams going through the whole season and not playing each other (what a mess that would be). FWIW: The previous is where I had them two weeks ago since I didn’t publish last week,

  1. Boise State (6-0; Prev 1). Virginia Tech’ is rolling through the early part of their ACC schedule, so I still think that was a good win despite some questions about VaTech after their loss to James Madison loss. Although Oregon State lost in Overtime to Washington last not go to 3-3, I still think they are a good team. Losses to Boise, TCU, and Washington won’t get you in the top 10, but they are nothing to be ashamed of.
  2. Oregon (6-0; Prev 3). Oregon has scored at least 42 points in every game and has a great second half defense. Although Tennessee isn’t the Tennessee that Peyton Manning played for, credit for playing them on the road is due and a nice win against Arizona have them in my top 2
  3. Oklahoma (6-0; Prev 4). Texas shows that they were better than advertised this week by beating Nebraska, thus making Oklahoma’s win against the Longhorns a bit more impressive.
  4. TCU (7-0; Prev 9). Texas Christian has out of conference wins against Pac-10 and Big-12 teams. And while SMU isn’t a top 10 (or even 20) caliber team, they aren’t chopped liver so that is another good out of conference win. I’m looking forward to the Utah-TCU game later this season.
  5. Utah (6-0; Prev Unranked). Utah’s opening win against Pitt might not be as impressive as it looked at the time with Pitt only being 3-3, but still the get credit for scheduling. That combined with rolling over the rest of their schedules moves Utah into my top 10. BTW: Wow, has BYU dropped down this year.
  6. Missouri (6-0; Prev Unranked). Missouri keeps rolling through the early part of the schedule, which included a Big 10 foe (Illinois) and some out of conference schools that are not impressive. At least when they played McNeese State I was in the second week of the season.
  7. Auburn (7-0; Prev unranked). Only tough out of conference game was an overtime win at home against Clemson, but they put up 65 this week against Arkansas. I thought the SEC was supposed to have good defense?
  8. LSU (7-0; Prev 5). If you pay McNeese State out of conference mid-season and they hang tough with you for a while, you drop. Esp. since my #6 beat the same common opponent by 44. They would have dropped further, but I still giver them credit for scheduling West Virginia.
  9. Michigan State (7-0; Prev Unranked). Only good out of conference win was against Notre Dame, but they have looked good in the Big 10(11) conference schedule thus far.
  10. Oklahoma State (6-0); Prev Unranked). The Cowboys are going to start facing much tough competition starting next week when Nebraska comes to town.

A History of OCLC’s Ohio Tax Exemption Status

The Disruptive Library Technology Jester has an interesting look at A History of the OCLC Tax-Exemption Status. As the author points out, it is but one version of the history. However, it is the best one I have seen and worth a look if you are interested in these sorts of things.

College Football Top 10

Welcome to the first installment of my college football top 10 for 2010. As a reminder, my rankings are based on what you did on the field. Not if you can beat X. If you think you can beat X schedule them and beat them, and I’ll rank you higher. This does mean though that my early top 10 rankings are unlikely to look very similar at the end of the season. Top concern is wins, followed by whom you played with an emphasis on out-of-conference games because that is what you can control. Until the SEC lets all teams join the SEC and share revenue equally, I don’t want to hear how you beat each other up in conference. It is your choice to be in there and your collective choice to keep other teams out. This is especially when you have teams that lose at home to Jacksonville State. That is not to say in-conference games don’t matter, but I am more impressed when you go on the road out of conference to play a tough team instead of playing teams that aren’t even in a FBS conference. However if you play good teams out of conference, if your conference is stronger than someone else’s, you’ll rate favorably. Without further ado, here is my first top 10.

  1. Boise State (4-0). Virginia Tech’s lost to James Madison has me questioning how good of a win this was, but since VaTech has won three straight after that, including 2 ACC games, I think the James Madison loss was a hangover from losing the Boise State game, so I’ll give the Broncos credit for playing what was basically a road game against an AP-top 10 ranked team to start the season. They also followed it up with an impressive win over the Pac-10’s Oregon State University.
  2. Alabama (4-0). I’m not impressed with Penn State, but give Alabama credit for scheduling them even if it was played in Alabama. A blowout of the ACC’s Duke isn’t that impressive but beating in-conference foe Florida counts for something.
  3. Oregon (5-0). Oregon has scored at least 42 points in every game and has a great second half defense. Although Tennessee isn’t the Tennessee that Peyton Manning played for, credit for playing them on the road is due. I’ll also give them a pass for scheduling Portland State since they are another Oregon school. New Mexico as the other out-of-conference game keeps them out of the top 2, however.
  4. Oklahoma (5-0). I really didn’t want to put the Sooners this high as with the exception of the Texas game I haven’t been overly impressed by their scheduling (and beating) Florida State and Cincinnati.
  5. LSU (5-0). Nice wins against North Carolina and West Virginia out of conference. Both were at home though and UNC was without a bunch of players, but a win is a win and credit for scheduling them is due.
  6. Nevada (5-0). Beat California out of the Pac-10 and BYU out of conference. Even though BYU isn’t as good as usual this year, that still counts for something.
  7. Nebraska (4-0). Only tough out of conference matchup was against Washington (who just beat USC) but that was a good win and at least San Diego State isn’t the Citadel.
  8. Arizona (4-0). I doubt Arizona will stay in my top 10 for long, but a good win against a AP top-10 Iowa gets them on my list. Wins over Citadel and Toledo keep them from being higher.
  9. TCU (5-0). Texas Christian has out of conference wins against Pac-10 and Big-12 teams. And while SMU isn’t a top 10 (or even 20) caliber team, they aren’t chopped liver so that is another good out of conference win.
  10. Ohio State (5-0). Not really sure what to think of their one good win – an out-of-conference triumph of Miami (FL) – as I don’t know if Miami is as good as some people seem to think they are.

Code4Lib Journal, Issue 11

Issue 11 of the Code4Lib Journal is now available. The contents are as follows:


Editorial Introduction – A Cataloger’s Perspective on the Code4Lib Journal

Kelley McGrath
On the Code4Lib Journal, technology, and the universe of library cataloging and metadata.


Interpreting MARC: Where’s the Bibliographic Data?

Jason Thomale
The MARC data format was created early in the history of digital computers. In this article, the author entertains the notion that viewing MARC from a modern technological perspective leads to interpretive problems such as a confusion of “bibliographic data” with “catalog records.” He explores this idea through examining a specific MARC interpretation task that he undertook early in his career and then revisited nearly four years later. Revising the code that performed the task confronted him with his own misconceptions about MARC that were rooted in his worldview about what he thought “structured data” should be and helped him to place MARC in a more appropriate context.

XForms for Libraries, An Introduction
Ethan Gruber, Chris Fitzpatrick, Bill Parod, and Scott Prater
XForms applications can be used to create XML metadata that is well-formed and valid according to the schema, and then saved to (or loaded from) a datastore that communicates via REST or SOAP. XForms applications provide a powerful set of tools for data creation and manipulation, as demonstrated by some projects related to library workflows that are described in this paper.


Why Purchase When You Can Repurpose? Using Crosswalks to Enhance User Access

Teressa M. Keenan
The Mansfield Library subscribes to the Readex database U.S. Congressional Serial Set, 1817-1994 (full-text historic reports of Congress and federal agencies). Given the option of purchasing MARC records for all 262,000 publications in the Serial Set or making use of free access to simple Dublin Core records provided by Readex, the library opted to repurpose the free metadata. The process that the Mansfield Library used to obtain the Dublin Core records is described, including the procedures for crosswalking the metadata to MARC and batch loading the bibliographic records complete with holdings information to the local catalog. This report shows that we successfully achieved our goals of dramatically increasing access to Serial Set material by exposing metadata in the local catalog and discusses the challenges we faced along the way. We hope that others tasked with the manipulation of metadata will be able to use what we learned from this project.

Hacking Summon
Michael Klein
When the Oregon State University Libraries selected Serials Solutions’ Summon as its discovery tool, the implementation team realized that they had an opportunity to implement a set of “hacks” that that would improve the overall user experience. This article will explore the space between Summon’s out-of-the-box user interface and full developer API, providing practical advice on tweaking configuration information and catalog exports to take full advantage of Summon’s indexing and faceting features. The article then describes the creation of OSUL’s home-grown open source availabilty service which replaced and enhanced the availability information that Summon would normally pull directly from the catalog.


Automatic Aggregation of Faculty Publications from Personal Web Pages

Najko Jahn, Mathias Lösch, and Wolfram Horstmann
Many researchers make their publications available on personal web pages. In this paper, we propose a simple method for the automatic aggregation of these documents. We search faculty web pages for archived publications and present their full text links together with the author’s name and short content excerpts on a comprehensive web page. The excerpts are generated simply by querying a standard web search engine.

Managing Library IT Workflow with Bugzilla
Nina McHale
Prior to September 2008, all technology issues at the University of Colorado Denver’s Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria’s IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

ASIS&T 2010 early-bird registation ends tommorow

The last day to register for ASIS&T 2010 at the early bird rate is tomorrow, September 17. I know many think of ASIS&T as a conference primarily for LIS faculty, and that may be true, but I find it very valuable as a practitioner and I think more people working in the field should attend because they would get a lot out of it, and the researchers would get some valuable real-world input into their projects. For more details see the ASIS&T 2010 program or facebook page.

Call for chapters: Getting started with cloud computing: A LITA guide

Dear Librarian Colleagues:

Consider writing a chapter for the forthcoming book, “Getting started with cloud computing: A LITA guide”.

Edward Corrado and Heather Moulaison, editors, are looking for 8-12 page (double spaced standard font) chapters on either:

1. Applications and services used by librarians in the cloud and how they might be used in a variety of libraries, including information on:

a. The tool itself (what it does, why it could be of use to libraries)
b. Why librarians should know about this application or service

2. Descriptions of best practices/ok practices/not good practices in using cloud services, including information on:

a. The background to the project: Describe your library, your collection, your resources, or any other element that will be necessary to understand what you did and why

b. The project: Describe what you did, why you did it, who did what, and how, being sure to mention any special funding you needed or resources you used

c. The assessment: How have you assessed your project and what are the results of that assessment

Possible topics: Using Amazon S3 for backups/storage, Hosting Websites, blogs, wikis, etc., in the Cloud, Hosting Library Subject Guides in the Cloud, Using Google Docs and other Google Applications, etc.

Examples can focus on all kinds of libraries, including public, special, museum, academic, etc.

Projected deadline for chapter: Nov. 1, 2010.

Authors will receive a copy of the book as compensation.

If you are interested in submitting an idea for consideration, please send a rough outline of your proposed chapter to ecorrado@ecorrado.us before Sept. 15, 2010. Clearly indicate in your email your name, contact information, and any other information the editors should take into
consideration about the context of your proposal.

Little Things Matter

Dear Open Source Software developer,

Little things matter. If you want me to take your project serious, doing the little things right can make a big difference. Here are just a few things you should try to do, IMHO. I know that some of these are not as fun as coding in your favorite programming language, but if they aren’t done, your project will less likely be chosen for me to use, and possibly contribute back. Instead of making a long list, I will give you my five of my pet peeves about Open Source projects. These are some little things that I have seen with a number of projects that could easily be rectified.

  1. Include complete install instructions. Don’t assume people know what you mean by things like create a database and grant CREATE, UPDATE, DROP, etc. means. Just tell people how to do it. People with more skills can read between the lines if they want to do it differently. You need to design documentation for people who are just learning. Also, don’t make me download the whole package to see the instructions. I tried to use one project that had instructions for install the project on MySQL/Linux but was written on Solaris/Postgres. I naively assumed that they actually tested the Linux instructions. Wrong! After struggling with it for a day, I e-mailed the project’s list and learned they weren’t even tested ans I should use Solaris/Postgres. I have nothing against Solaris/Postgres, but it would have been nice to know I should use it before wasting a day following instructions that weren’t even tested. I went elsewhere even though I would have probably used the project on Solaris/Postgres if I knew that in the begining.
  2. On your Website include (at least) basic operating instructions and screen shots. I recently saw an announcement for a new Open Source project that I thought sounded interesting, but when I went to the URL and it was basically just a place to download the code. I at least want some idea what it looks like and what it does beside being a statistics package, a time management package, or _____.
  3. Respond to posts on your forum/blog/e-mail list. Nothing screams dead project like questions and inquiries not being answered. It is possible they are answered in private, but someone investigating your project won’t know that. If it is asked in public, answer it in public or at least post something saying you contacted the person privately).
  4. Keep people informed/updated. I went to a OSS project Web site last week and found a new version of the software to download posted about a month ago with absolutely no mention of how they got there or what was different about the new version from earlier versions. A quick post saying new files are available would have been nice. The same project’s home page says beta software for a new version is coming in Spring 2009 (it is now Summer 2010!). I think the beta version was actually version 1.1 and was posted in March, but who knows? Even if you are not actively releasing new code, a blog post with a hint or tip every now and then will add confidence.
  5. Treat users (even newbies) with respect.
  6. Not everyone is an expert with Open Source, but you should treat them with respect. If they ask a dumb question don’t attack them, ask them for more information in a friendly manner Karl Fogel in Producing Open Source Software describes this as treating every user as a potential volunteer. He writes:

    A corollary of this is that developers should not express anger at people who file well-intended but vague bug reports. This is one of my personal pet peeves; I see developers do it all the time on various open source mailing lists, and the harm it does is palpable. Some hapless newbie will post a useless report:

    “Hi, I can’t get Scanley to run. Every time I start it up, it just errors. Is anyone else seeing this problem?”

    Some developer—who has seen this kind of report a thousand times, and hasn’t stopped to think that the newbie has not—will respond like this:

    “What are we supposed to do with so little information? Sheesh. Give us at least some details, like the version of Scanley, your operating system, and the error.”

    This developer has failed to see things from the user’s point of view, and also failed to consider the effect such a reaction might have on all the other people watching the exchange. Naturally a user who has no programming experience, and no prior experience reporting bugs, will not know how to write a bug report. What is the right way to handle such a person? Educate them! And do it in such a way that they come back for more:

    “Sorry you’re having trouble. We’ll need more information in order to figure out what’s happening here. Please tell us the version of Scanley, your operating system, and the exact text of the error. The very best thing you can do is send a transcript showing the exact commands you ran, and the output they produced. See http://www.scanley.org/how_to_report_a_bug.html for more.”

    This way of responding is far more effective at extracting the needed information from the user, because it is written to the user’s point of view.

    These are just a few things that I think, if followed, will give a project a much better chance of being successful. Also, by doing this you may have a better chance of growing community who can assist with things like documentation and answering questions from other users.

    Sincerely,

    Me

    P.S. Anyone have their own pet peeves to share?

Staying Free from “Corporate Marketing Machines”: Library Policy for Web 2.0 Tools

The presentation, Staying Free from “Corporate Marketing Machines”: Library Policy for Web 2.0 Tools, that Heather Lea Moulaison and I gave at the Marketing Libraries in a Web 2.0 Worldd IFLA Satellite conference is available on codabox. Here is the citation:

Moulaison, Heather Lea and Corrado, Edward M. (2010) Staying Free from “Corporate Marketing Machines”: Library Policy for Web 2.0 Tools. In: Marketing Libraries in a Web 2.0 World, 7-8 August 2010, Stockholm University, Sweden. Available at http://codabox.org/66/

« Previous Page« Previous entries « Previous Page · Next Page » Next entries »Next Page »