Looking back at Blacklight selection and implementation

In a couple weeks (cross fingers, knock on wood) we are going live with our Blacklight implementation, completely replacing the legacy OPAC in user-facing services. Blacklight is based on Rails and Solr.

A colleague at another institution asked a couple questions about how we selected Blacklight and what our experience has been implementing it, I thought it would be useful to share my answers publically. These answers are only my opinion, do not represent the collective opinon of my institution, other people at my institution may have different evaluations.

1. How did you decide on Blacklight as your next-gen catalog? What made it stand out over the alternatives?

We actually undertook our evaluation, in which we decided to go with Blacklight,  a while ago, if memory serves, about two and a half years ago.

At that time, we evaluated (if memory serves) WorldCat Local, Primo, AquaBrowser,  VuFind, and Blacklight. Summon did not yet exist, and neither did the aggregated index PrimoCentral component of Primo.

At that time, it looked to us like the commercial/proprietary offerings would take _just as much_ local development and customization to make them have the features we wanted (including integrating well into our existing web services infrastructure) as the open source offerings. So we’d be paying a bunch of money, but not actually saving any staff time.  At least with Primo — with WorldCat Local it was unclear if it was even possible to get the features and integration we wanted at all.  (Again, this was those products of ~2 years ago, they may not be the same now).

So we decided might as well do open source.

At that time Blacklight looked like the more flexible and powerful product, compared to VuFind, with a healthier development effort (at that time, it looked like VuFind had devolved into a bunch of seperate local forks with little sharing of bugfixes or improvements; I think VuFind development community has recovered somewhat since then).

So we chose to go with Blacklight.  Now, in retrospect, what we concluded from our evaluation may not have been completely accurate at the time (let alone accurate now 2-3 years later when all products involved have changed).  I’ll talk more about that later.

But note also that Summon did not exist then.  If Summon had existed, it would have made our choice somewhat harder — it still would have been true, I think, that it would have taken just as much local development staff time to make Summon what we wanted (and some things we wanted would have been just plain impossible) — but Summon also offers something of very high value that none of the products we were evalualating at that time offered — “non-local metadata” well-integrated into the users search experience, primarily journal articles, also HathiTrust, etc.  So it would have been much more difficult to decide what the right trade off was, and this is still something we think about.  Similar for WorldCat Local which has increased it’s “non-local metadata” offerings, and Primo Central.

2. How have you found maintenance? You’ve been through some upgrades for example, and have done a fair amount of customizing. Could you say something about what that’s been like, and what kind of resources are required?

So we didn’t expect it would take ~2 years to go live with our Blacklight implementation.  We’re about to go live in a couple weeks, completely replacing our legacy OPAC.

Blacklight was, at that time, not nearly as mature a product as we had thought it was; things we thought were already done or would Just Work for us, or require limited configuratin/customization —  ended up requiring extensive development instead.

Now, two years later, Blacklight is a LOT more mature.  In part because all that time I spent on development, I tried to push as much as possible back into Blacklight to more easily support the customizations or features we needed. Other Blacklight developers have been doing a lot too. So it definitely wouldn’t take you as long to get where we’ve gotten as it would have if you started two years ago, Blacklight’s come along since then.

And we are fairly happy with what’s resulted:  I think we have the features we wanted (including tight integration with patron-specific features from the ILS, so tight that it appears to be one unified application), and I think it probably would have taken us just as long to do with proprietary offerings, and in some cases would not have been possible at all with proprietary offerings. There are some features we’d like and still don’t have, like ‘browse search’, that also are not available in any proprietary offering I know about — I’ve got ideas for how we could add them to Blacklight time permitting, but with a proprietary offering we’d just have to wait and hope on the vendor.

I’m not able to compare contemporary Blacklight to contemporary VuFind though, it’s possible they both have strengths and weaknesses but I couldn’t say.

As far as resources required, it’s worth saying that I was essentially the only local staff working on Blacklight (some assist from one local colleague who helped with some tasks but didn’t actually do much development).  And I’m also (pretty much solely) responsible for managing our legacy OPAC, SFX, and Metalib. You guys may be better staffed than we are, not sure.  We were definitely pushing our capacity, and doing something with limited staff means longer project time, for sure.

However, I suspect that it’s probably a general rule that customizations to a product like this will always take you longer than you expect — whether it’s a proprietary product or an open source product with no support vendor.  I think probably the only way to avoid that is to not do customizations or really minimize the customizations as much as possible, just use the out-of-the-box product — again whether proprietary or open source.

Back to Blacklight, as far as expertise for development and maintenance — in my experience and from my perspective, you will need to develop a fair amount of local comfort/expertise with Solr.  If you’re going to do extensive customizations, you’ll definitely need to develop a fair amount of local comfort/expertise with ruby and Rails.

This entry was posted in General. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s