I know, I know. There is no reason to be nasty, but Beckett’s site is a disaster. Everyone knows it, and even Beckett management has today made a public announcement about future plans for its portal. It is shocking that Beckett’s web capabilities are as poor as they are. Beckett is very fortunate that a competitor has been able to capitalize on this long-standing problem.
The Beckett announcement focuses on the sheer amount of data the site must handle as well as a rapidly growing user base. I’m sure those are valid issues and major sore points. My problem with their site is how users apply filters to their searches. Each filter must be applied separately with a new search. Just getting the correct year can require three separate clicks/filter changes. This goes for both the main Beckett side as well as their Marketplace. It is no wonder that they have traffic issues when their site forces users to apply multiple filters, each of which automatically triggers a separate page refresh. To perform one search, a user must hit the website repeatedly instead of just once. The same comments go for display options. I prefer a list view with 128 items per page. That’s two page refreshes instead of one. Simply reconfiguring searches and filters would reduce traffic considerably.
Of course, I wouldn’t be so aware of how many page refreshes are required if each refresh wasn’t so slow.
As much as I’d like to complain more about the Beckett site, I have to admit that its database is awesome. I hope they can design a decent means of retrieving data from it.