At UW-Madison we have a mixed environment. Our campus has invested in a Tableau 
Server instance, so we participate in that environment. We have a single 
viz/dashboard describing our library instruction activities:

https://dataviz.wisc.edu/views/UW-MadisonLibrariesInfoLiteracyInstruction_0/HomePage?:embed=y&:showAppBanner=false&:showShareOptions=true&:display_count=no&:showVizHome=no

For our internal facing usage statistics, we tend to host our own web pages and 
apps that utilize the D3.js library for data visualization. Here are a few 
examples:

Collection Statistics
https://web.library.wisc.edu/sp/cca/

* Basic stats mostly about the size of our collection and its use
* Data spans over a decade across to ILSes (Voyager and Alma)
* Some viz's are used to tell our story to campus admins or were even requested 
by a provost [1]
* Others get into an insider baseball view that is intended for library staff 
with knowledge of operations [2]

Journal Statistics
https://journalusage.library.wisc.edu/journals/991021974405202122

* A focused look at serials use
* Combines print and electronic usage for macro trends
* Rails backend

If you will permit a bit of a longer reflection on this space, personally, I 
think there are some aspects of using the newer GUI tools like Tableau (or 
Microsoft's product, etc) that are not talked about enough. Data visualization 
is an extremely complex process. It involves:

* formulation of good analytical questions, 
* sophisticated knowledge of data sources and their associated formats, 
* data collection and processing techniques and 
* presentation techniques for quantitative data (including aspects like color 
theory).

Tableau is good, opinionated software that attempts to encourage good practices 
over bad ones. And it can generate some very impressive data viz. But from a 
software development perspective, there are some highly problematic elements 
akin to the difference between managing data and computation in a tool like 
Excel vs., say, a Python project in a Git repo (a la the Software/Data 
Carpentry practices).

For example, for other software engineering processes that are as complex as 
data visualization, professionals use both test driven development practices 
and version control. These practices and their associated tools mitigate the 
risk for bugs in software and foster transparent and reproducible processes. 
Given that the tasks Tableau is used for are equally complex, it is dangerous 
to use processes that lack these safe guards.

In addition, seen from a time investment perspective, many of the skills one 
gains using Tableau are not transferrable to other software or data processing 
and analytics work. It has its own UI idioms and terminology that don't always 
line up with statistical or programming language in other domains.

I understand that writing code is no small ask and D3.js, while amazing, is 
non-trivial to learn. But it is worth considering all aspects of the 
infrastructure and whether your investment is a long game. For us, data 
visualization has a been long term proposition and a modernization of the kinds 
of data reporting we have always done. And the investment is beginning to 
provide a foundation selectors and bibliographers to start developing models 
for collection analysis.

[1] https://web.library.wisc.edu/sp/cca/lc-classification.html
[2] https://web.library.wisc.edu/sp/cca/loan-to-volume-ratios.html#All

> On Jun 25, 2019, at 6:17 PM, Natasha Allen <natasha.al...@sjsu.edu> wrote:
> 
> Hi folks,
> 
> Hopefully a quick question I'm doing some information gathering on. Are
> there any libraries out there currently utilizing an internal data
> dashboard for visualizing library statistics? If so, what program are you
> using for this purpose?
> 
> Thanks,
> 
> Natasha
> 
> ---
> Natasha Allen (she/her)
> System and Fulfillment Coordinator, University Library
> San José State University
> 1 Washington Square
> San José , CA 95192
> natasha.al...@sjsu.edu
> 408-808-2655

Reply via email to