Monday, February 26, 2018

More Python, CSS, and some Classes

Finally got back to more of a system's focus this week even though I spent a a little time in the classroom.

  1. My newest python code is pretty cool, as it will now automatically restore and reindex the various collections is our eXist database, and if there are any errors you see them as well. This is the basic guts of the code below for the restore process:

    import config as cfg

    for directory in restore :

        directory_backup = backup_date + "/db/" + directory + "/__contents__.xml"
        arguments = ["-u", cfg.login['user'], "-p", cfg.login['password'], "-r", directory_backup]
        command = [restore_program]
        command.extend(arguments)
        #restore directory
        try:
            out_bytes = subprocess.check_call(command, stdout=f,stderr=f)
        except subprocess.CalledProcessError as e:
            pass


    Basically it writes the standard output and errors to a file, which it later emails upon completion.

  2. I also spent an afternoon last week dropping in the new CSS code for the Alma mash_up that controls the Get It / View It windows in our Primo instance. Thank you to Paul Ojennus, Whitworth University for creating it. Below is what our Get It window now looks like, a serious upgrade from before:


  3. Also got to drop into two classes this week, one a Museum Studies course that might use Omeka, and the other a Civic Communication & Media class where I showed them how to use Zotero.

Wednesday, February 21, 2018

Instagram, Zotero Table, Omeka Class Session

A lot going on this week not related directly to library systems work, so my work on systems has slipped a bit. Here are three non-systems highlights from this past week:


  1. On my work with the Marketing committee, we now have a library Instagram account, we are using the strategy in this article as our approach.
  2. Wednesday of this last week was Valentine's day, so I decided to use a "Zotero loves students" sign at a table in Goudy. Got a couple people interested, and now I also plan to do this at least once a month.
  3. Did an introductory Omeka session for a history class who will be working on creating exhibits on the history of Willamette. 

Monday, February 12, 2018

encodeURIComponent, ssh keys, and Gift Books


  1. So this first one surprised me that no one mentioned it before, but I think that is because most librarians got straight to Advanced Search for their catalog searches.

    If you searched our catalog from the main page for a search string with a "&" in it, it would chop the search string and only pass on the first part of the string.

    I think we were okay with this until we switched to the new UI, to fix this I just added a javascript call to encode the URI components in the query.

    query = encodeURIComponent(query);
  2. SSH keys are going to save my fingers some work. I was able to use the technique described here tohttp://www.rebol.com/docs/ssh-auto-login.html, to create public/private rsa key pairs so I can just ssh bkelm@libtest,  and I'm directly connected. I then also set up a shortcut in a bookmark to open an SSH connection, ssh://bkelm@libtest-1, and now I just have to choose that bookmark on my mac and I get a shell opened.
  3. I have always wanted to at least return a link to the list of gift books by a given donor . I knew I could do this if I could grab the data from the PNX record, and thanks to Corinna Baksik at the University of Harvard. She shared some examples at ELUNA 2016 which let me try out the following.  First I add a component, that I bind to prmBriefResultAfter, which will use a controller which then has access to the data model of the parent, which is the PNX record. Very cool, so the code below appends a link under our Gift Book line, to look at all Gift Books from a donor.

    I added the script to our Github repository:

    https://github.com/hatfieldlibrary/primo-new-ui-components/blob/master/custom/js/7_gift_book.js

Monday, February 5, 2018

Python, eXist, Alliance Share the Work


1. Python

    Did a little work in python this week to set up a script that we can just run with one command to restore the various directories in our eXist database. Prompts the user for which back up and then iterates through the directories we have indicated we need to restore.

#!/usr/bin/env python

import subprocess

#Directories to restore list

directories = ["apps/METSALTO", "bulletincatalogs", "collegian", "commencement", "handbooks", "puritan", "scene", "scrapbooks", "wallulah", "system/config/db"]

#Prompt user for backup directory

backup_date = raw_input('Enter the backup directory: ')

#Note to run by cron just place the date in the file here and comment line above out

#backup_date = ""

report_date = backup_date + ".txt"

rout = open(report_date, 'w')

program = "/var/lib/exist/eXist-db/bin/backup.sh"

for directory in directories :

    directory_backup = "/backup/" + backup_date + "/db/" + directory + "/__contents__.xml"
    arguments = ["-u", "admin", "-p", "XXXXX", "-r", directory_backup]
    command = [program]
    command.extend(arguments)

    #restore each directory and send output to file
    subprocess.call(command, stdout=rout)

2.  eXist

 I wrote a simple XQuery that could be run in the Exide that runs through the different eXist collections that we want to reindex after the restore command run from above.

xquery version "3.0" encoding "UTF-8";
declare option exist:serialize "method=xhtml media-type=text/html indent=yes";
    let $data-collection := '/db'
    let $login := xmldb:login($data-collection, 'admin', 'XXXXXX')
      for $directory in ("/bulletincatalogs/fulltext","/bulletincatalogs/mets","/collegian/fulltext","/collegian/mets","/handbooks/fulltext","/handbooks/mets","/puritan/fulltext","/puritan/mets","/scene/fulltext","scene/mets","scrapbook/fulltext","scrapbook/mets","walullah/fulltext","/walullah/mets")
        let $start-time := util:system-time()
        let $collection := concat($data-collection,$directory)
          let $reindex := xmldb:reindex($collection)
            let $runtime-ms := ((util:system-time() - $start-time)
                                 div xs:dayTimeDuration('PT1S'))  * 1000
                return
                  <html>
                      <head>
                         <title>Reindex</title>
                      </head>
                      <body>
                      <h1>Reindex</h1>
                      <p>The index for {$collection} was updated in {$runtime-ms} milliseconds.</p>
                      </body>
                  </html>

                  So this works just fine, but my supervisor would prefer we did not run through the Exide interface over HTTP. Okay, so finished the week, by trying to write an ant task to accomplish that, I'll share my success or failure with that next week.

                  3. Alliance Sharing the Work

                  On Friday morning I had a great call with the DUX leaders Anne at UW and Molly at PSU and Cassie from the Alliance offices. In my opinion, our group was doing too much hand holding for the other libraries with each Primo Upgrade. ExLibris puts out plenty of information at each upgrade, and in the past, we had been doing a bunch of customization to that information. Where in my opinion it just was not necessary. And if anyone cared enough they would be able to go through the information from ExLibris and gather what they need from it. 

                  The folks on the DUX call agreed, and we also agreed to allow people to just add their own issues to a spreadsheet for tracking issues with each upgrade. If the issue is important to you document it in the spreadsheet, no need to send to me to document, you can edit the spreadsheet just as easy as I can. If you care about the upgrade you will do testing and put your results and calls in the Google Doc folder that everyone can edit and read.
                  I now get to share this information with my group, maybe I should have told them about this, but I have to think they will be for the change as well. Then I will present the changes on the Alliance Discovery call on the 15th.