Enhancing the Student Experience through Blind Grading Automation and Self-Service in the Law School
Blind grading has been used by law schools and some non-law school institutions for many years to assure fairness and impartiality in the grading process. When blind grading is in place, students have confidence in the equality of the grading system and the faculties value the objective nature of the process. At Duquesne University in Pittsburgh, our School of Law has used blind grading for many years for what it brings to the academic experience of the Law School students. However, the process to conduct blind grading required a great deal of effort. Because the process lacked student self-service capabilities, it placed a burden on the students to manage their exam numbers and on the administration to provide manual intervention when assistance was needed.
Duquesne University is a mid-sized private liberal arts university with a Law School enrollment of around 400. Duquesne’s two-person Law School registrar staff found themselves overwhelmed by executing the highly manual blind grading process while also serving the many and varied needs of the law students. The manual processes to support the blind grading capability required the generation and distribution of the random exam numbers. There were a number of background steps involving manual data entry into spreadsheets by staff and faculty. Students were responsible for keeping track of exam numbers and other key data. Once all of the manual processing was completed, the grades were entered into our Student Information System
The Registrar felt that there had to be a way to automate their grading process and asked for help from the Computing and Technology Services (CTS) staff (the central IT department for Duquesne). After searching for possible purchased software solutions, and discussing the advantages and disadvantages of writing a standalone homegrown application, the Law School and CTS agreed that best approach was to customize our Student Information System (SIS) to address these needs. Our SIS is the Ellucian Banner system. Banner is a large and complex ERP/SIS and customizing it can be an extremely difficult task. While the approach we used to customize banner is very specific to that ERP, the challenges we face and our design approach are applicable to any SIS that allows customizations.
The first challenge was to enhance the student experience concerning their exam numbers. The goal was to make it easier to create the exam numbers, store them in the SIS on the student record, and make them accessible at any time to the student. The next challenge was the delivery of the number to the students. We decided to create a channel within our student portal to serve the Law School’s needs. Once the student logs into the portal, they receive information specific to their needs. This Law School channel would provide them with their exam numbers anytime they need them, eliminating the need to contact the Registrar when they lose their paper letter.
“When blind grading is in place, students have confidence in the equality of the grading system and the faculties value the objective nature of the process”
In order to accommodate blind grading we chose to customize a parallel copy of the grading application and customize the copy. This approach allows the blind grading customizations to be used by the Law School, and any other schools that desire it, while all others continue to use the standard Banner delivered grading application. Second, and equally important, this approach maintains the use of all of the University wide Banner functions such as GPA calculation, transcript generation, registration, advising, etc. Functions such as these can continue to be upgraded with Ellucian delivered code which will in turn limit the reprogramming and testing to just those functions that had to be customized to accommodate the blind grading. As we progressed through the process of customizing the final grade entry form we realized that we also had to customize the class roster functionality as well as several other ancillary functions that would have allowed faculty members access to both the student’s identity and the newly created exam number. The class roster was tied directly to the grading and all of the other class related functionality. The Registrar needed a version of the class roster that showed the exam numbers assigned to each student in order to streamline their work. At the same time the original class roster functionality without the assigned test numbers would be needed by the faculty members. We had to make sure that we documented and walked through all processes associated with the use of the student information system by the Law School to make sure that any function that needed to be executed using exam numbers did so, and at the same time great care had to be taken that none of the faculty functions allowed for a correlation between the test numbers and the student.
Once all of the modifications were identified and completed, we encountered the last significantly manual function in the grading process, class ranking. In much the same fashion as the manual efforts that supported the blind grading, the Law School class ranking process required many steps, a great deal of time, and significant auditing and error checking. Automating the generation of the class rank and the delivery of that rank to the student was the final step in enhancing the student experience and the efficiency of the Law School Registrar’s Office. A great deal of time and effort went into designing the ranking process. The logic that the Registrar had followed via spreadsheets and decision trees had to be programmed into a single automated process that could apply ranks throughout the Law School. Once that was completed the rank could be stored on the student record and delivered to the student through the same Law School portal channel that provided the student with their exam numbers. With automated ranking and the secure portal channel delivery in place the Law School Registrar’s Office is now able to focus their time on serving the students.
We took advantage of the summer semester to implement the grading portion of this project with a goal of having all functions live for the fall 2015 semester. We are already anticipating the next round of customizations that will be needed when another version of our SIS arrives. This of course is one of the disadvantages of customizing vendor delivered applications, but it is one that we accepted after a great deal of analysis. As is often the case with large scale projects, the amount of work and the degree of complexity required for this project were both much greater than anticipated.
The improved efficiencies of the student and staff experiences were worth the effort. Institutions considering this type of project must be prepared for unforeseen complications and challenges.
By Leni Kaufman, VP & CIO, Newport News Shipbuilding
By George Evans, CIO, Singing River Health System
By John Kamin, EVP and CIO, Old National Bancorp
By Elliot Garbus, VP-IoT Solutions Group & GM-Automotive...
By Gregory Morrison, SVP & CIO, Cox Enterprises
By Alberto Ruocco, CIO, American Electric Power
By Sam Lamonica, CIO & VP Information Systems, Rosendin...
By Sergey Cherkasov, CIO, PhosAgro
By Pascal Becotte, MD-Global Supply Chain Practice for the...
By Stephen Caulfield, Executive Director, Global Field...
By Shamim Mohammad, SVP & CIO, CarMax
By Ronald Seymore, Managing Director, Enterprise Performance...
By Brad Bodell, SVP and CIO, CNO Financial Group, Inc.
By Jim Whitehurst, CEO, Red Hat
By Clark Golestani, EVP and CIO, Merck
By Scott Craig, Vice President of Product Marketing, Lexmark...
By Dave Kipe, SVP, Global Operations, Scholastic Inc.
By Meerah Rajavel, CIO, Forcepoint
By Amit Bahree, Executive, Global Technology and Innovation,...
By Greg Tacchetti, CIO, State Auto Insurance