*Jason Ellis is now a Ph.D. student in the GVU Center
at GeorgiaTech. He can be reached at email@example.com,
© Copyright ACM 1997
One case management function that we identified as being a candidate for the new system involves selecting the best program for a youth. These programs range from community-based drug treatment programs to secure residential facilities. Currently, DJJ searches through a 4-inch manual of about 250 programs to find the best one. Not only is this very time consuming but there is also the potential bias of choosing the first program found, as opposed to the one that best suits the youth.
We believed that HCIL's earlier dynamic query (DQ) research could be applied to this problem. Most of DJJ's employees are novice computer users. Several have never worked in a graphical windows environment or even used a mouse, and only a few are familiar with query languages like the Structured Query Language (SQL). DQ applications allow users to make queries very quickly and easily by adjusting sliders and selecting buttons while the search results are continuously updated in a visual display (e.g., x/y scatterplot, map, etc.) . DQ applications are particularly good for novice users since they do not require a special query language to be learned, invalid queries cannot be formed, and users see the results of adjusting a value immediately. The ProgramFinder was designed to allow DJJ to quickly and easily select the best program(s) for a youth from among all the programs matching user specified attributes.
This paper describes the seemingly straightforward conversion of the ProgramFinder from a research prototype to a real product, and analyzes the interaction between the efforts of HCIL and DJJ and the amount of "buy-in" of DJJ staff and management (i.e., how excited they seemed to be about the prototype). We found that many levels of prototyping were still needed (five in all) and that the choice of the search attributes was the most time consuming (and the most conflict generating) task. A direct link to the workflow was also needed in the prototypes to generate the necessary "buy-in."
Figure 2. Final ProgramFinder prototype
The initial IVEE prototype was developed in a few hours to illustrate the ProgramFinder concept to DJJ. Once we had DJJ's go ahead, the Initial Customization prototype was developed. This is when development started to focus more intently on DJJ's workflow and as a result DJJ's "buy-in" increased dramatically. DJJ also began working harder on choosing the selection attributes.
Over time, it became obvious that workers had vastly different opinions from management. A comparison prototype was developed to illustrate the workers' ideas to management. After considerable debate, management decided on a set of attributes to use and a Testing prototype was developed so preliminary usability testing could be performed.
Once these attributes were chosen, the design effort gained steam as DJJ began reacting to the details in the prototype and requesting many modifications. The Testing prototype required increased implementation effort because there was little working functionality in the previous prototypes.
The major drawback of using IVEE was that it ran on Sun workstations and DJJ only uses PCs. For demonstration purposes, we resorted to using a slide show of IVEE screens in conjunction with a live demo of the HomeFinder to show the smooth DQ interaction. DJJ's initial reactions were positive and they asked us to continue.
DJJ's "buy-in" jumped dramatically when they were shown this prototype. They were very excited to see a DJJ document pop up when the "Send Packet..." button was pressed. They immediately started discussing how they could market ProgramFinder to other juvenile justice agencies. They also started envisioning other ways the ProgramFinder could be used. For example, they suggested adding a referral log so the acceptance and rejection patterns of different programs could be monitored. They also envisioned using the ProgramFinder to coordinate with other Maryland agencies which would allow them to choose from over 2000 programs (compared to only 250 currently).
At this stage, DJJ's involvement moved from a casual exploratory effort to more serious product design work. We believe this was due primarily to the customization effort, even though it was relatively simple. Ironically, the ProgramFinder was not a tool DJJ anticipated needing initially.
The major effort in creating this prototype was defining the new attributes which were significantly different from the attributes proposed by management. Instead of choosing a range of values, workers wanted to rank each value on a scale from "not important" to "required." Not important attributes would be ignored, low importance and high importance attributes would be used for color coding programs. An "attribute ranker" widget was designed to facilitate these selections. Some other minor changes included reducing the number of program types in the legend, adding more fields to the details area, and creating a "Show Referral Log" button.
At this point, there were two significantly different prototypes that needed to be brought to some consensus. Essentially, the workers wanted a system that supported how they currently did their jobs while management was interested in redefining the program selection process. Programs are currently selected according to a "type of care" paradigm where each program provides a description of the services they provide. Management is interested in establishing a new method where programs are described by a continuum of services measured by levels of restrictiveness (custody) and intervention (treatment). This would allow DJJ to directly link a youth's risk assessment (restrictiveness required) and needs assessment (intervention needed), which are measured by the same type of levels, to the selection process.
Management was presented with both prototypes and the strengths and weaknesses of each were discussed. After a month of deliberation, management chose the Initial Customization prototype. Management's rationale was that the attributes in the Comparison prototype did not engage the users in the selection process as much as those in the Initial Customization prototype. They were concerned that users might ignore critical areas in the lengthy checklists which would greatly effect the level of service a youth receives. They decided it was preferable to provide a few attributes with broad implications and ask users to consider all of them. They felt the new method would ultimately make things easier for workers and better programs would be chosen.
The decision not to use the worker's attributes (Comparison prototype) decreased their "buy-in" temporarily. While the ProgramFinder would still help them do their jobs, it had now become the vehicle by which their jobs were being redefined. Resistance to change is often encountered when new systems are introduced.
Management requested that the color coding not be included in the Testing prototype because they felt it would unduly bias the selection process. The concern was that workers might just select the highest ranked program (e.g., the one with the "best" color) and not take into account other suitable programs. DJJ wanted to avoid creating a tool that gives the "perfect answer." They wanted the ProgramFinder to narrow down the number of programs and then require the workers to examine each of the remaining programs in-depth.
HCIL's major effort in developing the Testing prototype (Figure 6) was implementation since there was still very little working functionality in the previous prototypes. Slider implementation required the most time. Similar controls are available in the public domain but none had all the functionality DJJ needed.
Figure 3. IVEE prototype
Figure 4. Initial Customization prototype
Figure 5. Comparison prototype
Figure 6. Testing prototype
Users' reactions were positive overall. They felt that the system would very likely help them select better programs for youths. They also thought the correct amount of information was being displayed which was not surprising since they had been involved in the design from the early stages.
Users did complain that the characters on the screen were too small and that the display was somewhat confusing. The confusion was likely due to their lack of experience with graphical user interfaces and the fact that the selection attributes were new to them.
Additional usability issues emerged during testing, and were addressed by the final design (Figure 2):
1- Addition of Textual Display - Users noted that often the location of a program is not taken into account when placing a youth so we added a textual display (showing a list of programs and their details) as an alternative way to review the best matches. The textual display is better for displaying more details at one time but the map display can provide an overview of all the matches in one screen (without using a scroll bar).
2- Reinstate the "Best" Values - DJJ reversed its decision about color coding with respect to "best" values. Although they were initially concerned that ranking programs might bias the selection process, after using the system they realized the color coding could assist workers when there is no program that matches a youth's needs fully (which is often the case). Assigning "best" values would also provide a clearer picture of what sorts of programs are needed.
3- More Integrated Help - Users found it difficult to remember what the number on the range sliders meant. Each number actually corresponds to a lengthy description that cannot be summarized in a few words. Users in the first session recommended allowing users to make selections from the help facility which contains the descriptions. A sample screen illustrating how this might be done was presented to the users in the second session (Figure 7).
Figure 7. Help facility supporting range selection The new interface met with mild approval but users felt the range sliders would be more convenient once they learned how they work.
4- Attaching Notes - while the workers were using the ProgramFinder, they found that they wanted to record comments about their settings. A small icon above each slider was added that, when clicked, would display the portion of the placement paperwork related to that particular attribute.
5- Modifying Range Sliders - several users expressed difficulty using the range sliders. They were especially frustrated using the sliders when they knew the exact range they wanted. One suggestion was to enhance the range sliders to allow users to select a range by dragging their mouse across the values shown below the slider. This would only require one action as opposed to the two drags required by the standard range slider.
6- Reordering Sliders - the order and categorization of attributes was raised as an important issue. The decision was made to present the controls by workflow and allow users to redisplay them alphabetically if they choose.
Search attribute selection can be difficult - We initially anticipated that it would be a simple task, but choosing the search attributes required the highest level of effort and caused the most conflict inside of DJJ. The Comparison prototype was developed solely for the purpose of exploring alternative attributes.
Customization increases "buy-in" - We were surprised how much DJJ's "buy-in" increased after the Initial Customization prototype was developed. To us, it was merely a re-implementation of the IVEE prototype for the PCs and the customization added was very minor (a few buttons and scanned forms) but it had a dramatic impact on DJJ's ability to understand how the ProgramFinder could help them and got them to start planning for novel uses.
Interface design can initiate changes in work processes - In the case of the ProgramFinder, the selected set of attributes will significantly change how DJJ selects programs. This temporarily troubled workers but they soon came to understand how it could help them choose better programs for the youths.
Presentation of similar applications stimulates early interest - Even though it is less effective than building a customized prototype, showing "live" demos of similar systems (e.g., HomeFinder) helps focus user thinking and bootstrap management "buy-in."
Creating alternative designs helps engage users - Illustrating functional differences through the creation of several prototypes is a very powerful tool. Users who initially expressed no opinions came forward with strong ideas once concrete choices were presented.
Creating a dialog between users and management early can save time - Meeting with workers and management together earlier in the design process might have eliminated the need for the Comparison prototype. Joint meetings can also help alleviate the "us against them" syndrome.
Selecting the search attributes was the most time consuming and conflict generating task. Demonstrating similar applications early on and adding custom workflow hooks to the prototypes increased "buy-in." Alternative designs were presented to increase user involvement. This effort also served as the catalyst for DJJ to redesign their work practice.
2. Ahlberg, C., Wistrand, E. (1995) " IVEE: An Information Visualization & Exploration Environment," Proceedings of IEEE Visualization '95 (Atlanta, October 1995), 66-73.
3. Rose, A., Shneiderman, B., Plaisant, C. (1995) " An applied ethnographic method for redesigning user interfaces," ACM Proc. of DIS '95, Symposium on Designing Interactive Systems: Processes, Practices, Methods & Techniques (Ann Arbor, MI, Aug 23-25, 1995), 115-122.
4. Slaughter, L., Norman, K., Shneiderman, B., (1995), " Assessing Users' Subjective Satisfaction with the Information System for Youth Services (ISYS)," Proceedings of Third Annual Mid-Atlantic Human Factors Conference (Blacksburg, VA, March 26-28, 1995), 164-170.
5. Williamson, C., Shneiderman, B. (1992) " The dynamic HomeFinder: Evaluating dynamic queries in a real-estate information exploration system," Proceedings ACM SIGIR '92 (Copenhagen, June 21-24, 1992), 338-346. Also appears in Sparks of Innovation in Human-Computer Interaction, Shneiderman, B., Ed., Ablex (June 1993), 295-307.