Survey Results from the Computerized LSAT Prototype Testing at the 1999 Law School Recruitment Forums (CT-00-06)
by Kimberly A. Swygert, Jennifer A. Lawlor, and Kira Velikopolsky, Law School Admission Council

Executive Summary

This paper addresses one aspect of the second phase of a research study series concerning the development of an acceptable prototype for a computerized Law School Admission Test (LSAT). Phase one of the research consisted of the development of seven preliminary LSAT computerized testing (CT) prototypes, along with their demonstrations at Law School Recruitment Forums for the years 1995 through 1998. The current phase of this research series is built upon the phase-one results and comprises the development and testing of an entirely new set of prototypes for the three existing item types—Reading Comprehension (RC), Logical Reasoning (LR), and Analytical Reasoning (AR). Usability test responses to these prototypes and survey data were collected from prospective law school students during the 1999 Forum season, and the current paper addresses the survey responses of those students.

The eight 1999 Forum locations were, in chronological order: Washington, DC; New York City; Atlanta; Houston; Boston; Los Angeles; Oakland; and Chicago. The students who attend these Forums might be considered somewhat self-selected in that they are already preparing for law school, but the free admission to the Forums tends to balance out the more ambitious candidates with those who are just interested in getting a feel for what law school might be like, so the candidate pool is ideal for testing and surveying. The participants in the usability tests and survey collection were all prospective law students who had registered for the Forum, as well as the occasional curious admissions officer. The candidates appeared to enjoy being part of the subject pool in LSAT research and were excited at the prospect of getting an “advance view” of the CT. The opportunity to interact with the Law School Admission Council (LSAC) test development staff was also an attraction for the candidates.

The three prototypes that were used throughout 1999 were completely new (i.e., not built upon previous prototype code or structures), and designed, developed, and programmed in-house. All three prototypes follow the same format of first presenting a passage/stimulus/set of conditions along with five items, then presenting a second set of five items with a second passage/stimulus, and finally presenting a scoring screen. All prototypes allow the users to use tabs or arrow buttons to move through each set of five items. When an item is answered, a black circle appears beside the item number on the tab, a green box surrounds the option text box, and the option letter appears to be depressed. If the candidate thinks an option is incorrect, he or she may rule it out by pressing the X-button to the right of the option text box; the text is then grayed-out and the X-button appears depressed. There is no tutorial, set of directions, or help file provided for the candidates (this is standard procedure for usability tests).

Forum data presented in this paper are in survey form, with two surveys available to the participants: short and long. The short survey that accompanied the 1999 prototypes is two pages long. The first section contains four demographic questions, and the second section contains five questions that ask about the participant’s computer usage and comfort with various types of hardware and software. The long survey is five pages long and is an extension of the short survey. In addition to the demographic and four of the computer usage/comfort questions from the short survey, there are three additional computer access and comfort questions, followed by two new sections: Computerized Test/Computerized LSAT Information and Special Accommodation Issues. The choice of survey was decided by the Forum participants: usability test participants completed the short survey, and attendees who did not have time for the test completed the long survey.

A total of 281 short surveys and 71 long surveys were collected. The results show that the survey respondents were composed of more females than males, and the largest racial/ethnic group was Caucasians, followed by African Americans. An important question to ask about the demographic data is: How do these percentages compare to the demographic breakdowns of all Forum attendees, and to the breakdowns of test takers? It turns out that the female-male ratio for the overall Forum attendee group is very similar to the survey group, while the test-taker population differs. It seems that a larger proportion of females visit the Forums and complete the survey than take the LSAT. A similar effect is seen in the racial/ethnic breakdowns, as the results suggest that African Americans are a much larger proportion of Forum attendees than of test takers, and they are more likely to have been included in these surveys. The larger percentages of women and African Americans in the survey group may actually be a plus for the usability tests or survey collection, as this makes it easier to check for any subgroup differences in attitude and computer usage.
The results show that survey respondents use computers very often, are very comfortable with them, and overwhelmingly have access to the internet. About one-third of the respondents had taken a computerized test before, and many of them found aspects specific to a computerized LSAT version to be very desirable. However, a majority of the respondents report that they would prefer to take the paper-and-pencil version of the test despite the high comfort levels they report for the computerized version.

These data are very useful in terms of gauging computer literacy and computerized test attitudes and experience. They also demonstrate that the increasing usage of computers in the candidate population does not necessarily mean the candidates are comfortable with the idea of a computerized LSAT. Concerns about having to navigate through reading passages and having to read large amounts of text onscreen surface in the data, and these concerns are out of proportion with the overall reported computer comfort level. The level of reported comfort with a computerized LSAT is still lower than that of overall reported computer use comfort, even when aspects specific to the computerized test are rated very highly, and more candidates would choose to take the LSAT in its paper-and-pencil format than in the computerized format. Future surveys should include questions that dig deeper into possible reasons for this continuing unease, in the hopes that the future prototypes and the eventual operational test may be modified to take these reasons into account.

Survey Results from the Computerized LSAT Prototype Testing at the 1999 Law School Recruitment Forums (CT-00-06)

Research Report Index