A spider’s web is intricate and fascinating, but it’s also a trap. Sound like another web you’ve encountered lately?

Your search starts out simple enough. You type a keyword in a search engine, which results in 50 matches. You click. Then you click again. And again. Before you know it, you’ve spent 20 minutes just clicking, tangled in a never-ending web.

Wouldn’t it be nice if there were some way you could preview a site before you actually linked to it? That’s one of the projects researchers in the School of Information and Library Science (SILS) are working on-helping users navigate the World Wide Web more efficiently.

The idea is that a click is a terrible thing to waste,” says Gary Marchionini, professor of information and library science. “It’s a radical action. It’s not something you want people to do indiscriminately, because once they’ve clicked, they’ve lost their focus. They’re on a different window. We want to give people very dynamic opportunities to look ahead-see what is to come.”

Marchionini is one of the directors of SILS’s new Interaction Design Laboratory (IDL), which houses various computer projects including the designing, testing, and evaluating of Internet-searching devices. IDL staff also collaborate with other departments, helping them put together multimedia databases or interactive class web sites.

The lab, hidden behind book stacks on the fourth floor of Manning Hall, consists of two small rooms. As you walk in, there’s a usability station with video cameras and an eye-tracking station. The tools are high-tech, but the goal is to simplify things for people.

Researchers use these stations to find out, for instance, how long it takes users to process information and what they’re thinking as they do so. One of the things they’re interested in is how people navigate through web sites. Do they click back just one page, or do they always click back and start over from the beginning? “If they always go back to the beginning, perhaps that means there needs to be a better navigation system,” says Ben Brunk, a SILS graduate student who also manages the IDL.

As it is now, when you type in a keyword or two, search engines usually generate long lists of addresses with a title and limited text description. “This forces the searcher to make a decision based on very limited information that they can only verify by actually visiting each site to see what is there,” Brunk says.

What Brunk proposes is an animated preview tool that would allow users to get an overview of a web site before they link to it. To test his idea, Brunk put together a prototype for the SILS web pages. He took screen shots of every page on the web site and then fit the thumbnail images into a slide show users can control themselves. There’s a “play” button and a slide control to adjust the speed. When you get to the page you’re looking for, you can stop the show by placing the mouse over the thumbnail. If you want to go to that page, you can click on it. “If you were looking to see if a web site had a particular form, and you had a general idea of what the page looked like, you could scroll through the video until you find the form, click on it, fill it out, and get on with your life,” Brunk says. “That way you don’t have to click all over the place looking for it.”

Feedback has been positive, says Brunk, who tested his prototype on a group of about 20 students with varying levels of Internet experience. The usability station allowed him to keep a record of the subjects’ responses. One camera recorded the facial and vocal expressions of the subjects, while another camera recorded their hand movements. In addition, a video scan converter captured what was happening on the computer screen. As each person worked through the preview tool, Brunk used a video mixer to record both the subject’s facial and hand data as well as the computer monitor at the same time. When you look at the recording on a television monitor, the two groups of information show up in separate quadrants on the screen.

Researchers in the IDL employ the usability station to test all types of projects. Another example is a video repository of digital images that Marchionini and Gary Geisler, a SILS graduate student, are putting together. “The goal is for people to be able to get a quick understanding of what’s in a video before they take the time to download it,” Marchionini says.

Suppose you’re a teacher and you want to put a lesson plan together for your biology classes. You want to show your classes a video-not the whole video-but some clips that will demonstrate a particular concept. You go to a web site that allows you to download videos, and you click on a title. The video promptly starts to download. You’ve now roped yourself into 20 or more minutes of time, and you’re not even sure if that’s the video you really want. “Watching movies and picking out the parts you want to show can be very time consuming,” Geisler says. “So what we’re trying to do is figure out ways people can see a preview before they download.”

One way is to create still frames of a video, so users can get an idea of what’s in the movie without downloading it. When someone moves the mouse over a movie title, representative frames from the movie are displayed one after the other. An option is to show several frames from the movie all at one time like a storyboard. That’s what the usability station is for-to figure out what people like best.

Geisler and Marchionini will also use the eye-tracking station to test how long it takes people to view a screen of several images as opposed to a slide show of images. The technology, originally developed for hands-free targeting in military aircraft, is now used to measure a person’s pupil diameter and point of gaze on the computer screen. The system captures where the eye is looking 60 times per second. In combination with the eye tracker, a head tracker lets researchers see exactly where the eye is positioned on the screen. On a separate monitor, crosshairs show where the eye is focused.

Internet companies, for instance, use eye trackers to determine the most eye-catching spot to place advertisements. “But we don’t care about that. What we care about is how people are processing the information they’re seeing,” says Marchionini. “We can use the eye tracker to record a person’s eye movement to precisely how much time people are spending on each frame, and that helps us determine how fast people are processing the information.”

And that’s what the IDL is all about-the human side of things. By studying how humans interact with computers, researchers can make better designs, so people have an easier time of navigating the Internet or rifling through huge amounts of electronic information. “We don’t want to just make designs based on what we think looks cool or what we think is a good idea,” Marchionini says. “They’ve got to make sense to the people who will be using them.”

Catherine House was formerly a staff contributor for Endeavors.

Bert Dempsey, assistant professor, and Barbara Wildemuth, associate professor, are also directors of the IDL. Initial funding for the lab came from the UNC-CH Provost’s Office and Intel, Inc.; current funding comes from the Smallwood Foundation. The eye tracker is funded by the National Science Foundation.