The Netflix effect

In higher education. Huh?

The Netflix Effect: When Software Suggests Students’ Courses

Joshua Anderson for The Chronicle

Denley, of Austin Peay State U., hopes his course-picking software will
“open students’ eyes to courses that they were dimly aware of.”

By Jeffrey R. Young

When Netflix suggests movies based on how
much previous renters liked them, all that’s at stake is a night’s
entertainment. Now a handful of colleges have begun using similar
recommendation systems to help students pick their courses
—a step that
could change GPA’s and career paths.

Last week, under­graduates at Austin Peay State Uni­versity were
invited to visit its new online recommendation system before meeting
with their academic advisers. When suggesting a course, the automated
system considers each student’s planned major, past academic
performance, and data on how similar students fared in that class. It
crunches this information to arrive at a recommendation. An early test
of the system found that it could lead to higher grades and fewer
, officials say.

Human academic advisers usually don’t get five stars from students.
The quality of course recommendation at colleges is often about as
reliable as the level of movie advice you’d get at the local
video-rental store (if you can still find one). Sure, some clerks are
film buffs—Quentin Tarantino first worked in a video store, after
all—but you can’t count on it. Many professors who help students plan
their academic schedules have lim­ited knowledge of courses outside
their discipline.

In contrast, colleges themselves have vast hard drives filled with data about academic requirements and student performance.

That makes the advising process a natural area to try a more
analytical approach to student services. If it works there, the
number-crunching techniques and suggestion engines could be put to other
purposes as well, pointing students toward majors, activities, and
campus resources.

Call it higher education’s Net­flix Effect.

I have also heard worries—not surprisingly, from those same
professors—that students may interpret the suggestions from the software
robot as commands, and miss the more creative ideas that they say are
more likely to come out of a free-flowing discussion.

Absent Advisers

Margaret Suddarth, a junior majoring in business at Austin Peay, in
Clarksville, Tenn., says she has seen her academic adviser only once in
her time at the university, even though they are supposed to get
together to talk about her schedule every semester. “He’s hard to catch
up with,” she says.

Her usual ritual is to flip through the course booklet on her own,
then call or e-mail her adviser with her plans. He approves them and
gives her a code number required to register.

The new software robot, on the other hand, is available anytime, and
its ideas seemed on target to her. “The suggestions were all good,” she

For the coming semester, the program advised her to take a course in
management-information systems, which reminded her that she could do
that to get a requirement out of the way. It even predicted she would
get a good grade.

Tristan Denley, Austin Peay’s provost and a former professor of
math­ematics who designed the course-picking software, says he tested it
using students’ perform­ance from past semesters. Those who took the
courses the software recommended, he found, earned GPA’s that were half a
point higher than those who chose courses not suggested by the program.

One possible reason for the difference, he says, is that students
sometimes take advanced courses before they are ready, whereas the
system can guide them to material closer to their level. In some cases,
he argues, it might even guide them to subject matter they have a
propensity for but may not have realized.

To me, the software robot —which in the online system is represented
to students as suggestions from the university’s mascot, a monocled
figure called the Gov, seemed as if it could guide students to “gut”
courses rather than challenging ones.

“I don’t think the major thrust will be to push people to classes
that are sort of easy A’s,” argues Mr. Denley. “I hope the major effect
will be instead to open students’ eyes to courses that they were dimly
aware of.”

David Major, chairman of the Faculty Senate and a professor of
languages and literature, confesses to being “a little nervous” about
the system when he first heard about it. But so far he does not see it
elbowing humans out of the process. “I haven’t heard any grumbling”
among other faculty members, either, he says.

The reality is that students already go online to help pick their
courses. For years many have turned to unofficial online forums such as
Rate My Professors, where students anony­mously describe professors
whose courses they’ve taken, including how strictly they grade and even
how attractive they are.

Professors say they like that Austin Peay’s system recommends
courses, not professors, basing its decisions on content rather than
teaching style.

Ms. Suddarth, the business major, uses both the official system and
Rate My to design her schedule, though she
says her goal is to find the best teacher, not the easiest.

Mr. Denley hopes his software program will have an impact on
retention. Specifically, it may help some scholarship students keep
their GPA’s high enough to maintain their awards. “The loss of those
scholarships for some students means they are no longer able to carry on
with their degrees,” he says.

If his calculations work, the number-crunching provost hopes to add a
tool to help students choose their majors as well. If that sounds like
too personal a decision for software, remember that these days, many
people use online dating services to find their spouses.

As Mr. Denley puts it, “if eHarmony works well, why not this?”

‘Personal Connections’

The University of Colorado at Boulder, however, which also added
online course-picking tools in the past few months, got a lesson in the
importance of old-fashioned advising.

Boulder set up an online service to help students plot course plans to satisfy their majors, though it doesn’t make suggestions.

One goal of building the online guide was to fight what Michael C.
Grant, associate vice chancellor for undergraduate education, says is
the mistaken impression that the university wants students to stay extra
semesters to take just one stray course, one that probably would have
fit into a previous term with proper planning.

“That perception is a big problem for us,” he told me. The online
tool makes it clearer to students what they need to take to stay on

But a recent survey at the university found that despite the online
information, students are seeking personal advising more than ever. It
continues a trend at Boulder in which 115,000 contacts between students
and advisers were reported in the 2008-9 academic year, compared with
70,500 in 2005-6.

“It’s clear that the students oftentimes have pretty personal
connections with their advisers, and most of the time those are very
positive,” Mr. Grant says.

The Netflix Effect, though, suggests that a little number crunching can still be a powerful force.

Some two-thirds of movies rented on Netflix result from
recommen­da­tions made by the site, and users rate recommended films
half a star higher than those they find on their own. That’s according
to the book Su­per Crunchers: Why Thinking-by-Numbers Is the New Way to Be Smart.

I asked the book’s author, Ian Ayres, a Yale Law School professor, why he trusts machines with such personal decisions.

He says humans tend to have blind spots when handling tasks like
advising, which involve complex systems. People often give too much
weight to certain details based on personal preferences.

The question, from his point of view, is why colleges haven’t done
more quantitative experiments to test their own educational and business
practices. In a column he wrote for Forbes magazine, he
challenged elite colleges to admit a few applicants who don’t meet
admission criteria, so the colleges can track whether the students
really perform more poorly than other students do. No one has tried it.

“It’s kind of perplexing that they have this unvalidated reliance” on
tradition, Mr. Ayres says. “We’re willing to do randomized testing on
drugs, where people’s lives are at stake, but we’re not willing to do it
with people’s education.”

At another college system that set up a course-recommendation tool,
the South Orange County Community College District, in California,
leaders compare educational data with baseball statistics and note that
colleges are getting more sophisticated in how they measure success,
just as Major League Baseball has over the years.

“Right now this is kind of a zeitgeist thing,” says Robert S.
Bramucci, vice president for technology and learning services. “We’re
starting to see a lot of independent efforts to improve analytics.”