Surveying tens of thousands of college students on hundreds of campuses is a large undertaking. In 1992, when we published the first edition of our Best Colleges book, our student survey was a paper survey. We conducted the survey on the college campuses working with school administrators to get their permission for us to set up tables in centrally trafficked locations at which students filled out the surveys. Sometimes in order for us to collect surveys from a wide range of students, freshmen to seniors, this process took place over several days and various on-campus locations. We were pleased to have surveyed about 125 students per campus on average via that format and method.

However, the launch of our online student survey  made it possible for us to gather opinions from far more students per college than we had reached previously. It also made the process more efficient, secure, and representative. Now all of our surveys are completed online. The average number of student surveys (per campus) upon which our ranking lists are annually tallied has jumped from about 125 to over 400 students.

Our student survey is also now a continuous process. Students submit surveys online from all schools in the book and can submit their surveys at any time (though our site will accept only one survey from a student per academic year per school—it's not possible to "stuff the ballot box," as it were). In addition to those surveys we receive from students on an ongoing basis, we also officially conduct surveys of students at each school in the book at least once every three years.

How do we conduct those official surveys? First, we notify our administrative contacts at the schools we plan to survey. We depend upon these contacts for assistance in informing the student body of our survey (although we also get the word out to students about our survey via other channels independent of the schools). In recent years, an increasing number of schools have chosen to send an email to the entire student body about the availability of the online survey; in such cases, this has yielded robust response rates. We've surveyed anywhere from all students at Deep Springs College, to more than 1,000 collegians at such colleges as Florida State University, University of Massachusetts–Amherst, Oregon State University, United States Air Force Academy, Michigan State University, and Clemson University among others.

We conduct these official surveys more often than once every three years if the colleges request that we do so (and if we can accommodate that request) or if we deem it necessary for one reason or another. And of course, surveys we receive from students outside of their schools' normal survey cycles are always factored into the subsequent year's ranking calculations, so our pool of student survey data is continuously refreshed.

The survey has more than 80 questions in four main sections: "About Yourself," "Your School's Academics/Administration," "Students," and "Life at Your School." We ask about all sorts of things, from "How many out-of-class hours do you spend studying each day?" to "How do you rate your campus food?" Most questions offer an answer choice on a five-point scale: students fill in one of five boxes on a grid with headers varying by topic (e.g., a range from "Very satisfied" to "Very dissatisfied"). All of our 50 ranking list tallies are based on students' answers to one or more of these questions with a five-point answer scale. The five-point grid—which is called a Likert scale—is the most commonly used measurement for this type of survey research: consensus-based assessment. Statisticians consider it most accurate as it presents equal amounts of positive or negative positions. 

Once the surveys have been completed and the responses stored in our database, we tally the results. The methodology and the math we use is quite simple. Each college is given a score (similar to a GPA) for its students' answers to each multiple-choice question. These scores enable us to compare student opinion from college to college and to tally the ranking lists. Most of the lists are based on students' answers to one survey question: some lists are based on students' answers to several questions. But all of our lists are based on our student survey results. They are the sole factors that determine which schools make it onto our 50 ranking lists and at what rank. (Note: the rankings are not based on—nor do they reflect— our opinions of the school or how we rate them.) See The Princeton Review's Rankings Methodology  for info on each of our ranking lists and the survey questions we use to tally them.

The findings from our student surveys also determine the "Students Say" lists we feature in the sidebars on our school profiles. These lists report the topics on which the students we surveyed at the school showed the highest consensus of agreement.

Some questions on our student survey are open-ended and invite students to answer in their own words. Our profiles are laced with comments from students that we cull from these open-ended questions on the survey (all text in quotes in the profiles comes directly from a surveyed student). Student quotations in our profiles are not chosen for their extreme nature, humor, or unique perspective. Rather, they are chosen because they represent the sentiments expressed by the majority of survey respondents from the college. In some cases, they illustrate one side or another of a mixed bag of student opinion, in which case there will also appear a counterpoint opinion within the text.

After students have taken our survey, we ask them to review the information we published about their school in the previous edition of our book and to grade us on its accuracy and validity. Year after year, we've gotten high marks from the folks we consider the experts on those colleges—the students attending them. About 80 to 85 percent of students have indicated our profile on their school was "extremely" or "very" accurate.

One last note: to guard against producing a write-up that's off the mark for any particular college, before we publish our Best Colleges book, we send our administrative contact at each school an advance copy of the college's profile. The administrator then has ample opportunity to respond with corrections, comments, and/or outright objections. We take careful measures to review the school's suggestions against the student survey data we collected and make appropriate changes when warranted.

To all college-bound students and parents of applicants using our site: we hope all of the information we provide in The Best 388 Colleges —our unique ranking lists and school rating scores and our richly detailed school profiles—will help you identify, apply to, get admitted to, get financial aid from, and happily graduate from … the truly "best fit" school  for you.

We wish you good luck in your applications and college years ahead.