Introduction
The 2011 Poets & Writers Magazine MFA rankings are
comprised of individual rankings for both full-residency and low-residency
programs. The full-residency programs are assessed on the basis of sixteen
measures, half of which are in the nature of ordered rankings and half of which
are unranked recitations of important program features. All eight of the
full-residency rankings-based measures are unscientific, though all eight are
predicated upon sufficient hard data to be substantially probative. A
scientific ranking of MFA programs is not presently possible, as more than half
of the nation's full- and low-residency programs have thus far declined to make
public the necessary data (see below).
Four of the eight full-residency rankings are based upon unscientific polling of a large sample of current MFA applicants. These rankings are discussed in significant detail throughout this article. The most important of the four rankings is the ranking upon which the ordering of the programs in the chart is based, a ranking predicated upon individual fiction and poetry applicants' varying esteem for the nation's 148 full-residency MFA programs. The remaining three poll-based "genre" rankings are essentially subsets of this first ranking to the extent they offer data relating to various elements of the overall cohort polled: fiction applicants, poetry applicants, and nonfiction applicants. Programs are ordered, as with the "overall" rankings, on the basis of the number of votes received by each MFA program in that category. Polled respondents cast a "vote" by stating a present or future intent to apply to the program in question. The top fifty "overall" vote-getters are listed in the rankings chart, also published in the September/October 2010 print edition of Poets & Writers Magazine (with two programs tied for fiftieth), with the remaining 97 MFA programs listed in "The Additional Rankings of Full-Residency MFA Programs."
As to the genre rankings, programs ranking in the top fifty in poetry and fiction are noted in both the print and online rankings charts, as are programs ranking in the top twenty in nonfiction.
The four hard data-based rankings are as follows: total funding, annual funding, selectivity, and placement. These rankings are scientific to the extent that they rank programs on the basis of quantitative data publicly released by the programs themselves, though they are unscientific to the extent that not every program has released data for every category of assessment. The rankings therefore constitute an ordering of all publicly known data rather than an ordering of all extant data. A full complement of funding and admissions data is available for approximately half of the nation's full-residency MFA programs; the remaining programs are primarily smaller, newer, lightly advertised, or nondomestic programs, or else programs with a primarily regional applicant base. As all of these programs have Web sites, however, and as all of these programs exert exclusive dominion over their online presence, the absence of any specific funding or selectivity data in these programs' online promotional materials is taken, by the rankings, as an indication that these programs fully fund less than 33% of their students and do not have an acceptance rate low enough for inclusion in the top 50 in this category (currently, a program's yield-exclusive acceptance rate would need to be less than 11.7% for it to be included in the selectivity ranking). The rankings are based in part on the presumption that it would be counterintuitive for a program providing full funding to a substantial percentage of its student body to not indicate as much in its promotional materials. Program Web sites are regularly reviewed to determine whether a program has added information to its online profile; program administrators can also e-mail the author of this methodology article to draw attention to any substantive Web site changes.
Based on the data presently available, it is not anticipated that any of those programs without a full complement of funding and admissions data available in some form online would have ranked in the top 50 in either of the two funding categories. These programs, given the incompleteness of their promotional materials, are also much less likely to attract sufficient applications to be eligible for the selectivity rankings; a program must receive at least 100 applications annually to be considered eligible for the ranking in this category. As to the placement rankings, these do not rely on programs' promotional materials or their willingness to release internal data to individual applicants or groups of applicants, so all programs nationally, both full- and low-residency, were equally eligible for a top 50 ranking.
The overlap between those
programs ranked in the top 50 overall and those programs ranked in the top 50
in the other seven categories subject to ranking is significant. Ninety-eight
percent of the overall top 50 programs ranked in the top 50 in one or both of
the fiction and poetry genres—and
the one top 50 program that failed to achieve this status missed the cut by one
vote. Forty-four of the overall top 50 (86%) ranked in the top 50 in both poetry and fiction. In nonfiction, 20 of the top 30
nonfiction programs (67%) also ranked in the overall top 50.
Thirty-two (63%) of the overall top 50 ranked in the top 50 in funding, with another seven (14%) receiving an Honorable Mention (see below for definitions). In all, 77% of the top 50 full-residency programs ranked in the top 50 for funding or received an Honorable Mention in this measure of program quality. Forty-six (90%) of the top 50 programs ranked in the top 50 in selectivity, with 36 (71%) ranking in the top 50 in placement. Of the 29% of the top 50 MFA programs that did not rank in the top 50 for placement, nearly two-thirds were hampered by the fact that they were founded in the midst of the twelve-year assessment period for this measure. Programs disadvantaged in this way include the programs at University of Wyoming in Laramie, University of Mississippi in Oxford, University of Illinois in Urbana-Campaign, University of Nevada in Las Vegas, Vanderbilt University in Nashville, Tennessee, Louisiana State University in Baton Rouge, The New School in New York City, Virginia Polytechnic Institute [Virginia Tech] in Blacksburg, and Purdue University in West Lafayette, Indiana.
In view of the above, ordering programs on the basis of their overall vote totals also had the effect of placing a special emphasis, in the rankings, on those programs that placed highest in the four hard data rankings.
In reading the rankings and
this methodology article, several principles should be kept in mind: (1) MFA
programs are not for everyone, and many poets and writers will find their
energies better spent elsewhere as they attempt to explore and augment their
existing talents; (2) no poet or writer should feel that they must attend an MFA program, whether such a concern is
related to employment, networking, or personal artistic improvement and
achievement; (3) MFA students must remain on guard against sacrificing their
unique aesthetic, political, and cultural perspectives on the altar of
consensus, as MFA programs are ideally for an exchange of diverse
opinions, not hothouses for groupthink or aesthetic dogmatism; (4) an MFA in no
way guarantees one postgraduate employment, as the MFA is a nonprofessional,
largely unmarketable degree whose value lies in the time it gives one to write,
not any perceived (and illusory) advantage it may offer in the networking,
publishing, or employment arenas; (5) in view of the preceding, it is unwise to
go into any debt for an MFA degree; (6) holding an MFA degree does not, in
itself, make one more or less likely to be a successful poet or writer, nor
should those with MFA degrees consider themselves in any respect better
equipped, purely on the basis of their degree, for the myriad challenges of a
writing life; (7) the MFA, as an art-school degree, is not time-sensitive, and
many poets and writers will find the experience of an MFA more rewarding if
they have first pursued, for several years, other avenues of self-discovery and
civic engagement; (8) the MFA rankings are not intended to increase applicant
anxiety, reduce applicants' application and matriculation decisions to a
numbers game, or define prestige as a function of pedigree rather than program
factors that genuinely enrich the lives of real poets and writers (e.g.,
funding, a strong cohort, strong teaching, a vibrant and welcoming location and
community)—instead,
their aim is to maximize the information at applicants' fingertips.
The hope is that these
rankings will better position applicants to make an important life choice, one
which (necessarily) will finally be made, and must be made, using the rankings
as only a secondary resource. Specifically, the goals of these rankings and
their methodology are the following: Less overall student debt among MFA
graduates, more transparency in the promotional materials and public
disclosures of existing MFA programs, and greater access, for applicants, to
the wealth of conventional wisdom in the MFA applicant community about which
programs and which program features are most conducive to a memorable and
valuable MFA experience. Ideally, the MFA offers aspiring poets and writers
several years of funded time to write in a mutually-inspiring community; to the
extent some may see in the MFA unresolved dangers for the future of American
poetry and fiction, these rankings are as committed—in
their own way—to
the avoidance of these dangers as are those who have argued passionately for
the abolition of the MFA degree altogether. A better-funded and more
transparent national MFA system will be of greater benefit to artists in the
long run than the wholesale termination and dismantling of the system.
Cohort
In the nine months between July 15, 2009, and April
15, 2010, 527 full-residency MFA applicants were polled on the
highest-trafficked MFA-related Web site on the Internet, The MFA Blog. Founded
on August 21, 2005, this Web site received 410,000 unique visitors during the
polling period, including 706,000 page-loads, 276,000 first-time visitors, and
134,000 returning visitors. (The site's
StatCounter.com stat-counter was operational as of August 17, 2009;
consequently, the actual Web-traffic during the polling period was higher than
is listed here.)
The MFA Blog is a free, public, moderated discussion blog whose only requirement for viewing is access to a computer; active participation on the board requires a Google account. The site is run by American novelist Tom Kealey and a team of more than twenty designated moderators, approximately five of whom are active at any one time. The author of this article was a moderator at The MFA Blog for a portion of the polling period. Kealey himself was not an active moderator during this period. The Web site has no stated agenda other than to provide accurate and timely information about MFA programs to current and prospective applicants.
Online polling conducted in 2009 using a Google-sponsored polling application suggests that the online MFA applicant community, including the community at The MFA Blog, subscribes to the current conventional wisdom (as first laid out in the 2005 edition of Kealey's Creative Writing MFA Handbook) regarding the most important considerations in applying to and matriculating at an MFA program. Specifically, polling of more than 250 current applicants to MFA programs revealed the following:
· Asked, "Which of these is most important to your decision about where to apply?", and given the options "Location," "Funding," "Faculty," "Reputation," "Selectivity," "Curriculum," or "None of the Above," with the option to select more than one answer, the top four answers were as follows: Funding, 56%; Reputation, 45%; Location, 32%; Faculty, 18%; and
· Asked, "Why do you want to get a graduate creative writing degree?", and given the options "Credential," "Employability," "Time to Write," "Mentoring," "Networking," "Community," "Validation," "Avoid Work," and "None of the Above," with the option to select more than one answer, the top three answers were as follows: Time to Write, 55%; Employability, 43%; and Mentoring, 36%.
The Poets & Writers Magazine rankings have not, to date, used the above polling data to create a weighting system for the overall rankings. There is a presumption that applicants' own application lists best reflect the extent to which they take into account funding, location, reputation, selectivity, faculty, curriculum, and other applicant-specific factors in choosing which programs to apply to and attend.
Were the above polling used to create a weighting system for the rankings, many of the nation's most prominent and popular programs would drop from the top 50 rankings altogether. The result would be a series of rankings that poorly reflected the present national consensus on program quality. For instance, under the rankings' current methodology a popular but largely-unfunded MFA program in a major urban center might yet appear in the top 50 rankings because even a low standing in the funding, selectivity, and placement categories can be counterbalanced by a program's popularity due to location. The popularity of a program's location is best reflected by privileging applicants' application lists rather than a confluence of these lists and scientifically-gathered, publicly-accessible hard data. To redesign the overall rankings as something other than a direct reflection of current applicant mores would be to ensure that no nonfully funded and/or big-city program (with only one or two exceptions) would appear in the overall top 50 rankings.
While current trends suggest that program popularity going forward will be directly affected by a high or low standing in the funding, placement, and selectivity categories, the pace of this trend is arrested, rather than hastened, by the current ranking methodology. Whereas a weighted ranking system focusing on hard funding, selectivity, and placement data would remove most large-cohort urban programs from the national rankings immediately, the present methodology both registers the relative decline or stagnation in the popularity of such programs while ensuring that these programs have sufficient time to improve their funding, selectivity, and placement statistics before they are removed, by applicant consensus, from the top 50 altogether.
Polling Locus
Spring 2010 Google Web
searches for the individual terms "creative + writing + MFA," "CW + MFA," "poetry
+ MFA," "fiction + MFA," "MFA + questions," "creative + writing + MFA + blog,"
and "MFA + blog" returned The MFA Blog as the top worldwide hit in each
instance. Several other contemporaneous searches resulted in "top five"
worldwide hits: "MFA + program"; "MFA + applicant"; "MFA + application";
"nonfiction + MFA"; "MFA + resource"; and "MFA + response + times." Given the
visibility of the site for online-researching MFA applicants, the extended
duration of the polling period, and the regularity with which the polling
question regarding applicants' application lists was posed, a correlation is
presumed between that group of MFA applicants who used online research tools
during the 2009–10 application cycle, and that group of applicants at least
casually conversant with The MFA Blog. Tom Kealey, the proprietor of The MFA
Blog, is also the author of the top-selling MFA-related book in the United
States, per Amazon sales statistics recorded during the polling period. This
book, The Creative Writing MFA Handbook, prominently features the Web address for The MFA Blog. Consequently
even those who conducted their MFA research via print publications were
arguably likely to come across the Web address for The MFA Blog during the
course of their reading. Indeed, as Kealey's book is the only print publication
on the American or international market that profiles individual full-residency
MFA programs in detail, it has become nearly ubiquitous in the MFA applicant
community.
Individual users on The MFA Blog were distinguished by their user accounts, and substantial additional measures were taken to prevent duplicate submissions. During the polling period the number of individual accounts active on The MFA Blog was between 1,000 and 1,500, which suggests that the present polling's 527-person cohort represents between one-half and one-third of all active patrons on the site during the nine-month period in question. The presence of an unknown number of nonposting members on the site helps explain the high unique visitor count cited above, as well as the fact that even the most respected stat-counter services will sometimes read returning users as first-time users, depending upon an individual user's privacy settings with respect to IP-recognition cookies.
Polled applicants were asked to list the programs to which they had applied or intended to apply, and were permitted to adjust these lists during the polling period. Fewer than 10% of poll respondents elected to do so.