I am on occasion asked why the nation's MFA programs were not simply contacted directly, by mail or telephone, to fill in what few gaps exist in the table of data that follows. Apart from the questions of time and resources involved, the important point to make here about MFA programs is that they have not traditionally been particularly forthcoming about program particulars—indeed, one reason for the rankings' reliance on programs' promotional materials is to encourage programs to view their own, self-generated online documentation as the best venue for informing prospective applicants of program features. And in the past three years the rankings have been extremely effective in this regard. But efficiency and the promotion of transparency are not the only reasons for relying primarily on programs' online materials in collecting data; program Web sites are also, in the Internet age, the great equalizer among the nation's graduate creative writing programs. Every program has one and, just as important, every program has exclusive control over content. Telephone- and mail-based surveys necessarily advantage those programs that respond—and while that may seem like rough but fair justice to some, the purpose of these rankings is not merely to reward those programs that support the rankings through active engagement but also to encourage all programs everywhere to better serve their applicants and students. My belief in this approach has only been confirmed by several fruitless attempts to contact individual programs regarding some omission or ambiguity on their Web sites; while from time to time I encounter a program responsive to such queries, more commonly they are met with either no response or a terse dismissal. The hundreds of e-mails I've gotten from MFA applicants over the past three years reveal that they've had similar experiences in trying to coax information from programs. Only putting something at stake in a program's nonresponse—only letting the programs know that they are being assessed not merely on their quantifiable aspects but also on their transparency—has led to any significant movement on this score. To those among the faculty and staff of the nation's programs who question this methodology, the response, then, is clear enough: Adding missing data to online promotional materials has the triple benefit of being entirely within program control, being likely to benefit the program in the now-annual, highly popular MFA rankings, and being necessary to a better-informed and therefore more satisfied applicant pool and student body.
In the table that follows the individual programs are ranked on the basis of their votes in the online poll; these votes have then been broken down by genre, resulting in a genre-specific ranking for each program. Each program's placement in four feature-specific rankings also is included. Because there are 140 full-residency MFA programs in the United States, any school whose numerical ranking is in the top fifty in any of the ranked categories—the overall rankings; rankings in the poetry, fiction, or nonfiction genres; or the rankings by funding, selectivity, and postgraduate placement—should be considered exceptional in that category. Practically speaking, what this means is that all the programs listed in the table are exceptional programs in one or (frequently) many more than one area.
In rare instances, unavailable data, as well as data that falls beneath the cutoff for inclusion in the table, is denoted by special markings. For instance, fewer than half of all full-residency programs offer the nonfiction genre, and so in that category an asterisk (indicating the absence of such a program) appears alongside many programs. In the funding, selectivity, and placement rankings, an asterisk indicates that a program is unranked in this category. The poetry, fiction, and funding categories include a special notation—a plus sign—to indicate programs that received honorable mention (programs ranked between fifty and sixty nationally).
Comments
jelhai replied on Permalink
Low-residency programs
Seth Abramson wrote: "Generally speaking, low-residency programs do not offer much if any financial aid, cannot offer teaching opportunities to students,...are less likely to be gauged on the basis of their locales (as applicants only spend the briefest of periods on campus), and, because their faculties are part-time, are more likely to feature star-studded faculty rosters."
Given that hundreds, surely thousands, of people DO apply to low-residency programs each year, doesn't that suggest that many of the qualities measured in these rankings are unimportant to a significant number of students? And what is the basis for asserting that low-residency faculties are more star-studded than others? Even if it were true, how would it matter?
Finally, don't rankings merely offer a lazy short cut to school selection, perpetuating the myth that some programs are inherently better than others, when prospective students would benefit most by finding the program that is best suited to their individual aims and needs? You may not intentionally provide these rankings as a template for school selection, but you can bet that many people will foolishly use them that way, just as people use the US News & World Report rankings.
Seth Abramson replied on Permalink
Re:
Hi Jelhai,
You're absolutely right that the hundreds (not thousands; the national total is under 2,000) of aspiring poets and fiction-writers who apply to low-residency programs annually are, generally speaking, a very different demographic than those who apply to full-residency programs: they tend to be older, they are more likely to be married and/or have children, they are more likely to be professionals (i.e. have a career rather than a job), they are more likely to be (only relatively speaking) financially stable, they are more likely to have strong personal, financial, or logistical ties to their current location (hence the decision to apply to low-res programs, which require minimal travel and no moving). That's the reason this article did not contemplate low-res programs, in additional to the reasons already stated in the article. So when the article makes claims about MFA applicants, yes, it is referring to full-residency MFA applicants. Assessing low-residency programs and their applicants would be an entirely different project, requiring a different assessment rubric as well as--as the article implicitly acknowledges--a different series of first principles about applicant values.
As to the rankings that are here, keep in mind that what you're seeing is an abbreviated version. The full version, available either in the upcoming print edition or as an e-book (available for purchase on this site), includes data categories for each school: duration, size, funding scheme, cost of living, teaching load, curriculum focus (studio or academic). These are some of the most important "individual aims and needs" the hundreds and hundreds of MFA applicants I've spoken with over the past three years have referenced. Indeed, I've even done polling (the first-ever polling of its kind) to ask applicants what they value most in making their matriculation decision: in a recent poll of 325 MFA applicants (where applicants could list more than one top choice), 59% said funding was most important, 44% said reputation (e.g. ranking) was most important, 34% said location, 19% said faculty, and much smaller percentages said "curriculum" and "selectivity."
These rankings (and the article above) specifically urge applicants to make their own decisions about location, but provide ample information about funding, reputation, curriculum, and selectivity--four of applicants' top six matriculation considerations. Needless to say, many applicants will have "individual aims and needs" that they need to consider in making their matriculation decision, and I always urge them to look to those needs with the same fervor they consider (as they do) funding, reputation, location, and so on. But to imply these rankings haven't done the necessary footwork to ask applicants what their primary aims and needs are is simply incorrect. In fact, in the poll referenced above applicants were given the opportunity to vote for "none of the above"--meaning, they were invited to say that their top consideration in choosing a school was something other than the six categories referenced above. Only 1% of poll respondents chose this option. So when we speak casually of "individual aims and needs," I think we need to remember that these aims and needs are no longer as unknowable as they once were--largely due to efforts like the one that produced these rankings. And again, for those who don't see their own aims and needs reflected in the data chart that accompanies this ranking (and which you haven't seen yet), I say--as I always say--that these rankings and this data should be used only as a starting point for making an intensely personal and particularized decision.
Take care,
Seth
Seth Abramson replied on Permalink
Re:
P.S. I should say, too, that the poll I mentioned above is just one of many. Another poll (of 371 applicants, where applicants could pick more than one first choice), showed that 57% of applicants have as their top "aim" getting funded "time to write," 42% say employability (i.e. the degree itself), 36% say mentoring (which causes them to primarily consider program size, as program size helps determine student-to-faculty ratio), 34% say "community" (which again causes applicants to consider program size, though it pushes many of these applicants to consider larger programs, i.e. larger communities), 19% say "the credential" (again, as represented by the degree itself, though this also pushes such applicants to favor shorter programs, with a lower time-to-degree), and much smaller percentages said that they wanted an MFA to validate themselves as writers or to avoid full-time employment (very similar to wanting "time to write," per the above, just as "validation" is intimately related to "mentoring" and "the credential"). Again, these polls were not intended to be exhaustive, though it's noteworthy that 0% of poll respondents chose "none of the above."
clairels replied on Permalink
Suspicious
A graduate of Harvard Law School and the Iowa Writers' Workshop
I'm not accusing anyone of anything, but you have to realize how suspicious this looks.
Seth Abramson replied on Permalink
Re:
Hi Clairels,
I'd respond to your comment, but honestly I have absolutely no idea what you mean to imply or what your concern is. I attended both those programs (J.D., 2001; M.F.A. 2009), and certainly don't regret either experience.
Take care,
S.
Seth Abramson replied on Permalink
P.S. I think it was the
P.S. I think it was the reference to HLS that threw me. If you're talking about my IWW affiliation (as I now see you might be), I don't know what to tell you except to say that you won't find a single person who's well-versed in the field of creative writing who's surprised by Iowa's placement in the poll--a poll that was taken publicly and with full transparency, and whose results are echoed in/by the 2007 poll, the 2008 poll, the (ongoing) 2011 poll, USNWR's 1996 poll, and the 2007 MFA research conducted by The Atlantic. Iowa has been regarded as the top MFA program in the United States since the Roosevelt Administration (1936). In three years of running MFA polls I'll say that I think you're the first person to suggest to me (even indirectly) that Iowa might have finished first in the poll for any reason other than that it finished first in the poll (to no one's surprise). So no, I can't say that I see my affiliation with the IWW--an affiliation I share with thousands of poets (Iowa graduates 250 poets every decade) is "suspicious." --S.
sweetjane replied on Permalink
To be fair, Seth, I think
Seth_Abramson replied on Permalink
Hi SJ, Sorry for any
J Thomas Lore replied on Permalink
Acceptance Rates
J Thomas Lore replied on Permalink
And Seth, there was a link
Seth Abramson replied on Permalink
Hi JTL, Per my contract
sweetjane replied on Permalink
"Sorry for any confusion--my
Seth_Abramson replied on Permalink
Hi Phoebe, I've addressed
sstgermain replied on Permalink
question about collection of information
Seth_Abramson replied on Permalink
Hi SSTG, This was one of
ewjunc replied on Permalink
nothing is absolutley objective
sethabramson replied on Permalink
Hi ewjunc, The article's
OKevin replied on Permalink
Hi Seth, Good job. Have a
sethabramson replied on Permalink
Hi there Kevin, thanks so
illingworthl replied on Permalink
Re: UNH Core Faculty--include Mekeel McBride, please!
sashanaomi replied on Permalink
Other factors: health insurance
Since Seth Abramson is considering cost of living and funding, I think he should consider another, really huge factor: Does the school offer health insurance? There are some very highly ranked CUNY programs. Yes, CUNY is cheap, but there is no health insurance. If you really want to commit to a writing program, you don't really have time for a full-time job with health benefits. Health insurance was a big factor in my selection, and I'm sure it is for many others as well.