Data Sources
For those program measures not subject to applicant
polling, such as rankings and recitations of admissions, curricular, and
funding data, only data publicly released by the programs—either
to individual applicants, to groups of applicants, in a program's promotional
literature, or via a program Web site—have
been included in the rankings chart. All data were updated regularly to reflect
programs' most recent public disclosures.
Many of the nation's full- and low-residency MFA programs decline to publicly release internal data. In 2007, between 40% and 60% of the nation's MFA programs declined to answer questions on an AWP questionnaire seeking admissions and funding data from member programs. Specifically, 47% of programs declined to reveal how many assistantships they offered annually to incoming students; 61% declined to reveal the stipend offered to teaching assistants; 56% declined to reveal whether they offered a full tuition waiver to teaching assistants; 49% declined to reveal how many scholarships were offered to incoming students; 55% declined to reveal their annual number of applicants; and 52% declined to reveal the size of their annual matriculating class. Compounding the incompleteness of the AWP survey was the fact that the Association did not distinguish between low-residency and full-residency programs. Given that low-residency programs do not offer teaching assistantships (as low-residency students are only on campus during brief residencies), this omission was a critical one. Likewise, because AWP surveys are only sent to AWP members, and AWP has previously indicated in public disclosures that 33% of U.S. creative writing programs are not AWP members, the 2007 survey's polling cohort (142 MFA programs) was missing as many as 71 potential respondents.
Programs unable or unwilling to release data regarding their funding and admissions processes are necessarily disadvantaged by a ranking system that promotes and rewards transparency. Yet no program that fails to release this data for applicants' consideration can avoid being judged, by applicants and other observers, through the lens of such nondisclosures. As research for these rankings is based entirely on publicly-available, publicly-verifiable data, (1) the accuracy of the data upon which the rankings are based can be readily confirmed by any party, and (2) programs can easily optimize their involvement in the rankings by ensuring their applicants have access to all of the data prospective students generally require in making application and matriculation decisions.
Programs were not contacted
directly for these rankings for a variety of reasons: (1) As indicated above,
past attempts by AWP, the national trade organization for creative writing
programs, to secure even bare-majority participation by its member programs via
a nationwide data-disclosure project were unsuccessful (and AWP member programs
presumably owe more, not less, institutional fealty to AWP than to any
independent nonprofit or freelance journalist); (2) the human resources
required to track down internal admissions data for nearly two hundred MFA
programs, many of which do not wish to release such data, would likely be
prohibitive for any independent nonprofit organization or freelance
investigative journalist; (3) to the extent the present rankings seek to
actively promote program transparency, it would be counterintuitive for the
rankings to reward programs willing to selectively leak data to members of the
media through private channels, but not, via publicly-accessible channels, to
the public-at-large; (4) unless 100% compliance with a nationwide
data-disclosure project could be ensured, any attempt to reach programs
individually—rather
than place the responsibility for disclosure of admissions, curricular, and
funding data on the programs themselves—will
necessarily favor those programs researchers are able to successfully contact.
This places the onus for proof of "equivalent due diligence" (as to each
program) on researchers rather than where it belongs, on the programs
themselves. The programs, not their assessors, are the "bearers of least
burden" with respect to due diligence in the release of these data, as they
only stand to benefit from increased transparency and are entirely in control
of their internal data and program Web sites at all times.
Structure
Low-residency programs were measured in eight
categories, six of which are rankings—four
employing unscientific but probative polling of the sort described above, and
two based upon publicly-available hard data. Low-residency programs have not
been assessed with respect to their funding packages because these programs
generally offer no or very little financial aid to incoming students. The
reason for this is that low-residency programs presume their applicants will
continue in their present employment during the course of their studies.
Cohort
Over the course of three successive application
cycles, a total of 195 low-residency applicants were polled as to their program
preferences, with these preferences exhibited in the form of application lists.
The locus for this polling was the Poets & Writers online discussion board,
The Speakeasy, widely considered the highest-trafficked low-residency
community on the Internet. The relatively small cohort used for this polling
accounts for the following: (1) The annual applicant pool for low-residency
programs is approximately one-eighth the size of the full-residency applicant
pool (see below); (2) low-residency applicants do not congregate online in the
same way or in the same numbers that full-residency applicants do; and (3)
low-residency programs are subject to a "bunching" phenomenon not evident with
full-residency programs, with only eight programs nationally appearing on even
10% of poll respondents' application lists, and only three appearing on 20% or
more. For this reason only the top ten low-residency programs have been
included in the rankings (also available in the September/October 2010 print
edition of Poets & Writers Magazine); below this level it is difficult to draw distinctions between
programs, as none received a significant number of votes over the three years
polling was conducted.
One explanation for the bunching phenomenon described above may be that low-residency programs are less susceptible to comparison than full-residency programs, as many of the major considerations for full-residency applicants, including location, funding, cohort quality, class size, duration, and cost of living, are not major considerations for low-residency applicants due to the structure and mission of low-residency programs. Generally speaking, low-residency programs are assessed on the basis of their faculty and pedagogy, neither of which are conducive to quantification and ranking. That three programs have such a clear advantage in the rankings on the other 43 operating in the United States, Canada, the United Kingdom, and China is a function of both the relatively recent development of the low-residency model (with older programs tending to be more highly regarded, though none dates to before 1976) and the consensus that appears to have existed for years that three programs in particular are strongest in terms of faculty, selectivity, and placement. It is worth noting, too, that a significant number of the world's 46 low-residency MFA programs were founded within the last eight to ten years; applicant familiarity with these programs may still be relatively low.
The three-year low-residency polling described above has been further broken down into year-by-year poll results. The cohort for the 2009–10 annual ranking was 88, for the 2008–09 ranking 55, and for the 2007–08 ranking 52. If and when individual account-users applied to programs in more than one admissions cycle, their application lists from each cycle were treated as separate slates of votes; repeat applicants accounted for less than 10% of the polling cohort, however. Full-residency applicants on The MFA Blog who applied to one or more low-residency programs as part of their overall slate of target programs (see "Structure" and "Cohort" under the header "Full-Residency Rankings," above) were also included in the low-residency voting; due to the exceedingly small number of such votes, these entries were manually compared both to one another and to existing low-residency application lists to ensure duplicate lists were avoided.
While polls with larger cohorts are, all other things being equal, more reliable than those with smaller ones, the fact that the annual applicant pool for low-residency programs is likely between 400 and 500 (see below) suggests that even the 2007–08 single-year low-residency rankings polled a substantial percentage of all applicants nationally during that application cycle. Moreover, as is the case with the full-residency rankings, cross-checking applicant vote totals across a period of three years reveals substantial consistency in the results and quickly unearths any significant anomalies or outliers. Of the ten low-residency programs listed in this year's print rankings, eight (80%) ranked in the top 10 in all three years of polling, while another was in the top 10 for two of the three application cycles studied. All of the programs in the top 10 achieved at least an Honorable Mention (a ranking between 11 and 15) for all three of the years in which low-residency applicants were polled.
An "N/A" notation signifies that a program has not released the requisite data. An asterisk indicates that the program is unranked in that category. Only five low-residency programs achieved a positive score in the national placement ranking, which considered placement data for full- and low-residency programs in a single assessment: Vermont College of Fine Arts in Montpelier (#17 nationally); Warren Wilson College in Swannanoa, North Carolina (#38); Bennington College in Vermont (#41); University of Alaska in Anchorage (#46); and Queens University of Charlotte, North Carolina (#53). In order to better acknowledge the achievement, in the placement category, of these five low-residency programs relative to their low-residency peers, and in recognition of the fact that low-residency graduates are substantially less likely to seek postgraduate fellowships (largely because they do not give up their present employment when they matriculate), the rankings above have been re-constituted as low-residency-only: Vermont College of Fine Arts, #1; Warren Wilson College, #2; Bennington College, #3; University of Alaska, Anchorage, #4; and Queens University of Charlotte, #5.
Due to the still relatively small number of low-residency programs in the United States and abroad, only programs receiving top 10 placement in any category of assessment have received a special notation in either the print or online editions of the rankings.
National Low-Residency Applicant Pool
A realistic estimate of the annual number of
low-residency MFA applicants is 400. This estimate is based in part on the fact
that the five most-applied-to low-residency programs receive an average of 144
total applications per year; in
contrast, the five most-applied-to full-residency programs receive an average
of 1,137 fiction and poetry only
applications per year. If this comparison is any guide, approximately eight
times as many individuals apply to full-residency programs as low-residency
programs each year, suggesting a mean low-residency applicant pool, per year,
of just over 400. This figure can then be cross-checked using the number of
votes for Warren Wilson College in the present low-residency rankings (79), the
total number of low-residency votes cast for the rankings (195), and Warren
Wilson's publicly-released annual applicant pool size (200). Using these
figures one would expect an annual national low-residency applicant pool of
494. The only other low-residency programs for which all these data are both
available and may be considered reliable are Bennington College (whose data suggest an estimated 488 annual
low-residency applicants) and Lesley College (598).
In view of the above, the three-year, 195-person sample used for this year's low-residency rankings likely represents between one-half and one-third of an annual applicant cohort for this type of residency program.
Added to the adjusted mean for
annual poetry, fiction, and nonfiction applicants, the estimate for the annual
number of low-residency applicants suggests a total annual applicant pool to
creative writing MFA programs—across
all genres and types of residency, and gauging discrete applicants only—of
somewhere between 4,000 and 5,000.
Cohort
Between July 15, 2009, and April 15, 2010, 346 fiction
applicants were polled for the fiction-genre rankings, 141 poetry applicants
were polled for the poetry-genre rankings, and 101 nonfiction applicants were
polled for the nonfiction-genre rankings. The reason for the disparity between
the total number of fiction and poetry applicants in the genre-specific polls
(487) and the total number of votes in the overall fiction and poetry poll (527) is that 40 applicants,
or 7.6% of the cohort polled in fiction and poetry, did not specify their genre—though
it was clear from their application lists that the genre in which they applied
could not have been nonfiction (due to the fact that the majority of MFA
programs do not offer nonfiction tracks, an applicant specifying that he or she
has applied in only genre, but who lists certain programs on his or her
application list, can be precluded from consideration as a nonfiction
applicant). One consequence of this 7.6% nongenre-reporting population is that
certain programs are tied in the overall rankings even though, by virtue of
their rankings in the two major genres, this would seem to be a statistical
impossibility.
The cohort sizes used in this polling are roughly consistent with the national distribution of MFA applicants by genre, as revealed by those few programs which both (1) accept applicants in all three genres, and (2) release their internal admissions data for all three genres. The national distribution of fiction, poetry, and nonfiction applicants is approximately 6 to 3 to 2, respectively.
Due to the still relatively
small number of nonfiction programs in the United States and abroad, only
programs receiving top 20 placement in the genre have received a special
notation in either the print or online editions of the rankings. No Honorable
Mentions have been awarded, for the following reasons: (1) the relatively small
number of votes for programs ranked beyond twentieth in the genre, all of which
appeared on fewer than 10% of nonfiction applicants' application lists; (2) a
bunching phenomenon in the nonfiction rankings, such that any presumptive
Honorable Mention section of the nonfiction rankings (programs ranked between
21 and 25) would include nine programs, making the Honorable Mention section
nearly half the size of the rankings proper; and (3) there would be little
statistical distinction, that is, two votes or less, between the nine
presumptive Honorable Mention programs and the six programs ranked behind them—a
smaller disparity, out of a cohort of 101, than the three-vote difference
between the top 50 and Honorable Mention sections in the 527-cohort
full-residency rankings.
Programs without a nonfiction
program are designated, in the top 50 rankings, with an em-dash (—).
Comments
morescotch replied on Permalink
Stop Publishing These Rankings
Dear Poets & Writers,
I can't believe Poets & Writers is going to keep publishing these ridiculous rankings. First of all, doesn't it occur to anyone that the values of a group of people who frequent an MFA blog might not be the same values of the general MFA community? There’s no way to tell how good a program is going to be by staring at a hundred program websites and comparing their funding packages, which is what a group of people answering polls on a blog are doing. You shouldn’t apply to an MFA program in order to become a person funded by an MFA program; you should apply to an MFA program to become a better writer. And this emphasis on “time to write” is flawed. I’m from Hartford, CT. You want time to write, move to Hartford. You can rent a one bedroom for $250 dollars a month, and write all the time. Good teachers. A good community. These are what a person should look for in a MFA program, and Seth Abramson is never going to point you toward that. Please stop legitimizing his preposterous internet fetish. Let’s go back to when we admitted that this was something you couldn’t rank.
Samuel Amadon
sethabramson replied on Permalink
Hi Samuel, You're absolutely
Hi Samuel,
You're absolutely right in thinking that the values of the (total) annual national applicant pool are not those of the nation's largest (or, really, any) online community of MFA applicants; the article above (pp. 1-2) emphasizes this point several times and in several different ways. The goal of the polling, which is only one portion of the ranking system as you know, is to measure only the attitudes of those who pool their resources and knowledge when applying to MFA programs by participating in a community of fellow applicants -- those less likely to do so are also less likely to enjoy positive outcomes with respect to the first of the primary goals of the rankings (pg. 1, above: "Specifically, the goals of these rankings and their methodology are the following: Less overall student debt among MFA graduates, more transparency in the promotional materials and public disclosures of existing MFA programs, and greater access, for applicants, to the wealth of conventional wisdom in the MFA applicant community about which programs and which program features are most conducive to a memorable and valuable MFA experience"). You're also absolutely right to say that polling can never offer a complete picture of program quality--that's why the article above says (pg. 1) that the matriculation decision "will finally be made, and must be made, using the rankings as only a secondary resource," why it does not attempt to measure "faculty and community" (two unmeasurables both you and the article agree are not quantifiable) directly but uses applicants' application decisions as an indirect reflection of word-of-mouth about both, and why a good portion of the rankings are assessments of publicly-announced, hard-data program features like funding, selectivity, and postgraduate placement. The first measure is aimed at helping applicants avoid unnecessary, crippling debt, which was rampant among applicants before programs' funding information received national release via a single ranking methodology, and the second two hard-data measures aim at helping applicants gauge prospective cohort quality (an imperfect science, one reason the rankings are often cited as "unscientific" in the article above; still, "cohort quality" being one vital element of "community," this does strike at the heart of what you've termed the key to the MFA application/matriculation decision). I know you went to Columbia, as I'm familiar with and enjoy your work, and I think the key for you, as for anyone, is to simply ask whether you enjoyed your experience there and found it, on balance, worthwhile--if so, and I've no reason to think or guess otherwise, the rankings are admittedly of no relevance, as they're not aimed at/toward current students or graduates but only future applicants whose MFA years may still lie ahead. The hope is that future applicants to Columbia (or anywhere else) will be able to use the rankings to get hard data on funding, selectivity, and postgraduate placement, even if they decide the polling portion of the rankings is not helpful to them--though as the article above details (pg. 1) the correlation between what the hard data tells us about program features that affect real lives, and what applicants are saying about where they want to apply, is intimately linked. This suggests that applicants are now able and inclined to use information to make application and matriculation decisions, rather than rumor and guesswork. I can't imagine willingly going back to a time when such an important decision was made without the benefit of even the "secondary resource" of information. If you (I mean the generic "you" here) didn't decide where to attend college without the benefit of information, why apply to an MFA that way, especially when it's an unmarketable degree that it's financially dangerous to go into debt for, unlike the B.A.? The response to the rankings among applicants has been overwhelming--more than 98% positive. Those who are not applicants may tend to misunderstand the rankings because, at base, the rankings are not geared toward meeting the needs or interests of those who are not applicants (i.e., whose futures in no way depend on or involve an MFA-related decision). It is much easier to dismiss all the research and information contained in the rankings when one does not need that research or information; those who do need it are saying, en masse, that it is enormously profitable for them to have it, and that's why it keeps getting national release. Again, read pg. 1 above if you have any additional questions about the underlying principles behind, and/or the aim of, the rankings. It's spelled out fairly explicitly there. In any case, I'm glad you wrote in, because these are important questions and concerns. And (side note) congratulations on your recent book! Best wishes, Seth
morescotch replied on Permalink
Also
AWP's got an even better explanation for why you shouldn't listen to this guy.
http://guide.awpwriter.org/rankings.php
sethabramson replied on Permalink
Samuel, If you believe that
Samuel,
If you believe that the best writers always make the best teachers; that the aesthetics of a writer determine his or her in-class pedagogy; that an artist of one aesthetic inclination is temperamentally incapable of working productively with an aspiring artist of an entirely different bent; that applicants can conclusively determine, through sheer force of will, which poets and writers (all of whom are individuals they've never met) will be most helpful to their future development as artists... in that case, yes, David's argument might have some purchase. But we'd have to assume that you also cared little about accruing crippling student debt or attending a program with a strong cohort of artists, weren't at all interested in how large, how long, how student-teaching-intensive, how studio-intensive, and how focused on faculty teaching (cf. student-to-faculty ratio) your prospective program would be, and had time to research 200+ programs in grave detail rather than relying on massive online communities where others charitably contribute, for free, such intelligence. Granted, I don't know of any MFA applicant who fits this description--and I've had contact with literally thousands since 2006--but if I do come across any I will pass along the link. The point is, the rankings are the product of a community, and implicitly promote that community; David's comments mention some undoubtedly important considerations in choosing an MFA--and I endorse such considerations wholeheartedly--but nowhere can one find better discussions of such considerations than the polling locus used by the P&W rankings. It's not a coincidence. In any case, hopefully at some point in the future there'll be a possibility of discussing this more responsibly and decently (cf. "this guy"); David knows, I think, that the views he's attributed to me are not mine, and that I've said, from the start, and quite publicly, and repeatedly, that it would be foolish for any person to make an application or matriculation decision purely or largely on the basis of rankings. The difference between me and David is that I think artists are fiercely independent-minded enough to actually do this; meanwhile, David's concern on this score has somehow morphed into A) a categoric opposition to rankings (don't misunderstand his comments; at the time AWP vehemently opposed the very methodologies David's now implicitly endorsing, i.e. those of USNWR and The Atlantic), and B) a brand of advice -- as mystical as it is misleading -- which endows MFA applicants with powers of perception and prediction not even the best artists among us could possibly lay claim to. Be well, Seth
stovedore replied on Permalink
Unfortunate
It's unfortunate that P&W would continue to back flawed methodology and a writer whose logorrhea is well-documented (just check out Abramson's responses to pithy statements in this comment section). The first word in the title of the magazine is "Poets" (which Abramson professes to be!) but this ranking and the sheer amount of insecure writing done to back up the ranking (funding...funding...funding...) is so far removed from anything poetic, or even useful to a writer or human being. Yes, this article, this ranking, this comment thread will get the clicks and eyeballs that P&W wants (and probably needs), but it is worth it?
sethabramson replied on Permalink
Stovedore, I'm sorry you
Stovedore,
I'm sorry you feel that way, and sorry also for my long-windedness. This is a complicated issue, and I'll admit that I balk when folks approach it only superficially. Any good faith discussion of the subject would need to be more exhaustive than the sort of pith that finds favor in our drive-by online exchanges -- all too many of which, like your own note, are peppered with irrelevant personal attacks. (These don't help a single applicant.) If you're curious about my poetry I hope you'll check it out, it's readily available -- and I can assure you, from personal experience, that there's more than enough time and space in the world for both writing poetry and providing a public service for young, under-resourced applicants to MFA programs. Cheers,
S.
seelo replied on Permalink
The nonsense continues...And
The nonsense continues...And I really admired P&W at one time.
sethabramson replied on Permalink
Seelo, Sorry you feel that
Seelo,
Sorry you feel that way. Be well,
S.
sputnik replied on Permalink
MFA rankings
According to the magazine, somewhere on the website there's a complete listing of all MFA programs, domestic and international. Can't find it. What's the URL? Thx.
sethabramson replied on Permalink
Hi Sputnik, These two links
Hi Sputnik,
These two links -- (http://www.pw.org/content/2011_mfa_rankings_the_top_fifty_0) and (http://www.pw.org/content/2011_mfa_rankings_the_additional_rankings_of_fullresidency_mfa_programs) -- together constitute the largest and most complete listing of domestic and international full-residency MFA programs (as opposed to M.St., M.A., or M.Phil programs) available online or in print. In fact, every full-residency MFA program domestically or internationally that advertises itself is believed to be contained somewhere on these two lists.
Best,
Seth
bretquinn replied on Permalink
rankings
Seth, I first found your rankings last year, and couldn't wait for this year's. They are an integral component of my MFA quest. Thank you so much for the time and effort put into the database. Having so much useful information gathered in one place is an inestimable help.
rgarciasr replied on Permalink
MFA's
There is no doubt that you do great work with Full Residency MFA Programs. What about Low Residency? Don't they deserve some attention as well?
sethabramson replied on Permalink
See
See here:
http://www.pw.org/content/2011_mfa_rankings_the_top_ten_lowresidency_programs
At that link, there's also a link to a listing of the additional 36 low-res programs in the U.S. and abroad. And if you read the methodology article (see sidebar) it covers low-res programs as well (there's a separate section). Plus these programs are mentioned in my articles in the print edition of the magazine. Hope you find them helpful! Best,
Seth
CarvingCarver replied on Permalink
Doing a Classical Argument on this subject
I am an undergrad, a really low undergrad (sophomore) and I want to get an MFA. Your rankings have helped me make a decision about where to apply and to know that I need an MA in something else. I am thinking of being an editor if I can't make it writing, because lets face it, few can. For an unbiased classical argument, I need as many facts as I can and your article helps. How much influence do you have on the rankings? This is a real help and as someone who lives below poverty level currently, I appreciate your rankings. Perhaps the naysayers have money that they can throw around but I struggle and your rankings have helped me decide. I also look for faculty and community. Those are my top criteria. But for anyone to say that they may dismiss P&W because of these rankings is missing the entire point of P&W. It is a side endeavor. And they should know poor folks like myself rely on such thoroughness.
CW Dad replied on Permalink
MFA & PhD versus MA & PhD
Seth - My son is a College Junior with a post graduate goal of getting his PhD in Creative Writing. I have to admit I am somewhat confused about the benefits of an MFA along with a PhD. From what I've seen, both of these are thought of as terminal degrees. So my question is - is it advantageous to get the MFA over the MA if the intention is to get your PhD? Also, there are only about 35 colleges in the U.S. that offer a PhD with a Creative Dissertation. Are there any rankings of these schools?
Seth Abramson replied on Permalink
Hi CWD, Unfortunately no
Hi CWD,
Unfortunately no ranking of CW Ph.D. programs has been possible thus far due to a lack of data, but I'm hoping that will change soon. Suffice to say that you can expect the programs at University of Southern California, University of Houston, Florida State University, University of Denver, and University of Illinois at Chicago to be in the top 10, and likely also (though with less definite assurance) University of Georgia, University of Missouri, and University of Utah. CW Ph.D. programs are slightly more likely to accept applicants with MFA degrees, I feel, so in that sense an MFA may be preferable to an M.A., but generally you're absolutely right--both are terminal degrees, and one doesn't need more than one terminal degree technically (though with today's CW job market it really couldn't hurt), so one could certainly get an M.A. if one wanted to go on and get a CW Ph.D (or as more and more folks are doing, get a terminal CW MFA and then a terminal non-CW English Lit Ph.D.). The question I'd ask, though, is this: Why get an M.A. over an MFA? Why not get the terminal degree instead, in the event something unexpected happens (for instance one hits one's own personal comfort "limit" as to student loan debt, one suddenly can't move from one's current location for personal/family reasons, etcetera)--that way, one would already have a terminal degree, whereas if all you're holding is an M.A. when additional schooling becomes impossible you now have zero terminal degrees. Also, graduate school admissions in CW work almost entirely off one's portfolio, and the MFA gives one more time, generally, to work on one's thesis (and thus, by extension, one's CW Ph.D. portfolio) than an M.A. does. So one's chances of ending up in a top CW Ph.D. are better, for that reason also, following an MFA. I think the reason many MFA grads get a CW Ph.D. is not because of some added practical value--there's no proof yet it really affects one's job prospects, and there are no signs the CW Ph.D. is becoming the new CW terminal degree as some say (there's been almost no growth in the number of such programs in the past decade, whereas there have been maybe 40 new MFA programs over that time)--but because it gives one more teaching experience, more time to write and publish, more time in a supportive community of fellow artists, and so on. And yes, in a "tie-breaker" employment-related situation it might break a tie between two job candidates. Hope this helps, and best of luck to your son! --S.
JToman replied on Permalink
check out this
look at this