2013 MFA Index: Further Reading

by
Seth Abramson
From the September/October 2012 issue of
Poets & Writers Magazine

Avoidance of Respondent Bias

The most extensive program assessment system in the United States, the higher education surveys published annually by U.S. News & World Report, produce assessments almost exclusively by individuals with no firsthand experience attending or teaching in the programs they are being asked to assess. For the magazine’s much-lauded law school assessment, for instance, judges, lawyers, and law firm hiring coordinators are asked to assess the academic quality of programs others have attended, and that they have encountered only to the same extent an MFA applicant encounters the graduates of individual creative writing programs in the course of his or her in-genre reading (or, alternately, in a social or professional context). In fact, all of the program assessments published by U.S. News & World Report use the same basic methodology, as stated in the 2011 edition of the magazine containing its graduate school program assessments: “[These assessments] are based on the results of surveys sent to academics…[t]he individuals rated the quality of the program at each institution from marginal (1) to outstanding (5). Individuals who were unfamiliar with a particular school’s programs were asked to select ‘don’t know.’” This last provision merely ensures that survey respondents have some basic familiarity with the programs they are assessing; it does not ask or encourage respondents to submit an institutional (or personal) self-assessment.

As is the case with the methodology described above, national educational-institution assessment schemes have historically sought out unbiased observers to assess accredited degree programs, with self-reporting of interested observers implicitly or explicitly disallowed. The Poets & Writers Magazine 2013 MFA Index improves on this model by surveying individuals who not only are in a position to gauge the professional performance of individual programs’ graduates and professors (i.e., by reading their published work), but who also have access to—and a natural interest in—a large stock of hard data regarding the programs they are being asked to consider.

Overall assessments of program quality—in any field of study—are impossible, as such determinations differ depending upon the student, a large slate of unquantifiable program features, the period of time in which that student matriculates (as faculty turnover ensures program characters change over time), and a host of chance-based factors that no methodology ever devised could hope to or even wish to encapsulate. While the data provided in the Poets & Writers Magazine 2013 MFA Index is hopefully invaluable to applicants—especially given the historic opacity of graduate creative writing programs—it is no substitute for an individual applicant’s subtle, many-factored assessment of which program is best for him or her.

Survey Cohort Demographics

Online surveys conducted in 2010 using a Google-sponsored survey application suggest that the online MFA applicant community, including the community at The Creative Writing MFA Blog and the MFA Draft 2012 Facebook Group, subscribes to the current conventional wisdom (as first laid out in the 2005 edition of Tom Kealey's Creative Writing MFA Handbook) regarding the most important considerations in applying to and matriculating at an MFA program. Asked, "If you are a current creative writing MFA applicant, which of the following program features are among your top five reasons for choosing to apply to a particular program? ", and given the alphabetically-listed options "Alumni,” “Cost of Living,” “Curriculum,” “Duration,” “Faculty,” “Funding,” “Internship Opportunities,” “Location,” “Postgraduate Placement,” “Reputation,” “Selectivity,” “Student-to-Faculty Ratio,” “Size,” “Teaching Opportunities,” and “Other,” 909 survey respondents provided the following responses:

1. Funding (68 percent)                
2. Reputation (61 percent)
3. Location (59 percent)
4. Faculty (50 percent)
5. Teaching Opportunities (41 percent)
6. Curriculum (28 percent)
7. Cost of Living (23 percent)
8. Alumni (21 percent)
9. Duration (19 percent)
10. Size (13 percent)
11. Selectivity (13 percent)
12. Postgraduate Placement (11 percent)
13. Student-to-Faculty Ratio (10 percent)
14. Internship Opportunities (6 percent)
15. Other (5 percent)

There is substantial similarity between these survey results and the results of a 2009 survey that asked applicants, "Which of these is most important to your decision about where to apply?" with the listed options being "Location," "Funding," "Faculty," "Reputation," "Selectivity," "Curriculum," or "None of the Above" (respondents were permitted to select more than one answer). The top four answers, out of hundreds of survey responses, were identical to the top four responses in 2010:

1. Funding (56 percent)
2. Reputation (45 percent)
3. Location (32 percent)
4. Faculty (18 percent)

These survey responses also closely correspond to the responses provided by MFA faculty members when Poets & Writers Magazine asked them in 2011, to provide in narrative form some of those program features they considered most important for prospective students to consider. With over forty responses, the top priorities of MFA faculty members, as reported by the faculty members themselves, were (with the number in parentheses constituting the percentage of surveyed faculty members who cited a specific program feature):

1. Funding (71 percent)
2. Quality/Aesthetics of Faculty Work (69 percent)
3. Student Self-Reporting (60 percent)
4. Program Atmosphere (50 percent)
5. Faculty Accessibility (41 percent)
6. Teaching Opportunities (38 percent)
7. Location (36 percent)
8. Editing Opportunities (31 percent)

The remaining priorities were all cited by fewer than 30 percent of faculty respondents. Of the eight program features most frequently cited by faculty members as being appropriate bases to evaluate a graduate creative writing program, five—Funding, Quality/Aesthetics of Faculty Work, Teaching Opportunities, Location, and Editing Opportunities—are as well-known and accessible to, and as easily understood by, MFA program applicants as by current or former MFA students or their professors. The remaining three depend upon a phenomenon not yet common in the field of creative writing: For applicants to take into account Student Self-Reporting, Program Atmosphere, and Faculty Accessibility, programs would first need to routinely make current students and faculty available to non-admitted or pre-admission MFA applicants. As of the time of the writing of this article, none of the 224 creative writing MFA programs worldwide are known to regularly offer this courtesy to non-admitted or pre-admission applicants. It is worth noting, too, that of the next thirteen program features cited as important by current MFA faculty members, nine—Workshop Format, Curricular Flexibility, Alumni Publishing Success, Presence of a Reading Series, Curricular Intensity, Program Size, Cost of Living, Internship Opportunities, and Program Duration—are known or knowable to applicants at the time they make their application decisions, and two others are within the unique power of applicants to access (Program Visit and Alumni Self-Reporting). As of the writing of this article, fewer than 5 percent of creative writing MFA programs worldwide were known to offer funds to applicants for pre- (or even post-) admission program visits.

When applicants were asked, in 2010, “If you are a current creative writing MFA applicant, how old will you be when you begin your program, assuming you’re admitted this year?” 1,929 survey respondents provided the following responses on The Creative Writing MFA Blog:

1. 23 or 24 (18 percent)
2. 25 or 26 (16 percent)
3. 21 or 22 (13 percent)
4. 27 or 28 (11 percent)
5. Older than 40 (10 percent)
6. 29 or 30 (8 percent)
7. 31 or 32 (6 percent)
8. 33 or 34 (5 percent)
9. 35 or 36 (4 percent)
10. 37 or 38 (2 percent)
11. 39 or 40 (2 percent)

These results are consistent with earlier online survey results, from 2009, suggesting that the median age of a creative writing MFA applicant is between twenty-six and twenty-seven.

Asked, “As part of your research into MFA programs, how many current or former MFA students or faculty have you spoken to?” 686 survey respondents provided the following responses:

1. 1 to 2 (34 percent)
2. 3 to 5 (27 percent)
3. 0 (25 percent)
4. 6 to 10 (7 percent)
5. 11 or more (4 percent)

Asked, “Have you received advice from an undergraduate creative writing faculty member in applying to MFA programs?” 860 survey respondents provided the following responses:

1. Yes (59 percent)
2. No (30 percent)
3. Not Yet, But I Plan To (10 percent)                

In 2011, the application lists of a random sampling of three hundred 2010–2011 MFA applicants were analyzed to determine the frequency of different list sizes. The results were as follows (the first number is the number of programs on an applicant’s application list, while the second is the number of such lists in the analyzed sample; the third figure is the percentage of the total sample with an application list of the stated size):

1: 10 (3 percent)
2: 6 (2 percent)
3: 10 (3 percent)
4: 18 (6 percent)
5: 23 (8 percent)
6: 30 (10 percent)
7: 26 (9 percent)
8: 31 (10 percent)
9: 31 (10 percent)
10: 29 (10 percent)
11: 24 (8 percent)
12: 15 (5 percent)
13: 14 (5 percent)
14: 14 (5 percent)
15: 7 (2 percent)
16: 4 (1 percent)
17: 2 (1 percent)
18: 4 (1 percent)
19: 0 (0 percent)
20: 0 (0 percent)
21: 1 (0 percent)
22: 1 (0 percent)

Asked, on The Creative Writing MFA Blog in 2010, "Why do you want to get a graduate creative writing degree?" and given the options "Credential," "Employability," "Time to Write," "Mentoring," "Networking," "Community," "Validation," "Avoid Work," and "None of the Above," with the option to select more than one answer, the top three answers, among hundreds of responses, were as follows:

1. Time to Write (55 percent)
2. Employability (43 percent)
3. Mentoring (36 percent)

The Poets & Writers Magazine 2013 MFA Index does not use the above survey data to create a weighting system for the columns of information it provides. There is a presumption, instead, that applicants' own application lists best reflect the extent to which they take into account funding, location, reputation, selectivity, faculty, curriculum, and other applicant-specific factors in choosing which programs to apply to and attend.

Were the above data used to create a weighting system for the data presented in the Poets & Writers Magazine 2013 MFA Index, or were the applicant survey to be removed from the table altogether, many of the nation's most prominent and popular programs would disappear from the table altogether—as programs widely admired by applicants (and working poets and novelists) do not always perform superlatively in hard-data measures. A program assessment missing critical application-trend data would constitute a poor reflection of the present national consensus on which programs are most popular among applicants and working authors alike. For instance, under the applicant survey’s current methodology a popular but largely-unfunded MFA program in a major urban center might still appear in the top half of the one-year and four-year surveys because even a relatively low standing in the funding, selectivity, student-faculty, fellowship placement, and job placement categories can be counterbalanced by a program's popularity due to location, faculty, and/or other unquantifiable factors. The popularity of a program's location and faculty is best reflected by privileging applicants' application lists rather than a confluence of these lists and publicly accessible hard data.

Genre of Survey Respondents

Asked in 2010, using a Google-sponsored survey application, “What is your primary genre?” 701 respondents from The Creative Writing MFA Blog provided the following responses:

1. Fiction (53 percent)
2. Poetry (28 percent)
3. Nonfiction (15 percent)
4. Other (2 percent)

Isolating only the 578 poetry and fiction respondents to the above survey question, the results are as follows:

1. Fiction (65 percent)

2. Poetry (35 percent)

This suggests that the potential survey cohort at The Creative Writing MFA Blog is similar in its constitution, in terms of genre affiliation, to the national MFA-applicant cohort. Hard data from twenty MFA programs with available admissions data for both genres (constituting a total of twenty-four data-sets ranging in age from the 2008–2009 admissions cycle to the 2010–2011 admissions cycle) generates a total data-set of 12,368 applicants, 8,730 of these being fiction applicants (70 percent) and 3,638 poetry applicants (30 percent). The genre breakdown for the one-year applicant survey published in the MFA Index in 2012 is identical to this figure: As between poets and fiction-writers, 70 percent of the members of the surveyed cohort were fiction-writers and 30 percent were poets. For the 2011 applicant survey published in the summer of 2011 by Poets & Writers Magazine, the genre breakdown was 63 percent fiction-writers and 37 percent poets. Some of the deviations between the two surveys are best explained by the fact that the present (2012-published) survey more closely approximates the actual genre demographics of the national full-residency applicant pool.

Applicant survey respondents for the Poets & Writers Magazine 2013 MFA Index were self-selected. Just as a survey aimed at determining popular car manufacturers might use a self-selecting cohort to only compile the responses of the best-researched car buyers—for instance, those who had spent time on websites that allow consumers to compare various available car brands and styles—the one-year and four-year applicant popularity surveys do not intend to sample a generic cohort of MFA applicants. Instead, it is their aim to primarily if not exclusively catalogue application decisions made by the best-researched MFA applicants, which class of applicants is considerably more likely to be found in a massive, real-time applicant community in which scores of data-points regarding individual programs are researched, shared, and discussed daily.

Please log in to continue.
LOG IN
Don’t yet have an account?
Register for a free account.
For access to premium content, become a P&W member today.

Comments

Many of the same flaws

I've so far put off commenting on this cosmetically altered version of the "rankings." So, apparently, have others. The people I've discussed this topic with have not based their decision not to respond on the judgment that they consider the problems with this barely-different "methodology" solved; they've based it on the question of whether or not we should ignore this enterprise altogether.

I decided I shouldn't. The rankings are based on so many logical and empircal flaws that it's important, I think, for someone to address them (and I'm hardly alone in this opinion). So I'm gonna add my many 2 cents in the next few weeks, when I have spare moments.

As I mentioned last year, my brother is a mathematical (as opposed to applied) statisticain--which means he also understands the applications of stats. Having already read half of the "methodology" (it's still so long), my brother raises the obvious question about sample size. Before I receved a response from him, I'd raised to him the quesiont of the Central Limit Theorem, and the question of when it does not apply. (I first encounted the theorem in stat 101.) It does not apply well to this kind of sampling. 

As I did early on last year, I won't read any of Seth's responses unless a friend tells me that I ought to because of some innacuracy he's made about my claims or other good detail I should consider. I've seen smart and fair questions raised in response to Mr. Abranson's claims: e.g., on (I believe) HTMLGiant, a woman raised the quite reasonable question that myabe it was "misleading" for Abramson to say that Iowa's program was the "best" MFA program long before any other program existed; after all, "better" and "best" imply that there's something to which the item in question can be compared. His response to her was, in my view, rude and unfair.

Besides, I'm not really writing to him anyway.

 

P.S.

P.S. Sorry for the typos in my first post! I find spell checks useful (though I never rely on grammar checks)--and for that reason, I miss the squiggly red line! (Those can catch clerical errors as well!)

The assumptions you cannot make in science

Bit by bit, I want to respond to several assumptions made by Mr. Abramson. What I'll say tonight:

He describes his respondennts as "well researched," yet he provides no empirical evidence whatesover to support this claim. Also, he states in his "methodology" that his enterprise isn't "scientific" because, he argues, not all programs have responded. The problem, however, is that even if every program WERE to provide all the data he's searching for, his "rankings" (or "index," or whatever P&W wants to call it this year) would STILL be unscientific, and here's part of the reaon that that's the case:

One cannot make assumptions about one's sample without supporting evidence; also, human subjects, when it comes to their OPINIONS or FEELINGS (as opposed to, say, tissue samples), are fraught with well known interpretive difficulties that go beyond those found in the typical study in the natural or physical sciences.

Anyone who fully understands the scientific method understands at least this much: Making unsupported assumptions about your sample is NOT THE WAY SERIOUS SCIENCE IS DONE.

Anyway, more later...

 

Oh, and I'll add...

He argues that surverying MFA graduates would be a create a biased sampling because graduates would tend to rank their own programs highly. Fair enough--at least in theory.

But his method of examining prospective applicants doesn't get rid of the problem of bias; it merely replaces one kind of bias with a set of other biases beyond funding: location, picking "easy" programs regarding admission, having a "connection" to a particular faculty member, etc., etc., etc.

Noticing such details isn't rocket science.

(Oh, by the way: I'm trying to figure out how to separate paragraphs with white space. It seemed easier last year!) 

 

I agree with Caterina

These rankings continue to be an absurd blemish on PW's otherwise superb support for the CW community. The whole debate seems very simple to me - the information is useful, so make it available. But ranking requires criteria, and no-one has yet come up with sensible and generally-applicabe criteria for ranking MFA programs. Seth Abramson's criteria might work for him, and that's great. But putting PW's name on Abramson's ranking is silly (almost as silly as prorating the number of MFA programs founded in the 2010s on the basis of the number founded in 'the first thirty months' of the decade). 

A few final (?) thoughts

I agree with you, TimO'M.

A few additional claims made by the polll's creator that I'd wanted to respond to, including statments made by him that I think of as "Sethisms--a term I don't mean as denigrating but use because Seth has made a number of claims I'd never heard elsewhere but that he presents as if they should be believed merely because he made them (unless he thinks they carry some other obvious force: e.g., that they're self-evident or were handed down by the MFA Goddess and transcribed by Seth):

1) That MFA programs provide a "nonprofessional, largely unmarketable degree..." The problem with this claim is that this used to not be the case, before the number of MFA programs mushroomed (I think there are too many MFA programs now, and I suspect that some of them were created as cash cows--little equipment required but good salaries for the faculty and cheap labor by those who do manage to get funding from them). Although most Harvard law grads probably do manage to find good-paying jobs in the profession, the same phenomenon, more or less, has happened with law schools--a professional and marketable degree, traditionally: http://www.nytimes.com/2011/01/09/business/09law.html?pagewanted=all.

In fact, numerous law professors and members of the American Bar Association have questioned the ethics of this phenomenon.

2) That teaching is a relatively unimportant component in the MFA experience. While Seth is welcome to his opinion on this matter, that's all it is: his opinion. I earned my MFA from a program that, according to Seth, is associated with high post-grad employment. Why did I choose to apply there, though? A) the quality of the alumni; and B) the quality of the writers on the faculty. Most others I knew who had applied to harder-to-get-into programs considered the same two factors.

Although being a good writer doesn't guarantee that one will be a good teacher, I've had only one writing teacher who excelled the former but not the latter. Most good writers are good readers. How can that help (enormously, I'll add) and MFA student? By being read by a nuanced reader who understands the art form--someone who isn't also in competition with you, by the way--you can learn what you're doing well and what you're not doing so well. (Many of us have witnessed or even experienced this phenomenon: where one student will make in workshop a humane and fair-minded criticism of another student's piece and then the latter will later say, as payback, something nasty about the former's work. it's childish but also human, and it's more likely to occur among peers.)

No precise scientific measure will be created for MFA rankings, and I suspect that's why Abramson treats, for example, the quality of the faculty as rather trivial. How would he be able to measuer the quality of the writers on the faculty? By awards won? Which awards? As imperfect as it was, I find the old US News & World Reports helpful in that a) a faculty respondent was unable to rank her own school and b) faculty, who often guest-teach at other programs, have an idea of where the better students tend to be studying. Any more "scientific" a ranking seems highly unlikely to me. 

3) That it's really one's classmates--peers--that determine the quality of one's experience in an MFA program. Again, Mr. Abramson is entitled to his opinion, but that's all it is. A talented poet, and perhaps the gentlest person in my class, left after the first year (of a four-year program) for what he said would just be a "leave." He never returned. One thing he told me before he left was that he'd found no "writing community" there. Others did. But let's face it, an MFA program can include a lot of back-biting among students. (A friend of mine who attended Iowa in the '80s said that a running joke there was that the Iowa Writer's Workshop kept the student counseling services plied with clients. Perhaps the environment there is more humane now. It's refreshing to see the current director publicly state that applicants she's stongly supported--based on their writing sample--have sometimes been, to her surprise, rejected by the rest of the faculty votes.)

4) This distinction between "studio" and "academic" MFA programs, terminology I hadn't encountered pre-Seth Abramson (though I'd done an enormous amount of research on programs before I applied). He's said that Iowa is one of the "least academic" programs. By what measure? That they don't give grades? I know someone who took, during his MFA program there, a seminar that included classical Greek thought and was taught by James Alan McPherson: Pulitzer winner, Guggneheim and MacArthur Prize winner, and graduate of Harvard Law School before he attended the IWW. (Ever read any of his essays, often known for their intellectual, as well as emotional, nuance?) Not an "academic" program? (In contrast, my more "academic" program focused on reading literature as an art form; no postmodernist/post-structuralist/cultural studies-based lit-crit was involved. Otherwise, I wouldn't have attended.)  

5) That the level of the writing of MFA students at Iowa (or similar programs) is exceptional (I wish I could find the reference to that--if I do, I'll include it)--another justification for the claim that teaching isn't all that important?

While, as an undergraduate, I was taking other kinds of courses at Iowa, I used to sneak to the bin of fiction submissions for workshop (but only after the workshop had met) and steal the one or two leftovers (I wasnted to write fiction but was also scared by the prospect). Some of the writing was exceptional. Some of it, though, was rough-hewn (it was a workshop, after all)--and occasionally it was relatively bad, even if the prose was pretty good. My friend once described to me Frank Conroy's response to such stories: "Beautiful prose in the service of what?" (I.E., where was the plot, the characterization, the conflict, the sensory detail...?) Yes, this is hearsay, but I've heard the same depiction from several other grads of the IWW.

Given all of these obvious questions in response to this "ranking" system, what is it that has convinced P&W to attach it's name to it and give it such exposure.

I want to raise one more matter (one I consider at least as important as the above concerns I expressed), but it's getting rather late, so I'll sign off for now.

My feline pal, Caterina (one of three cats I live with), thanks you on my behalf for your indulgence--assuming you've made it this far into my comments.

Oh, just caught:

Forgot the end-parentheses in the second sentence above ("Sethisms").

While I'm thinking of it...

On the positive side: The application numbers are being called “popularity,” as they should be.

On the less positive side: It appears the Seth has still failed to distinguish “selectivity” from “acceptance rate.” As a Yale University administrator, whom I quoted last year, pointed out, the quality of the applicant pool makes a huge difference. In other words, a program that has a 25 percent acceptance rate might be more selective that some schools with, say, 10 percent acceptance rates. (And I have no bone to pick here: According to Mr. Seth’s own measures, the program I finished has a 4-5% acceptance rate.)

 I’m of course, in the above references, talking about Columbia (and some of the other NYC schools). For whatever reasons, Columbia’s MFA program has been associated with an exceptionally large number of fine writers. Tom Keeley, and Seth Abramson, were correct in alerting MFA applicants to the reality that funding is more available at some schools than at others, and that some of those latter schools are incredibly expensive if you don’t get funding. But it seems that Mr. Seth categorizes such schools as moral transgressions, even though some students get funding from them. (And anyway, if you’re living in NYC and you’ve got the money...)

 I also wrote earlier about Seth’s distinction between “studio” and “academic” MFA programs in creative writing, a distinction that caught my attention because I’d never it anywhere when I applied to programs in the ‘90s—which is why I came to call such terms “Sethisms.”

 Again: I have a friend who, during his MFA program at Iowa, took a seminar under James Alan McPherson--who also has a Harvard Law degree--on early classical Western thought. How is that not “academic”?? And why should we conclude that artistry and intellect are mutually exclusive? Since when? The idea that they're deeply different is a fairly recent distinction in the West.

 And one more time: In my own four-year program, we didn’t study Derrida or Foucault, etc., etc... So is that "academic" or not?

Oh, and I’ll add for good measure: I think Jorie Graham is, at least in her later work, a fantastically bad poet. Iowa (IWW) is lucky to be rid of her. And if we’re talkin’ intellectual stuff: Graham’s stupidly irrelevant references to obscure Latin botanical terms and to quantum theory say one thing she seems to want others to believe about her above all other possibilities: “I’m really really really really smarter than you!!

(And I'll later post a small bit about Columbia.)