Avoidance of Respondent Bias
The most extensive program assessment system in the United States, the higher education surveys published annually by U.S. News & World Report, produce assessments almost exclusively by individuals with no firsthand experience attending or teaching in the programs they are being asked to assess. For the magazine’s much-lauded law school assessment, for instance, judges, lawyers, and law firm hiring coordinators are asked to assess the academic quality of programs others have attended, and that they have encountered only to the same extent an MFA applicant encounters the graduates of individual creative writing programs in the course of his or her in-genre reading (or, alternately, in a social or professional context). In fact, all of the program assessments published by U.S. News & World Report use the same basic methodology, as stated in the 2011 edition of the magazine containing its graduate school program assessments: “[These assessments] are based on the results of surveys sent to academics…[t]he individuals rated the quality of the program at each institution from marginal (1) to outstanding (5). Individuals who were unfamiliar with a particular school’s programs were asked to select ‘don’t know.’” This last provision merely ensures that survey respondents have some basic familiarity with the programs they are assessing; it does not ask or encourage respondents to submit an institutional (or personal) self-assessment.
As is the case with the methodology described above, national educational-institution assessment schemes have historically sought out unbiased observers to assess accredited degree programs, with self-reporting of interested observers implicitly or explicitly disallowed. The Poets & Writers Magazine 2014 MFA Index improves on this model by surveying individuals who not only are in a position to gauge the professional performance of individual programs’ graduates and professors (i.e., by reading their published work), but who also have access to—and a natural interest in—a large stock of hard data regarding the programs they are being asked to consider.
The 2014 MFA Index makes a further improvement on the U.S. News & World Report methodology by eschewing overall program quality assessments altogether, and by stating explicitly that neither its constituent surveys nor its hard-data listings in any sense constitute an overall assessment of program quality. Overall assessments of program quality—in any field of study—are impossible, as such determinations differ depending upon the student, a large slate of unquantifiable program features, the period of time in which that student matriculates (as faculty turnover ensures program characters change over time), and a host of chance-based factors that no methodology ever devised could hope to or even wish to encapsulate. While the data provided in 2014 MFA Index is hopefully invaluable to applicants—especially given the historic opacity of graduate creative writing programs—it is no substitute for an individual applicant’s subtle, many-factored assessment of which program is best for him or her. The 2014 MFA Index should be used as one tool among many.
Survey Cohort Demographics
Online surveys conducted in 2010 using a Google-sponsored survey application suggest that the online MFA applicant community, including the community at The Creative Writing MFA Blog and the MFA Draft 2012 Facebook Group, subscribes to the current conventional wisdom (as first laid out in the 2005 edition of Tom Kealey's Creative Writing MFA Handbook) regarding the most important considerations in applying to and matriculating at an MFA program. When asked, “If you are a current creative writing MFA applicant, which of the following program features are among your top five reasons for choosing to apply to a particular program?” and given the alphabetically listed options “Alumni,” “Cost of Living,” “Curriculum,” “Duration,” “Faculty,” “Funding,” “Internship Opportunities,” “Location,” “Postgraduate Placement,” “Reputation,” “Selectivity,” “Student-to-Faculty Ratio,” “Size,” “Teaching Opportunities,” and “Other,” 909 survey respondents provided the following responses:
1. Funding (68 percent)
2. Reputation (61 percent)
3. Location (59 percent)
4. Faculty (50 percent)
5. Teaching Opportunities (41 percent)
6. Curriculum (28 percent)
7. Cost of Living (23 percent)
8. Alumni (21 percent)
9. Duration (19 percent)
10. Size (13 percent)
11. Selectivity (13 percent)
12. Postgraduate Placement (11 percent)
13. Student-to-Faculty Ratio (10 percent)
14. Internship Opportunities (6 percent)
15. Other (5 percent)
There is substantial similarity between these survey results and the results of a 2009 survey that asked applicants, “Which of these is most important to your decision about where to apply?” with the listed options being "Location," "Funding," "Faculty," "Reputation," "Selectivity," "Curriculum," or "None of the Above" (respondents were permitted to select more than one answer). The top four answers, out of hundreds of survey responses, were identical to the top four responses in 2010:
1. Funding (56 percent)
2. Reputation (45 percent)
3. Location (32 percent)
4. Faculty (18 percent)
These survey responses also closely correspond to the responses provided by MFA faculty members when they were asked by Poets & Writers Magazine, in 2011, to provide in narrative form some of those program features they considered most important for prospective students to consider. With over forty responses, the top priorities of MFA faculty members, as reported by the faculty members themselves, were (with the number in parentheses constituting the percentage of surveyed faculty members who cited a specific program feature):
1. Funding (71 percent)
2. Quality/Aesthetics of Faculty Work (69 percent)
3. Student Self-Reporting (60 percent)
4. Program Atmosphere (50 percent)
5. Faculty Accessibility (41 percent)
6. Teaching Opportunities (38 percent)
7. Location (36 percent)
8. Editing Opportunities (31 percent)
The remaining priorities were all cited by fewer than 30 percent of faculty respondents. Of the eight program features most frequently cited by faculty members as being appropriate bases to evaluate a graduate creative writing program, five—Funding, Quality/Aesthetics of Faculty Work, Teaching Opportunities, Location, and Editing Opportunities—are as well-known and accessible to, and as easily understood by, MFA program applicants as by current or former MFA students or their professors. The remaining three depend upon a phenomenon not yet common in the field of creative writing: For applicants to take into account Student Self-Reporting, Program Atmosphere, and Faculty Accessibility, programs would first need to routinely make current students and faculty available to non-admitted or pre-admission MFA applicants. As of this writing, only one of the 171 full-residency creative writing MFA programs worldwide is known to regularly offer this courtesy to non-admitted or pre-admission applicants. It is worth noting, too, that of the next thirteen program features cited as important by current MFA faculty members, nine—Workshop Format, Curricular Flexibility, Alumni Publishing Success, Presence of a Reading Series, Curricular Intensity, Program Size, Cost of Living, Internship Opportunities, and Program Duration—are known or knowable to applicants at the time they make their application decisions, and two others are within the unique power of applicants to access (Program Visit and Alumni Self-Reporting). As of this writing, fewer than 5 percent of creative writing MFA programs worldwide were known to offer funds to applicants for pre- (or even post-) admission program visits.
When applicants were asked, in 2010, “If you are a current creative writing MFA applicant, how old will you be when you begin your program, assuming you’re admitted this year?” 1,929 survey respondents provided the following responses on The Creative Writing MFA Blog:
1. 23 or 24 (18 percent)
2. 25 or 26 (16 percent)
3. 21 or 22 (13 percent)
4. 27 or 28 (11 percent)
5. Older than 40 (10 percent)
6. 29 or 30 (8 percent)
7. 31 or 32 (6 percent)
8. 33 or 34 (5 percent)
9. 35 or 36 (4 percent)
10. 37 or 38 (2 percent)
11. 39 or 40 (2 percent)
These results are consistent with earlier online survey results, from 2009, suggesting that the median age of a creative writing MFA applicant is between twenty-six and twenty-seven.
Asked, “As part of your research into MFA programs, how many current or former MFA students or faculty have you spoken to?” 686 survey respondents provided the following responses:
1. 1 to 2 (34 percent)
2. 3 to 5 (27 percent)
3. 0 (25 percent)
4. 6 to 10 (7 percent)
5. 11 or more (4 percent)
Asked, “Have you received advice from an undergraduate creative writing faculty member in applying to MFA programs?” 860 survey respondents provided the following responses:
1. Yes (59 percent)
2. No (30 percent)
3. Not Yet, But I Plan To (10 percent)
In 2011, the application lists of a random sampling of three hundred 2010–2011 MFA applicants were analyzed to determine the frequency of different list sizes. The results were as follows (the first number is the number of programs on an applicant’s application list, while the second is the number of such lists in the analyzed sample; the third figure is the percentage of the total sample with an application list of the stated size):
1: 10 (3 percent)
2: 6 (2 percent)
3: 10 (3 percent)
4: 18 (6 percent)
5: 23 (8 percent)
6: 30 (10 percent)
7: 26 (9 percent)
8: 31 (10 percent)
9: 31 (10 percent)
10: 29 (10 percent)
11: 24 (8 percent)
12: 15 (5 percent)
13: 14 (5 percent)
14: 14 (5 percent)
15: 7 (2 percent)
16: 4 (1 percent)
17: 2 (1 percent)
18: 4 (1 percent)
19: 0 (0 percent)
20: 0 (0 percent)
21: 1 (0 percent)
22: 1 (0 percent)
Asked, on The Creative Writing MFA Blog in 2010, "Why do you want to get a graduate creative writing degree?” and given the options "Credential," "Employability," "Time to Write," "Mentoring," "Networking," "Community," "Validation," "Avoid Work," and "None of the Above," with the option to select more than one answer, the top three answers, among hundreds of responses, were as follows:
1. Time to Write (55 percent)
2. Employability (43 percent)
3. Mentoring (36 percent)
The Poets & Writers Magazine 2014 MFA Index does not use the above survey data to create a weighting system for the columns of information it provides. There is a presumption, instead, that applicants' own application lists best reflect the extent to which they take into account funding, location, reputation, selectivity, faculty, curriculum, and other applicant-specific factors in choosing which programs to apply to and attend.
Were the above data used to create a weighting system for the data presented in the 2014 MFA Index, or were the applicant survey to be removed from the MFA Index altogether, many of the nation’s most prominent and popular programs would disappear from the table altogether—as programs widely admired by applicants (and working poets and novelists) do not always perform superlatively in hard-data measures. A program assessment missing critical application-trend data would constitute a poor reflection of the present national consensus on which programs are most popular among applicants and working authors alike. For instance, under the applicant survey’s current methodology a popular but largely-unfunded MFA program in a major urban center might still appear in the top half of the one-year and five-year surveys because even a relatively low standing in the funding, selectivity, student-faculty, fellowship placement, and job placement categories can be counterbalanced by a program's popularity due to its location, faculty, and/or other unquantifiable factors. The popularity of a program’s location and faculty is best reflected by privileging applicants’ application lists rather than a confluence of these lists and publicly accessible hard data. To redesign the 2014 MFA Index to deprivilege current applicant mores would be to ensure that virtually no nonfully funded and/or big-city programs (with only a handful of exceptions) would appear in the table, nor many (if any) nonfully funded programs whose appeal lies in large part in the composition of their faculty rosters.
While it’s fair to assume that program popularity going forward may be directly affected by a higher or lower relative placement in the funding, selectivity, student-faculty ratio, fellowship-placement, and job-placement categories, the pace of this trend is arrested, rather than hastened, by the current program assessment. The present methodology both registers the relative decline or stagnation in the popularity of certain programs while allowing for these programs to improve their funding, selectivity, student-faculty ratio, and placement statistics before losing their positions (in part as a result of applicant consensus) in 2014 MFA Index altogether.
Genre of Survey Respondents
Asked in 2010, using a Google-sponsored survey application, “What is your primary genre?” 701 respondents from The Creative Writing MFA Blog provided the following responses:
1. Fiction (53 percent)
2. Poetry (28 percent)
3. Nonfiction (15 percent)
4. Other (2 percent)
Isolating only the 578 poetry and fiction respondents to the above survey question, the results are as follows:
1. Fiction (65 percent)
2. Poetry (35 percent)
This suggests that the potential survey cohort at The Creative Writing MFA Blog is similar in its constitution, in terms of genre affiliation, to the national MFA-applicant cohort. Hard data from twenty MFA programs with available admissions data for both genres (constituting a total of twenty-four data-sets ranging in age from the 2008–2009 admissions cycle to the 2010–2011 admissions cycle) generates a total data-set of 12,368 applicants, 8,730 of these being fiction applicants (70 percent) and 3,638 poetry applicants (30 percent). The genre breakdown for the one-year applicant survey published in 2014 MFA Index is nearly identical to this figure: Between poets and fiction writers, 67 percent of the members of the surveyed cohort were fiction writers and 33 percent were poets. For the 2013 applicant survey published in September 2012 by Poets & Writers Magazine, the genre breakdown was 70 percent fiction writers and 30 percent poets; for the 2011 applicant survey, the genre breakdown was 63 percent fiction-writers and 37 percent poets.
Applicant survey respondents for the 2014 MFA Index were self-selected, and it is the particular and express design of the survey methodology that this survey cohort be self-selected. Just as a survey aimed at determining popular car manufacturers might use a self-selecting cohort to only compile the responses of the best-researched car buyers—for instance, those who had spent time on websites that allow consumers to compare various available car brands and styles—the one-year and five-year applicant popularity surveys do not intend to sample a generic cohort of MFA applicants. Instead, it is their aim to primarily if not exclusively catalogue application decisions made by the best-researched MFA applicants, which class of applicants is considerably more likely to be found in a massive, real-time applicant community in which scores of data-points regarding individual programs are researched, shared, and discussed daily.
National Full-Residency Applicant Pool Size
The median estimate for the national full-residency fiction/poetry applicant pool (as calculated in 2011) is 2,797, the mean estimate is 3,253, and the adjusted mean is 3,042. The same series of calculations produced a median estimate, for the national nonfiction applicant pool, of 291, and a mean estimate of 345. The total size of the national full-residency applicant pool, across all three of the “major” genres of study, is therefore likely between 3,000 and 4,000. The two-genre, five-year, 2,519-respondent applicant survey that appears in the 2014 MFA Index consequently surveys the equivalent of 77 percent to 90 percent of an annual national two-genre applicant pool in the field of creative writing; the one-year surveys published annually by Poets & Writers Magazine survey between 13 percent and 23 percent of the three-genre national applicant pool for each admissions cycle.
Data Sources
For those program measures not subject to applicant surveys, such as recitations and ordered listings of admissions, curricular, placement, student-faculty ratio, and funding data, only data publicly released by the programs—either to individual applicants, to groups of applicants, in a program's promotional literature, or via a program website—have been included in the 2014 MFA Index. All data were updated regularly to reflect programs’ most recent public disclosures.
Many of the nation’s full- and low-residency MFA programs decline to publicly release internal data. Programs unable or unwilling to release data regarding their funding and admissions processes are necessarily disadvantaged by a program assessment that relies on transparency. Yet no program that fails to release this data for applicants' consideration can avoid being judged, by applicants and other observers, through the lens of such nondisclosures. As research for these surveys and listings is based entirely on publicly available, publicly verifiable data, the accuracy of the data of which the MFA Index is comprised can be readily confirmed by any party.
The Nonfiction Survey
Because fewer than half (47 percent) of full-residency MFA programs offer a dedicated nonfiction or creative nonfiction track—defined as a curricular track which permits a master’s thesis in the genre—nonfiction and creative nonfiction applicants have been surveyed separately from poetry and fiction applicants. These survey responses do not factor, in any sense, into either the one-year or five-year popularity surveys published in the 2014 MFA Index.
For the nonfiction/creative nonfiction survey, the designation “n/a” indicates that a given program does not offer a nonfiction track or concentration.
LOW-RESIDENCY SURVEY
Structure
Low-residency programs were assessed in twelve categories, nine of which are either applicant surveys or ordered listings of hard data—six of these employing unscientific but probative surveying of the sort described above, and three based upon publicly-available hard data. Low-residency programs have not been assessed with respect to their funding packages because these programs generally offer no or very little financial aid to incoming students. The reason for this is that low-residency programs presume their students will continue in their present employment during the course of their graduate studies.
Cohort
Over the course of six successive application cycles, a total of 304 low-residency applicants were surveyed as to their program preferences, with these preferences exhibited in the form of application lists. The locus for this surveying was (between April 16, 2007 and April 15, 2011) the Poets & Writers Magazine online discussion board, the Speakeasy Message Forum, widely considered the highest-trafficked low-residency community on the Internet; from April 16, 2011 to April 15, 2013, the survey locus was the MFA Draft 2012 and MFA Draft 2013 Facebook Groups described in detail above. The relatively small cohort used for this surveying accounts for the following: (1) The annual applicant pool for low-residency programs is approximately one-eighth the size of the full-residency applicant pool; (2) low-residency applicants do not congregate online in the same way or in the same numbers that full-residency applicants do; and (3) low-residency programs are subject to a "bunching" phenomenon not evident among full-residency programs, with only nine of 53 eligible programs nationally appearing on even 10 percent of survey respondents' application lists, and only three appearing on 20 percent or more.
Comments
trois petits chats replied on Permalink
MacArthur winner earned MFA at Columbia
Since the orginator of these "rankings" has repeatedly denigrated Columbia and other NYC MFA programs (no, I neither got my MFA there nor applied to that program), I thought I would mention this 2013 MacArthur Fellow who earned her MFA at Columbia:
http://www.macfound.org/fellows/902/
Alumni of NYU's MFA program:
http://cwp.fas.nyu.edu/object/grad_alumni_publications.html
trois petits chats replied on Permalink
Hunter College
Work by Hunter College Alumni:
http://www.hunter.cuny.edu/creativewriting/student.shtml
akboatwright replied on Permalink
Columbia
I think people mistake all education with vocational training these days. They want a certificate and a job when they finish. Money in, money out. At Columbia, I was given time (and academic credit) for writing. There were no teaching fellowships and no one chased after you to give you career counseling, agent counseling. There was not one single lecture on marketing your work. Much of what I learned had to do with developing a certain angle of vision. Attitudes about my work and how to pursue my ideas with both faith and objectivity.
trois petits chats replied on Permalink
"academic" versus "studio" MFA program distinction
I've read comments on a couple of sites claiming that the so-called studio-versus-academic distinction regarding MFA programs was created by the AWP. If so, that's too bad. It's an unfortunately misleading distinction.
What does "academic" mean? I attended one of those so-called academic programs, where certain courses were modeled after earlier courses at the Iowa Writers' Workshop, where one of the co-founders of my program studied under Donald Justice in the 1960s. Our "Form and Theory of Fiction" and "Form and Theory of Poetry" courses were modeled after courses in Iowa's MFA program, including "Form and Theory of Fiction" and its later versions under later names. In fact, here's an example:
http://www.slate.com/articles/arts/books/2012/11/kurt_vonnegut_term_paper_assignment_from_the_iowa_writers_workshop.html
I read a few years ago on the MFA blog a comment by one potential MFA applicant who stated that she would prefer a "studio" MFA program because she couldn't stand the thought of writing another lit crit paper.
After having read plenty of "critical theory," etc. on my own in an effort to figure out what all this jargon-laden prose by contemporary lit scholars was saying, I became determined to never write such a paper EVER. The sole reason I didn't major in English (I majored in "analytic" philosophy instead) is that I was appalled by the obscurantist writing that characterized so much of the scholarship I'd come across in literary theory, and I didn't think I would benefit from any course that would reward me for writing that badly.
And had I been expected to write such papers in my "academic" MFA program, I would have left the program after one semester. Fortunately, the focus was on craft, not Derrida or post-structuralism, etc, etc.
So in case anyone fears the more "academic" programs, rest assured that at least SOME of those programs won't torture you by making to write a Marxist or feminist or Foucauldian analysis of "Sense and Sensibility." (As physicist Alan Sokal demonstrated, one can be politically liberal, or "progressive," without embracing the ideas of the “academic left.”)
trois petits chats replied on Permalink
the claim that creative writing can't be "taught"
I'd like to avoid altogether referring to Seth Abramson, the creator of this system of "rankings," but he's created such a world for himself around his views on MFA programs that it's impossible for me to avoid referring to his other comments on the topic--or impossible to avoid if I'm to again raise questions about the wisdom of his system and about the wisdom of Poets & Writers for advocating the system's worthiness.
Abramson has a habit of proclaiming that something is true and then assuming, as if through magical thinking (or so it seems to some of us), that the mere the stating of the idea therefore makes it true.
One of those truisms of his: that being a good writer has little or nothing to do with being a good writing teacher. Yet, he provides utterly no evidence for that claim--not even his own anectdotal evidence.
Here's my own anedtodal (experience-based) evidence:
With one exception, all the good writers I had as writing teachers were very good writing teachers. True, it's not necessarily the case that a good writer would be a good writing teacher. But unless the teacher has an emotional problem, is self-centered (and, therefore, uninterested in students' needs), or has some other emotional/social/psychological reason she cannot communicate her ideas orally or in writng to students, it would make sense that good writers would tend be good readers and good at expressing themselves in language about the art of (say) fiction writing and, therefore, would make good writing teachers.
Having an asute reader is vital to learning to write well.
Rust Hills wasn't a fiction writer but he was a great fiction editor (meaning, a smart fiction reader) and, therefore, a good writing teacher:
http://www.amazon.com/Writing-General-Short-Story-Particular/dp/0618082344
On the other hand, good fiction writers tend to think about what they're doing and--barring some bizarro problem with their abilty to work with other human beings--tend to be (if they communicate even a tenth as well outside their writing as they do in their writing) perfect candidaties for being good writing teachers.
I'd like to see us rid ourselves of this romatic/romanticized notion that writing teachers are pretty much irrelvant in these programs.
Oh, and by the way: I, like many other voracious readers when we were young, was able to read astutely long before I entered an MFA program (even though I didn't major in English!). Although the sprouting of more and more MFA programs would serve Abramson's purposes well, the idea that MFA programs should increase in number so that Americans can become better readers of literature is not only absurd when we look at literary history--including the history of readership--in the U.S. but also conspicuously self-serving on Abramson's part.
trois petits chats replied on Permalink
Abramson's expertise on MFA programs and literature
Also, Abramson has claimed that he's acquired special expertise on MFA programs, and Poets & Writers editors have quickly supported/defended that claim. Yes, he's got some numbers down--though those numbers don't satisfy either a doctoral-level mathematician or statistician I've talked to about this.
The problem is that, in a broad range of areas, he doesn't display great expertise:
1) He shows an almost obsessive need to classify things: literature, writing teachers, periods in the history of poetry...
But the difficulty with a tendecy to classify that intensely is that it veers increasingly toward over-classification--and, as many people realize, overclassification often leades to oversimplification.
Seth Abramson is trying to learn about the history of poetry, and I laud him for that effort, but he so often gets that history wrong. And his rather grandiose claims about the worthiness of MFA programs does next to nothing to elevate the status of MFA programs in the eyes of those who didn't attend one. Pre-Seth, we had Dana Gioia as the main detractor of MFA programs. I seems obvious to me that as more of these programs have sprouted, the more resistence I'm seeing among "literary" (and I mean that in the best sense) poets and witers who didn't get an MFA. And to be honest, had my first encounters with MFA programs been with Abramson's description of them, I would have likely regarded the whole phenomenon with much more suspicion.
As it is, Abramson sometimes seems like a kid who's just encountering a whole new history of poetry, but his reaction seems to be an over-simplified thinking about that history
His latest view on the future of poetry: Metamodernism is taking over lierature--or lit crit? (It's an intersting conclusion for someone who supports the ideal of the "studio" program where no "analysis" takes place.)
http://www.huffingtonpost.com/seth-abramson/on-literary-metamodernism_b_3629021.html
And while this exchange is clever on the surface, it's also worth reading because it shows that literary history is messy and complex:
http://scarriet.wordpress.com/2013/07/23/metamodernism-lol/
By the way, the latest (as far as I can tell) fad in lit crit is "neuro lit crit." My favorite sentence from this particuar article:
"Given that many philosophers saw critical theory as a way for English professors to do philosophy really badly, it should not come as a surprise to find that some with a keen understanding of neuroscience are deeply skeptical of this attempt to say something new about old books."
http://www.forbes.com/sites/booked/2010/04/01/neuroscience-and-literary-theory-a-match-made-in-nonsense/
trois petits chats replied on Permalink
about the value of the MFA
I was in a dark hole-in-the-wall corner of a restaurant when I wrote my last post, but I nonetheless apologize for the typos, etc. therein. (If there's a spellcheck on this site's keyboard, I missed it).
A New Yorker piece from 2009 about the many attempts to define or explain the worth and purpose of the MFA program in creative writing:
http://www.newyorker.com/arts/critics/atlarge/2009/06/08/090608crat_atlarge_menand?currentPage=1
Based on the accounts of two people I know who got their MFAs at Iowa, former director Frank Conroy didn't appear to believe that faculty ought to just "get out of the way" of students and let things happen, creatively. (And WERE that truly the case for the faculty at Iowa's writing program, the university might want to consider putting those same faculty members' salaries toward another use.) Anyway, Conroy was known to sometimes say to a student (and in front of that student's classmates) some things that certain others in the class saw as emotionally damaging. In any case, Conroy was, apparently, never known for "getting out of the way" and leaving any discussion of a story's merits solely to the students in a particular workshop.
trois petits chats replied on Permalink
Conroy
"Beautiful prose in the service of what?" That's the sentence one Iowa-alum friend of mine described Conroy as saying when the prose in a story that was being workshopped was lovely but nothing of consequence was actually happening in the story. My friend, who saw Conroy as sometimes very unkind to students, has still said, all these years later, that he "learned" a great deal about story-telling from being in Conroy's workshops.
Besides Stop-Time, his memoir (written before the memoir became hip and widely marketable), Conroy's work includes the short story Midair (first published in a collection by that name), and it's astonishingly good--one I've read three or four times in the past 15 years.
vivian replied on Permalink
Advice
After reading your post, I would like to talk to you and get your advice. When it comes to an MFA and Critical Theory, NH Institute of Art is big on that....I applied and got in....now I am strongly wondering if this will help me.....I applied to Lesley as well. What are your thoughts about both programs? Which one is better???
trois petits chats replied on Permalink
"selectivity"
What Seth still seems to fail to grasp is that what he calls "selectivity" is really just the school's acceptance rate. A school that generally draws less qualified applicants but has (or claims to have) a 5 or 10% acceptance rate is not going to be as "selective" as a school that attracts much more qualified applicants and has the same acceptance rates as the first school. He never makes that distinction.
Of course, it would be hard to compare the quality of current MFA students in one program with those in another on the basis of any clearly quantitative measurement.