On Tuesday and yesterday, I looked at the data about this year’s “best of” lists, as tallied from School Library Journal, Kirkus, Horn Book, Publishers Weekly, and Library Journal’s “Best YA for Adults.” I used almost the exact same metrics as I did in 2012, adjusting a bit for new categories and removing a couple I didn’t necessarily find that interesting or have enough data to pull together into anything worth looking at.
Because I used the same tally sheet and looked at so many of the same factors, I thought it would be worthwhile to compare what the “best of” lists in 2012 looked like against this year’s “best of” lists. Were there any notable differences between the two years? Were there more books considered “best” one year than the other? Was there a big difference in gender representation? What about other factors? If “best of” lists give a snapshot of a year in YA, then what will comparing two consecutive years say about preferences in “best” books? Again, this is all data and nothing conclusive can be said about it, but it is interesting to look and speculate.
In both 2012 and 2013, I used the same criteria to define a YA book. I didn’t look at non-fiction, and I didn’t include graphic novels in the final results. In both years, I also took Amazon’s age rating of the book being for those 12 and older as a standard for “YA fiction.”
Range and Spread of Titles Selected
The first thing that caught my attention when looking at the 2013 data was that it seemed like there were far fewer books being labeled “best of” than there were in 2012. Turns out, my suspicions were correct.
Note that this bar chart begins at 50 and Google won’t let me change it to begin at 0. But it shouldn’t matter, as it’s pretty clear there’s a difference in titles selected: last year, there were 89 unique titles on the “best of” lists. This year, there were only 55.
I decided to look at each publication and compare their number of unique choices last year against this year. Every publication selected more YA fiction last year than they did this year, except for Publishers Weekly, which picked 16 titles this year and only 11 last year. There’s a big difference in Kirkus’s number of choices, where they had selected 82 last year and 42 this year. Repeated titles were included here, as long as it was a unique journal which selected it (in other words, every instance of Far, Far Away counted as an individual title, as long as it was a different journal that picked it).
Even accounting for the non-fiction and graphic novel selections — which were minimal this year, as well — there were definitely fewer books selected as “best of” this year.
Does the fewer number of titles being selected as “best of” suggest that maybe this was a weaker year for YA fiction? Or if that’s not the case, did fewer books stand out and resonate this year among editors tasked with selecting the bests? Most “best of” lists are decided by vote and by the editors of the journals, and I wonder if there’s any correlation between the number of “best of” titles selected and the number of starred reviews earned this year. In other words, did fewer books earn starred reviews in 2013 than in 2012?
Even with Kirkus’s more esoteric selections, as discussed yesterday, there seem to be surprisingly few bests this year. Is this a trend we’re going to continue to see in the coming years or will 2013 be sort of an outlier?
Author Gender and “Best of” Lists
I didn’t keep track of the gender of the main characters in 2012 the way I did in 2013 (part of it having to do with having way more books on the 2012 list), but I did look at the gender of the authors on both sets of lists. For 2012, there were a total of 90 authors and in 2013, there were a total of 55.
There were 72 females and 18 males.
There were 41 females and 14 males.
As can be seen, there was a smaller percentage of female authors in 2013 than there were comparatively in 2012. Eighty percent of the authors in 2012 were female, whereas about 75% were female in 2013.
Although there aren’t hard numbers to represent all of the YA books published as categorized by author gender in these years, it does make me wonder a little bit if there were fewer female authors in 2013. Or were there fewer female-written books that stood out as “best?” It’s a small percentage drop, of course, but it’s an interesting trend, especially when taken in light of the data about the New York Times gender split for their YA list.
Debut Novelists on the “Best of” Lists
Did debut novelists do better in 2012 than they did in 2013 when it comes to being on the “best of” lists? Let’s take a look.
There were a total of 18 debut novelists in 2012, which came to 20% of the total number of authors on the “best of” list.
Compare to 2013:
There were 11 debut novelists in 2013, which also equalled a total of 20% of the authors on this year’s “best of” lists. In other words, no difference in debut novelists on the lists in the last year.
Genre Representation in “Best of” Lists
I mentioned that this year, there was a rise in realistic fiction in frequency of appearance on the “best of” lists. I thought it was notable, as the last couple of years have mentioned that realistic fiction would become “the next big thing,” and the “best of” lists at least suggested that realistic fiction caught more critical attention this year.
But was there a rise in realistic fiction this year as compared to last year? And if so, what was in abundance last year that maybe didn’t show itself as popular among the “best of” lists this year?
Here’s the 2012 breakdown:
Fantasy took up the largest portion of the “best of” lists, though realistic held its own. Last year, when I did the genre breakdown, I made “mystery and thriller” a separate category, which I did not do this year. I suspect if I were to reconsider categories, many of those books would end up under realistic fiction, thus making it about the same size as fantasy in terms of appearances on the list. Historical and science fiction followed in popularity.
There were more historical novels on this year’s “best of” lists than there were in 2012, with roughly 24% of the books falling under that genre. Compare to last year’s 14%. But what’s most notable is that fantasy dropped sharply this year, at roughly 19%, while in 2012, fantasy occupied almost 40% of the “best of” lists. There were also fewer novels categorized as science fiction that appeared in 2013 than in 2012.
Realistic fiction’s presence on the “best of” lists definitely increased, even if the mystery/thriller category is rolled into realistic fiction for 2012’s counts. This year, realistic fiction was nearly 44% of the “best of” lists.
Best of by List Frequency
With the fact there were fewer books on this year’s “best of” list than in 2012, as well as a shift a bit in terms of genre representation, I thought it would also be worth looking at the frequency of titles appearing across multiple lists. There were 5 lists total, and I was curious whether more books would appear more frequently on lists in 2012 or in 2013.
In 2012, here’s what the frequency of books on the “best of” books looked like:
The vast majority of books only showed up on one list, though a good portion also showed up on two lists. Smaller numbers appeared on three and four lists, and there was a single book which appeared on all five of the lists (that went to Elizabeth Wein’s Code Name Verity). For the curious, the books which were on four lists each last year were Vaunda Nelson’s No Crystal Stair, Margo Lanagan’s Brides of Rollrock Island, AS King’s Ask the Passengers, John Green’s The Fault in Our Stars, and Libba Bray’s The Diviners.
Compare to 2013:
There were a smaller percentage of books appearing on a single list this year than last, but there was a pretty big increase in the percentage of books on two lists, as opposed to one list in 2012. That’s percentage-wise, though, of the total number of books across the five lists. Raw numbers show that it was actually only an increase of one book appearing on two lists this year — 11 in 2013, rather than 10 in 2012. Both years saw a total of five books on three lists, though because of the smaller number of books overall on this year’s list, the percentage appears larger.
As mentioned in a previous data post, there were no books this year that ended up on all five of the “best of” lists (except for Boxers and Saints, which was not included in any of the data because I didn’t include it in YA fiction but considered it a graphic novel instead).
So What Does This All Mean?
In the big context of “best of” lists and accolades at the end of any given year in YA fiction, the data doesn’t really say a whole lot. It does, however, give us a picture of what a year in YA looks like. This year, it appears we have fewer female authors penning books considered “best of” (though it’s still a larger percentage than male authors), and we have many more realistic fiction filling out the lists than other genres.
We have fewer books earning multiple spots on “best of” lists, but with fewer books overall, what does that say? Again, the question I keep circling back to and have from the beginning of looking at this data is how much one list impacts another list and how much marketing may influence these things.
This year felt like a noteworthy one when it came to books being sold to readers and sold to readers in a very big way. There appeared to be a lot more money spent on a lot fewer titles, and I wonder how much of that reflects in these “best of” lists. The more a book is sold as a great book, how much more likely are we to believe that?
Even the most objective readers can’t avoid hearing and seeing the buzz about certain books. I’m not suggesting that editorial boards choosing their “best of” are swayed by this kind of marketing, but rather, this kind of marketing really did stick out this year more than other years. Which then leads me to another set of questions that seem to be the ones authors and creative types deal with themselves: do these “best of” list creators stick to their purely objective “best of” picks or do they feel at times pressured to bend to what the popular opinion of the “best of” books might be?
The most popular book this year among the “Best of” selections this year was Rowell’s Eleanor & Park. It was a good book.
But this was also a book that received spectacular marketing and publicity. It got a review in the New York Times by John Green, along with five starred reviews. That wasn’t lost on the book’s marketing, either — how many places was the book heralded as one that John Green himself loved and that other readers would, too? It was SMART. It helped a new YA author, who had only published one book into the adult market prior, gain immense traction and attention very quickly (it didn’t hurt the attention Rowell’s second book out this year received, either, as we were reminded that Green loved her first book in the marketing there, too). Readers have fallen in love with Eleanor & Park over and over, and it showed up on nearly every list this year where adult readers were told it’s okay to read YA because of books like that.
Was it this year’s “best” book? Would this book be seen as this good were it not for all of the marketing behind it? What about without all of the adult praise it earned (you know, it’s a “YA book that is okay for grown ups to read”)? This book was impossible to avoid, whether you were a YA reader or you weren’t a YA reader.
It’s hard not to think about the other books that came out this year that were as good as Rowell’s. But what were they? Are they some of those books Kirkus called out that, yesterday, I questioned as to why they were on the list in the first place? Have I become accustomed to thinking that outliers on these lists indicate a poor choice? Or is Kirkus on to something I’m unaware of because those books have yet to be sold and marketed to me as a reader (or more accurately, as a librarian who buys these books and then sells them to teen readers)?
The smaller the field of “bests,” the more I wonder what was overlooked simply because a few big titles had so much weight behind them.
Of course I have no answers. I just have a lot more questions, and they’re the kinds of questions I like to end a year with because they make me reevaluate my own reading, my own means of book recommendation, and my own personal “favorite” or “best of” lists. How much farther out do I want to reach to find hidden gems? How many of the big books should I make sure I do read because maybe I am missing something big there, too?
As of this writing, I haven’t yet seen the Booklist nor the BCCB “best of” lists, and I’m curious how those will stack up against these lists. Likewise, what will YALSA committees select as best books with their Printz this year, their Quick Picks, or their Best Fiction for Young Adults?
I’d love thoughts and ideas regarding this year’s best of picks, especially as they compare to last year’s. Any thoughts? Do you have any books you wish had seen time on the “best of” lists that didn’t show up? What about books that appeared on the list that make you scratch your head a bit?