Fair warning: this post is long, but it is graphic-filled. Because I think these “best” lists are a nice slice of a year in the book world, looking at them numerically is fascinating — but note that nothing here is conclusive or proof of anything. These are all my thoughts and musings on the data. Also worth noting is this is my math and while I am confident in my statistical skills, I’m also human. There is a chance there are errors, and I accept responsibility for that. I’m hopeful there are not though.
The first thing I wanted to look at was whether books that published in the first half of the year — January through June — were represented on “best of” lists with more or less frequency than those that published in the second half of the year — July through December. This is of interest just in terms of access to titles, as well as the lasting impact of titles. If a book was good in January, is it still good compared to everything else published in the year? Or do books that were published in December get overlooked inadvertently? Last year, there was a slight preference toward books published in the second half of the year. What about this year?
There isn’t a huge difference in release dates and appearance on the “best of” lists, though this year’s numbers show a preference for titles published in the first half of the year. There were a total of 48 titles on these lists published January through July and 41 published between July and December.
For kicks, I broke it down even further. Here’s when books on the “best of” lists were published this year:
Though they’re fairly evenly distributed, books published in the summer and in the late fall/early winter saw fewer titles on the “best of” lists. September had the most books published that ended up on “best of” lists.
I looked at debut novelists yesterday in the post at The Hub, but I thought it would be interesting to see whether or not there was a better month to be a debut novelist. So, here are when the 18 debut novels that ended up on “best of” lists were published:
March and August had the highest showing of debut novel publications that then went onto “best of” lists. There were no debut novels published in May or November that went on to “best” lists, despite May being a bigger month for non-debut novels which ended up on a list. It’s pretty even in terms of first and second half of the year publication dates and appearance on a list.
I’m not done with debut novel analytics yet, though. I noted in my post yesterday a couple of important facts: first, the Kirkus “best of” list contained 100 titles (which were then judiciously weeded by me for the purposes of data gathering), which was a significant number. Second, and maybe more interesting to me, was the fact the editor of Kirkus’s “best of” list is a member of this year’s Morris Awards committee. All of the Morris finalists are on that list, as well as a number of other debut novels. I was curious if, seeing the length of the list and knowing some of the editor’s own reading over the year, there would be more debut novelists on one awards list, as opposed to others.
Roughly 20% of all “best of” titles are debut novels.
For this data, I counted the number of “best of” titles from each list, then I counted up the number of those titles which were debuts. Enter some division, and I came up with the percentages of each list were made up of debut novels:
The blue bar is the total number of selected titles, with the yellow bar being the debut novels selected. I tried to make this graph interactive, but that didn’t work well with Blogger, so apologies!
The raw numbers are as such: School Library Journal selected 4 debuts out of a total of 20 titles (20%); Kirkus selected 16 out of a possible 82 titles (19.5%); Library Journal selected 1 out of a possible 8 titles (12.5%); Publishers Weekly selected 1 out of 11 titles (9%); and finally, Horn Book did not pick any debuts for their “best of” list (0%). This was surprising — I expected the highest percentage to come from Kirkus but it did not. It was neat how Kirkus and School Library Journal, though, selected an almost identical percentage of debut novels for their “best of” lists as there were debut novels in all of the lists together.
What was the distribution of books that were part of a series and those that were stand alone titles? Were there more series or stand alones?:
Personally, I’m thrilled to see so much stand alone love. There were a total of 53 stand alone titles and 36 series titles. In determining what was and was not a stand alone, I did not include Code Name Verity with the series category, despite there being a companion in the works — the book was originally a stand alone title. Books like The Drowned Cities, though, were includes in the series category. That choice was because it is labeled as “Shipbreaker #2.”
Which leads naturally to the next data set, which is where series books fell within a series. Were first in a series or last in a series more likely to make a “best of” list? Or were middle books the real winners here?
It wasn’t entirely surprising to see most of the “best of” series titles were either the start of a series or the conclusion to one. There were 6 middle titles — which I defined as anything between the first and last, regardless of the number of books in the series. Within the last in series category, I did include sequels when a series only included two books (like Such Wicked Intent and Girl of Nightmares, neither of which I could find definitive information about a future installment). There was one prequel to a series, and I marked The Drowned Cities as a companion title, rather than as a straight sequel or final in a series.
I’m always curious if earning starred reviews means that books have any more chance of appearing on a “best of” list. In other words, if a book earned 6 starred reviews, is it more likely to show up on multiple “best” lists? I don’t know if there is any connection or not, though there does tend to be a likelihood that titles on a “best of” list will have earned stars from that publication (many of the “best” titles on Kirkus’s list earned starred reviews from Kirkus — and in many cases, those titles only earned stars from Kirkus).
Before that, let’s look at the distribution of starred titles. In other words, how many of the 89 books earned 6 stars vs. no stars at all:
The bulk of books earned only one starred review, followed by books earning either two or three stars. There was a good chunk of books that didn’t earn any starred reviews at all.
Does this translate, then, to frequency of a book’s appearance on a “best of” list? In other words, do books with more stars show up more often? Maybe.
The chart should be fairly self-explanatory, but just in case: the bottom labels indicate how many books earned that number of starred reviews (so there were six books that each earned 6 starred reviews). Stacked above are then the number of lists those books appeared on. So of the books with 6 starred reviews, only 1 fell on all 5 “best of” lists — that’s Elizabeth Wein’s Code Name Verity. There were then 2 books with six starred reviews that fell on four “best of” lists and 3 books with six starred reviews each that fell on three “best of” lists.
The heavy majority of books earned placement on one “best” list, and they all happened to be one starred titles (and also Kirkus picks — though not always). Kirkus was the only journal to put all of the books without a single starred review on their “best” list.
Here’s the raw data on that chart (you can blow this up to see it better):
Before delving into a couple other very data-heavy topics, I wanted to look at an easier one to graph (but one that’s so interesting to me, nonetheless). That’s publication type. In other words, are books that come out in hardcover more likely to appear on a list than books that come out as paperback originals? And what about split runs? Split runs have started becoming a thing at Simon and Schuster specifically, as Hannah Moskowitz talks about here. If you don’t click over, a split run is when a book comes out both in paperback and hardcover at the same time.
Here’s where visual data isn’t always the best. There were 84 books on the “best” lists published in hardcover. There were three paperback originals — A Breath of Eyre, Street Dreams, and The Assassin’s Curse. All three earned one star from Kirkus and all three only appeared on Kirkus’s “best” list. There were two split run titles — Aristotle and Dante Discover the Secrets of the Universe and The Chaos. The first appeared on two “best of” lists and earned two starred reviews, while the second appeared on one “best of” list and earned three starred reviews.
Worth noting, both titles are from Simon and Schuster. Also of interest is that the first title features an LBGTQ storyline and the second features a POC as the main character. Simon and Schuster’s split run on Hannah Moskowitz’s Gone Gone Gone comes from an LGBTQ story line and the other title of theirs I know has a split run is Mindi Scott’s Live Through This, a solid contemporary title. If there are others, please let me know. There is something interesting in their choices for what will be split, what goes straight hardcover and what is straight to paperback.
This data shows that hardcovers are, by far, the most frequent types of books to appear on “best of” lists.
I blogged earlier this year about how there are far more publishers out there than just the Big 6 (well, the soon to be Big 5). So I thought I’d take a look at the individual publishers represented on “best” lists, and I’ll follow it up with a breakdown of the publishers represented by their being either a Big 6 or a non-Big 6. I’ve collapsed all imprints within their bigger house.
I put the Big 6 up first, and it’s clear they take up much of the list space, but there are plenty of mid- to small- publishers represented, too. Here’s an actual breakdown of the Big 6 against every other publisher on the lists:
I think it is neat that non-Big 6 publishers are taking up more than 1/3 of the lists, actually. I love, too, how Candlewick has four books represented — for what it’s worth, I think Candlewick is consistently putting out some of the best stuff. I won’t list those titles here because they will all be searchable in the spreadsheet linked at the end of the post.
There’s not a pretty graph for this next data set, which is something I was simply curious about. I’ll include the spreadsheet screen cap, though. I was interested in seeing what information I could find on print runs of titles that appeared on “best of” lists. This information, I should warn, isn’t always accurate or true (for a number of reasons) but I was able to track down quite a few print runs on titles appearing on the list. I then looked at the genre of the books those runs were associated with, as well as the gender of the author. I wanted to know if there was anything between size of print run, gender, genre, and appearances on “best of” lists and earned starred reviews. I think the data sample is too small to make correlations, and the accuracy is speculative, but it’s interesting nonetheless. I’m not going to interpret this information.
Because of space issues and screen capping, I could not get the column labels in with all 26 titles I was able to look at this information for. So, the columns, left to right, are PRINT RUN, LIST APPEARANCES, STARS EARNED, GENDER OF AUTHOR, and GENRE (as defined in my post at The Hub yesterday).
You’re not misreading this. There were books with 500,000 first printings and 200,000 first printings. If you look at my spreadsheet, you can see what they were. But just looking at the additional information in this image should allow you to ferret it out pretty well.
Are you still with me here? This post is never ending because there are a million ways to look at data. But this is the last big thing I wanted to look at, and it’s one in which I admit up front is subjective, that will be riddled with arguments, and in which I show my own ignorance because of my reading this year.
That is representation of POC. I looked at the books that feature POC either as main characters or supporting characters, as well as books written by authors who were of color. I haven’t read all of these, and I am not intimately familiar with all of these authors. I asked for help in some of these. So what I am about to say in terms of numbers is possibly understating it. I do not think I am overstating it, though. But to be fair, I was loose in applying “supporting” characters. Basically, if a book described a supporting character’s race or ethnicity with some detail, it was fair game. Again, I collapsed book characters in with authors, so this number is the combination of the two; I did not double dip and count instances where the author was of color AND their character was, too.
Of 89 books, with 90 authors, I found instances of POC in 22 books/authors. Let’s take the bigger number of authors (90) and do a little math (22 total books/authors): 24%. Almost a quarter of the books on this list. I won’t say whether that’s great or whether it’s not great because I speak from a place of privilege as a white woman. But I am thrilled to see these books getting recognition because these authors and characters? They represent the teens I work with.
One more thing about this particular stat I wanted to note. But before I do that, go read this post on YALSA’s The Hub about whitewashing of book covers and then follow it up with this thoughtful response from Diana Peterfreund.
I’d love to do a breakdown of covers on these award lists and see what is and is not trending. But I simply can’t after looking at all of these numbers. So I did the next best thing. I created two Pinterest boards with just the covers. You can look at them here and here. Looking through those quickly, I found a total of 14 covers featuring a POC pretty obviously (I include Vaunda Nelson’s No Crystal Stair in that count, if you’re wondering). That’s roughly 16% of the total covers.
Again, all of the data above comes from my breaking down of “best of” lists, especially with the context this represents a year of published YA books. You better believe I’ll be revisiting this list when the Printz awards are announced. If you want to see my raw data — and I warn you it is messy and at times, inconsistent in how it’s spelled out, though it is very thorough — you can look at my spread sheets here. I do hope someone goes through those covers I shared on Pinterest and does a post on them. Some suggested interesting things to look at: body parts on covers (eyes and hands especially), girls and guys on covers, and original art vs the use of stock images.
I think there are some interesting title trends worth noting, too, but I’ll be brief because this post has a lot of information in it and it’s getting excessively long.
Three Word Titles: There are 25, if you consider Catch & Release a three word title. Here’s the list, if you’re curious.
- Code Name Verity
- Ask the Passengers
- No Crystal Stair
- The Raven Boys
- The Drowned Cities
- Keeping the Castle
- A Certain October
- The Good Braider
- Never Fall Down
- Second Chance Summer
- Girl of Nightmares
- Call the Shots
- The Crimson Crown
- The Assassin’s Curse
- After the Snow
- Don’t Turn Around
- A World Away
- The Obsidian Blade
- Throne of Glass
- Be My Guest
- The Broken Lands
- Such Wicked Intent
- A Million Suns
- A Troublesome Boy
- Catch & Release
Titles That Sound Like Band Names (*And One Is): There is The List, The Disenchantments (which is the name of the band in the book), The Chaos and The Diviners.
Single Word Titles: There are 15 this year. Here’s a list!
- Double
- Passenger
- Boy21
- Pandemonium
- Shadowfell
- Stormdancer
- Enchanted
- Pinned
- Seraphina
- Vessel
- Reached
- Above
- Cinder
- Son
- Bitterblue
- Dodger
Easily Confused Titles: Let’s meditate on Between You & Me and The Difference Between You and Me for a second.
Negative Connotations: There are a ton of titles that give a negative connotation this year. I’m being very liberal in use, simply because I’ve been reading and rereading the same 89 titles for weeks now. But take a look at The FAULT in Our Stars, The MISEDUCATION of Cameron Post, In DARKNESS, Such WICKED Intent, Love and Other PERISHABLE Items, and so forth.
Gendered Titles: It’s interesting when you see gendered terms in titles and when you read them together the impression you get. So, for the female side, there’s Dust Girl, The Girl with Borrowed Wings, The Girl is Trouble, Mister Death’s Blue-Eyed Girls, Girl in the Clockwork Collar, Girl of Nightmares, Me & Earl & The Dying Girl, and The Brides of Rollrock Island. For the male side, there’s The Troublesome Boy, Sons of 613, Boy21, Confusion of Princes, Son, and The Raven Boys.
Royal and Divinely Inspired Titles: There are quite a few of them (and how many covers feature castles on them!)
- The Diviners (subjectively, of course — objectively, not so much)
- Keeping the Castle
- Devine Intervention
- The Crimson Crown
- Throne of Glass
- Confusion of Princes
Day and Night: There are seven titles that talk about moon, sun, day, stars, and darkness.
- Beneath a Meth Moon
- A Million Suns
- In Darkness
- Radiant Days
- Have a Nice Day
- Every Day
- The Fault in Our Stars
- After the Snow
- Stormdancer
A Matter of Location: We know where these stories take place.
- Under the Never Sky
- Above
- Between You & Me
- A World Away
- Beneath a Meth Moon
- Passion Blue
- Mister Death’s Blue-Eyed Girls
- The Crimson Crown
- Black Heart
- Bitterblue
- Second Chance Summer
- Summer of the Mariposas
- A Certain October
Objects, not People: Okay, I lied. I looked at covers and noted, too, that 17 of the covers did not sport a single person on them.
If you’re as thoroughly exhausted as me, kudos. This is a lot of information. It’s a lot of information without a lot of context, too, which is why it’s tough to read through and digest. It’s interesting, nonetheless. Does it mean much? Maybe or maybe not.
I suspect people could look at the spreadsheet and find a million more ways to interpret the data. It could be interesting to look at gender and starred reviews, for example. I so wanted to look at gender of character, but it was simply too tough to do — I ultimately chose to delete that column from my list because, not having read all of the books, it was too difficult and I couldn’t easily decide whether to count multiple perspective stories individually or collectively, etc. Maybe someone else can look at this. It’s been done before (if you haven’t read this post before, do it).
All that said, there are books I am absolutely shocked saw no “best” representation. I won’t name them for many reasons, but I’m wondering if other people noticed some obviously missing titles. If you have, feel free to drop the title in the comments. I’m curious, of course, if any of those titles might see their time on YALSA lists at the start of next year.
Any thoughts? Any surprises in the data? Lay it on me!
* For some reason the links are not working on my preview page. They may when this post goes live, but in the event they do not, here is yesterday’s post at The Hub. From there, you can see the links to last year’s post, as well as each of the review journal “best of” lists.
Thank you so much to Liz Burns, Sarah Thompson, and everyone else who helped me out in looking at and calculating the data, as well as suggesting things worth looking at. Any mistakes here are mine and mine alone.
Stina Lindenblatt says
I'm amazed at all the work you put into this, but I love the breakdown. So the trick is to have a three word title for a book published by Macmillian in September for your stand alone novel and then you're gold. 😀
admin says
Now you know the trick. And it helps, too, if you're a woman 😉
Deb Marshall says
LOL, Stina!
Thanks for doing this, Kelly! So very interesting, am actually going to print it off, highlight what I've read and what I need to read yet.
admin says
There are some gem in these lists, for sure.
evanroskos says
again — great work. very very fascinating. some positive trends in the breakdowns. do you have a robot that can start comparing backwards as you do this every december for the rest of your life? ha!
admin says
DO I EVER WISH I HAD A ROBOT FOR THIS. But nah, it's all me. Fortunately, I think now that I have a system (this is the second year I've done a breakdown), it'll maybe get easier. The hard part's the data collecting. The fun part's the graph making.
It will be super interesting to see what the trends are when the YALSA awards are announced.
TeriBrownwrites says
THis is amazing… thanks for all your hard work!
Alicia says
I did a similar (albeit much less formal) analysis of the "best of" lists to help me create my "best of 2012" presentations for students. I looked at NYTimes, Kirkus, SLJ, Publishers Weekly, National Book Award, YALSA Nonfiction Shortlist, YALSA Morris Award Shortlist, and Amazon, and I was surprised to find that there were so many books that made only 1 or 2 lists, and only 3 books (The Fault in Our Stars, Code Name Verity and Liar & Spy) that made at least 5 of the lists. Thanks for your very thoughtful analysis!
Caroline Starr Rose says
As always, amazing. Off to share!
Valerie Cole says
I love stats like these. Thanks for sharing, Kelly! This is a lot of work.