Remove ads, unlock a dark mode theme, and get other perks by upgrading your account. Experience the website the way it's meant to be.

Ranking the Re-Rankings of the Rankings

Discussion in 'Article Discussion' started by Melody Bot, Jan 14, 2020.

  1. Melody Bot

    Your friendly little forum bot. Staff Member

    This article has been imported from chorus.fm for discussion. All of the forum rules still apply.

    If there’s one thing we like doing on this website, it’s ranking things. We like ranking things so much it’s become a meme in the forums for just how often it happens in threads. From albums to food, to the new sub-genre of brackets, ranking has become a core part of our little culture. It’s also part of what we do on a pretty regular basis here on the editorial side of things. We’ve got our yearly most anticipated lists and the mid and end of the year “best-of lists.” Back in the days of AbsolutePunk, we scored these lists using a basic scale that I think Thomas Nassiff originally came up with. When there were 30+ staff members all contributing, it worked pretty well to give a basic structure to what albums were the most popular amongst staff members. I never really gave much thought to it, and it’s been passed down and continued to be used by different contributors that help put together all of the various lists here on the website. Last week I got the itch to re-think this process.

    I started by researching all the various ways you can vote for and rank things. From the multiple ways award shows do it, to election models, to all the math-stat-nerd stuff I could understand. My biggest takeaway was that there’s no perfect way to do this. Every method has pros and cons, and a lot of it comes down to what features are most important to you in how you devise a system to be most “fair.” With that in mind, I decided to create a new algorithm that we could use going forward on all of our contributor fueled ranking lists. The downside of the method we were using was far more apparent with a much smaller staff pool; albums that were ranked at number one, but only appeared on a few lists, ended up having outsized point values. What I set out to do was create an algorithm that had the goal of looking at consensus amongst staff members around albums, took into account where those albums were on each staff member’s list, and would rank the albums based upon what the year looked like as a whole from contributors. So, in a year where there’s a lot of consensus, that would be taken into account, but in a year where there’s a whole lot of unique albums, and fewer are showing up multiple times, it would take that into account as well. The result is a ranking system that still gives a lot of points to highly ranked albums on lists, but can account for that album showing up on very few individual lists when other albums have more of a staff consensus around them. Therefore, the final list feels, to me at least, more representative of what the contributors were listened to and loving at that time.

    What I’d like to do here is detail a little bit more about the process in order to have a page to link to at the bottom of every ranking post in the future to explain what’s happening transparently and openly and be clear about the goals and design of this system. Then I’d like to share some of the results of the new algorithm on some of the data sets I ran it on. Not to “redo” any of the lists from the past years; I’m not going to replace them, but instead because it’s fascinating (to me at least) to see the results and how different priorities in weighting and ranking can put together often very similar, and often very different, results.

    For each list, contributors come up with their favorite albums of the year on their own. We don’t sit around and talk about what records would best define the website for the year, or what should be where on the list, or even limit the list of albums that can be picked. It’s just a bunch of individual lists that get ranked in a spreadsheet indicating that individual’s favorite albums for the year. This is how we start. Then, I take all of these lists and put them into the ranking system. This system begins by converting each of the “ranks” into a score based on a logarithmic scale. Lower ranked albums are worth fewer points. Then bonus points are added based upon various factors, such as the album appearing on multiple lists, the average score, how many unique albums there were in a given year, how much consensus was built around albums as a whole, and a few other things. Albums higher on the lists and on more lists generally do the best. I most certainly over-engineered the shit out of this, but I think the end result is the fairest representation of my stated goal. I know nothing like this is “perfect,” and someone else may have written a ranking system that has different priorities. But, for me, I like seeing what albums are the most representative of the staff best correlated to their passion for them. I think this system, prioritizing consensus a tad more, but also making sure to take into account how high an album is on a list, does a pretty good job of balancing that line and scales well interdependent of how many staff members are contributing.

    In the future, I think I will end up sharing, in a separate post a few days later (or probably in my newsletter), the final raw data from this system for the number nerds who are curious about things like this. I think it’s fun to look at just how close specific calls were and see how the system determined the higher rank. (While I was making this, I pulled out some of the extra data so I could watch what it was doing and ended up finding that sort of stuff interesting.)

    Here are the results of the new algorithm on various data sets. Again, this is not meant to replace any of the previous lists; I’ll only be using this system going forward. However, I think it’s fun to see what changes the algorithm would have made verses our older model and just how close some of these albums were to each other.

    2019


    [​IMG]

    The original list can be found here.

    Best of the Decade


    [​IMG]

    The original list can be found here.

    2018


    [​IMG]

    The original list can be found here.

    2017


    [​IMG]

    The original list can be found here.

    2016


    [​IMG]

    The original list can be found here.

    I hope those that made it this far into the post found some of this information as interesting as I did. At the very least I enjoyed the mental exercise and like the consistency we’ll have in the future with calculating and then publishing our never-ending-cascade-of-lists.

     
    jorbjorb and noxee like this.
  2. EntryLevelTank

    "Draft Day" is a good film Supporter

    Data nerd here. This stuff is fascinating to me.
     
  3. Thanks for sharing! I love this kinda stuff
     
    Nate_Johnson and anonimito like this.
  4. supernovagirl

    Poetic and noble land mermaid

    I’m glad I put so much labor into creating a discourse and got myself attacked over it only to have it basically boiled down to you “had an itch” lmao

    You know it’s possible to say “in an effort to respond to some valid criticism, I decided to take a look at other options for our algorithm”
     
    dylan likes this.
  5. This is really cool! I actually started thinking about this exact thing when looking over this year's list; the idea of someone's #1 being higher overall even if others didn't rank it. The idea never took anything away from the list itself, always just enjoy reading the blurbs & what not, but it's really cool to see the old/updated versions compared. Oddly enough, seeing NF (an album I liked) so high on the 2019 list was surprising, but the new ranking almost seems more representative of where I'd expect it to fall on this site. Thanks for sharing Jason!
     
  6. noxee

    Regular Prestigious

    Given how hostile you were with your "valid criticism" it makes it hard to understand what point you're trying to make, so I don't see why you're behaviour should be validated.
     
  7. supernovagirl

    Poetic and noble land mermaid

    Ahh the ole “your critique regarding oppression isn’t wrapped in a docile enough bow for me to take seriously” excuse
    Love it. Fresh.
     
  8. noxee

    Regular Prestigious

    How does pointing out you were being hostile in that thread imply that I expect you to be docile. My only expectation is that people be respectful, and considerate when trying to have a discussion. It feels like you don't want to have any kind of constructive discussion.
     
    tyramail likes this.
  9. supernovagirl

    Poetic and noble land mermaid

    How does pointing out that I was hostile in another thread contribute to a constructive conversation in this thread?

    oh and what part of your original comment to me was respectful and considerate? Just curious.

    Maybe you should spend less time policing my tone and more time taking your own advice.

    I’ve literally only tried to interact and give feedback to staff members but for some reason users keep thinking it’s cool to attack me for it and yet paint me as the hostile asshole. It’s a really cool cycle and all but I’m over it.
    Not gonna interact with anyone about this further other than Jason who is the only person my original comment was directed at.
     
  10. derekjd

    Slow down, Quentin Supporter

    Looks like an awesome idea!
     
    Nate_Johnson and Jason Tate like this.
  11. tyramail

    Trusted Supporter

    This is dope.
     
    Nate_Johnson and Jason Tate like this.
  12. RileyWitiw

    more like absolutepop.net Supporter

    there is so much good music wow
     
    Jason Tate likes this.
  13. Bartek T.

    D'oh! Prestigious

    Lovely! We do like ranking stuff! I've been putting up my AOTY list the last 2-3 weeks and it's almost ready hahah.
     
    Jason Tate likes this.
  14. This is rad. Can’t wait to take some more time this weekend and really look at the differences
     
    Jason Tate likes this.