Science presents: the top 10 best and worst Mega Drive games

Discussion in 'General Sega Discussion' started by Black Squirrel, Apr 10, 2021.

  1. BSonirachi

    BSonirachi

    Opa-Opa takes flight! Wiki Sysop
    All of Beep! MegaDrive's ratings for Game Gear games are yet to be put up on Sega Retro, and if this game got reviewed in that mag, then chances are it won't be the number one worst game should it be documented.

    Akane usually handles the Japanese magazine ratings for all other systems, but has never done the Game Gear ones.
     
  2. Black Squirrel

    Black Squirrel

    this is what KLF is about Wiki Sysop
    (Better here than clogging up the general information topic)

    "The science" in Sega Retro terms is a very simple "convert x rating system to percentages" affair which leads to a wide variety of problems. You would think a magazine who used 0-100 review scale would... use the scale, but in reality, there's probably only about 10 or 20 games on the wiki that have ever been consciously rated less than 10%. In fact, games that score that low usually do so because of politics, as in "reviewer hates his job" rather than "reviewer hates this game". I'd quite like to include some form of "weighting" to make things more relatable, but I genuinely don't know how best to do that, or whether it's just better to put a bazillion asterisks after the number pointing out how it will never be accurate.

    Also the fluctuation in scores highlights something else - the cultural difference among games journalists. You can't really see this on Metacritic because most new scores come from a homogenised, English spreaking internet, but for early 90s magazine content, it's a bit more interesting:

    For example, critically acclaimed shooter Thunder Force IV. The average score is around 90, but there's a noticeable difference when split down country lines:

    - The US average is 95. I find that, particularly in the old days, US magazines seem to be overwhelmingly positive about video games as a whole. EGM and GamePro read like a 200-page advert - "this could be awesome", "this is awesome", "but this is even more awesome". Their teeth start coming out a few years in when they're faced with tat, but it's rare to see a game score lower (when converted to our scale) than 50%. Are they in the pockets of publishers? Well there's plenty of damn adverts, so maybe, but a less "professional" magazine like GameFan is equally positive on the medium as a whole, so I guess video games are just awesome, guys.

    - The French average is 94. French magazines from this era are overwhelmingly positive about video games too, with everything seemingly getting more than 70-75%, however unlike the US where it seems to be based on not wanting to offend suits, my guess is that the French mags didn't actually bother to play the games they were praising, and if the game was actually made in France, you get top marks by default.

    - The German average is 88. Germany is a lot more demanding of its video games, and its bigger multi-platform magazines, ASM, Mega Fun, Play Time, Power Play and Video Games were quite happy to award 10s and 20s if the product was crap (or if they felt like it). 80% is a good score in Germany, while it's fairly average in France. Anything over 90 is exceptional and scoring over 95 just doesn't happen.

    - The UK average is 90. We have a lot of British magazines, and with gaming journalism going back decades, there's usually a bit of quality in the writing, even if scores can be a bit loopy. Some publishers go for the "70 is average", others are closer to 50, "grown up" magazines like Edge tend to rate games lower than some of the (other) Future or EMAP publications - you get the broadest range of opinions from this period from the UK... which isn't all that helpful if you're using these things to help make decisions.

    - The Japanese average is 81. Japan doesn't tend to give games really low marks, but it's very hestitant to score things highly - it's all 50/60/70 and 80 on a good day. So in a way it's less about describing how good games are relative to each other, and more "buy this game" and "don't buy this game". Mind you I can't rule out an anti-Sega bias for the early 90s - Nintendo games do seem to get more attention and score higher(?) so maybe that's a thing... although you wouldn't expect that from the dedicated Sega magazines... right?

    (and this applies to most video games from this era, not just Thunder Force IV)


    We don't have enough reviews from other countries to make sweeping judgements there but from what I've seen, Spain tends to be positive, Brazil and Italy use random number generators and the rest usually fall in line with the average. By the late 90s, especially around the Dreamcast launch, some of that positive skew tends to go away as people start "expecting" things from their consoles, as opposed to be in constant awe that video games exist.
     
  3. doc eggfan

    doc eggfan

    Are you pondering what I'm pondering? Wiki Sysop
    9,442
    90
    28
    ACT
    GreatMegaLD, GreatSC3k, Great SG1k
    I'm not sure how you'd code it/automate it, but I would look at all the review scores for a particular year in a particular publication for a particular system, and then within that sub-group, the highest scoring game, whether it's 88, or 91 or 97, would get re-scored as 100%, and worst game for the year, whether it's 33% or 51% or 69% would be re-scored as 0%, and then everything else within this sub-group would be re-scored and distributed in order from worst to best over a normal distribution curve centered over 50%. Then you'd re-assess based on these new scores.