(blogs let others gawk)

January 28, 2016

Video game review scoring vs. movies, music, etc…

Filed under: General,Historical Rant,Perspective,Videogaming Rant — Bryan @ 6:00 pm

Today I ran across the question about why when searching through Metacritic there are more high scoring reviews on video games as opposed to other entertainment mediums. It’s a good question on face value, but there’s actually more to the answer then you might think.

Say that you are writing a review of a game on a scale of 1-10. In 1986 a game like Ulitma IV might have easily garnered a 9 or 10 because it was the pinnacle of that genre, remarkable as a video game in general and an overall exceptional game. It had many notable “new” features such as an exceptionally large game world, lots of NPC interactions for the time, a morality system of sorts, etc… It was also cutting edge in the use of audio technology (on the Apple it supported dual Mockingboards allowing 12 channel audio, which was simply unprecedented at the time as most games of this era might only use a computer’s built in speaker, if that to generate clicks and buzzes).

Ultima IV released exactly as it is today as a new game might only garner anywhere from a 5 to a 7 because while it is still a well done game it is now an overused concept and unoriginal by current standards and expectations.

In contrast, an exceptionally filmed movie from 1930 can still be just as visually compelling and artistically comparative to a contemporary film made now. Consider the movie Metropolis. Even today this movie is visually impressive and story wise, quite contemporary in its subjects of worker oppression, class elitism and surprisingly… A.I.. Granted, while the silent presentation and slower pacing may prove difficult for some to watch, it can be quite enjoyable for a modern viewer and it is easy to both acquire and watch without much trouble. The only options for variation in experiencing this movie are between watching in a theater or on a TV. Granted those experience differences can be significant they are typically not considered a factor in a review.

To carry our analogy, Ultima IV may be enjoyable for modern players but they must also endure the added burden of many significant technological barriers to overcome before they can even try to experience the game in a way that in the end almost certainly will not be the same as the experience of 30 years ago. You can still potentially go to a movie theater and watch Metropolis with a live pianist. Finding a complete, working Apple //e with Mockingboards and functional game media is more of a challenge, and that’s if you decided you want to try and play the Apple version and not the MSDOS-PC or Commodore-64 ports (most people these days only play the PC port via DOS emulation). Which takes us to our next topic…

Ratings in video games unlike any other medium are highly context sensitive to the technology used and moment in time they were written for which is why a review generated in 1986 for Ultima IV is more relevant than a review written for that same game today. The prevailing attitude in video gaming culture is that there is literally no way a contemporary reviewer could write a review with the same level of enthusiasm or appreciation and recognition as a period reviewer. To that end, “retro reviews” are typically considered of lower value than period reviews. Another aspect of retro reviews to keep in mind is that many of them now are performed under emulation using non-standard controllers which may effect the overall experience (eg, NES games played on a PC via an emulator using a PS2 style gamepad controller. This is simply not even the same experience.). Even things like up-scaled pixel resolutions or the lack of scan-lines on modern displays (an artifact of CRT based display technology) can effect the visual experience of a game when the designer incorporated something about that legacy viewing system into the visual aesthetics of a game’s art design.

Let’s consider another example… Stunt Race FX for the Super Nintendo. My magazine at the time gave this game a combined review score of 94.0/100 spread between four reviewers. I distinctly remember this game being visually impressive and I spent hours playing and enjoying the game.

Recently, based on those fond memories I dug out the SNES, dusted off the controller, loaded up the game cartridge and tried to play it. I found the game almost impossible to view let alone play. It was an incredibly jarring experience. If a game like that had been released right now on a modern platform and I was reviewing it, I probably would have tanked it.

As I eluded to above, video game scores take into consideration aspects such as the player interface as part of a review (eg, the responsiveness of the controller, the screen resolution of the video output, etc…). For the most part movie reviewers do not consider popcorn quality or sticky floors as a relevant element in a movie rating (granted, the quality of the camera and projection format may impact movie reviews but that’s generally the exception, not the norm), yet in video games, interface elements of the user experience are generally intrinsic to a reviewers scoring.

Lastly, you can’t look at gaming scores as a spread spectrum the same as other mediums. You really need to quantify your data, be it by era, platform, etc… as those extra parameters are just as relevant to the nature of the score beyond the raw play experience itself. I suppose movies and music have similar strata but the differences between eras in technologies aren’t typically as critical to the content as they are with video games.

February 9, 2014

Single point of failure (or how important is your data?)

So, this is a story I don’t tell too often but in light of some recent conversations about performing backups following the news about the Iron Mountain fire, I felt it would be insightful to share.

Back in 1997/1998 I learned a very hard lesson about data loss and the publication I Co-Edited named Game Zero magazine.

First the back story to explain how this situation ended up the way it did.

We started our web presence near the end of 1994 with a user account with a local Arizona company named Primenet who offered users the traditional array of features (WWW, POP mail, etc…). This worked out great except for a couple of problems. The first was that even though we had registered the domain gamezero.com for our site, Primenet’s server name resolution would sometimes flip a visitor’s browser to the primenet.com/team-0 URL while the person was traversing the site. This caused lots of people to create bookmarks and links to the site by the wrong URL (this comes into play later).

The second and later problem, although not a technical issue, was the cost associated with bandwidth for WWW visitors to the site. Towards the end of our time with Primenet we were hitting fees of a few hundred dollars a month for bandwidth from our 700,000+ page views a month. Fortunately we had designed our site incredibly light, so that helped keep costs low, but traffic and fees were climbing. Ultimately I set my sights to moving us to new “discount” hosting services which were becoming a thing in 1997. It was obvious we could save a significant amount of money by moving the site.

For backups, we had our production computer which housed all the original and developing web content, including the active mirror of the website and remote publishing tools as well as our POP e-mail client for all business e-mail. Additionally, we kept backups of web content and e-mails on a collection of Zip disks along with some limited content on a random assortment of floppies.

In 1997 hard drives where expensive! We’re talking a few hundred dollars for a 1GB drive. Our production PC had something like a 120MB drive, as I recall, so we had lots of data off loaded on the Zip disks.

Also around this time we also received word that the provider which had been handling our FTP based video repository was getting out of the hosting business. I decided it best to roll the video content into the new web hosting arrangement as the price would still be reasonable. We quickly migrated everything over, changed DNS entries, started sending out e-mails to people who had the old primenet.com addresses to please update their links, etc… Following the migration we only published a few major updates on the new server consisting of a couple of new videos and some articles which only existed on the website, our production system and our Zip drive backups.

Then problems started…

  1. Traffic tanked on the new server.
  2. My crawling the web looking for bad links suddenly made me aware of just how bad the extent of the linking issue was and a significant amount of traffic was still going to the old Primenet URL. Fortunately right before we closed our Primenet account we setup a root page that linked to the proper URL along with a notice about the move which Primenet was kind enough to leave up at no cost, but it wasn’t a full site wide redirect though. Just the root pages.
  3. A few months into running on the new provider their servers went dark. When I contacted them to find out what happened, I reached a voicemail that informed me that they had filed bankruptcy and closed business. Done, gone… No contact and no way to recover any of the data from the web server.
  4. We now had a domain name that didn’t respond, our old provider’s server was pointing traffic to that very same dead URL and since we had long since closed the Primenet account we had no ability to log in and change the redirect notice or make other modifications to redirect traffic someplace else.
  5. While scrambling to find new hosting, the hard drive on our production computer completely and utterly failed. 100% data loss.
  6. After getting a new hard drive I went to start rebuilding from our Zip disks and to my horror none of them would read. We had now become a victim of what became to be known as the “click of death”. We lost some 20-30 Zip disks in total. Almost everything was gone except for a mirror of the website from before the migration to the new hosting and other random items scattered around. We also had a limited number of hard copies of e-mails and other documents.
  7. Lastly, while the Internet Archive now is a great way to recover website content. At this point in time it was still just getting started and their “Wayback Machine” had only just taken a partial snapshot of our sites (in both the US and Italy). Par for this story, the lost content was pages that had not been crawled yet except for the index pages for the missing videos. I could view the archive of the video pages… but the linked videos were too large at that time and were not mirrored.

Coming into this, I felt we had a pretty good data backup arrangement. But I learned the hard way that it wasn’t good enough. We lost all of the magazine’s e-mail archives including thousands of XBand correspondences as well as innumerable e-mails with publishers and developers. We lost two videos that had been produced and published. We lost a few articles and reviews. We also lost nearly all of the “in progress” content as well as a number of interviews.

At this point the staff agreed to stop spending money on the publication and formally end the magazine, especially since some of them were already making natural transitions into their careers and school. While we had stopped actively publishing at then end of 1996/start of 1997, if you were to ask me if there was a hard line for the the true end of the magazine, this was it.

Ultimately I did get the site back up as an archive which you can still read today. But, that’s another story.

The lesson of this story is to remember that there is no fool-proof backup situation. Only you can be responsible for you (or your company’s) data and you must always be aware that no matter what your best efforts are, data loss is always a possibility.

99.9% guarantees are great except forĀ  that 0.1% chance, which is still a chance! and if someone is selling you a 100% guarantee let me know because I’ve got the title for this bridge in Brooklyn I might consider selling you for a deal.

What could I have done differently?

  1. Spread out our backups across more than one media type and one location. Simply having a duplicate set of Zip disks and a second drive off site where there was no cross-mixing would have made a huge difference here.
  2. More frequent backups of critical business data such as e-mail.
  3. Retained the master account with the old service provider until we were sure traffic migration had been completed.
  4. Upon the first sign of Click of Death observed. I should have isolated both the problematic media and drive from use and looked for a second drive as the damage propagated once manifest but nobody had enough information about the problem at the time and the manufacture kept denying the problem existed.

Granted some of these would have likely added overhead cost, but the the question is would that cost balance against the value of the data lost? I don’t know. But since this happened I have been far more diligent in my data storage strategies where I now factor in the value and importance of the data with the breadth and depth of the backup plan and go with the best possible solution I can devise.

I have had only one significant data loss in the years since this happened. It was just last fall and I was doing some data re-organization as part of a desktop upgrade. A USB drive I was using for temporary storage fell over and become damaged in such a way that it would no longer read the disk. I then discovered that the data on the drive hadn’t been synchronized with the backup repository for a couple of months for some reason. Fortunately it was non-critical, personal data (downloaded drivers and install packages that I was able to re-download from the Internet). So all in all the only loss here was in my time. But it was a reminder to me that even though I am way more careful than before, accidents can still happen.

June 24, 2013

What’s been going on (2013 edition)

Filed under: Historical Rant,Unloading — Tags: , , — Bryan @ 11:05 am


Ok, so to begin with, back on November 30, 2012, a Wikipedia editor made the final decision that the Game Zero magazine entry was not only irrelevant, but lacking citations enough to warrant reference in the wiki.

At the least I was able to talk him into restoring the deleted page/talk page to a personal page on my profile which you can find here.

I knew this was going to happen eventually. If you’ve been by my blog at all or seen my posts around the web, I have complained for years about the black hole of history that is 1994-1996 that relates to the Internet.

I’m really proud of the work we did at Game Zero and how we laid the ground work for so much of what ended up being staples of web based video game review sites. Sure a lot of it was just common sense outcomes for the medium, but we did it first and people appreciated it at the time.

Oh well… such is life eh? Maybe at some point I’ll get the decision reversed.

July 14, 2010

The industy is dead! Long live the industry! (Won’t someone think of the children!?!)

Since I touched on the subject of media transition touched on briefly in my post about going mobile friendly, I think this is a good chance to highlight some historical hysteria regarding entrenched business models collapsing to be replaced by new ones.

Let’s specifically look at the history of music distribution over the last 100 years.

Going into the 1900’s piano rolls and sheet music were the predominant methods of music distribution. Granted there were also broadsides, but those were considered a medium for the working class and were typically lyric sheets with no music score, commonly notated with statements like ” sung to the tune of -fill in the blank- “.

Even in a time where the average worker earned around $600 a year, 25-60 cents for a copy of sheet music was a premium purchase for many. That said, sheet music was big business and when the phonograph came around, sheet music publishers saw the new medium as a threat and fought tooth and nail to kill the medium. “Oh! We can’t let this happen”, “This will destroy the traditional family gathering in song”, “nobody will learn to play music”, “someone think of the children”, etc…

But, the reality on the ground was that pianos are expensive both to purchase and maintain. On the other hand, phonographs are cheaper to produce, cheaper to maintain, easier to operate and you didn’t have to be able to play music to enjoy them. In the end music producers actually sold more copies of music because they now had a larger audience and the companies that adopted to the new business model profited greatly.

Most sheet music publishers failed to adopt the medium hardly failed to die. Granted some of the fears were well founded, the days of the families gathering around a piano to sing together for leisure were lost (if they truly were all that common to begin with). But sheet music is still produced and sold in most music stores. Granted these days it’s mostly piano and guitar based, but, those are the popular instruments for people learning to play music so it only makes sense.

Let’s roll ahead to the next big jump to radio. When radio hit the scene phonograph publishers went crazy. “Oh! We can’t let this happen”, “People will quit buying music when then can get it for free over the radio”, “This will make it impossible for musicians (sic, publishers) to get paid for their work”, etc…

On the contrary to most concerns, radio actually increased sales for two reasons. People were exposed to a larger variety of music and they like the convenience of listing to a song on-demand so naturally they went out to buy their favorite songs in order to have them available to listen to. Publishers that added value to their product saw even better profits (e.g., B-Sides) among core fans.

Then over then next 60 years not much changed aside from improvements in recording and distribution. Granted there were fights over the introduction of cassette tapes and fears that people would just copy music instead of buying it. The same thing happened with CD based music. But don’t forget it’s always been a steady lowering of the bar of the cost of entry into the world of listening to music contrasted against the publishers desires to maximize profits from that same music. Also, people on the whole will more often than not pay for something when they feel it is being sold at a value they perceive as fair. Don’t believe me? Ask Trent Reznor (of the band Nine Inch Nail) or for another example in a different entertainment industry, ask the video game publisher Stardock. Both have spoken out at length about the success of this business model to their sales.

Since the who knows how long the common perception (supported by much anecdotal evidence and statements from artists) is that artists get paid little if nothing for their work when they publish music through a publisher and that publishers takes all of the profit from sales. The big money for the artists more often than not is in concerts and live performances and endorsements. A popular song will generate larger ticket sales and everyone wins (hopefully). This situation set the scale for next big crisis for publishers. The Internet combined with the modern computer.

In the mid-to-late 90’s the barrier of cost related to copying and sharing music finally broke down and anyone with a computer and internet connection suddenly found a plethora of methods to acquire music to listen for free (sometimes pirated, sometimes not) that weren’t available before. And the net result? Sales increased! What’s that? That doesn’t make any sense. The music publishers told us that people stealing music was loosing them money. Wrong. People downloading music was gaining them customers. The money they were supposedly loosing was based on estimates of “if every single downloaded song on the Internet had instead been purchased we would have profited this much”.

… publishers used the same logic with radio by the way.

The reality was that people who used to be pigeonholed to a particular music style suddenly had a inexpensive way to explore new music that they might not have been willing to pay for (on the risk that they might not like it). When they discovered a new artists or new genre they enjoyed, they then frequently went down to the record store to find more of that music to purchase. You had punk rockers buy classical music and country lovers buying speed metal.

But the industry could only focus on the “lost sales” not factoring in that these weren’t really lost sales. Anecdotal evidence from the time indicates that were more like samples. To compare, yes, we know there is always going to be the guy who lives off samples at the grocery store for dinner, but most people actually buy their food and the samples are good because they primarily encourage regular customers to try things they never tried before. The guy living off of them is a cost of business.

I always said at the time that music companies should have jumped on this immediately and put their entire catalogs online at 56kbps bit-rate (radio quality), with an easy click to purchase the higher quality version priced as a convenience item. They would have made a killing, but instead they decided to fight their customers (and still do). Effectively deciding to sue anyone who eats a square of cheese at the deli counter without buying the whole wedge.

When asked years later why they pursued this course of action, one executive answered that they were so scared of the changes happening and knew that they didn’t have a clue about what was going on that they feared everyone was out to rob them and that even the consultants couldn’t be trusted. So they fell back on the only tool they could trust, their lawyers.

Sadly this fight is still playing out even to this day but in the last year some significant changes have happened that probably mark the end for some of the large publishers in this space. The barrier of entry for recording equipment has vanished and a lot of bands, frustrated with publishers and finding greater profitability by simply going solo on the Internet is increasing. When you strip away all of the fluff YouTube is now the largest publisher of music on the Internet at this time. So large that other publishers now effectively turn their content over to them just to get exposure for their artists.

The more things change the more they stay the same. Business is always evolving and those that learn and adopt quickly are well positioned to profit from their observation skills. Others are destined to dig in their heels and ultimately become a footnote to history.

….some references that helped in the creation of this article are listed below.

  • Media-Morphosis: How the Internet Will Devour, Transform, or Destroy Your Favorite Medium: http://www.internetevolution.com/document.asp?doc_id=171555&
  • http://www.econlib.org/library/Enc1/WagesandWorkingConditions.html
  • http://www.bls.gov/opub/cwc/cm20030124ar03p1.htm
  • Perspective: Radio/photograph was going to destroy print: http://web.mit.edu/comm-forum/papers/murphy.html
  • Sheetmusic and broadsides…: http://popmusic.mtsu.edu/dbtw-wpd/textbase/broad/broadside_ex.htm and http://www.phonobooks.com/BirthRec.htm
  • http://cultureandcommunication.org/deadmedia/index.php/The_Victrola
  • Radio was going to destroy the records: http://en.wikipedia.org/wiki/History_of_radio#Legal_issues_with_radio (although the Internet distributed music has revolutionized the way records are sold, it still hasn’t destroyed them)

July 4, 2009

Lost frames of reference… Part 3: you put the phone where?

Filed under: General,Historical Rant,Technology Rant — Tags: , , , , — Bryan @ 10:37 am

Since we’re on a roll talking about telephone technology let’s branch out.

The most obscure phone device that more people have seen than used is the acoustic coupler. Did you see the movies WarGames or TRON (boy talk about dated references eh)? These days, these are probably the most likely place people will see an Acoustic coupler in action. Back when you had to lease your phone from the Ma Bell and when most people still had rotary dial phones (instead of Touch Tone based phone systems) someone figured out a bright idea for how to connect two computers together over a phone line.

The acoustic coupler was a device that most commonly plugged into the serial port on your computer and would convert the data sent to it into audio tones. It would then pulse those tones over the phone line where the recipient computer would record the tones and turn then back into data to be processed (in many ways it analogous to data storage via audio cassette tape on early computers, but that’s a whole other subject). You would dial the phone manually and tell the person on the other end to put their receiver on their coupler and set their computer to receive. Then you put your handset on your coupler. Once the computers were connected you would start data transmission.

After the AT&T/Bell break up I mentioned previously, people eventually had the ability to plug any device other than a Bell telephone into the phone network. From there you saw modems that you just plugged the phone line directly into. These modems would also be able to issue the dialing tones to initiate a call and would be able to monitor the line for a ring in order to provide an unsupervised answer (this led to the dawn of home computer run Bulletin Board Systems). The last progression was to move the modem directly into the computer as a board that plugged into a slot. Less and less people are using modems now with the spread of broadband internet services. Although in very remote locations where the only communication is a old style telephone landlines, some people still use modern acoustic couplers that run off the USB port. As cell phone tethering becomes more prevalent though this too shall likely pass.

Another soon to be lost piece of one common phone technology are RF based Beepers and Pagers and eventually their cellular technology based cousins. Before everyone had cellphones, someone working in a job that had to be on call might carry a beeper. Initially beepers were tied to an operator, and then to a voice-mail system. Someone would call you and leave a message. Your beeper would buzz/beep. You would call your operator or voice-mail system and retrieve your message.

The first major upgrade of these featured a small display on the beeper that would display the phone number of the caller.

Following that, pagers got to the point where the display would show any number that the caller punched in so you could send a message that included other numbers that represented agreed to code number systems which allowed you to get the gist without having to call the voice mail.

The last generation I’ve seen most commonly used supported texting/SMS services like you would have on your cell phone.

Beepers used to be expensive and common in nearly all professions but from personal experience for IT workers in the 90’s having to carry a beeper for their employer was more of a curse than a benefit. It may have been this way in other industries as well for all I know. The curse being that you had no excuse for missing that alert at 2am when the server actually crashed.

One reality scenario here was:

  1. Your employer thought their systems were so important that they needed a 24/7 baby sitter.
  2. But they were too cheep to actually pay staff to sit in a data center around the clock to monitor for problems.
  3. So now you get lovely false alarm beeps waking you at 3am when the random Windows server reboots unexpectedly.

Additionally, for a number of years having a beeper (just like early cell phones) was used as a status symbol for the rich and famous to help make sure they looked important even if they never used it.

Some good extra reading if you’re up to it:


Or, jump back to Part 1 or Part 2

Related posts

Older Posts »