Everything written, recorded, linked to, posted or stolen for this site represents the opinion of Josh Shabtai, not necessarily that of his employer or anyone else. Thank you.

Entries in marketing (7)



Despite the flood of revolutionary new videogaming announcements at this year's E3 -- the promise of as many as three revolutionary new video game control schemes, gaming consoles that finally bring convergence to the living room and, most importantly, that "Boy and His Blob" remake -- I can't help but mourn for a long-deceased convention of the gaming industry.


The most noteworthy announcements related to videogaming -- and this will the first of many times I begin using this phrase in this life: when I was growing up -- were about bits.

"Dude, videogame console XX is going to have DOUBLE the bits of console YY!!!"

"Dude, I heard Sega's coming out with a 32-bit system!"

"Dude, everyone knows that your TurboGrafx 16 is really only 8 bits and that makes you a complete pussy but when you're done with Legendary Axe you should come over and play Actraiser because it's totally awesome and truly 16-bit from the Mode 7 graphics to the stereophonic Koshiro score but you wouldn't know what those are because you're only playing a lame 8-bit system in your stupid bedroom that your stupid mom spent $400 on because she's stupid and so are you though that TurboExpress thing sounds pretty cool I mean whoever heard of 16-bit games I mean 8-bit games being able to be played on a handheld so bring it over when you come even though my Super NES is way better."

As a kid who knew nothing about cars but worshipped the SuperFX chip and spent many of his high school years playing Shadows of the Empire, bits were a more tangible metric of machismo and horsepower than... well, horsepower.  Every 4-5 years, a new generation of video game consoles would emerge, effectively doubling the bittage of the previous epoch -- and it was awesome.

Bit cycles meant as much to me as did the Calendar Round to a Mayan, a means for marking time,  quantifying progress and even reflecting upon my age.  I'll never forget turning 16 and trying to deny my 'childish' excitement over the Nintendo 64, a battle that was lost in about 3 days because -- come on -- we were talking about 64 bits!

Now, I have no idea what an increase in bits actually did (and honestly, I still don't), though the lovingly designed diagram above shows that clearly, graphics improved considerably along with the bits, so something had to be happening.  Regardless of what they actually did, however, bits remained a symbol of clean advancement, not unlike report cards and summer vacation.

1999 marked the end of bittage, when Sony's PlayStation 2 touted its 'bit-less' Emotion Engine. (That year, the 128-bit Sega Dreamcast became the last console to draw attention to its, um, bits.*)

We've now jumped several console generations and it's clear that the bittage conversation is over.  In its place is a murkier sense of advancement, trumpeted by motion-sensitive controllers, handheld consoles, online play and a blurring of media platforms.  In the end, it really shouldn't matter that much, but...

*Nearly 10 years later, the bit needle hasn't moved an inch: this generation's Playstation 3 remains at only 128 bits. 

Dude, we should totally be at, like, 1024 bits by now.


Sticking to Canon: The Watchmen Flash Game

If you're a fan of comics, intelligent marketing or 80s-style arcade beat-em-ups, you must check out the new Watchmen flash site, which treats visitors to a seedy, Double Dragon-style arcade cabinet situated within the story's alternate-universe-circa-1985 setting.

It's brilliant:  a marketing tie-in that extends the story's narrative in multiple ways (e.g. the game lets you play as a former generation of superheroes, filling in gaps within the film; it's marked as a product of Veidt Industries, etc.).

But in some ways, perhaps it's off.

[NOTE: Unless you're super-nerdy, you may want to leave now.]

Both Kotaku and my new Twitter friend Valentine Donegan point out that, in its opening screen, the Watchmen game claims to be produced in 1977.  Fine, no problem; this would fit within the timeline.

But if the game were produced in 1977, it would look more like this:


...than this:

Which it doesn't!, meaning that HOLY CRAP OMG! WHAT A BUNCH OF F- the game doesn't appear to fit into the larger Watchmen canon. Those bastards.

[Another note:  If you're still here but don't know why this matters, click this link; matters of narrative continuity in anything related to new Watchmen product are of serious concern to many of us... um, I mean, to many of other people.]

But, after thinking about it for a few minutes (I'm reminded of this comic), I've realized that the game actually does fit into the larger Watchmen universe.  Here's my rationale:

Yes, it would have been virtually impossible for a "16-bit"-quality game to have come out in the arcades in 1977 at a reasonable cost in the universe as we know it.  But it's not; we're talking about an ALTERNATE 1977, one in which a large blue man has helped the U.S. win the Vietnamese conflict, superheroes run around in the streets (well, up until that pesky Keene Act), and people like PIRATE COMICS.  Everything you know is wrong.

Likewise, this alternate universe houses the world's smartest man, Adrian Veidt.  Shit, that dude can figure out what to name a cologne by watching 16 TV screens at once -- I'm sure he can figure out a way to pack more processing power into a standard arcade cabinet.  He probably invented the SuperFX chip years ahead of its time.  And I'm sure he had developed an early version of the UForce, only to realize it sucked before releasing it commercially.  See, those TVs come in handy.

So, long story short:  in the world of Watchmen, it is feasible that Adrian Veidt's company could have produced a Final Fight-quality game 12 years before that same feat was realized in our pathetic universe.

If Watchmen were a Marvel property, I'd have myself a No-Prize right about now.

Good night.  I'm going to go back to writing slash fiction about Doris Kearns Goodwin.

Page 1 2