I’ve called myself a gamer since I was twelve years old. Some years, particularly those in high-school, I had almost no other identity at all. There would be little exaggeration in saying that games have always been my drug; I often turn to virtual worlds for comfort whenever I’m depressed or bored.
But, like any addict, I’ve learned my drug of choice causes just as much trouble as it solves. My worst spats find me tearing through games, consuming them in giant gulps before moving on to another the moment I tire of the last. The feverish pace of play is worsened by announcements, arguments and forum wars, which I read with dedication I rarely grant a textbook or novel.
You’d think that such periods of intense gaming would go hand-in-hand with some of my most memorable experiences, but in fact the opposite is true. Instead, I would often come away from games with more bile and loathing than ever. By digging into them so thoroughly, I destroy the very escape I sought, and reduce everything to a rush of number-crunching and petty arguments, punctuated only by the hope that the next game will be different. But, of course, it never is; after an initial honeymoon it’s torn apart like all the rest, reduced to components to be endlessly picked and mocked.
The launch of a console generation always places a focus on specifications that, after launch, are quickly forgotten by most players in favor of talking about actual games. One such specification is the resolution at which games are rendered. A minor skirmish occurred between Microsoft and Sony at the launch of the last console generation over the fact that the Xbox 360 didn’t include 1080p support at release (this was later patched in). Phil Harrison, working for Sony at the time, even proclaimed that the “HD doesn’t start until we’re on the market.”
A recent client asked me to put together a list of recommended laptops in several categories, one of which was desktop replacements. After reviewing it, they came back to me with some questions about the list. Compared to the last list they’d compiled, which was several years old, my selections were much less expensive. Was this truly representative of the market? What had changed? My client wanted to know.
Home theater enthusiasts have spent the lastfiveyears collectively scratching their heads over the failure of plasma television. The technology has many clear advantages over LCD derivatives; it offers deeper black levels (and thus better contrast), clearer motion, and doesn’t suffer from uniformity problems. Almost all enthusiasts agree that plasma is superior to LCD, and sometimes by no small margin – yet sales have repeatedly fallen short of expectations and the technology’s most ardent supporter, Panasonic, struggles to make a profit even though its most serious competitor, Pioneer, abandoned the technology several years ago.
Many observers of this trend have concluded, in summary, that people are stupid. While I don’t entirely disagree with that rather cynical statement, it’s a position that onlookers, arm-chair analysts and pundits all too often retreat to, and in the case of plasma, I think it’s dead wrong.
People are not buying LCD televisions because they don’t know better, have been tricked by marketing, or fooled by in-store presentations. People are buying LCDs because they’re a better fit for many consumers. There’s a lesson of perspective to be learned from plasma’s hard times, and they apply to not just televisions, but all of consumer electronics.
Part of my job is to think about the future of computing. I’m paid to write informative pieces about current events in technology, and that means I must think about what happens next. At the same time, however, I try not to indulge in excessive speculation so I don’t end up quoted by John Gruber. If something seems unlikely to occur within a few years, I keep my mouth shut. Mostly.
But speculation can be fun, and some recent events in technology have caused me to think about the future of computing even more than usual. There are several ideas in my head that I just can’t resist. So here they are. Take them with a grain of salt.
Consumer technology has witnessed a number of revolutionary changes in the last few decades, each of which opened up a new battleground. First came the desktop, where PCs sparred for attention, followed by the laptop, which sent PCs mobiles, and smartphones, which took the war to our pockets. Now tablets have are new front in this ongoing match. But what’s next.
I think the next frontier is among the most familiar; the living room. At first glance this seems unlikely, as the living room has been a part of American life for years. What more could be done? Quite a lot, actually, and there’s good reason to think consumers might embrace change in one of today’s most mundane rooms. Here’s the three core reasons why the living room is ripe for revolution.
Welcome to @matt_on_tech, a simple blog belonging to me, Matt Smith. I’m a writer, technology guru, gamer, and Senior Editor of the Computing section at Digital Trends. In my down-time I’m an avid fan of video games (particularly the strategy and racing genres), but I sometimes get away from my PC long enough to camp, hike and enjoy the Pacific Northwest.
Email me to get in touch or follow me on Twitter and Google+. I’m currently looking for gigs that cover gaming, iOS devices or display technology.
We now know the PlayStation 4 is $399. That’s without the Eye camera, however, which I wrapped into this prediction as if it were included, and is sold as a $59 add-on. So that makes the system with the features I expected $459. I’m pretty happy with being just $40 off.
The lack of creativity in modern gaming is my biggest gripe about the hobby. Games offer the opportunity to interact with incredible worlds and situations that are far beyond everyday experience, yet most titles boil down to shooting or stabbing something in the face. Granted, most of us don’t spend much time murdering, so the allure is obvious – but there are other frontiers to explore.
So you can probably understand my excitement when I heard of Kerbal Space Program late last year (yes, I’m late to the party). A game about building and launching rockets? About exploration? About challenging what seems possible? Fuck, yea!
In 1997 Bill Gates sat down with cable executives and gave them a proposal. His company would sell them a box running Windows CE that would integrated the television experience, putting all content in one place. Cable executives loved the idea, but there was one small problem; Gates wanted 50% of all new interactive-advertising revenue. The cable executives (literally, in one case) told him “no fucking way.”
The deal died, but Microsoft never gave up the dream of an all-in-one box running Windows. The company shifted its strategy to circumvent cable providers by targeting something that happened in the living room, but wasn’t under their control; console gaming. And now, after 13 years of effort, the dream has become reality with The Xbox One.