Welcome to @matt_on_tech, a simple blog belonging to me, Matt Smith. I’m a freelance writer, technology guru, gamer and all-around geek. Currently, my time is taken up by hardware reviews for Digital Trends and articles for MakeUseOf, where I serve as both Assistant Answers Editor and Rewards Editor. In my down-time I’m an avid fan of video games (particularly the strategy and racing genres), but I sometimes get away from my PC long enough to camp, hike and enjoy the Pacific Northwest.
The above is a screen capture of my Google Authorship stats. As you can see, they’ve nose-dived. Suddenly I’m getting almost no impressions and, as of the last few days, zero clicks.
Except (thank god) this isn’t actually true. One of the sites which I work for provides author states, and they reporting as they had before. Yet the stats I can view are screwed – and why? Beats me. And I’ll almost certainly never hear of a solution from Google.
Microsoft’s recent announcement that it will ship a $399 version of the Xbox One without Kinect was met with mostly positive remarks, to no one’s surprise. Widely considered an inaccurate, unreliable and potentially insecure extra, Kinect still lacks a killer game despite years of development. Microsoft and its partners never produced its Wii Sports, a failure that’s doubly damning considering that Kinect is much larger, more complex and more expensive than the Wii’s rudimentary motion sensors.
The decision to drop mandatory Kinect represents a potential windfall for the Xbox One’s sales, which continue to lag the PlayStation 4. It’s a move that makes instant sense, as it brings the One in line with its competitor’s pricing and ditches the baggage of a hated peripheral. In a broader sense, however, dropping Kinect is a massive blow to Microsoft, and if the company seemed to drag its feet on the decision, it’s only because ditching its motion sensor means abandoning a potentially lucrative arena.
I’ve called myself a gamer since I was twelve years old. Some years, particularly those in high-school, I had almost no other identity at all. There would be little exaggeration in saying that games have always been my drug; I often turn to virtual worlds for comfort whenever I’m depressed or bored.
But, like any addict, I’ve learned my drug of choice causes just as much trouble as it solves. My worst spats find me tearing through games, consuming them in giant gulps before moving on to another the moment I tire of the last. The feverish pace of play is worsened by announcements, arguments and forum wars, which I read with dedication I rarely grant a textbook or novel.
You’d think that such periods of intense gaming would go hand-in-hand with some of my most memorable experiences, but in fact the opposite is true. Instead, I would often come away from games with more bile and loathing than ever. By digging into them so thoroughly, I destroy the very escape I sought, and reduce everything to a rush of number-crunching and petty arguments, punctuated only by the hope that the next game will be different. But, of course, it never is; after an initial honeymoon it’s torn apart like all the rest, reduced to components to be endlessly picked and mocked.
The launch of a console generation always places a focus on specifications that, after launch, are quickly forgotten by most players in favor of talking about actual games. One such specification is the resolution at which games are rendered. A minor skirmish occurred between Microsoft and Sony at the launch of the last console generation over the fact that the Xbox 360 didn’t include 1080p support at release (this was later patched in). Phil Harrison, working for Sony at the time, even proclaimed that the “HD doesn’t start until we’re on the market.”
Afterwards, the scuffle caused some backlash when gamers learned that while Sony’s console did support 1080p, no games would be rendered at that resolution upon release. Some news outlets speculated that this was merely a delay, but having lived through the history, we now know it was not. 1080p never became the standard for the out-going console generation. In fact, some popular titles (like Diablo III) don’t even manage 720p.
A recent client asked me to put together a list of recommended laptops in several categories, one of which was desktop replacements. After reviewing it, they came back to me with some questions about the list. Compared to the last list they’d compiled, which was several years old, my selections were much less expensive. Was this truly representative of the market? What had changed? My client wanted to know.
Home theater enthusiasts have spent the last five years collectively scratching their heads over the failure of plasma television. The technology has many clear advantages over LCD derivatives; it offers deeper black levels (and thus better contrast), clearer motion, and doesn’t suffer from uniformity problems. Almost all enthusiasts agree that plasma is superior to LCD, and sometimes by no small margin – yet sales have repeatedly fallen short of expectations and the technology’s most ardent supporter, Panasonic, struggles to make a profit even though its most serious competitor, Pioneer, abandoned the technology several years ago.
Many observers of this trend have concluded, in summary, that people are stupid. While I don’t entirely disagree with that rather cynical statement, it’s a position that onlookers, arm-chair analysts and pundits all too often retreat to, and in the case of plasma, I think it’s dead wrong.
People are not buying LCD televisions because they don’t know better, have been tricked by marketing, or fooled by in-store presentations. People are buying LCDs because they’re a better fit for many consumers. There’s a lesson of perspective to be learned from plasma’s hard times, and they apply to not just televisions, but all of consumer electronics.
Part of my job is to think about the future of computing. I’m paid to write informative pieces about current events in technology, and that means I must think about what happens next. At the same time, however, I try not to indulge in excessive speculation so I don’t end up quoted by John Gruber. If something seems unlikely to occur within a few years, I keep my mouth shut. Mostly.
But speculation can be fun, and some recent events in technology have caused me to think about the future of computing even more than usual. There are several ideas in my head that I just can’t resist. So here they are. Take them with a grain of salt.
Consumer technology has witnessed a number of revolutionary changes in the last few decades, each of which opened up a new battleground. First came the desktop, where PCs sparred for attention, followed by the laptop, which sent PCs mobiles, and smartphones, which took the war to our pockets. Now tablets have are new front in this ongoing match. But what’s next.
I think the next frontier is among the most familiar; the living room. At first glance this seems unlikely, as the living room has been a part of American life for years. What more could be done? Quite a lot, actually, and there’s good reason to think consumers might embrace change in one of today’s most mundane rooms. Here’s the three core reasons why the living room is ripe for revolution.
We now know the PlayStation 4 is $399. That’s without the Eye camera, however, which I wrapped into this prediction as if it were included, and is sold as a $59 add-on. So that makes the system with the features I expected $459. I’m pretty happy with being just $40 off.