Welcome to @matt_on_tech, a simple blog belonging to me, Matt Smith. I’m a writer, technology guru, gamer, and editor of the Computing section at Digital Trends. In my down-time I’m an avid fan of video games (particularly the strategy and racing genres), but I sometimes get away from my PC long enough to camp, hike and enjoy the Pacific Northwest.
Reviews of the iPhone 6S are in, and unsurprisingly, they’re positive. I don’t envy the job of those tasked with reviewing them. How many different ways can you say “it’s the best, and you should buy it?”
This year, however, I noticed something that I haven’t before – the boredom of expectations utterly fulfilled.
Apple’s latest and greatest is hugely fast. GeekBench says its about as quick as the new MacBook, which is powered by Intel’s Core M processor. While I’m not sure if I trust that GeekBench in the iOS environment is comparable to the same in a Windows environment, the point is clear. The iPhone 6S is very, very quick.
But not everyone agrees that this on-paper improvement translates to the everyday experience. Niley Patel, writing for The Verge, calls the new A9 “the most powerful processor ever in a smartphone,” but follows up by saying “it kind of doesn’t matter.” Gareth Beavis, writing for TechRadar, notes the new chip is quick, “but in general day to day use, there doesn’t seem to be a lot of difference. TechCrunch’s review, written by Matthew Panzarino, made the boldest statement by saying almost nothing about the processor at all – “A9” is found once, in reference to the better camera and ability to shoot 4K video. The iPhone 6S is good enough for almost anything, and that makes its specific performance merits more difficult to describe.
My iPad 3 died last night. Carelessly left to linger along the edge of a desk it soon fell victim to the curious paws of my cat and landed directly on a wheel of my office chair. The sound an iPad makes when it breaks this way is like a car window shattering. For a moment I thought someone had tried to vault into (or out of) my apartment.
In a way this accident’s timing is convenient. Apple just debuted its new iPad Air 2, so if I buy a replacement I’ll be at the cutting edge of the release cycle. The new tablet is by all accounts the best ever made.
Yet as I stood in the Apple Store today with its svelte frame in my hands, contemplating a purchase, I was more annoyed than excited. I’d be lying if I said the shiny new model felt meaningful quicker or drastically better in my hands. Pay $600? For this? I just couldn’t do it.
The moment my iPad broke didn’t change anything about my life. Without it I can still write, I can still play games, I can still visit web pages, I can still check my email. Absolutely everything that my tablet could do is already covered by another more necessary device (mostly my computer and smartphone, but game consoles as well).
For perhaps a year I felt that maybe Wired’s prophetic post-iPad article “How the Tablet Will Change the World” was not as overwrought as I’d thought. Maybe they would change the world. Maybe we would stop using PCs entirely. Maybe “a tablet in every backpack” would become the battlecry of a new era for education. Tablets were everywhere. Tablets could do everything. Tablets were cool.
But now the future isn’t as clear. Tablets aren’t going anywhere, but neither are PCs, or phones. There’s a dizzying range of devices available in nearly any size I’d like. Even Apple’s lineup is starting to become confusing. The expanding capabilities of each device is making the battlefronts between them longer, vaguer and harder to define.
If tablets aren’t going to take over, though, what is? I don’t think there’s a clear answer to that question. There isn’t anything in consumer technology that’s omg awesome! right now except 4K monitors - but I get the sense the general public isn’t on board with them yet. And they only improve, rather than change, how we use computers.
Maybe I will end up buying an iPad Air 2. Right now, though, I’m tempted to wait until something cool comes along.
Early last week we learned that Google is being sued by a group of celebrities whose nude photos were leaked to the Internet through some form of online attack that gained access to their iCloud photos.
The technology community’s reaction was swift and predictable. Its general tone was best captured at Ars Technica, which saw comments such as this:
When will these dinosaurs learn? You can’t scrub content, especially images, off the internet entirely.
People will say that I’m “blaming the victim” but really the only way to ensure naked photos from appearing on the internet is to never upload them to the internet in the first place.
Lawsuits to shut up people on the internet just don’t work.
If you took the name away from the company being sued and tried to guess it via reaction to the article, you’d think it was some helpless small business being victimized by a scummy lawyer and greedy celebrities.
But that, of course, is not what we’re talking about. We’re talking about Google, the world’s largest search company and one of the world’s largest advertisers. Surely they have the resources necessary to respond to takedown requests in a timely manner? Surely they have some responsibility to handle the content they display?
That is, in fact, the point being made by the celebrity’s lawyer, who notes (in a quote not often cited by technology websites covering this piece) that:
Ever since the hacked images first began to be posted on websites and blogs [...] we have been sending notices to various website operators and host providers [...]. The vast majority of those sites and ISPs/hosts, all of which are much smaller than Google, with fewer staff and resources, complied with their obligations under the DMCA and removed the Images within an hour or two of receiving our DMCA notice.
Google did not remove the images quickly. It still has not, as a Google Image Search for most of the celebrities victimized by the leak will turn up the leaked photos (along with a long list of photoshops). Some of the photos are hosted on domains that are literally the celebrity’s name followed by the word “nude” behind it, and all of them are obviously indexed well enough to make them visible in Google’s search engine. The company knows how to find them. Yet they’re still visible, still obtainable from the top of Google’s search results eight days after this lawsuit was filed.
It’s incredible to me that so many people are happy to give Google a pass on this with the usual hand waving about openness and freedom. They are, in effect, elevating the “rights” of a set of search results above the rights of real human beings. This is the sci-fi dystopia where the value of data trumps even the most basic morality, and it’s happening right now.
At times like these I’m reminded of Jason Lanier’s You Are Not A Gadget, a book that sharply points out that technology is not a unquestionable instrument deliver from the heavens but simply a tool built by humans with their own motivations. To quote him:
I fear that we are beginning to design ourselves to suit digital models of us, and I worry about a leaching of empathy and humanity in that process.
I worry about that, too. Not because technology is naturally de-humanizing, but because there’s always a wizard behind the curtain designing the models we conform to. And we’re becoming very, very good at refusing to acknowledge he exists.
“It’s an incredible opportunity for us to switch people from Android to iOS. So yes, this is epic. It is epic.”
This quote from Marco della Cava’s piece about Cook in USA Today shows Tim Cook at his most aggressive. While different from Jobs in many ways, Apple’s new CEO shares the unrelenting ambition of his predecessor. Cook wants Apple to explore new frontiers through innovation in both new categories and in existing, successful hardware.
But ambition is not enough on its own. What made Jobs a legend in consumer technology was the fact his ambition came second to his desire for perfection. The potential market for a new device wasn’t as important as the sanctity of its design. If he felt something was out of place, it wasn’t going to be sold, period.
By contrast, Cook comes across as a conqueror rather than a perfectionist. Why is the new iPhone larger? Because it’ll steal market share. He told Charlie Rose at PBS that “it’s [the new iPhone] been about making a better phone in every single way,” but as far as I can tell, he’s never specifically said why a larger screen is a better design.
Sometimes it’s hard being right.
Back in 2010, before I realized there wasn’t much money in writing about video games, I penned an opinion piece for The Escapist. The topic? Steam, and it’s current and/or impending monopoly status.
My point, in essence, was that what people think of Steam and Valve right now is irrelevant because, once the platform dominates, it’ll be free to disregard what’s in the best interests of consumers and developers.There will be nothing stopping Valve from making bone headed moves out of ignorance, greed or stubborn idealism.
There’s an excellent blog post from Puppygames developer Cas making the rounds today. Though framed as a rant, it’s really a critical piece about the state of the game industry that takes out multiple sacred cows with a barrage of rhetorical cruise missiles.
The Oculus Rift and its upcoming rival, Sony’s Morpheus, have become the darlings of technology. Almost everyone has something great to say about them, and many seem convinced that once consumer versions of VR headsets hit the market there will be no going back. Monitors, televisions and everything else will all be instantly forgotten.
I’m not so sure.
There are many serious complaints that can be made about the current state of VR technology. The units are bulky, they don’t play well with glasses, the display panels aren’t good enough, the price is too high, and so on. These aren’t trivial points, but they’re not the reason I’m skeptical of virtual reality.
My negativity is pragmatic. Geeks often pretend that technology is like a force of god that cannot be resisted and overwhelms human desire, pointing us towards a new, more enlightened path. In truth, technology – and consumer technology in particular – thrives only when it conforms to its users. Most people don’t care if technology changes their lifestyle, but hate changing their lifestyle for technology.
The above is a screen capture of my Google Authorship stats. As you can see, they’ve nose-dived. Suddenly I’m getting almost no impressions and, as of the last few days, zero clicks.
Except (thank god) this isn’t actually true. One of the sites which I work for provides author states, and they reporting as they had before. Yet the stats I can view are screwed – and why? Beats me. And I’ll almost certainly never hear of a solution from Google.
Microsoft’s recent announcement that it will ship a $399 version of the Xbox One without Kinect was met with mostly positive remarks, to no one’s surprise. Widely considered an inaccurate, unreliable and potentially insecure extra, Kinect still lacks a killer game despite years of development. Microsoft and its partners never produced its Wii Sports, a failure that’s doubly damning considering that Kinect is much larger, more complex and more expensive than the Wii’s rudimentary motion sensors.
The decision to drop mandatory Kinect represents a potential windfall for the Xbox One’s sales, which continue to lag the PlayStation 4. It’s a move that makes instant sense, as it brings the One in line with its competitor’s pricing and ditches the baggage of a hated peripheral. In a broader sense, however, dropping Kinect is a massive blow to Microsoft, and if the company seemed to drag its feet on the decision, it’s only because ditching its motion sensor means abandoning a potentially lucrative arena.
I’ve called myself a gamer since I was twelve years old. Some years, particularly those in high-school, I had almost no other identity at all. There would be little exaggeration in saying that games have always been my drug; I often turn to virtual worlds for comfort whenever I’m depressed or bored.
But, like any addict, I’ve learned my drug of choice causes just as much trouble as it solves. My worst spats find me tearing through games, consuming them in giant gulps before moving on to another the moment I tire of the last. The feverish pace of play is worsened by announcements, arguments and forum wars, which I read with dedication I rarely grant a textbook or novel.
You’d think that such periods of intense gaming would go hand-in-hand with some of my most memorable experiences, but in fact the opposite is true. Instead, I would often come away from games with more bile and loathing than ever. By digging into them so thoroughly, I destroy the very escape I sought, and reduce everything to a rush of number-crunching and petty arguments, punctuated only by the hope that the next game will be different. But, of course, it never is; after an initial honeymoon it’s torn apart like all the rest, reduced to components to be endlessly picked and mocked.