Republicans offered Democrats two more weeks before the doomsday shut-down. Democrats countered with four. Republicans held their ground. Democrats agreed to two.
This is what passes for compromise in our nation’s capital.
By Robert Reich.
For the first two years of the Obama administration, I kept waiting, giving them and Congressional Democrats the benefit of the doubt that they know what they’re doing. Surely, I thought, there must be good reasons I’m not seeing for what appears to be chronic inaction, constant giveaways to the Republicans, and complete lack of control over the political narrative and discourse of every issue. And we’re destined to start seeing positive change soon, right?
I’m not holding my breath.
Assuming the health of our country continues to crumble, as it seems destined to do, I don’t blame the Republicans. They’re doing a masterful job of using their financial influence and narrative techniques to accomplish their long-term goals and never meaningfully lose any power. I don’t see the Republicans facing many significant political losses in our lifetimes — they play the game ridiculously well, and they keep getting even better at it. I can’t blame them for trying to win and succeeding.
I blame the Democrats. It’s their fault for constantly losing, bickering so much amongst themselves that even when they’re technically in power, they’re really not.
Welcome to the United States of America. You can choose between the Republicans and the Republicans. One pretends to care about religion, and the other pretends to be a vague alternative to the first.
So why aren’t those who are criticizing Apple for taking a 30 percent cut of subscription revenue criticizing Amazon? My theory: everyone understands, intuitively, that the Kindle is a closed proprietary platform; but many people view iOS (incorrectly) as a platform like the Mac or Windows, where third parties are free to do what they want.
The root cause for so much of the subscription ruckus, I think, isn’t that 30% number — it’s that Apple pulled the rug out from under some major apps after the fact. And unlike nearly every App Store rule change in the past, this is a major change that developers couldn’t have been reasonably expected to anticipate, and it’s not based in any practical need for the health of the Store or the platform (malware, abuse, etc.).
Developers are being shown that their apps — and their months or years of hard work, and in many cases, their entire businesses — can be yanked by Apple’s whim at any time for reasons that they couldn’t have anticipated or avoided. This invokes fear and anger from many, and I think most of the “30% is too much” arguing is a misdirected side effect of this frustration not at the number itself, but at the seemingly arbitrary and greedy nature of the rule.
Gruber is right to point out that Amazon’s terms to publishers, generally, are far worse. And their corresponding proprietary platform — the Kindle — doesn’t have much1 of an app platform.
But unlike Amazon, Apple is pushing iOS as a relatively “open” platform, where “open” means the ability to create and run software by people other than the platform vendor (Apple) itself, and the associated ability for developers to base significant businesses on this market. Apple’s own rhetoric has encouraged hundreds of thousands of developers, big and small, to put significant resources into making iOS apps. Amazon has never implied on any serious level that the Kindle should be considered such a software platform.
The extremely low barrier to entry for the iOS market, especially with its low cost and lack of much of a required business relationship with Apple, further differentiates this situation from that of “more proprietary” platforms such as video-game consoles. The more proprietary platforms also usually have some degree of pre-approval and lack any significant denials or removals of formerly approved software.2
Gruber’s main points on the subscription issue are valid: Apple doesn’t need to follow anyone’s requests or do what’s convenient for anyone but Apple. And there’s a clear business case for them to try to extract as much money as possible from their entire ecosystem.
I agree with Gruber that it’s incorrect to expect iOS to be as “open” to developers and their businesses as Mac or Windows. But I also think it’s reasonable to expect Apple to adhere to some common-sense standards on how they conduct the massive platform that many of us — upon Apple’s invitation to do exactly this — invested heavily in.
A very reasonable expectation, I think, is relative stability: that our apps, once approved, will continue to be permitted indefinitely unless there’s a justifiable reason to prohibit them later.
And if Apple breaks that expectation by changing an important rule in a way that we think isn’t justifiable, it’s perfectly reasonable for us to complain about it as loudly as possible in order to effect change.
There’s a Kindle SDK, but it has never been pushed much either to developers or Kindle customers. Like Linux on the PlayStation 2, I suspect it’s destined for permanent “beta” status and eventually a quiet death. ↩
There was Michael Jackson’s unfortunate Moonwalker game, featuring him rescuing little girls from closets in haunted houses by shoving his crotch in the air, that was pulled from the shelves due to poor timing with some of his child-molestation accusations. ↩
As recently as 1993, three kid-oriented genres—animated movies, movies based on comic books, and movies based on children’s books—represented a relatively small percentage of the overall film marketplace; that year they grossed about $400 million combined (thanks mostly to Mrs. Doubtfire) and owned just a single spot in the year’s top ten. In 2010, those same three genres took in more than $3 billion and by December represented eight of the year’s top nine grossers.
Let me posit something: That’s bad. We can all acknowledge that the world of American movies is an infinitely richer place because of Pixar and that the very best comic-book movies, from Iron Man to The Dark Knight, are pretty terrific, but the degree to which children’s genres have colonized the entire movie industry goes beyond overkill. More often than not, these collectively infantilizing movies are breeding an audience—not to mention a generation of future filmmakers and studio executives—who will grow up believing that movies aimed at adults should be considered a peculiar and antique art. Like books. Or plays.
The most compelling feature of the iPad 2 is its case.
One huge design win in the Kindle hardware, starting with the Kindle 2, was the case-clip mechanism, which uses mechanical clips to attach book-style cases at the inner “binding” edge. It allowed the Kindle’s cases to be far less bulky and fiddly than the Nook’s cases, as I illustrated last year. Like the iPad 2, the Kindle 2 and 3 were designed to accommodate cases from the start, since Amazon knew that most people would use cases with them.
With the iPad 2, Apple came to the same realization, and designed something that seems even better: the Smart Cover. If it’s even half as nice as it looks, this is going to be a big deal.
Screens on slate-style devices are extremely vulnerable to damage in transit. For pocket-sized devices like phones, you can just dedicate a pocket to them and keep them naked. But larger slate-type devices, including the Kindle and iPad, require cases in almost all practical uses.
So, in practice, the first iPad wasn’t a 1.5-pound device that cost $499. For most buyers, it was an iPad wrapped in Apple’s (kinda crappy) case, effectively making it a 2-pound device that cost $538 plus tax, didn’t look very good, and was very difficult to remove from its case so it usually stayed in it even when it was inconvenient.
The iPad 2 with a Smart Cover will be approximately1 a 1.5-pound device that will cost $538 in pastel rubber or $568 in nice leather, looks a lot nicer, and can be removed from its case instantly and easily whenever convenient.
That’s about a 25% weight savings2, a huge reduction in thickness, and a significantly better-looking and more versatile product in actual use.
Of all of the iPad 2’s improvements, none of them are going to impact my daily use — or almost anyone else’s — as much as this.
I can’t find any mention of the Smart Cover’s weight, so I’m guessing. If you know it, please let me know. ↩
The weight reduction is even greater if you used a non-Apple case with the first iPad, since third-party cases tended to be much heavier. My preferred case, a nice Piel Frama folio, was 61% heavier than Apple’s. ↩
Apple’s not offering preorders for the iPad 2, giving everyone lots of time to agonize over which one to buy.1 There are 18 configurations: how do you choose?
We’ll start with the obvious question:
Should you get an iPad 2?
If you want an iPad but don’t already have one: Probably. You can argue that there might be another one coming out within the next 8-12 months… but that’s always going to be the case. And unlike a lot of other electronics, the retail price of any given Apple product will probably never change over its lifespan. Therefore, the best time to buy an Apple product is right when it’s released or updated.
If you already have an iPad 1, there’s not a lot in the iPad 2 that will make it worth upgrading, unless you can spare the money fairly comfortably, and you can either sell the old one for a good price or give it to a relative or spouse (and they won’t resent getting a hand-me-down instead of the new one).
As for the iPad 2’s improvements, I don’t expect the cameras to be used very often, but I bet the Smart Cover will be a very big deal in practice, making the entire package significantly thinner, lighter, and more fitting of its price and prestige.
The other big difference in day-to-day use will likely be the extra RAM, especially if you browse the web a lot in Safari: the RAM upgrade will allow the iPad 2 to hold more pages in memory before it kicks them out and requires you to reload them. And having more memory for multitasking means that fewer apps will need to manually rebuild their states on launch, so switching between apps will be faster in practice.
That’s really about it, though, for what you’d notice most of the time. If you’re happy with your iPad, there’s not a huge reason to upgrade. Casual gadget customers who already own iPads are probably better off waiting for the next version.
But for the sake of the rest of this post, suppose you’re buying an iPad 2.
Black or white?
This is mostly a personal preference. Keep in mind that it’s only the border around the screen — the back is brushed aluminum, and the case or cover is whatever color you want (and can always be changed later).
Black is the safest option if you’re unsure. I suggest seeing them in person before choosing white. While I haven’t seen them yet, I suspect that the white bezel is more likely to be distracting. You’re not supposed to notice the hardware while using it, and I bet the black will be easier to ignore than the white.
3G or not?
3G costs $129 extra on each model. There are two complicating factors:
Wi-Fi-only iPads don’t have GPS, and the GPS on 3G iPads works even without an active data plan.2 So if you ever want any kind of location-based feature, even offline mapping, it’s much more useful with the 3G iPad.
On the other hand, you might find, as I have, that mapping, both offline and online, is easier to just do on the iPhone.
Both AT&T and Verizon iPhones support the new Wi-Fi hotspot feature, which apparently works very well. For about the same price as the cheapest iPad data plans, you can just make your iPhone’s connection shareable to any Wi-Fi device, including a Wi-Fi-only iPad, negating the need for the 3G iPad. If you also own a laptop, this is a compelling feature — it replaces a standlone MiFi at a much lower monthly cost.
There’s a drawback to this method: your iPhone needs to be charged, on, and nearby, and the Wi-Fi hotspot feature needs to be on, whenever you want to use data on your iPad outside. If you get a 3G iPad with its own data plan, it’s always on and ready to go, just like an iPhone.
It’s not doing any harm, except the higher purchase price, to get 3G “just in case” you ever need it. But if you’re a gadget nut and plan to buy every iPad, you really only need to consider whether you’ll need it in the next year, not whether you’ll “ever” need it.
A 3G iPad may also have a more useful life after you’re done with it. If you want to give it to a friend or relative who’s starting from technology-zero and doesn’t have an internet connection, a 3G iPad can remove the need for a home broadband connection and crappy Wi-Fi routers that die every 8 months.
AT&T or Verizon?
If you go with 3G in the U.S., you have to choose the AT&T or Verizon model.
Their data plans are similar, so it’s mostly a network preference. The basic rule here is that AT&T is faster when it has great reception and coverage, but its coverage and quality is inconsistent and its speeds are often terrible in big cities, while Verizon is generally more reliable in most regions despite lower top speeds.
So if AT&T is great in the areas in which you’ll use the iPad, and you like their network better, go for theirs. Otherwise, in the U.S., the Verizon model is probably a better choice.
16, 32, or 64 GB?
Most iPad owners will be fine with the base 16 GB capacity. The most common uses for the iPad — web browsing, email, reading, most apps, and games — rarely come close to needing that much space.
Capacity is expensive — I really don’t think, for instance, that 64 GB is going to be worth the $200 premium to most users. And many people would find that the $129 for the 3G upgrade makes their iPad more “future-proof” than a $100 capacity bump.
The biggest reasons to go higher than 16 GB:
You plan to sync a lot of music and listen directly from the iPad. (Personally, I use my iPhone for this and rarely put any music on my iPad.)
You plan to store and watch a lot of videos on it, or you plan to capture or transfer a lot of videos to it for editing in iMovie.
You plan to sync a large photo library to it.
If any of those apply to you, you should consider the 32 GB version.
If you think you’ll need the 64 GB version, you should consider whether even that will be enough. If it won’t, and you’ll need to “edit” your collections to fit on the iPad anyway, consider whether editing them down to fit into 32 or 16 GB might be worthwhile.
Case, Cover, or other accessories?
Definitely try any cases or the Smart Covers in person, or wait for a lot of good reviews, before deciding. I expect to really like the leather Smart Covers.
If this is your first iPad, don’t jump right in and order the dock or keyboard. They’re pricey, and you probably won’t need them. If you find that you need them after using the iPad for a little while, you can always get them later.
Where to ensure best stock?
Whenever there’s a new iPhone or iPad launch, everyone thinks they know of a secret low-traffic AT&T store or Best Buy that will all but guarantee a short line and day-one availability. They’re usually wrong.
Apple’s own stores always have the most stock on day one. And if you wait on3 line, you’ll almost always get one. Sales this year start at 5 PM on Friday, and you usually don’t need to get on line more than an hour or two ahead of time to secure a relatively decent spot.
You may not even need to wait on line. Often, you can just show up a few hours after sales begin, or the next day, and get one without waiting. But if you do this, there’s a bigger risk that it will be out of stock. And with 18 different iPads, constrained stock may be more annoying: they might still have iPads, but not the exact configuration you want.
Good luck. I’ll see you on line at the Apple Store, where I think I’ll be getting a 32 GB Verizon 3G model, in black, with a red, navy, or black leather Smart Cover.
They’re probably not taking preorders to ensure big, press-worthy lines on day one, as John Gruber speculated on The Talk Show #32. ↩
I just walked around my lawn at 11 PM holding two iPads, a Wi-Fi and a 3G each running MotionX GPS, to test this.
Me: “I’ll be right back. I have to go test something with my iPads on the front lawn.”
These are the ridiculous things I do for science. ↩
If you aren’t in New York, you probably say “in line”. Your pizza might suck, too. Sorry. But hey, your real estate is affordable. Win some, lose some. ↩
But in my defense, heavy write loads seemed like the last thing Pinboard would ever face. It was my experience that people approached an online purchase of six dollars with the same deliberation and thoughtfulness they might bring to bear when buying a new car. Prospective users would hand-wring for weeks on Twitter and send us closely-worded, punctilious lists of questions before creating an account.
Great overview of the problems facing any notification system, and I love this suggestion at the end: regardless of how you want notifications to be delivered, reducing the quantity is probably a good idea.
I’ve never received a push notification.
Really. It’s great. Every time that box comes up from a new app asking permission to send me push notifications, I’ve said no, because none of them have ever been important enough to interrupt me at any time of day.
My phone will only vibrate in my pocket if I receive a phone call, a text message, or a calendar alarm. That’s it. And since I receive a low volume of all three, I treat them all seriously. If I get a text message, there’s a very good chance that an Instapaper server needs attention, so I pay attention.
My phone’s on, with volume up, right next to my head when I sleep at night, because I know that any reason for it to make noise in the middle of the night will be something very important that I should probably wake up for. Similarly, I always discreetly take my phone out of my pocket to see why it vibrated, even in situations where that’s non-ideal (like during a meal with friends).
If your notifications are so plentiful that you can’t treat them that way, are they really that important? Do you really need to be notified?
In the last decade, Apple introduced their first line of notebooks that didn’t have dial-up modems built in, because dial-up modems were on their way out and most people didn’t need them anymore. But for the few that still did, they offered a little USB modem to ease the transition:
Three years ago, Apple introduced the MacBook Air, their first notebook that abandoned the optical drive, because optical drives were (and still are) on their way out. But they also sold this USB optical drive to ease the transition:
One year ago, Apple introduced the iPad, their first “computer”, sort of, that didn’t have a keyboard. But since keyboards are required for a lot of productivity tasks, they also made it compatible with Bluetooth keyboards and released the Keyboard Dock:
Apple went out of their way to convince the world that the iPad was a legitimate productivity device:
Most of us tried to rationalize the iPad’s purchase by telling ourselves that it could often replace a laptop. The productivity apps and the Keyboard Dock support that view: that the iPad is a new kind of computer that might replace your traditional computer, and therefore, it’s rational to spend over $500 for one.
But I don’t think that’s what happened in practice.
The iPad isn’t really a great “office productivity” device, in the traditional PC sense. It can be used that way in some cases, but it’s rarely the best tool for the job.
I never liked the Keyboard Dock (or using a Bluetooth keyboard with the iPad). It looked like a temporary hack, like the USB dial-up modem: a bridge from the old to the new until people didn’t need it anymore. And it was clunky: not only was its protruding shape awkward and difficult to pack in thin bags, but using a keyboard at all in iOS was (and remains) half-baked: users needed to constantly reach up and touch the screen in use. If you need a physical keyboard very often with the iPad, you’re probably better served using a laptop, especially now that the 11” MacBook Air exists.
It seems that Apple has discontinued the Keyboard Dock with the launch of the iPad 2, which confirms that they saw it as a temporary hack, too. And rather than issue a huge update to the iWork productivity apps, they branched out into different uses with iMovie and Garage Band, and beefed up the graphics processor more than any other upgrade to strongly benefit games.
I don’t think this was their plan from the start — I think Apple didn’t know any better than we did, a year ago, whether the iPad was going to end up as a productivity device in practice. They probably thought, like we did, that it would replace laptops a lot more often.
But, as often happens in technology, the iPad hasn’t “killed” the laptop at all — it has simply added a new role for itself. And that role doesn’t include office productivity for most of us.
Apple is now adapting to the market’s actual use by retreating somewhat from office productivity and pushing strongly into new territory — casual media creation — to see if that gets a stronger uptake in practice. I think it will be a lot more interesting than office productivity, but there’s still a lot of work that needs to be done in iOS to make it practical (especially regarding file transfers with computers).
Like Photo Booth on the Mac (and now also the iPad), casual iPad users will have fun playing around with GarageBand for a while. Maybe even iMovie once or twice.
I still don’t think Apple has found the sweet spot for the iPad’s usage: the ideal role it fills in personal computing. And I don’t think we, as developers or iPad owners, have found it, either. But I know that sweet spot exists, and for a computer category that has only existed for one year, we’re rushing towards it remarkably quickly.
This is why the iPad is truly exciting: we can see that it has great potential, and while we don’t quite know its nature yet, we’re pretty sure that it’s huge.
The drive-by technorati are well-informed, curious and always probing. They’re also hiding… hiding from the real work of creating work that matters, connections with impact and art that lasts. I love to hear about the next big thing, but I’m far more interested in what you’re doing with the old big thing.
The next Bon Jovi concert I’ll consider attending now will be one with a completely different set list of tracks that I like as much as the ones you released 20 years ago. All you have to do is start recording them, and I promise that my wife or I will purchase them. So will the rest of your fans. Until that happens, and other musicians start churning out great music by the album rather than the song, the industry’s going to be in trouble. And if it keeps blaming the system rather than itself, it will deserve its fate.
The Daily has been free since its introduction, on a “trial” basis.
Apparently the free trial will finally end next week. To keep The Daily after that, customers will need to pay $1/week or $40/year.
That’s a fair price for regularly updated content, but when people need to start paying for it, they’re going to be a lot more critical. The first question most people will ask themselves isn’t “Can I afford $1 per week?” or “Is that really a lot of money compared to everything else I spend money on every week or year?”
It’s not about the money. That’s not how people think about app purchases or upgrading from a free app to a paid app. Instead, the reasoning is more like “Do I really want this enough to pay for it?”, or worse, “Can I get rid of this without missing it?”
And I’m not sure The Daily will hold up to such a critical evaluation as well as it needs to for it to be anywhere near profitability.
Some homeopaths also use techniques that are regarded by other practitioners as controversial. These include paper ‘remedies’, where the substance and dilution are written on a piece of paper and either pinned to the patient’s clothing, put in their pocket, or placed under a glass of water that is then given to the patient, as well as the use of radionics to prepare ‘remedies’. Such practices have been strongly criticised by classical homeopaths as unfounded, speculative, and verging upon magic and superstition.
I love the idea of “homeopaths” criticizing other people’s superstitious, unfounded, speculative, pseudoscientific “medical” practices.
(I discovered this gem while looking up the details to explain to Tiff. If you’re not familiar with the details of how homeopathy is supposed to “work”, it’s a pretty good laugh. To borrow from South Park, “THIS IS WHAT HOMEOPATHIC ‘DOCTORS’ ACTUALLY BELIEVE.”)
Samsung presented some of the first significant competition to the iPod touch…
I’d call it “potential competition” — it’s not competition if it doesn’t exist yet. And when it does, it’s not really a competitor if it doesn’t sell very well. It’d be difficult to say, for instance, that the Zune was ever really providing “significant competition” to the iPod.
Both run Android 2.2 and will be upgradable to 2.3 in the future.
2.3 has been out for a few months already, and we know how good the Android device manufacturers are at getting updates issued after a device’s sale.
The two ship with 8GB of storage built-in and have microSDHC slots to take up to 32GB more.
They need to cut a lot of costs to get anywhere near Apple’s pricing for the iPod touch, so they’re making users expand the storage later at their own expense. (Anyone think it’ll compete with the 8 GB iPod touch’s $229 price?)
Samsung is keen to tout that, as Android devices, they don’t need a sync app to load content…
…because a good one doesn’t exist…
…and will work with drag-and-drop…
…users will need to “sync” everything manually because Samsung hasn’t written decent sync software…
…for those comfortable with manually loading their media.
Samsung unusually didn’t give a ship date or a price for either Galaxy Player beyond a spring release,…
…what a surprise…
though they’re expected to cost much less than a Galaxy S-based phone would off-contract.
…we assume (and hope) that an iPod touch competitor would be priced significantly below $700, although how far below that (and how close to the iPod touch’s $229 entry point) isn’t something Samsung is ready to announce yet.
Jeff Verellen, a barista and roaster at Caffenation in Antwerp, Belgium, is working on prototype weights in Carrara marble, iron and stone to place on top of the [AeroPress] so that you don’t have to do the work of applying your own pressure.
‘The different weights will be used to time the pressing,’ he says. ‘Depending on the grind and coffee, there are different times and pressures needed to extract the most out of each different coffee. This is also to achieve consistency in a commercial setting … and also to save time. The marble also looks nice.’
Twitter’s official iPhone app, formerly Loren Brichter’s Tweetie and an otherwise awesome client, got a lot of negative reactions from the recent addition of the Quick Bar, a mandatory trending-topics banner on top of the tweet list. A lot of people really hate it, calling it the “dickbar” and often abandoning the Twitter app entirely because of it.
Its initial implementation as a floating overlay over anything you were doing in the app was far worse. Now, it’s just at the top of the main timeline, and it scrolls with the list. But it’s still offensive to most people who hated its debut, because making it scroll with the list didn’t solve the problem of it being there and being mandatory.
The reason Twitter added the Quick Bar was, presumably, to be able to feature ads, which show the “Promoted” badge:
If it only ever showed ads like this, I don’t think the response would be so negative. The bigger problem is that it’s showing a random “trending” topic or hash-tag most of the time. Here are a few of the topics I’ve seen in the last 24 hours:
It’s a news ticker limited to one-word items, lacking any context, broadcasting mostly topics that I don’t understand, recognize, or care about. It’s nonsensical. At worst, it can offend. At best, it will confuse.
If Twitter wants to run an ad at the top of the scrollview, Twiterrific-style, I’m all for it. It’s your platform. Monetize away. But the problem with the trend bar implementation is that I’m being subjected to what I find to be the poor taste of millions of mouth-breathing buffoons in my own timeline.
What’s worse is that it’s shown in a context — my Twitter timeline — that otherwise contains only content that I’ve (indirectly) chosen to put there. (I’ve chosen who to follow based on what I want to see in my timeline.) I’m not interested in sports or celebrities or middle-school survey trends, so I don’t follow people who overwhelm my timeline with those unwanted topics.
But now, my timeline looks like this:
Content that I’ve chosen to follow, and… Michigan. I don’t even know what that’s supposed to mean. Presumably, there’s some bit of news happening that’s relevant to the state of Michigan, and Twitter wants users to tap on this disembodied word for a reason that’s not made clear to us.
So I tapped on it.
When presented with this screen — which was important enough for Twitter to be worth alienating vast numbers of influential users with the mandatory Quick Bar — what am I supposed to do?
I see, from top to bottom: intentional spam, unintentional spam, and a random person’s frivolous, meaningless tweet about sports that I don’t care about. (I scrolled down and it only got worse.) I guess Michigan is a trending topic because something important happened with a Michigan sports team.
What am I supposed to do with this information?
Am I supposed to tweet about it? If so, why doesn’t the interface encourage that? Even if I hit the (effectively invisible) New Tweet button from this screen, my tweet isn’t prepopulated with “#michigan”, so whatever I say in response won’t be included here.
Am I supposed to save this search, which the interface does encourage, so I can see this topic again in a few days or weeks or months, when it’s presumably no longer coherent or useful? (Ignoring, for the moment, that it’s neither coherent nor useful now.)
Am I supposed to read these tweets? If so, why haven’t stronger anti-spam methods or human filtering mechanisms been employed to keep the stream somewhat readable? As-is, it’s a huge and easily exploited spam target, and it shows.
We don’t know Twitter’s true reason for adding the Quick Bar. Presumably, it’s part of a longer-term strategy. But today, from here, it looks like an extremely poorly thought-out feature, released initially with an extremely poor implementation, with seemingly no benefits to users.
This is so jarring to us because it’s so unlike the Twitter that we’ve known to date. Twitter’s product direction is usually incredibly good and well-thought-out, and their implementation is usually careful and thoughtful.
And in the context of this app, most of which was carefully and thoughtfully constructed by Loren Brichter before Twitter bought it from him, we’re accustomed to Brichter’s even higher standards, which won Tweetie an Apple Design Award in 2009. (I suspect he had little to no authority in the Quick Bar’s existence, design, or placement, and it’s probably killing him inside.)
The Quick Bar isn’t offensive because we don’t want Twitter making money with ads, or because we object to changes in the interface.
It’s offensive because it’s deeply bad, showing complete disregard for quality, product design, and user respect, and we’ve come to expect a lot more from Twitter.
Blu-ray is just a bag of hurt. It’s great to watch the movies, but the licensing of the tech is so complex, we’re waiting till things settle down and Blu-ray takes off in the marketplace.
The implication is that Apple doesn’t believe that Blu-ray will ever “take off” enough for them to need to care about it, so they aren’t interested in supporting it.2
I think the last two years have confirmed that they were right.
Blu-ray is a pain in the ass, even for consumers. The major movie publishers finally got what they wanted in a home-movie medium: enough dynamic “multimedia” capabilities that they can boast “interactive” extras to sell more expensive “special editions”, enough copy protection to kill almost all casual piracy (including such innocent cases as ripping movies you own so you can play them on vacation on your iPad), and even more customer-hostile restrictions during playback to make sure that you watch every last preview, commercial, and piracy warning before viewing the movie that you “own”.
They use the word “own” when it’s convenient, like in commercials for movies that were just released on Blu-ray. “Own it today!” A marketing study probably concluded that this phrase gives people the idea that they’re paying for something concrete so they’ll pay more and won’t think to just go out and rent it. But when it comes to restrictions and copy control, they’re quick to point out that we don’t “own” anything on those discs.
While ownership is still restricted by DRM in the iTunes and Netflix worlds of online video distribution, the experience is far more permissive with far fewer hassles.
Apple’s gamble paid off: iTunes presumably sells a good volume of HD movie rentals, and there’s very little demand for Blu-ray playback on Macs. I don’t think we’ll ever see it. And I don’t think most people will notice.
I have no idea which physical medium the major movie publishers will attempt to sell us after Blu-ray, but I’m not entirely confident that one will exist at all. And given the implementation of Blu-ray, that’s a very good thing.
I was reminded of this by a recent episode of Hypercritical (I think episode 8 or 9) when John Siracusa discussed it. Good show. ↩
Blu-ray drives can be added to Macs by customers, but Apple doesn’t offer any. If you add one, it works as a data-only drive, and Finder has no trouble reading and burning data BDs. But no Mac software exists to play commercial Blu-ray movies directly from their discs.
For the truly dedicated, Mac BlurayRipper Pro can dump the disc onto the hard drive and you can convert the directory with HandBrake if you’re willing to wait a long time. But it’s barely worthwhile.
Even burning data BD-Rs is barely worthwhile. External hard drives with a toaster are usually a better bet for bulk removable storage. ↩
“It’s no panacea, but this legislation will point us in the right direction. Looking at hard data, we know our children are struggling with a heck of a lot of the math, including the geometry incorporating pi,” Roby said. “I guarantee you American scores will go up once pi is 3. It will be so much easier.”
I think this is Amazon’s first step towards launching their own Amazon-branded (or perhaps Kindle-branded) Android devices, where the Amazon Appstore will be preinstalled, and the devices will ship from Amazon with your Amazon credentials already set up on the device (as with the Kindle hardware today).
Before LaunchBar, I used Quicksilver for launching apps and nothing for clipboard history. But now that I’ve integrated clipboard history into my workflow, I couldn’t go without it again.
My most common use is shuffling around two or three items in a stack. Select the first, Cmd-C, select the second, Cmd-C, go to the second item’s destination and hit Cmd-V, then go to the first item’s destination and hit Cmd-Backslash, Down. It’s invaluable when programming or juggling links for a blog post (as you can see by my stack’s contents in the screenshot).
Now, using a computer with “only” one pasteboard, like iOS devices or other people’s computers, feels primitive and constraining.
And honestly, I’m not a big LaunchBar fan otherwise. I think Quicksilver was much better at launching apps. More than a year after switching to LaunchBar because I thought Quicksilver was dead (which might not be the case anymore), I still make far more mistakes launching apps with LaunchBar and I’m still not fully accustomed to the nuances of typing into it.
But I’ve tried other clipboard-history apps (all of which cost far less than LaunchBar’s $35 price), and I like LaunchBar’s implementation best by far.
Maybe I’ll just turn off the app-launching feature and go back to Quicksilver for that, leaving LaunchBar running only as a very expensive (but good) clipboard-history tool.
Anyone that genuinely wants to improve the product or figure out why the app doesn’t work for them will use those support channels. If they take the effort to write in, I do my best to ensure they get a satisfactory response. Anyone who just stops in the [App Store] reviews to complain doesn’t really want to resolve the problem. They just want to let steam off.
— More awesomeness from Justin Williams. This whole post is ridiculously good, especially for iOS developers to read.
To date, I’ve relied only on my own speculation to estimate how many people used Instapaper on each iOS device, or with each iOS version. I didn’t think the exact numbers were worth the effort of tracking them — I just tested every version on my original iPhone before releasing it to make sure it was fast enough, and I’ve maintained 3.1.3 as my earliest supported iOS version (since original iPhones and iPod touches can’t run 4.x, and iPhone 3Gs really shouldn’t).
But there are so many new features and APIs that require 3.2 or 4.x that, at some point, I’m going to need to drop compatibility with iOS 3.1 and original iPhones.
In Instapaper 3.0 for iOS, I added device and OS-version reporting when the app communicates with the Instapaper service so I could get a better idea of who my customers were and when I can drop support for older devices and OSes.
This information might be useful or interesting to other developers, so I’ve decided to share it. Keep in mind that this sample does not represent all iOS owners. Specifically, it only represents people who:
Paid $5 for Instapaper’s full app, and…
Didn’t delete it before 2 weeks ago, and…
Keep their apps up to date enough to have a version that’s only two weeks old, and…
Launched it sometime during these two weeks, and…
Logged into the app with an account (or created an account from it), and…
Connected to the internet long enough for an update request to be received by the service.
So it’s very useful to me, but the usefulness to you will vary. With that out of the way:
I knew Instapaper sold well to iPad owners, but I didn’t know it was half of my business. (I also overestimated the iPod slice.) More specifically:
The biggest surprise here is how modern the devices are. It also looks like I don’t need to care as much about how the app performs on the first-generation CPUs:
I think this is a combination of factors:
Each successive model has sold more than its predecessor as prices have dropped and popularity has increased.
Devices that haven’t been sold for 2-3 years have had 2-3 more years to break and be replaced.
Instapaper appeals to geeks and heavy internet users who are more likely than average iOS owners to upgrade often.
Now, for the more important distinction: which versions of iOS are my customers running? Specifically, how many of my customers are running at least a given version that I’d like to target as my minimum supported version?
Keep in mind, of course, that the current minimum supported version is 3.1.3, so I can’t count anyone who’s still on an earlier version than that. (Hence, 100%.)
This was great news: it looks like I can comfortably raise the minimum to 4.0 whenever I want to start using a 4.0-only feature in a way that can’t be easily disabled when running on earlier versions.
But 4.x wasn’t available on iPads for a few months after it was released for iPhones. What about OSes on each device class, separately?
This was a nice surprise — I mistakenly thought iPad owners would be slower to update their OSes since there’s less motivation to sync the iPad often. But the results say otherwise.
And, for fun, since they’re reported as different devices, how well is the iPad 2 selling in its three connectivity options?
Anyway, I hope these stats can be helpful to you. They’ve certainly been helpful to me, and many of my assumptions were proven wrong — all in good ways.
Apple now seemingly considers most Mac hardware updates boring enough to skip any potential press event and just feature the new models on the Apple website for a while. Event-worthy Mac updates are now limited to entirely new products or major redesigns.
This makes sense: most people don’t care when a new laptop looks the same as its predecessor and has all of the same general benefits and drawbacks, but has a faster CPU. These core-component updates barely warrant a press release, because for normal usage, the difference is barely noticeable.
When WWDC was announced the other day, some publications reported that this was going to be a software-only event — for the first time since the iPhone’s debut in 2007, a new iPhone would not be announced at WWDC. So far, this rumor is uncontested, and the supporting interpretation of Apple’s own language is sensible.
Suppose that there’s no iPhone 5 this summer. This raises an obvious question: when will the next iPhone be released?
This change in the release schedule certainly explains a lot about some recent inconveniences that we’ve mostly been ignoring because they didn’t fit into our expected Apple-view:
Apple released a significant change to the iPhone 4, the Verizon CDMA model, mid-cycle. Why release such a major product that’s only intended to sell for five months?
Apple’s most recent statement on the white iPhone 4 was that they still intend to ship them “this spring”. We assumed that they were just stalling, hoping people would forget about the promised white iPhone 4 until the next iPhone is released. But assume it’s not just stalling — again, why manufacture and ship another product that’s intended to sell for only two or three months?
The annual fall iPod event is less important every year as traditional-iPod demand shifts to the Touch and iPhone, and the updates to the traditional iPod get less drastic and far less relevant. It’s nice to see the Touch’s updates at this event, but since it’s always the same core hardware as the iPhone announced a few months prior, it’s not particularly surprising or newsworthy.
It wouldn’t surprise me if updates to the iPod product line became as un-event-worthy as Mac CPU updates.
But couldn’t this happen to the iPhone and iPad as well?
Not every time, of course. Just like the Macs, what if the iPhone didn’t get massive updates on a regular annual schedule, but only got a significant redesign every few years, with minor mid-cycle updates as needed? Some wouldn’t even be announced at events at all, and others would be quick footnotes during otherwise software-focused events.
The “iPhone 5” might be more like the iPhone 3GS — itself a minor update to an existing design — in that it would essentially be the same design as the iPhone 4, but with a faster CPU based on the iPad 2’s A5, and maybe doubled storage capacity. Or an LTE chipset (which currently has coverage almost nowhere). Or a unified CDMA+GSM version with that great Qualcomm chip for easier supply-chain and retail management.
Would any of those justify an event?
iOS hardware advancement is reaching diminishing returns. Like the MacBook Pro, Apple has refined the iPhone and iPad designs almost to the point that it’s difficult to think of how they’d meaningfully improve them without major underlying changes (such as a breakthrough in storage or battery technology).
But the software is just getting started.
And when iOS itself gets better, everyone benefits. A major OS update can make a much bigger difference in everyday usage than an incremental hardware update. It might even create entirely new markets or give our devices significant new functionality.
So, with this year’s WWDC likely to focus on two major OS releases, Lion and iOS 5, I don’t think I’ll miss the formerly annual hardware update.
Here is a tip for all the non-developers out there. When you email your favorite developer with a feature request or bug report never, ever, ever use the word useless to describe their product. Useless is kryptonite to developers and puts us on the defensive instantly.
I’m also extremely sensitive to this word in emails or reviews. It’s the entire reason I was inspired to do my keyword analysis. And while I never got around to writing the follow-up post mentioned in the last paragraph, Justin’s makes similar points (and better) here.
“Useless” is in such a high proportion of negative feedback that it’s just a raw nerve for me. If you call my app “useless”, I stop reading right there and either hit Delete or keep scrolling.
Since Google’s business is advertising, shifting industries away from paying business models is in their interest. If people are willing to pay for email, mapping and documents, Google’s business model is limited. Thus, using the outsized revenues they make from advertising on search, Google gives away Gmail, Maps, Docs, navigation, translation, et cetera, so no one can compete in those areas—to make free the norm for these services. If Google is giving away a quite good service, it’s hard to compete with them in that area, and so the economics of that business shift away from paid services to advertising-supported. And if a business becomes dependent on advertising for revenue, that’s good for Google, because they’re better at it than everyone else.
Rather than continue to make changes to the QuickBar as it exists, we removed the bar from the update appearing in the App Store today. We believe there are still significant benefits to increasing awareness of what’s happening outside the home timeline. Evidence of the incredibly high usage metrics for the QuickBar support this. For now, we’re going back to the drawing board to explore the best possible experience for in-app notification and discovery.
This is a nice way to put it, but it’s obviously candy-coating the truth: that it sucked and should never have been released in the first place.
But good for them for owning up to the mistake and fixing it.
One little nitpick, though: of course Twitter’s metrics showed that it was used a lot. It was a large, new element at the top of the main screen. That doesn’t mean everyone who “used” it was happy about having done so, or clear about its purpose.
Tons of people must have invoked it just to figure out what it was, and given its size and position, I wouldn’t discount how many of the “uses” were probably from accidental taps. Usage metrics should attempt to account for this, for example, by separately tracking repeated invocations, or invocations that the user kept open for more than a couple of seconds.
I imagine they know this, and the simple statement in their post is just part of the PR-ification of the rollback.