Filed as Apple bug (Radar) 27848317. The problem, in short:
AVFoundation, the low-level audio/video framework in iOS and macOS, does not accurately seek within VBR MP3s, making VBR impractical to use for long files such as podcasts. Jumping to a timestamp in an hour-long VBR podcast can result in an error of over a minute, without the listener even knowing because the displayed timecode shows the expected time.
VBR encoding is far more space-efficient and better-sounding than constant-bitrate (CBR) encoding. It’s especially pronounced in podcasts, where VBR makes most podcasts 20–50% smaller AND better-sounding than the 64 kbps CBR encoding that most podcasters are forced to use today.
VBR could save podcast listeners massive amounts of data transfer over time. (And therefore money, and battery life, and precious storage space on phones.)
Without accurate seeking, streaming and web audio players don’t work properly, including share-at-timestamp links that are becoming key drivers of the sharing and spreading of podcasts.
Why can’t podcasters use it?
I explained how MP3s work, and why this is a problem, on Accidental Tech Podcast last week — see that? That’s a share-at-timestamp link, and if that file was VBR, it wouldn’t seek to the correct time.
See for yourself: here’s that same podcast in VBR. Note that the file is 25% smaller and the theme song (at 1:22:47 in the original file) sounds way nicer in the VBR version. But if you seek to the same timestamp as the above share link — 1:24:30 — you’ll hear the wrong audio. The player will say 1:24:30, but you’re actually hearing the audio at 1:25:16.
That’s 46 seconds off, and that’s enough to break timestamp sharing, and that’s enough to ensure that nobody ever uses VBR files, and podcasts keep transferring more bytes to sound worse for the foreseeable future.
We fixed this in the same year the Backstreet Boys released “I Want It That Way”
Three simple solutions to accurate VBR stream-seeking have existed for almost twenty years to embed seek-offset tables at the start of VBR MP3s for precise seeking:
But AVFoundation supports none of them. VBRI and legacy Xing frames are read, but only the duration is used from each, not the seek table. MLLT tags are seemingly ignored.
It appears that AVFoundation simply estimates byte offsets with the simple ratio of (timestamp / duration) × totalBytes, but that assumes a constant average bitrate over the file, which is incorrect and an unsafe assumption for VBR encoding. (ABR maintains an average bitrate over the whole file, but doesn’t achieve a better enough size-to-quality ratio than CBR for most podcasts.)
Supporting either MLLT or VBRI at the AVFoundation level (therefore affecting Safari, HTML5 <audio>, Apple’s Podcasts app, and more) would instantly make VBR podcasts practical, allowing much smaller files and better sound without sacrificing shareability and stream-seeking.
I’ll be adding MLLT support to Overcast, but without a way to embed podcasts in the web player to preserve share-at-timestamp links, VBR files will continue to be practically unusable for podcasters.
Know anyone in engineering at Apple? I’d appreciate any attention you can draw to this issue, which I’ve filed as bug 27848317.
This week: why ARM Macs aren’t imminent, a huge rant on the ancient Mac lineup (especially the neglect of the Mac Pro and Mac Mini), dog rental, and my algorithm for syncing audio tracks and correcting drift.
Government guidelines for LED manufacturers require these control circuits to operate on frequencies between 30 and 300 MHz. By coincidence, most garage door opener remotes have been assigned frequencies between 288 and 360 MHz.
I was using a mediocre, no-name LED light bulb in my (very old) garage-door opener, so I switched it to an incandescent I had stashed in my Drawer Of Light, which promptly and poetically burned out later that same day.1
But that was weeks ago, and the problem hasn’t occurred since. It’s been 100% reliable since I removed the LED bulb, and even catches the signal from greater distance now.
I still haven’t gotten around to replacing it. It turned out not to be essential, and I’m a terrible home-repair slacker, which is why I tried to put LED bulbs everywhere in the first place so half of our light bulbs wouldn’t be burned out constantly. ↩︎
Apple has, for a while now, offered two separate additional security measures to protect your Macs, iOS devices, and iCloud account, but thanks to some inexpert nomenclature, it can be a little difficult to tell them apart
I’m glad Dan Moren figured this out and wrote it up, because Apple sure didn’t make it easy to even know that there was a newer, better option than the original two-… uh, factor? Step? I already forgot which is the old one and which is the new one. Whichever it is, switch to the new one.
A few Tesla vehicles have had accidents with Autopilot enabled recently, and I’ve gotten countless questions about these incidents and the nature of Autopilot from people who aren’t Tesla owners. Tesla and the media haven’t clearly communicated what these features do (and don’t do) to the public, so I’ll try to help in whatever small way I can as a Model S owner for a few months so far.
I apologize in advance if I get any technical details wrong about these features. Authoritative information is hard to find, and these features change and evolve often.
Tesla’s autonomous features today, all somewhat grouped under or involved in “Autopilot”:
Automatic emergency braking: This always-on feature will sense if you’re approaching another car or obstacle too quickly and loudly alert you. If you don’t apply the brakes yourself, the car will automatically brake to somedegree. This is a common feature in luxury cars today and seems to be a clear safety win.
Autopark: Reverses into parking spots on demand. This is also becoming a common feature on other cars, and seems reasonably safe as long as you watch out for pedestrians. I use it regularly for parallel parking and it works well.
Summon:This feature lets you command the car, from outside of it, to very slowly drive itself into or out of a garage or parking space. It’s disabled by default and requires multiple steps to enable and engage (nobody could do this accidentally). The potential damage from failures is likely limited to car body or garage damage, not major bodily harm, due to the very slow movement and ultrasonic parking sensors. I haven’t used it yet — I don’t think the small benefit is worth the risk.
Adaptive cruise control: Like normal cruise control, but with a forward radar (augmented by the camera) to maintain a safe distance from the car ahead of you, automatically slowing down or even stopping as necessary. It’s almost like automated driving, but you still steer, and you’re responsible for obeying signs and signals. This feature is also available on many luxury cars today, and Tesla’s is the best one I’ve used yet, so I use it all the time. It bears most of the same risks as any cruise control, but the chances of rear-ending the car ahead of you are greatly reduced, and it may even be safer than manual driving in low-speed stop-and-go traffic. I’m a huge fan of this feature.
Autosteer, which people probably mean by “Autopilot”: Really just one significant addition to adaptive cruise control: the car also steers itself, using the camera to detect lane markings painted on roads (a feature offered by many other cars on its own) and automatically steer to keep you roughly centered in the lane.
Autosteer is a strange feeling in practice. It literally turns the steering wheel for you, but if you take your hands off for more than a few minutes, it slows down and yells at you until you put your hands back on the wheel. It’s an odd sensation: You’ve given up control of the vehicle, but you can’t stop mimicking control, and while your attention is barely needed, you can’t safely stop paying attention.
It’s automated enough that people will stop paying attention, but it’s not good enough that they can. You could say the same about cruise control, but cruise control feels like an acceptable balance to me, whereas Autosteer feels like it’s just over the line. History will probably prove me wrong on that, but it feels a bit wrong today.
Tesla, Elon Musk, and a lot of media coverage have set expectations too high for these features. People expect Autosteer to be fully autonomous, but today’s Tesla vehicles simply don’t have the hardware or software to safely and reliably self-drive on all roads, and such an advance doesn’t feel imminent.
There’s a huge gap between Autosteer and what most people expect from a “self-driving car”. For instance, Autosteer doesn’t see signs or traffic signals, so it will happily drive through red lights or stop signs if you let it.
Most critically, Autosteer has simply not been reliable enough yet for me on anything but wide-laned, gently turning, intersection-free highways with clearly painted lines in dry weather. In my experience, using it on any other type of road — even New York’s highway-like parkways — is dangerous and unsettling, often requiring manual corrections to avoid crossing center lines or getting dangerously close to lane edges and concrete barriers.
The most reliable, useful, and defensible parts of Tesla’s “Autopilot” features today are emergency braking, Autopark, and adaptive cruise control. I’d be just as happy with my Model S if it only had those, without Summon or Autosteer.
While I like using Autosteer on long highway trips, frankly, I’m amazed that it’s legal. I don’t think it’s a big enough advance over adaptive cruise control to be worth the risks in its current implementation. I’m scared for what will happen to Tesla and the progress of autonomous driving as more people use Autosteer in situations it’s not good at, or as a complete replacement for paying attention.
If Tesla updates the software to restrict Autosteer only to interstate highways, the yelling (and possible lawsuits) from existing owners would cause short-term pain, but I think it may save a lot of reputation damage — and possibly even people’s lives — in the long run.
The San Francisco Chronicle, in a very rare front-page editorial:
On one point we must all agree: The level and pervasiveness of homelessness in San Francisco is a disgrace. It is simply not acceptable to allow people to stay in the squalor of tent encampments or sleep in doorways, parks and freeway underpasses without attention to the underlying issues that prevent them from attaining shelter and stability in their lives. It’s bad for public safety, bad for public health, and bad as a matter of basic humanity.
Its reduction to the extent humanly possible should be this city’s No. 1 priority.
I only spend one week a year in San Francisco, and I’ve seen relatively little of the city. But every year, I’m increasingly struck by the widening class divide and disturbing contrast I see as tech workers (including myself) briskly walk past a lot of people for whom society has completely failed, pretending not to notice them, on our way to offices and events of some of the richest companies in the world.
We can’t continue boasting our industry’s “innovation” and how much we’re “changing the world” when we can’t even take care of people’s basic needs literally right outside these companies’ front doors.
This isn’t just a San Francisco or tech-industry problem, but there isn’t another place in America that illustrates the problem quite as clearly, sadly, and disturbingly.
Governments should be fixing this problem, but they have mostly failed due to public ignorance, judgment, and apathy. If you really want to be “disruptive” and have a meaningful impact on the world, disrupt the way our cities and citizens treat those less fortunate than the rich young people ordering overpriced burritos from their phones to avoid going outside.
This fall’s new iPhone is strongly rumored to have nearly the same physical design as the iPhone 6 and 6S, but with the headphone jack removed. Many have guessed the possible justifications for such a move:
In short: There may be a great reason why the headphone jack must be removed on an iPhone that isn’t getting a noteworthy size change or battery-life increase, but we haven’t heard one yet.
There are clear benefits to Apple — minor savings in parts and internal complexity, some profit from adapters and Lightning licensing, and driving a big Beats upgrade cycle — but nobody has come up with any compelling benefits for customers that require removing the headphone jack and can’t already be done in today’s iPhones.
People already think Apple changes ports capriciously and slows down their phone with OS updates just to force upgrades and make more money, even when they actually have good reasons that benefit their products and customers. I suspect that the reaction to removing the headphone jack will be even more severe in this way than the Dock-to-Lightning transition.
Apple better have very good benefits for this that customers will want, but none of the reports so far indicate any.
Combined with the disappointment sure to result from the same physical iPhone design for three years in a row — a mediocre one, at that — I fear for the public perception of this fall’s iPhone and Apple as a result.
It’s too late to change anything about this year’s iPhone hardware, but if this is true, I hope Apple at least reduces the perception damage by including a Lightning-to-3.5mm adapter in the box along with the new Lightning EarPods, and also selling the adapter separately for just $9.99. That would go a long way toward alleviating the problem.