Take

Later posts →

Review: MacBook Pro 16" (late 2016, Earth II)

I told myself I wasn't going to do this.

Just under two years ago, I had been holding out for years for Apple to introduce a MacBook Pro that undid the sins of the late 2016 model: Touch Bar, butterfly-mechanism keyboard and entirely Thunderbolt 3/USB-C. With rumors swirling all 2019, it took Apple until November 13th to introduce the first 16" model, which ditched the problematic butterfly switches, shrunk the Touch Bar to allow a physical Escape key and bumped the specifications.

With my then current pre-Touch Bar model crumbling, I bit the bullet and upgraded. It cost, by far, the most I had spent on a Mac, dongles to enable connectivity and compatibility and additional chargers not included. But it was a necessary investment, budgeted to last four to five years.

What none of us knew at the time was that work was already proceeding on its successor; the first Apple Silicon full-sized, full-featured MacBook Pro made for professionals, made to include all the features we had been clamoring for all these years.

The keyboard

The keys are, from all I can tell, the same type as the ones used in the 2019 model. Some say they can feel a difference between the older model MacBook Pros and these, and side by side I can maybe make it out, but they are what I want in a key. Snappy and distinct, with the right level of travel.

What's more, there are now 12 additional keys, taking the space of the Touch Bar, and stretching to fill a full-sized row, leaving the arrow keys (rightly) as the only half-height keys. After two years with a Touch Bar, being able to reach for and press a full-sized volume key feels almost ostentatiously luxurious.

Aesthetically, the keyboard well has been anodized a matching black, and it is a great decision that cleans up the visual clutter somewhat without sacrificing the legibility of the key shapes.

The function key system shortcuts are new, and the keyboard backlight controls are notably relegated, possibly in a message that it is time for the functionality to fade to the background and for the brightness to adjust itself. (You can still adjust the brightness in the Keyboard part of System Preferences.)

As for the rest, Do Not Disturb gets its own key, as does Spotlight and Siri. I don't use Siri and pressing its key (F5) is now permanently allocated to bringing up a window asking me if I'd like to change my mind. Together with Spotlight, which I always invoke faster with Cmd+Space, I would like to be able to reassign these keys to different functions.

As with all Apple Silicon Macs, the fn key now pulls double duty as the "Globe" (🌐) key, which when pressed alone brings up the Emoji palette. This feature has been available since macOS 11, as a checkbox option, but the key is now also referred to as the Globe key in the user interface of some settings related to keyboard layouts.

The ports

Blissfully, the sides now feature many things besides just plain Thunderbolt/USB-C. Let's start from the left.

First, you can find a new MagSafe 3 port for magnetically attached charging, which now involves a plain, braided and still expensive MagSafe-to-USB-C cable, but which at least is a separate purchase. Additionally, the 3.5 mm headphone jack has been moved back to the left edge (which I personally liked better) and it supposedly works better with high-impedance headphones.

Going around to the right, you will find an HDMI 2.0 port (and not HDMI 2.1), and an SDXC port capable of UHS-II, but not UHS-3. Neither are the absolute best, most up-to-date, most capable that could have appeared in their spot, which is frustrating considering the price and profile of the computer.

You will also find three Thunderbolt 4/USB 4 USB-C ports (two on the left, one on the right). What you won't find is even a single USB-A port (the traditional USB port). As someone who, even after years of accumulating USB-C devices, still needs to plug in USB-A devices, and still needs to do so much more often than use SD cards or HDMI, this is beyond frustrating. I would still have needed a dongle for the occasional Ethernet use or different video outputs, but I could have minimized my need to carry around, keep track of and remember to bring a dongle. After all, during the introduction of the MacBook Pro, Apple itself claimed to heed the wishes of professionals to not need adapters. With that goal in mind, eschewing even a single USB-A port is a very odd trade-off, since it, unlike Ethernet (which many production professionals also need) would have fit on the side.

The display

Returning to 2019 for a moment, it also saw the tremendously expensive Pro Display XDR, aimed to act as a less expensive alternative to professional production-level reference monitors. Whereas the Pro Display XDR had 576 local dimming zones powered by LEDs, whose need for constant cooling informed the lattice pattern on the back of the display, the new MacBook Pro display is one of the first mini-LED displays, packing in 2500 local dimming zones (with 4 mini-LEDs each), into an assembly that is millimeters thin, and still delivering the same color space, 1600 nits peak and 1000 nits sustained brightness specifications.

The new display also picks up 120 Hz and adaptive refresh rates (ProMotion), putting it still behind many PC laptops and desktop monitors (some of which go for 360 Hz), but matching this year's iPhone 13 Pro and 2017's iPad Pro, and besting the 60 Hz Pro Display XDR (which audience is often reticent to venture above film refresh rates at all).

In a sentence, the quality of the display is probably the best display you've ever seen on a laptop. OLED-level inky deep blacks with sharp contrast and even higher resolution. The bezels are even narrower, with the top corners even following the curve of the casing. You may even call it…

A top notch display

Smack dab in the top middle of the display is a notch; an area where the display ceases, for the benefit of the camera and related sensors. I think notches are an eyesore that at best you learn to ignore, but in this case I see a number of things working to its advantage.

Without the notch, the display would be roughly equal in proportions and area as the prior 16" MacBook Pro, and would match a full 16:10 display. The additional area is extra space and houses the menu bar, which for me personally is sparsely populated in almost every application. In applications with a large number of menus (which many professionals could need to use), or in any application for people who use dozens of the top right "menu extras", you do start to run out of space, with menu items wrapping around or their titles being truncated. This is not a good look, and it may be a better look to enable the quizzically titled Get Info option "Scale to fit below built-in camera", which may as well be named "Move below the notch", which causes the display rows alongside the notch to be ignored.

In iPhones, full screen widescreen video brings out the worst of the notch, making it cut out an unsightly bite of the display. Here, the aspect ratio of videos work to its advantage. Most are 16:9 or wider and are already centered in the display with letterboxing, putting them far away from the notch. The mini-LED nature of the display also leaves the letterboxing areas black and the notch barely distinguishable at all.

So is the notch needed from a technical standpoint? We've all seen phones; why can't you just fit the camera up in the bezel area? Because on phones, you still get 2-4x the depth to work with, and cameras can be small due to fitting all the circuitry behind the lens and the housing. Here, the module needs to be spread out instead, which means placing it entirely inside the bezel area probably did not work well at all. I'm sure you could fit some form of camera in the space allotted, but I'm not sure it would be one you'd want to use.

Given the result is both a good camera and additional screen space, and given that I don't suffer from stuffed menu bar syndrome, I'm willing to forgive the notch. My main complaint is that it's not possible to, out of the box, set the menu bar to pitch black, which would hide the notch much better. This is not a guess; opening an application in full-screen does this, and it is indeed almost not noticeable.

Of cores

I wanted to maintain my 64 GB of memory, which meant I had to pick the M1 Max, which is the only variant that allows for 64 GB. The heaviest pressure I put on the GPU in daily use is to have way too many windows and tabs open. On the previous MacBook Pro, the GPU had 8 GB of memory to work with, and I would not be surprised if most of it was populated with texture caches for all the various surfaces to draw.

Historically, shared memory between a CPU and an integrated GPU has been a feature of the low-end processors, or at least of the energy-efficient processors. Those tend to work with miniscule amounts of RAM and over slow buses, and the circumstances have added up to a poor experience – a slow GPU, stealing a static amount of memory from an already constrained CPU.

"Unified memory", as Apple terms it, on the most fundamental level is not different from these shared memory shenanigans. But as implemented on the M1 Pro and M1 Max, the balance shifts. The memory bandwidth is now 200 or 400 GB/s, which is in line with what purposely high-bandwidth directly-attached GPU memory provides. The access latency is very low, and the throughput is significant. Most of the GPU rendering is also probably being done by system frameworks using Metal, which is optimized to use representations for pixel buffers or textures that the GPU cores natively need, which means the CPU can produce an image and the GPU can use it without transforming it into another representation, and without even needing to copy it into another location. On the scale of a system, all of this adds up. Instead of two cooks trying to work separately in a galley the size of a closet, it ends up more being two cooks trying to cooperate, preparing and picking up from one another, in a sizable and well-kitted-out kitchen.

I have not used an M1 Mac, but have heard tales of a single core saturating the memory bandwidth, and one of the reasons I stayed away was because of the limited connectivity. The architectures of the M1 Pro and M1 Max, keeping the fundamental CPU cores themselves but expanding on everything else, shows that these shortcomings were not intentional declarations of good enough, but transitional pains that will eventually go away. Depending on the benchmark chosen, many things can be shown, including that the GPUs are far from adapted to be competitive with desktop GPUs at gaming. But the collected performance of the CPU cores is competitive with many desktop CPUs from Intel and AMD, and this bodes well for the future.

All of the above in an architecture that in both an absolute and relative sense sips power and thus both gives long battery life and minimizes fan noise and heat makes it hard to be calm and reasoned about the performance of the MacBook Pro.

Industrial design

In 2015, the 12" MacBook started a new era of industrial design and of priorities for Apple. For one and a half years, its impact on the upper end of the MacBook family was unknown, but the 2016 MacBook Pro brought in slimmer lines, the butterfly keyboard and entirely USB-C I/O.

As of last week, this era is over and buried, and the 2021 MacBook Pro, being the first major redesign, defines a new picture. The Pro products are allowed to be more substantial, in every sense of the word. Thicker and heavier are not absolute terms of revulsion, but axes along which the product can be measured. There are laptops three times the thickness of the 2021 MacBook Pro, and half the thickness as well. It is a reasonable size and weight for its intended use case, and its format allows it to pack in features that are important. Focusing, more than just saying "no" as many times as you can to various impingements of a designed dreamed up in isolation and divorced from purpose and mechanical realities, is the process of choosing tradeoffs.

Design is how it works, and the 2021 MacBook Pro, through its 100% recycled aluminum shell, through its comfortable keyboard and outstanding trackpad, through its sharp display, through its groundbreaking M1 Pro/Max system-on-chip, through even its camera notch and still-too-few ports, works very well for its intended use. It knows its market – the people who need to or who want to do a lot of things with their computers – and is designed for them.

It is also expensive. More than any high-end Mac in a decade, it provides a mix of features and capability that could actually command a premium, even if we weren't in a worldwide semiconductor crisis. And it finally catches up to high refresh rate displays.

In 2019, the 16" MacBook Pro told us: hey look, at least they're trying again. In 2021, the 14" and 16" MacBook Pro tell us: breathe a sigh of relief. The MacBook Pro has rediscovered what it is.

The hard part now will be going four years without getting the next one.

Good Things Take Time

Colin:

I spent part of 2019, all of 2020, and most of 2021 working on the new MacBook Pros announced today! I designed the M1 Pro and M1 Max dev boards. then I was the M1 Pro and M1 Max system integration lead, and new miniLED and camera system integration lead.

Reminder to myself and to other people: good things take time. Especially trying to make something brand new come to life for the first time.

Voicing complaints is easy. Building things is hard. Thank you, Colin, and all the people you work with, and every other person out there who sits down with a blank sheet of paper and slowly and steadily makes new things happen.

So Ordered

Clear, reasoned, calm thinking does not emanate within an hour of an announcement. I'll do my best, but let's instead call this a celebration.

  • MagSafe is back, at the cost of one Thunderbolt/USB-C port.

  • All Thunderbolt ports are now Thunderbolt 4, instead of the "USB 4/Thunderbolt 3" of the M1 Macs.

  • The Touch Bar is gone, with full-height function-row keys to take its place, and leaving Touch ID alone. Could it ever have worked? Maybe, but Apple was interested only in planting the flag and claiming victory instead of acknowledging its failures and working around them. The world is not full of successful touch surfaces that you use by feel, without looking at and which contents change constantly. Considering its mention in the presentation, even Apple eventually thought getting back to what a keyboard is all about was a defendable move.

  • Out of the fog of "M1X" steps M1 Pro and M1 Max, née Jade C-Chop and C-Die. M1 Pro has 16 GPU cores and up to 32 GB memory; M1 Max has 32 GPU cores and up to 64 GB memory. I (and my wallet) might have settled for M1 Pro with 64 GB of memory, but that is not how these things work, and now with some technical justification.

  • M1 Max's GPU chops are apparently "comparable" to laptop-level discrete "Max-Q" chips (GeForce RTX 3080 Laptop, in their comparison). In other words, it's not a bloodbath in terms of maximum performance, and it is not yet a credible alternative to the highest performance discrete desktop GPUs. But if your needs are lower than this, it looks promising.

  • That said, the unified memory strategy means that, as long as you don't want to max out both CPU memory and GPU memory, the ability to address a ton of it over a fast (200 or 400 GB/s) bus is a sleeper hit of this architectural choice. (And even if you do; 32 GB of each does not come easily in most PCs.)

  • The more square-ish form factor reminds me of the aluminum PowerBook G4, and is probably great for airflow.

  • HDMI, SDXC, good. I could have done with a USB-A port or two; them and wired Ethernet are prime reasons to not throw away your dongle.

  • 120 Hz, ProMotion/adaptive refresh rate, mini-LED display, with higher pixel density and display brightness fit to deliver good HDR. I would have been happy to have two or three of these; a notch for the webcam is a negative, but much less of a negative on an OS with a menu bar (assuming the OS knows not to put anything underneath, of course).

  • With M1 Pro and M1 Max, the mirage of an entire product line supported by essentially the same M1 chip is now killed; arguably, it was killed by the M1 itself taking the place of A-series chips in the iPad Pro earlier this year.

At some point in the past, from all I hear, the PowerBook G4 was faster than the then anemic Intel mobile chips despite its lower clocks. When I had mine, its own anemic 167 MHz front-side bus put a damper of any claims to supreme (or at times even acceptable) performance. The switch to Intel and the then-new Core Duo brought a welcome jolt, but at the expense of "just" being as good as the rest of the portable PC market.

Today, for the first time, Apple steps into the brave new world of not just terrific performance per watt, but assuming all the claims hold water, a level of laptop performance that is barely achievable with machines that are several times thicker, heavier, louder, and last for a fraction of the time on battery.

Today, with some caveats, Apple delivered a MacBook Pro that was for actual Pros again, for the people who need more ports, more performance, more capable hardware; who want their computers to be computers, instead of scaled-up app machines.

And today, at long last, Apple listened to its users once again, took their needs and their advice at their word, and made a better computer.

Windows 11's Start Menu

Lukas Mathis:

For application launchers, though, a spatial view is still the preferred approach. This is why Windows 11’s Start menu is so confusing to me.

This is what my Start menu looked like in Windows 10: (ed)

This is by far the best home screen experience any operating system currently offers. Better than the app launcher on OS X, better than Android, better than iOS, better than any Linux distro I’ve seen.

(There's only so much I can quote without quoting the whole article, so go read it, it's not long.)

Lukas is right – Windows 10's Start menu is not a place I prefer to spend time, but it does let you use seldom implemented humane sensibilities. Big things can be big, tiny things can be tiny, related things can be grouped together. You are allowed to paint the walls, arrange the furniture and inhabit it; to make it yours.

In contrast, the goal of Windows 11's Start menu is seemingly to run from the past, towards the colorless, odorless grey goo of conformity in the alphabetized list of junk. It seems to have started from the idea that Windows had gotten everything wrong, and that for the sake of everyone's sanity, it should just absorb the characteristics of other systems. There is some merit to this idea and allowed them to escape the morass of Metro and Fluent/"acrylic", but it doesn't serve Windows 11 well in these areas.

Yo Dawg, I Heard You Liked App Stores

MacRumors:

Two major Apple competitors, Google and Microsoft, now support alternate app installation options on their platforms, something that could potentially sway regulators working on antitrust legislation in the United States and other countries.

When I heard the original part of the Windows 11 announcement, I got the sense that the Microsoft Store would become either a store "platform", where other people could host Microsoft Store-shaped objects or a browser, through which other stores could be federated. Making the separate stores available for download through their store itself (which is mostly what it all comes down to) is a reasonable way to cut this Gordian knot.

But here's the thing. Alternative Android stores, alternative Windows stores — they already exist. They have existed for years, the technical platform has existed for years, all issues have been ironed out or known for years, and the major tide has been closing an open system (for Android, "anyone can do anything", for Windows, "Microsoft can't build a regular application cooler than you can" (literally)).

For Apple, the starting point is the complete opposite. A closed system, including a secure (in theory) sandbox architecture, with isolation of resources, deep layering in the system from bootloading upwards, and with "entitlements" and a set of permissions to unlock or manage who gets access to what. And, at the top, an App Store to download new app bundles to the device, and to set ongoing organizational policy about who gets access to what.

A third party app store (or package manager like Cydia) which wants to use this system has to have the routes to playing the App Store piece cleared to it, and for all the dozens of policies for which apps get to do what, in each case where Apple is listed as the arbiter, there will need to be a decision to delegate it or not. For instance, a third party app store could have a more laissez-faire approach to who gets to write a Network extension (VPN) or primary web browser.

There's also the gooey technical middle ground - for example, if you want to allow multiple web rendering engines, is there also then a need to make apps that can themselves provide frameworks to other apps? Personally, I think apps being able to work together and provide extensions, opt-in mutual integration points in an XPC-like way or export/import data in a way that uses the ask-for-get-granted-permission system to its advantage is long overdue and would leapfrog the current URL/Shortcuts-based workarounds.

But that underlines the work involved - even if Apple was ordered by the Andorran consumer protection government agency to provide a shrinkwrap App Store that other people could instantiate, those other people could not extend the underlying system, could not build out extensions, could not provide new permissions or entitlements. On Windows and Android, that's not an issue because it's not required (and for permissions and entitlements on Android, the OS itself can be forked, or maybe there is a lighter-weight way for the vendor to maintain custom permissions that I am just unaware of). On macOS, for all the unease about clamping down, there are already other app stores like Steam and Setapp that continue to work. But on iOS, there are basically more pieces missing than are present.

Right now, Apple seems to be coasting by on the pious hope that the entities that are able to order them to change their business will continue to be amenable to lobbying that defines their interpretation of the current state of affairs as the correct one, or failing that, that they will have some level of understanding and therefore sympathy for the sweeping and foundational efforts that would have to be made to enable other app stores — I can guarantee that they won't.

As Regards Circles and Jerks

Daring Fireball, in linking to Google's Jony Ive-lampooning ad about having a headphone jack in the new Pixel 5a with 5G:

As of next month, 40 percent of all iPhone model years will have been headphone-jack-free. This feels about as relevant as mocking the original iMac for not having a floppy drive.

This makes a lot more sense if you strike out "relevant" and scribble in "timely". Even so, clearly, the practical implications makes it difficult for the world to "move on" from what would otherwise just have been an imfamously bad look for a PR department.

The reason the ad is resonating with people is because we all either know people who are affected, or are ourselves affected, by the headphone jack's removal. The comments are full of people who are happy to have a headphone jack, and who are happy that the Pixel is made by a company that can make fun of itself as it's reversing a past decision; Apple removing a jack is one thing, but the entire industry being dragged along with it is another, and came with its own fallout.

As previously featured, I recently got a pair of AirPods Pro and have been using them for the six weeks or so. (That review may have started on an odd note because I'm not all that fucking important, but I really just wanted to explain what I was doing and why. I had a strong opinion going in, but I still wanted to let the product unfold in actual use, the way I tried to see the good in the Touch Bar; my life would be a whole lot easier if the pixie dust had checked out completely and I just had to give it a try.)

Since I wrote that, I have found myself switching to wired headphones more often than I should have had to if they'd been a full-on replacement, including one conference call where people thanked me for "unwrapping the tinfoil around my microphone". All this with shipping production firmware, a mature product that has barely been used. Add to this the developing problem with the Lightning contact in my phone which causes the wired headphones to disconnect abruptly.

The point is: it still makes a difference in day to day life. Not to everyone, and not all the time, but dollars to donuts, my headphone and calling situation would be much better if this headphone jack hadn't gone away. I'm way past done having emotions on this subject. I just want something to happen that solves the issue. AirPods don't. Adapters don't. And unless the W2 chip also does voodoo and Apple doesn't like me whining about App Stores, I didn't do anything with any of my hardware to make this something that literally only happens to me.

Manage Your Expectations

If I go to your site and there's a big blockout modal alert with the only options being "Accept All" and "Manage Settings", your options are to fix the bullshit exploitation business model that's already withering on the vine as it loses public support and legal cover, or to wait until bankruptcy and bitch and moan about how "the market never sent you any signals for what to fix and then you suffocated under the weight of regulation".

If your core business idea is to vacuum data about people's lives and habits and then resell them for money, it doesn't deserve to survive. If it's ever legitimate, it's not workable as a pillar of society or a thing that every business can do.

Don't get stuck in the dogmas of the trackers, the miners, the hoarders and the advertisers. Do what it is that you do. Get the word out. Let people find it and use it or read it or listen to it or whatever it is you do. The world is a big place and has room for many successful businesses. What it doesn't have room for is bloated messes of corporations getting, insincerely, into every market they can enumerate, just to keep the numbers up, the growth up, just to be "seen" or "ranked highly" in one more place.

Surveillance Capitalism is a cancer on society. Venture Capital is a false high, leaving businesses strip-mined, founders broken, employees wrung out like towels, and ideas accepted that don't work with billions behind them, and ignored though they would benefit many.

There is a place for keeping the space race alive and for running the engine of discovery, so that we can catch the ideas that no one expects to explode into society-altering upheaval, like movable type or electronics. There's no place for doing so off the backs of hundreds of thousands of employees not even allowed a workable, dignified job or a living wage.

We live in the future now. It was never the high technology that was the problem, it was the squishy humans, the poverty of their imaginations and their willingness to define the peak of human reasoning and achievement as an industrial age ironmaster.

Quinn Nelson: The MagSafe Battery is Trash

Speaking of avoiding range anxiety on Apple devices – Quinn reviews the iPhone MagSafe Battery, pronounces it an underbaked and confusing product and backs up his claims.

There's more to it than this appetizer, but extend that across all possible axes and it's not far off.

Review: AirPods Pro (2019)

I've come up with a set of rules that describe our reactions to technologies:

  1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
  2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
  3. Anything invented after you're thirty-five is against the natural order of things.

— Douglas Adams

The prologue

For years, I have taken contrarian perspectives against some recent innovations. I have decried app stores, refused to participate in social media and seen usability sprayed with shrapnel in the slaughter of skeumorphic extravagance or simplification for its own sake. There are several distinct reasons why me and AirPods are not born friends, so why should I do this review?

The purpose of a review is not to be neutral, it is to be fair.

I have not notably changed my mind, but opinions are nothing without experience, and when I started considering getting Bluetooth headphones anyway for those situations where they are a good idea and have always excelled, I knew what I wanted to try.

A good review is not just an enumeration of facts, it is the report of a long experiment where the product is allowed to perform its function, and where its success and fitness for purpose is documented in view of expectations, competitors and suitability.

This review is based on four weeks of daily use.

The expectation

For over a decade, I have used in-ear wired headphones. Most commonly of the type sold by Apple since 2009 (I have owned somewhere north of six pairs, wearing them out as I go), but I have recently tried models from Sudio (Vasa Blå), Sennheiser (Momentum In-Ear) and Bang & Olufsen (Beoplay H3).

I don't do anything special – I listen to music and to podcasts from my computer and my phone, I occasionally take phone calls or participate in online meetings, and I pipe all computer audio out through it most of the time. While I notice distortion and can tell poor audio apart from good audio when able to compare, I am not an audiophile. I occasionally fall asleep listening to something, often podcasts. I enjoy a good microphone, and I like having the controls: volume up/down, play/pause, previous/next track, take call/hang up call.

The courage

The first models of AirPods came into this world on September 7, 2016, the same day Apple removed the 3.5 millimeter headphone jack from all new iPhone models. The move left three ways for users to attach headphones: use the bundled Lightning-to-3.5 adapter, use Lightning headphones (including the bundled EarPods), or use Bluetooth headphones (including the then new AirPods). Apple did not leave anyone stranded unless they wanted to simultaneously use the Lightning port for other things, including charging. But the adapter is now relegated to an accessory selling for $9, and in my experience, it has been famously fragile.

Apple painted the move as having the courage to leave a long-lived connector behind. Considering the near-universal reaction from users, most of which used wired headphones (if any at all), it wasn't made in a cheap, populist plea for acceptance. The presentation set AirPods up as the natural companion to every iPhone, delivering a sense of freedom and a superior listening experience.

The basics

AirPods Pro come in a small white charging case. You flip it open to light a green LED on the front, hold it close to an iPhone or iPad and tap the button on the card that pops up on screen. This names them and pairs them to that device (and all other devices with your Apple ID). You then pop each earphone in your ear, it gives off a distinctive muffled boom and you're off to the races.

When they need charging, you can pop them into the case, including one by one, and they charge from the case's battery. The case itself needs charging, and can be charged from a Lightning port or through standard Qi wireless charging. The earphones themselves do not charge standalone via Qi charging, only in the case (via pins in the stem).

When both earphones are worn, they can provide active noise cancellation through microphones picking up ambient noise. The cancellation has a mode where it can provide "transparency", imitating the sound you would have heard by not having the earphone in your ear at all.

The wire

Many reviews and impressions of any model of AirPods focus on the sensation of being untethered. Being able to move freely, without worrying about the wire, without dragging your laptop off its surface or making your head or ears suffer recoil. Although I expected more, I will admit that this is a good feeling. Being able to get up without popping out your headphones or rearranging the wire. Being able to go into another room without stopping what you're doing (as long as you keep in reasonable range of the connected device, that is).

The fit

The fit is, regardless of which of the three sized tips I use, pretty terrible. Even when I push them in as far as I can, they are at risk of falling out, and have fallen out numerous times during use.

The sound

I am ill-equipped to pose as knowledgeable. In my experience, they perform well, and do not provide worse sound than wired headphones. To my ears, they do not appear to provide earth-shatteringly crisper or deeper sound.

The noise cancellation works reasonably well, but also has a tendency to amplify background noise in a way that sounds like transmission noise. I don't know if the transparency mode always works like that, but it has put me off wanting to check.

The switching

As far as I can tell, they appear to want to stick around the last device to which they were connected. On iPhone and iPad, it suffices to have one of them inserted and start using the device for them to automatically connect; on Mac, a notification is shown offering to connect, or you can pull down the audio menu to connect (and also switch between the noise cancellation modes). On Apple TV, you have to switch manually by pulling down the info panel, switching to the right tab and selecting AirPods Pro, although in this year's tvOS edition, apparently a notification will appear where a remote button press will let you switch.

Most of the time, this works as advertised. Switching between a device where they automatically connect and a device where they don't is confusing in practice, but may settle in eventually. Wearing one earphone, taking out another earphone (waiting until it also plays the audio) and then putting back the first earphone into the charging case has the tendency to stop playback. Sometimes it resumes playback once it realizes that the other earphone is not coming back, but more than once it also resumed playback from the wrong device, playing back something entirely differently.

The controls

Theoretically, you can perform controls by pressing the tips of the earphones, but in practice, this has not worked well at all for me. It does not work omnidirectionally but instead has two grooves where your fingers are supposed to grip, but I still can't seem to find it quickly. The regular AirPods were mocked for looking like electric toothbrush brush heads, but as odd as it looked I think they may have provided a better target.

The lack of tactility or haptic feedback also means you never quite know when something has been registered or not, making the mix of "click", "press-and-hold" and "double-click" gestures on the same area awkward to perform. I miss any form of volume controls, but considering the current zoo of options (of which some are able to be reassigned or disabled), I don't miss having to properly distinguish them.

The storied curse of Harald Blåtand

As always, Bluetooth is unreliable, and AirPods are not safe from the laws of physics. I have heard things drop out and screw up, but they seem to be doing a good job of managing it.

The battery

The battery is the single sin for which I can not forgive the AirPods Pro. They have introduced range anxiety in my life.

Using AirPods includes a contract to submit to always having battery life on your mind. I have seen this with some people who work their phones to the bone, having to always cart around extra chargers or battery banks and constantly topping them up to get through the day. If you use headphones in the way that I apparently do, you have to take a similar approach.

Each earphone lasts up to a few hours on a single charge, but in a way that seems to vary. The simple thing to do is to use them from full tilt until they are drained, then place them in the charging case, wait for them to be juiced back up, and then resume. But this assumes that using headphones is an arbitrary luxury that you are fine using for some time and then being without.

If this is not how you use earphones, you are instead driven to constantly devoting some part of your mind to doing the charging shuffle. Put one earphone in the case, wait for it to rise some number of percent, then take it out, put it in your ear, wait for the sound to bridge over, then take out your other earphone and put it in the case (and deal with resuming playback or reconnecting), then set a mental timer for however long is appropriate and do it again. Repeat this process every day for the rest of your life.

This breaks concentration, kills flow and reinforces, again and again and again, that this is a task you have to do or you will screw yourself in the future. If you are in a position to do something else or go somewhere, of course you can get a free reprieve by putting them both in the charging case. And if your usage pattern looks like this — brief (less than two hours) use, with at least 30 minutes or so in the case to allow recharging — then you probably have a much better experience than I do.

But I don't. This is my life now, and there is only using them or charging them. You can't do both.

The platonic ideal

Usually, there could be some hope. After all, battery technology is constantly advancing; charging is made quicker, capacities are improving and more efficient chemistries are being developed, with the world-class incentives of taking over power production from a dirty grid and making the transition to electric vehicles possible.

But I'm not all that hopeful that things will improve for the AirPods. Apple has a history of picking a "platonic" battery life figure, building a product around it and then maintaining that battery life. For the Apple Watch, it was 18 hours of occasional use, enough to retain some charge at the end of the day. For the iPod, 24 hours of music playback; iPad, 10 hours; iPhone, more or less lasting a day, scaling somewhat with the increasing demands of apps, networks and displays.

For those products, the platonic battery life is fine. For the Apple Watch and iPhones, as long as you can charge them as part of your day, it all works. But that's not the story for the AirPods, at least as I use them. They would have to last for two or three times longer. The only time I've seen that with Apple products is recently when Apple switched Macs to their own silicon. But AirPods already use, and probably are only possible because of the W series of chips powering them to begin with. And with the charging case, it's not impossible to keep them alive more or less indefinitely. It's just a much worse time.

The alternate universe

Let's rewind to September 7, 2016 and travel to Earth 2, where Apple instead introduces the iPhone 7 which of course still includes a headphone jack. They then go on to introduce the AirPods exactly as is, and they are remembered for being a technical milestone, which took a concept started by products like the Bragi earphones and made it work more dependably than before.

Would I have been interested in this product? As a curiosity, sure. I would have been a whole lot less reticent to try it out. But I likely would have run into the same issues with it and its successors over time.

The AirPods can be many things at once: one of the better products in a market segment; a product fighting physics at every step and having the consequences to show for it; a product scarred from birth with the expectation of being a clear technological improvement over wired headphones. No product with its features can be unambiguously better than wired headphones; too much gets lost in the shuffle. It's not a fair fight, and Apple did it (and itself) a disservice by sticking it with the unwarranted removal of the headphone jack.

The big red score in the bottom right of the last page

Reality is complex and nuanced. Even in this review, there are layers upon layers of expectations, specifications, philosophy and sociology. The truth is that I'm weary of all this, of the soundbite made religion, of the compression of factors that are situational and personal into n stars out of five, and any way I came out would be a judgement call of something trivial over something essential.

The AirPods Pro are not the first piece of technology to bring unexpected mental weight into your life on a cushion of marketing, the promise of looking less like an old relic and the worship of gadgetry; nor will they be the last. They perform an adequate job with some shackles removed, hoping your use case will be shaped in such a way that you will not notice the shackles that it adds. But if you are in that sweet spot, and honestly even if you aren't, they still are both adorable, personable and functional, in their own way. They aren't made without care; nor are they the perfect manifestation of headphone technology.

I will keep using them, but I will also keep using my wired headphones; they are both excellent tools for their scenarios. And unless battery technology and Apple psychology both make enormous strides in coming years, I'd still like that headphone jack back.

And If You Don't Like Them

Daring Fireball:

What happens, for example, if China demands that it provide its own database of image fingerprints for use with this system — a database that would likely include images related to political dissent. Tank man, say, or any of the remarkable litany of comparisons showing the striking resemblance of Xi Jinping to Winnie the Pooh.

This slippery-slope argument is a legitimate concern. Apple’s response is simply that they’ll refuse.

This hinges on Apple doing the right thing, protecting the privacy of its users. John Gruber is right that Apple has a record of showing more spine than usual to demands from law enforcement even in charged situations, but the problem is that Apple also has a record of bending to the PRC's will.

During a few days in October 2019, it pulled and then reinstated an app allowing Hong Kong democracy protesters to organize. (The story includes a statement by Apple CEO Tim Cook attempting to staple legitimacy to the takedown and features a quote from one John Gruber, who "called Cook’s explanation “both startling and sad,” adding, “I can’t recall an Apple memo or statement that crumbles so quickly under scrutiny.”")

I have no reason to believe that Apple is in a hurry to assent to PRC policies, or that it doesn't bite its tongue when forced to follow a directive from CCP or Beijing. Apple also has a responsibility to keep its employees safe, and from a state that perpetuates genocide against its own citizens and pull public figures from society, I can only imagine the many ways they wouldn't be.

In other words: Yes, I fully believe that Apple will refuse when asked, and I don't question their motives for why this feature should exist. The problem is that I don't believe it's remotely enough. Some states do not have a record of taking no for an answer, and when recent history shows impactful decisions, going against those same values and morals, that are the result of either successful pressure or regulatory capture, the situation recalls the words of a quite different Marx: "Those are my principles, and if you don't like them... well, I have others."

Charlie Harrington: Mario Paint Masterpiece

There's just no reason for this game to be this good. Sure, MS Paint on Windows was always a fun time-waster back in the day. But it certaintly wasn't weird. And Mario Paint is w-e-i-r-d. It's like Photoshop on Magic Mushrooms. Plus an animation studio. Plus a digital audio workstation (aka Garageband). And all this back when we were still recording our favorite songs from the radio on cassette tapes (if we were lucky enough to catch them, and even then usually missing the first few bars).

Don't talk to me about delightful UIs or UXs if you haven't played Mario Paint. Nothing makes sense at a glance. Instead, it's pure discovery. Click-and-see. The undo button is a dog's face. Why? Why not. The fill-paint animation is a break-dancing paint-brush with a smiley-face (that sentence had a lot of hyphens). I spent hours and hours clicking every button in Mario Paint, and just making weird shit.

Nintendo are (clearly) not the gods of UI, but in Mario Paint they are playing near the upper echelons of what "UX" can mean. Undodog is a character, and if you idle, he will start to walk around restlessly, hurrying back if you return, and sneezing in the background if you turn off the background music. (And Nintendo wouldn't be Nintendo if he didn't also reappear in Super Mario Maker, which shares a lot of conceptual DNA.) Clearing the screen is done by flying a rocket across it; saving involves a robot. Not to mention the music creation genre it inspired, or the mouse affinity tutorial turned mini-game.

I'm not saying all software should be designed this way. But when it comes to intentionally quirky and weird UIs, most drive you up the wall. Mario Paint manages to tell a story, encourage creativity and facilitate said creativity (within significant technical limitations, which it leans into). Look at the cover art – Mario is doing that thing where he measures proportions, but even this has a double meaning, where he winks, smiles and shows a thumbs up to what he's creating. The kid using Mario Paint is not being helped by a well-meaning, self-aware parent, overseeing the puerile paint splashing with portent; they are simply doing what Mario is also doing.

Many modern UIs see it as a goal in itself to strip everything down to their bare essentials and then freak out when there are no more essentials to clear out or conceptual purity to further. The opposite is to really empower your users, by giving them tools that let them do things they probably want to do, instead of worrying that an errant distraction will ruin everything.

Right to Repair

When I see "Right to Repair", I see "Plug 'n Play". When Plug 'n Play was introduced as a term alongside Windows 95, the joke was that Mac had never needed a term for it since it was just the way things were supposed to work.

Right to Repair exists today. You can take your device to any random repair shop and, as long as they are reasonably technically competent, they can take the same manuals and replacement parts and software provided to the authorized locations and perform the same work. It is not in any way rocket science (beyond, at times, what the manufacturer inserts into the process), it has not stopped manufacturers from "innovating", whatever the hell that even means anymore, and it is not a rampant public or product safety hazard.

The only thing I'd add is that the industry I'm describing is the automotive industry. You can do this with multiple ton vehicles, often filled with tens of gallons of flammable propellant just to make things interesting, but also with separate computer networks, tight clearances and miniaturized components out the wazoo.

Why you should not be able to do this with mobile phones and tractors has only ever had one honest answer – but we'd like it if we could make more money at the expense of our customers' convenience – and even it is not valid.

RIP Near

Incredibly sad news: Near, née byuu, known for their long commitment to emulation accuracy and as an author and maintainer of several emulator focused on accuracy and fidelity of reproduction above else, appears to have committed suicide over the weekend.

A terrible tragedy, by all accounts triggered by constant and unyielding online harassment (warning: chilling). A great, unique, inspiring, mind-bendingly influential mind with a life's work most of us can only aspire to — cut short by the worst elements of humanity.

Nick Heer: Safari 15 and Chickenshit Minimalism

I am not a fan of the new Safari design. I am not sure I hate it, and I think I get what Apple is trying to do by combining the tab and address bar into a single element and allowing it to inherit the colour of the page. But I do not think it makes sense yet and, worse, I am concerned about some bad design patterns that are emerging.

I have struggled a bit with what to say about WWDC, because there are so many things to say. For Safari especially, it gets complicated.

In one way, it is a design that breaks with the past and moves a bit further away from computer administrative debris (moving it all to the one flank closest to your thumb), and where you don't have to tap the Share button to Find something.

But in another way, it does it by placing nearly everything, including fundamentals like reload (or showing all tabs on iPad), practically in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying "Beware of the Leopard".

Like Nick says, it's chickenshit minimalism; but beyond that, it's also something ineffable that I can't name that goes beyond looks for the sake of looks and just borders on the unexplainable. For example: the AirPods Max "handbag/bra" charging case, which look like crap, is made out of military-grade Easystainium and doesn't provide any protection at all, or the unsightly iPhone battery case bumps.

You can counter this with the usual suspects: "great design stretches the tastes", "ideas are fragile", sometimes it's hard to see a flower's full bloom in its first month. These are all valid, but also refer to ideas that have underlying depth and that need to be protected as they are figured out or implemented.

Let's take an example: The iPhone X-era all-screen interaction model was probably worked on for years, but we first saw it after many iterations, when it was buttery smooth, paired with hardware to enable it (iPhone X had a 120 Hz touch substrate to increase responsiveness) and I have never noticed any hiccups with it or slowdowns as devices have slid out of favor.

From day 1, it required learning and was different, but it was obvious, dependable and clearly designed with the user as an active participant and fundamental driver of the interactions. It was different not for the sake of being different, but because it could be better, and it increased fluidity, intent and speed of navigation beyond the rigidity of a button press, letting the user interface merge movement and thought to the point where it really is intuitive, because it plays on the fundamental theme of what your brain does and how your body and your senses work. I am gushing, and I still can't get over how brilliant it is. The UI has been tweaked since, but I would have no problem subsisting indefinitely on the version that shipped out of the box on a launch day iPhone X.

But the three examples I mentioned before either are immutable or affect people right now – one is in beta and will hopefully be changed, but the other shipped, and there was never any saving grace discovered; just the Backstreet Boys defense. It boggles the mind.

It's like a desire to pick a controversial decision and, by sheer force of leaning into it hard enough, somehow make it palatable and right and true, without ever needing to tackle or confront the legitimate criticisms. I brought up the iPhone X example to show that it doesn't flow through everything Apple does, not even every big surprising change, but it sure does damage where it shows up, and it completely undoes any pretentions of having the platforms be well thought-out.

(As a footnote, the iPhone X is a good example of the two things mixing — no one requested the removal of the 3.5 mm jack either, and no one's life is improved by it. There were previous phones with 3.5 mm jacks and higher IP ratings.)

Old iMessages

John Gruber, in a footnote:

Unless I’m missing something, not one piece of communication entered into evidence — from either Apple or Epic — has been anything other than an email message. Not one message from iMessage or any other messaging service. I find that very surprising. Do Apple executives never use iMessage to discuss work?

I'm guessing they just can't get to them.

Becky Hansmeyer: A Few Thoughts on the Eve of WWDC

Caught in the middle of it all, then, are the lovely Apple employees we know (or are lightly acquainted with) and love. They show us their work with such deliberation and care, such passion and delight.

[..]

I’ve said this before, but I believe one of the single most important leadership qualities is humility, which by definition requires listening. If Apple executives listen to their employees and developers, decide their requests are not in line with the company’s core values, and say as much, that is one thing, because at least it’s honest. If, however, their requests or ideas align with the company’s values, but clash with its traditions or shareholder expectations (or simply aggravate the executives’ hubris) and they dig in their heels and tighten their grips, they are rightly deserving of criticism and, dare I say, scorn. And I think they’ll find, as the winds of change continue to blow, that they’ll eventually be caught in a storm they can’t escape, driven along on a course they did not chart for themselves.

Marco Arment: Developer relations

Without our apps, the iPhone has little value to most of its customers today.

[..]

[I]n the common case — and for most app installations, the much more common case — of searching for a specific app by name or following a link or ad based on its developer’s own marketing or reputation, Apple has served no meaningful role in the customer acquisition and “deserves” nothing more from the transaction than what a CDN and commodity credit-card processor would charge.

The idea that the App Store is responsible for most customers of any reasonably well-known app is a fantasy.

I pulled myself out of this in 2008 because I hate the idea of the App Store and have scarcely been able to shut up about it since. It's easy, or at least possible, to imagine that my unending grenades are just sour grapes or fantasies unmoored from reality. Marco ships Overcast, one of the most popular podcast apps in the world.

This is what it comes down to. Epic's inability to use another payment processor is just a symptom of the same disease. Beyond the mobility of huge companies, it affects the everyday lives of developers and customers as being users – this is where we live, and Apple are not being reasonable stewards of this community.

Ars Technica: Supreme Court limits reach of hacking law that US used to prosecute Aaron Swartz

The Supreme Court issued a ruling today that imposes a limit on what counts as a crime under the Computer Fraud and Abuse Act (CFAA).

[..]

"The parties agree that Van Buren accessed the law enforcement database system with authorization," the ruling said. "The only question is whether Van Buren could use the system to retrieve license-plate information. Both sides agree that he could. Van Buren accordingly did not 'excee[d] authorized access' to the database, as the CFAA defines that phrase, even though he obtained information from the database for an improper purpose. We therefore reverse the contrary judgment of the Eleventh Circuit and remand the case for further proceedings consistent with this opinion."

[..]

But as we wrote in our story on the oral arguments, the government's argument "seems hard to square with past CFAA cases. TicketMaster's website, for example, is available to the general public. People who purchase tickets there aren't 'akin to employees.' Yet people got prosecuted for scraping it. Similarly, JSTOR doesn't hand-pick who is allowed to access academic articles—yet [Aaron] Swartz was prosecuted for downloading them without authorization."

Swartz committed suicide in 2013 when he was being prosecuted under the CFAA for downloading over 4 million academic journal papers from JSTOR over MIT's computer network.

MacRumors: Phil Schiller on App Store Knockoffs in 2012: "Is No One Reviewing These Apps?"

"What the hell is this????" he asked. "How does an obvious rip off of the super popular Temple Run, with no screenshots, garbage marketing text, and almost all 1-star ratings become the #1 free app on the store?"

"Is no one reviewing these apps? Is no one minding the store?" he ranted on, before asking whether people remembered a talk about becoming the "Nordstrom" of App Stores in quality of service.

[.. later, in 2015:]

“[this scam app] is a great example of the stuff we should have automatic tools to find and kick out of the store. I can’t believe we still don’t.”

“and PLEASE develop a system to automatically find low rated apps and purge them!!”

Oh, spin me once again a yarn about how the App Store is inherently slathered in discerning curation; so discerning that low effort scams emerge, and so discerning that automated processes are dreamed up to salvage the situation, with automatically triggered removal of already approved applications without consideration for due process or developer impact the inevitable and apparently desirable outcome.

The App Store: Pigheaded, dishonest, ineffective, capricious.

Simon Willison: One year of TILs

Just over a year ago I started tracking TILs, inspired by Josh Branchaud’s collection. I’ve since published 148 TILs across 43 different topics. It’s a great format!

TIL stands for Today I Learned. The thing I like most about TILs is that they drop the barrier to publishing something online to almost nothing.

If you think this site is just me complaining about stuff, Simon Willison's collected output is pretty much always the opposite — building new stuff and being open about what's going on and what he's learning, both of which are always interesting. (I stayed subscribed to his feed through a hiatus of several years in the hopes that he would return to writing, which he did a few years back.)

The M1 is the Core Duo

Apparently the "M2" has gone into production, hot off the heels of an ExtremeTech piece about how Apple's M1 Positioning Mocks the Entire x86 Business Model.

It is a curious phenomenon how the M1 is the same chip in the Mac mini, MacBook Air, MacBook Pro 13" and now iMac (save for the binned 7-core GPU). It is curious how the chip can still, to some degree, whip even higher-end models; and it is curious which strategy Apple will use going forward for chip differentiation.

Following the iPhone/Apple Watch model, the idea is basically that all models get the latest chip, and then next year, last year's model slide down. The iPads having more models instead use a wider spread of chips, although all pulling from chips that were once the best.

I can't claim to have a firm grip on the future strategy, but considering both the break from the past, the relative performance but also the architectural bummers, M1 to me looks closest to the first Intel CPU Apple shipped in a product, the original Core Duo codenamed "Yonah". It was the product of Intel's backpedaling from the Pentium 4 Netburst architecture which was the inevitable endgame of always chasing CPU frequency at any cost, it equaled or outdid the Pentium 4 in performance, it ran laps around the G4 and G5, but it was also hopelessly 32-bit.

Yonah was the first shot across the bow as Intel recalibrated, and soon led to the Core 2 (with increasing core counts) and later to the i3, i5, i7 and recently the i9 marques, not to mention bleeding the architecture to the Xeon. It was hard at the time to not be impressed by the Intel Core Duo, but there was also a short while where it was everywhere simply due to being the first chip out of the gates.

I don't think Apple will make more chips than they have to, and I think they're likely to keep up their idea of making a chip "this" powerful, and then building a product around that level of performance, rather than providing tiers of increasing capability. But I also think having the same chip in the Mac mini as in the iMac as in the MacBook Pro 13" is a temporal flub; a child of necessity. What they will do is highly dependent on how often they wish to rev their chips, how big those revisions will be and how likely they are to make customizations. Considering the wide-ranging SoC duties, it already is not likely that the M1 in iMac is the exact same as the M1 in the Mac mini.

Every dime

Bloomberg:

Music-streaming service Spotify Technology SA and Match Group Inc., which operates online dating apps, accused Apple Inc. of squeezing software developers that depend on its App Store to reach customers by extracting monopoly profits and squashing competition.

[..]

Jared Sine, Match’s chief legal officer, told senators that a few years ago, the company wanted to make changes to its app in Taiwan aimed at boosting safety for users by instituting ID verification rules. Apple rejected the app, and when Sine contacted an executive at Apple about the decision, the person “disagreed with our assessment of how to run our business and keep our users safe.”

“He added that we just should be glad that Apple is not taking all of Match’s revenue, telling me: ‘You owe us every dime you’ve made,’” Sine said.

Every dime.

I don't know which is worse: someone going full Prosciutto in terms of intimidation, or someone actually believing this.

Apple Finally Introduces Long-Rumored Accessory "Air"

In a series of chiefly outdoor vignettes set outside Apple Park, Tim Cook and numerous other Apple executives and managers finally unveiled the long-rumored Air.

Long rumored to be in development, Air is thought to have been held up by the long process of oxidation as the Earth transformed from a primordial melting pot of chemical reactions to a venue suitable for carbon-based life forms. Sources close to the project, but who declined to be named, pointed to the 2007 cancellation of 64-bit Carbon as a low point, leaving the product team scrambling to recontextualize their vision. (The ex-hailed AirPower was a product of the same team, managing only to produce Power, claimed the same source.)

The presentation, clearly choreographed to show off the interplay of Air with greenery and the upcoming Earth Day celebration, also featured a series of special Hermès leather pouches starting at $299, all with special engraving, but in keeping with prior collaborations, functionality equivalent to the standalone Air, available in mid-May for $29.

Air follows the recent introduction of Samsung's Galaxphyx. Neither have been made available to reviewers, but industry followers familiar with both companies' compositions suspect Samsung's offering may be missing vital elements of the Apple offering's user experience.

Gary Neville on European Super League proposals

To me, many of the US "franchise" leagues have always seemed like vain cosplays of athletic enterprise. It's not that there's no talent involved, because I'm sure there are great individual athletes and great coaches, it's that there are no stakes.

In most leagues, in most sports, if you don't do well enough, you fall down one rung of the ladder, or you have to play a qualifier against one of the best performing teams in the division just below. This keeps the sport fresh, the teams giving their all and management and players from resting on their laurels. In a US-style franchise league, if a team collapses totally, they collapse totally, but that's all that happens. A game between two teams near the bottom of the table can be exciting because there are real stakes at hand. A team on the ascent can rise through the leagues quickly, and a team that doesn't know what it's doing can plummet through them.

Today's "European Super League" proposal has been met with near-universal scorn from fans of soccer all over the world. Every team benefits from the mobility of the current system. The top leagues all over Europe (and all over most of the world) are the topmost protrusions of a deep system of similar leagues. It's about advancement and setbacks, it's about solidarity, it's about improvement over time, having goals and meeting them. It's about matching the complementary skills of different players against a tableau of similar choices from the opposite team. It's about working your way to a better place. It's about dealing with the real world, where you get more than five consecutive non-TV-ad minutes, and where sometimes you go head to head for a long time and pour every ounce of energy you've got into the struggle, and the result is a tie. It is also, more than ever, about money, of course, but no one can buy their way out of the fundamental conditions of the game.

Gary Neville, legendary defender at Manchester United with a storied career in the England national team, explains better than I can in the linked clip what the specific problems are with the European Super League.

US Supreme Court decides Google v. Oracle for Google

The ruling.

In short: using compatible APIs is fair use in the non-overlapping magisteria of common sense and US legal doctrine.

Copyright is a subject with vast scope and consequence. Oracle's tradition has always been using aggressive lawyers as their primary source of innovation. (Java was more innovative than anything Oracle's ever done; all Oracle did was to purchase it, which is not nothing, but does not morally justify treating events before their ownership as a personal affront.) Their angle has been the Lion King angle: everything the light touches is our kingdom, indivisible and equally covered by copyright. The majority opinion as well as Justice Thomas' dissent attaches to this train of thought by treating the entire code base as a pile of lines of fungible value.

Consider a copyright case of an affluent painter v. another, with the argument that the wood grain in the frame looks suspiciously familiar, never mind whether that's where the threshold of originality is met. Or one author v. another, about whether a point-by-point debunking of a crackpot theory should be seen as an unlawful derivative because of a similar looking table of contents, never mind whether it is a pedagogical arrangement for the reader.

Software platforms, programming languages, APIs, SDKs, modules and more all have many aspects to their construction and to their use, angles from which the entire thing is considered. As do most creative works, but here made even more complicated not only because of the distinction between library use or library production, but because of the entirely separate class of consumer in the executing computer, which by nature requires similarity to achieve compatibility.

Going back to those books, would simply having the same outward dimensions, typeface and paper stock be enough to call one of them infringing on the other? Or if we're satisfying humans by convention instead of computers by requirement, could a restaurant chain bring another restaurant chain to court over similar room layout and serving flow?

The ruling managed to find its way to a reasonable outcome, but if these are the tools used to chisel fundamental conditions for developers, companies and people the world over, we are all in bad shape for the future.

← Earlier posts