Take

Later posts →

Podcasts in Big Sur

After some trepidation, I updated to Big Sur with 11.1, which may have been too early after all.

All other things notwithstanding, revisiting Podcasts is informative. Nearly all of it seems to still hold true. Someone is awake at the switch somewhere since Cmd+L now does jump to the "show" of the current episode, but it doesn't select the episode and scroll it into view to let you pick neighboring episodes. Indeed, it first loads an empty list and then visibly populates it with data, leaving you at the top.

Furthermore, I am now thrilled to discover that hitting Space in Podcasts no longer play/pauses. The Controls menu lists this as option+Space, and is not remappable via the System Preferences Keyboard Shortcuts functionality, since such shortcuts require modifier keys (for good reason). This breaks with convention for basically any media playback application of any form where a keyboard is available – even the full-screen media player on iPadOS reacts the right way to Space. The Music app definitely still does.

There have been changes made to Catalyst to make it a less horrible choice for building Mac applications (like opting into native or at least native-seeming controls like buttons and checkboxes), and certainly the cavalcade of odd UI choices all across the OS make the particular ones in Catalyst or Podcasts seem less weird. But it still looks, feels and behaves more like a poorly written web app, a mélange of UI goo scraped out of a foreign metaphor and allowed to set without much customization or supervision.

And regardless of UI framework, it doesn't seem like the Podcasts team has any interest in going further than making it the weak not-quite-anything port that it is. The iOS Podcasts app is redesigned more years than not, with custom interactions, animations and flows. That the macOS version can't even get to a coherent, serviceable, purpose-appropriate app is bewildering.

Oh, and the aforementioned Controls menu, when opened, beachballs for a handful of seconds – significantly more time than to launch the entire application – and then presents the menu, because when a company has only been doing pull-down menus for 36 years there's only so much you can expect.

Cydia sues Apple for anticompetitive behavior

I had the original iPhone, and could not have used it if Cydia and jailbreaking wasn't around. Apple doesn't imbue its developer community with creativity, it largely constrains it.

🍿

We Need to Talk About Nintendo Switch Sharing

The Nintendo Switch is a wonder at this point. Applying Gunpei Yokoi's Lateral Thinking with Withered Technology, Nintendo took an ARM SoC that was already on the market and made it into the platform they needed – a cheap, semi-portable platform that was good enough and easy to port games to, and that right now fills the niche as the mainstream, Wii-like console for people who don't want mobile games and don't want to play the exclusives lottery with the Xbox or PlayStation consoles and can't afford to get both.

The OS, or in Nintendo's words "System Software" is a lot better in the Switch than in the Wii consoles. For the Wii U, much was made of an early slash of boot-up time to get it down to a half minute delay, and most of the user interface felt like swimming through corn starch. But some of the OS is still the weakest link of the Switch.

v11.0 is a great example of what's wrong. It introduces two features I've been waiting for since launch – the ability to send screen captures (images and video) to a phone or tablet, and to a computer over USB. Both of the implementations have striking flaws.

Sending to a phone does not involve using any of the Switch apps available. Instead, it involves a two-part QR code process: First, you scan a QR code to join an ad-hoc hosted Wi-Fi network. Then, when it detects that you have connected, it shows a second QR code, which is a link to a locally hosted web server which has a page with the media. This is brilliant and inspired – but it's brilliant and inspired in a 24-hour hackathon, look-what-we-can-do, proof-of-concept kind of way. It's what you do when you can't do anything else, to show that anything is possible. But it's a thoroughly horrible user experience. To make matters worse, you can only do this with up to 10 screenshots or 1 video at a time.

Sending to a computer via USB is less concerning, since it involves connecting a USB cable between the Switch and the computer and lets you have at the entire contents. The problem here is that you can't use a USB cable in the Switch Dock. You have to pull out the Switch and use the USB-C port on the Switch itself. If you could use the Dock's port, you could leave a cable in there and just connect it to your computer when you wanted to take a look. Now, it's more involved.

Both of these are advances over the state of the Switch from launch date to just over a week ago, where you were resigned to posting to Facebook or Twitter, or powering down the console and removing the micro-SD card (inserted behind the kickstand) respectively. And the Switch was launched under infamous hard deadlines, because of preannouncements about an "NX" console to be launched within that fiscal year. But these are features that could have been added much sooner, and could have been done much better.

The idea that comes to mind for phone/tablet sharing is to let you select as many items as possible, establish a local network connection or maybe even Bluetooth and send them over to a new department of one of the Switch apps. (There are a few variants on how to do this and what goes where, like maybe you can argue that the gallery should be streamed over to the app and the device user should handle picking and saving; but it's hard to choose a model that isn't significantly simpler, more efficient and less disruptive.) For USB connections, maybe design choices made during the construction of the Dock mean they really are prevented from making the connection directly to the Dock work, which means it's hard to patch after the hurried launch.

Either way, I hope the upcoming Switch refresh has spent more time thinking through these features. The Switch project wasn't rushed in all aspects, considering the intricate detail of its DRM, extending through the game cards. When it comes to basic features, responsiveness and loading times, though, Nintendo's habit of acting bumfuzzled has yet to wear off.

Chris Forsythe: Growl in Retirement

Growl is being retired after surviving for 17 years. With the announcement of Apple’s new hardware platform, a general shift of developers to Apple’s notification system, and a lack of obvious ways to improve Growl beyond what it is and has been, we’re announcing the retirement of Growl as of today.

Growl was famously hard to explain succinctly to people in my experience, but I think it speaks a lot to the community that before Mac OS X contained an infrastructure for this, people banded together and built something that was widely adopted. In this way it's not dissimilar from Internet Config by Quinn! the Eskimo et al or the External Editor protocol implemented by many FTP-like applications.

Growl made notifications bearable when Mac OS X was Mac OS X – thanks, Chris.

Rent-seeker's Advocate

When Apple announced the upcoming App Store Small Business Program, under which some business will have to give up 15% of their app's price to Apple instead of 30%, I posted something brash in response to it. It contained humorous allegations that Apple was acting like organized crime towards those so successful as to escape the bounds of the program, had it existed at the time they did so. (Realistically, the imputed partner would have exceeded the volume within days, if not hours.)

I still maintain that within the context of the joke, and knowing the background, it's more right than it is wrong, but that's not to say that people at Apple working with marketing or the App Store or developing the underlying infrastructure go to work with the intent to do harm, repress creativity or spoil livelihoods. I regret any discomfort that I have caused in those people, especially when there are both individuals and organizations that are explicitly hell-bent on exploitation. But that said, what are we to do when that's what the end-result nevertheless is?

Seen from one perspective, Apple heard the feedback from developers and launched the program with clear and pure intent to build a better relationship with them. Seen from almost every other perspective, it is one in a series of ploys intended to protect, tooth and nail, a monopoly from being taken from them – a monopoly which ostensibly provides a marketplace for the enjoyment of developers and users, but which consistently serves themselves above any other party. Apple is as sincere in its determination going to war against anyone who would change that as its previous "thermonuclear war" against Android being "a stolen product".

In the subject of accessibility, I have heard of a scale of "situational, temporary, permanent". Making colors contrast enough or text sizes bigger, for example, helps the people with vision problems, but it also helps those who are trying to work an outdoor touch screen with fogged glasses from mask-wearing, those whose medical issues may flare up and ruin their vision temporarily or those who have misplaced their reading glasses. By improving an aspect of software for some, you also end up improving that aspect of it for many others. Or in other words: actions have consequences, and it's hard to wall them off.

Apple's desire for control is intended to stem some permanent behaviors seen as unseemly, like pornography, gambling or ethically murky business propositions. But the net cast by those controls also end up catching the situational and the temporary. Do you want to release an application under a pen name? No, fat chance, you must have something to hide, go live your life on Android instead. Do you want to use a behavior or technique that, in the wrong hands, could be abused? Nuh-uh. Do you want to just plain not be subservient to the consequences of these rules and the capriciousness emergent in their application; or disagree with their definition of what is unseemly or not because you are not the lowest common denominator by which the rules are applied? Build your own damn platform.

Jamie Zawinski has long posted a message to Facebook employees to the effect that being at Facebook is being complicit in the consequences of the platform. While I wouldn't go so far, as I tried to disengage from Google (the search engine itself), I ended up using DuckDuckGo, which although a privacy conscious provider, also enlists the floundering search engine from Microsoft, which is also one of the companies trading blows for the world's highest market cap, which helps itself to various and sundry data from every Windows 10 user (and shows them Start menu advertisements even if you paid hundreds of dollars for an operating system license). This is Alice in Wonderland.

In short, it is becoming incredibly hard to not surrender some part of your life to five-to-ten technology companies. So, no, maybe going by a particular letter interpretation of a particular definition of a particular law, what Apple has is not a monopoly. But by any sane way of looking at the status quo, it's not a healthy, customer-serving market either. Apple is acting the way they do because they get away with it, and mix semi-serious fealty to community-minded activities and values with bog standard profit-maximizing-at-all-costs tax planning and only caring about their own house and their own situation.

The pragmatic way to go for Apple, if it still believes it can keep this house of cards standing, might be to allow App Stores themselves into the App Store. For example: get Steam, and you can download any game or app from Steam that Valve has their own approval process for, which may include games not otherwise allowed by the App Store, and neither Steam nor the Steam-sourced apps would have to use Apple's In-App Purchase platform. It would also presumably be heavily policed to ensure that it wasn't letting misleading, dangerous or fraudulent apps through – an idea so good, it is second only to the idea that the App Store itself might competently enforce those policies on the App Store's own listings. But speaking of that, this would also open for a pet theory of mine, that with other app stores allowed and available, the Apple App Store itself could be free to truly only allow the apps matching a higher standard.

Monopolist, Wary of Being Recognized As Such, and Convicted of Same, in Transparent Ploy to Confuse, Announces Plans to Collect Half as Much Rent That They Were Never Entitled To, Assuming That You Don't Magically Find Unexpected Success, In Which Case You Know What, We Think We're Gonna Help Ourselves to Some of That Rent Anyway

Ingredients:

  • 20% long overdue but underpowered adjustment
  • 30% confusing and capricious rules
  • 30% opportunistic PR
  • 20% Fuck Everyone Big Enough That They'd Be Able To Have An Effect In Public Opinion, You Know Who You Are, Epic Spat That Shit and You Didn't Kiss The Ring, Now Eat Shit You Bunch of Cocksuckers, Except For You League of Legends But We Already Have That Backroom Deal

Matthew Green: Ok Google: please publish your DKIM secret keys

An accident of the past few years is that this feature has been used primarily by political actors working in a manner that many people find agreeable — either because it suits a partisan preference, or because the people who got “caught” sort of deserved it.

But bad things happen to good people too. If you build a mechanism that incentivizes crime, sooner or later you will get crimed on.

Email providers like Google have made the decision, often without asking their customers, that anyone who guesses a customer’s email password — or phishes one of a company’s employees — should be granted a cryptographically undeniable proof that they can present to anyone in order to prove that the resulting proceeds of that crime are authentic. Maybe that proof will prove unnecessary for the criminals’ purposes. But it certainly isn’t without value. Removing that proof from criminal hands is an unalloyed good.

Well-argued post in favor of large mail providers (like Google) rotating their signing keys periodically and publishing their secret key post facto. While it sounds insane or at least reckless, it removes a point of tension that right now is hurting people, and does not materially hurt the purpose for which DKIM was invented – to authenticate legitimacy as emails are being delivered. And as mentioned:

A paranoid reader might also consider the possibility that motivated attackers might already have stolen Google’s historical secret DKIM keys.

Cryptography, its applications and its consequences are very subtle, and it's hard to get one benefit in isolation. Rather, you buy into a set of behaviors, some of which are not immediately obvious. We are used to encryption establishing privacy, but deniability is also an aspect of privacy. I wonder if the initial threat modeling of DKIM foresaw this, or if it maybe saw the unintentional consequence as an unconditional net good.

Apple Silicon: Inaugural M1 Macs

To sum it up: Intel has been embarrassing itself for the past few years, Apple has been building up an impressive silicon know-how for the past decade, and ARM architecture chips have been inherently power-efficient since they were invented.

The M1 versions kick the Intel Core versions around the block a few times in most workloads, and the integrated GPU cores have been receiving tuning and tweaking as part of pipelines targeting Metal on iPad and iPhone for years. There's also no reason to doubt the statements about power efficiency and performance-per-watt increases. And having a completely fan-less MacBook Air in a way that is not just a dangerous fool's errand is a remarkable achievement.

There are still open questions, though:

  • The M1 leans into UMA, a "unified memory architecture" where the memory is shared between the GPU and CPU. That's great most of the time, but how is performance affected when both the GPU and CPU draw lots of resources?

    Will future M chips allow off-SoC memory in the first place, or off-chip GPUs? (eGPUs are noted incompatible with M1. Update: I was mistaken; only the MacBook Air M1 is incompatible, probably because of the lack of thermal envelope. Update: Apple's page was mistaken, as now no M1-powered models are listed at all.) Being the fastest integrated GPU is impressive, but it's also like walking the fastest; there are plenty of chips out there that can run.

    The M1 tops out at 16 GB now and the previous Mac mini offered up to 64 GB; how far north will the included RAM go in the future?

  • Mac mini loses out on two additional Thunderbolt/USB-C ports and 10 Gbit Ethernet. Will this type of functionality simply drop off the radar?

  • Both M1 MacBooks use the ISP to clean up the image, but still use 720p webcams? What year is it? Has no one heard of "garbage in, garbage out", or are they betting that extra chip functionality will solve literally every problem?

The M1 is the first step of the transition; the opening salvo instead of the crescendo. There will be surprises and revelations ahead, and I look forward to seeing genuinely new designs within the next year, including next spring and during WWDC.

Don't forget: The Macs that were revved today were from the bottom part of the lineup. For all we know, the M2 or M1X containing both faster cores and the additional infrastructure to do fuller system architecture are still being finalized. It takes time to build out the chip capabilities from iPhone/iPad-level chips to something capable of handling all subsystems in all Macs. PCI Express (used by Thunderbolt/USB4) and virtualization support are two examples of features already present in the M1 that no previous Apple SoC has needed to support.

Apple Silicon: The Roads Not Taken

[Please note: This was written ahead of today's announcements.]

My track record on predicting things out of the blue is pretty spotty, so here are a few things I can imagine but that will probably not materialize.

  • "Apple Pi"

    Raspberry Pi-like, "tinkerer-friendly" Mac, for under $100.

    Compare the prices of most single-board computers and the x86 models are steadily either significantly more expensive, or running four year old Intel Atom CPUs, or both. Not only do ARM processors not have the issue of having to keep Intel afloat, Apple has itself had experience putting out small SoCs in surprising places.

    If they would do this, chances are they'd make it all about hosting stuff on iCloud, writing code in Swift (maybe using a connected iPad). I don't quite see how it can both be what the Raspberry Pi crowd likes and what Apple likes at the same time. Apple's not interested in enabling tinkering. It's interested in making kids code, but on a high-margin iOS device and up. With the way macOS has moved recently, there's little making this a Mac as such, but it's more a Mac than iOS/iPadOS.

  • "Mac nano"

    A Mac mini the size of the Apple TV, for $199, with 4GB RAM, 64/128 GB of iPhone-like storage, hardly any I/O, and probably an A12, A13 or A14. BYODKM – hook up the display with HDMI or USB-C, hook up keyboard and mouse wirelessly or with a USB-C hub/adapter.

    The old Steve Jobs quote was "we don't know how to make a $500 computer that's not a piece of crap", and Apple can now comfortably pack in the computational power for an okay enough experience for what people are likely to plug into it. As long as it runs the software well enough, it's a candidate to bring people over from Windows, and they're about to lose the fallback "if all else fails you could use it as a Windows PC"; it needs to be cheaper.

    ("Mac SE" was already taken.)

  • An affordable Mac mini

    Take the current Mac mini, make it a bit smaller and make it affordable. Again – the Intel tax is gone, and Apple, if they want to, can churn out silicon in large scales by themselves already. The first Mac mini was $499 – there's no reason the first ARM Mac mini can't be.

All of these products essentially are based on this: there's an Apple that makes iPhones for $399 with industry-leading performance, and there's an Apple that sells wheels for almost twice that price. It's up to Apple to define what they want to sell and how they want to market it, and heading into a transition where you drop a hardware partner for your own designs is a perfect time to choose a new tack.

Say what you want about whether Apple wanted to offer lower-level products before, the price-to-performance ratio with Intel never made much sense. And if a Celeron or Atom didn't exactly scream high enough performance, neither did PowerPC chips that were lower-end than the ones they put in their low-end Macs back in those days. In a way, Apple's not had the opportunity to tackle this head-on at least for 20 years or so, so we don't really know that the idea has been rejected by Apple rather than by circumstance.

Mister President-Elect

They just elected to tell the monkey "you're going to have to turn that machine gun in".

(I have a longer post with a lot more going on, but it's not coming together, and there's a limit to how late I can post this and still be reasonably timely. I also have other things to post about and I don't want to not have acknowledged this when I do.)

Sean Connery dead at 90

Much will be said about his defining tenure as James Bond, of his dialect, of how people are both shaken and stirred.

For me, Sean Connery redefined the nature of creativity. If you haven't already, watch Finding Forrester this weekend. Its teachings sit deeper with me than the ur-meme and subsequent genre that it birthed, although we may hopefully all instinctively understand it a bit better by now.

Oh Gee

First, Apple's come a long way from calling carriers "orifices". Without checking, Verizon probably got more stage time than the lidar in the iPhone 12 Pro, where it assists autofocus and plays a big role in magically making photos work out even for people who have never knowingly 3D scanned something in their life.

But more importantly, the sense I've got is that 5G isn't a dud technology but that it really only provides its advantages in areas where it's really well built out. With mmWave, the intense half of the technology, that means literally if you have line of sight to a tower. I can see why they focused on stadiums and NFL, and why Apple picked a carrier who could say "we're now rolling this out for real", to get past any argument that it's been bad until now.

All of the presentation and indeed the product site itself are packed with demonstrations where all of these things will download completely before the touch debounces, before the foot hits the ground, before the next time the hummingbird flaps its wings. It's not just in the flashy montage video – Apple is setting the expectation that this is how life will be with 5G, and that requires that these speeds are realistic and omnipresent and dependable. Either they are right or it will be worse than this.

If they are right, it's a huge leap forward for 5G since it's not the known, measured opinion of anyone with access to a realistic 5G network today, never mind the people in the coverage map's vast shadow. If it's worse, are they going to hide behind people not having Verizon as a carrier, or not being one of the 200 million Americans ostensibly covered by the built-out non-mmWave network? The iPhone is a worldwide product, and even most people in the US don't use Verizon, and for all the US-centricity, these days the experience of using it rarely comes with great big asterisks.

If this all is a great big "fuck you" to AT&T over "5GE", though, good job.

Sony tears apart PlayStation 5

I've never owned a PlayStation and maybe never will, but this was an interesting overview. A crazy mix of conventional PC and bespoke architecture – and the CPU is cooled with liquidmetal! (Which from what I understand takes extreme care because it's also electrically conductive.)

Home

Far too often, the soul of someone who happens to be a Mac user is seen through the lens of corporate communications — about crazy ones, misfits and round pegs in square holes.

I'm a Mac user and have been so for most of my life. Growing up, System 6 was a staple, but I also remember a Compaq portable with the mouse trackball at the right side of the screen and the mouse buttons on the lid, and eventually using Windows 95 and 98. I came back to Mac with Mac OS X when I left a really powerful PC for a computationally dinky Aluminum PowerBook G4, and have not owned a full-on desktop since, even if I do own a NUC and a few Raspberry Pis.

I recently have been in a mode of deep (Windows-based) user interface focus at work, and was describing Panic to someone recently when something came over me. I love these guys. I love the attention to detail of every large, small and medium thing, the time put into making an application that feels right and flows right, that's easy to use, that has just enough user interface that you can get done what you need to get done, that has style, function and whimsy.

Panic is just an example, but there are so many of their ilk that I can point to. Many Tricks and their excellent Witch; Ranchero's NetNewsWire; Omni and their myriad of productivity tools; Noodlesoft's Hazel; and so on, and so on, and so on.

I've used Windows every day for as long as I can remember in one way or another. I can find my way around there as well as on any other platform. But while on the world's biggest desktop OS, I still feel constrained by a meandering vision, by a lack of common conceptual ground, by the infuriating feeling that so much is built and left unfinished, unpolished and put out to rot.

The Mac gets a lot of flack from people who are nose deep in technical specifications and price matchups. What they don't see — or aren't interested in — is the intangible: the culture that people with big dreams and small means have made the unconventional available, the complex seemingly simple and the advanced accessible. This culture doesn't live or die by Apple in particular, although the original Macintosh being a product of a similar mindset helped set the tone. This culture produces things that are hard to find elsewhere, not because it's technically impossible to do, but because the values that drive those other platforms produce different outcomes.

I am upset with a lot of recent technology, because it all seems intent on burying history as part of remaking the world. Not everything new is bad or worse than what came before, but so many important learnings are being thrown out. You can't make a web app without first filling it with a big framework to implement basic interactions, and most of them lose the tactility and the richness of most native interfaces of any platform. You can definitely build good web interfaces driven by snappy and well-thought-out engines, but it takes intense focus and hard work to do so, and it's easier (but not cheaper) to just throw in Bootstrap and work just as poorly as most other web sites. Electron takes all this and wraps it up in a computationally horrible footprint, under the insulting guise of "native".

iOS, iPadOS, watchOS, tvOS and now macOS with Big Sur – all the recent advancements seem to come at the expense of the wide berth that used to produce great results. The freedom that allowed a seamless experience is chopped up by security concerns ham-handedly and haphazardly applied, and on most platforms most of the time topped with having to pass the needle's eye of a trillion dollar enterprise's hungry bean-counting and control. All for the purpose of being a populous platform for its own sake; for having more apps that cost little, grey gruel instead of food; for padding a bottom line if you're cynical, or stroking a corporate self-image run amok.

The reason, all these things considered, that I haven't left these platforms yet is that there's still the feeling of being in a garden of my own cultivation. I can control every nut and bolt and swap out infinitesimal details and fundamental building blocks in Linux, that's true. But that means that people do, and you end up with worrying about technological fundaments because of this uneven foundation. This soil does not bear great fruit, efforts by GNOME etc notwithstanding, and the culture lionizing the endless flexibility of the command line and architectural purity of UNIX gives an easy escape hatch for any problem.

Windows is seemingly more stable in this aspect, but while I am able to live in that house, I am not able to make it my home, and it's not for a lack of trying. Microsoft's repeated wallpaper-stripping and ever-changing priorities make it feel like an enormous mansion under constant renovation, with uneven floors, studs poking through the walls and fundamental features left broken or half-finished since the last time they cared. (The less said about the impressionistic "Fluent" wing entirely in featureless acrylic, the better.)

The culture and the people and the shared values and what it all comes together to produce. That's why I'm still here. You can live in many houses, but not all of them will ever feel like home. I'm upset with the landlord and the building manager who ignores leaking pipes and oiled floors catching on fire while upping the rent and turning a blind eye to hustlers running Three-card Monte, but aside from that, I love the neighborhood, I love the surroundings, I love that they value the things I do and I love what it can build over time.

Joseph Gentle: "I was wrong. CRDTs are the future"

What’s the JSON equivalent for realtime editing that anyone can just drop in to their project? In the glorious future we’ll need high quality CRDT implementations, because OT just won’t work for some applications. You couldn’t make a realtime version of Git, or a simple remake of Google Wave with OT. But if we have good CRDTs, do we need good OT implementations too? I’m not convinced we do. Every feature OT has can be put in to a CRDT. (Including trimming operations, by the way). But the reverse is not true.

I have been looking sideways at technology like OT for the past 10 years or so wondering if it'll ever be part of anything I do. CRDTs are refreshingly simple, and has less of that ominous "if you fuck this up, you fuck everything up" feeling, but they've been limited to simple, known-good constructions similar to concurrency primitives.

There is a layer of subtlety and complexity that will never go away with real-time editing, but I hold out hope that there can be CRDTs that encode larger patterns too, and can be used to compose larger state graphs in a way that keeps the choices you have to make right in front of you and avoids "trap doors".

Steaming

Three quick things:

  1. Why, when installing a Steam game on macOS, do I get the option to "Add to start menu"? In macOS it's not available. (And in Windows, it's called the Start menu.)
  2. Why, when the Steam app has found an update and properly badges and bounces the Dock tile, does clicking the tile or switching to it by any other means hide the update window? My guess is that the switch-to-the-app trigger invokes "show the main window at all costs". The window also is not available in the Window menu for switching to, or seemingly visible in any other way (Witch doesn't seem to find it).
  3. Why the hell is Portal 2 (and the other handful of first-party Valve games) still not cross-compiled for 64-bit macOS, so that it is still playable now, and would be playable (at least for a few years) on future Apple Silicon Macs? This is not rocket science, and Valve wouldn't even have to do the technical work themselves.

The first two should have been found within the first week of actual use. All of them should have been taken seriously and fixed as soon as possible.

Jordan Rose: Objective-Rust

Yep, that’s Rust code with embedded Objective-C syntax, and it works. Why would you do such a thing?

Fast Company: "Facebook is quietly pressuring its independent fact-checkers to change their rulings"

The video was notable because it had been shared by Lila Rose, the founder of antiabortion group Live Action, who has upwards of five million Facebook followers. Rose, leery of restrictions on her page and handy with claims of Big Tech censorship, quickly launched a petition protesting what she alleged was bias by Facebook’s fact-checking partner, a nonprofit called Health Feedback. Soon, four Republican senators, including Josh Hawley of Missouri and Ted Cruz of Texas, wrote a letter to Zuckerberg condemning what they called a “pattern of censorship.” They demanded a correction, a removal of all restrictions on Lila Rose and her group, and a “meaningful” audit of Facebook.

Soon, the fact-check labels were gone. A Facebook spokesperson told BuzzFeed News at the time that the labels would be removed pending an investigation “to determine whether the fact checkers who rated this content did so by following procedures designed to ensure impartiality.”

How very independent.

MacRumors: "WordPress for iOS Was Blocked From Updating Unless it Agreed to Add In-App Purchases for .Com Plans"

This is getting stupid.

Either Apple are somehow trying to find one interpretation of their rules and let it percolate through all apps as updates are processed, or they have gone stir-fry, stark raving crazy with greed and/or self-conscious paranoia and have whipped themselves into some sort of draft-dodger-hunting frenzy. Neither is a good look and neither serves the user.

(The same thing as always serves the user: letting people build the app they want, the way they want it and shipping it to their users and customers.)

This serves literally no one, not even Apple.

WSJ: "News Publishers Join Fight Against Apple Over App Store Terms"

MacRumors summarizes:

For Amazon Prime Video, Apple offered Amazon a special deal where it took just 15 percent of subscription revenue, and the publishers want the same deal. The letter asks Apple to "clearly define the conditions" that Amazon met to garner that agreement.

Clearly this must be some kind of mistake. All developers are treated equally by Apple, and no one ever gets to skirt the rules or a better deal – at least according to congressional testimony under oath.

Appic

I've had trouble finding things to say about Apple v. Epic, wherein Epic launched a clearly choreographed and premeditated series of actions, beginning with adding a method of payment that was not allowed by App Store guidelines and (some) analogues, getting this update taken down and Fortnite delisted, filing legal action against Apple for monopolistic behavior, being told that their Apple developer account for all platforms would be delisted in 14 days, appending more legal evidence and finally being told by Apple by way of a public statement, more or less, that "it doesn't have to be like this".

This should be catnip for me (or "app nip", if you will). In some ways the same fight is being wagered as before, but with high enough stakes that some damage may be made. But instead it just makes me uncomfortable.

Apple and Epic are two peas in a pod - both large companies staffed with many talented, capable people who provide an equally capable platform, on top of which many other people can build projects that do things they wouldn't do otherwise, and both companies that have mastered the ability to extract just enough money from this practice to not look like Oracle outright. Neither can plausably look like a scrappy underdog, and instead of stumbling into this situation through emergent arbitrariness, it was instigated as a public relations operation which looks, sounds and smells like a public relations operation, and only really excels at pointing out the hypocrisy in the other 800 pound gorilla's public relations self-image. In short: it is an excellent point, fight and narrative, ruined by details, circumstance and participants.

But what has sat with me for a bit is the wording used in Apple's recent olive branch: "The problem Epic has created for itself is one that can easily be remedied if they submit an update of their app that reverts it to comply with the guidelines they agreed to and which apply to all developers." Combined with the power in Apple's grasp, their immense size using nearly every possible metric, and the policies used in the store today, theirs is the language of the person on the wrong side of history.

Imagine a digital company town, all profits feed the owner and all salaries are spent in the company store. Why, the worker may have created a kerfuffle with their funny-sounding, foreign ideas about "safety standards" and "unions" and "5 day work weeks", but they could simply drop these corrosive ideas and go back to being a worker at belt 33, and The Company will be magnaminous enough to forgive. Maybe.

Imagine an industry town racket, all shops and cafés and barbers and bakers feeling their hearts sink at the tap on the glass. It's time, have you forgotten? 30% of all profits. Hm, this seems a bit light, are you holding out on us? I'll let it be this time, but you better be playing us straight. This town is full of bad luck, and you're lucky to have someone like us to look out for you, making sure you don't end up in it. Remember what happened to Jimmy? Terrible.

Imagine an unkind world, denying to you the dignity of personal enumeration, of individual treatment, of suffrage, of education, because of the makeup of your chromosomes, the area of your origin, the color of your skin, the spirit and morals from which you channel strength. Why would you demand more - don't you know these are not innate rights, but benefits confered to people who are the way people should be? Put down your petitions and your aspirations, and be of use, unilaterally and unconditionally, to us, and if you seek clemency and bother not nobody of import, so may you find, in time, gracious compence for your efforts. As long as you remember the source of your fortune.

The App Store is not any of those things, and Apple are not tyrants (and have indeed stood up against some tyranny, albeit in a curiously specific way). Apple are about as much of a net good in the world as a company that runs an App Store can still credibly be. Nor are Epic doing more than LARPing not being in control of their own ship. Only they know whether they actually care about other companies; I'm willing to entertain the notion that they do, but regardless they're not out on the corner with a paper cup, so why the above?

Because the proportions are so disproportionate even with Epic that Apple, as one of the world's biggest companies with one of the world's biggest platforms, can't help but speak in the same tone. It's not a choice; it's the weight of circumstance, sprinkled with years of history. That's what monopolies do – they turn every opportunity and every proposal into a transaction that furthers the monopolist, transferring power and influence and means chiefly in one way. Power corrupts, and a monopoly is a centralization of power, its presence permeating inescapably, even if you're plopped down in the position by mistake and not by malicious pursuit, even if you're good and well-meaning.

If that's what you spend 12 years doing and wish to run an operation that's not just benevolent on the surface — with extremely hand-picked scenarios that pretend the world of software is confined to the four companies whose work your keynote features and ignore the existence of the Internet while training people to accept a significantly lower price of purchase which maybe would work out if only more people could find your app — but actually a beneficial proposition for both parties, you have to fight with every fiber of your being, with every action along the road to counteract the balance tipping towards you, and to empower the developers so as to empower users by the fruits of those labors. They not only haven't done that, they've thrown obstacle after obstacle in the way of the developers, in the cause of advancing some hare-brained strategy or apparent unity or surface simplicity. Was this by active, pre-meditated malevolence? Almost never, that I've been able to tell. Does it affect, move or excuse the outcome? Not that, either.

The App Store is a corrupt state and it deserves a revolution, but it can't be started by the cousin of the Minister of Energy who happens to own half the refineries, and it can't be given credibly sustained traction by the playbook that spends time aping Chiat-Day.

Lesszilla

Mozilla's PR statement:

As I shared in the internal message sent to our employees today, our pre-COVID plan for 2020 included a great deal of change already: building a better internet by creating new kinds of value in Firefox; investing in innovation and creating new products; and adjusting our finances to ensure stability over the long term. Economic conditions resulting from the global pandemic have significantly impacted our revenue. As a result, our pre-COVID plan was no longer workable. Though we’ve been talking openly with our employees about the need for change — including the likelihood of layoffs — since the spring, it was no easier today when these changes became real. I desperately wish there was some other way to set Mozilla up for long term success in building a better internet.

Michael Tsai reports that most of the Servo team – tasked with revamping, improving and rewriting the rendering engine in Rust, indeed the explicit purpose for which Rust was invented – has been laid off.

Mozilla was, and is still for an indeterminate amount of time, the check and balance on Apple and on Google, the two remaining browser engine competitors. Both have perverse incentives to turn the web into their own platform, to make the web not compete with their own platform or make the web look like and behave like their own platform. Mozilla has often fronted technologies that advanced the web, be it adopting Microsoft's XMLHttpRequest as a native object or spearheaded CSS Grid and WebAssembly. The success and failure of web APIs and new developments are dependent on what this triumvirate decides; it's worth keeping the most independent, most pro-user vendor of the three a strong alternative.

I have no interest in most of Mozilla's offshoots like the Pocket app or iOS Firefox, but I will likely switch to Firefox and find a way to support them as a manner of principle. I should have done it much sooner.

As the Swedish saying goes, man saknar inte kon förrän båset är tomt; you don't miss the cow until the booth is empty.

Armored

Apple enthusiasts are in the rare period where we know something new and big is definitely coming, in that an architectural shift is on its way but isn't here yet. And for all the talk about Apple Silicon and various ways it can shape future Mac models, I think there hasn't been enough time spent looking back on one of the winners of the previous transition: the original MacBook.

The original MacBook was the successor of the iBook; white, plastic, budget laptops that were still Macs, but that were almost comically uncapable at every point I evaluated them. I got into Macs for my own use with an aluminum PowerBook G4, and the G4 had a famously slow "front-side [system] bus" at 167 MHz that bottlenecked it. Every iBook was further bottlenecked with smaller screens (including a choice of two screen sizes with the same resolution), limited optical drives and lower specs overall.

The MacBook, in contrast, announced in spring 2006 out of nowhere and without any fanfare, was a revelation. It boasted the same dual-core Intel Core Duo (opposite speculation that the iBook successor would surely get the Core Solo that eventually ended up only in the Mac mini), and at about the same price range and a far more solid build, it was enough for me to upgrade without scare quotes from a reasonably full-featured PowerBook G4 to a base model MacBook. It also introduced the first chiclet-style laptop keyboard in Apple's lineup, a magnetic hook-less latch, as well as picking up MagSafe and a built-in webcam from its bigger sibling. The top-of-the-line laptop becoming markedly faster was huge news; the piddling mass-market model becoming a bona fide competitively priced speed demon may have made a difference to more people, and done a lot more to expand the Mac market base.

The Apple Silicon transition holds many potential effects, some of which don't bode well for extensibility, modularity and people's general day-to-day dependencies on the x86-64 architecture in practice. But Apple wouldn't make this transition if it couldn't bring this sort of overhaul or comparative before-after difference to at least two or three of its lines. It may not be the case that an Apple Silicon SoC can run circles around a 64-core, 128-thread AMD Threadripper, but it is both plausible and achievable that it can deliver a faster processor that runs cooler while sipping battery and break free of the oppressive thermal equation of recent Intel chips. When you suddenly have room to breathe, it becomes a platform to build a better product around. You can swap some of the battery space for other componentry, use the battery space to achieve significantly longer battery life – or (knowing Apple) just make the whole thing smaller/thinner, which is still a good choice for some products even if it isn't the best choice for every product.

Whether or not the new "original MacBook" will be the return of the MacBook (2015), only-this-time-it's-a-good-idea, is anyone's guess. But the only thing anyone saw coming about the original MacBook, working backwards from the already released original MacBook Pro, was the name.

Fran Allen dead at 88

An industry giant who got far too little air time and consideration.

Fran led a team at IBM that pioneered optimization in compilers, including automatic parallelization to feed a convoluted architecture, decades before UNIX or C. Her work represents the road not taken that many, me included, regret.

The chapter about her in Coders at Work is recommended reading.

Apple: Phil Schiller advances to Apple Fellow

Apple Fellow positions are very rare and given to people who have made significant contributions. Phil is a "marketing" person by title, but as evidenced by Jeff Williams, current COO, basically having run product ownership of Apple Watch, Apple, as many great creative companies, attracts "T-shaped" and multiply-talented people, and in many ways allow them to exercise many of these talents and influence decisions without regard to job title.

Phil notably personally advocated to put a rotary/circular scroll wheel in the iPod, a decision that made it easier to scroll indefinitely through the long lists produced by putting 1000 songs in a device with a small screen, and which differed from the jog wheels and static buttons used at the time. I'm sure there are many other marks that those in the room will recognize and hopefully one day tell the tale of.

Congratulations to Phil — you've earned it.

← Earlier posts