Take

Later posts →

Matthew Green: Ok Google: please publish your DKIM secret keys

An accident of the past few years is that this feature has been used primarily by political actors working in a manner that many people find agreeable — either because it suits a partisan preference, or because the people who got “caught” sort of deserved it.

But bad things happen to good people too. If you build a mechanism that incentivizes crime, sooner or later you will get crimed on.

Email providers like Google have made the decision, often without asking their customers, that anyone who guesses a customer’s email password — or phishes one of a company’s employees — should be granted a cryptographically undeniable proof that they can present to anyone in order to prove that the resulting proceeds of that crime are authentic. Maybe that proof will prove unnecessary for the criminals’ purposes. But it certainly isn’t without value. Removing that proof from criminal hands is an unalloyed good.

Well-argued post in favor of large mail providers (like Google) rotating their signing keys periodically and publishing their secret key post facto. While it sounds insane or at least reckless, it removes a point of tension that right now is hurting people, and does not materially hurt the purpose for which DKIM was invented – to authenticate legitimacy as emails are being delivered. And as mentioned:

A paranoid reader might also consider the possibility that motivated attackers might already have stolen Google’s historical secret DKIM keys.

Cryptography, its applications and its consequences are very subtle, and it's hard to get one benefit in isolation. Rather, you buy into a set of behaviors, some of which are not immediately obvious. We are used to encryption establishing privacy, but deniability is also an aspect of privacy. I wonder if the initial threat modeling of DKIM foresaw this, or if it maybe saw the unintentional consequence as an unconditional net good.

Apple Silicon: Inaugural M1 Macs

To sum it up: Intel has been embarrassing itself for the past few years, Apple has been building up an impressive silicon know-how for the past decade, and ARM architecture chips have been inherently power-efficient since they were invented.

The M1 versions kick the Intel Core versions around the block a few times in most workloads, and the integrated GPU cores have been receiving tuning and tweaking as part of pipelines targeting Metal on iPad and iPhone for years. There's also no reason to doubt the statements about power efficiency and performance-per-watt increases. And having a completely fan-less MacBook Air in a way that is not just a dangerous fool's errand is a remarkable achievement.

There are still open questions, though:

  • The M1 leans into UMA, a "unified memory architecture" where the memory is shared between the GPU and CPU. That's great most of the time, but how is performance affected when both the GPU and CPU draw lots of resources?

    Will future M chips allow off-SoC memory in the first place, or off-chip GPUs? (eGPUs are noted incompatible with M1. Update: I was mistaken; only the MacBook Air M1 is incompatible, probably because of the lack of thermal envelope. Update: Apple's page was mistaken, as now no M1-powered models are listed at all.) Being the fastest integrated GPU is impressive, but it's also like walking the fastest; there are plenty of chips out there that can run.

    The M1 tops out at 16 GB now and the previous Mac mini offered up to 64 GB; how far north will the included RAM go in the future?

  • Mac mini loses out on two additional Thunderbolt/USB-C ports and 10 Gbit Ethernet. Will this type of functionality simply drop off the radar?

  • Both M1 MacBooks use the ISP to clean up the image, but still use 720p webcams? What year is it? Has no one heard of "garbage in, garbage out", or are they betting that extra chip functionality will solve literally every problem?

The M1 is the first step of the transition; the opening salvo instead of the crescendo. There will be surprises and revelations ahead, and I look forward to seeing genuinely new designs within the next year, including next spring and during WWDC.

Don't forget: The Macs that were revved today were from the bottom part of the lineup. For all we know, the M2 or M1X containing both faster cores and the additional infrastructure to do fuller system architecture are still being finalized. It takes time to build out the chip capabilities from iPhone/iPad-level chips to something capable of handling all subsystems in all Macs. PCI Express (used by Thunderbolt/USB4) and virtualization support are two examples of features already present in the M1 that no previous Apple SoC has needed to support.

Apple Silicon: The Roads Not Taken

[Please note: This was written ahead of today's announcements.]

My track record on predicting things out of the blue is pretty spotty, so here are a few things I can imagine but that will probably not materialize.

  • "Apple Pi"

    Raspberry Pi-like, "tinkerer-friendly" Mac, for under $100.

    Compare the prices of most single-board computers and the x86 models are steadily either significantly more expensive, or running four year old Intel Atom CPUs, or both. Not only do ARM processors not have the issue of having to keep Intel afloat, Apple has itself had experience putting out small SoCs in surprising places.

    If they would do this, chances are they'd make it all about hosting stuff on iCloud, writing code in Swift (maybe using a connected iPad). I don't quite see how it can both be what the Raspberry Pi crowd likes and what Apple likes at the same time. Apple's not interested in enabling tinkering. It's interested in making kids code, but on a high-margin iOS device and up. With the way macOS has moved recently, there's little making this a Mac as such, but it's more a Mac than iOS/iPadOS.

  • "Mac nano"

    A Mac mini the size of the Apple TV, for $199, with 4GB RAM, 64/128 GB of iPhone-like storage, hardly any I/O, and probably an A12, A13 or A14. BYODKM – hook up the display with HDMI or USB-C, hook up keyboard and mouse wirelessly or with a USB-C hub/adapter.

    The old Steve Jobs quote was "we don't know how to make a $500 computer that's not a piece of crap", and Apple can now comfortably pack in the computational power for an okay enough experience for what people are likely to plug into it. As long as it runs the software well enough, it's a candidate to bring people over from Windows, and they're about to lose the fallback "if all else fails you could use it as a Windows PC"; it needs to be cheaper.

    ("Mac SE" was already taken.)

  • An affordable Mac mini

    Take the current Mac mini, make it a bit smaller and make it affordable. Again – the Intel tax is gone, and Apple, if they want to, can churn out silicon in large scales by themselves already. The first Mac mini was $499 – there's no reason the first ARM Mac mini can't be.

All of these products essentially are based on this: there's an Apple that makes iPhones for $399 with industry-leading performance, and there's an Apple that sells wheels for almost twice that price. It's up to Apple to define what they want to sell and how they want to market it, and heading into a transition where you drop a hardware partner for your own designs is a perfect time to choose a new tack.

Say what you want about whether Apple wanted to offer lower-level products before, the price-to-performance ratio with Intel never made much sense. And if a Celeron or Atom didn't exactly scream high enough performance, neither did PowerPC chips that were lower-end than the ones they put in their low-end Macs back in those days. In a way, Apple's not had the opportunity to tackle this head-on at least for 20 years or so, so we don't really know that the idea has been rejected by Apple rather than by circumstance.

Mister President-Elect

They just elected to tell the monkey "you're going to have to turn that machine gun in".

(I have a longer post with a lot more going on, but it's not coming together, and there's a limit to how late I can post this and still be reasonably timely. I also have other things to post about and I don't want to not have acknowledged this when I do.)

Sean Connery dead at 90

Much will be said about his defining tenure as James Bond, of his dialect, of how people are both shaken and stirred.

For me, Sean Connery redefined the nature of creativity. If you haven't already, watch Finding Forrester this weekend. Its teachings sit deeper with me than the ur-meme and subsequent genre that it birthed, although we may hopefully all instinctively understand it a bit better by now.

Oh Gee

First, Apple's come a long way from calling carriers "orifices". Without checking, Verizon probably got more stage time than the lidar in the iPhone 12 Pro, where it assists autofocus and plays a big role in magically making photos work out even for people who have never knowingly 3D scanned something in their life.

But more importantly, the sense I've got is that 5G isn't a dud technology but that it really only provides its advantages in areas where it's really well built out. With mmWave, the intense half of the technology, that means literally if you have line of sight to a tower. I can see why they focused on stadiums and NFL, and why Apple picked a carrier who could say "we're now rolling this out for real", to get past any argument that it's been bad until now.

All of the presentation and indeed the product site itself are packed with demonstrations where all of these things will download completely before the touch debounces, before the foot hits the ground, before the next time the hummingbird flaps its wings. It's not just in the flashy montage video – Apple is setting the expectation that this is how life will be with 5G, and that requires that these speeds are realistic and omnipresent and dependable. Either they are right or it will be worse than this.

If they are right, it's a huge leap forward for 5G since it's not the known, measured opinion of anyone with access to a realistic 5G network today, never mind the people in the coverage map's vast shadow. If it's worse, are they going to hide behind people not having Verizon as a carrier, or not being one of the 200 million Americans ostensibly covered by the built-out non-mmWave network? The iPhone is a worldwide product, and even most people in the US don't use Verizon, and for all the US-centricity, these days the experience of using it rarely comes with great big asterisks.

If this all is a great big "fuck you" to AT&T over "5GE", though, good job.

Sony tears apart PlayStation 5

I've never owned a PlayStation and maybe never will, but this was an interesting overview. A crazy mix of conventional PC and bespoke architecture – and the CPU is cooled with liquidmetal! (Which from what I understand takes extreme care because it's also electrically conductive.)

Home

Far too often, the soul of someone who happens to be a Mac user is seen through the lens of corporate communications — about crazy ones, misfits and round pegs in square holes.

I'm a Mac user and have been so for most of my life. Growing up, System 6 was a staple, but I also remember a Compaq portable with the mouse trackball at the right side of the screen and the mouse buttons on the lid, and eventually using Windows 95 and 98. I came back to Mac with Mac OS X when I left a really powerful PC for a computationally dinky Aluminum PowerBook G4, and have not owned a full-on desktop since, even if I do own a NUC and a few Raspberry Pis.

I recently have been in a mode of deep (Windows-based) user interface focus at work, and was describing Panic to someone recently when something came over me. I love these guys. I love the attention to detail of every large, small and medium thing, the time put into making an application that feels right and flows right, that's easy to use, that has just enough user interface that you can get done what you need to get done, that has style, function and whimsy.

Panic is just an example, but there are so many of their ilk that I can point to. Many Tricks and their excellent Witch; Ranchero's NetNewsWire; Omni and their myriad of productivity tools; Noodlesoft's Hazel; and so on, and so on, and so on.

I've used Windows every day for as long as I can remember in one way or another. I can find my way around there as well as on any other platform. But while on the world's biggest desktop OS, I still feel constrained by a meandering vision, by a lack of common conceptual ground, by the infuriating feeling that so much is built and left unfinished, unpolished and put out to rot.

The Mac gets a lot of flack from people who are nose deep in technical specifications and price matchups. What they don't see — or aren't interested in — is the intangible: the culture that people with big dreams and small means have made the unconventional available, the complex seemingly simple and the advanced accessible. This culture doesn't live or die by Apple in particular, although the original Macintosh being a product of a similar mindset helped set the tone. This culture produces things that are hard to find elsewhere, not because it's technically impossible to do, but because the values that drive those other platforms produce different outcomes.

I am upset with a lot of recent technology, because it all seems intent on burying history as part of remaking the world. Not everything new is bad or worse than what came before, but so many important learnings are being thrown out. You can't make a web app without first filling it with a big framework to implement basic interactions, and most of them lose the tactility and the richness of most native interfaces of any platform. You can definitely build good web interfaces driven by snappy and well-thought-out engines, but it takes intense focus and hard work to do so, and it's easier (but not cheaper) to just throw in Bootstrap and work just as poorly as most other web sites. Electron takes all this and wraps it up in a computationally horrible footprint, under the insulting guise of "native".

iOS, iPadOS, watchOS, tvOS and now macOS with Big Sur – all the recent advancements seem to come at the expense of the wide berth that used to produce great results. The freedom that allowed a seamless experience is chopped up by security concerns ham-handedly and haphazardly applied, and on most platforms most of the time topped with having to pass the needle's eye of a trillion dollar enterprise's hungry bean-counting and control. All for the purpose of being a populous platform for its own sake; for having more apps that cost little, grey gruel instead of food; for padding a bottom line if you're cynical, or stroking a corporate self-image run amok.

The reason, all these things considered, that I haven't left these platforms yet is that there's still the feeling of being in a garden of my own cultivation. I can control every nut and bolt and swap out infinitesimal details and fundamental building blocks in Linux, that's true. But that means that people do, and you end up with worrying about technological fundaments because of this uneven foundation. This soil does not bear great fruit, efforts by GNOME etc notwithstanding, and the culture lionizing the endless flexibility of the command line and architectural purity of UNIX gives an easy escape hatch for any problem.

Windows is seemingly more stable in this aspect, but while I am able to live in that house, I am not able to make it my home, and it's not for a lack of trying. Microsoft's repeated wallpaper-stripping and ever-changing priorities make it feel like an enormous mansion under constant renovation, with uneven floors, studs poking through the walls and fundamental features left broken or half-finished since the last time they cared. (The less said about the impressionistic "Fluent" wing entirely in featureless acrylic, the better.)

The culture and the people and the shared values and what it all comes together to produce. That's why I'm still here. You can live in many houses, but not all of them will ever feel like home. I'm upset with the landlord and the building manager who ignores leaking pipes and oiled floors catching on fire while upping the rent and turning a blind eye to hustlers running Three-card Monte, but aside from that, I love the neighborhood, I love the surroundings, I love that they value the things I do and I love what it can build over time.

Joseph Gentle: "I was wrong. CRDTs are the future"

What’s the JSON equivalent for realtime editing that anyone can just drop in to their project? In the glorious future we’ll need high quality CRDT implementations, because OT just won’t work for some applications. You couldn’t make a realtime version of Git, or a simple remake of Google Wave with OT. But if we have good CRDTs, do we need good OT implementations too? I’m not convinced we do. Every feature OT has can be put in to a CRDT. (Including trimming operations, by the way). But the reverse is not true.

I have been looking sideways at technology like OT for the past 10 years or so wondering if it'll ever be part of anything I do. CRDTs are refreshingly simple, and has less of that ominous "if you fuck this up, you fuck everything up" feeling, but they've been limited to simple, known-good constructions similar to concurrency primitives.

There is a layer of subtlety and complexity that will never go away with real-time editing, but I hold out hope that there can be CRDTs that encode larger patterns too, and can be used to compose larger state graphs in a way that keeps the choices you have to make right in front of you and avoids "trap doors".

Steaming

Three quick things:

  1. Why, when installing a Steam game on macOS, do I get the option to "Add to start menu"? In macOS it's not available. (And in Windows, it's called the Start menu.)
  2. Why, when the Steam app has found an update and properly badges and bounces the Dock tile, does clicking the tile or switching to it by any other means hide the update window? My guess is that the switch-to-the-app trigger invokes "show the main window at all costs". The window also is not available in the Window menu for switching to, or seemingly visible in any other way (Witch doesn't seem to find it).
  3. Why the hell is Portal 2 (and the other handful of first-party Valve games) still not cross-compiled for 64-bit macOS, so that it is still playable now, and would be playable (at least for a few years) on future Apple Silicon Macs? This is not rocket science, and Valve wouldn't even have to do the technical work themselves.

The first two should have been found within the first week of actual use. All of them should have been taken seriously and fixed as soon as possible.

Jordan Rose: Objective-Rust

Yep, that’s Rust code with embedded Objective-C syntax, and it works. Why would you do such a thing?

Fast Company: "Facebook is quietly pressuring its independent fact-checkers to change their rulings"

The video was notable because it had been shared by Lila Rose, the founder of antiabortion group Live Action, who has upwards of five million Facebook followers. Rose, leery of restrictions on her page and handy with claims of Big Tech censorship, quickly launched a petition protesting what she alleged was bias by Facebook’s fact-checking partner, a nonprofit called Health Feedback. Soon, four Republican senators, including Josh Hawley of Missouri and Ted Cruz of Texas, wrote a letter to Zuckerberg condemning what they called a “pattern of censorship.” They demanded a correction, a removal of all restrictions on Lila Rose and her group, and a “meaningful” audit of Facebook.

Soon, the fact-check labels were gone. A Facebook spokesperson told BuzzFeed News at the time that the labels would be removed pending an investigation “to determine whether the fact checkers who rated this content did so by following procedures designed to ensure impartiality.”

How very independent.

MacRumors: "WordPress for iOS Was Blocked From Updating Unless it Agreed to Add In-App Purchases for .Com Plans"

This is getting stupid.

Either Apple are somehow trying to find one interpretation of their rules and let it percolate through all apps as updates are processed, or they have gone stir-fry, stark raving crazy with greed and/or self-conscious paranoia and have whipped themselves into some sort of draft-dodger-hunting frenzy. Neither is a good look and neither serves the user.

(The same thing as always serves the user: letting people build the app they want, the way they want it and shipping it to their users and customers.)

This serves literally no one, not even Apple.

WSJ: "News Publishers Join Fight Against Apple Over App Store Terms"

MacRumors summarizes:

For Amazon Prime Video, Apple offered Amazon a special deal where it took just 15 percent of subscription revenue, and the publishers want the same deal. The letter asks Apple to "clearly define the conditions" that Amazon met to garner that agreement.

Clearly this must be some kind of mistake. All developers are treated equally by Apple, and no one ever gets to skirt the rules or a better deal – at least according to congressional testimony under oath.

Appic

I've had trouble finding things to say about Apple v. Epic, wherein Epic launched a clearly choreographed and premeditated series of actions, beginning with adding a method of payment that was not allowed by App Store guidelines and (some) analogues, getting this update taken down and Fortnite delisted, filing legal action against Apple for monopolistic behavior, being told that their Apple developer account for all platforms would be delisted in 14 days, appending more legal evidence and finally being told by Apple by way of a public statement, more or less, that "it doesn't have to be like this".

This should be catnip for me (or "app nip", if you will). In some ways the same fight is being wagered as before, but with high enough stakes that some damage may be made. But instead it just makes me uncomfortable.

Apple and Epic are two peas in a pod - both large companies staffed with many talented, capable people who provide an equally capable platform, on top of which many other people can build projects that do things they wouldn't do otherwise, and both companies that have mastered the ability to extract just enough money from this practice to not look like Oracle outright. Neither can plausably look like a scrappy underdog, and instead of stumbling into this situation through emergent arbitrariness, it was instigated as a public relations operation which looks, sounds and smells like a public relations operation, and only really excels at pointing out the hypocrisy in the other 800 pound gorilla's public relations self-image. In short: it is an excellent point, fight and narrative, ruined by details, circumstance and participants.

But what has sat with me for a bit is the wording used in Apple's recent olive branch: "The problem Epic has created for itself is one that can easily be remedied if they submit an update of their app that reverts it to comply with the guidelines they agreed to and which apply to all developers." Combined with the power in Apple's grasp, their immense size using nearly every possible metric, and the policies used in the store today, theirs is the language of the person on the wrong side of history.

Imagine a digital company town, all profits feed the owner and all salaries are spent in the company store. Why, the worker may have created a kerfuffle with their funny-sounding, foreign ideas about "safety standards" and "unions" and "5 day work weeks", but they could simply drop these corrosive ideas and go back to being a worker at belt 33, and The Company will be magnaminous enough to forgive. Maybe.

Imagine an industry town racket, all shops and cafés and barbers and bakers feeling their hearts sink at the tap on the glass. It's time, have you forgotten? 30% of all profits. Hm, this seems a bit light, are you holding out on us? I'll let it be this time, but you better be playing us straight. This town is full of bad luck, and you're lucky to have someone like us to look out for you, making sure you don't end up in it. Remember what happened to Jimmy? Terrible.

Imagine an unkind world, denying to you the dignity of personal enumeration, of individual treatment, of suffrage, of education, because of the makeup of your chromosomes, the area of your origin, the color of your skin, the spirit and morals from which you channel strength. Why would you demand more - don't you know these are not innate rights, but benefits confered to people who are the way people should be? Put down your petitions and your aspirations, and be of use, unilaterally and unconditionally, to us, and if you seek clemency and bother not nobody of import, so may you find, in time, gracious compence for your efforts. As long as you remember the source of your fortune.

The App Store is not any of those things, and Apple are not tyrants (and have indeed stood up against some tyranny, albeit in a curiously specific way). Apple are about as much of a net good in the world as a company that runs an App Store can still credibly be. Nor are Epic doing more than LARPing not being in control of their own ship. Only they know whether they actually care about other companies; I'm willing to entertain the notion that they do, but regardless they're not out on the corner with a paper cup, so why the above?

Because the proportions are so disproportionate even with Epic that Apple, as one of the world's biggest companies with one of the world's biggest platforms, can't help but speak in the same tone. It's not a choice; it's the weight of circumstance, sprinkled with years of history. That's what monopolies do – they turn every opportunity and every proposal into a transaction that furthers the monopolist, transferring power and influence and means chiefly in one way. Power corrupts, and a monopoly is a centralization of power, its presence permeating inescapably, even if you're plopped down in the position by mistake and not by malicious pursuit, even if you're good and well-meaning.

If that's what you spend 12 years doing and wish to run an operation that's not just benevolent on the surface — with extremely hand-picked scenarios that pretend the world of software is confined to the four companies whose work your keynote features and ignore the existence of the Internet while training people to accept a significantly lower price of purchase which maybe would work out if only more people could find your app — but actually a beneficial proposition for both parties, you have to fight with every fiber of your being, with every action along the road to counteract the balance tipping towards you, and to empower the developers so as to empower users by the fruits of those labors. They not only haven't done that, they've thrown obstacle after obstacle in the way of the developers, in the cause of advancing some hare-brained strategy or apparent unity or surface simplicity. Was this by active, pre-meditated malevolence? Almost never, that I've been able to tell. Does it affect, move or excuse the outcome? Not that, either.

The App Store is a corrupt state and it deserves a revolution, but it can't be started by the cousin of the Minister of Energy who happens to own half the refineries, and it can't be given credibly sustained traction by the playbook that spends time aping Chiat-Day.

Lesszilla

Mozilla's PR statement:

As I shared in the internal message sent to our employees today, our pre-COVID plan for 2020 included a great deal of change already: building a better internet by creating new kinds of value in Firefox; investing in innovation and creating new products; and adjusting our finances to ensure stability over the long term. Economic conditions resulting from the global pandemic have significantly impacted our revenue. As a result, our pre-COVID plan was no longer workable. Though we’ve been talking openly with our employees about the need for change — including the likelihood of layoffs — since the spring, it was no easier today when these changes became real. I desperately wish there was some other way to set Mozilla up for long term success in building a better internet.

Michael Tsai reports that most of the Servo team – tasked with revamping, improving and rewriting the rendering engine in Rust, indeed the explicit purpose for which Rust was invented – has been laid off.

Mozilla was, and is still for an indeterminate amount of time, the check and balance on Apple and on Google, the two remaining browser engine competitors. Both have perverse incentives to turn the web into their own platform, to make the web not compete with their own platform or make the web look like and behave like their own platform. Mozilla has often fronted technologies that advanced the web, be it adopting Microsoft's XMLHttpRequest as a native object or spearheaded CSS Grid and WebAssembly. The success and failure of web APIs and new developments are dependent on what this triumvirate decides; it's worth keeping the most independent, most pro-user vendor of the three a strong alternative.

I have no interest in most of Mozilla's offshoots like the Pocket app or iOS Firefox, but I will likely switch to Firefox and find a way to support them as a manner of principle. I should have done it much sooner.

As the Swedish saying goes, man saknar inte kon förrän båset är tomt; you don't miss the cow until the booth is empty.

Armored

Apple enthusiasts are in the rare period where we know something new and big is definitely coming, in that an architectural shift is on its way but isn't here yet. And for all the talk about Apple Silicon and various ways it can shape future Mac models, I think there hasn't been enough time spent looking back on one of the winners of the previous transition: the original MacBook.

The original MacBook was the successor of the iBook; white, plastic, budget laptops that were still Macs, but that were almost comically uncapable at every point I evaluated them. I got into Macs for my own use with an aluminum PowerBook G4, and the G4 had a famously slow "front-side [system] bus" at 167 MHz that bottlenecked it. Every iBook was further bottlenecked with smaller screens (including a choice of two screen sizes with the same resolution), limited optical drives and lower specs overall.

The MacBook, in contrast, announced in spring 2006 out of nowhere and without any fanfare, was a revelation. It boasted the same dual-core Intel Core Duo (opposite speculation that the iBook successor would surely get the Core Solo that eventually ended up only in the Mac mini), and at about the same price range and a far more solid build, it was enough for me to upgrade without scare quotes from a reasonably full-featured PowerBook G4 to a base model MacBook. It also introduced the first chiclet-style laptop keyboard in Apple's lineup, a magnetic hook-less latch, as well as picking up MagSafe and a built-in webcam from its bigger sibling. The top-of-the-line laptop becoming markedly faster was huge news; the piddling mass-market model becoming a bona fide competitively priced speed demon may have made a difference to more people, and done a lot more to expand the Mac market base.

The Apple Silicon transition holds many potential effects, some of which don't bode well for extensibility, modularity and people's general day-to-day dependencies on the x86-64 architecture in practice. But Apple wouldn't make this transition if it couldn't bring this sort of overhaul or comparative before-after difference to at least two or three of its lines. It may not be the case that an Apple Silicon SoC can run circles around a 64-core, 128-thread AMD Threadripper, but it is both plausible and achievable that it can deliver a faster processor that runs cooler while sipping battery and break free of the oppressive thermal equation of recent Intel chips. When you suddenly have room to breathe, it becomes a platform to build a better product around. You can swap some of the battery space for other componentry, use the battery space to achieve significantly longer battery life – or (knowing Apple) just make the whole thing smaller/thinner, which is still a good choice for some products even if it isn't the best choice for every product.

Whether or not the new "original MacBook" will be the return of the MacBook (2015), only-this-time-it's-a-good-idea, is anyone's guess. But the only thing anyone saw coming about the original MacBook, working backwards from the already released original MacBook Pro, was the name.

Fran Allen dead at 88

An industry giant who got far too little air time and consideration.

Fran led a team at IBM that pioneered optimization in compilers, including automatic parallelization to feed a convoluted architecture, decades before UNIX or C. Her work represents the road not taken that many, me included, regret.

The chapter about her in Coders at Work is recommended reading.

Apple: Phil Schiller advances to Apple Fellow

Apple Fellow positions are very rare and given to people who have made significant contributions. Phil is a "marketing" person by title, but as evidenced by Jeff Williams, current COO, basically having run product ownership of Apple Watch, Apple, as many great creative companies, attracts "T-shaped" and multiply-talented people, and in many ways allow them to exercise many of these talents and influence decisions without regard to job title.

Phil notably personally advocated to put a rotary/circular scroll wheel in the iPod, a decision that made it easier to scroll indefinitely through the long lists produced by putting 1000 songs in a device with a small screen, and which differed from the jog wheels and static buttons used at the time. I'm sure there are many other marks that those in the room will recognize and hopefully one day tell the tale of.

Congratulations to Phil — you've earned it.

Filip Pizlo: Speculation in JavaScriptCore

Good, in-depth post from the WebKit team about how JavaScriptCore handles speculative compilation to optimize JavaScript execution.

Paul Tozour: The Game Outcomes Project, Part 4: Crunch Makes Games Worse

Extended overtime (“crunch”) is a deeply controversial topic in our industry. Countless studios have undertaken crunch, sometimes extending to mandatory 80-100 hour work weeks for years at a time. If you ask anyone in the industry about crunch, you’re likely to hear opinions stated very strongly and matter-of-factly based on that person’s individual experience.

And yet such opinions are almost invariably put forth with zero reference to any actual data.

Deep and wide analysis of an area that must be incredibly difficult to test in a scientific manner. Includes preemptive consideration of a reasonable counterargument, that crunch is likely most needed in projects that are poorly run or managed to begin with, which risks skewing the data or entangling variables.

David Heinemeier Hansson's Statement to the House Antitrust Subcommittee

It’s worth noting here that we are already paying Apple for the privilege of having access to the App Store. All developers must pay $99/year for a developer license. Apple brags of having millions of developers, so they’re in essence already making hundreds of millions of dollars, just in licensing fees. Nobody is asking for a free ride here! If it costs Apple more than hundreds of millions of dollars to run the App Store, they can raise their prices. We’d gladly pay $199/year for a developer license.

But what Apple is asking for is a cut of revenues, at “highway robbery rates”, and it’s simply absurd. Imagine if the telcos demanded a cut of company revenues, since they provide the phone line that connects customers, in the heyday of their monopoly? Imagine if the railroads demanded a cut of company revenues from the goods shipped in the heyday of their monopoly?

roguelazer: "Etcd, or, why modern software makes me sad"

Popular modern technology is taken over by expats from a megacorp and made worse in the service of a hyper-specialized (and just plain over-hyped) orchestration platform. That's the world today. Anything that has a simple and elegant feature-set ends up coöpted by people who just want to build big ungainly architecture and ends up inheriting features from whatever megacorp the coöpters came from.

I have a tenuous relationship to containers, to orchestration, to automated infrastructure-as-a-service with Puppet, Ansible and so on. They make possible the dream of having a sea of computation, where your service isn't the individual wave or stream, but rather the sum of whatever needs to come into existence at the moment.

The problem for me is – at what cost do you get this? If you're Google or some other huge company, you need to have this anyway, the entire department company-wide (and/or one or two people in each related effort) are worth the expense and effort, and the complexity at least isn't added complexity as much as it is things you'd have to deal with one way or the other.

If you're not, well, you either have to spin up a layer of architectural and organizational complexity that Google can support or throw in with some sort of cloud solution that does it for you. Either is costly in one way or another, and handing someone else the keys never absolves you of all the new exciting ways in which something can break or fail to be tuned or proportioned correctly.

Which leaves the option of not doing it. I just received a message from my host that in August, the singular server that hosts this place will go down for a service window that "should be about an hour". Leaving aside the question of how on earth this isn't handled by live migrating these virtual machines to another server, this is a good opportunity to highlight the difference.

Right now, this site hosts everything on one server, with one application serving the pages and one SQLite database hosting them. To avoid the downtime, I would need to have at least two servers and a load balancer, and by necessity either a third server for the database, hoping it wouldn't need to go down at any point or a vendor-provided "database service" that I could use which could come with such guarantees, just like the load balancer.

The point isn't that these services are beyond me, or that it would be terribly expensive in real money, really, even if it would be a factor of 5-10x. The point is that the complexity needed to scale up comes in steep cliffs.

There's nothing wrong with understanding how to scale and decouple in the first place. The current lore and fascination with containers and orchestration invites you to entertain these investments, these aspects and these costs for everything you do. If what you do is a large scale system where every part is mission critical (either to a customer or to other infrastructure), they are justified. But what happens when our industry gets enthralled with this way of functioning, to the point when all technology works best like this?

What happens to the developer that only needed the simple solution, or to the many small shops that can scarcely afford the infrastructure or hosting costs, or the also many slightly larger shops where the resources are there, sure, but people end up spending much of their time monitoring and gardening the large complex system and less time doing actual development? And, to roguelazer's point, what happens to the simple solution, which could have been used by developers and solutions of all shapes and sizes to solve smaller or less complex problems in a meritorious way?

(This is effectively also a corollary on the related but not identical microservices debate which is essentially the same argument on another plane. Swap out Google for Yelp.)

Jeff Glatt: COM in plain C

COM, Microsoft's 90's-era do-it-all layer of cross-library, cross-language, cross-machine interoperability whose foundations underlie .NET in spirit and the Windows Runtime in actuality is ridiculously complex, but in service of a great number of functions, and providing "automation" level support where everything could be called from scripts decades before PowerShell.

I was somehow tickled today to wonder what it would take to vend a COM interface in plain C (since most of the abstractions assume you would touch C++ with a five-foot pole), and Jeff Glatt has an eight-part series of articles on CodeProject introducing the relevant concepts piecemeal. Recommended reading to gain a higher level of understanding of how much COM does and, in some cases, how much of its hoops that it jumps through for you if you don't mind jumping through smaller, specific hoops.

The Final Hours of Half-Life: Alyx

I've never been particularly interested in playing Half-Life games, having the hand-eye coordination of a partly paralyzed goldfish as I do and never liking first person shooters or horror. But Half-Life as a series has left an indelible impression on the gaming industry, as has the Steam distribution platform (borne out of horrible experiences with physical publishing gatekeepers) and the Valve company that created both of them.

After a stint of a few years where the "episodic" nature of the second Half-Life game came to a halt on a climactic cliffhanger, any development on any Half-Life title has been shrouded in mystical secrecy befitting of a Salinger-esque recluse; the original Half-Life and the last episode of Half-Life 2 were released within a 9 year window, and were followed by 12 years of silence.

The Final Hours of Half-Life: Alyx is a compendious monolith of behind-the-scenes reporting from independent games journalist Geoff Keighley who previously produced similar articles for the original Half-Life, Half-Life 2 and Portal 2. Not only does it tell the story of Half-Life: Alyx, the VR-only prequel finally announced at the end of 2019, but it reveals the series of false starts, development stalls and projects that were not in tune with the current circumstances that occupied the enigmatic dozen-year sabbatical.

Valve is an incredibly successful company with vast resources, and its employees driven, capable and afforded legendary levels of freedom, but in the end, game development remains a creative and human endeavor, and is susceptible to downturns, lulls or, as one employee dubs it, wandering off into a collectively shared wilderness. Stories like these are all-too-common, but in a "10x engineer" world obsessed with venture capitalists roving around for "rockstar developers", they are too seldom told.

(If you think you might be interested in playing Half-Life: Alyx at some point, stay far away – there's no way of reading this without being spoiled to high hell. Also, be prepared for requiring a several GB download on a reasonably recent Windows PC to read an article.)

← Earlier posts