Take

Rauhul Varma: "Byte-sized Swift: Building Tiny Games for the Playdate"

I’m excited to share swift-playdate-examples, a technical demonstration of using Swift to build games for Playdate, a handheld game system by Panic.

I am terribly upset that at no point did Panic get the chance to charge Apple $99 for developer tools, scrutinize every byte in their executable and toss it back in their face after Review as being "redundant", since games already exist on the Playdate.

Of course I kid — this is a fantastic achievement and a great post. It's interesting how many programming languages have modes explicitly targeting embedded scenarios these days.

Apple: Getting ready for Web Distribution in the EU

To be eligible for Web Distribution, you must:

  • Be enrolled in the Apple Developer Program as an organization incorporated, domiciled, and or registered in the EU (or have a subsidiary legal entity incorporated, domiciled, and or registered in the EU that’s listed in App Store Connect). [..]
  • Be a member of good standing in the Apple Developer Program for two continuous years or more, and have an app that had more than one million first annual installs on iOS in the EU in the prior calendar year.

Apple of 2024: asymptotically approaching reasonableness.

To be honest, the remaining requirements are not as terrible. Sideloading should mean completely open, completely allowed (including from an airgapped computer), for anyone, by anyone. But it's not the notarization that's the main issue as long as notarization is purely technical (which it is on Mac, but isn't on the current iOS DMA plan, where manual review is involved).

Let's list the additional changes that would make this offer something less than sideloading but still ultimately be somewhat palatable.

  • No "freedom is for rich, successful people". No limits on "2 continuous years of good standing", no requirements of "1 million annual installs of single app".

  • No Core Technology Fee.

    (Dude, you sell tens of millions of phones for $1,500+ and pride yourself on being the margin leader of the entire phone industry, millions of developer accounts for $99 and skim 30% (or sometimes 15%) off of the entire collective App Store revenue. And you operate the entire company on one Profit/Loss statement. You're good.

    No "business unit" will imminently capsize, except possibly PR or Developer Relations from how badly you're screwing all this up so far, but if that's financial, it's not from lack of Core Technology Fee income, it's from people getting fed up with your bullshit and not feeling quite so happy about shoveling money into your gaping maw while you treat them as degenerate freeloaders.)

  • A purely technical notarization process without manual review, or at least a shortened process past the first release or so.

...you know what, that's basically it. That still wouldn't make it DMA compliant, since rights in the regulation are still being held ransom, but it's helpful in plotting their movement across the spectrum of possible options.

Steve Troughton-Smith on Phil Schiller in charge of the App Store

Putting Phil Schiller in charge of the App Store is going to be a hundred billion dollar mistake that all-told leaves Apple with a pile of legal, perhaps criminal, liability and a raft of draconian regulations around the world that massively compromise the iOS experience. This was clear years ago; it is unimaginable that he’s still calling the shots

Apple of 2024: repeatedly jamming a fork in the wall socket, proudly proclaiming their company policy of not being electrocuted.

Platforms

Steve Jobs, January 7th, 1997, 18 days after the Apple acquisition of NeXT:

We've got to get the spark back with the developers. [..] We've got to get the developers back. Now, how are we going to do that? What's our strategy to do that?

I'm not going to go back to the Apple ][ days, so we'll start off with DOS. If you're a developer, you can build an app, let's use a metaphor of the floors of a building, you can build an app that's three floors tall, but you can't build an app that's 10 floors tall, because you all read The Mythical Man-Month, what happens is as your software team starts getting bigger it kinda collapses under its own weight, like a building built of wood. You can't build a building built of wood that high. So, let's say you can build a three story app, well, that means if you start off with DOS you're on the first floor, you can deliver a fourth story app.

Well, what we did with the Mac was, we had an OS that was about the same as DOS, but then what we did was we put this thing called the Toolbox on top of it that lifted the developer up to the fifth floor, so that they could write an app like PageMaker and deliver an eight floor app on Mac, because they started at the fifth floor, that they could never, ever, ever deliver on DOS. And that's why all these wonderful apps propelled Apple into this exciting markets over the years.

But, there's a problem now. And the problem is a very simple one. It's called: Windows. The Mac didn't progress much beyond the fifth floor, and over ten years, Microsoft copied it. And now they can offer developers, you know, you squint your eyes, one's a little better than the other in some areas, but you squint your eyes and they're basically both fifth floor. That's not good for us. It's even a little worse. Because they've been a little ahead of us in getting a multi-threaded, multi-tasking operating system underneath Windows. And that's arguably even better for the developer.

So, here's what we have to do. What we have to do is bring out an operating system that's even more advanced than NT. And this is not easy. This is not easy to do because these operating systems are very complex. We forget many times that it's taken NT eight years to get where it is today. Eight years. So to do this, we can't do this overnight. Fortunately, we've got one that's been battle tested and is ready for the challenge. But on top of that, we're going to put something called OpenStep, and OpenStep lets you start developing your apps on the 20th floor. And the kinds of apps you can deliver are phenomenal.

What does it mean to be a platform?

A platform is both its own foundation, the starting point for everything built on top of it, and the collective embodiment of those things.

The original Mac and its line of successors did not invent Desktop Publishing. It provided an environment that was conducive to graphical applications, where you could put up bitmapped graphics, and through the metaphor of windows and zooming and moving around, could render printer quality typesetting. (It had not been completely impossible to produce such typesetting, but doing so while limited to a monospaced 80 column terminal was an arduous process.) The system provided the cultural foundation as well as the technical foundation to put you a few floors up, in Steve's metaphor, and the combination of production-quality, production-ready apps and more and more capable hardware invented and made the industry.

Similarly, the original Apple ][ a few years earlier did not invent computerized spreadsheets. It seems hard to grasp now, but Apple ][ was one of the first instances of a computer and a "user terminal" being co-habitated. This made personally interactive applications possible to a larger degree and within two years, Dan Bricklin had started work on VisiCalc, the first spreadsheet as we know it.

Science describes a Cambrian explosion when all of a sudden, due to an available niche, organisms adapt and change shape on a time scale that is, relatively speaking, overnight. When the right circumstances exist, great things will emerge.

Platforms understand this and seek this synergy. Using the capabilities and the culture and the code in the base, the application can solve problems that it is built to solve in the best way it knows how to. The platform becomes a world teeming with life, with new possibilities, with problems solved in unimaginable ways. Without the world, no friendly, helpful, earnest critters floating freely in unoccupied interstellar space. Without the organisms, the world is essentially dead.

Moreover, I advise that the iPhone software platform must be opened

Apple: Update on apps distributed in the European Union

Apple is sharing changes to iOS, Safari, and the App Store, impacting developers’ apps in the European Union (EU) to comply with the Digital Markets Act (DMA). These changes create new options for developers, including how they can distribute apps on iOS, process payments, use web browser engines in iOS apps, request interoperability with iPhone and iOS hardware and software features, access data and analytics about their apps, and transfer App Store user data.

There really is a lot to cover, and you could spend a lot of time trying to summarize it and still miss things. Here are some thoughts in no particular order.

  • This is part actually honestly well-meaning, part malicious compliance, part intricate machinery. Many components have all three parts present.

  • This has been a long time coming and I'm sure a lot of people have been working very hard for a long time to make these things happen. I appreciate their hard work; particularly I know that untangling assumptions built up over the past 15 years does not come without headaches. If Apple leadership had just designed a less ass-backwards platform, fewer changes, maybe even none, may have been required.

  • This is not sideloading in any commonly used term. Particularly with the financial requirements and the Core Technology Fee, it is devolved monopoly. A third-party marketplace is required to have significant amounts of cash on hand, and driven to be able to recoup €0.50 per first annual install of its own marketstore app (which do not get the one million "grace" window).

    On the other hand, there is a sliver of a legitimate point that for aspiring actual marketplaces, being able to provide all the things that such a marketplace should provide. But marketplaces are the answer to a not-locked-down App Store. Sideloading is the answer to a not-locked-down Ad Hoc Distribution; this quadrant is as missing now as it ever was.

  • App Notarization is required and includes some form of abridged manual App Review where a bare selection of guidelines, relatively speaking, are applied. (It is possible that this lighter review process is a variant of Testflight pre-release scanning.) Following notarization, the signed app is delivered to the marketplace backend directly.

  • Apple is digging in on its world view that no developer can ever be trusted, even directly by a user. Shipping a bug fix update straight to users, without having someone on Apple QA it, is not possible.

  • It is not ideal. It is a start. Even when the monopolist is trying to fight back, it is better than the monopoly.

I'll Know It When I See It

Apple App Store Review Guidelines:

We will reject apps for any content or behavior that we believe is over the line. What line, you ask? Well, as a Supreme Court Justice once said, “I’ll know it when I see it”. And we think that you will also know it when you cross it.

Brent Simmons:

Just like the sixth finger in an AI-rendered hand, Apple’s policies for Distributing apps in the U.S. that provide an external purchase link are startlingly graceless and a jarring, but not surprising, reminder that Apple is not a real person and not worthy of your love.

John Gruber:

After yesterday:

  • Apps that wish to link to — or, I think, even tell users about — web purchasing options from within their iOS apps must (a) still offer Apple’s IAP for those items; (b) pay Apple its adjusted 27/12 percent commissions on web sales that come from inside iOS apps; (c) send Apple sales data monthly and submit to audits of their sales; and (d) follow Apple’s stringent design edicts for these in-app links to the web.

It is possible to get absorbed in the question of whether or not Apple does genuinely care about their user's privacy, safety and overall experience. I still think so, because I still think there are bundles of humans with beating hearts working there.

But whether it's true or not, it is irrelevant. What makes more sense to talk about is: does Apple value higher maintaining control? Ever since the App Store was conceptualized, the answer has been yes.

If the top concern was the privacy, safety and overall experience, the solution is straightforward, although not necessarily simple to implement in practice (you know, in much the same way as attempting to thoroughly, efficiently, effectively and productively review every single update of every single app): institute rules that protect the customer.

Allow some leeway, but have a mechanism where if you act in a way that defrauds or misleads the customer, you are liable to be booted off the App Store. With this in place and effectively administered, there would be no point in attempting to mislead the customer. Whether an abuse of In-App Purchases, a particularly malodorous third-party payment system or just shifty behavior in general, it could be chalked up to the same offense. Or, to focus on the positives, an opportunity to throw down the gauntlet and focus on reasoned, respectful behavior, building a community of trust, providing the rising tide that lifts all boats.

Instead, the focus is on the enshrined axiomatic supremacy of whatever the Apple payment solution is. If you find it wanting, and want to do something else, tough noogies. If you built up your own idea, your own product, your own network, your own offering, and wish to make reference to it – better do it under our rules, even when those rules have no parallel anywhere else and even when those rules do not make sense to the common person.

Instead, the focus is on the absence of trust, the framing of the developers who largely built the platform's identity, humanity and success as rogue agents incapable of contributing productively.

Apple already knows that it is a good idea to start out by providing warm fuzzy feelings and work backwards from there, rather than to aim for profit maximalism and sort out the details as you go. But, with apologies to Upton Sinclair, it seems to be difficult to get a company to remember what it knows to be right when their income depends on them not remembering it.

Erin Kissane: Untangling Threads

Less emotionally, I think it’s unwise to assume that an organization that has…

  • demonstrably and continuously made antisocial and sometimes deadly choices on behalf of billions of human beings and
  • allowed its products to be weaponized by covert state-level operations behind multiple genocides and hundreds (thousands? tens of thousands?) of smaller persecutions, all while
  • ducking meaningful oversight,
  • lying about what they do and know, and
  • treating their core extraction machines as fait-accompli inevitabilities that mustn’t be governed except in patently ineffective ways…

…will be a good citizen after adopting a new, interoperable technical structure.

Increasingly I'm of the mind that "social media" has been a net negative. Whatever else it has been, it has also always been a tool for propaganda, for harassment, for distortion. And if you think that's a bit rich of me to say, I invite you to read Erin's article, where the case is made that even if it can be good in the hands of the right steward, Meta/Facebook is not an unwitting or hapless one but an actively terrible one.

Humain't

The Humane Ai Pin has been announced, a phone alternative trying its best to not be a phone in any way. Humane famously spearheaded by ex-Apple luminaries Imran Chaudhri (with large amounts of the iPhone and multi-touch user experience to his name), Bethany Bongiorno (a Director of Software Engineering from the launch of the original iPad) and counting among its ranks Ken Kocienda (part of the initial Safari/WebKit team and designer of the first software keyboard and typing autocorrect), I'm finding myself wondering what I'm missing.

There's the ambition, the philosophical thrust behind the product itself: people bewitched by apps, addicted to constant impulses be they doom-scrolling or drip-feeding entertainment. The desire to break free of the neck-craning prison of the pocket rectangle is understandable. (It's also been used as a siren song for both Windows Phone and Apple Watch before.)

There's the technological moment in time. AI personal assistants have been available for years and recent breakthroughs in some AI technology means this is probably the first time this type of device could do what it could do with convincing accuracy — remember things, relate them to the current location, time, context — well enough to be basically the only interface, the only contact surface. Not a parlor trick which you can ignore if you want; a small projected readout in your palm aside, talking to it, having it understand what you mean and doing it is it.

The Humane Ai Pin didn't happen by chance and was not lazily extracted from between the couch cushions. A lot of talented people spent a lot of time at it, clearly chasing a deep vision.

So why does it seem so terribly, undeniably off?

There is a precipitous cliff for anything beyond "talking to the magical AI", where viewing photos and videos and managing settings all happen by going to a ".Center" site in a browser. The product site features food delivery and messaging between friends, two things that are well handled by apps today and that look dreadful to handle via voice entry or the projected palm interface, more fit for haikus than menus. But the "cosmos" operating system is leaning into this, supposed to be free of all types of apps. So much for growing pains.

I am not the first to react strongly to this, but I am probably uncommon in my intense dislike for personal assistant AIs, a dislike that obviously flares to new heights in a product so heavily focused on them. The Humane site harps on privacy and trust, but what is private about being forced to live your life out loud; to not be able to jot a thought down silently? Were these things even discussed on a fundamental level during the considerable ideation, or was anyone just seen as the bearer of the bad culture, steeped in the musky scent of old magics?

A little experience can be a dangerous thing. Having gone through a world-changing evolution of how most people interact with personal technology, I understand if people think "in the beginning, they will laugh at you and say that the keyboard will not work until the screen can deliver convincing tactile feedback echoing physical buttons, but look at what happened; the world adapted and we won". I understand if some of the people involved feel a strange mix of regret of what this new technology, and everything that happened in its wake, has wrought, as well as the professional and curious imperative to do it again by taking the next leap, to unwind the next impediment to the machine just knowing what you mean to do by interacting with it.

If walking around in the world but looking at a screen because you're reading something is being absorbed by something else and not being present, then tapping a pocket square and talking to a virtual assistant about the same thing you would accomplish if you had a screen is also not being present.

In the scale of things, what exists is a technological achievement (and taking the recent progress and extrapolating five more years, might be even more so). There's just no compelling reason for anyone to throw the things that already exist overboard to use it and only it. (Maybe if you believe so strictly in the mission of Humane that aligning with it overrides every practical concern and dresses the contortions up in adherence to a more enlightened existence.)

This doesn't sound like an insurmountable issue to me; for it to be a purposeful, focused device, used in addition to other things and also free from the burden, whether catered to or not, of having to be all the other things. But I'm not sure the people who would found Humane would want to go down that road.

Macs Schreck

Space Black looks nice, but the one I'd want starts at $7,984 (Swedish price) and even an ~82% uplift in Xcode from the M1 Max is still not worth it when I hardly ever feel held back by this chip.

Apple Should Create a Handheld Game Console

I have no idea what's in store for Monday night's Scary Fast event, and although some have noted the overlap of the curiously late event with Japanese business hours (and yours truly's wee hours) and thus a gaming focus, this is not a prediction of what's to come.

That said, Apple should create a handheld game console.

Why? Because I think it would be a pretty good option.

The highest volume handheld console is the Nintendo Switch, slated for an upgrade before the end of next year, but as of yet running Nvidia's 2015 platform Tegra X1 (the Switch launched in 2017 and even the pre-launch NX rumors focused on a revised and upgraded Tegra X1), true to Gunpei Yokoi's lateral thinking with withered technology.

It has sufficient capability and beautiful games are still being put out for it, but it is also slightly comical that it can't output 4K, HDR or above 60 Hz, or often maintain a respectable, consistent framerate in 3D games. (Tears of the Kingdom was marred by what looks like AI upscaling artifacting in the Zonai shrines and some background set details being low-poly enough for the Nintendo 64.)

The exemplar for the rest of the handheld consoles is Valve's Steam Deck, running a mobile AMD Ryzen "APU" (CPU with integrated GPU) with RDNA graphics, being sufficiently top of the line sufficiently recently that it packs a believable punch. It runs Linux and Valve's Proton layer for DirectX emulation (for Windows titles; Linux-targeted titles run natively with Vulkan or OpenGL) and is able to support a generous portion of the Steam game library, especially now that developers see it as an option.

Beyond the Steam Deck lie a sea of similar portable-PCs-as-game-consoles-with-joysticks-and-buttons of varying capacities, capabilities and outcomes. The Aya brand seems to be at the top of the pack.

Because it's got everything we need

In 2007, during the introduction of the first iPhone, Steve Jobs explained the selection of "OS X" for the platform (deliberately leaving out the "Mac" part of "Mac OS X"), leading in with this:

Why would we want to run such a sophisticated operating system on a mobile device? Well, because it's got everything we need.

Apple itself is in a similar place for a handheld game console.

  • It has famously high-performing but power-sipping M-series chips, and a mature software platform to go with it.

  • It has a hardware-hugging graphics layer with Metal, along with a series of underdog but steadily improving GPUs that are now capable of ray tracing under mobile device conditions (power, heat).

  • It has unified memory, letting the GPU have access to a large amount of working memory, physically close, with a high bandwidth link and shared with the CPU.

  • It has a handful of honest-to-goodness AAA games, who are as of recent developments capable of delivering graphics on par with their current, high-end, living room console versions.

  • It has high-bandwidth, integrated solid state storage management. Doesn't everything with PCI Express and NVMe support have this? Both PlayStation 5 and Xbox Series X have solutions for working with data right from the SSDs in ways that bypass the file system and caching; loading data from optical media died this current console generation. Doing this from within the same chip as the GPU, there could be great possibilties.

  • It has a growing awareness that gaming, while enjoyable on a touchscreen if you do it right and with some games, is also measurably improved with the tactility, precision, feedback and haptics that come from a controller.

  • And finally, it does have Apple Arcade, being some sort of "own" or at least exclusive lineup of games. I am not personally acquainted enough with them to know if they're good, but it's a starting point.

Any pretender to the handheld gaming throne would chomp at the bit to have these parts at their disposal. Nintendo, Sony, Microsoft or Valve probably would not say no to integrating a thing or two, like the Apple Silicon architecture ARM cores, into their own pipelines.

Vision

What's left is just whether they will do it. Unless there are reveals coming on Monday to recontextualize the past few years, Apple doesn't get gaming. They don't have cultural credibility. From the non-console side, they are laughed at for not making PC tradeoffs (flexibility and performance). From the console side, Apple hasn't made a game.

But, in a handheld console, PC tradeoffs are either irrelevant or likely to hurt. And both Microsoft and Sony were in similar, culturally philistine positions when they embarked on the Xbox and PlayStation respectively. The bigger issue is the bumbling with which they have tried to get into gaming, always doing just the wrong thing, holding parties for getting traction with individual games and extrapolating a golden future that so far has not arrived, instead marking a precipitous drop from the Mac comparatively riding high a few years ago, being passed on the sidelines by desktop Linux, of all platforms. We'll leave the open warfare against one of the two major game engines and one of the biggest game studios and the mutual disenchantment with one of the two major GPU vendors for another day.

What's more important is that their eyes are, as always, fixed on where the puck is going to be, not where it has been. So-called "spatial computing" (AR) and self-driving cars. But there's room for the eternal "hobby" Apple TV, the hardware device, that has also tried to be a living room console, despite not having any standout features other than "running tvOS, UIKit and Metal".

There might also be room for a widescreen handheld console, somewhere between the iPhone 15 Pro Max and iPad mini, with integrated control sticks and buttons, with better battery life than most handheld consoles, with better performance and graphics than any similarly built handheld console. In a marketplace where there are chunkier devices with worse battery life just to stream games from someplace else, this would be a standout device. And it would actually be something that "only Apple can do".

Of 15s, Pro

10 years ago, the first Apple Watch came out, and then and a year later when the second model came out, a common thought seemed to be: why would you spend "good watch" money on something that won't even last as long as a "good watch"? That won't be a trooper, an heirloom, an object that can serve you well, find its way into the hands (or onto the wrist) of someone else and serve them well before it even gets close to giving up the ghost.

Every now and then, you have to get a new phone. If your favorite prior iPhone model was iPhone 5S, that doesn't matter – eventually, events will force you there. There are practical concerns with updating the same thing in perpetuity in a way that doesn't apply to fundamental timekeeping. A modern pocket rectangle has more to concern itself with than the phases of the moon, timers, stopwatch and an hour-minute readout. But things change, looks change, designs change and perfection is fleeting.

So, in a world where all these things are immutable facts, consider this a flag planted at an extreme moment in time. I am holding the 15 Pro in my hand, it looks wonderful and it feels even better in the hand. The minimalism, the simple, the essential, used to describe as few geometric shapes, vertices, inflection points as possible. There is no diamond-cut, chamfered edge; there is a smooth, continuous curve, joining the improbable matchup of titanium and glass. It feels like a pebble. It feels organic, it feels natural, it feels as if water has worn off the harshness over decades.

Does it matter? Is it worth the exorbitant (Swedish) price? It doesn't make my heart (or wallet) do flip-flops. But if that's what you have to part with, it should at least be as nice as this to hold in your hand, since that's where it's going to be for such a long time. I don't know what the trends of fashion and the demands of competition will have done by the time this one has expired and will need to be replaced. Hope springs eternal that its successor will approximate the eternal – the eroded, smooth stone; the continuous curve; the mellowed, the gentle, the kind.

John Warnock, RIP

I have never known about John Warnock more than the occasional fact, like "involved in early Adobe" and "did things with early PostScript", but yesterday's news of his untimely passing gave me reason to look further, including reading the 2010 Knowledge at Wharton interview which casts a lot of light on his contribution.

It seems not just an industry executive but a pioneer in historically reverential typesetting (and its intersection with computer graphics) has left us. Sorry for not being curious sooner; pay your respects by marveling at his story and life's work and start with Michael Tsai's roundup, which is where I found the interview.

iA: Unraveling the Digital Markets Act

Point by point, what the provisions in the Digital Markets Act mean.

To the extent that is realistically possible, this is a piece of legislation that plucks the power bestowed upon a few actors from their hands and back into the citizens', the customers', the owners'.

The world is complicated and there are a number of points where the law will force one trade-off to turn into another trade-off. For example, there are the actions affecting the ad market, where the light will fall and land on various actors curiously scurrying away – not the oligopolists themselves (mostly), but the exploitative, get-away-with-whatever-you-can, bonkers actors on the market they created. If they can't do what they do now, I'm not sure they will consign themselves to lives of quiet contemplation and community service. But worrying about whether the cure will be hell is no reason to put off fighting the disease any longer.

I view this as a cornerstone of civil rights and customer rights in the same vein as the GDPR. The EU does not get everything right and are not the foremost authority on how this all should work. But they are in the same place as the United States Government was before passing the Clean Air Act and Clean Water Act. When the corporations involved have decided that they don't feel like doing anything, what else is left to do?

The major technology companies affected by the DMA, to the letter, are acting in self-serving, customer-harming ways because they get away with it. Everyone knows it's unfair. Everyone knows it takes seconds to push a feature flag with the dark pattern or the monopolistic behavior and ages to prosecute. Everyone knows there's no one else between the App Review team and the developer. Everyone knows you can't realistically avoid having many of them in your life, or between you and your bank, friend, employer or government, to the point that anyone attempting a protest is labelled a kook by the same people cheerily asserting that "if they don't like it they should just use something else".

There was another way. This was not inevitable. They just chose not to.

Ars Technica: EU wants “readily removable” batteries in devices soon—but what does that mean?

Ars Technica also digs into the specifics of what "readily removable" means.

The European Parliament is bringing back replaceable phone batteries

...but people are getting it wrong.

TechSpot: "Sleek slabs could soon be a thing of the past"

Most batteries were standalone modular units that could be traded out by releasing a latch and sliding it out, kind of like the battery on cordless power tools today. For phones with "internal" batteries, you'd simply pop off the rear cover of the device, lift the battery out, put a fresh one in, and button it back up.

Manufacturers eventually moved away from easily swappable batteries in favor of "sealed" handsets sporting sleeker designs. Many consumers were vocal about the change but over time, most accepted it as the new norm and moved on. The EU's new rules could force manufacturers to open up the history books for ideas on how to move forward.

It's a reasonable first impulse given our history, but that's not what the actual legislation says.

Let's take a look at Article 11, section 1:

Any natural or legal person that places on the market products incorporating portable batteries shall ensure that those batteries are readily removable and replaceable by the end-user at any time during the lifetime of the product. That obligation shall only apply to entire batteries and not to individual cells or other parts included in such batteries.

A portable battery shall be considered readily removable by the end-user where it can be removed from a product with the use of commercially available tools, without requiring the use of specialised tools, unless provided free of charge with the product, proprietary tools, thermal energy, or solvents to disassemble the product.

Any natural or legal person that places on the market products incorporating portable batteries shall ensure that those products are accompanied with instructions and safety information on the use, removal and replacement of the batteries. Those instructions and that safety information shall be made available permanently online, on a publicly available website, in an easily understandable way for end-users.

In other words, it does not mean "every phone will, Nokia 3310-style, have to have a door that you can flip open and manually replace a battery, without any tools, just with your hands".

It does mean phones can keep looking the way they have been looking, but:

  • You can't use proprietary screws (unless the screwdrivers are sufficiently commercially available — Apple's Pentalobe screws may qualify now, but they wouldn't at the time of introduction when the purpose was to obfuscate and to prevent user repair).
  • For all practical purposes, disassembly or battery installation can't rely on steps that can only be done with factory methods or large proprietary tools.
  • Disassembly can't rely on heating the product up or dissolving adhesive.

As far as I can tell, a phone where you unscrew the screws at the bottom, which disengages the internal frame, where you then use a suction cup to separate the seal enough to then use prying tools to disengage clips enough to flip it open and then access the insides is fully compliant. Phones described by this design have already shipped in hundreds of millions of units.

This legislation does not say "no more batteries that are not inside easily user-accessible latches". If anything, it says "no more load-bearing adhesive to get inside the product and at the battery".

I will leave the wisdom of legislators dictating technological decisions for another day. But let's agree on the wisdom of understanding the details of those dictates.

Andrew Tsai on Windows emulation inside the Metal "Game Porting Toolkit"

I watched Bring your game to Mac and was wondering what powered this.

The toolkit, announced in the WWDC keynote, wasn't available for download then but is now. And yes, it's true, it uses a combination of Wine, several open source projects, Rosetta translation of x86-64 binaries and recompilation of DirectX 12 shaders to Metal (via Vulkan, seemingly). It is also the only Apple-related project I know to require installing Homebrew.

Getting this to the point where it has significant feature coverage to begin with is a technological achievement. Having it be dependable "enough" for game developers to be able to accurately assess the work involved and not give up after a flurry of mistranslations or hokey support is basically a miracle.

Following Cider and Proton, getting games running on non-Windows platforms via Wine-based approaches is nothing new, but I didn't expect to see it used as a native development on-ramp in this way.

Marques Brownlee's impressions of Apple Vision Pro

Includes a bit at around 13:30 where he gets the same ick I get about 3D photos and "being that guy". Having a full picture of how you come off when using the gimmicky features of their hardware and software is not a strong suit for Apple.

Vision

The Apple Vision Pro now exists.

The iPod was a better MP3 player. The iPad was a better tablet. The Apple Watch was a better watch. The Vision Pro is a better... what, exactly?

It is clearly a technological achievement. It is clearly interesting. It also clearly makes you look like a dork, first when you plonk down $3499 (before obligatory prescription inserts for the majority of the population which requires correction), then when you use it. They did what they could, but it still looks pretty terrible.

The first iPhone had no third-party apps (until the jailbreaking community stepped in and developed an SDK within months), no copy and paste, no GPS, no MMS, no front-facing camera, no good backside camera, and so on. There were phones you could buy with some of those features. It was ahead in software quality and interaction and behind in the details. People bought into it because it was a breath of fresh air and the rest would be filled in over time.

The Apple Vision Pro is possibly, battling the Purple project for the original iPhone or the extended meta-Multi-Touch project that spanned both iPhone and iPad, the largest development project in the company's history (to have spawned a product; hi Titan). It produced a product that is simultaneously not comparatively embarrassing (it stacks up reasonably well against untethered VR headsets), will still clearly get much better within a few years when both hardware is able to deliver more and software grows to be more capable, and lives in a category no one is (currently) interested in.

The success of Apple Vision Pro depends on Apple's ability to bewitch people into using it. There are a handful of practical arguments contrasting with alternative hardware, like getting "infinite display" or "a personal theater", but both are beset with technological limitations and a daunting cost calculus. (The best value appears to be as an immersive TV - but one that can only be used by one person.) Aside from that, it's all about being able to take in entirely new experiences. This, along with the iPad-esque note of "doing what you could do before but now much better", is the only reason for it.

At its core, there is a big source of tension in the product. It is supposed to, like iPhone and iPad before it, make interaction more direct. Bring things not into our periphery or into an indirect plane but right into our reality. But it can only do this when you are wearing it, and when you are wearing it you look like one of the evolutionary steps on the way to the people at the end of Wall-E.

Apple's materials try to counter this by showing a dad preparing breakfast and intercepting stray soccer balls from his sprog. But all the 3D photos and videos that you are supposed to enjoy are also taken by someone having had the product on their face. No doubt, Apple saw the problem with Google Glass "glassholes" and have been trying to cover up some aspects, like by letting the outward facing display show when you are taking a photo. But how anyone is supposed to be comfortable wearing this around in their life enough for the 3D photos and videos to be captured, or comfortable being around people constantly wearing them, is an open question. (Bet on iPhones being able to take 3D photos and videos soon, to give the adoption legs.)

All in all, what gives me hope about the Apple Vision Pro is that it is "good enough to criticize". If it doesn't eliminate screen door effect (where you can see the cracks between pixels), it will soon. If it costs entirely too much today, it will cost at least marginally less in a few years, probably by having a lower tier which none-the-less will have more processing power.

And the thing about Apple and the thing about the advancement of technology is that this is as clunky as it will get. It will only get sleeker and more capable over time. (It is already a "Pro" - there's not going to be a clunkier, faster "Pro Pro" model.) Whether physical limitations will forever bind you to looking like a dork while wearing it is anyone's guess. But in three years, at $1299, I'm hoping to find out.

Eric Lippert: A long expected update

Today is [..] my last day at Facebook-now-Meta.

My team — Probabilistic Programming Languages — and indeed entire “Probability” division were laid off a couple weeks ago; the last three years of my work will be permanently abandoned.

The mission of the Probability division was to create small teams that applied the latest academic research to real-world at-scale problems, in order to improve other groups’ decision-making and lower their costs. New sub-teams were constantly formed; if they didn’t show results quickly then they were failed-fast; if they did show results then they were reorganized into whatever division they could most effectively lower costs.

We were very successful at this. [..]

We foolishly thought that we would naturally be protected from any layoffs, being a team that reduced costs of any team we partnered with. [..]

The whole Probability division was laid off as a cost-cutting measure. I have no explanation for how this was justified and I note that if the company were actually serious about cost-cutting, they would have grown our team, not destroyed it.

I'm looking forward to hearing about what Eric has been up to and saddened but barely surprised at the ways bean-counting illusions about "what a company ought to be and do" forces hundred billion dollar companies to run roughshod over great ideas and fantastic people while stabbing their future selves in the butt. (Significant personal fallout aside, I will try to contain my dismay that it wounds Meta.)

Bryan Cantrill: Coming Of Age

A long and thoughtful talk sparked by a tweet from what I assume is a venture capitalist. People in general but kids and young adults especially deserve more than to be cogs in some exploitative grind factory driven by Disney villain morals in search of "fuck you" money.

BBC News: Elon Musk: Twitter locks staff out of offices until next week

Twitter has told employees that the company's office buildings will be temporarily closed, effective immediately.

In a message seen by the BBC, workers were told that the offices would reopen on Monday 21 November.

It did not give a reason for the move.

The announcement comes amid reports that large numbers of staff were quitting after new owner Elon Musk called on them to sign up for "long hours at high intensity" or leave.

The signs have been there for ages but anyone still enchanted with Mr Musk at this point is simply not paying attention.

What do all these things say about him?

  • I have been a part of many successful ventures. I am technically literate and I know how to play the press. But I took away the wrong lessons about company building, about leadership and about trust.

  • I have survived, as every leader does, because I have let talented and hard-working people do what they wanted and they needed to do, which every company wants and needs. But since I didn't build this company, since I swooped in at the back of a bad meme which I was too proud to get out of and since it and I am at the crunch of liquidity demands of my own making, all that is no longer relevant. Saving my bacon is relevant. Not having to put up my own money to defend my own folly is relevant.

  • Remember in the 80's when the businessmen bought up companies just to break them up and turn them into monetary assets? Well, no more. It's time to buy up companies and treat them like beleaguered startups. Let's cosplay, I'll be the single venture capitalist who lets half of you go, having built nothing of it but accusing all of you of sloth, incompetence and insufficient adherence to virtue; the dude-bro with an Ångström skin who can't tell productivity from activity; who thinks people working from home are hiding something, while putatively being the irreplaceable engine of more companies than there are weekdays at the same time.

  • I am the only judge of what is correct. I, or possibly hand picked people who I have worked with previously, will grasp the nuance and necessity of everything, including things to which I have not been previously exposed. Chesterton's fence is for lesser men. On every team, no one would do productive work unless I was present to observe them or lead their efforts. In short, Edward Mike Davis had the right idea, but he was small-time.

Nilay Patel: "Welcome to hell, Elon"

Twitter, the company, makes very little interesting technology; the tech stack is not the valuable asset. The asset is the user base: hopelessly addicted politicians, reporters, celebrities, and other people who should know better but keep posting anyway. You! You, Elon Musk, are addicted to Twitter. You’re the asset. You just bought yourself for $44 billion dollars.

[..]

You can’t deploy AI at this problem: you have to go out and defend the actual First Amendment against the bad laws in Texas and Florida, whose taxes you like and whose governors you seem pretty fond of. Are you ready for what that looks like? Are you ready to sit before Congress and politely decline to engage in their content capture sessions for hours on end? Are you ready to do any of this without the incredibly respected policy experts whose leader you first harassed and then fired? This is what you signed up for. It’s way more boring than rockets, cars, and rockets with cars on them.

Babel Lecture 2022 with Stephen Fry

In defense of everyone's birthright to have fun with, develop, expand and own their language.

Connected

Pick up a book, read an article, watch a clip from the past 200 years or so, centered on what people find admirable and there they are. "Renaissance men" - people who know a lot about a lot. The Valve employee handbook put it differently: T-shaped people, "people who are both generalists and also experts".

But while the upsides of this broad mind has been extolled and somewhat substantiated over the years, it has been mostly left unsaid how to go about it. "Go to University!" and "Just go learn what you want to do and follow your nose!" is seemingly incompatible advice.

For all of the ills of social media, for all the ways a misleading fact, fake story, damaging, made-up rumor can lap Twitter while the truth is putting its shoes on, the influx of information in our lives means that you can engage with people's experience in a way that didn't use to be possible. And I mean experience, not mere "experiences".

Just this week, I somehow absorbed information about how "stroads" are a mismatched half-way point between streets and roads, leading small areas that should remain human-scale and personable to be torn apart in an effort to look like a big city; and how open shelves should be used for things you want to display and cabinets and other storage used to put stuff away. The first doesn't affect me all that much since I'm not an urban planner (although it seems neither are many of the purported urban planners), but it addresses a vague churning in my stomach I've had when I've been to some locations that just didn't feel right to me. And the second one seems incredulously easy, but it introduced a distinction that I hadn't thought about. I can't find the link, but the architect in question pointed out that with so many naked open cubbies like the IKEA Kallax, it is a distinction that many people do not observe, and live with cluttered furniture exposing incidental objects to dust, instead of storing them safely and showcasing the things you really care about.

Work hard to make money and spend them on objects and you may hear: "you can't take it with you". In a way, the same thing is true for ideas and knowledge outside of your domain and sphere of interest; if you manage a vertical collection of Zippo lighters, at least it will be left for someone when you're gone. Given that, what good is knowing the intricasies of performing Barrier Skip when you don't speedrun Wind Waker, manufacturing Panko bread crumbs when you aren't that one company in Japan, age a flat file cabinet when you haven't negotiated a band saw in more than a decade?

You have to feed the soul too. I'm a programmer, a developer, a problem solver. I like finding out about new domains, expanding my knowledge about them, slowly get a grip on them, realize there's so much here that someone else could, and probably is, living their whole life within this domain and barely gets to call themselves an expert. If you can muscle your way past the worst parts of Dunning-Kruger, you may find an interesting spot where you simultaneously understand that there's a whole lot you don't know, and that you have a better understanding on a small part of it than you thought you ever would. That's invigorating to me.

The reason the T-shaped people are revered isn't because they had read more books than others. It's because often they could see the same thing from multiple angles. As the kids today might put it, they were full-stack - or at least multi-faceted. In a way, they were multiple people at once.

I have never quite gotten in the habit of reading books, which is ironic because of how much we all read now, all day, every day. But putting aside their use as a mechanism for control and indoctrination where it was used to narrow thinking, the traditional promise of books is to widen thinking, to carry the results of someone's research, someone's lived experiences, someone's deep thinking, through the ages, from before sewers to after personal meal delivery apps.

If you can manage to dodge the divisive, conspiratorial, resentful, regressive people who make their living telling people how much of a shame it is that it isn't the fifty years ago it never was, be those people celebrated authors in the before times or producers of sputtering self-centered video podcasts today, there are plenty of good things left.

I don't know about you, but I've been spending a lot of time worrying about the future, being crushed by increasing complexity both privately, professionally and in current world events. There's "sharpening your saw" and become better at exactly what you do. There's "turning off" by vegetating to the cheap, the mass-produced, entry-level "SEO-optimized" or corporate-approved mulch. But aside from also allowing yourself rest and disconnection, how about recognizing that something that activates, engages and challenges you can also avoid feeling instinctually like work, like your responsibility, like something you ought to fix, like details you need to commit to memory or like noise you have to endure but that never means anything.

It can be entertaining because it catches your mind off guard, in a curious, open state where it doesn't have any notion of what's happening next and doesn't feel the urge to check phone notifications. It can be instructive and give you a lesson to tuck away for the future. It can jump into your mesh of neurons and trigger a connection for a problem you've been wrestling with for months. Or it can just be pleasing to listen to or watch in a world where your own mobility is limited.

Functional UI

I'm writing this article leaning against some nameless architectural mistake, and I am not writing the article on a Mac. I would, but my PowerBook is fresh out of power (funny notion, to name the thing after its only major shortcoming it's rather like Greenland in that respect.) [emphasis editor's]

Douglas Adams, The Little Computer that Could, 1998

Linked in popular places today is Marcel Weiher's UIs Are Not Pure Functions of the Model - React.js and Cocoa Side by Side from 2018, which delves into why React's functional state muckery appears useless.

This is a subject that I give a lot of thought from many angles, so there are many things to note:

  • The post appears snarky, but bases its attitude on the notion that Cocoa is a thought-out, capable, battle-tested, baked user interface framework. For example, for the point where the React side focuses on being able to provide fit-for-purpose lists, the Cocoa side highlights that nothing special needs to be done since, roughly, we know how to use our lists and we know how to design model classes to hook into them.

  • The functional nature of React, according to the origin story - retold many times, but here citing Pete Hunt in 2014's Rethinking Web App Development at Facebook beginning at around 24:00, is that of all the collective state being changed boiling down to one type of update, and if we could just handle that well we could stop the complexity from being spread out across 500 minor pokes to the DOM from across as many JavaScript files. Pete says "we built a user interface library called React, designed to solve the hardest problem in this space, which is that is that data changing over time is the root of all evil".

  • The major, primordial feature of SwiftUI, as explained by Kyle Macomber in WWDC 2019's Introducing SwiftUI session beginning at around 24:00 is collapsing the number of reasons things can happen to views down to 1. Kyle says "UI programming is hard - like 'lock free concurrency' hard" and refers to an "explosion" of possible ordering of events. These may not be identical arguments, but they are at least parallel arguments within their respective ecosystem.

  • The notion that Cocoa's views are baked and done and ready has also taken a hit in the last decade or so. As an example, when Mac table views stopped being primarily NSCell-based, they became NSView-based, and responding to selection and picking a contrasting color to the background is now a halfway magical process, when it used to be entirely managed for you. And of course, the desire to go to more custom and higher production value presentation requires even more code to stray from the default. (Even if collection views, for example, are excellent tools to get a grip on this complexity and focus on the behavior you want.)

View models and functional UI look like solutions, and they are indeed effective ways of managing complexity by making all the constituent state visible and enumerated. But in my experience they also encourage a way of programming where you bind as much as possible, and the problem with that is that, as the title of the linked post notes, UIs are not pure functions of the models.

If you go from one place in your UI to another, you may want to be stopped because there are things that don't validate or don't fly. You may have pending changes that should neither automagically apply nor be lost. Both SwiftUI and React have state as a first-class concept and are theoretically well-equipped to handle this; what's worse is that we don't have a handle on it.

We don't know how to think in state. For the edited-but-not-saved text of a text field, sure. For the in-progress-but-not-committed-changes in what is at least partially modal, somewhere in the UI? Hm, well, that sounds like a tree of Redux reducers - and therefore model data - or a bunch of nested view models to me. The SwiftUI talk mentions "sources of truth" a lot. Here, the source of truth for the hitherto unsaved data is nebulous. Living in 107 state variables? Living in provisionally updated properties of an ObservableObject that is kept uncommited from the database or real source of truth?

The key to great user interfaces is that they work the way the person expects. That constantly - not always, but constantly - requires making the mental model of the user interface richer. When an item is dragged around in a list, there should be indications about where the new item would land if you dropped it and you should be able to cancel it. If the list is hierarchial, you should be able to have things open for you in the middle of the drag. When you type blindly into the list, the list should select the next best item depending on what you typed. When you have a list of items and the ability to close one of them, you should also be able to close all others. When you can select one item and edit some of its information, you should be able to select multiple items, see where the information agrees and edit the information en masse, applying it to all of them.

Great user interfaces take time and effort and forethought and respect for the user. Not everyone may wish to always create a great user interface for all things at all times. There's a time and place for all ends of the spectrum and all levels of ambition. But UI frameworks should think about how it can, to quote the most unintuitive thing I know, make simple things simple and hard things possible.

Because no matter how much support the framework provides, hard things remain. Thinking back to the not-yet-committed changes, before anything is saved, hitting Undo should Undo whatever is possible to Undo. After the save operation has happened, Undo should magically switch to either doing nothing or Undoing the entire change. Figuring that out requires thinking through what the model of the user interface needs to be to be closest to the user's mental model. After you have done that, the last thing you need is the framework fighting you because it did not anticipate your need for this type of control.

Frameworks should be built by and for the people who want to create the great user interfaces, the ones that anticipate your needs by assuming that you want to have as much and as expedient control as possible over whatever the user interface is about. Where a list means being able to copy and paste the selected items, drag and drop the items to rearrange them, drag and drop the items holding a key to copy them, hold a key to check/uncheck all checkboxes, resize all columns to fit the width of their respective contents, sort columns by clicking the headers, drag and drop the columns to reorder them and, yes, type into the list blindly to jump to the item with the text prefix.

It is the case that many things are presented better on the web when they use, say, a multiple-row, "layouted" form of presentation. Where for good reason, things should not just look like a list. The chat messages example from the React talk should not look like a list. But the big failure of the web's style of user interfaces is that everything is custom. On the web you have to make a gargantuan effort to build everything from the ground up and work well on every device. On the desktop (and in some ways on the tablet or on the phone), you just have to make a concerted effort to use the features that are already there.

So, wrapping around. I don't think functional is the right idea. I don't think mutation from fifty angles is the right idea. I don't think controls or MVC guide you sufficiently towards how your code should be structured to best serve the user interface, without bugs and allowing interactivity and reactivity. My answer is that I have no answer, but to look out for seemingly perfect answers. Making everything look like the web is not a good idea. Making everything look like not the web, when you have people expecting the web, and who have never clicked to sort a table column in their life, is probably also not a good idea.

A pure function transforms an input value to an output value. If the idea was for a reader to go into this post with one idea and come out with another, this post is as impure as it gets. But maybe it, and 500 other random bits from other random ideas in other random places, will end up collectively giving you a clearer, more nuanced picture over time.

I consider that an imperative.

← Earlier posts