Take

Later posts →

Humain't

The Humane Ai Pin has been announced, a phone alternative trying its best to not be a phone in any way. Humane famously spearheaded by ex-Apple luminaries Imran Chaudhri (with large amounts of the iPhone and multi-touch user experience to his name), Bethany Bongiorno (a Director of Software Engineering from the launch of the original iPad) and counting among its ranks Ken Kocienda (part of the initial Safari/WebKit team and designer of the first software keyboard and typing autocorrect), I'm finding myself wondering what I'm missing.

There's the ambition, the philosophical thrust behind the product itself: people bewitched by apps, addicted to constant impulses be they doom-scrolling or drip-feeding entertainment. The desire to break free of the neck-craning prison of the pocket rectangle is understandable. (It's also been used as a siren song for both Windows Phone and Apple Watch before.)

There's the technological moment in time. AI personal assistants have been available for years and recent breakthroughs in some AI technology means this is probably the first time this type of device could do what it could do with convincing accuracy — remember things, relate them to the current location, time, context — well enough to be basically the only interface, the only contact surface. Not a parlor trick which you can ignore if you want; a small projected readout in your palm aside, talking to it, having it understand what you mean and doing it is it.

The Humane Ai Pin didn't happen by chance and was not lazily extracted from between the couch cushions. A lot of talented people spent a lot of time at it, clearly chasing a deep vision.

So why does it seem so terribly, undeniably off?

There is a precipitous cliff for anything beyond "talking to the magical AI", where viewing photos and videos and managing settings all happen by going to a ".Center" site in a browser. The product site features food delivery and messaging between friends, two things that are well handled by apps today and that look dreadful to handle via voice entry or the projected palm interface, more fit for haikus than menus. But the "cosmos" operating system is leaning into this, supposed to be free of all types of apps. So much for growing pains.

I am not the first to react strongly to this, but I am probably uncommon in my intense dislike for personal assistant AIs, a dislike that obviously flares to new heights in a product so heavily focused on them. The Humane site harps on privacy and trust, but what is private about being forced to live your life out loud; to not be able to jot a thought down silently? Were these things even discussed on a fundamental level during the considerable ideation, or was anyone just seen as the bearer of the bad culture, steeped in the musky scent of old magics?

A little experience can be a dangerous thing. Having gone through a world-changing evolution of how most people interact with personal technology, I understand if people think "in the beginning, they will laugh at you and say that the keyboard will not work until the screen can deliver convincing tactile feedback echoing physical buttons, but look at what happened; the world adapted and we won". I understand if some of the people involved feel a strange mix of regret of what this new technology, and everything that happened in its wake, has wrought, as well as the professional and curious imperative to do it again by taking the next leap, to unwind the next impediment to the machine just knowing what you mean to do by interacting with it.

If walking around in the world but looking at a screen because you're reading something is being absorbed by something else and not being present, then tapping a pocket square and talking to a virtual assistant about the same thing you would accomplish if you had a screen is also not being present.

In the scale of things, what exists is a technological achievement (and taking the recent progress and extrapolating five more years, might be even more so). There's just no compelling reason for anyone to throw the things that already exist overboard to use it and only it. (Maybe if you believe so strictly in the mission of Humane that aligning with it overrides every practical concern and dresses the contortions up in adherence to a more enlightened existence.)

This doesn't sound like an insurmountable issue to me; for it to be a purposeful, focused device, used in addition to other things and also free from the burden, whether catered to or not, of having to be all the other things. But I'm not sure the people who would found Humane would want to go down that road.

Macs Schreck

Space Black looks nice, but the one I'd want starts at $7,984 (Swedish price) and even an ~82% uplift in Xcode from the M1 Max is still not worth it when I hardly ever feel held back by this chip.

Apple Should Create a Handheld Game Console

I have no idea what's in store for Monday night's Scary Fast event, and although some have noted the overlap of the curiously late event with Japanese business hours (and yours truly's wee hours) and thus a gaming focus, this is not a prediction of what's to come.

That said, Apple should create a handheld game console.

Why? Because I think it would be a pretty good option.

The highest volume handheld console is the Nintendo Switch, slated for an upgrade before the end of next year, but as of yet running Nvidia's 2015 platform Tegra X1 (the Switch launched in 2017 and even the pre-launch NX rumors focused on a revised and upgraded Tegra X1), true to Gunpei Yokoi's lateral thinking with withered technology.

It has sufficient capability and beautiful games are still being put out for it, but it is also slightly comical that it can't output 4K, HDR or above 60 Hz, or often maintain a respectable, consistent framerate in 3D games. (Tears of the Kingdom was marred by what looks like AI upscaling artifacting in the Zonai shrines and some background set details being low-poly enough for the Nintendo 64.)

The exemplar for the rest of the handheld consoles is Valve's Steam Deck, running a mobile AMD Ryzen "APU" (CPU with integrated GPU) with RDNA graphics, being sufficiently top of the line sufficiently recently that it packs a believable punch. It runs Linux and Valve's Proton layer for DirectX emulation (for Windows titles; Linux-targeted titles run natively with Vulkan or OpenGL) and is able to support a generous portion of the Steam game library, especially now that developers see it as an option.

Beyond the Steam Deck lie a sea of similar portable-PCs-as-game-consoles-with-joysticks-and-buttons of varying capacities, capabilities and outcomes. The Aya brand seems to be at the top of the pack.

Because it's got everything we need

In 2007, during the introduction of the first iPhone, Steve Jobs explained the selection of "OS X" for the platform (deliberately leaving out the "Mac" part of "Mac OS X"), leading in with this:

Why would we want to run such a sophisticated operating system on a mobile device? Well, because it's got everything we need.

Apple itself is in a similar place for a handheld game console.

  • It has famously high-performing but power-sipping M-series chips, and a mature software platform to go with it.

  • It has a hardware-hugging graphics layer with Metal, along with a series of underdog but steadily improving GPUs that are now capable of ray tracing under mobile device conditions (power, heat).

  • It has unified memory, letting the GPU have access to a large amount of working memory, physically close, with a high bandwidth link and shared with the CPU.

  • It has a handful of honest-to-goodness AAA games, who are as of recent developments capable of delivering graphics on par with their current, high-end, living room console versions.

  • It has high-bandwidth, integrated solid state storage management. Doesn't everything with PCI Express and NVMe support have this? Both PlayStation 5 and Xbox Series X have solutions for working with data right from the SSDs in ways that bypass the file system and caching; loading data from optical media died this current console generation. Doing this from within the same chip as the GPU, there could be great possibilties.

  • It has a growing awareness that gaming, while enjoyable on a touchscreen if you do it right and with some games, is also measurably improved with the tactility, precision, feedback and haptics that come from a controller.

  • And finally, it does have Apple Arcade, being some sort of "own" or at least exclusive lineup of games. I am not personally acquainted enough with them to know if they're good, but it's a starting point.

Any pretender to the handheld gaming throne would chomp at the bit to have these parts at their disposal. Nintendo, Sony, Microsoft or Valve probably would not say no to integrating a thing or two, like the Apple Silicon architecture ARM cores, into their own pipelines.

Vision

What's left is just whether they will do it. Unless there are reveals coming on Monday to recontextualize the past few years, Apple doesn't get gaming. They don't have cultural credibility. From the non-console side, they are laughed at for not making PC tradeoffs (flexibility and performance). From the console side, Apple hasn't made a game.

But, in a handheld console, PC tradeoffs are either irrelevant or likely to hurt. And both Microsoft and Sony were in similar, culturally philistine positions when they embarked on the Xbox and PlayStation respectively. The bigger issue is the bumbling with which they have tried to get into gaming, always doing just the wrong thing, holding parties for getting traction with individual games and extrapolating a golden future that so far has not arrived, instead marking a precipitous drop from the Mac comparatively riding high a few years ago, being passed on the sidelines by desktop Linux, of all platforms. We'll leave the open warfare against one of the two major game engines and one of the biggest game studios and the mutual disenchantment with one of the two major GPU vendors for another day.

What's more important is that their eyes are, as always, fixed on where the puck is going to be, not where it has been. So-called "spatial computing" (AR) and self-driving cars. But there's room for the eternal "hobby" Apple TV, the hardware device, that has also tried to be a living room console, despite not having any standout features other than "running tvOS, UIKit and Metal".

There might also be room for a widescreen handheld console, somewhere between the iPhone 15 Pro Max and iPad mini, with integrated control sticks and buttons, with better battery life than most handheld consoles, with better performance and graphics than any similarly built handheld console. In a marketplace where there are chunkier devices with worse battery life just to stream games from someplace else, this would be a standout device. And it would actually be something that "only Apple can do".

Of 15s, Pro

10 years ago, the first Apple Watch came out, and then and a year later when the second model came out, a common thought seemed to be: why would you spend "good watch" money on something that won't even last as long as a "good watch"? That won't be a trooper, an heirloom, an object that can serve you well, find its way into the hands (or onto the wrist) of someone else and serve them well before it even gets close to giving up the ghost.

Every now and then, you have to get a new phone. If your favorite prior iPhone model was iPhone 5S, that doesn't matter – eventually, events will force you there. There are practical concerns with updating the same thing in perpetuity in a way that doesn't apply to fundamental timekeeping. A modern pocket rectangle has more to concern itself with than the phases of the moon, timers, stopwatch and an hour-minute readout. But things change, looks change, designs change and perfection is fleeting.

So, in a world where all these things are immutable facts, consider this a flag planted at an extreme moment in time. I am holding the 15 Pro in my hand, it looks wonderful and it feels even better in the hand. The minimalism, the simple, the essential, used to describe as few geometric shapes, vertices, inflection points as possible. There is no diamond-cut, chamfered edge; there is a smooth, continuous curve, joining the improbable matchup of titanium and glass. It feels like a pebble. It feels organic, it feels natural, it feels as if water has worn off the harshness over decades.

Does it matter? Is it worth the exorbitant (Swedish) price? It doesn't make my heart (or wallet) do flip-flops. But if that's what you have to part with, it should at least be as nice as this to hold in your hand, since that's where it's going to be for such a long time. I don't know what the trends of fashion and the demands of competition will have done by the time this one has expired and will need to be replaced. Hope springs eternal that its successor will approximate the eternal – the eroded, smooth stone; the continuous curve; the mellowed, the gentle, the kind.

John Warnock, RIP

I have never known about John Warnock more than the occasional fact, like "involved in early Adobe" and "did things with early PostScript", but yesterday's news of his untimely passing gave me reason to look further, including reading the 2010 Knowledge at Wharton interview which casts a lot of light on his contribution.

It seems not just an industry executive but a pioneer in historically reverential typesetting (and its intersection with computer graphics) has left us. Sorry for not being curious sooner; pay your respects by marveling at his story and life's work and start with Michael Tsai's roundup, which is where I found the interview.

iA: Unraveling the Digital Markets Act

Point by point, what the provisions in the Digital Markets Act mean.

To the extent that is realistically possible, this is a piece of legislation that plucks the power bestowed upon a few actors from their hands and back into the citizens', the customers', the owners'.

The world is complicated and there are a number of points where the law will force one trade-off to turn into another trade-off. For example, there are the actions affecting the ad market, where the light will fall and land on various actors curiously scurrying away – not the oligopolists themselves (mostly), but the exploitative, get-away-with-whatever-you-can, bonkers actors on the market they created. If they can't do what they do now, I'm not sure they will consign themselves to lives of quiet contemplation and community service. But worrying about whether the cure will be hell is no reason to put off fighting the disease any longer.

I view this as a cornerstone of civil rights and customer rights in the same vein as the GDPR. The EU does not get everything right and are not the foremost authority on how this all should work. But they are in the same place as the United States Government was before passing the Clean Air Act and Clean Water Act. When the corporations involved have decided that they don't feel like doing anything, what else is left to do?

The major technology companies affected by the DMA, to the letter, are acting in self-serving, customer-harming ways because they get away with it. Everyone knows it's unfair. Everyone knows it takes seconds to push a feature flag with the dark pattern or the monopolistic behavior and ages to prosecute. Everyone knows there's no one else between the App Review team and the developer. Everyone knows you can't realistically avoid having many of them in your life, or between you and your bank, friend, employer or government, to the point that anyone attempting a protest is labelled a kook by the same people cheerily asserting that "if they don't like it they should just use something else".

There was another way. This was not inevitable. They just chose not to.

Ars Technica: EU wants “readily removable” batteries in devices soon—but what does that mean?

Ars Technica also digs into the specifics of what "readily removable" means.

The European Parliament is bringing back replaceable phone batteries

...but people are getting it wrong.

TechSpot: "Sleek slabs could soon be a thing of the past"

Most batteries were standalone modular units that could be traded out by releasing a latch and sliding it out, kind of like the battery on cordless power tools today. For phones with "internal" batteries, you'd simply pop off the rear cover of the device, lift the battery out, put a fresh one in, and button it back up.

Manufacturers eventually moved away from easily swappable batteries in favor of "sealed" handsets sporting sleeker designs. Many consumers were vocal about the change but over time, most accepted it as the new norm and moved on. The EU's new rules could force manufacturers to open up the history books for ideas on how to move forward.

It's a reasonable first impulse given our history, but that's not what the actual legislation says.

Let's take a look at Article 11, section 1:

Any natural or legal person that places on the market products incorporating portable batteries shall ensure that those batteries are readily removable and replaceable by the end-user at any time during the lifetime of the product. That obligation shall only apply to entire batteries and not to individual cells or other parts included in such batteries.

A portable battery shall be considered readily removable by the end-user where it can be removed from a product with the use of commercially available tools, without requiring the use of specialised tools, unless provided free of charge with the product, proprietary tools, thermal energy, or solvents to disassemble the product.

Any natural or legal person that places on the market products incorporating portable batteries shall ensure that those products are accompanied with instructions and safety information on the use, removal and replacement of the batteries. Those instructions and that safety information shall be made available permanently online, on a publicly available website, in an easily understandable way for end-users.

In other words, it does not mean "every phone will, Nokia 3310-style, have to have a door that you can flip open and manually replace a battery, without any tools, just with your hands".

It does mean phones can keep looking the way they have been looking, but:

  • You can't use proprietary screws (unless the screwdrivers are sufficiently commercially available — Apple's Pentalobe screws may qualify now, but they wouldn't at the time of introduction when the purpose was to obfuscate and to prevent user repair).
  • For all practical purposes, disassembly or battery installation can't rely on steps that can only be done with factory methods or large proprietary tools.
  • Disassembly can't rely on heating the product up or dissolving adhesive.

As far as I can tell, a phone where you unscrew the screws at the bottom, which disengages the internal frame, where you then use a suction cup to separate the seal enough to then use prying tools to disengage clips enough to flip it open and then access the insides is fully compliant. Phones described by this design have already shipped in hundreds of millions of units.

This legislation does not say "no more batteries that are not inside easily user-accessible latches". If anything, it says "no more load-bearing adhesive to get inside the product and at the battery".

I will leave the wisdom of legislators dictating technological decisions for another day. But let's agree on the wisdom of understanding the details of those dictates.

Andrew Tsai on Windows emulation inside the Metal "Game Porting Toolkit"

I watched Bring your game to Mac and was wondering what powered this.

The toolkit, announced in the WWDC keynote, wasn't available for download then but is now. And yes, it's true, it uses a combination of Wine, several open source projects, Rosetta translation of x86-64 binaries and recompilation of DirectX 12 shaders to Metal (via Vulkan, seemingly). It is also the only Apple-related project I know to require installing Homebrew.

Getting this to the point where it has significant feature coverage to begin with is a technological achievement. Having it be dependable "enough" for game developers to be able to accurately assess the work involved and not give up after a flurry of mistranslations or hokey support is basically a miracle.

Following Cider and Proton, getting games running on non-Windows platforms via Wine-based approaches is nothing new, but I didn't expect to see it used as a native development on-ramp in this way.

Marques Brownlee's impressions of Apple Vision Pro

Includes a bit at around 13:30 where he gets the same ick I get about 3D photos and "being that guy". Having a full picture of how you come off when using the gimmicky features of their hardware and software is not a strong suit for Apple.

Vision

The Apple Vision Pro now exists.

The iPod was a better MP3 player. The iPad was a better tablet. The Apple Watch was a better watch. The Vision Pro is a better... what, exactly?

It is clearly a technological achievement. It is clearly interesting. It also clearly makes you look like a dork, first when you plonk down $3499 (before obligatory prescription inserts for the majority of the population which requires correction), then when you use it. They did what they could, but it still looks pretty terrible.

The first iPhone had no third-party apps (until the jailbreaking community stepped in and developed an SDK within months), no copy and paste, no GPS, no MMS, no front-facing camera, no good backside camera, and so on. There were phones you could buy with some of those features. It was ahead in software quality and interaction and behind in the details. People bought into it because it was a breath of fresh air and the rest would be filled in over time.

The Apple Vision Pro is possibly, battling the Purple project for the original iPhone or the extended meta-Multi-Touch project that spanned both iPhone and iPad, the largest development project in the company's history (to have spawned a product; hi Titan). It produced a product that is simultaneously not comparatively embarrassing (it stacks up reasonably well against untethered VR headsets), will still clearly get much better within a few years when both hardware is able to deliver more and software grows to be more capable, and lives in a category no one is (currently) interested in.

The success of Apple Vision Pro depends on Apple's ability to bewitch people into using it. There are a handful of practical arguments contrasting with alternative hardware, like getting "infinite display" or "a personal theater", but both are beset with technological limitations and a daunting cost calculus. (The best value appears to be as an immersive TV - but one that can only be used by one person.) Aside from that, it's all about being able to take in entirely new experiences. This, along with the iPad-esque note of "doing what you could do before but now much better", is the only reason for it.

At its core, there is a big source of tension in the product. It is supposed to, like iPhone and iPad before it, make interaction more direct. Bring things not into our periphery or into an indirect plane but right into our reality. But it can only do this when you are wearing it, and when you are wearing it you look like one of the evolutionary steps on the way to the people at the end of Wall-E.

Apple's materials try to counter this by showing a dad preparing breakfast and intercepting stray soccer balls from his sprog. But all the 3D photos and videos that you are supposed to enjoy are also taken by someone having had the product on their face. No doubt, Apple saw the problem with Google Glass "glassholes" and have been trying to cover up some aspects, like by letting the outward facing display show when you are taking a photo. But how anyone is supposed to be comfortable wearing this around in their life enough for the 3D photos and videos to be captured, or comfortable being around people constantly wearing them, is an open question. (Bet on iPhones being able to take 3D photos and videos soon, to give the adoption legs.)

All in all, what gives me hope about the Apple Vision Pro is that it is "good enough to criticize". If it doesn't eliminate screen door effect (where you can see the cracks between pixels), it will soon. If it costs entirely too much today, it will cost at least marginally less in a few years, probably by having a lower tier which none-the-less will have more processing power.

And the thing about Apple and the thing about the advancement of technology is that this is as clunky as it will get. It will only get sleeker and more capable over time. (It is already a "Pro" - there's not going to be a clunkier, faster "Pro Pro" model.) Whether physical limitations will forever bind you to looking like a dork while wearing it is anyone's guess. But in three years, at $1299, I'm hoping to find out.

Eric Lippert: A long expected update

Today is [..] my last day at Facebook-now-Meta.

My team — Probabilistic Programming Languages — and indeed entire “Probability” division were laid off a couple weeks ago; the last three years of my work will be permanently abandoned.

The mission of the Probability division was to create small teams that applied the latest academic research to real-world at-scale problems, in order to improve other groups’ decision-making and lower their costs. New sub-teams were constantly formed; if they didn’t show results quickly then they were failed-fast; if they did show results then they were reorganized into whatever division they could most effectively lower costs.

We were very successful at this. [..]

We foolishly thought that we would naturally be protected from any layoffs, being a team that reduced costs of any team we partnered with. [..]

The whole Probability division was laid off as a cost-cutting measure. I have no explanation for how this was justified and I note that if the company were actually serious about cost-cutting, they would have grown our team, not destroyed it.

I'm looking forward to hearing about what Eric has been up to and saddened but barely surprised at the ways bean-counting illusions about "what a company ought to be and do" forces hundred billion dollar companies to run roughshod over great ideas and fantastic people while stabbing their future selves in the butt. (Significant personal fallout aside, I will try to contain my dismay that it wounds Meta.)

Bryan Cantrill: Coming Of Age

A long and thoughtful talk sparked by a tweet from what I assume is a venture capitalist. People in general but kids and young adults especially deserve more than to be cogs in some exploitative grind factory driven by Disney villain morals in search of "fuck you" money.

BBC News: Elon Musk: Twitter locks staff out of offices until next week

Twitter has told employees that the company's office buildings will be temporarily closed, effective immediately.

In a message seen by the BBC, workers were told that the offices would reopen on Monday 21 November.

It did not give a reason for the move.

The announcement comes amid reports that large numbers of staff were quitting after new owner Elon Musk called on them to sign up for "long hours at high intensity" or leave.

The signs have been there for ages but anyone still enchanted with Mr Musk at this point is simply not paying attention.

What do all these things say about him?

  • I have been a part of many successful ventures. I am technically literate and I know how to play the press. But I took away the wrong lessons about company building, about leadership and about trust.

  • I have survived, as every leader does, because I have let talented and hard-working people do what they wanted and they needed to do, which every company wants and needs. But since I didn't build this company, since I swooped in at the back of a bad meme which I was too proud to get out of and since it and I am at the crunch of liquidity demands of my own making, all that is no longer relevant. Saving my bacon is relevant. Not having to put up my own money to defend my own folly is relevant.

  • Remember in the 80's when the businessmen bought up companies just to break them up and turn them into monetary assets? Well, no more. It's time to buy up companies and treat them like beleaguered startups. Let's cosplay, I'll be the single venture capitalist who lets half of you go, having built nothing of it but accusing all of you of sloth, incompetence and insufficient adherence to virtue; the dude-bro with an Ångström skin who can't tell productivity from activity; who thinks people working from home are hiding something, while putatively being the irreplaceable engine of more companies than there are weekdays at the same time.

  • I am the only judge of what is correct. I, or possibly hand picked people who I have worked with previously, will grasp the nuance and necessity of everything, including things to which I have not been previously exposed. Chesterton's fence is for lesser men. On every team, no one would do productive work unless I was present to observe them or lead their efforts. In short, Edward Mike Davis had the right idea, but he was small-time.

Nilay Patel: "Welcome to hell, Elon"

Twitter, the company, makes very little interesting technology; the tech stack is not the valuable asset. The asset is the user base: hopelessly addicted politicians, reporters, celebrities, and other people who should know better but keep posting anyway. You! You, Elon Musk, are addicted to Twitter. You’re the asset. You just bought yourself for $44 billion dollars.

[..]

You can’t deploy AI at this problem: you have to go out and defend the actual First Amendment against the bad laws in Texas and Florida, whose taxes you like and whose governors you seem pretty fond of. Are you ready for what that looks like? Are you ready to sit before Congress and politely decline to engage in their content capture sessions for hours on end? Are you ready to do any of this without the incredibly respected policy experts whose leader you first harassed and then fired? This is what you signed up for. It’s way more boring than rockets, cars, and rockets with cars on them.

Babel Lecture 2022 with Stephen Fry

In defense of everyone's birthright to have fun with, develop, expand and own their language.

Connected

Pick up a book, read an article, watch a clip from the past 200 years or so, centered on what people find admirable and there they are. "Renaissance men" - people who know a lot about a lot. The Valve employee handbook put it differently: T-shaped people, "people who are both generalists and also experts".

But while the upsides of this broad mind has been extolled and somewhat substantiated over the years, it has been mostly left unsaid how to go about it. "Go to University!" and "Just go learn what you want to do and follow your nose!" is seemingly incompatible advice.

For all of the ills of social media, for all the ways a misleading fact, fake story, damaging, made-up rumor can lap Twitter while the truth is putting its shoes on, the influx of information in our lives means that you can engage with people's experience in a way that didn't use to be possible. And I mean experience, not mere "experiences".

Just this week, I somehow absorbed information about how "stroads" are a mismatched half-way point between streets and roads, leading small areas that should remain human-scale and personable to be torn apart in an effort to look like a big city; and how open shelves should be used for things you want to display and cabinets and other storage used to put stuff away. The first doesn't affect me all that much since I'm not an urban planner (although it seems neither are many of the purported urban planners), but it addresses a vague churning in my stomach I've had when I've been to some locations that just didn't feel right to me. And the second one seems incredulously easy, but it introduced a distinction that I hadn't thought about. I can't find the link, but the architect in question pointed out that with so many naked open cubbies like the IKEA Kallax, it is a distinction that many people do not observe, and live with cluttered furniture exposing incidental objects to dust, instead of storing them safely and showcasing the things you really care about.

Work hard to make money and spend them on objects and you may hear: "you can't take it with you". In a way, the same thing is true for ideas and knowledge outside of your domain and sphere of interest; if you manage a vertical collection of Zippo lighters, at least it will be left for someone when you're gone. Given that, what good is knowing the intricasies of performing Barrier Skip when you don't speedrun Wind Waker, manufacturing Panko bread crumbs when you aren't that one company in Japan, age a flat file cabinet when you haven't negotiated a band saw in more than a decade?

You have to feed the soul too. I'm a programmer, a developer, a problem solver. I like finding out about new domains, expanding my knowledge about them, slowly get a grip on them, realize there's so much here that someone else could, and probably is, living their whole life within this domain and barely gets to call themselves an expert. If you can muscle your way past the worst parts of Dunning-Kruger, you may find an interesting spot where you simultaneously understand that there's a whole lot you don't know, and that you have a better understanding on a small part of it than you thought you ever would. That's invigorating to me.

The reason the T-shaped people are revered isn't because they had read more books than others. It's because often they could see the same thing from multiple angles. As the kids today might put it, they were full-stack - or at least multi-faceted. In a way, they were multiple people at once.

I have never quite gotten in the habit of reading books, which is ironic because of how much we all read now, all day, every day. But putting aside their use as a mechanism for control and indoctrination where it was used to narrow thinking, the traditional promise of books is to widen thinking, to carry the results of someone's research, someone's lived experiences, someone's deep thinking, through the ages, from before sewers to after personal meal delivery apps.

If you can manage to dodge the divisive, conspiratorial, resentful, regressive people who make their living telling people how much of a shame it is that it isn't the fifty years ago it never was, be those people celebrated authors in the before times or producers of sputtering self-centered video podcasts today, there are plenty of good things left.

I don't know about you, but I've been spending a lot of time worrying about the future, being crushed by increasing complexity both privately, professionally and in current world events. There's "sharpening your saw" and become better at exactly what you do. There's "turning off" by vegetating to the cheap, the mass-produced, entry-level "SEO-optimized" or corporate-approved mulch. But aside from also allowing yourself rest and disconnection, how about recognizing that something that activates, engages and challenges you can also avoid feeling instinctually like work, like your responsibility, like something you ought to fix, like details you need to commit to memory or like noise you have to endure but that never means anything.

It can be entertaining because it catches your mind off guard, in a curious, open state where it doesn't have any notion of what's happening next and doesn't feel the urge to check phone notifications. It can be instructive and give you a lesson to tuck away for the future. It can jump into your mesh of neurons and trigger a connection for a problem you've been wrestling with for months. Or it can just be pleasing to listen to or watch in a world where your own mobility is limited.

Functional UI

I'm writing this article leaning against some nameless architectural mistake, and I am not writing the article on a Mac. I would, but my PowerBook is fresh out of power (funny notion, to name the thing after its only major shortcoming it's rather like Greenland in that respect.) [emphasis editor's]

Douglas Adams, The Little Computer that Could, 1998

Linked in popular places today is Marcel Weiher's UIs Are Not Pure Functions of the Model - React.js and Cocoa Side by Side from 2018, which delves into why React's functional state muckery appears useless.

This is a subject that I give a lot of thought from many angles, so there are many things to note:

  • The post appears snarky, but bases its attitude on the notion that Cocoa is a thought-out, capable, battle-tested, baked user interface framework. For example, for the point where the React side focuses on being able to provide fit-for-purpose lists, the Cocoa side highlights that nothing special needs to be done since, roughly, we know how to use our lists and we know how to design model classes to hook into them.

  • The functional nature of React, according to the origin story - retold many times, but here citing Pete Hunt in 2014's Rethinking Web App Development at Facebook beginning at around 24:00, is that of all the collective state being changed boiling down to one type of update, and if we could just handle that well we could stop the complexity from being spread out across 500 minor pokes to the DOM from across as many JavaScript files. Pete says "we built a user interface library called React, designed to solve the hardest problem in this space, which is that is that data changing over time is the root of all evil".

  • The major, primordial feature of SwiftUI, as explained by Kyle Macomber in WWDC 2019's Introducing SwiftUI session beginning at around 24:00 is collapsing the number of reasons things can happen to views down to 1. Kyle says "UI programming is hard - like 'lock free concurrency' hard" and refers to an "explosion" of possible ordering of events. These may not be identical arguments, but they are at least parallel arguments within their respective ecosystem.

  • The notion that Cocoa's views are baked and done and ready has also taken a hit in the last decade or so. As an example, when Mac table views stopped being primarily NSCell-based, they became NSView-based, and responding to selection and picking a contrasting color to the background is now a halfway magical process, when it used to be entirely managed for you. And of course, the desire to go to more custom and higher production value presentation requires even more code to stray from the default. (Even if collection views, for example, are excellent tools to get a grip on this complexity and focus on the behavior you want.)

View models and functional UI look like solutions, and they are indeed effective ways of managing complexity by making all the constituent state visible and enumerated. But in my experience they also encourage a way of programming where you bind as much as possible, and the problem with that is that, as the title of the linked post notes, UIs are not pure functions of the models.

If you go from one place in your UI to another, you may want to be stopped because there are things that don't validate or don't fly. You may have pending changes that should neither automagically apply nor be lost. Both SwiftUI and React have state as a first-class concept and are theoretically well-equipped to handle this; what's worse is that we don't have a handle on it.

We don't know how to think in state. For the edited-but-not-saved text of a text field, sure. For the in-progress-but-not-committed-changes in what is at least partially modal, somewhere in the UI? Hm, well, that sounds like a tree of Redux reducers - and therefore model data - or a bunch of nested view models to me. The SwiftUI talk mentions "sources of truth" a lot. Here, the source of truth for the hitherto unsaved data is nebulous. Living in 107 state variables? Living in provisionally updated properties of an ObservableObject that is kept uncommited from the database or real source of truth?

The key to great user interfaces is that they work the way the person expects. That constantly - not always, but constantly - requires making the mental model of the user interface richer. When an item is dragged around in a list, there should be indications about where the new item would land if you dropped it and you should be able to cancel it. If the list is hierarchial, you should be able to have things open for you in the middle of the drag. When you type blindly into the list, the list should select the next best item depending on what you typed. When you have a list of items and the ability to close one of them, you should also be able to close all others. When you can select one item and edit some of its information, you should be able to select multiple items, see where the information agrees and edit the information en masse, applying it to all of them.

Great user interfaces take time and effort and forethought and respect for the user. Not everyone may wish to always create a great user interface for all things at all times. There's a time and place for all ends of the spectrum and all levels of ambition. But UI frameworks should think about how it can, to quote the most unintuitive thing I know, make simple things simple and hard things possible.

Because no matter how much support the framework provides, hard things remain. Thinking back to the not-yet-committed changes, before anything is saved, hitting Undo should Undo whatever is possible to Undo. After the save operation has happened, Undo should magically switch to either doing nothing or Undoing the entire change. Figuring that out requires thinking through what the model of the user interface needs to be to be closest to the user's mental model. After you have done that, the last thing you need is the framework fighting you because it did not anticipate your need for this type of control.

Frameworks should be built by and for the people who want to create the great user interfaces, the ones that anticipate your needs by assuming that you want to have as much and as expedient control as possible over whatever the user interface is about. Where a list means being able to copy and paste the selected items, drag and drop the items to rearrange them, drag and drop the items holding a key to copy them, hold a key to check/uncheck all checkboxes, resize all columns to fit the width of their respective contents, sort columns by clicking the headers, drag and drop the columns to reorder them and, yes, type into the list blindly to jump to the item with the text prefix.

It is the case that many things are presented better on the web when they use, say, a multiple-row, "layouted" form of presentation. Where for good reason, things should not just look like a list. The chat messages example from the React talk should not look like a list. But the big failure of the web's style of user interfaces is that everything is custom. On the web you have to make a gargantuan effort to build everything from the ground up and work well on every device. On the desktop (and in some ways on the tablet or on the phone), you just have to make a concerted effort to use the features that are already there.

So, wrapping around. I don't think functional is the right idea. I don't think mutation from fifty angles is the right idea. I don't think controls or MVC guide you sufficiently towards how your code should be structured to best serve the user interface, without bugs and allowing interactivity and reactivity. My answer is that I have no answer, but to look out for seemingly perfect answers. Making everything look like the web is not a good idea. Making everything look like not the web, when you have people expecting the web, and who have never clicked to sort a table column in their life, is probably also not a good idea.

A pure function transforms an input value to an output value. If the idea was for a reader to go into this post with one idea and come out with another, this post is as impure as it gets. But maybe it, and 500 other random bits from other random ideas in other random places, will end up collectively giving you a clearer, more nuanced picture over time.

I consider that an imperative.

Don't Overextend Yourself

Guilherme Rambo:

However, Apple's documentation lacks crucial information on how to use these new APIs (FB10140097), and there were no WWDC sessions or sample code available in the weeks following the keynote.

Thanks to some trial and error, and some help from other developers, I was able to put together some sample code demonstrating how one can use ExtensionFoundation/ExtensionKit to define custom extension points for their Mac apps.

I spent about three hours doing the software development equivalent of gesticulating wildly trying to get ExtensionFoundation to work and will be looking at this. Vital pieces of information just weren't there at all, nor were any Xcode templates present to fill in the blanks. (Update: there apparently is a target for Generic extension, which I missed because it wasn't in the Multiplatform, iOS or Other categories.)

On the one hand, this technology now exists, likely took months to implement and years to bake and was someone's passion project.

On the other hand, it's just thrown out there. Here you go. I hope you understand how to use it. We will explain the thread of every shipped nut in details; we will even provide an example of one sort of fully assembled cabinet. But we will abstain from providing actual instructions of vital parts of the process.

This is not the first time I'm getting these vibes from Apple frameworks; IOSurface and its ilk has been like that, and the sessions on ScreenCaptureKit were almost comical at their circuitous avoidance of describing what you actually do with the audio/video buffers you are handed.

The title of this post is a salty pun, but if history is any guide it's also an actual earnest hope for the individuals who put together the new technologies, stressed to the gills under time pressure. They have my admiration as always.

But – they also get the dubious pleasure of being the face of underdocumented software. Apple is big enough that it's time to grow beyond the scrappy image, respect developers both inside and outside the company and allocate time and opportunity to move beyond "no overview available".

WWDC 2022

  • Years back, I wanted a new type of system UI primitive that was like a widget, but for transient events, to track your food order or count down to a bus or train. More information is still forthcoming about Live Activities, but given the examples were an ongoing sports match and the status of an incoming ride, I am optimistic.

  • Heavens to Betsy, apps can themselves have extensions now with ExtensionKit. There's even ExtensionFoundation for non-UI extensions.

  • There are now extensions for streaming media to "third-party" devices within the AirPlay menu.

  • There are also App Intents with the ability to pass data in and recieve data out on the other side, along with optionally showing UI. Siri and Shortcuts so far, and being able to be triggered via widgets, but tell me if this doesn't sound like macOS Services and wouldn't be ideal to be triggerable via context menus on selections.

  • There's also an iOS 16 "Developer Mode" device opt-in.

All of this is good stuff. The platform becomes more of a platform – things talking to each other, developers becoming more powerful and the user being in control. In many of the changes, it is impossible to conclusively separate whether they thought it was a good idea that needed to happen or whether they just didn't want to build it under the gun. (I will note that they have been and remained good ideas for numerous years.)

And as for a gun, well, having a Developer Mode that "lowers security" would put them in alignment with Android if one day they lifted the restrictions to allow broader distribution without going through orifices.

The Grave Insult of Turning a Basic Task into a Complicated Nightmare

1.

First, a cautionary tale.

In 1976, a secret memo was sent from Swedish National Police Commissioner Carl Persson to Prime Minister Olof Palme. The memo documented the Minister of Justice Lennart Geijer's alleged regular contacts with prostitutes and the blackmail it could make him and others susceptible to.

In 1977, after Geijer's resignation the year before, journalist Peter Bratt wrote a flawed exposé of the memo's existence in Dagens Nyheter, the Swedish paper of record. This kicked up a lot of dust, mostly about the allegations but also about the slipshod coverage. Later the same day, the story was confirmed in the news program Ekot on state radio, separately sourced and with the egregrious mistakes corrected.

Within days, Prime Minister Palme stood in parliament, vehemently denying the "lies" of Dagens Nyheter, comparing the unnamed author to a sewer rat with yellowed teeth. He did not mention the corrected and factual reporting of the same events and the same memo by Ekot.

Over the next few months, Dagens Nyheter would come to issue an apology, Swedish journalists and publicists would come to focus on outing the source of the original report and the popular understanding was left as: this did not happen. Even though it did happen, even though it had already been reported on correctly, even though that report had remained unassaulted. Holding either politician accountable turned into open season on the author and the source of the original report. Because the thing everyone read was flawed.

2.

John Gruber:

Last weekend The Verge ran a piece by Sean Hollister under the headline “Apple Shipped Me a 79-Pound iPhone Repair Kit to Fix a 1.1-Ounce Battery”. Sometimes I read an article that’s so absurdly and deliberately wrongheaded, I worry that I’m reading it wrong.

Louis Rossman:

It's just so missing the point, and the fact that this is getting reported by so many others, not just The Verge, is part of the problem with tech journalism in general. You don't have your eye on the ball. You're focusing on the things that are easy, rather than focusing on the things that matter.

John (again):

That sounds great, of course, but that’s not how modern mobile devices work. Apple isn’t an outlier in this regard — there are no popular modern mobile devices that are easily serviceable with simple tools. If it were possible for iPhones to be more easily repairable, without sacrificing their appearance, dimensions, performance, water-and-dust resistance, and cost, Apple would make them more easily repairable. That iPhones are not easily repairable is of no benefit to Apple whatsoever. What’s the theory otherwise? That $69 in-store battery replacements are highly profitable?

3.

The Verge can write a flawed article and miss the point, but that doesn't mean there isn't a point. There are several.

Before the iPhone, there were phones you could swap the battery in. Even the weird, crazy-ass, design experiment Nokia models, you could swap the battery in. The iPhone changed a lot of things, this being one of them.

But this wasn't a change for Apple. Every single model of iPod had a non-user-replaceable battery. This was adhering to their design philosophy, in place since the early 80's, which can be succinctly summarized as: don't touch it, you'll only make it worse.

The iPhone being the first of the stereotypical modern smartphone having a non-user-replaceable battery does not mean a modern smartphone has to not have a non-user-replaceable battery. It means that when a company with Apple's design philosophy does one, they will pay more attention to pretty much everything else than to practical maintainability concerns for the user.

Apple's design philosophy is focused on the story that only Apple can do this. Only Apple can secure the software, only Apple can repair the hardware, only Apple can replace the battery. From the lens of that story, it makes sense to, rather than allow the user to twist a knob, remove a plate or undo a screw and swap the battery, tell you to go to the store that Apple runs and have someone else do that for you.

4.

The instictual, visceral rebellion from so many people has several constituent components.

From a user's point of view, from a customer's point of view, this is ridiculous. We all own products where the battery eventually goes bad where we can replace the battery. We know that there are alternative solutions.

The roundabout nature of the repair speaks to the degree to which practicality was not a priority during the design process, with wasteful shipping of enormous machines being one of many side effects. (Note: putting aside the wisdom of choosing this design, shipping enormous machines to service centers is a different set of concerns since presumably those machines are acquired and will service hundreds to thousands of devices over years. But renting them and sending them across time zones, continuously charging an environmental cost that would otherwise be amortized over time, is a completely different sustainability calculus.)

It is so easy to transpose the arguments slightly and re-examine them. What would people say about a new Porsche with a sealed tire design, where you required an 18-wheeler-sized vehicle to be called out from the nearest dealer to, carefully and gingerly, plasma cutting, loosening and desoldering the tire, followed by friction-stir welding a replacement tire on and getting on the horn with a service representative over the telematics system to bond it to the vehicle electronics? If you fell into a coma, woke up ten years later and this was the standard, would it become any less wasteful? If you learned that this had been controversial, but that recently Porsche had started sending out these vehicles to people without a support contract and for a lower fee, would the construction become any better?

What would people say if, in late 2006, you wanted a phone with a big touch screen, with apps, with high-speed internet access, with fluid animations and a capable software stack? It's possible they would say "silly rabbit, that's just not how phones work, they run cheap-ass software written to a real-time OS, and if you sit nice, be very quiet and hand-type this URL on this T9 keypad, you can have a J2ME midlet application that's so maddeningly generic, the developer couldn't even tell on which side the softkey buttons are going to show up from platform to platform". But would that mean the current status quo was the only way things would ever go, or that you were objectively an idiot unmoored from reality for wanting them to be different?

From a user's point of view, it is true that having a user-replaceable battery does add to the size and weight of the device, especially the easier the mechanism is to undo. It also adds to the lifetime and the resale value, because now you could do it with your hands in under a minute, or with a screwdriver and maybe some minor tools in under 15 minutes, so now you might actually do it, since you don't have to give your primary technical support device up for hours or days in a store that may not even be in the same city, without risking your warranty.

I can see why Apple or why publicly listed companies who live and die by quarterly earnings don't want to incentivize that. I can see why they would rather want people to get a new device, or failing that, inflate one of two notions that the device should only be handled by its maker or that its maker is a good and green company for disassembling and recycling a device that could still have done five more years of service out in the real world.

Just because servicing is not a "profit center" for a company, doesn't mean that the company doesn't benefit from designing its devices to not be user-serviceable. And it doesn't mean that the company, at the end of the day, doesn't view your convenience as a customer somewhere on a dynamic scale from indifference to contempt. At least when it means that you might actually use the damn device for an inconveniently long time, when you should have ran out to get an upgraded model, whose upgraded features you do not really want enough to justify the price of a new device.

I can also see why some people make the mistake of looking at the before and after and think, well, the iPhone won. But the alternative isn't an iPhone 13 Pro Max vs a Nokia 3310. This is a false comparison. The alternative is an iPhone 14 Pro Max that spends a fraction of its area on making battery removal possible without industrial tooling. And maybe that seems alien given the device landscape of today; maybe it seems alien given Apple's history in particular.

But what seems alien is a bad predictor of what's possible. It used to seem really fucking nuts to make a phone out of aluminium and glass. And the idea that made people think you were a propeller-head who installed NetBSD on your toaster wasn't that you wanted to swap your battery without involving suction cups, but that you wanted your phone, the thing you made calls with, sent SMS text messages or the odd email with and played Snake on, to be as capable as a computer.

5.

So, I don't know where to start and I don't know quite where to end either. But the idea that Apple isn't a grown-up company that can do whatever they want is stupid. They could focus a tenth of the effort that they put into designing a new iPhone into making it just a hair more repairable without industrial tooling.

Swapping a battery after 18-30 months, 1-3 times during the device's total lifetime, when it is significantly degraded, could be possible with a screwdriver and screwdriver-esque type of tools. Note that batteries are now so good that no one is asking for the ability to flip open a hatch and swap between one battery and another battery you have on hand, just for a way of opening the thing up, getting to the battery and sealing the thing back down that would be less of a science project. This would be a significantly less intrusive change than a "user-swappable" battery as the old Nokias had (a mechanism the likes of which the iPhone already supports on a smaller scale for the SIM tray).

Apple has enormous assets and enormous expertise. They could do it. But they choose not to do it, and they should be held accountable for what they are doing. They should be held accountable for what they have chosen to do and not to do.

Yes, it is some flavor of nice that they are not hoarding the industrial tooling. Yes, it is on its face ridiculous that they have willingly painted themselves into a situation where such tooling is necessary to open a god damn high-volume mobile phone. And no, the reason Apple is where they are is not because they have somehow reached the practical limits of applicable technology, and nothing will ever be more easily repairable. The reason is because Steve Jobs wanted a perfect object that people didn't screw around inside of, and company culture is a hell of a thing.

Quinn Nelson: Using Apple’s Tools to Fix My iPhone

Quinn Nelson, who used to own an iPhone repair shop, takes the US-only iPhone Self Service Repair for a spin. They ship you 40+ kg of equipment, which you rent for 7 days. The entire experience is a confounding mix of thoughtful little touches and issues being solved with a ridiculous, over-the-top sledgehammer approach, more or less because Apple can afford it.

The iPhone as a product is designed, engineered and developed with a number of constraints on materials, fit and finish. It has to be manufacturable in the first place, it has to be torsionally rigid, it has to withstand atmospheric pressure, a person's grip and perspiration. The materials have to be responsibly and sustainably sourced, free from impurities and forbidden substances. It has to fit together well and look its best.

There are thousands of check boxes to tick, all of which have guided the design and build process. For a company capable of doing all this, only laziness, apathy or spite are stopping them from making it just as well put-together, just as representative of a deliberate design ethos, but more easily repairable.

Only laziness, apathy or spite.

Platforms

Over and over again, I hear the same argument. Someone (Apple, Google, Microsoft, Facebook, whoever) created a thing, and they are now and forever in charge of that thing, and can do whatever they want, and if people don't like it, tough.

This argument is a fair place to start and holds water to a point. If you make a sandwich in the comfort of your own home, you can put whatever you want on it. If you make a sandwich for your kid, your kid can't insist that it have jell-o on it instead of sprouts and cream cheese. (Although if you put dirt on it, social services may want to have a word.)

However, if you have somehow transformed into the civilized world's maker, keeper and provider of sandwiches, it is ridiculous to apply the same logic as if you're just making an afternoon snack for yourself. If applications run on your sandwich that, if they went down, would seriously impact society, we have graduated from the point where it's just your concern (and beyond this metaphor).

iOS and Android is a duopoly. It's not because it's technically impossible to create an alternate platform - although it is very difficult. It's because of the powerful network effects holding back any contender. At that point, your platform controls eating and breathing; the air, the soil and the water; livelihood, where the savings are deposited, how they are used. At that point, everything else is a joke. At that point, you made yourself the world's platform.

I am not one to say that the government, the military or the angry concerned mob of civilians should wrest control of these creations from the private companies, and that this will solve everything. But I am one to say that you keep acting in the same self-interested, self-serving, self-aggrandizing way as you did when you were one of ten hopeful entrants in a crowded market at your own peril.

Monopolies eventually fall. If you're unlucky, they pull all the oxygen from the room first, smothering all counter-agents, setting us back years. Making us forget what it was like before everyone took as given the compromises we were forced to make; the selfish, enshrined as security, as utility, leaving a shriveled, ashen field.

Monopolies are endothermic and parasitic, claiming for itself the value of their surroundings, under the guise of its own importance, propped up and fueled by those who wish for eternal youth, ultimate power and endless resources. But wishing so does not change how thermodynamics work, how ecosystems work. If not dethroned, all power, all value, all energy flows to the monopoly, which assumes it, which forgets everything else, the spectre of endless growth the everpresent hum of the universe. And when it's all there...

The story only ends in one place - without a monopoly. Having fallen, first gradually, then suddenly.

I am not worried about monopolies lasting forever. I am worried about what they will consume before they are toppled. And I am worried about the power of grown adults who still insist on jell-o sandwiches.

Noah Grey's Gofundme

I am late to this and only found out about this on Slashdot, but days ago, Noah Grey was in deep trouble.

Noah Grey is probably one of the most influential people to personal, automated Internet publishing that most people have never heard of. At the turn of the century he produced a remarkable piece of software called Greymatter, far before it would be described as a "static site generator", "content management system" or "weblog engine", and used it to publish beautiful photography. His sensibilities ran deeply through the software and from what I can tell, through everything he did.

The story of the current circumstances has a happy ending, at least in terms of account balances, but I took the opportunity to contribute anyway because of what people like Noah have meant to me personally and to the Internet that I love. Due to cynicism, commercialism, greed and politics, the Internet may not be what it was any longer, but that's hardly Noah's fault. It was stolen from him and from all of us, so let's steal it back.

Steve Troughton-Smith on Apple, App Store, developers and government meddling

Steve calls it "an incredibly-delicate, calculated 'truce' that relied on Apple not overstepping into abuse" and now thinks "Apple’s behavior has been so offensive over past few years that, at this point, all I’m thinking is ‘damn the consequences’".

I've never understood why people have been so bewitched about the positive potential outcomes. Purely by making the operating system, defining the APIs, bundling its own software, manufacturing its own hardware, tying the operating system to its hardware and vice versa and selling even more of its own software, Apple has plenty of opportunity to set good examples and draw people to the combined value proposition. Why the hell does it need any more control than that?

I have been against the App Store from day one on a philosophical standpoint but also from a practical, realistic standpoint. The ability for a developer to be a great developer is hampered, frustratingly and undeniably. The likelihood that a user will find a great app is lowered dramatically.

Mac OS has always had less apps than Windows, but they've also had great applications, sometimes good enough to keep people on the platform, earlier good enough to keep people enduring an unstable foundation. Applications developed by small teams of people who nevertheless made great things, in some cases still for longer than there's even been an App Store.

iOS and the App Store has not been a truce. It has been a destructive, abusive, lopsided, mistrustful relationship; a relationship that has allowed access to a platform advantageous enough to make you close your eyes and think of multi-touch. It has been an insult to history, an exercise in attempting to redefine away the fundamental facts of the market and of their existing user base and developer ecosystem in a puff of malevolent marketing.

I don't see governments as the best arbitrators of required features in hardware or software, but they sometimes have a strong connection to what's fair for the customer, and ain't nothing about the App Store that's fair.

Like Steve, I resent the muddling connection to "industry interests", whose involvement just serves to hide the real issues. I am reluctantly interested in what's going on as a forcing function, knowing the lopsided unfairness could simply slide over to yet another party beyond end user, developer and Apple. That could happen.

If that happens, I will blame the Apple that woke up every morning since 2008 and chose to worship Mammon. I will blame the Apple that took the easy way out and chose the clammy, desperate grip of control. I will blame the Apple cowardly enough and uninspired enough to keep to its own, rather than to lead in the sense of empowering other people.

I will blame the Apple that, through this mismanagement, craps on the legacy of the people and divisions within the company that have done groundbreaking, innovative and empowering work in their own areas over the years, all the while it reaps the profits of their advancements and discards their spent husks when they're done.

And I will blame the Apple that, when people naturally react to their mismanagement, acts surprised, as if we don't know that they know best, because didn't we know that pure intent scrubs away all consequences?

The word "gaslighting" is thrown around a lot these days and in its original form it seems to require a malicious intent to deceive. I really don't know what's there. I believe people at all ranks at Apple to be intelligent people and think very few of them have ill-intent, so I really can't describe what motivates what they're doing. But they seem to have a radically different idea of what's going on than literally everyone else. And so maybe the idea of the "Cult of Apple", long brought out to explain why people would even use those smelly Macs and iPhones, just had it backwards; it is the company that is the cult.

← Earlier posts