Take

Later posts →

MacRumors: Apple 'Surprised' By Developer Frustration With Its App Review Process

In September of last year, the Australian Competition and Consumer Commission (ACCC) launched an investigation into Apple's ‌App Store‌ and Google's Play Store to examine the experiences of consumers, suppliers, and developers in Australia.

[..]

In a submission to the commission, Apple says that it's "surprised to hear that developers have legitimate concerns about their ability to engage with Apple in the app review process," and that it "invests significant time and resources in engaging with developers directly" to ensure the quality of apps on the platform.

There are no words left.

Ars Technica: Facebook “Supreme Court” overrules company in 4 of its first 5 decisions

(It's hard to quote just one part; read all of it.)

An interesting experiment. Even if it is a "fig leaf", it might at least be a fig leaf with sharp edges, so to speak. It will not make a difference in most of the unjust actions taken by the moderators, but neither does every transgression in society turn into a court case, and they are still effective.

Whether or not Facebook having this kind of power constitutes a monopoly or not, they have an outsized effect on the exposure of speech. Their platform thrives on turning gaming of the human mind into a clinical optimization problem; a "goal seek" that may have ended up radicalizing hundreds of millions of people, torn apart families and communities, upset the causes of science and education and prolonged a pandemic, with death and suffering in its wake. And they make money, all their money, from this behavior and from its consequences.

They're not going to change, because this is all there is to Facebook. If the trickery and manipulation is stripped away, it wouldn't know what it was, because it quite literally never wasn't those things. But if this dulls the edge of their sword while documenting some of the faults, that might be worth something.

In Bruges

I had one movie (The Man From Earth) that I would recommend everyone to see, and see knowing as little as possible going into it; after seeing In Bruges, I now have two.

Brad Cox, Creator of Objective-C, dies at 77

Objective-C is presumably the mainstream language with the most outsized influence. Along with Ruby and Squeak later on, it carried the values of Smalltalk into the modern programming era. Introspection and messages and dynamism, rather than C++ vtable optimization and trickery inventing seven kinds of memory management/ownership subtlety and delegating all to the programmer. Getting things to work together in a coherent and easy way that befits a small system, rather than spending 90% of your attention making sure no performance is untowardly spilled on the floor.

Brad Cox was a virtual unknown, and Objective-C's origins have been shrouded in mystery to me aside from the words "Brad Cox" and "Stepstone". I wish I'd known more. I hope there are enough people out there who knew more, and who talked to him and wrote it down, and who can tell the story of how he invented a gem that unlocked so much possibility and so much imagination over the years. I hope he got to do many other things (the obituary mentions lecturing about object-orientation and later work on neural networks). I hope he was happy.

Know Your Current Events

Presumably, this site was created to facilitate and promulgate reactions, so why so few of them and why on such odd subjects?

2020 was a terrible year for reasons everyone know. It already occupied my mind as much as it occupied other people's minds. I vowed from early on that I would spend as little time mentioning them as possible. Towards the end, it got difficult and would have strained credulity, and at some point I just decided to not post anything at all until a certain date in early 2021 had passed; a decision that, very recently, I was happy I'd made.

It's not that those things aren't interesting or possible to write about. It's that I wished to spend that energy writing about other things, and many other things started evaporating in lockstep with said energy.

At this point I also need to step away from the idea, and look at it critically, that adding one's opinion just because it's possible is an unequivocal good. Maintaining it as a cultural value has consequences, and personally, striving to always make my own opinions heard to make myself feel validated is a side of me that I feel very icky about right now.

Round the Outside, Round the Outside, Round the Outside

MacRumors: Kuo: New MacBook Pro Models to Feature Flat-Edged Design, MagSafe, No Touch Bar and More Ports

The new MacBook Pro machines will feature a flat-edged design, which Kuo describes as "similar to the iPhone 12" with no curves like current models. It will be the most significant design update to the MacBook Pro in the last five years.

For my money, it's self-explanatory why they would do this.

Both MagSafe and other ports hinge on having the vertical real-estate around the perimeter of the product. Apple has been reticent to put ports on surfaces that aren't flat, and has been (let's call it) interested in keeping the products as thin as possible.

Look at the current MacBook Pro head-on, and a significant portion of the thickness is tapered, leaving only the minimal edge on which to put ports. (For all we know, this is why most all ports were removed in the first place.)

As noted, laptops do need to not be complete bricks, lest you're unable to pick them up off a table without maddeningly pushing them around first. But there are ways to either beef up the rubber feet that still exist, or provide a slight bevel or less pronounced tapering to the side. (Or maybe they realized it wasn't going to kill them to make the thing a millimeter thicker and keep the current form factor, but that's not the current topic.)

It's also possible that, due to the iPad Pro and recent iPad Air that have the same flat-edged design and unlike the new and old flat-edged iPhones are too big to "wrap around to pick up", Apple's just going to use the same radius and go for it the way it is.

As for all the other features: Assuming this thing comes out and has an M2/M1X in it, I'm there. This 16" MacBook Pro is relatively speaking barely unpacked at 14 months, but the combination of voting with one's wallet to mark a step in the right direction and getting what appears to be a significant improvement in performance, battery life, thermal ergonomics and utility is very tempting. Chucking the Touch Bar and the Intel chips should provide enough savings to drop pricing to merely exorbitant.

Lukas Mathis: How User Tracking Devalues Ads

Why would Facebook take out a huge non-personalized ad to make the point that, for ads to really work, they need to be personalized? Why advertise in a newspaper if they think that personalized ads are so much more effective?

Podcasts in Big Sur

After some trepidation, I updated to Big Sur with 11.1, which may have been too early after all.

All other things notwithstanding, revisiting Podcasts is informative. Nearly all of it seems to still hold true. Someone is awake at the switch somewhere since Cmd+L now does jump to the "show" of the current episode, but it doesn't select the episode and scroll it into view to let you pick neighboring episodes. Indeed, it first loads an empty list and then visibly populates it with data, leaving you at the top.

Furthermore, I am now thrilled to discover that hitting Space in Podcasts no longer play/pauses. The Controls menu lists this as option+Space, and is not remappable via the System Preferences Keyboard Shortcuts functionality, since such shortcuts require modifier keys (for good reason). This breaks with convention for basically any media playback application of any form where a keyboard is available – even the full-screen media player on iPadOS reacts the right way to Space. The Music app definitely still does.

There have been changes made to Catalyst to make it a less horrible choice for building Mac applications (like opting into native or at least native-seeming controls like buttons and checkboxes), and certainly the cavalcade of odd UI choices all across the OS make the particular ones in Catalyst or Podcasts seem less weird. But it still looks, feels and behaves more like a poorly written web app, a mélange of UI goo scraped out of a foreign metaphor and allowed to set without much customization or supervision.

And regardless of UI framework, it doesn't seem like the Podcasts team has any interest in going further than making it the weak not-quite-anything port that it is. The iOS Podcasts app is redesigned more years than not, with custom interactions, animations and flows. That the macOS version can't even get to a coherent, serviceable, purpose-appropriate app is bewildering.

Oh, and the aforementioned Controls menu, when opened, beachballs for a handful of seconds – significantly more time than to launch the entire application – and then presents the menu, because when a company has only been doing pull-down menus for 36 years there's only so much you can expect.

Cydia sues Apple for anticompetitive behavior

I had the original iPhone, and could not have used it if Cydia and jailbreaking wasn't around. Apple doesn't imbue its developer community with creativity, it largely constrains it.

🍿

We Need to Talk About Nintendo Switch Sharing

The Nintendo Switch is a wonder at this point. Applying Gunpei Yokoi's Lateral Thinking with Withered Technology, Nintendo took an ARM SoC that was already on the market and made it into the platform they needed – a cheap, semi-portable platform that was good enough and easy to port games to, and that right now fills the niche as the mainstream, Wii-like console for people who don't want mobile games and don't want to play the exclusives lottery with the Xbox or PlayStation consoles and can't afford to get both.

The OS, or in Nintendo's words "System Software" is a lot better in the Switch than in the Wii consoles. For the Wii U, much was made of an early slash of boot-up time to get it down to a half minute delay, and most of the user interface felt like swimming through corn starch. But some of the OS is still the weakest link of the Switch.

v11.0 is a great example of what's wrong. It introduces two features I've been waiting for since launch – the ability to send screen captures (images and video) to a phone or tablet, and to a computer over USB. Both of the implementations have striking flaws.

Sending to a phone does not involve using any of the Switch apps available. Instead, it involves a two-part QR code process: First, you scan a QR code to join an ad-hoc hosted Wi-Fi network. Then, when it detects that you have connected, it shows a second QR code, which is a link to a locally hosted web server which has a page with the media. This is brilliant and inspired – but it's brilliant and inspired in a 24-hour hackathon, look-what-we-can-do, proof-of-concept kind of way. It's what you do when you can't do anything else, to show that anything is possible. But it's a thoroughly horrible user experience. To make matters worse, you can only do this with up to 10 screenshots or 1 video at a time.

Sending to a computer via USB is less concerning, since it involves connecting a USB cable between the Switch and the computer and lets you have at the entire contents. The problem here is that you can't use a USB cable in the Switch Dock. You have to pull out the Switch and use the USB-C port on the Switch itself. If you could use the Dock's port, you could leave a cable in there and just connect it to your computer when you wanted to take a look. Now, it's more involved.

Both of these are advances over the state of the Switch from launch date to just over a week ago, where you were resigned to posting to Facebook or Twitter, or powering down the console and removing the micro-SD card (inserted behind the kickstand) respectively. And the Switch was launched under infamous hard deadlines, because of preannouncements about an "NX" console to be launched within that fiscal year. But these are features that could have been added much sooner, and could have been done much better.

The idea that comes to mind for phone/tablet sharing is to let you select as many items as possible, establish a local network connection or maybe even Bluetooth and send them over to a new department of one of the Switch apps. (There are a few variants on how to do this and what goes where, like maybe you can argue that the gallery should be streamed over to the app and the device user should handle picking and saving; but it's hard to choose a model that isn't significantly simpler, more efficient and less disruptive.) For USB connections, maybe design choices made during the construction of the Dock mean they really are prevented from making the connection directly to the Dock work, which means it's hard to patch after the hurried launch.

Either way, I hope the upcoming Switch refresh has spent more time thinking through these features. The Switch project wasn't rushed in all aspects, considering the intricate detail of its DRM, extending through the game cards. When it comes to basic features, responsiveness and loading times, though, Nintendo's habit of acting bumfuzzled has yet to wear off.

Chris Forsythe: Growl in Retirement

Growl is being retired after surviving for 17 years. With the announcement of Apple’s new hardware platform, a general shift of developers to Apple’s notification system, and a lack of obvious ways to improve Growl beyond what it is and has been, we’re announcing the retirement of Growl as of today.

Growl was famously hard to explain succinctly to people in my experience, but I think it speaks a lot to the community that before Mac OS X contained an infrastructure for this, people banded together and built something that was widely adopted. In this way it's not dissimilar from Internet Config by Quinn! the Eskimo et al or the External Editor protocol implemented by many FTP-like applications.

Growl made notifications bearable when Mac OS X was Mac OS X – thanks, Chris.

Rent-seeker's Advocate

When Apple announced the upcoming App Store Small Business Program, under which some business will have to give up 15% of their app's price to Apple instead of 30%, I posted something brash in response to it. It contained humorous allegations that Apple was acting like organized crime towards those so successful as to escape the bounds of the program, had it existed at the time they did so. (Realistically, the imputed partner would have exceeded the volume within days, if not hours.)

I still maintain that within the context of the joke, and knowing the background, it's more right than it is wrong, but that's not to say that people at Apple working with marketing or the App Store or developing the underlying infrastructure go to work with the intent to do harm, repress creativity or spoil livelihoods. I regret any discomfort that I have caused in those people, especially when there are both individuals and organizations that are explicitly hell-bent on exploitation. But that said, what are we to do when that's what the end-result nevertheless is?

Seen from one perspective, Apple heard the feedback from developers and launched the program with clear and pure intent to build a better relationship with them. Seen from almost every other perspective, it is one in a series of ploys intended to protect, tooth and nail, a monopoly from being taken from them – a monopoly which ostensibly provides a marketplace for the enjoyment of developers and users, but which consistently serves themselves above any other party. Apple is as sincere in its determination going to war against anyone who would change that as its previous "thermonuclear war" against Android being "a stolen product".

In the subject of accessibility, I have heard of a scale of "situational, temporary, permanent". Making colors contrast enough or text sizes bigger, for example, helps the people with vision problems, but it also helps those who are trying to work an outdoor touch screen with fogged glasses from mask-wearing, those whose medical issues may flare up and ruin their vision temporarily or those who have misplaced their reading glasses. By improving an aspect of software for some, you also end up improving that aspect of it for many others. Or in other words: actions have consequences, and it's hard to wall them off.

Apple's desire for control is intended to stem some permanent behaviors seen as unseemly, like pornography, gambling or ethically murky business propositions. But the net cast by those controls also end up catching the situational and the temporary. Do you want to release an application under a pen name? No, fat chance, you must have something to hide, go live your life on Android instead. Do you want to use a behavior or technique that, in the wrong hands, could be abused? Nuh-uh. Do you want to just plain not be subservient to the consequences of these rules and the capriciousness emergent in their application; or disagree with their definition of what is unseemly or not because you are not the lowest common denominator by which the rules are applied? Build your own damn platform.

Jamie Zawinski has long posted a message to Facebook employees to the effect that being at Facebook is being complicit in the consequences of the platform. While I wouldn't go so far, as I tried to disengage from Google (the search engine itself), I ended up using DuckDuckGo, which although a privacy conscious provider, also enlists the floundering search engine from Microsoft, which is also one of the companies trading blows for the world's highest market cap, which helps itself to various and sundry data from every Windows 10 user (and shows them Start menu advertisements even if you paid hundreds of dollars for an operating system license). This is Alice in Wonderland.

In short, it is becoming incredibly hard to not surrender some part of your life to five-to-ten technology companies. So, no, maybe going by a particular letter interpretation of a particular definition of a particular law, what Apple has is not a monopoly. But by any sane way of looking at the status quo, it's not a healthy, customer-serving market either. Apple is acting the way they do because they get away with it, and mix semi-serious fealty to community-minded activities and values with bog standard profit-maximizing-at-all-costs tax planning and only caring about their own house and their own situation.

The pragmatic way to go for Apple, if it still believes it can keep this house of cards standing, might be to allow App Stores themselves into the App Store. For example: get Steam, and you can download any game or app from Steam that Valve has their own approval process for, which may include games not otherwise allowed by the App Store, and neither Steam nor the Steam-sourced apps would have to use Apple's In-App Purchase platform. It would also presumably be heavily policed to ensure that it wasn't letting misleading, dangerous or fraudulent apps through – an idea so good, it is second only to the idea that the App Store itself might competently enforce those policies on the App Store's own listings. But speaking of that, this would also open for a pet theory of mine, that with other app stores allowed and available, the Apple App Store itself could be free to truly only allow the apps matching a higher standard.

Monopolist, Wary of Being Recognized As Such, and Convicted of Same, in Transparent Ploy to Confuse, Announces Plans to Collect Half as Much Rent That They Were Never Entitled To, Assuming That You Don't Magically Find Unexpected Success, In Which Case You Know What, We Think We're Gonna Help Ourselves to Some of That Rent Anyway

Ingredients:

  • 20% long overdue but underpowered adjustment
  • 30% confusing and capricious rules
  • 30% opportunistic PR
  • 20% Fuck Everyone Big Enough That They'd Be Able To Have An Effect In Public Opinion, You Know Who You Are, Epic Spat That Shit and You Didn't Kiss The Ring, Now Eat Shit You Bunch of Cocksuckers, Except For You League of Legends But We Already Have That Backroom Deal

Matthew Green: Ok Google: please publish your DKIM secret keys

An accident of the past few years is that this feature has been used primarily by political actors working in a manner that many people find agreeable — either because it suits a partisan preference, or because the people who got “caught” sort of deserved it.

But bad things happen to good people too. If you build a mechanism that incentivizes crime, sooner or later you will get crimed on.

Email providers like Google have made the decision, often without asking their customers, that anyone who guesses a customer’s email password — or phishes one of a company’s employees — should be granted a cryptographically undeniable proof that they can present to anyone in order to prove that the resulting proceeds of that crime are authentic. Maybe that proof will prove unnecessary for the criminals’ purposes. But it certainly isn’t without value. Removing that proof from criminal hands is an unalloyed good.

Well-argued post in favor of large mail providers (like Google) rotating their signing keys periodically and publishing their secret key post facto. While it sounds insane or at least reckless, it removes a point of tension that right now is hurting people, and does not materially hurt the purpose for which DKIM was invented – to authenticate legitimacy as emails are being delivered. And as mentioned:

A paranoid reader might also consider the possibility that motivated attackers might already have stolen Google’s historical secret DKIM keys.

Cryptography, its applications and its consequences are very subtle, and it's hard to get one benefit in isolation. Rather, you buy into a set of behaviors, some of which are not immediately obvious. We are used to encryption establishing privacy, but deniability is also an aspect of privacy. I wonder if the initial threat modeling of DKIM foresaw this, or if it maybe saw the unintentional consequence as an unconditional net good.

Apple Silicon: Inaugural M1 Macs

To sum it up: Intel has been embarrassing itself for the past few years, Apple has been building up an impressive silicon know-how for the past decade, and ARM architecture chips have been inherently power-efficient since they were invented.

The M1 versions kick the Intel Core versions around the block a few times in most workloads, and the integrated GPU cores have been receiving tuning and tweaking as part of pipelines targeting Metal on iPad and iPhone for years. There's also no reason to doubt the statements about power efficiency and performance-per-watt increases. And having a completely fan-less MacBook Air in a way that is not just a dangerous fool's errand is a remarkable achievement.

There are still open questions, though:

  • The M1 leans into UMA, a "unified memory architecture" where the memory is shared between the GPU and CPU. That's great most of the time, but how is performance affected when both the GPU and CPU draw lots of resources?

    Will future M chips allow off-SoC memory in the first place, or off-chip GPUs? (eGPUs are noted incompatible with M1. Update: I was mistaken; only the MacBook Air M1 is incompatible, probably because of the lack of thermal envelope. Update: Apple's page was mistaken, as now no M1-powered models are listed at all.) Being the fastest integrated GPU is impressive, but it's also like walking the fastest; there are plenty of chips out there that can run.

    The M1 tops out at 16 GB now and the previous Mac mini offered up to 64 GB; how far north will the included RAM go in the future?

  • Mac mini loses out on two additional Thunderbolt/USB-C ports and 10 Gbit Ethernet. Will this type of functionality simply drop off the radar?

  • Both M1 MacBooks use the ISP to clean up the image, but still use 720p webcams? What year is it? Has no one heard of "garbage in, garbage out", or are they betting that extra chip functionality will solve literally every problem?

The M1 is the first step of the transition; the opening salvo instead of the crescendo. There will be surprises and revelations ahead, and I look forward to seeing genuinely new designs within the next year, including next spring and during WWDC.

Don't forget: The Macs that were revved today were from the bottom part of the lineup. For all we know, the M2 or M1X containing both faster cores and the additional infrastructure to do fuller system architecture are still being finalized. It takes time to build out the chip capabilities from iPhone/iPad-level chips to something capable of handling all subsystems in all Macs. PCI Express (used by Thunderbolt/USB4) and virtualization support are two examples of features already present in the M1 that no previous Apple SoC has needed to support.

Apple Silicon: The Roads Not Taken

[Please note: This was written ahead of today's announcements.]

My track record on predicting things out of the blue is pretty spotty, so here are a few things I can imagine but that will probably not materialize.

  • "Apple Pi"

    Raspberry Pi-like, "tinkerer-friendly" Mac, for under $100.

    Compare the prices of most single-board computers and the x86 models are steadily either significantly more expensive, or running four year old Intel Atom CPUs, or both. Not only do ARM processors not have the issue of having to keep Intel afloat, Apple has itself had experience putting out small SoCs in surprising places.

    If they would do this, chances are they'd make it all about hosting stuff on iCloud, writing code in Swift (maybe using a connected iPad). I don't quite see how it can both be what the Raspberry Pi crowd likes and what Apple likes at the same time. Apple's not interested in enabling tinkering. It's interested in making kids code, but on a high-margin iOS device and up. With the way macOS has moved recently, there's little making this a Mac as such, but it's more a Mac than iOS/iPadOS.

  • "Mac nano"

    A Mac mini the size of the Apple TV, for $199, with 4GB RAM, 64/128 GB of iPhone-like storage, hardly any I/O, and probably an A12, A13 or A14. BYODKM – hook up the display with HDMI or USB-C, hook up keyboard and mouse wirelessly or with a USB-C hub/adapter.

    The old Steve Jobs quote was "we don't know how to make a $500 computer that's not a piece of crap", and Apple can now comfortably pack in the computational power for an okay enough experience for what people are likely to plug into it. As long as it runs the software well enough, it's a candidate to bring people over from Windows, and they're about to lose the fallback "if all else fails you could use it as a Windows PC"; it needs to be cheaper.

    ("Mac SE" was already taken.)

  • An affordable Mac mini

    Take the current Mac mini, make it a bit smaller and make it affordable. Again – the Intel tax is gone, and Apple, if they want to, can churn out silicon in large scales by themselves already. The first Mac mini was $499 – there's no reason the first ARM Mac mini can't be.

All of these products essentially are based on this: there's an Apple that makes iPhones for $399 with industry-leading performance, and there's an Apple that sells wheels for almost twice that price. It's up to Apple to define what they want to sell and how they want to market it, and heading into a transition where you drop a hardware partner for your own designs is a perfect time to choose a new tack.

Say what you want about whether Apple wanted to offer lower-level products before, the price-to-performance ratio with Intel never made much sense. And if a Celeron or Atom didn't exactly scream high enough performance, neither did PowerPC chips that were lower-end than the ones they put in their low-end Macs back in those days. In a way, Apple's not had the opportunity to tackle this head-on at least for 20 years or so, so we don't really know that the idea has been rejected by Apple rather than by circumstance.

Mister President-Elect

They just elected to tell the monkey "you're going to have to turn that machine gun in".

(I have a longer post with a lot more going on, but it's not coming together, and there's a limit to how late I can post this and still be reasonably timely. I also have other things to post about and I don't want to not have acknowledged this when I do.)

Sean Connery dead at 90

Much will be said about his defining tenure as James Bond, of his dialect, of how people are both shaken and stirred.

For me, Sean Connery redefined the nature of creativity. If you haven't already, watch Finding Forrester this weekend. Its teachings sit deeper with me than the ur-meme and subsequent genre that it birthed, although we may hopefully all instinctively understand it a bit better by now.

Oh Gee

First, Apple's come a long way from calling carriers "orifices". Without checking, Verizon probably got more stage time than the lidar in the iPhone 12 Pro, where it assists autofocus and plays a big role in magically making photos work out even for people who have never knowingly 3D scanned something in their life.

But more importantly, the sense I've got is that 5G isn't a dud technology but that it really only provides its advantages in areas where it's really well built out. With mmWave, the intense half of the technology, that means literally if you have line of sight to a tower. I can see why they focused on stadiums and NFL, and why Apple picked a carrier who could say "we're now rolling this out for real", to get past any argument that it's been bad until now.

All of the presentation and indeed the product site itself are packed with demonstrations where all of these things will download completely before the touch debounces, before the foot hits the ground, before the next time the hummingbird flaps its wings. It's not just in the flashy montage video – Apple is setting the expectation that this is how life will be with 5G, and that requires that these speeds are realistic and omnipresent and dependable. Either they are right or it will be worse than this.

If they are right, it's a huge leap forward for 5G since it's not the known, measured opinion of anyone with access to a realistic 5G network today, never mind the people in the coverage map's vast shadow. If it's worse, are they going to hide behind people not having Verizon as a carrier, or not being one of the 200 million Americans ostensibly covered by the built-out non-mmWave network? The iPhone is a worldwide product, and even most people in the US don't use Verizon, and for all the US-centricity, these days the experience of using it rarely comes with great big asterisks.

If this all is a great big "fuck you" to AT&T over "5GE", though, good job.

Sony tears apart PlayStation 5

I've never owned a PlayStation and maybe never will, but this was an interesting overview. A crazy mix of conventional PC and bespoke architecture – and the CPU is cooled with liquidmetal! (Which from what I understand takes extreme care because it's also electrically conductive.)

Home

Far too often, the soul of someone who happens to be a Mac user is seen through the lens of corporate communications — about crazy ones, misfits and round pegs in square holes.

I'm a Mac user and have been so for most of my life. Growing up, System 6 was a staple, but I also remember a Compaq portable with the mouse trackball at the right side of the screen and the mouse buttons on the lid, and eventually using Windows 95 and 98. I came back to Mac with Mac OS X when I left a really powerful PC for a computationally dinky Aluminum PowerBook G4, and have not owned a full-on desktop since, even if I do own a NUC and a few Raspberry Pis.

I recently have been in a mode of deep (Windows-based) user interface focus at work, and was describing Panic to someone recently when something came over me. I love these guys. I love the attention to detail of every large, small and medium thing, the time put into making an application that feels right and flows right, that's easy to use, that has just enough user interface that you can get done what you need to get done, that has style, function and whimsy.

Panic is just an example, but there are so many of their ilk that I can point to. Many Tricks and their excellent Witch; Ranchero's NetNewsWire; Omni and their myriad of productivity tools; Noodlesoft's Hazel; and so on, and so on, and so on.

I've used Windows every day for as long as I can remember in one way or another. I can find my way around there as well as on any other platform. But while on the world's biggest desktop OS, I still feel constrained by a meandering vision, by a lack of common conceptual ground, by the infuriating feeling that so much is built and left unfinished, unpolished and put out to rot.

The Mac gets a lot of flack from people who are nose deep in technical specifications and price matchups. What they don't see — or aren't interested in — is the intangible: the culture that people with big dreams and small means have made the unconventional available, the complex seemingly simple and the advanced accessible. This culture doesn't live or die by Apple in particular, although the original Macintosh being a product of a similar mindset helped set the tone. This culture produces things that are hard to find elsewhere, not because it's technically impossible to do, but because the values that drive those other platforms produce different outcomes.

I am upset with a lot of recent technology, because it all seems intent on burying history as part of remaking the world. Not everything new is bad or worse than what came before, but so many important learnings are being thrown out. You can't make a web app without first filling it with a big framework to implement basic interactions, and most of them lose the tactility and the richness of most native interfaces of any platform. You can definitely build good web interfaces driven by snappy and well-thought-out engines, but it takes intense focus and hard work to do so, and it's easier (but not cheaper) to just throw in Bootstrap and work just as poorly as most other web sites. Electron takes all this and wraps it up in a computationally horrible footprint, under the insulting guise of "native".

iOS, iPadOS, watchOS, tvOS and now macOS with Big Sur – all the recent advancements seem to come at the expense of the wide berth that used to produce great results. The freedom that allowed a seamless experience is chopped up by security concerns ham-handedly and haphazardly applied, and on most platforms most of the time topped with having to pass the needle's eye of a trillion dollar enterprise's hungry bean-counting and control. All for the purpose of being a populous platform for its own sake; for having more apps that cost little, grey gruel instead of food; for padding a bottom line if you're cynical, or stroking a corporate self-image run amok.

The reason, all these things considered, that I haven't left these platforms yet is that there's still the feeling of being in a garden of my own cultivation. I can control every nut and bolt and swap out infinitesimal details and fundamental building blocks in Linux, that's true. But that means that people do, and you end up with worrying about technological fundaments because of this uneven foundation. This soil does not bear great fruit, efforts by GNOME etc notwithstanding, and the culture lionizing the endless flexibility of the command line and architectural purity of UNIX gives an easy escape hatch for any problem.

Windows is seemingly more stable in this aspect, but while I am able to live in that house, I am not able to make it my home, and it's not for a lack of trying. Microsoft's repeated wallpaper-stripping and ever-changing priorities make it feel like an enormous mansion under constant renovation, with uneven floors, studs poking through the walls and fundamental features left broken or half-finished since the last time they cared. (The less said about the impressionistic "Fluent" wing entirely in featureless acrylic, the better.)

The culture and the people and the shared values and what it all comes together to produce. That's why I'm still here. You can live in many houses, but not all of them will ever feel like home. I'm upset with the landlord and the building manager who ignores leaking pipes and oiled floors catching on fire while upping the rent and turning a blind eye to hustlers running Three-card Monte, but aside from that, I love the neighborhood, I love the surroundings, I love that they value the things I do and I love what it can build over time.

Joseph Gentle: "I was wrong. CRDTs are the future"

What’s the JSON equivalent for realtime editing that anyone can just drop in to their project? In the glorious future we’ll need high quality CRDT implementations, because OT just won’t work for some applications. You couldn’t make a realtime version of Git, or a simple remake of Google Wave with OT. But if we have good CRDTs, do we need good OT implementations too? I’m not convinced we do. Every feature OT has can be put in to a CRDT. (Including trimming operations, by the way). But the reverse is not true.

I have been looking sideways at technology like OT for the past 10 years or so wondering if it'll ever be part of anything I do. CRDTs are refreshingly simple, and has less of that ominous "if you fuck this up, you fuck everything up" feeling, but they've been limited to simple, known-good constructions similar to concurrency primitives.

There is a layer of subtlety and complexity that will never go away with real-time editing, but I hold out hope that there can be CRDTs that encode larger patterns too, and can be used to compose larger state graphs in a way that keeps the choices you have to make right in front of you and avoids "trap doors".

Steaming

Three quick things:

  1. Why, when installing a Steam game on macOS, do I get the option to "Add to start menu"? In macOS it's not available. (And in Windows, it's called the Start menu.)
  2. Why, when the Steam app has found an update and properly badges and bounces the Dock tile, does clicking the tile or switching to it by any other means hide the update window? My guess is that the switch-to-the-app trigger invokes "show the main window at all costs". The window also is not available in the Window menu for switching to, or seemingly visible in any other way (Witch doesn't seem to find it).
  3. Why the hell is Portal 2 (and the other handful of first-party Valve games) still not cross-compiled for 64-bit macOS, so that it is still playable now, and would be playable (at least for a few years) on future Apple Silicon Macs? This is not rocket science, and Valve wouldn't even have to do the technical work themselves.

The first two should have been found within the first week of actual use. All of them should have been taken seriously and fixed as soon as possible.

Jordan Rose: Objective-Rust

Yep, that’s Rust code with embedded Objective-C syntax, and it works. Why would you do such a thing?

Fast Company: "Facebook is quietly pressuring its independent fact-checkers to change their rulings"

The video was notable because it had been shared by Lila Rose, the founder of antiabortion group Live Action, who has upwards of five million Facebook followers. Rose, leery of restrictions on her page and handy with claims of Big Tech censorship, quickly launched a petition protesting what she alleged was bias by Facebook’s fact-checking partner, a nonprofit called Health Feedback. Soon, four Republican senators, including Josh Hawley of Missouri and Ted Cruz of Texas, wrote a letter to Zuckerberg condemning what they called a “pattern of censorship.” They demanded a correction, a removal of all restrictions on Lila Rose and her group, and a “meaningful” audit of Facebook.

Soon, the fact-check labels were gone. A Facebook spokesperson told BuzzFeed News at the time that the labels would be removed pending an investigation “to determine whether the fact checkers who rated this content did so by following procedures designed to ensure impartiality.”

How very independent.

← Earlier posts