Take

Later posts →

TikTok tracks everything, obfuscated out the wazoo

bangorlol on Reddit:

I reverse-engineered the app, and feel confident in stating that I have a very strong understanding for how the app operates (or at least operated as of a few months ago).

TikTok is a data collection service that is thinly-veiled as a social network. If there is an API to get information on you, your contacts, or your device... well, they're using it.

[..]

Here's the thing though.. they don't want you to know how much information they're collecting on you, and the security implications of all of that data in one place, en masse, are fucking huge. They encrypt all of the analytics requests with an algorithm that changes with every update (at the very least the keys change) just so you can't see what they're doing. They also made it so you cannot use the app at all if you block communication to their analytics host off at the DNS-level.

For what it's worth I've reversed the Instagram, Facebook, Reddit, and Twitter apps. They don't collect anywhere near the same amount of data that TikTok does, and they sure as hell aren't outright trying to hide exactly whats being sent like TikTok is.

Paints a fuller picture than "it helps itself to your clipboard", doesn't it?

On Tools and Developers

During the (excellent) WWDC episode of The Talk Show, there was this:

John Gruber: I've seen in, day one, there has been some - to me - misreading the message, but some coverage along the lines of: "Apple is moving the Mac to its own silicon to further lock in [insert either developers or users or both users and developers]", that this is to increase lock-in. And I just have to ask... I don't see it, because I've seen these announcements and I don't see where that's coming from in terms of any aspect that was announced.

Craig Federighi [SVP, Software Engineering, Apple]: I think those guys are being total tools, honestly.

This comes in the middle of a segment where both Craig and Greg "Joz" Joswiak (VP, Product Marketing) are talking about the lack of respect they get from showing that their focus in on the Mac, and that they want the Mac to continue existing, not be subsumed by or replaced by or absorbed into iPads and its OS, and so on and so forth.

To begin with: I agree with them to a point. As much as I've been able to tell, you will not be allowed to do less on macOS Big Sur running on Apple Silicon than on Intel processors. The limitations that are there have excellent technical motivations: it is hard to straight up virtualize a different processor architecture, and they have still provided good support for everything up to that point in terms of automatic binary translation of programs compiled for Intel.

I also empathize with the pressure on a personal level. Although they are well-compensated for it and should be accountable for the decisions they make, there's no need for incivility or personal attacks. Luckily, I do not have to stretch myself as far in order to enumerate a number of reasons why this is the feedback Craig and his team gets.

Apple's modus operandi is to find big changes, make big bets, and go from the current status quo to where they want to be over a series of small, incremental changes. Once every blue moon, a big change is needed, or a new component or technology or device or market needs to be introduced, and this is seen by some, especially outsider experts, as the core of Apple, to the point where if they don't regularly do it, they have "lost the innovative spark" and you would be a complete fool to not immediately short AAPL. But the core is the silent trudge, the long term mixed with the incremental.

Apple is judged by their actions, by their behavior and by their history, and in the absence of roadmaps and rationalizations, and in the recurring presence of re-contextualizations as new changes happen, the guessing game is the result. Every change turns into a proposed Chekov's gun.

  • When Sandboxing is introduced, the logical conclusion is that at some point it will eventually be required for everything. Since Sandboxing is inconsistent, flaky and insufficient, and since many of the current applications many people depend on could not survive Sandboxing, this creeps me out. The conjecture is unproven - Sandboxing is not required for everything, but we are not also at the end of history. Office is now sandboxed, but also has loads of handcrafted exceptions. Sandboxing wasn't built to scale to accomodate all current app behavior, not even of just the legitimate apps, it was built to force developers into a box. For increased security, yes. At the cost of tremendous inconvenience and missing features to the apps that adopt it and their users: also yes.

  • When the App Store is introduced on Mac, with the bluster that all developers should find it manna from heaven because it unties knots many developers had never even ran into for several years, the logical conclusion is that at some point everything will be required to be in the App Store. This theory has gone both back and forth, due to the introduction of Developer ID and Gatekeeper, and then the subsequent move towards making it harder to allow non-Developer ID apps.

    I stopped writing Mac apps largely because of this. I want to maintain a pseudonym, and Apple's assumption that it means I can't be trusted to make well-behaved software offends me personally. My software was used inside Apple, and I got bug reports when it broke on in-development OSes. The last updates I released, I signed with my own signature chain - I have no qualms with the security aspects, or with cryptographic signatures or even with blocking them after the fact.

  • When 32-bit Intel application support was dropped, it meant people couldn't run some applications any longer. There's now excellent justification to believe this happened to lessen the burden of the already capable Rosetta 2. But it still means applications people paid for, learned, loved, were productive with, possibly got a Mac for, stopped being usable. This is the opposite of user-friendly, and there's not even a security angle.

  • When Mac hardware consistently and consequently lost ports and user-accessible/user-serviceable parts over the years, they hardly ever came back. Additionally, some hardware was left to rot, and had a tendency to return at a steeper price. There are exceptions – worth noting precisely because they are exceptions.

    Many developers could use Mac Pros in the cheesegrater days, because it was recognized that you didn't have to be a film maker with a studio budget for modularity and customizability to be useful. And yet, even when the Mac Pro reverted from the "trash can" form factor to a full-on accessible desktop workstation setup as it previously had been, the price was hiked significantly and the product was completely repositioned, in the same breath as Apple declared developers one of their most populous "Pro" groups; indeed, the announcement was made to a room full of developers. There's a wide array of displays, and if people don't need an HDR display, they don't need to buy the Pro Display XDR, but making an entire Mac model inaccessible is a different animal.

If you meet a person and they act a certain way, over time you learn to recognize that pattern in them. If you develop for Apple platforms and every year is a series of new inconveniences to manage just as much as it is new technology to consider adopting, you learn to assume a negative progression in convenience, utility and freedom, just as much as you have hopes for the advances in frameworks and hardware.

The "tools" Craig's talking about have all seen the beginning of, effectively, the closing of the Mac as a platform. We know that Apple doesn't like to dwell on the bets they make, and we know that Apple doesn't usually back out of things. We're waiting anxiously for the moment where the hammer drops. That means we assume that sweeping transitions will bring those changes. The small ratcheting moves have often happened without being announced, or by being announced with individual bullet points in presentations during the week of WWDC (the interview was recorded on Tuesday, by which point not even half of the presentations were available).

This bed is of Apple's own making. By never copping to imperfection, by never really listening to and answering the detractors who are in its own camp, by avoiding humility and the taking of other perspectives than its own, the only method of communication left is loud and clear dissent.

Satisfaction Dissatisfaction

You might think that for a person with so many opinions, I would love rating interactions and customer satisfaction. The truth is that I get stressed out at the prospect most times.

Most rating systems are used to form a single number, a single indicator saying "are we on track, are we doing a good job?" and there's nothing wrong with that, except that my data point will be "is representative X doing a good job?". Most of the time, people aren't rude and they aren't unreasonably applying the script or process in front of them, and there's no reason for me to give them a low rating for that. Almost all the time, I do have significant qualms with the process itself in one way or another, but there's no way for me to express that aside from putting representative X in hot water for "being so bad with customers", or nudging them down some well-meaning but counter-productive ladder of incentives.

For any company that really cares about their customers' actual satisfaction and their process, whatever it is, the question should be split into at least two questions: Was our representative helpful, supportive and informative during the execution of this process? and Are you satisfied with the process?

Phoronix: Perl 7 Announced As Evolving Perl 5 With Modern Defaults

This truly is the weirdest timeline.

WWDC 2020: iOS 14

Reflections:

  • Unusually light on features.
  • There are crash-on-use incompatibilities in the release notes, but so far this is notably solid for a first beta, maybe even iOS 12-level. Probably related to the previous point.
  • Moving most stuff I use some of the time to a second home screen and turning off the third-to-nth home screens knowing the apps are still all in the App Library is a very good feeling of freedom.
  • The only widget I use so far is the weather widget, which when set to Current Location occasionally teleports back to Cupertino (homesick?) following which it refuses to reflect the actual location unless you change to another location and back. This sounds like a consequence of the way widgets are pre-baked, UI-wise.
  • Update: some widget-related manipulation and the automatic moving-aside thereof of apps and folders ended up straight up removing two folders and two apps, which had to be re-added from the App Library. (One could think that they would have slipped onto one of the app pages/home screens since hidden, but that doesn't seem to be the case.) These folders had been present in their current configuration for several years, iOS versions and indeed devices.
  • Finding an app in the App Library is easy enough, but adding it back to the home screen is inconsistent. As far as I can tell, you have to find it within one of the folder-like blobs in the App Library (as one of the featured three or from showing from the fourth item) and drag it from there. You can pull up the list/search results from App Library and drag the icon, but not the full row. And if you pull up the regular home screen/global search, you can't drag neither the icon nor the row. In none of these four cases does long pressing bring up a context menu.
  • Emoji search is a winner, although the always-there-search field adds some height, which is robbed from the app itself.

WWDC 2020: Apple Silicon

Reflections:

  • First of all: it is going to be incredibly interesting to see where the architecture can go without the thickness of an iPad being a constraint. Shipping an existing iPad chip is basically as strong of a statement as they can make that they're not showing their cards yet.
  • Making a big deal of virtualization still being there is necessary, but the way it was presented totally gave the (wrongful) impression that virtualizing Intel from Apple Silicon was possible. To the "UNIX-and-docker-using developers" that were mentioned, there's a hell of a difference between being able to use virtualization and containers for x86-64 vs for ARM, since the dry technical capability is intact but you miss out on the entire ecosystem of x86-64 containers and operating systems, which is most of the point. Were they trying to go through the keynote without using the word "ARM"?
  • Considering that the transition will be about six months old as the first hardware is being shipped, I'm guessing the two year length of the transition will be necessary to develop hardware, architecture or OSes for the Mac Pro end of the spectrum.
  • The unified memory architecture between the CPU and GPU is being touted and underlined as "modern" – I'm wondering where this leaves GPU support, even external. To a degree, even the Afterburner card seems dated by this framing, but maybe it'll be baked into the Mac Pro equivalent to begin with.
  • If the unified memory architecture is such a big deal – will any Mac even be able to have user-installable memory after the fact? The term of art used during the presentation was SoC – System-on-Chip – and not CPU and GPU, and for them at least in the current form factors, all direct RAM usable by the processing units is hooked up inside the die. I guess they can carve out an iMac Pro/Mac Pro-sized exception to allow plebeian DIMM modules in addition to the on-chip RAM too. This session may contain answers.
  • All demos were seemingly made on Pro XDR Displays, which use Thunderbolt 3 only for video signaling and not USB-C – if I'm not mistaken it runs two parallel DisplayPort streams to be able to fill up the display, leaving only enough pins and capability for USB 2.0 on its USB-C hub. But the tech specs of the Developer Transition Kit lists only USB-C, USB-A, Gigabit Ethernet, and a single HDMI 2.0 port. Did they all use some screwball converter to the HDMI port, which would have to be enlightened to their peculiar multiplexed Thunderbolt-DisplayPort connection?
  • Update: Apple will continue to support Thunderbolt, with strong references to Apple and Intel having co-invented it, maybe to distance them from AMD which has had a famously hard time getting support. The statement doesn't include a reference to a version number, so it could be read as Thunderbolt 3 being supported by its inclusion in USB4.
  • OpenGL support will be present-but-deprecated from the start, which essentially means the full OpenGL stack (beyond OpenGL ES) is available.
  • Rosetta for PowerPC/Intel was barely able to run an Office+Photoshop demo convincingly and was labelled "fast (enough)" in the slides; with Rosetta 2, we saw recent-ish AAA games and 3D modeling software being labelled as "great" and "fluid" and "without hitches". Doing it up front surely helps, but they've raised the bar of expectations by a lot this time. If they're launching as soon as by the end of the year, they'll have to deliver.
  • Being able to use XPC to support Intel and ARM plugins separately is inspired. I do wonder how many applications in the target audience allow for such a platform-specific architecture though.
  • Depending on how things shake out especially with the desktops, this could be the end of Mac being a "PC"-family architecture. The screws will be put to anything new that has to be brought along, and many things are carried on only reluctantly and/or temporarily. The explicit mention to still be relevant to multiple-volume-multiboot-OS-external-drive-UNIX mavens is a strong signal they don't intend to go all the way, but whether it'll be enough for people who need something that's PC-like in its structure is anyone's guess.
  • No word on whether apps from unidentified developers will still be allowed. (Update: allowed, but notarization is required.) The UNIX mention is interesting, because cross-platform command line tools can't really be expected to be fully packaged as macOS-enlightened, including notarization.
  • Running iOS/iPad apps seems like a gimme and recontextualizes wanting to make Catalyst so badly, but also seems like even more of a half-solution without a touch screen, which does not seem likely without API to enable it, which would have been announced now. Then again, iPadOS pointer support sprung up virtually overnight.
  • Having Office, Photoshop and Unity ported enough to be running from day one is far from "let's fly out the guy from Mathematica the week before". The Intel transition, even though the technical foundations had been laid for years, was famously close-hold; I wonder how long this has been cooking?
  • Having prepared ports of open source components is also a sign of the times.

WWDC 2020: macOS Big Sur

Reflections:

  • The new UI style is not entirely "flat gone mad" – it allows for depth and shadows and materials. Look at the speech bubble in the Messages icon, the envelope in the Mail icon or the pencil in the Pages icon. Many instances look a bit over-the-top-for-the-sake-of-the-effect, though.
  • I am not a big fan of the continued slaughter of available-space-for-the-actual-title in the title bar, or similarly of cleanly draggable areas.
  • The frontmost/active window needs to have a much more prominent title bar. Just going by the traffic lights isn't going to cut it.
  • Going from poorly-delineated buttons to borderless buttons isn't a good idea when the button icons are just outlined shapes with a button shape when hovering. Being in the unified-toolbar-and-titlebar is a sort of cue, but not as strong as just having a graphically richer icon to begin with.
  • Dear god, the just barely opaque menu bar is back, and it's just as horribly unreadable as a few years ago. Do we really need to keep doing this?
  • Control Center with modules that can be dragged into the menu bar – this I actually like. Coherent, rich presentation that is customizable, and where the customizability plays to the strengths and structure of macOS.
  • Catalyst better have grown some strengths, because the macOS Developer app released only a few days ago is a complete UI shit show that still feels neither like a Mac app nor an iOS/iPad app.
  • With all respect for the design upheaval and the technology changes – this is incredibly light for any macOS update, and choosing this version to round up to version 11 feels odd. The Intel transition didn't even get its own major version marketing-wise. Maybe it was chosen for semantic reasons?

Brent Simmons: The iOS App Store Brings Users Only Because It’s the Only Choice

This is a misconception that many people have — they think the App Store brings some kind of exceptional distribution and marketing that developers wouldn’t have on their own.

It’s just not true. It lacks even a grain of truth.

[..]

Build it (and upload it to the App Store) and they will not come.

Instead, you have to do marketing on your own, on the web and on social media, outside of the App Store. Just like always. The App Store brings nothing to the table.

Getting Purchase

But let's for a moment assume that Apple does not intend to shut down the App Store. There's something they can do that would have both good optics and be actually productive.

Strike the mentions of In-App Purchase from section 3.1, and replace it with a set of hard rules for qualifying services. The services must be as easy to unsubscribe from as they are to subscribe to, may not use misleading "dark patterns" to trick people into staying on longer than they intended to, and must be straightforward to understand. In other words, the kind of rules that would usually come from hard laws or region-level consumer protection practices.

Sprinkle on some APIs implemented by the apps to make it possible to get to the appropriate aspects of the other payment solutions from within Settings, or the App Store subscription panels, and there's suddenly a whole lot more freedom without the risk of a precipitous drop of usability or confidence. The people that have been kicking up a stink about this for ages have been doing so because they've been trying to run businesses in a hostile environment – it's in their interest to handle this well.

I mention optics because Apple could still reasonably come out of this as a pragmatic, customer-focused leader. Not a terribly responsive one, mind you, but a leader none-the-less.

John Siracusa: The Art of the Possible

A hardline stance will not sway hearts and minds, and it has proven unable to change developers’ business models without sacrificing the user experience. Apple needs to decide if it wants to be “right,” or if it wants to be happy.

All We Want

One of the reasons I started Take was to make a clean break from my previous site Waffle where I'd gotten into a rut. There's a whole world full of things to write about, and somehow I centered on the same subjects and with the same mentality and tone, and it wasn't very positive. Now, for good reason, the innate failures of the App Store are in the news again, and I'm writing about it again and while it feels good to be topical, I do want to do something different and highlight the reasons I am so frustrated.

In October 2007, Steve Jobs posted an article to Apple's Hot News page announcing the development of an iPhone SDK. Before this, he was hounded by developers of all stripes and from all places (including inside Apple) to be able to develop apps for the iPhone, including getting into a spat with John Carmack who wanted to put games on the thing. The first iPhone was a watershed in hardware capability for a phone, in being able to deliver applications that weren't just "pocket" versions, to the point where Nokia spent the rest of the day of the original announcement in denial, saying that it must be fake. The demo was carefully choreographed along a working path, but it wasn't fake, and it shipped five months later.

Everyone saw the potential. There was a jailbreak of the then iPhone OS shortly after its release, which allowed for unlocking the carrier lock, and let me import one that fall. I wrote the beginnings of an iPhone app with the jailbreak makeshift SDK - it was not pretty, but it did work, and the Cydia package manager that quickly appeared and was bundled with jailbreaks contained many apps, some polished, others not quite so polished.

The next WWDC, Apple announced iPhone 3G and iPhone OS 2.0, which included the original App Store. If they hadn't done something to pull in developers, and even if they'd included all apps they make themselves, my guess is that iPhones would now occupy something like 4-7% of the market, instead of the 35-45% it currently does.

The iPhone's killer app was apps. Apps that could live and frolick on hardware and a platform that strived to be something more than the prototypical real-time embedded OS of the time.

For nine months in 2007, Steve Jobs told people that the reason there was no apps was because AT&T doesn't want their cellular network to go down, which is straight-up bullshit. There is a resource management and security aspect to having apps run in a constrained environment with sensitive information, but the ways you deal with that does not affect the carrier's network any more than it proscribes how those apps are distributed or what they actually do. More so than on Mac OS X, he wanted control, and the App Store gave him control.

Now, in 2020, the App Store is still the App Store. Apple is forced to justify contradictory and inconsistent stances on what's a valid app or not, to allow in apps that they probably do not really like to be associated with them because otherwise there are consequences and to deny apps that they probably would want to be associated with because of business decisions. And it has turned into Chekov's gun, where as a completely unnecessary funnel point, Apple can be pressured by dictatorships to drop apps that are politically inconvenient for bullshit reasons.

I tore into Phil Schiller, Apple SVP of Worldwide Product Marketing and de facto boss of the App Store, because of the things he said in defense of the recent actions, and other responses have set the worst possible tone for Monday's WWDC opening. Everyone of us who are reacting to this, everyone who's writing apps, everyone of us (I'm told there are 20-30 or so) who are still holding off on writing iOS software because of the App Store in general... we don't want to do this.

If I had five minutes with Phil Schiller, I'd ask him if it was worth it. Is it so worth it to not renege on a silly compromise formed to placate a difficult, if often brilliant, man? Is it so worth it that you end up antagonizing the people who build on your life's work? Is it worth it to be remembered for being a politruk, an apparatchik, spouting an opaque party line no one in view of the world and the facts agree with, instead of for having the courage to see a solution that no longer works and take action to fix it? All the status quo gives him and gives Apple is money at the cost of lost trust and long-fomented frustration.

We don't want to do this. What we want is the same we have wanted from the beginning, from before there was an App Store, from the moment we saw a pill with an arrow and the words "slide to unlock".

All we want is to write apps.

Real apps, good apps, innovative apps, apps that make money sunstainably for their creators, apps that build on what's there to go further, apps that make life better, apps that respect the time of their users, apps that are great because they have paid attention to details.

Why won't you let us?

Now slightly less dumb

One of the lower hanging fruit in the CMS powering this site was that, for each edit or new post or deleted post, it would reload the entire thing from the database. (To keep load times at a minimum, it keeps the HTMLized state of all the posts entirely in memory, essentially frying it, but entirely from the working set on every page load.)

I wanted to reload only the changing parts, but between having two separate "tracks" of "all posts in chronological order" depending on whether you only see the published posts or you're me and you see everything, it was enough complications to make me skip it before. But tonight, I tackled this and quickly realized that by limiting the changes to the only changes that really happen, which is one post being created, changed or deleted, all that was needed was to make a copy of the list of all post information, find its next/prev neighbors for each of the all/only-published combinations, reload them together with the post itself, insert this reloaded information into the list copy and create a new instance of the everything-about-all-posts object (which does all the lookups and even has the pre-baked JSON and Atom feeds) with the new list.

It now occurs to me that this way of doing it is in fact a bit more dumb because it introduces a flaw. If a post's published date is changed such that it is ordered differently, it will have a different set of neighbors before and after being saved. Reloading everything from scratch each time avoided this corner case. I suppose it's back to the drawing board to fix it – but not before documenting it, because I do find it interesting to write about these things. This site is about opinions and thinking, which often turns into criticizing or critiquing other people and the fruits of their labor. I don't talk much about my own work, so it's only fair to talk about something.

(Update: I fixed it by, after all the work is done, finding the next/prev neighbors again on the new state and comparing the ordered sequence of the next/prev neighbors before and after. If there are differences, get all of the new neighbors and reload them too – then, between the two passes, all posts that could possibly have been affected will have been reloaded. This is skipped if a post is deleted, because there's no "after" state that can conflict. This is a bit inefficient because it does all the work of the everything-about-all-posts object being created again, but it only happens on a tiny fraction of all edits and it takes very little time and CPU in the first place, so it's not worth complicating the code with tricky code prone to bugs and written only for this rare purpose to avoid.)

Another note is: why this constant frying (computing the page on demand)? Why not just bake the posts (into static HTML files)? There are two reasons for this – one is because it would complicate things slightly with templating, and the other is because I want for the pages to have the editing UI for me, but be "zero cost" to everyone else. That involves either having some sort of middleware trap to selectively serve something else, at which point I'm already taking a performance hit and might as well just serve it dynamically, or including a JavaScript for everyone into the baked page that checks for something and makes it show up, but I don't want for the page to incur that cost for most of its readers.

Crap Store

Say you don't care one whit about developer freedom, or owning the device you spent $1,449 on (a figure known in the industry as an arm and a wheel). The App Store is still a tremendously bad idea if you want good applications that work.

As detailed, Phil Schiller offers this excuse:

"You download the app and it doesn't work, that's not what we want on the store," says Schiller. This, he says, is why Apple requires in-app purchases to offer the same purchasing functionality as they would have elsewhere.

The amount of contortion this line of logic requires is unconscionable, and is the kind of reasoning that make people believe salespeople do not trigger automatic doors. The reason it doesn't work is because Apple's own damn guidelines prevent developers, reasonable people the lot of them, from including functionality that would make it work, and also feed their employees, kids and household pets. It allows it if you use a payment method that would require custom development and nick 30% off the top. Since this is so reasonable, I'm sure Apple wouldn't mind the principle of giving up similar amounts of money in the name of communal development into the many ecosystems they so wildly benefit from, not just in terms of personal talent but in enabling people to purchase high-margin products in the first place. Oh, wait.

If we allow ourselves to tease apart the thinly-disguised organizational avarice from Mr Schiller's statement, a trace of a soupçon of a reasonable usability concern can be detected: it would just suck if apps weren't usable and weren't good. The problem with that is that the App Store, for all the relatively recent improvements to its storefront and editorial articles, still has a rampant shovelware problem. It's entirely possible to go looking for something you know you want and come out with something else, and it's entirely possible to find 25 apps with no thought or care put into it for every app that is even merely competent.

The App Store has been a crappy idea from the very start. It serves only the monopolist owner, and all compromises break in favor of them and against the other party, be it the developer or the end user. The only winners aside from Apple are a few app developers that manage to beat all odds and achieve success (often fueled by loss-and-viability-insensitive venture capitalists who don't mind playing foul to grab market share) as well as the unscrupulous corner of humanity that always show up to squeeze what they can get from the naive, the weak and the misinformed. Users get left behind and developers pose the same well-reasoned feedback year after year after year, without anything changing.

I have no idea if it will take the US or EU torching it for things to change, but it baffles me that the bundling of a web browser was considered a bigger problem than this. One way or another, it's way past due for the App Store and its share-cropping mockery of users and developers alike to burn clean to the ground.

Barack Obama, 35, reads from Dreams from My Father at Cambridge Public Library

The passage that I'm going to read right now takes place right after a party, and what's happened is that typically, when I went to parties in high school, oftentimes there were three or four black people in a room of 300. So finally a black friend of mine and myself decided to invite some white friends to a black party, at an army base, out in Schofield Barracks, one of the major army bases in Hawai'i.

And we immediately sense that they're a little uncomfortable, being in this minority situation. You know, they're sort of trying to tap their foot to the beat... you know, they're being extraordinarily friendly... and after a while, they decide, after about half an hour they say "well, Barack, let's get going", you know, "we're feeling kind of tired", we're feeling this or that. And suddenly, this sense that what I've had to put up with every day of my life, is something that they find so objectionable that they can't even put up with it for a day, and these are good friends of mine, and folks who have stood by me for many years... something is triggered in my head and I suddenly start seeing, as I say in this passage, a new map of the world.

HEY rejected from the App Store

Works the same as Basecamp (also from, uh, Basecamp), which has had an App Store app for years.

The two most compelling arguments against the iOS and iPhone user experience are the App Store policies and the sandboxing restrictions. Some of those choices are made in favor of privacy and speak for the user, but which users are better served by Apple getting their organized-crime-type cut than they would by getting a functional application designed the way Basecamp had in mind?

Things hideously wrong with Apple Music

Non-exhaustively:

  • Searching for a song by name because you're looking for one version of it and then playing one by an artist that you've never heard of does not imply that I now forever want to know about everything that artist does, or am interested in their genre.
  • In general, playing a single track, or a handful of tracks once, by an artist does not imply that I like this artist or want to hear more like it. Chances are I just want to listen to see what it is, or to play it for someone else.
  • Asking for my favorite artists when Apple Music came out and now hide the ability to change this list is a dick move.
  • Not being able to cull the recently played list is a dick move.
  • Not being able to play something less recently played in the recently played list to make it more recently played is a dick move.
  • Filing something that isn't German and something that isn't Pop under "German Pop" is inscrutable and US-centric, and reminds me of when Apple used to print Dutch text on Swedish packaging.
  • Not having a way to shuffle all songs by an artist from Apple Music, just the "top songs" as statically determined by someone else, is bullshit.

Michael Harriot on "black on black" crime

Including the poignant:

So, even if a different black person committed every murder, it means 99.989497% of black people DIDN'T commit a murder that year.

That also means 99.994923% of white people didn't commit a murder.

So it's not 13 percent committing half the murders. .010% of black people are murderers and .005 percent of white people. Which means, I'd have to meet 10,000 people before I met a black killer.

Statistics. 🤯

The Intercept: "New Facebook Tool Allows Employers to Suppress 'Unionize' in Workplace Chat"

Absolute best case, someone was strong-armed into building this and snuck this into the presentation to highlight what a terrible idea it is. But in all scenarios: these fucking guys.

ARM Wrestling

We're coming up on WWDC, and in an eerily similar way, the imminent announcement of a transition of the Mac line to a new processor architecture is being reported in a big mainstream newspaper. This time, we're talking ARM Macs in 2021.

There are many questions to ponder:

What about the Mac Pro?

Given that the Apple A series of processors are so ahead of other ARM processors that flagship Android devices released recently are having trouble keeping up with last year's Apple counterpart, I don't think there's an issue achieving performance in the abstract. The ARM architecture has always been famous for its power-sipping, to the point where during the first successful power-on, it was able to run without the power supply being properly hooked up, and Apple's work with thousands of independent power domains to power only the parts of the chips that are necessary at the moment doesn't exactly hurt.

This doesn't mean it's easy to scale up in a smart way, that can compete with Xeons and especially modern AMDs. But the story of ARM over the years has been that it's much easier to make it go fast than an x86 family CPU power efficient.

PCI Express is strongly entangled with the idea of a modular PC, but the standard is its own island. There's no reason an ARM Mac Pro can't have just as many (or more) PCI Express slots or interfaces.

But a new CPU brings with it an entirely new environment, though. The boot environment of the new Macs are much more likely to be based on T2-style security. UEFI is available on ARM, but Apple is essentially starting over no matter what. On top of this, the software stack, the OS and the access allowed can be wildly different architecturally. Being able to stick in a PCI Express module doesn't mean all assumptions will hold about how to work with it. The compatibility story for GPUs will be especially interesting, since Apple's GPU chops are much more untested than their CPU chops.

Even if all of these are handled in the most inclusive way possible, unless there's some sort of extra bone thrown towards Mac Pro users, who now have seen a platform long-neglected, then ostensibly rebooted, twice, back-to-back, the future for the Mac Pro as the value proposition it currently occupies is murky at best. Forming a Pro team and taking everybody out for a ride of gradually coming to terms with actual people's actual needs only to decide that they are no longer a priority would be unspeakably stupid. Unless Mac Pros will live on in the current form, there's more to this, although maybe not revealed immediately at this year's WWDC.

With USB4 subsuming Thunderbolt 3, it's not impossible that Mac Pro could just get AMD's best performing CPUs in them and gain an impressive boost. (Although there's other Intel technology to worry about, such as the wireless video standard one that powers Sidecar.)

What about the software story?

macOS is not going anywhere. It would be criminally incompetent to assume that only UIKit/Catalyst-based apps were welcome on ARM Macs. Apple would have to rewrite, to a first approximation, all of their own software for that to fly, including Finder, Safari and all Pro apps. Macs will still be Macs, at least in that regard.

What might happen is that, as has happened with 32-bit-to-64-bit transitions before, chunks of the platform that's being supported and maintained largely for compatibility is unceremoniously sloughed off. With the decade-long trend towards sandboxing, "Developer ID" and notarization, I'm feeling ill at ease for what could be made to be mandatory. I don't think it will be bad enough that the Mac App Store will turn into the only mode of distribution, but if they ever wanted to do it and anchor it in some technical credibility, now would be the time.

Of course, there's nothing inherent about ARM itself that makes this inevitable. Looking back, it's been more prevalent across ARM devices simply because they've been devices - which are expected to perform a set function first, and where some lockdown to ensure the stability of that function is at least straightforward to motivate. But one of the most widely known ARM devices today is the Raspberry Pi computer, which is the epitome of hackable and open and free from such lockdowns. A general-purpose computer is still closer on the scale to a Pi than an iPad. Being able to bring out the power tools when necessary isn't an obstruction of the function, but rather a part of the function itself.

What about all the little stuff?

The hardware "form factors" that can be brought to market using ARM are worth considering.

  • A Mac mini that looks more like, and develops about as much heat and fan noise as, an Apple TV; similar in size to a NUC, and much more similar in price.

  • A MacBook that out-MacBook (2015)s the MacBook (2015). Lower power consumption means less cooling and less thermal throttling, and much less batteries – and much less weight and volume/thickness.

  • A Mac Pro mini, with the best performing chip they can put in there, an onboard AMD GPU, iMac-level SSD storage and user-replaceable RAM. Think the Mac Pro without the PCI Express slots, and likely the size of the G4 Cube. It's not a bad concept (it's just horrible when it's the complete extent of the platform). This is possible with Intel today, but cooling and power draw would likely make it too bulky to be tiny, or bring down the level of performance by a lot. Pro products never used the energy-efficient Core chips, and the G5 was a famously bad fit, so this might be the first time it's considered worth doing again by the powers that be.

Lori Lakin Hutcherson explains white privilege through examples

Concrete, heart-wrenching examples, making it hard to pick a quote. Just read all of it.

And since many people seem to blanch at the phrase itself, it's worth highlighting this part of the conclusion:

As to you “being part of the problem,” trust me, nobody is mad at you for being white. Nobody. Just like nobody should be mad at me for being black. Or female. Or whatever.

Surfaces

I have been lucky enough so far to avoid doing screen sharing and streaming, but I've been in meetings where other people are doing it and I've seen plenty of streamed videos on all subjects where something is demoed, and it is astonishing to me how much effort is put into moving stuff around the screen, and how the thing you want to show is always locked into what a window shows.

For example: The .NET Community Standups have regressed tremendously even as they got a professional studio with its own technology staff, but it still comes down to sharing a big dumb thing with floating buttons that no one who's watching actually cares about in the way of the person. And this is just what one of the biggest company in the world, in control of its own OS and several video conferencing solutions can muster.

There is a big need for a new user-visible, user-manipulable primitive on the OS level: surfaces that an application can project. Skype or Slack or Zoom or Discord should be able to have a separate surface for just the video, laid out and with no UI, and a surface that's just the chat, and you should be able to go into your streaming software and grant the 75 OS-level mother-may-I permissions and say: I want the chat from Discord, the video from Skype, my own webcam in this corner and this window from this game or application.

All those things would be latently available, would render directly into some buffer when needed, would be easy to add support for and would cost nothing when not used. The word surface comes from many similar concepts, including IOSurface on Apple platforms where it would most certainly be used to implement this, but the thing I'm proposing is something concrete and user-level. People who care about streaming would know that a surface is available, could hook up and preview any surface from any app (subject to completion of the obligatory round of permission Twister), and would know to ask application authors to add support for them.

Answers to presumptive questions:

  • Doesn't this exist? For all I know, this does exist in some form, but since I've never heard about it as an application developer, and never heard of any video conferencing applications implementing something like it (but making enormous hacky workarounds for call recording to work), it effectively isn't what it could be.

  • Since OBS plugins and overlays exist, isn't this superfluous? OBS plugins and overlays exist and are great in-so-far that people can tweak and add in what they want and they shouldn't go away, but it's not this, since it's not open-ended. Application A can't say "I've got views A, B and C if anyone is interested" and application B can't say "I'll take view B and C" without application A and application B having engaged in some form of blood pact beforehand.

  • Should this really be called "surfaces"? This could very well be named "views" to users, but it would be very difficult to call it that to developers who are used to "views" meaning controls in a window. Calling it "surfaces" has a similar issue but can be seen as raising some surfaces to the user level and making them accessible.

  • Okay, so how isn't this like "sources", then? What I'm asking for is pretty much like the interface that allows streaming applications (or any application) to take a stream from a webcam, which is why I think it's technically a solved problem. But by convention they all come from a hardware device, and not being a hardware engineer or driver developer my guess is that it's hard for just an application to project one of these things. And besides, having a separate name could be good too: "please add a pseudo-webcam to your application" is ripe for confusion.

  • Why the OS specifically, couldn't this be polyfilled/provided by something else? Sure, and software like OBS is in a good position to do so if there is a good technical path to stream video to some sort of receptacle that can facilitate it. Making it suitable for real use means keeping it low-latency and efficient and preferably low on copying while respecting security, privacy and permissions. But, hell, if you can figure out a way for applications to advertise that "if you connect to me in this way I'll talk libretro to you and you do with that what you wish", we're most of the way there.

And the final one: Can't you just do this with windows?

Yes, almost, but the problem is that the window has to present a good UI within it. Something that's a good visual that you want to show – when what you want to show isn't literally a UI itself – doesn't have a bunch of buttons and crap on it to annoy, distract and occlude. It's the difference between just looking at the slideshow in PowerPoint, confined to a segment of the screen, with UI affordances and borders and grip handles and what have you, and entering presentation mode, except that surfaces could be even cleaner because no one but the presenter would see the little toolbar with back/forward buttons and drawing tools because they wouldn't be on the surface.

And on a technical level, now you're getting into syncing with the window manager and compositing and screen recording and god help you if someone moves a window or something shows in front, compared to the application saying "this is what you want to see" and filling a buffer, and that buffer being rendered. Recognizing that some things are more like screws than they are like nails, and putting away the hammer for a while.

Programming difficulty texture

As I wrote the previous entry, I realized that many programming languages have their own texture of difficulty.

C is difficult because it is an unsafe veneer over assembly, because it's so easy to fall off the edge and because almost nothing is included. That's how it had to be in the 70s if you wanted to write an operating system, but it's easy today to make dangerous mistakes that compilers still won't warn you about, and see uninitialized memory. You're free to express a whole lot of things that just won't be compatible with reality. At every juncture you have to know exactly what you intend to do and intimate details about what the code calling you and the code you call expect to do with values and memory, and tooling to enforce these things are rare and hard-won commodities.

C++ is difficult, and I have always shied away from it more than any language, because of the mix of ambition and tricks to make things low-cost. Between Rule of the big four (and a half), rvalue references, moving semantics and templating, it's the most immensely complicated programming language in the world, and despite all this extra machinery, much of which is called implicitly, if you slip up, you crash really bad.

Rust is difficult because of having so many new concepts, and in being so precise and exacting. As far as I can tell, it moves all of this complexity to the compilation stage, so you end up with things not building instead, although some things can still be surprises.

JavaScript is difficult because the standard library is straight up pants-on-head stupid. Never mind the language; the Date object alone with its "add 1900 to the year", "months start at 0 but days start at 1" and incredibly imprecisely defined functions that have only recently been supplemented with sane alternatives is a good demonstrative microcosm. Not to mention that its relative poverty of functionality combined with having grown up in an unsophisticated environment has lead to a culture of many small ad-hoc polyfills.

PHP is difficult for the very many reasons this is still true. For heaven's sake, I read up on PHP 7 today, and I knew it was created because PHP 6 didn't get anywhere, but I hadn't realized until now that it meant they gave up on having an actual Unicode-compatible string data type.

Rust: Five Years of Rust

Rust is impressive for many reasons, but not the least because it was an ambitious long-term bet. It was researched for years, with the explicit aim to rule out the safety issues Mozilla ran into all the time when maintaining Gecko, and it brought a new focus of combining safety with zero cost-abstractions that do not require runtime support. Not many other languages has a known and commonly targeted subset for functionality that does not require run-time allocation. With Mozilla's previous experience of writing Gecko in the first place to replace the Netscape 4 rendering engine (a famously drawn-out process), they wanted something that could support them in gradually renovating their browser subsystems, and so far the results have been encouraging.

I have looked into it, and in my experience so far the programming model has been very cumbersome for me. It requires you to think in new ways and specify the world, and this is a common refrain even for people much less dense than me. I'm not sure what could be made to help the ergonomics, but from what I understand improvements have been made in usability already.

Cliff L. Biffle's Rewriting m4vgalib in Rust is a great capsule of the promise of Rust: prove that what you're doing is safe by mechanism and construction, and then not only avoid bugs but achieve minimum slowdown by unnecessary synchronization and the like. (Quite the inversion of terms combined to the more popular approach of wrapping things in smart wrappers, and one I hope is the wave of the future.)

Joe Biden's "address on Civil Unrest Facing Communities Across America"

Biden's a documented gaffemaster, but he has survived to this point because he is a competent politician. He's not the one I'd choose of the hundreds of millions of qualified possible applicants, but he's got basic decency, a working mind, a reasonable grasp of his own limitations and the ability to prioritize the future before the present, the bigger picture before the attention-grabbing flashpoint detail and the greater good before his own personal interest. These are not usually unique or distinctive qualities among the two presumptive major party candidates, nor does the cost of forgoing them usually continue being highlighted so clearly each passing day.

(The assertion of basic decency is contingent on the allegations being made against Biden not holding water. If they do turn out to hold water, there are many similarly (and some even more) qualified candidates to take his place.)

Lynnell Mickelsen: We don't have a protest problem. We have a policing problem.

I didn't realize the full background and consequences of what's happening in Minneapolis until I read this. Coming from a country with a mostly competent and non-corrupt police, the sheer audacity of what's going on in the US gets stranger and more flagrant every day.

← Earlier posts