Take

Later posts →

Filip Pizlo: Speculation in JavaScriptCore

Good, in-depth post from the WebKit team about how JavaScriptCore handles speculative compilation to optimize JavaScript execution.

Paul Tozour: The Game Outcomes Project, Part 4: Crunch Makes Games Worse

Extended overtime (“crunch”) is a deeply controversial topic in our industry. Countless studios have undertaken crunch, sometimes extending to mandatory 80-100 hour work weeks for years at a time. If you ask anyone in the industry about crunch, you’re likely to hear opinions stated very strongly and matter-of-factly based on that person’s individual experience.

And yet such opinions are almost invariably put forth with zero reference to any actual data.

Deep and wide analysis of an area that must be incredibly difficult to test in a scientific manner. Includes preemptive consideration of a reasonable counterargument, that crunch is likely most needed in projects that are poorly run or managed to begin with, which risks skewing the data or entangling variables.

David Heinemeier Hansson's Statement to the House Antitrust Subcommittee

It’s worth noting here that we are already paying Apple for the privilege of having access to the App Store. All developers must pay $99/year for a developer license. Apple brags of having millions of developers, so they’re in essence already making hundreds of millions of dollars, just in licensing fees. Nobody is asking for a free ride here! If it costs Apple more than hundreds of millions of dollars to run the App Store, they can raise their prices. We’d gladly pay $199/year for a developer license.

But what Apple is asking for is a cut of revenues, at “highway robbery rates”, and it’s simply absurd. Imagine if the telcos demanded a cut of company revenues, since they provide the phone line that connects customers, in the heyday of their monopoly? Imagine if the railroads demanded a cut of company revenues from the goods shipped in the heyday of their monopoly?

roguelazer: "Etcd, or, why modern software makes me sad"

Popular modern technology is taken over by expats from a megacorp and made worse in the service of a hyper-specialized (and just plain over-hyped) orchestration platform. That's the world today. Anything that has a simple and elegant feature-set ends up coöpted by people who just want to build big ungainly architecture and ends up inheriting features from whatever megacorp the coöpters came from.

I have a tenuous relationship to containers, to orchestration, to automated infrastructure-as-a-service with Puppet, Ansible and so on. They make possible the dream of having a sea of computation, where your service isn't the individual wave or stream, but rather the sum of whatever needs to come into existence at the moment.

The problem for me is – at what cost do you get this? If you're Google or some other huge company, you need to have this anyway, the entire department company-wide (and/or one or two people in each related effort) are worth the expense and effort, and the complexity at least isn't added complexity as much as it is things you'd have to deal with one way or the other.

If you're not, well, you either have to spin up a layer of architectural and organizational complexity that Google can support or throw in with some sort of cloud solution that does it for you. Either is costly in one way or another, and handing someone else the keys never absolves you of all the new exciting ways in which something can break or fail to be tuned or proportioned correctly.

Which leaves the option of not doing it. I just received a message from my host that in August, the singular server that hosts this place will go down for a service window that "should be about an hour". Leaving aside the question of how on earth this isn't handled by live migrating these virtual machines to another server, this is a good opportunity to highlight the difference.

Right now, this site hosts everything on one server, with one application serving the pages and one SQLite database hosting them. To avoid the downtime, I would need to have at least two servers and a load balancer, and by necessity either a third server for the database, hoping it wouldn't need to go down at any point or a vendor-provided "database service" that I could use which could come with such guarantees, just like the load balancer.

The point isn't that these services are beyond me, or that it would be terribly expensive in real money, really, even if it would be a factor of 5-10x. The point is that the complexity needed to scale up comes in steep cliffs.

There's nothing wrong with understanding how to scale and decouple in the first place. The current lore and fascination with containers and orchestration invites you to entertain these investments, these aspects and these costs for everything you do. If what you do is a large scale system where every part is mission critical (either to a customer or to other infrastructure), they are justified. But what happens when our industry gets enthralled with this way of functioning, to the point when all technology works best like this?

What happens to the developer that only needed the simple solution, or to the many small shops that can scarcely afford the infrastructure or hosting costs, or the also many slightly larger shops where the resources are there, sure, but people end up spending much of their time monitoring and gardening the large complex system and less time doing actual development? And, to roguelazer's point, what happens to the simple solution, which could have been used by developers and solutions of all shapes and sizes to solve smaller or less complex problems in a meritorious way?

(This is effectively also a corollary on the related but not identical microservices debate which is essentially the same argument on another plane. Swap out Google for Yelp.)

Jeff Glatt: COM in plain C

COM, Microsoft's 90's-era do-it-all layer of cross-library, cross-language, cross-machine interoperability whose foundations underlie .NET in spirit and the Windows Runtime in actuality is ridiculously complex, but in service of a great number of functions, and providing "automation" level support where everything could be called from scripts decades before PowerShell.

I was somehow tickled today to wonder what it would take to vend a COM interface in plain C (since most of the abstractions assume you would touch C++ with a five-foot pole), and Jeff Glatt has an eight-part series of articles on CodeProject introducing the relevant concepts piecemeal. Recommended reading to gain a higher level of understanding of how much COM does and, in some cases, how much of its hoops that it jumps through for you if you don't mind jumping through smaller, specific hoops.

The Final Hours of Half-Life: Alyx

I've never been particularly interested in playing Half-Life games, having the hand-eye coordination of a partly paralyzed goldfish as I do and never liking first person shooters or horror. But Half-Life as a series has left an indelible impression on the gaming industry, as has the Steam distribution platform (borne out of horrible experiences with physical publishing gatekeepers) and the Valve company that created both of them.

After a stint of a few years where the "episodic" nature of the second Half-Life game came to a halt on a climactic cliffhanger, any development on any Half-Life title has been shrouded in mystical secrecy befitting of a Salinger-esque recluse; the original Half-Life and the last episode of Half-Life 2 were released within a 9 year window, and were followed by 12 years of silence.

The Final Hours of Half-Life: Alyx is a compendious monolith of behind-the-scenes reporting from independent games journalist Geoff Keighley who previously produced similar articles for the original Half-Life, Half-Life 2 and Portal 2. Not only does it tell the story of Half-Life: Alyx, the VR-only prequel finally announced at the end of 2019, but it reveals the series of false starts, development stalls and projects that were not in tune with the current circumstances that occupied the enigmatic dozen-year sabbatical.

Valve is an incredibly successful company with vast resources, and its employees driven, capable and afforded legendary levels of freedom, but in the end, game development remains a creative and human endeavor, and is susceptible to downturns, lulls or, as one employee dubs it, wandering off into a collectively shared wilderness. Stories like these are all-too-common, but in a "10x engineer" world obsessed with venture capitalists roving around for "rockstar developers", they are too seldom told.

(If you think you might be interested in playing Half-Life: Alyx at some point, stay far away – there's no way of reading this without being spoiled to high hell. Also, be prepared for requiring a several GB download on a reasonably recent Windows PC to read an article.)

Eric Lippert: Life, part 21

The aforementioned series continues, and today's episode includes notes from an email discussion (centered on basically manual bit-twiddling Huffman coding) involving several people whose algorithms Eric implemented over the past few weeks, after the author of one of them, David Stafford, got in touch with Eric about it.

These things happening is one of the reasons why the Internet is great.

Melissa Hillman on pronouncing "ask" "ax"

The pronunciation "ax" predates "ask" in the English language & has been in continuous usage for 1200 years. When English was formalized, upper class white English usage became "correct."

Ars Technica: We traced Namco’s “new” Pac-Man demake to its source: A 2008 fan ROMhack

Last month, Bandai Namco announced a special bonus for Switch players who invested in the new Namco Museum Archives Vol. 1. In addition to 10 emulated Namco classics, the game's official Nintendo store page notes it includes "a newly created 8-bit demastered version of Pac-Man Championship Edition" (emphasis added) [original emphasis maintained].

[..] As it turns out, though, the "newly created" part of the game's promotion isn't quite accurate. Bandai Namco has confirmed to Ars Technica that its much-lauded Championship Edition demake is actually based directly on an obscure NES/Famicom ROMhack created over a decade ago by a Japanese fan going by the handle Coke774.

"Throughout Pac-Man's 40-year history, he has inspired countless fans to take on game development as either a hobby or a career," the development team said in a statement provided to Ars Technica. "In the case of Coke774, his work was highly appreciated by our team and we worked with him officially to implement his design into our game."

LinkedIn Says iOS App Reading Clipboard With Every Keystroke is a Bug, Fix Coming

I would never use LinkedIn in the first place and can't understand people who would, with their legendary, genre-defining platform-owner selfishness at the cost of the personal privacy and discretion of the people they claim to serve. I don't doubt that this is a bug, but it wouldn't surprise me if this wasn't a bug either.

Michael Flarup: The Comeback of Fun in Visual Design

It’s finally here. The thing I have been advocating for through my work, writing, videos and talks for years. A swing of the pendulum. A reemergence of fun in visual design. I have been waiting 7 years to write this.

InfoQ: Disabling Google 2FA doesn't need 2FA

[..] this attack was facilitated by the fact that the attackers were able to turn off 2 factor authentication on Google's password.google.com without needing to confirm by the 2 factor authentication mechanism, which defeats the point of enabling 2 factor authentication.

Whoops.

Wired: "Nuclear ‘Power Balls’ May Make Meltdowns a Thing of the Past"

Triso— short for “tristructural isotropic”—fuel is made from a mixture of low enriched uranium and oxygen, and it is surrounded by three alternating layers of graphite and a ceramic called silicon carbide. Each particle is smaller than a poppy seed, but its layered shell can protect the uranium inside from melting under even the most extreme conditions that could occur in a reactor.

[..] Most nuclear reactors today operate well below 1,000 degrees Fahrenheit, and even the next generation high-temperature reactors will top out at about 2,000 degrees. But during the INL tests, Demkowicz demonstrated that triso could withstand reactor temperatures over 3,200 degrees Fahrenheit. Out of 300,000 particles, not a single triso coating failed during the two-week long test.

Between triso, Thorium and Bill Gates' vaunted "Terrapower" — where you recondition spent rods into a big heap that you essentially light on fire, lock up and walk away from for a steady production over several decades — nuclear technology seems to exist that is able to manage the enormous downsides associated with its operating thesis.

No one is taking the inadequate first few generations of solar panels as indicators of what current photovoltaic technology is capable of; it's downright reckless to do so with all nuclear technology, considering the changes we're all going to have to make in the next few years.

List of Common Professions, Were They Labelled by The Same People Who Think "Content Creator" is a Fine, Dandy and Altogether Rather Unproblematic Descriptor

  • Protein-consumable assembler
  • Travel-time transitory hibernation domicile vendor
  • Junior organic pulp adherer-and-joiner assistant (in-training)
  • Hydraulic detritus disperser conveyance mechanism maintainer
  • Microbiome targeted development and exploitation specialist

˙ɹǝʞɐq 'ɹǝqɯnןd 'ǝɔıʇuǝɹddɐ ɹǝʇuǝdɹɐɔ 'ʇsıuoıʇdǝɔǝɹ ןǝʇoɥ 'ʞooƆ

Signal v. Noise: The evolution of Hey

Interesting story behind the shaping of Hey's design and general approach – it started by taking the "gnarliest threads" and building around that. Email is relatively painless for the simple cases, which everyone else is obsessed with adding trivial sparkles to.

Smithsonian Magazine: The Neuroscientist Who Discovered He Was a Psychopath

Knowing that it belonged to a member of his family, Fallon checked his lab’s PET machine for an error (it was working perfectly fine) and then decided he simply had to break the blinding that prevented him from knowing whose brain was pictured. When he looked up the code, he was greeted by an unsettling revelation: the psychopathic brain pictured in the scan was his own.

Speedrunner Link

Filed under the beauty of the Internet:

TikTok tracks everything, obfuscated out the wazoo

bangorlol on Reddit:

I reverse-engineered the app, and feel confident in stating that I have a very strong understanding for how the app operates (or at least operated as of a few months ago).

TikTok is a data collection service that is thinly-veiled as a social network. If there is an API to get information on you, your contacts, or your device... well, they're using it.

[..]

Here's the thing though.. they don't want you to know how much information they're collecting on you, and the security implications of all of that data in one place, en masse, are fucking huge. They encrypt all of the analytics requests with an algorithm that changes with every update (at the very least the keys change) just so you can't see what they're doing. They also made it so you cannot use the app at all if you block communication to their analytics host off at the DNS-level.

For what it's worth I've reversed the Instagram, Facebook, Reddit, and Twitter apps. They don't collect anywhere near the same amount of data that TikTok does, and they sure as hell aren't outright trying to hide exactly whats being sent like TikTok is.

Paints a fuller picture than "it helps itself to your clipboard", doesn't it?

On Tools and Developers

During the (excellent) WWDC episode of The Talk Show, there was this:

John Gruber: I've seen in, day one, there has been some - to me - misreading the message, but some coverage along the lines of: "Apple is moving the Mac to its own silicon to further lock in [insert either developers or users or both users and developers]", that this is to increase lock-in. And I just have to ask... I don't see it, because I've seen these announcements and I don't see where that's coming from in terms of any aspect that was announced.

Craig Federighi [SVP, Software Engineering, Apple]: I think those guys are being total tools, honestly.

This comes in the middle of a segment where both Craig and Greg "Joz" Joswiak (VP, Product Marketing) are talking about the lack of respect they get from showing that their focus in on the Mac, and that they want the Mac to continue existing, not be subsumed by or replaced by or absorbed into iPads and its OS, and so on and so forth.

To begin with: I agree with them to a point. As much as I've been able to tell, you will not be allowed to do less on macOS Big Sur running on Apple Silicon than on Intel processors. The limitations that are there have excellent technical motivations: it is hard to straight up virtualize a different processor architecture, and they have still provided good support for everything up to that point in terms of automatic binary translation of programs compiled for Intel.

I also empathize with the pressure on a personal level. Although they are well-compensated for it and should be accountable for the decisions they make, there's no need for incivility or personal attacks. Luckily, I do not have to stretch myself as far in order to enumerate a number of reasons why this is the feedback Craig and his team gets.

Apple's modus operandi is to find big changes, make big bets, and go from the current status quo to where they want to be over a series of small, incremental changes. Once every blue moon, a big change is needed, or a new component or technology or device or market needs to be introduced, and this is seen by some, especially outsider experts, as the core of Apple, to the point where if they don't regularly do it, they have "lost the innovative spark" and you would be a complete fool to not immediately short AAPL. But the core is the silent trudge, the long term mixed with the incremental.

Apple is judged by their actions, by their behavior and by their history, and in the absence of roadmaps and rationalizations, and in the recurring presence of re-contextualizations as new changes happen, the guessing game is the result. Every change turns into a proposed Chekov's gun.

  • When Sandboxing is introduced, the logical conclusion is that at some point it will eventually be required for everything. Since Sandboxing is inconsistent, flaky and insufficient, and since many of the current applications many people depend on could not survive Sandboxing, this creeps me out. The conjecture is unproven - Sandboxing is not required for everything, but we are not also at the end of history. Office is now sandboxed, but also has loads of handcrafted exceptions. Sandboxing wasn't built to scale to accomodate all current app behavior, not even of just the legitimate apps, it was built to force developers into a box. For increased security, yes. At the cost of tremendous inconvenience and missing features to the apps that adopt it and their users: also yes.

  • When the App Store is introduced on Mac, with the bluster that all developers should find it manna from heaven because it unties knots many developers had never even ran into for several years, the logical conclusion is that at some point everything will be required to be in the App Store. This theory has gone both back and forth, due to the introduction of Developer ID and Gatekeeper, and then the subsequent move towards making it harder to allow non-Developer ID apps.

    I stopped writing Mac apps largely because of this. I want to maintain a pseudonym, and Apple's assumption that it means I can't be trusted to make well-behaved software offends me personally. My software was used inside Apple, and I got bug reports when it broke on in-development OSes. The last updates I released, I signed with my own signature chain - I have no qualms with the security aspects, or with cryptographic signatures or even with blocking them after the fact.

  • When 32-bit Intel application support was dropped, it meant people couldn't run some applications any longer. There's now excellent justification to believe this happened to lessen the burden of the already capable Rosetta 2. But it still means applications people paid for, learned, loved, were productive with, possibly got a Mac for, stopped being usable. This is the opposite of user-friendly, and there's not even a security angle.

  • When Mac hardware consistently and consequently lost ports and user-accessible/user-serviceable parts over the years, they hardly ever came back. Additionally, some hardware was left to rot, and had a tendency to return at a steeper price. There are exceptions – worth noting precisely because they are exceptions.

    Many developers could use Mac Pros in the cheesegrater days, because it was recognized that you didn't have to be a film maker with a studio budget for modularity and customizability to be useful. And yet, even when the Mac Pro reverted from the "trash can" form factor to a full-on accessible desktop workstation setup as it previously had been, the price was hiked significantly and the product was completely repositioned, in the same breath as Apple declared developers one of their most populous "Pro" groups; indeed, the announcement was made to a room full of developers. There's a wide array of displays, and if people don't need an HDR display, they don't need to buy the Pro Display XDR, but making an entire Mac model inaccessible is a different animal.

If you meet a person and they act a certain way, over time you learn to recognize that pattern in them. If you develop for Apple platforms and every year is a series of new inconveniences to manage just as much as it is new technology to consider adopting, you learn to assume a negative progression in convenience, utility and freedom, just as much as you have hopes for the advances in frameworks and hardware.

The "tools" Craig's talking about have all seen the beginning of, effectively, the closing of the Mac as a platform. We know that Apple doesn't like to dwell on the bets they make, and we know that Apple doesn't usually back out of things. We're waiting anxiously for the moment where the hammer drops. That means we assume that sweeping transitions will bring those changes. The small ratcheting moves have often happened without being announced, or by being announced with individual bullet points in presentations during the week of WWDC (the interview was recorded on Tuesday, by which point not even half of the presentations were available).

This bed is of Apple's own making. By never copping to imperfection, by never really listening to and answering the detractors who are in its own camp, by avoiding humility and the taking of other perspectives than its own, the only method of communication left is loud and clear dissent.

Satisfaction Dissatisfaction

You might think that for a person with so many opinions, I would love rating interactions and customer satisfaction. The truth is that I get stressed out at the prospect most times.

Most rating systems are used to form a single number, a single indicator saying "are we on track, are we doing a good job?" and there's nothing wrong with that, except that my data point will be "is representative X doing a good job?". Most of the time, people aren't rude and they aren't unreasonably applying the script or process in front of them, and there's no reason for me to give them a low rating for that. Almost all the time, I do have significant qualms with the process itself in one way or another, but there's no way for me to express that aside from putting representative X in hot water for "being so bad with customers", or nudging them down some well-meaning but counter-productive ladder of incentives.

For any company that really cares about their customers' actual satisfaction and their process, whatever it is, the question should be split into at least two questions: Was our representative helpful, supportive and informative during the execution of this process? and Are you satisfied with the process?

Phoronix: Perl 7 Announced As Evolving Perl 5 With Modern Defaults

This truly is the weirdest timeline.

WWDC 2020: iOS 14

Reflections:

  • Unusually light on features.
  • There are crash-on-use incompatibilities in the release notes, but so far this is notably solid for a first beta, maybe even iOS 12-level. Probably related to the previous point.
  • Moving most stuff I use some of the time to a second home screen and turning off the third-to-nth home screens knowing the apps are still all in the App Library is a very good feeling of freedom.
  • The only widget I use so far is the weather widget, which when set to Current Location occasionally teleports back to Cupertino (homesick?) following which it refuses to reflect the actual location unless you change to another location and back. This sounds like a consequence of the way widgets are pre-baked, UI-wise.
  • Update: some widget-related manipulation and the automatic moving-aside thereof of apps and folders ended up straight up removing two folders and two apps, which had to be re-added from the App Library. (One could think that they would have slipped onto one of the app pages/home screens since hidden, but that doesn't seem to be the case.) These folders had been present in their current configuration for several years, iOS versions and indeed devices.
  • Finding an app in the App Library is easy enough, but adding it back to the home screen is inconsistent. As far as I can tell, you have to find it within one of the folder-like blobs in the App Library (as one of the featured three or from showing from the fourth item) and drag it from there. You can pull up the list/search results from App Library and drag the icon, but not the full row. And if you pull up the regular home screen/global search, you can't drag neither the icon nor the row. In none of these four cases does long pressing bring up a context menu.
  • Emoji search is a winner, although the always-there-search field adds some height, which is robbed from the app itself.

WWDC 2020: Apple Silicon

Reflections:

  • First of all: it is going to be incredibly interesting to see where the architecture can go without the thickness of an iPad being a constraint. Shipping an existing iPad chip is basically as strong of a statement as they can make that they're not showing their cards yet.
  • Making a big deal of virtualization still being there is necessary, but the way it was presented totally gave the (wrongful) impression that virtualizing Intel from Apple Silicon was possible. To the "UNIX-and-docker-using developers" that were mentioned, there's a hell of a difference between being able to use virtualization and containers for x86-64 vs for ARM, since the dry technical capability is intact but you miss out on the entire ecosystem of x86-64 containers and operating systems, which is most of the point. Were they trying to go through the keynote without using the word "ARM"?
  • Considering that the transition will be about six months old as the first hardware is being shipped, I'm guessing the two year length of the transition will be necessary to develop hardware, architecture or OSes for the Mac Pro end of the spectrum.
  • The unified memory architecture between the CPU and GPU is being touted and underlined as "modern" – I'm wondering where this leaves GPU support, even external. To a degree, even the Afterburner card seems dated by this framing, but maybe it'll be baked into the Mac Pro equivalent to begin with.
  • If the unified memory architecture is such a big deal – will any Mac even be able to have user-installable memory after the fact? The term of art used during the presentation was SoC – System-on-Chip – and not CPU and GPU, and for them at least in the current form factors, all direct RAM usable by the processing units is hooked up inside the die. I guess they can carve out an iMac Pro/Mac Pro-sized exception to allow plebeian DIMM modules in addition to the on-chip RAM too. This session may contain answers.
  • All demos were seemingly made on Pro XDR Displays, which use Thunderbolt 3 only for video signaling and not USB-C – if I'm not mistaken it runs two parallel DisplayPort streams to be able to fill up the display, leaving only enough pins and capability for USB 2.0 on its USB-C hub. But the tech specs of the Developer Transition Kit lists only USB-C, USB-A, Gigabit Ethernet, and a single HDMI 2.0 port. Did they all use some screwball converter to the HDMI port, which would have to be enlightened to their peculiar multiplexed Thunderbolt-DisplayPort connection?
  • Update: Apple will continue to support Thunderbolt, with strong references to Apple and Intel having co-invented it, maybe to distance them from AMD which has had a famously hard time getting support. The statement doesn't include a reference to a version number, so it could be read as Thunderbolt 3 being supported by its inclusion in USB4.
  • OpenGL support will be present-but-deprecated from the start, which essentially means the full OpenGL stack (beyond OpenGL ES) is available.
  • Rosetta for PowerPC/Intel was barely able to run an Office+Photoshop demo convincingly and was labelled "fast (enough)" in the slides; with Rosetta 2, we saw recent-ish AAA games and 3D modeling software being labelled as "great" and "fluid" and "without hitches". Doing it up front surely helps, but they've raised the bar of expectations by a lot this time. If they're launching as soon as by the end of the year, they'll have to deliver.
  • Being able to use XPC to support Intel and ARM plugins separately is inspired. I do wonder how many applications in the target audience allow for such a platform-specific architecture though.
  • Depending on how things shake out especially with the desktops, this could be the end of Mac being a "PC"-family architecture. The screws will be put to anything new that has to be brought along, and many things are carried on only reluctantly and/or temporarily. The explicit mention to still be relevant to multiple-volume-multiboot-OS-external-drive-UNIX mavens is a strong signal they don't intend to go all the way, but whether it'll be enough for people who need something that's PC-like in its structure is anyone's guess.
  • No word on whether apps from unidentified developers will still be allowed. (Update: allowed, but notarization is required.) The UNIX mention is interesting, because cross-platform command line tools can't really be expected to be fully packaged as macOS-enlightened, including notarization.
  • Running iOS/iPad apps seems like a gimme and recontextualizes wanting to make Catalyst so badly, but also seems like even more of a half-solution without a touch screen, which does not seem likely without API to enable it, which would have been announced now. Then again, iPadOS pointer support sprung up virtually overnight.
  • Having Office, Photoshop and Unity ported enough to be running from day one is far from "let's fly out the guy from Mathematica the week before". The Intel transition, even though the technical foundations had been laid for years, was famously close-hold; I wonder how long this has been cooking?
  • Having prepared ports of open source components is also a sign of the times.

WWDC 2020: macOS Big Sur

Reflections:

  • The new UI style is not entirely "flat gone mad" – it allows for depth and shadows and materials. Look at the speech bubble in the Messages icon, the envelope in the Mail icon or the pencil in the Pages icon. Many instances look a bit over-the-top-for-the-sake-of-the-effect, though.
  • I am not a big fan of the continued slaughter of available-space-for-the-actual-title in the title bar, or similarly of cleanly draggable areas.
  • The frontmost/active window needs to have a much more prominent title bar. Just going by the traffic lights isn't going to cut it.
  • Going from poorly-delineated buttons to borderless buttons isn't a good idea when the button icons are just outlined shapes with a button shape when hovering. Being in the unified-toolbar-and-titlebar is a sort of cue, but not as strong as just having a graphically richer icon to begin with.
  • Dear god, the just barely opaque menu bar is back, and it's just as horribly unreadable as a few years ago. Do we really need to keep doing this?
  • Control Center with modules that can be dragged into the menu bar – this I actually like. Coherent, rich presentation that is customizable, and where the customizability plays to the strengths and structure of macOS.
  • Catalyst better have grown some strengths, because the macOS Developer app released only a few days ago is a complete UI shit show that still feels neither like a Mac app nor an iOS/iPad app.
  • With all respect for the design upheaval and the technology changes – this is incredibly light for any macOS update, and choosing this version to round up to version 11 feels odd. The Intel transition didn't even get its own major version marketing-wise. Maybe it was chosen for semantic reasons?

Brent Simmons: The iOS App Store Brings Users Only Because It’s the Only Choice

This is a misconception that many people have — they think the App Store brings some kind of exceptional distribution and marketing that developers wouldn’t have on their own.

It’s just not true. It lacks even a grain of truth.

[..]

Build it (and upload it to the App Store) and they will not come.

Instead, you have to do marketing on your own, on the web and on social media, outside of the App Store. Just like always. The App Store brings nothing to the table.

← Earlier posts