Take

Programming difficulty texture

As I wrote the previous entry, I realized that many programming languages have their own texture of difficulty.

C is difficult because it is an unsafe veneer over assembly, because it's so easy to fall off the edge and because almost nothing is included. That's how it had to be in the 70s if you wanted to write an operating system, but it's easy today to make dangerous mistakes that compilers still won't warn you about, and see uninitialized memory. You're free to express a whole lot of things that just won't be compatible with reality. At every juncture you have to know exactly what you intend to do and intimate details about what the code calling you and the code you call expect to do with values and memory, and tooling to enforce these things are rare and hard-won commodities.

C++ is difficult, and I have always shied away from it more than any language, because of the mix of ambition and tricks to make things low-cost. Between Rule of the big four (and a half), rvalue references, moving semantics and templating, it's the most immensely complicated programming language in the world, and despite all this extra machinery, much of which is called implicitly, if you slip up, you crash really bad.

Rust is difficult because of having so many new concepts, and in being so precise and exacting. As far as I can tell, it moves all of this complexity to the compilation stage, so you end up with things not building instead, although some things can still be surprises.

JavaScript is difficult because the standard library is straight up pants-on-head stupid. Never mind the language; the Date object alone with its "add 1900 to the year", "months start at 0 but days start at 1" and incredibly imprecisely defined functions that have only recently been supplemented with sane alternatives is a good demonstrative microcosm. Not to mention that its relative poverty of functionality combined with having grown up in an unsophisticated environment has lead to a culture of many small ad-hoc polyfills.

PHP is difficult for the very many reasons this is still true. For heaven's sake, I read up on PHP 7 today, and I knew it was created because PHP 6 didn't get anywhere, but I hadn't realized until now that it meant they gave up on having an actual Unicode-compatible string data type.

Rust: Five Years of Rust

Rust is impressive for many reasons, but not the least because it was an ambitious long-term bet. It was researched for years, with the explicit aim to rule out the safety issues Mozilla ran into all the time when maintaining Gecko, and it brought a new focus of combining safety with zero cost-abstractions that do not require runtime support. Not many other languages has a known and commonly targeted subset for functionality that does not require run-time allocation. With Mozilla's previous experience of writing Gecko in the first place to replace the Netscape 4 rendering engine (a famously drawn-out process), they wanted something that could support them in gradually renovating their browser subsystems, and so far the results have been encouraging.

I have looked into it, and in my experience so far the programming model has been very cumbersome for me. It requires you to think in new ways and specify the world, and this is a common refrain even for people much less dense than me. I'm not sure what could be made to help the ergonomics, but from what I understand improvements have been made in usability already.

Cliff L. Biffle's Rewriting m4vgalib in Rust is a great capsule of the promise of Rust: prove that what you're doing is safe by mechanism and construction, and then not only avoid bugs but achieve minimum slowdown by unnecessary synchronization and the like. (Quite the inversion of terms combined to the more popular approach of wrapping things in smart wrappers, and one I hope is the wave of the future.)

Joe Biden's "address on Civil Unrest Facing Communities Across America"

Biden's a documented gaffemaster, but he has survived to this point because he is a competent politician. He's not the one I'd choose of the hundreds of millions of qualified possible applicants, but he's got basic decency, a working mind, a reasonable grasp of his own limitations and the ability to prioritize the future before the present, the bigger picture before the attention-grabbing flashpoint detail and the greater good before his own personal interest. These are not usually unique or distinctive qualities among the two presumptive major party candidates, nor does the cost of forgoing them usually continue being highlighted so clearly each passing day.

(The assertion of basic decency is contingent on the allegations being made against Biden not holding water. If they do turn out to hold water, there are many similarly (and some even more) qualified candidates to take his place.)

Lynnell Mickelsen: We don't have a protest problem. We have a policing problem.

I didn't realize the full background and consequences of what's happening in Minneapolis until I read this. Coming from a country with a mostly competent and non-corrupt police, the sheer audacity of what's going on in the US gets stranger and more flagrant every day.

It's not about skin color

The common refrain from the proponents of the wrong side of history is: "well, okay, but why make this about your skin color? Shouldn't – ahem – all lives matter?"

It's not about skin color for the people who were shot, assaulted, demeaned or mistreated. For them it's the opening of wounds that can never be healed, sorrows that can never be forgotten or the taking of lives, senselessly, in vain.

It's about skin color for the people who were doing the assaulting. It's about dehumanization, relegation to second-class status, willful ignorance of all the individual properties that make up a person. It's about the overwhelming, overbearing legacy of systemic and systematic mistreatment reaching back centuries, and the scar it left.

Whether or not someone is indoctrinated, deeply mistaken (perhaps with the lure of formality fairness, or borne of corrupted shame) or willfully negative, their position is ultimately one of choice, and can be reversed by humility, thought and compassion. No one is born with hatred in their heart, and no one needs hold onto it to their death.

Raspberry Pi 4 with 8 GB RAM announced

I've been waiting for this ever since I caught the mention in the leaflet (as shown). Within years, many legitimate demands for full-on servers and desktops will be able to be served by single-board computers like the Pi. The main obstacles now are a lack of properly hardware accelerated graphics (which has been started), getting widespread ecosystem 64-bit compatibility and moving to Wayland.

Speaking computing-power-per-dollar-wise, ARM and Raspberry Pi-like computers have us living in an embarrassment of riches. If it weren't for the above reasons, and for SD cards being flakier and more prone to failure than other storage systems, there would be few reasons to build a NUC instead of a Pi unless you absolutely had to run games, Windows or video editing on it.

Steve Jobs on Consulting

Worth a listen. His main point is that if you're a consultant you get to see, touch and own only very little, with no agency.

As some people notice, this is not necessarily still true; I've seen big companies use consultants to do what's necessary but impolitic, or throw impossible projects at firms that are too deep in to be able to do anything but attempt to deliver, or act as a wax paper-thin "source of recommendations", when they are more or less dictated up front and laundered. Some consultancies represent what's worst about business, but even good ones can enable dysfunctional businesses to misbehave even further.

Jordan Rose: ROSE-8 on Mac OS 9

From a previous Swift compiler engineer: A working game written in Swift for Mac OS 9, with the help of many bespoke parts, including a Swift runtime largely written in Swift. Follows on from getting a Swift program to run on OS 9 largely with C mappings.

(See also: UEFI Snake in C#, which falls somewhere in-between.)

Weave: part three

Two or three years ago, you couldn't swing a social media expert without hitting someone who believed in so-called "gamification". The core observation behind gamification is that when people are playing computer games, they become focused, driven and engrossed. They have a goal and are working towards it, the process is stimulating, and if they encounter resistance, they keep trying again until they can solve the problem.

The traditional definition of the work that has emerged after literacy and industrialism has been obsessed with efficiency, similarity and fungibility. A lot of similar things need to get done, and people need to have the discipline to buckle down and do them. This isn't "fun", it's hard and grueling and repetitive, but the value of work comes from having put in the effort.

Viewed from this angle, it's easy to see why multi-billionaires tell people to live at the office and not go to the bathroom. Ostensibly, if grit is the singular name of the game, advancement and success comes down to doing more of it. But then again, something that sounds like play should be mindbogglingly unpopular, so how does the relative success of gamification work?

Grit as a model of careerism works only insofar as everyone is willing to play Bloomberg's game. Among the several billions of people on this planet, there's more than one set of priorities and more than one type of person. Aside from the people who just plain don't want to advance into the stratosphere, there are also the innumerable who take to "presenteeism". This word can commonly mean people who have to go to work who otherwise would be home sick or treat their parents or kids, but it also refers to people who are coasting and "going through the motions"; feigning work, or taking pains to act as a compliant cog in the organizational organism, without actually having to put in effort to get stuff done.

I don't know whether adherents to gamification believe that it's supposed to shock the coasters into being productive. I don't know whether they have picked up on coasting at all, and are just trying to use it as a newer, better-smelling version of Taylorism, a system wherein every minor step of manual labor in a factory was optimized to within fractions of a second.

If you're not careful, gamification tends to enhance the constant strive towards "KPI" (Key Performance Indicator) maximization, which often maximizes the salary of an individual or budget of a department at the cost of the quality of the work or outcome of the purpose of the business. In this way, corporate business and especially professional/commoditized gaming share an approach. The important thing isn't what you do, but that the numbers keep increasing, be that levels or EXP, or number of veeblefetzers manufactured.

The perverse conclusion is that some of the old world analysis was right. Some work does involve putting in grueling and uncomfortable effort in order to reach a good outcome, and that's the work of realizing that most people in the workforce today are dealing with problems that require creativity to solve and analytical thinking to reason through. The more cookie-cutter and danger-minimizing the approach is to these problems, the worse they get solved, the less fulfilled people are, until failure, insufficience and malicious compliance isn't so much an unfortunate result as a bog for well-intentioned people to tentatively just survive in.

You have to trust people to do the right thing, and you have to give them the opportunity to do the right thing, and you have to not have coasters onboard who will take this freedom and squander it, instead of doing their job better. And whether they know it or not, the school of management that takes grit for granted as the way to salvation are the coasters of their organizational level.

To pull an Amazon and reinstitute Taylorism isn't just dehumanizing, it's intellectually lazy; a brute-force solution of a discredited method. And to run your business in a way constrained by the way you worry the laziest employees will abuse the system, instead of the ways all of your employees will be empowered to do a better job, is just destructive cowardice.

Weave: part two

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.

Jamie Zawinski

Google Chrome was a web browser released after Google had already launched Gmail, and so doesn't particularly need to read email. But its Chromium backend is still one half of the Electron engine (together with Node), powering many applications today.

The gist of the previously linked Software disenchantment is that most of the architecture assembled to cumulatively support the sum of 30 years of web pages, including an Xbox 360 controller driver, is present and available and accounted for in the low-effort approach to desktop application development.

When Chrome runs several processes, it's for reasons of privacy, security, stability and decoupling; the sum of complex architectural deduction weighing in backwards compatibility and power consumption. For better or for worse, this is what a browser is forced to do to browse the web these days. When Electron does the same, it's because Chromium was cribbed and duct-taped together with Node, in a feat that is truly technologically impressive, similar to how a chicken with its head cut off still running around is zoologically impressive.

I'm getting mean, but the point isn't to piss on the arduous and precise work necessary to make Electron work, done by people who want to make it easier to make desktop applications. The point is that what you end up with is gross, from top to bottom.

Here's Ryan Dahl, creator of Node.js, explaining the things he got wrong, and who subsequently started the just-released Deno which takes a different approach.

Chromium isn't itself bad, but see above. Swatting flies by poking them with nuclear warheads gets even more dangerous once you realize that you're asking the same hand that would have held the fly-swatter to hold the warhead instead.

And the act of putting together desktop applications based on web UI is also bad. Maybe it's fine on ChromeOS, but an application is one point in a landscape. Every possible way that an OS could be of service now depends on being piped through two layers of abstraction, maintained by two different development teams. Not to mention the cognitive load of constantly deciphering new models of interaction. For web sites, that's okay – you're there to read something, and maybe add a reply every now and then, or maybe play a game where the game itself is the interaction mechanism. But anything put through Electron aspires not to be a publication but an application.

It's not impossible to create good UIs in a web setting, but it sure as hell takes effort and time and practice. It takes constant vigilance to keep up to date and well adjusted in the face of every device that can access it, and most of the frameworks that are there to help you add more heft and complexity for the developer.

Many web-based apps deal with tables/grids in one way or another. How many of them have rearrangable columns, resizable columns, sortable columns? In how many of them can you select several rows and use the copy shortcut to copy the text? Type the first few letters to jump the selection to the first row that starts with them? None of these things are there from the beginning. All of them have to be added. It's not that it's impossible to do these things, it's that it takes effort for the developer, it's that it erodes familiarity for users who can't get their work done faster and can't build an intuition about how things work and what things are capable of.

Of course Electron can be a good tool for someone with few resources or without extensive knowledge of desktop APIs. You've gotta start somewhere. But it's become the go-to tool for even large companies to use, under the debatable pretense that it solves more problems than it creates. And by redefining UI and usability to also include experience (which was necessary), we have opened the door to shift the "experience" away from "did this work well for me?" and towards "was this UI high production value enough to impress them?". What gets thrown away? Familiarity, respect for the user's computer and battery life, respect for the user's desire to master a tool so that they can capably get work done.

Polarization is the enemy of reasoning. I'm not the first to rail against Electron or web-based UIs. But the answer isn't always to switch, tomorrow, to the lowest native stack available on all platforms, and code as if you're using your grandfather's Pentium Pro. It's to be aware of what you're doing, to understand what you're buying into and what you're giving up; to respect your users and respect yourself by doing the best job you can for them. Be informed, be deliberate; be proud of the work you do, the bugs you fix, the complexities you manage and master. Be willing to improve.

Weave: part one

Ever since I read Nikita Prokopov's Software disenchantment, I've been going down the rabbit hole. The article is about software quality going to crap, and how often it is linked with expediency.

It contains a mention of Jonathan Blow's language Jai, which made me look it up, and whose history and philosophy I gradually soaked up after watching a number of his archived streams on YouTube which are largely either progress reports or him making a change live. (The name, as far as I can tell, is never explained nor even said on camera, which is a rare feat given the screen time.)

The gist of this philosophy is basically: "speaking as a game developer with Braid and The Witness to my name, to whom details are important both in that they make my game what it is and in that they dictate whether given the previous details I will even be allowed to hit 60 fps, I am thoroughly disappointed with today's programming landscape, most languages and most current dogma on architecture and abstraction, so therefore I'm making my own programming language from scratch so I can make it do the things I think it should be able to do".

Jai is enthralling because of these perspectives. It has metaprogramming-that-is-also-compile-time-code-generation-which-feeds-into-runtime-introspection on a level that is rare for how close the language gets to the metal. Generics are turned into "polymorphic functions" which are fully mapped out and realized at compile time from actual use. Each top-level declaration or function is parsed and type-checked separately and fed into a "meta-program" using which you can choose to produce new code or alter the just-parsed code. There's no such thing as classes, and indeed no constructors or methods - but basic levels of "inheritance" are possible through struct containment and the keyword using, which turns into syntactic sugar for going up the chain if you can't find it, but by the compiler doing lookups instead of via virtual method tables.

The reason for all of these things are that they allow flexibility in the code while still not prescribing implementation. You're supposed to make your own of most things, because in Jonathan's world that's basically how it is anyway – the memory layout of the list of entities affects performance, so why wouldn't you roll your own, or at least want to be able to? Earlier versions contained support for managing turning an "array of structs" into a "struct of arrays" — ie being able to pack the base metadata of all entities in a denser form, and so on – while still using the same syntax.

Having always carried the stereotype that people who cared so much about performance never aspired to neater code, or at least never let it bother them that they couldn't have it, I was disturbed at first by all this. Heaven knows how rudimentary code completion or "go-to-definition" type analysis would be implemented for this language, and that's probably why this road isn't often taken. Security experts would run from numerous aspects of the language and its model, and it doesn't attempt to make things easier for people who screw up memory safety in a pointer-oriented language (which at last count is still literally everyone).

There are numerous reasons to "not try this at home" and Jai is explicitly not designed for general use by everyone, but those concerns are not reasons to avoid looking at an unconventional language that throws off, at least temporarily, the yokes of soundness and theory and abstraction in exchange for actual utility and usefulness – and even other kinds of abstraction.

Optimizations and tweaks

The minimal CMS and engine behind Take has gotten a few improvements along with minimal tweaks to the site look.

Particularly important to me, there is now a step which minimizes the HTML of all pages. The front page (which for me is slightly heavier since I'm logged in) now weighs in at 40 353 bytes for me, and contains 31 661 characters of text, for a 78.4% text-to-HTML density.

For comparison:

(Play along home by running
[document.body.innerText.length, document.body.parentElement.outerHTML.length, (document.body.innerText.length / document.body.parentElement.outerHTML.length) * 100]
in the developer tools console of your browser.

Please be advised that this is in no way a useful representative measure of bloat since it completely disregards the potential freight train of externally included resources, and punishes links, tables and images. For me, now, it's a good measure of the whitespace-removing and HTML-simplifying minification process, and the rest is just in good fun.)

Eric Lippert: Life, part 1

Ever since John Conway passed away, Eric Lippert has had a good series on the search for an efficient Game of Life algorithm. It's still ongoing and has already managed to touch on both SIMD and Michael Abrash.

Michal Malewicz: Neumorphism in user interfaces

I could do without the name, the dogma about precisely which effect is used and the desire to be "fresh", but as a step in the right direction away both from flatness and "Material Design", it is refreshing.

(via: Uninvited Redesign's link to Sangeeta Baishya's neumorphic Spotify redesign, which mostly looks pretty good but partly capsizes in Photoshop layer effects.)

The Miracle Sudoku

You're about to spend the next 25 minutes watching a guy solve a Sudoku.

Not only that, but it's going to be the highlight of your day.

Ben Orlin is not wrong.

I don't have much aptitude for slowly unpicking crosswords or sudokus, but applying constraints and deductive reasoning and progressing towards an answer is a real high, and seeing how Simon solves this is riveting.

Microsoft: Windows Subsystem for Linux BUILD 2020 Summary

WSL was a strange beast in its first incarnation (the userspace bits from various distros interacting with a Linux translation layer in the NT kernel) and becomes stranger still in WSL2 where it becomes a Microsoft-blended Linux kernel shipped with Windows and maintained with Windows Update, atop which the userspace bits are run essentially in a container-like fashion.

Graphical applications are coming too, using a layer where RDP talks to Wayland (luckily, rather than X). And interestingly, DirectX 12 and lots of GPU-related machinery is being ported to run on Linux as part of this effort; largely to enable WSL features, but going further wouldn't be the strangest technical consequence of this project.

I still wouldn't want to rely on any Linux testing of neither command line nor desktop applications in this way, but there's an odd symmetry of incendiary grump that the same solution might provide both a decently modern command line on Windows and a decent desktop experience for Linux applications.

On Building

I used to write a bunch of software for what was then called Mac OS X. (It's a long story, but I'm personally responsible for the little controls you type a keyboard shortcut into sometimes being called "shortcut recorders".) Why did I stop?

I stopped because it was too much. I love the platform, I love the community, I love the sense that it's okay to care about details, I love the combination of a strong foundation that's not afraid to move forward and try new things. That gives you energy to do something, and it gives you ideas.

I've never been good at containing and managing that into a neat and healthy time allotment and a bounded investment of energy and emotion. Together with what my work asks of me, the things I do for my own personal use and the time I need to unwind from those other things, I had to drop it.

I've been reading about the pressures of open source maintainers for the past year or so, and I feel it. You want to put something out there to share, and some people – fine, good people with good intent – can turn it into a commitment that you only fulfill because you maintain an interest in it. If those pressures are reasonable, and if your life allows, you can do that, but it risks being turned from one of the highlights of your day into a downer, or even both simultaneously.

Making something for yourself, for friends, for people in the same situation you are, for kindred spirits that need to do something that's just too hard or inconvenient or troublesome – it can be an expression of who you are. It can be a balled-up gift of toil and thought and soul. It can make someone's day brighter and easier, and it can make you happier and filled with purpose. And it can also make you heart-achingly vulnerable, like a kid with a sandcastle, staring nervously at the waves, judging the strength of the winds against your skin.

On Sonic Mania and Forgetting

I mentioned having an interest in gaming-related videos, and one of the biggest, strangest and diverse subcultures is certainly the one around Sonic.

A few years ago something interesting happened. Christian Whitehead, known in the community as The Taxman, came to an agreement with SEGA to revamp and re-release Sonic CD on iOS and Android. He did this using his own Retro Engine, with which he'd been making Sonic fan games for a number of years. This led into remasters of the original Sonic the Hedgehog as well as Sonic 2, and eventually also to him pitching a new game to them. That game became Sonic Mania.

Sonic Mania essentially continued the tangent of where Sonic was heading after the sibling installments of Sonic 3 and Sonic & Knuckles, and was better received than any Sonic title in recent memory – all the more impressive since SEGA's own attempts at following it up with the episodic Sonic 4 foundered, and recalling SEGA's walk in the wilderness with several cancelled projects after Sonic 3.

In both instances, SEGA tried to forget what had brought them there, or take something new and retrofit it. In the years after Sonic 3, they were caught up in a wave of consoles hastily making the transition to 3D spaces using still-early technology, and made the decision that continuing a 2D approach wouldn't work. And with Sonic 4, they tried to retro-fit the eventually occasionally successful 3D Sonic to the 2D formula, without paying attention to the dynamics that came to define not only the player's experience and memories but also how everything else had to be designed to interact well with Sonic's controls and the player.

I am about as nostalgic as you could be, but ever since Sonic Mania was released I've been waiting for the announcement of whatever comes next. Sonic Mania was announced as a one-off to honor fans and the series' own history (a task it performs admirably, the attention to detail and lore goes deep), but ended up demonstrating that the idea of the 2D Sonic games still worked (and with the DLC/physical release Sonic Mania Plus, that it was open to expanded mechanics).

In a world that works the way it should work, the minds behind Sonic Mania are now working not on Sonic Mania 2, but on the next step the original series would have taken, a game that isn't retro, but also not just re-painted with "assets" from more contemporary 3D-line Sonic titles. A game that tries to treat today's technology as something to be used in the way the original games approached then-current 16-bit technology, instead of as a crutch for reproducing the same old painting. A game that perfects the foundation and uses it to build a new and wondrous thing. A game that gets back in the driver's seat and gives the fans what they didn't yet know they wanted, instead of just paying homage to their obsession.

In order to make Sonic as great as it used to be, they need to be courageous enough to stop making Sonic the way it used to be.

John Feminella: Not Even Wrong

Lots on the distinction between communication based in facts, numbers and statistics and in storytelling, how neither necessarily communicate truthfully or completely, and on how the brain is ill-equipped yet to gel the two.

Most of the controversy and opposing camps of the past few months has its source in one group being comfortable looking at things one way and not understanding another group's choice to look at it another way. My guess is that either group is made up mostly of people persuaded to that position, and that being able to communicate and connect on many planes is impactful.

Today's site progress

Open Graph is a complete trainwreck, but is now a supported trainwreck.

An og:image is supposedly required for every page, but there's no defined or overwhelmingly common aspect ratio, and it's apparently not the end of the world if you leave it out for your front page, so I'm leaving it out for all pages since I wouldn't know what to put there. (I have ideas, but it would have to be customized to match the look of the site.)

Entry page loading should be just a little faster, since Markdown parsing and HTML conversion is done once when all entries are loaded instead of with every page load.

Emily Shea: "Voice Driven Development: Who needs a keyboard anyway?"

Incredibly impressive voice coding demo using Talon. Starts with the infamous "let's code Perl using the Vista speech recognition" video for even more contrast.

(via: Accidental Tech)

Joseph Anderson's Super Mario Odyssey critique

This is another good example to illustrate the previous entry's point. In this, he takes issue with the ungodly amount of "moon" collectibles and how it dilutes the worth and meaning and joy associated with it, which most fair-minded reviews would. Except that he also painstakingly catalogues every single moon and categorizes them. This is the opposite of rash and harsh commentary often left off the cuff – as recently explored by Mark Brown's Game Maker's Toolkit, another good watch.

Gaming critiques and deconstructions on YouTube

I have a complicated relationship with them.

On the one hand, it's easy to look at them and say: wow, here are hundreds of videos retreading a game like Super Mario 64 most people who care about games have played and coming to largely the same conclusion.

On the other hand, it's not wrong to like something, nor is it necessary for large catalogues of human endeavor to be deduplicated. For each of these videos there has always been thousands of people thinking similar thoughts or holding fond memories in their head.

As an example, I have long appreciated how well-designed the Baby Mario bubble dynamic in Super Mario World 2: Yoshi's Island is. The sound of the baby is gut-wrenching and speaks to your instincts; after recovering from being stunned, Yoshi seems to lurch in the right direction; it's possible for the flying toadies that come to collect him to be delayed or prevented from approaching the bubble even though the countdown timer is at 0.

Thanks to videos like the ones I have been watching over the years, I have come to appreciate those details and think about game design in a way I surely wouldn't unless I designed games myself, even though I've been playing games all my life.

For similar reasons, the ongoing and surely accelerating death of magazine journalism is not the death of gaming journalism. "Democratization" is a common and worn-out term, but it's also imprecise – it's not so much that everyone can write a review but that we are able to be seeded with ideas and foundations of knowledge now like never before.

Many of the English language gaming magazines available to me when I grew up seemed more like rough-housing lad bibles that were most entertaining to the ten people on staff who understood all the in-jokes. Meanwhile, there are plenty of hour-long videos exploring the meaning, pacing, narrative, metaphysics, controls and so on of games; here's one about 2018's God of War. I don't mean to compare the message available in a 3 hour video vs a two-page spread, but I do mean to contrast that the desire for depth was often never really there in the old magazines, and I'd rather trust the guy who made the video than most classically designated game journalists.

In a way this is the positive side of the wonder of the Internet. For good and for bad, it connects people and lets them communicate. It can let misinformation, stereotypes and conspiracy theories flourish, with many consequences. But it can also be a positive force in deepening interest and knowledge and passion, which makes us stronger and happier and more fulfilled, and helps us on the way to fight the bad sides.

(And wouldn't you know that there are thoughtful deconstructions of those things too.)

Brent Simmons: More on the Default Feeds Issue

I’m trying to figure out what bothers me. I think there are two things.

One is just that the App Store has always seemed rather arbitrary. The guidelines don’t even have to change for unseen policies to change, and it’s impossible to know in advance if a thing you’re doing will be okay and stay okay. (Recall that NetNewsWire has been doing the same thing with default feeds for 18 years.)

This gets really tiring, because every time we submit an app — even just a bug-fix release, like 5.0.1 is — I have to deal with the anxiety as I wonder what’s going to happen this time.

The other issue is a little harder to explain, but it goes like this:

If a site provides a public feed, it’s reasonable to assume that RSS readers might include that feed in some kind of discovery mechanism — they might even include it as a default. This is the public, open web, after all.

Now, if NetNewsWire were presenting itself as the official app version of Daring Fireball, for instance, then that would be dishonest. But it’s not, and that’s quite clear.

To nevertheless require documentation here is for Apple to use overly-fussy legal concerns in order to infantilize an app developer who can, and does, and rather would, take care of these things himself.

In other words: lay off, I want to say. I’m an adult with good judgment and I’ve already dealt with this issue, and it’s mine to deal with.

I have been complaining about the App Store since its inception, but it's worth repeating that it's not just dolts like me who don't like it. It's arbitrary in ways that defeat its purpose, demean its constituents and take for itself the crown of only responsible grown-up. Responsible grown-ups bring up and infuse their kids with values, yes, but would walk on burning embers to give them the freedom to do with their lives what they want. Extracting new rules from the same list of information and applying it suddenly and unevenly is capricious.

It would take an incredible balancing act to actually run an app store well. Apple has done the best job of it so far, but it's still a tire fire that inhibits applications legitimate developers want to write. These events are not representative of every app review process ever, but they are representative of what happens when you have an app review process and you live in the real world. It doesn't have to happen in most of the cases to be a disgrace and an impediment.

Saagar Jha: Why we at $FAMOUS_COMPANY Switched to $HYPED_TECHNOLOGY

A slightly less complimentary, but no less true, take on today's theme.

(Also, that decision to allow Markdown in titles is coming out in force right out of the gates.)

← Earlier posts