Once upon a time, the idea that “only the code mattered” was sold as a way to be inclusive. No one would be shut out if their code was good.

But building software is more than code. It’s design. Planning. Discussion. It’s figuring out use cases, misuse cases, and failure modes. It’s interacting with people.

And if you allow some people to treat others like crap because only the code matters, you end up causing harm and driving people away.

Which obviously isn’t inclusive.

If you mistreat people or violate ethics to make your “technically perfect” software, those people have still been mistreated. Those ethics have still been violated. People have created marvels of engineering and fantastic art by abusing or exploiting others. People have done the same while abusing or exploiting people on the side. And people have created wonders while trying very hard not to abuse or exploit others.

The accomplishment doesn’t erase the exploitation or abuse. And if you can accomplish something incredible without mistreating others, it obviously doesn’t justify the mistreatment.

But the culture of “only the code matters” turned into a culture of tolerating assholes because they were good at their job. The ends justify the means. From trying to enhance freedom, to embracing Machiavelli.

It certainly didn’t help that 90s hacker culture had a significant BOFH element to it, with its built-in disdain for those with less technical knowledge. The Free part tended to prioritize programmers and sysadmins over “lusers.” It was Animal Farm with computer users. Sure, we tried to throw off the corporate overlords who were dictating how people could use their computers. But some computer users were more equal than others.

So a lot of people who could have become part of the Free Software community found a hostile environment and left in disgust. Or fear. And even if you don’t care about the harm done to them, consider their potential contributions. Free Software has always had a problem with coverage: Programmers work on problems that they find interesting or useful. The boring parts, the use cases that they personally don’t use, tend to fall by the wayside.

Yeah, your code is good…but the spec’s incomplete because you pushed away the people who would have pointed out a common use case, or just how easy it would be for a feature to be misused. You didn’t think they were worth listening to because they weren’t rockstar coders. But they also had information you didn’t.

Not that throwing off the corporate shackles has worked out all that well. Every platform now has its own walled garden. Microsoft is less dominant than it once was, but we have new mega-corps who’ve managed to leverage an internet built on Free/libre and open-source software into their own positions of dominance. And trying to maintain services for people who’ve come to expect free/gratis has brought us to the point where adware is the norm, and surveillance is everywhere…to better target those ads. And the majority of computing devices out there are locked down, preventing ordinary users from tinkering with them and developing that technical competence that might bring them into the fold…

If we’ll even let them join.

Kiddo’s been wanting to learn programming, with the ultimate goal of modding Minecraft. We’ve done some Ruby, but he’s impatient, so last night I we started Java with a simple program that repeats a println X times.

He wanted to pass it the integer limit.

After a few minutes, I suggested we watch a movie and check back later.

After dinner, he decided to stop it and we timed some shorter runs.

I think he has a better understanding of scale now!

I’ve written about the trouble with using mobile apps in dead zones before, so I’m happy to see that I’m not the only one thinking about the problem. Hoodie wants to design for offline first, and is starting a discussion project around the issue.

Offline reading is an obvious application. Most eBook readers handle that just fine, though it’s easy because you spend a lot of time in each book so it doesn’t need to predict what you’ll read next. It would be great if Feedly would sync new articles for offline reading. Heck, I’d like it if Chrome on Android would let me re-open recent pages when the connection dies.

Beyond reading, many actions can be handled offline too. Kindle will sync your notes and highlights. GMail will let you read, write, label, archive, delete, and even send messages without a network connection. All your actions are queued up for the next sync.

There’s no reason this approach can’t be taken with other communications apps for messages that don’t require an immediate response, even with services like Facebook and Twitter. Short notes of the “don’t forget to pick up milk” variety. Observations. Uploads to Dropbox. Photos going to Instagram or Flickr. Buffer would be perfect for this, since you’re not expecting the post to go out immediately in the first place. It shouldn’t give you an “Unable to buffer” error, it should just save it for later.

I’d like to be able to do work in a place where there’s no connection, have that work persist, and fire things off as I finish them instead of having to come back to all of them the next time I’m within range of a cell tower or a coffee shop with wifi. I’d also like to be able to post in the moment, hit “Send,” and move on with my life, instead of having to hang onto that extra context in my mind as I walk around.

Firefox has been testing a new release that detects and closes crashed plugins (instead of letting them crash Firefox entirely) for several months, carefully making sure everything was working before they released Firefox 3.6.4 last week.

Within days, they released an update. I couldn’t imagine what they might have missed in all the beta testing. Katie wondered if the beta testers hadn’t been testing the limits.

You want to know what convinced Mozilla to issue an update so quickly?

Farmville.

Apparently Firefox was detecting Farmville as frozen and closing it. It turns out that on many computers, Farmville regularly freezes up the browser for longer than 10 seconds, and its players just deal with it and wait for it to come back. Mozilla decided that the simplest thing to do would be to increase the time limit.

What this tells me is that the type of person willing to beta-test a web browser these days is not likely to be playing Farmville — or if they are, it’s likely to be on a bleeding-edge computer that can handle it without 10-second freezes.

In more practical terms: Mozilla needs to convince a wider variety of users to help test their software!

As the first major web browser to reach a double-digit version, Opera has been testing out alpha releases of version 10 for months now. One of the early problems they encountered was bad browser detection scripts that only looked at the first digit of a version number and decided that Opera 10 was actually Opera 1, and therefore too old to handle modern web pages.

After extensive testing, they’ve concluded that the best way to work around this is to pretend to be Version 9.80. From now on, all versions of Opera will identify themselves as “Opera/9.80” with the real version appearing later in the user-agent string.

For example:

Opera/9.80 (Macintosh; Intel Mac OS X; U; en) Presto/2.2.15 Version/10.00

This is similar to the way all Gecko-based browsers identify themselves as Mozilla/5.0, then list the real browser name and version number later on, which makes me wonder why they didn’t just stick with that increasingly irrelevant prefix — though I suppose any scripts looking specifically for Opera versions might have still picked up Opera/10 later on in the ID.

It’ll be some time before Firefox or Safari runs into this issue, but with Internet Explorer 8 in wide release, you have to wonder…what will Microsoft do when they get to IE 10?