I just caught a reference to Arve Bersvendsen’s EvilML file. What is it? It’s an HTML document designed to make use of the fact that HTML is, technically, SGML, which has all kinds of strange shortcuts you can use. Of course, no one has ever bothered to make a web browser that actually handles all these shortcuts.

It’s hard to describe it. The code is barely readable. The first line of text looks like this: <body<h1<em>Emphasized</> in &lt;h1&gt;</>. No browser in existence is likely to display it correctly, and yet — amazingly enough — it validates…

I already thought that moving to the more rigidly-defined XHTML was a good idea, but suddenly it makes a lot more sense!

Interesting note: I was looking at a subset of the web server logs, and noticed some hits from a program called webcollage. I was already familiar with it as a screen saver module: it reaches out to grab random images off the web and creates a collage on your screen.

What I did not know was that one of its sources is Altavista’s random link feature – and that feature isn’t so random. From what I can tell, AV picks a random site once a minute and sends everyone to the same place.

Whatever the reason, there were lots of times that several different computers grabbed the same page looking for images, and every time the hits were all clustered within a minute.

The big winners seemed to be our Comic Con photos (8 at once), the Chain Lightning Flashes (7 at once), and – not surprisingly – the entire Strange World category (11 at once).

While looking for more ideas related to my earlier post on fighting link rot, I came across some interesting articles:

Web Sites that Heal [archive.org] considers some of the causes of linkrot, including: changing CMS systems (which I’ve dealt with here twice), poor structure (starting small and simple, but finding that as the site grows, the old design doesn’t work anymore), lack of testing, and plain apathy. More interesting are some of the reasons it becomes a problem, in particular the difficulty in setting up redirections and informing other sites that you’ve moved. That’s something else I can relate to: My site hasn’t been on the UCI Arts server in four years, yet despite a massive attempt to get people to update their links, Altavista still shows 82 pages linking to my site’s old location. Something I think the article leaves out is the number of sites – particularly people who set up a free Geocities account back in the dot-com era – that just aren’t maintained anymore. The pages are there, but they’re six years out of date – and so are the links.

The article then proceeds to suggest an automated server-to-server system that will detect incoming links to a moved page, then contact the referring site, report the new location, and instruct it to update the link with no human intervention whatsoever. A great idea, though it will require people like me to drop the edit-locally-and-upload model of development.

“Web Sites That Heal” referred to a Jakob Nielsen column on Linkrot. Nielsen’s advice is frequently useful, though not always applicable [archive.org]. Sadly, his recent columns have tended toward rehashing old ones or applying to ever more specialized niches, but sometimes his advice is spot-on. In this case, the article from six years ago still applies to today’s web: run a link validator on your site from time to time, and keep old URLs on your own site active (whether with actual content or with a redirect). The comments on this article are worth reading as well.

Lastly, I found a remark on Consequences of Linkrot [archive.org] as applied to weblogs. Most of the post is actually an excerpt from Idle Words, where the original author notes that the classic blog post – a single line linking to something of interest, or a series of the same – is particularly susceptible to linkrot. Without the original material, there’s nothing (or next to nothing) left. And it happens fast: The Web isn’t that old, and blogging is even younger, yet information is disappearing rapidly enough that you really have to wonder how much of what exists today will still be around – in any form – ten years from now. One of the key lessons DeLong takes from this article: it’s “critically important not just to link but to quote–and to quote extensively.”

The lesson is clear: The site you link to today may not be there tomorrow, and you may not have the time (or inclination) to go chasing it down. Quote it, summarize it, add context, write lots of commentary, whatever. Make sure what you post can stand on its own… just in case it has to.

On an ideal Web, pages would stay put and links would never change. Of course, anyone who has been on the Internet long enough knows just how far away this ideal is. Commercial sites go out of business, personal sites move from school to school to ISP to ISP, news articles get moved into archives or deleted, and so on.

There are two sides to fighting link rot. The first is to design your own site with URLs that make sense, that you won’t find yourself changing a few months or years down the road. If you have to move something, use a redirect code so that people and spiders will automatically reach the new location.

The other side to the fight is periodically checking all the links on your site to make sure they still go where you expect.

So how do you handle online journals? Obviously they’re websites, so from that standpoint you should at least try to keep the links current. But on the blogging side, there are problems with this, in particular the school of thought that you should never revise a blog entry (also discussed in Weblog Ethics). Continue reading

As one of the many working stiffs who can access the internet from work but has to share a connection, I would like to make a request of the corporate world at large:

STOP REQUIRING FLASH TO VIEW YOUR SITE!!!!!!

Everything I look at on the net while at work has to go through a server in northern CA, which doesn’t have Flash capability and probably never will, because it would be even slower if the 250 people using it were allowed to view bandwidth-hogging all-Flash sites. With the economy being what it is, bandwidth costs being what they are, and connection power needing to be split at most offices, I’m not sure any company should be upping the ante this far in the name of pretty pictures. And the defense that people can look at it at home isn’t too great, either, since DSL is out of reach of more working stiffs than web geeks want to admit, and Deity-of-Your-Choice only knows when it might creep into affordability.

So, please do what you used to do, and keep your non-Flash site online after the upgrade, instead of routing us to a page exhorting the wonders of Flash and attempting to bully us into downloading it. (Baaaa.) You’ll widen your audience with very little effort–and hey, aren’t non-Flash sites easier to maintain?