76 IEs? Not Likely.
Paul Irish believes that by 2020, web designers will be forced to support 76 variations of Internet Explorer between the nearly annual release schedule IE is on lately and the compatibility modes each of these IEs will have for previous versions of Internet Explorer. 1
IE6's entrenched position came from the fact that (a) it was the latest version of Internet Explorer for a huge amount of time and (b) its status as the IE dead end for Win2k and below.
When IE7 came out, any company that still had any Win2k machines had to keep designing with IE6 in mind if they wanted their new apps to work on all their computers. (I'm making the assumption that if they were relying on IE previously, they couldn't just switch to Firefox or something).
Now, IE8 I think most people can accept is going to end up in IE6's current place. It's the IE dead end for XP, a hugely popular OS. But IE7? None of those companies that don't upgrade upgraded to IE7. Home users that upgrade will also have installed the IE8 upgrade. So you're left with what? Unpatched Vista installations. These are much rarer than unpatched XP installations simply because Vista had a shorter lifespan, and Windows Vista to 7 is sufficiently undramatic an upgrade for the types of people who would take years to go from XP to Vista.
So so far we have:
- IE6 will drag on as long as XP does.
- IE7 won't last particularly long. While it's popular now, earlier Vista computers will be replaced in the close future (2-3 years), causing it to lose market share to IE8.
- IE8 will have a long lifespan, although probably not as long as IE6.
IE9? IE9 has never been shipped by default with any version of Windows. That means anyone who installed it did decide to upgrade. These users will likely upgrade away, meaning in the future, IE9 will be even more of a non-issue than IE7.
IE10 will likely also go the way of IE7. While it will be installed by default on Windows 8, the amount of dramatic changes in W8 will scare off many of the companies that are slow to upgrade.
So in 5 years time, what versions of IE will realistically you need to support?
- IE6 (maybe - probably, hopefully, enterprise only at this stage)
- IE10 (enterprise will never use it because Win8 is scary and different to them so for home users only)
- IElatest-1 So IE13 or something?
- IElatest IE14 or something.
Needing to support IE6 and IE10 will likely be mutually exclusive, so that's 4 versions for sites targeted at home users and 5 for sites aimed at both enterprise and home users. Still ugly, but far from 76. And all those versions will be dead in the timescale that the article is using. Insofar as IE6 will ever die, anyway.
IE6 for home users will be dead at that point. Most of those old early XP computers will be "broken" and replaced, even if "broken" is just slow and annoying. Using XP in five years will be like using Win98/Win2k. Yes, people do use them. No, they aren't a large enough group for most to worry about. I even have a small amount of hits from Netscape 6. I haven't a clue what my page looked like for them, and don't care.
In theory, if even IE is aiming for at least yearly releases from now on, no future IE will end up in the position that IE6 is in, and that IE8 will find itself in, as upgrading your browser frequently becomes a fact of life. The compatibility modes will be much less important too, as the shorter lived the browser, the less likely that the compatibility mode for it will ever be used.
This post was originally posted as a comment on HN. Check the thread for possible replies.
Not Invented Here and New Programmers
The general consensus is that the one of the best ways to learn how to program better, beyond learning the basic syntax is to just go ahead and write some programs on your own. Another consensus is that the best type of program to write is one that scratches your own itch. Yet another consensus is that it is best to avoid the "Not Invented Here" syndrome of writing everything from scratch and instead reuse as much code as possible.
However, for someone learning to program today, for the most part, large parts of any itch that could be scratched has already been done so by someone else. Usually, this is someone who has done a much better job of it than any newbie code. This means that if they follow the advice regarding NIH, they would be reduced to writing glue code for quite some time, tying together libraries written by other people while they program relatively boring code.
The problem with this is twofold:
- The newbie doesn't really learn much about designing their own programs. Sure, they see how the (hopefully) well written libraries do it, but they don't get to see the thought process required, or any of the refactoring that removes earlier bad design decisions.
- Glue code is boring. How many people are driven away from programming by this experience?
One of the first big programs I wrote personally was a PHP social network. I did many parts of it from scratch - a database abstraction library, a templating system, even a primitive MVC system1, and so on. The code that resulted was horrible, and probably riddled with security problems, but I learned a lot from the process.
How would a newbie do that nowadays? They'd install cakePHP or Django or Rails, getting them a DBA layer, templating, MVC all written for them. And sure, the resulting program will probably be cleaner and less buggy, as all the hard parts are handled by the framework. But the job of the programmer gets relegated to writing some models and a few views that are pretty much the same everywhere.
But then a lot of the benefit from trying to write a program for themselves just isn't there. They don't get the benefit of finding out why some ideas don't work, as they just use the framework. Most of what they learn are the APIs of the framework, not any of the thought processes involved in creating the program. Not to mention, while people writing more complex programs than anything a newbie might be delighted to have that problem taken out of their hands, as they figure out how to make their database not fall over with 20k users.
The fun of programming is solving new problems. That's why Rails and co. are so popular. After you've written your second or third wep app, then these problems quickly become old problems, and having Rails handle them for you is really convenient. But for a new programmer, they haven't solved these before. Their programs will not be as ambitious (as anything that ambitious will most likely be dismissed by them as too hard), so having Rails solve these problems will leave them without any problems to solve.
Of course, another benefit is that when they are finished, they can go look at the existing solution to see how it compares to their own solution. This can help them to compare the thoughts that led to their own solution to the (presumably) better result of the existing solution. If they managed to find some itch that hadn't been scratched by someone else already, they wouldn't be able to find a sample to compare to their own work.
In short, laying off the avoidance of "Not Invented Here" can help newbies learn quicker in many cases than being relegated to writing glue code.
Before I knew what MVC was, so it ended up as more of a VC system
Android, the early days
So, for quite some time, I've been meaning to buy a smartphone to replace my current dumbphone and iPod Touch 2G combination. It had worked fairly well for most of the last few years, with a few exceptions, but I was getting tired of carrying around 2-3 devices with me (depending on if I needed a camera or not) and iOS 4 was very problematic. I was left to choose between a stable, responsive device (downgrading to 3.1.2) and one which could continue to install most apps from the App Store. My iPod Touch 2G was also looking like it would soon fall completely to the onslaught of planned obselence breathing down it's neck, as updates such as iOS 4.3 did not appear to support it, and many apps ran unacceptably slowly on it.
Initially I had hoped to buy a device in roughly the price range of a new iPod Touch, at around €300. This obviously ruled out the iPhone, but I reckoned that once I'd eliminated the Apple tax, I'd be able to get an acceptable smartphone for that price range.
As some of you already laughing at my naivety have guessed, I was wrong on that point. The sub-€300 price range was represented by a cheap Vodafone branded phone, the obsolete-at-birth Sony Ericsson X8, and the Samsung i5500 and HTC Wildfire, both of which had a screen resolution identical to my 3 year old €80 dumbphone, the Sony Ericsson k800i. Nice job guys. I know it's relatively the budget end but you're selling a smartphone that has fallen at the first hurdle to a 3 year old phone at 1/3 of the price.
So with the budget end ruled out, I waited a while until I had enough cash and decided to have a look at more expensive phones. The iPhone? €460 for the obselete 3GS, €600 for the iPhone 4. Pass. Around the €400 mark, I noticed the HTC Desire. It was certainly well reviewed and the 3.7" screen was roughly around the size I wanted. It did however look a little dated with the HTC Desire HD and HTC Desire Z having been released since.
None of the networks here sold the Desire Z, which limited my HTC choices to the HTC Desire or the Desire HD. There was also the Samsung i9000 hanging around that price point also, but the general consensus I got from people I knew was that HTC was the better of the two.
So, with the choices down to two, I agonised over the HTC Desire vs. the HTC Desire HD. While the HTC Desire HD was €50 dearer, that wasn't a major factor. Instead, I was worried that the 4.2" screen was too big for practicality.
However, that decision was made for me pretty quickly when HTC announced the Desire S, a definite replacement to the Desire. Not as you might think, to wait for the Desire S, as with the phone networks here there is no telling how long that would be, but for the Desire HD. So pretty quickly I went into the store to buy it. I know, I know. You bought it in store? What a caveman! However, the online discount of €30 was only slightly more than the €20 discount I got in store from switching from O2 to Vodafone, and I find it always helps to be able to shout at a specific store if things go wrong.
When I'd first bought my iPod Touch, one of my initial thoughts was how much better the screen looked than any other device I'd owned before. While people with Retina displays might scoff at the 800x480 screen on the Desire HD, that too managed to inspire the same reaction, being a huge improvement over my old iPod Touch's 320x480 screen.
Performance was a huge improvement too, although anything would have been as iOS 4 completely destroyed any semblance of responsiveness in my iPod Touch. However, it still seemed faster than my iPod had been under 3.1.2 and than my friend's 3GSes (none of them had an iPhone 4 for me to compare with).
It's more stable than iOS from what I've seen so far. Safari used to crash on a fairly regular basis, occasionally requiring the entire iPod to be rebooted. However, I'm not willing to make a final judgement on this until I've used the device a bit more, and twice I've hit the home button to be brought back to a Loading dialog for a minute instead of my home screen, as if the home screen app has been closed.
Compared to an unjailbroken iOS device, the vanilla Android experience is extremely well featured. Notifications are handled much better, though the pull down UI was a bit unusual. I like the idea of the letters typed appearing above the area occupied by my fingers, though on my first few uses as I got used to it from iOS, I had to shrug off the feeling that I was hitting the wrong button upon seeing this. One annoyance when typing that I still mess up is that the close keyboard button occupies the space taken by the button to bring up numbers and symbols on iOS devices. I've heard that this keyboard is HTC specific, but I haven't access to any other Android device to check at the moment.
Applications are pretty good too. My must have App Store apps on an iOS device are Twitter, Canabalt and Angry Birds. Far from a huge collection, obviously, as I got bored of most other apps and removed them eventually, and for some found they never worked that well anyway on my iPod (Call of Duty, Simcity, etc.). Angry Birds is on Android, free too, though the ads are somewhat annoying. If they offered a paid no-ads option I would probably buy it again. Canabalt is not, however my device has a flash player, so I can actually play it on Canabalt's website. I probably won't, because it's not the most practical, thought it was a funny discovery.
Twitter was already included on the device, and my first few comments were about how it was inferior to Twitter for iPhone, but acceptable. A quick search of Twitter clients (Twidroyd, HTC Peep) failed to show up anything better so I stuck with Twitter for Android. However, I soon discovered that preinstalled apps don't auto update, and manually installing it from the Market got me Twitter for Android 2. Which is pretty close to the iPhone app, missing only the My Tweets, Retweeted functionality. On the plus side, it has no dickbar.
Moving onto jailbreak applications, my must haves were sbsettings, MxTube, and.. and.. well that was about it really at the end. sbsettings I do miss, mainly for the brightness control, as auto brightness (on either device) never seems to set brightness to the minimum, which makes using the device when it's anyway dark rather painful. MxTube had tons of equivalents on the Android market, and the one I went for is TubeMate. It's a bit clunky, but it'll do.
I also found plenty of GBA emulators. While there was one on the iPhone, gPSPhone, which I used when it was free, I just kept the free version until I updated to iOS 4 which it no longer worked on. Since iOS 4 had performance problems, I never bothered with any emulators on it. There is also a PSX emulator, FPSe. It's very nice, and unlike on my iPod Touch 2G, it runs the games at an acceptable speed without ruining the sound. There is also psx4droid, which I tried, but putting a ROM on it took more than 15 minutes so I missed the refund window when I discovered it was too slow.
An interesting fact about these emulators is that they are on the Android Market, and worked without rooting, unlike the iOS ones which required Cydia and a jailbroken device. Likewise, other jailbreak features such as custom themes and a slide down app launcher (part of sbsettings) are inbuilt into the software. Anything that saves on potental warranty voiding is always good.
One of the nicest features of Android is how syncing works. Take files, drag and drop to device, either by plugging it into a computer or by using sftp over an app running on the device (I use SSHDroid but it was literally the first I found). No crapware programs, and none of the horribly painful bloat that is iTunes. On the negative side, this does make album art a bit more difficult to manage.
All in all, I'm fairly happy with my new phone, though there are still a few teething issues, such as the close keyboard button placement, and my tendency to not use the search key in a lot of cases where I should.
The Magic Layer
Any sufficiently advanced technology is indistinguishable from magic.
Arthur. C. Clarke
As a programmer, and a computer nerd, I obviously know more about how they work than if you just grabbed a random person off the street. As such, it's always interesting to see how people with less knowledge about computers use them.1 To a lot of people, they click some buttons on the screen, and then the computer does something that may as well be magic, and finally results appear on screen.
However, it's not just those unfamiliar with technology that find that at some level, the processes going on may as well be magic. Very few people understand the entirety of how things work from top to bottom. Moving on from the unfamiliar user and the buttons they click, for many poorer developers they type (or copy and paste) in some code they don't really understand and somehow it all works. Their magic layer is lower than that of the user who doesn't understand how the program actually works. After all, they understand that programming code needs to be written, but they still don't understand any of what goes on beyond that. Their magic layer is off in the function calls, in the syntax on the language, in the meaning of the code. They may understand small sections, but the entire program as a whole is still magic to them.
Moving on from the poorer developers, novice developers, who instead of incompetence only have the problem of knowledge they don't yet have. They understand for the most part how their code works. They know how to structure their functions, they know when to use objects (and when not to), they can create programs from scratch. But they may not quite understand what is going on in the library routines, or what happens when the compiler creates a program from their code.
As you go further up in the skill of the developers, the magic level recedes. They realise that the library routines are just like functions they create themselves for the most part. Some might make system calls, but other than that, there isn't much magic there. They understand that the compiler reads their code, and produces machine code which is run by the CPU. The magic is banished from their code to the inner workings of the compiler/interpreter and the operating system.
Finally, once the developer learns how those last few retreats of magic
work, they can understand pretty much the whole picture, as far as
software goes. However, this still isn't the final end of the story.
Sure, the magic might be gone from the software side, but there is still
the hardware. How do those system calls translate to data being written
to disk? How does the CPU know that
MOV eax, ebx moves data between
registers? And at this point you're in the hardware layer.
So, where is your magic layer? Personally I'm at the "inner workings of the compiler/OS" stage, and to work on pushing it further back, I've found some useful online resources. As a self taught programmer, most of the resources I've had up until now glossed over these areas, and it's my main aim for this year to push the magic layer back into the hardware. For compilers, I've found Jack Crenshaw's Let's Build a Compiler useful to start with, and I'm currently reading through that, and for operating systems, a bunch of useful articles have actual coincidentally popped up on Reddit and HN recently on that topic, though I'm still open for suggestons on other resources.
If you found this article interesting, why not follow me on Twitter, leave a comment, or suscribe to my RSS feed.
Some further discussion of this post can be found at Hacker News and on Reddit.
Except when they frustratingly do everything in the most convoluted way possible.
The AUR and its flaws
First of all, yes, I am picking on Arch again. But, something to keep in mind is although I've been complaining about Arch a lot over the past few months, I'm still using it, and I haven't been pushed enough to replace it. The big negatives I complain about are countered by big positives also, such as the ability to have the latest up to date programs far ahead of other distributions that make you wait for the next major release of the distribution to get version 3.x+1 automatically without installing other packages (I'm looking at you Ubuntu).
Anyway, with that disclaimer out of the way, here's today's complaint. The target for today is the AUR - the Arch User's Repository. This is for those packages that haven't been confirmed stable enough to go into the main repositories, aka, most of the interesting ones.
My first complaint of today is that so many of its packages are just flat out broken. More than once I've come accross a package that has it's download URL as something along the lines of http://somesite.com/latest.zip . However, since that time, latest.zip has changed to the latest version, with the result that it no longer passes the checksum, and you can't install it. Someone may have already flagged the package as out of date, but for a lot of them, that's no guarentee that it will be updated.
The next thing that annoys me is source packages. Sure, they're fine for some little toy panel program, but when you try and install something major like new versions of Firefox or Chrome? Well, it changes the install time from 1 minute to 30 minutes. Which is especially annoying when something goes wrong after the compilation in the installation, such as say, the firebrand package declaring it needs firefox>=3.6 and your AUR helper waiting to uninstal the old package after the compilation process, and then deciding that actally, it can't do that. Sure, you may be installing firefox4, but firebrand needs Firefox now, damn it! I am aware it doesn't completely recompile from scratch, but it's still significantly slower. This problem probably also applies to BSD's ports system or Gentoo's Portage.
Another aspect is that so many packages are duplicated without any apparent reason. What exactly is the difference between epsxe-launcher-bash and epsxe-launcher-gentoo? They both have the same description and the same version number. And then some packages, while not duplicates, do need pruning. Why is there a firefox-qt at version 3.6.3 (outdated) and a firefox-qt-beta at 3.5b4 (even more outdated)? For that matter, why are there so many Firefox packages that seem to do the same thing (integrate with KDE)? We've got firefox-qt, firefox-qt-beta, firefox-branded-kde (also outdated) and firefox-kde-opensuse. Now, I'm picking on Firefox here because as a popular package, it will have the most variants on the AUR, but duplicates do crop up on lesser used packages too.
All in all, some of these are to be expected. After all, it wouldn't be possible to have perfectly up to date packages for everything, no matter how hard the packagers tried. But it would be nice if AUR usage was to become as easy to use as the offical repos.