Macha

The problem with tech "is dead" hysteria

May 01, 2010 | categories: Pcs, Gadgets

It seems every week there are people claiming that yet another area of computing is dead. IE6, Windows, netbooks, desktop apps, the list goes on. Yet 9 times out of 10 these proclamations are very premature. Because while power users normally use the latest and greatest of everything, ordinary users don't. Ordinary users don't use n-1 either. Ordinary users use whatever was on their computer when they got it. Fine. Most people know that.

But then the problems start. How old is an old computer? One that came with Windows XP? (and these are non-technical users -they are using Windows) I'm sure most reading this would agree that that is an old computer. No one would use anything older than that, surely?

People actually still use this.

Think again. As someone who regularly fixes the computers of "non-computer people", I frequently encounter PCs running Windows XP SP1, Windows ME, and even Windows 98. These are users who can't run the newest versions of apps. Or the versions before them. Or often even the versions before them. Surely they'd buy a new computer then? Nope, they're still happily using that copy of Office 2000 they were given with the PC. Now, I will admit that "computers needing to be fixed" is obviously going to highlight the older ones still hanging around more than just picking some at random, but it can't invent old computers out of thin air, either.

So what happens when their computer goes up in smoke? Well, first they send it to their computer nerd friends, or a PC repair store. If it's still irreparable without extra money, they will buy a new one, and that's one less old computer for us to deal with.

So, a new computer nowadays will come with Windows 7 preloaded. And despite all the new features, the basics of Windows usage haven't changed much since Windows 95. The start menu button might be circular, but it's still in the bottom left. The text names off the programs in the taskbar have been replaced by pictures. The windows are a bit rounder and more colourful. Nothing that would trip any of us up.

But then... We have those like my dad. He "can't understand this Vista stuff", coming from XP. He still uses an old, slow Windows XP laptop over the new Windows 7 desktop. I managed to finally move him away from IE6 to IE8. He still misses IE6's "easiness". His average use pattern consists of MS Word, Facebook and Outlook Express. He did an ECDL course as part of his job. ECDL for those who haven't done it, is a computer course that teaches MS Office, Windows and IE.

Attempts at moving these people from their preferred apps is futile. They learned how to use one program, and they're going to continue using that one. You won't get them using OpenOffice, which is as close to MS Office as you'll get without being it, so trying to get them to use iWork, for example, is futile.

Now, moving aside from my dad, and keeping with the Apple theme,  let's look at a device that has spawned a lot of "is dead" articles: the iPad. Depending on who you ask, the iPad will be responsible for the demise of everything from the netbook all the way up to the entire personal computer industry. But on it's release, the device was woefully underpowered for any heavy usage. But then a theory came about: The iPad wasn't for us, the heavy computer user, it is for the average Joe. After all, it is far simpler, so should be easy for people who aren't good with computers to use, right? Wrong. As I've already mentioned, the difference between XP and Vista are enough to confuse some people.

It's certainly less different than this, but apparently, everybody is going to use these.

Even apart from that, do you really think those people looked at the iPad and thought anything more than: "Oh, another gadget"? A significant amount of people I know won't use anything other than the "dumb" Nokias. Why? Because that's what they've always used, and that's what they are used to. Because of this, if you try and sell the iPad to them as like a computer, they won't want it, because it's not Windows. If you try and sell it to them as like a phone, they won't want it, because it's not Nokia. If you try and sell it to them as an middle device, they won't want it, because they'll see it as an expensive unneeded gadget. The fact that Apple had to impose a two iPad per customer limit during pre-orders shows exactly who is buying them, and the amount of Apple fanboys and gadget nerds, even when combined, is still finite.

Oh, and you can play WoW and The Sims on netbooks. You can't do that on an iPad. Those two games have a very large number of players, many of which use their computer for little other than those games and Facebook.

Yet people still use these, and even older models.

Let's look at another "is dead" example, that of desktop apps. The apparent killers here are web apps. After all, if you look at my typical day's computer usage, it includes Google Reader, Gmail, Brizzly, Google Calendar, Pidgin, Vim,  Eclipse, Rhythmbox - Oh, look, the last 4 are desktop apps. Desktop apps with no web app of equal quality to match them. And, they're not likely to reach it for some time. Even Joe Hewitt, who quit iPhone development, believes it to be better than web development. I wonder how much time and effort it would take to recreate programs like Photoshop, or Visual Studio in the browser. The best online image editor I could find, still doesn't achieve feature parity with Paint.NET, which it is obviously inspired by. Google Docs and what I've seen of Office Web Apps don't reach the features of OpenOffice, never mind MS Office on the desktop.

Huge Progress? Undeniably. Equivalent to a desktop app? Not at all.

Another problem with web apps is the web designers' old enemy, IE6, which is also allegedly dead, yet still has 20% of the market share. Where is it coming from? Those users with Windows 98, ME and 2000, who actually can't upgrade (even to Firefox, or Chrome). And of course, corporations who won't upgrade anything without everything being tested and examined three times over.

Some of these users have even been reached by the messages many web designers are putting on pages telling IE6 users to upgrade. I got asked by one teacher who has been seeing these more and more, how he would go about upgrading his browser. The computer ran Windows ME. I had to give the advice that he would need to buy a new computer. I hate telling people that, because most of them don't want to spend €400 on a new computer, unsurprisingly.

I could just search the web for tech "is dead" posts, and go on through this for every widely used technology that's been said to be dead, but people reading this would get bored around 2000 words in, with only one fifth of things that have been said to be dead covered. But these few examples should hopefully remind people that tech doesn't just die and go away that easily.

Arch follow up post

April 24, 2010 | categories: Pcs

So, after a few weeks of using Arch, I've gotten my system to the state that I don't want to reinstall it instantly. So this post is a list of problems I've encountered and what I did to fix them for anyone who has these problems again.

The first problem I came up against was during installation. To install, I needed wifi, and to use the wifi, I needed some drivers from the AUR, and for the AUR, I needed an installed system with wifi. My wireless card is a Broadcom BCM4312 which needed the wl driver. #archlinux led to a user which advised me to use the latest testing version which supported the card.

Of course, the card still needed the b43-firmware, which had the same catch-22 issue. The solution was to finally load the firmware onto a flash drive, and copy it onto the system while running the LiveCD (and it had to be a LiveCD, for some reason, it wouldn't work running from USB).

That done, I installed away, and had to repeat the process to get wifi on the installed system.

My next problem was the screen resolution was stuck at around 1152x864 . The problem was solved by using this guide from, of all places, Ubuntu. After that, I had the problem where scrolling things had all the pace of a dying snail. Turns out the answer was that I shouldn't have listened to all the people saying radeonhd was as good as the fglrx driver. It wasn't anywhere near up to the job on my ATI Radeon HD3650. To install the fglrx drivers, see this guide on the Arch Wiki. That fixed the scrolling issue straight away.

My next issue was the installation of Wine. To install from the AUR you need the bin32-wine package on x86-64. Of course, at the time of installation (not sure if it's still true), the lib32-jack problem, on the AUR, is set up wrong. I had to edit the pkgver and pkgrel variables in the PKGBUILD to match that listed on the AUR itself.

My experience with Arch Linux

April 09, 2010 | categories: Pcs

So, I've been using Arch Linux the past few days, after reading Sirupsen's post on it. He was quite positive of it, and he'd pointed out Vim to me, so I was prepared to take his advice again. Besides, what with the upgrades, downgrades, beta versions, etc, my Ubuntu installation was pretty much a wreck at this stage, and needed to be reinstalled anyway. And the last x64 disc I had was for 8.10. Which would mean upgrading three times in the space of a week, which I wasn't very inclined to do.

And that brings me on the the first issue with Ubuntu - the whole updates system. The fact that you have to reinstall your system every few months, or get stuck with old versions of software like Firefox and Wine, is a major nuisance, especially on a 1 meg connection, where a system update is 5 hours just for the download. (On that subject, I'm finally getting a 3 meg connection sometime in the next 2 weeks. Yay!). This unsurprisingly defeats the point of a package manager, and has led to me to forgo it altogether for certain software like Firefox, and Eclipse with their own updating systems outside it's control. So, hearing one of Arch's main goals of having a rolling release with the latest programs constantly sounded like a pleasant escape.

Another point that Ubuntu has always given me a problem with, is that of bloat. A default install of Ubuntu contains GIMP, OO.o, Firefox, Evolution, Empathy, Ubuntu One, Orca, etc., etc. Evolution, Firefox and Empathy all get instantly uninstalled in favour of other programs, GIMP is a good example of horrendous UI, and OO.o, while not bad, certainly could be improved. But Arch and I have very different ideas of bloat. I consider those bloat. Arch considers things like automated wifi config bloat. So while I'd like a minimalistic Linux, Arch goes too far and is completely spartan. To put it another way, Ubuntu is like a rented house full of furniture, some of it rather dated, my ideal would be an empty house, and Arch is a pile of bricks.

But then there is installation. A fresh install of Ubuntu, provided you have the latest ISO, is a 20 minute process. Most of it is relatively automatic, and it _usually_ just works. Unless you do something stupid like sudo apt-get remove evolution-*, which somehow removes GNOME, but hey, you can remove the wrong package on any distro. Arch on the other hand, is a much more involved process, requiring manual editing of at least 10 config files, requiring you to seemingly psychically know details about modules, and which kernel version has what for which device. The one in the downloads page would not work for me because I need to install over wifi. Turns out, I need drivers which aren't there. They are there in kernel 2.6.32, which is in the version of arch with the download link on the forums, not the download page. Oh, and on top of that, I need firmware, which is not on the disk, instead I have to stick it on a USB stick and copy it over before installation.

Part of this is the reality of what Arch is. It's a geeks for geeks distro, much like Linux as a whole used to be. Nothing wrong with that, except it's at a level beyond what I'm able to deal with. For example, I was trying to set my resolution the other day. The advice I got was "Read man xorg.conf". As I remarked on Twitter the other day:

man xorg.conf makes the rather amusing assumption that you already know loads about both X.org and the internal workings of your monitor...

Or, at least it's amusing until it means you are stuck looking at a display that looks like crap because it's stuck at 1152 x 864, when it's native resolution is 1440x900. Further asking only get the advice "Look under Modes". Well, here is the an excerpt section of Modes in man xorg.conf:

Mode "name"
This is an optional multi-line entry that can be used to provide
definitions for video modes for the monitor. In most cases this
isn't necessary because the built-in set of VESA standard modes
will be sufficient. The Mode keyword indicates the start of a
multi-line video mode description. The mode description is ter-
minated with the EndMode keyword. The mode description consists
of the following entries:
DotClock clock
is the dot (pixel) clock rate to be used for the mode.
HTimings hdisp hsyncstart hsyncend htotal
specifies the horizontal timings for the mode.
VTimings vdisp vsyncstart vsyncend vtotal
specifies the vertical timings for the mode.
Flags "flag" ...
specifies an optional set of mode flags, each of which is a
separate string in double quotes. "Interlace" indicates
that the mode is interlaced. "DoubleScan" indicates a mode

Mode "name" This is an optional multi-line entry that can be used to provide definitions for video modes for the monitor. In most cases this isn't necessary because the built-in set of VESA standard modes will be sufficient. The Mode keyword indicates the start of a multi-line video mode description. The mode description is ter- minated with the EndMode keyword. The mode description consists of the following entries:
DotClock clock is the dot (pixel) clock rate to be used for the mode.
HTimings hdisp hsyncstart hsyncend htotal specifies the horizontal timings for the mode.
VTimings vdisp vsyncstart vsyncend vtotal specifies the vertical timings for the mode.
Flags "flag" ... specifies an optional set of mode flags, each of which is a separate string in double quotes. "Interlace" indicates that the mode is interlaced. "DoubleScan" indicates a mode

This is not helpful! horizontal timings? dot clocks? All I want to do is change the resolution. man pages are for advanced technical details, not basic user information. And the wiki, their other reference, is in varying parts as unashamedly complex as these man pages, or only covers the optimal scenario, lacking any middle ground. On the other hand, Ubuntu provides this page, which explained far, far more. Also, while doing this process, I was informed I shouldn't use GNOME because it's a bloated, buggy, desktop environment, in the opinion of several members of #archlinux . Despite working fine for me for the last while. I'm looking for help, not conversion! I wonder what they think of KDE or Windows then...

On the subject of not helpful, here is the recommended way to install a package not in the default repos for pacman in Arch, and believe me, that is a lot of packages.

  1. Go to aur.archlinux.org
  2. Download the PKGBUILD
  3. Possibly make changes to the PKGBUILD
  4. Open a terminal to the folder containing the PKGBUILD
  5. Run makepkg -i

Does this sound tedious and repetitive to you? Certainly does to me, so many members of the Arch community have built helpers to automate this, such as Clyde, as mentioned by Sirupsen the other day. But apparently this isn't the Arch way. So to install wine, I use sudo clyde -S bin32-wine. But, the AUR has the wrong version of one it's dependencies. Here's what #archlinux thinks of those.

Macha, you should really read about abs and aur and stop using any aur helper

Another area where I want a simple solution, and they go for the most spartan one possible. Ok, not the most spartan, I could compile it and place it in the directories myself, but really, a 5 step process for EVERY package, and each of it's dependencies? As opposed to sudo apt-get install foo, or sudo clyde -S foo. Why is this better?

I want to like Arch. I admire it's aims of being up to date, and simple. But I just can't. It's not for me, and on the final release of Ubuntu 10.04, I will be going back to Ubuntu on this laptop. I hear they've dropped GIMP now, and worked on getting it to boot up faster, too...

Why Ubisoft's DRM screws me over, and you can't presume everyone has decent internet

April 03, 2010 | categories: Gaming

One of the most annoying recent developments in gaming is Ubisoft has, and EA plans to, implement a always-on internet requirement for their DRM in the games, including single player games. This means that even for single player games, you need to be online to play it. Which is very annoying, to say the least. Yet I frequently see people online saying that you must have a always on internet connection, and presume if you don't, it's your fault for being cheap or lazy.

Most frequently, this seems to come from American users, who have become accustomed to having decent internet (as much as they complain about their standard of internet). There are countries other than America out there. Here in Ireland, I have a 1 megabit connection. An unreliable 1 megabit connection, with an ISP that works 9-5, Monday to Friday. This means if my connection crashed at 6pm Friday, I have no internet until I get home at 5pm Monday. So long internet outages do happen. And, because of my ISP's hours, they tend to happen at times when I might be at home, wanting to, maybe, play some video games.

Sure, it's your own fault, get a better ISP, I hear some of you say. But, there is no better ISP here. I can get a more stable connection alright. It's called dial up. But now I have to pay for usage, rather than monthly, rendering it (a) significantly dearer, and (b) on even less.

And apart from the big outages, between the fact that I have a wi-fi network, and my internet connection is a rural satelite dish pointed at tower on hill affair, smaller ones, maybe a minute long, are par for the course. Do I want to be kicked out of my game EVERY. SINGLE. TIME? Nope.

Oh, well you're a minority, living rurally in a country with poor broadband. If I moved to a city (other than Dublin), I could get 7 meg broadband. And I've read Americans on forums complain at the state of broadband in America (usually whilst comparing themselves to Sweden or Japan), but on the other hand, I've seen Americans scoff at 10 meg connections. Which would leave the average Irish internet connection in the dust, even in a city.

There is another part to the poor internet connection, one that Ubisoft would be delighted to hear about. It renders me incapable of pirating a game, because of the sheer length of time it would take to download. Between unreliable internet corrupting downloads, the fact that I'm not home all day, and my slow speed, downloading a 5GB game would take around 2 weeks. And let's be honest, 5GB is a conservative estimate for the size of a modern game. By which stage, all my friends would have cleared the game, and moved onto the next one, rendering it pointless.

I bought Assassins Creed 2, on 360, before this mess was announced. And now I regret it, having supported a company that does this.

On Twitter

March 16, 2010 | categories: Me

Twitter is, depending on who you ask, the greatest invention, or the greatest timewaster of recent times. I certainly use it a lot, and it provides around 40% of the traffic to my blog, so evidently others do too. But, whenever I use it, I have a tendency to wonder: is this really a valuable use of time? Most of the time I use it, I am just timewasting. I might be putting off tidying stuff, study, or even activities I enjoy such as coding or blogging while on Twitter.

Of course, pre-Twitter the other two big timewasters were email and IM. I've never been a big user of email, and IM requires the other person to be on at the same time as you. In comparison, some of the people I follow on Twitter have a 12 hour time difference, meaning I'm never on at the same time as them. As well as that, because following people on Twitter isn't two ways, I can follow the famous people like @stephenfry, or even just relatively well known programmers like @codinghorror (Jeff Atwood) or @shanselman (Scott Hanselman).

But... 90% of famous people I follow on Twitter end up getting unfollowed within the next 2 weeks. Why? Because they tend to either end up doing one of the following.

  1. They constantly promote whatever they are working on, to the point the tweets are little more than ads.
  2. They completely ignore why they are famous and tweet about their cats or other boring subjects.
  3. They try to keep up their reputation to the point where there tweets are little more than 140 character fortune cookie style tweets, with either absolutely no substance, or tweets that make you go: "Well, Duh..."

Another problem with Twitter is that there are loads of automated bots on it. Some range from the mildly irritating, quickly blocked, porn spambots that follow 1500 people in the hope someone will notice them, to those that retweet every tweet containing certain words. And of course, they manually retweet, show it shows up with the rest of your @replies. And the marketers, the annoying marketers.

People following me just to market their products
Yes, I'm sure that they really are interested in what I'm going to say, and the fact that all they have is ads for their products and 10k people they follow won't affect that. Really.

Another problem with Twitter is part of it's core functionality. Every tweet is limited to 140 characters. While it's partially one of twitter's strengths, it can cause significant problems. Some things just don't fit in 140 characters, as evidenced by the existence of sites such as Twitlonger. Others can be reduced to 140 characters, but end up losing all meaning, To quote one of my own tweets:

How many great possible blog posts, go out in a blaze of 140 chars, I wonder?
One of my biggest problems with this blog is thinking of things to write about. And lately, looking through some of my tweets, I wonder how many of them would have been better used by expanding them to a blog post, rather than reducing them to 140 characters. Others have evidently came to the same conclusion before I did, as you can see by the amount of tweets that are just links to blog posts, but then this has the flip side of reducing some Twitter accounts to being inferior RSS feeds (worse because decent RSS feeds include the content as well).

One of the big features of twitter is it's API, which is easily the best one out there. Before OAuth it took all of 5 minutes to code your own twitter client, using nothing more than the standard libraries included with your programming language. And it's led to many great clients like Twitteriffic on the iPhone/iPod Touch and TweetDeck on PC, both of which I use a lot. The problem with these is it makes it even easier to waste time on Twitter.

Tweetdeck

And, I do waste a lot of time on Twitter. When my timeline froze the other day, rendering Twitter unusable for me, I got nearly twice as much work done.

The final problem with Twitter is who to follow. This used to be much easier, as you could see @replies by people you follow to people you didn't. If people you follow tend to tweet at one person a lot, maybe they had similar interests. So you followed them. Now the closest we get is Retweets. Unless you want to manually look through all the following lists of the people you are following of course.

Now of course, it's not all bad, and if it was I'd have deleted my Twitter account already, but in the future, I will definitely be spending less time on Twitter, but if you'll excuse me, I'm off to tweet about this blog post.

« Previous Page -- Next Page »