Macha

Arch follow up post

April 24, 2010 | categories: Pcs

So, after a few weeks of using Arch, I've gotten my system to the state that I don't want to reinstall it instantly. So this post is a list of problems I've encountered and what I did to fix them for anyone who has these problems again.

The first problem I came up against was during installation. To install, I needed wifi, and to use the wifi, I needed some drivers from the AUR, and for the AUR, I needed an installed system with wifi. My wireless card is a Broadcom BCM4312 which needed the wl driver. #archlinux led to a user which advised me to use the latest testing version which supported the card.

Of course, the card still needed the b43-firmware, which had the same catch-22 issue. The solution was to finally load the firmware onto a flash drive, and copy it onto the system while running the LiveCD (and it had to be a LiveCD, for some reason, it wouldn't work running from USB).

That done, I installed away, and had to repeat the process to get wifi on the installed system.

My next problem was the screen resolution was stuck at around 1152x864 . The problem was solved by using this guide from, of all places, Ubuntu. After that, I had the problem where scrolling things had all the pace of a dying snail. Turns out the answer was that I shouldn't have listened to all the people saying radeonhd was as good as the fglrx driver. It wasn't anywhere near up to the job on my ATI Radeon HD3650. To install the fglrx drivers, see this guide on the Arch Wiki. That fixed the scrolling issue straight away.

My next issue was the installation of Wine. To install from the AUR you need the bin32-wine package on x86-64. Of course, at the time of installation (not sure if it's still true), the lib32-jack problem, on the AUR, is set up wrong. I had to edit the pkgver and pkgrel variables in the PKGBUILD to match that listed on the AUR itself.

My experience with Arch Linux

April 09, 2010 | categories: Pcs

So, I've been using Arch Linux the past few days, after reading Sirupsen's post on it. He was quite positive of it, and he'd pointed out Vim to me, so I was prepared to take his advice again. Besides, what with the upgrades, downgrades, beta versions, etc, my Ubuntu installation was pretty much a wreck at this stage, and needed to be reinstalled anyway. And the last x64 disc I had was for 8.10. Which would mean upgrading three times in the space of a week, which I wasn't very inclined to do.

And that brings me on the the first issue with Ubuntu - the whole updates system. The fact that you have to reinstall your system every few months, or get stuck with old versions of software like Firefox and Wine, is a major nuisance, especially on a 1 meg connection, where a system update is 5 hours just for the download. (On that subject, I'm finally getting a 3 meg connection sometime in the next 2 weeks. Yay!). This unsurprisingly defeats the point of a package manager, and has led to me to forgo it altogether for certain software like Firefox, and Eclipse with their own updating systems outside it's control. So, hearing one of Arch's main goals of having a rolling release with the latest programs constantly sounded like a pleasant escape.

Another point that Ubuntu has always given me a problem with, is that of bloat. A default install of Ubuntu contains GIMP, OO.o, Firefox, Evolution, Empathy, Ubuntu One, Orca, etc., etc. Evolution, Firefox and Empathy all get instantly uninstalled in favour of other programs, GIMP is a good example of horrendous UI, and OO.o, while not bad, certainly could be improved. But Arch and I have very different ideas of bloat. I consider those bloat. Arch considers things like automated wifi config bloat. So while I'd like a minimalistic Linux, Arch goes too far and is completely spartan. To put it another way, Ubuntu is like a rented house full of furniture, some of it rather dated, my ideal would be an empty house, and Arch is a pile of bricks.

But then there is installation. A fresh install of Ubuntu, provided you have the latest ISO, is a 20 minute process. Most of it is relatively automatic, and it _usually_ just works. Unless you do something stupid like sudo apt-get remove evolution-*, which somehow removes GNOME, but hey, you can remove the wrong package on any distro. Arch on the other hand, is a much more involved process, requiring manual editing of at least 10 config files, requiring you to seemingly psychically know details about modules, and which kernel version has what for which device. The one in the downloads page would not work for me because I need to install over wifi. Turns out, I need drivers which aren't there. They are there in kernel 2.6.32, which is in the version of arch with the download link on the forums, not the download page. Oh, and on top of that, I need firmware, which is not on the disk, instead I have to stick it on a USB stick and copy it over before installation.

Part of this is the reality of what Arch is. It's a geeks for geeks distro, much like Linux as a whole used to be. Nothing wrong with that, except it's at a level beyond what I'm able to deal with. For example, I was trying to set my resolution the other day. The advice I got was "Read man xorg.conf". As I remarked on Twitter the other day:

man xorg.conf makes the rather amusing assumption that you already know loads about both X.org and the internal workings of your monitor...

Or, at least it's amusing until it means you are stuck looking at a display that looks like crap because it's stuck at 1152 x 864, when it's native resolution is 1440x900. Further asking only get the advice "Look under Modes". Well, here is the an excerpt section of Modes in man xorg.conf:

Mode "name"
This is an optional multi-line entry that can be used to provide
definitions for video modes for the monitor. In most cases this
isn't necessary because the built-in set of VESA standard modes
will be sufficient. The Mode keyword indicates the start of a
multi-line video mode description. The mode description is ter-
minated with the EndMode keyword. The mode description consists
of the following entries:
DotClock clock
is the dot (pixel) clock rate to be used for the mode.
HTimings hdisp hsyncstart hsyncend htotal
specifies the horizontal timings for the mode.
VTimings vdisp vsyncstart vsyncend vtotal
specifies the vertical timings for the mode.
Flags "flag" ...
specifies an optional set of mode flags, each of which is a
separate string in double quotes. "Interlace" indicates
that the mode is interlaced. "DoubleScan" indicates a mode

Mode "name" This is an optional multi-line entry that can be used to provide definitions for video modes for the monitor. In most cases this isn't necessary because the built-in set of VESA standard modes will be sufficient. The Mode keyword indicates the start of a multi-line video mode description. The mode description is ter- minated with the EndMode keyword. The mode description consists of the following entries:
DotClock clock is the dot (pixel) clock rate to be used for the mode.
HTimings hdisp hsyncstart hsyncend htotal specifies the horizontal timings for the mode.
VTimings vdisp vsyncstart vsyncend vtotal specifies the vertical timings for the mode.
Flags "flag" ... specifies an optional set of mode flags, each of which is a separate string in double quotes. "Interlace" indicates that the mode is interlaced. "DoubleScan" indicates a mode

This is not helpful! horizontal timings? dot clocks? All I want to do is change the resolution. man pages are for advanced technical details, not basic user information. And the wiki, their other reference, is in varying parts as unashamedly complex as these man pages, or only covers the optimal scenario, lacking any middle ground. On the other hand, Ubuntu provides this page, which explained far, far more. Also, while doing this process, I was informed I shouldn't use GNOME because it's a bloated, buggy, desktop environment, in the opinion of several members of #archlinux . Despite working fine for me for the last while. I'm looking for help, not conversion! I wonder what they think of KDE or Windows then...

On the subject of not helpful, here is the recommended way to install a package not in the default repos for pacman in Arch, and believe me, that is a lot of packages.

  1. Go to aur.archlinux.org
  2. Download the PKGBUILD
  3. Possibly make changes to the PKGBUILD
  4. Open a terminal to the folder containing the PKGBUILD
  5. Run makepkg -i

Does this sound tedious and repetitive to you? Certainly does to me, so many members of the Arch community have built helpers to automate this, such as Clyde, as mentioned by Sirupsen the other day. But apparently this isn't the Arch way. So to install wine, I use sudo clyde -S bin32-wine. But, the AUR has the wrong version of one it's dependencies. Here's what #archlinux thinks of those.

Macha, you should really read about abs and aur and stop using any aur helper

Another area where I want a simple solution, and they go for the most spartan one possible. Ok, not the most spartan, I could compile it and place it in the directories myself, but really, a 5 step process for EVERY package, and each of it's dependencies? As opposed to sudo apt-get install foo, or sudo clyde -S foo. Why is this better?

I want to like Arch. I admire it's aims of being up to date, and simple. But I just can't. It's not for me, and on the final release of Ubuntu 10.04, I will be going back to Ubuntu on this laptop. I hear they've dropped GIMP now, and worked on getting it to boot up faster, too...

You know, I kind of liked Vista

February 21, 2010 | categories: Pcs
Still doesn't make this any less funny.
<Snakeman^Engineer> Do I sense some hatred towards Windows Vista originating from your direction?
<Chrysalid^Revenge> Oh no, not at all
* Chrysalid^Revenge stands up in a medieval recitation pose
<Chrysalid^Revenge> "OS X for the Mac users, pretentious in their coffeeshops
<Chrysalid^Revenge> Gentoo for the nerd-lords in their mother's basement
<Chrysalid^Revenge> XP for the everyday user, bound to muck around with bloody settings and registry values they should damn well leave alone
<Chrysalid^Revenge> Then Vista from the Dark Lord behind his desk
<Chrysalid^Revenge> In the Microsoft office, where crappy programming is performed
<Chrysalid^Revenge> One OS to eat your RAM, One OS to spy on your digital media
<Chrysalid^Revenge> One OS to screw them all, and in frustration bind them
<Chrysalid^Revenge> In the Microsoft office, where crappy programming is performed"
<Sectoid^Authopsy> Whoa!

The wi-fi doesn't work. I give up. I'm going back to... Linux

February 16, 2010 | categories: Gaming, Pcs

Linux and wi-fi. They normally go together like a square peg and a round hole. Every wireless adapter I've had the past few years has had at least some problems running under Linux. Ranging from my USB WPN111 putting an end to my first foray into Linux ("screw this - no internet, it's too much effort, I'm going back to Windows"), about 2 years ago, to my current laptop's random DNS failures when I used WPA2.

So, I was pleasantly surprised when my new TP-Link WN821N worked straight away on Linux on my desktop PC. Given that is was a €20 Wireless N adapter, I didn't expect it to. All well and good. Then it came to Windows. Expecting it to be simple as usual, I installed the driver. The wi-fi thing showed up, I used Connect to a Network, and... No network.

Several minutes of googling later reveals there is no Windows 7 driver (which is what Server 2008 R2 normally uses). Instead there is a Vista driver, which is no good in this case. So I've either:

  • Ran into one of those edge cases where the difference between Windows 7 and Windows Server 2008 R2 actually does matter.
  • Ran into one of those edge cases where the difference between Vista and 7 does matter

or

  • Both

Which really sucks. Especially when the majority of what I do on that desktop using Windows is gaming. (Yeah, yeah, gaming over Wi-Fi, tut tut...). Gaming sans multiplayer is kind of limited. I mean, sure I can play Fallout 3, and Oblivion fine, but what about Team Fortress 2? Oh, hold on, that works on Wine. Actually, so does Fallout 3. And openTTD. And that's basically all I play in PC games lately.

Right, so why am I running Windows on this machine? My copy of Visual Studio is from DreamSpark (as is my copy of Windows, which is why I'm using Server 2008 R2 in the first place - it was that or XP, I've no Windows 7 yet, and my Vista disc is a Dell OEM disc), which I'm fairly certain can only be installed once, and it's already on my laptop. uTorrent and Paint.NET both blow away their nearest competitors on Linux, but uTorrent is useless without an internet connection, and Paint.NET is only one program at the end of the day.

So, screw this - no internet, it's too much effort, I'm going back to Linux1. And that is something I never thought I'd say when I first started experimenting with Linux. Of course, most of this is TP-Link's fault. If some random outsider can write a working driver for Linux, I don't see why they shouldn't be able to write a driver for Windows 7. I won't be buying from them again.

1

On this desktop at least, on my laptop I still dual-boot to have VS for programming Windows languages, and iTunes for syncing my iPod Touch.

Windows does partitions wrong, but so does Linux

July 28, 2009 | categories: Pcs

Back when I first installed Linux, when I had said I had a 60GB partition to use, the advice I got was: Match your swap partiton to your RAM, use 10GB for /, and the rest for /home .

Yesterday, I started getting errors everywhere. So I turned it off and on again. When that didn't work, I actually read the errors. "No Space Remaining". I go "WTF, I only used about 12GB".

So after spending a while looking at the disc usage program, I notice something. It lists / as full with 9GB/9GB used. Then I realise the problem: My / partition is full. (which is presumably also the location of /tmp , hence the errors).

Luckily my swap partition was in between my / and /home partitions, so I've deleted that and recreated it at the end of the disc. (which took a bloody long time, and required finding that LiveCD again).

This makes me wonder: Is Windows' system of drive letters rather than a defined purpose (which is often a point of criticism) such a bad idea? When on Windows, my C: drive filled up, I just needed to move files en masse to my D: drive. Still slow, but doesn't require even a reboot, much less depending on where my swap partition just so happened to be because of the order that I used when installing the system.

For those of you wondering how my programs and system data are significantly larger than my files, here's the amount of file storage I've used on programming:

  • Eclipse: 130 mb
  • Eclipse plugins: at least 20mb, possibly as high as another 100mb
  • jdk/jre: ~30mb
  • Apache/PHP/MySQL: ~100mb
  • Many other programming tools
  • Actual programs I'm writing now: < 20mb (Older ones are stored on a network drive)

Representing partitions as drive letters is clearly wrong, because the file system is supposed to be abstracting the actual physical hard drives out of it, but representing them for one use is also wrong. How can I predict when I get a new computer that I'll need x gb for data and y gb for programs?

An ideal OS would abstract all of this away, so you just have storage and don't have to deal with the actual drives you have, what partition a file goes in, predicting your disk usage, which partition is on which drive etc.