Macha

The wi-fi doesn't work. I give up. I'm going back to... Linux

February 16, 2010 | categories: Gaming, Pcs

Linux and wi-fi. They normally go together like a square peg and a round hole. Every wireless adapter I've had the past few years has had at least some problems running under Linux. Ranging from my USB WPN111 putting an end to my first foray into Linux ("screw this - no internet, it's too much effort, I'm going back to Windows"), about 2 years ago, to my current laptop's random DNS failures when I used WPA2.

So, I was pleasantly surprised when my new TP-Link WN821N worked straight away on Linux on my desktop PC. Given that is was a €20 Wireless N adapter, I didn't expect it to. All well and good. Then it came to Windows. Expecting it to be simple as usual, I installed the driver. The wi-fi thing showed up, I used Connect to a Network, and... No network.

Several minutes of googling later reveals there is no Windows 7 driver (which is what Server 2008 R2 normally uses). Instead there is a Vista driver, which is no good in this case. So I've either:

  • Ran into one of those edge cases where the difference between Windows 7 and Windows Server 2008 R2 actually does matter.
  • Ran into one of those edge cases where the difference between Vista and 7 does matter

or

  • Both

Which really sucks. Especially when the majority of what I do on that desktop using Windows is gaming. (Yeah, yeah, gaming over Wi-Fi, tut tut...). Gaming sans multiplayer is kind of limited. I mean, sure I can play Fallout 3, and Oblivion fine, but what about Team Fortress 2? Oh, hold on, that works on Wine. Actually, so does Fallout 3. And openTTD. And that's basically all I play in PC games lately.

Right, so why am I running Windows on this machine? My copy of Visual Studio is from DreamSpark (as is my copy of Windows, which is why I'm using Server 2008 R2 in the first place - it was that or XP, I've no Windows 7 yet, and my Vista disc is a Dell OEM disc), which I'm fairly certain can only be installed once, and it's already on my laptop. uTorrent and Paint.NET both blow away their nearest competitors on Linux, but uTorrent is useless without an internet connection, and Paint.NET is only one program at the end of the day.

So, screw this - no internet, it's too much effort, I'm going back to Linux1. And that is something I never thought I'd say when I first started experimenting with Linux. Of course, most of this is TP-Link's fault. If some random outsider can write a working driver for Linux, I don't see why they shouldn't be able to write a driver for Windows 7. I won't be buying from them again.

1

On this desktop at least, on my laptop I still dual-boot to have VS for programming Windows languages, and iTunes for syncing my iPod Touch.

Building a PC

January 12, 2010 | categories: Me, Pcs

A while back I decided to build a PC from parts for the first time. Why didn't I ever do so before? (a) Lack of money, and (b) Worry I'd screw something up. Of course, what kind of computer nerd would I be if I never built my own?

So first things first, the specs (some components old, the rest ordered from komplett.ie):

  • MSI G41M-F Motherboard
  • Intel Core 2 Quad Q8300 (2.5Ghz)
  • Geforce 210 (512mb)
  • 2GB RAM
  • 630W PSU
  • 500GB HDD
  • Nox Saphira Case
  • DVD±RW drive

Nothing astounding there. A decidedly below-par graphics card for a new system, but considering it was my first buld, I didn't want to spend loads in case I screwed up. Plenty of room for expansion later however.

The first part of the build went easy enough. A minor beginners mistake when I didn't check which way I was putting the heatsink beforehand, so I had to unwrap more of the CPU fan cable than otherwise needed (which meant I later had to cable tie it to another cable to keep it away from the fan blades).

The system was assembled and wired up, hassle-free. Turning it on first time yielded spinning fans, spinning hard drive, running graphics card, no CD drive and no display. After a minor panic attack, it turned out I'd forgotten to plug in the extra 12V power cable for the motherboard. That solved, the system booted up fine.

System setup was the next step. The first OS to go on was Windows Server 2008 R2. Why Windows Server? Because I'm a cheapskate, and could get it for free from Microsoft Dreamspark. That started installation. Everything looked fine, but it froze at expanding Windows files. A quick install of Fedora 10 proved the system was capable of running an OS, and even Windows XP ran fine. Then it dawned on me. I burned a new disc, and Server 2008 R2 installed fine.

A little tinkering was needed to get a decent desktop experience from the Server OS. This is actually so common, there are websites for converting Server 2008 to a Vista-esque PC and the same for Server 2008 R2 to 7.

The system runs fine, and can handle all my games without any difficulty, except Fallout 3. I don't know what's causing the Fallout 3 problem, but it freezes randomly within the first 5 minutes. A quick google reveals some people have the same problem with Windows 7, so it may just be an incompatibility, so I'll have to hang on and wait for another patch.

in total (remember, some parts are from old systems), this build cost me just under €350. could i have gotten a comparable system for €350? checking on dell, the same money would buy me a pentium dual core e5300 and integrated graphics, so probably not.

Footnote: I've written this post while trying out Windows Live Writer. It seems cool. My one problem is it doesn't quite render my theme right in the edit section, but apart from that it is fine. Anyone know of any Linux programs with similar features?

Dear Adobe, Apple and Sun, your update is not important enough that you have to crash my game

January 06, 2010 | categories: Gaming, Pcs

Auto updaters are great. They keep Windows secure, they save me having to manually install each time with Firefox, they're completely transparent with Chrome and on Linux, the whole system's updates are controlled through the one interface. Some are less ideal. Paint.NET prompts you to update on startup, usually when you've just turned it on for a quick edit (and you can't use it while it downloads the updates - apparently this is fixed in the newest version). It's still better than nothing however.

There are one class of updates that aren't quite so great however. These are the ones that decide they need to sit in your system, all running simultaneously. And when there is an update? It's so important that they need to pop up a window to alert you of this, even if you have no intention of going near said application for a week. Or maybe they are like Apple's. An update to iTunes includes Safari by default. Why?

The other day, I was two hours into a Supreme Commander LAN party. While I was about to start the final attack, what happens? The game minimizes, and a message pops up. New update for Adobe Reader. Last time I opened a PDF was a week ago (against my will). On my system, once you minimize any of these games:

  • Oblivion
  • Supreme Commander
  • Call of Duty 5
  • Team Fortress 2
  • Unreal Tournament III
  • etc.

They aren't coming back up again. Say goodbye to your progress if there isn't a recent save. I tried in vain to start the game again. Click the taskbar entry. Up pops another Window:

"Supreme Commander Application has stopped responding"

It closes. I end up disconnected from the LAN game. Up pops to the nearly defeated player "Macha has been defeated". All because some stupid cruddy app to open files created by those too lazy to make an actual web page decided it needed to update itself right now. (Yes, I am aware Adobe Reader and PDFs are useful to some people in some situations. I am not one of them).

While the Java updater was not the guilty culprit this time, it has been at other times, with behavior similar to that of Adobe's updater.

That is one clear advantage to console gaming. The nearest equivalent is the 360's forced update or be signed out of xbox live being applied to single player games as well, and that's not nearly as bad.

I can remove these applications from startup of course, but somehow they seem to always make their way back there.

Source control and me - Why I use git and github

December 29, 2009 | categories: Pcs, Programming

The biggest change to programming for me this year (apart from spreading out from just PHP and Javascript) is that I now use source control for most of my projects. I started out on subversion because it was widespread, easy to use (with TortoiseSVN, I was still a Windows "everything must be GUI" user at the time), and there was a handy tutorial for it published for it on a blog I happened to read. I used that for a good while until a friend showed me git and github. I'd used Sourceforge and Google Code for a while, because although my projects are too small for anyone else to be interested in adding to them, the benefits of having a remote source control service that's always accessible from any computer are huge for me. And it's handy for showing people the program in more detail if I need help on Stack Overflow.

While I was (and still remain) unconvinced about the benefits of distributed source control versus normal source control, and more specifically of git vs svn, the benefits of github versus google code (which I had been using at the time) were more than enough to convince me to make the switch. It also helped that at the time, Linux had recently become my main OS with Windows being relegated to usage for syncing my iPod Touch and gaming, and I had become much more familiar with CLI usage of the system, so the command line orientation was no longer the problem it once was.

Branch history of a small github project

Look at the pretty graphs! (ignore Chromium for Linux's fail at positioning. My cursor is actually on that big green dot on my screen. It's just the screenshot gone wrong. This is the fault of

Setting up git and github was simple. I could give you instructions on how to do it, but Sirupsen has already done this much better than I could for Linux users. For Windows, there is another tutorial hosted on github,

Usage of git with my github account is equally simple.

  • Change files
  • git commit -a
  • Type commit message into nano (or vim if you've changed your system editor, or notepad if you're on Windows)
  • git push origin master

The only thing I missed during switching was that Subversion numbers revisions like 1, 2, 3...20000 while git uses md5 hashes. But I can live with that.

Uploading a PDF is not putting information on the web

November 21, 2009 | categories: Pcs

One thing that alwasy gets on my nerves is when I go to a website to read an article and the web page turns out to just be a summary of the article with the full article in a PDF. 99/100 times, this results in me leaving the page and finding the information elsewhere. There are other variations of this: "See our website for information", and the information is a PDF file is another common one.

If I did click on the PDF, what would happen? On Windows, Adobe Reader would open up (slowly, sometimes taking as long as 5 minutes), then the PDF file would start loading. If I was lucky, it'd be in a browser tab. If I was unlucky, it'd be in a whole new window. On Linux, it'd open in Document Viewer, which is faster. However, I would then be unable to click any links in the document.

Why would anyone supply on the web, a page that needs an external application to run outside the web browser and call it putting information online? Especially when said application has had security problems in the past. It just strikes me as being very lazy.

« Previous Page -- Next Page »