Saturday, June 15, 2013

Fixing "mojo.run: No such file or directory" error

When trying to install something with a .mojo.run extension (for example, I was installing Dungeon Defenders from the Humble Bundle), if you get this error:

mojo.run: No such file or directory

The solution (thanks Kazade!) is to install the ia32-libs package. Then the installer should work.

(Posted this because it took me way too long searching to stumble across Kazade's post).

Update 20/10/2013:

I noticed that ia32-libs is no longer an installation target in 13.10. There doesn't seem to be an easy way to fix this, since some suggested mechanisms don't work when you've got a mojo.run file.

The only way I found to do it is detailed here: http://wiki.phoenixviewer.com/ia32-libs-in-ubuntu-13-10

The version of synaptic I was using was slightly different to that described above. In Step 4, it was "click New" rather than "other software -> add", and in step 5 the values to insert are in four separate lines:

  • Dropdown box: Binary (deb)
  • URI: http://archive.ubuntu.com/ubuntu/
  • Distribution: raring
  • Section(s): main restricted universe multiverse

Monday, June 10, 2013

Fix broken xfce4 desktop after uninstalling compiz

So, to test a theory on how to fix graphical tearing on a Xubuntu 12.04.1 install, I tried installing compiz.

Not only did it not help, but it also broke the desktop environment. So I uninstalled (followed the reverse of the compiz installation steps I did). Uninstalling though doesn't clean up everything, and the desktop was still broken.

Fortunately, I found this post which suggested removing ~/.config and ~/.cache.

You don't need to zap those entire directories though. Inside each one there is a subdirectory that starts with "compiz". Those are the directories that need deleting. After that, I rebooted and the regular xfce4 desktop was working again.

Friday, May 31, 2013

Review: Logitech K400 vs Shintaro Wireless Multimedia Keyboard

I was tossing up between a Logitech K400 and a Shintaro Multimedia (trackball) keyboard for my HTPC. I got the Logitech first for $35, but wasn't entirely happy with it so I got the Shintaro as well ($38+postage) for a comparison. These are their stories...

Shintaro Wireless Media (above) and Logitech K400

My initial reaction was that the Shintaro has a much more solid build than the Logitech. The mouse buttons were clicky and "alive", compared to the gummy feel of the K400 buttons, where you're never quite sure if you've actually pressed it.

The key layout was also much better, and almost for that alone I'd take the Shintaro. The placement of the right shift key and up arrow buttons on the Logitech continually annoyed me. The Shintaro is closer to a "normal" keyboard layout.

Shintaro positives:

  • Solid feel and nice clicky buttons, has a quality about it.
  • The keyboard layout is close to a regular desktop keyboard. Minimal pressing the wrong key when reaching into the shift/enter/arrow key area.
  • I think I prefer the trackball to a touchpad.

Shintaro Wireless Media keyboard

Shintaro negatives:

  • Size of the USB receiver. That thing is enormous. I ended up connecting it via a USB extension cable from the back of the case, because it looked so precarious hanging out the front.
  • Wireless connectivity can perform really badly. Even with a direct line of sight and less than a metre distance, having the keyboard sitting in the wrong place on your knees can mean up to 80\% dropped characters. When it was connected it was fine, but I still haven't quite worked out what positions will cause it to go bad. (Even a direct line of sight < 3m sometimes drops the occasional character when typing).
  • The board "goes to sleep" really quickly. Spinning the trackball doesn't wake it up either, you've got to press a key. I'm so used to bumping the mouse to wake up a computer, it takes a bit to get used to.
  • While the build quality is nice, it is quite bulky.
  • I sometimes had trouble getting into the BIOS with this keyboard.
  • It has a "sync" step you have to perform by pressing a button on the receiver. I seemed to lose sync occasionally, but this may have just been the wakeup problem noted above.
  • Takes four AA batteries, and with an estimated 3 month life, probably falls short of the Logitech in that respect.

Logitech positives:

  • Rock solid wireless connection. I was typing thing in another room with no line of sight.
  • Tiny USB receiver and no need to sync. Just works.
  • Size-wise, the keyboard is nice and compact.

Logitech K400 wireless keyboard

Logitech negatives:

  • The flimsy build quality and gummy feel of the keys and buttons. It just felt really cheap, and was difficult to know when you'd clicked a button.
  • Keyboard layout was problematic. In particular, the Right Shift key is much smaller than normal, with the Up Arrow taking up the space. This means the Up Arrow is easily pressed when searching for the Shift key. Doing any command-line stuff during installation, this was infuriating.

The lengths of the lists don't represent who won in my test, since the negatives to the Logitech K400 and positives to the Shintaro were really big factors in my decision to keep the Shintaro.

Saturday, May 25, 2013

Case Fan Review: Antec TrueQuiet Pro, Aerocool SharkFan and Coolermaster Sickleflow

I started on a vendetta to try and get my computer to run a bit quieter. Although I think the main culprit of noise levels is the stock CPU fan, the case fans in the Corsair 300R were making some noise, so I bought a few other fans to try out.

The fans were:

  • CoolerMaster SickleFlow 120mm ($8)
  • Aerocool SharkFan 120mm Blue Edition ($13)
  • Antec TrueQuiet Pro 120mm ($20)

CoolerMaster SickleFlow (left) and Aerocool SharkFan (right)

I don't have a dB meter -- quantitative testing isn't how I roll -- so I was just going by the ear test.

The CoolerMaster SickleFlow was the cheapest and was pretty loud, louder than the stock case fan.

The Aerocool was initially really loud as well, but then I realised the "extension lead" in the packet for a voltage reducer. With this fitted it reduced the noise to about the same as the stock fan.

The Antec TrueQuiet Pro had a physical switch for adjusting the speed. At full speed it too was louder than the stock fan, but when switched down to low speed it was quieter.

Spending the extra for something like the TrueQuiet Pro is worthwhile if noise levels are important.

Saturday, May 18, 2013

NFS Mount Hangs on Network Between Two Linux Machines

I was trying to set up NFS on my local network to transfer some stuff between two machines. I thought this would be pretty easy, but there seem to be a lot of guides out there that are either out of date or more complicated than they need to be (maybe they include some advanced features, not sure).

The main problem I had was that the mount command would hang when I tried to connect the client to the server. I tried everything I could think of, and in desperation tried reversing the client<-->server direction. At that point, it worked without a hitch. Still don't know exactly what the issue was (some conflict in the setup or configuration of my server machine?), but I was ecstatic at that point it worked at all.

Here are the steps (use ifconfig on each machine to find out their IP address, or use hostnames if you've set up hostnames):

On the nominated "server" machine

  • $ sudo apt-get install nfs-common nfs-kernel-server
  • Edit /etc/exports and add the following line (assuming here that the client IP address is 192.168.1.1 and the directory to be made available is /tmp):
        /tmp	192.168.1.1(rw,sync,no_subtree_check)
    
  • $ sudo exportfs -ra
  • Check that the entry just added to the exports file is okay with: $ sudo exportfs
  • $ sudo service nfs-kernel-server restart

NFS server daemon processes should now be running.

On the nominated "client" machine

Assuming the server IP address is 192.168.1.7 and /files/remote is the directory which we will be mounting to:

  • $ sudo apt-get install nfs-common
  • $ mkdir <local-directory-to-be-mount-point>
  • $ sudo mount -t nfs 192.168.1.7:/tmp /files/remote

An entry to automatically mount can be put in /etc/fstab, but since I will only be using the NFS connection on an ad-hoc basis, I haven't done that at this stage.

Saturday, April 27, 2013

Create a Bootable USB on Xubuntu with unetbootin

To create a bootable Linux installation on a flash drive (using Xubuntu):

  • Insert the flash drive. After it mounts, use df to find the file system name (for example, /dev/sdd1).
  • Install unetbootin if you need to:

    $ sudo apt-get install unetbootin

  • Run unetbootin. *** Requires sudo access. Apparently this is a known issue with unetbootin.
  • I selected the "disk image" option because I'd already downloaded the ISO I wanted:
  • Press OK and it will expand the ISO onto the flash drive. *** Warning: This will delete everything on the drive.
  • Insert flash drive in target machine and boot away. You might have to go into the BIOS and select the USB drive as the bootable device, depending on the motherboard brand.

Update 10/6/2013:
If you want to get rid of the pesky ldlinux.sys off the flash drive that won't even let sudo delete it, do this:

$ sudo chattr -i ldlinux.sys

Then sudo rm ldlinux.sys. (Fix taken from here).

Update 19/10/2013:
If you have an Intel motherboard and boot from a USB drive and get the message "Boot error", it could be due to the BIOS settings. This forum post describes how to fix the problem, which worked on one of my old computers with an Intel motherboard.

Monday, April 22, 2013

Format Flash Drive for Big Files on Linux

By default, flash drives are formatted with the FAT32 file system. FAT32 has a file size limit of about 4.3GB.

To get around this, you can format with a file system that supports bigger files. I chose ext4 for this, you can use ext2, ext3, or others.

Who's a big boy today?

Warnings:

  • You probably won't be able to use the flash drive on Windows machines (maybe this is what you want?)
  • Performance of flash drives under different file systems can apparently vary markedly. I didn't have any issues with mine using ext4.

Here are the commands:

$ df

Use df to find out which device is your flash drive, in my case it was /dev/sdd1. (Make sure you get this right, so you don't blat your hard drive or something).

$ umount /dev/sdd1 $ sudo mkfs.ext4 -L "BigFileDrive" /dev/sdd1

After reformatting, the drive mounted with root as the owner, so I did:

$ sudo chown ash /media/ash/BigFileDrive $ chgrp ash /media/ash/BigFileDrive

And all was well.

Update 11 Sept 2013:

Trying to run this for NTFS (on kubuntu at least) can result in:

The program 'mkfs.ntfs' is currently not installed.
You can install it by typing:
sudo apt-get install ntfs-3g

But it says it's already installed. This is a known bug, a simple workaround is to just run mkntfs rather than mkfs.ntfs.

Sunday, April 21, 2013

Canon MG6250 Scanning on Xubuntu 12.10

The other day a friend of mine challenged me asked me if I'd got scanning going on the Canon MG6250.

I had never tried it, and some research showed other people were having some issues as well.

Here were the steps I took to get it going:

  • Install xsane (sudo apt-get install xsane)
  • The sane man pages refer to "backendname" a lot. The project's documentation gives the backend name for the 6250 as "pixma"
  • man sane-pixma (seems to be a man entry for each backend) tells you to that network scanners should normally be detected, but if not, add them directly to /etc/sane.d/pixma.conf
  • Edit that file and add a line of the format:
    bjnp://<ip_address>
    IP address can be retrieved from the printer settings dialog, or from the options in the printer itself.
  • After adding an entry for the printer, save pixma.conf
  • If the sane daemon isn't running (some have reported that it is running, but I had to start it manually as per the next two steps):
    • Edit /etc/default/saned and set RUN=yes.
    • Then start the sane service: service saned start
  • Run xsane

Now xsane should discover the scanner, and instead of saying "no devices found" and dying, it should run up (brings up about 4 windows). All the default settings seem to work — just press "Scan".

Saturday, April 6, 2013

Logitech C270 Webcam in Linux

TL;DR: Logitech c270 works with Linux; guvcview good.

I bought a Logitech C270 webcam for my Dad, but thought I'd try it out first and see how it works with Linux.

First thing was to install cheese (as per the suggestion at ubuntu forums):

$ sudo apt-get install cheese

(This required a restart).

Then I ran cheese from the command-line -- webcam started up great! First thing I tried was to do a video capture...and got this:

Cannot connect to server socket err = No such file or directory Cannot connect to server request channel jack server is not running or cannot be started (cheese:2518): cheese-WARNING **: Jack server not found (cheese:2518): cheese-WARNING **: Could not initialize supporting library. Segmentation fault (core dumped)

So I tried to install jack and jackd, but this had no effect.

Then this bug suggested installing "gstreamer1.0-pulseaudio".

This worked. Had one core-dump after that, but mostly worked. Cheese complained about not being able to create thumbnails for the videos that were recorded, but I wasn't too concerned about that. By default, pictures go into ~/Pictures/Webcam, videos go in ~/Videos/Webcam.

Screen cap taken with cheese running

Videos recorded with cheese at 1280x960 and 960x720 looked awful. I don't know if this is a function of using webm or something else. Dialling down to 640x480 looked much better.

Only problem: the sound wasn't working.

Couldn't find a readily available solution, so I tried guvcview, after seeing it recommended in this askubuntu question.

$ sudo apt-get install guvcview

This looked like a neat little program. Had a lot more options than cheese. But still sound wouldn't work. Went through all the sound devices in the list. The device ID for the webcam (003 on my machine, as lsusb told me) wasn't in the list. Then I had this anti-brain fart where I recalled that some devices don't work so well in USB 3 ports, and I'd plugged the Webcam into the USB 3 on the front of the case.


640x480 screenshot taken with C270 and guvcview

Plugged it into a USB 2 in the back (restarted guvcview), and BAM! New audio device appears (a "USB Audio"). This worked just fine.

I had to give cheese another go, but still no sound. So I'm not sure what was going on there, but it felt like guvcview gave better control over the capture anyway.

(For the record, they are the Natural Confectionery Co. snakes).

Monday, April 1, 2013

Script to initialise Wacom Intuos 5

To round out the setup for my Wacom Intuos 5 tablet, this is the script I run to initialise it for left-handed use with an nVidia graphics card in Xubuntu 12.10:

#!/bin/bash if [ -x /usr/bin/xsetwacom ]; then xsetwacom set "Wacom Intuos5 M Pen stylus" Rotate half xsetwacom set "Wacom Intuos5 M Pen eraser" Rotate half xsetwacom set "Wacom Intuos5 M Pen cursor" Rotate half xsetwacom set "Wacom Intuos5 M Pen pad" Rotate half # HEAD-0, HEAD-1 identify screens when using nVidia graphics. # Use xrandr output for AMD, Intel, etc. xsetwacom set "Wacom Intuos5 M Pen stylus" MapToOutput HEAD-0 fi

Saturday, March 23, 2013

Configure Mouse Speed in Xubuntu

I found the default mouse acceleration to be way too fast (particularly when trying to click on the single-pixel window borders in xfce).

To slow it down, I followed Patrick Mylund's instructions. These are the results specific to the Logitech G400.

$ xinput --list --short

This shows the names/IDs of input devices. In my case, "Logitech Gaming Mouse G400".

Now create a file ~/.xinput-mouse.sh, chmod it +x to make it executable, and add edit to include the following command:

xinput --set-prop "Logitech Gaming Mouse G400" "Device Accel Constant Deceleration" 4

Add a file xinput-mouse.desktop to ~/.config/autostart with the following contents:

[Desktop Entry] Encoding=UTF-8 Version=0.9.4 Type=Application Name=xinput-mouse Comment=Slow the mouse acceleration Exec=/home/<username>/.xinput-mouse.sh OnlyShowIn=XFCE; StartupNotify=false Terminal=false Hidden=false

Sunday, March 10, 2013

Asus GTX 650TI on Linux

So after suffering through the (self-inflicted) pain of trying to run an AMD GPU under Linux, I bought an nVidia-based GTX 650ti to try out next.

Asus GTX 650ti

The 650ti doesn't have the best reputation for value (with performance that is similar if not worse than the much cheaper HD7770), but it seemed to go okay in reviews and is realistically already overkill for anything I'm going to use it for.

In box

I narrowed the choices down to an Asus version versus the MSI Power Edition. While I'd heard goods things about the MSI PE cards, in the end it came down to:

  • DVI-D ports (compared to DVI-I).
  • HDMI ports (compared to mini-HDMI. I have plenty of HDMI cables, but no mini ones).
  • Low noise level and temperatures in reviews (though the MSI is similar here anyway).

Ports

Installation

Compared to the troubles I had with the AMD, getting the nVidia card up and running is a breeze. Admittedly, I cheated this time and went straight to the proprietary driver. The open-source driver (nouveau) worked fine straight after installing the card, and I'll probably keep an eye on it (it's apparently made some recent advances in capability). But, I was tired, and just wanted something to work, so I cheated.

Plugged in to motherboard

To see which drivers are installed, you can use:

$ sudo lspci -v

For more (or heaps more) information, you can use -vv or -vvv to the above command. Running with sudo gives you a bit of extra output as well.

Initially you'll see a line something like this in the VGA controller section:

Kernal modules: nouveau, nvidiafb

These are the open-source drivers installed automatically in recent kernels.

To install the proprietary drivers, you can use aptitude to look up the possible targets:

$ aptitude search nvidia

To get going, all you need is to do the following:

$ sudo apt-get install nvidia-current nvidia-settings
$ sudo nvidia-xconfig
$ sudo <reboot>

The xconfig command above writes a default file to /etc/X11/xorg.conf. If all installs correctly, repeating the lspci command above will now output something like:

Kernel modules: nvidia_current, nouveau, nvidiafb

Results

Everything worked without a hitch. Running nvidia-settings lets you set up dual monitors. After rebooting, for the first time the login screen was actually two separate screens, rather than mirrored.

There was no obvious increase in noise levels with the Asus card, which is nice.

It's definitely far from perfect though. There are some noticeable artifacts when watching HD video, for example, and some minor tearing while running the game Dungeon Defenders. But at the moment I'm going to take "everything's working and it was easy to set up" over "pixel perfect display". Maybe if it annoys me enough I'll investigate some more, but I'm happy to run as is for the time being.

Update 15/6/2013:

Some months later, I've still been unable to resolve the tearing issue. It also occurs when scrolling up or down in web pages as well as in games/video, and is frustrating beyond belief that such a trivial action causes screen tearing with vendor-provided drivers.

The issue occurs on multiple computers I've used, all with different hardware and distros, so I can only conclude that nVidia's driver is broken.

Update 3/7/2013:

Forgot to link to it, but I ended up fixing the tearing issue. Unfortunately, I found it only worked with Mint/Cinnamon rather than Xubuntu.

Splitting Large MP4 Files

Flash drive formatted with FAT32 has a ~4.3GB maximum file size. I wanted to use the USB to watch the movie on my TV, but it wouldn't fit. To split a large mp4 file into more manageable chunks, I found this forum post on MP4Box. It works great.
$ sudo apt-get install gpac
$ MP4Box -split 3600 <filename>.mp4
The above splits the file into one hour chunks.

Monday, February 11, 2013

Linux SSD Setup

A solid state drive (SSD) is often the cheapest way to improve all-round performance in a computer.

There are many guides for setting up an SSD for each operating system. When I did a recent reinstall of Xubuntu, I looked through a few of the guides and picked what I felt were the most important things. To my mind, these are the two biggest things:

  • Ensure that the drive controller is running in ACHI mode. This is an option in the BIOS. (Note: With Windows, you need to change this mode before installing. I haven't tried changing it with a Linux install, but it will trash a Windows install).
  • Edit /etc/fstab to add "noatime,nodiratime,discard" options for the SSD partitions.

Since SSDs have a lifetime measured in "number of writes", much of the tuning advice is aimed at reducing the number of unnecessary writes.

Here's a few other things I did.

WARNING: I based doing this on the principle of "OS/applications on SSD, data/media on HDD", and "reduce unnecessary writes". Whether or not they are good things to do (particularly, mapping /tmp to a different drive while the system was running), I have no idea. It worked for me, but I didn't base this activity on any existing guide.

System Setup

I have one SSD (Samsung 830 128GB) and one HDD (Seagate Barracuda 2TB). During installation, I partitioned the SSD into ~80GB for the "primary" OS and the remainder for "experimental" OS installs.

I allocated the swap space to the HDD, since this could case a lot of writes if the system ever needs to swap out (probably rare, given the RAM available). I mounted the HDD as "/files".

Mapping Files in User Home Directory

By default /home contains all the user's files. When I was using Mint, it automatically created "Documents", "Downloads", "Pictures", "Videos" etc. in the user home directory. I maintained this with Xubuntu (can't remember if Ubuntu variants do this by default), but replaced the true directories with symbolic links to the equivalent directories in the HDD (in the /files partition).

For example:

    $ rmdir Documents
    $ ln -s /files/ashley/Documents Documents

This means that all these files are stored on the HDD, keeping the SSD free from associated writes.

Mapping Email and Browser Data

The user's home directory also contains email and browser data. So I symbolic linked my thunderbird mailbox to the HDD as well. This Firefox support post explains how to move the Firefox cache to another drive. In hindsight, I probably should have just linked the entire Firefox folder as well.

Remapping /tmp

I remapped /tmp from the SSD to the HDD. This one was a bit of an experiment. It could have killed my system, I suppose, but everything seemed okay, so I'll explain what I did.

The system uses the directory /tmp to store random runtime stuff as needed. It's a special directory in that anyone can write to it. You need to set the "sticky" flag for this. If you do an "ls -l" on it, you'll see something like this:

    drwxrwxrwt  2 user user 4096 Nov 19 20:13 tmp

The "t" character at the end of the first column indicates the sticky bit is set. This is what I did to move my /tmp directory:

    $ sudo mkdir /files/tmp
    $ sudo chmod 777 /files/tmp
    $ sudo chmod +t /files/tmp
    $ sudo rm -rf /tmp
    $ sudo ln -s /files/tmp /tmp

The reason I noticed I needed to set the sticky flag (the "chmod +t"), is that without it, filename completion in the terminal stopped working. I imagine a whole heap of other stuff would have to.

Also, when I look at my tmp folder now, the "t" flag no longer appears. Not sure why, but everything still seems to work.

So, there it is: cowboy setup for reducing SSD writes.

Tuesday, February 5, 2013

AMD HD7770 GPU in Linux


Update (Oct 2013): For anyone who stumbles across this, I've posted an update where I got the proprietary driver to work acceptably (but only on single screen).


After researching and asking around on forums about GPUs and Linux, I decided to get an AMD Gigabyte HD7770 1GB as an experiment. From what I could gather, AMD support in Linux was pretty bad, (or at least, more problematic than nVidia cards), so I went in with low expectations, willing to wear the difference of flogging it off secondhand if it didn't work out.

TL;DR: It wasn't all bad, but nothing "just works" (as was pointed out in forum responses). From my experience, I have to conclude that the AMD Linux drivers are effectively broken at the moment.

Rationale

Despite being warned away from AMD cards, the 7770 is an entry-level gaming card and not overly expensive. It was also overkill for the games I expect I would be trying to play (of which there isn't a huge selection on Linux yet anyway).

Since it wasn't too expensive ($119), I was willing to take the risk and try it out, partly for the chance to experiment, partly because AMD are reckoned to be good value for money in that price range, and partly because AMD at least supports an open source driver (for varying degrees of "supports").

Machine and OS Specs

Hooked into the i5 3470 system I built in December 2012, running LinuxMint 14 (Nadia) Cinnamon 64-bit.

On with what I found.

Card Installation

Quick and easy. Dropped right into the PCI-E slot without the firm coaxing that RAM and SATA cables usually need. I chose the second port because it looked like the USB 3.0 cable might get in the way of the card. After I put it in, I thought it probably wouldn't have been a problem, but I left it where it was.

Plug in the 6-pin power cable and away we go.

Note: If I'd got the slight less powerful 7570 (which I was considering) I wouldn't have needed the power, but I got a beefy PSU and it seemed a shame to not try out some of those cables sitting around in the bottom of the case).

With the GPU plugged in the on-board graphics is disabled. This is (I gather) expected behaviour.

Open Source AMD Video Driver

Single Screen

After plugging in the GPU and booting up, I didn't have have to do anything in particular. LinuxMint 14 has drivers installed by default to drive the GPU.

First test I plugged into the DVI port. It started up fine at full resolution.

A command you can use to see how the system has detected your GPU is:

    lscpi --nn | grep VGA

Mine displayed "[Radeon HD 7700 Series]"

Dual Screen

Then I added in the second monitor. If you've ever heard someone claim the dual-monitor support in Linux is limited, you can believe them.

Initially with both plugged in, only the HDMI monitor would work. The other monitor (plugged into the DVI, which was originally working), displays its "Current input timing not supported by monitor display" message.

I worked out how to get things going by using xrandr, a command-line tool that configures the displays.

By itself, executing xrandr outputs details about the detected displays. You need to know the "names" the system has given the monitors plugged into each port in order to control their output settings. In my case, the monitor plugged into the DVI port was "DVI-0", and the one plugged into the HDMI port "HDMI-0".

To make my main screen on the left and the second screen on the right, I used the following command:

    xrandr --output DVI-0 --auto --output HDMI-0 --auto --right-of DVI-0

This works, but leaves the taskbar on the right hand side screen (HDMI-0 in this case). To make the system use a particular screen as the "primary", use the following command:

    xrandr --output DVI-0 --preferred --auto --primary

This shifts the taskbar to the DVI-0 display.

Saving Dual Screen Setup -- Attempt 1

So, after using these commands to set up the displays as desired, I had to work out how to make the changes permanent. The "normal" way to do this (if that's the right word) is to put the configuration into the /etx/X11/xorg.conf file.

Using this answer, I adapted a minimal xorg.conf file and came up with this:
Section "Monitor"
  Identifier "First monitor"
  Option     "PreferredMode"   "1920x1200"
EndSection

Section "Monitor"
  Identifier "Second monitor"
  Option     "PreferredMode"   "1920x1080"
  Option     "RightOf"          "First monitor"
EndSection

Section "Device"
  Identifier  "Radeon HD 7700 Series"
  Driver      "radeon"
  Option      "DVI-0"   "First monitor"
  Option      "HDMI-0"   "Second monitor"
EndSection

Something went wrong when I ran it though. At least, the HDMI-0 monitor got a new name, and became "HDMI-3". So it didn't quite work.

Saving Dual Screen Setup -- Attempt 2

Next I thought I'd try and set things up with xrandr, then go into the Preferences/Display dialog and use the "Keep current configuration" option to save the setup.

This creates a monitors.xml file in ~/.config with the current settings. However, it only applies after you login. The HDMI screen, if plugged in, always seems to be considered the "primary".

Saving Dual Screen Setup -- Give Up

I tried a couple of other wild and fancy things that I found on various sites to set up dual screens, but in the end just gave up. No matter which way I plugged the monitors in, if the smaller 23" LCD was attached, it was the one that "lights up" for the splash screen. After login the 24" came to life, but not before.

I couldn't work out how to get both screens going at the system level, rather than the user level, so I gave up.

Open Source Driver Performance

First test was running a fullscreen 1080p video. It ran with lots of glitches and jaggies when playing. The CPU went up to 50-55 degrees on all cores -- was is actually processing in software?

Then I downloaded the Phoronix Test Suite. The suite package itself is reasonably small, but the full test suite takes over 5GB of download and can take a few hours. I managed to run a few of them, but none of them ran very well. For example, Nexuis looked like it was running at about 0.1 FPS.

It was clear that, unfortunately, the open source driver was not even close.

Proprietary AMD Video Driver

It seemed I could choose between the Catalyst driver from AMD's site, or one of the fglrx drivers that were listed in the software centre. I still haven't worked out what the difference between these two things is -- I think they're related in some way, but I'm not sure how.

So after looking at askubuntu.com/questions/142627, I did:

    sudo apt-get install fglrx-updates flgrx-amdcccle-updates

Bam! Upon reboot, the video was completely broken. I could see the motherboard post, and then nothing -- not even a prompt. The onboard video also wasn't working (at the time I didn't realise you have to unplug the GPU completely for the motherboard to activate the onboard video).

So, at a loss, I booted up and started typing blind the following steps:

  • Reboot
  • Ctrl-Alt-F1 (open a console, that wasn't visible at all)
  • Enter username
  • Enter password
  • sudo apt-get -y remove --purge fglrx-updates
  • Enter password again (for the sudo)
  • <wait>
  • sudo reboot

Fortunately, this fixed the problem. So I can't really recommend installing the fglrx-updates variant of the driver.

Next I tried installing the Linux Catalyst driver from AMD's website. This was version 12.11beta at the time (see upubuntu.com/2012/08/install-amd-catalyst-128-on-ubuntu.html for the steps I followed).

This driver seemed to work fine (it at least didn't black out my screen) but came with the "beta" watermark in the bottom right. To get rid of that, follow the instructions at askubuntu.com/a/216730.

Dual Screen -- Proprietary

Unfortunately, getting dual screens set up with the proprietary driver was, if anything, more difficult than with the open source driver.

I tried the following mechanisms:

  • Steps at unixmen.com/howto-install-ati-driver-in-ubuntu
  • Then superuser.com/questions/395927, but the Virtual 3820 caused tearing when the taskbar animated up
  • Then I used arandr to get the 2nd display going
  • Then I tried the amdcccle (admin) application to apply "Digital Monitor(2) -> Adjustments -> Scaling (0%)" This resulted in another refresh problem on the second monitor, where it would only update while the taskbar on the primary screen was animating during hover.

So I gave up on the dual screen idea in order to get some testing done for the driver. With the dodgy dual screen setup, running the test application fgl-glxgears got 500-600 FPS, but was very choppy.

Proprietary Driver Performance

To revert to a single screen, I entered the following at the command line:

    aticonfig --initial -f

Now running fgl-glxgears got around 2500-3000FPS. So, same driver but running with only a single screen instead of dual screen improved the performance of this benchmark application by 6x, and it no longer looked choppy. But this is a pretty simple little application, I wanted something with a bit more meat.

So I installed xonotic and ran it up fullscreen (1920x1200) with the highest settings I could set. Within the game is ran just fine (too fast for my old hands), but my old eyes couldn't discern any difference between the "default" versus the "high" settings, so I'm not sure if I did it right.

But, after running this (or, it seemed, any "fullscreen" game), the taskbar animation back in the desktop became choppy and caused tearing all over the screen.

Last Ditch

All this playing had taken a couple of weeks by this stage. I got recommended to run the sgfxi script at smxi.org (a "fix it all" graphics driver script). When I downloaded it and ran, it exited saying I needed to run it from outside X.

Which was fine, I thought, so I tried to Ctrl-Alt-F1 into a console. But I couldn't see the console, at all. I could see the X session, but none of the TTY console (1 through 6) were working.

Somewhere along the way I'd borked those. By this time I'd had enough "experimentation".

Blew away Mint. I'm now running Xubuntu with the onboard video. Anyone want to buy a barely used Gigabyte HD7770?

Conclusion

My experience has led me to conclude that both the open source and proprietary drivers for AMD GPUs on Linux are broken. The Intel drivers for the onboard video aren't without their quirks (for example, I find it really hard to configure the login screen to start on a specific monitor at a specific resolution, and setting up dual monitors is still a bit of a chore). But for the most part they work.

With the AMD proprietary drivers I had crashes, no screen at all for a while, and dual screen setup was diabolical. With the open source driver I had tearing and I wasn't quite sure it was actually using the GPU to do anything.

Hopefully I'll get an nVidia card at some point to get another perspective, but I'll be starting with pretty low expectations.