Windows Vista on VirtualBox – the networking fix

vbox_logo2_gradient.pngWe’ve written a bit about VirtualBox in tech/business section of ArsGeek, so you’re probably familiar with this easy to use x86 virtualization tool. It supports Windows (NT 4.0, 2000, XP, Server 2003, Vista), DOS/Windows 3.x, Linux (2.4 and 2.6), and OpenBSD and a bunch of other OSes as well.

Vista however, can present a problem. There are no built in network drivers for the virtual NIC that VBox implements. Without a network connection, it gets kinda hard to download and install the appropriate drivers. If you’re on Ubuntu or a Debian based distro, here’s what to do.

First we’ll get the appropriate driver:
Now, we’ll have to unzip it.
unzip -d vista
We’ve just made a folder called ‘vista’ with all of the driver info needed. Lastly, we’ll create an ISO image from this folder.
mkisofs -o vista.iso -R vista
Once that’s complete, you can mount the ISO through VirtualBox and restart your virtual Vista install.


When you boot into Vista, you’ll right click on My Computer, choose Manage and then update the driver with software on the ‘cd’ you have mounted.



Adobe Acrobat Reader got you down?

adobe acrobat reader

EDIT: 7/12/07: This apparently breaks Acrobat Reader 8.1 – even removing the accessibility plugin causes Acrobat Reader to complain about an ‘invalid plugin’, then exit out. I put the plugin back in, and 8.1 does not seem to exhibit the same symptoms that 8.0 did (i.e.: no “please wait while the document is being prepared” crap). I’m going to play around a bit, but it seems Adobe fixed this in Reader 8.1

– Cavtroop (senior developer of ArsGeek)

I installed Acrobat Reader 8.0, and every single time I open a document, I was presented with a “please wait while the document is being prepared” message, that would sometimes stay for minutes as it indexed the entire document. This presented issues with both usability, and when giving presentations.

To remove this ‘feature’, simply navigate to your %Program Files%\Adobe\Reader 8.0\reader\plug_ins folder, and rename (delete, copy elsewhere) the ‘accessability.api’ file. The same file exists, but in slightly different locations, in older version of Acrobat Reader.

Poof! No more annoying messages preventing you from actually reading the file. Keep in mind, the ‘reader’ functionality will no longer work, but who uses that anyways? Next time you open a file, Acrobat will complain that this version doesn’t have ‘reader’ functionality built in. Check the box, and click OK, and that’s the last you will hear from them on that!


Ubuntu Tricks – 4 ways to run Root privileged processes without a password

Got Root?First, a note from the owner of ArsGeek website. While some of these processes may be a bit time saving and knowledge is good for it’s own sake, I DO NOT recommend using these techniques often, if at all. I’ve had some requests for an article of this type and I’m happy to provide the information – it’s all available out in the great wide Internet anyway. However, running things as root means that there are no safety nets involved and should someone else gain access to your computer, they can do anything they want.

I’ve had three email requests now to right up how to execute things that normally require root privileges without having to enter a password. There are a number of ways to do this and we’re going to look at four of them.

Method #1 is to run everything from your terminal session as root. To do this, simply open a terminal session, and type:
sudo su -
Then enter your password (assuming you have sudo rights). *POOF* You are now root. You can do thinks like “apt-get install zangband” and you won’t be prompted for a password. Once you end your terminal session, your root privileges vanish. Of course, while you’re still in your session there is nothing stopping you from deleting your entire file system either, so be carefull. There’s a reason why so many people use sudo.

Method #2 involves allow root to log into your Gnome session. When you log in as root, obviously you won’t be prompted for a root/sudo password when you launch something that requires root privileges, such as changing the time, or running Synaptic. However this comes with the added danger of being able to destroy and/or mess up things without having to enter a password and thus get that final reminder that what you’re doing could effect your entire system.

To enable root to log into your Gnome session in Dapper:
Go to System -> Administration -> Login Screen Setup and then click on the Security tab, choose Security and check off the “Allow root to log into GDM” box.
To enable root to log into your Gnome session in Edgy:

Go to System -> Administration -> Login Window Preferences and click on the Security tab. Check off “Allow local system administrator login”.

Method #3 is by far the most dangerous. This will give you the ability to run sudo without a prompt for a password and without becoming root. This effectively makes you a root user. This also effectively makes anyone who happens to know or steal your username/password or hack into your account root as well. That’s a very, very bad situation to be in so again keep this in mind if you chose to make this change to your system.

To do this, you’re going to have to use visudo to edit your sudoers file:
EDITOR=gedit sudo visudo
Now you’re going to change:
to (or add if you don’t have it)
Replace USERNAME with your own.

Method #4 is much more selective. If you have a single process that you find changing all the time or a program installed in a location not owned by you that you launch all the time, you can change the owner to yourself. I use this one occasionally. If you look at my Songbird tutorial, you’ll see that I installed Songbird in my /opt directory, which is owned by root. However, I don’t want to be prompted for a password each time I launch it, so I used the chown command to give myself ownership of it. Since I own it there’s no password prompt.

Again, I advise you think about what you’re doing before you make any of these changes.


How to back up and restore your Ubuntu machine

I've crashed and I can't get up!

There are many, many different ways to back up your Ubuntu system. Here we’re going to look at two of them, one of which is a full system backup and the other is a way to copy folders and files. The point of this article isn’t to be super inclusive of every method under the sun, but to provide a guide as to how I do this and why it works for me.

All of my backups are done to an external drive. In my case, this is a firewire drive that is mounted in my /media directory. There’s nothing stoping you form doing this to a network drive, a separate partition or even your primary partition. However, you do have to be cautious of your space limitations. Backing up a 3 GB install onto a 40 GB disk is fine, but backing up 63GB of data to your 80GB drive… not so good. This is one of the two reasons I use an external 200 GB drive. Lots of space. The other reason is that moving a backup file off of my primary partition after I’m done backing it up just seems like an extra step.

There are two types of backups that I do. The first is a backup of several key folders, not my entire system. This is in case I blow something away, or lose some data that I’d want to get back quickly.

I use the rsync command for this. Rsync is a simple and fast way to make an exact copy of something. That something can be a single file or a whole file system.

Now my external hard drive is a firewire drive, which Ubuntu thoughtfully mounts in /media for me with the wonderful name of ‘ieee1394disk’. That’s where I want to keep this backup copy. Let’s open up a terminal session and go backup some stuff.
cd /media/ieee*
Now I’m in my external drive. If you have a USB disk, chances are it’s under /media/usbdisk or /media/whatevertheheckyoucalledit. I’m going to make a folder to store this backup in because I’m something of a filesystem neat freak.
mkdir arsgeek_backup
cd arsgeek_backup

Now there are four directories that I back up on a regular basis. These are my /home directory, my /etc directory my /opt directory and my mp3 collection. :) My mp3’s are located on a FAT32 partition mounted in /media/sda5 in a folder called music. So here’s the command I use to copy all of these.
rsync -arvu /home /etc /opt /media/sda5/music .
Here’s what the switches after the rsync command mean. a= archive, r= recursive, v= verbose, u= update and z= compress.

What I like about this is that while the first rsync does take some time to copy all of these files and folders the first time it’s run, the next time it’s run it only adds new stuff. So if I run this once a week and the only changes that were made was that I added several new mp3s to my music directory, it will only copy those new files.

If I accidentally deleted an mp3 that I wanted, I could easily (and through the GUI) go to my external drive and copy it back. Or if I accidentally deleted my /home directory (yikes!) I could rsync it back by reversing the command:
cd /home
rsync -arvu /media/iee*/arsgeek_backup/home .

I also plan on upgrading my laptop, which is my primary work computer, to Edgy Eft when it comes out on October 26th. (PLUG!) I’ve put a lot of work into getting my laptop just the way I like it, so I’m going to take a complete backup of the system before I do the upgrade. In fact, I’m doing a new backup while I type this howto. To do that, I use the tar command.

I’m going to back up all of the most important folders to me, however I’m not going to back up certain parts of my install, like the /tmp directory, or the /sys directory or anything mounted in /media like DVD’s or the external disk that I’m backing up too! That would be messy. So we’ll use the tar command with some excludes built into it. It’s a bit long and ungainly looking but it works like a charm.

First, I move into my external drive.
cd /media/iee*
Then I make another directory for my complete backup
mkdir arsgeek_wholeshebang
cd argeek_w*

Now I’m ready to back my machine up. This is going to take a while, so it’s a good idea to do it when you won’t need to power off your computer.
sudo tar cvpzf arsgeek.backup.tgz –exclude=”/proc/*” \
–exclude=”/lost+found/*” –exclude=”/dev/*” \
–exclude=”/mnt/*” –exclude=”/media/*” –exclude=”/sys/*” \
–exclude=”/tmp/*” –exclude “/var/cache/apt/*” /

As you can see, that’s quite the command. Here’s how it breaks down. Tar is the program we’re using to make a backup copy.

The switches work out as follows: c= create, v= verbose, p= preserve permissions, j= bzip2, f= file.

arsgeek.backup.tgz is the file we’ll end up with, a complete and compressed archive of my entire ext3 filesystem.

– -exclude=”/something” is a directory or file that you’re explicity telling tar not to back up. If we were doing this in the same filesystem we were backing up, it would be important to exclude the arsgeek.backup.tgz file. Since we’re doing it to an external drive however, we don’t have to worry about that.

the / at the end tells it to start from the top level (or root) directory of my filesystem. It will start taring at / and get everything that lives beneath it except for those directories and files we told it not to get.

This will chug along for quite some time until eventually we’re left with a massive file called arsgeek.backup.tgz. So if things go horribly, horribly wrong how do I restore my computer?

Here’s how I would do it. I’d first reinstall my laptop with a fresh Dapper install. No updates, same hard drive partitions as before. Then, I’d log in, attach my external drive and go to the backup file.
cd /media/iee*/arsgeek_w*
sudo tar xvpfz backup.tgz -C /

Be warned however that this will overwrite anything and everything in any of the directories you’ve tared up. So /home will get completely over written with whatever’s in your tar file and the same for everything else. Again, this will take some time.

Once that’s done (and note that you’re doing it from within a running OS! Neat!) simply log off and log back in again. This is how I backup my personal documents along with the ArsGeek business website. Phew! Glad you had a backup plan!