Error Establishing a Database Connection in WordPress

I am working on a project with a friend. We are developing a new WordPress website. I am doing the background IT stuff and he is designing the web site.
I want to share my experience of handling the dreaded “Error Establishing a Database Connection” in WordPress. I have not found a similar case while researching this problem, and I want to share the “false” rabbit trails that I went down, trying to solve this problem.

A few nights ago, about 10:00, I received a text message out of the blue from my partner, saying we was busy developing the website, and all of a sudden he went to view it and he got a white screen saying “Error Establishing a Database Connection”. He wanted to know if I was doing something on the web site. I was not. He then informed me that he lost control of the “admin” side of the website as well.

I instantly got a knot in my stomach, as this website has to be up soon, and we have been having difficulties for a while and finally set up a new hosting account.

A little back story:
I used the easy 1 click WordPress install that many hosts have available. Why not? It’s easy!
I adjusted the php version to the latest, set some php parameters that are necessary for the website like the upload size, etc.
Uploaded the purchased theme we are using.
Installed and activated all the necessary plug-ins for the theme.
And set up an additional Admin user.

Perfect. Right?

My partner then went at his magic and started designing the website. As I mentioned before, after several hours of work, and about 10:00 at night, I received a text saying “I am getting an ‘Error Establishing a Database Connection”. Are you doing anything? My response “No, Arrrg!!!” He then said he just lost his admin login also.

The first thing on-line guides will tell you, is to check the database credentials: database name, database user name, password.  I knew these didn’t randomly change.  I tried running a small script to test the database connection and got a 500 internal server error. Ouch.

I logged into the hosting control panel and started fiddling around. I looked at the database, and it all looked ok… for a while. Then I lost connection and the control panel asked me for the password for the database user. Upon searching the web for solutions, the suggestion that the MySql server was down. That seemed likely to me, or at least trying to go down as things were intermittent. Little did I know, I was on a wrong rabbit trail. I fought with that for some time.

I looked into rebuilding the database within WordPress. There is a command that can be added to the end of wp-config.php, and go to a certain url withing your website (Google it), and you should come up with a little menu that enables you to check the database and fix things. I could not access that.

Using the control panel supplied file manager, I manually disabled all the plug-ins. I thought maybe there was a rogue one there somewhere. Ta da… I was able to access the database-fix url. I did that. Then I had admin access. I also had limited access to the actual web site, but some pages were blank. No error, just a white screen. I reactivated the plug-ins and boom, it went down again. I thought I would have to enable one at a time until the nasty one showed its ugly side. I had to do the database rebuild thing each time to gain access. Again, I was on a rabbit trail that I did not intend on being, without knowing it.

In the mean time, my mind was going a hundred different ways. I went to our theme’s web site to see if there was an update beyond the version we had.

There was one a couple of days ago. I looked at the version change log and there was nothing big to that update. BUT… I looked down the change log to previous versions and something caught my eye. An update a while back made the theme compatible with PHP V7.0. Hmmmm, PHP V7.0. When I set everything up, I set the php version to the latest which was….. V7.1! I went into the php configuration and set it from V7.1 down to V7.0. Bada bing bada boom! The site came up. PERFECTLY UP!

The whole time, I had the php version wrong. While my partner was coding away, it must have been hours before he hit a feature that was not compatible with V7.1 and down it went.

As I looked back, I did not come across any website that pointed to this type of problem. So, if you are having an unsolvable problem with your WordPress site losing communications with its database, and all the usual fixes don’t seem to work, maybe… just maybe, you might have the same problem we did. I hope this helps someone!

-John.

WordPress 3.9 Lost Functionality, Image Borders, and Margins

wordpress
With “Advanced Image Styles” plug-in, borders and margins can again, be easily set.

I just updated to WordPress 3.9 and with my first post, was disappointed. When an image is placed in a post, the text is smashed right up to the image.  In older versions, a margin of space could be put around the image, along with a border if wanted.  This needed functionality is now lost with WordPress 3.9.

Thankfully one of the developers created a plug-in called “Advanced Image Styles”.  This returns some of the functionality lost with this version of WordPress.  In older versions, there were more options to scale images that are gone now also.  It is kind of interesting that one of their own developers wrote this plug-in, to try to compensate what they took out.

For now at lease, install the “Advanced Image Styles” plug-in to get some of the functions back.  Maybe if enough people scream, they will put it back in.

**EDIT** It seems that when a caption is added to the image, as the one above, the caption width is not adjusted to compensate for the width of the image plus the margin widths.  I manually adjusted the code to make it look right.  I left a comment on the plug-in’s development page about it.

**Another EDIT** The plug-in author responded to my comment on the plug-in’s development page and pointed out that the caption width didn’t work in the previous version of WordPress either.  I all fairness, I realize that I should not suggest this is something he should fix in his plug-in.  As Gregory pointed out,

Ideally, this flexibility would be added to core.

-John. K7JM

WordPress 3.9 Jetpack Update Failure

wordpress I use WordPress as my blogging software.  Lately, my software updates have been failing.  Blogging software and other web/Internet software needs to be updated when there are discovered vulnerabilities just like other software.  I also use something called Jetpack that provides statistics and other functions for my blog.  It also needed to be updated and was failing.

To fix this:

  • First, log into your web site administration control panel however you do it.  For me, it is cpanel. Get into your file manager and find your plugin directory.  If you have ssh access, you can also do it via the command line.
  • Find the jetpack directory inside your plugins directory and delete the whole jetpack directory.  The plugins directory is inside of the wp-content directory.
  • Do a plugins update if required for other plugins
  • Do a WordPress update.  It took quite a while for me.  If it asks you to do a database update, do it.
  • Reinstall the Jetpack plugin and activate it if necessary.
  • Have fun.

I hope this helps someone with a similar problem.

-John – K7JM

The Amateur Amateur

Click to read “The Amateur Amateur”

I’ve enjoyed reading Gary Hoffman’s, KB0H, “The Amateur Amateur” on the ARRL (American Radio Relay League) web site for years.  The ARRL no longer publishes Gary’s articles, but he still writes them.  In Gary’s own words:

The Amateur Amateur is a column about my experiences in ham radio. Since I have little technical expertise and not much knowledge of electronics, I make a lot of mistakes. I consider myself to be just an amateur amateur radio operator, but I keep pressing on and trying new things. This column details my triumphs – and foibles – and I try not to take myself too seriously. Whether you are an experienced ham or new to the hobby, I hope you find these chronicles of my efforts to be entertaining.

Gary Ross Hoffman, KB0H

Gary’s “The Amateur Amateur” columns are available HERE.
Enjoy! -John, K7JM

Repairing My Icom IC-706MKIIG Final Amp Board

Final amp board with old finals removed, and ready for board modifications.
Final amp board with old finals removed, and ready for board modifications.
IC-706MKIIG amplifier board with new final transistors mounted in modified circuit board.
IC-706MKIIG amplifier board with new final transistors mounted in modified circuit board.

A few YEARS ago, I noticed that my Icom IC-706MKIIG Amateur Radio Transceiver was very hot.  The problem was, it was NOT turned on!  I immediately unplugged it from its 12V powers supply, and pondered it for a long time.  Apparently, the final amplifier transistors in the IC-706MKIIG are not wired through the power switch, but they get their power directly from the 12VDC line coming into the radio.  If all is working well, and there is no drive to the transistors, there is no current drawn.  One of mine, was drawing current, and a lot of it, all the time.  Ten Amps of current, all the time.

My radio sat for a few years, with me occasionally looking at it and wondering what to do.  Finally, in late 2012 or early 2013, I contacted a repairman about my radio.  The ball was moving to my court, because #1: the repairman was too busy to take additional jobs, and #2: the repairman informed me that my radio was of an “old” design.  It had SRF-J7044 Mosfet transistors for the HF finals.  They were no longer being made, and they are extremely rare to find.  An alternative was to “replace” the final amplifier board at the cost of over $400.  No thank you!

I did a lot of searching around and found Jose Gavila’s (EB5AGV) web site. He replaced the transistors in one of these older Icom radios with newer transistors, and he documented it very thoroughly.  Being a hardware kind of a guy, I decided I could do the repair following his steps.  And here are the results.  I videoed the process for your viewing pleasure.  If you have an older IC-706MKIIG with the same problem, you CAN repair it.

As a side note about the Icom IC-706MKIIG:  Apparently, damage can be done to the final transistors while the rig is off when the installation is in a vehicle.  I did have mine in a vehicle at one time.  Constant starts of the vehicle, and heavy loads and fluctuations at the battery, enter the radio, and are applied to the finals because of the previously mentioned problem with them not being switched.  If my radio again ends up in my vehicle, I will have it switched, so I can make sure no power is applied to it while I start or stop the engine.

Most of this information was found at EB5AGV’s web site at:
http://jvgavila.com/ic706.htm

Thank you a thousand times for the nice details that Jose, EB5AGV provided in doing this mod.

–John, K7JM

embedded by Embedded Video

YouTube Direkt

 

Amazon Instant Video And Flash Update Fix

My family regularly view videos from Amazon’s Instant Video.  We went to rent a movie last night, and the Amazon Video screen said “Updating” and when it was finished, we received an error like that shown below.

After a lot of searching on-line, I found a discussion about the topic on Amazon’s Customers Discussions web site.  It appears that Amason’s Instant Videos, Flash, and Ubuntu all of a sudden don’t get along.  On the discussion page, there seems to be some fixes, mostly for Ubuntu 64 Bit operating systems.  I, however, run 32 Bit Ubuntu.  Thanks to one particular post by Erik, I found a solution that seems to work on 32 Bit Ubuntu systems.

Here is my solution based on Erik’s advice.

# Install the required packages.

sudo apt-get install hal

# Remove cached junk from Adobe and Macromedia.

# Back these up if you’re not sure what you’re doing.

rm -r ~/.adobe ~/.macromedia

# Reboot the computer.

#Start Firefox

# The Amazon player updates, and video rentals are now playable again.

Since I was trying all kinds of stuff, I didn’t really know what made if finally work, so to confirm my findings, I booted up the Alpha of Ubuntu 12.04 in Virtual Box, confirmed that Amazon Instant video choked, and repeated the above procedure. It worked for me. I am running 32 Bit Ubuntu.  I hopes this helps someone else!

John – K7JM

How I Back Up My Ubuntu Linux Laptop

I am an ardent believer of backups. My desktop computer is backed up nightly to an external usb drive using a program called Back In Time. It uses the famous rsync program to do the backups. The neat thing about rsync, is that it uses hard links to link to previously backedup files that have not changed. What this means, is that if a file does not change, it is not copied multiple times to the hard drive, but just once. This saves a very large amount of disk space, while preserving the original directory structure. Back In Time is started though a cron job, and as long as the desktop computer is on, it will be backed up. By the way, my computers run with the UBUNTU Linux operating system. These instructions are for Linux and will not work on Windows systems. My laptop us running Ubuntu 11.10, and my server is running Ubuntu 10.04.3 LTS. Check out Ubuntu Linux at ubuntu.com.

The setup for my desktop will not work for my laptop. My laptop is not always on, and is not always available around the home network to be backed up. I needed a system that would automatically backup my laptop whenever it is on, but only once a day. I have been doing manual backups, but that is a hassle. I may eventually employ this backup method to my desktop so I don’t have to keep it on all the time either.

Here are a couple of problems I needed to overcome. If I mounted a server drive from my laptop, and backup remotely to the server, rsync would not preserver the hard links. This not only takes up a lot of disk space, but every file would need to be sent over to the server during every backup. Over wifi, a typical backup would take longer than a day. Not ideal for a daily backup. I therefore, had to initiate the backup from the server. This meant, sshing into the server, mounting the laptop as a drive in the server, and starting the backup from the server end.

I searched for solutions to these problems, and came up with a multitude of answers that were not particularly ideal for me to impliment. One solution was to use NFS to mount the drives in fstab. I did not want to put it in fstab because, I did not want it to mount automatically when the server booted (or the laptop for that matter). Another solution was to continuously poll for the existance of the laptop, from the server, and when it ‘appeared on the network’, start a backup. I then would have to keep track of it to make sure it was only backed up once a day. None of these solutions suited me well.

What I did…

Through a bit of more studying and searching, I discovered that a remote script can be run by ssh. I use ssh all the time to connect to my server, and finding that I was able to start a script on the server just by initiating an ssh session was a revelation to me.

Another tool that I found was “anacron“. Anacron is kind of like cron, but enables you to run something x number of days, weeks or months. And, if the computer is not running at midnight, anacron will make sure the program will run sometime during that day as long as the computer is on sometime in that day. Unlike cron, anacron does not enable you to run a program multiple times a day, or at exact times. This is exactly what I was looking for and anacron was already installed on my laptop, and was already set up to run upon boot.

This is not exactly a step by step tutorial, as you need to know how to get around your system, and run things as root.

I wanted the backup process to be automatic, so I needed to set up ssh keys. I have never set up ssh keys before. There are a lot of tutorials on the net to help along; and I picked this one: http://news.softpedia.com/news/How-to-Use-RSA-Key-for-SSH-Authentication-38599.shtml

This is the command to generate the keys. I wanted unattended backups, so I did not enter a passphrase.

# ssh-keygen -t rsa

I also ran these commands to copy the files to the appropriate directories and files:

# scp .ssh/id_rsa.pub username@hostname.com:~
# cd $HOME
# cat id_rsa.pub >> .ssh/authorized_keys

I also had to copy the keys from the .ssh directory in my home directory to the .ssh diretory in the /root directory.

*** EDIT 11/3/2012 ***  I updated my operating system, and neglected the above step.  The backups failed, and I really scratched my head wondering why.  Anacron (see below) is run as root, so it is imperative that the ssh keys be placed in the /root/.ssh/ folder.

*** EDIT 7/29/2023 ***  I still use this same routine to backup my laptop over a decade later.  A new command (to me) that I found is “ssh-copy-id username@remote_host”  This copies the public rsa key to the server and adds it to the “authorized_keys”.  If you have a root password on your server, you can use the command “ssh-copy-id root@remote_host”.  Otherwise use the commands listed above (copied here):

# scp .ssh/id_rsa.pub username@hostname.com:~

ssh into the server

# ssh username@hostname.com

# sudo su

# cat /home/username/.ssh/id_rsa.pub >> /root/.ssh/authorized_keys

Read over the ssh tutorial sited above, and when you are done you should be able to ssh into your server with:

# ssh username@remote.hostname.com

I like sshfs to mount my remote directories.
if it is not installed on your server, do it by running:

# sudo apt-get install sshfs

or find it in the Ubuntu Software Center.

You will also need a recent version of Back In Time. I have v1.0.8 installed. The difference between the newest version and older ones, is that the newer version can have multiple profiles. That is, you can have a different profile for each computer you want to back up from your common point (server).

I added the Back In Time stable repository ppa. This will give you the latest and greatest version of Back In Time. If you don’t need multiple profiles, you can install Back In Time from the Ubuntu Software Center and it will be fine. Do this command to add the repository:

# sudo apt-add-repository ppa:bit-team/stable

Then do:

# sudo apt-get update
# sudo apt-get install backintime-gnome

Some explaination is needed here that probably should have been mentioned sooner. My server has ubuntu-desktop installed on it (that is, a full gui interface). The Back In Time profile configuration is easily done with the gui. Then the actual backups can be done with the command line. This also means that if you don’t have a server, your desktop computer should work just fine to backup your laptop with these instructions. There is nothing really “servery” about the process.

OK, some preliminary work:

You need to make a mount point on your server where you want to mount your laptop drive. Create a blank directory where you want it mounted. I chose “/home/ubuntu-laptop” as my mount point.

Create directories on your laptop and on the server where you want the script files to go. On my laptop, I put my backup script in “/home/myhome/Scripts_and_Programs/server_backup”. You can put yours where ever you want. On my server I put the script in “/home/myhome/Scripts”.

I use static ip addresses for my home network devices. You can set up a static ip for your laptop and server in the network manager, but I chose to let the rounter do it. My router lets me pick a certain ip to be assigned by its DHCP. It is based off the mac address of my laptop wifi connection and my server eithernet card. My laptop is assigned 192.168.1.5 and my server is 192.168.1.11. By letting the rounter assign the addresses, I am free to take my laptop somewhere else, and some other wifi rounter will be free to assign my laptop any address it needs to.

Test your connections. ssh into your server. If your ssh keys were setup correctly, a command like:

# ssh -X myname@192.168.1.11

should work, and no password is necessary. On your first connection, you may be asked to accept the key. Answer “yes” and proceede.

Now, try to mount your laptop drive from your server. My command to do this is:

# sshfs myname@192.168.1.5:/home /home/ubuntu-laptop

If you go to “/home/ubuntu-laptop”, on your sever, you should now see your laptop files. Again, the first time you do this, you may be asked to accept the ssh keys. You will also be asked for your laptop password. You could set up another set of ssh keys for your server if you would like, but the sshfs command has an option to get your password from somewhere else and that is how I did it. That will be explained later.

If all is well up to this point, you can fire up the Back In Time gui to configure your backup. On the command line, the command to enter is “backintime-gnome”. You still should be ssh’ed into your server, or be at a server terminal. Notice that my ssh command from above has the option -X in it. This will allow X display forwarding so you can run a gui program from your terminal box.

in your ssh’ed terminal box, that is connected to your server, type:

# backintime-gnome

This should bring up the Back In Time gui, ready to configure.  Click the “settings” icon on the Back In Time gui. If you are using a newer version of Back In Time, you can select a “New” profile. Mine is called “ubuntu-laptop”. Then you need to decide where to put your backups. I have a usb drive mounted at /home/shared2. You need to traverse to the point where you want your backups to go using the folder icon to the right of “Where to save snapshots”. Leave “Schedule” disabled as this will be handled in one of the script files.

Click the “Include” tab and add the folders you want backed up. Remember that I have my laptop mounted at /home/myname/ubuntu-laptop on my server. This is where you need to go to include the included folders. Notice I have a seperate entry for my Thunderbird email folders. This is because I have any hidden files .* excluded from being backup up and my Thunderbird email folders are inside a hidden directory.

Click the “Exclude” tab. Add any files or folders you do not want backed up. As you can see, I added an entry to skip all my configuration files with .* . You need to decide what you want backed up. I also excluded my download directory, my virtual machines folder (large files for experimentation only), and any other directories where I might have other systems mounted in.

Click the “Auto-remove” tab and decide how many backups to want to retain.

Click the “Options” tab. I added one checkmark here that was not already checked. I checked “Continue on errors (Keep incomplete snapshots). I did this because I had a snapshot fail because of some protected .files. I suggest you click this box, at least at first. Then after you make backups, you can look at the log file to see what failed. Otherwise, you will never get a backup if there is an error somewhere.

I did not adjust any of the “Expert Options”.

Click “ok” and this should bring you back to the main “Back In Time” window. If you like, you can make your initial backup by pressing “take snapshot” icon. Your first backup may take a good amount of time, depending on your wifi connection, and the size of your backup. Mine included many video files, and took well over a day and a half. Subsequent backups are quite quick, depending on what new or changed files your need backed up.

If all is successful to this point, you can close the “Back In Time” gui. I suggest you wait until your initial backup is done, as the backup will continue even after you shut the gui, but this may cause problems when you try to setup the rest of the system.

Now the meat and potatoes:

This is the script “start_laptop_backup.sh” that is run on the laptop side, to initiate the backup.

#!/bin/sh
# Automated backup of laptop, that is not always on, to a server that is always on.
# By: John McDougall
# Aug. 11, 2011
#
# This is a script that is run on the laptop. It starts a script on the server named backup_ubuntu-laptop.sh
#
# Is the server pingable?
# try three pings, wait maximum one sec for reply, be quiet
echo “Starting laptop side script”
echo “testing to see if server is accessable”
if ping -qc 3 -W 1 192.168.1.11 > /dev/null; then
echo “yes… Server is up. Will now attempt backup”
echo “Connecting to server.”
ssh -X john@192.168.1.11 /home/john/Scripts/backup_ubuntu-laptop.sh
echo “Server disconnected… Closing laptop side script”
else
echo “no… Server is down. No backup done now”
fi

This is what this script does: First, I try to ping my server to see if is available. If it passes that test, I connect to the server over ssh and run the script “backup_ubuntu-laptop.sh”. No password needed if the ssh keys are set up properly.

Here is the script on the server side called “backup_ubuntu-laptop.sh”.

#!/bin/bash
# Automated backup of laptop, that is not always on, to a server that is always on.
# By: John McDougall
# McDougallsHome.net and radio.McDougallsHome.net
# Aug. 11, 2011
#
# This is a script that is run on the server. It is started from a script run through ssh from the laptop to be backed up.
#
# Connect to ubuntu-laptop from server.
echo “Successfully connected to server… Running server side script”
echo “Attempting to mount ubuntu-laptop to server”
sshfs -o password_stdin john@ubuntu-laptop:/home /home/ubuntu-laptop < /home/john/Scripts/ubuntu_pwd
echo $1
for i in `cat /proc/mounts | cut -d’ ‘ -f2`; do
if [ “x/home/ubuntu-laptop” = “x$i” ]; then
echo “laptop is mounted. ”
# Start Backintime
echo “Starting backintime”
# backintime –backup
backintime –profile ubuntu-laptop –backup
echo “Backup finished”
echo “Unmounting ubuntu-laptop”
fusermount -u /home/ubuntu-laptop
echo “ubuntu-laptop unmounted. Exiting backup script and disconnecting from server”
exit
fi
done
echo “laptop not mounted… Can not backup”
exit

This script first mounts the laptop via sshfs. Notice the “-o password_stdin” option. This enables the password to be entered from another source. In this example, I have the password in a file called laptop_pwd in my Scripts directory on the server. As mentioned earlier, you could set up ssh keys for the server and eliminate the password in a file thing if you feel that is unsafe to do.

This script then confirms that the laptop is indeed mounted. It then initiates the backup with the command “backintime –profile ubuntu-laptop –backup”. After the backup is finished, the script unmounts the laptop and closes down.

To test all this, open a terminal window on your laptop and start the script “start_laptop_backup.sh” and watch its progress. If all worked out, and you already made an initial backup, it should run relatively quick.

Now to automate it all with anacron:

You can find a little tutorial on anacron at “http://www.thegeekstuff.com/2011/05/anacron-examples/“.
To configure anacron, you must modify the file /etc/anacrontab on your laptop. I did this in a terminal window using the command:

# sudo nano /etc/anacrontab

I added one line to the bottom of the anacrontab file as shown above. The first parameter “1” means it should be run once a day. The second parameter “13” means it will start approximately 13 minutes after the machine is booted. You can change this to whatever you want. Set it at a time where it won’t interfere with the busy boot process, and anything else you might do as soon as you turn on your computer. The third paramater is a label you make up to identify this process. The fourth parameter is the command to execute. By placing “nice” before the command, if the system is very busy, the backup process will wait until the system is less busy. You can look up the manual page for “nice” by typing “man nice” in a terminal window.

That about does it. If everything worked so far, you are ready to reboot your computer. Anacron should run and a backup initiated. Anacron will only run once a day even if the laptop is turned on several times a day, or left on for days. You will still only have one backup per day. You can look at the files in /var/spool/anacron. ubuntu-laptop (or what ever label you used in the anacrontab file) should be listed there. Open it up with nano or gedit, and it should list the last date that line in the anacrontab file was executed.

I hope this post helps you in your laptop backup.  It might seem kind of complicated, but I enjoyed the process of learning and figuring it all out.  Have fun in exploring linux!

John – K7JM

Aquaria mouse pointer problem solved

I purchased the Humble Introversion Bundle for the Ubuntu Linux platform.  I have had a problem getting the game Aquaria working on my laptop computer.  The mouse pointer would automatically travel up and left.  In certain menus, selections were impossible to select.  It was very frustrating and made the game unusable.  After much searching, I came upon this page with the solution: “Aquaria mouse pointer problem solved” on the bit blot forums.  Thank you archmage for coming up with this easy solution.  The game is now playable, and my kids love it.

John – K7JM

I bought the Introversion Humble Bundle to play Aquaria on my 64-bit Ubuntu 11.04 system. Unfortunately the mouse did not work, to the point of making the game unplayable.

First of all it was not possible to change resolution, because the mouse pointer kept moving back to “no”. Far more seriously, while in game the character kept moving towards up and left.

A long search on the net provided no clear answer; it seems some people have the same problem on Windows and OSX too. On ubuntu some suggest to disable the package “unclutter” to solve the issues, but I did not have that installed at all.

My solution to the problem was to modify the ~/.Aquaria/preferences/usersettings.xml file. Using any text editor, change the line:

<JoystickEnabled on=”1″ />

to:

<JoystickEnabled on=”0″ />

The game is now playable Cool

How To Get Sound Working On Gridwars2

I am a fan of the Gridwars2 game. If you have problems installing Gridwars2 in Ubuntu, see my post about installing it HERE.  I have been successful in installing Gridwars2 in versions from 10.04 to 11.04.

After installing it in a test version of Ubuntu 11.04 Unity install, the sound would not work.  I found this solution that worked:

sudo apt-get install alsa-oss

Then, run Gridwars with this command:

aoss ./gridwars

If you are not running the command from within the Gridwars directory, adjust the path to point to the Gridwars executable.

Have fun with Gridwars!

— John – K7JM

LaCrosse WS-2315 Wind Interference Solution

I recently put my weather station on a SheevaPlug computer. See my post detailing the process HERE. Since that time, I’ve noticed that the LaCrosse WS-2315 (the weather system I have) has been reporting that wind gusts are 14 MPH very often. I doubted that the wind was really blowing 14 MPH all the time, so I search the Internet for a solution. It seems that the LaCrosse systems are prone to interference on the wind sensor line. A very good article by Kenneth Lavrsen explained a good solution to the problem.  I also, read about using toroidal cores to reduce the interference.  I also have to admit that the line from the wind sensor was loose and flapping in the wind.  That is not a good thing with cold dry staticy weather we have been having.  I tightened up the line, and attached a ferrite noise reducer to the line and the problem went away.  Here is a screen shot of my 14 MPH wind.  Notice the correct readings after about 17:20.  Also, see a blurry picture of the solution.

— John – K7JM