2007 Honda Civic Efficiency Update

May 7, 2013 7:41 pm

Just an update on the gas mileage efficiency I’ve been getting with my 2007 Honda Civic (automatic transmission).  The orange line represents the average, which is currently hovering just above 30MPG.  The EPA rating was 25/36 so I’m doing pretty well still.

It does look like the efficiency may have dropped a little bit starting around autumn 2011.  But it’s hard to say for sure since the gasoline formulations change regularly what with the summer/winter mixes and inconsistent ethanol levels.

Civic_07_MPG_chart

Automaton Simulator

March 23, 2013 5:20 pm

I posted about this on Google+ a while back, but I’ve updated the site and it’s now much cleaner.  I still have a few features I’d like to add in the future, but they don’t really impact the site’s purpose.

Anyway, I present AutomatonSimulator.com:

automaton_simulator

In Computer Science we study simple automatons called finite-state machines.  They are equivalent to various useful language concepts.  For example, Deterministic Finite Automata (DFA) can be used to process any Regular Language (i.e., regular expressions, which are infinitely useful).  And Push-Down Automata (PDA) can process any Context-Free Grammars.

In the CS course I TA’d for as a student, CS 252, a chunk of the course is devoted to working with these concepts.  This usually means developing a working automaton design based on some desired language recognition. For example, make a machine that will accept strings that alternate between “A” and “B”. Or, make a machine that will accept strings that have the same number of “A”s as “B”s.

They’re usually quite meaningless in and of themselves, but the point is to develop the skills necessary to understand how programming languages are created and why, as well as to hone the ability to logically analyze problems and build logically consistent solutions.

Well, we had to do all this work by hand.  Drawing out machines, tracing through their execution, finding bugs, and making sure they did what they were supposed to without doing things they weren’t supposed to.

As the TA I had to grade a lot of these messily drawn machines that often didn’t work.  It was tiring.  So to aid my grading I wrote a simple simulator in Python for each machine type.  Then I’d encode each student’s machine into my simulator, run a bunch of tests and figure out from there whether it worked and, if not, how badly it was wrong.

AutomatonSimulator.com is a fully functional tool to visually create and test these types of machines.  I took my Python simulators, rewrote them in Javascript, and built a lovely UI around them.

You can save/load machines from your browser’s local storage.  Or you can copy/paste machine descriptions to share with other people.  A small set of examples is included on the site.  You can debug a machine by stepping through an input and you can bulk test a large set of strings with a single button press.

I had fun creating the site and hopefully CS students will find it useful in developing their understanding of finite-state machines.

Something I’d like to do in the future is to build a simple game around the site.  It wouldn’t be very involved, but it would challenge the user to build a machine for a certain language and help them make the connections between these machines and regular expressions.  We’ll see if I get around to it someday.

XBMC LIRC XBox DVD Dongle

July 25, 2012 8:53 pm

Just sticking this here for the rest of the Internet to benefit from.

I was rebuilding the HTPC with an updated OS and needed to get LIRC working again for my remote using the original XBox DVD dongle connected via USB.

I found this post which describes the process of building and using the lirc_xbox kernel module.  Which worked fine in XBMCBuntu (based on Ubuntu 11.10).  But when I tried to follow the instruction in Linux Mint 13 XFCE I got the error “E: Unable to find a source package for lirc” when attempting the line “sudo apt-get build-dep lirc”. 

I worked around the problem by running the same command on my laptop running Ubuntu 12.04 which gave me the correct list of dependencies:

autotools-dev build-essential debhelper devscripts dh-apparmor diffstat dpkg-dev g++ g++-4.6 gettext html2text intltool-debian libasound2-dev libdpkg-perl libftdi-dev libftdi1 libgettextpo0 libice-dev libirman-dev libjack-dev libjack0 libportaudiocpp0 libpthread-stubs0 libpthread-stubs0-dev libsm-dev libstdc++6-4.6-dev libunistring0 libusb-dev libx11-dev libxau-dev libxcb1-dev libxdmcp-dev libxt-dev patch po-debconf portaudio19-dev quilt x11proto-core-dev x11proto-input-dev x11proto-kb-dev xorg-sgml-doctools xtrans-dev

I used that list in a “sudo apt-get install [long list of package names]” and was able to successfully get the build dependencies installed and follow the rest of Mr. Plow’s instructions and get my remote working properly.

Since I saw that a few other people were having the same issue, I thought I’d post this in case anyone else gets stuck on the problem.

Reducing power usage: SheevaPlug and Squeezebox Radio

July 7, 2012 12:36 pm

Since we’ve entered summer our electricity rate tiers have switched to the summer levels.  This means that the lower (cheaper) tiers are smaller so you start running into the much more expensive tiers sooner.  Tiers 1 and 2 are both pretty cheap, 13 and 15 cents per kWh, but tier 3 jumps to 30 cents per kWh.  So whenever possible we try to avoid landing in tier 3.

A while back Mom gave me a Kill-A-Watt meter.  Our electricity bill informs us that we’re consistently using substantially more electricity than “similar homes in your area.”  Which seemed odd since we don’t obviously waste energy.  So I finally got around to checking on our electronics to find out what’s guzzling our energy.

Jess has an old stereo thing that we use to listen to music when going to bed.  I discovered that this stereo was drawing ~18 watts regardless of what it was doing, 24 hours a day, 7 days a week.  So just having this stereo plugged in was costing us somewhere between $20 and $45 a year depending on the tier.

I have a desktop computer which I use as a file server and media center (via XBMC).  It holds all of our DVDs so that the actual DVDs sit in a box somewhere out of the way.  It also holds all of our music files and I use it to download various things via bittorrent (Linux ISOs, games purchased via the Humble Bundle, perfectly legal things, of course).  Thus the computer was usually on 24/7 also.

So I was rather shocked and appalled to discover that it was drawing ~106 watts when running.  Keeping that machine on was costing us ~$100-$200 a year!  So the first thing I did was dig into configuration options and disconnect unused components.  Via this route I was able to bring its energy usage down to about 80 watts.  Better, but not great.

Enter the Logitech Squeezebox Radio and the SheevaPlug.

Logitech Squeezebox Radio

IMGP8274as

The Squeezebox Radio was something I’ve wanted for a little while.  It’s a nifty device and the very low power usage was just a nice bonus.  It’s, essentially, a music streaming device with built-in speaker and wireless network connection.  So you just plug it in and you can listen to Internet radio stations, connect to a Pandora account (or most other music streaming services), and play music from a local server via the freely available Logitech Media Server.  Something I like about it is that all the software is open source and they don’t make any attempt at locking down the software or hardware.

Anyway, I received the Squeezebox Radio for my birthday this year.  Part of its job was going to replace Jess’ old stereo system.  It’s working great at that task and takes up less than a third of the space.  It’s small (smaller than I expected) and easy to move so Jess often moves it to the living room during the day, into the bathroom for Heather’s bathtime, etc.

The Squeezebox Radio draws ~2.0 watts when running (~2.2 watts when the screen is on).  So that’s a big win over the stereo drawing 18 watts.  But it also contributes in savings in other ways.  Instead of running the full blown stereo system in the living room for music Jess uses the Squeezebox Radio, so that’s going to count for something.

Overall I am very pleased with the Squeezebox Radio.

The Squeezebox Radio spends much of its time streaming music locally from the desktop (using the mentioned media server software).  And the computer was the big power hog.  So let’s address that now.

SheevaPlug

IMGP8269as

To try and reduce the power usage of the computer I spent some time researching low-power computing options.  I researched building an Intel Atom based machine, an AMD Fusion based machine, a dedicated NAS device, and a few other options.  But based on those systems it looked like I was going to get a lot more power than I needed and still be pulling 20-30 watts.

I then turned my attention to the Plug Computer scene.  Plug computers are designed to be cheap, low power, plugged in somewhere out of the way, and mostly forgotten. They have a vibrant community built around them.

There are several plug computers to choose from.  I went with the SheevaPlug because it has a long history with many success stories and guides.  Its age means it’s a little less capable than some of the other offerings, but it looked like it would do what I needed just fine.

It’s small, about the size of 3 decks of cards.  It features a 1.2 GHz ARM processor, 512 MB DDR2 RAM, SD Card slot, USB 2.0 port, and a gigabit ethernet jack.

I set it up with a 4 GB SD card and a 64 GB USB flash drive.  I had planned to use a 2 GB SD card, but it didn’t like the one I had and a 1 GB card was too small.

I used the 4 GB card to install the operating system (Debian Linux) and other necessary software (like the Logitech Media Server, Transmission [a bittorrent client], etc.).

The 64 GB USB flash drive is holding all the data I need.  It has our music library, backup files from the Board (the Board gets backed up nightly, previously to my desktop, now to the SheevaPlug), and any currently active torrent files.

Maybe some of the other plug computers are different, but setting up a SheevaPlug isn’t exactly for the novice.  I had to cobble together bits and pieces from various guides in order to get everything working correctly.  It requires a working knowledge of Linux, a comfortable familiarity with command lines and a basic understanding of memory addressing (well, if you want to have any idea what the commands you’re typing do, that is).

Here are the main resources I used:
http://www.cyrius.com/debian/kirkwood/sheevaplug/
http://www.cyrius.com/debian/kirkwood/sheevaplug/install.html
http://plug.noloop.net/sheevaplug-hacks/installing-debian/
http://wiki.slimdevices.com/index.php/SheevaPlug_Installation_guide
http://d-i.debian.org/daily-images/armel/20120705-08:35/kirkwood/netboot/marvell/sheevaplug/ (for the latest Debian installer images)
I also needed some decent Google skills to solve various issues along the way.

The SheevaPlug is up and running smoothly now.  Running full tilt it draws about 3.5 watts.  So the Squeezebox Radio and SheevaPlug together use about 6 watts compared to the ~125 watts previously needed for the desktop and stereo.  So over the course of 1 year this set up will save us somewhere between $120 and $200 in electricity.  The SheevaPlug costs $99, so it will pay for itself in a year and that’s ignoring the reduced cooling costs.

Bayes’ Theorem

May 12, 2012 3:38 pm

By request, here’s my attempt to explain Bayes’ theorem (cribbing heavily from Wikipedia).

Deriving Bayes’ Theorem

We start with the standard definition of conditional probability (for events A and B):

3c5bf4874edc582346cb78a5eb66aaeb

Which reads: The probability of event A given the known occurrence of event B is equal to the joint probability of events A and B (e.g., the probability of both events occurring) divided by the probability of event B (assuming the probability of B is not zero).  I’m not going to show the derivation of conditional probability.

The summation axiom helps us understand the joint probability of A and B:

9e168c9112e06ac47b11fef388957692

It tells us that the joint probability of A and B is equal to the probability of A given B multiplied by the probability of B.  All we’re talking about is the likelihood of events A and B both occurring.

Keep the following equivalence in mind, we’ll need it in a minute.  It simply says that the order of variables when writing the joint probability is irrelevant.  It should be fairly straightforward that the likelihood of events A and B both occurring is the same as the likelihood of events B and A both occurring.

joint_equivalance

Filling back in our definition of conditional probability, we have (with the understanding that P(B) is not 0):

bayes_derivation

The third equation is the simplest form of Bayes’ theorem.  It wasn’t very hard to get to and the math, relatively speaking, is quite simple (we’re not talking about deriving the Schrödinger equation or anything absurd).  But its application, and understanding what it means, can be tricky.

P(A) is called the “prior,” representing our prior belief in the occurrence of event A.

P(A|B) is called the “posterior,” representing our belief in the occurrence of event A after (post) accounting for event B.

The remaining pieces, P(B|A) / P(B), are called the “support,” representing the support event B provides for the likelihood of event A occurring.

Applying Bayes’ Theorem

I’ll re-use the medical testing scenario from the previous post.

Substituting T to represent a positive test result and D to indicate the presence of the disease our equation becomes:

test_disease

We expand P(T) into P(T|D)P(D) + P(T|¬D)P(¬D) which is to say: The probability of getting a positive test, P(T), is equal to [the probability of getting a positive test when the disease is present, P(T|D), multiplied by the probability that the disease is present, P(D)] plus [the probability of getting a positive test when the disease is not present, P(T|¬D), multiplied by the probability that the disease is not present, P(¬D)].

So now we just need to map the numbers we have about the disease and the test to the variables in our equation:

P(D), our prior, is 1 in 200 million–the disease occurrence rate in the general population.

P(T|D), the probability of getting a positive test when the disease is present, is derived from the false negative rate.  When the disease is present, our test will only incorrectly say it is not 1 out of 200,000 times; so P(T|D) is 199,999 out of 200,000.

P(T|¬D), the probability of getting a positive test when the disease is not present is the false-positive rate: 1 in 100,000.

P(¬D), the probability of the disease not being present is simply the other 199,999,999 out of 200 million.

So let’s plug these numbers in step by step:

test_disease_computation

And we see that P(D|T), the probability that the disease is present given a positive test, is only 4.76%.  (My previous post incorrectly reported this as 0.05%, I’ve corrected the error.)

The car example I presented didn’t use hard numbers, but I’ll frame the concept into Bayes’ Theorem.  S will represent a car being stolen and H will represent a car being a Honda Civic:

stolen_honda

Which says, the probability that a car is stolen given that it’s a Honda Civic is equal to [the probability that a car is a Honda Civic given that it’s stolen] multiplied by [the probability of a car being stolen] divided by [the probability of a car being a Honda Civic].

The dealership was trying to push an insurance policy on me by only reporting P(H|S), but unless we account for the number of Honda Civics on the road in the first place, P(H), we aren’t getting the full story.

(Notice that we don’t have to actually care about P(S) if all we’re doing is comparing car make/models.  P(S) doesn’t change for the different make/models so it doesn’t affect the relative rankings.)