I followed the build-from-source instructions for Desktop-Full, installing dependencies as they came up (i.e., when CMake complained) and invoking catkin_make_isolated with the following: -DCMAKE_INSTALL_PREFIX=/opt/ros/groovy -DPYTHON_EXECUTABLE=/usr/bin/python2 -DPYTHON_INCLUDE_DIR=/usr/include/python2.7 -DPYTHON_LIBRARY=/usr/lib/libpython2.7.so -DSETUPTOOLS_DEB_LAYOUT=OFF
Some of the most annoying oddities:
camera_calibraton_parsers wants yaml-cpp 0.3 (from the AUR), not 0.5!
image_proc and stereo_image_proc use Boost’s lock_guard in their nodelet code but don’t compile because the BOOST_VERSION check doesn’t work. I tried passing -DBOOST_VERSION=105300 to make to no avail. I just edited the source code. (Though in hindsight, maybe I should have passed that to gcc by setting CFLAGS in the appropriate CMakeLists.txt.)
gscam needs gstreamer0.10-plugins installed.
The MOST annoying thing of all was waiting for catkin to sift through nearly 180 packages multiple times before choking up on various packages, over and over again. Even on eight hyperthreaded cores, this took a while. It would have been smarter to check ros.org for which packages each removed package was a dependency of and remove them from src preemptively. Even better, I could probably have used rosdep to generate and remove a list of dependent packages.
Rode down Sickter Gnar (at Blackrock) for the second time, got overambitious with my speed, and went over the bars. Hit the ground with my right elbow and left pinky and noticed I no longer had a fifth knuckle. The area just under the head of the metacarpal is shattered and will need pins to fix. Surgery is on Wednesday. I’ve a temporary cast for now.
Initially, there was no pain and I vainly tried to pull it back myself, thinking it was a dislocation.
In retrospect, I should have kept my front wheel straight and leaned backward more. And maybe not tried to pull my finger back by myself.
What sucks more than not being able to bike for the next few weeks is that I can’t use my left hand for typing anymore. I have to supinate my arm uncomfortably to even hit a key with my index finger without mashing the keyboard with the cast. Doing any precision mechanical work is also going to be a challenge.
That said, I’m getting excited thinking of all sorts of mechanical additions to build onto the cast when I’m back in the lab.
Update 5/9/13: Got three pins inserted yesterday at 13:00. I was under general anesthesia and apparently woke up and talked with the doctors just after surgery was over, but I don’t remember anything. All of this is a first for me, so it’s interesting. I remember waking up at 14:50 and was thinking and talking coherently by 15:10.
Then they gave me hydrocodone and removed the IV (they have flexible needles!). The painkiller took full effect by 16:00, and I experienced the worst feeling of nausea I’d had in a long while. Got home and vomited three times.
The nausea didn’t go away until 6:00 today when I woke up. That didn’t leave much time for me to study for the midterm exam I just took, which actually wasn’t that bad. I’ll probably get at least a C.
I forgot to mention last time that the nurse used an oscillating saw to cut my cast open. An oscillating saw cuts through hard material like the cast but not the cotton or skin underneath, which is cool.
The 900 MHz XBees I bought back in May 2012 came without antennas (not a mistake — the ones with whip antennas were out of stock, and I couldn’t wait). It turned out that making effective dipole antennas out of 50 ohm coax cable is fairly easy in practice:
DIY half-wave dipole antennas on 900 MHz XBee Pros.
People have, in the past, used neural networks to train a program to guess the next character in a sequence of characters, and in doing so, write fairly coherent sentences. This is similar to what search engines like Google do to autocomplete queries.
It would be a good exercise for me to implement such neural networks in Lisp, both of which I am currently learning.
To make things even more exciting, I can train a program to (aurally) recognize appropriate pauses in people’s conversations (e.g., “umm”, “uhh”, “so…”) and politely chime in with verbal suggestions.
Since doing so audibly might become too intrusive, I might make some of the incoming ECEs build an LED matrix (or just buy one) on which the program can display the words.
This might become a software module for a more sophisticated AI bot I build in a future project.
UPDATE 10/22/12: The bot could also compose poems. And music. And draw things. In short, this bot could be generalized to complete anything.