As I continue to explore Hardy, I find things wrong.
As with any distro, all is not roses, yet.
I accidentally deleted some pictures, and discovered that there's no ~/.trash. Nor do I see a trashcan icon. Where did they go? Luckily, "locate trash" found ~/.local/share/Trash/files.
The "world clock" (click on the calendar) seems to work, but doesn't display my weather, even though it's supposed to.
Wednesday, April 30, 2008
Monday, April 28, 2008
Sunday, April 27, 2008
Hardy Hedgehog
I spent the last day upgrading to Hardy Hedgehog.
The upgrade takes about twelve hours. (I've done it twice, now.) This may speed up when the servers aren't getting hammered, but the second time, I just started it before I went to bed and let it run all night. By the time I got up the next morning, everything left to do (clicking to accept a handful of installation choices) was local.
On the first box, a desktop, tracker crashed and complains. I filed a bug. On my laptop, I turned off the pretty screen-saver because it kept freezing. Also, on the laptop, compiz fusion keeps opening my windows under the panel, so to move them, I have to use Alt-left-click. I filed another bug.
On the good side, the laptop, which I'd had to fuss around with some on Gutsy, just came up and worked. The new default settings are for full, compiz-fusion eye candy, which is nice. I suspect an interaction with the screen-saver to be the "freeze" problem, so I'm willing to live with the trade.
I haven't tried the wireless yet -- maybe it's improved, too. :-)
It all seemed pretty painless.
The upgrade takes about twelve hours. (I've done it twice, now.) This may speed up when the servers aren't getting hammered, but the second time, I just started it before I went to bed and let it run all night. By the time I got up the next morning, everything left to do (clicking to accept a handful of installation choices) was local.
On the first box, a desktop, tracker crashed and complains. I filed a bug. On my laptop, I turned off the pretty screen-saver because it kept freezing. Also, on the laptop, compiz fusion keeps opening my windows under the panel, so to move them, I have to use Alt-left-click. I filed another bug.
On the good side, the laptop, which I'd had to fuss around with some on Gutsy, just came up and worked. The new default settings are for full, compiz-fusion eye candy, which is nice. I suspect an interaction with the screen-saver to be the "freeze" problem, so I'm willing to live with the trade.
I haven't tried the wireless yet -- maybe it's improved, too. :-)
It all seemed pretty painless.
Friday, April 25, 2008
Github: A Git-Repository Hosting Site
I started poking at github this week. At first, I thought, "What's the point of a centralized, hosting site for git, which is inherently distributed?" Then
I remembered that I'm already using a git repository at kernel.org.
This is useful stuff. Even if I don't host anything important there, it helps me re-think some of my SCM assumptions and clear up my mis-conceptions.
I remembered that I'm already using a git repository at kernel.org.
This is useful stuff. Even if I don't host anything important there, it helps me re-think some of my SCM assumptions and clear up my mis-conceptions.
Wednesday, April 23, 2008
Tuesday, April 22, 2008
This is Only a Test
Here and there, you'll see the syntax
Actually, it still is:
Old shells weren't nearly as powerful as newer ones, and in order to do even a simple test operation, you had to invoke a separate executable.
You can argue this is clean and minimalist, giving the shell only the roles that it has to have -- indeed, people make analogous arguments about microkernels -- but it's not efficient: to invoke the test sub-process, the kernel needs to fork() a subshell, and then exec() a new executable. To make things even messier, both these are kernel calls, which offers the risk of a process swap. All for a simple test.
Still, if you have to do that, it's more efficient to do both tests inside the new, test process than to return and invoke a second sub-process to do the second test. That is, [test1 -a test2 ] means "Do test1 and test2, and make me use an ugly syntax to tell you that while you're at it."
In fact, the syntax of test can get quite messy, but never mind that: you don't need it any more. The shell now offers its own, built-in versions of test, &&, and || .
if [ -f scriptname -a -x scriptname ]or
if [ -f file1 -o -f file2 ]These should, nowadays, be re-written like this:
if [ -f scriptname ] && [ -x scriptname ]and
if [ -f file1 ] || [ -f file2 ]Here's an opportunity for a little shell archaeology. In the old days, when men were men and giants walked the earth, test was a stand-alone executable. ( [ foo ] is a synonym for test foo ).
Actually, it still is:
$ ls /usr/bin/test
-rwxr-xr-x 1 root root 23036 2007-09-29 06:51 /usr/bin/test
In fact,(Now you see how [ foo ] works in older shells.)$ ls -l /usr/bin/[
-rwxr-xr-x 1 root root 25024 2007-09-29 06:51 /usr/bin/[
Old shells weren't nearly as powerful as newer ones, and in order to do even a simple test operation, you had to invoke a separate executable.
You can argue this is clean and minimalist, giving the shell only the roles that it has to have -- indeed, people make analogous arguments about microkernels -- but it's not efficient: to invoke the test sub-process, the kernel needs to fork() a subshell, and then exec() a new executable. To make things even messier, both these are kernel calls, which offers the risk of a process swap. All for a simple test.
Still, if you have to do that, it's more efficient to do both tests inside the new, test process than to return and invoke a second sub-process to do the second test. That is, [test1 -a test2 ] means "Do test1 and test2, and make me use an ugly syntax to tell you that while you're at it."
In fact, the syntax of test can get quite messy, but never mind that: you don't need it any more. The shell now offers its own, built-in versions of test, &&, and || .
Moreover, || and && both provide short-cut evaluation, just as in C.$ type test
test is a shell builtin
Monday, April 21, 2008
xclip: A Cut-and-Paste Aid
I'm forever cutting and pasting, snarfing-and-barfing, Jake Edge calls it. What's that you say? You too?
Do you know about xclip? I didn't.
I just saw a post somewhere that suggested using
First, I tried apt-get install pbcopy. No such package. How about apt-cache search pbcopy? Nothing.
When I Googled "pbcopy ubuntu" I learned why: pbcopy is an OS/X tool. For Ubuntu, the analogue is xclip. (And sudo apt-get install xclip works.)
By default, xclip puts its input into the X Windows clip buffer (duh), but I've also defined an alias to get me used to using pbcopy:
So, fabulous stuff. I'll just write this post in vi, cat it into pbcopy, and use ^V to paste it into Blogger. Um. No. Neither that, nor a simple xclip seems to work. I can paste it into a text file, but not into blogger.
More investigation is needed.
Update:
Ah. xsel (from the package of the same name) works much better.
Do you know about xclip? I didn't.
I just saw a post somewhere that suggested using
cat big-file-name | pbcopyto put a large file into the clipboard. "Ooooh!" I thought. "A command that captures stdin and slaps it into the clipboard? I could use that for a lot of things."
First, I tried apt-get install pbcopy. No such package. How about apt-cache search pbcopy? Nothing.
When I Googled "pbcopy ubuntu" I learned why: pbcopy is an OS/X tool. For Ubuntu, the analogue is xclip. (And sudo apt-get install xclip works.)
By default, xclip puts its input into the X Windows clip buffer (duh), but I've also defined an alias to get me used to using pbcopy:
alias pbcopy='xclip -selection clipboard'I've found that if I have a work-alike to a command in another distro, making an alias with the same name is a good didactic tool; it helps me get used to the command and makes it portable. Your mileage may vary.
So, fabulous stuff. I'll just write this post in vi, cat it into pbcopy, and use ^V to paste it into Blogger. Um. No. Neither that, nor a simple xclip seems to work. I can paste it into a text file, but not into blogger.
More investigation is needed.
Update:
Ah. xsel (from the package of the same name) works much better.
Friday, April 18, 2008
Everybody Makes Mistmakes
There are mistakes I make over and over and over. One of them is forgetting that every command in a make rule is a separate shell.
Make is the tool that makes the most use of shell scripts. Makefiles are full of shell commands. In a make rule, though, every command line is executed by a separate subshell. This won't work:
The good news: you'll get an error message, because only the echo is a shell command by itself. You have to write it like this:
Even when you lay it out nicely, you need the semicolons, because the shell has to think it's one line.
The bad news is that if it's not a syntax error, you won't see an error message.
But wait. It gets worse. If I start in my home directory, and make with this Makefile:
One subshell will do the cd, but the rm -rf will be done in my home directory, by a separate subshell.
I should have written it like this:
I make this kind of mistake in crontabs, too. Enumerating the times I've made these was a primary motivation for Georg Cantor's invention of transfinite cardinals.
Make is the tool that makes the most use of shell scripts. Makefiles are full of shell commands. In a make rule, though, every command line is executed by a separate subshell. This won't work:
foo:
for i in *
do
echo $i
done
The good news: you'll get an error message, because only the echo is a shell command by itself. You have to write it like this:
foo:
for i in *; do echo $i; done
Even when you lay it out nicely, you need the semicolons, because the shell has to think it's one line.
foo:
for i in *; \
do \
echo $i; \
done
The bad news is that if it's not a syntax error, you won't see an error message.
But wait. It gets worse. If I start in my home directory, and make with this Makefile:
default:
cd /tmp/foo.d
rm -rf *
One subshell will do the cd, but the rm -rf will be done in my home directory, by a separate subshell.
$ cd
$ make
/tmp
/home/jsh
$ ls
$ # Um. All my files are gone.
I should have written it like this:
default:
cd /tmp/foo.d; rm -rf *
I make this kind of mistake in crontabs, too. Enumerating the times I've made these was a primary motivation for Georg Cantor's invention of transfinite cardinals.
Thursday, April 17, 2008
${VAR:-}: Mystery Solved
I was mystified, yesterday, to find that nearly 5% of the lines in /etc/init.d/functions, in my Fedora 8 distro, had a construct that looked like this: ${VAR:-}.
I use ${VAR:-value} a lot; it provides a default value if $VAR is empty or unset. But without a value? "If it's unset or empty, make it empty."
Huh?
I posted to the Boulder Linux Users Group mailing list, and Sean Reifschneider, of tummy.com, jumped in promptly, with the answer.
If you run the script under bash -u (or with set -u), unset variables cause the script to fail. This construct ensures that the variables will be set, though to null
, so the script completes.
The sound you hear is my palm smacking my forehead.
(Thanks, Sean!)
I use ${VAR:-value} a lot; it provides a default value if $VAR is empty or unset. But without a value? "If it's unset or empty, make it empty."
Huh?
I posted to the Boulder Linux Users Group mailing list, and Sean Reifschneider, of tummy.com, jumped in promptly, with the answer.
If you run the script under bash -u (or with set -u), unset variables cause the script to fail. This construct ensures that the variables will be set, though to null
, so the script completes.
The sound you hear is my palm smacking my forehead.
(Thanks, Sean!)
Wednesday, April 16, 2008
Things I Forget: Loop Variables Are Global, Persistent
In the shell, loop variables, like diamonds, are forever.
Even though I "just meant 'for this loop' ," the shell thinks I really wanted it. This lies in the large class of "things I always forget."
Even though I "just meant 'for this loop' ," the shell thinks I really wanted it. This lies in the large class of "things I always forget."
$ for i in {0..5} > do > echo $i > done 0 1 2 3 4 5 $ echo $i 5
Tuesday, April 15, 2008
Setting Up a git Server
I set up my first git server with little trouble, yesterday.
There is no single, reasonable book on git, but there's a wiki. The design and implementation are more complicated than SVN or CVS, but it looks useable, and it's the sine qua non of kernel source code management.
In theory, my users could do it all themselves, since git is distributed. In practice, they can't and won't.
They need it, it goes up. And now, they need it.
There is no single, reasonable book on git, but there's a wiki. The design and implementation are more complicated than SVN or CVS, but it looks useable, and it's the sine qua non of kernel source code management.
In theory, my users could do it all themselves, since git is distributed. In practice, they can't and won't.
They need it, it goes up. And now, they need it.
Monday, April 14, 2008
Setting the Default Printer
People at work come to me with their printouts. I have CUPS broadcast my printers -- understand, everyone actually shares the same printer, so this is just my printer setup -- and, until they configure their systems, mine are the ones they get when they print.
Their jobs come out with my banner page, which says, "Please bring to Jeff Haemer."
CUPS is still a little confused about setting default printers from the GUI, so even after they set up their own printers, they often get mine by default.
The fix is lpoptions -d , or a direct, hand-edit of the file ~/.cups/lpoptions
Their jobs come out with my banner page, which says, "Please bring to Jeff Haemer."
CUPS is still a little confused about setting default printers from the GUI, so even after they set up their own printers, they often get mine by default.
The fix is lpoptions -d
$ cat ~/.cups/lpoptionsThe default printer is customizable, per-user, and comes up first in all printing menus.
Default Phaser_6120
Friday, April 11, 2008
Jordan Crouse
OLPC fans and curiosity seekers turned out for Jordan Crouse last night a the Boulder Linux User's Group meeting last night. (The guy lounging next to an OLPC is Neal McBurnett.)
Jordan skipped right past "the children," Sugar, and other things you read about in the press, or watch on YouTube. He went right for the throat; his third overhead was a picture of the board. The rest of the talk was things like BIOSes (or the lack thereof) and power management.
The office he works for, in Fort Collins, is AMD's Center for Excellence for CoreBoot, the Linux, Open-Source bootloader for X86 architectures: a good resource to know about.
Thursday, April 10, 2008
Mirroring Trees: rsync --archive vs. cp -a
I did a quick, informal test, and it looks like cp -a is actually faster than rsync --archive for copying hierarchies.
Even this works fine:
Even this works fine:
$ cd /devGo figure. It's a long way from the old tar piped to tar.
$ sudo cp -a zero foo
[sudo] password for jsh:
$ ls -l zero foo
crw-rw-rw- 1 root root 1, 5 2008-03-20 20:05 foo
crw-rw-rw- 1 root root 1, 5 2008-03-20 20:05 zero
$ head /dev/foo | od -c
0000000 \0 \0 \0 \0 \0 \0 \0 \0 \0 \0 \0 \0 \0 \0 \0 \0
*
Wednesday, April 9, 2008
Playing With My Router
My web access was particularly slow yesterday morning, so I took a look at my router. My neighbors were eating my bandwidth.
It's okay by me if my neighbors use my router -- I'm paying a flat rate -- but when it starts to slow me down, I guess I should do something.
I set up an access policy to block the offending box from the internet, and timed out the DHCP leases.
Last night, same thing. Half the boxes on my LAN were my neighbors. Okay, new policy: only our boxes can access the internet.
This morning, there was a neighbor's box on my LAN even though it can't get on the net. At 4:30 in the morning.
Time for bruter force. I added WPA2. The test? It immediately knocked my iBook off, and I was able to log back in with the password. I'll look again tonight, just to reassure myself that I don't have a cracker problem.
Looking on the bright side, I'm more familiar with my router. A year ago, I'd never have messed with it. Now, I do it without trepidation, and even know which security scheme to pick.
It's okay by me if my neighbors use my router -- I'm paying a flat rate -- but when it starts to slow me down, I guess I should do something.
I set up an access policy to block the offending box from the internet, and timed out the DHCP leases.
Last night, same thing. Half the boxes on my LAN were my neighbors. Okay, new policy: only our boxes can access the internet.
This morning, there was a neighbor's box on my LAN even though it can't get on the net. At 4:30 in the morning.
Time for bruter force. I added WPA2. The test? It immediately knocked my iBook off, and I was able to log back in with the password. I'll look again tonight, just to reassure myself that I don't have a cracker problem.
Looking on the bright side, I'm more familiar with my router. A year ago, I'd never have messed with it. Now, I do it without trepidation, and even know which security scheme to pick.
Tuesday, April 8, 2008
Firefox Day: Bookmark Keywords and Wedged Add-Ons
It was a Firefox day.
"Bookmark keywords" are convenient, little aliases that substitute their arguments for %s in the URL line. When I get tired of typing, I set them up. I typically use them for searches, and they come and go based on what I use frequently. They're more convenient for me than adding stuff to the search bar.
Yesterday, I cleaned up, and today I have two: Wikipedia, and a search for whether the public library has a book on the shelf that I want.
To set one up, I drill down to the page that I'm looking for, then do this:
I solved this by going into safe mode, invoking firefox --safe-mode, and removing/updating them there. Why's this work? I dunno, but it does. I got back up, re-started firefox the normal way, and all was well again.
"Bookmark keywords" are convenient, little aliases that substitute their arguments for %s in the URL line. When I get tired of typing, I set them up. I typically use them for searches, and they come and go based on what I use frequently. They're more convenient for me than adding stuff to the search bar.
Yesterday, I cleaned up, and today I have two: Wikipedia, and a search for whether the public library has a book on the shelf that I want.
To set one up, I drill down to the page that I'm looking for, then do this:
- ^L^C -- to capture the URL
- Bookmarks->Organize bookmarks->New Bookmark->Location->^V -- to paste it into the right place.
- Edit the URL to replace the specific things I was looking for by the parameter %s
- Type in a Name and Keyword
- Use them. For Wikipedia, I use the URL, wp
. For library searches, I type bpl.
I solved this by going into safe mode, invoking firefox --safe-mode, and removing/updating them there. Why's this work? I dunno, but it does. I got back up, re-started firefox the normal way, and all was well again.
Monday, April 7, 2008
Even Nurseries Need Spreadsheets
I went to Harlequin Gardens yesterday, to buy an American plum and a gooseberry. (I ended up getting a tastiberry instead.) I just found Harlequin on the web, looking for a local nursery that specializes in things that grow here in our dry climate.
We rushed to get there before it closed, at 5:00, and ran into Eve Reshetnik. I hadn't seen Eve for years, but I've known her for decades -- she's a fine concertina player and one of my favorite singers.
As it turns out, Eve and her husband own Harlequin Gardens.
Eve was locked in a struggle with her computer, trying to figure out how to freeze row and column headers in her spreadsheet, so they wouldn't disappear when she scrolled to see data that wasn't on the screen.
I said, "Kristina just showed me how to do that. Have her show you." They disappeared into Eve's office while I bought plants. A few minutes later, they emerged, Eve announcing, "She's a genius!"
We rushed to get there before it closed, at 5:00, and ran into Eve Reshetnik. I hadn't seen Eve for years, but I've known her for decades -- she's a fine concertina player and one of my favorite singers.
As it turns out, Eve and her husband own Harlequin Gardens.
Eve was locked in a struggle with her computer, trying to figure out how to freeze row and column headers in her spreadsheet, so they wouldn't disappear when she scrolled to see data that wasn't on the screen.
I said, "Kristina just showed me how to do that. Have her show you." They disappeared into Eve's office while I bought plants. A few minutes later, they emerged, Eve announcing, "She's a genius!"
Friday, April 4, 2008
Setting Up a Yum Repository
My desktops and laptop are Ubuntu boxes, but I'm working on manufacturing a Fedora-8 based product, so I've spent a couple of weeks at work learning about various aspects of the Red Hat Package Manager (RPM) and the Yellow Dog Updater (Yum).
Documentation is sometimes spotty, sometimes out-of-date, and sometimes absent.
Here, though, is quite a nice post on how to set up a local yum repository.
It's missing a createrepo in the step on how to set up an updates directory, the directory that mirrors Fedora's new and updated packages. However, that step's completely analogous to the earlier step for the Everything directory, which holds all the RPMs from the original distro, and the error message I got when I left it out was helpful.
Documentation is sometimes spotty, sometimes out-of-date, and sometimes absent.
Here, though, is quite a nice post on how to set up a local yum repository.
It's missing a createrepo in the step on how to set up an updates directory, the directory that mirrors Fedora's new and updated packages. However, that step's completely analogous to the earlier step for the Everything directory, which holds all the RPMs from the original distro, and the error message I got when I left it out was helpful.
Cron Doesn't Read Minds: Customizing My Crontab
Some mistakes I make over and over again.
In part, I automate so I can fix stuff once -- so I don't have to smack my forehead and say, "I knew that."
I (all too) often forget that cron jobs don't have my environment, and may not even have my shell or my email address.
I solve this with a template crontab header. My crontabs start like this:
Here's what the variable settings give me:
$SHELL ensures I won't use cron's default: /bin/sh
$PATH means I won't accidentally get command versions I don't want.
My .bash_env is typically a link to .bashrc. Bash sources $BASH_ENV at
startup when it's a non-interactive, non-login shell, so I get all the stuff I come to assume.
$MAILTO says where to send me error messages. I use gmail's plus addressing so I can search for my cron-job errors when I have to. I put this into root's crontabs on my boxes, too, so errors won't go to root's mailbox, which I never read.
In part, I automate so I can fix stuff once -- so I don't have to smack my forehead and say, "I knew that."
I (all too) often forget that cron jobs don't have my environment, and may not even have my shell or my email address.
I solve this with a template crontab header. My crontabs start like this:
$ crontab -l
## my crontab
SHELL=/bin/bash
PATH=/bin:/usr/bin:~/bin
BASH_ENV=/home/jsh/.bash_env
MAILTO=jeffrey.haemer+crontab@gmail.com
# minute (0-59)
# | hour (0-23)
# | | day of the month (1-31)
# | | | month of the year (jan-dec)
# | | | | day of the week (sun-mon, with 0=sun)
# | | | | | command
# | | | | | |
#
0 23 * * * crontab -l > ~/bin/crontab.txt 2>/dev/null
...
The column indicators are good reminders and visual indicators of what goes where.Here's what the variable settings give me:
$SHELL ensures I won't use cron's default: /bin/sh
$PATH means I won't accidentally get command versions I don't want.
My .bash_env is typically a link to .bashrc. Bash sources $BASH_ENV at
startup when it's a non-interactive, non-login shell, so I get all the stuff I come to assume.
$MAILTO says where to send me error messages. I use gmail's plus addressing so I can search for my cron-job errors when I have to. I put this into root's crontabs on my boxes, too, so errors won't go to root's mailbox, which I never read.
Thursday, April 3, 2008
Pipelines Create Subshells: A Demo
Pipelines create subshells. This example shows that each end of the pipeline is in a subshell
$ x=42
$ x=6 | x=9
$ echo $x
42
If you want to be fancier,$ x=42
$ { echo $x>/dev/tty; x=6; } | { echo $x>/dev/tty; x=9; }
42
42
$ echo $x 42
Tuesday, April 1, 2008
I Shoulda Stood in Bed
Some days, I can't win for losing.
Yesterday, a colleague who was doing a new installation of Fedora 8, came to ask me for help setting up his printing. Nothing else I was doing was working, so I thought, "Sure. At least I can do that. How hard can CUPS be?"
After struggling through SE Linux, adding an unsupported printer, and worked around the vagaries of naming and specifying URIs, I hand-hacked the PPD, and CUPS was finally willing to print a test page without complaint. We rushed to the printer in eager anticipation.
It was out of magenta toner.
The job's still sitting in the print queue, wating for its chance to create a paper jam.
Yesterday, a colleague who was doing a new installation of Fedora 8, came to ask me for help setting up his printing. Nothing else I was doing was working, so I thought, "Sure. At least I can do that. How hard can CUPS be?"
After struggling through SE Linux, adding an unsupported printer, and worked around the vagaries of naming and specifying URIs, I hand-hacked the PPD, and CUPS was finally willing to print a test page without complaint. We rushed to the printer in eager anticipation.
It was out of magenta toner.
The job's still sitting in the print queue, wating for its chance to create a paper jam.
Subscribe to:
Posts (Atom)