cities. physics. food. environment. fatherhood.

Random header image... Refresh for more!

Backing up the blogs

Although this blog has been mostly dormant for quite some time now, my two other blogs–the Matthew Picture of the Day and the Peter Picture of the Day–have been plugging along, barely missing a beat. I do have all the photos on my home computer, of course, which I back up twice (continuously via Time Machine and also to CD-ROM). But I’ve been a bit cavalier about backing up the blog. I’ve periodically done manual backups before major upgrades, but I’ve sort of naively trusted in Bluehost not to lose my content. A year ago or so, Bluehost had a substantial hiccup and MPOD was offline for several hours. They did rebuild everything, and all was well, but the episode did give me pause. Also, I remember Photopoint.

So it was time to get serious about backing up the blogs. These were my criteria:

  1. The backup must be stored by a different hosting company than that which hosts the blog. After all, I want to be able to recover if Bluehost has a catastrophic failure.
  2. The entire site must be backed up. As of now, the MPOD is at about 800MB and the PPOD is about 500MB (until the debut of the PPOD, I uploaded only the small size photos to the MPOD, but now I upload the full-sized originals to both, which is why the MPOD archive is not much larger). This is particularly important because PhotoQ, the photoblog plugin that I use, stores the photos in two directories of its own creation within wp-content. So the backup solution must be able to see and archive these.
  3. The backup must be automatic. I don’t want to be required to start each backup and I especially don’t want to have to use my home computer as an intermediary in transferring the archived files.

With this, I evaluated four automatic backup plugins. Rather, I installed and tried three, all of which failed in one way or another, before I found one that worked. My present solution is XCloner, stored on Amazon S3. Here are my experiences with the four plugins:

The first one I used was Automatic WordPress Backup. I do have to give this one credit: it was the first one I heard about that used Amazon’s S3 service to store the data. So it’s a pioneer of sorts. It has its own website and a slick demo video. Like many folks, though, I’d start a manual backup, it would finish, but nothing would have happened and there would be nothing stored on S3. Eventually I found the log file and discovered that it was running out of memory. This problem I could actually fix: I had inadvertently left about 10000 spam comments in the WordPress database, and clearing those out allowed the plugin to finish. But seriously, 10000 comments is not really all that many for a high volume blog. So the first problem with this was that it didn’t look like it could scale. (Note further that the flashy promotional videos and postings on the blog were not done by the programmer, and that the programmer is working on a different plugin now with similar functionality.)

The second problem was that AWB backed up the database and a few set directories, but not arbitrary directories introduced by other plugins. So only the database structure of the photoblog was backed up, not the photos.

And it also seemed that, despite all the flashiness of the website, it was quickly becoming abandon-ware. There hasn’t been a new version since August 2010. I had posted a question on their product blog asking if there was a way to backup other directories, but it was forever stuck in moderation and has never been posted. I think they had expected more people to buy premium support (which is no longer for sale). They were also upset that WordPress’s rules prohibited them from automatically inserting a link to them in the footer of each of your blog pages. And they also insisted that, instead of being controlled through the ‘plugins’ or ‘settings’ menus in WordPress, they needed to add their own menu to the main sidebar. And, I think it’s lost compatibility somewhere around WordPress 3.2.

The next plugin I tried was wp Time Machine. It has a very simple interface and does back up all files. It’s designed so that you just type in the minimal information needed and then click “generate archive,” so it’s a little counter-intuitive that you need to first click ‘Show Plugin Options’ to switch from the default Dropbox to Amazon S3. The instructions are found within the plugin interface, but it sends you to a blog post to get the details of setting up the cron job needed to run the backups automatically.  It also automatically includes, in the files it saves on Amazon S3, an instructions file for recovering after a crash.

I initially had file compression turned on while making the archives, and when adding the photo files to the archive, it would ungloriously quit running after about 330MB, leaving an incomplete and corrupt archive file. Turning off compression I was able to get the PPOD site successfully backed up–I saw the full file archives appear in the Amazon S3 console–but the plugin page display just indicated that the backup was in progress. (Turns out, JPEG files are already pretty well compressed and ZIP won’t make them much smaller.) In fact, several times I left it for hours thinking it was still working on the backup, because that’s what the screen indicated, while in fact it had hung up somewhere. Trying both PPOD and MPOD, I variously got failures to finish building the archive file, a finished archive file that failed to transfer to Amazon S3, and successful transfers to S3 that failed to indicate completion in the plugin dashboard. Perhaps it runs better via cron job, but it seemed like this plugin had troubles with lots of files.

The third plugin was pressBackup. They sell their own cloud storage service, but their plugin is free to use with Amazon S3. It uses a wizard to walk you through the setup process in a completely intuitive manner. It just asks you what service you’re using, how often you want to backup, and how many previous copies you want to keep stored. But the problem: they allow you to backup the database, themes, plugins, and uploads. Not arbitrary files and directories. So, like Automatic WordPress Backup, it wouldn’t save the photos. That, and its easy configuration was perhaps too easy: although you can have a daily backup, there’s no way to set the time at which the backup will execute; it will always execute at the same time as your first backup. I sort of imagine that servers are more idle and bandwith is more plentiful in the middle of the night, which is when I would want to schedule backups. But you don’t need to configure your own cron jobs.

Which brings me to XCloner, which I’m currently using. It is tremendously flexible and the configuration is both more involved and less obvious than with other products. It has a PDF manual, which seems to be a few releases behind. It did take a bit of work for me to get it up and running, but once I figured it out, configuring it for subsequent blogs was completely straightforward. But it was far from obvious which configuration options needed to be set and which didn’t need to be.

A few notes: you need to create your own directory structure, preferably in the blog’s top level directory, for it to store the backup files (which will then get transfered elsewhere). In order to transfer to Amazon S3, it needs to be run from a cron job; a manual backup will only create and store the backup file. There are quite a few settings that one doesn’t need to bother with (for example, cron configuration names). And the plugin itself has a username and password, that it wants you to configure right away–I suppose this is useful for a multi-author blog. If you want to do things like doing a nightly backup of the database only to be stored locally and a weekly backup of the whole site transferred to an FTP site, then XCloner can do that. It didn’t have any trouble with the large number of files I needed to back up, but if it did, then it has an option for an incremental backup that works on a smaller number of files at a time.

So in the end, there are no circumstances in which I’d recommend Automatic WordPress Backup. If you don’t want to mess with cron jobs, and the limitations to pressBackup don’t deter you, I’d recommend it first. If you need more flexibility–say with when the backup executes–and if the size of your site doesn’t trip up wp Time Machine, I’d recommend that. And if you have having things configured for you and want lots of settings available for tweaking–or if the other options won’t work for you, then XCloner will certainly get the job done, once you figure it out.

November 13, 2011   No Comments

Diagnostic post

This is a second diagnostic post, to see how the WordBooker plugin settings work. This is supposed to show up as a status update in my facebook.

November 11, 2011   No Comments

Blogging again?

Yes, this blog has been idle for many months now, but that doesn’t mean I don’t have dozens of potential blog posts mulling around in my head. This is my second diagnostic post, to see if the WordBooker plugin works, which is supposed to be a way to post my new blog posts to my Facebook wall. (The first diagnostic didn’t post to Facebook, do it’s been deleted. Sorry if that interfered with your RSS reader.)

November 11, 2011   No Comments

Announcing the Peter Picture of the Day

Peter Quinn Metcalf was born January 27th, 2011, at 8:56pm. I’m pleased to announce the Peter Picture of the Day, a photoblog featuring pictures of Peter. It’s modeled after his brother’s photoblog, the Matthew Picture of the Day, which has been running for three and a half years. Although the blogs look alike, there are a few subtle changes to the PPOD, some of which I hope to incorporate into the MPOD.

Most notably, the full scale picture is now available: simply click on the picture and you will be able to download the full picture. This should make for much better prints for those who wish to print out PPOD pictures. Warning: if uncropped, these pictures are about 3MB apiece. That seemed much too large back when I established the MPOD, and so it has used the display size only of 720 pixels wide. But these days, a 3MB image doesn’t seem so unreasonable.

I’m also now posting the date that each photograph was taken (a feature I should be able to add soon to MPOD), and I’m also putting the comments section on the main page, instead of in a pop-up window. As with MPOD, each commenter’s first comment must be approved by the moderator (me), to avoid spam postings.

I’ve loaded a couple of pictures to start things off, after which there should be one new picture each day, posted at about 6am Eastern Time.

January 29, 2011   No Comments

The New Sierra Club Executive Director

A recent (January 30, 2010) episode of Sierra Club Radio begins with an interview of the new executive director, Michael Brune, which was the first I’d heard in detail about him. I found it quite encouraging, in part because of what Mr. Brune said, but more refreshingly, in his tone. The outgoing executive director, Carl Pope, periodically recorded commentaries for Sierra Club radio, which I never really cared for. Mr. Pope’s tone was brash, smug, and confrontational, and his messages were needlessly political: hyping up small or incidental or Phyrric victories, spinning away the setbacks, never allowing that an issue might have subtleties and complications. There were good guys and villains, and the good guys were always winning, most likely thanks to the Sierra Club and its allies. Carl Pope’s commentaries always sounded like a slightly disingenuous pitch. Mr. Brune, by contrast, sounds very much like a thoughtful person.

That I picked up on this contrast is perhaps a bit ironic, as Mr. Brune as an activist was known for a rather confrontational style, as elucidated in a Living on Earth interview, a KQED Forum interview, and in a Grist article. Prior to the Sierra Club, Mr. Brune was executive director of Rainforest Action Network, where his most notorious stunt involved the campaign to get Home Depot to stop buying wood from endangered forests. Sympathetic Home Depot employees contacted Mr. Brune and clued him in to the code for the Home Depot intercom system, by which Rainforest Action Network activists could go into any Home Depot, find the intercom stations, and broadcast messages storewide about the source of the wood products for sale. This campaign worked, although I’m not really sure this is the sort of thing I’d like the Sierra Club to start doing more of.

Mr. Brune made what I think is a salient and subtle point in praising the Sierra Club for “evolving” over the past decade or so, of doing a good job of “holding onto its roots”–protecting wild places and the like–but at the same time “being responsive to the great threat of climate change.” This phrasing speaks to me–it signals an understanding that the environmental challenges we face and our responses to them are not identical to those of twenty or thirty years ago. Urban environmentalists often note a disconnect with what we might call “traditional” environmentalism, manifested as an insistence in saving every tree, and in opposing every development, and in always primarily characterizing the principals involved in any development as greedy, even if the trees that would have to be cleared to make way for a development would enable its future residents to live in ways–without cars, for example–that could drastically reduce their overall environmental impact when compared with what they might need to end up with should the traditional environmentalist’s protests be successful and should the greedy developers choose to build their buildings instead on some further-flung plot of land that’s less dear to said environmentalists. I’m oversimplifying the issue here of course, and I don’t want to presume that when Mr. Brune says the Club is evolving that he necessarily means that it will evolve exactly the way I want it to.  But it does seem to me that Mr. Brune is acknowledging the need to look at environmental challenges in a different way than has been considered traditional.

Two years ago, he wrote Coming Clean–Breaking America’s Addiction to Coal and Oil, published by Sierra Club books. That energy was the topic on the mind of someone employed to save the rain forests is, itself, encouraging. He was interviewed on Sierra Club radio for September 6, 2008 upon release of the book, and this earlier interview is perhaps more insightful than the current one. He struck a thoughtful and diplomatic tone, giving respectfully detailed answers to complex topics. He discussed, at some length, the problems with biofuels, beginning with a remark that the idea of growing your own fuels is, no doubt, very alluring. He concludes that “biofuels can only at best be part of the solution” and further noting that if we were to turn every single last ear of corn produced in the United States into ethanol, it would provide a scant 12% of our fuel needs. He resists the temptation to simply classify biofuels as “good” or “bad,” and he uses a quantitative figure in a proper and meaningful way, which is more than can be said of much of what passes for environmental discourse these days. The urbanist will also note that he also understands that biofuels are an attempt at a solution to what is in many respects the wrong question–instead of asking how we’re going to keep fueling our cars in a post-carbon age, we should also be asking whether we need so many cars to begin with. Brune mentions, several times, that ”we need to promote ways of transportation that are not centered on the automobile.” When asked for ways in which individuals could get involved in breaking our oil addiction, he suggested getting involved with your local bicycle advocacy organization, to get more bike lanes and to encourage office buildings to offer bicycle parking.

It also appears–although not having read his book, I’m not certain–that he wants to play down the role of individuals greening their own lives and instead look towards action for large-scale, widespread institutional change. In the 2008 interview, the host specifically asks him about a claim in the book that individual actions, like turning down your thermostat and changing out your lightbulbs, won’t be sufficient to solve the climate change problem. And although he’s not as polemic as Mike Tidwell’s Washington Post Op-Ed, the sentiment is the same: to make the changes that matter, we need large scale, collective action. And Brune makes clear in the current interview that be believes that there is no organization better suited to lead this action than the Sierra Club.

Brune, I couldn’t help but notice, is only two years older than I am, and is probably as young as one can be to also have enough experience to be considered a reasonable candidate for executive director of an organization with the size and stature of the Sierra Club. Carl Pope, I gather, is a few years younger than my parents. So there really is a transition here, a passing of the baton from one generation to the next. I’m optimistic about Brune, and will watch carefully to see where the Club goes.

February 8, 2010   1 Comment

Making sense of the March Meeting

This year’s March Meeting will be in Portland, Oregon. (See previous blog posts from 2009 and 2008, also here.) The largest of the meetings put on by the American Physical Society, this year it there will be 581 sessions and 818 invited speakers. Most time blocks–from Monday morning through Thursday mid-day–will have a full program of 42 parallel sessions. This is slightly larger than last year, in which most time blocks had 41 parallel sessions, with a total of 562 sessions. There were 832 invited speakers last year, so this year has slightly fewer. As each session can have up to 15 contributed talks–10 minutes each with 2 minutes for questions and changing speakers, or 5 invited talks–30 minutes each with 6 minutes for questions and changing speakers, that means there could be almost 6800 talks all total, but most sessions aren’t completely programmed. This year, I am not giving a talk.

With such a mass of talks going on, planning your time at the conference and deciding how long to stay take some effort. In the past, abstracts for all talks, and the 2000 or so posters that the meeting has each year and were printed in two volumes that resembled phone books, plus a pocket sized book of session titles. These days, one gets a smaller books that lists only the titles of the talks, and of course the entire program is also available online. Nevertheless, over the years I’ve found the layout of the program to be a bit wanting, and so I put together some scripts that parse the schedule and author information to print it out in a form that I find easier to work with.

My scripts and their outputs have evolved over several years, and currently they produce three files. They take as input the Epitome, which is a chronological list of sessions, and the list of invited speakers. At present, both of these must be cut-and-pasted from the meeting website into text files for my program to read. (Maybe next year I’ll have them automatically grab the files from the APS website.)

The first output file is a version of the Epitome more suitable for browsing on a printed page than the materials for APS. It skips the non-talk sessions (like receptions and unit business meetings) and makes sure all sessions of each time block are together on a single page.

The second file serves to give a sense of the structure of the March Meeting: it is a grid of time blocks versus session numbers, with symbols indicating the number of invited talks in each session. It also has a list of room numbers associated with each session number: for the most part, all sessions of a certain number (such as A14, B14, D14, and so on) will be in the same room, but not always.

The third file is a list of invited talks, sorted by session instead of by author last name. Because of the large number of parallel sessions and the high likelihood of schedule conflicts, I think it makes sense to look time block by time block.

Since this year I’ve actually got these files produced well before the meeting, I’m posting them here in case anyone else should like to use them too.

Here are the three PDF files I’ve generated:

Epitome_Mar10

Grid_Mar10

Invited_Mar10

If you like the information but want to fiddle with the formatting, here are the .tex files that generated the PDFs, that need to be run through LaTeX. Because of a quirk in WordPress, they’re all saved as .tex.txt. The APS online information already uses TeX formatting for accent marks in speaker names, and for super- and sub-scripts in talk titles. (There are occasionally errors in APS’s TeX formatting–this year, the title of Philip Anderson’s talk is missing the math mode $ characters surrounding the ^3 superscript command. Despite my efforts to automate everything with these scripts, fixes like this still must be done by hand.)

Epitome_Mar10.tex

Grid_Mar10.tex

Invited_Mar10.tex

Finally, here is the Tcl script that I use to parse the files and write the .tex files. It’s not very good code, having been mucked around with once a year for a few years and in general cobbled together from earlier scripts. It works, provided the settings file is appropriately edited, on march meeting files back to 2006, when the invited speaker list was first published online. The bits that work on the epitome work on 2005, and the epitome format in 2004 and earlier years was different. As with the .tex files, I’ve needed to upload them as tcl.txt files here.

march2tex.tcl

march10_settings.tcl

In a future post, I hope to use the results of the scripts, particularly the grid, to analyze ways in which the March Meeting has changed over the years.

January 31, 2010   No Comments

Travels with our toddler

We recently took Matthew on his first overnight train trip; regular viewers of the Matthew Picture of the Day can expect a couple of shots from on board. We took the Capitol Limited all the way to and from Chicago, in a bedroom in a sleeping car. As national network trains go, this is quite a convenient one–although it takes 17 hours to travel the 780 rail miles, via Pittsburgh and Cleveland, most of that is at night, and once you factor in time to eat dinner and breakfast and time to get ready for bed and to get dressed, there’s not that much idle time left. Matthew did well, and the train again proved to be a civilized and relaxing way to travel. He’s old enough to get some fascination from looking out the train window, which is quite an improvement from his previous trip, when he was one, when we went to New York to buy my Brompton. So all total, Matthew now has 2012 Amtrak miles.

Matthew, though, has logged more mileage in the air than by any other means: to date, 29063 miles in 26 segments. Most of this has gone well. We’ve always bought him a seat, even when he was young enough to travel as a “lap child.” In the past, it was common to travel with a young child as a “lap child” and then use an empty seat for him while aboard the airplane, but in recent years, there is no such thing as an empty seat, and lap children must almost always actually be carried on a grown-up’s lap for the whole flight. My advice, then, is not to count on there being an empty seat, but rather, to count on there not being an empty seat, and if you can at all afford it, buy the seat for the child.

[Read more →]

January 24, 2010   No Comments

Our inadvertent pumpkin patch

Although you wouldn’t be able to tell by looking at it–in fact, you might be tempted to conclude the opposite–I really do want there to be a recognizable garden someday in what can only honestly be called our yard. I dream about growing flowers and vegetables, and a rain garden and maybe blueberries and an apple tree. But, for a variety of reasons, I haven’t done anything and can barely keep up with mowing and controlling the weeds.

But it turns out you can grow things using the lackadaisical approach, and in our case, it’s pumpkins.

pumpkin vine in back yard

This is how the pumpkin plant looked in September.

The truly amazing bit is that I didn’t ever plant any pumpkin seeds. The pumpkin vine grew out of a side vent in our compost bin, presumably from a pumpkin seed that germinated instead of decomposing while inside the bin.

pumpkin vine growing out of compost bin

Since I didn’t plant these, I don’t actually know which variety of pumpkin they are, but I presume they are the inedible jack-o-lantern type from Halloween 2008. As with everything else in the yard, I didn’t tend to these, so they didn’t grow nearly as large as a proper jack-0-lantern would. Indeed, the pumpkins feel solid, not hollow.

I “harvested” them recently, although too late to be a part of our Halloween decorations. But, for the record, here is our garden output 2009:

pumpkin harvest 2009

November 10, 2009   2 Comments

Confounded smoke alarms

My electrician, who is safety-conscious above all else, has been bugging me for years now about smoke alarms. Sure, I have several battery-powered smoke alarms up, but from a safety improvement per dollar spent perspective, one really wants smoke alarms that are:

  • hard-wired, with
  • battery backup, and
  • interconnected

The batteries in battery-powered smoke alarms will run out. They do chirp to let us know it’s time to change the battery, but more often than not I won’t have a spare battery handy, or I won’t have a step stool nearby, or it will be the middle of the night, so instead of going back on the ceiling with a new battery like it’s supposed to, the alarm will sit around on a counter, battery-less, sometimes for weeks. Hard-wired smoke alarms solve the dead battery problem because they draw their power from the house electrical wiring. As it turns out, electrical fires that disrupt the power before smoke could be detected are really rare, and our power is pretty reliable, so the risk that the power’s off when the alarm needs to sound is really quite small, smaller than the risk that your battery-powered alarm will be sitting, battery-less, on the counter. And most hard-wired alarms also have battery backup, so you’re covered during power outages, too.

There are two smoke-detection technologies: ionization and photo-electric. Ionization sensors do well with small smoke particles, from fast-burning fires, while photoelectric sensors do better with large smoke particles from smoldering fires. Most safety recommendations (including Consumer Reports) are reluctant to specify one as being a better choice, and recommend both. So add to our wish-list:

  • dual-sensor

Interconnection of smoke alarms means that when one alarm goes off, all of them sound. So if there’s a fire in the basement while you’re asleep, the alarm in your second-floor bedroom will also go off, giving you much more time to escape than waiting either for enough smoke to set off a second-floor alarm or for you to hear the far-away alarm. The interconnection is conventionally done with three-conductor wiring: all the smoke alarms need to be installed on the same circuit and the third wire is used as the alarm interconnection signal wire. This is easy in new construction but really hard to retrofit: getting a new circuit to the ceiling of every location for a smoke alarm would mean lots and lots of holes in the walls and ceilings.

[Read more →]

June 7, 2009   1 Comment

To re-use plastic baggies

I get a fair number of yuppie housewares catalogs in the mail. I browse through them–I actually do like the style of much of their merchandise–but rarely do I actually buy anything. The catalogs want to sell you on the idea that simply buying a decorative plate will transform your whole dining room into something as stylish as that put together for the catalog shoot, and I understand that it won’t.

Of all the yuppie housewares catalogs, NapaStyle is one of the yuppiest, to the point of almost being a laughable self-parody. But I’m writing here about something I bought from them (a NapaStyle exlcusive, even) that’s turned out to be quite a satisfying purchase: the Stemware & Plastic Baggie Dryer. I hate to throw away plastic Zip-Lock bags after just one use; far better to wash and re-use them. This device is a ring of eight wood rods that make excellent places to hang plastic baggies to dry.

Of course, one doesn’t need a drying rack to wash and re-use plastic baggies, but I wasn’t regularly doing so until I bought this drying rack. The drying rack works very well for its task. It’s also a very unassuming product: it does not need to have its own box: it was simply placed inside the shipping box. It was not enclosed in a plastic bag, it was not packed with custom-fit styrofoam. It was not tied to a piece of cardboard with twist-ties. It required no assembly. It came with no manual, no marketing survey disguised as a warranty card, and no safety warnings. It has no website. You can’t get on it’s email list for exciting product updates. It’s made almost entirely of wood. It was made in Canada. 

I wish more products were like that.

May 11, 2009   3 Comments