1&1 hosting plans

I’ve been using 1&1 for web hosting for a very long time. Their reputation is mixed, but I’ve never had any huge problem with them. An occasional hiccup, but not that often really. I’m currently paying about $6.25 per month for my hosting plan. But I just got an email saying that they’re changing me over to their “1&1 Unlimited Plus” plan, which will cost me $11 per month. This supposedly includes an 8% discount off their normal rate, which I guess would make the normal rate $12.

Looking at their web site, it looks like new customers can get Unlimited Plus for $5/month for the first year, and $10/month after that. So I’m a little confused about how $11/month is a discounted rate. Maybe I’ll e-mail them about that. At any rate, it looks like the new plan might include a free SSL certificate, which I’m currently paying $50/year for, so that would offset the price increase. (Of course, there are other ways of getting free SSL certificates these days, so I shouldn’t have to pay for SSL regardless.)

I don’t really have any intention to move off 1&1, but a price increase is always a motive to look around at alternatives.

more fun with Ruby

I’ve been making good progress through The Book of Ruby this week. I’ve continued to use the simple Ruby install on my ThinkPad, since it’s working fine, and I don’t yet need any of the fancy stuff; I’m still just working my way through language basics.

But I stumbled across an interesting class on Coursera that was just starting up, called Web Application Architectures, which covers Ruby on Rails. Initially, I resisted the urge to sign up for it, since I don’t think I’m really ready to start messing with Rails yet, but my curiosity got the best of me, and I went ahead and registered for it.

So now I’m messing with the somewhat arcane process of setting up Rails. Initially, I looked at trying to get Rails to work with my existing Ruby install on the ThinkPad, but (to make a long story short) that didn’t work out. So I looked at a couple of other options for installing a Rails dev environment on Windows, including RailsInstaller and RailsFTW, but I had some problems with both of them and decided to go a different way, rather than try to resolve the issues. (Finding pages like this and this on Reddit pretty much convinced me that setting up Rails on a Windows machine was a bad idea.)

So I went ahead and installed Ubuntu 14.04 under VirtualBox and followed the instructions found here to set up Rails. That seems to have gone smoothly, but I won’t really know for sure until I’ve done some meaningful work. I’m still not entirely sure if I’m going to stick with it, or punt on the course for now and avoid the messy complications of Rails until I have a better grounding in basic Ruby, but I’m going to spend some more time on it this weekend and see how it goes.

Oh, and as a side note, it’s fun to be messing with Linux again. I haven’t really touched Linux in a while. Ubuntu was fairly easy to set up under VirtualBox, and it seems to be running fine. The desktop UI is attractive and reasonably fast, even in a VM with only 1.5 GB of RAM allocated to it. (I’ll probably have to bump that up to 2 GB if I get serious about the Rails stuff.)

Coherent

Coherent was a Unix clone operating system for PCs that was somewhat popular in the late 80s and early 90s. I have fond memories of buying a copy via mail-order, probably from an ad in the back of Dr. Dobb’s, and probably for $99, and using it on my PC at home. In the days before Linux, Coherent was a great way for an individual with a modestly-powered PC and a few bucks to spare to learn all about Unix. Coherent came with a huge manual. (An actual book, printed on paper. Not just a PDF or a bunch of text files.) And it was fairly well-written and well-organized. You could really learn a lot about Unix by reading through the introductory material in that book, then messing around with things on your PC, then going back to the book for reference.

Just recently, the sources and documentation for Coherent were published on the web, including that gigantic manual. I had held onto my copy of the manual for years after I’d stopped using Coherent, just because it was such a good general reference, but I finally threw it out some time ago. Well, now I have a nice PDF copy if I ever need to refer back to it again! I’m tempted to try and get Coherent running in a VM on my current PC, but it’s probably not worth the bother. It would be kind of fun though.

geocoding experiments

I wrote an initial blog post on Gisgraphy about a month ago. I wanted to write a follow-up, but hadn’t gotten around to it, what with all the other stuff I have going on. But I’m going to take a few minutes now and write something up.

The initial import process to get the Gisgraphy server up and running took about 250 hours. The documentation said it would take around 40 hours, but of course there’s no way to be accurate about that kind of thing without knowing about the specific hardware of the server it’s being installed on, and the environment it’s in. I’m guessing that, if I had more experience with AWS/EC2, I would have been better able to configure a machine properly for this project.

Once the import was complete, I started experimenting with the geocoding web service. I quickly discovered something that I’d overlooked when I was first testing, against his hosted web service. The geocoding web service takes a free-form address string as a parameter. It’s not set up to accept the parts of the address (street, city, state, zip) separately. It runs that string through an address parser, and here’s where we hit a problem. The address parser, while “part of the gisgraphy project,” is not actually open source. An installed instance of Gisgraphy, by default, calls out to the author’s web service site to do the address parsing. And, if you call it too often, you get locked out. At which point, you have to talk about licensing the DLL or JAR for it, or paying for access to it via web service.

Technically, the geocoder will work without the address parser, but I found that it returns largely useless results without it. For instance, it will happily return a result in California, given an address in New Jersey. I’m not entirely sure how the internal logic works, but it appears to just be doing a text search when it’s trying to geocode an un-parsed address, merely returning a location with a similar street name, for instance, regardless of where in the US it is.

While I don’t think the author is purposely running a bait-and-switch, I also don’t think he’s clear enough about the fact that the address parser isn’t part of the open source project, and that the geocoder is fairly useless without it. So, we shut down the EC2 instance for this and moved on the other things.

Specifically, we moved on to MapQuest Open, which I was going to write up here in this post, but I need to head out to work now, so maybe another time.

Gisgraphy

My boss stumbled across a project named Gisgraphy recently. A big part of what we do involves the need for geocoding. We have generally been using geocode.com for batch geocoding, but there’s a cost to that, and they only do US and Canada. There are many other geocoding services, but if you’re doing heavy volume, you’re generally excluded from free options, and the paid options can get expensive.

Gisgraphy is an open source project that you can set up on your own server. It will pull in data from freely-available sources, load it all into a local database, then allow you to use a REST web service to geocode addresses. A little testing with some US addresses leads me to believe that it’s generally accurate to street level, but not quite to house level. So, I’m not sure that we’ll want to use it for all of our geocoding, but it ought to be generally useful.

We decided to set it up on an AWS EC2 instance. We started messing with EC2 VMs for another project, and it seemed like EC2 would be a good fit for this project too. I started out with a small instance Linux VM, but switched it to a medium instance, since the importer was really stressing the small instance. I will probably switch back to small after the import is done. That’s one nice thing about EC2: being able to mess with the horsepower available to your VM.

Gisgraphy uses several technologies that are outside my comfort zone. I’m primarily a Windows / .NET / SQL Server guy, with a reasonable amount of experience with Linux / MySQL / PHP. Gisgraphy runs on Linux (also on Windows, but it’s obviously more at home on Linux), so that’s ok. But it’s written in Java, and uses PostgreSQL as its back-end database. I have only very limited experience with Java and PostgreSQL. And, of course, I’m new to AWS/EC2 also.

So, setting this all up was a bit of a challenge. The instructions are ok, but somewhat out of date. I’m using Ubuntu 12.04 LTS on EC2, and many things aren’t found in the same places as they were under whatever Linux environment he based his instructions on. For the sake of anyone else who might need a little help getting the basic setup done under a recent version of Ubuntu, I thought I’d list out a few pointers, where I had to do things a bit differently than found in the Linux instructions:

  • Java: I installed Java like this: “sudo apt-get install openjdk-6-jdk openjdk-6-jre”.
  • And JAVA_HOME should be /usr/lib/jvm/java-6-openjdk-i386/ or /usr/lib/jvm/java-6-openjdk-amd64/.
  •  PostgreSQL: I installed the most recent versions of PostgreSQL and PostGIS like this: “sudo apt-get install postgresql postgresql-contrib postgis postgresql-9.1-postgis”.
  • Config files were in /etc/postgresql/9.1/main and data files were in /var/lib/postgresql/9.1/main.
  • PostGIS: In his instructions for configuring PostGIS, the “createlang” command wasn’t necessary. 
  • And the SQL scripts you need to run are /usr/share/postgresql/9.1/contrib/postgis-1.5/postgis.sql and spatial_ref_sys.sql.

That’s about it for now, I think. I want to write up another blog entry on Gisgraphy, once I’ve got it fully up & running. And there might be some value in a blog entry on EC2. But now I have to get back to finishing my laundry!

WIndows 8, Mountain Lion, and Ubuntu 12

I have to do a 10pm web site rollout tonight, so I find myself at home with some time to kill. I haven’t gotten much of a chance to play around with Windows 8, so I decided to download the 90-day eval, and install it on my old laptop. I have the ISO downloaded and ready to go now. However, I had installed Ubuntu 11 on the laptop back in February. I haven’t really played around with it much since then, and I was ready to wipe it out, but when I turned it on, I got an update message letting me know that I could update it to Ubuntu 12.04 LTS. Well, I decided I’d rather upgrade the Ubuntu install on this laptop rather than wiping it out and starting over with Windows 8. It’s running now, and seems to be chugging along smoothly.

I did a little searching, and it looks like 12.04.1 was only just released. There’s an article about it on ZDNet, dated yesterday. And I guess the original 12.04 release was a few months back, based on the date on this Lifehacker article.

There’s been a lot of OS-related news lately, with Mountain Lion just released and Windows 8 nearing general availability. My old 2007 MacBook can’t handle Mountain Lion, so I’m sticking with plain-old Lion on that for now. I’m tentatively planning to buy myself a new MacBook Pro early next year, but I’m not really that worried about it right now. And I’m curious about Windows 8, but not that enthusiastic about it, given what I already know. I read an interesting CNET article this morning, comparing Mountain Lion and Windows 8. I think I agree with his conclusions, for the most part.

I will likely upgrade both my Windows desktop and laptop to Windows 8, when the consumer version is released, but I’m not that excited about it. Meanwhile, maybe I’ll play around with Ubuntu a bit more!

Linux

I decided to spend a little time today installing Ubuntu 11.10 on my old Dell Inspiron laptop.

It’s been a while since I messed around with Linux.  Just for yuks, I went back and looked at some old notes and posts to see if I could piece together my history with Linux. I think the first Linux distro I ever used was this old one, which I think came on two 5 1/4″ floppies. That probably would have been in 1993. I had wanted to get some Unix experience, since the company I was working for at the time was being purchased by a company that used some flavor of Unix on the back-end, so I wanted to be prepared for that. (In the end, I didn’t stick around for too long, so it didn’t matter much anyway.)

Past that, I remember using Red Hat 5, so that would have been 1997, and Corel Linux, probably in 2000. And I used various versions of Fedora at my previous job, for various purposes. And I can see that I was messing around with Ubuntu back in 2007.

The Ubuntu install finished up while I was typing this, and it looks like everything worked. In the past, any Linux install I did on a laptop would usually have at least one minor problem, either with video, audio, or networking. But so far, it looks like this one is fine. Now I need to sit down with this machine and see if I can get Apache, PHP, and MySQL all running, so I can mess around with Drupal in a Linux environment.

1&1 Linux

In anticipation of installing Drupal on my 1&1 account soon, I went into my control panel and poked around a bit. First, I found that my account was set to use PHP 4. It was pretty easy to switch it to PHP 5. A call to phpinfo() shows that I’m now at 5.2.17. That’s not quite up to date, but it’s probably close enough.
I also looked into the MySQL setup. Several years ago, I set up a MySQL database on my account. That database is still there, at MySQL 4, with a 100 MB limit. Just for the heck of it, I created a new database. The new one is MySQL 5, and has a 1 GB limit. So, that’s nice. (There doesn’t seem to be any way to upgrade the old MySQL 4 db to MySQL 5, but that’s fine, since it’s empty.)
I even went as far as uploading the Drupal 7 tar.gz file today, but the 1&1 web file browser can’t unzip tar.gz files, so I’m going to need to get to a command prompt to do that, and it’s a little late to get into that tonight.

Hello from Ubuntu

OK, well, I’ve got Ubuntu working now. Here’s a few notes, in case they’re helpful to anyone else.

My machine is configured with one SATA drive and one older IDE drive. The SATA drive is my main drive, with Windows XP installed on it. I put Ubuntu on the IDE drive. The install went smoothly, but to get into Ubuntu, I had to go into my BIOS and change it so I’m now booting from the IDE drive rather than the SATA. The IDE drive now has GRUB on it, so that allows me to get into either Ubuntu or XP. And I customized GRUB to default to XP using Startup Manager, which I installed from the Add/Remove application, which is quite nice.

I’ve got an ATI video card, which worked fine by default, but of course I had to mess with it. I installed an ATI driver, then got the eye candy working using the method described here. It seems to be working OK.

I also installed a couple of other things that are pretty much necessary: The “Ubuntu Restricted Extras” package has the stuff you need to play MP3s and DVDs. And, for some reason, emacs isn’t installed by default, so you have to pull that down.

Ubuntu

I decided to spend a little time this week playing around with Ubuntu 7.10. At work, I installed it on two old Dell Latitude notebooks that we had lying around. They’re pretty pathetic machines at this point. They’ve both got just 256MB of RAM, which is the minimum you need to get Ubuntu up and running. And, at that level, you really can’t run the graphical installer. Rather, you need to run the text installer from the “alternate” install CD. Once I figured that out, though, the installs went pretty smoothly. My plan is to use these laptops for some device configuration and network troubleshooting when we move to our new office. The one thing these laptops have that our new ones don’t have is a 9-pin serial port, which is pretty helpful for doing initial router configuration and stuff like that. And Linux is usually a bit better for general network troubleshooting than Windows.

Just for yuks, I’m now trying to install Ubuntu on my desktop XP machine, on a second drive. I’m hoping that I’ll be able to dual-boot cleanly, though I’ve sometimes had problems with that in the past, with other distros. I guess we’ll see how it works out.

Up until recently, I’d been kind of skeptical about Ubuntu. There are certainly a lot of Ubuntu fans out there, but I really didn’t think I needed to bother playing around with yet another distro. In the past, I’ve used Red Hat, Corel, Fedora, and probably three or four other distros, including some fairly oddball ones. I’m pretty fond of Red Hat and Fedora, mostly just because I’ve got the most experience with them. And I kind of liked the user experience on Corel Linux, but of course that got dropped after just one or two versions. Ubuntu definitely looks like a nice, user-friendly package. I’m looking forward to playing around with it.

Well, while I’ve been typing this post up, the install finished, and I rebooted the machine. It went straight into Windows XP, so I guess I need to do some research on the whole bootloader thing.