LaunchBar 6

I’ve been using LaunchBar on my Mac for quite some time. It hasn’t changed much over the last several years, and, for a while, I don’t think it was very popular, as Mac apps go. But LaunchBar 6 was just released, with a bit of an interface refresh, and it’s getting some attention, including a good review on Cult of Mac, and a lengthy and useful writeup by Shawn Blanc. I just installed it, and paid the $19 upgrade fee for it. That seems pretty reasonable, given that they haven’t done a paid upgrade since 2010. I’d recommend it to any Mac user who likes the idea of being able to quickly launch programs without having to use their mouse or trackpad.

InstaCast

After my issue with podcasts that I posted about last week, I decided to switch from using iTunes and the Apple podcast app to Instacast. I bought both the Mac and iOS versions. After using both for about a week, I’m mostly satisfied, but there are definitely a few shortcomings.

First, on the plus side, Instacast hasn’t arbitrarily deleted a bunch of podcasts from my Mac, as iTunes did last week. Instacast has a pretty interesting way of dealing with podcast files, actually. It’s not quite perfect for the way I’d like to do it, but it’s reasonable. Basically, you set a maximum amount of space that you’d like to use for podcasts, and Instacast deletes stuff once it reaches that limit. It’s pretty sensible about picking what to delete — it goes for episodes that you’ve already played and haven’t marked as favorites first, if I understand it correctly. I’ve set it to use up to 10 GB on my Mac, and 1 GB on my iPhone, so that should be good enough. I kind of wish, though, that you could set certain podcasts to keep forever. There are a few podcasts, like Warren Ellis’ SPEKTRMODULE, for instance, that I’d like to just keep forever. With Instacast, I can’t really do that, and I guess I’d want to copy the files out of Instacast and into a separate folder.

Which brings up a separate point: Instacast does allow you to right-click on a given podcast episode and select “Show in Finder”, so that’s good. But, unlike iTunes, it doesn’t organize individual podcasts into their own folders, now does it keep the original file names. Instead, it puts all of its files together in a single folder, and names them with (I assume) random GUIDs. So, to copy out all of the episodes of a given podcast, I’d really have to do “show in finder” on each one individually, and copy them one at a time. (And if I wanted the copied files to have reasonable names, I’d have to rename them too.) So I’m not too happy about that, but it’s not a terribly big deal.

In terms of the actual functionality of the apps, let’s start with the Mac app. It seems to be a reasonably well-written Mac app, not taking up too much memory or CPU, and launching pretty quickly. (I wish that was something I could take for granted with a commercial Mac app, but alas, no…) I’m using the Mac app mostly to watch Tekzilla. (If there were any other video podcasts that I was interested in, I’d use it for those too, but there isn’t anything else I’m following right now.) It does a good enough job on that. Basically, it just plays the video and gets out of the way, which is what I want. It works fine for playing downloaded episodes, and it can also stream episodes that you haven’t downloaded, which is nice.

For the iOS app, that also works reasonably well. I use it only for audio podcasts, and I follow a few of those. You can set it to download episodes only when on wifi, which is a good thing, as my Verizon data usage has been a problem lately. I’ve had it randomly stop playing a podcast twice so far, which is a bit puzzling. In both cases, I could start the podcast back up where I left off, no problem. I was driving both times, so I didn’t see what happened. I’m not sure if the app crashed or if it just stopped playing. And I think it was the same podcast file both times, so maybe there was just something wrong with that file. If this keeps happening, I’m going to get frustrated with it pretty quickly, but we’ll see what happens over time.

There’s a function built into both the Mac and iOS apps called “Up Next” that lets you create an on-the-fly playlist of a few random podcast episodes, so you can set yourself up if you’ve got a long drive. I used it today for my 90-minute drive down to a friend’s house, and it worked well. There doesn’t seem to be a way, though, to tell it to just continuously play consecutive episodes of a single podcast, which is a bit weird.

There are a few other things I could mention, but this post is long enough as-is, so I’ll leave it there, and just say that I don’t regret spending the $20 on the Mac app and $4 on the iOS app, but I’m still not sure if I’ll stick with it or switch to something else in the long term.

contact and calendar management

A few years back, I wrote up a couple of blog posts on my search for the “holy grail” of contact and calendar management. Back then, I had a BlackBerry, and I was hoping to find a good way to keep things in sync between the phone, my PC, and my Mac. I went through a few less than perfect options, which aren’t worth going into at this point.

Nowadays, I’ve got an iPhone, and I’ve found that iCloud does a fine job of keeping the iPhone, iPad, and Mac in sync. On the PC, I really don’t bother trying to keep a full set of contacts in Outlook anymore, nor do I keep my calendar there. I can always look anything up on icloud.com or on my iPhone. And, while I use Gmail for most of my mail, I don’t really feel a need to keep my Gmail contacts fully up-to-date either. There’s really only a small set of people who I e-mail regularly, and they’re all in my Google contacts, so there’s no problem there.

So, since everything’s working so well, of course I’m starting to mess around with it. I installed the vipOrbit app on my iPhone this week. It’s a program for managing contacts and calendars. Right now, the iPhone and iPad clients are free, the Mac desktop client is $30, and the sync service that I would need to subscribe to is $45/year. So I thought I’d start out by trying the iPhone app, and see if it was worth going any farther with it. The app imported my contacts from the main iPhone contact app with no problems. But, I found that it did not import all the fields. In particular, it didn’t import birthdays or the free-form notes field from contacts. The app has several user-defined fields available, so maybe there was a way to map those and import the birthdays and notes into them, but it wasn’t obvious how I could do that. I played around with the app a bit, and, while I think it might be useful for a salesperson tracking leads and/or customers, it’s not really useful enough for me to justify both the price and the inconvenience of keeping my contacts and calendar outside of the normal default iPhone apps.

Next, I may choose to try out fruux. Fruux is just a sync & backup service for contacts, calendars, and tasks. So, I’d keep using the default iOS apps, but would keep things in sync with fruux instead of iCloud. I honestly have no good reason to do this, except “just for the hell of it”. Or maybe so I can say I’m not 100% tied in to the Apple ecosystem.

Remapping Keystrokes in MorphX

The shortcut keys used in MorphX, the Dynamics AX code editor, are almost exactly the same as those used in Visual Studio. In fact, the code editor basically is the editor from Visual Studio, with somewhat reduced functionality, if I understand correctly.

The one thing that’s always bugged me about it is that the keystrokes for commenting and un-commenting code are different. In VS (and various other editors), it’s Ctrl-K,Ctrl-C and Ctrl-K,Ctrl-U. For no obvious reason, MorphX uses Ctrl-E,Ctrl-C and Ctrl-E,Ctrl-U. This isn’t too bad, until you start getting used to it, then you accidentally press Ctrl-E in SQL Management Studio, hence executing a block of SQL instead of commenting it out. After doing that a few times, I decided that I needed to fix MorphX.

Surprisingly, I couldn’t find any facility built into AX for changing keyboard shortcuts. So, I turned to AutoHotKey. It’s very easy to remap a single keystroke in AHK. For instance, I can just remap Ctrl-K to Ctrl-E with “^k::^e”. I went ahead and did that for awhile, since it didn’t really seem that there would be any harm in that. But, I wanted to figure out how to create a more targeted replacement, so only the two specific commands would get remapped.

The snippet below does that. And, or course, it limits the remapping to the AX code editor.

; https://gist.github.com/andyhuey/5466566
; comment/uncomment, the way it was intended to be...
#IfWinActive ahk_class AxMainFrame
^k::
Transform, CtrlC, Chr, 3
Transform, CtrlU, Chr, 21
Input Key, L1 M T1
if Key = %CtrlC%
     Send ^e^c
if Key = %CtrlU%
     Send ^e^u
return
#IfWinActive

Hard drive crash

One of the hard drives on my work PC crashed a couple of days ago. My work PC is (or rather, was) configured with an SSD for a boot drive, and two regular SATA drives, in a RAID 0 configuration, for a secondary data volume. It was one of those SATA drives that failed. Since RAID 0 doesn’t have any redundancy built in, that killed the volume.

The only data I had on that volume were the files for my VM. The way we have developer machines configured here, we have general productivity stuff (Office, etc) on the boot volume, and all the developer stuff on the VM. The setup for developing for Dynamics AX is fairly complicated, so it makes sense to do it on a VM.
Unfortunately, we don’t have any facility set up for backing up our VMs anywhere. Also, between the way AX stores source files, and the way we have TFS set up, we don’t always check in code daily, nor do we have a simple way of backing up in-progress code changes that haven’t been checked in. So, the end result is that I lost about two days worth of work on my current project.
I had, at one point, written a backup script (using PowerShell and 7-Zip) to back up the My Docs folder on the VM to the My Docs folder on the physical machine, but I hadn’t ever set it to run on a scheduled basis, so the backup file there was about a week old, which meant that I also lost a few SQL files, some test spreadsheets, and one quickie VS 2010 project that I’d written to test a web service. Oh, and I was keeping the backup script itself (plus some other scripts) in a ‘util’ folder on the root of the VM C: drive, so those didn’t get backed up either, and were lost.
So the takeaway from all of this, of course, is that I need to do what I can to get around the limitations of the environment I’m working in, and set up some automated backup procedures.
In terms of backing up the My Docs folder, I rewrote my lost PowerShell script, and set it up in task scheduler to run at 6pm daily. It ran fine last night, and I think it’ll work fine on a continuing basis.
In terms of backing up in-progress work in AX, I extended the ‘startup projects’ class that I blogged about recently to also allow me to export all of my active projects. I have it exporting them to a folder under the My Docs folder, so, if I run the export at the end of the day, prior to the file system backup, I should always have a backup of my current work, in a format that I can re-import into AX, if need be.
There are still some big holes in this system, including the fact that I have to remember to run that export daily. But it’s a good start. I’d like to add some extra stuff to this process, including daily SQL backups, and maybe a push of certain backup files to the cloud. The SQL backups are kind of hard, since the AX test database is 70 GB. And my employer, for some reason, likes to block access to cloud-based backup & storage providers, so I can’t just copy stuff into a DropBox folder, so that part’s a little tricky too. 
I’ve also considered setting up a local Mercurial or Git repo, checking in the AX export files every day, and pushing them up to a private Bitbucket repo. This would give me offsite backup, with the added benefit of increased granularity and visibility, but it would probably violate one or more corporate policies.
As a follow-up to this post, I’m going to write a few more posts, about some of the scripts I’m using now.

geocoding experiments

I wrote an initial blog post on Gisgraphy about a month ago. I wanted to write a follow-up, but hadn’t gotten around to it, what with all the other stuff I have going on. But I’m going to take a few minutes now and write something up.

The initial import process to get the Gisgraphy server up and running took about 250 hours. The documentation said it would take around 40 hours, but of course there’s no way to be accurate about that kind of thing without knowing about the specific hardware of the server it’s being installed on, and the environment it’s in. I’m guessing that, if I had more experience with AWS/EC2, I would have been better able to configure a machine properly for this project.

Once the import was complete, I started experimenting with the geocoding web service. I quickly discovered something that I’d overlooked when I was first testing, against his hosted web service. The geocoding web service takes a free-form address string as a parameter. It’s not set up to accept the parts of the address (street, city, state, zip) separately. It runs that string through an address parser, and here’s where we hit a problem. The address parser, while “part of the gisgraphy project,” is not actually open source. An installed instance of Gisgraphy, by default, calls out to the author’s web service site to do the address parsing. And, if you call it too often, you get locked out. At which point, you have to talk about licensing the DLL or JAR for it, or paying for access to it via web service.

Technically, the geocoder will work without the address parser, but I found that it returns largely useless results without it. For instance, it will happily return a result in California, given an address in New Jersey. I’m not entirely sure how the internal logic works, but it appears to just be doing a text search when it’s trying to geocode an un-parsed address, merely returning a location with a similar street name, for instance, regardless of where in the US it is.

While I don’t think the author is purposely running a bait-and-switch, I also don’t think he’s clear enough about the fact that the address parser isn’t part of the open source project, and that the geocoder is fairly useless without it. So, we shut down the EC2 instance for this and moved on the other things.

Specifically, we moved on to MapQuest Open, which I was going to write up here in this post, but I need to head out to work now, so maybe another time.

Gisgraphy

My boss stumbled across a project named Gisgraphy recently. A big part of what we do involves the need for geocoding. We have generally been using geocode.com for batch geocoding, but there’s a cost to that, and they only do US and Canada. There are many other geocoding services, but if you’re doing heavy volume, you’re generally excluded from free options, and the paid options can get expensive.

Gisgraphy is an open source project that you can set up on your own server. It will pull in data from freely-available sources, load it all into a local database, then allow you to use a REST web service to geocode addresses. A little testing with some US addresses leads me to believe that it’s generally accurate to street level, but not quite to house level. So, I’m not sure that we’ll want to use it for all of our geocoding, but it ought to be generally useful.

We decided to set it up on an AWS EC2 instance. We started messing with EC2 VMs for another project, and it seemed like EC2 would be a good fit for this project too. I started out with a small instance Linux VM, but switched it to a medium instance, since the importer was really stressing the small instance. I will probably switch back to small after the import is done. That’s one nice thing about EC2: being able to mess with the horsepower available to your VM.

Gisgraphy uses several technologies that are outside my comfort zone. I’m primarily a Windows / .NET / SQL Server guy, with a reasonable amount of experience with Linux / MySQL / PHP. Gisgraphy runs on Linux (also on Windows, but it’s obviously more at home on Linux), so that’s ok. But it’s written in Java, and uses PostgreSQL as its back-end database. I have only very limited experience with Java and PostgreSQL. And, of course, I’m new to AWS/EC2 also.

So, setting this all up was a bit of a challenge. The instructions are ok, but somewhat out of date. I’m using Ubuntu 12.04 LTS on EC2, and many things aren’t found in the same places as they were under whatever Linux environment he based his instructions on. For the sake of anyone else who might need a little help getting the basic setup done under a recent version of Ubuntu, I thought I’d list out a few pointers, where I had to do things a bit differently than found in the Linux instructions:

  • Java: I installed Java like this: “sudo apt-get install openjdk-6-jdk openjdk-6-jre”.
  • And JAVA_HOME should be /usr/lib/jvm/java-6-openjdk-i386/ or /usr/lib/jvm/java-6-openjdk-amd64/.
  •  PostgreSQL: I installed the most recent versions of PostgreSQL and PostGIS like this: “sudo apt-get install postgresql postgresql-contrib postgis postgresql-9.1-postgis”.
  • Config files were in /etc/postgresql/9.1/main and data files were in /var/lib/postgresql/9.1/main.
  • PostGIS: In his instructions for configuring PostGIS, the “createlang” command wasn’t necessary. 
  • And the SQL scripts you need to run are /usr/share/postgresql/9.1/contrib/postgis-1.5/postgis.sql and spatial_ref_sys.sql.

That’s about it for now, I think. I want to write up another blog entry on Gisgraphy, once I’ve got it fully up & running. And there might be some value in a blog entry on EC2. But now I have to get back to finishing my laundry!

WIndows 8, Mountain Lion, and Ubuntu 12

I have to do a 10pm web site rollout tonight, so I find myself at home with some time to kill. I haven’t gotten much of a chance to play around with Windows 8, so I decided to download the 90-day eval, and install it on my old laptop. I have the ISO downloaded and ready to go now. However, I had installed Ubuntu 11 on the laptop back in February. I haven’t really played around with it much since then, and I was ready to wipe it out, but when I turned it on, I got an update message letting me know that I could update it to Ubuntu 12.04 LTS. Well, I decided I’d rather upgrade the Ubuntu install on this laptop rather than wiping it out and starting over with Windows 8. It’s running now, and seems to be chugging along smoothly.

I did a little searching, and it looks like 12.04.1 was only just released. There’s an article about it on ZDNet, dated yesterday. And I guess the original 12.04 release was a few months back, based on the date on this Lifehacker article.

There’s been a lot of OS-related news lately, with Mountain Lion just released and Windows 8 nearing general availability. My old 2007 MacBook can’t handle Mountain Lion, so I’m sticking with plain-old Lion on that for now. I’m tentatively planning to buy myself a new MacBook Pro early next year, but I’m not really that worried about it right now. And I’m curious about Windows 8, but not that enthusiastic about it, given what I already know. I read an interesting CNET article this morning, comparing Mountain Lion and Windows 8. I think I agree with his conclusions, for the most part.

I will likely upgrade both my Windows desktop and laptop to Windows 8, when the consumer version is released, but I’m not that excited about it. Meanwhile, maybe I’ll play around with Ubuntu a bit more!

IPredator

I keep thinking that I ought to sign up for a third-party VPN service, so I can put all my traffic through an encrypted tunnel when I’m on public (or quasi-public) wifi. I meant to do something before I went off to San Diego, but I just didn’t get around to it. Some of the services I’ve seen are fairly expensive. These guys, for instance, are $15/month.

I just found one that’s reasonably simple and inexpensive: IPredator. It’s € 15 for 3 months, which comes out to about $22 US. So, about $7 per month. And it doesn’t auto-renew, so if I stop using it, I can just let the account go inactive until I decide to start using it again.

I have it set up on my Mac, iPhone, and iPad now. Setup was easy enough, and the speed seems reasonable. I need to do some more experimenting on that front.

I’m curious to see if it will work on the wifi at my office. We have a SonicWall security device on our network now, and it can be a bit agressive about blocking stuff. I’m not sure if it will let the VPN traffic through or not.

no more iGoogle

I’ve had my home page set to iGoogle for several years now, on all of my home computers. (Prior to that, I was using my.yahoo.com.) I just found out that it’s set to be discontinued. I don’t really understand why they’d be discontinuing something that can’t be costing them much money, and that entices people to have a nice big Google search bar on their home page. It’s not scheduled to disappear until late next year, but I decided to switch over to something else now anyway. The only reasonable alternative I could fine was Netvibes. If you go to their home page right now, they’re pushing their corporate dashboard stuff, but you can still sign up for a free account and use it like iGoogle. It’s pretty nice, though the page is slower to load than my iGoogle page is.