Remapping Keystrokes in MorphX

The shortcut keys used in MorphX, the Dynamics AX code editor, are almost exactly the same as those used in Visual Studio. In fact, the code editor basically is the editor from Visual Studio, with somewhat reduced functionality, if I understand correctly.

The one thing that’s always bugged me about it is that the keystrokes for commenting and un-commenting code are different. In VS (and various other editors), it’s Ctrl-K,Ctrl-C and Ctrl-K,Ctrl-U. For no obvious reason, MorphX uses Ctrl-E,Ctrl-C and Ctrl-E,Ctrl-U. This isn’t too bad, until you start getting used to it, then you accidentally press Ctrl-E in SQL Management Studio, hence executing a block of SQL instead of commenting it out. After doing that a few times, I decided that I needed to fix MorphX.

Surprisingly, I couldn’t find any facility built into AX for changing keyboard shortcuts. So, I turned to AutoHotKey. It’s very easy to remap a single keystroke in AHK. For instance, I can just remap Ctrl-K to Ctrl-E with “^k::^e”. I went ahead and did that for awhile, since it didn’t really seem that there would be any harm in that. But, I wanted to figure out how to create a more targeted replacement, so only the two specific commands would get remapped.

The snippet below does that. And, or course, it limits the remapping to the AX code editor.

; https://gist.github.com/andyhuey/5466566
; comment/uncomment, the way it was intended to be...
#IfWinActive ahk_class AxMainFrame
^k::
Transform, CtrlC, Chr, 3
Transform, CtrlU, Chr, 21
Input Key, L1 M T1
if Key = %CtrlC%
     Send ^e^c
if Key = %CtrlU%
     Send ^e^u
return
#IfWinActive

Hard drive crash

One of the hard drives on my work PC crashed a couple of days ago. My work PC is (or rather, was) configured with an SSD for a boot drive, and two regular SATA drives, in a RAID 0 configuration, for a secondary data volume. It was one of those SATA drives that failed. Since RAID 0 doesn’t have any redundancy built in, that killed the volume.

The only data I had on that volume were the files for my VM. The way we have developer machines configured here, we have general productivity stuff (Office, etc) on the boot volume, and all the developer stuff on the VM. The setup for developing for Dynamics AX is fairly complicated, so it makes sense to do it on a VM.
Unfortunately, we don’t have any facility set up for backing up our VMs anywhere. Also, between the way AX stores source files, and the way we have TFS set up, we don’t always check in code daily, nor do we have a simple way of backing up in-progress code changes that haven’t been checked in. So, the end result is that I lost about two days worth of work on my current project.
I had, at one point, written a backup script (using PowerShell and 7-Zip) to back up the My Docs folder on the VM to the My Docs folder on the physical machine, but I hadn’t ever set it to run on a scheduled basis, so the backup file there was about a week old, which meant that I also lost a few SQL files, some test spreadsheets, and one quickie VS 2010 project that I’d written to test a web service. Oh, and I was keeping the backup script itself (plus some other scripts) in a ‘util’ folder on the root of the VM C: drive, so those didn’t get backed up either, and were lost.
So the takeaway from all of this, of course, is that I need to do what I can to get around the limitations of the environment I’m working in, and set up some automated backup procedures.
In terms of backing up the My Docs folder, I rewrote my lost PowerShell script, and set it up in task scheduler to run at 6pm daily. It ran fine last night, and I think it’ll work fine on a continuing basis.
In terms of backing up in-progress work in AX, I extended the ‘startup projects’ class that I blogged about recently to also allow me to export all of my active projects. I have it exporting them to a folder under the My Docs folder, so, if I run the export at the end of the day, prior to the file system backup, I should always have a backup of my current work, in a format that I can re-import into AX, if need be.
There are still some big holes in this system, including the fact that I have to remember to run that export daily. But it’s a good start. I’d like to add some extra stuff to this process, including daily SQL backups, and maybe a push of certain backup files to the cloud. The SQL backups are kind of hard, since the AX test database is 70 GB. And my employer, for some reason, likes to block access to cloud-based backup & storage providers, so I can’t just copy stuff into a DropBox folder, so that part’s a little tricky too. 
I’ve also considered setting up a local Mercurial or Git repo, checking in the AX export files every day, and pushing them up to a private Bitbucket repo. This would give me offsite backup, with the added benefit of increased granularity and visibility, but it would probably violate one or more corporate policies.
As a follow-up to this post, I’m going to write a few more posts, about some of the scripts I’m using now.

geocoding experiments

I wrote an initial blog post on Gisgraphy about a month ago. I wanted to write a follow-up, but hadn’t gotten around to it, what with all the other stuff I have going on. But I’m going to take a few minutes now and write something up.

The initial import process to get the Gisgraphy server up and running took about 250 hours. The documentation said it would take around 40 hours, but of course there’s no way to be accurate about that kind of thing without knowing about the specific hardware of the server it’s being installed on, and the environment it’s in. I’m guessing that, if I had more experience with AWS/EC2, I would have been better able to configure a machine properly for this project.

Once the import was complete, I started experimenting with the geocoding web service. I quickly discovered something that I’d overlooked when I was first testing, against his hosted web service. The geocoding web service takes a free-form address string as a parameter. It’s not set up to accept the parts of the address (street, city, state, zip) separately. It runs that string through an address parser, and here’s where we hit a problem. The address parser, while “part of the gisgraphy project,” is not actually open source. An installed instance of Gisgraphy, by default, calls out to the author’s web service site to do the address parsing. And, if you call it too often, you get locked out. At which point, you have to talk about licensing the DLL or JAR for it, or paying for access to it via web service.

Technically, the geocoder will work without the address parser, but I found that it returns largely useless results without it. For instance, it will happily return a result in California, given an address in New Jersey. I’m not entirely sure how the internal logic works, but it appears to just be doing a text search when it’s trying to geocode an un-parsed address, merely returning a location with a similar street name, for instance, regardless of where in the US it is.

While I don’t think the author is purposely running a bait-and-switch, I also don’t think he’s clear enough about the fact that the address parser isn’t part of the open source project, and that the geocoder is fairly useless without it. So, we shut down the EC2 instance for this and moved on the other things.

Specifically, we moved on to MapQuest Open, which I was going to write up here in this post, but I need to head out to work now, so maybe another time.

Gisgraphy

My boss stumbled across a project named Gisgraphy recently. A big part of what we do involves the need for geocoding. We have generally been using geocode.com for batch geocoding, but there’s a cost to that, and they only do US and Canada. There are many other geocoding services, but if you’re doing heavy volume, you’re generally excluded from free options, and the paid options can get expensive.

Gisgraphy is an open source project that you can set up on your own server. It will pull in data from freely-available sources, load it all into a local database, then allow you to use a REST web service to geocode addresses. A little testing with some US addresses leads me to believe that it’s generally accurate to street level, but not quite to house level. So, I’m not sure that we’ll want to use it for all of our geocoding, but it ought to be generally useful.

We decided to set it up on an AWS EC2 instance. We started messing with EC2 VMs for another project, and it seemed like EC2 would be a good fit for this project too. I started out with a small instance Linux VM, but switched it to a medium instance, since the importer was really stressing the small instance. I will probably switch back to small after the import is done. That’s one nice thing about EC2: being able to mess with the horsepower available to your VM.

Gisgraphy uses several technologies that are outside my comfort zone. I’m primarily a Windows / .NET / SQL Server guy, with a reasonable amount of experience with Linux / MySQL / PHP. Gisgraphy runs on Linux (also on Windows, but it’s obviously more at home on Linux), so that’s ok. But it’s written in Java, and uses PostgreSQL as its back-end database. I have only very limited experience with Java and PostgreSQL. And, of course, I’m new to AWS/EC2 also.

So, setting this all up was a bit of a challenge. The instructions are ok, but somewhat out of date. I’m using Ubuntu 12.04 LTS on EC2, and many things aren’t found in the same places as they were under whatever Linux environment he based his instructions on. For the sake of anyone else who might need a little help getting the basic setup done under a recent version of Ubuntu, I thought I’d list out a few pointers, where I had to do things a bit differently than found in the Linux instructions:

  • Java: I installed Java like this: “sudo apt-get install openjdk-6-jdk openjdk-6-jre”.
  • And JAVA_HOME should be /usr/lib/jvm/java-6-openjdk-i386/ or /usr/lib/jvm/java-6-openjdk-amd64/.
  •  PostgreSQL: I installed the most recent versions of PostgreSQL and PostGIS like this: “sudo apt-get install postgresql postgresql-contrib postgis postgresql-9.1-postgis”.
  • Config files were in /etc/postgresql/9.1/main and data files were in /var/lib/postgresql/9.1/main.
  • PostGIS: In his instructions for configuring PostGIS, the “createlang” command wasn’t necessary. 
  • And the SQL scripts you need to run are /usr/share/postgresql/9.1/contrib/postgis-1.5/postgis.sql and spatial_ref_sys.sql.

That’s about it for now, I think. I want to write up another blog entry on Gisgraphy, once I’ve got it fully up & running. And there might be some value in a blog entry on EC2. But now I have to get back to finishing my laundry!

WIndows 8, Mountain Lion, and Ubuntu 12

I have to do a 10pm web site rollout tonight, so I find myself at home with some time to kill. I haven’t gotten much of a chance to play around with Windows 8, so I decided to download the 90-day eval, and install it on my old laptop. I have the ISO downloaded and ready to go now. However, I had installed Ubuntu 11 on the laptop back in February. I haven’t really played around with it much since then, and I was ready to wipe it out, but when I turned it on, I got an update message letting me know that I could update it to Ubuntu 12.04 LTS. Well, I decided I’d rather upgrade the Ubuntu install on this laptop rather than wiping it out and starting over with Windows 8. It’s running now, and seems to be chugging along smoothly.

I did a little searching, and it looks like 12.04.1 was only just released. There’s an article about it on ZDNet, dated yesterday. And I guess the original 12.04 release was a few months back, based on the date on this Lifehacker article.

There’s been a lot of OS-related news lately, with Mountain Lion just released and Windows 8 nearing general availability. My old 2007 MacBook can’t handle Mountain Lion, so I’m sticking with plain-old Lion on that for now. I’m tentatively planning to buy myself a new MacBook Pro early next year, but I’m not really that worried about it right now. And I’m curious about Windows 8, but not that enthusiastic about it, given what I already know. I read an interesting CNET article this morning, comparing Mountain Lion and Windows 8. I think I agree with his conclusions, for the most part.

I will likely upgrade both my Windows desktop and laptop to Windows 8, when the consumer version is released, but I’m not that excited about it. Meanwhile, maybe I’ll play around with Ubuntu a bit more!

IPredator

I keep thinking that I ought to sign up for a third-party VPN service, so I can put all my traffic through an encrypted tunnel when I’m on public (or quasi-public) wifi. I meant to do something before I went off to San Diego, but I just didn’t get around to it. Some of the services I’ve seen are fairly expensive. These guys, for instance, are $15/month.

I just found one that’s reasonably simple and inexpensive: IPredator. It’s € 15 for 3 months, which comes out to about $22 US. So, about $7 per month. And it doesn’t auto-renew, so if I stop using it, I can just let the account go inactive until I decide to start using it again.

I have it set up on my Mac, iPhone, and iPad now. Setup was easy enough, and the speed seems reasonable. I need to do some more experimenting on that front.

I’m curious to see if it will work on the wifi at my office. We have a SonicWall security device on our network now, and it can be a bit agressive about blocking stuff. I’m not sure if it will let the VPN traffic through or not.

no more iGoogle

I’ve had my home page set to iGoogle for several years now, on all of my home computers. (Prior to that, I was using my.yahoo.com.) I just found out that it’s set to be discontinued. I don’t really understand why they’d be discontinuing something that can’t be costing them much money, and that entices people to have a nice big Google search bar on their home page. It’s not scheduled to disappear until late next year, but I decided to switch over to something else now anyway. The only reasonable alternative I could fine was Netvibes. If you go to their home page right now, they’re pushing their corporate dashboard stuff, but you can still sign up for a free account and use it like iGoogle. It’s pretty nice, though the page is slower to load than my iGoogle page is.

Windows backup weirdness

I hadn’t done a backup of my main home desktop PC in a while, so I decided to get one done today. I’ve previously used the built-in Windows 7 Backup, and, more recently, Crash Plan. I’ve had problems with both, so I needed to find another backup program.  I have a 1 TB drive, about 70% full, and two 500 GB drives that I can use for the backup. So, I need a program that can split the backup across two drives, which turns out to be more of a limiting factor than you’d think it would be. I’m currently running a backup with Macrium Reflect Free, which *should* be able to split the backup between two drives, though I’m not sure if it will or not.

The “weirdness” referenced in the title of this post is with regard to the speed of the backup. This is a desktop PC, and I’ve never really tweaked the power settings on it. I have the display set to blank after 10 minutes, but my assumption has always been that the PC will keep running at full speed, if it’s doing something, like a backup. When I started the backup, it was running at about 300 Mb/sec. That seemed like a good speed, and I expected it to get done fairly quickly. I’ve noticed, though, that if I check on it after it’s been running for awhile, it shows at 100 Mb/s.  If I sit in front of it for a few minutes, it gets back up to about 300 Mb/s. But, if I step away for an hour, then come back, it’s back down to 100 Mb/s. So, clearly, something is happening to slow it down after a certain period of keyboard/mouse inactivity. So, I’ve switched the power settings from “recommended” to “high performance”, thinking that maybe it’s going into a low-power mode or something, but I don’t think that’s helped. Which could mean that some other background process is kicking in after a few minutes of keyboard/mouse inactivity and slowing things down. All very frustrating. We’ll see if I can manage to get a backup done before the NFC Championship game is over.

laptop stuff

We’re in the middle of Hurricane Irene right now, but my part of Somerville is fine, and we haven’t lost power. I’ve been using this time to finish setting up my new ThinkPad, and to wipe my old Inspiron and Aspire One.
For the Aspire One, I uninstalled a few programs, let Windows apply a bunch of pending updates, then created a new account and wiped out my old one.  I gave that machine away yesterday, before the storm hit.
For the Dell Inspiron, I had too much stuff on there to easily clean up, so I just did a clean install of Windows 7 on that, created a user account, and ran updates to get it (mostly) current.  I think that’s ready to sell now.
On the ThinkPad, I’d done most of the quick installs already — Firefox, Notepad++, and a bunch of stuff like that. Yesterday, I took care of the two major installs: Office 2010 and Visual Studio 2010.  Now, I’m letting the system pull down and install updates for both of those programs.
Over all, I think I’ve probably pulled down 5 or 10 GB of updates over my internet connection this weekend.  Thank god I don’t have a data cap on my Optimum Online account!