Weird Al on the presidential debate

I don’t usually post about politics, but I feel like I need to write something today, just to maybe mark a few odd items, for posterity:

  1. Weird Al’s reaction to last night’s presidential debate was silly and made me feel a little better about the whole thing. (Not a lot better, but a little.) The fact that it’s posted on the NY Times site makes it a little weirder, but somehow even better. It amuses me to think about the editorial process that led someone to decide that Weird Al was the right guy to go to for reaction to a presidential debate.
  2. In less amusing news, my House rep, Tom Malinowski, is getting death threats from QAnon, after a misleading press release and ad from Republicans. Here’s a Post story on the ad, and here’s an NJ.com opinion piece on his opponent’s refusal to address the issue at all. I’ve donated a few bucks to Malinowski’s campaign on a couple of occasions, and I really do think that he’s a good guy. I’m sure he’s not perfect, but he comes off as smart, serious, competent, and concerned about his constituents.

I thought I had an item #3 for that list, but I’m exhausted now, just thinking about the election.

NJ MVC, IFTTT, RSS, and other acronyms

This post may wind up covering a variety of barely-related topics. I have a bunch of stuff in my head today and I’m making connections between things that might not make much sense. Buy anyway…

I had to renew the registration for my car recently. I normally do that by mail, and I did that again this year, and got the registration card back, no problem. But then I got a letter saying that NJ MVC had undercharged me by $7 due to a computer glitch and I’d have to pay that. The letter didn’t really include any helpful information about how to submit that $7 to them. It wouldn’t let me do it online. And there was no indication that they’d accept it by mail. I definitely don’t want to go near an MVC office right now, since they’ve been mobbed ever since they reopened in July. (Apparently, the line at the Somerville MVC starts forming at 4 AM every day.) I really wasn’t sure what to do, but thankfully I found an article on nj.com today explaining the problem and indicating that it was OK to mail in the $7, and gave the address to send it to.

NJ.com has been reasonably useful throughout the pandemic. They’ve run a lot of good, useful, articles. (Mind you, they also run a lot of nonsense and clickbait.) They started asking people to pay $10/month to subscribe to the site at some point earlier this year, and I thought about doing it. But I couldn’t quite talk myself into it. First, there’s the aforementioned nonsense and clickbait. Then, there’s the worry that they won’t make it easy to cancel.

In the past, I’ve often used virtual credit card numbers when I’m subscribing to something that might be hard to cancel. Citi used to have a good program for virtual card numbers, including a Windows program that you could use to generate them on the fly and copy them into forms on web pages. But that program stopped working a while back. And the web-based version relied on Flash, and I don’t have any browsers left on any of machines that are still running Flash. So I kind of gave up on them.

I saw an announcement today from 1Password saying that they were going to start integrating with privacy.com to allow users to generate virtual card numbers right from 1Password. That sounded promising, but it draws from your bank account, and not from a real credit card. So it seems like there could be complications there. But that got me thinking about virtual card numbers again, so I checked Citi’s web site, and found that they’ve finally rewritten their virtual card functionality to work without Flash. (They’ve also eliminated the Windows program, which is a bummer, but I was expecting that.)

And I saw that NJ.com recently added an option to pay $100 for a full year, rather that $10/month. So I went ahead and paid for a year of NJ.com with a virtual card number. I figure their article explaining the $7 MVC mess was worth at least $20 to me. And somebody’s got to pay for all of their articles on pork roll sandwiches and ranked lists of 326 Bruce Springsteen songs, so it might as well be me. A year from now, I’ll figure out if I want to pay for another year.

Overall, I’ve been struggling with how to both consume and support local news during the pandemic. I generally watch NJTV News every night. Their newscast is pretty good and covers a lot of NJ news, but it’s mostly political state-level stuff. I don’t currently support NJTV or Thirteen, and I probably should. I watch enough stuff on PBS that I should toss them $5/month, at least.

I’ll also occasionally look at MyCentralJersey.com, which covers some local Somerset county news, but they’re hiding a lot of stuff behind a paywall now. They have a deal for $39 for one year, and I might go ahead and pay for that with a virtual card too.

I try to get a lot of my news through email and RSS. I use IFTTT to set up some email stuff, and The Old Reader to manage my RSS feeds (along with Reeder on iOS). IFTTT has recently introduced Pro subscriptions, and I would need to start paying for Pro to keep doing some of the stuff I’m currently doing with the service. I don’t really want to do that, so I’ve been looking at shifting more stuff into The Old Reader. But I hadn’t really looked too closely at IFTTT Pro. I just noticed a blog post from David Sparks that’s got me a little more interested in it. It sounds like they might be adding enough value to make it worth the minimum $2/month that you can pay for Pro under their current “set your own price” plan. It’s not quite clear, but maybe you can actually write code as part of Pro applets? That would be useful.

So, yeah, this is me going down a bunch of rabbit holes and thinking about spending a bunch of money. I should probably stop now.

performance tuning surprises

Here’s another blog post about the program I’m currently working on at my job. This is the same program I blogged about yesterday and a couple of weeks ago.

Today, I was trying to fix a performance issue. The app originally ran really fast. It just had to make a few API calls, filter and combine some data, then spit it back out in JSON format. It took less than a minute to run. But then, I was asked to add a new data element to one of the files. An element that I could only get by calling a new web service method repeatedly, once per order, for about 7000 orders. It shouldn’t be an expensive call, but the end result was that my 1 minute runtime was now up to 10 minutes.

The first thing I tried doing was adding some concurrency to those 7000 new API calls. I did that using the first technique described in this article, implementing a ConcurrentQueue. I wasn’t really optimistic that it would help much, but I thought it was worth a try. It didn’t really help at all. The program still took about 10 minutes to run. So I undid that change.

The next thing I did was to look and see if I was repeating any of the API calls. While I was processing 7000 records, there were some cases where the same sales order number was found on multiple records, so I was making extra unnecessary API calls. So I implemented a simple cache with a dictionary, saving the API call results and pulling them from cache when possible. That didn’t help much either. About 90% of the calls were still necessary, so I only got down from 10 minutes to 9 minutes. But that was at least worth doing, so I left that code in place.

Then, finally, it occurred to me to look at how I was calling the API. This new API call was part of the WCF SOAP service that I’ve mentioned previously. Well, the way I wrote my wrapper code for the API, I was creating a new call context and service client for every call. I didn’t think that would be a huge issue, but I went ahead and refactored things so all the calls used the same call context and client. Well, that got the execution time back down to one minute. So really all of that extra time was spent in whatever overhead there is in spinning up the WCF client object (and I guess tearing it down when it goes out of scope).

That was really unexpected. I hadn’t thought about it much, but I assumed the code behind the instantiation of the service client was just setting up a structure in memory. I guess that maybe it’s also establishing communication with the server? Theoretically, I could dig into it, but I don’t really have the time for that.

The moral of this story is that, when performance tuning, some of the stuff that you think will help, won’t, and some of the stuff that seems dubious, might actually make a huge difference!

Trying to debug a .NET Core app as a different user

I’m working on a .NET Core console app at work that, on one level, is pretty simple. It’s just calling a couple of web services, getting results back, combining/filtering them, and outputting some JSON files. (Eventually, in theory, it’ll also be sending those files to somebody via SFTP. But not yet.)

There have been a bunch of little issues with this project though. One issue is that one of the web services I’m calling uses AD for auth, and my normal AD account doesn’t have access to it. (This is the SOAP web service I blogged about last week.) So I have to access it under a different account. It’s easy enough to do that when I’m running it in production, but for testing and debugging during development, it gets a little tricky. I went down a rabbit hole trying to find the easiest way to deal with this, and thought it might be worthwhile to share some of my work.

In Visual Studio, I would normally debug a program just by pressing F5. That will compile and run it, under my own AD account, obviously. My first attempt at debugging this app under a different user account was to simply launch VS 2017 under that account. That’s easy enough to do, by shift-right-clicking the icon and selecting “run as different user”. But then there are a host of issues, the first being that my VS Pro license is tied to my AD/AAD account, so launching it as a different user doesn’t use my license, and launches it as a trial. That’s OK short-term, but would eventually cause issues. And all VS customization is tied to my normal user account, so I’m getting a vanilla VS install when running it that way. So that’s not really a good solution.

My next big idea was to use something like this Simple Impersonation library. The idea being to wrap my API calls with this, so they’d get called under the alternate user, but I could still run the program under my normal account. But the big warning in the README about not using impersonation with async code stopped me from doing that.

So, at this point, I felt like I’d exhausted the ideas for actually being able to run the code under the VS debugger and dropped back to running it from a command-line. This means I’m back to the old method of debugging with Console.WriteLine() statements. And that’s fine. I’m old, and I’m used to low-tech debugging methods.

So the next thing was to figure out the easiest way to run it from the command-line under a different user account. I spent a little time trying to figure out how to open a new tab in cmder under a different account. It’s probably possible to do that, but I couldn’t figure it out quickly and gave up.

The next idea was to use this runas tool to run the program as the alternate user, but still in a PowerShell window running under my own account. I had a number of problems with that, which I think are related to my use of async code, but I didn’t dig too deeply into it.

So, eventually, I just dropped back to this:

Start-Process powershell -Credential domain\user -WorkingDirectory (Get-Location).Path

This prompts me for the password, then opens up a new PowerShell window, in the same folder I’m currently in. From there, I can type “dotnet run” and run my program. So maybe not the greatest solution, but I’d already spent too much time on it.

One more thing I wanted to be able to do was to distinguish my alternate-user PowerShell session from my normal-user PowerShell session. I decided to do that with a little customization of the PS profile for that user. I’d spent some time messing with my PowerShell profile about a month ago, and documented it here. So the new profile for the alternate user was based on that. I added a little code to show the user ID in both the prompt and the window title. Here’s the full profile script:

function prompt {
    $loc = $(Get-Location).Path.Replace($HOME,"~")
    $(if (Test-Path variable:/PSDebugContext) { '[DBG]: ' } else { '' }) +
    "[$env:UserName] " +
    $loc +
    $(if ($NestedPromptLevel -ge 1) { '>>' }) +
    $(if ($loc.Length -gt 25) { "`nPS> " } else { " PS> " })
}
$host.ui.RawUI.WindowTitle = "[$env:UserName] " + $host.ui.RawUI.WindowTitle

You can see that I’m just pulling in the user ID with $env:UserName. So that’s that.

I’m not sure if this post is terribly useful or coherent, but it seemed worthwhile to write this stuff up, since I might want to reference it in the future. I probably missed a couple of obvious ways of dealing with this problem, one or more of which may become obvious to me in the shower tomorrow morning. But that’s the way it goes.

Sunday stuff, and Legend of Korra

After taking a sick day on Friday, I felt a lot better yesterday, but was still taking it pretty easy. I read some more Buffy comics, and also started watching The Legend of Korra on DVD. I thought I was even better this morning, but then had to cut my usual Sunday walk short because I started getting tired at around the 15-minute mark. So I guess today will also be mostly Buffy and Korra and hot tea and maybe a nap or two.

Avatar and Korra both started streaming on Netflix recently, so there’s been a surge of interest and popularity for both series. I’d watched Avatar some time ago, when it was streaming… somewhere. I don’t really remember. I know I didn’t watch it when it first aired, and I know I don’t have the DVD set. But I never got around to watching Korra. I got interested in it early this year, and bought the DVD set from Amazon. At the time, it wasn’t streaming anywhere that I had access to, and the DVD set was less than $20.

When Netflix got both shows, I started seeing a lot of references to them on the web, so that reminded me that I hadn’t gotten around to Korra yet, and now seemed like a good time to start watching it. Generally, there’s a dearth of new shows coming out right now, for obvious reasons, so there’s been a lot of articles and podcasts that are mining old ones. I listened to a Pop Culture Happy Hour episode on Avatar recently that was pretty good. I was going to link to a few more articles here, but one of them turned out to have a big spoiler in it for something I haven’t gotten to yet, so I think I’m going to avoid reading any more Korra articles until I’ve finished the series.

Anyway, I finished “book one” yesterday, and I’m really enjoying the series so far. It’s got a bit of a steampunk feel to it. Steampunk definitely had a peak at some point, maybe around the same time this was originally airing, so I might have rolled my eyes at it then, but now it seems kind of cool again. It’s kind of amazing what they did with this show, on what I assume must have been a fairly limited budget. I’m not sure how well it holds up over the next three seasons, but it’s well-respected enough that I assume it continues to be pretty good, at least.

sick day, more Buffy comics

I went on a Buffy comics kick back at the end of 2019, reading through Buffy seasons eight and nine, Angel & Faith season nine, and some miscellaneous one-shots and mini-series. I finished up in December, but now I’m getting back into it. At the time, the Dark Horse stuff was out of print, so it was a little hard to come by. I’d purchased a full run of Buffy season ten off eBay in December, but had just left it in my “to be read” pile, until now. Well, I’ve been feeling sick all week and finally gave in and took a sick day today. So I’m reading Buffy comics. And I’m really liking season ten.

The current holder of the Buffy license, Boom Studios, has finally started reprinting some of the Dark Horse stuff. So I can now buy Buffy seasons 11 and 12 right from Comixology. (Season 11 is on sale for 40% off right now, so I already bought that.) They’ve also reprinted Angel season 11, but, annoyingly, haven’t done Angel & Faith season 10 yet. Mind you, I have a ton of stuff from the Angel & Buffy Humble Bundle that I bought in 2016 that I still haven’t read yet, but I really want to ready Angel & Faith season 10. Both the trade paperbacks and individual issues from that run are scarce and pricey. So I guess I’m going to have to wait for it to get reprinted. (It’s funny how sometimes I can have hundreds of comics and books to read, but the one thing I want to read is the one thing I can’t get…)

Anyway, finishing Buffy season 10 will probably take me a while, and getting through that may be enough to get me off my Buffy kick again. Meanwhile, I’ll keep an eye out for Angel & Faith back issues on eBay.

Oh, and back on the subject of sick days: one nice thing about the pandemic is that I haven’t gotten a cold in six months. I guess that long run has been broken, since I seem to have something very much like a cold right now. I often get one at the change in seasons, and this week is the week when I finally switched from shorts to long pants, so I guess this is the official transition from summer to fall, so it makes sense.

Calling a SOAP WCF web service from .NET Core

I had a problem at work today that I’d previously solved, almost exactly a year ago. The project I was working on then got almost completely rewritten, so the current version of that code doesn’t have any reference to calling WCF web services at all. I kind of remembered that I’d written up a blog post about it, but couldn’t find it, since I was searching for SOAP instead of WCF. So I’m writing a new blog entry, with “SOAP” in the title, so if I have the same problem again, and I search for “SOAP” again, I’ll at least find this post, with a reference to the previous post. (Having a blog comes in handy, when your present-day self has to solve a problem that your past self has solved, but forgotten about…)

I don’t really have anything to add to that previous post. One thing I will do, though, is post the actual code here, rather than just embed a gist, since I now have a syntax highlighting solution that won’t garble it the way the previous setup did.

// https://gist.github.com/andyhuey/d67f78f6568548f66aabd20eadff8acf
// old way:
        public async Task RunAsync()
        {
            CallContext context = new CallContext();
            context.Company = "axcompany";
            string pingResp = string.Empty;
            var client = new XYZPurchInfoServiceClient();
            var rv = await client.wsPingAsync(context);
            pingResp = rv.response;
            Console.WriteLine("Ping response: {0}", pingResp);
        }
/* app.config:
    <system.serviceModel>
        <bindings>
            <netTcpBinding>
                <binding name="NetTcpBinding_XYZPurchInfoService" />
            </netTcpBinding>
        </bindings>
        <client>
            <endpoint address="net.tcp://myserver:8201/DynamicsAx/Services/XYZPurchInfoServices"
                binding="netTcpBinding" bindingConfiguration="NetTcpBinding_XYZPurchInfoService"
                contract="XYZPurchInfoSvcRef.XYZPurchInfoService" name="NetTcpBinding_XYZPurchInfoService">
                <identity>
                    <userPrincipalName value="myservice@corp.local" />
                </identity>
            </endpoint>
        </client>
    </system.serviceModel>
*/

// new way:
	CallContext context = new CallContext();
	context.Company = "axcompany";
	string pingResp = string.Empty;
	var client = new XYZPurchInfoServiceClient(GetBinding(), GetEndpointAddr());
	var rv = await client.wsPingAsync(context);
	pingResp = rv.response;
	Console.WriteLine("Ping response: {0}", pingResp);
	
	private NetTcpBinding GetBinding()
	{
		var netTcpBinding = new NetTcpBinding();
		netTcpBinding.Name = "NetTcpBinding_XYZPurchInfoService";
		netTcpBinding.MaxBufferSize = int.MaxValue;
		netTcpBinding.MaxReceivedMessageSize = int.MaxValue;
		return netTcpBinding;
	}

	private EndpointAddress GetEndpointAddr()
	{
		string url = "net.tcp://myserver:8201/DynamicsAx/Services/XYZPurchInfoServices";
		string user = "myservice@corp.local";

		var uri = new Uri(url);
		var epid = new UpnEndpointIdentity(user);
		var addrHdrs = new AddressHeader[0];
		var endpointAddr = new EndpointAddress(uri, epid, addrHdrs);
		return endpointAddr;
	}

six months

I’ve been thinking a lot about how we’re just hitting the six month mark on this whole COVID-19 thing here in the US. March 12 was my last day in the office. I took March 13 as a vacation day. Then, we started working from home on Monday, March 16. And my company is still working from home, with plans now to return to the office in November, with managers coming in two days a week and the rest of us coming in only one day a week. This is probably the fifth “return to office” plan we’ve had. The last one had us all returning in October, two days a week. I’m not complaining or criticizing; no one is having an easy time figuring this thing out, and I’m glad my employer hasn’t forced us to come back too early. But it seems like, through the whole pandemic, we’re always just about a month away from returning to “normal,” but we never quite get there. I’m only now really starting to see that, and thinking about things I could be doing differently, treating work from home as “normal” rather than “temporary”. So that’s all a lot of wind-up to say that I finally broke down and ordered a new USB headset for home use. (There’s still one sitting on my desk at work, but they haven’t allowed us to go back in and clean out our desks. I put in a request to do that, but I guess it’s stalled somewhere, since I haven’t gotten a response to it.)

I ordered the headset from Best Buy, since they had it in stock to ship right away, while Amazon still has a one-month wait on USB headsets, at least for the model I wanted. That turned out to be a bit dangerous, since it also led to me poking around in their Blu-ray selection, which led to me buying four Miyazaki SteelBook Blu-rays, plus the 25th Anniversary Ghost in the Shell Blu-ray SteelBook, and pre-ordering the Weathering With You Blu-ray SteelBook. I’m not particularly attached to SteelBooks, but they do look nice, and the Miyazaki ones were only $18 each. (Apparently, they’re also $18 at Amazon right now.) The Ghost in the Shell one was also only $18. Weathering With You was more expensive, but still reasonable. So now I have a bunch of new anime discs to watch. (Even though most of them are movies I’ve already seen multiple times.)

Yesterday, I watched a bit of DC FanDome and some of NCSFest. This second day of FanDome was done differently from the first day, last month, where they actually had a schedule, kind of like a real con. This time, they just dumped all the video content out there at 1 PM Eastern time, and left it up until 1 PM today. So you could watch whatever you want, whenever you want. The content was really a hodgepodge of random stuff. I’m pretty sure most of it was recorded at least a month ago. And there wasn’t really much content around the actual comic books. But I did watch a nice panel discussion with Brian Bendis, Gene Yang, and Dan Jurgens, talking about Superman. The feeling I’m getting out of DC Comics right now is that they’re really hitting the brakes on a lot of stuff and pulling back on things. I think that a lot of the FanDome content was prepared before the layoffs last month, so it’s not really reflecting the actual state of things at DC right now. I’m not really sure what DC is going to look like at the end of this year, but it’ll probably be a lot different from the way they looked at the beginning of this year, before Dan DiDio left, and COVID-19 hit, and all the layoffs happened.

NCSFest was done as one long live stream, via YouTube. They had a number of panels, interspersed with announcements of the Reuben awards. The whole nine-hour stream is available to watch on YouTube now. If you wanted to find a particular panel, I guess you could check the schedule, then do a little math and fast-forward to it. I watched a bit of the “From Panels to Publishing” panel, some of the Jim Davis panel, and some of the Mutts panel. All of them were fun to watch. Cartoonists are generally pretty cool, chill, funny people. NCS is really a professional organization, so the content isn’t necessarily geared towards fans, or towards self-promotion, more towards actual cartoonists. For me, that makes it even more fun. But if you don’t want to hear Lynn Johnston and Patrick McDonnell talking about what brand of ink they use, then it’s probably not for you.

I guess that’s my ramblings for today. There’s maybe not much value in any of this for anyone else, but writing these posts helps me get through things. Tomorrow starts another week of sitting alone in my apartment, staring at a computer screen, trying to do my job without going nuts. If random blogging about comics and anime helps, I’m going to keep doing it.

tinkering with WordPress

Just for the sake of doing something useful today, I decided to tinker with my WordPress setup a bit. I’d upgraded to WordPress 5.5 in mid-August, then updated my version of PHP from 7.3.21 to 7.4.9. And, a little later, I switched to a new syntax highlighting plugin. I had one more major thing on my to-do list: upgrading to a newer version of MySQL.

My host, IONOS, makes it easy to switch PHP versions; you can do that right in their control panel. But you can’t just switch to a new MySQL version. You have to create a new database, export you data, import it to the new database, then edit your WordPress config file to point to the new database. So I did that first on my test database, and it worked fine, so I went ahead and did it with my production database too.

It had been a long time since I’d done anything even vaguely low-level with MySQL. The IONOS control panel lists your MySQL databases and gives you a link to get to a phpMyAdmin site for each of them. From there, you can backup, restore, run queries, and so on. On my first try, I forgot that you need to edit the backup SQL to remove the “create database” command, and edit the “use” command to point to the new database. But once I figured that out, I didn’t have any problems.

My old test WordPress install takes up 19 MB in the old database and 35 MB in the new one. I’m not sure why the new version is bigger than the old version. I could probably figure that out, but it’s not really important. The max size on a MySQL database in IONOS is 1000 MB, so I’m fine there. The production blog is 60 MB in the old database and 87 MB in the new one. So if you ever wondered how much space 20 years of blog entries takes up, it’s apparently 87 MB.

I did all this on my PC, rather than my Mac, and it turns out that I didn’t have an SFTP client installed. On my Mac, I generally use Commander One for SFTP file management. On the PC, I’ve recently started using Directory Opus as an Explorer replacement. Opus includes SFTP support, but it’s an add-on purchase, and I hadn’t bothered with it when I paid for my license a few months ago. I went ahead and enabled a trial of the FTP functionality today, and it worked fine. So I’ll probably pay for it when the trial expires. It’s only $10.

The first thing I did after switching to the new database was to run a WordPress site backup with UpdraftPlus. I’ve been using UpdraftPlus for a long time. I’ve stuck with the free version, which is good enough for me. The paid version is $70, plus $42/year after the first year. That’s not bad, I guess, but I don’t really need it.

The next thing on my WordPress “rainy day” list is to maybe look into switching to a new theme. The Stargazer theme that I’ve been using since 2014 hasn’t been updated in a couple of years and is being “phased out” by its developer. He’s replacing it with something called Exhale, for which he’s charging $50/year. I don’t have a problem with that, but I’d like to stick to a free theme, if I can. (If I was actually making money off this blog, I’d be more willing to spend money on subscription themes and backup services, but this is really just a little personal blog with no revenue stream, so I like to keep things simple.)

Labor Day

So, here it is, Labor Day. If you’d told me back in March that I’d still be working from home in September, and too afraid to take NJ Transit into NYC to visit a few museums on Labor Day, I’d… well, I’d be a little depressed but I probably wouldn’t be that surprised. I didn’t initially expect this thing to last so long, but there were good reasons to suspect that it would be around at least until the end of 2020, even back in March. I was ruminating in my last post about whether or not I could talk myself into going into NYC today; I’ve definitely given up on that idea.

Walking


I’ve been going out for a morning walk nearly every day since the beginning of this thing in March. That habit has been one of the bright spots of the last several months. I’ve gotten into the habit of taking a few photos on my walks and picking one of them to save to Day One, along with a short journal entry, usually just a sentence of two. Day One tracks streaks, if you post to it every day, and my current streak is nearing 200 days. (Looking back, it appears that my current streak started on March 10, just before the pandemic lockdown.) I post about other stuff in Day One too, but I almost always start the day with a photo from my walk.

Often I just walk a circuit down Main St, up one of the side streets, then down High Street, back to Main St, and back to my front door. I can get a good 20-minute, 1-mile walk out of that. On weekends, though, I’ll often walk along the Peters Brook Greenway. I can get some nice photos along there. I took the hibiscus photo in this post yesterday morning, just about one minute before getting bitten by a mosquito. That mosquito bite bothered me a lot more than it should have. I’d gotten so used to getting into a nice relaxed state on these walks, and it’s been so long since I’ve been bitten by a mosquito, that I took it as something of a personal affront. And of course my mind started playing out all sorts of nightmare scenarios. (Can you get COVID-19 from a mosquito bite? Almost definitely not. Whew.) But I got back out there this morning and did a nice 40-minute walk along the Greenway again. This time, though, I tried to keep a little further away from the water.

Reading

I really didn’t do anything much useful yesterday, and I don’t plan on doing much today either. Yesterday, I read the “City of Bane” issues of Batman, which were the end of Tom King’s run on the title. My Goodreads review for the second half of that story nearly turned into a long essay on the Trump presidency, but I held myself back. Finishing that story got me thinking about King’s run on Batman, which started with the DC Rebirth event from the summer of 2016, which was of course just a few months before the 2016 election. That got me thinking about how much the world has changed over the last four years, and especially over the last six months. And how much it might change over the next, say, six months, or four years. I’d started reading monthly comics again in 2016 with the DC Rebirth event, and have keep reading them, though I’ve dropped Batman and Detective recently and I don’t have any titles on my current subscription list that were on it back in 2016. And I’m thinking of dropping monthly books entirely again. (But this is a subject I’ve blogged about too many times.)

Listening

Bandcamp has continued to do their Bandcamp Friday thing, where they waive their revenue share and give all the money from sales to the musicians. I last bought anything from them in June, so I was overdue to spend some money on music. I wound up spending around $75 this past Friday, buying seven albums (all digital), including something called Good Music to Avert the Collapse of American Democracy, a compilation benefiting the voting rights group Fair Fight. So I hope my $20 helps, though I’m not optimistic about the future of American democracy, to be honest.