moving from ADAL to MSAL

I haven’t written a programming-related post in a while. I just had to rewrite some code that used ADAL to use MSAL, so I thought I’d write up a short post on that.

There’s a bunch of documentation around this on the Microsoft web site, but for the simple case I was interested in, it took some effort to track down.

Here are links to a couple of general articles:

What I needed was to rewrite a small block of code that calls a web API from a console app, with no user intervention. I have an Azure app registration set up to help with that, with a client ID and secret, and all that stuff. I have some links on how to set that up somewhere, but I’ll skip that for now.

The actual code I needed was something like the code here (to initialize MSAL) and here (to get a token). After I get the token, I just add it as a bearer token to the header for the request.

Here’s a bit of “before” and “after” code:

// before:
using Microsoft.IdentityModel.Clients.ActiveDirectory;

// get the parameters from a config file, or somewhere...
string clientId = ConfigurationManager.AppSettings["ClientId"];
string clientSecret = ConfigurationManager.AppSettings["ClientSecret"];
string authority = ConfigurationManager.AppSettings["Authority"];
string svcResourceId = ConfigurationManager.AppSettings["ServiceResourceId"];

AuthenticationContext authContext = null;
ClientCredential clientCredential = null;

authContext = new AuthenticationContext(authority);
clientCredential = new ClientCredential(clientId, clientSecret);
AuthenticationResult result = null;
try
{
	result = await authContext.AcquireTokenAsync(svcResourceId, clientCredential);
}
catch (AdalException ex)
{
	Console.WriteLine(String.Format(
		"An error occurred while acquiring a token\nTime: {0}\nError: {1}\n",
		DateTime.Now.ToString(), ex.ToString()));
	return;
}
//Console.WriteLine("Access Token: {0}", result.AccessToken);

client = new HttpClient();
client.BaseAddress = new Uri(BaseAddr);
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
// Add the access token to the authorization header of the request.
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", result.AccessToken);

// after: 
using Microsoft.Identity.Client;

IConfidentialClientApplication app = ConfidentialClientApplicationBuilder.Create(clientId)
	.WithClientSecret(clientSecret)
	.WithAuthority(authority)
	.Build();

AuthenticationResult authResult = null;
try
{
	List<String> scopes = new List<String>() { svcResourceId + "/.default" };
	authResult = await app.AcquireTokenForClient(scopes).ExecuteAsync();
	//string accessToken = authResult.AccessToken;
	//Console.WriteLine($"Access Token: {accessToken}");
}
catch (Exception ex)
{
	Console.WriteLine($"MSAL Error: {ex.Message}");
}

I’m not sure if anyone other than me will ever find this useful, but here it is, just in case.

Ephemeral Port Exhaustion

We’ve been having some trouble with our main web server at work over the last few months. It all boils down to ephemeral port exhaustion, which sounds kind of like a post-COVID side-effect, but is actually something that can happen to a Windows server if you’re opening too many ports and then not releasing them. The post linked above contains some useful troubleshooting information regarding this problem.

I actually think the best explanation of this issue is in a 2008 TechNet article titled Port Exhaustion and You. (That link goes to the original version of the article via archive.org. Here’s a link to it’s current location at Microsoft’s site.)

The basic issue is that you can run out of ports and then anything that relies on opening a new one fails, and you just need to reboot the server. So, not the end of the world, but not good for a production server. We’ve been working around it for awhile. We had it scheduled to reboot once a week, but upped that to twice a week when it seemed like once wasn’t enough. And now it’s gotten to the point where I really think we need to find the underlying issue and correct it.

In our case, the server is running a bunch of web services under IIS. There are more than a dozen separate services, written by various programmers, at various points in time. They’re all (probably) C# programs, but they’re written under various versions of .NET Framework and .NET Core. They’re grouped into three or four app pools.

The first thing that makes sense to look at here is how the individual programs are handling outgoing network connections. Normally, in C#, you’d use HttpClient for that. I wrote a blog post in 2018 about HttpClient and included a link to this article about how to properly use HttpClient without opening a bunch of unnecessary connections. I think I’ve got all of my own code using HttpClient correctly and efficiently, though I’m not sure about everyone else’s.

It can be hard to tell what’s going on behind the scenes, though, if you need to rely on closed-source third-party libraries that also open up HTTP connections. I’ve got a few of those, and I think they’re not causing problems, but I don’t really know.

To try to monitor and track down port exhaustion issues, there are a few tools you can use. A number of the articles I’ve linked above mention “netstat -anob” or some variation of that, and I’ve found that helpful. One issue with that, if you’re running a lot of web services, is that you can’t easily see which service is causing a problem.

My big breakthrough yesterday was realizing that I could use “appcmd list wp” to get a list of the PIDs and app pool names associated with the various IIS worker processes. From that, you can tie the netstat output back to a specific app pool at least. (Of course, if you have ten web services under one app pool, then you’ve still got some more work to do.) See here for some info on appcmd.

Anyway, we still haven’t quite got our problem solved, but we’re getting closer. For now, we’ll still just need to keep an eye on it and use the old IT Crowd solution: “Have you tried turning it off and on again?”

debugging

In a recent blog post, Mark Evanier included this quote from Maurice Wilkes, probably taken from his memoir:

By June 1949, people had begun to realize that it was not so easy to get a program right as had at one time appeared. It was on one of my journeys between the EDSAC room and the punching equipment that the realization came over me with full force that a good part of the remainder of my life was going to be spent in finding errors in my own programs.

(Emphasis mine.) Yep. Today, I spent too much time working on a bug that boiled down to something like this: I had a WHERE clause in some SQL that was originally “where X and Y.” I changed it to “where X and Y or Z.” It should have been “where X and (Y or Z).” Stupid parentheses.

Playing with Postman

Postman is a tool that I’ve been meaning to learn for years. I’m not sure when I first heard of it, but I’m pretty sure it was back when it was just a Chrome extension. So it might have been almost ten years ago. I didn’t really get serious about it until 2019, at which point I was doing enough REST API work that it seemed like I should take some time and see what all the fuss was about. At that time, I would have primarily been using Fiddler for API testing. Fiddler’s Composer tab is pretty good for basic API testing, but you can do a lot more with Postman.

Alas, when I tried setting up Postman on my development VM in 2019, I couldn’t get it to work. It would just hang every time I launched it. I went back and forth with support for a while, and tried a number of things, but I just couldn’t get it working. So I gave up and went back to Fiddler.

But I switched to a new VM a while back, so I thought I’d give Postman another try. I successfully installed it on my VM at some point last year, and poked around a bit, but never had time to actually learn it. So last week I had a bit of free time and decided to spend some of it figuring out Postman.

I started with this Postman 101 for Developers video on YouTube. The Postman YouTube channel has a bunch of useful videos. After that, I moved on to a couple of LinkedIn Learning videos:

  • Introducing Postman – This video is from Dave Fancher, and was created in 2019, so it’s a little out of date, but still useful. It’s about 90 minutes.
  • Postman Essential Training – This one is by Kristin Jackvony, and is from 2020, so it’s a little closer to up-to-date. It’s also about 90 minutes. It covers some more advanced testing stuff, like the collection runner and Newman.

Then, I moved on to a Pluralsight video: Postman Fundamentals, by Nate Taylor. That one is about 2.5 hours long, and gets a bit deeper into what you can do with JavaScript for testing API calls. I found it to be very useful for the kind of stuff I’m likely to be doing.

All three of these courses are old enough that they predate the new v8 Postman user interface, so it can occasionally be a little challenging to figure out where something is in the current version vs. where it was in 2019 or 2020. But it’s not too bad.

So I think I now have a pretty good grounding in the basics. Of course, now I’ve gotten busy again, and haven’t gotten back to Postman in the last few days. But I did at least set up a collection/workspace for one of the APIs that I work on, by importing the Swagger JSON for it. I need to clean it up a bit, but I can certainly use it for ad-hoc testing now.

Next, I need to find the time to maybe write some test scripts. My current “smoke tests” for the API are in C#. I have a number of console programs that exercise different aspects of the API, to test out different stuff. An I have a C# script that I run in LINQPad after every deployment that just does some quick non-destructive tests, to make sure the deployment didn’t break anything obvious. But I’d really like to have some more structured and exhaustive tests that I can run. I’m not 100% sure that I want to commit to Postman for that, since it does add some complexity. But it might be worth it. It was worth spending several hours learning about it, either way, and I think I’ll be using it for a lot of my ad-hoc testing now.

Online Learning Resources

I have a bunch of topics I’ve been meaning to blog about this year, and just haven’t had the time to get to too many of them. A lot of my blogging recently has been more “getting stuff off my chest” blogging or “clearing out my head” blogging. But I have a few topics to cover that might be mildly useful to other people. Today’s topic is going to be an overview of online learning resources. I had to write up some notes of this stuff for work recently, since we’re doing a review of the training resources we make available in the IT department. So this post is basically repurposed from an email I sent to my boss.

I get a fair amount of use out of Pluralsight. I have my own subscription, but we also have a department subscription at work. Pluralsight is really good for .NET stuff and other Microsoft-specific programming topics. It does cover topics outside of the Microsoft ecosystem, but not as well. It’s all video training (no books).

We also have a Percipio account at work, and I’ve poked around in it a bit, but haven’t gotten much out of it. There are a lot of books and videos available, and it covers a much wider set of subjects than Pluralsight. There’s probably a lot of useful stuff in there, but it’s not that useful for me. (Percipio seems to be a rebranding of Skillsoft, which I also have access to via my ACM membership.)

I’ve also tried out LinkedIn Learning, which we have access to at work. This platform has a much wider breadth of material than Pluralsight or Percipio, and includes a lot of non-IT oriented stuff. I’m looking at the home page now, and I’m seeing stuff like “Life Mastery: Achieving Happiness and Success”. Basically, a lot of “soft skill” stuff. There’s plenty of content for programmers too though. Like Pluralsight, it’s all video (no books). From what I’ve watched, I’d say that the quality of stuff on this platform is pretty mixed. Some of it is really good, and some of it is more on the level of what you’d get from random YouTube videos. (And LinkedIn Learning is a rebranding of Lynda, which I still (I think) have access to via the Somerset County Library.)

Through my membership in ACM, I have access to the O’Reilly learning platform (formerly Safari), which is, I think, the best one out there for programming topics. They have basically every programming-related book that gets published by any of the major publishers (and some minor ones). It used to be just an ebook platform, but they’ve adding a lot of video content too. And the ACM access used to be to just a subset of the full Safari library, but it’s now the full library, which is awesome. (See previous mention here.)

Outside of paid learning platforms, there’s a lot of free stuff out there now. Microsoft has a lot of stuff at Microsoft Learn and Channel 9. And all of their conferences went virtual (and free) in 2020. Both Build and Ignite had some good content last year. Ignite is already scheduled for March 2-4 this year, and will be free and virtual.

In terms of my own current online learning, I’m trying to finish a course on ASP.NET Core Fundamentals on Pluralsight. I’ve been really busy at work though, and haven’t watched any of it in more than a week. (And yes, I know, I could watch it at night or on the weekend, but I’ve been either tired and/or busy on weeknights and weekends lately too. But that’s a subject for an entirely different blog post.)

New Year’s Day 2021

I’ve been writing big New Year’s Day posts on this blog every year for the last several years. I might as well do one this year too. Obviously, last year was a doozy, and a lot of stuff has changed, and a lot is still in flux. I’m not even sure where to start. So I’ll start with links to the last few New Year’s posts:

And I guess I’ll follow a format not too different from previous years.

Health, Weight, and Sleep

My weight has been pretty steady at around 135 pounds this year. It dipped a bit in spring & summer, getting down to 130 briefly, but has rebounded back to 135. I dropped some weight at the beginning of the pandemic, probably because I wasn’t eating any take-out food. I’m still logging all of my meals with Lose It, which I’ve been using since 2013.

I’m also still using Sleep Cycle as an alarm clock and to log my sleep. I’ve been having some weird dreams this year, but apparently so has everyone else. My sleep quality has been mixed, I’d say. Some nights I’m fine, and some nights I’m not.

I was pretty good about exercise through the spring and summer. I did a lot of walking. I’ve cut back on the walks now, since it’s been getting colder. If I don’t go out for a morning walk now, I try to do ten minutes on my exercise bike instead. (I’m glad I didn’t get rid of that thing.) I need to be careful about not letting up too much through the rest of the winter.

On the meditation front, I’ve certainly done more meditation this year than I’d usually do. One of the reasons for that is that I’ve been working from home since March, so it’s easy to take a ten minute midday meditation break. Back when I was working in a cubicle, I was too self-conscious to meditate at work. (And, really, the office environment is too noisy for meditation anyway.) I was using Insight Timer for most of this year, but I switched to Calm in December, since I had a deal to get a free year of Calm Premium. I have enough opinions on meditation apps right now that I should probably hold them for another post. But overall, I’d say that meditation helped me get through this crazy year.

I did finally get my hearing checked this year, in March, just before the pandemic lockdown really kicked in. The results were pretty much what I expected: I’ve lost a lot of hearing in my left ear. My right ear is fine. The doctor said that I’m not really at the stage where a hearing aid would make sense. My hearing issues haven’t really much mattered this year, though. If I’m talking to anybody at work, it’s on my computer, and I can just turn up the volume as much as I need. And I’m never in a crowded restaurant with a lot of background noise, so that’s not a problem either.

Work and Professional Development

I’m feeling very lucky to have had a good, steady, job this year, and to be able to work from home. My performance review for 2020 was very good. I didn’t really expect a raise this year, given the general state of the economy, but I got one. So that’s all good. There are going to be a lot of challenges ahead, going into 2021. Again, that’s probably a whole blog post of its own though.

On the professional development front, one nice thing to come out of 2020 was a lot of free virtual conferences. I didn’t participate in as many of those as I would have liked, but I did manage to watch some content from Microsoft Build and Microsoft Ignite. Most of my efforts at learning new stuff this year were centered around SharePoint Framework (SPFx) and Microsoft’s Power Platform stuff. I wasn’t really successful in getting any projects done with any of this new stuff in 2020 though. I have a couple of big projects at work that will really need to get done in 2021. I’m still not even sure if I’ll be using SPFx or Power Platform or something else though.

Looking at last year’s post, I see I was talking about trying to learn maybe Rust or Swift in 2020. I definitely didn’t do that. The one new general thing I tried to learn in 2020 was React. And that was mostly because I needed to learn it for SPFx.

Finance

I’m in pretty good shape, financially. Certainly better than most people, given the state of things. I’ve actually seen my checking account balance grow this year, presumably because I didn’t spend any money on travel, or on day trips to NYC, or even on a lot of little things like restaurant meals and gas for my car and Starbucks coffee. I expect 2021 will be similar. Given how little interest I make on my checking account, I really need to shunt some money over into my Merrill account and buy some more shares in an S&P 500 fund. The stock market (after a brief crash back in March) has done surprisingly well this year. And I probably need to sit down with a financial advisor at some point in 2021 and move some money around. There’s some stuff I want to do to simplify my finances a bit, but I can’t do it without figuring out the tax implications.

I opened two new credit card accounts this year, which is pretty unusual for me. I traded in the AmEx Green card I’d had since college for an AmEx EveryDay card. That was done mostly because the fee on the Green card had gone up to $150, so I wanted to replace it with a fee-free card. And I finally gave in and got an Apple Card. I’ve only used the Apple Card to buy my new Apple Watch, in November. I don’t really anticipate using it for anything other than Apple Store purchases.

I’ve also been thinking about getting an Amazon Prime credit card. I spent nearly $2000 at Amazon this year, so the 5% back could be as much as $100 for me. There’s really no reason for me not to get it, other than not wanting to add yet another card to my wallet.

Subscriptions

I’m always obsessing over subscriptions. The pandemic has caused me to pull the trigger on a few subscriptions that I’ve been holding out on for years. Partially because I have some extra money to spend (as noted above), and partially because I have some extra time to kill at home. So I might as well spend some money and time on nice stuff that’ll distract me from the horrible state of the world right now.

I finally subscribed to Apple Music. I signed up for a six-month free trial in October, so I don’t need to start paying for it until April. But I will likely keep it going when that happens. After years of trying to resist switching from CDs & MP3s to a subscription service, I’ve finally given in and embraced the new way of doing things.

I’ve also signed up for Disney+ and Hulu. I wanted Disney+ for The Mandalorian and Soul. And Hulu had a Black Friday deal where you could get the ad-supported tier for $2/month for a year, so that seemed worthwhile. I’m still resisting HBO Max, but I might give in on that one too eventually. If Wonder Woman 1984 had gotten better reviews, I’d probably have done it by now.

I might also sign up for the Apple One subscription bundle at some point in 2021. I don’t really need Apple TV+ or Apple Arcade, but if the pandemic keeps going, I’ll probably give in on that.

Books and Comics

According to Goodreads, I read 86 books in 2020. I’d set a goal of 100 books, and I didn’t reach it, but I’m OK with that. Most of those were comics, but (again) I’m fine with that.

For my Great American Read group, I didn’t really get through much, but I did finish Gone With The Wind in March, so that was a big one. I also read White Teeth, Invisible Man, and The Outsiders from the TGAR list. I’m still an admin in that group, and we’re still posting monthly group reads, but I’m not sure why I’m still bothering with that. The other admin is doing about half the work, so that’s good. I feel like we’re going to have to wind that group down in 2021, but I’m not in a hurry to do so.

My favorite comics of the year were probably the Resident Alien collections that I read back in May. And the Locke & Key series was also surprisingly good.

I’m still ordering a few titles from Westfield every month, but I think I’m going to wind that down over the next few months. I haven’t gotten on board with DC’s Future State thing, and I’m not reading any Marvel books. So that just leaves a few books from smaller publishers, and it’s probably best if I just switch to digital and/or trades for those. Also, my Comixology backlog is nearly 200 books (mostly collections, not single issues), so just working through that could take me a few years.

Movies

As I mentioned recently, I watched a lot of movies in 2020. Looking at Letterboxd, I see that I watched a total of 73. Probably my favorite film of the year (that actually came out in 2020) was Soul. My second favorite would have been Onward, so the year for me was bookended with solid Pixar films. I did a rewatch of all four Avengers films early in the year, and a rewatch of all the Daniel Craig Bond films just recently. Those were both fun distractions. I also tried to watch a bunch of Kurosawa films, but I only got through four. For 2021, I want to watch some more Kurosawa, and maybe rewatch a bunch of Miyazaki films. (I bought several of them on Blu-ray earlier this year, and haven’t watched any of those discs yet.)

Summary

I am kind of proud of myself for getting through 2020 in one piece, not too much worse for wear. I managed to avoid putting on weight, picking up a drinking habit, getting COVID, and losing my job. I think my mental health is reasonably OK, all things considered. I’m trying not to stress about things I didn’t do. I’d like to have spent more time on “enriching” activities and less on pure distraction, but I’m mostly OK with having watched 73 movies and lots of TV, and having read a lot more comics this year than novels or non-fiction books.

I’m expecting the first couple of months of 2021 to be pretty rough. I think the vaccine rollout will be slow. I don’t expect a change in the status quo on mask wearing and social distancing and working from home. Winter will probably still be in full force through to early March, so we’re not going to be able to do much outdoors. I think the current surge of COVID cases will continue through February, and not start to let up until March. I don’t see us all being able to return to anything like normality until very late in 2021, if at all. But, hopefully, by summer, we’ll have enough folks vaccinated and the political situation will have stabilized enough that we’ll start on the road to “normal.”

I’m thinking a lot about short-term strategies for getting through winter. Things like getting my groceries delivered, watching a lot more “comfort” TV, reading a lot of comics, working out on the exercise bike, meditating, blogging, journaling, whatever helps. I’m not making any resolutions for 2021. I’m going to take it day by day, and I think that’s what we’re all going to have to do.

 

the rest of the year

I still seem to be dealing with a lingering cold that I picked up last weekend. So this weekend has been pretty quiet. I finished reading The Outsiders, finished watching Young Wallander, and started watching Giri/Haji. I did my laundry yesterday, but that was it for productive work, really. I had my groceries delivered from Whole Foods, so I didn’t even get out to the grocery store. And I’ve mostly been living off leftovers from some takeout barbecue I got on Friday night.

Last week, I attended a remote workshop for Microsoft’s CSP program, and this week, I’m supposed to be attending a week-long class on Microsoft’s Power Platform. Last week’s workshop took up only about 3 hours each day, but this week’s class is supposed to run from 9:30 to 3:30 each day, so that’s going to take up most of the day. Normally, this would be an in-person class, but of course now it’s going to be delivered remotely. The CSP workshop was done over Teams and went pretty smoothly, but it wasn’t very interactive. I’m wondering about how the Power Platform class will go. I assume it’ll have to be more interactive than the CSP thing was. And I think it’s being done over Zoom, rather than Teams. For various reasons, I’m going to need to do the class directly on my work laptop, using only the laptop screen and keyboard. So it might get a little tough to follow along with the instructor while also working through examples on my own in a separate window. I wish I could get a multi-monitor setup going for that, but there’s no practical way to do that right now. So, anyway, it’s going to be an interesting week, trying to get through the class while also keeping up with anything else that comes up at work. (And, again, I’m very grateful to have a job right now, never mind a job that’s letting me work remotely, and paying for me to attend workshops and classes and whatnot. I’m very lucky.)

I’ve been thinking a lot about how the rest of the year is going to go. Thinking back to the summer, I guess I was vaguely aware that we might be going through a second wave at the end of the year, but it’s looking now like it’s going to be a doozy. I’ve been spending maybe too much time doomscrolling on Twitter, but there are a lot of reasonable people talking about how bad it can get if people aren’t careful around Thanksgiving and Christmas. So I’m trying to get into the proper lockdown mindset.

Since I’ve spent so little money this year on travel and other stuff like that, and since I’m going to be stuck inside a lot, I’m thinking that maybe I should pop for Disney+. It’s only $7 a month, and I keep hearing good things about The Mandalorian. Plus, the next Pixar film, Soul, is going straight to Disney+. (And it won’t cost $30 extra, like Mulan did, which is nice.) Disney+ has been around for just about a year now, and seems to be doing really well. So I guess I should give in and sign up. Eventually, I might even talk myself into canceling cable TV. But maybe I’ll keep that going until the end of the year, since (again) I’m going to be spending a lot of time indoors and I have enough disposable income to pay for both cable TV and streaming right now.

Stuck In The Mud With SPFx

I’ve been trying to make some progress with SharePoint Framework (SPFx) lately, but I keep getting stuck in the mud, so to speak. I started working on learning SPFx some time ago, but I had to put it aside due to other projects. But now, I have a little spare time to get back to it.

I set aside a few hours one day last week to work on it. But since I last worked on it, I’ve moved most of my work to a new dev VM. So step one was moving all of my SPFx projects over to the new VM. That shouldn’t have been a big deal. But of course each SPFx project has a node_modules folder of about 725 MB, across more than 100,000 files. So just copying everything over wasn’t going to work. So step 0.1 (let’s say) would be to delete the node_modules folders. Since I had less than a dozen work projects, I thought I’d use brute force for that, and just click each node_modules folder in Explorer and hit the delete key on my keyboard. Of course I then realized that asking Windows Explorer to move 100,000+ files to the recycle bin is a bad idea. So I started looking into writing a script to do it.

I found something called npkill that looked like it would do the trick without me even having to write a script, but I couldn’t get it working in Windows. (It’s probably possible to get it working in Windows, but I hit a snag and decided not to spend too much time on it.)

So I was back to writing a script. I started putting something together in PowerShell, but then I found rimraf, which looked promising and (according to at least one blog post I read) would be faster than doing the equivalent recursive delete natively in PowerShell. So I wrote a PowerShell script using rimraf. I wound up with this simple one-liner:

gci -name | % { echo "cleaning $_\node_modules..."; rimraf $_\node_modules }

I’m not sure if rimraf was actually faster than just using a native PowerShell command, but it worked. So that got me down to a manageable set of files that I could zip up and move to the new VM. (There was actually some trouble with that too, but I won’t get into that.) And that pretty much killed the time I’d put aside to work on SPFx for day one. Sigh.

For day two, I wanted to get back to a simple project that would just call a web service and return the result. I’d previously stubbed out the project with the Yeoman generator on my old VM, so now I just had to do “npm install” to get the node_modules folder back. Long story short, I got some unexpected errors on that which led me down some rabbit holes, chasing after some missing dependencies. That got me messing around with using yarn instead of npm, which someone had recommended to me. That didn’t really help, but after a bunch of messing around, I think I figured out that the missing dependencies weren’t really a problem. So just messing around with npm and yarn, and getting the project into a git repo, killed the time I’d set aside on day two.

For day three, I actually went into the project and added a web service call, to a local service I wrote, but immediately hit an error with the SPFx HttpClient not liking the SSL certificate on that web service. So that got me trying to figure out if you can bypass SSL certificate checking in the JavaScript HttpClient the same way you can in the .NET HttpClient. I got nowhere with that, but it did set me down the path of looking into that SSL cert, and realizing that it’s due to expire in January, but I didn’t have a reminder to renew it in Outlook. Which got me going through all of my SSL certs and Outlook reminders and trying to make sure I had everything covered for anything that might expire soon. And that sent me down a couple of other administrative side-paths that used up all the time I’d set aside on day three.

So after three days, I basically just had a sample SPFx project that makes one simple web service call, which fails. Sigh. I picked it back up today, trying to fix the call. I got past the SSL issue. But that led me down a couple of more rabbit holes, mostly regarding CORS. So, good news: I now understand CORS a lot better than I did this morning. Bad news: I spent most of the morning on this and can’t really spend most of the afternoon on it.

At some point, I’ll get over all these initial speed bumps and actually start doing productive work with SPFx. Maybe.

performance tuning surprises

Here’s another blog post about the program I’m currently working on at my job. This is the same program I blogged about yesterday and a couple of weeks ago.

Today, I was trying to fix a performance issue. The app originally ran really fast. It just had to make a few API calls, filter and combine some data, then spit it back out in JSON format. It took less than a minute to run. But then, I was asked to add a new data element to one of the files. An element that I could only get by calling a new web service method repeatedly, once per order, for about 7000 orders. It shouldn’t be an expensive call, but the end result was that my 1 minute runtime was now up to 10 minutes.

The first thing I tried doing was adding some concurrency to those 7000 new API calls. I did that using the first technique described in this article, implementing a ConcurrentQueue. I wasn’t really optimistic that it would help much, but I thought it was worth a try. It didn’t really help at all. The program still took about 10 minutes to run. So I undid that change.

The next thing I did was to look and see if I was repeating any of the API calls. While I was processing 7000 records, there were some cases where the same sales order number was found on multiple records, so I was making extra unnecessary API calls. So I implemented a simple cache with a dictionary, saving the API call results and pulling them from cache when possible. That didn’t help much either. About 90% of the calls were still necessary, so I only got down from 10 minutes to 9 minutes. But that was at least worth doing, so I left that code in place.

Then, finally, it occurred to me to look at how I was calling the API. This new API call was part of the WCF SOAP service that I’ve mentioned previously. Well, the way I wrote my wrapper code for the API, I was creating a new call context and service client for every call. I didn’t think that would be a huge issue, but I went ahead and refactored things so all the calls used the same call context and client. Well, that got the execution time back down to one minute. So really all of that extra time was spent in whatever overhead there is in spinning up the WCF client object (and I guess tearing it down when it goes out of scope).

That was really unexpected. I hadn’t thought about it much, but I assumed the code behind the instantiation of the service client was just setting up a structure in memory. I guess that maybe it’s also establishing communication with the server? Theoretically, I could dig into it, but I don’t really have the time for that.

The moral of this story is that, when performance tuning, some of the stuff that you think will help, won’t, and some of the stuff that seems dubious, might actually make a huge difference!

Trying to debug a .NET Core app as a different user

I’m working on a .NET Core console app at work that, on one level, is pretty simple. It’s just calling a couple of web services, getting results back, combining/filtering them, and outputting some JSON files. (Eventually, in theory, it’ll also be sending those files to somebody via SFTP. But not yet.)

There have been a bunch of little issues with this project though. One issue is that one of the web services I’m calling uses AD for auth, and my normal AD account doesn’t have access to it. (This is the SOAP web service I blogged about last week.) So I have to access it under a different account. It’s easy enough to do that when I’m running it in production, but for testing and debugging during development, it gets a little tricky. I went down a rabbit hole trying to find the easiest way to deal with this, and thought it might be worthwhile to share some of my work.

In Visual Studio, I would normally debug a program just by pressing F5. That will compile and run it, under my own AD account, obviously. My first attempt at debugging this app under a different user account was to simply launch VS 2017 under that account. That’s easy enough to do, by shift-right-clicking the icon and selecting “run as different user”. But then there are a host of issues, the first being that my VS Pro license is tied to my AD/AAD account, so launching it as a different user doesn’t use my license, and launches it as a trial. That’s OK short-term, but would eventually cause issues. And all VS customization is tied to my normal user account, so I’m getting a vanilla VS install when running it that way. So that’s not really a good solution.

My next big idea was to use something like this Simple Impersonation library. The idea being to wrap my API calls with this, so they’d get called under the alternate user, but I could still run the program under my normal account. But the big warning in the README about not using impersonation with async code stopped me from doing that.

So, at this point, I felt like I’d exhausted the ideas for actually being able to run the code under the VS debugger and dropped back to running it from a command-line. This means I’m back to the old method of debugging with Console.WriteLine() statements. And that’s fine. I’m old, and I’m used to low-tech debugging methods.

So the next thing was to figure out the easiest way to run it from the command-line under a different user account. I spent a little time trying to figure out how to open a new tab in cmder under a different account. It’s probably possible to do that, but I couldn’t figure it out quickly and gave up.

The next idea was to use this runas tool to run the program as the alternate user, but still in a PowerShell window running under my own account. I had a number of problems with that, which I think are related to my use of async code, but I didn’t dig too deeply into it.

So, eventually, I just dropped back to this:

Start-Process powershell -Credential domain\user -WorkingDirectory (Get-Location).Path

This prompts me for the password, then opens up a new PowerShell window, in the same folder I’m currently in. From there, I can type “dotnet run” and run my program. So maybe not the greatest solution, but I’d already spent too much time on it.

One more thing I wanted to be able to do was to distinguish my alternate-user PowerShell session from my normal-user PowerShell session. I decided to do that with a little customization of the PS profile for that user. I’d spent some time messing with my PowerShell profile about a month ago, and documented it here. So the new profile for the alternate user was based on that. I added a little code to show the user ID in both the prompt and the window title. Here’s the full profile script:

function prompt {
    $loc = $(Get-Location).Path.Replace($HOME,"~")
    $(if (Test-Path variable:/PSDebugContext) { '[DBG]: ' } else { '' }) +
    "[$env:UserName] " +
    $loc +
    $(if ($NestedPromptLevel -ge 1) { '>>' }) +
    $(if ($loc.Length -gt 25) { "`nPS> " } else { " PS> " })
}
$host.ui.RawUI.WindowTitle = "[$env:UserName] " + $host.ui.RawUI.WindowTitle

You can see that I’m just pulling in the user ID with $env:UserName. So that’s that.

I’m not sure if this post is terribly useful or coherent, but it seemed worthwhile to write this stuff up, since I might want to reference it in the future. I probably missed a couple of obvious ways of dealing with this problem, one or more of which may become obvious to me in the shower tomorrow morning. But that’s the way it goes.