Posts

Showing posts from 2021

AWS EC2 Can't Reach EC2 Metadata Service After Subnet Change

Just another black box day. I had to move an EC2 instance to a different subnet, so I created an AMI out of it and launch it on a different subnet. Everything went well and it has no issue reaching the internet, but apparently not everything went well. The AWS agents such as SSM agent and CodeDeploy agent in the instance stop working. After checking the logs, they can't access the EC2 metadata. Since this is a Windows Server 2019 instance, it also shows that it is not activated, which is strange. On the following article, I found out that my issue was due to the "Gateway Address doesn't match that of the current subnet". https://aws.amazon.com/premiumsupport/knowledge-center/waiting-for-metadata/ Running the suggested command fixed the issue: Import-Module c:\ProgramData\Amazon\EC2-Windows\Launch\Module\Ec2Launch.psm1 ; Add-Routes

App.config File Transformation in Azure DevOps

Utilizing the right tool for the right job makes life easier. That also means we have to keep learning and exploring new tools. For some if not many of us, we probably wish we can do config file transformation on app.config for various environment just like web.config. To do that locally, we can use something like SlowCheetah , but in my case, I want to do it before deploying to the server.  So I use Azure DevOps  File Transform task . Learning from my experience with web.config, I added couple of transformation file such as App.Prod.config. It took couple of tries for me to get it working right and the following are the steps I take: Add transformation file and set its Copy to Output Directory property to Always. Since App.config is usually renamed to <ApplicationName>.exe.config and File transform task requires config files to follow certain naming pattern, for example, App.<environment>.config can only be used to transform App.config, I set  Copy to Output...

NullReferenceException on VB.NET Anonymous Type

As much as we talk about decoupling in computer world, it is probably quite impossible to achieve. The best thing we can do is reduce coupling. But my point is actually on how fragile our code is nowadays. Seems like I have to keep relying on workarounds just to keep the application working. I have a working anonymous type and there is no change on that particular line. It looks like the following: Dim theValue = SharedFunction.GetValue() Some changes on the project, however, has nothing to do with that particular line of code. I switched to VS2019 instead of VS2017 and update Nuget packages without touching that code. The project builds successfully. But during runtime, that particular line threw NullReferenceException. At first, I thought it is my shared function, but it worked great when I ran it on Immediate Window. Few other things are the project is using .NET Framework 4.6.2 and the problematic code is nested inside an if statement which is nested inside #If directive. ...

Retain Web.Config Transformation Files in Azure DevOps

Documentation is never enough and it will never be able to keep up with the change. The only way to really find answer to a problem is to experiment. I have a simple task. Keep the web.config transformation files during build pipeline in Azure DevOps so I can use them to transform the web.config during release pipeline to customize by environment. By transformation files, I mean files such as web.Release.config.  I had it working by adding a copy task in Azure DevOps but it was just a workaround, so I'm trying to find a more elegant way on doing the same thing. At first, I only set the Build Action of each transformation file to Content. It is supposed to be included during deployment. However, after build is done, I noticed the transformation files were discarded and thus not included. After few trial and error, the steps that work for me are: Setting the Build Action to Content for each transformation file Remove <DependentUpon> tag of each transformation file in project fi...

VB.NET Property is of Unsupported Type

 Backward compatibility is hard and there is a saying "The only constant is change". The error message this time makes me scratch my head for an hour or so. I updated the CsvHelper package in one of my applications to 23.0.0 and immediately notice errors. Looking at the change log ( https://joshclose.github.io/CsvHelper/change-log ) and indeed there is a breaking change. I'm aware of the parameter change to a struct as specified in the change log and made the required change.  For somewhat reason, Visual Studio didn't like the configuration part, for example the PrepareHeaderForMatch delegate, in which it can't access the property of the struct argument. The error message says: Property 'CsvHelper.PrepareHeaderForMatchArgs.Header' is of unsupported type. This happens on Visual Studio 2017. So, I visited the GitHub repository: https://github.com/JoshClose/CsvHelper/blob/master/src/CsvHelper/Delegates/PrepareHeaderForMatch.cs and noticed the following code: ...

USERPROFILE Environment Variable Resolves to C:\windows\system32\config\systemprofile via AWS Systems Manager

Context is important which is why different environment can and will produce different values. This time it happened when I ran a PowerShell script through AWS Systems Manager (SSM). I intended to download a file reliably to Downloads directory through PowerShell script and AWS Systems Manager.  At first, it seems straight forward, SSM Agent usually runs as ssm-user with administrator privilege. And USERPROFILE environment variable usually resolves to C:\Users\<username>, well, at least locally. So, $env:USERPROFILE\Downloads should work as intended.  But it isn't so in my particular case. Instead, it resolves to C:\windows\system32\config\systemprofile\Downloads which of course doesn't exist and failed. I also tried using $HOME and it resolves to the same path as $env:USERPROFILE. Reading online, there are indicators that it happened on some machines and not the others. And also, this behavior has been around for a while. Some solutions online suggest tweaking the ...

Error with No Exception Thrown When Transferring Data from Amazon Elasticsearch Service to Amazon DocumentDB

We all wish our application performs as fast as possible and to do that sometimes we need to slow down. I have a project in which I have to get data from Amazon Elasticsearch, process the data and save the result into Amazon DocumentDB. I processed the data asynchronously so my processing application performed really fast. Since the result accuracy is important, I deleted the result and rerun the process just to make sure the same input will produce the same result. However, the results are different and no error nor any exception is thrown. After few hours of troubleshooting, I noticed that the data were not immediately available right after inserting them into DocumentDB. Since DocumentDB separate storage and compute, it took a bit of time to store the data and make them available.  In my code, I need to immediately queried the inserted data. That means, sometimes, it saved the data fast enough that the data are available and sometimes they are not. So my solution is putting a sl...

Running Express JS Application via Plesk in Mocha Host Windows Hosting

Image
Nowadays, to run an application balancing best practices and ease of use need tons number of different technology. That means documentations are scattered too. I have an Express JS application that I need to be hosted and since I have active plan under Mocha Host, I decided to host it there. My hosting server is Windows and it supports Node.JS. Looking into the settings, it is as simple as enabling Node.JS. However, after managing to deploy the application, it doesn't run as expected as it returns 404 for known url. When I finally managed to solve the running issue, there are many steps that I need to configure to get my application working. Step 1 I'm using Express JS version 4 generated via express-generator, so the starting script is not app.js, but /bin/www . So following the instruction in the article below: https://www.plesk.com/blog/product-technology/node-js-plesk-onyx/ I created a new entry file called service.js. The content is simple as follow: const app = require(...