Posts

Showing posts from 2019

Android Room Database Subsequent Insert Failed Due to Broken AutoIncrement

Room DB has managed to abstract away complicated SQL statements which is pretty nice. But as with other new things, it takes a while to get used to. It all starts with an entity such as the following: @Entity data class Item( @PrimaryKey(autoGenerate = true ) val id : Int = Int. MIN_VALUE , @ColumnInfo val name : String? } and my Dao has the following function: @Insert(onConflict = OnConflictStrategy. IGNORE ) suspend fun insert(item: Item) The first insert went well, but I noticed my subsequent insert failed. For somewhat reason, it tried to assign the same value as Id. After few trial and error, I managed to fix it by changing the Int. MIN_VALUE to 0. So the entity class becomes: @Entity data class Item( @PrimaryKey(autoGenerate = true ) val id : Int = 0 , @ColumnInfo val name : String? }

"Specify one workspace" - Deleting Team Foundation Phantom Workspace

One day, my colleague somehow discovered a duplicate team foundation workspace in his computer after rebooting his PC. The duplicates make him unable to find the projects tied to remote repository that he is working on. The duplicated workspace has exactly the same name, owner and computer name as the other one except no existing mapping. Worse, Visual Studio detected only one workspace and deleting it doesn't seem to work. We started with tf.exe command after we figure out that we can manage workspaces using command line. Using the following command, we manage to get a list of all workspaces: tf workspaces /collection:<domain>.visualstudio.com\<organization> /owner:* But our attempt to delete the particular workspace using the following command failed with the message "Specify one workspace": tf workspace /delete <workspace_name>;<domain>\<owner_name> After browsing more, we found this thread:  https://developercommunity.visualstudio...

Moving ASP.NET Session State Server to Aurora/MySQL

Our database was in MS SQL Server and we were in the middle of moving to Aurora with MySQL compatibility. And obviously there are differences to be resolved and one of them is we are not sure on how to move the ASP.NET Session state. After several troubleshooting sessions (pun not intended), I finally managed to move the session state server to Aurora. The following two webpages have been very helpful, albeit the first one is outdated: https://www.codeproject.com/Articles/633199/Using-MySQL-Session-State-Provider-for-ASP-NET https://dev.mysql.com/doc/connector-net/en/connector-net-programming-asp-provider.html My steps are as follow: 1. Disable the current state server by commenting/removing the sessionState tag under system.web in web.config. 2. Add MySql.Web Nuget package (As of this post, the working one is version 8.0.17). 3. Add new sessionState tag under system.web <sessionState mode="Custom" customProvider="MySqlSessionStateStore">  ...

Install Previous Version of Nuget Package that is not Visible

Sometimes a new software version also introduces a new bug and we need to rollback. And this time is on one of the Nuget package that we use. When troubleshooting MySql Nuget package issue, I noticed that I can't revert back to the previous version using the version drop down under Nuget manager window in Visual Studio.  So, a bit of searching reminds me that I can use the Package Manager Console to install a package. Maybe I can specify the version and the server still has it. So I uninstall the latest package and ran the following command: Install-Package < package_name > -Version < version > -Source nuget.org The source option is optional. In my case, somehow I had it defaulted to some other source so I have to actually specify it in the command. Hit enter and I got the previous version installed. Problem solved!

Convert FAT32 to NTFS with No Data Loss in Windows

Filesystem is as usual a pretty complicated thing. And I happened to have a new external hard drive that for somewhat reason was formatted as FAT32. Of course, I didn't notice until I put a bunch of data in it.  The time has come when I need to store file larger than the 4GB limit of FAT32.  So browsing around the internet, I found out that I can convert to NTFS without data loss and third party software from  https://www.tenforums.com/tutorials/85893-convert-fat32-ntfs-without-data-loss-windows.html . The steps that I took are: Make sure data are backed up somewhere else. Close all software/application that has the drive opened. I closed the File Explore too. Run command prompt as administrator. Run the following command in command prompt: convert <drive> /fs:ntfs . For example: convert D: /fs:ntfs Restart the computer. That's it!

Kernel not Updated on Ubuntu 14.04 in AWS EC2 Nitro-based Instance

Sometimes a simple thing which works for many others doesn't work for us and this time it is on updating the Linux kernel. It is all started when we are trying to migrate an m1 to t3. We are aware that t3 is a Nitro-based instance thus NVMe and ENA module have to be installed. Even after following AWS documentation, the modules don't seem to be installed properly even after reboot. Then the journey begins.  First, I ran the following command to check what kernel is actually loaded: uname -r In this particular case, it returns:  3.13.0-45-generic.  And I know that it is not the latest. So, as suggested by Amazon support, I ran the following commands one by one to see if the latest linux-aws package are properly installed and at the latest and nvme driver is loaded into the kernel ( NVME driver is set to 'Y')   sudo apt-cache policy linux-aws ls -al /boot/ cat /boot/config-4.4.0-1044-aws |grep -i "nvme" And the result are all as expected. The late...

ASP.NET Web Forms, Auth0 and OWIN

Sometimes supporting an old technology is much more troublesome, but when you manage to overcome the challenge, you will be feeling so much more satisfied. My boss wants to use Auth0 for authentication, but the application that we need to modify is in ASP.NET Web Forms and there is no Auth0 quickstart for ASP.NET Web Forms and I can't find an example online. I remember I saw somewhere that we can use OWIN on ASP.NET Web Forms with a bit of tweaking. Following the awesome blog post below, I managed to get OWIN to work. https://tomasherceg.com/blog/post/modernizing-asp-net-web-forms-applications-part-2 Next, I followed Auth0 quickstart for ASP.NET (OWIN) https://auth0.com/docs/quickstart/webapp/aspnet-owin/01-login#configure-auth0 One thing to note, the RedirectUri specified in the app has to be registered in Callback URLs in the Auth0 account. Then comes the customization. First, I need to be able to secure pages. Reading online, I found out that simply adding the f...

Amazon Aurora Serverless "The provider did not return a ProviderManifestToken string" Error

We had a good working application which connect to Amazon Aurora Serverless pretty well and just recently it started to intermittently unable to connect with the error "The provider did not return a ProviderManifestToken string". The following is some spec of the application: .NET Framework 4.6.2 MySQL.Data 6.10.9 MySQL.Data.Entity 6.10.9 EntityFramework 6.2.0 (EF6) When I debugged the application, it has inner exception which has the message: "Sequence contains more than one matching element". Upon more troubleshooting, I remember that Aurora Serverless is a cluster and it requires at least 2 subnets which reside in 2 different Available Zones. Amazon does not recommend the use of IP address instead provide us with an endpoint to connect to the cluster. That means, the endpoint might resolve to two IP addresses. So, I decided to see what DNS lookup will show and indeed, the endpoint resolves to two IP addresses. Hence, my suspect is the MySQLConnection w...

OpenVPN Client Save Connection (IP Address)

When I first used OpenVPN client, it was set up for me so I'm missing out on some configuration know how. And now I need to set up OpenVPN on a new computer and I need to save a connection so I can quickly connect to it the next time I log on. In this case, I'm using a v2.x.x OpenVPN client and I can't seem to figure out how to do that. It turns out that I have to be disconnected from all connections and then select Import > From server.... Then I can enter my connection information and it will be saved and easily accessible just by hovering over it and select Connect...

Refresh System Environment Variables for IIS and Visual Studio

Environment variables can sometimes be a pain to deal with. Since environment variables are often cached or loaded only once, a change might not be immediately reflected in the application that we intend to apply them to.  In my case, I need to debug our ASP.NET application and need the new environment variables.  Restarting the Visual Studio itself was not enough.  I tried to kill the worker process and that doesn't help either.  At the end, I tried one thing that works which I got from a forum, i.e., enter the following command on command prompt or PowerShell run in admin mode: iisreset

A2 Hosting New Website ASP.NET Default Page Not Shown

This might affect other hosting too, but it just happened that I bumped into it in my A2 hosting account. Basically I created a new website and upload the files. I had the domain name servers updated. Everything looks good. But for somewhat reason, my default page is not shown by default. I checked the DNS propagations and it was done. I try visiting one of the pages in my website and it works great. But somehow when entering only the domain name, it doesn't bring up the default page. After several hours, I figured out that there is index.html that was put when the directory was setup. Renaming it to something other than list of default documents fix the issue.

AWS Code Deploy Error: Make sure your AppSpec file specifies "0.0" as the version

I got this error when attempting to deploy using AWS Code Deploy. Checking the appspec.yml, it does have: version: 0.0. One of the suggestions in StackOverflow was the line ending has to Linux. However that did not help in my case. Since I'm deploying to Windows, the line ending has to be Windows. After couple trial and error. I found out that Visual Studio save my appspec.yml with UTF-8 encoding, so I proceed to change it to "Western European (Windows) - Codepage 1252" encoding and code deploy works flawlessly. To change the encoding, I use the following steps in VS2017: 1. Select the file in Solution Explorer. 2. Click File menu on Visual Studio 3. Select Save <filename> As... 4. On the pop up, click the tiny arrow next to the Save button 5. Select Save with Encoding... 6. I select Western European (Windows) - Codepage 1252  for the Encoding and Current Setting  for the Line endings.

VB.NET Exit Sub Finally

I have a recurring VB.NET application that will start every 15 minutes. However, as a precaution, the next scheduled instance of the application will immediately exit if the previous instance is still running. To keep track of the status of the application, I put a code in the finally block that will update the status to stopped and save it to the database. I noticed that somehow the status of the application was stopped but the application still running in the task manager. There is no background worker so it should exit when the status is updated. It turns out that the subsequent instance update the status when it exited due to finally block is always executed even on "Exit Sub"

AWS Aurora "Reading from the stream has failed" Error

We had problem with Aurora SQL throwing error when it is in sleep (pause) mode. By default, it is set to sleep when idle for 5 minutes. After few attempts, we managed to extend the timeout which is command timeout (not to be confused with connection timeout) to 60s in our case to prevent the error from happening. The timeout can be set in connection string: Server=server;Database=database;Uid=username;Pwd=password; Default Command Timeout=60 Reference: https://www.connectionstrings.com/mysql-connector-net-mysqlconnection/specifying-default-command-timeout/ Update 12/3/2019 The above didn't work somehow on our ASP.NET application that used EntityFramework, so we have to specify it in our ApplicationDbContext constructor and increase it to 5 minutes (300s). The following is the VB.NET version: Public Sub New(existingConnection As Common.DbConnection, contextOwnsConnection As Boolean) MyBase.New(existingConnection, contextOwnsConnection) Database.CommandTim...

Xamarin "java.exe exited with code 2" Error

Bumped into the following error  "java.exe exited with code 2" when building Xamarin app today. The only thing change is I added a Syncfusion NuGet package. I managed to solve it by enabling MultiDex. Reference: https://forums.xamarin.com/discussion/97803/getting-error-java-exe-exited-with-code-2 https://developer.android.com/studio/build/multidex

Salesforce CPQ Lightning Template Content Font Color

Somehow I was involved in trying to change the font color in Salesforce CPQ template content. Seems easy but it is not working the way we want. We selected HTML as the content type and it comes with a nice rich text editor. As we change the font color, it looks great on the page. However, when we attach the template content to the quote template and preview the quote with that template, couple of things occurs: The font color is gone and reflected back to black It puts the next words which is supposed to be different color in a new line We double-checked the HTML and can't find anything wrong with it. So I decided to check which engine generates the quote preview PDF and found out that it is using Apache FOP. The fun thing is it Apache FOP doesn't support HTML inherently, but it does support XSL, so

Model Binding Issue ASP.NET Core in Ubuntu using Postman

I spent a fair amount of time trying to troubleshoot my ASP.NET Core web server in Ubuntu. The issue starts when I noticed a failed model binding in Ubuntu using HTTP PUT method when it works locally on my Window machine. I also found out that it binds properly when I used HTTPS compared to HTTP. Looking into the server, I have configured Nginx as reverse proxy server which send a 301 Redirect to HTTPS when the request is made using HTTP. I can't find any issue on the web application itself, Nginx, so I decided to check Postman which I used to generate the request. I found out that by default, Postman always follows redirect. There is nothing wrong with that, except seems like the data is lost during redirect. Eventually I found out that, 301 is meant to be used with GET and thus any POST/PUT data will be scrapped during redirect. Hence, it is the correct logic all along.

Method not found System.Net.Http.HttpContentExtensions.ReadAsAsync

Bumped into this error when moving my web app to another server. It happened to me before but this time the cause is different. So two ways that worked for me: 1. Install package Microsoft.AspNet.WebApi.Client. This will provide access to HttpFormatting.dll which actually contains the ReadAsAsync method and fixed the issue for me before. 2. I found out that my System.Net.Http was not referenced properly because it depends on the dll installed in the machine. So, installing System.Net.Http NuGet package fix the current issue for me.

Where is My Environment Variables? Journey to Linux Service

Ok, I had a .NET Core Web App running in Ubuntu behind Nginx. Everything else is fine except I can't retrieve the value of the environment variables that I put in /etc/environment. After hours of googling, turns out systemd service strips all out except some variables. Two ways to fix this: 1. Put the environment variable in the .service config file [Service] Environment=MY_ENV_VAR=thevalue 2. Include /etc/environment in the service. (I don't think this is a good idea, especially for my use case). [Service] EnvironmentFile=/etc/environment

AWS SSM Linux Shell Script Closing Paren Expected Error

I ran my scripts through AWS SSM and received the "closing paren expected" error message. Quick check on my code, I was missing items in two different situations: 1. I was missing closing parentheses \), so adding it solves the issue. My code was like: if [ \( <expr> ]; then <do this>; fi 2. My closing parentheses was not prefixed by space, so adding a space fixed it. It was like: if [ \( <expr> \) ]; then <do this>; fi

Linux Script Conditional Conditions

Like most people, I guess, I came from Windows background. Just recently, I have projects exploring multiple flavors of Linux and spending a lot of time to understand how conditions work in bash and/or shell script in Linux. And then my code didn't work and that took me on a journey. Long time ago, I tried to do something as simple as the following: if (<expr1> or <expr2>) and (<expr3> or <expr4>) then <do this> fi But things get complicated as the expressions involve which and grep commands. For example, I'm checking if any python is installed, so my first attempt was if [ "`which python`" = "" -a "`which python3`" = "" ]; then echo "no python"; fi then I realize that in RedHat, if there is no python installed, it will return a string containing "no python" instead of empty string, so my code becomes if [ \( "`which python`" = "" -o "`which python | ...

LZMA SDK Compress Decompress

7z is one of the best if not the best file compression available. Best of all, it is open source. The engine behind it is the LZMA compression method. I was integrating the sdk ( https://www.7-zip.org/sdk.html ) in my project, but however, can't get a quick start on the compress decompress process. After searching the internet, I figured out on a surface level how it all works. Partially thanks to the question in  https://stackoverflow.com/questions/7646328/how-to-use-the-7z-sdk-to-compress-and-decompress-a-file . So, below, I write down my basic understanding. Compress Basically, the compressed file will contain 3 things with the first 2 are metadata: The first 5 bytes are compression properties The next 8 bytes are file size before compression The compressed bytes var encoder = New Encoder(); encoder.WriteCoderProperties(outStream); // Write properties encoder.Write(BitConverter.GetBytes(inputFileSize), 0, 8); // Write uncompressed file size encoder.Code(inStream, out...

Read-only File System Error in Linux

I was moving the content of CentOS boot drive to a new hard drive. CentOS has MBR partition with xfs file system. It worked great, boot fine, but when I tried to do yum install, it barked that it can't do the install because the file system is read-only. After a decent amount of research, I found out that the problem lies on the /etc/fstab. Because it is new hard drive, the UUID is different and grub2-mkconfig used the new UUID to configure the grub.cfg. However, when it is booted, it checked the /etc/fstab and found the discrepancy. Once I changed the /etc/fstab to reflect the new UUID, the error went away.

Understanding AWS Security Group

We were playing with AWS Aurora Serverless. After figuring out how to configure and start the database cluster, we were having trouble connecting to it from our EC2 instance. Few hours later, I tried tracing what was going on with Flow Logs in the subnet. We realized it was a network and/or security issue. Checking all possible connections and security, we think everything has been configured correctly. But only after trial and error, we found out that our understanding of security group was incorrect. Due to AWS security group being promoted as stateful, we understand it as if we specify an entry in Inbound tab, we don't need to specify it in Outbound. Sadly, that is not what stateful means. Each entry specifies the allowed origin of the network request and the response will be automatically allowed. For example, if we allow in Inbound only, a request can come from outside and allowed to the EC2 instance and out, but a request from the instance won't be allowed to go out of ...

Troubleshooting AWS Instance Profile, Role, and SSM Agent

During some AWS troubleshooting session, I happened to notice that there is a possibility of stale role attached to EC2.  My scenario is as follow: We launched a new EC2 and by default no role attached to it. Then we programmatically create a role, let's call it EC2Role which we then associate it with a new Instance Profile. We then attach the Instance Role to our new EC2. In our case, the EC2Role allows SSM Agent to have permission to run commands. For somewhat reason, we decided to delete the EC2Role and again programmatically recreate a new role with the same name and associate it with a new Instance Profile. We noticed that when we don't detach the old role (which has the same name with the new one) from the EC2, the old role will still be attached to the EC2 although the old role itself has been deleted. Hence, we were confused on why the EC2 which has the right role attached to it will not run command sent by SSM. The SSM Agent log keeps saying the token is invalid....

AWS Systems Manager (SSM) Run Command Troubleshooting

I have been working with AWS SSM for couple of months, but I found the troubleshooting document on their website lacks straightforward answers. So I provide the problems that I encountered and the solution based on my experience. Problem #1 : The instance is not visible in AWS Systems Manager Console although documentation says the agent has been installed by default. Problem #2 : The instance is visible, but "Run Command" took too long and even timed out. Solution : First thing I would check is whether the instance has a role attached to it. If so, make sure the role has AmazonEC2RoleforSSM policy attached to it since permission is required for the agent to do health check. If after all the above has been confirmed, check if the latest SSM agent has been installed and running. If SSM agent is at the latest and running, check if it is hibernating. The hibernate logic has exponential backoff, so it might not respond for a long time. If it is hibernating, we can sim...