Friday, December 31, 2021

Configure HttpClient to consume windows authenticated NTLM service in ASP.NET Core

Recently I migrated a system to Linux based docker platform from IIS environment. I had a service call to some other API that has NTLM authentication.

That was working fine by just setting Credentials to HttpClient when hosted in IIS. 

But when you run dotnet app in Linux you do not have that windows authentication fancy feature. 

Below is how I managed to configure HttpClient with NTLM Authentication using CredentialCache.


public void ConfigureServices(IServiceCollection services)
{
services.AddCognitoIdentity();
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1);
//// configure httpclient and consume this using HttpClientFactory
services.AddHttpClient("MyCompanyApiClient", c =>
{
c.BaseAddress = new Uri("https://mycompany/api/");
c.DefaultRequestHeaders.Add("Accept", "application/json");
}).ConfigurePrimaryHttpMessageHandler(() =>
{
return new HttpClientHandler()
{
//// this is how you set ntlm auth
Credentials = new CredentialCache
{
{ new Uri("https://mycompany/api/"), "NTLM", new NetworkCredential("username", "password") }
},
//// if there is a proxy
Proxy = new WebProxy ("http://companyproxy:8989")
};
}) ;
}
view raw ntlmAuth.cs hosted with ❤ by GitHub

Monday, May 24, 2021

How to update only one field using EF Core?

This method is useful When you are trying to update some record in the table while you are not tracking that entity. 

If you are using tracking then EF is smart enough to update only certain fields. 

Below is the code.


var book = new Book() { Id = bookId, Price = price };
_dbContext.Books.Attach(book);
_dbContext.Entry(book).Property(x => x.Price).IsModified = true;
await _dbContext.SaveChangesAsync();
//// then detach it
_dbContext.Entry(book).State = EntityState.Detached;
This will generate a sql like below
UPDATE dbo.Book SET Price = @Price WHERE Id = @bookId

Wednesday, January 27, 2021

Fix Openshift build pod was killed due to an out of memory condition

I was doing a large build using oc build command in Openshift dedicated cluster. This build has many steps in Dockefile and it fails somewhere in copying blobs. Error was "The build pod was killed due to an out-of-memory condition.". 

I found two solutions for this. 

1. Increase build pod resources.
resources: 
 requests: 
   cpu: "100m" 
   memory: "1024Mi" 

2. Build locally using docker build and push the image to the image stream.

- Build locally
docker build -t myapi 

- Tag build
docker tag myapi default-route-openshift-image-registry.apps.ca-central-1.starter.openshift-online.com/apis/myapi

- Login to an openshift registry
docker login default-route-openshift-image-registry.apps.ca-central-1.starter.openshift-online.com -u $(oc whoami) -p $(oc whoami -t)

- Push the image to the registry
docker push default-route-openshift-image-registry.apps.ca-central-1.starter.openshift-online.com/apis/myapi