Cntlm and a corporate web proxy

When working in a corporate context, you often get confronted with a corporate web proxy. This can become very annoying when working with various command-line tools that have issues with the authentication part of that web proxy.

Luckily, Cntlm can remove that friction by running a local proxy without authentication, that authenticates to the actual proxy for you.


  1. Download and install Cntlm:
    It will install itself under”C:\Program Files (x86)\Cntlm”
  2. Edit “Cntlm.ini” and fill in your Username, Domain and Proxy. Remove the plain text password property and save the file.
  3. Use “cntlm -H” to generate a new password hash. Copy the PassNTMLv2 hash to the Cntlm.ini file.
  4. Start the Cntlm service using “net start cntlm”
  5. Now you can use your local proxy (without authentication) at http://localhost:3128/


You only need 4 properties in the Cntlm.ini file to get Cntlm running in a secure way:

Username	testuser
Domain		corp-uk
PassNTLMv2      <output from cntlm -H>


Getting started with Apache Kafka

Today I started with the excellent Pluralsight course “Getting Started with Apache Kafka“.

The course is focused on using an Ubuntu test server for Kafka and a Java development environment. This blog post is a list of resources that I found helpful while exploring Apache Kafka, with a focus on the Microsoft stack.

Hosting a Kafka environment

Multiple options exist for hosting your test environment. One option is to deploy Apache Kafka in an Azure HDInsight cluster (quickstart tutorial + Azure Friday). This option seemed a bit overkill for me as I was searching a “quick start” experience.

Another option are containers. While searching for Docker images, I stumbled upon the Bitnami Kafka Stack.
They offer both Docker and VM images that you can use for running locally on your machine or for deploying to the Azure cloud (Bitnami Azure Marketplace).

But, also this still takes some time before you can connect to your Kafka test cluster. So in the end, I went for a Confluent Kafka cluster, running in Azure. It took me less then 3 minutes to register a new Confluent account, create a new Kafka cluster and connect to it with Conduktor. This solution is of course not free. Luckily, it is possible to use the Confluent platform for 3 months with a 50 USD spending credit per month.

Connecting to Kafka from .NET

When it comes to connecting to Kafka with .NET, there is only one solution you should use and that is “confluent-kafka-dotnet“. All other packages are outdated and suggest you should use that one.

As mentioned in the beginning, the Pluralsight is focusing on the Java client for Kafka. Not all concepts are the same when using the Confluent .NET client, therefore you should dig into the wiki on the GitHub project. It provides all info needed to create your first simple producer and consumer application. Also, the blogpost ‘Designing the .NET API for Apache Kafka‘ gives some interesting back story.

What’s next?

After completing the ‘getting started course’ on Pluralsight I can recommend the following resources to continue your learnings:

Automate IBM MQ object creation with PCF in .NET using IKVM.NET

TL;DR – There is no decent .NET support for PCF in the IBM client, but we can use IKVM.NET to convert JARs to DLLs so we can still use .NET instead of JAVA to use PCF.


Programmable Command Formats (PCFs) define command and reply messages that can be used to create objects (Queues, Topics, Channels, Subscriptions,…) on IBM Websphere MQ. For my current project we wanted to build a custom REST API to automate object creation based on our custom needs. The IBM MQ REST API was not a possible alternative at that moment in time.

The problem

The .NET PCF namespaces are not supported/documented by IBM and do not provide the possibility to inquired the existing subscriptions on a queue manager. All other tasks we wanted to automate are possible in .NET. Using JAVA seemed to be the only alternative if we wanted to build this custom REST API with all features.

Action PCF Command Result Info
Create Local/Alias Queue MQCMD_CREATE_Q OK

Being able to use the .NET platform was a requirement at that time, because the whole build and deployment pipeline was focused on .NET.


After some searching I stumbled upon IKVM.NET:

IKVM.NET is a JVM for the Microsoft .NET Framework and Mono. It can both dynamically run Java classes and can be used to convert Java jars into .NET assemblies. It also includes a port of the OpenJDK class libraries to .NET.“

Based on this description it sounded like it could offer a possible solution!

Using IKVM.NET we should be able to convert the IBM JARs to .NET assemblies and use the supported and documented IBM Java Packages from a .NET application.

From JAR to DLL

Now I will shortly explain how we were able to put it all together. Using IKVM.NET is not that easy when you use it for the first time. The whole process consists basically out of 3 steps:

  1. Download (and Install) the IBM MQ redistributable client (in order to extract the JAR files)
  2. Convert JARs to DLLs with IKVM.NET
  3. Copy DLLs and Reference in .NET project
    • The IBM Converted JARs and the IKVM.NET Runtime dlls

Convert JAR to DLL

Download IKVM:
Extract the IKVM files (c:\tools\IKVM)
I have the IBM client installed, so the JAR files will be on there default installation (C:\Program Files\IBM\MQ\java\lib)

Open up a Command Prompt:

set path=%path%;c:\tools\IKVM\bin
cd C:\Program Files\IBM\MQ\java\lib
ikvmc -target:library -sharedclassloader { } { } { } { }

You will find the output in the source directory of the JAR files:

Add References…

Now that we have our DLLs, we can add them to our .NET project. This seemed less easy then I thought, I spend a lot of time figuring out what dependencies I needed. In the end, this was my result:

1 = The IBM JARs converted to DLLs
2 = The IKVM.NET runtime DLLs

MqPcfAutomation Sample

To help you get started, I added my sample proof of concept solution (.net 4.6.1) to GitHub in the MqPcfAutomation repository.

The sample is a showcase of the functionality described in the table at the beginning of this post.


In the end I am happy that I was able to build a solution for the problem. The question is if this approach is advised…
I don’t think IBM approves this approach, but it works for what we need it. We are using this solution now for more then 6 months without any issues. In the future we might be able to move to the IBM MQ REST API as more features will be added.