We want to develop a simple REST web API for searching customers in Microsoft's Adventure Works DB. Imagine we first prototyped the underlying query in SQL Management Studio:
As you can see, the query isn't trivial. For test purposes, I installed the Adventure Works sample DB in the smallest Microsoft Azure SQL Database (Basic pricing tier).
Microsoft Azure is an awesome platform for testing your applications. You can get ready-made VMs with latest releases of Visual Studio (including VS14 CTP). Just use your MSDN account and your Visual Studio in the cloud is even correctly licensed. During the workshop, I will do all of my demos based on Azure VMs and Azure SQL Databases.
The web API should be running in a self-hosted command line EXE using Owin/Katana. Customer searches are performed with URLs like http://localhost:12345/api/BasicSearch?customerName=Lee.
If you want to follow along, start by creating a command line EXE with the following NuGet packages:
Dapper (we will need that one later)
Next, create the startup code:
With that, we are ready to go. So let's create a very basic implementation:
Note that the algorithm shown above uses a T4 template to generate the SQL SELECT statement:
Take a second and review our first implementation. Do you find flaws? Do you think you have ideas for enhancing the algorithm? During the workshop I demo the following topics. I encourage you to do the same when working through this article.
Create a Visual Studio Web and Load test to generate a standardized usage scenario (the test is in my GitHub repo, too)
Run the load test while profiling CPU in Visual Studio. Do we have a CPU problem?
Collect a SQL statement with Visual Studio IntelliTrace, run it in SQL Management Studio and analyze it (how long does it take? How does the execution plan look like?)
Check if SQL Server is properly caching execution plans. Here is the query with which you can do that:
You will probably find out that we do not have a CPU problem at all. The DB query is simply too slow. Additionally, the execution plan is not cached. So change the T4 template and the algorithm to make it cache execution plans. That solves our perf problem to a certain degree.
In our scenario we assume that we cannot make the DB faster (in practice it would only take a few mouse clicks thanks to Microsoft Azure SQL Database different pricing tiers). So we have to re-think our approach. Let's just cache the result and look for customers in memory. Our first approach uses ADO.NET's DataView mechanism:
Before you start profiling, ask yourself whether you think that the new approach will be faster? Does it still have flaws?
Seems that DataView is a performance problem, right? So let's replace it with LINQ. Experiment with the different search algorithms in the sample shown above (line 42 and following). During the workshop I use them for the following discussions:
How can you destroy LINQ's performance with poor programming?
Garbage Collector profiling with PerfView
Profiling Windows system calls with PerfView
Seems that caching ADO.NET data isn't very efficient, right? So let's change that to POCOs. Note the use of the light-weight OR mapper Dapper:
Wow, the code looks much cleaner now and it is shorter. But is it faster? Try it our yourself using the load test and a profiler of your choice.
September is BASTA! time. It has become a beloved tradition for me to travel in Autumn to Mainz to speak at the BASTA! conference. This year, my workshop, session and keynote topics were .NET, microservices and web dev technology. In this blog post I share my session material and recordings.
Yesterday, I did a session at the .NET Stammtisch Linz. I spoke about latest Docker developments relevant for .NET developers. I focused on new features in Visual Studio and .NET base images available in the Docker Hub for .NET Core and .NET Framework 4.7. In this post I share a recording of my session. Take a look if you want to hear me speaking in my native Upper Austrian dialect ;-)
Time Cockpit Newsletter
Thanks for your registration! You are almost finished. We need to confirm your email address. To complete the subscription process, please click the link in the email we just sent you.