- It's recommended to watch udemy course to grasp basic concepts.
- Make sure you have Azure Data Studio with SQL Server Profiler extension installed. Alternatively you can use SQL Server Profiler that comes with on premise SQL Server installation, to install it you can re-run SQL Server installer package and make sure you have
Management Tools - Completeoption selected. - Adventure.Web project is created using ASP.NET Core Web APP (MVC) project template.
- Adventure.Model project is a .NET Class library. DB context is generated using
Scaffold-DbContextcomand out of a sample AdventureWorks Azure DB. For more information check scaffolding tutorial. - Adventure.Logicc project is a project representing aggregation summaries logic.
- Some additional project tuning included: removal of hardcoded connection strging from DB Context itself, adding connection string logic to
Startup.cs, enabling lazy loading (to match EF 6).
Clone the project.
If not yet configured, install git tools. Open folder for the solution and execute git clone https://github.com/denysdenysenko/ukad-ef-workshop.
Update connection string to point to Adventure Works sample DB. It can be your own DB installation or common installation provided by the host. If time permits, DB can be created during the workshop.
Sample Azure DB can be created using next steps:
- Go to Azure Portal and open or create a resource group that will be used for a sample database.
- Click '+ New' and select SQL Database resource type.
- DB Name: Specify any name.
- Server: click 'Create' new and give it a unique name, server admin login, password and choose the region closest to your location. Take a note of admin password.
- Compute + storage: click on 'Configure database' and choose 'Basic' plan. Click 'Apply'.
- Go to 'Networking' tab, change connectivity method to 'Public endpoint', switch 'Add current client IP address' to 'Yes'.
- Go to 'Additional settings' tab. Switch 'Use existing data' to 'Sample'.
- Click 'Review + create'.
- Wat for deployment to complete, go to resource, open 'Connection strings' blade and copy the connection string. Replace password with the one saved at step 4.
Open solution. Go to 'Adventure.Web' project.
Notice hardcoded static collection _bestSalesStub consisting of 999 POCOs in HomeController.cs.
Start the project by pressing F5.
Go to Diagnistic Tools, Memory Usage, press Take Snapshot. Revisit memory objects, find 999 stub items.
- While keeping the project running go to https://localhost:44382/home/cache?top=500, after that take another memory snapshot. Revisit memory objects, find 500 cached objects and supporting objects. https://i.imgur.com/YOHtZd2.png.
- Stop the project.
- Go to StatisticsService.cs, insert
.AsNoTracking()line inCacheProductModelProductDescriptionmethod. - Start the project and repeat step 1, notice reduced number of memory objects. https://i.imgur.com/uBgswb7.png.
- Stop the project.
- Remove
_bestSalesStubvariable declaration at the top of HomeController.cs. - Replace
_bestSalesStubassignment withinIndexaction with actual statistics service callvm.BestSalesPeople = await _statisticsService.GetBestSalesPeopleAsync(10);. - Set breakpoint at the beginning of
GetBestSalesPeopleAsyncmethod in StatisticsService.cs file and start the project again. - Once you hit the breakpoint, open the output window, clear it's content, press F10 and examine the newly added records. Stop the project.
- Go to the Startup.cs file, find a line with
AdventureDBContextconfiguration and add.LogTo(_=> Debug.WriteLine(_), LogLevel.Information)to the configuration method chain, add missingusingstatements. Start the project again and repeat the previous step. - Examine the info message with actual SQL query performed. Press F10 a few more times while keeping output window open to see how lazy loading triggers new SQL statements to appear.
- Remove the breakpoint. Stop debugging.
Review the next section if you use SQL Server Profiler instead of Azure Data Studio.
- Launch Azure Data Studio. Add new connection using conection string details from the #Setup step.
- Right-click server node and choose Launch Profiler or press Alt+P. Choose installation type (on-premise or Azure) and press Start button.
- In top menu in select view choose
TSQL_Duration Viewwhich will filter out system traces. - Start the project.
- Revisit Events window, notice enormous amount of SQL statements listed. To see actual SQL script click on a Text link at the bottom of the screen.
- Notice that Profiler extension in Azure Data Studio can be used in many more cases, such as: deadlocks diagnostics, transactions durations, numbers of physical reads, statistics etc.
- Clear Events window and keep profiler running.
Alternatively you can use SQL Server Profiler with on-premise SQL Server installations.
- Launch SQL Server Profiler. Start a new trace, enter connection details. Please notice that elevated DB permissions might be required in order to collect traces.
- In trace properties, under
Use the template:selectTSQL_Duration. Press Run. - Start the project.
- Revisit Trace window, notice enormous amount of SQL statements listed.
- Notice that SQL Server Profiler can be used in many more cases, such as: deadlocks diagnostics, transactions durations, numbers of physical reads, statistics etc.
- Clear trace window and keep profiler running.
- Start the project and collect memory dump. Notice memory objects number and heap size.
- Open AdventureDBContext.cs file, find
OnConfiguringmethod and comment out of removeUseLazyLoadingProxiesconfiguration. - Open StatisticsService.cs file, find
GetBestSalesPeopleAsyncmethod. Modify customers selection logic, extend it with additional include statementsInclude(c => c.SalesOrderHeaders).ThenInclude(h => h.SalesOrderDetails). Optionally mark query as non-tracking. - Start the project again. Collect memory dump again and notice reduced number of objects created as well as reduced heap size.
- Get back to the Profiler extension in Azure Data Studio (or SQL Server Profiler), notice a single SQL statement added to the traces window.
- Copy SQL statement and execute it in Azure Data Studio as a new query. Alternatively you can start SQL Server Management Studio, connect to the same DB, create a new query and execute copied statement.
- Scroll through results, notice excessive customer data duplicated multiple times in order for Include statements to work correctly. Find customer
Metropolitan Bicycle Supplyand revisit duplicated data inCustomertable columns.
- Start the application, open developers tool, network tab. Hit refresh a few times, see how long does it take to load the document.
- Stop the application, navigate to HomeController.cs file, find
Indexaction and replaceGetBestSalesPeopleAsyncmethod call withGetBestSalesPeopleAggregateAsyncmethod call. - Start project again and collect memory dump, notice reduced heap size and overall number of objects created.
- Go to SQL Server Profiler and compare new SQL statement with the previous one, execute it in Azure Data Studio (or SQL Server Management studio) to see how it affects total amount of data to be transferred between SQL Server and the app.
- Open Dev Tools, hit refresh a few times and see how the update affected overall document load time.