Friday, June 17, 2016

Basic steps while dealing with application slowness due to database

In this blog I will share my experience of improving web application performance. Application slowness is often faced in large scale applications. Slowness can happen due to multiple factors and one of the major factor is because of bad database design.

Recently I got a chance to work on an application optimization task. The whole activity includes many areas that are focused and optimized. The main performance improvements are though after database optimizations are done. I am listing down some of the major activities that are performed and found very helpful in finding performance improvement areas.

Is it really a database problem?

Get web server traces and find if there is performance issue in the application itself. Before saying that DB is the only issue just double check by getting and comparing the web server traces and SQL-Profiler traces. Confirm that most of the time taken by request is through database and not something from application itself.

Execute and check SQL-Profilers

Execute the SQL-Profiler and check the queries/stored-procedures that are taking most of time. Notice both duration and read counts in causing application slowness.

Check proper indexes

From the profiler traces one can easily identify the queries or stored-procedures that are taking time. Once those are identified, first thing you have to look is to check your indexes on tables used in queries or stored-procedures. Add appropriate indexes on required columns.

Execute query execution plans

Before I was not very friendly with this feature of Microsoft SQL Server. The Actual Execution Plan of a particular query or SP is very helpful in finding out the specific area that is taking time. The execution plan also suggests the indexes that can fasten up the execution. The suggested indexes can be applied on tables and the impact can be re-checked by executing query/SP again.

Use JOIN instead of IN statements on large data

Other than indexes do go through the statements either those can be overwritten more efficiently. For example instead of using IN clause use JOINS if the data is too much for IN clause.

Check for LOCKS

Specifically taking about MSSQL, it has different pre-defined SP calls that can hep in identifying locking information. For example sp_who2 and sp_lock SPs can be called to get locking information. Statements can be locked due to synchronous Update/Read hits are made on a same table in database. One of the statements can be blocked/locked until other statement releases the table lock.

Use WITH (NOLOCK) where ever required

Do not use WITH(NOLOCK) everywhere. Use it where ever required. I am not going to mention the details of using/not-using WITH(NOLOCK) but I will suggest to read some helpful articles related to this.

Check database statistics

Database statistics can be find out from sys.stats system table. sp_updatestats command can be used to update database statistics. Index rebuilding can also happen statistics update. These activities are not mandatory but if for some reason your database indexes goes bad you have to update/rebuild them. Bad database indexes cannot only return false results but can also make the query/SPs very slow to execute.

rizzz86