Thursday, August 30, 2012

How to improve performance of Database Operations in application

 Hello Guys.


                   Today I am going to explain some tips that I have realized to improve the performance of web application or portal in context of database operation. We all know that without any database no dynamic website/application or portal can function.  So it’s very important for us to effectively work with database. Following are some of the points which we need to consider while designing and developing our application so that its performance is always better. And we wouldn’t have to worry after the application development that how to optimize this application as its performance is very poor with respect to others.




  • Return Multiple Resultsets


If the database code has request paths that go to the database more than once, then these round-trips decrease the number of requests per second your application can serve.


Solution


Return multiple resultsets in a single database request, so that you can cut the total time spent communicating with the database. You'll be making your system more scalable, too, as you'll cut down on the work the database server is doing managing requests.




Connection pooling is a useful way to reuse connections for multiple requests, rather than paying the overhead of opening and closing a connection for each request. It's done implicitly, but you get one pool per unique connection string. Make sure you call Close or Dispose on a connection as soon as possible. When pooling is enabled, calling Close or Dispose returns the connection to the pool instead of closing the underlying database connection.


Account for the following issues when pooling is a part of your design:


Share connections


Avoid per-user logons to the database


Do not vary connection strings


Do not cache connections




  • Use SqlDataReader Instead of Dataset wherever it is possible


If you are reading a table sequentially, you should use the DataReader rather than DataSet. DataReaderobject creates a read only stream of data that will increase your application performance because only one row is in memory at a time.




  • Keep Your Datasets Lean


Remember that the dataset stores all of its data in memory, and that the more data you request, the longer it will take to transmit across the wire.


Therefore, only put the records you need into the dataset.




  • Avoid Inefficient queries


How it affects performance


Queries that process and then return more columns or rows than necessary waste processing cycles that could best be used for servicing other requests.




  • Cause of Inefficient queries


Too much data in your results is usually the result of inefficient queries.


The SELECT * query often causes this problem. You do not usually need to return all the columns in a row. Also, analyze the WHERE clause in your queries to ensure that you are not returning too many rows. Try to make the WHERE clause as specific as possible to ensure that the least number of rows are returned.


Queries that do not take advantage of indexes may also cause poor performance.




  • Unnecessary round trips


How it affects performance


Round trips significantly affect performance. They are subject to network latency and to downstream server latency. Many data-driven Web sites heavily access the database for every user request. While connection pooling helps, the increased network traffic and processing load on the database server can adversely affectperformance.


Solution


Keep round trips to an absolute minimum.




  • Too many open connections


Connections are an expensive and scarce resource, which should be shared between callers by using connection pooling. Opening a connection for each caller limits scalability.


Solution


To ensure the efficient use of connection pooling, avoid keeping connections open and avoid varying connection strings.




How it affects performance


If you select the wrong type of transaction management, you may add latency to each operation. Additionally, if you keep transactions active for long periods of time, the active transactions may cause resource pressure.


Solution


Transactions are necessary to ensure the integrity of your data, but you need to ensure that you use the appropriate type of transaction for the shortest duration possible and only where necessary.




  • Avoid Over Normalized tables


Over Normalized tables may require excessive joins for simple operations. These additional steps may significantly affect the performance and scalability of your application, especially as the number of users and requests increases.




  • Reduce Serialization


Dataset serialization is more efficiently implemented in .NET Framework version 1.1 than in version 1.0. However, Dataset serialization often introduces performance bottlenecks.


You can reduce the performance impact in a number of ways:


Use column name aliasing


Avoid serializing multiple versions of the same data


Reduce the number of DataTable objects that are serialized




  • Do Not Use CommandBuilder at Run Time


How it affects performance


CommandBuilder objects such as SqlCommandBuilder and OleDbCommandBuilder are useful when you are designing and prototyping your application. However, you should not use them in production applications. The processing required to generate the commands affects performance.


Solution


Manually create stored procedures for your commands, or use the Visual Studio® .NET design-time wizard and customize them later if necessary.




  • Use Stored Procedures Whenever Possible


Stored procedures are highly optimized tools that result in excellent performance when used effectively.


Set up stored procedures to handle inserts, updates, and deletes with the data adapter


Stored procedures do not have to be interpreted, compiled or even transmitted from the client, and cut down on both network traffic and server overhead.


Be sure to use CommandType.StoredProcedure instead of CommandType.Text




  • Avoid Auto-Generated Commands


When using a data adapter, avoid auto-generated commands. These require additional trips to the server to retrieve meta data, and give you a lower level of interaction control. While using auto-generated commands is convenient, it's worth the effort to do it yourself in performance-critical applications.




  • Use Sequential Access as Often as Possible


With a data reader, use CommandBehavior.SequentialAccess. This is essential for dealing with blob data types since it allows data to be read off of the wire in small chunks. While you can only work with one piece of the data at a time, the latency for loading a large data type disappears. If you don't need to work the whole object at once, using Sequential Access will give you much better performance.