How do I make my ASP pages more efficient?

URL: http://www.aspfaq.com/show.asp?id=2424

If possible, put web root, temp, database, system and pagefile on separate partitions. If separate physical drives, even better. If you can use RAID 5 or RAID 0 + 1, by all means, do so. Keep your drives defragmented, and make sure you have a patched system. 
 
Don't generate large strings in single variables -- concatenation is very expensive. If you are looping through an array or recordset and building a string, only to dump it to the screen at the end of the loop, use response.write within the loop instead. Time it, you might be surprised how inefficient concatenation can be (and this is exponential as the size of the string goes up). If you are looping through and building a string that will need to be used in multiple places later on, consider an array of smaller strings. VBScript's string buffer is only so large. 
 
Avoid storing too much data in session variables, and also make sure your session.timeout is reasonable. This can use up significant amounts of memory on the server, and session variables hang out long after the user closes the browser. In my testing, scenarios where large amounts of session variables were used, performance was enhanced by replacing this with a single session variable (e.g. an IDENTITY value, or a GUID()) used to store / retrieve "session" data in a database. See our cookieless shopping cart for somewhat of an example. And of course, just about every object you will use in ASP should not be stored in session or application scope (see Article #2053). 
 
Try not to do too much work, especially interaction with a database or remote server(s), in a single script. Consider optimizing the code and perhaps spreading the work over multiple pages. I often see errors like this happening: 
 
<some error> 
/file.asp, line 1294
 
This seems to be *way* too many lines of ASP code to (a) manage and (b) expect to run efficiently. I've worked on some pretty big ASP applications and I don't recall ever having an ASP script more than 250 lines long. 
 
Try to avoid nested loops of any kind. 
 
Use Response.Buffer = true. 
 
Use Option Explicit. Yes, this can be a pain in the butt and slow down development slightly, but it forces you to use locally declared variables. As Eric Lippert [MS] explains in this Google post, using declared variables is more efficient. (As an aside, Option Explicit helps prevent seemingly simple typographical errors, which always turn out to be a big production because you can't figure out why myNumber is blank, when in actuality you spelled the initial declaration myNunber.) 
 
Compare apples to apples! Instead of making the ASP engine implicitly cast values for you, force it. Also, instead of this: 
 
if clng(rs("whatever")) = 1 or clng(rs("whatever")) = 6 then
 
Store this variable locally... both to avoid multiple cast operations, and to avoid looking up the value from the rs object multiple times. So it would become: 
 
whatever = clng(rs("whatever")) 
if whatever = 1 or whatever = 6 then
 
Use readall() instead of while not AtEndOfStream / readLine. I've seen this kind of logic far too often: 
 
do while not fs.AtEndOfStream 
    localLine = fs.readLine() 
    wholeString = wholeString & vbCrLf & localLine 
loop
 
The following is much more efficient. You get the whole string in one call, and this allows you to close the handle on the file immediately, and do the processing afterward: 
 
wholeString = fs.readAll()
 
Use server-side validation, as opposed to generating large client-side 
strings / arrays, to prevent duplicates and other problems in forms. You may recall in SAMS' Active Server Pages 2.0 Unleashed, I wrote a section about using data from the server in client-side validation, to prevent duplicates and such WITHOUT requiring a round-trip. But this is only useful with smallish data sets. The client side can only handle so much JavaScript before it croaks. The Mac will crash and burn far earlier than the PC. I haven't seen this cause problems per se on the PC, only sluggishness. 
 
There are some other general recommendations at the following URLs: 
 
    25+ ASP Tips to Improve Performance and Style 
 
    Improving ASP Application Performance 
 
    ASP Guidelines

Database-specific

  • Make sure you're using MDAC 2.7 or better (see KB #300420 for one of the problems with 2.6) and the most recent SQL Server service pack. For information on keeping your server secure and healthy, see Article #2151
     
  • I am currently using many of the suggestions described in the SQL Server 2000 Operations Guide. If you're using SQL Server 2000, please take a look at this excellent article. 
     
  • Use a DSN-less connection string (see Article #2126), and enforce TCP/IP (avoiding name resolution) by using Network=DBMSSOCN in the connection string (see Article #2082). Use ADO and OLEDB, not DAO or ODBC. Use the EXACT same connection string throughout your application, in order to make the best possible use of connection pooling. If you absolutely must use a DSN (e.g. because of a hard-coded application or COM object), be sure to use a System DSN. 
     
  • Open your connection just before needing it, and close it as soon as you're done with it. Your motto should always be "get in, get/save data, get out." Always close your ADO objects and set them to nothing. 
     
  • Avoid adovbs.inc (see Article #2112). If you absolutely must use named constants, define the subset you need, or use the reference to the MDAC type library in global.asa. 
     
  • Avoid storing recordsets or connections in session or application variables (see Article #2053). 
     
  • Consider using GetRows() or GetString() instead of looping through a recordset (see Article #2467 for more information). 
     
  • If your SQL queries take too long to run, consider moving the queries themselves to stored procedures, instead of using ad hoc queries (see Article #2201). Use indexes on any column(s) used in the WHERE or ORDER BY clauses (see Article #2231). Increasing timeouts is not a solution! It's more like painting your car bright orange to draw attention from a door ding. 
     
  • Use SET NOCOUNT ON in all procedures: 
     
    CREATE PROCEDURE foo 
    AS 
    BEGIN 
        SET NOCOUNT ON 
        -- ... 
    END
     
    This prevents "1 row(s) affected." messages from being sent back over the wire, prevents SQL Server from working to obtain those numbers, and prevents ADO from tripping over recordsets that aren't really recordsets. If you need to know the number of rows affected while debugging, set up PRINT @@ROWCOUNT calls on the following line, or store them all in a table and SELECT from them *after* all your other SELECT statements, that way you can view such results in Query Analyzer without interfering with your ASP code. 
     
  • Do not name stored procedures with the sp_ prefix. There is a performance hit with this prefix, as the database will first search master for a procedure with that name. sp_ is a nice and convenient prefix for a stored procedure, but you will do your application, database and users a huge favor by coming up with some other naming convention (if necessary) or just using functional prefixes (like get, set, etc). 
     
  • Always create and reference tables, views and other objects using the dbo. prefix (unless you are using specific owner-inheritance for some reason). There is a performance hit when the database has to search first for the object with that name owned by the calling user, and then for the object with that name owned by dbo. If you use the owner prefix explicitly, the database doesn't have to guess or grope around, and it goes directly to the correct object. 
     
  • Make sure your network connection between web server and database server is not the bottleneck. You can test this by sitting at (or terminal serving into) the database machine directly, and executing the query from its own Query Analyzer, and comparing the times with your ASP results (see Article #2092 and Article #2245 for some timing ideas). 
     
  • Limit resultsets when possible. For example, if you are returning a student directory, make the user restrict the results to those whose last name starts with G or M. This will reduce the work you are forcing your database and web server(s) to perform, and also making the interface more usable (it's very rare that the end user needs to see the entire table in one shot). You could also consider paging (see Article #2120) - breaking the recordset into chunks across multiple pages. 
     
  • Do not nest recordsets in ASP! The database is faster at grouping/processing rows. If you think you need nested recordsets, consider a JOIN. If you are having problems with an appropriate JOIN strategy, please post your table structure(s), sample data and desired results, and others will help you get a correct query working. 
     
  • Do not use rs.movefirst when using a default, forward-only recordset. Set up SQL Profiler when doing this, to see why. 
     
  • Avoid extra ADODB objects unless necessary (see Article #2191). If you are using ADODB.Recordset or ADODB.Command to execute stored procedures or INSERT/UPDATE/DELETE rows in a table, consider using the connection object by itself. 
     
  • Use the adExecuteNoRecords + adCmdText constant for INSERT, UPDATE and DELETE queries: 
     
    conn.execute sql, , 129
     
  • When designing your tables, use the narrowest columns possible. If you have a numeric value that will be from 1-10, use a tinyint, not a smallint or an int. If it can only be 0 or 1, use a BIT. If you have a column for a Social security number, use INT or CHAR(9) (or CHAR(11), if it has to be string formatted for some reason), as opposed to a BIGINT or VARCHAR(50). Choose character-based datatypes appropriately (see Article #2354). If second- and millisecond-accuracy are not important, use a smalldatetime instead of a datetime. 
     
  • Make sure you index your tables, and choose your indexes wisely. It never hurts to throw a batch of hypothetical work at the Index Tuning Wizard and have it report to you what it thinks the best candidates for indexes would be. You don't have to follow all of its suggestions, but it may reveal things about your structure or data that will help you choose more appropriate indexes. 
     
  • In procedure code, avoid cursors if possible - most cursor solutions have a more efficient, set-based alternative. If you can't find a set-based solution, post your table structure, sample data, and desired results to microsoft.public.sqlserver.programming and someone will help you. If you still need to use a cursor, or if the cursor solution is faster (rare, but possible), make sure you close and deallocate the cursor when you are done with it. 
     
  • Only get the data you need - avoid unnecessary columns, frivolous JOINs, and the all-too-common SELECT * (see Article #2096). 
     
  • Avoid DISTINCT unless it is absolutely necessary. You can use GROUP BY, self-JOINs or sub-queries in many cases. 
     
  • Avoid NULLs unless they are necessary (see Article #2073). 
     
  • Test the use of derived tables in place of temp tables / table variables. 
     
  • Use local temp tables in place of global temp tables. Make sure you drop all #temp tables at the end of your procedures (otherwise you will find tempdb continuously growing). Use @table variables in SQL Server 2000, where possible (see the limitations in Article #2475). 
     
  • Choose constraints over triggers when given the choice. Assuming you have decent error-handling in place, this is a more efficient way to regulate your data and prevent invalid data from entering into your database. 
     
  • Test the performance differences of IN vs. EXISTS, ISNULL vs. COALESCE, each of which can provide the exact same results, but often the optimizer can choose a better plan in one case or the other. The performance can differ based on the datatype(s), size of table, number of relevant rows, and other factors. Using Query Analyzer, turn on Show Execution Plan and run two similar queries... you will see how much work each query took, as a percentage of the entire batch... and this can often indicate the better performer. 
     
  • Try to avoid non-sargable conditions in the WHERE clause, such as "IS NULL", "IS NOT NULL", "OR", "<>", "!=", "!>", "!<", "NOT", "NOT EXISTS", "NOT IN", "NOT LIKE", "LIKE"... any non-sargable arguments will not use an index and will almost always result in a table scan (trust me, you don't like table scans). 
     
  • Make sure you are either committing or rolling back all T-SQL transactions. If a query is not fully committed or rolled back, you can exceed the concurrent query limit, and can cause blocking for all other SPIDs. Check @@TRANCOUNT within the SPID's connection, if possible, and issue a ROLLBACK if it is not equal to 0. 
     
  • If you find that simple SELECT queries are blocking (you can determine this by issuing the following in Query Analyzer, and looking at the BlkBy column): 
     
    EXEC sp_who2 'active'
     
    And that these SELECT queries are able to perform a 'dirty read' (e.g. it's okay that the query doesn't wait for new that are currently being added or modified), then you might consider adding the NOLOCK hint: 
     
    SELECT columns FROM table WITH (NOLOCK)
     
    To set this for your entire transaction, instead of adding WITH (NOLOCK) to every single table reference, you can use an isolation level of read uncommited, e.g.: 
     
    SET TRANSACTION ISOLATION LEVEL READ_UNCOMMITED 
    -- ...code...
     
  • If you find that SQL Server is running away with RAM, you can temporarily cap SQL Server's memory consumption by right-clicking the server name in Enterprise Manager, clicking properties, and on the memory tab you'll see some options: 
     
     
  • Similarly, you can place a throttle on processor usage (e.g. limiting SQL Server to one CPU) on none other than the processor tab. You should only change the default settings until you can find the problem; and even then, only attempt this approach if you understand the implications. 
     
  • For information on performing an 'audit' on your SQL Server's performance, see this article. Also, there are two KB articles for troubleshooting performance issues: KB #224587 and KB #243588
     
  • There are several links at Microsoft's SQL Server site related to performance tuning. 
     
  • DatabaseJournal.com has an article outlining the areas where you can fine-tune SQL Server to yield the best performance. 
     
  • There are also some reference materials and resources listed in Article #2423
     
  • For other tips on maximizing your data-driven ASP pages, see KB #176056 and KB #258939
     
  • For tips on optimizing Access performance, see MSDN's Ways to optimize query performance.
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章