Thursday, February 16, 2012

about optimization?

There are two main tables in my app,in order to optimize search via scope condition, I set many indexs for these two tables

however,at the same time the two tables are also used for my etl app,everyday there are more than thousands of data need to be updated or inserted, but index is not suitable for huge modification,any idea about how to handle this?

thanks in advance

Moving to more appropriate forum...|||

Generally speaking, you usually don't want more than about 3-4 indexes on a table that is used for an OLTP workload. Rather than guessing, if you have SQL Server 2005, you can run the query below to see which indexes are being used. If you see indexes that have zero or a very low number of reads, then you should consider dropping those indexes.

Index Read/Write stats for a single table

DECLARE @.dbid int

SELECT @.dbid = db_id()

SELECT objectname = object_name(s.object_id), indexname = i.name, i.index_id,

reads = user_seeks + user_scans + user_lookups, writes = user_updates

FROM sys.dm_db_index_usage_stats AS s, sys.indexes AS i

WHERE objectproperty(s.object_id,'IsUserTable') = 1

AND s.object_id = i.object_id

AND i.index_id = s.index_id

AND s.database_id = @.dbid

AND object_name(s.object_id) = 'YourTableName'

ORDER BY object_name(s.object_id), writes DESC, reads DESC;

|||

thank you very much

thus i can first tune indexs

then consider to stop index before etl, and rebuild index after that

No comments:

Post a Comment