Hi, every one
I have a big big problem in SQL Sever 2000 working with a very very big problem
This is My Story
-
I have a very big table (36,000,000 row and every day we add 400,000 new row) and i have a page ASP Classic that run Stored Procedural in Database that make ajoins betwen that table and other small tables (100,000 roww),
in this page i display some summery information that back from this Stored Procedural.
The problem is runung any query on this Stored Procedural Take from 1 hour to 3
how can i optmaize this table ? to make query run faster
We can certainly help if we know the schema of the tables involved and the join conditions. Please post DDL and the script to help you.
|||Hi,
To optimize this table try the following:
-Reduce Number of Columns
Try to reduce the number of columns in a table. The fewer the number of columns in a table, the less space the table will use, since more rows will fit on a single data page, and less I/O overhead will be required to access the table's data.
-Avoid Un-necessary Indexes
Avoid creating un-necessary indexes on table thinking they would improve your performance. Only join tables that you need
-TEXT datatype
Unless you are using it for really large data. The TEXT datatype is not flexible to query, is slow and wastes a lot of space if used incorrectly. Sometimes a VARCHAR will handle your data better.
-Avoid NULLable columns
When possible, normalize your table and separate your nullable columns. They consume an extra byte on each NULLable column in each row and have more overhead associated when querying data.
Regards,
Tarek Ghazali
SQL Server MVP
http://www.sqlmvp.com
|||
The number 1 way to improve performance where you only need summary information out of a query is to maintain summaries.
Also, if you are only gathering summary information from recent times you can partition the table.
A tried and proven design approach to systems is to not perform redundant work.
If you are summarising the same detailed data many times to present it on a page you are doing redundant work.
Best Regards
Peter
|||table is very big and have about 40 columns
what about index or clustered index or partioning
|||
Sorry to bring the bad news, but this problem can't be solved by simply putting this on a forum. Far to many variables are in play here. If I look at your problem I think your queries are using a table scan to gather the summary info (assumption 1). I presume you don't need all columns in the summary, so avoiding a table scan would be a possibility (assumption 2). If both assumptions are right, a covering index might (assumption 3) be a solution to your problem.
Another solution would be an extra column with the summary data (allready mentioned). Update this data by triggers so you only have to do a select and no more calculations. Yet an other solution could be an indexed view with the calculation included.
I'm not sure if this helped, but this is not so easy answered. The answer depends on the table structure, the query you're running, the data involved and even your hardware. If you supply the table structure, the query and the query plans, maybe we could say something usefull.
|||In order to verify your indexes, or to determine if additional indexing will help, you may wish to explore using the 'Database Tuning Advisor'.
In general, those are not extraordinarily large tables for SQL Server. With tables containing over a billion rows, I still expect sub-second response to typical queries -even JOINs. It is a matter of anticipating the queries, creating summary tables if need be, proper indexing, and adequate hardware.
Reporting queries are a different 'animal', and may require very different data handling -even specialized servers.
|||I will do the following
1- do cluster unique index
2- try to minimize # of columns to what i need only
3-try to create temp table that will hold summerized data
No comments:
Post a Comment