Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
chih
Posting Yak Master
154 Posts |
Posted - 2009-11-05 : 07:00:31
|
Hello everyone,I've been asked to design a table which potentially will have millions of records per day. This table is used to store customers’ transactions. Customers are also able to query and get their historical transactions from the table.As this is my first time building something of a larger scale table, I am not sure what the best practice is. What the financial firms normally do to handle this kind of situation? I have been thinking about partition table. Is that a good solution? Or should I break the table into multi small tables. Accessing different table according to different customer ID?Thank you in advance. |
|
AndrewMurphy
Master Smack Fu Yak Hacker
2916 Posts |
Posted - 2009-11-05 : 10:30:13
|
proper indexing will be a help. good hardware will.different tables won't. you'll neeed different code per customer. not bad if only 1-2 customers, but not workable (from a code management perspective) if anything over that. or else all your routines will ahve to be dynamic sql, which wil lcost you in terms of performance (no caching) and security.different database per customer may help - may also be desirable in terms of security/data rollbacks/hours of operation/isolation of problems, etccan you tell us more, giving some volumes? |
 |
|
chih
Posting Yak Master
154 Posts |
Posted - 2009-11-05 : 17:06:00
|
Thank you for the reply.As this project is still at very early stage. I was only told to design a table with possibly millions transaction per day.This table will record customers' transaction. For example, the collumn will beCustID, Price, Code, DatesThis is an online website, the number of customers will increase.I was thinking to create serveral tables to store different range of customers, for examplecustomer ID 1-10000: go to table transaction_1customer ID 10001-20000: go to table transaction_2...But the issue will be as your mentioned hard to maintent. But it will reduce deadlock/blocking issue. So not sure what is the best way to do itquote: Originally posted by AndrewMurphy proper indexing will be a help. good hardware will.different tables won't. you'll neeed different code per customer. not bad if only 1-2 customers, but not workable (from a code management perspective) if anything over that. or else all your routines will ahve to be dynamic sql, which wil lcost you in terms of performance (no caching) and security.different database per customer may help - may also be desirable in terms of security/data rollbacks/hours of operation/isolation of problems, etccan you tell us more, giving some volumes?
|
 |
|
AndrewMurphy
Master Smack Fu Yak Hacker
2916 Posts |
Posted - 2009-11-06 : 11:16:44
|
design simply. apply proper indices - work backwards from the most popular queries and see what supports them best. in a test environment generate production-sized volumes and see if the hardware if suitable. you may need to investigate partitioning for "old data", to keep a lid on "active" data volumes.my suggestion re mutliple databases, only really applies if you were offering an ASP-type service, ie. your customers are themselves operating as traders with their own data requirements. |
 |
|
|
|
|
|
|