Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
rav3n0u5
Yak Posting Veteran
88 Posts |
Posted - 2006-02-12 : 00:53:35
|
| I have a table approaching 2 billion rows... and I am not happy about it. The primary key for the table is a bigint, and it is nearing its limit. How does one go about circumventing this limitation? I was thinking about changing the primary key of the table to extend to a new column, a tinyint column, and then putting a trigger instead of insert that increments the tinyint. This would of course require rewriting every freaking sproc to accomodate this 2 part PK, but I don't know what the alternatives are. Of course, this primary key serves as the basis for a very huge amount of data in the database. Thoughts? |
|
|
Kristen
Test
22859 Posts |
Posted - 2006-02-12 : 02:24:46
|
| Errmmm ... INT limit is 2,147,483,647, BIGINT is 9,223,372,036,854,775,807Does that mean you've got another month or two before you have to worry about it?!Kristen |
 |
|
|
blindman
Master Smack Fu Yak Hacker
2365 Posts |
Posted - 2006-02-12 : 11:41:36
|
| Not sure how big each record is, but I'm guessing his problem will be running out of drive space before he runs out of big ints. |
 |
|
|
rav3n0u5
Yak Posting Veteran
88 Posts |
Posted - 2006-02-12 : 18:13:07
|
| Getting my ints mixed up again.Doh.Thanks for pointing that out. |
 |
|
|
madhivanan
Premature Yak Congratulator
22864 Posts |
Posted - 2006-02-13 : 01:35:28
|
| Also look for maximum capacity specifications in SQL Server help fileMadhivananFailing to plan is Planning to fail |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2006-02-13 : 10:36:19
|
"Getting my ints mixed up again" |
 |
|
|
|
|
|
|
|