Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Database Design

Author  Topic 

Winston Smith
Starting Member

1 Post

Posted - 2006-06-09 : 17:48:37
Good Afternoon,

I am designing a database which will be tasked with housing > 1,000,000 records each of a size >=1.5 meg. It is a raw binary file which will be accessed in groups for processing often. Starting with 0 records they will come in groups of 100 every 20 minutes or so.

I was wondering the best way to design the table to store the raw datarecord in, how best to speed up access to the records when needed.

Thanks for any pointers or best tips.

Cheers

MichaelP
Jedi Yak

2489 Posts

Posted - 2006-06-09 : 18:30:47
The generally accepted design pattern here is to NOT store the files in the database, but rather store the paths to the files in the database.
Add the files to some sort of "queue" table with a status (processed, not process, error, etc) and the "path" to said file and have a service that works that queue.

Hope that helps!
Michael

<Yoda>Use the Search page you must. Find the answer you will. Cursors, path to the Dark Side they are. Avoid them, you must. Use Order By NewID() to get a random record you will.</Yoda>

Opinions expressed in this post are not necessarily those of TeleVox Software, inc. All information is provided "AS IS" with no warranties and confers no rights.
Go to Top of Page

Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)

7020 Posts

Posted - 2006-06-09 : 18:42:03
Since you didn't give any information about what will be stored, other than the fact that you are storing binary objects, I don't see how we can help with the table design.

I assume that you have some information about how you plan to access the data.

You will need a lot of disk. Based on the information you posted, I calculate the database will be at around 1.5 terrabytes. My rule of thumb with databases that large is that you should have an amount of storage at least 5 times the size of the database to be able to properly manage it. (Database + 2 backups + restored backup for data recovery + copy of database for data recovery = 5x ).



CODO ERGO SUM
Go to Top of Page
   

- Advertisement -