Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
mivey4
Yak Posting Veteran
66 Posts |
Posted - 2011-07-06 : 13:13:05
|
Does anyone know of any tools or can recommend the best way to analyze large amounts of data to identify and tag potential dimensions?Let me re-clarify the question. I have a spreadsheet of data having around 3,000 to 4,000 fields with descriptions of those fields and other metadata. What I need to do is to identify which fields can potentially be flagged as dimensions as an iterative first process.For several hundred rows coming from another source, I have already managed the task by grouping the fields in the spreadsheet and then analyzing the descriptions to make the determinations.I now have a Data source that has thousands of records and can easily see that this can become a very complex task and architecting the data isn't one of my strong areas.So I was wondering if there were any Data Architects in the forum who may possibly recommend the best practice for this; OR if anyone may know of some software tools that may facilitate in the task to make it less cumbersome.Any responses at all would be highly appreciated!Thanks.Oracle OCAAdaptec ACSP |
|
|
|
|