Author |
Topic |
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2006-08-29 : 08:12:22
|
We have a system based on AcuCobol and vision-files and we need to dump the entire datafiles in to txt-based files to be imported in to a sql server. We are using "vutil" to dump the data right now, but it's creating a fixed-width file and we would *really* like it to be a delimited file of some sort. The command we are using right now is this:vutil -unload -t sourcefile destinationfileI'll be surprised if anyone even has a clue of what I'm talking about here (I sure don't!) and if anyone can even help...whoa...that would be totally far out... --Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
|
khtan
In (Som, Ni, Yak)
17689 Posts |
Posted - 2006-08-29 : 08:33:12
|
Cobol ? Are you serious ? I was using cobol before switching over to SQL. But I as using MicroFocus Cobol not AcuCobol. KH |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2006-08-29 : 08:48:22
|
quote: Cobol? Are you serious ?
I'm afraid I am Working for a financial company now, and our main business application was created in the early 80's using cobol. It rules. Yeah...--Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
|
|
eyechart
Master Smack Fu Yak Hacker
3575 Posts |
Posted - 2006-08-29 : 08:57:58
|
doesn't peoplesoft use cobol?-ec |
|
|
khtan
In (Som, Ni, Yak)
17689 Posts |
Posted - 2006-08-29 : 09:08:40
|
i try to see if i can dig out any AcuCobol manual when i am back in office tomorrow. We did evaluated AcuCobol once when moving from DOS to Windows, but decided to stick to MicroFocus KH |
|
|
Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)
7020 Posts |
Posted - 2006-08-29 : 09:31:30
|
I don't see any benefit of a delimited file over a fixed width file.If you have something working, why mess with it?CODO ERGO SUM |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2006-08-29 : 09:42:29
|
I don't have anything working...I'm creating a new solution. The problem is with maintenance...we are moving a few hundred vision files over to sql server every day for reporting purposes and when some of the files have over 100 columns it gets pretty nasty to keep contol over how many characters each column is (I have to do this manually for each file). Adding delimiters to the flat files would simply rule out this problem as the delimiter tells my importing application (Business Objects Data Integrator) exactly how big the column is...--Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
|
|
Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)
7020 Posts |
Posted - 2006-08-29 : 09:57:06
|
A delimiter implies veriable length data, so how will using a delimiter tell your importing application how big the column is? (not to mention the data type and column name)CODO ERGO SUM |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2006-08-30 : 04:49:54
|
A return question: you have 200 flat files which are all going in to different tables with different structure in a sql server database. You have the record description for all these in separate files in some obscure format that forces you to type in the record length for every single column manually. Some of the flat files have over 100 columns. Would you prefer to do this one-time job manually by typing the record length column by column table by table, or would you prefer that each column length was computed automatically?--Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
|
|
spirit1
Cybernetic Yak Master
11752 Posts |
Posted - 2006-08-30 : 09:24:48
|
couldn't you somehow build the format file for eash data file and use bcp to import data?Go with the flow & have fun! Else fight the flow blog thingie: http://weblogs.sqlteam.com/mladenp |
|
|
Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)
7020 Posts |
Posted - 2006-08-30 : 12:28:40
|
quote: Originally posted by Lumbago A return question: you have 200 flat files which are all going in to different tables with different structure in a sql server database. You have the record description for all these in separate files in some obscure format that forces you to type in the record length for every single column manually. Some of the flat files have over 100 columns. Would you prefer to do this one-time job manually by typing the record length column by column table by table, or would you prefer that each column length was computed automatically?--Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand"
I have no objection to less work, but my question still stands, because don't see how having a delimited file accomplishes what you want.I will say that there are a number of things to watch out for with data from COBOL files:1. It is common to have multiple record types in the same file, and you may have to put these in different tables and relate them together.2. It is easy to have repeating items, or repeating groups of items in a COBOL file, and you will usually have to transform those into a normalized structure.3. It is not uncommon for programmers to overload data elements with multiple meanings, depending on the context.4. It is not uncommon to find data elements that violate the ordinary rules for valid data for that data element. What do you do with a date of 99999999 ?I did a project a few years back to convert data from COBOL data files to SQL Server, and I found it necessary to write a custom conversion program for each COBOL file to extract to files in a normalized format to load into SQL Server.If the only issue you have to deal with turns out to be having to define the format for fixed width files, you can count yourself very, very lucky.CODO ERGO SUM |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2006-08-31 : 02:32:18
|
spirit: this is basically what I'm doing with this Business Objects Data Integrator (BODI) only it's using bulk insert. No overhead in the application whatsoever after the import statement has been prepared.Michael: Now this is very valuable information! I have never migrated data from cobol to sql server before but I'm beginning to see that it's a big can of worms. In response to your comments:1. I have not yet encountered multiple record types (not sure what you mean by this) in our flat files but I have only processed two of them so I'm probably bound to find some!2. I have had several repeating items but the "vutil"-thingy that does the export of data places these columns next to eachother. So the 4 times repeating item "Column" will come out as Column1, Column2, Column3, Column4 which is normalized and works for me.3. I'm positive that this is happening here also, luckily we have a dts-based system that's moving the same data allready that I can refer to (but that takes AGES to run so that's why I'm doing this).4. I have no clue whatsoever right now what to do with these. Right now we have a negative number issue where the number -3256 comes out as 0000325} or 0000325N. It makes no sense!And writing a custom conversion program for each file is just not an option. Ugh...I seriously hope I don't have to go there We have sent a question to AcuCorp on how to achieve our objective so I hope they will come up with something sensible...--Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2006-08-31 : 02:53:30
|
Bad news...just got the response from AcuCorp The vutil utility will only do a direct dump from Vision files into flat files. It does not do any data conversion and all numbers will be in the internal format. It will not create a file in the format that you require.The suggestion that I would normally make would be to use AcuODBC to load the data directly from the Vision files. The only other suggestion would be to write your own programs to dump the Vision files into flat files in the format that you require.From our 6.0.0 release there is the AcuXML interface that can be used to create files in XML format. This would still require the writing of a program to read through the Vision file to create the XML file.We are using AcuODBC in the DTS-packages I just mentioned and they are working fine exept for the fact that it's about as fast as watching my own hair grow. And upgrading costs millions, takes years and anyone that has moved large amounts of data in xml-format knows that this is just bad. I'll just go and dump myself in to a river now...nice to have known you all...--Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
|
|
SwePeso
Patron Saint of Lost Yaks
30421 Posts |
Posted - 2006-08-31 : 03:25:01
|
Maybe this [url]http://www.trifox.com/news/vision.html[/url] will help. $200 does seem affordable.There seems to be a extract option in the vutil tool. See this page [url]http://www.cosort.com/public/solutions/cosort/acucobol.htm[/url]. They also have a tool to extract data from vision files.Peter LarssonHelsingborg, Sweden |
|
|
khtan
In (Som, Ni, Yak)
17689 Posts |
Posted - 2006-08-31 : 04:24:04
|
Darn ! can't find my AcuCobol manual. Hope nobody has sold it to karung guni KH |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2006-08-31 : 05:20:23
|
Omg Peso, this might actually be something! I'm sending them an email right now and 200 bux is not a problem whatsoever. Heck, even 8000 would be money well spent if this VORTEX-baby performs like a champ --Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
|
|
SwePeso
Patron Saint of Lost Yaks
30421 Posts |
Posted - 2006-08-31 : 05:33:34
|
If it works, do you want my account number to send the other 7800 bucks?Peter LarssonHelsingborg, Sweden |
|
|
Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)
7020 Posts |
Posted - 2006-08-31 : 07:32:57
|
quote: Originally posted by Lumbago...Right now we have a negative number issue where the number -3256 comes out as 0000325} or 0000325N. It makes no sense!...
Actually, that looks like good data. What is happening is that the sign is carried in the right-most character. This is a very common format for signed decimal data. Wait till you run into packed decimal for a real WTF moment.If the other utility doesn't work out, you may want to tale a look at a tool called Data Junction. It understands all these internal COBOL data formats for most major COBOL versions and file systems.Do you have the source code for the COBOL programs? If so, you may want to spend a little time with a COBOL manual, learning to read the contents of the DATA DIVISION of a program.CODO ERGO SUM |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2006-08-31 : 07:35:08
|
No probs buddy...anything for a helpful, friendly neighbour After doing a bit of research and googleing a little differently I actually managed to find a few other options as well, so a hope is not lost. This link for example is pretty cool after seriously considering the river: http://www.cobug.com/cobug/docs/fileconver0038.html--Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2006-08-31 : 07:42:38
|
Heck man, you gotta be crazy! Putting me in front of a COBOL manual is like putting a blonde in a nuclear physics class...I'm just not interested! Cobol is for old people...just like Delphi (sorry, I just had to say that hehe)But I'll definetly check up on that Data Junction product. I've had my share of WTF-moments lately and I'm open to anything that makes my life easier at this point.--Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
|
|
SwePeso
Patron Saint of Lost Yaks
30421 Posts |
Posted - 2007-08-23 : 17:05:13
|
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=88357 E 12°55'05.25"N 56°04'39.16" |
|
|
Next Page
|