Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 Development Tools
 Other Development Tools
 NVARCHAR chop chop

Author  Topic 

SamC
White Water Yakist

3467 Posts

Posted - 2003-12-18 : 14:38:17
It's time to rewrite my email and message system to support longer messages.

I've got to slice and dice large NVARCHAR strings into 4000 byte lengths, inserting multiple rows into a storage table to get the whole Japanese response stored correctly.

Seems to me there are more than a couple of challenges. Here's one: In ADO, what's the optimal way to pass Nx1000 bytes of NVARCHAR to a stored procedure? Lots and lots of Parms?

Is there some way to pass more than 8000 bytes in a single parameter?

Sam

robvolk
Most Valuable Yak

15732 Posts

Posted - 2003-12-19 : 08:19:00
Take a look here:

http://weblogs.sqlteam.com/davidm/posts/655.aspx
Go to Top of Page

SamC
White Water Yakist

3467 Posts

Posted - 2003-12-19 : 09:28:40
Thanks Rob. I'd thought of using text to pass long strings to SQL using ADO. Seems like an idea to explore.

The trick in my case is that NVARCHAR must be supported. Users can enter Asian characters in web forms, the ASP/ADO submits the result to SQL. I'll need to test the behavior of ASP forms capturing UNICODE strings and ADO / TEXT passing that string to SQL.

Then there's the issue of converting SQL text to NVARCHAR in the stored procedure. In this case, the SQL text is actually 2-bytes of UNICODE. To avoid SQL converting each byte of text to a unique NVARCHAR double-byte, I need some help, ideas or a miracle. I'm thinking an intermediate type like VARBINARY might work.

Sam
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2003-12-20 : 17:36:58
Sorry, I should've pointed out that ntext should work too. You may (or may not) need to watch out for the double bytes when using SubString() (I've never used ntext so I don't know for sure). I'm pretty sure the functions will automatically handle unicode but you should test it anyway just to be certain.
Go to Top of Page

SamC
White Water Yakist

3467 Posts

Posted - 2003-12-20 : 21:17:25
I stumbled on ntext in BOL today. I've never worked with either, and have a learning experience ahead. But ntext solves the problem of getting long unicode to a stored procedure and into the database.

Lots of odd things about ntext/text. You can pass 'em as parameters, but you can't allocate a local text variable.

This'll be fun - when I get a round tuit.

Sam
Go to Top of Page

Merkin
Funky Drop Bear Fearing SQL Dude!

4970 Posts

Posted - 2003-12-21 : 08:07:29
Here is an idea, and it's just an off the top of my head idea....

For this, it might be easier to do some of it in your business layer. Actually entering a new email wouldn't be a high volume process so it shouldn't be time critical, so you could :

1. Fire a "Create message" procedure and get back an id / whatever
2. use asp/vb to break up the message body and send it in chunk by chunk.


Then use the reverse to rebuild the message prior to sending it.
Might be easier than trying to do it all inside one proc.



Damian
Go to Top of Page

SamC
White Water Yakist

3467 Posts

Posted - 2003-12-21 : 09:42:59
The idea of using multiple NVARCHAR rows to hold a single email doesn't sound bad. And I'm dealing with familiar entities.

When I get to this job, I may push the NTEXT idea until I hit a wall then revert back to a multi-row NVARCHAR solution.

There are several other angles to this. One being that some email sent by administrators are full of text which must be REPLACED with User/Course/Client informaiton. This will get ugly with either solution.

Sam
Go to Top of Page

Merkin
Funky Drop Bear Fearing SQL Dude!

4970 Posts

Posted - 2003-12-21 : 09:52:46
If you have to do a bunch of token replacing I think you are definitely better to do it outside of TSQL.



Damian
Go to Top of Page
   

- Advertisement -