cancel
Showing results for 
Search instead for 
Did you mean: 

Why does UNLOAD insist on escaping commas when using the QUOTES OFF clause?

justin_willey
Participant
0 Kudos
1,270

Using v17.0.10.5923, I am trying to create a simple way of unloading a number of (unspecified) tables to JSON. The FOR JSON AUTO option is ideal for the purpose. The one problem I have is that I want to use QUOTES OFF (or QUOTE ''), otherwise UNLOAD puts single quotes round the data.

When I do this, the commas between the JSON data value pairs are escaped as \\x2c, rendering the JSON invalid. If I use ESCAPES OFF, this doesn't happen, but then things that do need escaping, aren't!

The problem doesn't seem to specifically relate to JSON data, as it can be illustrated with two very simple examples:

create variable blah2 long varchar;
set blah2 = '"Jane Doe","Town"';

running

unload select blah2 to 'c:\\\\tmp\\\\junk26.txt' quotes off;

gives

"Jane Doe"\\x2c"Town"

whereas

unload select blah2 to 'c:\\\\tmp\\\\junk27.txt'

gives

'"Jane Doe","Town"'

Using xp_write_file() works, but has numerous disadvantages compared with UNLOAD (client side files, character set control etc). Using iSQL and Save to File doesn't have the problem.

As a work-around I could strip out the excess quotes from the resulting file, but that makes a very simple process rather messier.

Accepted Solutions (1)

Accepted Solutions (1)

johnsmirnios
Employee
Employee

The default column delimiter is a comma so if a value contains a comma it must be escaped. UNLOAD tries to ensure (the best it can) that the data you UNLOAD can be given to a LOAD with the amse options. Without escaping the comma, your output without quotes ("Jane Doe","Town") would represent two values if you attempted to LOAD it with the same options. ESCAPES OFF will likely give you what you want. Or, possibly, set your column delimiter to something that doesn't appear in your data.

justin_willey
Participant
0 Kudos

Of course - my commas are inside the values as far as SQLA is concerned.

unload select blah2 to 'c:\\\\tmp\\\\junk26.txt' quotes off delimited by ''  ;

works fine.(Interestingly using the COLUMN DELIMITED BY variation gives "Syntax error near 'COLUMN'")

johnsmirnios
Employee
Employee
0 Kudos

The doc does appear to be incorrect. IIRC, the syntax has always been just "DELIMITED BY" and that syntax predated the introduction of custom row delimiters that were set via "ROW DELIMITED BY".

johnsmirnios
Employee
Employee

Like me, you probably prefer the old dcx. The SAP help system contains the correction (dcx is no longer updated): https://help.sap.com/viewer/93079d4ba8e44920ae63ffb4def91f5b/17.0/en-US/817fd0fd6ce21014a58ff727fbb7...

VolkerBarth
Contributor
0 Kudos

Of course for those like you and me, a DCX comment that COLUMN DELIMITED BY is not supported would be worthwhile 🙂

Breck_Carter
Participant
0 Kudos

> dcx is no longer updated

FWIW dcx is handy for creating A links to old doc topics (V10, etc), so even though it's not being updated, I hope the website remains up.

But... it's up to everyone to preserve everything they own since there is no guarantee cloud-based data won't disappear.

justin_willey
Participant
0 Kudos

As they say: The Cloud - other people's computers 🙂

justin_willey
Participant
0 Kudos

Many thanks John

VolkerBarth
Contributor
0 Kudos

So how do I backup this Forum? 🙂

jack_schueler
Product and Topic Expert
Product and Topic Expert

There is a comment that COLUMN is not correct, added by yours truly, on 2016/10/19 13:44.

VolkerBarth
Contributor
0 Kudos

Thanks Jack – apparently I had a look at v16 DCX, probably because I prefer even older docs 🙂

Breck_Carter
Participant
0 Kudos

justin_willey
Participant

My UNLOAD DATABASE TO JSON now works fine - except when I hit a big table. It seems (unsurpringly) that 1GB of data in a table turns into rather more than 4GB of JSON. Before this, I never expected to meet SQLCODE -1313 Maximum string length exceeded!!

I'll just to have to unload in blocks, maybe 100k records at a time.

Answers (0)