on 2020 Aug 20 9:06 AM
Using v17.0.10.5923, I am trying to create a simple way of unloading a number of (unspecified) tables to JSON. The FOR JSON AUTO option is ideal for the purpose. The one problem I have is that I want to use QUOTES OFF (or QUOTE ''), otherwise UNLOAD puts single quotes round the data.
When I do this, the commas between the JSON data value pairs are escaped as \\x2c, rendering the JSON invalid. If I use ESCAPES OFF, this doesn't happen, but then things that do need escaping, aren't!
The problem doesn't seem to specifically relate to JSON data, as it can be illustrated with two very simple examples:
create variable blah2 long varchar; set blah2 = '"Jane Doe","Town"';
running
unload select blah2 to 'c:\\\\tmp\\\\junk26.txt' quotes off;
gives
"Jane Doe"\\x2c"Town"
whereas
unload select blah2 to 'c:\\\\tmp\\\\junk27.txt'
gives
'"Jane Doe","Town"'
Using xp_write_file() works, but has numerous disadvantages compared with UNLOAD (client side files, character set control etc). Using iSQL and Save to File doesn't have the problem.
As a work-around I could strip out the excess quotes from the resulting file, but that makes a very simple process rather messier.
The default column delimiter is a comma so if a value contains a comma it must be escaped. UNLOAD tries to ensure (the best it can) that the data you UNLOAD can be given to a LOAD with the amse options. Without escaping the comma, your output without quotes ("Jane Doe","Town") would represent two values if you attempted to LOAD it with the same options. ESCAPES OFF will likely give you what you want. Or, possibly, set your column delimiter to something that doesn't appear in your data.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Like me, you probably prefer the old dcx. The SAP help system contains the correction (dcx is no longer updated): https://help.sap.com/viewer/93079d4ba8e44920ae63ffb4def91f5b/17.0/en-US/817fd0fd6ce21014a58ff727fbb7...
There is a comment that COLUMN is not correct, added by yours truly, on 2016/10/19 13:44.
My UNLOAD DATABASE TO JSON now works fine - except when I hit a big table. It seems (unsurpringly) that 1GB of data in a table turns into rather more than 4GB of JSON. Before this, I never expected to meet SQLCODE -1313 Maximum string length exceeded!!
I'll just to have to unload in blocks, maybe 100k records at a time.
User | Count |
---|---|
68 | |
10 | |
10 | |
7 | |
6 | |
6 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.