cancel
Showing results for 
Search instead for 
Did you mean: 

Problem with FOR JSON RAW and character sets

justin_willey
Participant
3,345

We are hitting a problem with the extremely useful FOR JSON RAW option:

CREATE TABLE Blah67(PK int default autoincrement, stuff char(50), primary key (PK));

INSERT INTO Blah67(Stuff) values ('£'); INSERT INTO Blah67(Stuff) values ('€'); INSERT INTO Blah67(Stuff) values ('é');

If I select the contents of the table as XML
select * from Blah67 for xml raw;
everything is fine: forxml(1,'PK',xmltab."1",'stuff',xmltab."2") ==================================================================== <row PK="1" stuff="£"/><row PK="2" stuff="€"/><row PK="3" stuff="é"/>

but with JSON

select * from Blah67 for JSON raw;
it all goes haywire: forjson(256,'PK',jsontab."1",'stuff',jsontab."2") ==================================================================== [{"PK":1,"stuff":"£"},{"PK":2,"stuff":"€"},{"PK":3,"stuff":"é"}]

reproducible in 17.0.9.4913 & 16.0.0.2546

I am using catalog collation 1252LATIN1 and character set windows-1252

VolkerBarth
Contributor

FWIW, if you change the Stuff column to NCHAR(50), the behaviour is identical. I had suspected it might make a difference.

"SELECT EXPRTYPE(...)" returns XML for both queries. The XML data type is CHAR-based, so that may be relevant here. JSON itself uses Unicode strings.

Resume: I don't have a clue.

Accepted Solutions (1)

Accepted Solutions (1)

johnsmirnios
Participant

That is a bit odd. I can see that, internally, FOR JSON always builds its strings as NCHAR since JSON is always in unicode according to rfc34627.

But the server seems to describe FOR JSON values as the "XML" datatype which seems to be equivalent to "LONG VARCHAR". That would mean that the UTF-8 string would be treated as if it were CHAR charset and, assuming your terminal is using 1252LATIN1, it would not undergo character set translation on the way to the client. Unfortunately, I'm not very familiar with our XML datatype handling but, given that you appear to be getting a UTF-8 encoded string back (a string that should have been converted back to 1252LATIN1), we do appear to be treating FOR JSON as CHAR which is incorrect.

In the end, you want your JSON to be in NCHAR (UTF-8) since valid JSON must always be encoded in Unicode according to the standard.

Given what I see, I think the only workaround to get truly valid JSON is to use a database with a CHAR encoding of UTF-8. And I will need to report a bug...

justin_willey
Participant
0 Kudos

Many thanks for that John - will look forward to a fix in due course. We'll try the workaround.

VolkerBarth
Contributor
0 Kudos

FWIW, here's an older answer from Nick on the pecularties of XML being implemented as LONG VARCHAR...

johnsmirnios
Participant
0 Kudos

The bug has not been fixed. Given the size of the change (effort involved, impact on apps by changes in datatypes, etc), I don't expect it to be addressed in the foreseeable future. I would recommend changing your database to using a UTF8-based encoding (UCA or UTF8BIN).

Answers (2)

Answers (2)

Hi, i have the same problem as Justin. Is there any fix for this problem yet? I consume the JSON RAW result on a webserver and can't get the characters displayed correctly. I tried setting the JSON RAW as "LONG NVARCHAR" but that doesn't work.

justin_willey
Participant
0 Kudos

which version & build are you using?

0 Kudos

16.0.0.2344

justin_willey
Participant
0 Kudos

I think v16 was already out of support at the time of John's comment. The problem is still present in 17.0.10.5923. I'll check 6176 when I have it installed.

johnsmirnios
Participant
0 Kudos

I can tell you that it is not fixed yet and I cannot tell you when it will. I recommend using a database CHAR collation that uses UTF8.

justin_willey
Participant
0 Kudos

I've tested the script above in a UTF8 database, and can confirm that it works fine.

VolkerBarth
Contributor
0 Kudos

Aside: Note that v16 is out of support since end of 2018 (as Justin has pointed out), but that there are a bunch of newer v16 EBFs than 16.0.0.2344, the lstest one is 16.0.0.2798 from 01/2019. It does not fix the current issue but probably other ones...

0 Kudos

Hmmm, is it easy to change the database from latin to utf 8?

VolkerBarth
Contributor

Well, it's easy but requires a database reload as explained here - to quote:

You can change your database collation from one collation to another by performing an unload and reload into the collation of choice. Collations are chosen at database creation time and cannot be changed without rebuilding the database.

justin_willey
Participant

Once you've done it, do check that any non-standard characters are displaying properly in your application - it may be expecting something else!

This workaround worked for me:

DECLARE vTemp LONG VARCHAR;
select * INTO vTemp from Blah67 for JSON raw;
SELECT CAST(CAST(vTemp AS BINARY) AS LONG NVARCHAR);

It should even work (do nothing) if the bug gets fixed some time in the future...

VolkerBarth
Contributor
0 Kudos

That's a nice workaround, as it seems to work with unicode characters that are not valid in the CHAR charset, too.

You might even reduce one conversion:

begin
   declare vTemp long binary;
   select * into vTemp from Blah67 for json raw;
   select to_nchar(vTemp);
end;