Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Errormessage DATA_OFFSET_LENGTH_TOO_LARGE with GUI_UPLOAD

Former Member
0 Likes
1,031

Hello experts,

I get a runtime error with the errormessage DATA_OFFSET_LENGTH_TOO_LARGE when using GUI_UPLOAD to upload a dataset.

I use a txt-dataset (separator '#').

It seems to me that R/3 cannot detect a lineend.

Is there a special lineend-seperator?

Or does anybody have a different solution?

Kind regards

Udo

8 REPLIES 8
Read only

Former Member
0 Likes
921

Hi

why u r using line-end. I mean there is any special purpose?

In general each line we will add a new line to internal table. So line end will be determined automatically by the system.

try to use CONDENSE statement before appending the data to internal table.

probelm will be solved.

Read only

0 Likes
921

Hello,

in my opinion I can't use CONDENSE, because I first have to upload (with function GUI_UPLOAD) the

data from the dataset.

Within this GUI_UPLOAD I get the error.

In the dump there is a hint that the linebuffer is too large. That's why I asked for a special separator at lineend.

Kind regards

Udo

Read only

Former Member
0 Likes
921

Hi,

I believe Character '#' is for column separator and '##' is for Line Separator.

Regards,

Ramkumar.K

Read only

0 Likes
921

Hello,

I get still the same error, even when I'm trying to add '##' at lineend.

Read only

0 Likes
921

Hi Dude,

Try this

Include this statement under EXPORTING of your

GUI_UPLOAD function module,

HAS_FIELD_SEPARATOR = '#'.

Reward if useful.

Regards,

Lakshmanan

Read only

Former Member
0 Likes
921

Problem disappeared

Read only

Former Member
0 Likes
921

Hi

instead of that we can go for

HAS_FIELD_SEPARATOR = '#'.

plz Reward points if useful

Regards

Hemasekhara reddy S

Read only

0 Likes
921

The problem has disappeared when creating a new file and copying the content to this file.

Thanks for your support.