Additional Blogs by SAP
Showing results for 
Search instead for 
Did you mean: 
0 Kudos

Ok, I'm honest here - I'll get into more details than in my Developing composite applications with PHP - Upgrading to Web 2.0 concerning the AJAX for the  example I presented in my Developing composite applications with PHP - consuming ESA Services of this serie, but AJAX can get much

more tricky and complex than I'll talk about here, so don't worry - it won't get too complicated today :smile:

As promised, I'll first show a better way to store things like the client-side id (the


I used)

than relying on the help of the server to remind us of it. Using this there is an easy way to handle outdated

requests. Afterwards I'm taking the approach to do the main work on the server side, which requires close

collaboration between the server-side PHP script and the client-side JavaScript. Last I'll spend a few thoughts on RIA versus old-style web.

JavaScript function closures

function receiveTableWrapper(win) {

    f = function(originalRequest) {

        alert('Received result for tab '+win);

    return f;


    var myAjax = new Ajax.Request(url,



The call to


is executed immediately, resulting in a new function instance (stored

in f) which is not yet executed, but which inherits the context from the receiveTableWrapper execution instance -

preventing its inner variable


from being freed until the reference to the newly created function

instance is destroyed. But this instance is returned back to the outside (which makes this thing a so-called 'function

closure') and given as value for the onComplete parameter. Now when the Ajax request returns, the function instance is

executed while still 'magically' having access to its own instance of


. What a nice way of coding




could be stored equally, but I left it in so I could just reuse the sources from the last


Invalidate outdated requests

There is another thing which should be noted when dealing with asynchronous requests. Since using XHR is a

fire-and-forget operation we can neither stop requests from coming in in any order or with any delay, but we also

can't stop them from returning at all. What happens if the user executes a search for some term, then, before it

returns, changes his mind and starts another search? How should we know the returned result list is actually the most

recent one? This is especially true for the loading of the basic data, which might take quite some time to finish and

the user may refine (and resubmit) his query in the meantime.

Given the information above, this is luckily rather easy. We can't prevent the first (now outdated) request from

returning, but we can simply discard it. For this I added a global, unique token. This token is changed whenever

something happens which outdates running calls - in our example, whenever a new search is initiated. A copy of this

token is kept with every call (inside the function closure). Upon returning of the call the locally stored token is

compared to the global one. If they don't match - i.e. the global token has been changed - then the call is outdated

and the function just returns, discarding the result and doing nothing.

Since this token is not cryptographically important and must only be unique I simply used a counter which is

incremented every time the user starts a new search request.

Live Table, server-side data storage

The list of returned customers was kept in an array local to the client, as I wrote about in the last blog. For the

tables in the CFS popup I used the other approach: The data is cached on the web server in a PHP Session, not on the

client in JavaScript, sorting and pagination is also done on the server. There is not much logic used on the client,

except for the AJAX calls fired of when changing the display order or page, I created the actual HTML in PHP. So this

time neither XML nor JSON is transferred, but HTML directly. Which is then displayed in the corresponding div element

of the tab.

The table data is stored on the web server in the


variable, thus a session needs to be

created. The session id is stored in a cookie, which is PHP default. Additionally each table needs its own id

(separate from the session id, as one session holds multiple tables), which must be unique at least inside the same

session. While a simple counter would perhaps suffice I followed my habit to create secure applications and used a

token generator which can't be just guessed easily:

$token = md5(uniqid(rand(), true));

Additional metadata is stored hardcoded inside PHP which defines the column types for each service. This is used

primarily for sorting (whether a column is sortable and what format the data is in), but also for the column header

names or special formatting (e.g. right-aligned numbers).

The TableControl.php file holds all the logic - retrieving the data from the Web service(s), caching, sorting and

displaying the HTML result. The CFSpopup.php provides the basic data and the tabbed navigation and for each tab calls

the TableControl.php with the specific parameters.

Note: the OutboundDeliverySimpleByCustomer call returns a nested array. For display it is flattened to a

single 2D array. To not break the associations when trying to sort in the flattened columns these are


Advantages: client doesn't need to hold the complete set of data, navigating is acceptable quick (though network

latencies are now present)

Disadvantages: network activity is needed for every action (i.e. no offline work), server-side cache might time out

(loss of cached data) and will cache longer than needed (as it is still cached when the browser is closed)

The last two disadvantages can be reduced. All information required to call the Enterprise service can be kept on

the client and retransmitted, so that the cache can be silently (or with confirmation, if needed) refreshed after a

session timeout. Also when changing pages or closing the browser an onunload event can be triggered which tries to

inform the web server that the particular table is not needed anymore. However, this might not always work. I did

implement the re-query in this demo (and a refresh-button was made possible with this, too).

Performance considerations

Concurrent AJAX calls can be thought of as separate threads, being fired off and asynchronously returning the

result. JavaScript is not really threaded preemptively- so be careful how you implement a 'waiting loop' - but the

considerations are the same.

The AJAX calls are subject to the browsers connection pool: there are a fixed number of simultaneous connections

and each call has to share them with both the other AJAX calls as well as other items which are loaded, like images.

Vice versa, sending off a bunch off calls might delay the loading of an image, thus images - especially the

waiting-image - should be loaded before the calls are made. This limitation is set to a rather low value, as described

in the HTTP spec: +"A

single-user client SHOULD NOT maintain more than 2 connections with any server or proxy."+ While it might seem

outdated in the times of s and Comet , it is still very valid today: Both Firefox and

Internet Explorer adhere to this limit and have an additional limit of the number of total connections to all servers

(usually 10). This is a client-side setting which can be changed (try about:config in Firefox or read here ), but only by the

user, not by our application.

One obvious speedup would be to group several requests together in a single call. Especially the calls 'get basic

data' for each customer can be grouped together. If all were grouped in a single call there would be little difference

to having no loop in JavaScript at all and just make the calls in PHP before returning anything. However, then we're

back to a longer waiting time for the user until he sees the first results. Trading off 'time to first result' with

'total time for all results' one should experiment with a good number of requests grouped together. This might even be

done dynamically depending on network speed/latency.

Naturally coming to mind would be some kind of streaming: setting off all requests at once, getting them back one

after another in the same HTTP request and displaying them as soon as another result is completely received. I've not

yet done anything like this yet, so you're on your own here.

The other option, and this one is actually widely used, is to circumvent the restriction : the low number of concurrent connections

are bound to the host-part of the URL, so a server cluster using different host names can get two connections to each

of the hosts. For smaller installations just letting all different host names (FQDN , to be precise) point to the same server

works equally well. Keep in mind, though, that each additional connection is not only additional overhead but also

subject to the 'maximum of concurrent connections in total' as well.

There are other factors to be aware of, like the way Garbage Collection works in IE6 (see also here ). If speed really matters a much deeper dive

must be taken, including profiling.</p>

Graceful Degrading

Graceful degrading was a term coined when


was new and not every

browser would support it. The basic idea was that without CSS support the whole nice design might be gone, but all the

information and functionality still remains. An extreme test for this would be the use of a textmode browser like

Lynx. Now we have 2007 and Web 2.0 is everywhere, relying heavily on JavaScript. Is it possible to build a graceful

degrading for users who have JavaScript disabled? Should we care at all? Peter Micheaux states that you shouldn't and rather pick a target audience and develop

only to them (or, if really necessary, develop the same application twice)

I'm not sure how much is possible without JavaScript - all client-side interaction, AJAX loading, drag&drop, etc is

gone and other things - like our sorted tables - would need to be developed twice to work also without JavaScript. But

we can at least try to keep as much functionality as possible. In the end it's probably a choice of not supporting

non-JavaScript-browsers for rich applications, but provide as much degrading as possible for 'just' enhanced


Does it happen at all? Are there still browsers without scripting capabilities? There might very well be some, but

more importantly every browser offers the option to actively disable it. And next to just ALL security incident

reports state exactly this as one of or the only possibility to secure yourself from the newest threat.

Components which are easily degradable include tables and tabs. In the CFS popup I use several tabs, from which

only one is shown. Without JavaScript it's not possible to change between the tabs - and since they default to being

hidden they also can't be seen all at once. How could we do better?

One option would be to display them as visible and hide them in the JavaScript. Unfortunately this would result in

a visible flickering (especially when onLoad is used instead of the onDomReady estimations provided by some

frameworks). A year ago, Ara Pehlivanian stated that graceful degrading is a myth and can't

work. But now there are new ideas: here< /a> and here.

The best method (distilled from the above mentioned pages and the many comments therein) seems to be to have

separate stylesheet definitions. One could change the body.class very early (e.g. directly after the body tag with

inline-javascript) and set the

display: none

only in the JavaScript version, or to have separate

stylesheet files altogether and use a CSS selection as described back in 2005 here.</p>

To be continued...

There is one more blog about to come, but I'll leave it as a surprise what I'll do there. I just promise you it will be a hyped thing and include more JavaScript 🙂

Also the files I made are finally approved and will be available here shortly. I'm sorry this took so long. The blog was not meant to be just theoretical but you'll get my full sources in a ready-to-use project (that is, if you have an SAP backend at hand) so play around with it.

Former Member
0 Kudos
Hi Frederic:

Sorry -:) I'm not into AJAX...Still, I like and read your AJAX blog series -:) They're good.


0 Kudos
Thanks 🙂

I got very little feedback, so I wondered whether I should continue at all. But perhaps it was just the missing files.

Don't you want to use AJAX in your applications or are you just not interested in all these little details I mentioned here? There is a wealth of libraries coming, and using AJAX components is getting easier day by day.

Active Contributor
0 Kudos
Hi Frederic,

Your blogs well written and very informative. Do continue.

Active Contributor
0 Kudos
This is good stuff. Very technical but that's ok. Keep them coming.
Former Member
0 Kudos
Actually I'm very interested in your blog series, because I love to improve my knowledge...But I'm not really interested on working with AJAX...Maybe if you keep your blogs comming I would change my mind -;)