Just thought that the techniques shown below might be of interest to some folk somewhere. It is a copy of a posting in a dBase newsgroup.






Subject: Horses for Courses - may be of interest !!

Good Afternoon Folks
 I am very very reluctant to make this post - When you get to the end
you may understand why !!

 I seem to recall a thread recently about "performance" or maybe it was
"response times", it seemed to me to have degenerated into what a 'brit'
would call a "slanging match" - please be aware, that if this leads to a
similar thread, I will not be joining in .

 Those that know me will be aware that I am a great believer in trying
to select "horses for courses". That is how this started. The results
seemed worthy of me making an effort and telling you about it !!

 If it is of no interest, forgive me for wasting your time.

 Several years ago, (before dB2K) I wrote several serious data
processing applications that used Web Browsers as the user interface.

 Let us call this STEP ONE (and use it as the benchmark for the rest of
the steps)

 The technology used was:-

  Input to Browser ->
  http ->
  IIS (asp file) ->
  dBase 5.7 exe file ->
  back to the waiting ASP file ->
  Redirect to the file just written by dBase ->
  Web browser

 The result (obviously) did not give the "sub-second" responses of local
dBase (or Delphi) .EXE files - but the users loved it !!

 STEP TWO

 Was the new dB2K technology (no ASP !!)

 The result (on the same server, using two side by side clients
simultaneously) was a reduction of response times to between 80% and 90%
of those in step one.

 Would the improvement justify the re-writing of several MBytes of code
? - it didn't seem like it to me.

 STEP THREE

 The Technology was:-

  Input to Browser ->
  http ->
  IIS (asp file) using MSXMLdom to access XML files created from the
dBase tables ->
  Direct response to the browser from the ASP file ->
  Web browser

 The result (on the same server, using two side by side clients
simultaneously) was a to response times tending towards infinity,
completely and totally unacceptable.

 Deep despair - about to throw in the towel.
 I had done so much clever programming - created indexes - fast binary
chopping to get the key for the node that I needed - most disappointing.

 STEP FOUR

 Decided just out of idle curiosity to abandon my "clever code" and do
the job by "serial reads" of the XML data.
 Scared me witless - the average size of the XML files is about 1.5
MBytes !!

 Surprise, surprise, the result was  a reduction of response times to
between 10% and 25% of those in step one.

 Quite unbelievable - MSXMLdom is not fast at finding by key, but
reading the whole file it flies !!

 CONCLUSION

 This technology is worth pursuing - clients are "pressing money into my
hands" to migrate their big systems !!

 HOWEVER

 I am resisting the temptation to do this until I get MDAC upgraded to
the latest so that I can access dBase tables directly from the ASP. That
will be STEP FIVE - it could be better still !!

 BUT

 When clients say "is XML proprietary ?"

  and then "are dBase tables proprietary ?"

  what the devil do I say ???

Hope that it has interested you - is dBase the best way to proceed for
good fast Web based systems - (I have no doubt that it is for "Local"
processing)

Anyone wanting to know more, please ask - if I can tell you, I will.

Best Regards
Pete (Northolt UK)