HTML5 Data Bindings Extended Repeater Support Product Page


Performance Problems / load time with extended repeater

Reported 07 Oct 2014 19:12:04
have this problem
07 Oct 2014 19:12:04 Steve Skinner posted:
I have done some testing and I'm not too pleased with the results.

Here's a page that loads 25 records per page, the original way (before extended repeater was available), where the # of records per page is defined in the data source. Perfectly acceptable load time. Very fast.

Now, doing the same thing with the extended repeater though, you're not supposed to set the records per page in the data source, but in the extended repeater. Here's a page that is set up like that - same recordset, same paging amount as the link above, except # of records is defined in the extended repeater (not the data source):

You'll notice that it takes 10 or more seconds to load because it is loading all records in the table first, or so it would appear. What's even worse though, is that I had to restrict the records to only 1000 on this second page using the extended repeater because it takes so long to retrieve records from the database. At around 10-15 seconds to load 1000 records, loading over 64000 records would be out of the question.

There must be something wrong here, but I have no idea what. Everything seems to be working fine and I've set up everything based on your tutorials and instructions.


Replied 08 Oct 2014 09:54:08
08 Oct 2014 09:54:08 Teodor Kuduschiev replied:
Hi Steve,
The whole data source actually loads really fast. There is some issue when rendering the data on the page. We are working to find what causes the slow rendering on the page.
Replied 13 Oct 2014 09:17:39
13 Oct 2014 09:17:39 Teodor Kuduschiev replied:
Hi Steve,
Please check your email, i've sent you an updated data bindings js file. Please put it in the scriptlibrary folder and this will improve the loading performance.
Replied 17 Oct 2014 22:27:54
17 Oct 2014 22:27:54 Steve Skinner replied:
When working with a max of 1000 records, it's a big improvement. So that I could easily track which dmxDataBindings.js file is being used, I renamed the new one you emailed me to dmxDataBindings-rev1.js.

Compare the original (dmxDataBindings.js):

to the new version that uses the revised dmxDataBindings-rev1.js

Perfectly zippy performance. However, I have a lot more records to work with than that on most of the customers I work with.


Here's the newest version of this test...

New test 1: records per page defined in data source
The data source pulls all records from the table (over 65000), and is using the new dmxDataBindings.js file you gave me, although the paging is set the original way - using the data source, NOT the extended repeater. This all works fine (as expected - since only the extended repeater has serious performance issues). I just don't get to use any of the features with the extended repeater extension.

New test 2: records per page defined in extended repeater
No URL given
I also created a version of this same file using the extended repeater to define paging, and leaving the max/page results in the data source empty. It doesn't work at all. No records are loaded and it chokes the website (or server - I don't know) for at least a full minute if not more. This second test - I can give you a copy of the code for the page, but I can't allow you to run it since it takes such a toll on the database and slows down my customer's ecommerce site.

So *&^%$# frustrating...
Replied 20 Oct 2014 08:37:15
20 Oct 2014 08:37:15 Teodor Kuduschiev replied:
Hi Steve,
The client-side paging works good as long as the results are about ~1000. The results returned this way get loaded into the memory, that's why your browser hangs when you try to load 65k records at once!
If you have such a large data source, and if you really need to load the whole of it it every time, just use HTML5 Data Bindings without the extended repeater.
Replied 20 Oct 2014 15:23:57
20 Oct 2014 15:23:57 Steve Skinner replied:
No problem. The regular HTML5 DB handles any volume of records quite well.

Most of my customers are either ecommerce or have some sort of content management backend for managing articles and records. Some of my older clients (10+ years) have huge databases like the one I've described in this thread. Most have smaller databases than that, but even in those cases 5,000 to 15,000 records is still quite normal.

You might consider putting some sort of notice or disclaimer on this extended repeater extension to let people know that this extension can't handle recordsets larger than 1,000 to 2,000 records. That would prevent others from wasting as much time as I just did trying to use it on sites with significantly more records than that, and bouncing back and forth with you on troubleshooting.

One way or the other, this information will get in front of potential customers of this extension, but it would look good for you guys to be up front about this extension's restrictive limitations.

Reply to this topic