Vaadin lets you build secure, UX-first PWAs entirely in Java.
Free ebook & tutorial.
JPAContainer 2.x - performance problems
I am using JPAContainer 2.1.0 and EclipseLink to access my MySQL database. I have one table with about 500k rows of data. When I load the data into a Vaadin Table, the first page (about 20 rows) comes up in a reasonable (albeit noticable) amount of time. However, any attempt to scroll the table can cause the page to go blank and hang there for as long as 2-3 minutes. If I add a filter that reduces the data set down to 500 rows, it can still take a couple of minutes to scroll to the 200'th row of the (reduced) data set.
Are there known performance issues with the JPAContainer? Are there things I can do to improve the performance? I saw some old threads that discussed some container related performance issues, but I don't see anything recent - although I may have missed other discussions. I really like what I can do with Vaadin - but this particular table is causing me serious grief - any suggestions for speeding up the scrolling through and/or filtering large datasets would be much appreciated.
I have hear JPAContainer to work with similar row amounts with rather good performance, so I guess this should be solvable. First thing to ensure is that you are using caching entity provider. Caching provider queries data with Don't know about your environment and best method for this, but e.g. JPAContainer.make() methods should do one for your automatically.
If you have caching provider in use, the best thing would be to enable query logging and start debugging the queries JPA does. Often adding index to some column will fix slow queries.
BTW. With that many rows (when closing to 1M) the lazy loading Table component will start to have some headaches on the client side. If I remember right IE will die first, but all browsers have their limits. Browsers have some maximum for their DOM element sizes. At some point you might need to go for old school "paging" or always limit amount of displayed rows to something reasonable. It isn't really usable for end users either to scroll through hundreds of thousands of rows.
I am not currently using a Caching Entity provider - at least I don't believe that I am - I will verify that - that may be a large part of the problem.
As for the scrolling, my users don't want to scroll through 500k records either :) My original plan was to load the table and then let the users
apply some filters to cut that number down to something manageable (a few hundred to perhaps a few thousand entries, max). But at the moment,
even filtering down to a few hundred rows doesn't allow for useful scrolling - I will look at the entity provider and see if that helps. I'll post
a note here after I try that. Thanks very much!
I was DEFINITELY not using a caching entity provider.... I just created one and I think the performance is now going to be acceptable to everyone. I can scroll through the full 500k records, not just the filtered list. Much better - thanks very much!