Large dataset API design

~what is done can not go back~
Nobody can completely break backward compatibility.
but for new API and evolution:

  1. It’s Good Thing to use Iterator/Iterable instead of Collection for large dataSets.

For example: Container’s method “Collection<?> getItemIds()" is evil, but with this form "Iterator<?> getItemIds()” we can live.

  1. Two API: one for read/view and one for write/edit - is also Good Thing.
    For example Java InputStream/OutputStream API MUCH better then Borland Delphi TStream (all in one) API.

Look at custom containers in add-ons. Many method there look so:

public Object addItem() throws UnsupportedOperationException {
throw new UnsupportedOperationException(“May not add items to a wrapper”);
}



Apache Click
(Vaadin’s stateless light brother) has good API example:


DataProvider


public interface DataProvider<T> extends Serializable {
    /**
     * Return the iterable collection of data items supplied by the data provider.
     *
     * @return the iterable collection of data items supplied by the data provider.
     */
    public Iterable<T> getData();
}


PagingDataProvider


public interface PagingDataProvider<T> extends DataProvider<T> {
    /**
     * Return the total number of results represented by this DataProvider.
     *
     * @return the total number of results represented by this DataProvider
     */
    public int size();
}