I am not an encryption expert and most of the time become cautious after hearing some hearsay or heresy. So let me dispense my heresy and you the psychiatric therapist to calm my nerves.
Let’s say we have an encryption with entropy of log2(per-bit-possibilities) = 256 . Let me express that as a kindred of putting one hydrogen atom in a confined space in total vacuum where it has only 2^256 possibilities of its orientation and location, etc. I am not a particle physicist either so, I again operating on hypothetical hearsay/heresy. So don’t fault me for my imprecision in molecular physics or chem. As I inject more hydrogen atoms into that space, the positional/orientational entropy of each atom is reduced.
So I am looking at information entropy with a similar eye. Given the same key, would you not say that, even though a user keys in a different attempted password, the user has injected more stuff into the space without overall increase of entropy but has significantly reduced the expected entropy of each bit.
Let’s say a user attempts to set up a password using 8 16bit chars. That would be 128 bits per try. So, after 10 tries, the user’s proposal for a password is finally accepted by my password criteria. What about the request’s https and html form repeated overhead like the submit button’s name and value, etc and return error message of unacceptable password format?1280 ++, let’s say a total of 4kbytes of unique information was transmitted. And because, the hacker has also used the application and sniffs all these repeated bytes and encountered the same error message and json terminators and button name values - that reduces the expected entropy further because they are like unchanging microscopic walls constraining the movement, position and orientation of each hydrogen atom.
Now say, I am on the mission to cloudify the engineering, research, financial and manufacturing data analysis of a company. Not that I am. There are thousands of fields that gets to be verified. Not that a user would use all of them. Within the same session. Now I am thinking further about the reduction in encryption entropy.
See, I now seem to be so threatened that I need to have a set of data accessed once and only once and place it on the local session. So that, all field verification has to be done locally without traverse or travesty of the network. Do you think it is safe for me to let the user verify those fields over the network. It’s not a banking operation of a paltry few pages you know. While the corresponding set of data sits on the server for the benefit of graph plotting and other manipulation that the user chooses.
That is saying that, whatever needs to be done on the server, let it stay on the server and whatever needs to be done by the local session let it stay on the local session, rather than transmitting a whole amounts of data to the Window’s urihandler or parameterhandler and then the window transmitting it back just because the application needs to find out the uri and parameters. Why should those handlers sit on the Window widget rather than the Application servlet?
Perhaps, who wants to hack into a stupid manufacturing concern’s engineering data? Why would anybody want to do that.
What about the case of the insider telling a collaborator
… between 10 and 10.05 am I will be accessing such pages, with multiple erroneous field entries. Then the tacit message is - for the rest of the session, the collaborator can then sniff my transmissions and find out that critical parameter set that defines the secret of our new product or you can judge for yourself whether to buy or dump the company’s stocks …
all done within plausible deniability of the insider. Is this situation one that needs to be mitigated or am I being paranoid?
I am trying to start a business, so may be, I should spend more time (and money) on security issues. But in the mean time, I am doing my best with the lowest cost to myself to weed out possibilities. It is not that I know so much but that I know so little about security that I will take efforts to reduce, if not eliminate, every possibility. And part of those efforts is to reduce transmission of information and reduce the reduction of encryption entropy and I am evaluating if vaadin is suitable for such a goal and how costly would the mitigation be if there are some corners to be repaired. Perhaps, I should just stick with vanilla GWT where I know and understand the data transfer. But vaadin is so seductive because of all the widgets it has.
If you feel I am paranoid, you have to calculate out the entropy numbers for me to show that I am, rather than saying - well the banks have no worries, why should you.