It's possible it's done in C and other languages for completeness. It's possible for subSequence to return an empty CharSequence (
start = end). If you implemented subSequence to include the index end, then I one way to get an empty result would be to have
start = end + 1. In Java this would work perfectly fine most of the time since the only unsigned type is char (i.e. non-negative), and it's use as an unsigned number is unorthodox (basically, don't try to do this without a very good reason).
However, if you did happen to use char's in this manner (or use another language which supports true unsigned data types), then you run into an issue mathematically when start = 0, because that would require end=-1 which isn't possible with unsigned types.
It could be argued here that calling subSequence in this unique manner to get an empty sequence is not very useful, but it is still possible and computer scientists dislike incompleteness.
In Java's case it's most likely it's done just to make porting code easier, or is done purely out of habit. I know I personally pretty much always implement methods which deal with sequences to include the first index and exclude the last one out of habit.
There could also be other reasons early programming languages did this pertaining to some obscure optimization which made significant differences back then, but now this either is insignificant or is done by the compiler for you anyways.
What-ever the reason, it's there so you might as well live with it
(unless you want to go about creating your own language/libraries)