Wikipedia
The LogSumExp (LSE) function is a smooth approximation to the maximum function, mainly used by machine learning algorithms. It's defined as the logarithm of the sum of the exponentials of the arguments:
LSE(x, …, x) = log(exp(x) + ⋯ + exp(x))
The LogSumExp function domain is $\R^n$, the real coordinate space, and its range is $\R$, the real line. The larger the values of x or their deviation, the better the approximation becomes. The LogSumExp function is convex, and is strictly monotonically increasing everywhere in its domain (but not strictly convex everywhere ).
On the otherhand, when directly encountered, LSE can be well-approximated by max{x, …, x}, owing to the following tight bounds.
max{x, …, x} ≤ LSE(x, …, x) ≤ max{x, …, x} + log(n)
The lower bound is met when only one of the argument is non-zero, while the upper bound is met when all the arguments are equal.