Wiktionary
n. (context mathematics English) A type of convergence of a sequence of functions { ''f''''n'' }, in which the speed of convergence of ''f''''n''(''x'') to ''f''(''x'') does not depend on ''x''.
Wikipedia
In the mathematical field of analysis, uniform convergence is a type of convergence stronger than pointwise convergence. A sequence } of functions converges uniformly to a limiting function if the speed of convergence of to does not depend on .
The concept is important because several properties of the functions , such as continuity and Riemann integrability, are transferred to the limit if the convergence is uniform, but not necessarily if the convergence is not.
Uniform convergence to a function on a given interval can be defined in terms of the uniform norm.
For a class of predicates H defined on a set X and a set of samples x = (x, x, …, x) , where x ∈ X , the empirical frequency of h ∈ H on x is $\widehat{Q_{x}}(h)=\frac{1}{m}|\{i:1\leq i\leq m,h(x_{i})=1\}|\,\!$. The Uniform Convergence Theorem states, roughly,that if H is "simple" and we draw samples independently (with replacement) from X according to a distribution P , then with high probability all the empirical frequency will be close to its expectation, where the expectation is given by Q(h) = P{y ∈ X : h(y) = 1} . Here "simple" means that the Vapnik-Chernovenkis dimension of the class H is small relative to the size of the sample.
In other words, a sufficiently simple collection of functions behaves roughly the same on a small random sample as it does on the distribution as a whole.
__TOC__