Wiktionary
n. (context mathematics English) A square matrix that is its own transpose, and is thereby symmetric about the main diagonal
Wikipedia
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, matrix A is symmetric if
A = A.
Because equal matrices have equal dimensions, only square matrices can be symmetric.
The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if the entries are written as A = (a), then a = a, for all indices i and j.
The following 3×3 matrix is symmetric:
$$\begin{bmatrix}
1 & 7 & 3\\
7 & 4 & -5\\
3 & -5 & 6\end{bmatrix}.$$
Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.
In linear algebra, a real symmetric matrix represents a self-adjoint operator over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.