#jsDisabledContent { display:none; } My Account |  Register |  Help

# Orthogonal

Article Id: WHEBN0000046069
Reproduction Date:

 Title: Orthogonal Author: World Heritage Encyclopedia Language: English Subject: Collection: Publisher: World Heritage Encyclopedia Publication Date:

### Orthogonal

"Orthogonal" redirects here. For the trilogy of novels by Greg Egan, see Orthogonal (novel).

In mathematics, orthogonality is the relation of two lines at right angles to one another (perpendicularity), and the generalization of this relation into n dimensions; and to a variety of mathematical relations thought of as describing non-overlapping, uncorrelated, or independent objects of some kind.

The concept of orthogonality has been broadly generalized in mathematics, science, and engineering, especially since the beginning of the 16th century. Much of it has involved the concepts of mathematical functions, calculus, and linear algebra.

## Etymology

The word comes from the Greek ὀρθός (orthos), meaning "straight", and γωνία (gonia), meaning "angle". The ancient Greek ὀρθογώνιον orthogōnion (< ὀρθός orthos 'upright'[1] + γωνία gōnia 'angle'[2]) and classical Latin orthogonium originally denoted a rectangle.[3] Later, they came to mean a right triangle. In the 12th century, the post-classical Latin word orthogonalis came to mean a right angle or something related to a right angle.[4]

## Mathematics

### Definitions

• In geometry, two Euclidean vectors are orthogonal if they are perpendicular, i.e., they form a right angle.
• Curves or functions in the plane are orthogonal at an intersection if their tangent lines are perpendicular at that point.
• Two vectors, x and y, in an inner product space, V, are orthogonal if their inner product $\langle x, y \rangle$ is zero.[6] This relationship is denoted $x \, \bot \, y$.
• Two vector subspaces, A and B, of an inner product space, V, are called orthogonal subspaces if each vector in A is orthogonal to each vector in B. The largest subspace of V that is orthogonal to a given subspace is its orthogonal complement.
• A linear transformation, T : VV, is called an orthogonal linear transformation if it preserves the inner product, and thus the angle between and the lengths of vectors. That is, for all pairs of vectors x and y in the inner product space V, $\langle Tx, Ty \rangle = \langle x, y \rangle$.
• A term rewriting system is said to be orthogonal if it is left-linear and is non-ambiguous. Orthogonal term rewriting systems are confluent.

A set of vectors is called pairwise orthogonal if each pairing of them is orthogonal. Such a set is called an orthogonal set. Nonzero pairwise orthogonal vectors are always linearly independent.

In certain cases, the word normal is used to mean orthogonal, particularly in the geometric sense as in the normal to a surface. For example, the y-axis is normal to the curve y = x2 at the origin. However, normal may also refer to the magnitude of a vector. In particular, a set is called orthonormal (orthogonal plus normal) if it is an orthogonal set of unit vectors. As a result, use of the term normal to mean "orthogonal" is often avoided. The word "normal" also has a different meaning in probability and statistics.

A vector space with a bilinear form generalizes the case of an inner product. When the bilinear form applied to two vectors results in zero, then they are orthogonal. The case of a pseudo-Euclidean plane uses the term hyperbolic orthogonality. In the diagram, axes x′ and t′ are hyperbolic-orthogonal for any given φ.

### Euclidean vector spaces

In 2-D or higher-dimensional Euclidean space, two vectors are orthogonal if and only if their dot product is zero, i.e. they make an angle of 90°, or π/2 radians.[7] Hence orthogonality of vectors is an extension of the concept of perpendicular vectors into higher-dimensional spaces.

In terms of Euclidean subspaces, the "orthogonal complement" of a line is the plane perpendicular to it, and vice-versa.[8]

Note however that there is no correspondence with regards to perpendicular planes, because vectors in subspaces start from the origin(definition of vector subspace).

In four-dimensional Euclidean space, the orthogonal complement of a line is a hyperplane and vice versa, and that of a plane is a plane.[8]

### Orthogonal functions

Main article: Orthogonal functions

By using integral calculus. it is common to use the following to define the inner product of two functions f and g:

$\langle f, g\rangle_w = \int_a^b f\left(x\right)g\left(x\right)w\left(x\right)\,dx.$

Here we introduce a nonnegative weight function $w\left(x\right)$ in the definition of this inner product. In simple cases, w(x) = 1, exactly.

We say that these functions are orthogonal if that inner product is zero:

$\int_a^b f\left(x\right)g\left(x\right)w\left(x\right)\,dx = 0.$

We write the norms with respect to this inner product and the weight function as

$\|f\|_w = \sqrt\left\{\langle f, f\rangle_w\right\}$

The members of a set of functions { fi : i = 1, 2, 3, ... } are:

• orthogonal on the closed interval [a, b] if
$\langle f_i, f_j \rangle=\int_a^b f_i\left(x\right) f_j\left(x\right) w\left(x\right)\,dx=\|f_i\|^2\delta_\left\{i,j\right\}=\|f_j\|^2\delta_\left\{i,j\right\}$
• orthonormal on the interval [a, b] if
$\langle f_i, f_j \rangle=\int_a^b f_i\left(x\right) f_j\left(x\right) w\left(x\right)\,dx=\delta_\left\{i,j\right\}$

where

$\delta_\left\{i,j\right\}=\left\\left\{\begin\left\{matrix\right\}1 & \mathrm\left\{if\right\}\ i=j \\ 0 & \mathrm\left\{if\right\}\ i\neq j\end\left\{matrix\right\}\right.$

is the "Kronecker delta" function. In other words, any two of them are orthogonal, and the norm of each is 1 in the case of the orthonormal sequence. See in particular the orthogonal polynomials.

### Examples

• The vectors (1, 3, 2)T, (3, −1, 0)T, (1/3, 1, −5/3)T are orthogonal to each other, since (1)(3) + (3)(−1) + (2)(0) = 0, (3)(1/3) + (−1)(1) + (0)(−5/3) = 0, and (1)(1/3) + (3)(1) + (2)(−5/3) = 0.
• The vectors (1, 0, 1, 0, ...)T and (0, 1, 0, 1, ...)T are orthogonal to each other. The dot product of these vectors is 0. We can then make the generalization to consider the vectors in Z2n:
$\mathbf\left\{v\right\}_k = \sum_\left\{i=0\atop ai+k < n\right\}^\left\{n/a\right\} \mathbf\left\{e\right\}_i$
for some positive integer a, and for 1 ≤ ka − 1, these vectors are orthogonal, for example (1, 0, 0, 1, 0, 0, 1, 0)T, (0, 1, 0, 0, 1, 0, 0, 1)T, (0, 0, 1, 0, 0, 1, 0, 0)T are orthogonal.
• Take two quadratic functions 2t + 3 and 5t2 + t − 17/9. These functions are orthogonal with respect to a unit weight function on the interval from −1 to 1. The product of these two functions is 10t3 + 17t2 − 7/9 t − 17/3, and now,


\begin{align} & {} \qquad \int_{-1}^1 \left(10t^3+17t^2-{7\over 9}t-{17\over 3}\right)\,dt \\[6pt] & = \left[{5\over 2}t^4 + {17\over 3}t^3-{7\over 18}t^2-{17\over 3} t \right]_{-1}^1 \\[6pt] & = \left({5\over 2}(1)^4+{17\over 3}(1)^3-{7\over 18}(1)^2-{17\over 3}(1)\right)-\left({5\over 2}(-1)^4+{17\over 3}(-1)^3-{7\over 18}(-1)^2-{17\over 3}(-1)\right) \\[6pt] & = {19\over 9} - {19\over 9} = 0. \end{align}

• The functions 1, sin(nx), cos(nx) : n = 1, 2, 3, ... are orthogonal with respect to Riemann integration on the intervals [0, 2π], [-π, π], or any other closed interval of length 2π. This fact is a central one in Fourier series.

#### Orthogonal states in quantum mechanics

• In quantum mechanics, two eigenstates of a Hermitian operator, $\psi_m$ and $\psi_n$, are orthogonal if they correspond to different eigenvalues. This means, in Dirac notation, that $\langle \psi_m | \psi_n \rangle = 0$ unless $\psi_m$ and $\psi_n$ correspond to the same eigenvalue. This follows from the fact that Schrödinger's equation is a Sturm–Liouville equation (in Schrödinger's formulation) or that observables are given by hermitian operators (in Heisenberg's formulation).

## Art

In art, the perspective (imaginary) lines pointing to the vanishing point are referred to as "orthogonal lines".

The term "orthogonal line" often has a quite different meaning in the literature of modern art criticism. Many works by painters such as [1]

## Computer science

Orthogonality in programming language design is the ability to use various language features in arbitrary combinations with consistent results.[9] This usage was introduced by van Wijngaarten in the design of Algol 68:

The number of independent primitive concepts has been minimized in order that the language be easy to describe, to learn, and to implement. On the other hand, these concepts have been applied “orthogonally” in order to maximize the expressive power of the language while trying to avoid deleterious superfluities.[10]

Orthogonality is a system design property which guarantees that modifying the technical effect produced by a component of a system neither creates nor propagates side effects to other components of the system. Typically this is achieved through the separation of concerns and encapsulation, and it is essential for feasible and compact designs of complex systems. The emergent behavior of a system consisting of components should be controlled strictly by formal definitions of its logic and not by side effects resulting from poor integration, i.e., non-orthogonal design of modules and interfaces. Orthogonality reduces testing and development time because it is easier to verify designs that neither cause side effects nor depend on them.

An instruction set is said to be orthogonal if it lacks redundancy (i.e., there is only a single instruction that can be used to accomplish a given task)[11] and is designed such that instructions can use any register in any addressing mode. This terminology results from considering an instruction as a vector whose components are the instruction fields. One field identifies the registers to be operated upon and another specifies the addressing mode. An orthogonal instruction set uniquely encodes all combinations of registers and addressing modes.

## Communications

In communications, multiple-access schemes are orthogonal when an ideal receiver can completely reject arbitrarily strong unwanted signals from the desired signal using different basis functions. One such scheme is TDMA, where the orthogonal basis functions are nonoverlapping rectangular pulses ("time slots").

Another scheme is orthogonal frequency-division multiplexing (OFDM), which refers to the use, by a single transmitter, of a set of frequency multiplexed signals with the exact minimum frequency spacing needed to make them orthogonal so that they do not interfere with each other. Well known examples include (a, g, and n) versions of 802.11 Wi-Fi; WiMAX; ITU-T G.hn, DVB-T, the terrestrial digital TV broadcast system used in most of the world outside North America; and DMT (Discrete Multi Tone), the standard form of ADSL.

In OFDM, the subcarrier frequencies are chosen so that the subcarriers are orthogonal to each other, meaning that crosstalk between the subchannels is eliminated and intercarrier guard bands are not required. This greatly simplifies the design of both the transmitter and the receiver. In conventional FDM, a separate filter for each subchannel is required.

## Statistics, econometrics, and economics

When performing statistical analysis, independent variables that affect a particular dependent variable are said to be orthogonal if they are uncorrelated,[12] since the covariance forms an inner product. In this case the same results are obtained for the effect of any of the independent variables upon the dependent variable, regardless of whether one models the effects of the variables individually with simple regression or simultaneously with multiple regression. If correlation is present, the factors are not orthogonal and different results are obtained by the two methods. This usage arises from the fact that if centered by subtracting the expected value (the mean), uncorrelated variables are orthogonal in the geometric sense discussed above, both as observed data (i.e., vectors) and as random variables (i.e., density functions). One econometric formalism that is alternative to the maximum likelihood framework, the Generalized Method of Moments, relies on orthogonality conditions. In particular, the Ordinary Least Squares estimator may be easily derived from an orthogonality condition between the explanatory variables and model residuals.

## Taxonomy

In taxonomy, an orthogonal classification is one in which no item is a member of more than one group, that is, the classifications are mutually exclusive.

## Combinatorics

In combinatorics, two n×n Latin squares are said to be orthogonal if their superimposition yields all possible n2 combinations of entries.[13]

## Chemistry

In synthetic organic chemistry orthogonal protection is a strategy allowing the deprotection of functional groups independently of each other. In supramolecular chemistry the notion of orthogonality refers to the possibility of two or more supramolecular, often non-covalent, interactions being compatible; reversibly forming without interferce from the other.

## System reliability

In the field of system reliability orthogonal redundancy is that form of redundancy where the form of backup device or method is completely different from the prone to error device or method. The failure mode of an orthogonally redundant back-up device or method does not intersect with and is completely different from the failure mode of the device or method in need of redundancy to safeguard the total system against catastrophic failure.

## Neuroscience

In neuroscience, a sensory map in the brain which has overlapping stimulus coding (e.g. location and quality) is called an orthogonal map.

## Gaming

In board games such as chess which feature a grid of squares, 'orthogonal' is commonly used to mean "in the same row or column". In this context 'orthogonal' and 'diagonal' are considered opposites.[14]

## References

• The Art of Unix Programming
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.

Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.