Recursive least squares ladder estimation algorithms have attracted much attention recently because of their excellent convergence behavior and fast parameter tracking capability, compared to gradient based algorithms. We present some recently developed square root normalized exact least squares ladder form algorithms that have fewer storage requirements, and lower computational requirements than the unnormalized ones. A Hilbert space approach to the derivations of magnitude normalized signal and gain recursions is presented. The normalized forms are expected to have even better numerical properties than the unnormalized versions. Other normalized forms, such as joint process estimators (e.g., "adaptive line enhancer") and ARMA (pole-zero) models, will also be presented. Applications of these algorithms to fast (or "zero") startup equalizers, adaptive noise- and echo cancellers, non-Gaussian event detectors, and inverse models for control problems are also mentioned.