Adaptive filters / Ali H. Sayed.
Material type: TextPublisher: Hoboken, New Jersey : Wiley-Interscience : c2008Distributor: [Piscataqay, New Jersey] : IEEE Xplore, 2008Description: 1 PDF (xxx, 786 pages) : illustrationsContent type:- text
- electronic
- online resource
- 9780470374122
- Adaptive filters
- Acoustics
- Adaptation model
- Adaptive algorithms
- Adaptive equalizers
- Adaptive filters
- Additive noise
- Additives
- Algorithm design and analysis
- Antenna measurements
- Approximation algorithms
- Approximation methods
- Arrays
- Artificial intelligence
- Artificial neural networks
- Bayesian methods
- Bibliographies
- Binary phase shift keying
- Blind equalizers
- Books
- Channel estimation
- Chebyshev approximation
- Computational complexity
- Computational efficiency
- Computer aided software engineering
- Computers
- Context
- Convergence
- Correlation
- Cost function
- Covariance matrix
- Data models
- Data structures
- Decision feedback equalizers
- Delay
- Digital filters
- Discrete Fourier transforms
- Discrete cosine transforms
- Echo cancellers
- Eigenvalues and eigenfunctions
- Energy conservation
- Equalizers
- Equations
- Estimation
- Estimation error
- Estimation theory
- Extraterrestrial measurements
- Face
- Filter bank
- Filtering algorithms
- Filtering theory
- Finite impulse response filter
- Frequency domain analysis
- Gaussian distribution
- Gaussian noise
- Gaussian processes
- Geometry
- Histograms
- History
- Indexes
- Information filters
- Inspection
- Joints
- Kalman filters
- Lattices
- Lead
- Least squares approximation
- Linear algebra
- Linear matrix inequalities
- Linear systems
- Linearity
- Manifolds
- Materials
- Mathematical model
- Matrices
- Matrix decomposition
- Maximum likelihood estimation
- Measurement uncertainty
- Medical services
- Noise
- Noise measurement
- OFDM
- Optimization
- Oscillators
- Performance analysis
- Poles and zeros
- Polynomials
- Prediction algorithms
- Presses
- Probability density function
- Projection algorithms
- Qualifications
- Random processes
- Random variables
- Receivers
- Reflection
- Reliability theory
- Robustness
- Sections
- Signal to noise ratio
- Silicon compounds
- Simulation
- Speech
- Speech processing
- Stability analysis
- Stacking
- Steady-state
- Stochastic processes
- Symmetric matrices
- Technological innovation
- Terminology
- Time measurement
- Tin
- Transfer functions
- Transforms
- Transient analysis
- Transversal filters
- Vectors
- Weight measurement
- Writing
- 621.3815/324
Includes bibliographical references (p. 758-774) and indexes.
Preface and Acknowledgments -- Notation and Symbols -- BACKGROUND MATERIAL -- A. Random Variables -- A.1 Variance of a Random Variable -- A.2 Dependent Random Variables -- A.3 Complex-Valued Random Variables -- A.4 Vector-Valued Random Variables -- A.5 Gaussian Random Vectors -- B. Linear Algebra -- B.1 Hermitian and Positive-Definite Matrices -- B.2 Range Spaces and Nullspaces of Matrices -- B.3 Schur Complements -- B.4 Cholesky Factorization -- B.5 QR Decomposition -- B.6 Singular Value Decomposition -- B.7 Kronecker Products -- C. Complex Gradients -- C.1 Cauchy-Riemann Conditions -- C.2 Scalar Arguments -- C.3 Vector Arguments -- PART I: OPTIMAL ESTIMATION -- 1. Scalar-Valued Data -- 1.1 Estimation Without Observations -- 1.2 Estimation Given Dependent Observations -- 1.3 Orthogonality Principle -- 1.4 Gaussian Random Variables -- 2. Vector-Valued Data -- 2.1 Optimal Estimator in the Vector Case -- 2.2 Spherically Invariant Gaussian Variables -- 2.3 Equivalent Optimization Criterion -- Summary and Notes -- Problems and Computer Projects -- PART II: LINEAR ESTIMATION -- 3. Normal Equations -- 3.1 Mean-Square Error Criterion -- 3.2 Minimization by Differentiation -- 3.3 Minimization by Completion-of-Squares -- 3.4 Minimization of the Error Covariance Matrix -- 3.5 Optimal Linear Estimator -- 4. Orthogonality Principle -- 4.1 Design Examples -- 4.2 Orthogonality Condition -- 4.3 Existence of Solutions -- 4.4 Nonzero-Mean Variables -- 5. Linear Models -- 5.1 Estimation using Linear Relations -- 5.2 Application: Channel Estimation -- 5.3 Application: Block Data Estimation -- 5.4 Application: Linear Channel Equalization -- 5.5 Application: Multiple-Antenna Receivers -- 6. Constrained Estimation -- 6.1 Minimum-Variance Unbiased Estimation -- 6.2 Example: Mean Estimation -- 6.3 Application: Channel and Noise Estimation -- 6.4 Application: Decision Feedback Equalization -- 6.5 Application: Antenna Beamforming -- 7. Kalman Filter.
7.1 Innovations Process -- 7.2 State-Space Model -- 7.3 Recursion for the State Estimator -- 7.4 Computing the Gain Matrix -- 7.5 Riccati Recursion -- 7.6 Covariance Form -- 7.7 Measurement and Time-Update Form -- Summary and Notes -- Problems and Computer Projects -- PART III: STOCHASTIC GRADIENT ALGORITHMS -- 8. Steepest-Descent Technique -- 8.1 Linear Estimation Problem -- 8.2 Steepest-Descent Method -- 8.3 More General Cost Functions -- 9. Transient Behavior -- 9.1 Modes of Convergence -- 9.2 Optimal Step-Size -- 9.3 Weight-Error Vector Convergence -- 9.4 Time Constants -- 9.5 Learning Curve -- 9.6 Contour Curves of the Error Surface -- 9.7 Iteration-Dependent Step-Sizes -- 9.8 Newton?s Method -- 10. LMS Algorithm -- 10.1 Motivation -- 10.2 Instantaneous Approximation -- 10.3 Computational Cost -- 10.4 Least-Perturbation Property -- 10.5 Application: Adaptive Channel Estimation -- 10.6 Application: Adaptive Channel Equalization -- 10.7 Application: Decision-Feedback Equalization -- 10.8 Ensemble-Average Learning Curves -- 11. Normalized LMS Algorithm -- 11.1 Instantaneous Approximation -- 11.2 Computational Cost -- 11.3 Power Normalization -- 11.4 Least-Perturbation Property -- 12. Other LMS-Type Algorithms -- 12.1 Non-Blind Algorithms -- 12.2 Blind Algorithms -- 12.3 Some Properties -- 13. Affine Projection Algorithm -- 13.1 Instantaneous Approximation -- 13.2 Computational Cost -- 13.3 Least-Perturbation Property -- 13.4 Affine Projection Interpretation -- 14. RLS Algorithm -- 14.1 Instantaneous Approximation -- 14.2 Computational Cost -- Summary and Notes -- Problems and Computer Projects -- PART IV: MEAN-SQUARE PERFORMANCE -- 15. Energy Conservation -- 15.1 Performance Measure -- 15.2 Stationary Data Model -- 15.3 Energy Conservation Relation -- 15.4 Variance Relation -- 15.A Interpretations of the Energy Relation -- 16. Performance of LMS -- 16.1 Variance Relation -- 16.2 Small Step-Sizes -- 16.3 Separation Principle.
16.4 White Gaussian Input -- 16.5 Statement of Results -- 16.6 Simulation Results -- 17. Performance of NLMS -- 17.1 Separation Principle -- 17.2 Simulation Results -- 17.A Relating NLMS to LMS -- 18. Performance of Sign-Error LMS -- 18.1 Real-Valued Data -- 18.2 Complex-Valued Data -- 18.3 Simulation Results -- 19. Performance of RLS and Other Filters -- 19.1 Performance of RLS -- 19.2 Performance of Other Filters -- 19.3 Performance Table for Small Step-Sizes -- 20. Nonstationary Environments -- 20.1 Motivation -- 20.2 Nonstationary Data Model -- 20.3 Energy Conservation Relation -- 20.4 Variance Relation -- 21. Tracking Performance -- 21.1 Performance of LMS -- 21.2 Performance of NLMS -- 21.3 Performance of Sign-Error LMS -- 21.4 Performance of RLS -- 21.5 Comparison of Tracking Performance -- 21.6 Comparing RLS and LMS -- 21.7 Performance of Other Filters -- 21.8 Performance Table for Small Step-Sizes -- Summary and Notes -- Problems and Computer Projects -- PART V: TRANSIENT PERFORMANCE -- 22. Weighted Energy Conservation -- 22.1 Data Model -- 22.2 Data-Normalized Adaptive Filters -- 22.3 Weighted Energy Conservation Relation -- 22.4 Weighted Variance Relation -- 23. LMS with Gaussian Regressors -- 23.1 Mean and Variance Relations -- 23.2 Mean Behavior -- 23.3 Mean-Square Behavior -- 23.4 Mean-Square Stability -- 23.5 Steady-State Performance -- 23.6 Small Step-Size Approximations -- 23.A Convergence Time -- 24. LMS with non-Gaussian Regressors -- 24.1 Mean and Variance Relations -- 24.2 Mean-Square Stability and Performance -- 24.3 Small Step-Size Approximations -- 24.A Independence and Averaging Analysis -- 25. Data-Normalized Filters -- 25.1 NLMS Filter -- 25.2 Data-Normalized Filters -- 25.A Stability Bound -- 25.B Stability of NLMS -- Summary and Notes -- Problems and Computer Projects -- PART VI: BLOCK ADAPTIVE FILTERS -- 26. Transform Domain Adaptive Filters -- 26.1 Transform-Domain Filters -- 26.2 DFT-Domain LMS.
26.3 DCT-Domain LMS -- 26.A DCT-Transformed Regressors -- 27. Efficient Block Convolution -- 27.1 Motivation -- 27.2 Block Data Formulation -- 27.3 Block Convolution -- 28. Block and Subband Adaptive Filters -- 28.1 DFT Block Adaptive Filters -- 28.2 Subband Adaptive Filters -- 28.A Another Constrained DFT Block Filter -- 28.B Overlap-Add Block Adaptive Filters -- Summary and Notes -- Problems and Computer Projects -- PART VII: LEAST-SQUARES METHODS -- 29. Least-Squares Criterion -- 29.1 Least-Squares Problem -- 29.2 Geometric Argument -- 29.3 Algebraic Arguments -- 29.4 Properties of Least-Squares Solution -- 29.5 Projection Matrices -- 29.6 Weighted Least-Squares -- 29.7 Regularized Least-Squares -- 29.8 Weighted Regularized Least-Squares -- 30. Recursive Least-Squares -- 30.1 Motivation -- 30.2 RLS Algorithm -- 30.3 Regularization -- 30.4 Conversion Factor -- 30.5 Time-Update of the Minimum Cost -- 30.6 Exponentially-Weighted RLS Algorithm -- 31. Kalman Filtering and RLS -- 31.1 Equivalence in Linear Estimation -- 31.2 Kalman Filtering and Recursive Least-Squares -- 31.A Extended RLS Algorithms -- 32. Order and Time-Update Relations -- 32.1 Backward Order-Update Relations -- 32.2 Forward Order-Update Relations -- 32.3 Time-Update Relation -- Summary and Notes -- Problems and Computer Projects -- PART VIII: ARRAY ALGORITHMS -- 33. Norm and Angle Preservation -- 33.1 Some Difficulties -- 33.2 Square-Root Factors -- 33.3 Norm and Angle Preservation -- 33.4 Motivation for Array Methods -- 34. Unitary Transformations -- 34.1 Givens Rotations -- 34.2 Householder Transformations -- 35. QR and Inverse QR Algorithms -- 35.1 Inverse QR Algorithm -- 35.2 QR Algorithm -- 35.3 Extended QR Algorithm -- 35.A Array Algorithms for Kalman Filtering -- Summary and Notes -- Problems and Computer Projects -- PART IX: FAST RLS ALGORITHMS -- 36. Hyperbolic Rotations -- 36.1 Hyperbolic Givens Rotations -- 36.2 Hyperbolic Householder Transformations.
36.3 Hyperbolic Basis Rotations -- 37. Fast Array Algorithm -- 37.1 Time-Update of the Gain Vector -- 37.2 Time-Update of the Conversion Factor -- 37.3 Initial Conditions -- 37.4 Array Algorithm -- 37.A Chandrasekhar Filter -- 38. Regularized Prediction Problems -- 38.1 Regularized Backward Prediction -- 38.2 Regularized Forward Prediction -- 38.3 Low-Rank Factorization -- 39. Fast Fixed-Order Filters -- 39.1 Fast Transversal Filter -- 39.2 FAEST Filter -- 39.3 Fast Kalman Filter -- 39.4 Stability Issues -- Summary and Notes -- Problems and Computer Projects -- PART X: LATTICE FILTERS -- 40. Three Basic Estimation Problems -- 40.1 Motivation for Lattice Filters -- 40.2 Joint Process Estimation -- 40.3 Backward Estimation Problem -- 40.4 Forward Estimation Problem -- 40.5 Time and Order-Update Relations -- 41. Lattice Filter Algorithms -- 41.1 Significance of Data Structure -- 41.2 A Posteriori-Based Lattice Filter -- 41.3 A Priori-Based Lattice Filter -- 42. Error-Feedback Lattice Filters -- 42.1 A Priori Error-Feedback Lattice Filter -- 42.2 A Posteriori Error-Feedback Lattice Filter -- 42.3 Normalized Lattice Filter -- 43. Array Lattice Filters -- 43.1 Order-Update of Output Estimation Errors -- 43.2 Order-Update of Backward Estimation Errors -- 43.3 Order-Update of Forward Estimation Errors -- 43.4 Significance of Data Structure -- Summary and Notes -- Problems and Computer Projects -- PART XI: ROBUST FILTERS -- 44. Indefinite Least-Squares -- 44.1 Indefinite Least-Squares -- 44.2 Recursive Minimization Algorithm -- 44.3 Time-Update of the Minimum Cost -- 44.4 Singular Weighting Matrices -- 44.A Stationary Points -- 44.B Inertia Conditions -- 45. Robust Adaptive Filters -- 45.1 A Posteriori-Based Robust Filters -- 45.2 ε-NLMS Algorithm -- 45.3 A Priori-Based Robust Filters -- 45.4 LMS Algorithm -- 45.A H1 Filters -- 46. Robustness Properties -- 46.1 Robustness of LMS -- 46.2 Robustness of εNLMS.
46.3 Robustness of RLS -- Summary and Notes -- Problems and Computer Projects -- REFERENCES AND INDICES -- References -- Author Index -- Subject Index.
Restricted to subscribers or individual electronic text purchasers.
Adaptive filtering is a topic of immense practical and theoretical value, having applications in areas ranging from digital and wireless communications to biomedical systems. This book enables readers to gain a gradual and solid introduction to the subject, its applications to a variety of topical problems, existing limitations, and extensions of current theories. The book consists of eleven parts?each part containing a series of focused lectures and ending with bibliographic comments, problems, and computer projects with MATLAB solutions.
Also available in print.
Mode of access: World Wide Web
Description based on PDF viewed 12/21/2015.
There are no comments on this title.