Geometry for Machine Learning With Engineering Applications: Theory, Examples, and Python Implementations (Computational Mathematics Library)
Format:
Hardcover
En stock
1.24 kg
Sí
Nuevo
Amazon
USA
- A rigorous bridge between modern differential geometry and deployable machine learning systems. This graduate-level text unifies Riemannian optimization, Lie theory, and equivariant representations into a coherent toolkit for building stable, symmetry-aware learning algorithms. It is written for readers who want proofs and implementation details side-by-side—and who intend to use both.What sets this book apartMathematically rigorous and implementation-focused: from definitions and theorems to practical algorithms that run on GPUs.Unified geometric perspective: curvature, connections, group actions, and spectrum inform both optimization landscapes and model architectures.Engineering-grade case studies: robotics (SE(3) pose), signal processing (Grassmann subspaces), medical imaging (SPD statistics), molecular modeling (E(3)-equivariant networks), vision on spheres (SO(3) harmonics), and more.Designed for research and production: emphasizes numerical stability, invariances, and reproducibility.Structure of every chapterTheory: precise definitions, lemmas, propositions, and theorems; geometric intuition without sacrificing rigor.Examples: worked case studies connecting the chapter’s mathematics to real engineering tasks.Python Implementation: reference code illustrating core abstractions (manifolds, metrics, exp/log, retractions, transports), optimizers, and equivariant layers; integrates cleanly with NumPy/PyTorch/JAX workflows.What you will learnDifferential and Riemannian geometry for ML: metrics, connections, geodesics, curvature, Jacobi fields, and comparison geometry—how they shape optimization and generalization.Matrix manifolds and quotients: Stiefel/Grassmann, SPD cones, fixed-rank geometries; submersions, homogeneous spaces, and symmetry reduction for identifiability and efficiency.Lie groups and representation theory: exponential/log, adjoint actions, Haar integration; Peter–Weyl, characters, and intertwiners for designing equivariant layers.Equivariant deep learning: steerable convolutions on groups and homogeneous spaces; gauge-equivariant networks on meshes and surfaces; sampling and quadrature on manifolds.Optimization on manifolds: first- and second-order methods, stochastic/online variants, proximal and splitting schemes, trust-region and quasi-Newton, with convergence guarantees.Probabilistic and statistical geometry: information geometry and natural gradients; Wasserstein/Otto calculus; diffusion/score models and HMC on manifolds and Lie groups.Spectral and hyperbolic geometry: Laplace–Beltrami operators, heat kernels, Hodge theory, nonpositive curvature, and hyperbolic embeddings for hierarchical data.Software practice: manifold-aware autodiff, numerically safe exp/log and transports, differentiating through eigendecompositions, and testing equivariance.
IMPORTÁ FACIL
Comprando este producto podrás descontar el IVA con tu número de RUT
NO CONSUME FRANQUICIA
Si tu carrito tiene solo libros o CD’s, no consume franquicia y podés comprar hasta U$S 1000 al año.