Satellite Imagery. From acquisition principles to processing of optical images for observing the Earth
This book was written for students and engineers wishing to understand the basic principles behind the acquisition of optical imagery for Earth observation and the ways in which the quality of the images can be optimised.
The book describes a very wide range of subjects from fundamental physics (radiation, electronics, optics) to applied mathematics (frequency analysis), geometry and technological issues.
Commande avant 16h,
expédié le jour même (lu. - ve.)
Livraison express sous 48h.
Intended both for designers and downstream users, the book begins with a detailed explanation of the physical principles involved when a satellite acquires an optical image and then goes on to discuss image processing and its limits as well as the ultimate performance obtained.
It also covers in depth the problems to be solved when designing and dimensioning observation systems so that the reader can become familiar with the various processes implemented for acquiring an optical image.
The book describes a very wide range of subjects from fundamental physics (radiation, electronics, optics) to applied mathematics (frequency analysis), geometry and technological issues.
It draws on work done over many years by engineers from CNES (the French Space Agency), the IGN (the French National Geographic Institute) and ONERA (the French Aerospace Laboratory) in the field of satellite optical imagery.
Référence : | 1036 |
Nombre de pages : | 492 |
Format : | 17x24 |
Reliure : | Broché |
Rôle | |
---|---|
CNES | Auteur |
I. INTRODUCTION
Philippe LIER (CNES), Christophe VALORGE (CNES)
I.1. Some history
I.2. What is remote sensing?
I.2.1. Definition
I.2.2. What is a 'digital image’?
I.2.3. What is 'Image Quality’?
I.2.4. Ground processing to correct for remote-sensing effects
I.3. Some examples of Earth observation applications
I.3.1. Meteorology
I.3.2. Mapping
I.3.3. Intelligence gathering
I.3.4. Monitoring natural disasters
I.3.5. Scientific applications
I.4. A panorama of several Earth observation missions
I.4.1. The KEYHOLE satellites of the CORONA programme
I.4.2. The Landsat family, example: Landsat 7
I.4.3. The SPOT family
I.4.4. PLEIADES
I.4.5. American commercial satellites
I.4.6. Vegetation
I.4.7. POLDER
I.4.8. ScaRaB
I.4.9. CALIPSO’s Infrared Camera
I.5. Scope of this book
II. IMAGE GEOMETRY
Jean Marc DELVIT (CNES), Daniel GRESLOU (CNES), Sylvia SYLVANDER (IGN), Christophe VALORGE (CNES)
II.1. Introduction
II.1.1. Chapter outline
II.1.2. Introduction to direct location
II.2. Pre- requisites: Space and Time Reference frames
II.2.1. Stating the problem
II.2.2. Reference frames and object-centred coordinate systems
II.2.3. From the Earth to the stars
II.2.4. Space reference frames
II.2.5. The time references
II.2.6. Changing reference frames
II.3. Geometric principles of acquisition
II.3.1 The different types of sensor
II.3.2. Time-stamping images
II.3.3. Satellite orbits
II.3.4. Satellite attitude
II.4. Geometric modelling of the scene
II.4.1. General principle
II.4.2. Review of conic geometry
II.4.3. Physical modelling of the scene
II.4.4. Analytical modelling of the viewing geometry
II.4.5. Refining the geometric viewing model
II.5. Geometrical processing
II.5.1. Geometrical corrections
II.5.2. Image matching and correlation
II.5.3. 'Downstream' geometric processing
II.6. Geometric image quality
II.6.1. Introduction
II.6.2. User requirements and GIQ
II.6.3. In-flight geometric image quality
II.6.4. Summary of requirements and GIQ performance
II.7. Essential geometrical formulations
II.7.1. Notations
II.7.2. Basic formulae
II.7.3. Detector projection
II.8. Bibliographical references
III. RADIOMETRY
Alain BARDOUX (CNES), Xavier BRIOTTET (ONERA), Bertrand FOUGNIE (CNES), Patrice HENRY (CNES), Sophie LACHERADE (ONERA), Laurent LEBEGUE (CNES), Philippe LIER (CNES), Christophe MIESCH (ONERA), Françoise VIALLEFONT (ONERA)
III.1. Introduction
III.2. Measurement physics
III.2.1. Introduction
III.2.2. Definition of radiative parameters
III.2.3. Optical properties of surfaces
III.2.4. The atmosphere
III.2.5. Analysis of radiance at sensor level
III.3. Acquisition principle: description of the on-board imaging system
III.3.1. Introduction
III.3.2. Optics
III.3.3. Detector system
III.3.4. Electronic system
III.4. Mathematical model of the image acquisition system
III.4.1. Calculation of irradiance over the focal plane
III.4.2. Calculating the number of electrons produced
III.4.3. Calculating the output signal expressed in digital counts
III.5. Radiometric modelling of the image acquisition process
III.5.1. Introduction
III.5.2. Example 1: IIR CALIPSO radiometric model
III.5.3. Example 2: the SPOT radiometric model
III.5.4. Example 3: the PLEIADES-HR radiometric model
III.5.5. Example 4: the POLDER radiometric model
III.6. Calibration and measurement of radiometric performance
III.6.1. Introduction
III.6.2. Relative calibration in the field of view, or 'normalisation'
III.6.3. Absolute calibration
III.7. Radiometric resolution
III.7.1. Introduction
III.7.2. Example: PLEIADES radiometric noise model
III.7.3. Estimation of instrument noise
III.8. Summary and future prospects
III.9. References
IV. IMAGE RESOLUTION
Sébastien FOUREST (CNES), Philippe KUBIK (CNES), Christophe LATRY (CNES), Dominique LEGER (ONERA), Françoise VIALLEFONT (ONERA)
IV.1. Introduction
IV.2. Image spot and MTF
IV.2.1. Review of the theory of stationary linear systems
IV.2.2. Imagers
IV.2.3. Expression of the image spot and MTF
IV.2.4. Overall model
IV.3. Sampling
IV.3.1. The effects of sampling
IV.3.2. Impact on system design
IV.4. Image interpolation
IV.4.1. General introduction
IV.4.2. Classical interpolation
IV.4.3. 1-D interpolating filters
IV.4.4. 2D interpolating filters
IV.4.5. Interpolation in the Fourier domain
IV.5. Treatments for improving resolution
IV.5.1. Introduction
IV.5.2. Deconvolution
IV.5.3. Denoising
IV.5.4. Panchromatic/multispectral Fusion
IV.6. In-flight methods of measuring MTF and focusing errors
IV.6.1. Introduction
IV.6.2. Methods for measuring focus error
IV.6.3. Methods for measuring MTF
IV.6.4. Conclusion
IV.7. Conclusion
IV.8. Annexe 1: The Fourier transform
IV.8.1. The continuous Fourier transform
IV.8.2. Going from the continuous to the discrete world: sampling
IV.8.3. A suitable tool for the sampled world: the Discrete Fourier transform
IV.8.4. The finite discrete Fourier transform
IV.8.5. Summary: from continuous Fourier transform to finite discrete
Fourier transform
IV.8.6. FDFT properties
IV.8.7. Use of the FDFT
IV.8.8. Conclusion
IV.9. Annexe 2: wavelets and packets
IV.9.1. Limitations of the frequency representation
IV.9.2. Wavelets
IV.10. Annexe 3: Interpolation and B-splines
IV.10.1. Basic properties of interpolating functions
IV.10.2. Spline construction
IV.11. Bibliography
V. SYSTEM DIMENSIONING
Philippe KUBIK (CNES)
V.1. Objective and definitions
V.2. Dimensioning principles
V.2.1. Geometry
V.2.2. Radiometry
V.2.3. Resolution
V.3. Design examples
V.3.1. SPOT type mission, 10 m
V.3.2. Satellite with metre-scale resolution
V.4. Conclusions
VI. IMAGE COMPRESSION
Catherine LAMBERT (CNES), Christophe LATRY (CNES), Gilles MOURY (CNES)
VI.1. Introduction
VI.2. General overview of image compression
VI.3. Compression and image quality
VI.3.1. Inadequacy of the usual criteria
VI.3.2. Consideration of the overall onboard/ground image system
VI.3.3. User application criteria
VI.4. Diversity of compression techniques in the space field
VI.4.1. Predictive coding techniques
VI.4.2. DCT encoding techniques
VI.4.3. Lapped Orthogonal Transform (LOT).
VI.4.4. Wavelet transform compression
VI.4.5. Future prospects
VI.4.6. Bibliography
VII. IMAGE SIMULATION
Philippe LIER (CNES), Christophe VALORGE (CNES)
VII.1. The purpose of image simulation
VII.1.1. Review: the concept of 'Image Quality’
VII.1.2. Simulation: a design tool
VII.1.3. Simulation: an interface tool
VII.2. General principles of image simulation
VII.2.1. Simulating the input scene either for the sensor, or for
pre-processing
VII.2.2. Simulating the sensor
VII.2.3. Simulating the ground processing
VII.2.4. Summary
VII.2.5. Examples of how this processing system is used at CNES
VII.2.6. The limitations of 'traditional’ simulation
VII.2.7. Comments
VII.3. Image synthesis and 3D simulation
VII.3.1. Reminder: modelling scenes in '2.5D’
VII.3.2. Modelling scenes in 3D
VII.3.3. Pre-processing in 3D
VII.3.4. 3D simulation
VII.4. Outlook for image simulation
VIII. CONCLUSION
Philippe LIER (CNES)
VIII.1. The resolution race
VIII.2. Other criteria
VIII.2.1. The revisit interval
VIII.2.2. The spectral bands
VIII.2.3. Stereoscopic imagery
VIII.2.4. Operational capability
VIII.3. High resolution imagery for everyday use?
Les clients qui ont acheté ce produit ont également acheté...
Livres de l'auteur CNES
Vous aimerez aussi