1. What is a kernel?; Part I. Upper Bounds: 2. Warm up; 3. Inductive priorities; 4. Crown decomposition; 5. Expansion lemma; 6. Linear programming; 7. Hypertrees; 8. Sunflower lemma; 9. Modules; 10. Matroids; 11. Representative families; 12. Greedy packing; 13. Euler's formula; Part II. Meta Theorems: 14. Introduction to treewidth; 15. Bidimensionality and protrusions; 16. Surgery on graphs; Part III. Lower Bounds: 17. Framework; 18. Instance selectors; 19. Polynomial parameter transformation; 20. Polynomial lower bounds; 21. Extending distillation; Part IV. Beyond Kernelization: 22. Turing kernelization; 23. Lossy kernelization.
A complete introduction to recent advances in preprocessing analysis, or kernelization, with extensive examples using a single data set.
Fedor V. Fomin is Professor of Computer Science at the Universitetet i Bergen, Norway. He is known for his work in algorithms and graph theory. He has co-authored two books, Exact Exponential Algorithms (2010) and Parameterized Algorithms (2015), and received the EATCS Nerode prizes in 2015 and 2017 for his work on bidimensionality and Measure and Conquer. Daniel Lokshtanov is Professor of Informatics at the Universitetet i Bergen, Norway. His main research interests are in graph algorithms, parameterized algorithms, and complexity. He is a co-author of Parameterized Algorithms (2015) and is a recipient of the Meltzer prize, the Bergen Research Foundation young researcher grant, and an ERC starting grant on parameterized algorithms. Saket Saurabh is Professor of Theoretical Computer Science at the Institute of Mathematical Sciences, Chennai, and Professor of Computer Science at the Universitetet i Bergen, Norway. He has made important contributions to every aspect of parametrized complexity and kernelization, especially to general purpose results in kernelization and applications of extremal combinatorics in designing parameterized algorithms. He is a co-author of Parameterized Algorithms (2015). Meirav Zehavi is Assistant Professor of Computer Science at Ben-Gurion University. Her research interests lie primarily in the field of parameterized complexity. In her Ph.D. studies, she received three best student paper awards.
'Kernelization is one of the most important and most practical
techniques coming from parameterized complexity. In parameterized
complexity, kernelization is the technique of data reduction with a
performance guarantee. From humble beginnings in the 1990's it has
now blossomed into a deep and broad subject with important
applications, and a well-developed theory. Time is right for a
monograph on this subject. The authors are some of the leading
lights in this area. This is an excellent and well-designed
monograph, fully suitable for both graduate students and
practitioners to bring them to the state of the art. The authors
are to be congratulated for this fine book.' Rod Downey, Victoria
University of Wellington
'Kernelization is an important technique in parameterized
complexity theory, supplying in many cases efficient algorithms for
preprocessing an input to a problem and transforming it to a
smaller one. The book provides a comprehensive treatment of this
active area, starting with the basic methods and covering the most
recent developments. This is a beautiful manuscript written by four
leading researchers in the area.' Noga Alon, Princeton University,
New Jersey and Tel Aviv University
'This book will be of great interest to computer science students
and researchers concerned with practical combinatorial
optimization, offering the first comprehensive survey of the
rapidly developing mathematical theory of pre-processing - a nearly
universal algorithmic strategy when dealing with real-world
datasets. Concrete open problems in the subject are nicely
highlighted.' Michael Fellows, Universitetet i Bergen, Norway
'The study of kernelization is a relatively recent development in
algorithm research. With mathematical rigor and giving the
intuition behind the ideas, this book is an excellent and
comprehensive introduction to this new field. It covers the entire
spectrum of topics, from basic and advanced algorithmic techniques
to lower bounds, and goes beyond these with meta-theorems and
variations on the notion of kernelization. The book is suitable for
students wanting to learn the field as well as experts, who would
both benefit from the full coverage of topics.' Hans L. Bodlaender,
Universiteit Utrecht
'The book is well written and provides a wealth of examples to
illustrate concepts, while being succinct.' D. Papamichail,
Choice
'The book does a good job in several ways: it can serve as the
first textbook on this flourishing area of research; it is also
very useful for self-study, as it contains quite a number of
exercises, with further pointers to the literature. In addition, it
gives quite a good overview of the present state-of-the-art and can
therefore help researchers in the area to discover results that
(s)he might have missed due to the speed in which the area has
developed over the last decade.' Henning Fernau, MathSciNet
'This book studies the research area of kernelization, which
consists of the techniques used for data reduction via
pre-processing in order to speed up data analysis computations …
the book explores very novel and complex ideas, it is well written
with attention to detail and easy to follow. The book concludes
with a useful list of relevant references.' Efstratios Rappos,
zbMATH
'The book manages to present an incredible number of techniques,
methods, and examples in its 528 pages. Each chapter ends with a
bibliographic notes section, which often provides some small
historical context for the material covered. It also points to more
current results and papers although it does so very briefly.
Together, this makes the textbook a valuable resource book to
researchers.' Tim Jackman and Steve Homer, SIGACT News
Ask a Question About this Product More... |