Forthcoming titles in the ACM Books Series are subject to change and will be published as they become available, with 25 titles to be published in each Collection. Upon publication, each of the following books will appear in the ACM Digital Library and be accessible to those with full-text access in both PDF and ePub formats. Individual titles will be available for purchase at Amazon and Barnes & Noble. Please click on the title names below for more information.
Coming Soon
Aurèle Barrière
This book outlines a methodology to develop formally verified Just-in-Time compilers. Just-in-Time compilation is a technique to execute programs, where execution is interleaved with optimizations of the program itself. These compilers often produce fast executions, so much so that their use has grown greatly for dynamic programming languages. Most modern web browsers today use Just-in-Time compilation to speed up the execution of the JavaScript programs they execute.
However, the techniques used in Just-in-Time compilers can be particularly complex. This complexity can be a source of bugs and vulnerabilities. How can you make sure that your Just-in-Time compiler is bug-free? For traditional ahead-of-time compilers, many techniques have been developed to prevent compilation bugs. One such technique is formally verified compilation, where the compiler itself comes with proof that the semantics of the compiled program correspond to the semantics of the source program. But Just-in-Time compilers are more recent, less understood, and have been the target of far fewer verification efforts.
To bring formal verification to Just-in-Time compilation, the book identifies a set of specific verification challenges and presents novel solutions for each of them. Such challenges include dynamic optimizations, speculative optimizations, deoptimizations, and the interleaving of interpretation and machine code generation. The author repurposes proof techniques from formally verified ahead-of-time compilers like CompCert. Following this methodology, readers can develop Just-in-Time compilers and formally prove that they behave as prescribed by the semantics of the program they execute. All proofs within the book have been mechanized in the Coq proof assistant.
Omar Alonso and Ricardo Baeza-Yates
In the last decade, deep learning and word embeddings have made significant impacts on information retrieval by adding techniques based in neural networks and language models. At the same time, certain search modalities like neural IR and conversational search have become more popular. This book, written by international academic and industry experts, brings the field up to date with detailed discussions of these new approaches and techniques. The book is organized in three sections: Foundations, Adaptations and Concerns, and Verticals.
Under Foundations, we address topics that form the basic structure of any modern IR system, including recommender systems. These new techniques are developed to augment indexing, retrieval, and ranking. Neural IR, recommender systems, evaluation, query-driven functionality, and knowledge graphs are covered in this section.
IR systems need to adapt to specific user characteristics and preferences, and techniques that were considered too niche a few years ago are now a matter of system design consideration. The Adaptations and Concerns section covers the following topics: conversational search, cross-language retrieval, temporal extraction and retrieval, bias in retrieval systems, and privacy in search.
While web search engines are the most popular information access point, there are cases where specific verticals provide a better experience in terms of content and relevance. The Verticals section describes eCommerce, professional search, personal collections, music retrieval, and biomedicine as examples.
In Development
Author(s): Ayan Mukhopadhyay and Yevgeniy Vorobeychik
Abstract:
The collection of essays on Artificial Intelligence and Society provides the first systematic and comprehensive coverage of how the field of Artificial Intelligence (AI) has impacted society. The book is centered around five sections---sustainability, health, smart cities and urban planning, privacy and security, and fairness, accountability, and transparency. Each section will consist of several tutorial-style essays discussing specific societal challenges, relevant background material (mathematical and algorithmic), a brief literature review, the core algorithmic content, and a description of how the societal challenge has been addressed. The ultimate aim of this edited volume is to enlighten students about how AI algorithms and data-driven modeling have impacted society from a diverse range of perspectives and impact areas.
Abstract:
Behavior Informatics emerges as an important tool for discovering behavior intelligence and behavior insight. As a computational concept, behavior captures the aspects of the demographics of behavioral subjects and objects. Accordingly, a behavior model captures the subject and the object of a behavior or behavior sequence, the activities conducted by its subject on objects, and the relationships between activities. This book introduces some of the real-life applications of behavior informatics in core business, capital markets and government services.
Abstract:
This books describes recent efforts in designing techniques for indexing and mining truly massive collections of data series that will enable scientists to easily analyze their data, and presents a vision for the future in big sequence management research.
Author(s): Mark J. P. Wolf
Abstract:
This book is a comprehensive history of computer graphics in Hollywood cinema. As the first such book of its kind, it is an essential reference for anyone interested in the history of cinema, visual effects, or computer graphics, and the industries of which they are a part.
It begins with a look at the history behind the calculation of images, from weaving to screen imaging, and the faux computer graphics used in movies before real computer graphics were available or affordable. Next, the author examines the rise of computer graphics and computer-animated films, and the gradual introduction of computer-generated imagery into the cinema. It moves chronologically through the development of computer-generated animation and its use both for cartoonlike imagery and for photorealistic imagery. The author discusses behind-the-scenes uses of computer graphics in the film industry, and how these uses have impacted the kinds of imagery made and the technologies by which they are made. It also looks at how computer animation is combined with approaches such as procedural generation and simulation, and the ways in which computers automate imagery.
Throughout the book, the histories of individuals, companies, films, and computer graphics techniques are explored in detail, as well as changes in the visual effects (VFX) industry itself over time.
Calculated Imagery: A History of Computer Graphics in Hollywood Cinema is for anyone interested in how CG changed the VFX industry, film history, and filmmaking overall, and the people, companies, and techniques that made it happen.
Author(s): Pat Helland
Abstract:
This book in the ACM Turing Award series focuses on the life and contributions of Jim Gray. Jim was dedicated to the mentoring, nurturing, and development of individuals and the scientific community with a special emphasis on computer science education, and his curiosity and passion for learning led him into new and uncharted areas within systems computing. The book follows Jim’s exploration of eight major areas of systems computing: systems, transactions, databases, availability, performance, sort, scale, and eScience.
Abstract:
Historically, lack of transparency and low standards for e-voting system implementations have resulted in a culture where open source code is seen as a gold standard; however, this ignores the issue of the comprehensibility of that code. This books makes concrete empirical recommendations based on experiences and findings from examining the source code of e-voting systems. It highlights that any solution used for significant elections should be well-designed, carefully analyzed, deftly built, accurately documented and expertly maintained.
Author(s): Rocco Servedio
Abstract:
ACM Books is pleased to announce the signing of a new book in our Turing Award series, Foundations of Computation and Machine Learning: The Work of Leslie Valiant, edited by Rocco Servedio of Columbia University.
Valiant received the prestigious ACM Turing Award in 2010 for "transformative contributions to the theory of computation, including the theory of probably approximately correct (PAC) learning, the complexity of enumeration and of algebraic computation, and the theory of parallel and distributed computing." The book will feature a short biography of Valiant, as well as analysis of his seminal works by today's leading computer scientists.
Abstract:
This book is an introduction to data structures and algorithms for functional languages, with a focus on proofs. It covers both functional correctness and running time analysis.
Abstract:
Aayush Jain is the recipient of the 2022 ACM Doctoral Dissertation Award for his dissertation “Indistinguishability Obfuscation From Well-Studied Assumptions,” which established the feasibility of mathematically rigorous software obfuscation from well-studied hardness conjectures. The mathematical object that Jain’s thesis constructs, indistinguishability obfuscation, is considered a theoretical “master tool” in the context of cryptography—not only in helping achieve long-desired cryptographic goals such as functional encryption, but also in expanding the scope of the field of cryptography itself.
Author(s): Uday Khedker
Abstract:
Pointer analysis provides information to disambiguate indirect reads and writes of data through pointers and indirect control flow through function pointers or virtual functions. The book focuses on fundamental concepts instead of trying to cover the entire breadth of the literature on pointer analysis. Bibliographic notes point the reader to relevant literature for more details. Rather than being driven completely by pointer analysis’s practical effectiveness, the book evolves the concepts from the first principles based on the language features, brings out the interactions of different abstractions at the level of ideas, and finally, relates them to practical observations and the nature of practical programs.
Author(s): Boelie Elzen and Donald MacKenzie
Abstract:
This book describes the development and use of supercomputers in the period 1960-1996, a time known as the Seymour Cray Era. For more than three decades, Cray’s computer designs were seen as the yardstick against which all other efforts were measured. Initially, this yardstick was sheer computing speed. However, the supercomputer world gradually became more complex and other factors became equally important.
The initial development of supercomputers was commissioned and financed by the Los Alamos and Lawrence Livermore National Laboratories, which had huge computational needs in connection with nuclear weapons development. The computers designed by Cray satisfied those needs, and these computers were also sold to a few dozen other big research organizations and weather agencies. In the 1970s and 1980s, a variety of companies started to compete with the Cray designs by offering supercomputers that used a new architectural approach, MPP: massively parallel processing. This new architecture based on using tens of thousands of simple microprocessors subsequently began to dominate high-performance computing and marked the end of the Seymour Cray Era.
This book is important reading for anyone working in the area of high-performance computing, providing essential historical context for the work of a legendary pioneer and the computers he became famous for designing. It will also be valuable to students of computing history and, more generally, to readers interested in the history of science and technology. For advanced students, the book illustrates how innovation in its very essence is a socio-technical process: not just a matter of developing the “best technology,” but also of making appropriate choices concerning the interaction of human and technical factors in product design.