A leading expert in informal logic, Douglas Walton turns his attention in this new book to how reasoning operates in trials and other legal contexts, with special emphasis on the law of evidence. The new model he develops, drawing on methods of argumentation theory that are gaining wide acceptance in computing fields like artificial intelligence, can be used to identify, analyze, and evaluate specific types of legal argument. In contrast with approaches that rely on deductive and inductive logic and rule out many common types of argument as fallacious, Walton&’s aim is to provide a more expansive view of what can be considered &"reasonable&" in legal argument when it is construed as a dynamic, rule-governed, and goal-directed conversation. This dialogical model gives new meaning to the key notions of relevance and probative weight, with the latter analyzed in terms of pragmatic criteria for what constitutes plausible evidence rather than truth.
This book provides an overview of computer techniques and tools — especially from artificial intelligence (AI) — for handling legal evidence, police intelligence, crime analysis or detection, and forensic testing, with a sustained discussion of methods for the modelling of reasoning and forming an opinion about the evidence, methods for the modelling of argumentation, and computational approaches to dealing with legal, or any, narratives. By the 2000s, the modelling of reasoning on legal evidence has emerged as a significant area within the well-established field of AI & Law. An overview such as this one has never been attempted before. It offers a panoramic view of topics, techniques and tools. It is more than a survey, as topic after topic, the reader can get a closer view of approaches and techniques. One aim is to introduce practitioners of AI to the modelling legal evidence. Another aim is to introduce legal professionals, as well as the more technically oriented among law enforcement professionals, or researchers in police science, to information technology resources from which their own respective field stands to benefit. Computer scientists must not blunder into design choices resulting in tools objectionable for legal professionals, so it is important to be aware of ongoing controversies. A survey is provided of argumentation tools or methods for reasoning about the evidence. Another class of tools considered here is intended to assist in organisational aspects of managing of the evidence. Moreover, tools appropriate for crime detection, intelligence, and investigation include tools based on link analysis and data mining. Concepts and techniques are introduced, along with case studies. So are areas in the forensic sciences. Special chapters are devoted to VIRTOPSY (a procedure for legal medicine) and FLINTS (a tool for the police). This is both an introductory book (possibly a textbook), and a reference for specialists from various quarters.
Use of argumentation methods applied to legal reasoning is a relatively new field of study. The book provides a survey of the leading problems, and outlines how future research using argumentation-based methods show great promise of leading to useful solutions. The problems studied include not only these of argument evaluation and argument invention, but also analysis of specific kinds of evidence commonly used in law, like witness testimony, circumstantial evidence, forensic evidence and character evidence. New tools for analyzing these kinds of evidence are introduced.
In modern industrial democracies, the making of public policy is dependent on policy analysis--the generation, discussion, and evaluation of policy alternatives. Policy analysis is often characterized, especially by economists, as a technical, nonpartisan, objective enterprise, separate from the constraints of the political environment. however, says the eminent political scientist Giandomenico Majone, this characterization of policy analysis is seriously flawed. According to Majone, policy analysts do not engage in a purely technical analysis of alternatives open to policymakers, but instead produce policy arguments that are based on value judgments and are used in the course of public debate. In this book Majone offers his own definition of policy analysis and examines all aspects of it--from problem formulation and the choice of policy instruments to program development and policy evaluation. He argues that rhetorical skills are crucial for policy analysts when they set the norms that determine when certain conditions are to be regarded as policy problems, when they advise on technical issues, and when they evaluate policy. Policy analysts can improve the quality of public deliberation by refining the standards of appraisal of public programs and facilitating a wide-ranging dialogue among advocates of different criteria. In fact, says Majone, the essential need today is not to develop 'objective' measures of outcomes--the traditional aim of evaluation research--but to improve the methods and conditions of public discourse at all levels and stages of policy-making.
This monograph poses a series of key problems of evidential reasoning and argumentation. It then offers solutions achieved by applying recently developed computational models of argumentation made available in artificial intelligence. Each problem is posed in such a way that the solution is easily understood. The book progresses from confronting these problems and offering solutions to them, building a useful general method for evaluating arguments along the way. It provides a hands-on survey explaining to the reader how to use current argumentation methods and concepts that are increasingly being implemented in more precise ways for the application of software tools in computational argumentation systems. It shows how the use of these tools and methods requires a new approach to the concepts of knowledge and explanation suitable for diverse settings, such as issues of public safety and health, debate, legal argumentation, forensic evidence, science education, and the use of expert opinion evidence in personal and public deliberations.
This book provides theoretical tools for evaluating the soundness of arguments in the context of legal argumentation. It deals with a number of general argument types and their particular use in legal argumentation. It provides detailed analyses of argument from authority, argument ad hominem, argument from ignorance, slippery slope argument and other general argument types. Each of these argument types can be used to construct arguments that are sound as well as arguments that are unsound. To evaluate an argument correctly one must be able to distinguish the sound instances of a certain argument type from its unsound instances. This book promotes the development of theoretical tools for this task.
Bridging the gap between applied ethics and ethical theory, Ethical Argumentation draws on recent research in argumentation theory to develop a more realistic model of how ethical justification actually works. Douglas Walton presents a new model of ethical argumentation in which ethical justification is analyzed as a defeasible form of argumentation considered in a balanced dialogue. Walton's new model employs techniques such as: asking the appropriate critical questions, probing accepted values, finding nonexplicit assumptions in an ethical argument, and deconstructing emotive terms and persuasive definitions. This book will be of significant interest to scholars and advanced students in applied ethics and theory.
This book examines the nature of evidence for character judgments, using a model of abductive reasoning called Inference To The Best Explanation. The book expands this notion based on recent work with models of reasoning using argumentation theory and artificial intelligence. The aim is not just to show how character judgments are made, but how they should be properly be made based on sound reasoning, avoiding common errors and superficial judgments.
This book presents a framework for translation-mediated forensic analysis to deal with problems that require special techniques, procedures and methodologies not normally found in a recently developing branch of linguistics called Forensic Linguistics.