Improving the Dependability of Measurements, Calculations, Equipment, and Software
Author: I. R. Walker
Publisher: Cambridge University Press
Covering many techniques widely used in research, this book will help researchers in the physical sciences and engineering solve troublesome - and potentially very time consuming - problems in their work. The book deals with technical difficulties that often arise unexpectedly during the use of various common experimental methods, as well as with human error. It provides preventive measures and solutions for such problems, thereby saving valuable time for researchers. Some of the topics covered are: sudden leaks in vacuum systems, electromagnetic interference in electronic instruments, vibrations in sensitive equipment, and bugs in computer software. The book also discusses mistakes in mathematical calculations, and pitfalls in designing and carrying out experiments. Each chapter contains a summary of its key points, to give a quick overview of important potential problems and their solutions in a given area.
A review of the empirical evidence shows that unreliability of research findings relating brain images and cognitive processes is widespread in cognitive neuroscience. Cognitive neuroscientists increasingly claim that brain images generated by new brain imaging technologies reflect, correlate, or represent cognitive processes. In this book, William Uttal warns against these claims, arguing that, despite its utility in anatomic and physiological applications, brain imaging research has not provided consistent evidence for correlation with cognition. Uttal bases his argument on an extensive review of the empirical literature, pointing to variability in data not only among subjects within individual experiments but also in the new meta-analytical approach that pools data from different experiments. This inconsistency of results, he argues, has profound implications for the field, suggesting that cognitive neuroscientists have not yet proven their interpretations of the relation between brain activity captured by macroscopic imaging techniques and cognitive processes; what may have appeared to be correlations may have only been illusions of association. He supports the view that the true correlates are located at a much more microscopic level of analysis: the networks of neurons that make up the brain. Uttal carries out comparisons of the empirical data at several levels of data pooling, including the meta-analytical. He argues that although the idea seems straightforward, the task of pooling data from different experiments is extremely complex, leading to uncertain results, and that little is gained by it. Uttal's investigation suggests a need for cognitive neuroscience to reevaluate the entire enterprise of brain imaging-cognition correlational studies.
In the field of social work, qualitative research is starting to gain more prominence as are mixed methods and various issues regarding race, ethnicity and gender. These changes in the field are reflected and updated in The Handbook of Soical Work Research Methods, Second Edition. This text contains meta analysis, designs to evaluate treatment and provides the support to help students harness the power of the Internet. This handbook brings together leading scholars in research methods in social work.
Since the publication of the first edition of Content Analysis: An Introduction to Its Methodology, the textual fabric in which contemporary society functions has undergone a radical transformation -- namely, the ongoing information revolution. Two decades ago, content analysis was largely known in journalism and communication research, and, to a lesser extent, in the social and psychological sciences. Today, content analysis has become an efficient alternative to public opinion research -- a method of tracking markets, political leanings, and emerging ideas, a way to settle legal disputes, and an approach to explore individual human minds. The Second Edition of Content Analysis is a definitive sourcebook of the history and core principles of content analysis as well as an essential resource for present and future studies. The book introduces readers to ways of analyzing meaningful matter such as texts, images, voices -- that is, data whose physical manifestations are secondary to the meanings that a particular population of people brings to them.Organized into three parts, the book examines the conceptual and methodological aspects of content analysis and also traces several paths through content analysis protocols.The author has completely revised and updated the Second Edition, integrating new information on computer-aided text analysis. The book also includes a practical guide that incorporates experiences in teaching and how to advise academic and commercial researchers. In addition, Krippendorff clarifies the epistemology and logic of content analysis as well as the methods for achieving its aims. Author Klaus Krippendorff discusses three distinguishing characteristics of contemporary content analysis: that it is fundamentally empirically grounded, exploratory in process, and predictive or inferential in intent; that it transcends traditional notions of symbols, contents, and intents; and that it has been forced to develop a methodology of its own, one that enables researchers to plan, execute, communicate, reproduce, and critically evaluate an analysis independent of the desirability of its results.Intended as a textbook for advanced undergraduate and graduate students across the social sciences, Content Analysis, Second Edition will also be a valuable resource for practitioners in a variety of disciplines.
"Covers a broad range of subjects that undergraduates in the discipline should be familiar and comfortable with upon graduation. From chapters on the scientific method and fundamental research concepts, to experimental design, sampling and statistical analysis, the text offers an excellent introduction to the key concepts of geographical research. The content is applicable for students at the beginning of their studies right through to planning and conducting dissertations. The book has also been of particular support in designing my level 1 and 2 tutorials which cover similar ground to several of the chapters." - Joseph Mallalieu, School of Geography, Leeds University "Montello and Sutton is one of the best texts I've used in seminars on research methodology. The text offers a clear balance of quantitative vs. qualitative and physical vs. human which I've found particularly valuable. The chapters on research ethics, scientific communication, information technologies and data visualization are excellent." - Kenneth E. Foote, Department of Geography, University of Colorado at Boulder This is a broad and integrative introduction to the conduct and interpretation of scientific research, covering both geography and environmental studies. Written for undergraduate and postgraduate students, it: Explains both the conceptual and the technical aspects of research, as well as all phases of the research process Combines approaches in physical geography and environmental science, human geography and human-environment relations, and geographic and environmental information techniques (such as GIS, cartography, and remote sensing) Combines natural and social scientific approaches common to subjects in geography and environmental studies Includes case studies of actual research projects to demonstrate the breadth of approaches taken It will be core reading for students studying scientific research methods in geography, environmental studies and related disciplines such as planning and earth science.
"What a helpful book! This will be a 'friend ' to many undergraduate students looking for clarification." - Helen Hazelwood, St Mary's University College "This is a great book that really helps the students understand research and the complex processes that can often daunt even the most intelligent students." - Phil Barter, Middlesex University "Few can bring research methods to life like Mike Atkinson. His breadth of research interests and experience mean he can introduce you to all you need to know and inspire you to get down to doing some research yourself." - Dominic Malcolm, Loughborough University This book systematically demonstrates the significance and application of research methods in plain language. Written for students, it contains the core methodological concepts, practices and debates they need to understand and apply research methods within the field of sport and exercise. It provides a comprehensive panoramic introduction which will reassure and empower students. Written by a leading academic and drawing on years of teaching experience, it includes carefully cross-referenced entries which critically engage with interdisciplinary themes and data. Each concept includes: clear definitions suggestions for further reading comprehensive examples practical applications Pragmatic, lucid and concise the book will provide essential support to students in sports studies, sport development, sport and exercise science, kinesiology and health.
Numerical software is used to test scientific theories, design airplanes and bridges, operate manufacturing lines, control power plants and refineries, analyze financial derivatives, identify genomes, and provide the understanding necessary to derive and analyze cancer treatments. Because of the high stakes involved, it is essential that results computed using software be accurate, reliable, and robust. Unfortunately, developing accurate and reliable scientific software is notoriously difficult. This book investigates some of the difficulties related to scientific computing and provides insight into how to overcome them and obtain dependable results. The tools to assess existing scientific applications are described, and a variety of techniques that can improve the accuracy and reliability of newly developed applications is discussed. Accuracy and Reliability in Scientific Computing can be considered a handbook for improving the quality of scientific computing. It will help computer scientists address the problems that affect software in general as well as the particular challenges of numerical computation: approximations occurring at all levels, continuous functions replaced by discretized versions, infinite processes replaced by finite ones, and real numbers replaced by finite precision numbers. Divided into three parts, it starts by illustrating some of the difficulties in producing robust and reliable scientific software. Well-known cases of failure are reviewed and the what and why of numerical computations are considered. The second section describes diagnostic tools that can be used to assess the accuracy and reliability of existing scientific applications. In the last section, the authors describe a variety of techniques that can be employed to improve the accuracy and reliability of newly developed scientific applications. The authors of the individual chapters are international experts, many of them members of the IFIP Working Group on Numerical Software.
Essay from the year 2010 in the subject Politics - Methods, Research, grade: 1,3 (77%), University of Warwick (Politics and International Studies), course: Qualitative Research Methods, language: English, abstract: Validity and reliability as quality indicators have an uneasy standing in qualitative research and are subject to numerous debates. Researchers from different paradigmatic backgrounds expressed a variety of views, the extremes ranging from a complete denial of the possibility of valid and reliable qualitative research on one hand to the rejection of validity and reliability as meaningful quality indicators on the other. The following essay acknowledges the diverging assumptions underlying the different paradigms associated with quantitative and qualitative research. However, it denies that validity and reliability are inherently connected to predetermined ontological or epistemological assumptions and argues for their general use as quality indicators. To clarify this claim, a selection of different paradigms and the development of alternative quality indicators within them are highlighted. Since the usefulness of this multitude of indicators is questionable, reconciliation is attempted by consolidating them. The concepts of “core validity” and “core reliability”, which can be specified according to the researcher’s paradigm, are introduced for this task. These concepts underline the relevance and applicability of validity and reliability as quality indicators in qualitative research. Furthermore, qualitative research has developed strategies and methods, which enable the researcher to address negative influences on validity and reliability and achieve high degrees of both.
Kirk and Miller define what is -- and what is not -- qualitative research. They suggest that the use of numbers in the process of recording and analyzing observations is less important than that the research should involve sustained interaction with the people being studied, in their own language and on their own turf. Following a chapter on objectivity, the authors discuss the role of reliability and validity and the problems that arise when these issues are neglected. They present a paradigm for the qualitative research process that makes it possible to pursue validity without neglecting reliability.