Improving the Dependability of Measurements, Calculations, Equipment, and Software
Author: I. R. Walker
Publisher: Cambridge University Press
Covering many techniques widely used in research, this book will help researchers in the physical sciences and engineering solve troublesome - and potentially very time consuming - problems in their work. The book deals with technical difficulties that often arise unexpectedly during the use of various common experimental methods, as well as with human error. It provides preventive measures and solutions for such problems, thereby saving valuable time for researchers. Some of the topics covered are: sudden leaks in vacuum systems, electromagnetic interference in electronic instruments, vibrations in sensitive equipment, and bugs in computer software. The book also discusses mistakes in mathematical calculations, and pitfalls in designing and carrying out experiments. Each chapter contains a summary of its key points, to give a quick overview of important potential problems and their solutions in a given area.
A review of the empirical evidence shows that unreliability of research findings relating brain images and cognitive processes is widespread in cognitive neuroscience. Cognitive neuroscientists increasingly claim that brain images generated by new brain imaging technologies reflect, correlate, or represent cognitive processes. In this book, William Uttal warns against these claims, arguing that, despite its utility in anatomic and physiological applications, brain imaging research has not provided consistent evidence for correlation with cognition. Uttal bases his argument on an extensive review of the empirical literature, pointing to variability in data not only among subjects within individual experiments but also in the new meta-analytical approach that pools data from different experiments. This inconsistency of results, he argues, has profound implications for the field, suggesting that cognitive neuroscientists have not yet proven their interpretations of the relation between brain activity captured by macroscopic imaging techniques and cognitive processes; what may have appeared to be correlations may have only been illusions of association. He supports the view that the true correlates are located at a much more microscopic level of analysis: the networks of neurons that make up the brain. Uttal carries out comparisons of the empirical data at several levels of data pooling, including the meta-analytical. He argues that although the idea seems straightforward, the task of pooling data from different experiments is extremely complex, leading to uncertain results, and that little is gained by it. Uttal's investigation suggests a need for cognitive neuroscience to reevaluate the entire enterprise of brain imaging-cognition correlational studies.
In the field of social work, qualitative research is starting to gain more prominence as are mixed methods and various issues regarding race, ethnicity and gender. These changes in the field are reflected and updated in The Handbook of Soical Work Research Methods, Second Edition. This text contains meta analysis, designs to evaluate treatment and provides the support to help students harness the power of the Internet. This handbook brings together leading scholars in research methods in social work.
Since the publication of the first edition of Content Analysis: An Introduction to Its Methodology, the textual fabric in which contemporary society functions has undergone a radical transformation -- namely, the ongoing information revolution. Two decades ago, content analysis was largely known in journalism and communication research, and, to a lesser extent, in the social and psychological sciences. Today, content analysis has become an efficient alternative to public opinion research -- a method of tracking markets, political leanings, and emerging ideas, a way to settle legal disputes, and an approach to explore individual human minds. The Second Edition of Content Analysis is a definitive sourcebook of the history and core principles of content analysis as well as an essential resource for present and future studies. The book introduces readers to ways of analyzing meaningful matter such as texts, images, voices -- that is, data whose physical manifestations are secondary to the meanings that a particular population of people brings to them.Organized into three parts, the book examines the conceptual and methodological aspects of content analysis and also traces several paths through content analysis protocols.The author has completely revised and updated the Second Edition, integrating new information on computer-aided text analysis. The book also includes a practical guide that incorporates experiences in teaching and how to advise academic and commercial researchers. In addition, Krippendorff clarifies the epistemology and logic of content analysis as well as the methods for achieving its aims. Author Klaus Krippendorff discusses three distinguishing characteristics of contemporary content analysis: that it is fundamentally empirically grounded, exploratory in process, and predictive or inferential in intent; that it transcends traditional notions of symbols, contents, and intents; and that it has been forced to develop a methodology of its own, one that enables researchers to plan, execute, communicate, reproduce, and critically evaluate an analysis independent of the desirability of its results.Intended as a textbook for advanced undergraduate and graduate students across the social sciences, Content Analysis, Second Edition will also be a valuable resource for practitioners in a variety of disciplines.
"Covers a broad range of subjects that undergraduates in the discipline should be familiar and comfortable with upon graduation. From chapters on the scientific method and fundamental research concepts, to experimental design, sampling and statistical analysis, the text offers an excellent introduction to the key concepts of geographical research. The content is applicable for students at the beginning of their studies right through to planning and conducting dissertations. The book has also been of particular support in designing my level 1 and 2 tutorials which cover similar ground to several of the chapters." - Joseph Mallalieu, School of Geography, Leeds University "Montello and Sutton is one of the best texts I've used in seminars on research methodology. The text offers a clear balance of quantitative vs. qualitative and physical vs. human which I've found particularly valuable. The chapters on research ethics, scientific communication, information technologies and data visualization are excellent." - Kenneth E. Foote, Department of Geography, University of Colorado at Boulder This is a broad and integrative introduction to the conduct and interpretation of scientific research, covering both geography and environmental studies. Written for undergraduate and postgraduate students, it: Explains both the conceptual and the technical aspects of research, as well as all phases of the research process Combines approaches in physical geography and environmental science, human geography and human-environment relations, and geographic and environmental information techniques (such as GIS, cartography, and remote sensing) Combines natural and social scientific approaches common to subjects in geography and environmental studies Includes case studies of actual research projects to demonstrate the breadth of approaches taken It will be core reading for students studying scientific research methods in geography, environmental studies and related disciplines such as planning and earth science.
"What a helpful book! This will be a 'friend ' to many undergraduate students looking for clarification." - Helen Hazelwood, St Mary's University College "This is a great book that really helps the students understand research and the complex processes that can often daunt even the most intelligent students." - Phil Barter, Middlesex University "Few can bring research methods to life like Mike Atkinson. His breadth of research interests and experience mean he can introduce you to all you need to know and inspire you to get down to doing some research yourself." - Dominic Malcolm, Loughborough University This book systematically demonstrates the significance and application of research methods in plain language. Written for students, it contains the core methodological concepts, practices and debates they need to understand and apply research methods within the field of sport and exercise. It provides a comprehensive panoramic introduction which will reassure and empower students. Written by a leading academic and drawing on years of teaching experience, it includes carefully cross-referenced entries which critically engage with interdisciplinary themes and data. Each concept includes: clear definitions suggestions for further reading comprehensive examples practical applications Pragmatic, lucid and concise the book will provide essential support to students in sports studies, sport development, sport and exercise science, kinesiology and health.
Author: Jerome Kirk,Marc L. Miller,Marc Louis Miller
Kirk and Miller define what is -- and what is not -- qualitative research. They suggest that the use of numbers in the process of recording and analyzing observations is less important than that the research should involve sustained interaction with the people being studied, in their own language and on their own turf. Following a chapter on objectivity, the authors discuss the role of reliability and validity and the problems that arise when these issues are neglected. They present a paradigm for the qualitative research process that makes it possible to pursue validity without neglecting reliability.
Numerical software is used to test scientific theories, design airplanes and bridges, operate manufacturing lines, control power plants and refineries, analyze financial derivatives, identify genomes, and provide the understanding necessary to derive and analyze cancer treatments. Because of the high stakes involved, it is essential that results computed using software be accurate, reliable, and robust. Unfortunately, developing accurate and reliable scientific software is notoriously difficult. This book investigates some of the difficulties related to scientific computing and provides insight into how to overcome them and obtain dependable results. The tools to assess existing scientific applications are described, and a variety of techniques that can improve the accuracy and reliability of newly developed applications is discussed. Accuracy and Reliability in Scientific Computing can be considered a handbook for improving the quality of scientific computing. It will help computer scientists address the problems that affect software in general as well as the particular challenges of numerical computation: approximations occurring at all levels, continuous functions replaced by discretized versions, infinite processes replaced by finite ones, and real numbers replaced by finite precision numbers. Divided into three parts, it starts by illustrating some of the difficulties in producing robust and reliable scientific software. Well-known cases of failure are reviewed and the what and why of numerical computations are considered. The second section describes diagnostic tools that can be used to assess the accuracy and reliability of existing scientific applications. In the last section, the authors describe a variety of techniques that can be employed to improve the accuracy and reliability of newly developed scientific applications. The authors of the individual chapters are international experts, many of them members of the IFIP Working Group on Numerical Software.
Low-income Mothers' Empowerment Through Participation
Author: Barbara J. Peters
Publisher: Taylor & Francis
Category: Social Science
Drawing on her own experience many years ago and on interviews with more recent mothers of children in the Headstart program of a community in the upper Midwest, Peters explains how staff members can use the program to help parents become better at the task of parenting, and enhance the parents' self-esteem so that can effect change in their environment and eventually move out on poverty.
Statistics for Sport and Exercise Studies guides the student through the full research process, from selecting the most appropriate statistical procedure, to analysing data, to the presentation of results, illustrating every key step in the process with clear examples, case-studies and data taken from real sport and exercise settings. Every chapter includes a range of features designed to help the student grasp the underlying concepts and relate each statistical procedure to their own research project, including definitions of key terms, practical exercises, worked examples and clear summaries. The book also offers an in-depth and practical guide to using SPSS in sport and exercise research, the most commonly used data analysis software in sport and exercise departments. In addition, a companion website includes more than 100 downloadable data sets and work sheets for use in or out of the classroom, full solutions to exercises contained in the book, plus over 1,300 PowerPoint slides for use by tutors and lecturers. Statistics for Sport and Exercise Studies is a complete, user-friendly introduction to the use of statistical tests, techniques and procedures in sport, exercise and related subjects. Visit the companion website at: www.routledge.com/cw/odonoghue
Great Britain: Parliament: House of Commons: Science and Technology Committee
Report, Together with Formal Minutes, Oral and Written Evidence
Author: Great Britain: Parliament: House of Commons: Science and Technology Committee
Publisher: The Stationery Office
Category: Great Britain
This report indicates that the oversight of research integrity in the UK is unsatisfactory. The Science and Technology Committee concludes that in order to allow others to repeat and build on experiments, researchers should aim for the gold standard of making their data fully disclosed and made publicly available. The report examines the current peer-review system as used in scientific publications and the related issues of research impact, data management, publication ethics and research integrity. The UK does not seem to have an oversight body for research integrity covering advice and assurance functions across all disciplines and the Committee recommends the creation of an external regulator. It also says all UK research institutions should have a specific member of staff leading on research integrity. The report highlights concerns about the use of journal Impact Factor as a proxy measure for the quality of research or of individual articles. Innovative ways to improve current pre-publication peer-review practices are highlighted in the report, including the use of pre-print servers, open peer review, increased transparency and online repository-style journals. The growth of post-publication peer review and commentary also represents an enormous opportunity for experimentation with new media and social networking tools, which the Committee encourages. There should also be greater recognition of the work-sometimes considered to be a burden-carried out by reviewers, by both publishers and employers. In order to do this, publishers need to have in place systems for recording and acknowledging the contribution of those involved in peer review.
This book provides a comprehensive overview of recent advances in the analysis and design of health management systems for cooperating unmanned aerial vehicles. Such systems rely upon monitoring and fault adaptation schemes. Motivation for their study comes from the fact that, despite the use of fault-tolerant control software and hardware embedded onboard air vehicles, overall fleet performance may still be degraded after the occurrence of anomalous events such as systems faults and failures. Cooperative health management (CHM) systems seek to provide adaptation to the presence of faults by capitalizing on the availability of interconnected computing, sensing and actuation resources.This monograph complements the proposed CHM concepts by means of case studies and application examples. It presents fundamental principles and results encompassing optimization, systems theory, information theory, dynamics, modeling and simulation. Written by pioneers in cooperative control, health management and fault-tolerant control for unmanned systems, this book is a unique source of information for designers, researchers and practitioners interested in the field.
Bruce G. Carruthers,Stephen L. Schensul,Jean J. Schensul,Margaret Diane LeCompte,Margaret Diane LeCompte, M.A., Ph.D.
This work illustrates research conducted over a ten-year timespan and addresses a fundamental issue in reliability theory. This still appears to be an empirically disorganized field and the book suggests employing a deductive base in order to evolve reliability as a science. The study is in line with the fundamental work by Gnedenko. Boris Vladimirovich Gnedenko (1912 – 1995) was a Soviet mathematician who made significant contributions in various scientific areas. His name is especially associated with studies of dependability, for which he is often recognized as the 'father' of reliability theory. In the last few decades, this area has expanded in new directions such as safety, security, risk analysis and other fields, yet the book ‘Mathematical Methods in Reliability Theory’ written by Gnedenko with Alexander Soloviev and Yuri Bélyaev still towers as a pillar of the reliability sector’s configuration and identity. The present book proceeds in the direction opened by the cultural project of the Russian authors; in particular it identifies different trends in the hazard rate functions by means of deductive logic and demonstrations. Further, it arrives at multiple results by means of the entropy function, an original mathematical tool in the reliability domain. As such, it will greatly benefit all specialists in the field who are interested in unconventional solutions.
With the rapid growth of Cloud computing, the size of Cloud data is expanding at a dramatic speed. A huge amount of data is generated and processed by Cloud applications, putting a higher demand on cloud storage. While data reliability should already be a requirement, data in the Cloud needs to be stored in a highly cost-effective manner. This book focuses on the trade-off between data storage cost and data reliability assurance for big data in the Cloud. Throughout the whole Cloud data lifecycle, four major features are presented: first, a novel generic data reliability model for describing data reliability in the Cloud; second, a minimum replication calculation approach for meeting a given data reliability requirement to facilitate data creation; third, a novel cost-effective data reliability assurance mechanism for big data maintenance, which could dramatically reduce the storage space needed in the Cloud; fourth, a cost-effective strategy for facilitating data creation and recovery, which could significantly reduce the energy consumption during data transfer. Captures data reliability with variable disk rates and compares virtual to physical disks Offers methods for reducing cloud-based storage cost and energy consumption Presents a minimum replication benchmark for data reliability requirements to evaluate various replication-based data storage approaches
The Guest Editors have assembled an international list of top experts to present the most current information to pediatricians about patient safety. The issue has a primarily clinical focus with a few articles addressing the business and practice of patient safety. Articles are devoted to the following topics: Developing performance standards and expectations for safety; The role of CPOE in patient safety; The role of smart infusion pumps on patient safety; Abstracted detection of adverse events in children; The role of effective communication (including handoffs) in patient safety; Reducing mortality resulting from adverse events; Optimizing standardization of case reviews (morbidity and mortality rounds) to promote patient safety; Impact of (resident) duty work hours on patient safety; Role of simulation in safety; The role of diagnostic errors in patient safety; The role of collaborative efforts to reduce hospital acquired conditions; Patient safety in ambulatory care; Role of FDA and pediatric safety; and Patient safety through the eyes of a parent.
The Role of Interval Methods in Scientific Computing
Author: Ramon E. Moore
Perspectives in Computing, Vol. 19: Reliability in Computing: The Role of Interval Methods in Scientific Computing presents a survey of the role of interval methods in reliable scientific computing, including vector arithmetic, language description, convergence, and algorithms. The selection takes a look at arithmetic for vector processors, FORTRAN-SC, and reliable expression evaluation in PASCAL-SC. Discussions focus on interval arithmetic, optimal scalar product, matrix and vector arithmetic, transformation of arithmetic expressions, development of FORTRAN-SC, and language description with examples. The text then examines floating-point standards, algorithms for verified inclusions, applications of differentiation arithmetic, and interval acceleration of convergence. The book ponders on solving systems of linear interval equations, interval least squares, existence of solutions and iterations for nonlinear equations, and interval methods for algebraic equations. Topics include interval methods for single equations, diagnosing collinearity, interval linear equations, effects of nonlinearity, and bounding the solutions. The publication is a valuable source of data for computer science experts and researchers interested in the role of interval methods in reliable scientific computing.