Fr. 75.00

Principles of Big Data - Preparing, Sharing, and Analyzing Complex Information

English · Paperback / Softback

Shipping usually within 1 to 3 weeks (not available at short notice)

Description

Read more

Informationen zum Autor Jules Berman holds two Bachelor of Science degrees from MIT (in Mathematics and in Earth and Planetary Sciences), a PhD from Temple University, and an MD from the University of Miami. He was a graduate researcher at the Fels Cancer Research Institute (Temple University) and at the American Health Foundation in Valhalla, New York. He completed his postdoctoral studies at the US National Institutes of Health, and his residency at the George Washington University Medical Center in Washington, DC. Dr. Berman served as Chief of anatomic pathology, surgical pathology, and cytopathology at the Veterans Administration Medical Center in Baltimore, Maryland, where he held joint appointments at the University of Maryland Medical Center and at the Johns Hopkins Medical Institutions. In 1998, he transferred to the US National Institutes of Health as a Medical Officer and as the Program Director for Pathology Informatics in the Cancer Diagnosis Program at the National Cancer Institute. Dr. Berman is a past President of the Association for Pathology Informatics and is the 2011 recipient of the Association’s Lifetime Achievement Award. He is a listed author of more than 200 scientific publications and has written more than a dozen books in his three areas of expertise: informatics, computer programming, and pathology. Dr. Berman is currently a freelance writer. Klappentext Shows readers how to create and use Big Data safely and responsibly. Zusammenfassung Helps readers avoid the common mistakes that endanger all Big Data projects. By stressing simple! fundamental concepts! this book teaches readers how to organize large volumes of complex data! and how to achieve data permanence when the content of the data is constantly changing.

List of contents

1. Big Data Moves to the Center of the Universe
2. Measurement
3. Annotation
4. Identification, De-identification, and Re-identification
5. Ontologies and Semantics: How information is endowed with meaning
6. Standards and their Versions
7. Legacy Data
8. Hypothesis Testing
9. Prediction
10. Software
11. Complexity
12. Vulnerabilities

13. Legalities
14. Social and Ethical Issues

Report

"By stressing simple, fundamental concepts, this book teaches readers how to organize large volumes of complex data, and how to achieve data permanence when the content of the data is constantly changing. General methods for data verification and validation, as specifically applied to Big Data resources, are stressed throughout the book." --ODBMS.org, March 2014
"The book is written in a colloquial style and is full of anecdotes, quotations from famous people, and personal opinions." --ComputingReviews.com, February 2014
"The author has produced a sober, serious treatment of this emerging phenomenon, avoiding hype and gee-whiz cases in favor of concepts and mature advice. For example, the author offers ten distinctions between big data and small data, including such factors as goals, location, data structure, preparation, and longevity. This characterization provides much greater insight into the phenomenon than the standard 3V treatment (volume, velocity, and variety)." --ComputingReviews.com, October 2013

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.