Bayesian Networks

eBook - An Introduction, Wiley Series in Probability and Statistics

80,99 €
(inkl. MwSt.)
E-Book Download

Download

Bibliografische Daten
ISBN/EAN: 9781119964957
Sprache: Englisch
Umfang: 368 S., 20.25 MB
Auflage: 2. Auflage 2011
E-Book
Format: EPUB
DRM: Adobe DRM

Beschreibung

Bayesian Networks: An Introduction provides a self-contained introduction to the theory and applications of Bayesian networks, a topic of interest and importance for statisticians, computer scientists and those involved in modelling complex data sets. The material has been extensively tested in classroom teaching and assumes a basic knowledge of probability, statistics and mathematics. All notions are carefully explained and feature exercises throughout.

Features include:

An introduction to Dirichlet Distribution, Exponential Families and their applications.A detailed description of learning algorithms and Conditional Gaussian Distributions using Junction Tree methods.A discussion of Pearl's intervention calculus, with an introduction to the notion of see and do conditioning.All concepts are clearly defined and illustrated with examples and exercises. Solutions are provided online.

This book will prove a valuable resource for postgraduate students of statistics, computer engineering, mathematics, data mining, artificial intelligence, and biology.

Researchers and users of comparable modelling or statistical techniques such as neural networks will also find this book of interest.

Autorenportrait

Timo Koski, Professor of Mathematical Statistics, Department of Mathematics, Royal Institute of Technology, Stockholm, Sweden.

John M. Noble, Department of Mathematics, University of Linköping, Sweden.

Inhalt

Preface.

1 Graphical models and probabilistic reasoning.

1.1 Introduction.

1.2 Axioms of probability and basic notations.

1.3 The Bayes update of probability.

1.4 Inductive learning.

1.5 Interpretations of probability and Bayesian networks.

1.6 Learning as inference about parameters.

1.7 Bayesian statistical inference.

1.8 Tossing a thumb-tack.

1.9 Multinomial sampling and the Dirichlet integral.

Notes.

Exercises: Probabilistic theories of causality, Bayes rule, multinomial sampling and the Dirichlet density.

2 Conditional independence, graphs andd-separation.

2.1 Joint probabilities.

2.2 Conditional independence.

2.3 Directed acyclic graphs andd-separation.

2.4 The Bayes ball.

2.5 Potentials.

2.6 Bayesian networks.

2.7 Object oriented Bayesian networks.

2.8d-Separation and conditional independence.

2.9 Markov models and Bayesian networks.

2.10I-maps and Markov equivalence.

Notes.

Exercises: Conditional independence andd-separation.

3 Evidence, sufficiency and Monte Carlo methods.

3.1 Hard evidence.

3.2 Soft evidence and virtual evidence.

3.3 Queries in probabilistic inference.

3.4 Bucket elimination.

3.5 Bayesian sufficient statistics and prediction sufficiency.

3.6 Time variables.

3.7 A brief introduction to Markov chain Monte Carlo methods.

3.8 The one-dimensional discrete Metropolis algorithm.

Notes.

Exercises: Evidence, sufficiency and Monte Carlo methods.

4 Decomposable graphs and chain graphs.

4.1 Definitions and notations.

4.2 Decomposable graphs and triangulation of graphs.

4.3 Junction trees.

4.4 Markov equivalence.

4.5 Markov equivalence, the essential graph and chain graphs.

Notes.

Exercises: Decomposable graphs and chain graphs.

5 Learning the conditional probability potentials.

5.1 Initial illustration: maximum likelihood estimate for a fork connection.

5.2 The maximum likelihood estimator for multinomial sampling.

5.3 MLE for the parameters in a DAG: the general setting.

5.4 Updating, missing data, fractional updating.

Notes.

Exercises: Learning the conditional probability potentials.

6 Learning the graph structure.

6.1 Assigning a probability distribution to the graph structure.

6.2 Markov equivalence and consistency.

6.3 Reducing the size of the search.

6.4 Monte Carlo methods for locating the graph structure.

6.5 Women in mathematics.

Notes.

Exercises: Learning the graph structure.

7 Parameters and sensitivity.

7.1 Changing parameters in a network.

7.2 Measures of divergence between probability distributions.

7.3 The Chan-Darwiche distance measure.

7.4 Parameter changes to satisfy query constraints.

7.5 The sensitivity of queries to parameter changes.

Notes.

Exercises: Parameters and sensitivity.

8 Graphical models and exponential families.

8.1 Introduction to exponential families.

8.2 Standard examples of exponential families.

8.3 Graphical models and exponential families.

8.4 Noisy or as an exponential family.

8.5 Properties of the log partition function.

8.6 Fenchel Legendre conjugate.

8.7 Kullback-Leibler divergence.

8.8 Mean field theory.

8.9 Conditional Gaussian distributions.

Notes.

Exercises: Graphical models and exponential families.

9 Causality and intervention calculus.

9.1 Introduction.

9.2 Conditioning by observation and by intervention.

9.3 The intervention calculus for a Bayesian network.

9.4 Properties of intervention calculus.

9.5 Transformations of probability.

9.6 A note on the order of see and do conditioning.

9.7 The Sure Thing principle.

9.8 Back door criterion, confounding and identifiability.

Notes.

Exercises: Causality and intervention calculus.

10 The junction tree and probability updating.

10.1 Probability updating using a junction tree.

10.2 Potentials and the distributive law.

10.3 Elimination and domain graphs.

10.4 Factorization along an undirected graph.

10.5 Factorizing along a junction tree.

10.6 Local computation on junction trees.

10.7 Schedules.

10.8 Local and global consistency.

10.9 Message passing for conditional Gaussian distributions.

10.10 Using a junction tree with virtual evidence and soft evidence.

Notes.

Exercises: The junction tree and probability updating.

11 Factor graphs and the sum product algorithm.

11.1 Factorization and local potentials.

11.2 The sum product algorithm.

11.3 Detailed illustration of the algorithm.

Notes.

Exercise: Factor graphs and the sum product algorithm.

References.

Index.

Informationen zu E-Books

Herzlichen Glückwunsch zum Kauf eines Ebooks bei der BUCHBOX! Hier nun ein paar praktische Infos.

Adobe-ID

Hast du E-Books mit einem Kopierschutz (DRM) erworben, benötigst du dazu immer eine Adobe-ID. Bitte klicke einfach hier und trage dort Namen, Mailadresse und ein selbstgewähltes Passwort ein. Die Kombination von Mailadresse und Passwort ist deine Adobe-ID. Notiere sie dir bitte sorgfältig. 
 
Achtung: Wenn du kopiergeschützte E-Books OHNE Vergabe einer Adobe-ID herunterlädst, kannst du diese niemals auf einem anderen Gerät außer auf deinem PC lesen!!
 
Du hast dein Passwort zur Adobe-ID vergessen? Dann kannst du dies HIER neu beantragen.
 
 

Lesen auf dem Tablet oder Handy

Wenn du auf deinem Tablet lesen möchtest, verwende eine dafür geeignete App. 

Für iPad oder Iphone etc. hole dir im iTunes-Store die Lese-App Bluefire

Für Android-Geräte (z.B. Samsung) bekommst du die Lese-App Bluefire im GooglePlay-Store (oder auch: Aldiko)
 
Lesen auf einem E-Book-Reader oder am PC / MAC
 
Um die Dateien auf deinen PC herunter zu laden und auf dein E-Book-Lesegerät zu übertragen gibt es die Software ADE (Adobe Digital Editions).
 
 

Andere Geräte / Software

 

Kindle von Amazon. Wir empfehlen diese Geräte NICHT.

EPUB mit Adobe-DRM können nicht mit einem Kindle von Amazon gelesen werden. Weder das Dateiformat EPUB, noch der Kopierschutz Adobe-DRM sind mit dem Kindle kompatibel. Umgekehrt können alle bei Amazon gekauften E-Books nur auf dem Gerät von Amazon gelesen werden. Lesegeräte wie der Tolino sind im Gegensatz hierzu völlig frei: Du kannst bei vielen tausend Buchhandlungen online Ebooks für den Tolino kaufen. Zum Beispiel hier bei uns.

Software für Sony-E-Book-Reader

Wenn du einen Sony-Reader hast, dann findest du hier noch die zusätzliche Sony-Software.
 

Computer/Laptop mit Unix oder Linux

Die Software Adobe Digital Editions ist mit Unix und Linux nicht kompatibel. Mit einer WINE-Virtualisierung kommst du aber dennoch an deine E-Books.