**Subject Code/Name:** MATH3871 - Bayesian Inference and Computation **Contact Hours:** 2 hours of lecture, 1 hour of tutorial, 1 hour of laboratory

**Assumed Knowledge:** MATH2801 or MATH2901, but the latter is seriously recommended. (Apparently the lecturer was told by someone that MATH2931 was also a prerequisite when it was not, but fortunately he kept the 2931 content minimal. Although even if not mandatory, MATH2931 is still helpful.)

**Assessment:** - 20% Group Assignment

- 15% Individual Assignment

- 5% Class Participation (not too hard to get)

- 60% Final Exam

**Lecture Recordings?** Mostly yes - at times Zdravko used the whiteboard, but not frequently.

**Notes/Materials Available**: Lecture slides (+ notes for the MCMC section) and tutorial/lab exercises provided, but that was it. Felt insufficient, but it seemed to be fine - you just had to be able to redo the tutorial exercises.

**Textbook:** Statistical Modeling and Computation, D.P. Kroese and J.C.C. Chan, Springer, 2014. Was not necessary but it was still a decent textbook.

Also provided was Handbook of Monte Carlo Methods, D.P. Kroese, T. Taimre, Z. Botev - had some helpful techniques included.

**Lecturer(s):** Dr. Zdravko Botev

**Year & Semester/Trimester of completion:** 18 s2

**Difficulty:** 3.5/5

**Overall Rating:** 4.5/5

**Your Mark/Grade:** 92 HD

**Comments: **This is one of the third year electives for a Statistics major. Completion of this course along with the three core gets accreditation with the Statistics Society of Australia.

Bayesian inference stems from a probabilistic approach of inference - it literally falls out of Bayes rule. In the classical frequentist approach, parameters to be estimated were fixed, but Bayesian approaches treat the parameter itself as a random variable, consequently invoking lots more probabilistic techniques (credible intervals, hypothesis tests, expectation of the parameter, predictive distribution etc.)

This course also introduced simulation techniques. Basic methods (inverse transform, accept/reject method) were covered but there was a lot of depth put into Markov-chain Monte Carlo.

The computations in this course are quite interesting. On one hand, some of them are fairly straightforward thanks to the shortcuts you're introduced in weeks 1 and 2. But then at other times they get completely chaotic and it feels a bit like a war trying to fight through all of it (cough Bayes factors). A part of the course was recognising distributions, because that helped you simplify down nasty integrals (including multivariate integrals).

Those tricks were so convenient though. Trivialised pretty much half of the computations you saw in this course.

The simulations were examined through making you do a few computations in advance and also writing pseudocode. For example, with the usual rejection sampling you had to understand high school optimisation to find the optimal enveloping constant. But you pretty much just had to adapt your distributions/values/etc. to the algorithm itself to write out the pseudocode, and there was no strict style guide for it either.

Much like with combinatorics last sem, I found I actually liked this course despite having various difficult concepts. It helped that the tutorials/assignments/exam were all made fairer by the new lecturer (this course used to be a 5/5 difficulty course). But it was still pretty easy to get lost in the lectures because the lecture examples were much harder to grasp (a lot of multivariate computations).

You did need to know all the definitions, techniques and tricks the course teaches you to do well in the exam. A bit of all of that was asked.