diff --git a/.nojekyll b/.nojekyll index 64309386..c72921c2 100644 --- a/.nojekyll +++ b/.nojekyll @@ -1 +1 @@ -f1989b81 \ No newline at end of file +f1410d11 \ No newline at end of file diff --git a/Aalto2024.html b/Aalto2024.html index 3020a93a..16e57996 100644 --- a/Aalto2024.html +++ b/Aalto2024.html @@ -359,7 +359,7 @@

Schedule overview

7. Hierarchical models and exchangeability BDA3 Chapter 5 2023 Lecture 7.1,
2023 Lecture 7.2,
2022 Project info,
Slides 7 -Old Assignment 7 +Assignment 7 2024-10-28 2024-11-10 @@ -579,7 +579,7 @@

7) BDA3 Ch 5
  • Read the additional comments for Chapter 5
  • Check R demos or Python demos for Chapter 5
  • -
  • Make and submit Old Assignment 7. Deadline Sunday 2024-11-10 23:59 (two weeks for this assignment)
  • +
  • Make and submit Assignment 7. Deadline Sunday 2024-11-10 23:59 (two weeks for this assignment)
  • TA sessions 2024-10-30 14-16, 2024-10-31 12-14,
  • Highly recommended, but optional: Make BDA3 exercises 5.1 and 5.2 (model solution available for 5.3-5.5, 5.7-5.12)
  • Start reading Chapters 6-7 and additional material, see instructions below.
  • diff --git a/assignments/search.json b/assignments/search.json index ec2a9420..dd1315b3 100644 --- a/assignments/search.json +++ b/assignments/search.json @@ -37,7 +37,7 @@ "href": "template7.html", "title": "Notebook for Assignment 7", "section": "", - "text": "1 General information\nThis assignment relates to Lecture 7 and Chapter 5.\nWe recommend using JupyterHub (which has all the needed packages pre-installed).\n\nReading instructions:\n\nThe reading instructions for BDA3 Chapter 5.\n\n\n\n\n\n\n\n\nGeneral Instructions for Answering the Assignment Questions\n\n\n\n\n\n\nQuestions below are exact copies of the text found in the MyCourses quiz and should serve as a notebook where you can store notes and code.\nWe recommend opening these notebooks in the Aalto JupyterHub, see how to use R and RStudio remotely.\nFor inspiration for code, have a look at the BDA R Demos and the specific Assignment code notebooks\nRecommended additional self study exercises for each chapter in BDA3 are listed in the course web page. These will help to gain deeper understanding of the topic.\nCommon questions and answers regarding installation and technical problems can be found in Frequently Asked Questions (FAQ).\nDeadlines for all assignments can be found on the course web page and in MyCourses.\nYou are allowed to discuss assignments with your friends, but it is not allowed to copy solutions directly from other students or from internet.\nDo not share your answers publicly.\nDo not copy answers from the internet or from previous years. We compare the answers to the answers from previous years and to the answers from other students this year.\nUse of AI is allowed on the course, but the most of the work needs to by the student, and you need to report whether you used AI and in which way you used them (See points 5 and 6 in Aalto guidelines for use of AI in teaching).\nAll suspected plagiarism will be reported and investigated. See more about the Aalto University Code of Academic Integrity and Handling Violations Thereof.\nIf you have any suggestions or improvements to the course material, please post in the course chat feedback channel, create an issue, or submit a pull request to the public repository!\n\n\n\n\n\nif (!require(tidybayes)) {\n install.packages(\"tidybayes\")\n library(tidybayes)\n}\n\nLoading required package: tidybayes\n\n\nWarning in library(package, lib.loc = lib.loc, character.only = TRUE,\nlogical.return = TRUE, : there is no package called 'tidybayes'\n\n\nInstalling package into '/home/runner/work/_temp/Library'\n(as 'lib' is unspecified)\n\n\nalso installing the dependencies 'svUnit', 'arrayhelpers'\n\nif (!require(brms)) {\n install.packages(\"brms\")\n library(brms)\n}\n\nLoading required package: brms\n\n\nLoading required package: Rcpp\n\n\nLoading 'brms' package (version 2.22.0). Useful instructions\ncan be found by typing help('brms'). A more detailed introduction\nto the package is available through vignette('brms_overview').\n\n\n\nAttaching package: 'brms'\n\n\nThe following objects are masked from 'package:tidybayes':\n\n dstudent_t, pstudent_t, qstudent_t, rstudent_t\n\n\nThe following object is masked from 'package:stats':\n\n ar\n\nif (!require(metadat)) {\n install.packages(\"metadat\")\n library(metadat)\n}\n\nLoading required package: metadat\n\n\nWarning in library(package, lib.loc = lib.loc, character.only = TRUE,\nlogical.return = TRUE, : there is no package called 'metadat'\n\n\nInstalling package into '/home/runner/work/_temp/Library'\n(as 'lib' is unspecified)\n\n\nalso installing the dependency 'mathjaxr'\n\nif(!require(cmdstanr)){\n install.packages(\"cmdstanr\", repos = c(\"https://mc-stan.org/r-packages/\", getOption(\"repos\")))\n library(cmdstanr)\n}\n\nLoading required package: cmdstanr\n\n\nThis is cmdstanr version 0.8.1\n\n\n- CmdStanR documentation and vignettes: mc-stan.org/cmdstanr\n\n\n- CmdStan path: /home/runner/.cmdstan/cmdstan-2.35.0\n\n\n- CmdStan version: 2.35.0\n\ncmdstan_installed <- function(){\n res <- try(out <- cmdstanr::cmdstan_path(), silent = TRUE)\n !inherits(res, \"try-error\")\n}\nif(!cmdstan_installed()){\n install_cmdstan()\n}\n\n\n\n2 Simulation warm-up\nHere is the function to simulate and plot observations from a hierarchical data-generating process.\n\nhierarchical_sim <- function(group_pop_mean,\n between_group_sd,\n within_group_sd,\n n_groups,\n n_obs_per_group\n ) {\n # Generate group means\n group_means <- rnorm(\n n = n_groups,\n mean = group_pop_mean,\n sd = between_group_sd\n )\n\n # Generate observations\n\n ## Create an empty vector for observations\n y <- numeric()\n ## Create a vector for the group identifier\n group <- rep(1:n_groups, each = n_obs_per_group)\n \n for (j in 1:n_groups) {\n ### Generate one group observations\n group_y <- rnorm(\n n = n_obs_per_group,\n mean = group_means[j],\n sd = within_group_sd\n )\n ### Append the group observations to the vector\n y <- c(y, group_y)\n }\n\n # Combine into a data frame\n data <- data.frame(\n group = factor(group),\n y = y\n )\n\n # Plot the data\n ggplot(data, aes(x = y, y = group)) +\n geom_point() +\n geom_vline(xintercept = group_pop_mean, linetype = \"dashed\")\n}\n\nExample using the function:\n\nhierarchical_sim(\n group_pop_mean = 50,\n between_group_sd = 5,\n within_group_sd = 1,\n n_groups = 10,\n n_obs_per_group = 5\n )\n\nError in ggplot(data, aes(x = y, y = group)): could not find function \"ggplot\"\n\n\n\n\n3 Sleep deprivation\nThe dataset sleepstudy is available by using the command data(sleepstudy, package = \"lme4\")\nBelow is some code for fitting a brms model. This model is a simple pooled model. You will need to fit a hierarchical model as explained in the assignment, but this code should help getting started.\nLoad the dataset\n\ndata(sleepstudy, package = \"lme4\")\n\nError in find.package(package, lib.loc, verbose = verbose): there is no package called 'lme4'\n\n\nSpecify the formula and observation family:\n\nsleepstudy_pooled_formula <- bf(\n Reaction ~ 1 + Days,\n family = \"gaussian\",\n center = FALSE\n)\n\nWe can see the parameters and default priors with\n\nget_prior(pooled_formula, data = sleepstudy)\n\nError: object 'pooled_formula' not found\n\n\nWe can then specify the priors:\n\n(sleepstudy_pooled_priors <- c(\n prior(\n normal(400, 100),\n class = \"b\",\n coef = \"Intercept\"\n ),\n prior(\n normal(0, 50),\n class = \"b\",\n coef = \"Days\"\n ),\n prior(\n normal(0, 50),\n class = \"sigma\"\n )\n))\n\n prior class coef group resp dpar nlpar lb ub source\n normal(400, 100) b Intercept <NA> <NA> user\n normal(0, 50) b Days <NA> <NA> user\n normal(0, 50) sigma <NA> <NA> user\n\n\nAnd then fit the model:\n\nsleepstudy_pooled_fit <- brm(\n formula = pooled_formula,\n prior = pooled_priors,\n data = sleepstudy\n)\n\nError: object 'pooled_formula' not found\n\n\nWe can inspect the model fit:\n\nsummary(pooled_fit)\n\nError: object 'pooled_fit' not found\n\n\n\n\n4 School calendar\nMeta-analysis models can be fit in brms. When the standard error is known, the se() function can be used to specify it.\nThe dataset dat.konstantopoulos2011 has the observations for the school calendar intervention meta-analysis.\n\ndata(dat.konstantopoulos2011, package = \"metadat\")\n\nAs mentioned in the assignment instructions, a unique identifier for school needs to be created by combining the district and school:\n\nschoolcalendar_data <- dat.konstantopoulos2011 |>\n dplyr::mutate(\n school = factor(school),\n district = factor(district),\n district_school = interaction(district, school, drop = TRUE, sep = \"_\")\n )\n\nThen the models can be fit\n\nschoolcalendar_pooled_formula <- bf(\n formula = yi | se(sqrt(vi)) ~ 1,\n family = \"gaussian\"\n) \n\nschoolcalendar_pooled_fit <- brm(\n formula = schoolcalendar_pooled_formula,\n data = schoolcalendar_data\n)\nPredictions for a new school can be made using the posterior_epred function:\n\nnew_school <- data.frame(\n school = factor(1),\n district = factor(1),\n district_school = factor(\"1_1\"),\n vi = 0 # the expectation of the prediction is not affected by the sampling variance, so this can be any number\n)\n \n\nschoolcalendar_post_epred <- posterior_epred(\n schoolcalendar_pooled_fit,\n newdata = new_school,\n allow_new_levels = TRUE\n )\nIt can be helpful to plot the posterior estimates. Here is a function that will do this:\n\nplot_school_posteriors <- function(fit, dataset) {\n tidybayes::add_predicted_draws(dataset, fit) |>\n ggplot(\n aes(\n x = .prediction,\n y = interaction(district, school, sep = \", \", lex.order = TRUE))) +\n tidybayes::stat_halfeye() +\n ylab(\"District, school\") +\n xlab(\"Posterior effect\")\n}\n\nAnd can be used as follows:\nplot_school_posteriors(\n fit = schoolcalendar_pooled_fit,\n dataset = school_calendar_data\n)", + "text": "1 General information\nThis assignment relates to Lecture 7 and Chapter 5.\nWe recommend using JupyterHub (which has all the needed packages pre-installed).\n\nReading instructions:\n\nThe reading instructions for BDA3 Chapter 5.\n\n\n\n\n\n\n\n\nGeneral Instructions for Answering the Assignment Questions\n\n\n\n\n\n\nQuestions below are exact copies of the text found in the MyCourses quiz and should serve as a notebook where you can store notes and code.\nWe recommend opening these notebooks in the Aalto JupyterHub, see how to use R and RStudio remotely.\nFor inspiration for code, have a look at the BDA R Demos and the specific Assignment code notebooks\nRecommended additional self study exercises for each chapter in BDA3 are listed in the course web page. These will help to gain deeper understanding of the topic.\nCommon questions and answers regarding installation and technical problems can be found in Frequently Asked Questions (FAQ).\nDeadlines for all assignments can be found on the course web page and in MyCourses.\nYou are allowed to discuss assignments with your friends, but it is not allowed to copy solutions directly from other students or from internet.\nDo not share your answers publicly.\nDo not copy answers from the internet or from previous years. We compare the answers to the answers from previous years and to the answers from other students this year.\nUse of AI is allowed on the course, but the most of the work needs to by the student, and you need to report whether you used AI and in which way you used them (See points 5 and 6 in Aalto guidelines for use of AI in teaching).\nAll suspected plagiarism will be reported and investigated. See more about the Aalto University Code of Academic Integrity and Handling Violations Thereof.\nIf you have any suggestions or improvements to the course material, please post in the course chat feedback channel, create an issue, or submit a pull request to the public repository!\n\n\n\n\n\nif (!require(tidybayes)) {\n install.packages(\"tidybayes\")\n library(tidybayes)\n}\n\nLoading required package: tidybayes\n\nif (!require(brms)) {\n install.packages(\"brms\")\n library(brms)\n}\n\nLoading required package: brms\n\n\nLoading required package: Rcpp\n\n\nLoading 'brms' package (version 2.22.0). Useful instructions\ncan be found by typing help('brms'). A more detailed introduction\nto the package is available through vignette('brms_overview').\n\n\n\nAttaching package: 'brms'\n\n\nThe following objects are masked from 'package:tidybayes':\n\n dstudent_t, pstudent_t, qstudent_t, rstudent_t\n\n\nThe following object is masked from 'package:stats':\n\n ar\n\nif (!require(metadat)) {\n install.packages(\"metadat\")\n library(metadat)\n}\n\nLoading required package: metadat\n\nif(!require(cmdstanr)){\n install.packages(\"cmdstanr\", repos = c(\"https://mc-stan.org/r-packages/\", getOption(\"repos\")))\n library(cmdstanr)\n}\n\nLoading required package: cmdstanr\n\n\nThis is cmdstanr version 0.8.1\n\n\n- CmdStanR documentation and vignettes: mc-stan.org/cmdstanr\n\n\n- CmdStan path: /home/runner/.cmdstan/cmdstan-2.35.0\n\n\n- CmdStan version: 2.35.0\n\ncmdstan_installed <- function(){\n res <- try(out <- cmdstanr::cmdstan_path(), silent = TRUE)\n !inherits(res, \"try-error\")\n}\nif(!cmdstan_installed()){\n install_cmdstan()\n}\n\n\n\n2 Simulation warm-up\nHere is the function to simulate and plot observations from a hierarchical data-generating process.\n\nhierarchical_sim <- function(group_pop_mean,\n between_group_sd,\n within_group_sd,\n n_groups,\n n_obs_per_group\n ) {\n # Generate group means\n group_means <- rnorm(\n n = n_groups,\n mean = group_pop_mean,\n sd = between_group_sd\n )\n\n # Generate observations\n\n ## Create an empty vector for observations\n y <- numeric()\n ## Create a vector for the group identifier\n group <- rep(1:n_groups, each = n_obs_per_group)\n \n for (j in 1:n_groups) {\n ### Generate one group observations\n group_y <- rnorm(\n n = n_obs_per_group,\n mean = group_means[j],\n sd = within_group_sd\n )\n ### Append the group observations to the vector\n y <- c(y, group_y)\n }\n\n # Combine into a data frame\n data <- data.frame(\n group = factor(group),\n y = y\n )\n\n # Plot the data\n ggplot(data, aes(x = y, y = group)) +\n geom_point() +\n geom_vline(xintercept = group_pop_mean, linetype = \"dashed\")\n}\n\nExample using the function:\n\nhierarchical_sim(\n group_pop_mean = 50,\n between_group_sd = 5,\n within_group_sd = 1,\n n_groups = 10,\n n_obs_per_group = 5\n )\n\nError in ggplot(data, aes(x = y, y = group)): could not find function \"ggplot\"\n\n\n\n\n3 Sleep deprivation\nThe dataset sleepstudy is available by using the command data(sleepstudy, package = \"lme4\")\nBelow is some code for fitting a brms model. This model is a simple pooled model. You will need to fit a hierarchical model as explained in the assignment, but this code should help getting started.\nLoad the dataset\n\ndata(sleepstudy, package = \"lme4\")\n\nError in find.package(package, lib.loc, verbose = verbose): there is no package called 'lme4'\n\n\nSpecify the formula and observation family:\n\nsleepstudy_pooled_formula <- bf(\n Reaction ~ 1 + Days,\n family = \"gaussian\",\n center = FALSE\n)\n\nWe can see the parameters and default priors with\n\nget_prior(pooled_formula, data = sleepstudy)\n\nWe can then specify the priors:\n\n(sleepstudy_pooled_priors <- c(\n prior(\n normal(400, 100),\n class = \"b\",\n coef = \"Intercept\"\n ),\n prior(\n normal(0, 50),\n class = \"b\",\n coef = \"Days\"\n ),\n prior(\n normal(0, 50),\n class = \"sigma\"\n )\n))\n\nAnd then fit the model:\n\nsleepstudy_pooled_fit <- brm(\n formula = pooled_formula,\n prior = pooled_priors,\n data = sleepstudy\n)\n\nWe can inspect the model fit:\n\nsummary(pooled_fit)\n\n\n\n4 School calendar\nMeta-analysis models can be fit in brms. When the standard error is known, the se() function can be used to specify it.\nThe dataset dat.konstantopoulos2011 has the observations for the school calendar intervention meta-analysis.\n\ndata(dat.konstantopoulos2011, package = \"metadat\")\n\nAs mentioned in the assignment instructions, a unique identifier for school needs to be created by combining the district and school:\n\nschoolcalendar_data <- dat.konstantopoulos2011 |>\n dplyr::mutate(\n school = factor(school),\n district = factor(district),\n district_school = interaction(district, school, drop = TRUE, sep = \"_\")\n )\n\nThen the models can be fit\n\nschoolcalendar_pooled_formula <- bf(\n formula = yi | se(sqrt(vi)) ~ 1,\n family = \"gaussian\"\n) \n\nschoolcalendar_pooled_fit <- brm(\n formula = schoolcalendar_pooled_formula,\n data = schoolcalendar_data\n)\n\nPredictions for a new school can be made using the posterior_epred function:\n\nnew_school <- data.frame(\n school = factor(1),\n district = factor(1),\n district_school = factor(\"1_1\"),\n vi = 0 # the expectation of the prediction is not affected by the sampling variance, so this can be any number\n)\n \n\nschoolcalendar_post_epred <- posterior_epred(\n schoolcalendar_pooled_fit,\n newdata = new_school,\n allow_new_levels = TRUE\n )\n\nIt can be helpful to plot the posterior estimates. Here is a function that will do this:\n\nplot_school_posteriors <- function(fit, dataset) {\n tidybayes::add_predicted_draws(dataset, fit) |>\n ggplot(\n aes(\n x = .prediction,\n y = interaction(district, school, sep = \", \", lex.order = TRUE))) +\n tidybayes::stat_halfeye() +\n ylab(\"District, school\") +\n xlab(\"Posterior effect\")\n}\n\nAnd can be used as follows:\n\nplot_school_posteriors(\n fit = schoolcalendar_pooled_fit,\n dataset = school_calendar_data\n)", "crumbs": [ "Templates", "Notebook for Assignment 7" @@ -378,7 +378,7 @@ "href": "template1.html", "title": "Notebook for Assignment 1", "section": "", - "text": "1 General information\nThe exercises here refer to the lecture 1/BDA chapter 1 content, not the course infrastructure quiz. This assignment is meant to test whether or not you have sufficient knowledge to participate in the course. The first question checks that you remember basic terms of probability calculus. The second exercise checks you recognise the most important notation used throughout the course and used in BDA3. The third-fifth exercise you will solve some basic Bayes theorem questions to check your understanding on the basics of probability theory. The 6th exercise checks on whether you recall the three steps of Bayesian Data Ananlysis as mentioned in chapter 1 of BDA3. The last exercise walks you through an example of how we can use models to generate distributions for outcomes of interest, applied to a setting of a simplified Roulette table.\nThis quarto document is not intended to be submitted, but to render the questions as they appear on Mycourses to be available also outside of it. The following will set-up markmyassignment to check your functions at the end of the notebook:\n\nlibrary(markmyassignment)\nassignment_path = paste(\"https://github.com/avehtari/BDA_course_Aalto/\",\n\"blob/master/tests/assignment1.yml\", sep=\"\")\nset_assignment(assignment_path)\n\nAssignment set:\nassignment1: Bayesian Data Analysis: Assignment 1\nThe assignment contain the following (3) tasks:\n- p_red\n- p_box\n- p_identical_twin\n\n\n\n\n2 1. Basic probability theory notation and terms\n\n\n3 2. Notation\n\n\n4 3. Bayes’ theorem 1\nIf you use pen and paper, it may help to draw pictures as follows (see also assignment_instructions#fig-workflow):\n\n\n\n\n\n\nFigure 1: Parts of Bayesian workflow\n\n\n\nSee Figure 1 for illustration of parts of Bayesian workflow.\n\n\n5 4. Bayes’ theorem 2\nThe following will help you implementing a function to calculate the required probabilities for this exercise. Keep the below name and format for the function to work with markmyassignment:\n\nboxes_test <- matrix(c(2,2,1,5,5,1), ncol = 2,\n dimnames = list(c(\"A\", \"B\", \"C\"), c(\"red\", \"white\")))\n\np_red <- function(boxes) {\n # Do computation here, and return as below.\n # This is the correct return value for the test data provided above.\n 0.3928571\n}\n\np_box <- function(boxes) {\n # Do computation here, and return as below.\n # This is the correct return value for the test data provided above.\n c(0.29090909,0.07272727,0.63636364)\n}\n\n\n\n6 5. Bayes’ theorem 3\nThe R functions below might help you calculating the requited probabilities.\n\nfraternal_prob = 1/125\nidentical_prob = 1/300\n\nKeep the below name and format for the function to work with markmyassignment:\n\np_identical_twin <- function(fraternal_prob, identical_prob) {\n # Do computation here, and return as below.\n # This is the correct return value for the test data provided above.\n 0.4545455\n}\n\n\n\n7 6. The three steps of Bayesian data analysis\n\n\n8 7. A Binomial Model for the Roulette Table\nIncomplete code can be found below.\n\n# Ratio of red/black\ntheta <- # declare probability parameter for the binomial model\n\n# Sequence of trials\n\ntrials <- seq(#start value of sequence,#end value of sequence,#value for spacing)\n\n# Number of simulation draws from the model\nnsims <- # number of of simulations from the binomial model\n\n# Helper function for getting the ratios\nbinom_gen <- function(trials,theta,nsims){\n df <- as.data.frame(rbinom(nsims,trials,theta)/trials) |> mutate(nsims = nsims,trials = trials)\n colnames(df) <- c(\"Ratios\",\"Nsims\",\"Trials\")\n return(df)\n}\n\n# Create a data frame containing the draws for each number of trials\nratio_60 <- do.call(rbind, lapply(trials, binom_gen, theta, nsims)) # lapply applies elements in trials column to binom_gen function, which is then rowbound via do.call\n\nNow plot a histogram of the computed ratios for 10, 50 and 1000 trials, using the code below\n\n# Plot the Distributions\nsubset_df60 <- ratio_60[ratio_60$Trials %in% c(#trial values), ] # Subset your dataframe\n\nsubset_df60 |> ggplot(aes(Ratios)) +\n geom_histogram(position = \"identity\" ,bins = 40) +\n facet_grid(cols = vars(Trials)) +\n ggtitle(\"Ratios for specific trials\")\n\nSuppose you are now certain that theta = 0.6, plot the probability density given 1000 trials using the code below.\n\nsize = # number of trials\nprob = # probability of success\n\nbinom_data <- data.frame(\n Success = 0:size,\n Probability = dbinom(0:size, size = size, prob = prob)\n)\n\nggplot(binom_data, aes(x = Success, y = Probability)) +\n geom_point() +\n geom_line() +\n labs(title = \"PMF of Binomial Distribution\", x = \"Number of Successes\", y = \"PDF\")\n\n\n\n\n\n\n\nmarkmyassignment\n\n\n\n\n\nThe following will check the functions for which markmyassignment has been set up:\n\nmark_my_assignment()\n\n✔ | F W S OK | Context\n\n⠏ | 0 | task-1-subtask-1-tests \n⠏ | 0 | p_red() \n✖ | 1 3 | p_red()\n────────────────────────────────────────────────────────────────────────────────\nFailure ('test-task-1-subtask-1-tests.R:21:3'): p_red()\np_red(boxes = boxes) not equivalent to 0.5.\n1/1 mismatches\n[1] 0.393 - 0.5 == -0.107\nError: Incorrect result for matrix(c(1,1,1,1,1,1), ncol = 2)\n────────────────────────────────────────────────────────────────────────────────\n\n⠏ | 0 | task-2-subtask-1-tests \n⠏ | 0 | p_box() \n✖ | 1 3 | p_box()\n────────────────────────────────────────────────────────────────────────────────\nFailure ('test-task-2-subtask-1-tests.R:19:3'): p_box()\np_box(boxes = boxes) not equivalent to c(0.4, 0.1, 0.5).\n3/3 mismatches (average diff: 0.0909)\n[1] 0.2909 - 0.4 == -0.1091\n[2] 0.0727 - 0.1 == -0.0273\n[3] 0.6364 - 0.5 == 0.1364\nError: Incorrect result for matrix(c(1,1,1,1,1,1), ncol = 2)\n────────────────────────────────────────────────────────────────────────────────\n\n⠏ | 0 | task-3-subtask-1-tests \n⠏ | 0 | p_identical_twin() \n✖ | 2 3 | p_identical_twin()\n────────────────────────────────────────────────────────────────────────────────\nFailure ('test-task-3-subtask-1-tests.R:16:3'): p_identical_twin()\np_identical_twin(fraternal_prob = 1/100, identical_prob = 1/500) not equivalent to 0.2857143.\n1/1 mismatches\n[1] 0.455 - 0.286 == 0.169\nError: Incorrect result for fraternal_prob = 1/100 and identical_prob = 1/500\n\nFailure ('test-task-3-subtask-1-tests.R:19:3'): p_identical_twin()\np_identical_twin(fraternal_prob = 1/10, identical_prob = 1/20) not equivalent to 0.5.\n1/1 mismatches\n[1] 0.455 - 0.5 == -0.0455\nError: Incorrect result for fraternal_prob = 1/10 and identical_prob = 1/20\n────────────────────────────────────────────────────────────────────────────────\n\n══ Results ═════════════════════════════════════════════════════════════════════\n── Failed tests ────────────────────────────────────────────────────────────────\nFailure ('test-task-1-subtask-1-tests.R:21:3'): p_red()\np_red(boxes = boxes) not equivalent to 0.5.\n1/1 mismatches\n[1] 0.393 - 0.5 == -0.107\nError: Incorrect result for matrix(c(1,1,1,1,1,1), ncol = 2)\n\nFailure ('test-task-2-subtask-1-tests.R:19:3'): p_box()\np_box(boxes = boxes) not equivalent to c(0.4, 0.1, 0.5).\n3/3 mismatches (average diff: 0.0909)\n[1] 0.2909 - 0.4 == -0.1091\n[2] 0.0727 - 0.1 == -0.0273\n[3] 0.6364 - 0.5 == 0.1364\nError: Incorrect result for matrix(c(1,1,1,1,1,1), ncol = 2)\n\nFailure ('test-task-3-subtask-1-tests.R:16:3'): p_identical_twin()\np_identical_twin(fraternal_prob = 1/100, identical_prob = 1/500) not equivalent to 0.2857143.\n1/1 mismatches\n[1] 0.455 - 0.286 == 0.169\nError: Incorrect result for fraternal_prob = 1/100 and identical_prob = 1/500\n\nFailure ('test-task-3-subtask-1-tests.R:19:3'): p_identical_twin()\np_identical_twin(fraternal_prob = 1/10, identical_prob = 1/20) not equivalent to 0.5.\n1/1 mismatches\n[1] 0.455 - 0.5 == -0.0455\nError: Incorrect result for fraternal_prob = 1/10 and identical_prob = 1/20\n\n[ FAIL 4 | WARN 0 | SKIP 0 | PASS 9 ]", + "text": "1 General information\nThe exercises here refer to the lecture 1/BDA chapter 1 content, not the course infrastructure quiz. This assignment is meant to test whether or not you have sufficient knowledge to participate in the course. The first question checks that you remember basic terms of probability calculus. The second exercise checks you recognise the most important notation used throughout the course and used in BDA3. The third-fifth exercise you will solve some basic Bayes theorem questions to check your understanding on the basics of probability theory. The 6th exercise checks on whether you recall the three steps of Bayesian Data Ananlysis as mentioned in chapter 1 of BDA3. The last exercise walks you through an example of how we can use models to generate distributions for outcomes of interest, applied to a setting of a simplified Roulette table.\nThis quarto document is not intended to be submitted, but to render the questions as they appear on Mycourses to be available also outside of it. The following will set-up markmyassignment to check your functions at the end of the notebook:\n\nlibrary(markmyassignment)\nassignment_path = paste(\"https://github.com/avehtari/BDA_course_Aalto/\",\n\"blob/master/tests/assignment1.yml\", sep=\"\")\nset_assignment(assignment_path)\n\nAssignment set:\nassignment1: Bayesian Data Analysis: Assignment 1\nThe assignment contain the following (3) tasks:\n- p_red\n- p_box\n- p_identical_twin\n\n\n\n\n2 1. Basic probability theory notation and terms\n\n\n3 2. Notation\n\n\n4 3. Bayes’ theorem 1\nIf you use pen and paper, it may help to draw pictures as follows (see also assignment_instructions#fig-workflow):\n\n\n\n\n\n\nFigure 1: Parts of Bayesian workflow\n\n\n\nSee Figure 1 for illustration of parts of Bayesian workflow.\n\n\n5 4. Bayes’ theorem 2\nThe following will help you implementing a function to calculate the required probabilities for this exercise. Keep the below name and format for the function to work with markmyassignment:\n\nboxes_test <- matrix(c(2,2,1,5,5,1), ncol = 2,\n dimnames = list(c(\"A\", \"B\", \"C\"), c(\"red\", \"white\")))\n\np_red <- function(boxes) {\n # Do computation here, and return as below.\n # This is the correct return value for the test data provided above.\n 0.3928571\n}\n\np_box <- function(boxes) {\n # Do computation here, and return as below.\n # This is the correct return value for the test data provided above.\n c(0.29090909,0.07272727,0.63636364)\n}\n\n\n\n6 5. Bayes’ theorem 3\nThe R functions below might help you calculating the requited probabilities.\n\nfraternal_prob = 1/125\nidentical_prob = 1/300\n\nKeep the below name and format for the function to work with markmyassignment:\n\np_identical_twin <- function(fraternal_prob, identical_prob) {\n # Do computation here, and return as below.\n # This is the correct return value for the test data provided above.\n 0.4545455\n}\n\n\n\n7 6. The three steps of Bayesian data analysis\n\n\n8 7. A Binomial Model for the Roulette Table\nIncomplete code can be found below.\n\n# Ratio of red/black\ntheta <- # declare probability parameter for the binomial model\n\n# Sequence of trials\n\ntrials <- seq(#start value of sequence,#end value of sequence,#value for spacing)\n\n# Number of simulation draws from the model\nnsims <- # number of of simulations from the binomial model\n\n# Helper function for getting the ratios\nbinom_gen <- function(trials,theta,nsims){\n df <- as.data.frame(rbinom(nsims,trials,theta)/trials) |> mutate(nsims = nsims,trials = trials)\n colnames(df) <- c(\"Ratios\",\"Nsims\",\"Trials\")\n return(df)\n}\n\n# Create a data frame containing the draws for each number of trials\nratio_60 <- do.call(rbind, lapply(trials, binom_gen, theta, nsims)) # lapply applies elements in trials column to binom_gen function, which is then rowbound via do.call\n\nNow plot a histogram of the computed ratios for 10, 50 and 1000 trials, using the code below\n\n# Plot the Distributions\nsubset_df60 <- ratio_60[ratio_60$Trials %in% c(#trial values), ] # Subset your dataframe\n\nsubset_df60 |> ggplot(aes(Ratios)) +\n geom_histogram(position = \"identity\" ,bins = 40) +\n facet_grid(cols = vars(Trials)) +\n ggtitle(\"Ratios for specific trials\")\n\nSuppose you are now certain that theta = 0.6, plot the probability density given 1000 trials using the code below.\n\nsize = # number of trials\nprob = # probability of success\n\nbinom_data <- data.frame(\n Success = 0:size,\n Probability = dbinom(0:size, size = size, prob = prob)\n)\n\nggplot(binom_data, aes(x = Success, y = Probability)) +\n geom_point() +\n geom_line() +\n labs(title = \"PMF of Binomial Distribution\", x = \"Number of Successes\", y = \"PDF\")\n\n\n\n\n\n\n\nmarkmyassignment\n\n\n\n\n\nThe following will check the functions for which markmyassignment has been set up:\n\nmark_my_assignment()\n\n✔ | F W S OK | Context\n\n⠏ | 0 | task-1-subtask-1-tests \n⠏ | 0 | p_red() \n✖ | 1 3 | p_red()\n────────────────────────────────────────────────────────────────────────────────\nFailure ('test-task-1-subtask-1-tests.R:21:3'): p_red()\np_red(boxes = boxes) not equivalent to 0.5.\n1/1 mismatches\n[1] 0.393 - 0.5 == -0.107\nError: Incorrect result for matrix(c(1,1,1,1,1,1), ncol = 2)\n────────────────────────────────────────────────────────────────────────────────\n\n⠏ | 0 | task-2-subtask-1-tests \n⠏ | 0 | p_box() \n✖ | 1 3 | p_box()\n────────────────────────────────────────────────────────────────────────────────\nFailure ('test-task-2-subtask-1-tests.R:19:3'): p_box()\np_box(boxes = boxes) not equivalent to c(0.4, 0.1, 0.5).\n3/3 mismatches (average diff: 0.0909)\n[1] 0.2909 - 0.4 == -0.1091\n[2] 0.0727 - 0.1 == -0.0273\n[3] 0.6364 - 0.5 == 0.1364\nError: Incorrect result for matrix(c(1,1,1,1,1,1), ncol = 2)\n────────────────────────────────────────────────────────────────────────────────\n\n⠏ | 0 | task-3-subtask-1-tests \n⠏ | 0 | p_identical_twin() \n✖ | 2 3 | p_identical_twin()\n────────────────────────────────────────────────────────────────────────────────\nFailure ('test-task-3-subtask-1-tests.R:16:3'): p_identical_twin()\np_identical_twin(fraternal_prob = 1/100, identical_prob = 1/500) not equivalent to 0.2857143.\n1/1 mismatches\n[1] 0.455 - 0.286 == 0.169\nError: Incorrect result for fraternal_prob = 1/100 and identical_prob = 1/500\n\nFailure ('test-task-3-subtask-1-tests.R:19:3'): p_identical_twin()\np_identical_twin(fraternal_prob = 1/10, identical_prob = 1/20) not equivalent to 0.5.\n1/1 mismatches\n[1] 0.455 - 0.5 == -0.0455\nError: Incorrect result for fraternal_prob = 1/10 and identical_prob = 1/20\n────────────────────────────────────────────────────────────────────────────────\n\n══ Results ═════════════════════════════════════════════════════════════════════\n── Failed tests ────────────────────────────────────────────────────────────────\nFailure ('test-task-1-subtask-1-tests.R:21:3'): p_red()\np_red(boxes = boxes) not equivalent to 0.5.\n1/1 mismatches\n[1] 0.393 - 0.5 == -0.107\nError: Incorrect result for matrix(c(1,1,1,1,1,1), ncol = 2)\n\nFailure ('test-task-2-subtask-1-tests.R:19:3'): p_box()\np_box(boxes = boxes) not equivalent to c(0.4, 0.1, 0.5).\n3/3 mismatches (average diff: 0.0909)\n[1] 0.2909 - 0.4 == -0.1091\n[2] 0.0727 - 0.1 == -0.0273\n[3] 0.6364 - 0.5 == 0.1364\nError: Incorrect result for matrix(c(1,1,1,1,1,1), ncol = 2)\n\nFailure ('test-task-3-subtask-1-tests.R:16:3'): p_identical_twin()\np_identical_twin(fraternal_prob = 1/100, identical_prob = 1/500) not equivalent to 0.2857143.\n1/1 mismatches\n[1] 0.455 - 0.286 == 0.169\nError: Incorrect result for fraternal_prob = 1/100 and identical_prob = 1/500\n\nFailure ('test-task-3-subtask-1-tests.R:19:3'): p_identical_twin()\np_identical_twin(fraternal_prob = 1/10, identical_prob = 1/20) not equivalent to 0.5.\n1/1 mismatches\n[1] 0.455 - 0.5 == -0.0455\nError: Incorrect result for fraternal_prob = 1/10 and identical_prob = 1/20\n\n[ FAIL 4 | WARN 0 | SKIP 0 | PASS 9 ]\n\nKeep trying!", "crumbs": [ "Templates", "Notebook for Assignment 1" diff --git a/assignments/template1.html b/assignments/template1.html index 9f95fb24..65b713b3 100644 --- a/assignments/template1.html +++ b/assignments/template1.html @@ -553,7 +553,9 @@

    8 7. A Binomial M [1] 0.455 - 0.5 == -0.0455 Error: Incorrect result for fraternal_prob = 1/10 and identical_prob = 1/20 -[ FAIL 4 | WARN 0 | SKIP 0 | PASS 9 ] +[ FAIL 4 | WARN 0 | SKIP 0 | PASS 9 ] + +Keep trying! diff --git a/assignments/template3_files/figure-html/unnamed-chunk-10-2.png b/assignments/template3_files/figure-html/unnamed-chunk-10-2.png index 72aa2212..94ca4cc8 100644 Binary files a/assignments/template3_files/figure-html/unnamed-chunk-10-2.png and b/assignments/template3_files/figure-html/unnamed-chunk-10-2.png differ diff --git a/assignments/template7.html b/assignments/template7.html index dd6afbac..dd70fa9b 100644 --- a/assignments/template7.html +++ b/assignments/template7.html @@ -381,21 +381,10 @@

    1 General informa
    Loading required package: tidybayes
    -
    -
    Warning in library(package, lib.loc = lib.loc, character.only = TRUE,
    -logical.return = TRUE, : there is no package called 'tidybayes'
    -
    -
    -
    Installing package into '/home/runner/work/_temp/Library'
    -(as 'lib' is unspecified)
    -
    -
    -
    also installing the dependencies 'svUnit', 'arrayhelpers'
    -
    -
    if (!require(brms)) {
    -    install.packages("brms")
    -    library(brms)
    -}
    +
    if (!require(brms)) {
    +    install.packages("brms")
    +    library(brms)
    +}
    Loading required package: brms
    @@ -421,28 +410,17 @@

    1 General informa ar -
    if (!require(metadat)) {
    -  install.packages("metadat")
    -  library(metadat)
    -}
    +
    if (!require(metadat)) {
    +  install.packages("metadat")
    +  library(metadat)
    +}
    Loading required package: metadat
    -
    -
    Warning in library(package, lib.loc = lib.loc, character.only = TRUE,
    -logical.return = TRUE, : there is no package called 'metadat'
    -
    -
    -
    Installing package into '/home/runner/work/_temp/Library'
    -(as 'lib' is unspecified)
    -
    -
    -
    also installing the dependency 'mathjaxr'
    -
    -
    if(!require(cmdstanr)){
    -    install.packages("cmdstanr", repos = c("https://mc-stan.org/r-packages/", getOption("repos")))
    -    library(cmdstanr)
    -}
    +
    if(!require(cmdstanr)){
    +    install.packages("cmdstanr", repos = c("https://mc-stan.org/r-packages/", getOption("repos")))
    +    library(cmdstanr)
    +}
    Loading required package: cmdstanr
    @@ -458,71 +436,71 @@

    1 General informa
    - CmdStan version: 2.35.0
    -
    cmdstan_installed <- function(){
    -  res <- try(out <- cmdstanr::cmdstan_path(), silent = TRUE)
    -  !inherits(res, "try-error")
    -}
    -if(!cmdstan_installed()){
    -    install_cmdstan()
    -}
    +
    cmdstan_installed <- function(){
    +  res <- try(out <- cmdstanr::cmdstan_path(), silent = TRUE)
    +  !inherits(res, "try-error")
    +}
    +if(!cmdstan_installed()){
    +    install_cmdstan()
    +}

    2 Simulation warm-up

    Here is the function to simulate and plot observations from a hierarchical data-generating process.

    -
    hierarchical_sim <- function(group_pop_mean,
    -                             between_group_sd,
    -                             within_group_sd,
    -                             n_groups,
    -                             n_obs_per_group
    -                             ) {
    -  # Generate group means
    -  group_means <- rnorm(
    -    n = n_groups,
    -    mean = group_pop_mean,
    -    sd = between_group_sd
    -  )
    -
    -  # Generate observations
    -
    -  ## Create an empty vector for observations
    -  y <- numeric()
    -  ## Create a vector for the group identifier
    -  group <- rep(1:n_groups, each = n_obs_per_group)
    -  
    -  for (j in 1:n_groups) {
    -    ### Generate one group observations
    -    group_y <- rnorm(
    -      n = n_obs_per_group,
    -      mean = group_means[j],
    -      sd = within_group_sd
    -    )
    -    ### Append the group observations to the vector
    -    y <- c(y, group_y)
    -  }
    -
    -  # Combine into a data frame
    -  data <- data.frame(
    -    group = factor(group),
    -    y = y
    -  )
    -
    -  # Plot the data
    -  ggplot(data, aes(x = y, y = group)) +
    -    geom_point() +
    -    geom_vline(xintercept = group_pop_mean, linetype = "dashed")
    -}
    +
    hierarchical_sim <- function(group_pop_mean,
    +                             between_group_sd,
    +                             within_group_sd,
    +                             n_groups,
    +                             n_obs_per_group
    +                             ) {
    +  # Generate group means
    +  group_means <- rnorm(
    +    n = n_groups,
    +    mean = group_pop_mean,
    +    sd = between_group_sd
    +  )
    +
    +  # Generate observations
    +
    +  ## Create an empty vector for observations
    +  y <- numeric()
    +  ## Create a vector for the group identifier
    +  group <- rep(1:n_groups, each = n_obs_per_group)
    +  
    +  for (j in 1:n_groups) {
    +    ### Generate one group observations
    +    group_y <- rnorm(
    +      n = n_obs_per_group,
    +      mean = group_means[j],
    +      sd = within_group_sd
    +    )
    +    ### Append the group observations to the vector
    +    y <- c(y, group_y)
    +  }
    +
    +  # Combine into a data frame
    +  data <- data.frame(
    +    group = factor(group),
    +    y = y
    +  )
    +
    +  # Plot the data
    +  ggplot(data, aes(x = y, y = group)) +
    +    geom_point() +
    +    geom_vline(xintercept = group_pop_mean, linetype = "dashed")
    +}

    Example using the function:

    -
    hierarchical_sim(
    -  group_pop_mean = 50,
    -  between_group_sd = 5,
    -  within_group_sd = 1,
    -  n_groups = 10,
    -  n_obs_per_group = 5
    -  )
    +
    hierarchical_sim(
    +  group_pop_mean = 50,
    +  between_group_sd = 5,
    +  within_group_sd = 1,
    +  n_groups = 10,
    +  n_obs_per_group = 5
    +  )
    Error in ggplot(data, aes(x = y, y = group)): could not find function "ggplot"
    @@ -534,68 +512,53 @@

    3 Sleep deprivati

    Below is some code for fitting a brms model. This model is a simple pooled model. You will need to fit a hierarchical model as explained in the assignment, but this code should help getting started.

    Load the dataset

    -
    data(sleepstudy, package = "lme4")
    +
    data(sleepstudy, package = "lme4")
    Error in find.package(package, lib.loc, verbose = verbose): there is no package called 'lme4'

    Specify the formula and observation family:

    -
    sleepstudy_pooled_formula <- bf(
    -  Reaction ~ 1 + Days,
    -  family = "gaussian",
    -  center = FALSE
    -)
    +
    sleepstudy_pooled_formula <- bf(
    +  Reaction ~ 1 + Days,
    +  family = "gaussian",
    +  center = FALSE
    +)

    We can see the parameters and default priors with

    -
    get_prior(pooled_formula, data = sleepstudy)
    -
    -
    Error: object 'pooled_formula' not found
    -
    +
    get_prior(pooled_formula, data = sleepstudy)

    We can then specify the priors:

    -
    (sleepstudy_pooled_priors <- c(
    -  prior(
    -    normal(400, 100),
    -    class = "b",
    -    coef = "Intercept"
    -  ),
    -  prior(
    -    normal(0, 50),
    -    class = "b",
    -    coef = "Days"
    -  ),
    -  prior(
    -    normal(0, 50),
    -    class = "sigma"
    -  )
    -))
    -
    -
                prior class      coef group resp dpar nlpar   lb   ub source
    - normal(400, 100)     b Intercept                       <NA> <NA>   user
    -    normal(0, 50)     b      Days                       <NA> <NA>   user
    -    normal(0, 50) sigma                                 <NA> <NA>   user
    -
    +
    (sleepstudy_pooled_priors <- c(
    +  prior(
    +    normal(400, 100),
    +    class = "b",
    +    coef = "Intercept"
    +  ),
    +  prior(
    +    normal(0, 50),
    +    class = "b",
    +    coef = "Days"
    +  ),
    +  prior(
    +    normal(0, 50),
    +    class = "sigma"
    +  )
    +))

    And then fit the model:

    -
    sleepstudy_pooled_fit <- brm(
    -  formula = pooled_formula,
    -  prior = pooled_priors,
    -  data = sleepstudy
    -)
    -
    -
    Error: object 'pooled_formula' not found
    -
    +
    sleepstudy_pooled_fit <- brm(
    +  formula = pooled_formula,
    +  prior = pooled_priors,
    +  data = sleepstudy
    +)

    We can inspect the model fit:

    -
    summary(pooled_fit)
    -
    -
    Error: object 'pooled_fit' not found
    -
    +
    summary(pooled_fit)

    @@ -603,61 +566,65 @@

    4 School calendar

    Meta-analysis models can be fit in brms. When the standard error is known, the se() function can be used to specify it.

    The dataset dat.konstantopoulos2011 has the observations for the school calendar intervention meta-analysis.

    -
    data(dat.konstantopoulos2011, package = "metadat")
    +
    data(dat.konstantopoulos2011, package = "metadat")

    As mentioned in the assignment instructions, a unique identifier for school needs to be created by combining the district and school:

    -
    schoolcalendar_data <- dat.konstantopoulos2011 |>
    -  dplyr::mutate(
    -    school = factor(school),
    -    district = factor(district),
    -    district_school = interaction(district, school, drop = TRUE, sep = "_")
    -  )
    +
    schoolcalendar_data <- dat.konstantopoulos2011 |>
    +  dplyr::mutate(
    +    school = factor(school),
    +    district = factor(district),
    +    district_school = interaction(district, school, drop = TRUE, sep = "_")
    +  )

    Then the models can be fit

    -
    
    -schoolcalendar_pooled_formula <- bf(
    -  formula = yi | se(sqrt(vi)) ~ 1,
    -  family = "gaussian"
    -)  
    -
    -schoolcalendar_pooled_fit <- brm(
    -  formula = schoolcalendar_pooled_formula,
    -  data = schoolcalendar_data
    -)
    +
    +
    schoolcalendar_pooled_formula <- bf(
    +  formula = yi | se(sqrt(vi)) ~ 1,
    +  family = "gaussian"
    +)  
    +
    +schoolcalendar_pooled_fit <- brm(
    +  formula = schoolcalendar_pooled_formula,
    +  data = schoolcalendar_data
    +)
    +

    Predictions for a new school can be made using the posterior_epred function:

    -
    
    -new_school <- data.frame(
    -  school = factor(1),
    -  district = factor(1),
    -  district_school = factor("1_1"),
    -  vi = 0 # the expectation of the prediction is not affected by the sampling variance, so this can be any number
    -)
    -  
    -
    -schoolcalendar_post_epred <- posterior_epred(
    -    schoolcalendar_pooled_fit,
    -    newdata = new_school,
    -    allow_new_levels = TRUE
    -  )
    +
    +
    new_school <- data.frame(
    +  school = factor(1),
    +  district = factor(1),
    +  district_school = factor("1_1"),
    +  vi = 0 # the expectation of the prediction is not affected by the sampling variance, so this can be any number
    +)
    +  
    +
    +schoolcalendar_post_epred <- posterior_epred(
    +    schoolcalendar_pooled_fit,
    +    newdata = new_school,
    +    allow_new_levels = TRUE
    +  )
    +

    It can be helpful to plot the posterior estimates. Here is a function that will do this:

    -
    plot_school_posteriors <- function(fit, dataset) {
    -  tidybayes::add_predicted_draws(dataset, fit) |>
    -    ggplot(
    -      aes(
    -        x = .prediction,
    -        y = interaction(district, school, sep = ", ", lex.order = TRUE))) +
    -    tidybayes::stat_halfeye() +
    -    ylab("District, school") +
    -    xlab("Posterior effect")
    -}
    +
    plot_school_posteriors <- function(fit, dataset) {
    +  tidybayes::add_predicted_draws(dataset, fit) |>
    +    ggplot(
    +      aes(
    +        x = .prediction,
    +        y = interaction(district, school, sep = ", ", lex.order = TRUE))) +
    +    tidybayes::stat_halfeye() +
    +    ylab("District, school") +
    +    xlab("Posterior effect")
    +}

    And can be used as follows:

    -
    plot_school_posteriors(
    -  fit = schoolcalendar_pooled_fit,
    -  dataset = school_calendar_data
    -)
    +
    +
    plot_school_posteriors(
    +  fit = schoolcalendar_pooled_fit,
    +  dataset = school_calendar_data
    +)
    +
    @@ -1148,269 +1115,277 @@

    4 School calendar diff --git a/search.json b/search.json index 1e1cf8d4..448abf3e 100644 --- a/search.json +++ b/search.json @@ -326,7 +326,7 @@ "href": "Aalto2024.html#schedule-2024", "title": "Bayesian Data Analysis course - Aalto 2024", "section": "Schedule 2024", - "text": "Schedule 2024\nThe course consists of 12 lectures, 9 assignments, a project work, and a project presentation in periods I and II. It’s good start reading the material for the next lecture and assignment while making the assignment related to the previous lecture. There are 9 assignments and a project work with presentation, and thus the assignments are not in one-to-one correspondence with the lectures. The schedule below lists the lectures and how they connect to the topics, book chapters and assignments.\n\nSchedule overview\nHere is an overview of the schedule. Scroll down the page to see detailed instructions for each block. When you are working on assignment related to previous lecture, it is good to start reading the book chapters relaed to the next lecture and assignment. The schedule links to 2023 lecture videos until couple hours after the 2024 lecture has been recorded.\n\n\n\n\nReadings\nLectures\nAssignment\nLecture Date\nAssignment due date\n\n\n\n\n1. Introduction\nBDA3 Chapter 1\n2024 Lecture 1.1 Introduction, 2024 Lecture 1.2 Course practicalities, Slides 1.1, Slides 1.2\n2024 Assignment 1\n2024-09-02\n2024-09-15\n\n\n2. Basics of Bayesian inference\nBDA3 Chapter 1, BDA3 Chapter 2\n2023 Lecture 2.1 (2024 recording failed), 2024 Lecture 2.2, Slides 2\nAssignment 2\n2024-09-16\n2024-09-22\n\n\n3. Multidimensional posterior\nBDA3 Chapter 3\n2023 Lecture 3.1, 2023 Lecture 3.2Slides 3\nAssignment 3\n2024-09-23\n2024-09-29\n\n\n4. Monte Carlo\nBDA3 Chapter 10\n2024 Lecture 4.1, 2024 Lecture 4.2, Slides 4\nAssignment 4\n2024-09-30\n2024-10-06\n\n\n5. Markov chain Monte Carlo\nBDA3 Chapter 11\n2024 Lecture 5.1, 2024 Lecture 5.2, Slides 5\nAssignment 5\n2024-10-07\n2024-10-13\n\n\n6. Stan, HMC, PPL\nBDA3 Chapter 12 + extra material on Stan\n2023 Lecture 6.1, Lecture 6.2, 2024 Slides 6\nAssignment 6\n2024-10-14\n2024-10-27\n\n\n7. Hierarchical models and exchangeability\nBDA3 Chapter 5\n2023 Lecture 7.1, 2023 Lecture 7.2, 2022 Project info, Slides 7\nOld Assignment 7\n2024-10-28\n2024-11-10\n\n\n8. Model checking & cross-validation\nBDA3 Chapter 6, BDA3 Chapter 7, Visualization in Bayesian workflow, Practical Bayesian cross-validation\n2023 Lecture 8.1, 2023 Lecture 8.2, Slides 8a,Slides 8b\nStart project work\n2024-11-04\nN/A\n\n\n9. Model comparison, selection, and hypothesis testing\nBDA3 Chapter 7 (not 7.2 and 7.3), Practical Bayesian cross-validation\n2023 Lecture 9.1, 2023 Lecture 9.2, Slides 9\nOld Assignment 8\n2024-11-11\n2024-11-17\n\n\n10. Decision analysis\nBDA3 Chapter 9\n2023 Lecture 10.1, 2023 Lecture 10.2, Slides 10a, Slides 10b\nOld Assignment 9\n2024-11-18\n2024-11-24\n\n\n11. Variable selectio with projpred, project presentation example\nBDA3 Chapter 4\n2023 Lecture 11.1, 2023 Lecture 11.2, 2023 Lecture 11.3, Slides 11a, Slides Project Presentation, Slides 11 extra\nProject work\n2024-11-25\nN/A\n\n\n12. TBA\n\nOptional: \nProject work\n2024-12-02\nN/A\n\n\n13. Project evaluation\n\n\n\nProject presentations: 9.-13.12.\nEvaluation week\n\n\n\n\n\n1) Course introduction, BDA 3 Ch 1, prerequisites assignment\nCourse practicalities, material, assignments, project work, peergrading, QA sessions, TA sessions, prerequisites, chat, etc.\n\nLogin with Aalto account to the Zulip course chat with link in MyCourses\nIntroduction/practicalities lecture Monday 2024-09-02 14:15-16, hall C, Otakaari 1**\n\n2024 Lecture videos 1.1 and 1.2 in Panopto\nSlides 1.1, Slides 1.2\n\nRead BDA3 Chapter 1\n\nstart with reading instructions for Chapter 1 and afterwards read the additional comments in the same document\n\nThere are no R demos for Chapter 1\nMake and submit 2024 Assignment 1. Deadline Sunday 2024-09-15 23:59\n\nWe highly recommend to submit all assignments Friday before 3pm so that you can get TA help before submission. As the course has students who work weekdays (e.g. FiTech students), the late submission until Sunday night is allowed, but we can’t provide support during the weekends.\nthis assignment checks that you have sufficient prerequisite skills (basic probability calculus, and R or Python)\nGeneral information about assignments\n\nR markdown template for assignments\nFAQ for the assignments has solutions to commonly asked questions related RStudio setup, errors during package installations, etc.\n\n\nGet help in TA sessions 2024-09-04 14-16, Y342a, Otakaari 1 2024-09-05 12-14, Y429c-d, Otakaari 1\n\nin Sisu these are marked as exercise sessions, but we call them TA sessions\nthese are optional and you can choose which one to join\nsee more info about TA sessions\n\nHighly recommended, but optional: Make BDA3 exercises 1.1-1.4, 1.6-1.8 (model solutions available for 1.1-1.6)\nStart reading Chapters 1+2, see instructions below\n\n\n\n2) BDA3 Ch 1+2, basics of Bayesian inference\nBDA3 Chapters 1+2, basics of Bayesian inference, observation model, likelihood, posterior and binomial model, predictive distribution and benefit of integration, priors and prior information, and one parameter normal model.\n\nRead BDA3 Chapter 2\n\nsee reading instructions for Chapter 2\n\nLecture Monday 2024-09-16 14:15-16, hall T1, CS building\n\nSlides 2\nVideos: 2023 Lecture 2.1 (2024 recording failed), 2024 Lecture 2.2 on basics of Bayesian inference, observation model, likelihood, posterior and binomial model, predictive distribution and benefit of integration, priors and prior information, and one parameter normal model. BDA3 Ch 1+2.\n\nRead the additional comments for Chapter 2\nCheck R demos or Python demos for Chapter 2\nMake and submit Assignment 2. Deadline Sunday 2024-09-22 23:59\n\nTA sessions 2024-09-18 14-16, Y342a, Otakaari 1 2024-09-19 12-14, Y429c-d, Otakaari 1\n\nHighly recommended, but optional: Make BDA3 exercises 2.1-2.5, 2.8, 2.9, 2.14, 2.17, 2.22 (model solutions available for 2.1-2.5, 2.7-2.13, 2.16, 2.17, 2.20, and 2.14 is in course slides)\nStart reading Chapter 3, see instructions below\n\n\n\n3) BDA3 Ch 3, multidimensional posterior\nMultiparameter models, joint, marginal and conditional distribution, normal model, bioassay example, grid sampling and grid evaluation. BDA3 Ch 3.\n\nRead BDA3 Chapter 3\n\nsee reading instructions for Chapter 3\n\nLecture Monday 2024-09-23. 14:15-16, hall T1, CS building\n\nSlides 3\nVideos: 2023 Lecture 3.1 2023 Lecture 3.2 on multiparameter models, joint, marginal and conditional distribution, normal model, bioassay example, grid sampling and grid evaluation. BDA3 Ch 3.\n\nRead the additional comments for Chapter 3\nCheck R demos or Python demos for Chapter 3\nMake and submit Assignment 3. Deadline Sunday 2024-09-29 23:59\nTA sessions 2024-09-25 14-16, 2024-09-26 12-14,\nHighly recommended, but optional: Make BDA3 exercises 3.2, 3.3, 3.9 (model solutions available for 3.1-3.3, 3.5, 3.9, 3.10)\nStart reading Chapter 10, see instructions below\n\n\n\n4) BDA3 Ch 10, Monte Carlo\nNumerical issues, Monte Carlo, how many simulation draws are needed, how many digits to report, direct simulation, curse of dimensionality, rejection sampling, and importance sampling. BDA3 Ch 10.\n\nRead BDA3 Chapter 10\n\nsee reading instructions for Chapter 10\n\nLecture Monday 2024-09-30 14:15-16, hall T1, CS building\n\nSlides 4\nVideos: 2024 Lecture 4.1 on numerical issues, Monte Carlo, how many simulation draws are needed, how many digits to report, and 2024 Lecture 4.2 on Pareto-\\(\\hat{k}\\) diagnostic, direct simulation, rejection sampling, and importance sampling. BDA3 Ch 10.\n\nRead the additional comments for Chapter 10\nCheck R demos or Python demos for Chapter 10\nMake and submit Assignment 4. Deadline Sunday 2024-10-06 23:59\nTA sessions 2024-10-02 14-16, 2024-10-03 12-14,\nHighly recommended, but optional: Make BDA3 exercises 10.1, 10.2 (model solution available for 10.4)\nStart reading Chapter 11, see instructions below\n\n\n\n5) BDA3 Ch 11, Markov chain Monte Carlo\nMarkov chain Monte Carlo, Gibbs sampling, Metropolis algorithm, warm-up, convergence diagnostics, R-hat, and effective sample size. BDA3 Ch 11.\n\nRead BDA3 Chapter 11\n\nsee reading instructions for Chapter 11\n\nLecture Monday 2024-10-07 14:15-16, hall T1, CS building\n\nSlides 5\nVideos: 2024 Lecture 5.1 on Markov chain Monte Carlo, Gibbs sampling, Metropolis algorithm, and 2024 Lecture 5.2 on warm-up, convergence diagnostics, R-hat, and effective sample size.\n\nRead the additional comments for Chapter 11\nCheck R demos or Python demos for Chapter 11\nMake and submit Assignment 5. Deadline Sunday 2024-10-13 23:59\nTA sessions 2024-10-09 14-16, 2024-10-10 12-14,\nHighly recommended, but optional: Make BDA3 exercise 11.1 (model solution available for 11.1)\nStart reading Chapter 12 + Stan material, see instructions below\n\n\n\n6) BDA3 Ch 12 + Stan, HMC, PPL, Stan\nHMC, NUTS, dynamic HMC and HMC specific convergence diagnostics, probabilistic programming and Stan. BDA3 Ch 12 + extra material\n\nRead BDA3 Chapter 12\n\nsee reading instructions for Chapter 12\n\nLecture Monday 2024-10-14 14:15-16, hall T1, CS building\n\nSlides 6\nVideos: 2023 Lecture 6.1 on HMC, NUTS, dynamic HMC and HMC specific convergence diagnostics, and 2024 Lecture 6.2 on probabilistic programming and Stan. BDA3 Ch 12 + extra material.\nOptional: Stan Extra introduction recorded 2020 Golf putting example, main features of Stan, benefits of probabilistic programming, and comparison to some other software.\n\nRead the additional comments for Chapter 12\nRead Stan introduction article\nCheck R demos for RStan or Python demos for PyStan\nAdditional material for Stan:\n\nDocumentation\nRStan installation\nPyStan installation\nBasics of Bayesian inference and Stan, Jonah Gabry & Lauren Kennedy Part 1 and Part 2\n\nMake and submit Assignment 6. DeadlineSunday 2024-10-27 23:59 (two weeks for this assignment)\nTA sessions 2024-10-16 14-16, 2024-10-17 12-14,\nStart reading Chapter 5 + Stan material, see instructions below\n\n\n\n7) BDA3 Ch 5, hierarchical models\nHierarchical models and exchangeability. BDA3 Ch 5.\n\nRead BDA3 Chapter 5\n\nsee reading instructions for Chapter 5\n\nLecture Monday 2024-10-28 14:15-16, hall T2, CS building\n\nSlides 7\nVideos: 2023 Lecture 7.1 on hierarchical models, 2023 Lecture 7.2 on exchangeability.\n\nRead the additional comments for Chapter 5\nCheck R demos or Python demos for Chapter 5\nMake and submit Old Assignment 7. Deadline Sunday 2024-11-10 23:59 (two weeks for this assignment)\nTA sessions 2024-10-30 14-16, 2024-10-31 12-14,\nHighly recommended, but optional: Make BDA3 exercises 5.1 and 5.2 (model solution available for 5.3-5.5, 5.7-5.12)\nStart reading Chapters 6-7 and additional material, see instructions below.\n\n\n\n8) BDA3 Ch 6+7 + extra material, model checking, cross-validation\nModel checking and cross-validation.\n\nRead BDA3 Chapters 6 and 7 (skip 7.2 and 7.3)\n\nsee reading instructions for Chapter 6 and Chapter 7\n\nRead Visualization in Bayesian workflow\n\nmore about workflow and examples of prior predictive checking and LOO-CV probability integral transformations\n\nRead Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC (Journal link)\n\nreplaces BDA3 Sections 7.2 and 7.3 on cross-validation\n\nLecture Monday 2024-11-04 14:15-16, hall T2, CS building\n\nSlides 8a, Slides 8b\nVideos: 2022 Lecture 8.1 on model checking, and 2023 Lecture 8.2 on cross-validation part 1. BDA3 Ch 6-7 + extra material.\n\nRead the additional comments for Chapter 6 and Chapter 7\nCheck R demos or Python demos for Chapter 6\nAdditional reading material\n\nCross-validation FAQ\n\nNo new assignment in this block\nStart the project work\nTA sessions 2024-11-06 14-16, 2024-11-07 12-14,\nHighly recommended, but optional: Make BDA3 exercise 6.1 (model solution available for 6.1, 6.5-6.7)\n\n\n\n9) BDA3 Ch 7, extra material, model comparison and selection\nPSIS-LOO, K-fold-CV, model comparison and selection. Extra lecture on variable selection with projection predictive variable selection.\n\nRead Chapter 7 (no 7.2 and 7.3)\n\nsee reading instructions for Chapter 7\n\nRead Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC (Journal link)\n\nreplaces BDA3 Sections 7.2 and 7.3 on cross-validation\n\nLecture Monday 2024-11-11 14:15-16, hall T2, CS building\n\nSlides 9\nVideos: 2023 Lecture 9.1 and 2023 Lecture 9.2 on model comparison, selection, and hypothesis testing.\n\nAdditional reading material\n\nCV FAQ\n\nMake and submit Old Assignment 8. Sunday 2024-11-17 23:59\nTA sessions 2024-11-13 14-16, 2024-11-14 12-14,\nStart reading Chapter 9, see instructions below.\n\n\n\n10) BDA3 Ch 9, decision analysis + BDA3 Ch 4 Laplace approximation and asymptotics\nDecision analysis. BDA3 Ch 9. + Laplace approximation and asymptotics. BDA Ch 4.\n\nRead Chapter 9 and 4\n\nsee reading instructions for Chapter 9\nsee reading instructions for Chapter 4\n\nLecture Monday 2024-11-18 14:15-16, hall T2, CS building\n\nSlides 10a, Slides 10b\nVideos: 2023 Lecture 10.1 on decision analysis. BDA3 Ch 9, and 2023 Lecture 10.2 on Laplace approximation, and asymptotics, BDA3 Ch 4.\n\nMake and submit Old Assignment 9. Sunday 2024-11-24 23:59\nTA sessions 2024-11-20 14-16, 2024-11-21 12-14,\nStart reading Chapter 4, see instructions below.\n\n\n\n11) Variable selection with projpred, project presentation example, extra\n\nLecture Monday 2024-11-25 14:15-16, hall T2, CS building\n\nSlides 11a, Slides Project Presentation, Slides 11 extra\nVideos: 2023 Lecture 11.1 on variable selecion with projpred, 2023 Lecture 11.2 on project presentations, 2023 Lecture 11.3 on rest of BDA3, ROS, and Bayesian Workflow\n\nNo new assignment. Work on project. TAs help with projects.\nTA sessions 2024-11-27 14-16, 2024-11-28 12-14,\n\n\n\n12) TBA\n\nLecture Monday 2024-12-02 14:15-16, hall T2, CS building\n\nSlides 12\n\nTBA\nWork on project. TAs help with projects. Project deadline 1.12. 23:59\nTA sessions 2024-12-04 14-16, 2024-12-05 12-14,\n\n\n\n13) Project evaluation\n\nProject report deadline 1.12. 23:59 (submit to peergrade).\n\nReview project reports done by your peers before 6.12. 23:59, and reflect on your feedback.\n\nProject presentations 9.-13.12. (evaluation week)" + "text": "Schedule 2024\nThe course consists of 12 lectures, 9 assignments, a project work, and a project presentation in periods I and II. It’s good start reading the material for the next lecture and assignment while making the assignment related to the previous lecture. There are 9 assignments and a project work with presentation, and thus the assignments are not in one-to-one correspondence with the lectures. The schedule below lists the lectures and how they connect to the topics, book chapters and assignments.\n\nSchedule overview\nHere is an overview of the schedule. Scroll down the page to see detailed instructions for each block. When you are working on assignment related to previous lecture, it is good to start reading the book chapters relaed to the next lecture and assignment. The schedule links to 2023 lecture videos until couple hours after the 2024 lecture has been recorded.\n\n\n\n\nReadings\nLectures\nAssignment\nLecture Date\nAssignment due date\n\n\n\n\n1. Introduction\nBDA3 Chapter 1\n2024 Lecture 1.1 Introduction, 2024 Lecture 1.2 Course practicalities, Slides 1.1, Slides 1.2\n2024 Assignment 1\n2024-09-02\n2024-09-15\n\n\n2. Basics of Bayesian inference\nBDA3 Chapter 1, BDA3 Chapter 2\n2023 Lecture 2.1 (2024 recording failed), 2024 Lecture 2.2, Slides 2\nAssignment 2\n2024-09-16\n2024-09-22\n\n\n3. Multidimensional posterior\nBDA3 Chapter 3\n2023 Lecture 3.1, 2023 Lecture 3.2Slides 3\nAssignment 3\n2024-09-23\n2024-09-29\n\n\n4. Monte Carlo\nBDA3 Chapter 10\n2024 Lecture 4.1, 2024 Lecture 4.2, Slides 4\nAssignment 4\n2024-09-30\n2024-10-06\n\n\n5. Markov chain Monte Carlo\nBDA3 Chapter 11\n2024 Lecture 5.1, 2024 Lecture 5.2, Slides 5\nAssignment 5\n2024-10-07\n2024-10-13\n\n\n6. Stan, HMC, PPL\nBDA3 Chapter 12 + extra material on Stan\n2023 Lecture 6.1, Lecture 6.2, 2024 Slides 6\nAssignment 6\n2024-10-14\n2024-10-27\n\n\n7. Hierarchical models and exchangeability\nBDA3 Chapter 5\n2023 Lecture 7.1, 2023 Lecture 7.2, 2022 Project info, Slides 7\nAssignment 7\n2024-10-28\n2024-11-10\n\n\n8. Model checking & cross-validation\nBDA3 Chapter 6, BDA3 Chapter 7, Visualization in Bayesian workflow, Practical Bayesian cross-validation\n2023 Lecture 8.1, 2023 Lecture 8.2, Slides 8a,Slides 8b\nStart project work\n2024-11-04\nN/A\n\n\n9. Model comparison, selection, and hypothesis testing\nBDA3 Chapter 7 (not 7.2 and 7.3), Practical Bayesian cross-validation\n2023 Lecture 9.1, 2023 Lecture 9.2, Slides 9\nOld Assignment 8\n2024-11-11\n2024-11-17\n\n\n10. Decision analysis\nBDA3 Chapter 9\n2023 Lecture 10.1, 2023 Lecture 10.2, Slides 10a, Slides 10b\nOld Assignment 9\n2024-11-18\n2024-11-24\n\n\n11. Variable selectio with projpred, project presentation example\nBDA3 Chapter 4\n2023 Lecture 11.1, 2023 Lecture 11.2, 2023 Lecture 11.3, Slides 11a, Slides Project Presentation, Slides 11 extra\nProject work\n2024-11-25\nN/A\n\n\n12. TBA\n\nOptional: \nProject work\n2024-12-02\nN/A\n\n\n13. Project evaluation\n\n\n\nProject presentations: 9.-13.12.\nEvaluation week\n\n\n\n\n\n1) Course introduction, BDA 3 Ch 1, prerequisites assignment\nCourse practicalities, material, assignments, project work, peergrading, QA sessions, TA sessions, prerequisites, chat, etc.\n\nLogin with Aalto account to the Zulip course chat with link in MyCourses\nIntroduction/practicalities lecture Monday 2024-09-02 14:15-16, hall C, Otakaari 1**\n\n2024 Lecture videos 1.1 and 1.2 in Panopto\nSlides 1.1, Slides 1.2\n\nRead BDA3 Chapter 1\n\nstart with reading instructions for Chapter 1 and afterwards read the additional comments in the same document\n\nThere are no R demos for Chapter 1\nMake and submit 2024 Assignment 1. Deadline Sunday 2024-09-15 23:59\n\nWe highly recommend to submit all assignments Friday before 3pm so that you can get TA help before submission. As the course has students who work weekdays (e.g. FiTech students), the late submission until Sunday night is allowed, but we can’t provide support during the weekends.\nthis assignment checks that you have sufficient prerequisite skills (basic probability calculus, and R or Python)\nGeneral information about assignments\n\nR markdown template for assignments\nFAQ for the assignments has solutions to commonly asked questions related RStudio setup, errors during package installations, etc.\n\n\nGet help in TA sessions 2024-09-04 14-16, Y342a, Otakaari 1 2024-09-05 12-14, Y429c-d, Otakaari 1\n\nin Sisu these are marked as exercise sessions, but we call them TA sessions\nthese are optional and you can choose which one to join\nsee more info about TA sessions\n\nHighly recommended, but optional: Make BDA3 exercises 1.1-1.4, 1.6-1.8 (model solutions available for 1.1-1.6)\nStart reading Chapters 1+2, see instructions below\n\n\n\n2) BDA3 Ch 1+2, basics of Bayesian inference\nBDA3 Chapters 1+2, basics of Bayesian inference, observation model, likelihood, posterior and binomial model, predictive distribution and benefit of integration, priors and prior information, and one parameter normal model.\n\nRead BDA3 Chapter 2\n\nsee reading instructions for Chapter 2\n\nLecture Monday 2024-09-16 14:15-16, hall T1, CS building\n\nSlides 2\nVideos: 2023 Lecture 2.1 (2024 recording failed), 2024 Lecture 2.2 on basics of Bayesian inference, observation model, likelihood, posterior and binomial model, predictive distribution and benefit of integration, priors and prior information, and one parameter normal model. BDA3 Ch 1+2.\n\nRead the additional comments for Chapter 2\nCheck R demos or Python demos for Chapter 2\nMake and submit Assignment 2. Deadline Sunday 2024-09-22 23:59\n\nTA sessions 2024-09-18 14-16, Y342a, Otakaari 1 2024-09-19 12-14, Y429c-d, Otakaari 1\n\nHighly recommended, but optional: Make BDA3 exercises 2.1-2.5, 2.8, 2.9, 2.14, 2.17, 2.22 (model solutions available for 2.1-2.5, 2.7-2.13, 2.16, 2.17, 2.20, and 2.14 is in course slides)\nStart reading Chapter 3, see instructions below\n\n\n\n3) BDA3 Ch 3, multidimensional posterior\nMultiparameter models, joint, marginal and conditional distribution, normal model, bioassay example, grid sampling and grid evaluation. BDA3 Ch 3.\n\nRead BDA3 Chapter 3\n\nsee reading instructions for Chapter 3\n\nLecture Monday 2024-09-23. 14:15-16, hall T1, CS building\n\nSlides 3\nVideos: 2023 Lecture 3.1 2023 Lecture 3.2 on multiparameter models, joint, marginal and conditional distribution, normal model, bioassay example, grid sampling and grid evaluation. BDA3 Ch 3.\n\nRead the additional comments for Chapter 3\nCheck R demos or Python demos for Chapter 3\nMake and submit Assignment 3. Deadline Sunday 2024-09-29 23:59\nTA sessions 2024-09-25 14-16, 2024-09-26 12-14,\nHighly recommended, but optional: Make BDA3 exercises 3.2, 3.3, 3.9 (model solutions available for 3.1-3.3, 3.5, 3.9, 3.10)\nStart reading Chapter 10, see instructions below\n\n\n\n4) BDA3 Ch 10, Monte Carlo\nNumerical issues, Monte Carlo, how many simulation draws are needed, how many digits to report, direct simulation, curse of dimensionality, rejection sampling, and importance sampling. BDA3 Ch 10.\n\nRead BDA3 Chapter 10\n\nsee reading instructions for Chapter 10\n\nLecture Monday 2024-09-30 14:15-16, hall T1, CS building\n\nSlides 4\nVideos: 2024 Lecture 4.1 on numerical issues, Monte Carlo, how many simulation draws are needed, how many digits to report, and 2024 Lecture 4.2 on Pareto-\\(\\hat{k}\\) diagnostic, direct simulation, rejection sampling, and importance sampling. BDA3 Ch 10.\n\nRead the additional comments for Chapter 10\nCheck R demos or Python demos for Chapter 10\nMake and submit Assignment 4. Deadline Sunday 2024-10-06 23:59\nTA sessions 2024-10-02 14-16, 2024-10-03 12-14,\nHighly recommended, but optional: Make BDA3 exercises 10.1, 10.2 (model solution available for 10.4)\nStart reading Chapter 11, see instructions below\n\n\n\n5) BDA3 Ch 11, Markov chain Monte Carlo\nMarkov chain Monte Carlo, Gibbs sampling, Metropolis algorithm, warm-up, convergence diagnostics, R-hat, and effective sample size. BDA3 Ch 11.\n\nRead BDA3 Chapter 11\n\nsee reading instructions for Chapter 11\n\nLecture Monday 2024-10-07 14:15-16, hall T1, CS building\n\nSlides 5\nVideos: 2024 Lecture 5.1 on Markov chain Monte Carlo, Gibbs sampling, Metropolis algorithm, and 2024 Lecture 5.2 on warm-up, convergence diagnostics, R-hat, and effective sample size.\n\nRead the additional comments for Chapter 11\nCheck R demos or Python demos for Chapter 11\nMake and submit Assignment 5. Deadline Sunday 2024-10-13 23:59\nTA sessions 2024-10-09 14-16, 2024-10-10 12-14,\nHighly recommended, but optional: Make BDA3 exercise 11.1 (model solution available for 11.1)\nStart reading Chapter 12 + Stan material, see instructions below\n\n\n\n6) BDA3 Ch 12 + Stan, HMC, PPL, Stan\nHMC, NUTS, dynamic HMC and HMC specific convergence diagnostics, probabilistic programming and Stan. BDA3 Ch 12 + extra material\n\nRead BDA3 Chapter 12\n\nsee reading instructions for Chapter 12\n\nLecture Monday 2024-10-14 14:15-16, hall T1, CS building\n\nSlides 6\nVideos: 2023 Lecture 6.1 on HMC, NUTS, dynamic HMC and HMC specific convergence diagnostics, and 2024 Lecture 6.2 on probabilistic programming and Stan. BDA3 Ch 12 + extra material.\nOptional: Stan Extra introduction recorded 2020 Golf putting example, main features of Stan, benefits of probabilistic programming, and comparison to some other software.\n\nRead the additional comments for Chapter 12\nRead Stan introduction article\nCheck R demos for RStan or Python demos for PyStan\nAdditional material for Stan:\n\nDocumentation\nRStan installation\nPyStan installation\nBasics of Bayesian inference and Stan, Jonah Gabry & Lauren Kennedy Part 1 and Part 2\n\nMake and submit Assignment 6. DeadlineSunday 2024-10-27 23:59 (two weeks for this assignment)\nTA sessions 2024-10-16 14-16, 2024-10-17 12-14,\nStart reading Chapter 5 + Stan material, see instructions below\n\n\n\n7) BDA3 Ch 5, hierarchical models\nHierarchical models and exchangeability. BDA3 Ch 5.\n\nRead BDA3 Chapter 5\n\nsee reading instructions for Chapter 5\n\nLecture Monday 2024-10-28 14:15-16, hall T2, CS building\n\nSlides 7\nVideos: 2023 Lecture 7.1 on hierarchical models, 2023 Lecture 7.2 on exchangeability.\n\nRead the additional comments for Chapter 5\nCheck R demos or Python demos for Chapter 5\nMake and submit Assignment 7. Deadline Sunday 2024-11-10 23:59 (two weeks for this assignment)\nTA sessions 2024-10-30 14-16, 2024-10-31 12-14,\nHighly recommended, but optional: Make BDA3 exercises 5.1 and 5.2 (model solution available for 5.3-5.5, 5.7-5.12)\nStart reading Chapters 6-7 and additional material, see instructions below.\n\n\n\n8) BDA3 Ch 6+7 + extra material, model checking, cross-validation\nModel checking and cross-validation.\n\nRead BDA3 Chapters 6 and 7 (skip 7.2 and 7.3)\n\nsee reading instructions for Chapter 6 and Chapter 7\n\nRead Visualization in Bayesian workflow\n\nmore about workflow and examples of prior predictive checking and LOO-CV probability integral transformations\n\nRead Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC (Journal link)\n\nreplaces BDA3 Sections 7.2 and 7.3 on cross-validation\n\nLecture Monday 2024-11-04 14:15-16, hall T2, CS building\n\nSlides 8a, Slides 8b\nVideos: 2022 Lecture 8.1 on model checking, and 2023 Lecture 8.2 on cross-validation part 1. BDA3 Ch 6-7 + extra material.\n\nRead the additional comments for Chapter 6 and Chapter 7\nCheck R demos or Python demos for Chapter 6\nAdditional reading material\n\nCross-validation FAQ\n\nNo new assignment in this block\nStart the project work\nTA sessions 2024-11-06 14-16, 2024-11-07 12-14,\nHighly recommended, but optional: Make BDA3 exercise 6.1 (model solution available for 6.1, 6.5-6.7)\n\n\n\n9) BDA3 Ch 7, extra material, model comparison and selection\nPSIS-LOO, K-fold-CV, model comparison and selection. Extra lecture on variable selection with projection predictive variable selection.\n\nRead Chapter 7 (no 7.2 and 7.3)\n\nsee reading instructions for Chapter 7\n\nRead Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC (Journal link)\n\nreplaces BDA3 Sections 7.2 and 7.3 on cross-validation\n\nLecture Monday 2024-11-11 14:15-16, hall T2, CS building\n\nSlides 9\nVideos: 2023 Lecture 9.1 and 2023 Lecture 9.2 on model comparison, selection, and hypothesis testing.\n\nAdditional reading material\n\nCV FAQ\n\nMake and submit Old Assignment 8. Sunday 2024-11-17 23:59\nTA sessions 2024-11-13 14-16, 2024-11-14 12-14,\nStart reading Chapter 9, see instructions below.\n\n\n\n10) BDA3 Ch 9, decision analysis + BDA3 Ch 4 Laplace approximation and asymptotics\nDecision analysis. BDA3 Ch 9. + Laplace approximation and asymptotics. BDA Ch 4.\n\nRead Chapter 9 and 4\n\nsee reading instructions for Chapter 9\nsee reading instructions for Chapter 4\n\nLecture Monday 2024-11-18 14:15-16, hall T2, CS building\n\nSlides 10a, Slides 10b\nVideos: 2023 Lecture 10.1 on decision analysis. BDA3 Ch 9, and 2023 Lecture 10.2 on Laplace approximation, and asymptotics, BDA3 Ch 4.\n\nMake and submit Old Assignment 9. Sunday 2024-11-24 23:59\nTA sessions 2024-11-20 14-16, 2024-11-21 12-14,\nStart reading Chapter 4, see instructions below.\n\n\n\n11) Variable selection with projpred, project presentation example, extra\n\nLecture Monday 2024-11-25 14:15-16, hall T2, CS building\n\nSlides 11a, Slides Project Presentation, Slides 11 extra\nVideos: 2023 Lecture 11.1 on variable selecion with projpred, 2023 Lecture 11.2 on project presentations, 2023 Lecture 11.3 on rest of BDA3, ROS, and Bayesian Workflow\n\nNo new assignment. Work on project. TAs help with projects.\nTA sessions 2024-11-27 14-16, 2024-11-28 12-14,\n\n\n\n12) TBA\n\nLecture Monday 2024-12-02 14:15-16, hall T2, CS building\n\nSlides 12\n\nTBA\nWork on project. TAs help with projects. Project deadline 1.12. 23:59\nTA sessions 2024-12-04 14-16, 2024-12-05 12-14,\n\n\n\n13) Project evaluation\n\nProject report deadline 1.12. 23:59 (submit to peergrade).\n\nReview project reports done by your peers before 6.12. 23:59, and reflect on your feedback.\n\nProject presentations 9.-13.12. (evaluation week)" }, { "objectID": "Aalto2024.html#r", diff --git a/sitemap.xml b/sitemap.xml index c3f31b52..f0aed6d4 100644 --- a/sitemap.xml +++ b/sitemap.xml @@ -2,50 +2,50 @@ https://avehtari.github.io/BDA_course_Aalto/ta_info_gsu.html - 2024-10-28T10:01:34.950Z + 2024-10-28T10:12:01.676Z https://avehtari.github.io/BDA_course_Aalto/project.html - 2024-10-28T10:01:34.522Z + 2024-10-28T10:12:01.252Z https://avehtari.github.io/BDA_course_Aalto/gsu2023.html - 2024-10-28T10:01:34.522Z + 2024-10-28T10:12:01.252Z https://avehtari.github.io/BDA_course_Aalto/assignments_gsu.html - 2024-10-28T10:01:34.474Z + 2024-10-28T10:12:01.204Z https://avehtari.github.io/BDA_course_Aalto/FAQ.html - 2024-10-28T10:01:34.474Z + 2024-10-28T10:12:01.204Z https://avehtari.github.io/BDA_course_Aalto/Aalto2024.html - 2024-10-28T10:01:34.474Z + 2024-10-28T10:12:01.204Z https://avehtari.github.io/BDA_course_Aalto/Aalto2023.html - 2024-10-28T10:01:34.474Z + 2024-10-28T10:12:01.204Z https://avehtari.github.io/BDA_course_Aalto/BDA3_notes.html - 2024-10-28T10:01:34.474Z + 2024-10-28T10:12:01.204Z https://avehtari.github.io/BDA_course_Aalto/assignments.html - 2024-10-28T10:01:34.474Z + 2024-10-28T10:12:01.204Z https://avehtari.github.io/BDA_course_Aalto/demos.html - 2024-10-28T10:01:34.474Z + 2024-10-28T10:12:01.204Z https://avehtari.github.io/BDA_course_Aalto/index.html - 2024-10-28T10:01:34.522Z + 2024-10-28T10:12:01.252Z https://avehtari.github.io/BDA_course_Aalto/project_gsu.html - 2024-10-28T10:01:34.522Z + 2024-10-28T10:12:01.252Z