Hi, Richard,
initially I thought you were right, but on second thoughts, Maria_s actually did give us the model, i.e., the data generating process, not just a data sample. By writing out that her data is conditionally normally distributed in each class, and that the covariance matrix is the same for each class, she's basically saying that the Bayes classifier (the classifier which minimizes the probability of misclassification) is LDA. See https://en.wikipedia.org/wiki/Linear_discriminant_analysis
x1 = mvrnorm(50, mu = c(0, 0), Sigma = matrix(c(1, 0, 0, 3), 2))
x2 = mvrnorm(50, mu = c(3, 3), Sigma = matrix(c(1, 0, 0, 3), 2))
x3 = mvrnorm(50, mu = c(1, 6), Sigma = matrix(c(1, 0, 0, 3), 2))
So now it's just a matter of plotting the linear boundaries - probably the code linked by @Max can be easily rewritten in ggplot2.