Skip to main content

Posts

Showing posts from November, 2018

8.4 due November 30

I don't quite understand Figure 8.4.3. It looks like the blue signal is the same on the two pictures but the red is different, but the caption says it should be the other way. I'm guessing the caption is wrong? This section wasn't too bad because I had done the Discrete Fourier Series lab today. So it helped the lab make more sense. So I don't have anything to add.

7.3 due November 20

This section seemed pretty straight forward. The trickiest part was remembering the big O costs of things. This section seems like it would fit with the tree stuff we did. Maybe the sections we are skipping make sense why it is here in the book.

7.2 due November 19

The thing I didn't understand about this section is why we need to use a distribution Q instead of just sampling uniformly (which I guess is a distribution...) when we are trying to do rejection sampling. I think it is because it speeds things up as it will have fewer rejections than a rectangle would. This section seems like we are learning more of what we just learned. I understand most of it, but I'm not sure I would recognize when I should use any of the techniques in this section.

7.1 due November 16

The part I struggled with the most in this section was figuring out what a monte carlo method was. It seemed to give a general introduction/description and then an example, but never the actual definition. I figured it out, but it took me a bit to realize I needed to. I like the estimates of pi examples. I think my kids would, too.

6.5 due November 12

The trickiest part of this section for me was keeping track of the parts of the formulas. They were rather messier than most that we've done. I do think it is cool how the math helps you whether you know the distribution and need to figure out what will happen, or you know what happened and need the distribution, or you have some samples and need to estimate what will happen next.

6.3 due November 9

This section seems pretty straight forward. I don't follow every step of every proof, (I don't have enough statistics understanding to know what we can and can't do.) but generally it makes sense. It's always fun to study the parts of math and science that unexpectedly show order where we don't expect it. I love seeing structure and order built into the world.

6.2 due November 7

The hardest part for me is understanding the difference between the law of large numbers and the Nota Bene 6.2.10. There is a lot to these sections that we seem to be glossing over. It leaves me feeling unsure of how well I understand the material.

6.1 due November 5

This section is tricky because it switches everything around. Now we know what our data is but we don't know our probability function. I'm not quite sure why someone would use a known to be biased estimator. It might be easier, but if it doesn't give you want you want, why use it?

5.7 due November 2

I think the hardest part of this section is figuring out how to apply it all. It covers a lot of material without many examples. It is fun to see math/probability that is complicated enough to do a better job at describing the real world.