Skip to main content

4.4 due October 12

This section made sense. At first the average word length didn't make sense, but the example helped a lot. (I kept trying to have it be the average length of an encoded word, not the average length of an encoded letter.)

It was really nice to read this section that was mostly words after reading the 344 section that was mostly symbols.

Comments

Popular posts from this blog

7.2 due November 19

The thing I didn't understand about this section is why we need to use a distribution Q instead of just sampling uniformly (which I guess is a distribution...) when we are trying to do rejection sampling. I think it is because it speeds things up as it will have fewer rejections than a rectangle would. This section seems like we are learning more of what we just learned. I understand most of it, but I'm not sure I would recognize when I should use any of the techniques in this section.

4.2 due October 10

The hardest thing for me in this section is following all the details of actually implementing a search. The general "go here, then here, then here" makes sense, but keeping track of all the stacks and dictionaries and lists is what takes me the most time. I liked the cartoon example of what a depth first search would look like in real life. It both made me laugh and made me think.