R代写 - Neural Network Theory
时间:2020-11-09
Create an RMarkdown file and turn in both the knitted version (as HTML) and the .Rmd file. Your file should answer the following questions: Empirical Chain Rule: (16 pts) 1. Create a function `p` with the following features: p must calculate a degree 5 polynomial p must have two local maximas and two local minimas between x=-2 and x=5 p must be appropriately vectorized 2. Create a function `q` with the following features: q must calculate a degree 3 polynomial q must have one maxim and one minima between x=0 and x=1 q must have a range between y=-2 and y=5 over the interval [0,1] 3. Create a function p.q which is the composition p(q(x)) 4. Create the function `D` (see the notes) that approximates the derivative. Use h=0.0001 5. Use D to create the functions: 1. Dp 2. Dq 3. Dp.q 6. Produce a graph with the following properties: Must show `p` between x=-2 and x=5 [updated] Must show `Dp` between x=-2 and x=5 [updated] The ylim() should be enough to show both `p` and `Dp` clearly. Add dotted vertical lines at the zero's of `Dp` The x-axis and y-axis should be appropriately labelled, and the graph should have a title (hint: `main=""`) 7. Produce a second graph with the following properties: Must show `q` between x=0 and x=1 Must show `Dq` between x=0 and x=1 The ylim() should be enough to show both `q` and `Dq` clearly. Add dotted vertical lines at the zero's of `Dq` The x-axis and y-axis should be appropriately labelled, and the graph should have a title (hint: `main=""`) 2020/11/9 Neural Network Theory https://canvas.umn.edu/courses/184883/assignments/1264515 2/2 8. Produce a third graph with the following properties: Must show `p.q` between x=0 and x=1 Must show `Dp.q` between x=0 and x=1 Add a curve for `Dq(x)*Dp(q(x))+offset` where `offset` is chosen to make it clear the the graph of Dp.q and your curve are the same shape. The ylim() should be enough to show all details clearly Add dotted vertical lines at the zero's of `Dp.q` The x-axis and y-axis should be appropriately labelled, and the graph should have a title (hint: `main=""`) Neural Network I (bounded activation functions) (5 pts) A function is bounded if there is a lower-limit and an upper-limit to the possible values it can output. For example, the logistic function is bounded because it can only returns values in the interval (0,1). In one or two paragraphs discuss why a neural net that contains a hidden layer whose every node utilizes a bounded activation function must have both an upper and lower bound on the possible values of its output nodes. Explain why this makes it impossible for a neural network using the logistic function as an activation function for it's hidden nodes can never produce a perfect approximation to the function f(x)=x . Neural Network II (RELU) (5 pts) Consider a neural network that uses RELU for all of its hidden nodes. Explain why the function encoded by such a network is necessarily linear for sufficiently large input values. Explain the implications for emulating f(x)=x .
essay、essay代写