We know how to take the average of a group of numbers: add all the numbers and divide by how many there are.
What would the "average value" of a function be?
A function like f (x) = x or f (x) = ex takes on infinitely many values. We can't add infinitely many values and divide by ∞.
However, there is a reasonable way to define the average value of a function on an interval.
First we're going to briefly revisit taking averages of numbers. We want to think about averages of numbers in a specific way that will make it easier to understand what the average of a function means.