Respuesta :

Answer:

The range is defined as the difference between the term with the highest value and the term with the lowest value. This statistic is used to measure the variability of a series of data because it provides information on how far apart the values of a tail of the distribution are from the values at the other end of the tail.

Imagine that you manufacture a type of spare part for cars that must have a measurement of 10 cm with a margin of error of 1 cm.

This is:

10 ± 1 cm

Then you expect your manufacturing process to produce pieces with identical dimensions, that is, with little variability.

If you randomly select a sample of n pieces and measure them, the variability is expected to be low, so that your process is of quality, then expect a low range preferably less than 1 cm.

{10, 10.1, 10.5, 9.8, 9,6, 10.2} Range= 10.5 - 9.6 = 0.9 cm low variability

But if you find that the range is up to 8 cm, this would mean that not all pieces measure around 10 cm, it means that the variability of the measurements is high.

{14, 12, 11, 8, 7, 11, 12, 15} Range = 15 - 7 = 8 cm   high variability

The range can help describe a data set by evaluating the whole of a data set, showing spread within a data set, and comparing the spread between similar data sets. Simply put, it is the amount of variation from the lowest number to the highest and indicates the size of the statistical dispersion.

100% on Edge, used Google to come up with the answer.