I just made them up. We could use any other value. I think this is a subjective matter and I don't see any objective way of deciding what the right number should be. Maybe there is an objective way of deciding if averaging over 30 seconds is too little, too much or just OK. It looks good to me, but those 30 seconds are also just a guess.The default values of the sampling parameters should be safe. They should ensure that the number of sampled packets is no more than 0.01% of the packets carried by the link observed or that they do not add up to more than 0.01% of the link capacity. These constraints should hold over any 30 second time interval. A configuration of the sampling function that samples no packets at all is safe.Cristian, Can you explain where these figures come from?
Actually I was thinking of something else. What I meant there is that if someone decides to sample based on packets sizes so that large packets are more likely to be sampled (because one 1500 byte packet matters more than a 40 byte one), than the sampled packets should not add up to more than 0.1% of the link capacity.I understand that they are just default values. I also understand that we are talking about two different links - the link observed, where no more than one packet in 10,000 should be picked by the sampling process, and the link that exports the sampled information where no more than 1/10,000 additional bandwidth should be allowed for each filter on an observed link.