- Detect & fail sample set on duplicate ERT times before downlink data rate calculations and ensure that we have unique ERTs on the sample set
- Dropped frames/packets, duplicate ERTs or incorrect ERT tags would negatively affect estimated downlink rates, so run any consecutive filters first, then ert filters, then data rate filters, to provide clearer error messages first. Run Contact Filter last.
- Also move the frame length uniformity validation to within a sample set, instead of the entire query range
Further description:
Duplicate frames (e.g. in a test environment) can cause a divide by zero error during downlink data rate estimate, if that functionality is in use, in the case where subsequent frames (after sorting) have the same ERTs. This is a confusing error to log in this case, and our users would be better served by an obvious error stating that the sample set has frames with the same ERTs, and that the sample set was being discarded.
Make changes so that MMTC validates a sample set on having unique ERTs before downlink data rate is calculated and set.
Further description:
Duplicate frames (e.g. in a test environment) can cause a divide by zero error during downlink data rate estimate, if that functionality is in use, in the case where subsequent frames (after sorting) have the same ERTs. This is a confusing error to log in this case, and our users would be better served by an obvious error stating that the sample set has frames with the same ERTs, and that the sample set was being discarded.
Make changes so that MMTC validates a sample set on having unique ERTs before downlink data rate is calculated and set.