The most critical part of any test in the 3GPP 3G/4G/LTE/5G, IMS, WiFi Offload, or CIoT environments is determining the results. A comprehensive set of measurements should be an integral part of any test tool in order to gain a broader understanding of the potential shortcomings of the network entities under test. In addition, measurements should provide the necessary information for operators and vendors to judge network and node capacity so that they may adequately plan future requirements and upgrades.
Testing today’s sophisticated mobile networks requires measurements beyond the usual message and error counters. It is vital that the set of measurements available provide awareness of more complex concepts such as transaction duration, network latency, and successful transactions-per-second. Measurements such as message re-transmissions, duplicate messages, unsupported or malformed messages, and timeouts should also be available.
Furthermore, the granularity of the test measurements — both in terms of time and resources — must be sufficiently fine so as to isolate problems. Measurements from protocol layers and system resources are essential when troubleshooting failed tests and issues in the device under test, in the test network, or with the test platform itself.
dsTest provides such sets of Operational Measurements (OMs) for each of its interface applications.
At the application layer, dsTest provides transaction attempts, successes, and failures along with minimum, maximum, and average transaction duration for each report interval. The causes of transaction failures can typically be quickly discerned using the error counters that record error indications received in messages. Application layer OMs also report the number of messages sent and received by message type as well as reporting any malformed messages.
dsTest’s transport layer OMs focus on physical connections and also record messages and bytes sent and received. Error OMs for this layer record re-transmissions, duplicate messages received, ordering issues, buffer overruns, and protocol violations.
The OM framework also provides measurements for dsTest features such as Traffic Profile, and custom measurements that record SmartEvents or SmartFlow state machine activities while also reflecting their configurations.
dsTest collects OM values at one second intervals by default but can be configured to use interval timing from 200 milliseconds to 1 hour. It writes these values to a SQLite database and it will create multiple databases as needed to ensure that measurements from the entire test run are available.
dsTest and dsClient offer various means in which to retrieve measurements, ranging from live OM sampling via dsClient Terminal to post-test statistical evaluation with dsClient Desktop’s SmartReport.
dsClient Desktop Reporting Features
dsClient Desktop provides the following feature-rich reporting options while your test is running and after it is complete.
- OM values are summarized by application type, node type, and interface type while the lowest-level details are always available.
- You can define persistent custom measurements that perform mathematical operations on one or more OMs. You can also use custom OMs in the calculations for other custom OMs to form more complex calculations.
- Three report modes allow you to view OM values natively (values accumulate over time), incremental values in each reporting interval, or as a rate of change-per-second.
- An error indicator displays the sum of all error OM values and provides an option to add some or all of those OMs to your report.
- Export values for either a specified time frame or from the entire report to a CSV file.
- Save report configurations in your library for use with future tests.
Chart reports can contain multiple charts, each with its own set of OMs. Callouts display the latest values — either the most recent value in a live report or the final value in a database report. Values for any report interval are displayed by rolling your mouse over the interior of the chart. Zoom in to an interesting time frame with a click and drag of the mouse, and all of the charts in your report will synchronize to the same time frame. You can choose the manner in which multiple charts are arranged – stacked, side-by-side, or tiled – and you can also synchronize Y axis values. Export a chart image with the click of the snapshot button. You can even brand your charts with a watermark background image or configure custom plot colors and line styles if desired.
If you are more interested in actual values than trends, choose a Tabular Report. Report values are displayed in columns labeled with your choice of either the interval time or the interval identifier. Easily track error measurements as they appear in red. You can page through the report intervals, jump to the beginning or end of the test, or jump to a specific interval with the convenient navigation controls.
SmartReport is designed to provide test status at a glance and it also provides you with the tools to find the root cause of failed tests that are difficult to troubleshoot.
The SmartReport dashboard provides you with instant feedback on the status of your test based on your definition of a successful test. You can define multiple criteria profiles and choose the appropriate profile when you open the report. Within the SmartReport profile you can also define custom dashboards that evaluate the OM(s) you specify.
SmartReport produces, by default, a summary chart for each of the dashboards in your criteria profile. Every chart includes a menu that enables you to view ancillary charts that provide more detailed OMs. Drill down through the node or application paths or branch off to the diagnostic path at any level.
One of the most powerful troubleshooting tools dsTest offers is the statistical analysis available with SmartReport. When the root cause of a failed test is not apparent from the error OM indications, SmartReport’s statistical analysis can show you which OMs were exhibiting variant behavior during the time frame in which the test was struggling. That determination is made by examining changes in standard deviation from one interval to another. In the example shown below we can see that transaction duration variance, maximum transaction duration, and delay time for updateLocationAnswer messages are all experiencing rapid changes in the same time frame, indicating that the device under test may be overwhelmed.
dsTest and dsClient both offer reporting solutions for automated testing.
- dsClient Terminal supports measurement sampling with the OM command, which has various options to isolate or summarize OM values. A text response is printed to standard out.
- dsTest provides a utility that produces a series of CSV files containing values for all of the OMs and report intervals in a database.
- Run dsClient Desktop in headless mode to produce pass/fail test results. It will write the SmartReport dashboard sections to an HTML file. You define the template for that file, which may include any information you desire.