Artem Titov | a168bb9 | 2021-04-13 09:23:59 | [diff] [blame] | 1 | <?% config.freshness.reviewed = '2021-04-12' %?> |
| 2 | |
| 3 | # PeerConnection Level Framework |
| 4 | |
| 5 | ## API |
| 6 | |
| 7 | * [Fixture][1] |
| 8 | * [Fixture factory function][2] |
| 9 | |
| 10 | ## Documentation |
| 11 | |
| 12 | The PeerConnection level framework is designed for end-to-end media quality |
| 13 | testing through the PeerConnection level public API. The framework uses the |
| 14 | *Unified plan* API to generate offers/answers during the signaling phase. The |
| 15 | framework also wraps the video encoder/decoder and inject it into |
| 16 | *`webrtc::PeerConnection`* to measure video quality, performing 1:1 frames |
| 17 | matching between captured and rendered frames without any extra requirements to |
| 18 | input video. For audio quality evaluation the standard `GetStats()` API from |
| 19 | PeerConnection is used. |
| 20 | |
| 21 | The framework API is located in the namespace *`webrtc::webrtc_pc_e2e`*. |
| 22 | |
| 23 | ### Supported features |
| 24 | |
| 25 | * Single or bidirectional media in the call |
| 26 | * RTC Event log dump per peer |
| 27 | * AEC dump per peer |
| 28 | * Compatible with *`webrtc::TimeController`* for both real and simulated time |
| 29 | * Media |
| 30 | * AV sync |
| 31 | * Video |
| 32 | * Any amount of video tracks both from caller and callee sides |
| 33 | * Input video from |
| 34 | * Video generator |
| 35 | * Specified file |
| 36 | * Any instance of *`webrtc::test::FrameGeneratorInterface`* |
| 37 | * Dumping of captured/rendered video into file |
| 38 | * Screen sharing |
| 39 | * Vp8 simulcast from caller side |
| 40 | * Vp9 SVC from caller side |
| 41 | * Choosing of video codec (name and parameters), having multiple codecs |
| 42 | negotiated to support codec-switching testing. |
| 43 | * FEC (ULP or Flex) |
| 44 | * Forced codec overshooting (for encoder overshoot emulation on some |
| 45 | mobile devices, when hardware encoder can overshoot target bitrate) |
| 46 | * Audio |
| 47 | * Up to 1 audio track both from caller and callee sides |
| 48 | * Generated audio |
| 49 | * Audio from specified file |
| 50 | * Dumping of captured/rendered audio into file |
| 51 | * Parameterizing of `cricket::AudioOptions` |
| 52 | * Echo emulation |
| 53 | * Injection of various WebRTC components into underlying |
| 54 | *`webrtc::PeerConnection`* or *`webrtc::PeerConnectionFactory`*. You can see |
| 55 | the full list [here][11] |
| 56 | * Scheduling of events, that can happen during the test, for example: |
| 57 | * Changes in network configuration |
| 58 | * User statistics measurements |
| 59 | * Custom defined actions |
| 60 | * User defined statistics reporting via |
| 61 | *`webrtc::webrtc_pc_e2e::PeerConnectionE2EQualityTestFixture::QualityMetricsReporter`* |
| 62 | interface |
| 63 | |
| 64 | ## Exported metrics |
| 65 | |
| 66 | ### General |
| 67 | |
| 68 | * *`<peer_name>_connected`* - peer successfully established connection to |
| 69 | remote side |
| 70 | * *`cpu_usage`* - CPU usage excluding video analyzer |
| 71 | * *`audio_ahead_ms`* - Used to estimate how much audio and video is out of |
| 72 | sync when the two tracks were from the same source. Stats are polled |
| 73 | periodically during a call. The metric represents how much earlier was audio |
| 74 | played out on average over the call. If, during a stats poll, video is |
| 75 | ahead, then audio_ahead_ms will be equal to 0 for this poll. |
| 76 | * *`video_ahead_ms`* - Used to estimate how much audio and video is out of |
| 77 | sync when the two tracks were from the same source. Stats are polled |
| 78 | periodically during a call. The metric represents how much earlier was video |
| 79 | played out on average over the call. If, during a stats poll, audio is |
| 80 | ahead, then video_ahead_ms will be equal to 0 for this poll. |
| 81 | |
| 82 | ### Video |
| 83 | |
| 84 | See documentation for |
| 85 | [*`DefaultVideoQualityAnalyzer`*](default_video_quality_analyzer.md#exported-metrics) |
| 86 | |
| 87 | ### Audio |
| 88 | |
| 89 | * *`accelerate_rate`* - when playout is sped up, this counter is increased by |
| 90 | the difference between the number of samples received and the number of |
| 91 | samples played out. If speedup is achieved by removing samples, this will be |
| 92 | the count of samples removed. Rate is calculated as difference between |
| 93 | nearby samples divided on sample interval. |
| 94 | * *`expand_rate`* - the total number of samples that are concealed samples |
| 95 | over time. A concealed sample is a sample that was replaced with synthesized |
| 96 | samples generated locally before being played out. Examples of samples that |
| 97 | have to be concealed are samples from lost packets or samples from packets |
| 98 | that arrive too late to be played out |
| 99 | * *`speech_expand_rate`* - the total number of samples that are concealed |
| 100 | samples minus the total number of concealed samples inserted that are |
| 101 | "silent" over time. Playing out silent samples results in silence or comfort |
| 102 | noise. |
| 103 | * *`preemptive_rate`* - when playout is slowed down, this counter is increased |
| 104 | by the difference between the number of samples received and the number of |
| 105 | samples played out. If playout is slowed down by inserting samples, this |
| 106 | will be the number of inserted samples. Rate is calculated as difference |
| 107 | between nearby samples divided on sample interval. |
| 108 | * *`average_jitter_buffer_delay_ms`* - average size of NetEQ jitter buffer. |
| 109 | * *`preferred_buffer_size_ms`* - preferred size of NetEQ jitter buffer. |
| 110 | * *`visqol_mos`* - proxy for audio quality itself. |
| 111 | * *`asdm_samples`* - measure of how much acceleration/deceleration was in the |
| 112 | signal. |
| 113 | * *`word_error_rate`* - measure of how intelligible the audio was (percent of |
| 114 | words that could not be recognized in output audio). |
| 115 | |
| 116 | ### Network |
| 117 | |
| 118 | * *`bytes_sent`* - represents the total number of payload bytes sent on this |
| 119 | PeerConnection, i.e., not including headers or padding |
| 120 | * *`packets_sent`* - represents the total number of packets sent over this |
| 121 | PeerConnection’s transports. |
| 122 | * *`average_send_rate`* - average send rate calculated on bytes_sent divided |
| 123 | by test duration. |
| 124 | * *`payload_bytes_sent`* - total number of bytes sent for all SSRC plus total |
| 125 | number of RTP header and padding bytes sent for all SSRC. This does not |
| 126 | include the size of transport layer headers such as IP or UDP. |
| 127 | * *`sent_packets_loss`* - packets_sent minus corresponding packets_received. |
| 128 | * *`bytes_received`* - represents the total number of bytes received on this |
| 129 | PeerConnection, i.e., not including headers or padding. |
| 130 | * *`packets_received`* - represents the total number of packets received on |
| 131 | this PeerConnection’s transports. |
| 132 | * *`average_receive_rate`* - average receive rate calculated on bytes_received |
| 133 | divided by test duration. |
| 134 | * *`payload_bytes_received`* - total number of bytes received for all SSRC |
| 135 | plus total number of RTP header and padding bytes received for all SSRC. |
| 136 | This does not include the size of transport layer headers such as IP or UDP. |
| 137 | |
| 138 | ### Framework stability |
| 139 | |
| 140 | * *`frames_in_flight`* - amount of frames that were captured but wasn't seen |
| 141 | on receiver in the way that also all frames after also weren't seen on |
| 142 | receiver. |
| 143 | * *`bytes_discarded_no_receiver`* - total number of bytes that were received |
| 144 | on network interfaces related to the peer, but destination port was closed. |
| 145 | * *`packets_discarded_no_receiver`* - total number of packets that were |
| 146 | received on network interfaces related to the peer, but destination port was |
| 147 | closed. |
| 148 | |
| 149 | ## Examples |
| 150 | |
| 151 | Examples can be found in |
| 152 | |
| 153 | * [peer_connection_e2e_smoke_test.cc][3] |
| 154 | * [pc_full_stack_tests.cc][4] |
| 155 | |
| 156 | ## Stats plotting |
| 157 | |
| 158 | ### Description |
| 159 | |
| 160 | Stats plotting provides ability to plot statistic collected during the test. |
| 161 | Right now it is used in PeerConnection level framework and give ability to see |
| 162 | how video quality metrics changed during test execution. |
| 163 | |
| 164 | ### Usage |
| 165 | |
| 166 | To make any metrics plottable you need: |
| 167 | |
| 168 | 1. Collect metric data with [SamplesStatsCounter][5] which internally will |
| 169 | store all intermediate points and timestamps when these points were added. |
| 170 | 2. Then you need to report collected data with |
| 171 | [`webrtc::test::PrintResult(...)`][6]. By using these method you will also |
| 172 | specify name of the plottable metric. |
| 173 | |
| 174 | After these steps it will be possible to export your metric for plotting. There |
| 175 | are several options how you can do this: |
| 176 | |
| 177 | 1. Use [`webrtc::TestMain::Create()`][7] as `main` function implementation, for |
| 178 | example use [`test/test_main.cc`][8] as `main` function for your test. |
| 179 | |
| 180 | In such case your binary will have flag `--plot`, where you can provide a |
| 181 | list of metrics, that you want to plot or specify `all` to plot all |
| 182 | available metrics. |
| 183 | |
| 184 | If `--plot` is specified, the binary will output metrics data into `stdout`. |
| 185 | Then you need to pipe this `stdout` into python plotter script |
| 186 | [`rtc_tools/metrics_plotter.py`][9], which will plot data. |
| 187 | |
| 188 | Examples: |
| 189 | |
| 190 | ```shell |
| 191 | $ ./out/Default/test_support_unittests \ |
| 192 | --gtest_filter=PeerConnectionE2EQualityTestSmokeTest.Svc \ |
| 193 | --nologs \ |
| 194 | --plot=all \ |
| 195 | | python rtc_tools/metrics_plotter.py |
| 196 | ``` |
| 197 | |
| 198 | ```shell |
| 199 | $ ./out/Default/test_support_unittests \ |
| 200 | --gtest_filter=PeerConnectionE2EQualityTestSmokeTest.Svc \ |
| 201 | --nologs \ |
| 202 | --plot=psnr,ssim \ |
| 203 | | python rtc_tools/metrics_plotter.py |
| 204 | ``` |
| 205 | |
| 206 | Example chart:  |
| 207 | |
| 208 | 2. Use API from [`test/testsupport/perf_test.h`][10] directly by invoking |
| 209 | `webrtc::test::PrintPlottableResults(const std::vector<std::string>& |
| 210 | desired_graphs)` to print plottable metrics to stdout. Then as in previous |
| 211 | option you need to pipe result into plotter script. |
| 212 | |
Tony Herre | b0ed120 | 2021-07-22 15:40:44 | [diff] [blame] | 213 | [1]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/api/test/peerconnection_quality_test_fixture.h;drc=cbe6e8a2589a925d4c91a2ac2c69201f03de9c39 |
| 214 | [2]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/api/test/create_peerconnection_quality_test_fixture.h;drc=cbe6e8a2589a925d4c91a2ac2c69201f03de9c39 |
| 215 | [3]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/test/pc/e2e/peer_connection_e2e_smoke_test.cc;drc=cbe6e8a2589a925d4c91a2ac2c69201f03de9c39 |
| 216 | [4]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/video/pc_full_stack_tests.cc;drc=cbe6e8a2589a925d4c91a2ac2c69201f03de9c39 |
| 217 | [5]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/api/numerics/samples_stats_counter.h;drc=cbe6e8a2589a925d4c91a2ac2c69201f03de9c39 |
| 218 | [6]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/test/testsupport/perf_test.h;l=86;drc=0710b401b1e5b500b8e84946fb657656ba1b58b7 |
| 219 | [7]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/test/test_main_lib.h;l=23;drc=bcb42f1e4be136c390986a40d9d5cb3ad0de260b |
| 220 | [8]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/test/test_main.cc;drc=bcb42f1e4be136c390986a40d9d5cb3ad0de260b |
| 221 | [9]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/rtc_tools/metrics_plotter.py;drc=8cc6695652307929edfc877cd64b75cd9ec2d615 |
| 222 | [10]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/test/testsupport/perf_test.h;l=105;drc=0710b401b1e5b500b8e84946fb657656ba1b58b7 |
| 223 | [11]: https://source.chromium.org/chromium/chromium/src/+/main:third_party/webrtc/api/test/peerconnection_quality_test_fixture.h;l=272;drc=484acf27231d931dbc99aedce85bc27e06486b96 |