Recently, we began working with a large Independent School District in Texas. The networking team had a few questions regarding our Application Performance Scoring feature and wanted to know how they could leverage it to monitor the performance students experienced while completing online testing from providers like Pearson, IXL and Zeal.
Here’s what we did:
The first key in any monitoring workflow is correct classification – so we first created an Application object to correctly identify the mix of http and https transport types used by the testing service. This is done simply by creating an application object which correctly identifies the application traffic for the web service.
To create an application using a combination of http and https attributes, build the Application Object as in the following example – based upon the specific attributes for the testing service application.
You can quickly verify traffic is matching an application object by using the Real-time Monitor.
Once we created the Application Object, our next step was to create a custom application report through Exinda’s Solution Center to start monitoring hosts, users, traffic volume by time series, and Application Performance Scores for the specific application.
The Solution Center Report quickly gives us insight into application’s bandwidth usage, who uses it, and the users’ experience of the application. From the Solution Center we can also view and drill into the Application Performance Scores.
What is an Application Performance Score?
Application Performance Score (APS) is a technology provided by Exinda that monitors an application’s network performance. The method works by passively measuring several properties of a TCP conversation and combining them to give an overall score. These properties are referred to as an Application Performance Score (APS).
APS can be used to replace traditional non-passive methods that attempt to measure network performance – such as ICMP echo (ping) or SNMP measurements of server load.
The metrics that make up the score are a combination of the following:
- Round-Trip Time (RTT)
- Network Delay
- Server Delay
- Network Jitter
- Transaction Delay
- Packets / Bytes Lost (Retransmissions)
- Connections Started
- Connections Aborted
- Connections Refused
- Connections Ignored
APS has an auto-baselining feature that allows the administrator to accept a specific time period as being a time in which the application was performing as expected.
APS can show the relative value of these scores to indicate a time when the user experience might shift from tolerable, to unsatisfied for example. The mix of server and WAN metrics offer great insight into finding a root cause for any period in time when the user experience was frustrating.
The networking team is now automatically notified when user application performance levels cross from good, to tolerable, to unsatisfied, to frustrated.
This helps the team take a proactive approach to performance monitoring for these web services that are leveraged across the district.
Looking to improve performance of all your critical learning applications? We got you covered! Book a demo.