INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)

ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue VIII, August 2025

www.ijltemas.in Page 692

User Quality of Experience (QoE) In IEEE 802.11 Wlan: A
Comprehensive Review

Pawan Kumar Pant#, Mahesh Chandra Joshi$, Neeraj Tiwari*
# Department of Information Technology, Kumaun University, Nainital. Uttarakhand

$ Department of Mathematics & IT, Kumaun University, Nainital. Uttarakhand
*Department of Statistics, Soban Singh Jeena University, Almora. Uttarakhand

DOI: https://doi.org/10.51583/IJLTEMAS.2025.1408000086

Abstract— Quality of Experience (QoE) has become a central metric for evaluating user satisfaction in telecommunication
networks, yet most research has focused on LTE, GSM, and UMTS, leaving IEEE 802.11 WLANs still the backbone of global
connectivity less explored from a QoE perspective. This paper presents a comprehensive review of QoE in WLANs, integrating
technical, regulatory, and business viewpoints. We examine QoE models and their interplay with IEEE 802.11 standards
(802.11n/ac/ax/be), and analyze application-level QoE for video streaming, VoIP, online gaming, live event broadcasting, AR/VR,
and web browsing. The mapping between Quality of Service (QoS) metrics and user-perceived QoE is discussed, along with
methodologies for measurement, simulation, and testbed evaluation. Special attention is given to challenges arising in dense,
moderate, and home deployments. Building on this analysis, we propose a priority-aware framework that links QoS indicators,
subjective user perception, and contextual application importance (e.g., emergency announcements, interactive gaming, or live
sports streaming). We introduce mathematical models for embedding QoE into WLAN operations and identify open challenges,
including ML-driven prediction, privacy-preserving telemetry, and QoE-aware scheduling. Our study concludes with
recommendations to guide future research and industry practices toward user-centric WLAN design.

Keywords— Quality of Experience (QoE), IEEE 802.11 WLAN, Wi‑Fi 6, Wi‑Fi 7, QoS–QoE Mapping, Machine Learning

I. Introduction

IEEE 802.11 Wireless Local Area Networks (WLANs) have become ubiquitous. The last two decades have witnessed exponential
growth in wireless connectivity, Wi-Fi emerging as the de facto medium for broadband access in homes, enterprises, and public
spaces. According to Cisco’s Visual Networking Index, Wi-Fi continues to carry the majority of global IP traffic, surpassing both
wired and cellular technologies [1]. Traditionally, performance evaluation in WLANs has focused on Quality of Service (QoS)
metrics such as throughput, latency, jitter, and packet loss. While these parameters remain important for network operators and
engineers, they fail to fully capture the end-user’s perception. With the growth of streaming, real-time communications, and
interactive apps, Quality of Experience (QoE) provides a user-centric complement to traditional Quality of Service (QoS). QoE
has therefore emerged as a complementary, user-centric paradigm that assesses how services are perceived at the application level
[2][3]. Quality of Experience (QoE) has become a critical metric for assessing user satisfaction in modern communication networks.
While earlier research has predominantly emphasized LTE, GSM, and UMTS systems, IEEE 802.11 Wireless Local Area Networks
(WLANs) continue to serve as the backbone of global connectivity, supporting billions of devices in homes, enterprises, and public
spaces. With the rapid proliferation of Wi-Fi 6 (802.11ax), Wi-Fi 7 (802.11be), and beyond, understanding QoE in WLANs is
essential for both academia and industry.

A key driver of WLAN demand is the dominance of mobile devices. Smartphones and tablets from vendors such as Apple, Samsung,
OPPO, and others rely on Wi-Fi for HD video streaming, cloud services, and conferencing. Unlike desktops, these devices are
highly mobile within the coverage area, making handoff and roaming performance critical for seamless QoE [4]. In residential
environments, smart TVs, set-top boxes, and streaming dongles have become the main platforms for video delivery from services
such as Netflix and YouTube, requiring stable throughput and minimal buffering [5]. The shift to 4K/8K Ultra HD further stresses
consumer WLAN capacity. Gaming is another latency-sensitive category: online multiplayer titles on smartphones, PCs, and
consoles (PlayStation, Xbox) demand round-trip times below 50 ms, while cloud gaming platforms (Xbox Cloud Gaming, NVIDIA
GeForce Now) make WLAN performance even more critical [6]. Unlike video streaming, where buffering can mask impairments,
gaming QoE is tightly bound to jitter, packet loss, and latency spikes. Emerging AR/VR applications impose even stricter
requirements, with immersive headsets requiring <20 ms latency and sustained high data rates. Although Wi-Fi 6/6E and Wi-Fi 7
promise improvements through OFDMA, MU-MIMO, Multi-Link Operation (MLO), and wider channels, AR/VR traffic remains
highly sensitive to even short disruptions [7]. With features such as OFDMA, MU-MIMO, Target Wake Time (TWT), and MLO,
Wi-Fi 6 (802.11ax) and Wi-Fi 7 (802.11be) are expected to significantly impact QoE [8][9]. Given the centrality of WLANs for
mobile connectivity, streaming, gaming, AR/VR, and web browsing, a comprehensive review of QoE in IEEE 802.11 WLANs is
both timely and necessary.

This paper provides a comprehensive review of QoE in IEEE 802.11 WLANs, combining technical, regulatory, and business
perspectives. We survey existing QoE models, examine their application to Wi-Fi standards, and analyze methodologies for
evaluating user experience through measurements, simulations, and testbeds. Building on this foundation, we propose an integrated

INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)

ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue VIII, August 2025

www.ijltemas.in Page 693

framework that links QoS indicators, application-level priorities, and business policies to user-centric QoE outcomes. Finally, we
identify open challenges and outline future research directions to enable WLANs that are truly designed around user experience.

II. Methodological Framework

The study of Quality of Experience (QoE) in WLANs requires a multidisciplinary approach that integrates technical, regulatory,
and business perspectives. Unlike traditional Quality of Service (QoS) analyses, QoE research must capture both system-level
indicators and user-centric perceptions [10]. To achieve this, we adopt a design-science research methodology consisting of three
phases: problem definition, solution design, and evaluation [11].

A. Technical Dimension

At the technical level, QoE analysis involves mapping IEEE 802.11 MAC/PHY mechanisms to application-layer performance.
Recent standards such as Wi-Fi 6 (802.11ax) and Wi-Fi 7 (802.11be) introduce key innovations—OFDMA, MU-MIMO, Target
Wake Time (TWT), BSS coloring, and Multi-Link Operation (MLO)—that directly influence user-perceived quality [8][9]. By
evaluating throughput stability, latency distributions, packet loss, and jitter across different applications (video streaming, VoIP,
gaming, AR/VR), we establish a systematic link between network-layer metrics and subjective user satisfaction [12].

B. Regulatory Dimension

QoE in WLANs also operates within the constraints of spectrum regulation and net neutrality policies. As Wi-Fi uses unlicensed
spectrum (2.4 GHz, 5 GHz, and 6 GHz), coexistence with other technologies such as Bluetooth, Zigbee, and 5G unlicensed bands
can degrade user experience [13]. Regulatory frameworks particularly those governing spectrum sharing and Quality of Experience
reporting, play a critical role in ensuring fair access and minimizing interference. Moreover, the Wi-Fi Alliance certification
programs enforce interoperability and compliance, indirectly shaping QoE outcomes [14].

C. Business and Service Dimension

From a business perspective, QoE is increasingly recognized as a strategic differentiator for service providers. Internet Service
Providers (ISPs) and enterprise network operators are moving toward Customer Experience Management (CEM) platforms, which
incorporate QoE metrics into network analytics and decision-making [15]. QoE-aware SLAs (Service Level Agreements) are being
explored to guarantee not only throughput but also end-user satisfaction in applications such as cloud gaming and enterprise
collaboration. Business adoption of QoE further motivates the need for standardized measurement frameworks and benchmarking
tools.

In summary, the methodological framework for WLAN QoE research requires the integration of technical models, regulatory
considerations, and business strategies. Only through such a holistic approach can WLAN systems be optimized for true user-
centric performance.

III. Mathematical Modelling & Discussions

QoE in WLANs can be expressed as a multi-dimensional construct combining technical network factors, regulatory compliance,
and business perception. In this section, we first present the baseline QoE model, and then extend it to incorporate application-level
priorities and integrated user perception frameworks (QoE-IFW). In this work, we define QoE-IFW (Integrated Framework for
WLANs) as a three-dimensional model combining technical, regulatory, and business factors. This allows us to extend traditional
QoE modelling (focused solely on throughput or delay) into a holistic framework aligned with real-world WLAN deployments.

At the most general level, QoE can be decomposed into three additive terms:

������ − ������ = �� + �� + �� (��)

where:

T: Technical factors (throughput, delay, jitter, packet loss).

R: Regulatory factors (fair access, net neutrality, compliance).

B: Business-related factors (pricing, perceived value, brand image).

Equation (1) provides a conceptual baseline, where each term may be modelled separately and later integrated.

Technical QoE is often modelled as a utility maximization problem, where the network distributes capacity C among users i ∈ Ui :

�� = max
{����}

∑ ����(����)
��∈��

s.t. ∑ ����
��∈��

≤ ��, ���� ≥ 0 (��)

where:

xi: resource allocated to user i.

ui(xi): utility (QoE) function of user i.

INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)

ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue VIII, August 2025

www.ijltemas.in Page 694

C: total channel capacity.

Equation (2) provides a fair allocation, but it is application-agnostic, treating all flows equally regardless of their sensitivity to delay
or jitter.

A. Application-Priority Layer

Let �� be the set of application classes (e.g., {voice/RTC, live video, gaming, VOD, web, background}).
Assign each class a priority weight ωₐ ∈ [0,1] (policy-driven; normalized ∑ₐ ωₐ = 1). Examples:

ωₐ=1.00 for emergency/critical live announcement,

0.85 for live sports/news, 0.80 for interactive gaming/voice,

0.60 for VOD streaming, 0.35 for web, 0.05 for background updates.

Define the per-app QoE score Qₐ as a MOS-like utility derived from app-specific KPIs:

voice/RTC → E-model/POLQA (delay, jitter, loss),

live/VOD video → P.1203/VMAF (startup, rebuffer, bitrate stability),

gaming → utility of (low p95/p99 latency, low jitter, loss),

web → Core Web Vitals sensitivity to latency/jitter.

We aggregate to a priority-aware technical QoE:

��∗ = ∑ ��������
{�� ∈��}

(����)

where:

ωa: priority weight of application a.

Qa: QoE contribution of application a.

This replaces (or augments) the earlier T term. If you prefer to keep the planning/interference view, combine them:

�� {∗∗} = ���� + (1 − ��) ∑ �������� �� (����)

where η is a tunable parameter:

η=1: pure fairness (baseline).

η=0: strict priority enforcement.

Intermediate values allow trade-offs between fairness and priority.

B. Scheduling with Airtime Constraint

Let xi,a be the airtime (or RU/time share) given to user i running app a. Let u i,a(x) be the user-level utility mapped to QoE (concave
in x for fairness).

Maximize priority-weighted QoE under airtime budget:

max
��∈��

∑ ω��
��∈����

 ����,��(����,��) s.t. ∑ ∑ ����,��
����

≤ 1, ����,�� ≥ 0 (����)

To ensure strict protection of high-priority classes (e.g., live video, AR/VR), use a lexicographic or mixed objective:

max �� = λ  min
��∈��

���� + (1 − λ) ∑ ω�������� , λ ∈ [0,1](����)

First term (mina Qa): guarantees a minimum QoE for critical services.

Second term: optimizes global weighted QoE.

λ : controls trade-off between protection (λ→1) and efficiency (λ→0).

This protects live/critical streams first, then optimizes the global weighted QoE.

C. Event-aware priority boosts

To capture time-bounded events (e.g., a live political announcement or sports final), we introduce a context factor ����(��) ∈ [0, Δ] for
each application class. The effective weight becomes:

INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)

ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue VIII, August 2025

www.ijltemas.in Page 695

ω��̃(��)(��) =
ω��(1 + ����(��))

∑ ω��(1 + ���� (��))��∈��
(����)

Here, the denominator normalizes the boosted weights so that ∑ ω��̃(��)��∈�� = 1. During the event window, the live application class
receives a temporary weight increase, after which weights revert to their baseline values.

D. Mapping to Wi-Fi QoS knobs (WMM/DSCP)

The theoretical priority weights ωa introduced in (2a)–(2e) must be mapped to practical QoS mechanisms available in WLAN
deployments. In IEEE 802.11, the Wi-Fi Multimedia (WMM) framework defines four access categories (ACs) that correspond to
differentiated MAC-layer handling. Similarly, in IP networks, the Differentiated Services (DiffServ) architecture provides traffic
classification through DSCP markings.

A practical mapping of application classes to QoS knobs is shown below:

Voice / Real-time communications (RTC): mapped to AC_VO with Expedited Forwarding (EF, DSCP 46).

Live video, cloud gaming, and interactive media: mapped to AC_VI, often using AF4x or CS5 DSCP values depending on operator
policy.

Web browsing and video-on-demand (VOD): mapped to AC_BE (best effort).

Background traffic (updates, downloads): mapped to AC_BK.

To ensure that the airtime allocations reflect the intended weights ωa without starving lower-priority flows, WLANs can incorporate
admission control and per-queue management (e.g., AQL, fq_codel, CAKE). These mechanisms enforce latency bounds and
fairness within each access category.

This mapping illustrates how abstract priority weights ωa can be grounded in standards-based QoS mechanisms, thereby enabling
the proposed QoE framework to be implemented in commodity WLAN devices. (References: Wi-Fi Alliance WMM Specification
[19]; IETF DiffServ RFC 4594 [20]; ITU-T E-model G.107 [21].)

E. Integrated Model

We now extend the baseline framework (1) by replacing the technical term T with the priority-aware, event-boosted technical model
from (2a)–(2e). Integrating the above refinements, the complete QoE-IFW model can be written as:

QoE-IFW = ∑ ω��̃(��)����
��∈��

+ �� + �� (��)

where:

����̃(��) are time-varying, event-aware weights from (2e), Qa are application-specific QoE scores, R captures regulatory dimensions
(fair access, neutrality, compliance), B reflects business levers (pricing, premium tiers, managed Wi-Fi services).

Equation (3) therefore represents the QoE-IFW integrated framework, where the technical term is given by the event-aware priority
model of (2e).

F. Experimental Presets

The following presets illustrate how ωa and event factors ca(t) can be configured in practice for experimental validation:

Live announcement / public address: ωlive = 1.0, ��live(��) = +0.5, with strict minimum QoE guarantees
enforced via (2d).

Esports / gaming night:

ωgame = 0.85; configure low target queue latency (AQL) on AC_VI and allocate larger RU grants for small packets.

Family movie night (4K streaming):

ωvideo=0.7; target steady throughput for ITU-T P.1203 video quality stability while capping best-effort background traffic.

Work calls / enterprise collaboration:

ωvoice=0.8; enforce AC_VO queue shaping and configure roaming triggers for session continuity.

These presets demonstrate how abstract weights ωa can be mapped to practical WLAN configurations, bridging the gap between
mathematical modelling and deployable policies.

Application-Level QOE In WLANs

INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)

ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue VIII, August 2025

www.ijltemas.in Page 696

QoE is highly application-dependent, since each service has unique sensitivity to throughput, delay, jitter, and loss. In WLANs,
common categories include video streaming, real-time voice/video, web browsing, online/cloud gaming, and emerging AR/VR
workloads.

A. Video Streaming

Video streaming services such as Netflix, YouTube, and Disney and other dominate WLAN traffic. QoE depends primarily on
startup delay, rebuffering events, average bitrate, and stability of adaptive bitrate algorithms [22]. The ITU-T P.1203 model and
industry metrics such as VMAF provide objective approximations of user experience [5][22]. WLAN impairments such as
contention, interference, and hidden-node collisions directly affect buffer dynamics, leading to degraded QoE.

B. VoIP and Real-Time Video Conferencing

Applications like Zoom, Microsoft Teams, FaceTime, and WhatsApp are highly sensitive to one-way delay, jitter, and packet loss.
QoE is typically measured using the E-model (ITU-T G.107) and POLQA (P.863), which yield a Mean Opinion Score (MOS)
[2][21]. In WLANs, roaming delays and airtime contention often cause transient degradations. Features such as Fast BSS Transition
(802.11r) and Target Wake Time (TWT) can mitigate disruptions for real-time traffic [4].

C. Web Browsing

QoE for web browsing is influenced by page load time, responsiveness, and layout stability. Google’s Core Web Vitals (Largest
Contentful Paint – LCP, Interaction to Next Paint – INP, and Cumulative Layout Shift – CLS) are widely adopted QoE indicators
[23]. Wi-Fi queuing delays and uplink bufferbloat significantly impact these metrics. Active queue management (e.g., AQL,
fq_codel, CAKE) has been shown to improve browsing QoE [17].

D. Online and Cloud Gaming

Multiplayer and cloud gaming (e.g., Xbox Cloud Gaming, NVIDIA GeForce Now, PlayStation Network) are extremely latency
and jitter sensitive. Unlike video streaming, gaming cannot rely on buffering, so p95/p99 latency distributions are critical to
perceived experience [6][24]. WLANs often exacerbate variability due to contention and interference. To optimize gaming QoE,
application priority mapping (e.g., WMM AC_VI) and fair airtime scheduling are required [19][20].

E. Live Event Streaming and Emergency Broadcasts

Live applications such as sports streaming, political speeches, and emergency announcements introduce a contextual priority
dimension. Users tolerate little delay, and synchronization across devices becomes important. Priority-aware QoE models (as
described in Section III) can assign higher weights to such applications to ensure uninterrupted delivery during peak demand.

F. Augmented Reality (AR) and Virtual Reality (VR)

AR/VR workloads demand very low latency (<20 ms) and high sustained throughput to avoid motion sickness and frame drops
[7][25]. Wi-Fi 6 and 7 features such as Multi-Link Operation (MLO), OFDMA, and wider channels are expected to support AR/VR
by providing deterministic latency and reduced jitter. However, these applications remain one of the most challenging QoE
scenarios in WLANs, requiring cross-layer optimization.

Measurement and Evaluation Methodologies

QoE assessment in WLANs requires both objective and subjective evaluation methods. Unlike purely technical QoS monitoring,
QoE methodologies attempt to capture the user’s perception of service quality.

A. Objective Metrics

Objective approaches map network-level KPIs (throughput, delay, jitter, loss) to application-level QoE models. For WLANs, the
tools which are widely used like Video QoE are ITU-T P.1203, Netflix VMAF, YouTube QoE model [22][26]. Some Voice QoE
are ITU-T G.107 (E-model), POLQA P.863 [21]. Web QoE are Core Web Vitals (LCP, INP, CLS) [23]. Gaming QoE are
latency/jitter distribution models, p95/p99 delay analysis [24]. AR/VR QoE: frame loss, motion-to-photon latency (<20 ms), head-
tracking stability [25]. In practice, WLAN-specific impairments (interference, roaming delays, hidden terminals) must be
incorporated into these models.

B. Subjective Studies

Subjective methods involve user surveys, MOS experiments, and controlled trials. For instance, Lab-based crowdsourcing
experiments using standardized scales (1–5 MOS, Absolute Category Rating – ACR) [27]. Field studies capturing real user
impressions during roaming, streaming, or gaming sessions [28]. QoE panels to benchmark devices (smartphones, AR/VR headsets,
smart TVs) under standardized WLAN scenarios.

INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)

ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue VIII, August 2025

www.ijltemas.in Page 697

C. Simulation Frameworks

Network simulators such as ns-3, OMNeT++, and QualNet are commonly used to model WLAN performance. Ns-3’s 802.11ax
module supports OFDMA and MU-MIMO, enabling QoE-focused simulation of dense networks [29]. Cloud gaming and AR/VR
scenarios can be reproduced with packet-trace-driven traffic generators.

D. Testbed Evaluations

QoE must ultimately be validated on real testbeds. Common practices includes using Wireshark, iPerf3, and tcpdump for KPI
capture. Employing 802.11ax/11be access points in controlled environments. Integrating machine-learning telemetry to predict
QoE in real time [30].

E. Cross-Layer Monitoring

Recent research emphasizes cross-layer QoE monitoring, where PHY/MAC KPIs (RSSI, MCS, airtime share) are correlated with
app-level QoE. Machine learning models (e.g., random forests, deep neural networks) have been applied to map raw WLAN
telemetry into user satisfaction indicators [31].

Challenges in Dense, Moderate, And Home Wlan Deployments

WLAN deployments vary significantly across environments, ranging from highly dense public spaces (airports, stadiums,
universities) to enterprise and campus networks, and finally to residential home scenarios. Each setting introduces unique challenges
that directly impact the Quality of Experience (QoE).

A. Dense Deployments

Dense environments involve hundreds or even thousands of concurrent users competing for limited airtime.

Contention and Interference: Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) suffers in ultra-dense networks,
leading to higher collision probability and backoff delays.

Hidden/Exposed Nodes: QoE for delay-sensitive services such as video conferencing and AR/VR applications deteriorates
significantly due to overlapping Basic Service Sets (BSSs).

QoE Impacts: Increased latency and jitter impair gaming experience, while streaming services face frequent rebuffering events.

Mitigation: Features introduced in IEEE 802.11ax/802.11be such as BSS Coloring, OFDMA, and MU-MIMO partially alleviate
these issues by enabling parallel transmissions and improved spatial reuse [32].

B. Moderate Deployments (Enterprise, Campus, Small Business)

Moderate deployments are typical of offices, universities, and enterprise environments, where dozens of devices connect to each
access point.

Mobility and Roaming: Frequent handovers, for example during VoIP calls, can cause transient service interruptions.

Application Diversity: These environments support diverse workloads including video conferencing, cloud applications, and
browsing, each with different QoE sensitivities.

QoE Impacts: Users often report degraded satisfaction due to inconsistent roaming policies, load balancing issues, or interference
from neighboring APs, even when congestion is not severe.

Mitigation: The adoption of IEEE 802.11k/v/r amendments and AI-driven roaming prediction mechanisms enhances session
continuity and fairness [33].

C. Home Deployments

Home WLANs represent the most widespread use case, supporting smartphones, TVs, IoT devices, and gaming consoles.

Challenges: Spectrum limitations, poor AP placement, interference from household devices (e.g., microwaves, Bluetooth), and lack
of QoS configuration degrade performance.

QoE Impacts: Streaming on Smart TVs often conflicts with latency-sensitive applications like online gaming or AR/VR, while IoT
devices may create unexpected bursts of traffic.

Mitigation: Wi-Fi 6E/7 consumer routers increasingly integrate mesh networking, band steering, and application prioritization.
QoE-aware scheduling and intuitive monitoring tools further enhance the end-user experience [34].

D. Cross-Environment Challenges

Heterogeneity: QoE must account for diverse devices such as smartphones, smart TVs, AR/VR headsets, and IoT sensors.

INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)

ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue VIII, August 2025

www.ijltemas.in Page 698

Fairness vs. Priority: Prioritizing certain applications (as per Section III) may conflict with fairness principles and regulatory
constraints.

Real-World Complexity: Testbed evaluations reveal that theoretical gains often underperform in practice due to multipath fading,
dynamic interference, and device heterogeneity [35].

Future Directions

Despite notable advances in integrating Quality of Experience (QoE) into IEEE 802.11 WLANs, several open research challenges
remain. These gaps highlight the need for intelligent, adaptive, and user-centric WLAN design.

A. Machine Learning–Driven QoE Prediction

The growing availability of WLAN telemetry (RSSI, MCS, airtime, queue length) enables machine learning (ML) models to predict
QoE in real time. Key challenges include developing lightweight models deployable on access points, incorporating explainable AI
(XAI) for transparency, and extending prediction frameworks across diverse applications such as streaming, gaming, and AR/VR
[36].

B. QoE-Aware Resource Allocation

Future WLANs must balance fairness with application priorities. Building on priority-weighted frameworks, research should focus
on adaptive scheduling, dynamic airtime allocation, and integration of multi-AP coordination and multi-link operation (MLO).
Ensuring deterministic QoE for ultra-reliable low-latency communications (URLLC) workloads, including AR/VR, remains an
open challenge [37].

C. Privacy-Preserving QoE Telemetry

QoE data collection introduces privacy and compliance concerns. Emerging approaches such as federated learning and edge-based
analytics can enable collaborative QoE modelling without sharing raw user data. Designing privacy-preserving protocols aligned
with regulations (e.g., GDPR) and scalable crowdsourcing techniques are critical future directions [38].

D. Integration with 5G/6G and Beyond

As WLANs increasingly interoperate with cellular networks, seamless QoE-aware mobility will be required. Open problems include
unified QoE frameworks spanning heterogeneous access technologies, intelligent offloading strategies across Wi-Fi and 5G/6G,
and optimized service delivery over multi-access edge computing (MEC) infrastructures [39].

E. QoE for Emerging Applications

Emerging workloads demand new QoE models tailored to their unique requirements. Immersive AR/VR and metaverse applications
require ultra-low-latency transport with synchronized media delivery. Remote surgery and e-health rely on deterministic
performance, while industrial WLANs supporting IoT and robotics demand reliability- and safety-driven QoE metrics [40].

IV. Conclusion

This paper reviews Quality of Experience (QoE) in IEEE 802.11 WLANs from technical, regulatory, and business perspectives.
We map network-level QoS indicators to user-level QoE outcomes, analyze application requirements across streaming, VoIP,
gaming, live events, and AR/VR, and propose models that integrate application priorities into WLAN operations. Our survey shows
that QoE is environment-dependent: contention dominates dense networks, roaming and heterogeneity challenge enterprise
WLANs, while device diversity and interference constrain home setups. Fairness versus application priority emerges as a central
design trade-off. We emphasize robust measurement combining objective models (e.g., ITU-T P.1203, MOS, Core Web Vitals)
with subjective methods such as user studies and crowdsourcing, supported by simulations and testbeds for reproducibility. Looking
ahead, WLANs must adopt ML-driven QoE prediction, QoE-aware scheduling, privacy-preserving telemetry, and 5G/6G
integration. Emerging applications like AR/VR, cloud gaming, remote healthcare, and industrial IoT demand cross-layer, adaptive,
and user-centric solutions.

In conclusion, QoE should evolve from a metric to a design principle, embedded into standards, implementations, and business
models to ensure WLANs remain the backbone of global connectivity.

References

1. ITU-T Rec. P.800, “Methods for subjective determination of transmission quality,” 1996.
2. ITU-T Rec. G.107, “The E-model: a computational model for use in transmission planning,” 2015.
3. Cisco, “Cisco Annual Internet Report (2018–2023),” White Paper, 2020.
4. IEEE Std 802.11r-2008, “Amendment 2: Fast Basic Service Set (BSS) Transition,” IEEE, 2008.
5. Netflix Tech Blog, “VMAF: The Video Multi-Method Assessment Fusion,” 2016.
6. Laghari, A. A., Laghari, K., Memon, K. A., Soomro, M. B., Laghari, R. A., & Kumar, V. (2020). Quality of experience

(QoE) assessment of games on workstations and mobile. Entertainment Computing, 34, 100362.
https://doi.org/10.1016/j.entcom.2020.100362.

INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)

ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue VIII, August 2025

www.ijltemas.in Page 699

7. C. Keighrey, R. Flynn, S. Murray and N. Murray, "A QoE evaluation of immersive augmented and virtual reality speech
& language assessment applications," 2017 Ninth International Conference on Quality of Multimedia Experience
(QoMEX), Erfurt, Germany, 2017, pp. 1-6, doi: 10.1109/QoMEX.2017.7965656.

8. ITU-T Rec. P.1203, “Parametric Bitstream-Based Quality Assessment of Video Streaming,” 2016.
9. ETSI TS 126 114, “IP Multimedia Subsystem (IMS); Multimedia Telephony; Media handling and interaction,” 2018.
10. IEEE Std 802.11ax-2021, “IEEE Standard for Information Technology—Telecommunications and information exchange

between systems Local and metropolitan area networks—Specific requirements—Part 11: Wireless LAN Medium Access
Control (MAC) and Physical Layer (PHY) Specifications,” IEEE, 2021.

11. IEEE Std 802.11be/Draft 5.0, “EHT (Extremely High Throughput),” IEEE, 2024.
12. ITU-T Rec. P.863, “Perceptual Objective Listening Quality Assessment (POLQA),” 2018.
13. Skorin-Kapov, Lea, Martín Varela, Tobias Hoßfeld, and Kuan-Ta Chen. 2018. “A Survey of Emerging Concepts and

Challenges for QoE Management of Multimedia Services.” ACM Transactions on Multimedia Computing,
Communications, and Applications 14 (2s): Article 29, 1–29. https://doi.org/10.1145/3176648.

14. Agarwal, Trapty, Madhur Grover, Madhusmita Sahu, Varun Ojha, Sunil MP, and M. Saravanan. 2025. “QoE-Based
Resource Allocation in Wireless Networks: An Analysis.” In 2025 International Conference on Automation and
Computation (AUTOCOM), 690–695. IEEE. https://doi.org/10.1109/AUTOCOM64127.2025.10957456.

15. ITU-T Rec. P.911, “Subjective audiovisual quality assessment methods for multimedia applications,” 2021.
16. Sandvine, “Global Internet Phenomena Report,” 2022.
17. Nichols, Kathleen, and Van Jacobson. 2012. “Controlling Queue Delay.” Communications of the ACM 55 (7): 42–50.

https://doi.org/10.1145/2209249.2209264.
18. H. Susanto, B. Liu and B. -G. Kim, "QoE-Based User-Regulated Congestion Control," 2018 IEEE 38th International

Conference on Distributed Computing Systems (ICDCS), Vienna, Austria, 2018, pp. 1612-1617, doi:
10.1109/ICDCS.2018.00181.

19. Wi-Fi Alliance, “WMM® Specification: Wi-Fi Multimedia QoS,” Wi-Fi Alliance, 2016 (and subsequent updates).
20. J. Babiarz, K. Chan, and F. Baker, “RFC 4594: Configuration Guidelines for DiffServ Service Classes,” IETF, 2006.
21. ITU-T Rec. G.107, “The E-model: a computational model for use in transmission planning,” 2015.
22. ITU-T Rec. P.1203, “Parametric Bitstream-Based Quality Assessment of Video Streaming,” 2016.
23. Google Web.dev, “Core Web Vitals,” 2023.
24. Juluri, Parikshit, Venkatesh Tamarapalli, and Deep Medhi. 2016. “Measurement of Quality of Experience of Video-on-

Demand Services: A Survey.” IEEE Communications Surveys & Tutorials 18 (1): 401–418.
https://doi.org/10.1109/COMST.2015.2401424.

25. Concannon, David, Ronan Flynn, and Niall Murray. 2019. “A Quality of Experience Evaluation System and Research
Challenges for Networked Virtual Reality-Based Teleoperation Applications.” In Proceedings of the 11th ACM Workshop
on Immersive Mixed and Virtual Environment Systems (MMVE ’19), 10–12. New York: Association for Computing
Machinery. https://doi.org/10.1145/3304113.3326119.

26. Netflix, “VMAF: The Video Multi-Method Assessment Fusion,” Netflix Tech Blog, 2016.
27. ITU-T Rec. P.910, “Subjective video quality assessment methods,” 2021.
28. Egger-Lampl, Sebastian, Judith Redi, Tobias Hoßfeld, Matthias Hirth, Sebastian Möller, Babak Naderi, Christian Keimel,

and Dietmar Saupe. 2017. “Crowdsourcing Quality of Experience Experiments.” In Evaluation in the Crowd:
Crowdsourcing and Human-Centered Experiments, edited by Daniel Archambault, Helen Purchase, and Tobias Hoßfeld,
154–190. Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-66435-4_7.

29. T. Henderson et al., “The ns-3 Network Simulator: Modeling 802.11ax,” Workshop on ns-3, ACM, 2020.
30. P. Casas et al., "Predicting QoE in cellular networks using machine learning and in-smartphone measurements," 2017

Ninth International Conference on Quality of Multimedia Experience (QoMEX), Erfurt, Germany, 2017, pp. 1-6, doi:
10.1109/QoMEX.2017.7965687.

31. L. Skorin-Kapov et al., “Cross-Layer QoE Management in Wireless Networks,” Springer Handbook of QoE, 2020.
32. E. Khorov, A. Kiryanov, A. Lyakhov and G. Bianchi, "A Tutorial on IEEE 802.11ax High Efficiency WLANs," in IEEE

Communications Surveys & Tutorials, vol. 21, no. 1, pp. 197-216, Firstquarter 2019, doi: 10.1109/COMST.2018.2871099.
33. A. Zubow et al., “Enabling Seamless Handover in Enterprise WLANs: The Role of IEEE 802.11k/v/r,” Computer

Networks, vol. 180, 2020.
34. Wi-Fi Alliance, “Wi-Fi EasyMesh™: Multi-AP Wi-Fi for the Home,” White Paper, 2022.
35. Mozaffariahrar, Erfan, Fabrice Theoleyre, and Michael Menth. 2022. "A Survey of Wi-Fi 6: Technologies, Advances, and

Challenges" Future Internet 14, no. 10: 293. https://doi.org/10.3390/fi14100293.
36. Kulin, Merima, Tarik Kazaz, Eli De Poorter, and Ingrid Moerman. 2021. "A Survey on Machine Learning-Based

Performance Improvement of Wireless Networks: PHY, MAC and Network Layer" Electronics 10, no. 3: 318.
https://doi.org/10.3390/electronics10030318.

37. B. Bellalta, "IEEE 802.11ax: High-efficiency WLANS," in IEEE Wireless Communications, vol. 23, no. 1, pp. 38-46,
February 2016, doi: 10.1109/MWC.2016.7422404.

38. Kairouz, P., McMahan, H. B., Avent, B., Bellet, A., Bennis, M., Bhagoji, A. N., Bonawitz, K., Charles, Z., Cormode, G.,
Cummings, R., D’Oliveira, R. G. L., Eichner, H., El Rouayheb, S., Evans, D., Gardner, J., Garrett, Z., Gascón, A., Ghazi,

INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)

ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue VIII, August 2025

www.ijltemas.in Page 700

B., Gibbons, P. B., … Zhao, S. (2021). Advances and open problems in federated learning. Foundations and Trends® in
Machine Learning, 14(1–2), 1–210. Now Publishers Inc. https://doi.org/10.1561/2200000083.

39. M. Beshley, N. Kryvinska and H. Beshley, "Energy-Efficient QoE-Driven Radio Resource Management Method for 5G
and Beyond Networks," in IEEE Access, vol. 10, pp. 131691-131710, 2022, doi: 10.1109/ACCESS.2022.3228758.

40. Ruan, Jinjia, and Dongliang Xie. 2021. "A Survey on QoE-Oriented VR Video Streaming: Some Research Issues and
Challenges" Electronics 10, no. 17: 2155. https://doi.org/10.3390/electronics10172155.