Concept Q&A
12 questions
Popular

Wireless Technologies — Interview Questions & Answers

Practice interview questions on BLE, WiFi, LoRa, cellular IoT, mesh networking, and wireless technology selection.

Study the fundamentals first

Read the Wireless Technologies topic pages for in-depth concepts before practicing Q&A

BLE

QExplain the BLE protocol stack — GAP, GATT, ATT, and L2CAP

The BLE protocol stack is organized into layers that each serve a distinct role, much like the OSI model but tailored for low-energy, short-range communication. At the bottom sits the Physical Layer (2.4 GHz ISM band, 40 channels, 1 Mbps or 2 Mbps PHY) and the Link Layer, which handles advertising, scanning, connection management, and channel hopping. Above the Link Layer is L2CAP (Logical Link Control and Adaptation Protocol), which provides channel multiplexing — it routes data to either the ATT (Attribute Protocol) channel or the SMP (Security Manager Protocol) channel. L2CAP also handles fragmentation and reassembly of larger payloads into the Link Layer's maximum PDU size (typically 27 bytes without Data Length Extension, up to 251 bytes with DLE).

ATT defines a simple client-server model where one device (the GATT server) exposes a table of attributes — each identified by a 16-bit handle, a UUID (type), a value, and permissions (read, write, notify). The client can discover, read, write, and subscribe to these attributes using ATT operations. GATT (Generic Attribute Profile) adds structure on top of ATT by organizing attributes into Services (groupings of related data, identified by standard or custom UUIDs) and Characteristics (individual data points within a service, with properties like Read, Write, Notify, Indicate). For example, a Heart Rate Service contains a Heart Rate Measurement characteristic that supports Notify, and a Body Sensor Location characteristic that supports Read.

GAP (Generic Access Profile) sits at the top and defines how devices discover each other and establish connections. It defines four roles: Broadcaster (advertise only), Observer (scan only), Peripheral (advertise and accept connections), and Central (scan and initiate connections). GAP controls advertising parameters (interval, data payload, scan response), connection parameters (connection interval, slave latency, supervision timeout), and bonding/pairing procedures. A common interview mistake is conflating GAP and GATT — GAP handles the "how do I find and connect to you" question, while GATT handles "what data do you have and how do I access it." Understanding this separation is essential for designing BLE firmware, because advertising data (GAP level) is limited to 31 bytes and is broadcast, while GATT data flows over an established connection with acknowledgment and flow control.

QWhat is the difference between BLE advertising mode and connected mode?

Advertising mode is a broadcast-based, connectionless state where a BLE peripheral transmits short packets on three dedicated advertising channels (37, 38, 39) at a configurable interval (typically 20 ms to 10.24 s). Any scanner within range can receive these packets without establishing a connection. Advertising packets carry up to 31 bytes of payload (extended to 255 bytes with BLE 5.0 Extended Advertising), and a scan response can provide an additional 31 bytes if a scanner sends a scan request. This mode is ideal for beacons, broadcasting sensor readings to multiple observers, or making a device discoverable. The power consumption depends heavily on the advertising interval — a 1-second interval might draw 10-20 microamps average on an nRF52, while a 100 ms interval increases that by roughly 10x.

Connected mode establishes a dedicated, bidirectional, acknowledged link between exactly two devices (Central and Peripheral). The Central sends connection request during an advertising event, and both devices switch to data channels (0-36) using an adaptive frequency hopping scheme that avoids channels with interference. Data is exchanged in connection events at a regular connection interval (7.5 ms to 4 s). The Peripheral can use slave latency to skip a configurable number of connection events without responding, saving power when there is no data to send — for example, a slave latency of 4 with a 30 ms connection interval means the peripheral only needs to wake every 150 ms if it has nothing to transmit.

The tradeoffs are significant for system design. Advertising mode is one-to-many but unreliable (no acknowledgment, no retransmission), limited in payload, and cannot support bidirectional communication. Connected mode is one-to-one but provides reliable delivery, larger effective throughput (up to 1.4 Mbps with BLE 5.0 2M PHY and DLE), security (encryption via AES-128-CCM after pairing), and features like notifications and indications with acknowledgment. A common design pattern combines both: use advertising for initial discovery and broadcasting a device identifier, then establish a connection for configuration, firmware updates, or streaming sensor data. The choice between the two directly impacts battery life, latency, and maximum number of simultaneously communicating devices.

QHow do you optimize BLE for battery life on a wearable device?

BLE power optimization is a multi-layered problem that spans radio parameters, firmware architecture, and hardware design. The single most impactful parameter is the connection interval — increasing it from 7.5 ms to 500 ms can reduce radio power consumption by 50x or more, since the radio only wakes for a brief window at each connection event. For a wearable that sends heart rate data once per second, a connection interval of 500 ms with slave latency of 0 is sufficient and keeps average radio current well under 50 microamps. Pair this with slave latency — setting slave latency to 9 means the peripheral can sleep through 9 consecutive connection events, waking only on the 10th, effectively turning a 100 ms connection interval into a 1-second wake cycle while still allowing the peripheral to respond within 100 ms if it has urgent data.

At the firmware level, minimize the time the radio is active per connection event. Use Data Length Extension (DLE) to send up to 251 bytes per PDU instead of the default 27 bytes — this amortizes the radio wake-up overhead over more data, dramatically improving energy per byte. Batch sensor readings and send them in a single burst rather than one reading per connection event. Prefer Notifications over Indications for data that can tolerate occasional packet loss, since Indications require an application-layer acknowledgment that extends the radio-on time. On the advertising side, increase the advertising interval to the maximum acceptable for your use case (1-2 seconds is common for wearables) and minimize scan response data to avoid the extra TX/RX cycle when a scanner requests it.

At the system level, ensure the MCU enters the deepest sleep mode between connection events — on Nordic nRF52, this is System ON sleep with RAM retention at approximately 1.5 microamps. Disable unused peripherals, use the lowest possible clock source for the RTC (32.768 kHz crystal consumes roughly 0.5 microamps), and configure the DC-DC converter if available (nRF52 internal DC-DC reduces radio current from 5.3 mA to 3.7 mA at 0 dBm TX). Measure actual power consumption with a tool like the Nordic Power Profiler or a shunt resistor and oscilloscope — calculated estimates often miss wake-up transients, crystal settling times, and flash read currents that add 10-30% overhead. A well-optimized BLE wearable can run for months on a 100 mAh coin cell, but a poorly configured one (short connection interval, no slave latency, excessive advertising) might drain the same battery in days.

WiFi & Cellular

QWiFi vs BLE vs LoRa — how do you choose for an IoT product?

The choice between WiFi, BLE, and LoRa depends on four primary factors: data rate requirements, range, power budget, and network infrastructure. WiFi (802.11b/g/n/ax) delivers the highest throughput (tens of Mbps), operates over existing infrastructure (routers, cloud connectivity), and supports TCP/IP natively — making it ideal for devices that need internet connectivity, stream video or audio, or transfer large payloads like firmware images. The cost is power: a WiFi SoC like ESP32 draws 80-240 mA during TX/RX, making it impractical for battery-powered devices that must last months. WiFi works best for line-powered products (smart plugs, cameras, displays) or devices with large rechargeable batteries and infrequent transmission windows.

BLE is the sweet spot for short-range, low-power personal devices: wearables, fitness trackers, medical sensors, smart locks, and phone-connected accessories. Range is typically 10-30 meters indoors (up to 100+ meters line-of-sight with BLE 5.0 Long Range coded PHY), data rate is modest (1-2 Mbps PHY, roughly 100-700 kbps application throughput), and power consumption is micro-amp-level between connection events. BLE requires a smartphone or gateway for cloud connectivity, which is acceptable for consumer products but adds complexity for industrial deployments. BLE Mesh extends the topology to many-to-many but adds latency and complexity.

LoRa occupies the long-range, ultra-low-power, low-data-rate niche: agricultural sensors, utility meters, asset trackers, environmental monitoring — any device that sends small payloads (tens of bytes) infrequently (once per hour or less) over distances of 2-15 km. LoRa operates in sub-GHz ISM bands (868 MHz in EU, 915 MHz in US), which penetrate buildings and foliage far better than 2.4 GHz. The tradeoff is throughput: LoRa's maximum data rate is roughly 50 kbps (SF7, 500 kHz BW), and typical LoRaWAN deployments use much less. The decision framework in an interview should consider: Can the device be plugged in? If yes, WiFi. Does it need phone connectivity and short range? BLE. Does it need multi-kilometer range with tiny payloads on battery? LoRa. Real products often combine two radios — for example, BLE for configuration and local connectivity plus LoRa for long-range data reporting.

QWhat is the power cost of WiFi on a battery-powered device?

WiFi is fundamentally power-hungry because the protocol was designed for always-on, AC-powered devices. A typical WiFi SoC like the ESP32 draws approximately 80 mA in receive mode, 120-240 mA during transmission (depending on TX power and modulation), and 20-60 mA during active scanning. Even the association process — scanning for the AP, authenticating, completing the DHCP handshake — can take 2-8 seconds and consume several hundred milliamp-seconds of charge. For a battery-powered sensor that must report data once per minute, each wake-up cycle including WiFi association, DNS lookup, TLS handshake, HTTP POST, and disconnection might consume 200-400 mA for 3-5 seconds — roughly 1-2 mAs per transmission. On a 1000 mAh battery, that limits the device to approximately 500-1000 transmissions, or less than a day at one-per-minute reporting.

The 802.11 power-save modes help but do not fundamentally solve the problem. Legacy Power Save (PS-Poll) lets the station sleep between beacon intervals (typically 100 ms) and wake to check the TIM field in the beacon; if data is buffered at the AP, the station sends a PS-Poll to retrieve it. This reduces idle current but still requires the radio to wake every 100 ms. WMM Power Save (U-APSD) improves on this by allowing trigger-based delivery. The most aggressive optimization is Target Wake Time (TWT), introduced in WiFi 6 (802.11ax), which allows the station to negotiate a specific schedule with the AP — for example, waking once every 10 seconds for a 2 ms window. TWT can reduce average WiFi current to the hundreds-of-microamps range for infrequent data exchange, making battery operation feasible.

Practical strategies for battery-powered WiFi devices include: keeping the WiFi association alive (avoiding the expensive reconnect cycle) using light-sleep with periodic beacon listening; batching multiple sensor readings and transmitting them in a single burst; using UDP instead of TCP to avoid the handshake overhead; storing the WiFi channel and BSSID in RTC memory to skip the scanning phase on wake-up (reduces connection time from seconds to hundreds of milliseconds on ESP32); and using deep sleep between transmissions with the WiFi radio completely powered down. Even with all optimizations, WiFi battery life is measured in weeks to months for infrequent reporting — compared to years for BLE or LoRa. An interviewer asking this question wants to hear that you understand the fundamental energy cost and know both the protocol-level mitigations and the system-level workarounds.

QCompare NB-IoT and LTE-M for cellular IoT applications

NB-IoT (Narrowband IoT, 3GPP Cat-NB1/NB2) and LTE-M (LTE for Machines, 3GPP Cat-M1) are both cellular LPWAN technologies designed for IoT, but they occupy different design points. NB-IoT uses a narrow 200 kHz channel bandwidth, delivers peak data rates of roughly 60 kbps downlink and 30 kbps uplink (Cat-NB1), and is optimized for stationary devices that send very small payloads infrequently — think smart meters, soil moisture sensors, and parking spot detectors. LTE-M uses a wider 1.4 MHz bandwidth, delivers peak rates of approximately 1 Mbps in both directions, supports voice (VoLTE) and full mobility with cell handover, and targets devices that need moderate throughput and movement — wearables, asset trackers, connected health devices, and point-of-sale terminals.

The power and coverage tradeoffs are nuanced. Both technologies support PSM (Power Saving Mode) and eDRX (extended Discontinuous Reception), which allow the modem to enter deep sleep for minutes to hours between scheduled paging windows, reducing average current to single-digit microamps. In PSM, the device is unreachable from the network until it wakes — acceptable for sensors that only report data, problematic for devices that need to receive commands. NB-IoT achieves deeper coverage (up to 20 dB better link budget than LTE, reaching underground basements and deep indoor locations) through extreme repetition of transmissions, but at the cost of higher latency (seconds to tens of seconds for data delivery). LTE-M has lower latency (10-100 ms typical), supports handover between cells (essential for moving assets), and enables real-time applications like voice calls.

The selection criteria in practice: choose NB-IoT for stationary, ultra-low-power devices with tiny payloads and no real-time requirements, especially in challenging coverage environments. Choose LTE-M for mobile assets, devices that need faster data rates, lower latency, or bidirectional communication, and applications where the device must be reachable by the network. Cost differences are narrowing but NB-IoT modules tend to be slightly cheaper. Both require a cellular subscription, which adds per-device monthly cost (typically $0.50-$2 for IoT plans). A key interview point: NB-IoT does not support handover, so a device moving between cells must re-attach, causing connection drops and increased power consumption — this makes NB-IoT unsuitable for vehicle tracking or anything that moves faster than walking speed.

LoRa & Mesh

QWhat is LoRaWAN and how does it differ from raw LoRa?

LoRa (Long Range) is a physical layer modulation technique based on Chirp Spread Spectrum (CSS), developed by Semtech. It defines how bits are encoded onto radio waves in the sub-GHz ISM bands (868 MHz EU, 915 MHz US, 433 MHz Asia). LoRa provides configurable tradeoffs between range, data rate, and power through parameters like Spreading Factor (SF7-SF12), Bandwidth (125/250/500 kHz), and Coding Rate (4/5 to 4/8). A raw LoRa radio is essentially a modem — you can build point-to-point or star links, define your own packet format, and implement custom networking logic. Raw LoRa gives maximum flexibility and is used in proprietary industrial systems, agriculture, and custom sensor networks where you control both endpoints.

LoRaWAN is a MAC layer and network architecture specification maintained by the LoRa Alliance that sits on top of the LoRa PHY. It defines a complete network stack: device classes (A, B, C), packet formats with headers and frame counters, AES-128 encryption (network session key and application session key), Adaptive Data Rate (ADR) to automatically select the best SF/BW for each device, over-the-air activation (OTAA) and activation by personalization (ABP), and a star-of-stars topology where end devices communicate with gateways that forward packets to a centralized Network Server via IP backhaul. The Network Server deduplicates packets received by multiple gateways, manages device sessions, and routes application data to the appropriate Application Server.

The key differences: raw LoRa is peer-to-peer with no network management, no standardized security, and no cloud integration — you build everything yourself. LoRaWAN provides a managed, secure, scalable network with built-in device management, but imposes constraints: Class A devices (the most power-efficient) can only receive downlink data in two short windows immediately after an uplink transmission, limiting bidirectional communication. LoRaWAN also enforces regional duty cycle limits (1% in EU868, frequency hopping in US915) that restrict how often a device can transmit. For an interview, emphasize that LoRa is the radio, LoRaWAN is the network — and that choosing raw LoRa over LoRaWAN makes sense when you need low latency, custom protocols, or cannot deploy gateway infrastructure, while LoRaWAN makes sense when you need to scale to thousands of devices with centralized management and standard security.

QExplain the Thread and Matter protocol stack for smart home

Thread is a low-power, IPv6-based mesh networking protocol designed for smart home and building automation. It operates on IEEE 802.15.4 radio (2.4 GHz, 250 kbps), the same physical layer as Zigbee, but uses a completely different network stack. Thread's key innovation is bringing standard IP networking to constrained devices: it uses 6LoWPAN for IPv6 header compression, MLE (Mesh Link Establishment) for neighbor discovery and routing topology, RPL or a simplified routing protocol for mesh routing, and DTLS/UDP for secure communication. Thread devices are assigned IPv6 addresses and can communicate with any IP device through a Border Router that bridges the Thread mesh to WiFi or Ethernet networks. The mesh is self-healing — devices automatically reroute around failed nodes, and the network supports up to 250+ devices per partition.

Matter (formerly Project CHIP) is an application layer protocol that runs on top of multiple transports — WiFi, Thread, and Ethernet. Matter defines a standardized data model for smart home devices: a Node contains Endpoints, each Endpoint implements Clusters (like On/Off, Level Control, Temperature Measurement), and each Cluster has Attributes, Commands, and Events. This is conceptually similar to BLE's GATT model but applied to home automation. Matter uses a standard commissioning flow (involving QR codes or NFC), certificate-based device authentication (Device Attestation Certificates issued by the Connectivity Standards Alliance), and supports multiple simultaneous controllers (so a device can be controlled by Apple Home, Google Home, and Amazon Alexa simultaneously via Multi-Admin).

The relationship between Thread and Matter is complementary, not competing: Thread provides the mesh network transport for battery-powered devices (door sensors, motion detectors, smart locks), while Matter provides the application protocol that ensures interoperability between manufacturers and ecosystems. A Matter-over-Thread device uses Thread for mesh networking and IP connectivity, and Matter for standardized device behavior. A Matter-over-WiFi device uses WiFi as the transport instead. The interview insight is that Thread solved the networking problem (reliable, low-power mesh with IP connectivity) while Matter solved the interoperability problem (one protocol understood by Apple, Google, Amazon, and Samsung). Together they replace the fragmented landscape of proprietary Zigbee profiles and Z-Wave command classes with a single, open standard.

QWhat is the maximum data rate and range of LoRa?

LoRa's data rate and range are not fixed — they are configurable tradeoffs controlled by the Spreading Factor (SF), Bandwidth (BW), and Coding Rate (CR). At SF7 with 500 kHz bandwidth and CR 4/5, LoRa achieves its maximum data rate of approximately 37.5 kbps (or about 50 kbps with the newer LR-FHSS modulation). At SF12 with 125 kHz bandwidth and CR 4/8, the data rate drops to roughly 183 bps — nearly 200x slower — but the receiver sensitivity improves by approximately 20 dB, dramatically extending range. Each step increase in spreading factor doubles the time-on-air for the same payload, halving the data rate but adding roughly 2.5 dB of link budget (approximately 30-40% more range in free space).

In terms of range, real-world performance depends heavily on environment, antenna design, and RF conditions. In urban environments with buildings and multipath interference, typical LoRa range is 2-5 km. In suburban or rural areas with clear line-of-sight, 10-15 km is achievable. Record-breaking experiments have demonstrated LoRa links exceeding 200 km in line-of-sight conditions (balloon-to-ground or mountain-to-valley), and 766 km has been achieved in extreme high-altitude balloon experiments. For practical system design, assume 2-5 km urban, 5-10 km suburban, and budget for gateway density accordingly. Sub-GHz frequencies (868/915 MHz) penetrate walls and foliage significantly better than 2.4 GHz (WiFi/BLE), which is LoRa's primary advantage for outdoor IoT deployments.

The tradeoff between data rate and range has direct implications for system design. A soil moisture sensor reporting a 10-byte reading every hour can use SF12 to maximize range — the 0.5-second transmission at 183 bps is perfectly acceptable. A livestock tracking device sending GPS coordinates every 5 minutes needs higher throughput and might use SF9 or SF10, accepting reduced range but keeping transmission time under 100 ms to conserve battery. LoRaWAN's Adaptive Data Rate (ADR) algorithm automatically adjusts the spreading factor based on link quality — devices close to the gateway use SF7 for speed and efficiency, while devices at the edge of coverage use SF12 for reliability. An important interview point: higher spreading factors increase time-on-air, which increases power consumption per packet and reduces the effective duty cycle — at SF12 in EU868 with a 1% duty cycle, a device can only transmit approximately 25 packets per hour with a 10-byte payload.

Selection & Design

QHow do you select a wireless technology for a new embedded product?

Wireless technology selection is a system-level decision that must be made early in the product definition phase, as it impacts PCB layout, antenna design, power architecture, regulatory certification, and software stack. The decision framework starts with four primary criteria: data rate requirements (bytes per second of application data), range (indoor, outdoor, urban, rural), power budget (line-powered, rechargeable battery with size constraints, coin cell for years), and network topology (point-to-point, star, mesh, internet connectivity needed).

Map your requirements to the technology landscape: if you need internet connectivity and are line-powered, WiFi is the default choice — it leverages existing infrastructure, has mature software stacks (TCP/IP, TLS, HTTP/MQTT), and users already have routers. If you need short-range communication with a smartphone and low power, BLE is the standard — every phone supports it, the ecosystem is mature, and power consumption in the tens-of-microamps range enables coin-cell operation. If you need long range (kilometers) with small payloads and extreme battery life, evaluate LoRa (ISM band, no subscription) versus NB-IoT/LTE-M (licensed spectrum, carrier subscription, better reliability and coverage). If you need mesh networking for home or building automation, Thread/Matter or Zigbee are purpose-built. If you need high-precision ranging, UWB provides centimeter-level accuracy.

Beyond the primary criteria, evaluate secondary factors: regulatory requirements by target market (FCC, CE, TELEC — each has different rules for ISM bands, duty cycle, and TX power), module availability and cost (a certified WiFi/BLE combo module like ESP32 costs under $2, while a cellular module costs $10-30 plus subscription fees), ecosystem maturity (BLE has excellent smartphone SDKs; LoRaWAN has standardized network servers; Thread/Matter has growing but still maturing tooling), antenna size constraints (sub-GHz antennas are physically larger than 2.4 GHz), coexistence with other radios on the same board, and the engineering team's experience. In an interview, demonstrate that you think about the full product lifecycle — not just the radio specification sheet, but the certification cost, manufacturing test strategy, field update mechanism, and end-user provisioning experience.

QWhat factors affect wireless range in real-world deployments?

The theoretical range of any wireless technology is calculated from the link budget: TX power + TX antenna gain - path loss + RX antenna gain must exceed the receiver sensitivity. In free space, path loss increases with the square of the distance and the square of the frequency (Friis equation). But real-world deployments rarely achieve free-space conditions, and understanding the degradation factors is critical for embedded engineers designing reliable wireless products.

Building materials are the primary range-killer for indoor deployments. Drywall attenuates 2.4 GHz signals by 3-5 dB, concrete by 10-15 dB, and reinforced concrete with rebar by 15-25 dB. A single concrete wall can cut range by 50-70%. Metal surfaces (filing cabinets, refrigerators, elevator shafts, metal studs) cause reflections and shadowing. Multipath — signals bouncing off surfaces and arriving at the receiver with different delays and phases — causes constructive and destructive interference, creating "dead spots" where signal strength drops by 20-30 dB within centimeters. This is why a device that works perfectly during bench testing fails in the field at the same distance. Body absorption at 2.4 GHz (the human body attenuates signals by 5-10 dB) particularly affects wearables — a BLE device worn on the wrist has its signal partially blocked by the body in certain orientations.

Environmental factors include interference from other devices on the same frequency band (2.4 GHz is shared by WiFi, BLE, Zigbee, Thread, and microwave ovens), vegetation (trees and foliage attenuate sub-GHz signals by 5-15 dB depending on density and moisture), and weather (rain attenuation is negligible below 10 GHz but humidity affects long-range sub-GHz links). Antenna design is often underestimated: a chip antenna saves PCB space but has 3-5 dB less gain than a properly tuned PCB trace antenna, and both are sensitive to nearby ground plane geometry, battery placement, and enclosure materials. A plastic enclosure with metallic paint, a battery positioned next to the antenna, or a missing ground plane clearance zone can each degrade range by 30-50%. For reliable product design, always measure range in the actual enclosure with all components populated, in the actual deployment environment, and add a 10-15 dB fade margin to handle worst-case interference and multipath.

QHow do you handle BLE and WiFi coexistence on the same chip?

BLE and WiFi both operate in the 2.4 GHz ISM band (2.400-2.4835 GHz), which creates a fundamental coexistence challenge when both radios are active on the same chip or the same PCB. WiFi channels 1, 6, and 11 (the non-overlapping channels in 2.4 GHz, each 20 MHz wide) occupy most of the band, while BLE uses 40 channels of 2 MHz each across the same spectrum. When both radios attempt to transmit simultaneously, the WiFi signal (at 15-20 dBm TX power) overwhelms the BLE receiver (sensitivity around -95 dBm), and the BLE transmission interferes with WiFi reception. Without coexistence management, packet loss rates of 20-50% are common, leading to BLE connection drops and WiFi throughput degradation.

Combo chips like the ESP32, Nordic nRF7002 (WiFi companion to nRF5340 BLE), and CYW43455 (Cypress/Infineon) implement hardware-level coexistence arbitration. The most common mechanism is the Packet Traffic Arbitration (PTA) or coex grant/request protocol: when one radio needs to transmit, it asserts a request signal; the coexistence arbiter decides which radio gets access based on priority rules and timing. Time-critical BLE connection events (especially the anchor point where the first packet must be exchanged) get high priority to prevent connection drops, while WiFi bulk data transfers can tolerate brief pauses. On ESP32, the coexistence module uses a combination of time-division (alternating WiFi and BLE access in microsecond-granularity slots) and frequency-domain awareness (BLE's adaptive frequency hopping avoids the channels currently used by WiFi).

From a firmware perspective, you can improve coexistence by: (1) increasing the BLE connection interval so BLE needs fewer time slots, giving WiFi more airtime; (2) using BLE's adaptive frequency hopping to avoid the three WiFi channels in use; (3) scheduling WiFi data transfers (like firmware OTA) during periods when BLE activity is low; (4) reducing WiFi TX power if range permits, decreasing the desensitization of the BLE receiver; and (5) on dual-chip designs, using an external coexistence interface (3-wire or 4-wire PTA signals between the WiFi and BLE chips) to coordinate access. Testing coexistence requires measuring both WiFi throughput and BLE connection stability simultaneously under load — a common failure mode is that everything works during individual testing but breaks when both radios are active under realistic application conditions.