An Enhanced End-to-End Framework for Drone RF Signal Classification

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Smart RF jamming relies on long-term spectrum prediction, which requires accurate, high-resolution RF detection and classification over extended observation periods. Detecting and classifying drone RF signals is particularly challenging due to short dwell times, high hopping rates, and narrow instantaneous bandwidths. This paper presents an enhanced end-to-end framework designed to meet these requirements for smart RF jamming, delivering high-resolution and precise detection and classification. We demonstrate that our Residual Neural Network (ResNet)-based You Only Look Once (YOLO) model effectively detects and extracts RF features from previously unseen drone signals with high accuracy, even when trained solely on a synthetic RF dataset. Furthermore, our ResNet classifier outperforms existing models, achieving 99.29% accuracy at 0 dB signal-to-noise ratio (SNR) for drone RF signals.

Original languageEnglish
Title of host publication2025 IEEE International Mediterranean Conference on Communications and Networking, MeditCom 2025
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798331529659
DOIs
Publication statusPublished - 2025
Event2025 IEEE International Mediterranean Conference on Communications and Networking, MeditCom 2025 - Nice, France
Duration: 7 Jul 202510 Jul 2025

Publication series

Name2025 IEEE International Mediterranean Conference on Communications and Networking, MeditCom 2025

Conference

Conference2025 IEEE International Mediterranean Conference on Communications and Networking, MeditCom 2025
Country/TerritoryFrance
CityNice
Period7/07/2510/07/25

Keywords

  • Deep Learning (DL)
  • Spectrum Sensing
  • Unmanned Aerial Vehicles (UAVs)

Fingerprint

Dive into the research topics of 'An Enhanced End-to-End Framework for Drone RF Signal Classification'. Together they form a unique fingerprint.

Cite this