kumoh national institute of technology
Networked Systems Lab.

S. H. Kim, J. W. Kim, V. S. Doan and D. S. Kim: "Lightweight Deep Learning Model for Automatic Modulation Classification in Cognitive Radio Networks", IEEE Access, vol. 8, pp. 197532 - 197541, Oct. 2020. IF: 3.745,
By : S H Kim
Date : 2020-09-23
Views : 248

Abstract,
Automatic modulation classification (AMC) used in cognitive radio networks is an important class of methods apt to utilize spectrum resources efficiently. However, conventional likelihood-based approaches have high computational complexity. Thus, this paper proposes a novel convolutional neural network architecture for AMC. A bottleneck and asymmetric convolution structure are employed in the proposed model, which can reduce the computational complexity. The skip connection technique is used to solve the vanishing gradient problem and improve the classification accuracy. The dataset DeepSig:RadioML, which is composed of 24 modulation classes, is used for the performance analysis. Simulation results show that the classification accuracy performance of the proposed model is outstanding in the signal-to-noise ratio (SNR) range from -4 dB to 20 dB compared with MCNet that is the best model in the conventional models, where the proposed model achieves 5.52% and 5.92% improvement regarding classification accuracy at the SNRs of 0 dB and 10 dB, respectively. In terms of the computational complexity, the proposed model not only saves the trainable parameters by more than 67% but also reduces the prediction time for a signal by more than 54.4% compared with those of MCNet.

DOI: 10.1109/ACCESS.2020.3033989

Final review comments,
Reviewer: 1

Recommendation: Accept (minor edits)

Comments:
The authors have addressed the comments of this reviewer and the paper can be accepted for publication.

Additional Questions:
1) Does the paper contribute to the body of knowledge?: Yes

2) Is the paper technically sound?: Yes

3) Is the subject matter presented in a comprehensive manner?: Yes

4) Are the references provided applicable and sufficient?: Yes

5) Are there references that are not appropriate for the topic being discussed?: No

5a) If yes, then please indicate which references should be removed.:


Reviewer: 2

Recommendation: Reject (do not encourage resubmit)

Comments:
No novelty with this work.
The authors just use convolutional neural network for applying modulation classification.

Additional Questions:
1) Does the paper contribute to the body of knowledge?: No. The authors just use convolutional neural network for applying modulation classification.

2) Is the paper technically sound?: It needs more improvement.

3) Is the subject matter presented in a comprehensive manner?: No. New section with title "Literature Review" should be added to present some related works.

4) Are the references provided applicable and sufficient?: Yes.

5) Are there references that are not appropriate for the topic being discussed?: No

5a) If yes, then please indicate which references should be removed.:


Reviewer: 3

Recommendation: Accept (minor edits)

Comments:
Thanks for their,

Additional Questions:
1) Does the paper contribute to the body of knowledge?: Yes,

2) Is the paper technically sound?: Yes,

3) Is the subject matter presented in a comprehensive manner?: Yes,

4) Are the references provided applicable and sufficient?: Yes,

5) Are there references that are not appropriate for the topic being discussed?: No

5a) If yes, then please indicate which references should be removed.:


Reviewer: 4

Recommendation: Accept (minor edits)

Comments:
I have a minor comment for the authors:

- Please modify the colors used in fig. 3. The current use is very confusing, use similar colors (but different line style) for validation and similar colors for training.

Additional Questions:
1) Does the paper contribute to the body of knowledge?: Yes.

2) Is the paper technically sound?: Yes.

3) Is the subject matter presented in a comprehensive manner?: Yes.

4) Are the references provided applicable and sufficient?: Yes.

5) Are there references that are not appropriate for the topic being discussed?: No

5a) If yes, then please indicate which references should be removed.:


Reviewer: 5

Recommendation: Accept (minor edits)

Comments:
First of all I would like to specify I was not involved in the previous evaluation round of this submitted work. My opinion is that this revised paper is interesting, well-written and represents a valuable contribution to this journal.

For the mentioned reason, I have only the following minor comments that I would like to see addressed by the authors:

1) Abstract – Automatic modulation classification (AMC) used in cognitive radio networks is an important method to utilize spectrum resources efficiently -> Automatic modulation classification (AMC) used in cognitive radio networks is an important class of methods apt to utilize spectrum resources efficiently

2) Abstract – However, a conventional likelihood-based approach has high computational complexity. -> However, conventional likelihood-based approaches have high computational complexity..

3) Abstract- It is a little bit strange, by reading the abstract, that the authors claim to use a real dataset DeepSig:RadioML, but then they write Simulation results. Please revise this part accordingly.

4) When introducing the benefits of DL to communications and networking, the following related works may be mentioned for the sake of a wider introduction to the topic:

"Mobile encrypted traffic classification using deep learning: Experimental evaluation, lessons learned, and challenges." IEEE Transactions on Network and Service Management 16.2 (2019): 445-458.

"Learning to detect." IEEE Transactions on Signal Processing 67.10 (2019): 2554-2564.

"Toward Effective Mobile Encrypted Traffic Classification through Deep Learning." Neurocomputing (2020).

5) In Sec. I, when the authors review the different DL-based approaches for AMC, it would be useful placing a summarizing table which categorizes them all along their main distinctive features. This would be beneficial to the generic reader.

6) Please add a notation paragraph at the end of Sec. I.

7) In Fig. 7, please add (for the sake of completeness), also the confusion matrices pertaining to the baseline reporting the highest performance on the considered dataset (e.g. VGG [37]).

8) In Tab. 4, I assume the authors are reporting the time elapsed for prediction. Still, for completeness, it would be useful also reporting the overall training runtime for each model.

9) Conclusions should be enriched with a brief paragraph highlighting the future directions of research based on the contributions of this study.

Additional Questions:
1) Does the paper contribute to the body of knowledge?: yes

2) Is the paper technically sound?: yes

3) Is the subject matter presented in a comprehensive manner?: yes

4) Are the references provided applicable and sufficient?: see comments below

5) Are there references that are not appropriate for the topic being discussed?: Yes

5a) If yes, then please indicate which references should be removed.: