Cyclically Equivariant Neural Decoders for Cyclic Codes

Authors: Xiangyu Chen, Min Ye

ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive simulations with BCH codes and punctured Reed-Muller (RM) codes show that our new decoder consistently outperforms previous neural decoders when decoding cyclic codes.
Researcher Affiliation Academia 1Data Science and Information Technology Research Center, Tsinghua-Berkeley Shenzhen Institute, Tsinghua Shenzhen International Graduate School, Shenzhen, China.
Pseudocode No The paper describes the decoding algorithms and procedures using mathematical equations and step-by-step descriptions in paragraph text (e.g., in Section 4, 'Step 1: Prepend a dummy symbol L0 = 0 to the LLR vector.'), but it does not contain structured pseudocode or an explicitly labeled algorithm block.
Open Source Code Yes Code available at github.com/cyclicallyneuraldecoder
Open Datasets No No concrete access information (link, DOI, repository, or formal citation) for a publicly available or open dataset used for training was provided. The paper discusses using 'BCH codes and punctured RM codes' and mentions training using 'the all-zero codeword', implying data is generated based on these code structures rather than loaded from a named external dataset.
Dataset Splits No The paper mentions 'For all codes, we train with a batch size of 160 samples...' and 'We use 105 samples for testing.' but does not explicitly provide details about a validation dataset split or its size.
Hardware Specification No No specific hardware details such as CPU/GPU models, processor types, or memory amounts used for running experiments were provided in the paper.
Software Dependencies No The paper does not provide specific software dependencies with version numbers, such as 'Python 3.8' or 'PyTorch 1.9'.
Experiment Setup Yes In our simulations, the number of BP iterations is 5 for all the methods. For all codes, we train with a batch size of 160 samples, among which we produce 20 samples from each of the following 8 SNR values: 1d B, 2d B, . . . , 8d B.