Online Markov Decoding: Lower Bounds and Near-Optimal Approximation Algorithms

Authors: Vikas Garg, Tamar Pichkhadze

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We describe the results of our experiments on two real datasets. We first compare the performance of our methods with the state-of-the-art Online Step Algorithm (OSA) [24] that also provides theoretical guarantees for first order Markov decoding under latency constraints. ... We experimented with the Glycerol Tra SH genome data [35] pertaining to M. tuberculosis transposon mutants.
Researcher Affiliation Academia Vikas K. Garg MIT vgarg@csail.mit.edu Tamar Pichkhadze MIT tamarp@alum.mit.edu
Pseudocode No We outline an efficient procedure, underlying Theorem 2, in the supplementary material.
Open Source Code No The paper does not provide concrete access to source code for the methodology described in this paper.
Open Datasets Yes We experimented with the Glycerol Tra SH genome data [35] pertaining to M. tuberculosis transposon mutants.
Dataset Splits Yes The corpus is not divided into separate train and test sets. Therefore, we formed 5 random partitions each having 80% train and 20% test sentences.
Hardware Specification No The paper does not provide specific hardware details (exact GPU/CPU models, processor types with speeds, memory amounts, or detailed computer specifications) used for running its experiments.
Software Dependencies No The paper does not provide specific ancillary software details (e.g., library or solver names with version numbers like Python 3.8, CPLEX 12.4) needed to replicate the experiment.
Experiment Setup No We used the parameter settings suggested by [35] for decoding with an HMM. ... We varied the OSA hyperparameter λ {10 4, 10 1, . . . , 104} under both the entropy and the expected classification error measures suggested by [24] to tune for L (as noted in [24], large values of λ penalize latency).