Near Optimal Reconstruction of Spherical Harmonic Expansions
Authors: Amir Zandieh, Insu Han, Haim Avron
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 5 Numerical Evaluation: Noise-free Setting. For a fixed q, we generate a random function f(σ) = Pq ℓ=0 cℓP ℓ d( σ, v ) where v U(Sd 1) and cℓ s are i.i.d. samples from N(0, 1). Then, f is recovered by running Algorithm 1 with s random evaluations of f on Sd 1. We count the number of failures among 100 independent random trials with different choices of d {3, 4}, q {5, . . . , 22}, and s {40, . . . , 2400}. The empirical success probabilities for d = 3 and 4 are reported in Fig. 1(a) and Fig. 1(b), respectively. |
| Researcher Affiliation | Academia | Amir Zandieh Independent Researcher amir@zed512.gmail.com Insu Han Yale University insu.han@yale.edu Haim Avron Tel Aviv University haimav@tauex.tau.ac.il |
| Pseudocode | Yes | Algorithm 1 Efficient Spherical Harmonic Expansion 1: Input: accuracy parameter ε > 0, integer q 0 2: Set s = c (βq,d log βq,d + βq,d/ε) for sufficiently large fixed constant c 3: Sample i.i.d. random points w1, w2, . . . , ws from U(Sd 1) 4: Compute K Rs s with Ki,j = Pq ℓ=0 αℓ,d s |Sd 1| P ℓ d ( wi, wj ) for i, j [s] 5: Compute f Rs with fj = 1 s f(wj) for j [s] 6: Solve the regression by computing z = K f 7: Return: y H(q)(Sd 1) with y(σ) := Pq ℓ=0 αℓ,d s |Sd 1| Ps j=1 zj P ℓ d ( wj, σ ) for σ Sd 1 |
| Open Source Code | No | The paper does not provide any explicit statements about open-source code availability or links to code repositories. |
| Open Datasets | No | The paper describes generating random functions and sampling points from them for experiments, rather than using a publicly available or open dataset with concrete access information: "For a fixed q, we generate a random function f(σ) = Pq ℓ=0 cℓP ℓ d( σ, v ) where v U(Sd 1) and cℓ s are i.i.d. samples from N(0, 1)." |
| Dataset Splits | No | The paper does not explicitly provide details about training, validation, or test dataset splits (e.g., percentages, sample counts, or references to standard splits). It describes generating sampled points for evaluation. |
| Hardware Specification | No | The paper does not specify any particular hardware used for its experiments (e.g., specific GPU/CPU models or cloud instances). It only mentions runtime complexity in terms of 'ω < 2.3727 is the exponent of the fast matrix multiplication algorithm'. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | Yes | We count the number of failures among 100 independent random trials with different choices of d {3, 4}, q {5, . . . , 22}, and s {40, . . . , 2400}. The empirical success probabilities for d = 3 and 4 are reported in Fig. 1(a) and Fig. 1(b), respectively. |