Bhattacharyya, A and Gadekar, A and Rajgopal, N (2018) Improved learning of k-parities. In: 24th International Conference on Computing and Combinatorics Conference, 2 - 4 July 2018, Qing Dao, pp. 542-553.
|
PDF
COCOON_2018.pdf - Published Version Download (293kB) | Preview |
Abstract
We consider the problem of learning k-parities in the online mistake-bound model: given a hidden vector (Formula Presented) where the hamming weight of x is k and a sequence of “questions” (Formula Presented), where the algorithm must reply to each question with (Formula Presented), what is the best trade-off between the number of mistakes made by the algorithm and its time complexity? We improve the previous best result of Buhrman et al. [BGM10] by an (Formula Presented) factor in the time complexity. Next, we consider the problem of learning k-parities in the PAC model in the presence of random classification noise of rate (Formula Presented). Here, we observe that even in the presence of classification noise of non-trivial rate, it is possible to learn k-parities in time better than (Formula Presented), whereas the current best algorithm for learning noisy k-parities, due to Grigorescu et al. [GRV11], inherently requires time (Formula Presented) even when the noise rate is polynomially small.
Item Type: | Conference Paper |
---|---|
Publication: | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
Publisher: | Springer Verlag |
Additional Information: | The copyright for this article belongs to Springer Verlag. |
Keywords: | Combinatorial mathematics; Economic and social effects, Hamming weights; Hidden vectors; Mistake bound models; Noise rate; Non-trivial; Time complexity; Trade off, Learning algorithms |
Department/Centre: | Division of Electrical Sciences > Computer Science & Automation |
Date Deposited: | 13 Sep 2022 09:46 |
Last Modified: | 13 Sep 2022 09:56 |
URI: | https://eprints.iisc.ac.in/id/eprint/76043 |
Actions (login required)
View Item |