PAC learning under helpful distributions

François Denis; Rémi Gilleron

RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications (2001)

  • Volume: 35, Issue: 2, page 129-148
  • ISSN: 0988-3754

Abstract

top
A PAC teaching model – under helpful distributions – is proposed which introduces the classical ideas of teaching models within the PAC setting: a polynomial-sized teaching set is associated with each target concept; the criterion of success is PAC identification; an additional parameter, namely the inverse of the minimum probability assigned to any example in the teaching set, is associated with each distribution; the learning algorithm running time takes this new parameter into account. An Occam razor theorem and its converse are proved. Some classical classes of boolean functions, such as Decision Lists, DNF and CNF formulas are proved learnable in this model. Comparisons with other teaching models are made: learnability in the Goldman and Mathias model implies PAC learnability under helpful distributions. Note that Decision lists and DNF are not known to be learnable in the Goldman and Mathias model. A new simple PAC model, where “simple” refers to Kolmogorov complexity, is introduced. We show that most learnability results obtained within previously defined simple PAC models can be simply derived from more general results in our model.

How to cite

top

Denis, François, and Gilleron, Rémi. "PAC learning under helpful distributions." RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications 35.2 (2001): 129-148. <http://eudml.org/doc/92658>.

@article{Denis2001,
abstract = {A PAC teaching model – under helpful distributions – is proposed which introduces the classical ideas of teaching models within the PAC setting: a polynomial-sized teaching set is associated with each target concept; the criterion of success is PAC identification; an additional parameter, namely the inverse of the minimum probability assigned to any example in the teaching set, is associated with each distribution; the learning algorithm running time takes this new parameter into account. An Occam razor theorem and its converse are proved. Some classical classes of boolean functions, such as Decision Lists, DNF and CNF formulas are proved learnable in this model. Comparisons with other teaching models are made: learnability in the Goldman and Mathias model implies PAC learnability under helpful distributions. Note that Decision lists and DNF are not known to be learnable in the Goldman and Mathias model. A new simple PAC model, where “simple” refers to Kolmogorov complexity, is introduced. We show that most learnability results obtained within previously defined simple PAC models can be simply derived from more general results in our model.},
author = {Denis, François, Gilleron, Rémi},
journal = {RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications},
keywords = {PAC learning; teaching model; Kolmogorov complexity; PAC teaching model; learnability},
language = {eng},
number = {2},
pages = {129-148},
publisher = {EDP-Sciences},
title = {PAC learning under helpful distributions},
url = {http://eudml.org/doc/92658},
volume = {35},
year = {2001},
}

TY - JOUR
AU - Denis, François
AU - Gilleron, Rémi
TI - PAC learning under helpful distributions
JO - RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications
PY - 2001
PB - EDP-Sciences
VL - 35
IS - 2
SP - 129
EP - 148
AB - A PAC teaching model – under helpful distributions – is proposed which introduces the classical ideas of teaching models within the PAC setting: a polynomial-sized teaching set is associated with each target concept; the criterion of success is PAC identification; an additional parameter, namely the inverse of the minimum probability assigned to any example in the teaching set, is associated with each distribution; the learning algorithm running time takes this new parameter into account. An Occam razor theorem and its converse are proved. Some classical classes of boolean functions, such as Decision Lists, DNF and CNF formulas are proved learnable in this model. Comparisons with other teaching models are made: learnability in the Goldman and Mathias model implies PAC learnability under helpful distributions. Note that Decision lists and DNF are not known to be learnable in the Goldman and Mathias model. A new simple PAC model, where “simple” refers to Kolmogorov complexity, is introduced. We show that most learnability results obtained within previously defined simple PAC models can be simply derived from more general results in our model.
LA - eng
KW - PAC learning; teaching model; Kolmogorov complexity; PAC teaching model; learnability
UR - http://eudml.org/doc/92658
ER -

References

top
  1. [1] D. Angluin, Learning Regular Sets from Queries and Counterexamples. Inform. and Comput. 75 (1987) 87-106. Zbl0636.68112MR916360
  2. [2] D. Angluin, Queries and Concept Learning. Machine Learning 2 (1988) 319-342. 
  3. [3] G.M. Benedek and A. Itai, Nonuniform Learnability, in ICALP (1988) 82-92. Zbl0649.68080MR1023628
  4. [4] A. Blumer, A. Ehrenfeucht, D. Haussler and M.K. Warmuth, Occam’s Razor. Inform. Process. Lett. 24 (1987) 377-380. Zbl0653.68084
  5. [5] R. Board and L. Pitt, On the Necessity of Occam Algorithms. Theoret. Comput. Sci. 100 (1992) 157-184. Zbl0825.68544MR1171438
  6. [6] N.H. Bshouty, Exact Learning Boolean Function via the Monotone Theory. Inform. and Comput. 123 (1995) 146-153. Zbl1096.68634MR1358974
  7. [7] J. Castro and J.L. Balcázar, Simple PAC learning of simple decision lists, in ALT 95, 6th International Workshop on Algorithmic Learning Theory. Springer, Lecture Notes in Comput. Sci. 997 (1995) 239-250. 
  8. [8] J. Castro and D. Guijarro, PACS, simple-PAC and query learning. Inform. Process. Lett. 73 (2000) 11-16. MR1741500
  9. [9] F. Denis, Learning regular languages from simple positive examples, Machine Learning. Technical Report LIFL 321 – 1998; http://www.lifl.fr/denis (to appear). Zbl0983.68104
  10. [10] F. Denis, C. D’Halluin and R. Gilleron, PAC Learning with Simple Examples, in 13th Annual Symposium on Theoretical Aspects of Computer Science. Springer-Verlag, Lecture Notes in Comput. Sci. 1046 (1996) 231-242. 
  11. [11] F. Denis and R. Gilleron, PAC learning under helpful distributions, in Proc. of the 8th International Workshop on Algorithmic Learning Theory (ALT-97), edited by M. Li and A. Maruoka. Springer-Verlag, Berlin, Lecture Notes in Comput. Sci. 1316 (1997) 132-145. Zbl0887.68084MR1707552
  12. [12] E.M. Gold, Complexity of Automaton Identification from Given Data. Inform. and Control 37 (1978) 302-320. Zbl0376.68041MR495194
  13. [13] S.A. Goldman and M.J. Kearns, On the Complexity of Teaching. J. Comput. System Sci. 50 (1995) 20-31. Zbl0939.68770MR1322630
  14. [14] S.A. Goldman and H.D. Mathias, Teaching a Smarter Learner. J. Comput. System Sci. 52 (1996) 255-267. Zbl1152.68451MR1393993
  15. [15] T. Hancock, T. Jiang, M. Li and J. Tromp, Lower Bounds on Learning Decision Lists and Trees. Inform. and Comput. 126 (1996) 114-122. Zbl0856.68121MR1391107
  16. [16] D. Haussler, M. Kearns, N. Littlestone and M.K. Warmuth, Equivalence of Models for Polynomial Learnability. Inform. and Comput. 95 (1991) 129-161. Zbl0743.68115MR1138115
  17. [17] C.D.L. Higuera, Characteristic Sets for Polynomial Grammatical Inference. Machine Learning 27 (1997) 125-137. Zbl0884.68107
  18. [18] M. Kearns, M. Li, L. Pitt and L.G. Valiant, Recent Results on Boolean Concept Learning, in Proc. of the Fourth International Workshop on Machine Learning (1987) 337-352. 
  19. [19] M.J. Kearns and U.V. Vazirani, An Introduction to Computational Learning Theory. MIT Press (1994). MR1331838
  20. [20] M. Li and P.M.B. Vitányi, Learning simple concepts under simple distributions. SIAM J. Comput. 20 (1991) 911-935. Zbl0751.68055MR1115658
  21. [21] M. Li and P. Vitányi, An introduction to Kolmogorov complexity and its applications, 2nd Edition. Springer-Verlag (1997). Zbl0866.68051MR1438307
  22. [22] D.H. Mathias, DNF: If You Can’t Learn ’em, Teach ’em: An Interactive Model of Teaching, in Proc. of the 8th Annual Conference on Computational Learning Theory (COLT’95). ACM Press, New York (1995) 222-229. 
  23. [23] B.K. Natarajan, Machine Learning: A Theoretical Approach. Morgan Kaufmann, San Mateo, CA (1991). MR1137519
  24. [24] B.K. Natarajan, On Learning Boolean Functions, in Proc. of the 19th Annual ACM Symposium on Theory of Computing. ACM Press (1987) 296-304. 
  25. [25] J. Oncina and P. Garcia, Inferring regular languages in polynomial update time, in Pattern Recognition and Image Analysis (1992) 49-61. 
  26. [26] R. Parekh and V. Honavar, On the Relationships between Models of Learning in Helpful Environments, in Proc. Fifth International Conference on Grammatical Inference (2000). Zbl0974.68165
  27. [27] R. Parekh and V. Honavar, Learning DFA from simple examples, in Proc. of the 8th International Workshop on Algorithmic Learning Theory (ALT-97), edited by M. Li and A. Maruoka. Springer, Berlin, Lecture Notes in Artificial Intelligence 1316 (1997) 116-131. Zbl0887.68083MR1707551
  28. [28] R. Parekh and V. Honavar, Simple DFA are polynomially probably exactly learnable from simple examples, in Proc. 16th International Conf. on Machine Learning (1999) 298-306. 
  29. [29] R.L. Rivest, Learning Decision Lists. Machine Learning 2 (1987) 229-246. 
  30. [30] K. Romanik, Approximate Testing and Learnability, in Proc. of the 5th Annual ACM Workshop on Computational Learning Theory, edited by D. Haussler. ACM Press, Pittsburgh, PA (1992) 327-332. 
  31. [31] S. Salzberg, A. Delcher, D. Heath and S. Kasif, Learning with a Helpful Teacher, in Proc. of the 12th International Joint Conference on Artificial Intelligence, edited by R. Myopoulos and J. Reiter. Morgan Kaufmann, Sydney, Australia (1991) 705-711. Zbl0748.68065
  32. [32] R.E. Schapire, The Strength of Weak Learnability. Machine Learning 5 (1990) 197-227. Zbl0747.68058
  33. [33] A. Shinohara and S. Miyano, Teachability in Computational Learning. NEWGEN: New Generation Computing 8 (1991). Zbl0712.68084
  34. [34] L.G. Valiant, A Theory of the Learnable. Commun. ACM 27 (1984) 1134-1142. Zbl0587.68077

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.