On the finite convergence of the NN classification learning on mistakes

The paper establishes an analog of well-known Novikoff’s theorem on the perceptron learning algorithm’s finite convergence in linearly separated classes. We obtain a similar result concerning the nearest neighbor classification algorithm in the case of compact classes in a general metric space for...

Повний опис

Збережено в:
Бібліографічні деталі
Дата:2022
Автор: Norkin, V.I.
Формат: Стаття
Мова:English
Опубліковано: Видавничий дім "Академперіодика" НАН України 2022
Назва видання:Доповіді НАН України
Теми:
Онлайн доступ:http://dspace.nbuv.gov.ua/handle/123456789/184927
Теги: Додати тег
Немає тегів, Будьте першим, хто поставить тег для цього запису!
Назва журналу:Digital Library of Periodicals of National Academy of Sciences of Ukraine
Цитувати:On the finite convergence of the NN classification learning on mistakes / V.I. Norkin // Доповіді Національної академії наук України. — 2022. — № 1. — С. 34-38. — Бібліогр.: 10 назв. — англ.

Репозитарії

Digital Library of Periodicals of National Academy of Sciences of Ukraine
Опис
Резюме:The paper establishes an analog of well-known Novikoff’s theorem on the perceptron learning algorithm’s finite convergence in linearly separated classes. We obtain a similar result concerning the nearest neighbor classification algorithm in the case of compact classes in a general metric space for the case of non-intersecting classes. The learning process consists of gradual modification of the algorithm in misclassification cases. The process is studied in the deterministic setting. Classes are understood as compacts in complete metric space, and class separation is defined as the non-intersection of compacts. The number of learning steps is bounded by the number of elements in some ε-net for the considered classes.