On the finite convergence of the NN classification learning on mistakes

The paper establishes an analog of well-known Novikoff’s theorem on the perceptron learning algorithm’s finite convergence in linearly separated classes. We obtain a similar result concerning the nearest neighbor classification algorithm in the case of compact classes in a general metric space for...

Full description

Saved in:
Bibliographic Details
Date:2022
Main Author: Norkin, V.I.
Format: Article
Language:English
Published: Видавничий дім "Академперіодика" НАН України 2022
Series:Доповіді НАН України
Subjects:
Online Access:http://dspace.nbuv.gov.ua/handle/123456789/184927
Tags: Add Tag
No Tags, Be the first to tag this record!
Journal Title:Digital Library of Periodicals of National Academy of Sciences of Ukraine
Cite this:On the finite convergence of the NN classification learning on mistakes / V.I. Norkin // Доповіді Національної академії наук України. — 2022. — № 1. — С. 34-38. — Бібліогр.: 10 назв. — англ.

Institution

Digital Library of Periodicals of National Academy of Sciences of Ukraine
Description
Summary:The paper establishes an analog of well-known Novikoff’s theorem on the perceptron learning algorithm’s finite convergence in linearly separated classes. We obtain a similar result concerning the nearest neighbor classification algorithm in the case of compact classes in a general metric space for the case of non-intersecting classes. The learning process consists of gradual modification of the algorithm in misclassification cases. The process is studied in the deterministic setting. Classes are understood as compacts in complete metric space, and class separation is defined as the non-intersection of compacts. The number of learning steps is bounded by the number of elements in some ε-net for the considered classes.