A k-Nearest Neighbor Based Multi-Instance Multi-Label Learning Algorithm. kobe chai email@example.com. כריית מידע,המכללה האקדמית להנדסה ירושלים. Mountains. Multi-label learning. Trees. Lake. ? What is Multi-Label Objects. example: natural scene image.
A k-Nearest Neighbor Based Multi-InstanceMulti-Label Learning Algorithm
example: natural scene image
(a) Traditional supervised learning
(b) Multi-instance learning
(c) Multi-label learning
(d) Multi-instance multi-label learning
In multi-instance multi-label learning (i.e. MIML),
each example is not only represented by multiple instances but
also associated with multiple labels
movie character name
Sport? Fans? Football? Training?
Identification process may lose useful information encoded in
training examples and therefore be harmful to the learning algorithms performance.
information loss during degeneration process!
An instance of a car the MIML algorithm cant recognize this specific car
Because of the change below.
kNN-MIML will consider the citer (blue door) and will learn the new car.
MIML-kNN is proposed for MIML by utilizing the popular k nearest neighbor techniques.
Given a test example, MIML-kNN not only considers its neighbors, but also considers its citers which regard it as their own neighbors.
How it works?
Learns one binary classifier for each label
Outputs the union of their predictions
Can do ranking if classifier outputs scores
LimitationDoes not consider label relationships
The performance of MIML-kNN is compared with MIMLBOOST
and MIMLSVM on two real-world MIML tasks.
The scene classification data contains 2,000 natural scene images.
All the possible class labels are desert, mountains, sea, sunset and trees.
average number of labels per image is 1.240.44.