Thư viện tri thức trực tuyến
Kho tài liệu với 50,000+ tài liệu học thuật
© 2023 Siêu thị PDF - Kho tài liệu học thuật hàng đầu Việt Nam

Một thuật toán học cải tiến của bộ nhớ liên kết hai chiều
Nội dung xem thử
Mô tả chi tiết
Nông Thị Hoa và Đtg Tạp chí KHOA HỌC & CÔNG NGHỆ 113(13): 61 - 65
61
AN IMPROVED LEARNING ALGORITHM OF BAM
Nong Thi Hoa1,*, Bui The Duy2
1College of Information Technology and Communication – TNU
2Human Machine Interaction Laboratory – Vietnam National University, Hanoi
SUMMARY
Artificial neural networks, characterized by massive parallelism, robustness, and learning capacity,
have many applications in various fields. Bidirectional Associative Memory (BAM) is a neural
network that is extended from Hopfield networks to make a two-way associative search for a
pattern pair. The most important advantage of BAM is recalling stored patterns from noisy inputs.
Learning process of previous BAMs, however, is not flexible. Moreover, orthogonal patterns are
recalled better than other patterns. It means that, some important patterns cannot be recalled. In
this paper, we propose a learning algorithm of BAM, which learns from training data more flexibly
as well as improves the ability of recall for non-orthogonal patterns. In our learning algorithm,
associations of patterns are updated flexibly in a few iterations by modifying parameters after each
iteration. Moreover, the proposed learning algorithm assures the recalling of all patterns is similar,
which is presented by the stop condition of the learning process. We have conduct experiments
with five datasets to prove the effectiveness of BAM with the proposed learning algorithm (FBAM
- Flexible BAM). Results from experiments show that FBAM recalls better than other BAMs in
auto-association mode.
Keywords: Bidirectional Associative Memory, Associative Memory, Learning Algorithm, Noise
Tolerance, Pattern Recognition.
INTRODUCTION*
Artificial neural networks, characterized by
massive parallelism, robustness, and learning
capability, effectively solve many problems
such as pattern recognition, designing
controller, clustering data. BAM [1] is
designed from two Hopfield neural networks
to show a two-way associative search of
pattern pairs. An important advantage of
BAM is recalling stored patterns from noisy
or partial inputs. Moreover, BAM possesses
two attributes overcome other neural
networks. First, BAM is stable without
condition. Second, BAM converges to a
stable state in a synchronous mode.
Therefore, it is easy to apply BAM for real
applications.
Studies on models of BAM can be divided
into two categories: BAMs without iterative
learning and BAMs with iterative learning
(BAMs with multiple training strategy).
BAMs with iterative learning recall more
*
Tel: 01238492484
effectively than BAMs without iterative
learning. The iterative learning of BAMs is
shown into two types. The first type is using
the minimum number of times for training
pairs of patterns (MNTP). BAMs [2, 3, 4]
showed multiple training strategy which
assured orthogonal patterns were recalled
perfectly. However, the learning process is
not flexible because MNTP is fixed. The
second type is learning pairs of patterns in
many iterations. BAMs learned pairs of
patterns sequentially in many iterations to
guarantee the perfect recall of orthogonal
patterns [5, 6, 7, 8]. Additionally, new
weights of associations depend on old weights
in a direct way. Therefore, it takes a long time
to modify weights if old weights are far from
desired values. In other words, previous
BAMs recall non-orthogonal patterns weakly
and learn fixedly. In this paper, we propose an
iterative learning algorithm of BAM, which
learns more flexibly as well as improves the
ability of recall for non-orthogonal patterns.
We use MNTP to show the multiple training
strategy. In the proposed learning rule,