Next: 1 Introduction
Up: fns98
Previous: fns98
The fundamentals of an interface are presented
to include prior knowledge in learning problems.
This is done in the framework of regularization theory,
working not with a restricted parameterization
but directly with the function values itself.
Technically,
single components of prior knowledge are hereby represented
by so called quadratic prior concepts.
The error functional which has to be minimized is
obtained by probabilistic or fuzzy logical operations
of prior concepts.
Commonly used regularization approaches correspond
to a probabilistic implementaion of AND yielding convex error surfaces.
The paper presents an approach
to go beyond such classical regularization functionals
by including OR-like combinations of prior concepts.
Resulting in non-convex error functionals
this requires non-linear stationarity equations to be solved.
A great variety of corresponding learning algorithms
can be constructed.
The paper concentrates on the fundamental definitions.
Keywords:
Prior information, Regularization,
annealing methods,
Landau-Ginzburg model.
Joerg_Lemm
2000-09-22