Pregled bibliografske jedinice broj: 1252296
Implementing monotonic constrained neural network layers using complementary activation functions
Implementing monotonic constrained neural network layers using complementary activation functions
(2023)
CROSBI ID: 1252296 Za ispravke kontaktirajte CROSBI podršku putem web obrasca
Naslov
Implementing monotonic constrained neural network
layers using complementary activation functions
Autori
Runje, Davor ; Shankaranarayana, Sharath Makki
Broj patenta
US11, 551, 063
Godina
2023
Datum patenta
10.1.2023.
Nositelj prava
AIRT Technologies d.o.o., Zagreb, Hrvatska
Sažetak
A facility for generating monotonic fully connected layer blocks for a machine learning model is described. The facility receives an indication of a convex constituent monotonically increasing activation function and a concave constituent monotonically increasing activation function for a monotonic layer. The facility generates a composite monotonic activation function made up of the convex and concave constituent activation functions. The facility receives an indication of a monotonicity indicator vector for the monotonic dense layer block. The facility determines one or more selector weights for the composite activation function. The facility initializes a sign for each weight of one or more kernel weights included in the monotonic layer and initializes a bias vector. The facility generates the monotonic dense layer block based on the composite activation function, the monotonicity indicator vector, the selector weights, the sign for each kernel weight, and the bias vector.
Izvorni jezik
Engleski
Znanstvena područja
Računarstvo, Informacijske i komunikacijske znanosti