Activation Functions
-
riscv_nmsis_nn_status riscv_nn_activation_s16(const int16_t *input, int16_t *output, const int32_t size, const int32_t left_shift, const riscv_nn_activation_type type)
-
void riscv_nn_activations_direct_q15(q15_t *data, uint16_t size, uint16_t int_width, riscv_nn_activation_type type)
-
void riscv_nn_activations_direct_q7(q7_t *data, uint16_t size, uint16_t int_width, riscv_nn_activation_type type)
-
void riscv_relu6_s8(int8_t *data, uint16_t size)
-
void riscv_relu_q15(int16_t *data, uint16_t size)
-
void riscv_relu_q7(int8_t *data, uint16_t size)
- group Acti
Perform activation layers, including ReLU (Rectified Linear Unit), sigmoid and tanh.
Functions
-
riscv_nmsis_nn_status riscv_nn_activation_s16(const int16_t *input, int16_t *output, const int32_t size, const int32_t left_shift, const riscv_nn_activation_type type)
s16 neural network activation function using direct table look-up
Supported framework: TensorFlow Lite for Microcontrollers. This activation function must be bit precise congruent with the corresponding TFLM tanh and sigmoid activation functions
- Parameters
input – [in] pointer to input data
output – [out] pointer to output
size – [in] number of elements
left_shift – [in] bit-width of the integer part, assumed to be smaller than 3.
type – [in] type of activation functions
- Returns
The function returns
RISCV_NMSIS_NN_SUCCESS
-
void riscv_nn_activations_direct_q15(q15_t *data, uint16_t size, uint16_t int_width, riscv_nn_activation_type type)
neural network activation function using direct table look-up
Q15 neural network activation function using direct table look-up.
Note
Refer header file for details.
-
void riscv_nn_activations_direct_q7(q7_t *data, uint16_t size, uint16_t int_width, riscv_nn_activation_type type)
Q7 neural network activation function using direct table look-up.
This is the direct table look-up approach.
Assume here the integer part of the fixed-point is <= 3. More than 3 just not making much sense, makes no difference with saturation followed by any of these activation functions.
- Parameters
data – [inout] pointer to input
size – [in] number of elements
int_width – [in] bit-width of the integer part, assume to be smaller than 3
type – [in] type of activation functions
-
void riscv_relu6_s8(int8_t *data, uint16_t size)
s8 ReLU6 function
- Parameters
data – [inout] pointer to input
size – [in] number of elements
-
void riscv_relu_q15(int16_t *data, uint16_t size)
Q15 RELU function.
- Parameters
data – [inout] pointer to input
size – [in] number of elements
-
void riscv_relu_q7(int8_t *data, uint16_t size)
Q7 RELU function.
- Parameters
data – [inout] pointer to input
size – [in] number of elements
-
riscv_nmsis_nn_status riscv_nn_activation_s16(const int16_t *input, int16_t *output, const int32_t size, const int32_t left_shift, const riscv_nn_activation_type type)