Configuration class for Dropout module.
More...
Configuration class for Dropout module.
Provides a type-safe fluent interface for configuring Dropout modules.
◆ DropoutConfig() [1/2]
Mila::Dnn::DropoutConfig::DropoutConfig |
( |
| ) |
|
|
default |
◆ DropoutConfig() [2/2]
Mila::Dnn::DropoutConfig::DropoutConfig |
( |
float |
probability | ) |
|
|
inlineexplicit |
Constructor with dropout probability.
- Parameters
-
probability | The dropout probability (0.0 to 1.0) |
◆ getProbability()
float Mila::Dnn::DropoutConfig::getProbability |
( |
| ) |
const |
|
inline |
Get the configured dropout probability.
- Returns
- float The dropout probability
◆ scalesDuringInference()
bool Mila::Dnn::DropoutConfig::scalesDuringInference |
( |
| ) |
const |
|
inline |
Check if scaling during inference is enabled.
- Returns
- bool Whether scaling during inference is enabled
◆ usesSameMaskPerBatch()
bool Mila::Dnn::DropoutConfig::usesSameMaskPerBatch |
( |
| ) |
const |
|
inline |
Check if the same mask is used for all elements in a batch.
- Returns
- bool Whether the same mask is used per batch
◆ validate()
void Mila::Dnn::DropoutConfig::validate |
( |
| ) |
const |
|
inlinevirtual |
Validate configuration parameters.
- Exceptions
-
std::invalid_argument | If validation fails |
Reimplemented from Mila::Dnn::ComponentConfig.
◆ withProbability()
DropoutConfig & Mila::Dnn::DropoutConfig::withProbability |
( |
float |
probability | ) |
|
|
inline |
Configure the dropout probability.
- Parameters
-
probability | The probability of zeroing elements (0.0 to 1.0) |
- Returns
- DropoutConfig& Reference to this for method chaining
◆ withSameMaskPerBatch()
DropoutConfig & Mila::Dnn::DropoutConfig::withSameMaskPerBatch |
( |
bool |
use_same_mask_per_batch | ) |
|
|
inline |
Configure whether to use the same dropout mask for all elements in a batch.
- Parameters
-
use_same_mask_per_batch | Whether to use the same mask for entire batch |
- Returns
- DropoutConfig& Reference to this for method chaining
◆ withScalingDuringInference()
DropoutConfig & Mila::Dnn::DropoutConfig::withScalingDuringInference |
( |
bool |
scale_during_inference | ) |
|
|
inline |
Configure whether to apply scaling during inference.
When true, outputs during inference will be scaled by 1/(1-p) to maintain the same expected value between training and inference. When false, dropout is completely disabled during inference.
- Parameters
-
scale_during_inference | Whether to apply scaling during inference |
- Returns
- DropoutConfig& Reference to this for method chaining
◆ probability_
float Mila::Dnn::DropoutConfig::probability_ { 0.5f } |
|
private |
The probability of zeroing elements.
◆ scale_during_inference_
bool Mila::Dnn::DropoutConfig::scale_during_inference_ { false } |
|
private |
Whether to apply scaling during inference.
◆ use_same_mask_per_batch_
bool Mila::Dnn::DropoutConfig::use_same_mask_per_batch_ { false } |
|
private |
Whether to use the same mask for entire batch.
The documentation for this class was generated from the following file: