Sum Modules¶
Sum¶
Weighted sum over input modules with learnable log-space weights.
- class spflow.modules.sums.Sum(inputs, out_channels=None, num_repetitions=1, weights=None)[source]¶
Bases:
ModuleSum module representing mixture operations in probabilistic circuits.
Implements mixture modeling by computing weighted combinations of child distributions. Weights are normalized to sum to one, maintaining valid probability distributions. Supports both single input (mixture over channels) and multiple inputs (mixture over concatenated inputs).
- inputs¶
Input module(s) to the sum node.
- Type:
Module
- weights¶
Normalized weights for mixture components.
- Type:
Tensor
- logits¶
Unnormalized log-weights for gradient optimization.
- Type:
Parameter
- __init__(inputs, out_channels=None, num_repetitions=1, weights=None)[source]¶
Create a Sum module for mixture modeling.
Weights are automatically normalized to sum to one using softmax. Multiple inputs are concatenated along dimension 2 internally.
- Parameters:
inputs (
Module | list[Module]) – Single module or list of modules to mix.out_channels (
int | None, optional) – Number of output mixture components. Required if weights not provided.num_repetitions (
int | None, optional) – Number of repetitions for structured representations. Inferred from weights if not provided.weights (
Tensor | list[float] | None, optional) – Initial mixture weights. Must have compatible shape with inputs and out_channels.
- Raises:
ValueError – If inputs empty, out_channels < 1, or weights have invalid shape/values.
InvalidParameterCombinationError – If both out_channels and weights are specified.
- expectation_maximization(data, bias_correction=True, cache=None)[source]¶
Perform expectation-maximization step.
- log_likelihood(data, cache=None)[source]¶
Compute log likelihood P(data | module).
Computes log likelihood using logsumexp for numerical stability. Results are cached for parameter learning algorithms.
- Parameters:
- Returns:
- Log-likelihood of shape (batch_size, num_features, out_channels)
or (batch_size, num_features, out_channels, num_repetitions).
- Return type:
Tensor
- maximum_likelihood_estimation(data, weights=None, cache=None)[source]¶
Update parameters via maximum likelihood estimation.
For Sum modules, this is equivalent to EM.
- sample(num_samples=None, data=None, is_mpe=False, cache=None, sampling_ctx=None)[source]¶
Generate samples from sum module.
- property feature_to_scope: ndarray¶
Mapping from output features to their respective scopes.
- Returns:
- 2D-array of scopes. Each row corresponds to an output feature,
each column to a repetition.
- Return type:
np.ndarray[Scope]
ElementwiseSum¶
Element-wise summation over multiple inputs with the same scope.
- class spflow.modules.sums.ElementwiseSum(inputs, out_channels=None, weights=None, num_repetitions=None)[source]¶
Bases:
ModuleElementwise sum operation for mixture modeling.
Computes weighted combinations of input tensors element-wise. Weights are automatically normalized to sum to one. Uses log-domain computations.
- logits¶
Unnormalized log-weights for gradient optimization.
- Type:
Parameter
- unraveled_channel_indices¶
Mapping for flattened channel indices.
- Type:
Tensor
- __init__(inputs, out_channels=None, weights=None, num_repetitions=None)[source]¶
Initialize elementwise sum module.
- Parameters:
inputs (
list[Module]) – Input modules (same features, compatible channels).out_channels (
int|None) – Number of output nodes per sum. Note that this results in a total of out_channels * in_channels (input modules) output channels since we sum over the list of modules.weights (
Tensor|None) – Initial weights (if None, randomly initialized).
- maximum_likelihood_estimation(data, weights=None, cache=None)[source]¶
MLE step (equivalent to EM for sum nodes).
- sample(num_samples=None, data=None, is_mpe=False, cache=None, sampling_ctx=None)[source]¶
Generate samples by choosing mixture components.