Sum Modules¶
Sum¶
Weighted sum over input modules with learnable log-space weights.
- class spflow.modules.sums.sum.Sum(inputs, out_channels=1, num_repetitions=1, weights=None)[source]¶
Bases:
ModuleSum module representing mixture operations in probabilistic circuits.
Implements mixture modeling by computing weighted combinations of child distributions. Weights are normalized to sum to one, maintaining valid probability distributions. Supports both single input (mixture over channels) and multiple inputs (mixture over concatenated inputs).
- inputs¶
Input module(s) to the sum node.
- Type:
Module
- weights¶
Normalized weights for mixture components.
- Type:
Tensor
- logits¶
Unnormalized log-weights for gradient optimization.
- Type:
Parameter
- __init__(inputs, out_channels=1, num_repetitions=1, weights=None)[source]¶
Create a Sum module for mixture modeling.
Weights are automatically normalized to sum to one using softmax. Multiple inputs are concatenated along dimension 2 internally.
- Parameters:
inputs (
Module | list[Module]) – Single module or list of modules to mix.out_channels (
int, optional) – Number of output mixture components. Defaults to 1.num_repetitions (
int | None, optional) – Number of repetitions for structured representations. Inferred from weights if not provided.weights (
Tensor | list[float] | None, optional) – Initial mixture weights. Must have compatible shape with inputs and out_channels.
- Raises:
ValueError – If inputs empty, out_channels < 1, or weights have invalid shape/values.
InvalidParameterCombinationError – If both out_channels and weights are specified.
- log_likelihood(data, cache=None)[source]¶
Compute log likelihood P(data | module).
Computes log likelihood using logsumexp for numerical stability. Results are cached for parameter learning algorithms.
- Parameters:
- Returns:
- Log-likelihood of shape (batch_size, num_features, out_channels)
or (batch_size, num_features, out_channels, num_repetitions).
- Return type:
Tensor
ElementwiseSum¶
Element-wise summation over multiple inputs with the same scope.
- class spflow.modules.sums.elementwise_sum.ElementwiseSum(inputs, out_channels=None, weights=None, num_repetitions=None)[source]¶
Bases:
ModuleElementwise sum operation for mixture modeling.
Computes weighted combinations of input tensors element-wise. Weights are automatically normalized to sum to one. Uses log-domain computations.
- logits¶
Unnormalized log-weights for gradient optimization.
- Type:
Parameter
- unraveled_channel_indices¶
Mapping for flattened channel indices.
- Type:
Tensor
- __init__(inputs, out_channels=None, weights=None, num_repetitions=None)[source]¶
Initialize elementwise sum module.
- Parameters:
inputs (
list[Module]) – Input modules (same features, compatible channels).out_channels (
int|None) – Number of output nodes per sum. Note that this results in a total of out_channels * in_channels (input modules) output channels since we sum over the list of modules.weights (
Tensor|None) – Initial weights (if None, randomly initialized).
RepetitionMixingLayer¶
A specialized sum layer used to sum over repetitions.
- class spflow.modules.sums.repetition_mixing_layer.RepetitionMixingLayer(inputs, out_channels=1, num_repetitions=1, weights=None)[source]¶
Bases:
SumMixing layer for RAT-SPN region nodes.
Specialized sum node for RAT-SPNs. Creates mixtures over input channels. Extends Sum with RAT-SPN specific optimizations.
- __init__(inputs, out_channels=1, num_repetitions=1, weights=None)[source]¶
Initialize mixing layer for RAT-SPN.
SignedSum¶
Linear combination node that allows negative, non-normalized weights.
- class spflow.modules.sums.signed_sum.SignedSum(inputs, out_channels=1, num_repetitions=1, weights=None)[source]¶
Bases:
ModuleLinear-combination node that allows negative, non-normalized weights.
This node is not a probabilistic mixture node. It represents a real-valued linear combination of input channels:
y = Σ_j w_j * x_j
where weights may be negative and do not need to sum to one.
Notes
SignedSum does not implement log_likelihood() because its output may be negative (log is undefined). Use SOCS or signed evaluation utilities for inference.
sample() is only supported when all weights are non-negative and no evidence is present, in which case it behaves like an unnormalized mixture over inputs.
- log_likelihood(data, cache=None)[source]¶
Compute log likelihood P(data | module).
Computes log probability of input data under this module’s distribution. Uses log-space for numerical stability. Results should be cached for efficiency.
- Parameters:
data (
Tensor) – Input data of shape (batch_size, num_features). NaN values indicate missing values to marginalize over.cache (
Cache | None, optional) – Cache for intermediate computations. Defaults to None.
- Returns:
Log-likelihood of shape (batch_size, out_features, out_channels).
- Return type:
Tensor
- Raises:
ValueError – If input data shape is incompatible with module scope.
- marginalize(marg_rvs, prune=True, cache=None)[source]¶
Structurally marginalize out specified random variables from the module.
Computes a new module representing the marginal distribution by integrating out the specified variables from the structure. For data-level marginalization, use NaNs in
log_likelihoodinputs.- Parameters:
marg_rvs (
list[int]) – Random variable indices to marginalize out.prune (
bool, optional) – Whether to prune unnecessary modules during marginalization. Defaults to True.cache (
Cache | None, optional) – Cache for intermediate computations. Defaults to None.
- Returns:
Marginalized module, or None if all variables are marginalized out.
- Return type:
Module | None
- Raises:
ValueError – If marginalization variables are not in the module’s scope.