LinMeasurementLayerA¶
Role: Learnable mapping from a flattened field vector to per-cell measurement probabilities in \([0, 1]\) via an MLP ending in a sigmoid layer. Location:
Q_Sea_Battle.lin_measurement_layer_a.LinMeasurementLayerA
Constructor¶
| Parameter | Type | Description |
|---|---|---|
| n2 | int, constraint \(n2 > 0\) | Field vector length (number of cells). Used as the output dimension of the final Dense layer and validated against the input last dimension in call. |
| hidden_units | Sequence[int], elements constraint each \(u\) castable to int | Hidden layer widths for the MLP; each element produces one tf.keras.layers.Dense(u, activation="relu") in build. Default: (64,). |
| name | Optional[str], constraint either None or any string |
Layer name passed to tf.keras.layers.Layer. Default: "LinMeasurementLayerA". |
| **kwargs | dict[str, Any], not specified | Forwarded to tf.keras.layers.Layer superclass constructor. |
Preconditions: n2 > 0.
Postconditions: self.n2 is set to int(n2); self.hidden_units is set to tuple(int(u) for u in hidden_units); _mlp is initialized empty and _built_mlp is False.
Errors: Raises ValueError if n2 <= 0.
Example
import tensorflow as tf
from Q_Sea_Battle.lin_measurement_layer_a import LinMeasurementLayerA
n2 = 100
layer = LinMeasurementLayerA(n2=n2, hidden_units=(64, 64))
x = tf.zeros((8, n2), dtype=tf.float32)
y = layer(x, training=True)
assert y.shape == x.shape
Public Methods¶
build¶
- Signature:
build(self, input_shape) -> None - Parameters:
input_shape: Unknown, not specified; passed through totf.keras.layers.Layer.build.- Returns: None.
- Behavior: Lazily constructs the MLP exactly once; for each
uinself.hidden_units, appends atf.keras.layers.Dense(u, activation="relu"), then appends a finaltf.keras.layers.Dense(self.n2, activation="sigmoid"); sets_built_mlp = Trueand callssuper().build(input_shape). - Errors: Not specified.
call¶
- Signature:
call(self, fields, training: bool = False) - Parameters:
fields: tf.Tensor-like, dtype any numeric; accepted shapes(B, n2)or(n2,); converted viatf.convert_to_tensor(fields)and cast totf.float32if not floating.training: bool, constraint any boolean; passed astraining=trainingto each Dense layer call.- Returns:
tf.Tensor, dtype float32, shape(B, n2)if input rank is 2, else shape(n2,)if input rank is 1; values are in \([0, 1]\) due to sigmoid output activation. - Errors: Raises
ValueErrorif input rank is not 1 or 2; raisesValueErrorif the last dimension is known and is not equal toself.n2.
Data & State¶
n2: int, constraint \(n2 > 0\); field vector length and final output width.hidden_units: tuple[int, ...], constraint each element is anintderived fromhidden_units; defines hidden Dense layers._mlp: list[tf.keras.layers.Layer], initial shapelen(_mlp) == 0beforebuild; afterbuild, containslen(hidden_units) + 1Dense layers in order._built_mlp: bool;Falsebeforebuild,Trueafter the first successfulbuild.
Planned (design-spec)¶
- Not specified.
Deviations¶
- Not specified.
Notes for Contributors¶
buildis intentionally idempotent via_built_mlp; modifying layer construction should preserve that behavior to avoid duplicate sublayers on repeated builds.callsupports rank-1 inputs by temporarily expanding to rank-2 and then squeezing back; maintain the squeeze/unsqueeze contract if changing shape handling.
Related¶
tf.keras.layers.Layertf.keras.layers.Dense
Changelog¶
- 0.1: Initial implementation of a learnable measurement layer mapping field vectors to per-cell probabilities via an MLP with sigmoid output.