GSoC Week 11 & 12: Training Refactors, Protocols, and Documentation Updates
Hi everyone! Here’s my update for Weeks 11 and 12 of my GSoC 2025 journey with the sbi project. These weeks were centered on refactoring trainer methods, introducing new protocols, and improving the documentation.
What I Worked On
-
Opened a PR to add the
PosteriorParametersdataclasses to the Sphinx documentation, ensuring these important classes are now visible in the API docs. I also updated the class template to display shorter titles so that long names are no longer cut off when rendering full paths, making the docs easier to navigate. -
Focused heavily on refactoring the training method for inference classes in sbi, which previously contained a lot of duplicated logic that made maintenance and extension harder. I started by identifying patterns in the shared training logic across classes, then moved into implementation, opening PR #1651. The key changes include:
- Moving the main training loop (previously repeated in each trainer) to the
NeuralInferencebase class and consolidating it into a single, reusable method. - Adding four new abstract methods that subclasses must now override, ensuring a cleaner and more consistent design.
- Splitting out epoch-level logic into its own dedicated methods inside the
NeuralInferencebase class, making the code easier to follow and extend. - Adding training duration summaries that were previously missing for
NLEandNREsubclasses.
Overall, this refactor makes the trainer code more modular, easier to maintain, and simpler to extend with new inference classes in the future.
- Moving the main training loop (previously repeated in each trainer) to the
-
Opened a PR to update the
RatioEstimatorclass to be a subclass ofConditionalEstimator(PR #1652). This change improves the overall class hierarchy and reduces duplication.- As part of this effort, I also removed the
RatioEstimatorBuilderandVectorFieldEstimatorBuilderprotocols, since their functionality can be represented more generally with theDensityEstimatorBuilderprotocol (PR #1633). This simplifies the estimator-building interface and avoids maintaining multiple overlapping protocols.
- As part of this effort, I also removed the
Weeks 11 and 12 were focused on deep refactoring of training methods, making them easier to extend and maintain, while also ensuring consistency across inference classes. Documentation was improved with the inclusion of PosteriorParameters in Sphinx, and the estimator hierarchy was streamlined by consolidating protocols under DensityEstimatorBuilder. Together, these updates strengthen the architecture and prepare the codebase for future extensibility.