Skip to content

Conversation

kctezcan
Copy link
Contributor

@kctezcan kctezcan commented Sep 30, 2025

Description

This is an dditional PR over a previous PR: #961

The previous one introduces a new function to embed cells for the targets. This PR uses the existing embed_cells() function to embed the target tokens. The purpose is to reduce duplicated code and prevent potential "code rot" etc..

I have tested both training and inference with this.

Issue Number

Ref #941
Refs #941
Closes #941
Closes #941

Checklist before asking for review

  • I have performed a self-review of my code
  • My changes comply with basic sanity checks:
    • I have fixed formatting issues with ./scripts/actions.sh lint
    • I have run unit tests with ./scripts/actions.sh unit-test
    • I have documented my code and I have updated the docstrings.
    • I have added unit tests, if relevant
  • I have tried my changes with data and code:
    • I have run the integration tests with ./scripts/actions.sh integration-test
    • (bigger changes) I have run a full training and I have written in the comment the run_id(s): launch-slurm.py --time 60
    • (bigger changes and experiments) I have shared a hegdedoc in the github issue with all the configurations and runs for this experiments
  • I have informed and aligned with people impacted by my change:
    • for config changes: the MatterMost channels and/or a design doc
    • for changes of dependencies: the MatterMost software development channel

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

Successfully merging this pull request may close these issues.

encoding target variales in the latent space
1 participant