Skip to content

Commit a65363e

Browse files
Correction of documentation on KeyjaggedTensor implied batch size (#3197)
Summary: Pull Request resolved: #3197 TorchRec's "concepts" page incorrectly declared that the implied batch size for a `KeyJaggedTensor` is the number of features divided by the length of the `lengths` tensor. It is in fact the lenght of `lenghts` divded by the number of keys. This diff corrects that. Reviewed By: iamzainhuda Differential Revision: D78375136 fbshipit-source-id: 10bd20f4328c2deae0a65aa059c38853b2f3b4d1
1 parent 6ea8189 commit a65363e

File tree

1 file changed

+4
-3
lines changed

1 file changed

+4
-3
lines changed

docs/source/concepts.rst

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -96,9 +96,10 @@ is the data type used in ``forward`` of ``EmbeddingBagCollection`` and
9696
``EmbeddingCollection`` as they are used to represent multiple features
9797
in a table.
9898

99-
A ``KeyedJaggedTensor`` has an implied batch size, which is the number
100-
of features divided by the length of ``lengths`` tensor. The example
101-
below has a batch size of 2. Similar to a ``JaggedTensor``, the
99+
A ``KeyedJaggedTensor`` has an implied batch size, which is the length
100+
of ``lengths`` tensor divided by the number of keys. The example
101+
below has a batch size of 2 (4 lengths divided by 2 keys).
102+
Similar to a ``JaggedTensor``, the
102103
``offsets`` and ``lengths`` function in the same manner. You can also
103104
access the ``lengths``, ``offsets``, and ``values`` of a feature by
104105
accessing the key from the ``KeyedJaggedTensor``.

0 commit comments

Comments
 (0)