Replies: 1 comment
-
|
I forgot about this discussion, this is pretty outdated and I think mostly implemented so I'll close it, we can open a new discussion if needed. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
To-do list for splitting off
NDTensors.BlockSparseArraysas a separate registered packageBlockSparseArrays.jl:AnyAbstractBlockSparseArraytype union in favor of an@derivemacro, similar toMoshi.@derive, the Rust derive attribute for implementing traits, andArrayLayouts.@layoutmatrixand related macros inArrayLayouts.jl. This would basically automatically definegetindex,map!, etc. asblocksparse_getindex,blocksparse_map!, etc. on a specified type or wrapper.NDTensors.jlsub-modulesBlockSparseArraysdepends on and either remove those dependencies or assess what we need to do to split off those libraries into packages as well. For example:SparseArraysBaseis a major dependency so we will have to release that first, see `SparseArraysBase.jl` release to-do list SparseArraysBase.jl#1.BroadcastMapConversion, which converts broadcast calls to map calls (which is heavily inspired by the broadcasting code logic inStrided.jl). That library is also used in other sub-modules ofNDTensors.jl, such asBlockSparseArraysandNamedDimsArrays.GradedAxesis being used for things likedualand to provide some generic block axis slicing functionality that works for both graded and non-graded unit ranges.TypeParameterAccessorsfor generically accessing type parameters, in particular it uses functionality for generically getting the type of the parent of a wrapper type. We've been planning to split that off for a while, though I think there are still some type instability issues and interface questions to decide on so I'm not sure how comfortable I am doing that right now.NestedPermutedDimsArrayswill be used as the output ofblocks(::PermutedDimsArray).GradedAxesandTensorAlgebrafor compatibility with those libraries.BlockSparseArrays, particularly in light of any changes we decide to make toSparseArraysBase, which are being discussed in `SparseArraysBase.jl` release to-do list SparseArraysBase.jl#1.Beta Was this translation helpful? Give feedback.
All reactions