Key | Value |
---|---|
FileName | ./usr/lib/i386-linux-gnu/cmake/dlpack/dlpackConfig.cmake |
FileSize | 1351 |
MD5 | C407146C90011FD1E30CD86015814384 |
SHA-1 | 5FF89C180B9E8A42D92DB22491FC75F29DD69F92 |
SHA-256 | 3AAE536B9D0AE5197A560D35668BBC67F4DD760388496FB3DD3392A5718E08FF |
SSDEEP | 24:73aKk6GFLDEZv6gkeUgJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VeUOh2FZXMbGrzbk2t3 |
TLSH | T1BF2132B427991C609397D2507296B41F08C6547FBEB34440FA8FD28923DD1B05AC33FA |
hashlookup:parent-total | 2 |
hashlookup:trust | 60 |
The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 6352 |
MD5 | AF4C402EB0FF60333E969408F17967ED |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.0~git20200217.3ec0443-2 |
SHA-1 | 3F22F58834B6385F75FE8AA8C3E5AF2E94C7140D |
SHA-256 | 02AABE3089F3EDC2C2057F4C2205B9D2338B65C66BDF2D169DF3B850676FE822 |
Key | Value |
---|---|
FileSize | 7076 |
MD5 | 353F70E50BFF9123407A6DE01810C667 |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.6-1+b1 |
SHA-1 | 41B0DC9F55E5CE37958E0F6BE653AD103649EED1 |
SHA-256 | 797D9B1C2FE1A3C11B6C910E5342E12D81DCF27FBD9479C464B3042FD119718D |