Key | Value |
---|---|
FileName | ./usr/lib/powerpc64le-linux-gnu/cmake/dlpack/dlpackConfig.cmake |
FileSize | 1358 |
MD5 | 55E857F20C35C8C5B470FECB32E7FD71 |
SHA-1 | D48CA5FA216954BF9CDCA8CFFCAE30B0F7C26AEA |
SHA-256 | F92315D7854501F0264344DAC32F45F22C25A5BD741158CD149E8AEA39D08136 |
SSDEEP | 24:73aKk6GFLDEZv6gkeBfgJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VelOh2FZXMbGrzbk2t3 |
TLSH | T10A2110B4269918608397C150729AB51F08CA547FBEB34440FA8FD28923DD2B05A932BA |
hashlookup:parent-total | 2 |
hashlookup:trust | 60 |
The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 7092 |
MD5 | CC441BFA808FC70BF3CA511143034B26 |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.6-1+b1 |
SHA-1 | 7229C7161E19EC3AFE4AE71E530E0FCBCED0DD0D |
SHA-256 | 27C95D3F35835633D2EBBD0295184131AA62DE145137C106B3347A04D1BABE60 |
Key | Value |
---|---|
FileSize | 6356 |
MD5 | D1D605F026504F6750E01C2A10522993 |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.0~git20200217.3ec0443-2 |
SHA-1 | E67AAA4F81D1C910506895C8E9E58C9765093B40 |
SHA-256 | 9B1B4A680913B018EB4A70CDE461374EB91AD043B050AB54DDE7D77AF7748294 |