Key | Value |
---|---|
FileName | ./usr/lib/arm-linux-gnueabihf/cmake/dlpack/dlpackConfig.cmake |
FileSize | 1356 |
MD5 | DF70A015E0B42706453A7471796168A7 |
SHA-1 | 8C2D65C522F28984153623E1073D2274A1F139B0 |
SHA-256 | 690CEA98050E798C3FD5AF5071830103545D33232152EDA7027A619013445701 |
SSDEEP | 24:73aKk6GFLDEZv6gkeAjUgJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VeAwOh2FZXMbGrzbk2t3 |
TLSH | T16B2132B427991C608397D15072A6B41F08C6497FBEB34840FA8FD28923DD1B05A833FA |
hashlookup:parent-total | 2 |
hashlookup:trust | 60 |
The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 6360 |
MD5 | 8CD3A27CB75DDEBEBEA7F75C2EDF3487 |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.0~git20200217.3ec0443-2 |
SHA-1 | 9B05B89ADF9A715F6AE3F44CF449BE389DAFDF9E |
SHA-256 | FB842C3A0DA44A77DBADC62C3A62B16363442D262B53D0AB9903B0A7FA100FCB |
Key | Value |
---|---|
FileSize | 7068 |
MD5 | 82232B5B34179A293C04FFA126A348EA |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.6-1+b1 |
SHA-1 | 290DC8B80510067FEBFDCD6E81E83F57A0AECDD3 |
SHA-256 | E889ED7FEC62F52C340B53E87626F8D95BE1F0B9A674BA1C2E5672119C7C30D2 |