Result for 8C2D65C522F28984153623E1073D2274A1F139B0

Query result

Key Value
FileName./usr/lib/arm-linux-gnueabihf/cmake/dlpack/dlpackConfig.cmake
FileSize1356
MD5DF70A015E0B42706453A7471796168A7
SHA-18C2D65C522F28984153623E1073D2274A1F139B0
SHA-256690CEA98050E798C3FD5AF5071830103545D33232152EDA7027A619013445701
SSDEEP24:73aKk6GFLDEZv6gkeAjUgJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VeAwOh2FZXMbGrzbk2t3
TLSHT16B2132B427991C608397D15072A6B41F08C6497FBEB34840FA8FD28923DD1B05A833FA
hashlookup:parent-total2
hashlookup:trust60

Network graph view

Parents (Total: 2)

The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:

Key Value
FileSize6360
MD58CD3A27CB75DDEBEBEA7F75C2EDF3487
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.0~git20200217.3ec0443-2
SHA-19B05B89ADF9A715F6AE3F44CF449BE389DAFDF9E
SHA-256FB842C3A0DA44A77DBADC62C3A62B16363442D262B53D0AB9903B0A7FA100FCB
Key Value
FileSize7068
MD582232B5B34179A293C04FFA126A348EA
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.6-1+b1
SHA-1290DC8B80510067FEBFDCD6E81E83F57A0AECDD3
SHA-256E889ED7FEC62F52C340B53E87626F8D95BE1F0B9A674BA1C2E5672119C7C30D2