Result for D48CA5FA216954BF9CDCA8CFFCAE30B0F7C26AEA

Query result

Key Value
FileName./usr/lib/powerpc64le-linux-gnu/cmake/dlpack/dlpackConfig.cmake
FileSize1358
MD555E857F20C35C8C5B470FECB32E7FD71
SHA-1D48CA5FA216954BF9CDCA8CFFCAE30B0F7C26AEA
SHA-256F92315D7854501F0264344DAC32F45F22C25A5BD741158CD149E8AEA39D08136
SSDEEP24:73aKk6GFLDEZv6gkeBfgJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VelOh2FZXMbGrzbk2t3
TLSHT10A2110B4269918608397C150729AB51F08CA547FBEB34440FA8FD28923DD2B05A932BA
hashlookup:parent-total2
hashlookup:trust60

Network graph view

Parents (Total: 2)

The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:

Key Value
FileSize7092
MD5CC441BFA808FC70BF3CA511143034B26
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.6-1+b1
SHA-17229C7161E19EC3AFE4AE71E530E0FCBCED0DD0D
SHA-25627C95D3F35835633D2EBBD0295184131AA62DE145137C106B3347A04D1BABE60
Key Value
FileSize6356
MD5D1D605F026504F6750E01C2A10522993
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.0~git20200217.3ec0443-2
SHA-1E67AAA4F81D1C910506895C8E9E58C9765093B40
SHA-2569B1B4A680913B018EB4A70CDE461374EB91AD043B050AB54DDE7D77AF7748294