Result for 0687415200D4026BAE2C741C85C3BEBE379AA74F

Query result

Key Value
FileName./usr/lib/s390x-linux-gnu/cmake/dlpack/dlpackConfig.cmake
FileSize1352
MD55845F86294BC4B86D28D061813EE91A5
SHA-10687415200D4026BAE2C741C85C3BEBE379AA74F
SHA-256A7B17749660CAF971D727F65AF104D5968F28BE1A4C02A1D9AD012EEE094900A
SSDEEP24:73aKk6GFLDEZv6gke5gJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6Ve5Oh2FZXMbGrzbk2t3
TLSHT1032132B427991C608397C1507296B41F08C6547FBEB34440FA8FD28923DD1B05A833FA
hashlookup:parent-total2
hashlookup:trust60

Network graph view

Parents (Total: 2)

The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:

Key Value
FileSize6352
MD5D4BC6FFA379BB4F5816A8AD537C70C9A
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.0~git20200217.3ec0443-2
SHA-1D5E6030C8D7C371B0D015007A36E44E2C2D030C1
SHA-256151C664E086E33D813CF567A528FE12922A9124E37CC2D23CE45B40CD4D8728E
Key Value
FileSize7064
MD5BD1FC769EFDCE940358381EFF1BB90B4
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.6-1+b1
SHA-1F19347CFFFC7DE9F6CEE5BEC6B821CFB437DEF5E
SHA-256A955AC951AB274990479474AACEEE9A113D4D9F5D07F2369DA6AF9F85BC6E704