Result for 5FF89C180B9E8A42D92DB22491FC75F29DD69F92

Query result

Key Value
FileName./usr/lib/i386-linux-gnu/cmake/dlpack/dlpackConfig.cmake
FileSize1351
MD5C407146C90011FD1E30CD86015814384
SHA-15FF89C180B9E8A42D92DB22491FC75F29DD69F92
SHA-2563AAE536B9D0AE5197A560D35668BBC67F4DD760388496FB3DD3392A5718E08FF
SSDEEP24:73aKk6GFLDEZv6gkeUgJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VeUOh2FZXMbGrzbk2t3
TLSHT1BF2132B427991C609397D2507296B41F08C6547FBEB34440FA8FD28923DD1B05AC33FA
hashlookup:parent-total2
hashlookup:trust60

Network graph view

Parents (Total: 2)

The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:

Key Value
FileSize6352
MD5AF4C402EB0FF60333E969408F17967ED
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.0~git20200217.3ec0443-2
SHA-13F22F58834B6385F75FE8AA8C3E5AF2E94C7140D
SHA-25602AABE3089F3EDC2C2057F4C2205B9D2338B65C66BDF2D169DF3B850676FE822
Key Value
FileSize7076
MD5353F70E50BFF9123407A6DE01810C667
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.6-1+b1
SHA-141B0DC9F55E5CE37958E0F6BE653AD103649EED1
SHA-256797D9B1C2FE1A3C11B6C910E5342E12D81DCF27FBD9479C464B3042FD119718D