Low-rank matrix models have been universally useful for numerous applications starting from classical system identification to more modern matrix completion in signal processing and statistics. The Schatten-1 norm, also known as the nuclear norm, has been used as a convex surrogate of the low-rankness since it induces a low-rank solution to inverse problems. While the Schatten-1 norm for low-rankness has a nice analogy with the ℓ1 norm for sparsity through the singular value decomposition, other matrix norms also induce low-rankness. Particularly as one interprets a matrix as a linear operator between Banach spaces, various tensor product norms generalize the role of the Schatten-1 norm. Inspired by a recent work on the max-norm-based matrix completion, we provide a unified view on a class of tensor product norms and their interlacing relations on low-rank operators. Furthermore we derive entropy estimates between the injective and projective tensor products of a family of Banach space pairs and demonstrate their applications to matrix completion and decentralized subspace sketching.