totally bounded
Recently Published Documents


TOTAL DOCUMENTS

81
(FIVE YEARS 8)

H-INDEX

10
(FIVE YEARS 1)

2021 ◽  
Vol 18 (6) ◽  
Author(s):  
Rovshan A. Bandaliyev ◽  
Przemysław Górka ◽  
Vagif S. Guliyev ◽  
Yoshihiro Sawano

AbstractWe study a characterization of the precompactness of sets in variable exponent Morrey spaces on bounded metric measure spaces. Totally bounded sets are characterized from several points of view for the case of variable exponent Morrey spaces over metric measure spaces. This characterization is new in the case of constant exponents.


Author(s):  
Simon Puchert

AbstractWe consider infinite graphs and the associated energy forms. We show that a graph is canonically compactifiable (i.e. all functions of finite energy are bounded) if and only if the underlying set is totally bounded with respect to any finite measure intrinsic metric. Furthermore, we show that a graph is canonically compactifiable if and only if the space of functions of finite energy is an algebra. These results answer questions in a recent work of Georgakopoulos, Haeseler, Keller, Lenz, and Wojciechowski.


2019 ◽  
Vol 265 ◽  
pp. 106825
Author(s):  
Kourosh Nourouzi ◽  
Faezeh Zahedi ◽  
Donal O'Regan

2019 ◽  
Vol 263 ◽  
pp. 350-371
Author(s):  
Elisa Hartmann

2019 ◽  
Vol 259 ◽  
pp. 110-123 ◽  
Author(s):  
Salvador Hernández ◽  
F. Javier Trigos-Arrieta

Entropy ◽  
2018 ◽  
Vol 20 (9) ◽  
pp. 640 ◽  
Author(s):  
Jorge Silva ◽  
Milan Derpich

This work demonstrates a formal connection between density estimation with a data-rate constraint and the joint objective of fixed-rate universal lossy source coding and model identification introduced by Raginsky in 2008 (IEEE TIT, 2008, 54, 3059–3077). Using an equivalent learning formulation, we derive a necessary and sufficient condition over the class of densities for the achievability of the joint objective. The learning framework used here is the skeleton estimator, a rate-constrained learning scheme that offers achievable results for the joint coding and modeling problem by optimally adapting its learning parameters to the specific conditions of the problem. The results obtained with the skeleton estimator significantly extend the context where universal lossy source coding and model identification can be achieved, allowing for applications that move from the known case of parametric collection of densities with some smoothness and learnability conditions to the rich family of non-parametric L 1 -totally bounded densities. In addition, in the parametric case we are able to remove one of the assumptions that constrain the applicability of the original result obtaining similar performances in terms of the distortion redundancy and per-letter rate overhead.


Author(s):  
Simon Fong ◽  
Peter Tino

This paper aims to describe the geometrical structure and explicit expressions of family of finitely parametrized probability densities over smooth manifold $M$. The geometry of family of probability densities on $M$ are inherited from probability densities on Euclidean spaces $\left\{U_\alpha \right\}$ via bundle morphisms, induced by an orientation-preserving diffeomorphisms $\rho_\alpha:U_\alpha \rightarrow M$. Current literature inherits densities on $M$ from tangent spaces via Riemannian exponential map $\exp: T_x M \rightarrow M$; densities on $M$ are defined locally on region where the exponential map is a diffeomorphism. We generalize this approach with an arbitrary orientation-preserving bundle morphism; we show that the dualistic geometry of family of densities on $U_\alpha$ can be inherited to family of densities on $M$. Furthermore, we provide explicit expressions for parametrized probability densities on $\rho_\alpha(U_\alpha) \subset M$. Finally, using the component densities on $\rho_\alpha(U_\alpha)$, we construct parametrized mixture densities on totally bounded subsets of $M$. We provide a description of inherited mixture product dualistic geometry of the family of mixture densities.


Sign in / Sign up

Export Citation Format

Share Document