scholarly journals Functions of Two Variables with Large Tangent Plane Sets

1998 ◽  
Vol 220 (2) ◽  
pp. 562-570 ◽  
Author(s):  
Zoltán Buczolich
2016 ◽  
Vol 2016 ◽  
pp. 1-12 ◽  
Author(s):  
Kun-Lin Wu ◽  
Ting-Jui Ho ◽  
Sean A. Huang ◽  
Kuo-Hui Lin ◽  
Yueh-Chen Lin ◽  
...  

In this paper, mobile robot navigation on a 3D terrain with a single obstacle is addressed. The terrain is modelled as a smooth, complete manifold with well-defined tangent planes and the hazardous region is modelled as an enclosing circle with a hazard grade tuned radius representing the obstacle projected onto the terrain to allow efficient path-obstacle intersection checking. To resolve the intersections along the initial geodesic, by resorting to the geodesic ideas from differential geometry on surfaces and manifolds, we present a geodesic-based planning and replanning algorithm as a new method for obstacle avoidance on a 3D terrain without using boundary following on the obstacle surface. The replanning algorithm generates two new paths, each a composition of two geodesics, connected via critical points whose locations are found to be heavily relying on the exploration of the terrain via directional scanning on the tangent plane at the first intersection point of the initial geodesic with the circle. An advantage of this geodesic path replanning procedure is that traversability of terrain on which the detour path traverses could be explored based on the local Gauss-Bonnet Theorem of the geodesic triangle at the planning stage. A simulation demonstrates the practicality of the analytical geodesic replanning procedure for navigating a constant speed point robot on a 3D hill-like terrain.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Julián Pozuelo ◽  
Manuel Ritoré

Abstract We consider an asymmetric left-invariant norm ∥ ⋅ ∥ K {\|\cdot\|_{K}} in the first Heisenberg group ℍ 1 {\mathbb{H}^{1}} induced by a convex body K ⊂ ℝ 2 {K\subset\mathbb{R}^{2}} containing the origin in its interior. Associated to ∥ ⋅ ∥ K {\|\cdot\|_{K}} there is a perimeter functional, that coincides with the classical sub-Riemannian perimeter in case K is the closed unit disk centered at the origin of ℝ 2 {{\mathbb{R}}^{2}} . Under the assumption that K has C 2 {C^{2}} boundary with strictly positive geodesic curvature we compute the first variation formula of perimeter for sets with C 2 {C^{2}} boundary. The localization of the variational formula in the non-singular part of the boundary, composed of the points where the tangent plane is not horizontal, allows us to define a mean curvature function H K {H_{K}} out of the singular set. In the case of non-vanishing mean curvature, the condition that H K {H_{K}} be constant implies that the non-singular portion of the boundary is foliated by horizontal liftings of translations of ∂ ⁡ K {\partial K} dilated by a factor of 1 H K {\frac{1}{H_{K}}} . Based on this we can define a sphere 𝕊 K {\mathbb{S}_{K}} with constant mean curvature 1 by considering the union of all horizontal liftings of ∂ ⁡ K {\partial K} starting from ( 0 , 0 , 0 ) {(0,0,0)} until they meet again in a point of the vertical axis. We give some geometric properties of this sphere and, moreover, we prove that, up to non-homogeneous dilations and left-translations, they are the only solutions of the sub-Finsler isoperimetric problem in a restricted class of sets.


2004 ◽  
Vol 11 (4) ◽  
pp. 753-758
Author(s):  
A. Kharazishvili

Abstract For a given σ-ideal of sets, the notion of a generalized stepfunction is introduced and investigated in connection with the problem of sup-measurability of certain functions of two variables, regarded as superposition operators.


2015 ◽  
Vol 38 ◽  
pp. 57-86 ◽  
Author(s):  
Rafael Martínez-Planell ◽  
Maria Trigueros Gaisman ◽  
Daniel McGee

2006 ◽  
Vol 18 (10) ◽  
pp. 2509-2528 ◽  
Author(s):  
Yoshua Bengio ◽  
Martin Monperrus ◽  
Hugo Larochelle

We claim and present arguments to the effect that a large class of manifold learning algorithms that are essentially local and can be framed as kernel learning algorithms will suffer from the curse of dimensionality, at the dimension of the true underlying manifold. This observation invites an exploration of nonlocal manifold learning algorithms that attempt to discover shared structure in the tangent planes at different positions. A training criterion for such an algorithm is proposed, and experiments estimating a tangent plane prediction function are presented, showing its advantages with respect to local manifold learning algorithms: it is able to generalize very far from training data (on learning handwritten character image rotations), where local nonparametric methods fail.


Sign in / Sign up

Export Citation Format

Share Document