scholarly journals On the spectral radius of block graphs having all their blocks of the same size

Author(s):  
Cristian M. Conde ◽  
Ezequiel Dratman ◽  
Luciano N. Grippo
Keyword(s):  
1989 ◽  
Vol 15 (1) ◽  
pp. 275
Author(s):  
NADKARNI ◽  
ROBERTSON
Keyword(s):  

2017 ◽  
Vol 60 (2) ◽  
pp. 411-421
Author(s):  
Luchezar Stoyanov

AbstractWe prove a comprehensive version of the Ruelle–Perron–Frobenius Theorem with explicit estimates of the spectral radius of the Ruelle transfer operator and various other quantities related to spectral properties of this operator. The novelty here is that the Hölder constant of the function generating the operator appears only polynomially, not exponentially as in previously known estimates.


2021 ◽  
Vol 9 (1) ◽  
pp. 1-18
Author(s):  
Carolyn Reinhart

Abstract The distance matrix 𝒟(G) of a connected graph G is the matrix containing the pairwise distances between vertices. The transmission of a vertex vi in G is the sum of the distances from vi to all other vertices and T(G) is the diagonal matrix of transmissions of the vertices of the graph. The normalized distance Laplacian, 𝒟𝒧(G) = I−T(G)−1/2 𝒟(G)T(G)−1/2, is introduced. This is analogous to the normalized Laplacian matrix, 𝒧(G) = I − D(G)−1/2 A(G)D(G)−1/2, where D(G) is the diagonal matrix of degrees of the vertices of the graph and A(G) is the adjacency matrix. Bounds on the spectral radius of 𝒟 𝒧 and connections with the normalized Laplacian matrix are presented. Twin vertices are used to determine eigenvalues of the normalized distance Laplacian. The distance generalized characteristic polynomial is defined and its properties established. Finally, 𝒟𝒧-cospectrality and lack thereof are determined for all graphs on 10 and fewer vertices, providing evidence that the normalized distance Laplacian has fewer cospectral pairs than other matrices.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Adisorn Kittisopaporn ◽  
Pattrawut Chansangiam ◽  
Wicharn Lewkeeratiyutkul

AbstractWe derive an iterative procedure for solving a generalized Sylvester matrix equation $AXB+CXD = E$ A X B + C X D = E , where $A,B,C,D,E$ A , B , C , D , E are conforming rectangular matrices. Our algorithm is based on gradients and hierarchical identification principle. We convert the matrix iteration process to a first-order linear difference vector equation with matrix coefficient. The Banach contraction principle reveals that the sequence of approximated solutions converges to the exact solution for any initial matrix if and only if the convergence factor belongs to an open interval. The contraction principle also gives the convergence rate and the error analysis, governed by the spectral radius of the associated iteration matrix. We obtain the fastest convergence factor so that the spectral radius of the iteration matrix is minimized. In particular, we obtain iterative algorithms for the matrix equation $AXB=C$ A X B = C , the Sylvester equation, and the Kalman–Yakubovich equation. We give numerical experiments of the proposed algorithm to illustrate its applicability, effectiveness, and efficiency.


Sign in / Sign up

Export Citation Format

Share Document