Analysis of sensitivity of reciprocal matrices (Q798731): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
ReferenceBot (talk | contribs)
Changed an Item
 
(One intermediate revision by one other user not shown)
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/0096-3003(83)90044-9 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1973529367 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3849175 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Symmetry and the crossing number for complete graphs / rank
 
Normal rank
Property / cites work
 
Property / cites work: A scaling method for priorities in hierarchical structures / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3044280 / rank
 
Normal rank

Latest revision as of 12:57, 14 June 2024

scientific article
Language Label Description Also known as
English
Analysis of sensitivity of reciprocal matrices
scientific article

    Statements

    Analysis of sensitivity of reciprocal matrices (English)
    0 references
    0 references
    1983
    0 references
    The author is concerned with the relationship between the principal right eigenvectors of a given reciprocal matrix and its permutation by Hadamard (elementwise) multiplication with another reciprocal matrix. Earlier studies have considered additive permutations. A reciprocal matrix, A, is one for whose elements satisfy \(a_{ij}=1/a_{ji}\). A reciprocal matrix is consistent if its elements satisfy the further restriction that \(a_{ij}a_{jk}=a_{ik}\) for each i,j,k. The Perron-Frobenius theorem applies to reciprocal matrices, so they have unique right eigenvectors corresponding to the largest positive real eigenvalue. The author's main result is paraphrased in the following Theorem. Let A denote a consistent reciprocal matrix, P a reciprocal matrix, and \(A\circ P\) their Hadamard product. If a and p are the principal right eigenvectors of A and P respectively, then the principal right eigenvector of \(A\circ P\) is \(a\circ p\). If A and P are 3\(\times 3\) matrices, then the same result holds for arbitrary reciprocal A. In order to show this result, the author first demonstrates the following Theorem. Let A be a reciprocal matrix. A can be uniquely decomposed as \(A=H\circ E\), where H is a consistent reciprocal matrix with the same principal right eigenvector, and E is a reciprocal matrix with the same largest eigenvalue. These results are illustrated by examples and extended to sequences of perturbations.
    0 references
    analysis of sensitivity
    0 references
    analytic hierarchy process
    0 references
    reciprocal matrix
    0 references
    Hadamard product
    0 references
    consistency
    0 references
    principal right eigenvectors
    0 references

    Identifiers