Discontinuity at fixed points with applications (Q2295716)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Discontinuity at fixed points with applications |
scientific article |
Statements
Discontinuity at fixed points with applications (English)
0 references
14 February 2020
0 references
Let \((X,d)\) be a complete metric space and \(T:X\to X\) be a selfmap. For \(u,v\in X\), define the mapping \[N(u,v)=\max\{d(u,v),d(u,Tu),d(v,Tv), A(u,v), B(u,v)\},\] where \(A(u,v)=d(v,Tv)(1+d(u,Tu))/(1+d(u,v)\), \(B(u,v)=d(u,Tu)(1+d(v,Tv))/(1+d(u,v))\). The following is the main result of this paper. Theorem. Suppose that there exists a function \(\phi:\mathbb{R}_+\to \mathbb{R}_+\) with \(\phi(t)< t\) for each \(t> 0\), such that \begin{itemize} \item[(1)] \(d(Tu,Tv)\le \phi(N(u,v))\), for all \(u,v\in X\); \item[(2)] for each \(\varepsilon > 0\) there exists \(\delta> 0\) such that \(\varepsilon< N(u,v)< \varepsilon +\delta\) implies \(d(Tu,Tv)\le \varepsilon\). \end{itemize} Then \begin{itemize} \item[(i)] \(T\) has a unique fixed point \(u^*\in X\) and \(T^nu\to u^*\) for each \(u\in X\); \item[(ii)] \(T\) is discontinuous at \(u^*\) iff \(\lim_{u\to u^*} N(u,u^*)\ne 0\). \end{itemize} A variant of this result for complex valued metric spaces is also established. Then, an application is given to discontinuous activation functions in real and complex valued neural networks.
0 references
fixed point
0 references
fixed circle
0 references
discontinuity
0 references
activation function
0 references
neural network
0 references