Privacy preserving multi-party computation delegation for deep learning in cloud computing
From MaRDI portal
Publication:2198099
Recommendations
- Privacy-preserving distributed deep learning based on secret sharing
- Multiparty homomorphic machine learning with data security and model preservation
- SPEED: secure, private, and efficient deep learning
- Maliciously secure matrix multiplication with applications to private deep learning
- An optimal \((\epsilon, \delta )\)-differentially private learning of distributed deep fuzzy models
Cites work
- scientific article; zbMATH DE number 6378127 (Why is no real title available?)
- scientific article; zbMATH DE number 2009971 (Why is no real title available?)
- A public key cryptosystem and a signature scheme based on discrete logarithms
- Differential privacy and robust statistics
- Fully homomorphic encryption using ideal lattices
- ML confidential: machine learning on encrypted data
- Monte Carlo Methods for Index Computation (mod p)
- New publicly verifiable computation for batch matrix multiplication
- On-the-fly multiparty computation on the cloud via multikey fully homomorphic encryption
- Preserving differential privacy in convolutional deep belief networks
- Public Integrity Auditing for Shared Dynamic Cloud Data with Group User Revocation
- Short signatures from the Weil pairing
- Theory of Cryptography
- Two round multiparty computation via multi-key FHE
- Verifiable Computation over Large Database with Incremental Updates
Cited in
(16)- Enhancing privacy preservation and trustworthiness for decentralized federated learning
- Multiparty homomorphic machine learning with data security and model preservation
- Survey on privacy preserving techniques for machine learning
- SecureBiNN: 3-party secure computation for binarized neural network inference
- Privacy-preserving distributed deep learning based on secret sharing
- Privacy-preserving computation in cyber-physical-social systems: a survey of the state-of-the-art and perspectives
- Verifiable inner product computation on outsourced database for authenticated multi-user data sharing
- SPEED: secure, private, and efficient deep learning
- An optimal \((\epsilon, \delta )\)-differentially private learning of distributed deep fuzzy models
- On-demand privacy preservation for cost-efficient edge intelligence model training
- Preserving differential privacy in deep neural networks with relevance-based adaptive noise imposition
- Preserving differential privacy in convolutional deep belief networks
- A mechanism design approach for multi-party machine learning
- Differential privacy protection method for deep learning based on WGAN feedback
- Privacy-preserving distributed machine learning based on secret sharing
- SecureTLM: private inference for transformer-based large model with MPC
This page was built for publication: Privacy preserving multi-party computation delegation for deep learning in cloud computing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2198099)