Page 82 - ITU Journal, ICT Discoveries, Volume 3, No. 1, June 2020 Special issue: The future of video and immersive media
P. 82
ITU Journal: ICT Discoveries, Vol. 3(1), June 2020
[12] G. Castellano, A. M. Fanelli, and M. Pelillo. An [26] A. Graves and J. Schmidhuber. Framewise
iterative pruning algorithm for feedforward neural phoneme classification with bidirectional lstm and
networks. IEEE transactions on Neural networks, other neural network architectures. Neural Net-
8(3):519–531, 1997. works, 18(5-6):602–610, 2005.
[13] J. Chen, X. Pan, R. Monga, S. Bengio, and R. Joze- [27] S. Han, H. Mao, and W. J. Dally. Deep com-
fowicz. Revisiting distributed synchronous sgd. pression: Compressing deep neural networks with
arXiv preprint arXiv:1604.00981, 2016. pruning, trained quantization and huffman coding.
arXiv preprint arXiv:1510.00149, 2015.
[14] X. Chen, J. Ji, C. Luo, W. Liao, and P. Li. When
machine learning meets blockchain: A decentral- [28] S. Han, J. Pool, J. Tran, and W. Dally. Learning
ized, privacy-preserving and secure design. In 2018 both weights and connections for efficient neural
IEEE International Conference on Big Data (Big network. In Advances in neural information pro-
Data), pages 1178–1187. IEEE, 2018.
cessing systems, pages 1135–1143, 2015.
[15] Y. Chen, T. Chen, Z. Xu, N. Sun, and O. Temam.
Diannao family: energy-efficient hardware acceler- [29] B. Hassibi and D. G. Stork. Second order deriva-
ators for machine learning. Communications of the tives for network pruning: Optimal brain surgeon.
ACM, 59(11):105–112, 2016. In Advances in neural information processing sys-
tems, pages 164–171, 1993.
[16] Y. Chen, L. Su, and J. Xu. Distributed statisti-
cal machine learning in adversarial settings: Byzan- [30] G. Hinton, L. Deng, D. Yu, G. E. Dahl, A.-r.
tine gradient descent. Proceedings of the ACM on Mohamed, N. Jaitly, A. Senior, V. Vanhoucke,
Measurement and Analysis of Computing Systems, P. Nguyen, T. N. Sainath, et al. Deep neural net-
1(2):44, 2017. works for acoustic modeling in speech recognition:
The shared views of four research groups. IEEE
[17] Y. Choi, M. El-Khamy, and J. Lee. Towards Signal Processing Magazine, 29(6):82–97, 2012.
the limit of network quantization. arXiv preprint
arXiv:1612.01543, 2016. [31] G. Hinton, O. Vinyals, and J. Dean. Distilling
the knowledge in a neural network. arXiv preprint
[18] Y. Choi, M. El-Khamy, and J. Lee. Universal arXiv:1503.02531, 2015.
deep neural network compression. arXiv preprint
arXiv:1802.02271, 2018. [32] B. Hitaj, G. Ateniese, and F. Perez-Cruz. Deep
models under the gan: information leakage from
[19] T. Choudhary, V. Mishra, A. Goswami, and
J. Sarangapani. A comprehensive survey on model collaborative deep learning. In Proceedings of the
compression and acceleration. Artificial Intelligence 2017 ACM SIGSAC Conference on Computer and
Review, pages 1–43. Communications Security, pages 603–618. ACM,
2017.
[20] R. Creemers. Cybersecurity law of the people’s re-
public of china (third reading draft). China Copy- [33] K. Hwang. Cloud computing for machine learning
right and Media, 2016. and cognitive applications. MIT Press, 2017.
[21] C. M. De Sa, C. Zhang, K. Olukotun, and C. Ré. [34] M. Ibnkahla. Applications of neural networks to
Taming the wild: A unified analysis of hogwild- digital communications–a survey. Signal processing,
style algorithms. In Advances in neural information 80(7):1185–1215, 2000.
processing systems, pages 2674–2682, 2015.
[35] N. Ivkin, D. Rothchild, E. Ullah, V. Braverman,
[22] J. Dean and S. Ghemawat. Mapreduce: simplified I. Stoica, and R. Arora. Communication-efficient
data processing on large clusters. Communications distributed sgd with sketching. arXiv preprint
of the ACM, 51(1):107–113, 2008. arXiv:1903.04488, 2019.
[23] C. Dwork, A. Roth, et al. The algorithmic foun- [36] Y. Jiang, J. Konečnỳ, K. Rush, and S. Kan-
dations of differential privacy. Foundations and nan. Improving federated learning personalization
Trends® in Theoretical Computer Science, 9(3– via model agnostic meta learning. arXiv preprint
4):211–407, 2014.
arXiv:1909.12488, 2019.
[24] R. C. Geyer, T. Klein, and M. Nabi. Differentially
private federated learning: A client level perspec- [37] P. Kairouz, H. B. McMahan, B. Avent, A. Bellet,
tive. arXiv preprint arXiv:1712.07557, 2017. M. Bennis, A. N. Bhagoji, K. Bonawitz, Z. Charles,
G. Cormode, R. Cummings, et al. Advances and
[25] O. Goldreich. Secure multi-party computation. open problems in federated learning. arXiv preprint
Manuscript. Preliminary version, 78, 1998. arXiv:1912.04977, 2019.
60 © International Telecommunication Union, 2020