Generative Adversarial Neural Networks and Deep Learning: Successful Cases and Advanced Approaches


  • Oleksandr Striuk
  • Yuriy Kondratenko



generative adversarial network, neural networks, deep learning, machine learning, artificial intelligence


Cross-domain artificial intelligence (AI) frameworks are the keys to amplify progress in science. Cutting edge deep learning methods offer novel opportunities for retrieving, optimizing, and improving different data types. AI techniques provide new ways for enhancing and polishing existing models that are used in applied sciences. New breakthroughs in generative adversarial neural networks (GANNs/GANs) and deep learning allow to drastically increase the quality of diverse graphic samples obtained with research equipment. All these innovative approaches can be compounded into a unified academic and technological pipeline that can radically elevate and accelerate scientific research and development. The authors analyze a number of successful cases of GAN and deep learning applications in applied scientific fields (including observational astronomy, health care, materials science, deep fakes, bioinformatics, and typography) and discuss advanced approaches for increasing GAN and DL efficiency in terms of performance calibration using modified data samples, algorithmic enhancements, and various hybrid methods of optimization.


S. Russell, P. Norvig, Artificial Intelligence: A Modern Approach, 3rd ed., Upper Saddle River, New Jersey: Prentice Hall, 2009, 1 p.

P. Barmby, Astronomical observations: a guide for allied researchers, 2019, [Online]. Available at:,

E. C. Sutton, Observational Astronomy. Techniques and Instrumentation, Cambridge University Press, 2011, 1 p.

J. Lee, P. L. Freddolino, Y. Zhang, Ab initio protein structure prediction, D.J. Rigden (ed.), From Protein Structure to Function with Bioinformatics, 2017, pp. 1–33.

DeepMind, “AlphaFold: Using AI for scientific discovery,” Nature, vol. 577, pp. 706–710, 2020.

S. Roberts, A. McQuillan, S. Reece, S. Aigrain, “Astrophysically robust systematics removal using variational inference: application to the first month of Kepler data,” Monthly Notices of the Royal Astronomical Society, vol. 435, pp. 3639–3653, 2013.

K. Schawinski, C. Zhang, H. Zhang, L. Fowler, G. K. Santhanam, “Generative adversarial networks recover features in astrophysical images of galaxies beyond the deconvolution limit,” Monthly Notices of the Royal Astronomical Society: Letters, vol. 467, issue 1, pp. 110–114, 2017.

P. Magain, F. Courbin, S. Sohy, “Deconvolution with Correct Sampling,” The Astrophysical Journal, pp. 472–477, 1998.

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, J. Bengio, “Generative Adversarial Networks,” Proceedings of the International Conference on Neural Information Processing Systems (NIPS), 2014, pp. 2672–2680.

A. D’Isanto, K. L. Polsterer, “Photometric redshift estimation via deep learning. Generalized and pre-classification-less, image based, fully probabilistic redshifts,” Astronomy & Astrophysics, vol. 609, A111, pp. 1–16, 2018.

G. Lample, N. Zeghidour, N. Usunier, A. Bordes, L. Denoyer, M. Ranzato, “Fader networks: Manipulating images by sliding attributes,” Proceedings of 31st Conference on Neural Information Processing Systems, USA, 2017, pp. 1–10.

K. Schawinski, M. D. Turp, C. Zhang, “Exploring galaxy evolution with generative models,” Astronomy & Astrophysics, vol. 616, L16, pp. 1–4, 2018.

M. J. Smith, J. E. Geach, “Generative deep fields: arbitrarily sized, random synthetic astronomical images through deep learning,” Monthly Notices of the Royal Astronomical Society, vol. 490, issue 4, pp. 4985–4990, 2019.

N. Jetchev, U. Bergmann, R. Vollgraf, “Texture Synthesis with Spatial Generative Adversarial Networks,” Proceedings of Workshop on Adversarial Training, NIPS’2016, Barcelona, Spain, 2016, pp. 1–11.

R. Hausen, B. Robertson, Morpheus: A Deep Learning Framework for Pixel-Level Analysis of Astronomical Image Data, 2019, [Online]. Available at:

O. Ronneberger, P. Fischer, T. Brox, “U-Net: Convolutional networks for biomedical image segmentation,” Proceedings of the 18th International Conference Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, October 5-9, 2015, pp. 1–8.

A. Zhavoronkov, Y. A. Ivanenkov, A. Aliper et al., “Deep learning enables rapid identification of potent DDR1 kinase inhibitors,” Nature Biotechnology, vol. 37, pp. 1038–1040, 2019.

O. Méndez-Lucio, B. Baillif, D.-A. Clevert, D. Rouquié, J. Wichard, De novo generation of hit-like molecules from gene expression signatures using artificial intelligence, 2020.

J. Schmidt, M. R. G. Marques, S. Botti et al., Recent advances and applications of machine learning in solid-state materials science, 2019.

A. Nouira, N. Sokolovska, J.-C. Crivello, “CrystalGAN: Learning to discover crystallographic structures with generative adversarial networks,” Proceedings of the AAAI Spring Symposium: Combining Machine Learning with Knowledge Engineering, Stanford University, USA, March 25-27, 2019, pp. 1–9.

T. T. Nguyen, C. M. Nguyen, D. T. Nguyen, D. T. Nguyen, S. Nahavandi, Deep Learning for Deepfakes Creation and Detection: A Survey, 2020, [Online]. Available at:

L. Guarnera, O. Giudice, S. Battiato, “DeepFake detection by analyzing convolutional traces,” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2020, pp. 1–10.

L. Lan, L. You, Z. Zhang, Z. Fan, W. Zhao, N. Zeng, Y. Chen, X. Zhou, Generative Adversarial Networks and its Applications in Biomedical Informatics, 2020.

J.-Y. Zhu, T. Park, P. Isola, A. A. Efros, Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks, 2020, [Online]. Available at:

J. M. Wolterink et al., Deep MR to CT Synthesis Using Unpaired Data, in Tsaftaris S., Gooya A., Frangi A., Prince J. (eds) Simulation and Synthesis in Medical Imaging., SASHIMI 2017, Lecture Notes in Computer Science, 2017, vol. 10557, Springer, Cham, pp. 14–23.

W. Li, Y. Wang, Y. Cai, C. Arnold, E. Zhao, Y. Yuan, Semi-supervised Rare Disease Detection Using Generative Adversarial Network, 2018, [Online]. Available at:

M. Marouf, P. Machart, V. Bansal, C. Kilian, D. S. Magruder, C. F. Krebs, S. Bonn, Realistic in silico generation and augmentation of single-cell RNA-seq data using generative adversarial networks, 2020.

O. Striuk, Y. Kondratenko, I. Sidenko, A. Vorobyova, “Generative adversarial neural network for creating photorealistic images,” Proceedings of the 2020 IEEE 2nd International Conference on Advanced Trends in Information Theory, Kyiv, Ukraine, November 27, 2020, pp. 1–4.

M. C. Chan, J. P. Stott, “Deep-CEE I: Fishing for galaxy clusters with deep neural nets,” Monthly Notices of the Royal Astronomical Society, vol. 490, pp. 5770–5787, 2019.

Z. L. Wen, J. L. Han, F. S. Liu, “A Catalog of 132,684 clusters of galaxies identified from sloan digital sky survey III,” The Astrophysical Journal Supplement, vol. 199, issue 2, article id. 34, pp. 1–12, 2012.

L. Fussell, B. Moews, “Forging new worlds: high-resolution synthetic galaxies with chained generative adversarial networks,” Monthly Notices of the Royal Astronomical Society, vol. 485, issue 3, pp. 3203–3214, 2019.

T. Karras, M. Aittala, J. Hellsten, S. Laine, J. Lehtinen, T. Aila, “Training generative adversarial networks with limited data,” Proceedings of the 34th Conference on Neural Information Processing Systems (NeurIPS 2020), Vancouver, Canada, June 2020, pp. 1–37.

H. Zhang, T. Xu, H. Li, S. Zhang, X. Wang, X. Huang, D. Metaxas, “StackGAN: Text to photo-realistic image synthesis with stacked generative adversarial networks,” Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 2017, pp. 5908-5916.

E. D. Cubuk, B. Zoph, J. Shlens, Q. V. Le, RandAugment: Practical automated data augmentation with a reduced search space, 2019, [Online]. Available at:

N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, R. Salakhutdinov, “Dropout: A Simple Way to Prevent Neural Networks from Overfitting,” Journal of Machine Learning Research, vol. 15, pp. 1929–1958, 2014.

D. P. Kingma, J. L. Ba, “Adam: A method for stochastic optimization,” Proceedings of the 3rd International Conference for Learning Representations, San Diego, USA, 2015, pp. 1–15.

R. Leizerovych, G. Kondratenko, I. Sidenko, Y. Kondratenko, “IoT-complex for monitoring and analysis of motor highway condition using artificial neural networks,” Proceedings of the 2020 IEEE 11th International Conference on Dependable Systems, Services and Technologies, DESSERT 2020, Kyiv; Ukraine; 14-18 May, 2020; Article No. 9125004, pp. 207–212.

K. Ivanova, G. Kondratenko, I. Sidenko, Y. Kondratenko, “Artificial intelligence in automated system for web-interfaces visual testing,” CEUR Workshop Proceedings, vol. 2604, 2020, 4th International Conference on Computational Linguistics and Intelligent Systems, COLINS 2020; Lviv; Ukraine; 2020; pp. 1019–1031.

V. M. Kuntsevich et al. (Eds), Control Systems: Theory and Applications. Series in Automation, Control and Robotics, River Publishers, 2018, 146 p.

Y. Kondratenko, D. Simon, Structural and parametric optimization of fuzzy control and decision making systems, In: Zadeh L., Yager R., Shahbazova S., Reformat M., Kreinovich V. (eds), Recent Developments and the New Direction in Soft-Computing Foundations and Applications. Studies in Fuzziness and Soft Computing, Springer, Cham., vol. 361, 2018, pp. 273–289.

Z. Gomolka, E. Dudek-Dyduch, Y. P. Kondratenko, “From homogeneous network to neural nets with fractional derivative mechanism,” Proceedings of the International Conference on Artificial Intelligence and Soft Computing, ICAISC-2017, Rutkowski, L. et al. (Eds), Part I, Zakopane, Poland, 11-15 June, 2017, LNAI 10245, Springer, Cham, 2017, pp. 52–63.

J. Brownlee. 18 Impressive Applications of Generative Adversarial Networks (GANs), 2019, [Online]. Available at:




How to Cite

Striuk, O., & Kondratenko, Y. (2021). Generative Adversarial Neural Networks and Deep Learning: Successful Cases and Advanced Approaches. International Journal of Computing, 20(3), 339-349.