Julia language in machine learning: algorithms, applications, and open issues. (English) Zbl 1478.68285

Summary: Machine learning is driving development across many fields in science and engineering. A simple and efficient programming language could accelerate applications of machine learning in various fields. Currently, the programming languages most commonly used to develop machine learning algorithms include Python, MATLAB, and C/C ++. However, none of these languages well balance both efficiency and simplicity. The Julia language is a fast, easy-to-use, and open-source programming language that was originally designed for high-performance computing, which can well balance the efficiency and simplicity. This paper summarizes the related research work and developments in the applications of the Julia language in machine learning. It first surveys the popular machine learning algorithms that are developed in the Julia language. Then, it investigates applications of the machine learning algorithms implemented with the Julia language. Finally, it discusses the open issues and the potential future directions that arise in the use of the Julia language in machine learning.


68T05 Learning and adaptive systems in artificial intelligence
68N15 Theory of programming languages
68T07 Artificial neural networks and deep learning
68-02 Research exposition (monographs, survey articles) pertaining to computer science
Full Text: DOI arXiv


[1] Jordan, M. I.; Mitchell, T. M., Machine learning: Trends, perspectives, and prospects, Science, 349, 6245, 255-260 (2015) · Zbl 1355.68227
[2] Deo, R. C., Machine learning in medicine, Circulation, 132, 20, 1920-1930 (2015)
[3] Domingos, P., A few useful things to know about machine learning, Commun. ACM, 55, 10, 78-87 (2012)
[4] Riley, P., Three pitfalls to avoid in machine learning, Nature, 572, 7767, 27-29 (2019)
[5] LeCun, Y.; Bengio, Y.; Hinton, G., Deep learning, Nature, 521, 7553, 436-444 (2015)
[6] Mjolsness, E.; DeCoste, D., Machine learning for science: State of the art and future prospects, Science, 293, 5537, 2051-+ (2001)
[7] Serrano, E.; Garcia Blas, J.; Carretero, J.; Abella, M.; Desco, M., Medical imaging processing on a big data platform using python: Experiences with heterogeneous and homogeneous architectures, (2017 17th Ieee/Acm International Symposium on Cluster, Cloud and Grid Computing. 2017 17th Ieee/Acm International Symposium on Cluster, Cloud and Grid Computing, IEEE-ACM International Symposium on Cluster Cloud and Grid Computing (2017)), 830-837
[8] Voulgaris, Z., Julia for Data Science (2016), Technics Publications, LLC
[9] Dinari, O.; Yu, A.; Freifeld, O.; Fisher, J., Distributed MCMC inference in Dirichlet process mixture models using Julia, (2019 19th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGRID) (2019)), 518-525
[10] Bezanson, J.; Edelman, A.; Karpinski, S.; Shah, V. B., Julia: A fresh approach to numerical computing, Siam Rev., 59, 1, 65-98 (2017) · Zbl 1356.68030
[11] Perkel, J. M., Julia: Come for the syntax, stay for the speed, Nature, 572, 7767, 141-142 (2019)
[12] [link]. URL https://julialang.org/benchmarks/.
[13] Lattner, C.; Adve, V., LLVM: A compilation framework for lifelong program analysis and transformation, 75-86 (2004)
[14] Huo, Z.; Mei, G.; Casolla, G.; Giampaolo, F., Designing an efficient parallel spectral clustering algorithm on multi-core processors in Julia, J. Parallel Distrib. Comput., 138, 211-221 (2020)
[15] Besard, T.; Foket, C.; De Sutter, B., Effective extensible programming: unleashing Julia on GPUs, IEEE Trans. Parallel Distrib. Syst., 30, 4, 827-841 (2018)
[16] R. Huang, W. Xu, Y. Wang, S. Liverani, A.E. Stapleton, Performance comparison of Julia distributed implementations of Dirichlet process mixture models, in: Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019, pp. 3350-3354. http://dx.doi.org/10.1109/BigData47090.2019.9005453.
[17] Ruthotto, L.; Treister, E.; Haber, E., Jlnv-A flexible Julia package for PDE parameter estimation, SIAM J. Sci. Comput., 39, 5, S702-S722 (2017) · Zbl 1373.86013
[18] Penny, W. D.; Kilner, J.; Blankenburg, F., Robust Bayesian general linear models, Neuroimage, 36, 3, 661-671 (2007)
[19] Frigola, R., Bayesian Time Series Learning with Gaussian Processes (2015), University of Cambridge
[20] Strickland, C.; Burdett, R.; Mengersen, K.; Denham, R., PySSM: A Python module for Bayesian inference of linear Gaussian state space models, J. Stat. Softw. Artic., 57, 6, 1-37 (2014)
[21] Mertens, U. K.; Voss, A.; Radev, S., ABrox A-user-friendly python module for approximate Bayesian computation with a focus on model comparison, Plos One, 13, 3 (2018)
[22] Toosi, H.; Moeini, A.; Hajirasouliha, I., BAMSE: Bayesian model selection for tumor phylogeny inference among multiple samples, BMC Bioinformatics, 20 (2019)
[23] Luttinen, J., BayesPy: Variational Bayesian inference in Python, J. Mach. Learn. Res., 17, 1-6 (2016), https://dl.acm.org/doi/10.5555/2946645.2946686 · Zbl 1360.62101
[24] Patil, A.; Huard, D.; Fonnesbeck, C. J., PyMC: Bayesian stochastic modelling in Python, J. Stat. Softw., 35, 4, 1-81 (2010)
[25] Jarno, V.; Jaakko, R.; Jouni, H.; Pasi, J.; Ville, T.; Aki, V., Bayesian modeling with Gaussian processes using the MATLAB toolbox GPstuff (v3.3), Statistics (2012)
[26] Zhang, L.; Agravat, S.; Derado, G.; Chen, S.; McIntosh, B. J.; Bowman, F. D., BSMac: A MATLAB toolbox implementing a Bayesian spatial model for brain activation and connectivity, J. Neurosci. Methods, 204, 1, 133-143 (2012)
[27] Cusumano-Towner, M. M.; Mansinghka, V. K.K. V., A design proposal for Gen: Probabilistic programming with fast custom inference via code generation, (Proceedings - the 2nd ACM SIGPLAN International Workshop (2018)), 57
[28] Cox, M.; van de Laar, T.; de Vries, B., A factor graph approach to automated design of Bayesian signal processing algorithms, Internat. J. Approx. Reason., 104, 185-204 (2019) · Zbl 07025914
[29] Bergstra, J.; Komer, B.; Eliasmith, C.; Yamins, D.; Cox, D. D., Hyperopt: A Python library for model selection and hyperparameter optimization, Comput. Sci. Discovery, 8, 1 (2015)
[30] [link]. URL https://github.com/KristofferC/NearestNeighbors.jl.
[31] Breiman, L., Random forests machine learning, Mach. Learn., 45, 5-32 (2001) · Zbl 1007.68152
[32] Ho, T. K., Random decision forests, (Proceedings of 3rd International Conference on Document Analysis and Recognition, Vol. 1 (1995)), 278-282
[33] Zhou, Y.; Gallins, P., A review and tutorial of machine learning methods for microbiome host trait prediction, Front. Genet., 10, 579 (2019)
[34] Upadhyay, A.; Shetty, A.; Kumar Singh, S.; Siddiqui, Z., Land use and land cover classification of LISS-III satellite image using KNN and decision tree, (2016 3rd International Conference on Computing for Sustainable Global Development (INDIACom) (2016)), 1277-1280, https://ieeexplore.ieee.org/document/7724471
[35] Keck, T., FastBDT: A speed-optimized and cache-friendly implementation of stochastic gradient-boosted decision trees for multivariate classification (2016), arXiv:1609.06119
[36] Yang, F.; Han, X.; Lang, J.; Lu, W.; Liu, L.; Zhang, L.; Pan, J., (Commodity Recommendation for Users Based on E-commerce Data. Commodity Recommendation for Users Based on E-commerce Data, Proceedings of the 2018 2nd International Conference on Big Data Research (2018)), 146-149
[37] Seiferling, I.; Naik, N.; Ratti, C.; Proulx, R., Green streets - Quantifying and mapping urban trees with street-level imagery and computer vision, Landsc. Urban Plan., 165, 93-101 (2017)
[38] Vapnik, V., The Nature of Statistical Learning Theory (2013), Springer science and business media · Zbl 0934.62009
[39] Shalev-Shwartz, S.; Singer, Y.; Srebro, N.; Cotter, A., Pegasos: primal estimated sub-gradient solver for SVM, Math. Program., 127, 1, 3-30 (2011) · Zbl 1211.90239
[40] Chang, C.; Lin, C., LIBSVM: A library for support vector machines, ACM Trans. Intell. Syst. Technol. (TIST), 2, 3, 27 (2011)
[41] Kebria, P. M.; Khosravi, A.; Salaken, S. M.; Nahavandi, S., Deep imitation learning for autonomous vehicles based on convolutional neural networks, IEEE-CAA J. Autom. Sin., 7, 1, 82-95 (2020)
[42] Parmar, Y.; Natarajan, S.; Sobha, G., DeepRange: deep-learning-based object detection and ranging in autonomous driving, IET Intell. Transp. Syst., 13, 8, 1256-1264 (2019)
[43] [link]. URL https://github.com/mpastell/LIBSVM.jl.
[44] Gwak, J.; Jung, J.; Oh, R.; Park, M.; Rakhimov, M. A.K.; Ahn, J., A review of intelligent self-driving vehicle software research, Ksii Trans. Internet Inf. Syst., 13, 11, 5299-5320 (2019)
[45] Abraham, A.; Pedregosa, F.; Eickenberg, M.; Gervais, P.; Mueller, A.; Kossaifi, J.; Gramfort, A.; Thirion, B.; Varoquaux, G., Machine learning for neuroirnaging with scikit-learn, Front. Neuroinform., 8, 14 (2014)
[46] Jovic, A.; Brkic, K.; Bogunovic, N., An overview of free software tools for general data mining, (2014 37th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO) (2014)), 1112-1117
[47] Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; Vanderplas, J.; Passos, A.; Cournapeau, D.; Brucher, M.; Perrot, M.; Duchesnay, E., Scikit-learn: Machine learning in Python, J. Mach. Learn. Res., 12, 2825-2830 (2011) · Zbl 1280.68189
[48] Demsar, J.; Curk, T.; Erjavec, A.; Gorup, C.; Hocevar, T.; Milutinovic, M.; Mozina, M.; Polajnar, M.; Toplak, M.; Staric, A.; Stajdohar, M.; Umek, L.; Zagar, L.; Zbontar, J.; Zitnik, M.; Zupan, B., Orange: Data mining toolbox in Python, J. Mach. Learn. Res., 14, 2349-2353 (2013) · Zbl 1317.68151
[49] Demsar, J.; Zupan, B.; Leban, G.; Curk, T., Orange: From experimental machine learning to interactive data mining, (Boulicaut, J. F.; Esposito, F.; Giannotti, F.; Pedreschi, D., Knowledge Discovery in Databases: Pkdd 2004, Proceedings. Knowledge Discovery in Databases: Pkdd 2004, Proceedings, Lecture Notes in Artificial Intelligence, vol. 3202 (2004)), 537-539
[50] Shan, Y. H.; Lu, W. F.; Chew, C. M., Pixel and feature level based domain adaptation for object detection in autonomous driving, Neurocomputing, 367, 31-38 (2019)
[51] Arnold, E.; Al-Jarrah, O. Y.; Dianati, M.; Fallah, S.; Oxtoby, D.; Mouzakitis, A., A survey on 3D object detection methods for autonomous driving applications, IEEE Trans. Intell. Transp. Syst., 20, 10, 3782-3795 (2019)
[52] Bishop, C., Neural Networks for Pattern Recognition (1995), Oxford University Press
[53] Raja, Y.; McKenna, S.; Gong, S., Segmentation and tracking using colour mixture models, Lecture Notes in Comput. Sci., 1351, 607-614 (1998)
[54] Bruneau, M.; Mottet, T.; Moulin, S.; Kerbiriou, M.; Chouly, F.; Chretien, S.; Guyeux, C., A clustering package for nucleotide sequences using Laplacian Eigenmaps and Gaussian mixture model, Comput. Biol. Med., 93, 66-74 (2017)
[55] Holoien, T. W.S.; Marshall, P. J.; Wechsler, R. H., EmpiriciSN: Re-sampling observed supernova/host galaxy populations using an XD Gaussian mixture model, Astron. J., 153, 6 (2017)
[56] [link]. URL https://github.com/AmebaBrain/GmmFlow.jl.
[57] [link]. URL https://github.com/davidavdav/GaussianMixtures.jl.
[58] Srajer, F.; Kukelova, Z.; Fitzgibbon, A., A benchmark of selected algorithmic differentiation tools on some problems in computer vision and machine learning, Optim. Methods Softw., 33, 4-6, 889-906 (2018) · Zbl 1453.65050
[59] Douzas, G.; Bacao, F.; Last, F., Improving imbalanced learning through a heuristic oversampling method based on k-means and SMOTE, Inform. Sci., 465, 1-20 (2018)
[60] Yu, S.; Tranchevent, L. C.; Liu, X. H.; Glanzel, W.; Suykens, J. A.K.; De Moor, B.; Moreau, Y., Optimized data fusion for kernel k-means clustering, IEEE Trans. Pattern Anal. Mach. Intell., 34, 5, 1031-1039 (2012)
[61] Zhang, K.; Zhu, Y. X.; Leng, S. P.; He, Y. J.; Maharjan, S.; Zhang, Y., Deep learning empowered task offloading for mobile edge computing in urban informatics, IEEE Internet Things J., 6, 5, 7635-7647 (2019)
[62] Corpet, F., Multiple sequence alignment with hierarchical-clustering, Nucleic Acids Res., 16, 22, 10881-10890 (1988)
[63] Johnson, S. C., Hierarchical clustering schemes, Psychometrika, 32, 3, 241-254 (1967) · Zbl 1367.62191
[64] Karypis, G.; Han, E. H.; Kumar, V., Chameleon: Hierarchical clustering using dynamic modeling, Computer, 32, 8, 68-+ (1999)
[65] Jaeger, D.; Barth, J.; Niehues, A.; Fufezan, C., PyGCluster, a novel hierarchical clustering approach, Bioinformatics, 30, 6, 896-898 (2014)
[66] Muellner, D., Fastcluster: Fast hierarchical, agglomerative clustering routines for R and Python, J. Stat. Softw., 53, 9, 1-18 (2013)
[67] Kabzan, J.; Hewing, L.; Liniger, A.; Zeilinger, M. N., Learning-based model predictive control for autonomous racing, Ieee Robot. Autom. Lett., 4, 4, 3363-3370 (2019)
[68] Datta, S., Hierarchical stellar clusters in molecular clouds, (DeGrijs, R.; Lepine, J. R.D., Star Clusters: Basic Galactic Building Blocks Throughout Time and Space. Star Clusters: Basic Galactic Building Blocks Throughout Time and Space, IAU Symposium Proceedings Series (2010)), 377-379
[69] Kerr, G.; Ruskin, H. J.; Crane, M.; Doolan, P., Techniques for clustering gene expression data, Comput. Biol. Med., 38, 3, 283-293 (2008)
[70] Jacques, J.; Biernacki, C., Model-based co-clustering for ordinal data, Comput. Statist. Data Anal., 123, 101-115 (2018) · Zbl 1469.62086
[71] Pessia, A.; Corander, J., Kpax3: Bayesian bi-clustering of large sequence datasets, Bioinformatics, 34, 12, 2132-2133 (2018)
[72] [link]. URL https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.PCA.html.
[73] [link]. URL https://www.alglib.net/dataanalysis/principalcomponentsanalysis.php.
[74] [link]. URL https://docs.opencv.org/master/d1/dee/tutorial_introduction_to_pca.html.
[75] [link]. URL https://github.com/JuliaStats/MultivariateStats.jl.
[76] Vorobyov, S.; Cichocki, A., Blind noise reduction for multisensory signals using ICA and subspace filtering, with application to EEG analysis, Biol. Cybernet., 86, 4, 293-303 (2002) · Zbl 1066.92037
[77] Ren, X.; Hu, X.; Wang, Z.; Yan, Z., MUAP extraction and classification based on wavelet transform and ICA for EMG decomposition, Med. Biol. Eng. Comput., 44, 5, 371-382 (2006)
[78] Assi, E. B.; Rihana, S.; Sawan, M., Kmeans-ICA based automatic method for ocular artifacts removal in a motorimagery classification, (2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (2014)), 6655-6658
[79] [link]. URL https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.FastICA.htm.
[80] [link]. URL https://www.mathworks.com/matlabcentral/fileexchange/38300-pca-and-ica-package?s_tid=prof_contriblnk.
[81] Fey, M.; Lenssen, J. E., Fast graph representation learning with PyTorch geometric (2019), arXiv:1903.02428
[82] Shen, J.; Nguyen, P.; Wu, Y.; Chen, Z.; Chen..., M. X., Lingvo: a modular and scalable framework for sequence-to-sequence modeling (2019), arXiv:1902.08295
[83] Tang, Y.-P.; Li, G.-X.; Huang, S.-J., ALiPy: Active learning in Python (2019), arXiv:1901.03802
[84] Tang, J.; Ericson, L.; Folkesson, J.; Jensfelt, P., GCNv2: Efficient correspondence prediction for real-time SLAM, IEEE Robot. Autom. Lett., 4, 4, 3505-3512 (2019)
[85] Huang, Z.; Huang, L.; Gong, Y.; Huang, C.; Wang, X., Mask scoring R-CNN (2019), arXiv:1903.00241
[86] Frazier-Logue, N.; Hanson, S. J., Dropout is a special case of the stochastic delta rule: faster and more accurate deep learning (2018), arXiv:1808.03578
[87] Hanson, S. J., A stochastic version of the delta rule, Phys. D, 42, 265-272 (1990)
[88] Luo, L.; Xiong, Y.; Liu, Y.; Sun, X., Adaptive gradient methods with dynamic bound of learning rate (2019), arXiv:1902.09843
[89] Bloice, M. D.; Stocker, C.; Holzinger, A., Augmentor: An image augmentation library for machine learning (2017), arXiv:1708.04680
[90] Krizhevsky, A.; Sutskever, I.; Hinton, G. E., ImageNet classification with deep convolutional neural networks, Commun. Acm, 60, 6, 84-90 (2017)
[91] Liu, Y.-R.; Hu, Y.-Q.; Qian, H.; Yu, Y.; Qian, C., ZOOpt: Toolbox for derivative-free optimization (2017), arXiv:1801.00329
[92] [link]. URL https://github.com/denizyuret/Knet.jl.
[93] [link]. URL https://github.com/FluxML/Flux.jl.
[94] [link]. URL https://github.com/malmaud/TensorFlow.jl.
[95] G. Ditzler, J.C. Morrison, Y. Lan, G.L. Rosen, Fizzy: Feature subset selection for metagenomics, 16 (2015) 358. http://dx.doi.org/10.1186/s12859-015-0793-8.
[96] Wicht, B.; Fischer, A.; Hennebert, J., DLL: A fast deep neural network library, (Pancioni, L.; Schwenker, F.; Trentin, E., Artificial Neural Networks in Pattern Recognition, Annpr 2018. Artificial Neural Networks in Pattern Recognition, Annpr 2018, Lecture Notes in Artificial Intelligence, vol. 11081 (2018)), 54-65
[97] Rackauckas, C.; Innes, M.; Ma, Y.; Bettencourt, J.; White, L.; Dixit, V., DiffEqFlux.jl - A Julia library for neural differential equations (2019), arXiv:1902.02376
[98] [link]. URL https://github.com/JuliaDiffEq/DifferentialEquations.jl.
[99] Huang, G. B.; Zhu, Q. Y.; Siew, C. K., Extreme learning machine: A new learning scheme of feedforward neural networks, (2004 IEEE International Joint Conference on Neural Networks, Vols 1-4, Proceedings. 2004 IEEE International Joint Conference on Neural Networks, Vols 1-4, Proceedings, IEEE International Joint Conference on Neural Networks (IJCNN) (2004)), 985-990
[100] Kasun, L. L.C.; Zhou, H.; Huang, G.-B.; Vong, C. M., Representational learning with ELMs for big data, IEEE Intell. Syst., 28, 6, 31-34 (2013)
[101] Tang, J.; Deng, C.; Huang, G.-B., Extreme learning machine for multilayer perceptron, IEEE Trans. Neural Netw. Learn. Syst., 27, 4, 809-821 (2015)
[102] Ouyang, Z. C.; Niu, J. W.; Liu, Y.; Guizani, M., Deep CNN-based real-time traffic light detector for self-driving vehicles, IEEE Trans. Mob. Comput., 19, 2, 300-313 (2020)
[103] Lee, I.; Lee, K., The Internet of Things (IoT): Applications, investments, and challenges for enterprises, Bus. Horiz., 58, 4, 431-440 (2015)
[104] Gubbi, J.; Buyya, R.; Marusic, S.; Palaniswami, M., Internet of Things (IoT): A vision, architectural elements, and future directions, Future Gener. Comput. Syst.-Int. J. Esci., 29, 7, 1645-1660 (2013)
[105] Atzori, L.; Iera, A.; Morabito, G., The Internet of Things: A survey, Comput. Netw., 54, 15, 2787-2805 (2010) · Zbl 1208.68071
[106] Mei, G.; Xu, N.; Qin, J.; Wang, B.; Qi, P., A survey of Internet of Things (IoT) for geo-hazards prevention: Applications, technologies, and challenges, IEEE Internet Things J., 1 (2019)
[107] Mohammadi, M.; Al-Fuqaha, A.; Sorour, S.; Guizani, M., Deep learning for IoT big data and streaming analytics: A survey, IEEE Commun. Surv. Tutor., 20, 4, 2923-2960 (2018)
[108] Mahdavinejad, M. S.; Rezvan, M.; Barekatain, M.; Adibi, P.; Barnaghi, P.; Sheth, A. P., Machine learning for Internet of Things data analysis: A survey, Digit. Commun. Netw., 4, 3, 161-175 (2018)
[109] [link]. URL https://invenia.github.io/blog/.
[110] [link]. URL https://juliacomputing.com/case-studies/fugro-roames-ml.html.
[111] Grys, B. T.; Lo, D. S.; Sahin, N.; Kraus, O. Z.; Morris, Q.; Boone, C.; Andrews, B. J., Machine learning and computer vision approaches for phenotypic profiling, J. Cell Biol., 216, 1, 65-71 (2017)
[112] Naik, N.; Kominers, S. D.; Raskar, R.; Glaeser, E. L.; Hidalgo, C. A., Computer vision uncovers predictors of physical urban change, Proc. Natl. Acad. Sci. USA, 114, 29, 7571-7576 (2017)
[113] Patel, A. K.; Chatterjee, S., Computer vision-based limestone rock-type classification using probabilistic neural network, Geosci. Front., 7, 1, 53-60 (2016)
[114] Gopalakrishnan, K.; Khaitan, S. K.; Choudhary, A.; Agrawal, A., Deep convolutional neural networks with transfer learning for computer vision-based data-driven pavement distress detection, Constr. Build. Mater., 157, 322-330 (2017)
[115] Cha, Y. J.; Chen, J. G.; Buyukozturk, O., Output-only computer vision based damage detection using phase-based optical flow and unscented kalman filters, Eng. Struct., 132, 300-313 (2017)
[116] [link]. URL https://github.com/FluxML/Metalhead.jl.
[117] [link]. URL https://github.com/peterkovesi/ImageProjectiveGeometry.jl.
[118] Hirschberg, J.; Manning, C. D., Advances in natural language processing, Science, 349, 6245, 261-266 (2015) · Zbl 1355.68275
[119] Poria, S.; Cambria, E.; Gelbukh, A., Aspect extraction for opinion mining with a deep convolutional neural network, Knowl.-Based Syst., 108, 42-49 (2016)
[120] Young, T.; Hazarika, D.; Poria, S.; Cambria, E., Recent trends in deep learning based natural language processing, IEEE Comput. Intell. Mag., 13, 3, 55-75 (2018)
[121] Gimenez, M.; Palanca, J.; Botti, V., Semantic-based padding in convolutional neural networks for improving the performance in natural language processing. A case of study in sentiment analysis, Neurocomputing, 378, 315-323 (2020)
[122] Liu, W. B.; Wang, Z. D.; Liu, X. H.; Zengb, N. Y.; Liu, Y. R.; Alsaadi, F. E., A survey of deep neural network architectures and their applications, Neurocomputing, 234, 11-26 (2017)
[123] [link]. URL https://github.com/hshindo/LightNLP.jl.
[124] Liu, H. F.; Han, X. F.; Li, X. R.; Yao, Y. Z.; Huang, P.; Tang, Z. M., Deep representation learning for road detection using Siamese network, Multimedia Tools Appl., 78, 17, 24269-24283 (2019)
[125] Cuenca, L. G.; Puertas, E.; Andres, J. F.; Aliane, N., Autonomous driving in roundabout maneuvers using reinforcement learning with Q-learning, Electronics, 8, 12, 13 (2019)
[126] Desjardins, C.; Chaib-draa, B., Cooperative adaptive cruise control: A reinforcement learning approach, IEEE Trans. Intell. Transp. Syst., 12, 4, 1248-1260 (2011)
[127] Samsi, S.; Gadepally, V.; Hurley, M.; Jones, M.; Kao, E.; Mohindra, S.; Monticciolo, P.; Reuther, A.; Smith, S.; Song, W.; Staheli, D.; Kepner, J., Static graph challenge: Subgraph isomorphism, (2017 IEEE High Performance Extreme Computing Conference (HPEC) (2017)), 1-6
[128] [link]. URL https://github.com/JuliaGraphs/LightGraphs.jl.
[129] Uengtrakul, B.; Bunnjaweht, D., A cost efficient software defined radio receiver for demonstrating concepts in communication and signal processing using Python and RTL-SDR, (2014 Fourth International Conference on Digital Information and Communication Technology and It’s Applications. 2014 Fourth International Conference on Digital Information and Communication Technology and It’s Applications, International Conference on Digital Information and Communication Technology and it’s Applications (2014)), 394-399
[130] Gideon, K.; Nyirenda, C.; Temaneh-Nyah, C., Echo state network-based radio signal strength prediction for wireless communication in Northern Namibia, Iet Commun., 11, 12, 1920-1926 (2017)
[131] Srivastava, P.; Kang, M.; Gonugondla, S. K.; Lim, S.; Choi, J.; Adve, V.; Kim, N. S.; Shanbhag, N., PROMISE: An end-to-end design of a programmable mixed-signal accelerator for machine-learning algorithms, (2018 Acm/Ieee 45th Annual International Symposium on Computer Architecture. 2018 Acm/Ieee 45th Annual International Symposium on Computer Architecture, Conference Proceedings Annual International Symposium on Computer Architecture (2018)), 43-56
[132] Bishop, C. M., Pattern Recognition and Machine Learning (2006), Springer · Zbl 1107.68072
[133] Milewski, R.; Govindaraju, V., Binarization and cleanup of handwritten text from carbon copy medical form images, Pattern Recognit., 41, 4, 1308-1315 (2008)
[134] Anand, R. S.; Stey, P.; Jain, S.; Biron, D. R.; Bhatt, H.; Monteiro, K.; Feller, E.; Ranney, M. L.; Sarkar, I. N.; Chen, E. S., Predicting mortality in diabetic ICU patients using machine learning and severity indices, AMIA Jt. Summits Transl. Sci. Proc., 2017, 310-319 (2018)
[135] [link]. URL https://github.com/JuliaStats/GLM.jl.
[136] [link]. URL https://github.com/JuliaText/Languages.jl.
[137] [link]. URL https://julialang.org/blog/2019/08/2019-julia-survey.
[138] Rong, H.; Park, J.; Xiang, L.; Anderson, T. A.; Smelyanskiy, M., (Sparso: Context-driven Optimizations of Sparse Linear Algebra. Sparso: Context-driven Optimizations of Sparse Linear Algebra, 2016 International Conference on Parallel Architecture and Compilation Techniques (2016)), 247-259
[139] Plumb, G.; Pachauri, D.; Kondor, R.; Singh, V., SnFFT: A Julia toolkit for Fourier analysis of functions over permutations, J. Mach. Learn. Res., 16, 3469-3473 (2015) · Zbl 1351.68006
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.