Будь ласка, використовуйте цей ідентифікатор, щоб цитувати або посилатися на цей матеріал: http://elibrary.kdpu.edu.ua/xmlui/handle/123456789/7033
Назва: Using spreadsheets as learning tools for neural network simulation
Автори: Семеріков, Сергій Олексійович
Теплицький, Ілля Олександрович
Yechkalo, Yuliia
Markova, Oksana
Соловйов, Володимир Миколайович
Ків, Арнольд Юхимович
Єчкало, Юлія Володимирівна
Маркова, Оксана Миколаївна
Ключові слова: computer simulation
neural networks
spreadsheets
neural computing
early network models
Anderson's Iris
cloud-based learning tools
Дата публікації: 30-вер-2022
Видавництво: Academy of Cognitive and Natural Sciences
Бібліографічний опис: Semerikov S. Using spreadsheets as learning tools for neural network simulation / Serhiy Semerikov, Illia Teplytskyi, Yuliia Yechkalo, Oksana Markova, Vladimir Soloviev, Arnold Kiv // Ukrainian Journal of Educational Studies and Information Technology. – 2022. – Vol. 10. – Iss. 3. – P. 42–68. – DOI: 10.32919/uesit.2022.03.04
Короткий огляд (реферат): The article supports the need for training techniques for neural network computer simulations in a spreadsheet context. Their use in simulating artificial neural networks is systematically reviewed. The authors distinguish between fundamental methods for addressing the issue of network computer simulation training in the spreadsheet environment, joint application of spreadsheets and tools for neural network simulation, application of third-party add-ins to spreadsheets, development of macros using embedded languages of spreadsheets, use of standard spreadsheet add-ins for non-linear optimization, creation of neural networks in the spreadsheet environment without add-ins, and On the article, methods for creating neural network models in Google Sheets, a cloud-based spreadsheet, are discussed. The classification of multidimensional data presented in R. A. Fisher's "The Use of Multiple Measurements in Taxonomic Problems" served as the model's primary inspiration. Discussed are various idiosyncrasies of data selection as well as Edgar Anderson's participation in the 1920s and 1930s data preparation and collection. The approach of multi-dimensional data display in the form of an ideograph, created by Anderson and regarded as one of the first effective methods of data visualization, is discussed here.
Опис: Abelson, H., Sussman, G. J., & Sussman, J. (1996). Structure and Interpretation of Computer Programs (2nd ed.). Cambridge: MIT Press. Abraham, T. H. (2002). (Physio)logical circuits: The intellectual origins of the McCulloch-Pitts neural networks. Journal of the History of the Behavioral Sciences, 38(1), 3–25. DOI: https://doi.org/10.1002/jhbs.1094. DOI: https://doi.org/10.1002/jhbs.1094 Anderson, E. (1928). The Problem of Species in the Northern Blue Flags, Iris versicolor L. and Iris virginica L. Annals of the Missouri Botanical Garden, 15(3), 241–332. DOI: https://doi.org/10.2307/2394087. DOI: https://doi.org/10.2307/2394087 Anderson, E. (1935). The Irises of the Gaspe Peninsula. Bulletin of the American Iris Society, 59, 2–5. Anderson, E. (1936). The Species Problem in Iris. Annals of the Missouri Botanical Garden, 23(3), 457–469+471–483+485–501+503–509. DOI: https://doi.org/10.2307/2394164. DOI: https://doi.org/10.2307/2394164 Anderson, E. (1952). Plants, Man and Life. Boston: University of California Press. DOI: https://doi.org/10.1525/9780520312548 Ayed, A. S. (1997). Master thesis. Memorial University. Buergermeister, J. J. (1990). Using Computer Spreadsheets for Instruction in Cost Control Curriculum at the Undergraduate Level. In D. W. Dalton (Ed.), Proceedings of the 32nd Annual International Conference of the Association for the Development of Computer-Based Instructional Systems, San Diego, California, October 29-November 1, 1990 (pp. 214–220). Columbus: ADCIS International. Chernoff, H. (1973). The Use of Faces to Represent Points in k-Dimensional Space Graphically. Journal of the American Statistical Association, 68(342), 361-368. DOI: https://doi.org/10.1080/01621459.1973.10482434. DOI: https://doi.org/10.1080/01621459.1973.10482434 Cowan, J. D. (1998). Interview with J. A. Anderson and E. Rosenfeld. In J. A. Anderson & E. Rosenfeld (Eds.), Talking nets: An oral history of neural networks (pp. 97–124). Cambridge: MIT Press. Cull, P. (2007). The mathematical biophysics of Nicolas Rashevsky. BioSystems, 88(3), 178–184. DOI: https://doi.org/10.1016/j.biosystems.2006.11.003. DOI: https://doi.org/10.1016/j.biosystems.2006.11.003 Eberhart, R. C. & Dobbins, R. W. (1990). Background and History. In R.C. Eberhart & R. W. Dobbins (Eds.), Neural Network PC Tools: A Practical Guide (pp. 9–34). San Diego: Academic Press. DOI: https://doi.org/10.1016/C2009-0-21624-2. DOI: https://doi.org/10.1016/B978-0-12-228640-7.50007-6 Fisher, R. A. (1936). The Use of Multiple Measurements in Taxonomic Problems. Annals of Eugenics, 7(2), 179–188. DOI: https://doi.org/10.1111/j.1469-1809.1936.tb02137.x. DOI: https://doi.org/10.1111/j.1469-1809.1936.tb02137.x Freedman, R. S., Frail, R. P., Schneider, F. T., & Schnitta, B. (1991). Expert systems in spreadsheets: modeling the Wall Street user domain. In Proceedings First International Conference on Artificial Intelligence Applications on Wall Street (pp. 296-301). DOI: https://doi.org/10.1109/AIAWS.1991.236586. DOI: https://doi.org/10.1109/AIAWS.1991.236586 Hegazy, T. & Ayed, A. (1998). Neural Network Model for Parametric Cost Estimation of Highway Projects. Journal of Construction Engineering and Management, 124(3), 210–218. DOI: https://doi.org/10.1061/(ASCE)0733-9364(1998)124:3(210). DOI: https://doi.org/10.1061/(ASCE)0733-9364(1998)124:3(210) Hewett, T. T. (1985a). Teaching Students to Model Neural Circuits and Neural Networks Using an Electronic Spreadsheet Simulator. Behavior Research Methods, Instruments, & Computers, 17(2), 339–344. DOI: https://doi.org/10.3758/BF03214406. DOI: https://doi.org/10.3758/BF03214406 Hewett, T. T. (1985b). Using an Electronic Spreadsheet Simulator to Teach Neural Modeling of Visual Phenomena. Philadelphia: Drexel University. Householder, A. S. (1940). A neural mechanism for discrimination: II. Discrimination of weights. Bulletin of Mathematical Biophysics, 2(1), 1–13. DOI: https://doi.org/10.1007/BF02478027. DOI: https://doi.org/10.1007/BF02478027 Householder, A. S. (1941). A theory of steady-state activity in nerve-fiber networks I: Definitions and Preliminary Lemmas. Bulletin of Mathematical Biophysics, 3(2), 63–69. DOI: https://doi.org/10.1007/BF02478220. DOI: https://doi.org/10.1007/BF02478220 Householder, A. S. & Landahl, H. D. (1945). Mathematical Biophysics of the Central Nervous System. Bloomington: Principia Press. James, W. (1890). The Principles of Psychology. New York: Henry Holt and Company. DOI: https://doi.org/10.1037/10538-000 James, W. (1892). Psychology. New York: Henry Holt and Company. Johnston, S. J. (1991). InfoWorld, 13(7), 14. DOI: https://doi.org/10.1016/0958-2118(91)90104-3 Kendrick, D. A., Mercado, P. R., & Amman, H. M. (2006). Computational Economics. Princeton: Princeton University Press. DOI: https://doi.org/10.1515/9781400841349 Landahl, H. D. (1947). A matrix calculus for neural nets: II. Bulletin of Mathematical Biophysics, 9(2), 99–108. DOI: https://doi.org/10.1007/BF02478296. DOI: https://doi.org/10.1007/BF02478296 Landahl, H. D., McCulloch, W. S., & Pitts, W. (1943). A statistical consequence of the logical calculus of nervous nets. Bulletin of Mathematical Biophysics, 5(4), 135–137. DOI: https://doi.org/10.1007/BF02478260. DOI: https://doi.org/10.1007/BF02478260 Landahl, H. D. & Runge, R. (1946). Outline of a matrix calculus for neural nets. Bulletin of Mathematical Biophysics, 8(2), 75–81. DOI: https://doi.org/10.1007/BF02478464. DOI: https://doi.org/10.1007/BF02478464 Markova, O., Semerikov, S., & Popel, M. (2018). CoCalc as a Learning Tool for Neural Network Simulation in the Special Course “Foundations of Mathematic Informatics”. CEUR Workshop Proceedings, 2104, 204. Retrieved from http://ceur-ws.org/Vol-2104/paper_204.pdf. DOI: https://doi.org/10.31812/0564/2250 Markova, O. M., Semerikov, S. O., Striuk, A. M., Shalatska, H. M., Nechypurenko, P. P., & Tron, V. V. (2019). Implementation of cloud service models in training of future information technology specialists. CEUR Workshop Proceedings, 2433, 499-515. Retrieved from http://ceur-ws.org/Vol-2433/paper34.pdf. DOI: https://doi.org/10.55056/cte.409 McCulloch, W. C. & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5(4), 115–133. DOI: https://doi.org/10.1007/BF02478259. DOI: https://doi.org/10.1007/BF02478259 Mitchell, T. M. (2017). Key Ideas in Machine Learning. Retrieved from http://www.cs.cmu.edu/%7Etom/mlbook/keyIdeas.pdf. Permiakova, O. S. & Semerikov, S. O. (2008). Zastosuvannia neironnykh merezh u zadachakh prohnozuvannia (The use of neural networks in forecasting problems). In Materials of the International Scientific and Practical Conference “Young scientist of the XXI century”, KTU, Kryviy Rih, 17–18 November 2008. Pitts, W. (1942a). Some observations on the simple neuron circuit. Bulletin of Mathematical Biophysics, 4(3), 121–129. DOI: https://doi.org/10.1007/BF02477942. DOI: https://doi.org/10.1007/BF02477942 Pitts, W. (1942b). The linear theory of neuron networks: The static problem. Bulletin of Mathematical Biophysics, 4(4), 169–175. DOI: https://doi.org/10.1007/BF02478112. DOI: https://doi.org/10.1007/BF02478112 Pitts, W. (1943a). A general theory of learning and conditioning: Part I. Psychometrika, 8(1), 1–18. DOI: https://doi.org/10.1007/BF02288680. DOI: https://doi.org/10.1007/BF02288680 Pitts, W. (1943b). A general theory of learning and conditioning: Part II. Psychometrika, 8(2), 131–140. DOI: https://doi.org/10.1007/BF02288697. DOI: https://doi.org/10.1007/BF02288697 Pitts, W. (1943c). The linear theory of neuron networks: The dynamic problem. Bulletin of Mathematical Biophysics, 5(1), 23–31. DOI: https://doi.org/10.1007/BF02478116. DOI: https://doi.org/10.1007/BF02478116 Pitts, W. & McCulloch, W. S. (1947). How we know universals the perception of auditory and visual forms. Bulletin of Mathematical Biophysics, 9(3), 127–147. DOI: https://doi.org/10.1007/BF02478291. DOI: https://doi.org/10.1007/BF02478291 Rashevsky, N. (1933). Outline of a physico-mathematical theory of excitation and inhibition. Protoplasma, 20(1), 42–56. DOI: https://doi.org/10.1007/BF02674811. DOI: https://doi.org/10.1007/BF02674811 Rashevsky, N. (1945a). Mathematical biophysics of abstraction and logical thinking. Bulletin of Mathematical Biophysics, 7(3), 133–148. DOI: https://doi.org/10.1007/BF02478314. DOI: https://doi.org/10.1007/BF02478314 Rashevsky, N. (1945b). Some remarks on the boolean algebra of nervous nets in mathematical biophysics. Bulletin of Mathematical Biophysics, 7(4), 203–211. DOI: https://doi.org/10.1007/BF02478425. DOI: https://doi.org/10.1007/BF02478425 Rashevsky, N. (1946). The neural mechanism of logical thinking. Bulletin of Mathematical Biophysics, 8(1), 29–40. DOI: https://doi.org/10.1007/BF02478425. DOI: https://doi.org/10.1007/BF02478469 Rienzo, T. F. & Athappilly, K. K. (2012). Introducing Artificial Neural Networks through a Spread-sheet Model. Decision Sciences Journal of Innovative Education, 10(4), 515–520. DOI: https://doi.org/10.1111/j.1540-4609.2012.00363.x. DOI: https://doi.org/10.1111/j.1540-4609.2012.00363.x Ruggiero, M. (1993). U.S. Patent No. 5,241,620. Ruggiero, M. A. (1997). Cybernetic Trading Strategies: Developing a Profitable Trading System with State-of-the-Art Technologies. New York: John Wiley & Sons. Schwab, K. & Davis, N. (2018). Shaping the Fourth Industrial Revolution. London: Portfolio Penguin. Semerikov, S. O. & Teplytskyi, I. O. (2018). Metodyka uvedennia osnov Machine learning u shkilnomu kursi informatyky (Methods of introducing the basics of Machine learning in the school course of informatics). In Problems of informatization of the educational process in institutions of general secondary and higher education. Ukrainian scientific and practical conference, Kyiv, October 09, 2018 (pp. 18–20). Kyiv: Vyd-vo NPU imeni M. P. Drahomanova. Semerikov, S. O., Teplytskyi, I. O., Yechkalo, Yu. V., & Kiv, A. E. (2018). Computer Simulation of Neural Networks Using Spreadsheets: The Dawn of the Age of Camelot. CEUR Workshop Proceedings, 2257, 14. Retrieved from http://ceur-ws.org/Vol-2257/paper14.pdf. DOI: https://doi.org/10.31812/123456789/2648 Shimbel, A. & Rapoport, A. (1948). A statistical approach to the theory of the central nervous system. Bulletin of Mathematical Biophysics, 10(2), 41–55. DOI: https://doi.org/10.1007/BF02478329. DOI: https://doi.org/10.1007/BF02478329 Stebbins, G. L. (1978). Edgar Anderson 1897-1969. Washington: National Academy of Sciences. Sussman, G. J. & Wisdom, J. (2015). Structure and interpretation of classical mechanics (2nd ed.). Cambridge: MIT Press. Teplytskyi, I. O. (2010). Elementy kompiuternoho modeliuvannia (Elements of computer simulation) (2nd ed.). Kryvyi Rih: KSPU. Teplytskyi, I. O., Teplytskyi, O. I., & Humeniuk, A. P. (2008). Simulation environments: from replacement to integration. New computer technology, 6, 67–68. Wei, T. (1948). On matrices of neural nets. Bulletin of Mathematical Biophysics, 10(2), 63–67. DOI: https://doi.org/10.1007/BF02477433. DOI: https://doi.org/10.1007/BF02477433 Werbos, P. J. (1989). Maximizing long-term gas industry profits in two minutes in Lotus using neural network methods. Transactions on Systems Man and Cybernetics, 19(2), 315–333. DOI: https://doi.org/10.1109/21.31036. DOI: https://doi.org/10.1109/21.31036 Young, G. (1941). On reinforcement and interference between stimuli. Bulletin of Mathematical Biophysics, 3(1), 5–12. DOI: https://doi.org/10.1007/BF02478102. DOI: https://doi.org/10.1007/BF02478102 Zaremba, T. (1990). Case Study III: Technology in Search of a Buck. In R.C. Eberhart & R. W. Dobbins (Eds.), Neural Network PC Tools: A Practical Guide (pp. 251–283). San Diego: Academic Press. DOI: https://doi.org/10.1016/C2009-0-21624-2. DOI: https://doi.org/10.1016/B978-0-12-228640-7.50018-0
URI (Уніфікований ідентифікатор ресурсу): https://uesit.org.ua/index.php/itse/article/view/393
https://doi.org/10.32919/uesit.2022.03.04
http://elibrary.kdpu.edu.ua/xmlui/handle/123456789/7033
ISSN: 2521-1234
Розташовується у зібраннях:Кафедра інформатики та прикладної математики

Файли цього матеріалу:
Файл Опис РозмірФормат 
admin,+4+-+Semerikov+et+al+-+42-68.pdf1.09 MBAdobe PDFПереглянути/Відкрити


Усі матеріали в архіві електронних ресурсів захищені авторським правом, всі права збережені.