Neuroscience and Network Dynamics Toward Brain-Inspired Intelligence

Abstract : This article surveys the interdisciplinary research of neuroscience, network science, and dynamic systems, with emphasis on the emergence of brain-inspired intelligence. To replicate brain intelligence, a practical way is to reconstruct cortical networks with dynamic activities that nourish the brain functions, instead of using only artificial computing networks. The survey provides a complex network and spatiotemporal dynamics (abbr. network dynamics) perspective for understanding the brain and cortical networks and, furthermore, develops integrated approaches of neuroscience and network dynamics toward building brain-inspired intelligence with learning and resilience functions. Presented are fundamental concepts and principles of complex networks, neuroscience, and hybrid dynamic systems, as well as relevant studies about the brain and intelligence. Other promising research directions, such as brain science, data science, quantum information science, and machine behavior are also briefly discussed toward future applications.
 EXISTING SYSTEM :
 ? In contrast, in the mechanism- and structure-oriented approach, brain-inspired intelligent robotics is developed to mimic, apply, and integrate internal mechanisms existing in humans. ? Existing as a part of an organic whole, individual components of collective robots must cooperate with each other to achieve harmonious behavior. ? When one becomes exposed to a new perception or experience that challenges existing schemas, a process of reorganization and adaptation occurs, leading to new schemas. ? Existing fixed-base industrial robots will no longer satisfy the demands of modern markets.
 DISADVANTAGE :
 ? Our brains simultaneously process various streams of temporal information allowing us to solve challenging real-world problems that ensure our survival. ? Deep learning provides algorithmic blueprints to organize large neural networks into suitable function approximators that flexibly solve diverse real-world problems. ? Surrogate gradient learning avoids such problems by using neuron specific feedback signals as in standard BP and dispensing with stochasticity in the forward pass. ? These findings suggest that explicit recurrence is either not necessary for many problems, or that ignoring explicit recurrence in gradient computations does not create a major impediment for successful learning, even when such recurrent connections are present.
 PROPOSED SYSTEM :
 • We expect that this work proposes novel research on developing efficient brain-inspired machine learning methods. • To continue to take advantage of this architecture, we have proposed a hybrid BIC architecture that accommodates and integrates the complexity of the temporal, spatial, and spatiotemporal domains into a single platform. • We also proposed the design of a morphable, intelligent, and collective robot operating system (micROS) for the management, control, and development of collective robots. • We have proposed a framework and components of such a system on the basis of previous research on multiscale spatiotemporal scene perception and understanding.
 ADVANTAGE :
 ? This temporal restriction improves complexity, but severely restricts learning performance on tasks that require long time horizons. ? We then systematically compared the resulting network performance of the two approaches and to networks without any explicit recurrent connections. ? The external computation of updates poses a significant performance bottleneck which renders this strategy often too slow for real-time or accelerated learning. ? Neuromorphic engineering has taken on the challenge of approaching such efficiency by building scalable, low-power systems that mirror the brain’s essential architectural features. ? To achieve this feat, deep learning optimizes loss functions with gradient descent, which can be computed efficiently with the Back-Propagation (BP) algorithm.

We have more than 145000 Documents , PPT and Research Papers

Have a question ?

Mail us : info@nibode.com