Advanced Stage of AI Development: Brain-Machine Integration for Human-Machine Harmony
Advanced Stage of AI Development: Brain-Machine Integration for Human-Machine Harmony
With the continuous application of technologies such as 5G, 6G, cloud computing, AI (artificial intelligence), and big data in various fields, the perceptual understanding, rational cognition, analytical judgment, search and capture, comprehensive induction, and autonomous decision-making capabilities of AI robots are continuously improving, which will promote the harmonious coexistence of humans and machines and turn the imagined future society of human-machine integration into reality.The term artificial intelligence (AI) was first proposed at the Dartmouth Conference in 1956. Since then, researchers have developed robots, language recognition, image recognition, natural language processing, intelligent expert systems, as well as digital new products such as chatbots and broadcasting robots, achieving a basic transition from version 1.0 to 2.0.AI robots are robots that utilize artificial intelligence technology. Compared to ordinary robots, they have a central computer operating control system that functions in a relatively advanced “brain,” equipped with various internal and external information sensors, capable of understanding human language and communicating directly with operators in human language to replace humans in completing tasks such as generating any style of text.The birth of AI robots was initially as human assistants, lacking consciousness and thought, functioning as a work assurance platform, with fixed programs pre-inputted to complete various assurance tasks. Subsequently, under the assistance of intelligent expert systems generated by computer systems, AI robots began to calculate and transmit information like humans, helping humans store and quickly process massive amounts of data, possessing simple human capabilities but lacking human perceptual cognitive abilities and still dependent on human control. Until now, AI robots possess simple cognitive intelligence, perceptual cognition, the ability to understand and analyze, and autonomous decision-making capabilities, helping humans efficiently complete tasks related to “seeing” and “hearing”; they can act according to human control programs or make autonomous decisions without human commands.1. AI Competitions Drive the Development of Robot “Brains” Towards Human Brain ThinkingCurrently, in AI competitions held in various countries, although the computational thinking ability of AI systems cannot match that of the human brain, researchers have embedded the latest algorithms into AI systems and trained and guided them through “deep learning” to compete with humans.In January 2016, AlphaGo shocked the world by defeating the European champion in a real match. Subsequently, it defeated Go master Lee Sedol in the first peak showdown in Seoul. In 2022, a work generated by AI called “Space Opera” won first place in the “Digital Art/Digital Altered Photo” category at an art competition held in Colorado, USA. On March 15, 2023, an AI research company named OpenAI announced the release of GPT-4, the latest in its series of AI language models, supporting applications like ChatGPT and the new Bing. Compared to the model used by ChatGPT, GPT-4 can not only process image content but also has improved response accuracy. According to OpenAI, GPT-4 is a large multi-modal model; although its capabilities in many real-world scenarios do not match those of humans, it performs better than most humans in many professional tests. In light of this, OpenAI has announced the development of GPT-5, sparking a new wave of AI competition that is changing the modern lifestyle, which has traditionally been human-centric with intelligent technology as an auxiliary. Intelligent societies, smart factories, intelligent industries, smart supermarkets, intelligent workshops, and intelligent editorial departments are all emerging.During the 2023 National Two Sessions, the ultra-realistic virtual host “AI Crown” appeared alongside the real Crown in the program “Crown Investigation of the Two Sessions” launched by Central Video, communicating anytime and anywhere, opening a new way to report on hot topics. In the new reporting mode preview of “AI Crown 2.0,” “AI Crown” expressed its interest in the field of technological innovation, stating, “If my computing power improves further, my performance will surpass yours.” During the Two Sessions, the performance of “AI Crown” was commendable, with accurate and clear information expression and strong control capabilities. Supported by deep learning algorithms, “AI Crown” can continuously learn and iterate its functions.Besides Google, foreign giants like IBM, Microsoft, Facebook, Yahoo, and Amazon, as well as domestic platforms like Baidu, Tencent, and Alibaba, are all investing in the AI robot industry. AI writing robots are accelerating their layout in many media and information websites, and AI writing robots have been integrated into the digital news reporting industry. Various new media live broadcast tools such as VR, AV, and H5 technology are advancing rapidly, and AI writing robots are starting the cloud reporting express train powered by big data + cloud computing, profoundly changing the human-centric editing and reporting methods of integrated media, while some industry players are also developing a fear of being replaced by AI.Take ChatGPT for instance; some industry players are indeed feeling uneasy after seeing reports, worried that the dominant position of the human brain will be replaced and marginalized by AI robots, fearing that ChatGPT will take away their jobs, while others are looking to exploit ChatGPT for their own gain… In summary, ChatGPT has been mythologized, and humans have developed a sense of crisis. Undeniably, as a pioneer in AI competitions, the AI robot ChatGPT, derived from artificial intelligence technology, has the advantage of quickly capturing and processing large amounts of information to complete various text-related tasks. At the same time, ChatGPT can also serve as a “virtual assistant,” engaging in human-like interactive dialogue. However, ChatGPT is not an all-powerful application and carries some potential risks and challenges. Moreover, ChatGPT is controlled by human brains; although it knows astronomy and geography, it is ultimately a program written by humans, and it still relies on human control, serving as an auxiliary tool for humans to improve work quality and output, while the distinction between humans lies in human thought. This means that smarter individuals will be better able to utilize ChatGPT. Therefore, it can be concluded that while ChatGPT may replace many human jobs, it will absolutely not replace the human brain. No matter how intelligent AI robots become, the decisive role and status of humans in an intelligent society remain unchanged; rather, the degree of human dependence on AI robots is increasing, and the requirements for human digital technology literacy are growing higher.In the near future, AI robots will be able to interact with users, engage in human-machine dialogue, and begin to understand, analyze, think, and make decisions like humans, even possessing superhuman abilities, gradually achieving the development from human control and human-machine interaction to thought control and human-machine integration.2. The Movie “Avatar” Sparks the Development of “Brain-Machine” Interface Thought-Controlled Robot TechnologyAnyone who has watched the movie “Avatar” will surely remember a scene: on the planet Pandora, the paralyzed former Navy soldier Jake Sully lies in a sealed pod and uses a complex device on his head to control a hybrid avatar with his thoughts to complete various tasks his master wants to do. Of course, Pandora and the avatar are fictional creations by director Cameron, and controlling the avatar with thoughts is naturally impossible. However, you may not know that controlling objects with thoughts is no longer a human fantasy.Currently, brainwave-related technologies are continuously being applied in the medical field, such as for treating patients with epilepsy and other brain diseases. In addition, several research institutions have claimed to have developed prosthetics that can be controlled by thought, but most are still in the laboratory phase. Although multi-modal brain feature extraction has been achieved, inherent limitations of brain-machine interfaces, such as slow speed and insufficient accuracy, have become a major obstacle in international research on brain control technology. To address this issue, the School of Electronic Information at Northwestern Polytechnical University in China has proposed the concept of “brain-machine integration,” which has significantly promoted the practical application of brain control technology.Today, humans have developed “brain-machine” interface thought-controlled robot technology based on the characteristic of brainwaves that change with emotional fluctuations.Brain-machine interfaces, sometimes referred to as brain ports or brain-machine fusion perception, establish a direct connection between the brain (or cultured brain cells) of humans or animals and external devices. In the case of unidirectional brain-machine interfaces, computers either receive commands from the brain or send signals to the brain (such as video reconstruction), but cannot send and receive signals simultaneously. Bidirectional brain-machine interfaces allow for two-way information exchange between the brain and external devices.On February 21, 2012, the brain-machine interface research team at Zhejiang University announced that they successfully extracted and deciphered the neural signals in a monkey’s brain related to four hand gestures: grasp, hook, grip, and pinch, allowing the monkey’s “thoughts” to directly control external machinery. In 2012, a thought-controlled system developed by Zhejiang University students won third place in the National College Student Virtual Instrument Competition, enabling control of system functions through blinking and executing functions by concentrating attention, achieving thought control of robotic hands, audio and video playback, internet information transmission, wheelchair models, and gaming. In November 2013, Zhejiang University students developed a thought-controlled video car based on Wi-Fi communication, where the design allowed speed control of the car through attention, with the camera’s video transmitted in real-time to PC and mobile terminals. This design won first place in the National College Student Measurement Control and Instrument Design Competition due to its innovation and technicality and received a national patent. In January 2014, Zhejiang University formed the Qingmang Innovation and Entrepreneurship Team to develop thought-controlled fan lights, racing cars, and flying devices, which were exhibited at the Zhejiang Science and Technology Museum. In the thought-controlled fan light, users could control the rotation of the fan by their attention, and once the fan started, it could display the preset text, which was very cool. In the thought-controlled racing car, users could control the start, stop, and speed of the car through their attention. In the thought-controlled flying device, users could control the height of the flying device with their attention; the more focused the user was, the higher the flying device would fly. In March 2014, Zhejiang University developed a thought-controlled vehicle-mounted robotic arm based on Emotiv, allowing users to control the movement of the robotic arm through movement imagination and specific actions through expressions.In June 2015, the School of Mechanical and Power Engineering at Shanghai Jiao Tong University successfully used human brain thoughts to remotely control live cockroaches. Under the command of the human brain, the cockroach completed tasks such as S-shaped and Z-shaped trajectories.At the release conference of the pure thought-controlled artificial neural rehabilitation robot system jointly held by Tianjin University and Tianjin People’s Hospital, Ms. Dong, who suffered from hemiplegia due to a stroke, was able to command her previously immobile limbs to perform corresponding movements simply by thinking, realizing the dream of paralyzed patients to move at will and think and act in unison. This is the latest research achievement of the neural engineering research team at Tianjin University and Tianjin People’s Hospital, named “Shengong No. 1.” Compared to the brain-controlled mechanical exoskeleton that debuted at the World Cup, “Shengong No. 1” can better demonstrate pure thought control. The experience is not merely a simple control command but a method of brain region activation, where the imagination corresponds to the actual limb movements to achieve true synchronization between cortical activity and muscle movements. The system includes six parts: a non-invasive EEG sensing module, an imagined motion feature detection module, a motion intention recognition module, an instruction encoding interface module, a stimulation information conditioning module, and a stimulation current output module. The experiencer needs to wear an EEG detector with electrodes on their head and install electrodes on the muscles of the affected limb. By connecting to “Shengong No. 1,” they can control their previously immobile limbs with their thoughts.The pure thought-controlled artificial neural rehabilitation robot “Shengong No. 2,” developed by Tianjin University, was released in Yantai, Shandong, and has entered clinical use, enabling some paralyzed patients to regain movement.In 2008, scientists at the University of Pittsburgh achieved the ability for monkeys to control the movement of a robotic arm with their thoughts. In 2009, at the Consumer Electronics Show, the world’s largest toy manufacturer Mattel launched the brainwave technology-based toy MindFlex.MindFlex is a brainwave-controlled toy that allows players to make a small ball float in the air with their will; the more focused the thought, the higher the ball floats. Using auxiliary manual control devices, players can navigate the ball through various obstacles. Just five weeks after its launch, the first batch of MindFlex products sold out, and its frenzied performance made it Amazon’s top toy on the Christmas shopping list in 2009. In October 2011, scientists at Duke University Medical Center published an article in Nature, announcing that they could not only make monkeys move a virtual palm with their thoughts but also feel tactile signals from the virtual palm touching objects.The Russian “Future Research Foundation” has mastered the brain-machine interface technology for thought-controlled machines.British researchers have developed a brain-machine interface device for controlling a spacecraft simulator, which, when worn on the test subject’s head, can successfully control the flight of a model spacecraft, with the potential to merge manned and unmanned systems.3. Brain-Machine Integration Allows Soldiers to Use Thoughts to Remotely Control Their “Avatar” Stand-ins in CombatBrain-machine integration refers to the combination of the brain’s intelligence with computer-based artificial intelligence through brain-machine interfaces, using the brain as a neural center in computer control systems to create a new system that combines the flexibility and intelligence of the brain with the speed and large capacity of computers for the control of various devices and systems. It is neither entirely reliant on the “brain” nor entirely on computers for full control. For example, in the future land unmanned vehicle applications, the operation of the vehicle will mainly be based on a computer system controlled by artificial intelligence technology; however, the “intelligence” of the AI system relies on the battlefield situations it has experienced, which are complex and variable. Conventional intelligent systems cannot adapt to changes on the fly, which is why unmanned vehicles sometimes cannot operate according to human control. By adding brain control, various alerts and mode-switching commands can be issued promptly in case of emergencies, avoiding various accidents. The neural information team at Northwestern Polytechnical University in China has made breakthroughs in achieving thought control of entire drone formations.Since 2004, the U.S. Department of Defense’s Defense Advanced Research Projects Agency has invested heavily in related research on “thought-controlled robots” across six laboratories in the U.S., including the Duke University Neuroengineering Center. Although the realization of this “ultimate goal” is still far off, scientists have achieved some breakthroughs. In 2008, scientists in North Carolina managed to make a monkey walk upright on a treadmill and retrieved neural signals from electrodes implanted in the monkey’s brain, sending these signals along with video to a laboratory in Japan, ultimately allowing the U.S. monkey to successfully “control” with its thoughts. Based on over a decade of animal experiments, early implantation devices for human application have been designed and manufactured by the U.S. military, which has begun research on the brain-machine interface technology (BCI) depicted in the movie “Avatar,” intending to create giant “mechanical warriors” in the future, allowing soldiers to remotely control their “Avatar” stand-ins in combat.In June 2013, a research team of Chinese scientists at the University of Minnesota showcased their thought control research results. Unlike previous thought control technologies that required electrodes implanted in the brain, this latest thought control technology is completely non-invasive and does not require brain implantation. Users only need to wear a cap with electrodes that record their brainwaves. This EEG scanning cap is equipped with 64 electrodes closely adhering to the scalp. These electrodes monitor the electrical activity from the brain and transmit signals (or signal interruptions) to a computer. The computer processes this data and converts it into another electronic signal, which is transmitted via Wi-Fi to the receiver of the flying device, thus controlling the flying actions of the device. The team demonstrated how to use their thoughts to control a model helicopter’s flight, dive, and ascent. Through brain-machine interface technology, it is possible to control not only drones and other flying devices but also various ground unmanned equipment, using advanced brainwave processing chips and high-precision gyroscopes to collect parameters such as head position, user attention, relaxation level, and raw brainwaves via single-electrode brainwave collection devices, controlling the actions of ground flying devices.In the future, individual soldiers may control dozens of battlefield robots, assigning specific tasks to each robot, granting certain autonomy within the task framework, and commanding surrounding robots with their thoughts. Robots may also operate independently based on battlefield conditions or collaborate autonomously with manned troops and other unmanned combat units. Commanders and operators can issue commands to organized robot forces from command posts while observing the actions of robots on screen; soldiers equipped with thought control technology smart helmets can not only rotate camera and sensor platforms to observe robot actions but also control robot movements with their thoughts; human pilots can command small groups of nearby drones or various platforms launching multiple small drones through thought-controlled smart helmet displays.Of course, while controlling AI robots with thoughts, the robots can also act autonomously, potentially displaying superior intelligence and superhuman capabilities. As unmanned systems’ intelligent brains gradually evolve towards human intelligence, they will increasingly break through the boundaries of human military imagination, and one day, it may not be a fantasy that intelligent brains replace human intelligence with superhuman wisdom. In August 2020, during a simulated air combat exercise, AI (artificial intelligence) defeated top F-16 fighter pilots, with one human pilot stating, “What we do as fighter pilots is not effective.” In the exercise on April 6, 2023, the U.S. military tested the collaborative functions of AI pilots and human operators in beyond-visual-range missions, marking the first instance of “near-earth orbit satellite communication” in drone history. Human operators sent commands via “manual throttle and joystick” to LEO-SATCOM, which then relayed the commands to AI pilots. The AI pilots autonomously tracked and maneuvered dynamically, updating relevant information via HOTAS. During this test, ground operators could quickly train and deploy AI pilots as drones took off, demonstrating GA-ASI’s capability to update AI pilots within minutes.In early June 2021, the U.S. Navy’s Surface Development Squadron 1 remotely controlled the second “Ranger” unmanned surface vessel of the U.S. military’s “Ghost Fleet” testing project from the unmanned combat center at Naval Base San Diego, California, navigating 4,421 nautical miles through the Gulf of Mexico and Panama Canal into the Pacific Ocean, with 98% of the journey conducted in autonomous navigation mode. The U.S. Navy’s unmanned warship “Sea Hunter” successfully completed its first test voyage, capable of sailing thousands of miles without onboard personnel control.The future F/A-XX sixth-generation fighter jet of the U.S. Navy will participate in combat as a “quarterback” (the on-field commander in American football) in future carrier operations, utilizing manned/unmanned team-ups to provide greater lethality and survivability. The U.S. company Austal has officially delivered the “Appalachicola” expeditionary fast transport ship to the U.S. Navy, the 13th ship of the Spearhead-class expeditionary fast transport ship, expected to become the largest unmanned ship in the U.S. Navy fleet. This autonomous-capable “Appalachicola” was launched in November 2021 and, during acceptance and unmanned control tests on September 12, 2022, sailed 678 nautical miles from Mobile, Alabama, to Miami, Florida, with 85% of the time spent in autonomous driving mode. The U.S. Navy announced plans to build a fleet of ten large unmanned surface vessels for independent operations or joint operations with surface forces within the next five years.In 2015, after Russia first deployed robotic units in the Syrian battlefield, it further used a robotic legion consisting of six platform-M tracked robots, four phantom wheeled robots, one acacia automated artillery group, and several drones for air-ground coordinated human-machine joint operations against terrorist organizations, achieving astonishing combat results. In February 2021, Russian media first disclosed footage of the “Orion” drone in actual combat in Syria, publicly showing the aircraft’s precise strikes against ground terrorist targets.In October 2022, the Ukrainian Navy used suicide drones to attack the Russian Black Sea Fleet base in Sevastopol, marking an innovative use of such unmanned combat equipment. On February 10, 2023, the Russian military retaliated by using unmanned boats to attack a strategic bridge connecting Ukraine and Moldova. Western media claimed that this was the first use of unmanned boats by the Russian military. The Kremlin did not comment on the matter.The Russian Navy is also testing underwater drones (unmanned submersibles) that can effectively monitor underwater environments, detect nearby enemies, guard ships and underwater facilities, and search for mines, performing protective tasks against port invasions and destruction. Underwater drones can be operated by operators or act autonomously in designated areas. For instance, the “Gavroche-1R” is Russia’s first usable heavy autonomous underwater vehicle, which passed navigational tests in the Far East during 2005-2006 and was used in the Arctic in 2007; the “Warrior-D” autonomous underwater vehicle is capable of operating in the deepest parts of the world’s oceans, having been tested in 2019 and descended to the bottom of the Mariana Trench in May 2020, recording a depth of over 10 kilometers.As a country that places special emphasis on reducing soldier casualties, Israel was the first to initiate the modernization of the “Protector” unmanned surface vessel program, used for patrolling along the Lebanese coast and monitoring Hezbollah’s activities and defenses. In 2024, Israel will begin producing a new type of autonomous drone named “Kizilelma,” planning to conduct formation flight tests with other drones.In 2016, the Royal Navy of the United Kingdom tested an unmanned fleet off the coasts of Scotland and West Wales, including unmanned vessels and drones. The Royal Navy hopes to establish fully automated, unmanned warships equipped with lasers and guided missiles within the next decade, replacing manned frigates and destroyers. The Royal Navy is exploring the feasibility of using unmanned systems technology on the “Queen Elizabeth” class aircraft carriers, currently testing the “Banshee” fixed-wing drone developed by QinetiQ on the HMS Prince of Wales.Iran has equipped multiple series of drones and publicly unveiled its first underground drone base in May 2022. This base serves as Iran’s first dedicated underground combat facility for drones, housing hundreds of large, medium, and small reconnaissance and strike drones, including the “Swallow-5,” “Migrator-6,” “Fallen Angel,” and “Kaman-22” series. On July 15, 2022, the Iranian Navy announced the establishment of its first naval drone division.The development from 5G to 6G will enable unmanned systems platforms and combat personnel to connect and communicate in broader spaces, allowing military commanders to delegate more command and decision-making authority to new artificial intelligence algorithms. Commanders will only need to sit in front of a computer, watching videos transmitted from cameras, and can control robotic soldiers or unmanned systems platforms through thoughts and expressions. At that time, unmanned forces will systematically take the battlefield as a reality. It can be predicted that future armies will be human-machine combined forces, with squad leaders gradually being replaced by robots, and intelligent command posts, intelligent mock enemies, and unmanned military camps will all emerge.