#new #hollywood #hemispheres #hippocampus #and #cerebrum #consciousness #knowledge #sessions #facilitated #by #the #legend #team
Hollywood Hemispheres
Hippocampus and Cerebrum Consciousness Knowledge Sessions.
Each Month the Literary Arts Correspondent for Legend Magazine and Hollywood Hemispheres and Cerebrum Consciousness Lecturer will provide a piece of knowledge, in the way of visual poetry, a poetry thought prompt, auditory poetry or written information.
The knowledge will focus on various subjects of scientific information e.g the anatomy and physiology of the body, health studies, health research, nutrition, digestion and other scientific information and research from around the World on various subjects.
The Cerebrum is part of a person that feels, thinks, perceives, wills, and especially reasons.
and is responsible for conscious thoughts, reasoning, memory and emotions.
The largest part of the brain is divided into two hemispheres, or halves, called the cerebral hemispheres. Areas within the cerebrum control muscle functions and also control speech, thought, emotions, reading, writing, and learning.
What part of the brain is responsible for knowledge?
The cerebrum sits at the topmost part of the brain and is the source of intellectual activities. It holds your memories, allows you to plan, enables you to imagine and think. It allows you to recognize friends, read books, and play games.
Hemispheres of The Brain
In general, the left hemisphere controls speech, comprehension, arithmetic, and writing. The right hemisphere controls creativity, spatial ability, artistic, and musical skills. The left hemisphere is dominant in hand use and language in about 92% of people.
We look forward to expanding and exploring your knowledge with Hollywood Hemispheres
Hippocampus and Cerebrum Consciousness Sessions.
© 2023 Carol Natasha Diviney
Literary Arts Correspondent Legend Magazine and Hollywood Hemispheres and Cerebrum Consciousness Lecturer.
Facilitated by The Legend Team.
September/October 2023 Post:
Instant evolution: AI designs new robot from scratch in seconds
First AI capable of intelligently designing new robots that work in the real world
Date:
October 3, 2023
Source:
Northwestern University
Summary:
Researchers developed the first AI to date that can intelligently design robots from scratch by compressing billions of years of evolution into mere seconds. It's not only fast but also runs on a lightweight computer and designs wholly novel structures from scratch — without human-labeled, bias-filled datasets.
gence (AI) to date that can intelligently design robots from scratch.
To test the new AI, the researchers gave the system a simple prompt: Design a robot that can walk across a flat surface. While it took nature billions of years to evolve the first walking species, the new algorithm compressed evolution to lightning speed -- designing a successfully walking robot in mere seconds.
But the AI program is not just fast. It also runs on a lightweight personal computer and designs wholly novel structures from scratch. This stands in sharp contrast to other AI systems, which often require energy-hungry supercomputers and colossally large datasets. And even after crunching all that data, those systems are tethered to the constraints of human creativity -- only mimicking humans' past works without an ability to generate new ideas.
The study will be published on Oct. 3 in the Proceedings of the National Academy of Sciences.
"We discovered a very fast AI-driven design algorithm that bypasses the traffic jams of evolution, without falling back on the bias of human designers," said Northwestern's Sam Kriegman, who led the work. "We told the AI that we wanted a robot that could walk across land. Then we simply pressed a button and presto! It generated a blueprint for a robot in the blink of an eye that looks nothing like any animal that has ever walked the earth. I call this process 'instant evolution.'"
Kriegman is an assistant professor of computer science, mechanical engineering and chemical and biological engineering at Northwestern's McCormick School of Engineering, where he is a member of the Center for Robotics and Biosystems. David Matthews, a scientist in Kriegman's laboratory, is the paper's first author. Kriegman and Matthews worked closely with co-authors Andrew Spielberg and Daniela Rus (Massachusetts Institute of Technology) and Josh Bongard (University of Vermont) for several years before their breakthrough discovery.
From xenobots to new organisms
In early 2020, Kriegman garnered widespread media attention for developing xenobots, the first living robots made entirely from biological cells. Now, Kriegman and his team view their new AI as the next advance in their quest to explore the potential of artificial life. The robot itself is unassuming -- small, squishy and misshapen. And, for now, it is made of inorganic materials. But Kriegman says it represents the first step in a new era of AI-designed tools that, like animals, can act directly on the world.
"When people look at this robot, they might see a useless gadget," Kriegman said. "I see the birth of a brand-new organism."
Zero to walking within seconds
While the AI program can start with any prompt, Kriegman and his team began with a simple request to design a physical machine capable of walking on land. That's where the researchers' input ended and the AI took over.
The computer started with a block about the size of a bar of soap. It could jiggle but definitely not walk. Knowing that it had not yet achieved its goal, AI quickly iterated on the design. With each iteration, the AI assessed its design, identified flaws and whittled away at the simulated block to update its structure. Eventually, the simulated robot could bounce in place, then hop forward and then shuffle. Finally, after just nine tries, it generated a robot that could walk half its body length per second -- about half the speed of an average human stride.
The entire design process -- from a shapeless block with zero movement to a full-on walking robot -- took just 26 seconds on a laptop.
"Now anyone can watch evolution in action as AI generates better and better robot bodies in real time," Kriegman said. "Evolving robots previously required weeks of trial and error on a supercomputer, and of course before any animals could run, swim or fly around our world, there were billions upon billions of years of trial and error. This is because evolution has no foresight. It cannot see into the future to know if a specific mutation will be beneficial or catastrophic. We found a way to remove this blindfold, thereby compressing billions of years of evolution into an instant."
Rediscovering legs
The robot has three legs, fins along its back, a flat face and is riddled with holes.
"It's interesting because we didn't tell the AI that a robot should have legs," Kriegman said. "It rediscovered that legs are a good way to move around on land. Legged locomotion is, in fact, the most efficient form of terrestrial movement."
To see if the simulated robot could work in real life, Kriegman and his team used the AI-designed robot as a blueprint. First, they 3D printed a mold of the negative space around the robot's body. Then, they filled the mold with liquid silicone rubber and let it cure for a couple hours. When the team popped the solidified silicone out of the mold, it was squishy and flexible.
Now, it was time to see if the robot's simulated behavior -- walking -- was retained in the physical world. The researchers filled the rubber robot body with air, making its three legs expand. When the air deflated from the robot's body, the legs contracted. By continually pumping air into the robot, it repeatedly expanded then contracted -- causing slow but steady locomotion.
Unfamiliar design
While the evolution of legs makes sense, the holes are a curious addition. AI punched holes throughout the robot's body in seemingly random places. Kriegman hypothesizes that porosity removes weight and adds flexibility, enabling the robot to bend its legs for walking.
"We don't really know what these holes do, but we know that they are important," he said. "Because when we take them away, the robot either can't walk anymore or can't walk as well."
Overall, Kriegman is surprised and fascinated by the robot's design, noting that most human-designed robots either look like humans, dogs or hockey pucks.
"When humans design robots, we tend to design them to look like familiar objects," Kriegman said. "But AI can create new possibilities and new paths forward that humans have never even considered. It could help us think and dream differently. And this might help us solve some of the most difficult problems we face."
Potential future applications
Although the AI's first robot can do little more than shuffle forward, Kriegman imagines a world of possibilities for tools designed by the same program. Someday, similar robots might be able to navigate the rubble of a collapsed building, following thermal and vibrational signatures to search for trapped people and animals, or they might traverse sewer systems to diagnose problems, unclog pipes and repair damage. The AI also might be able to design nano-robots that enter the human body and steer through the blood stream to unclog arteries, diagnose illnesses or kill cancer cells.
"The only thing standing in our way of these new tools and therapies is that we have no idea how to design them," Kriegman said. "Lucky for us, AI has ideas of its own."
Story Source:
Materials provided by Northwestern University. Original written by Amanda Morris. Note: Content may be edited for style and length.
Journal Reference:
David Matthews, Andrew Spielberg, Daniela Rus, Sam Kriegman, Josh Bongard. Efficient automatic design of robots. Proceedings of the National Academy of Sciences, 2023; 120 (41) DOI: 10.1073/pnas.2305180120
Instant evolution: AI designs new robot from scratch in seconds
First AI capable of intelligently designing new robots that work in the real world
Date:
October 3, 2023
Source:
Northwestern University
Summary:
Researchers developed the first AI to date that can intelligently design robots from scratch by compressing billions of years of evolution into mere seconds. It's not only fast but also runs on a lightweight computer and designs wholly novel structures from scratch — without human-labeled, bias-filled datasets.
gence (AI) to date that can intelligently design robots from scratch.
To test the new AI, the researchers gave the system a simple prompt: Design a robot that can walk across a flat surface. While it took nature billions of years to evolve the first walking species, the new algorithm compressed evolution to lightning speed -- designing a successfully walking robot in mere seconds.
But the AI program is not just fast. It also runs on a lightweight personal computer and designs wholly novel structures from scratch. This stands in sharp contrast to other AI systems, which often require energy-hungry supercomputers and colossally large datasets. And even after crunching all that data, those systems are tethered to the constraints of human creativity -- only mimicking humans' past works without an ability to generate new ideas.
The study will be published on Oct. 3 in the Proceedings of the National Academy of Sciences.
"We discovered a very fast AI-driven design algorithm that bypasses the traffic jams of evolution, without falling back on the bias of human designers," said Northwestern's Sam Kriegman, who led the work. "We told the AI that we wanted a robot that could walk across land. Then we simply pressed a button and presto! It generated a blueprint for a robot in the blink of an eye that looks nothing like any animal that has ever walked the earth. I call this process 'instant evolution.'"
Kriegman is an assistant professor of computer science, mechanical engineering and chemical and biological engineering at Northwestern's McCormick School of Engineering, where he is a member of the Center for Robotics and Biosystems. David Matthews, a scientist in Kriegman's laboratory, is the paper's first author. Kriegman and Matthews worked closely with co-authors Andrew Spielberg and Daniela Rus (Massachusetts Institute of Technology) and Josh Bongard (University of Vermont) for several years before their breakthrough discovery.
From xenobots to new organisms
In early 2020, Kriegman garnered widespread media attention for developing xenobots, the first living robots made entirely from biological cells. Now, Kriegman and his team view their new AI as the next advance in their quest to explore the potential of artificial life. The robot itself is unassuming -- small, squishy and misshapen. And, for now, it is made of inorganic materials. But Kriegman says it represents the first step in a new era of AI-designed tools that, like animals, can act directly on the world.
"When people look at this robot, they might see a useless gadget," Kriegman said. "I see the birth of a brand-new organism."
Zero to walking within seconds
While the AI program can start with any prompt, Kriegman and his team began with a simple request to design a physical machine capable of walking on land. That's where the researchers' input ended and the AI took over.
The computer started with a block about the size of a bar of soap. It could jiggle but definitely not walk. Knowing that it had not yet achieved its goal, AI quickly iterated on the design. With each iteration, the AI assessed its design, identified flaws and whittled away at the simulated block to update its structure. Eventually, the simulated robot could bounce in place, then hop forward and then shuffle. Finally, after just nine tries, it generated a robot that could walk half its body length per second -- about half the speed of an average human stride.
The entire design process -- from a shapeless block with zero movement to a full-on walking robot -- took just 26 seconds on a laptop.
"Now anyone can watch evolution in action as AI generates better and better robot bodies in real time," Kriegman said. "Evolving robots previously required weeks of trial and error on a supercomputer, and of course before any animals could run, swim or fly around our world, there were billions upon billions of years of trial and error. This is because evolution has no foresight. It cannot see into the future to know if a specific mutation will be beneficial or catastrophic. We found a way to remove this blindfold, thereby compressing billions of years of evolution into an instant."
Rediscovering legs
The robot has three legs, fins along its back, a flat face and is riddled with holes.
"It's interesting because we didn't tell the AI that a robot should have legs," Kriegman said. "It rediscovered that legs are a good way to move around on land. Legged locomotion is, in fact, the most efficient form of terrestrial movement."
To see if the simulated robot could work in real life, Kriegman and his team used the AI-designed robot as a blueprint. First, they 3D printed a mold of the negative space around the robot's body. Then, they filled the mold with liquid silicone rubber and let it cure for a couple hours. When the team popped the solidified silicone out of the mold, it was squishy and flexible.
Now, it was time to see if the robot's simulated behavior -- walking -- was retained in the physical world. The researchers filled the rubber robot body with air, making its three legs expand. When the air deflated from the robot's body, the legs contracted. By continually pumping air into the robot, it repeatedly expanded then contracted -- causing slow but steady locomotion.
Unfamiliar design
While the evolution of legs makes sense, the holes are a curious addition. AI punched holes throughout the robot's body in seemingly random places. Kriegman hypothesizes that porosity removes weight and adds flexibility, enabling the robot to bend its legs for walking.
"We don't really know what these holes do, but we know that they are important," he said. "Because when we take them away, the robot either can't walk anymore or can't walk as well."
Overall, Kriegman is surprised and fascinated by the robot's design, noting that most human-designed robots either look like humans, dogs or hockey pucks.
"When humans design robots, we tend to design them to look like familiar objects," Kriegman said. "But AI can create new possibilities and new paths forward that humans have never even considered. It could help us think and dream differently. And this might help us solve some of the most difficult problems we face."
Potential future applications
Although the AI's first robot can do little more than shuffle forward, Kriegman imagines a world of possibilities for tools designed by the same program. Someday, similar robots might be able to navigate the rubble of a collapsed building, following thermal and vibrational signatures to search for trapped people and animals, or they might traverse sewer systems to diagnose problems, unclog pipes and repair damage. The AI also might be able to design nano-robots that enter the human body and steer through the blood stream to unclog arteries, diagnose illnesses or kill cancer cells.
"The only thing standing in our way of these new tools and therapies is that we have no idea how to design them," Kriegman said. "Lucky for us, AI has ideas of its own."
Story Source:
Materials provided by Northwestern University. Original written by Amanda Morris. Note: Content may be edited for style and length.
Journal Reference:
David Matthews, Andrew Spielberg, Daniela Rus, Sam Kriegman, Josh Bongard. Efficient automatic design of robots. Proceedings of the National Academy of Sciences, 2023; 120 (41) DOI: 10.1073/pnas.2305180120
Instant evolution: AI designs new robot from scratch in seconds
First AI capable of intelligently designing new robots that work in the real world
Date:
October 3, 2023
Source:
Northwestern University
Summary:
Researchers developed the first AI to date that can intelligently design robots from scratch by compressing billions of years of evolution into mere seconds. It's not only fast but also runs on a lightweight computer and designs wholly novel structures from scratch — without human-labeled, bias-filled datasets.
gence (AI) to date that can intelligently design robots from scratch.
To test the new AI, the researchers gave the system a simple prompt: Design a robot that can walk across a flat surface. While it took nature billions of years to evolve the first walking species, the new algorithm compressed evolution to lightning speed -- designing a successfully walking robot in mere seconds.
But the AI program is not just fast. It also runs on a lightweight personal computer and designs wholly novel structures from scratch. This stands in sharp contrast to other AI systems, which often require energy-hungry supercomputers and colossally large datasets. And even after crunching all that data, those systems are tethered to the constraints of human creativity -- only mimicking humans' past works without an ability to generate new ideas.
The study will be published on Oct. 3 in the Proceedings of the National Academy of Sciences.
"We discovered a very fast AI-driven design algorithm that bypasses the traffic jams of evolution, without falling back on the bias of human designers," said Northwestern's Sam Kriegman, who led the work. "We told the AI that we wanted a robot that could walk across land. Then we simply pressed a button and presto! It generated a blueprint for a robot in the blink of an eye that looks nothing like any animal that has ever walked the earth. I call this process 'instant evolution.'"
Kriegman is an assistant professor of computer science, mechanical engineering and chemical and biological engineering at Northwestern's McCormick School of Engineering, where he is a member of the Center for Robotics and Biosystems. David Matthews, a scientist in Kriegman's laboratory, is the paper's first author. Kriegman and Matthews worked closely with co-authors Andrew Spielberg and Daniela Rus (Massachusetts Institute of Technology) and Josh Bongard (University of Vermont) for several years before their breakthrough discovery.
From xenobots to new organisms
In early 2020, Kriegman garnered widespread media attention for developing xenobots, the first living robots made entirely from biological cells. Now, Kriegman and his team view their new AI as the next advance in their quest to explore the potential of artificial life. The robot itself is unassuming -- small, squishy and misshapen. And, for now, it is made of inorganic materials. But Kriegman says it represents the first step in a new era of AI-designed tools that, like animals, can act directly on the world.
"When people look at this robot, they might see a useless gadget," Kriegman said. "I see the birth of a brand-new organism."
Zero to walking within seconds
While the AI program can start with any prompt, Kriegman and his team began with a simple request to design a physical machine capable of walking on land. That's where the researchers' input ended and the AI took over.
The computer started with a block about the size of a bar of soap. It could jiggle but definitely not walk. Knowing that it had not yet achieved its goal, AI quickly iterated on the design. With each iteration, the AI assessed its design, identified flaws and whittled away at the simulated block to update its structure. Eventually, the simulated robot could bounce in place, then hop forward and then shuffle. Finally, after just nine tries, it generated a robot that could walk half its body length per second -- about half the speed of an average human stride.
The entire design process -- from a shapeless block with zero movement to a full-on walking robot -- took just 26 seconds on a laptop.
"Now anyone can watch evolution in action as AI generates better and better robot bodies in real time," Kriegman said. "Evolving robots previously required weeks of trial and error on a supercomputer, and of course before any animals could run, swim or fly around our world, there were billions upon billions of years of trial and error. This is because evolution has no foresight. It cannot see into the future to know if a specific mutation will be beneficial or catastrophic. We found a way to remove this blindfold, thereby compressing billions of years of evolution into an instant."
Rediscovering legs
The robot has three legs, fins along its back, a flat face and is riddled with holes.
"It's interesting because we didn't tell the AI that a robot should have legs," Kriegman said. "It rediscovered that legs are a good way to move around on land. Legged locomotion is, in fact, the most efficient form of terrestrial movement."
To see if the simulated robot could work in real life, Kriegman and his team used the AI-designed robot as a blueprint. First, they 3D printed a mold of the negative space around the robot's body. Then, they filled the mold with liquid silicone rubber and let it cure for a couple hours. When the team popped the
Now, it was time to see if the robot's simulated behavior -- walking -- was retained in the physical world. The researchers filled the rubber robot body with air, making its three legs expand. When the air deflated from the robot's body, the legs contracted. By continually pumping air into the robot, it repeatedly expanded then contracted -- causing slow but steady locomotion.
Unfamiliar design
While the evolution of legs makes sense, the holes are a curious addition. AI punched holes throughout the robot's body in seemingly random places. Kriegman hypothesizes that porosity removes weight and adds flexibility, enabling the robot to bend its legs for walking.
"We don't really know what these holes do, but we know that they are important," he said. "Because when we take them away, the robot either can't walk anymore or can't walk as well."
Overall, Kriegman is surprised and fascinated by the robot's design, noting that most human-designed robots either look like humans, dogs or hockey pucks.
"When humans design robots, we tend to design them to look like familiar objects," Kriegman said. "But AI can create new possibilities and new paths forward that humans have never even considered. It could help us think and dream differently. And this might help us solve some of the most difficult problems we face."
Potential future applications
Although the AI's first robot can do little more than shuffle forward, Kriegman imagines a world of possibilities for tools designed by the same program. Someday, similar robots might be able to navigate the rubble of a collapsed building, following thermal and vibrational signatures to search for trapped people and animals, or they might traverse sewer systems to diagnose problems, unclog pipes and repair damage. The AI also might be able to design nano-robots that enter the human body and steer through the blood stream to unclog arteries, diagnose illnesses or kill cancer cells.
"The only thing standing in our way of these new tools and therapies is that we have no idea how to design them," Kriegman said. "Lucky for us, AI has ideas of its own."
Story Source:
Materials provided by Northwestern University. Original written by Amanda Morris. Note: Content may be edited for style and length.
Journal Reference:
David Matthews, Andrew Spielberg, Daniela Rus, Sam Kriegman, Josh Bongard. Efficient automatic design of robots. Proceedings of the National Academy of Sciences, 2023; 120 (41) DOI: 10.1073/pnas.2305180120
July 2023 Post.
How does the human brain learn to read?
Our brains aren't pre-wired to translate letters into sounds. We learn to read by repurposing parts of the brain meant to do other things — visual processing, language comprehension, and speech production. Researchers have studied these areas using a type of brain imaging called functional MRI (fMRI).
Is the parietal lobe responsible for reading?
One of the primary roles of the parietal cortex lies in the integration of somatosensory and visual information that is needed for movement planning and control. In addition, specific areas of the parietal lobe are relevant for cognitive processes such as reading comprehension, and logical and mathematical thinking.
What are the lobes of the cerebral hemisphere?
The cerebrum consists of two cerebral hemispheres: the outer layer called the cortex (gray matter) and the inner layer (white matter). There are four lobes in the cortex, the frontal lobe, parietal lobe, temporal lobe, occipital lobe. 24 Apr 2023.
Which part of the brain is deficient in dyslexics?
There is a failure of the left hemisphere rear brain systems to function properly during reading. Furthermore, many people with dyslexia often show greater activation in the lower frontal areas of the brain. (Cited Reading Rockets).
Date:
June 27, 2023
Source:
University of Cambridge
Summary:
Professor Barbara Sahakian from the Department of Psychiatry at the University of Cambridge said: “Reading isn’t just a pleasurable experience – it’s widely accepted that it inspires thinking and creativity, increases empathy and reduces stress. But on top of this, we found significant evidence that it’s linked to important developmental factors in children, improving their cognition, mental health, and brain structure, which are cornerstones for future learning and well-being.”
Children who begin reading for pleasure early in life tend to perform better at cognitive tests and have better mental health when they enter adolescence, a study of more than 10,000 young adolescents in the US has found.
In a study published in Psychological Medicine, researchers in the UK and China found that 12 hours a week was the optimal amount of reading, and that this was linked to improved brain structure, which may help explain the findings.
In a study published in Psychological Medicine, researchers in the UK and China found that 12 hours a week was the optimal amount of reading, and that this was linked to improved brain structure, which may help explain the findings.
Reading for pleasure can be an important and enjoyable childhood activity. Unlike listening and spoken language, which develop rapidly and easily in young children, reading is a taught skill and is acquired and developed through explicit learning over time.
During childhood and adolescence, our brains develop, making this an important time in which to establish behaviours that support our cognitive development and promote good brain health. However, until now it has been unclear what impact – if any – encouraging children to read from an early age will have on their brain development, cognition and mental health later in life.
To investigate this, researchers from the universities of Cambridge and Warwick in the UK and Fudan University in China looked at data from the Adolescent Brain and Cognitive Development (ABCD) cohort in the US, which recruited more than 10,000 young adolescents.
The team analysed a wide range of data including from clinical interviews, cognitive tests, mental and behavioural assessments and brain scans, comparing young people who began reading for pleasure at a relatively early age (between two and nine years old) against those who began doing so later or not at all. The analyses controlled for many important factors, including socio-economic status.
Of the 10,243 participants studied, just under a half (48%) had little experience of reading for pleasure or did not begin doing so until later in their childhood. The remaining half had spent between three and ten years reading for pleasure.
The team found a strong link between reading for pleasure at an early age and a positive performance in adolescence on cognitive tests that measured such factors as verbal learning, memory and speech development, and at school academic achievement.
These children also had better mental wellbeing, as assessed using a number of clinical scores and reports from parents and teachers, showing fewer signs of stress and depression, as well as improved attention and fewer behavioural problems such as aggression and rule-breaking.
Children who began reading for pleasure earlier also tended to spend less screen time – for example watching TV or using their smartphone or tablet – during the week and at weekends in their adolescence, and also tended to sleep longer.
When the researchers looked at brain scans from the adolescent cohort, they found that those participants who had taken to reading for pleasure at an early age showed moderately larger total brain areas and volumes, including in particular brain regions that play critical roles in cognitive functions. Other brain regions that were different among this group were those that have been previously shown to relate to improved mental health, behaviour and attention.
Professor Barbara Sahakian from the Department of Psychiatry at the University of Cambridge said: “Reading isn’t just a pleasurable experience – it’s widely accepted that it inspires thinking and creativity, increases empathy and reduces stress. But on top of this, we found significant evidence that it’s linked to important developmental factors in children, improving their cognition, mental health, and brain structure, which are cornerstones for future learning and well-being.”
The optimal amount of reading for pleasure as a young child was around 12 hours per week. Beyond this, there appeared to be no additional benefits. In fact, there was a gradual decrease in cognition, which the researchers say may be because it suggests they are spending more time sedentary and less time at other activities that could be cognitively enriching, including sports and social activities.
Professor Jianfeng Feng from Fudan University in Shanghai, China, and the University of Warwick, UK, said: “We encourage parents to do their best to awaken the joy of reading in their children at an early age. Done right, this will not only give them pleasure and enjoyment, but will also help their development and encourage long-term reading habits, which may also prove beneficial into adult life.”
Funders included: Wellcome and the National Institute for Health & Care Research (UK) and the National Natural Science Foundation of China.*
Journal Reference:
Yun-Jun Sun, Barbara J. Sahakian, Christelle Langley, Anyi Yang, Yuchao Jiang, Jujiao Kang, Xingming Zhao, Chunhe Li, Wei Cheng, Jianfeng Feng. Early-initiated childhood reading for pleasure: associations with better cognitive performance, mental well-being and brain structure in young adolescence. Psychological Medicine, 2023; 1 DOI: 10.1017/S0033291723001381