Know The Scientist
Shuji Nakamura – the ‘Edison’ of the blue LED revolution
Shuji Nakamura’s journey inventing blue LED is an example of a ‘high risk, but high reward’ strategy for scientific innovation. Moreover, it lends a sneak-peek at how scientific research unfolds in a corporate environment.

If there is ever an inspiring story about perseverance it would be Thomas Edison’s much retold story of the commercially viable incandescent light bulb in 1879.
In response to questions regarding ‘missteps’ with developing the bulb, Edison famously said, “I have not failed 10,000 times—I’ve successfully found 10,000 ways that will not work.”
But scientific research comes with its own complexity and challenges. Edison, arguably, had resources at his disposal. What is it to charter a breakthrough being a scientist in a corporate environment, when all odds are stacked against you?
Today’s EP Know the Scientist explores a scientist, who strived in an environment of desperation, and prevailed despite all odds. Rivaling Edison’s stature a century later, came the Japanese engineer and physicist, Shuji Nakamura.
Nakamura invented the blue-light emitting diode (LED) bulb in 1993, while he was the chief engineer for Nichia Corporation in Japan. The invention of the blue LED set up the second light revolution in the mid-1990s.

Shuji Nakamura holding a blue-LED. Credit: Ladislav Markus / Wikimedia
Blue-LED eluded the electrical industry which needed it to complete the coveted trio of red, green and blue LEDs. Superimposing and tuning the three different colors, it was easy to cover for the entire visible light spectrum.
Smartphone screens, digital signages, traffic lights or spot lights all depended exclusively on LED when they were rolled out.
LED bulbs paved the world for cheaper lighting that was simultaneously energy efficient. They can emit visible light of distinct colors, while consuming at least 75% less electricity on average than incandescent light bulbs. In incandescent bulbs, their bulk energy output is wasted in heat, with only 2% emitted as polychromatic light.
As the world battles for cleaner energy resources, LEDs set off the revolution around the same time when there was consensus already that we need to be more energy efficient.
The ‘high risk, but high reward’ strategy
Nichia Corporation was seeing losses for years in the LED market, despite selling red and infrared LEDs. That’s when Shuji Nakamura, the chief designer developing these LEDs, entered the scene as he came under pressure to come up with a new product of their own. Nakamura pitched an idea to actually researching and inventing a blue-LED – a last minute gamble to save Nichia’s future. It sounds ludicrous that Nakamura aimed to research blue LEDs, when semiconductor physicists of even higher repute worked in well equipped laboratories.
There were two materials that physicists had narrowed down to which had the band gap energy in semiconductors to emit blue light upon stimulation. One was zinc selenide, and the other was gallium nitride. Gallium nitride was deemed difficult to grow a perfect lattice, which meant physicists pushed resources towards unlocking the band gap energy in zinc selenide. In a game of high risk, but high reward strategy, Nakamura decided to dedicate time to research gallium nitride instead!

Gallium nitride crystal. Credit: Opto-p / Wikimedia
The reason was partly because he had a contingency plan in place. In Japan, one can get five papers published to be awarded one. Nakamura was banking on continuing his research until he can get those five papers published, to be assured as PhD. That doesn’t mean he was desperate enough to lose sight of his original goal to research with gallium nitride. By pursuing this semiconductor, he could avoid competition from physicists everywhere else! He had a higher chance of reporting something novel that merits a paper.
Building academic networks
There were other challenges – some important to maintain. A healthy research environment wasn’t assured, despite backing from Nobuo Ogawa, then boss at Nichia. His son, Eiji Ogawa, who succeeded him, had friction with Nakamura. Despite Eiji’s repeated attempts to get Nakamura to pursue research with zinc selenide, Nakamura was stubborn as ever.
As much as Nakamura’s story does sound like a prodigious, lone engineer who made a discovery that escaped many, it isn’t actually true. Nakamura did collaborate occasionally for leads with other scientific researchers while working on gallium nitride.
Japan, for one, has some of the brightest minds in the world at the forefront of scientific research. So did the US.

Representative image of a chemical vapor deposition (CVD) reaction chamber. Credit: NASA / Wikimedia
Nakamura kept abreast of moves on gallium nitride, traveling to Nagoya University within Japan, where Hiroshi Amano and Isamu Akasaki found a way to create the perfect lattice.
In the US, Nakamura learnt the art of designing a ‘metal oxide chemical vapor deposition (MOCVD) reactor’. This is a technique used across material physics to grow a 2D layer of a material over a substrate.
Nakamura built extra modifications to the MOCVD when he was back in Japan. Adding an extra nozzle, Nakamura created the ‘two-flow MOCVD’ technique that at last solved the blue-LED conundrum. And the rest is history.
Nobel Prize and his 70th birthday
Nakamura, along with Amano and Akasaki – his Nagoya University collaborators, shared the 2014 Nobel Prize in Physics, “for the invention of efficient blue light-emitting diodes which has enabled bright and energy-saving white light sources.”
Nichia’s fortunes grew like none before. From a gross receipt of $200 million in 1993, they had $800 million by 2001 – with 60% contribution by sale of Nakamura’s blue-LED technology. Nichia has supplied LEDs for bigger clients including Apple and other electronic companies since then, and remains a market leader in Japan.
However, he received no favors from Nichia. Nichia barely increased his paycheck after his invention that set the precedent for lighting across the world. Nakamura made the decision to leave for the US where he got much better opportunities and a salary. But he left only after fulfilling his backup plan, publishing five papers, to get an engineering PhD from the University of Tokushima in 1995.
Nichia followed him even to the US.
A decade ago, Nichia filed a lawsuit against him claiming intellectual blue LED technology with a company Nakamura worked with. However, Nakamura prevailed and received $8 million in a settlement.
Nakamura, who’ll be a septuagenarian later this March, now holds a position as an engineering professor at the University of California, Santa Barbara, US. Even at 70, he never retired.
An engineer in training, who won a physics prize, his tale brings one that’s never repeated often. A story of undomitable resilience, unwavering in every challenge they face.
Know The Scientist
Pierre Curie: The precision of a scientific pioneer
Pierre Curie is perhaps best known for his work on magnetism

Pierre Curie (1859–1906) was a man whose legacy has shaped the course of modern science, yet his name is often overshadowed by that of his famous wife, Marie Curie. Despite this, Pierre’s contributions to physics, particularly in the field of magnetism and the discovery of radioactivity, were revolutionary and continue to influence scientific research today.
Early Life and Education
Born in Paris on May 15, 1859, Pierre Curie grew up in an intellectually stimulating environment. His father, Eugene, was a physician, and his mother, Sophie, was a teacher, which cultivated in Pierre a deep passion for learning. From an early age, Pierre showed an exceptional aptitude for mathematics and physics, subjects that would later define his career.
By the time Pierre was 16, he had already completed his studies in mathematics and physics, earning a degree from the prestigious Sorbonne University in Paris. This early foundation in scientific inquiry laid the groundwork for his future innovations.
In 1895 together with his brother Jacques Curie, Pierre Curie developed the Curie point—the temperature at which certain magnetic materials lose their magnetism
Innovative Work in Magnetism and Crystallography
Pierre Curie is perhaps best known for his work on magnetism. In 1895, together with his brother Jacques Curie, he developed the Curie point—the temperature at which certain magnetic materials lose their magnetism. This work, foundational in the study of thermodynamics and magnetism, continues to be a key concept in modern physics.
Additionally, Pierre Curie’s research in crystallography and his study of the magnetic properties of materials played a pivotal role in the development of solid-state physics. His work laid the foundation for understanding the relationship between a material’s structure and its magnetic properties, which remains essential in materials science today.
The Discovery of Radioactivity
However, Pierre Curie’s most significant contribution came from his work on radioactivity, which would forever alter the understanding of matter itself. In the late 19th century, the mysterious rays emitted by certain substances, like uranium, intrigued scientists. Working alongside his wife, Marie Curie, Pierre embarked on a series of experiments to better understand this phenomenon.
Their work, starting in 1898, led to the discovery of two new elements: polonium and radium. Marie Curie coined the term “radioactivity” to describe the spontaneous emission of radiation from these elements, but it was Pierre’s precise experimental methods and scientific rigor that helped bring clarity to the phenomenon. Their discovery of radium, in particular, was a breakthrough that would lead to numerous advancements in medical treatments, including cancer therapy.
Nobel Recognition and Collaboration with Marie Curie
In 1903, Pierre Curie, together with Marie Curie and Henri Becquerel, was awarded the Nobel Prize in Physics for their joint work on radioactivity. The recognition marked the first time a Nobel Prize had been awarded to a couple. However, what makes this achievement particularly notable is that Pierre Curie insisted that Marie be included in the award, a gesture that demonstrated not only his scientific partnership with his wife but also his support for women in science, a rare stance in the male-dominated field of the time.
Tragically, Pierre Curie’s life was cut short in 1906 when he was killed in a street accident at the age of 46
Pierre Curie’s dedication to scientific rigor and his ability to work collaboratively with Marie, his wife and fellow scientist, was vital to their success. Their work would not only earn them the Nobel Prize but also set the stage for later advancements in nuclear physics and medicine.
Tragic Loss and Enduring Legacy
Tragically, Pierre Curie’s life was cut short in 1906 when he was killed in a street accident at the age of 46. His death was a blow to both the scientific community and his family. However, his legacy continued through his wife, Marie, who carried on their groundbreaking work and became the first woman to win a second Nobel Prize.
Today, Pierre Curie is remembered as a visionary physicist whose discoveries were instrumental in shaping the fields of physics, chemistry, and medicine. His contributions to magnetism, crystallography, and radioactivity remain foundational to scientific inquiry. His work continues to inspire scientists across disciplines and serves as a reminder of the power of precision, collaboration, and dedication in the pursuit of knowledge.
Know The Scientist
The ‘Godfather of AI’ has a warning for us
The speed with which large language models such as ChatGPT has come to the fore has re-invigorated serious discussion about AI ethics and safety among scientists and humanities scholars alike.

The quest to develop artificial intelligence (AI) in the 20th century had entrants coming in from various fields, mostly mathematicians and physicists.
Geoff Hinton, famously known as the ‘godfather of AI’ today, at one point dabbled in cognitive psychology as a young undergraduate student at Cambridge. Allured by the nascent field of AI in the 1970s, Hinton did a PhD from Edinburgh where he helped revive the idea of artificial neural networks (ANNs). These ANNs mimic neuronal connections in animal brains, and has been the staple of mainstream research into AI. Hinton, a British-born Canadian, since then moved to the University of Toronto, where he’s currently a professor in computer science.
In 2018, Hinton’s contributions to computer science and AI caught up to him. He was awarded a share of the coveted Turing Award, which is popularly known as the ‘Nobel Prize in Computing’. His 1986 work on ‘back propagation’ helped provide the blueprint to how machines learn, earning him the popular recognition of being one of the ‘fathers of deep learning’ as well.
The last two years saw artificial intelligence become commonplace in public discourse on technology. Leading the charge was OpenAI’s ChatGPT, as large language models (LLMs) found use in a whole host of settings across the globe. OpenAI, Google, Microsoft and their likes are engaged in upping the ante.
But this sudden spurt has alarmed many and is re-invigorating a serious discussion about AI ethics and safety. Last year, Elon Musk was amongst signatories of a letter requesting to halt AI research for a while, fearing the ever-increasing odds that sentient AI may be in the horizon. But sociologists believe this risk is simply overplayed by billionaires to avoid the real-world problems posed by AI gets swept under the carpet. For example, job losses will occur for which there is no solution in sight about what should be done to compensate those who may lose their work.
However, in a very technical sense, computer scientists like Hinton have taken to the fore to make their views explicitly clear. In fact, Hinton ended his decade long association with Google last year to speak freely about what he thought was a competition between technology companies to climb upon each other’s advances. He, like many computer scientists, believe humanity is at a ‘turning point’ with AI, especially with large language models (LLMs) like ChatGPT at the fore.
“It’s [LLMs] very exciting,” said Hinton in a Science article. “It’s very nice to see all this work coming to fruition. But it’s also scary.”
One research study suggests these LLMs are anything but ‘stochastic parrots’ that outputs what it’s been instructed to do. This doesn’t mean AI is anywhere close to being sentient today. However, Hinton and other computer scientists fear humanity may unwittingly run into the real risk of creating one. In fact, Hinton was one of several signatories of an open letter requesting policy makers to consider the existential risk of AI.
Creating a sentient AI, or artificial general intelligence (AGI, as it’s technically called) would vary in definition based on scientists researching them. They don’t exist for one today, and nobody safe to say knows what it would look like. But in popular lore, these can simply mean Skynet from the Terminator movies, becoming ‘self-aware’. Hinton was of the opinion that AI already surpassed biological intelligence in some ways. However, it must be bore in mind that AI isn’t anymore a stochastic parrot than it is sentient. Hinton doesn’t say more powerful AI would make humans all redundant. But AI could do many routine tasks humans already do, and thus replace them in those in time. Navigating them is a task that requires views that are transdisciplinary.
Know The Scientist
The astrophysicist who featured in TIME’s most influential personality list
Priyamvada Natarajan’s contributions in astronomy helped shed light into two major research interests in contemporary astrophysics – the origins of supermassive black holes, and mapping dark matter in the universe.

For Priyamvada Natarajan, her earliest exposure to scientific research arose from her childhood passion making star maps. Her love for maps never abated, and shaped her career as a theoretical astrophysicist. In the media, she’s the famous ‘cosmic cartographer’, who featured in the TIME magazine’s list of 100 most influential personalities this year.
“I realise what an honour and privilege this is,” said Natarajan to The Hindu. “It sends a message that people working in science can be seen as influential, and that is very gratifying.”
The Indian-American’s claim to fame arises from her pathbreaking research into dark matter and supermassive black holes.
She devised a mathematical technique to chart out dark matter clumps across the universe. Despite dark matter being invisible and elusive to astronomers, they’re thought to dominate some 75% of the universe’s matter. Dark matter clumps act as ‘scaffolding’, in the words of Natarajan, over which galaxies form. When light from background galaxies gets caught under the gravitational influence of dark matter clumps, they bend like they would when passed through a lens. Natarajan exploited this effect, called gravitational lensing, to map dark matter clumps across the universe.

Simulation of dark matter clumps and gas forming galaxies. Credit: Illustris Collaboration
Natarajan reflected her passion for mapping in a TEDx talk at Yale University, where she’s professor of physics and astronomy. Though she’s an ‘armchair’ cartographer, in her own description, she has resolved another major headwind in astronomy – nailing down the origins of supermassive black holes.
Black holes generally form from dying stars, after they collapse under their weight due to gravity. These black holes would swallow gas from their environment to grow in weight. However, there also exists supermassive black holes in the universe, millions of times heavier than any star or stellar-sized black hole, whose formation can’t be explained by the dying star collapse theory. One example is Sagittarius A* at the center of the Milky Way, which is a whopping four million times massive than our sun.

First direct image of Sagittarius A* at the Milky Way center. Credit: EHT
The origins of these behemoths remained in the dark until Natarajan and her collaborators shed some light to it. In their theory, massive clumps of gas in the early universe would collapse under its own weight to directly form a ‘seed’ supermassive black hole. This would grow similar to its stellar-massed counterparts by swallowing gas from its environment. In 2023, astronomers found compelling evidence to validate her theory. They reported a supermassive black hole powering the ancient quasar, UHZ1, at an epoch when no black hole could possibly have grown to attain such a massive size.
These observations came nearly two decades following Natarajan’s first paper on this in 2005. In a 2018 interview to Quanta, she expressed how content she would be with her contributions to astrophysics without having her theory requiring experimental verification done within her lifetime. For, she would be simply content at having succeeded at having her ideas resonate among astronomers for them to go search for her black holes. “I’m trying to tell myself that even that would be a supercool outcome,” she said in that interview. “But finding [the supermassive black hole ‘seed’] would be just so awesome.”
Beyond science, Natarajan’s a well-sought public speaker as well, with pursuits in the humanities as well. In fact, at Yale University, she’s the director of the Franke Program in Science and the Humanities, which fosters links between the two disciplines. Her humanities connect comes at MIT, where she did degrees in physics and mathematics before taking a three-year hiatus from science to explore her interest in the philosophy of science. However, she returned to astronomy soon thereafter, enrolling as a PhD student at Cambridge, where she worked under noted astronomer Martin Rees on black holes in the early universe which seeded her success in later years.
-
Space & Physics5 months ago
Bubbles observed moving on a star for the first time
-
Interviews4 months ago
Memory Formation Unveiled: An Interview with Sajikumar Sreedharan
-
EDUNEWS & VIEWS4 months ago
India: Big Science in the 20th century and beyond
-
Space & Physics5 months ago
Nobel laureates in Physics recognized for contributions to Machine Learning
-
Society6 months ago
Repurposed antidepressant shows promise as cost-effective treatment for breast cancer
-
The Sciences6 months ago
Researchers using mushrooms to clean contaminated water
-
The Sciences5 months ago
UFS researcher tackles plastic pollution with innovative biodegradable polymers
-
Society4 months ago
Global tech alliance: Nvidia partners with Reliance to transform AI landscape in India