Connect with us

Know The Scientist

Paul Dirac – the mystifying genius with a doleful past

Meet the enigmatic British physicist who combined quantum mechanics with Einstein’s theory of special relativity.

Karthik Vinod

Published

on

Credit: Jijin M.K. / EdPublica

The Nobel-winning contribution

Quantum mechanics presented a radical new picture of the world that defied intuition. In this radical theory, we can’t predict reality with certainty anymore. This was in direct conflict with the tenets of classical physics that Newtonian physics and Einstein’s theories of relativity were based on.

It didn’t develop overnight though. It had gone through several years of iteration after constant addition of key principles. And for the theory’s intuition defying predictions, it didn’t have an overnight reception either. One of the famous detractors was Albert Einstein, who famously said, “God does not play dice,” at the 1927 Solvay Conference. But then Niels Bohr, the other great physicist at the time somewhat humorously said, “Einstein, don’t tell God what to do!”.

Despite the successes quantum theory had in the 1920s, its proponents knew that the theory was incompatible with Einstein’s special theory of relativity. Einstein’s theory postulates that nothing in the universe can cross the speed of light barrier. For any object with mass, it must otherwise possess unlimited energy to reach the speed of light. The postulate holds as real as ever, and is a fundamental tenet of modern physics. However, quantum physics which lacked a strong mathematical rigor until then didn’t incorporate the postulate. At least, until Paul Dirac would arrive with his eponymous equation in 1928 – the Dirac equation.

Dirac began as a skeptic of quantum theory, sharing the same sentiments as Einstein. Dirac’s biographer, Graham Farmelo, recalled him deriding Heisenberg’s work on the uncertainty principle, when he was a young PhD student, scribbling away, “Nature can’t be this complicated,” on top of the paper and brushing it aside.

But not for long though. In what Dirac fondly recalled as his profound work when he proved that Heisenberg’s and Schrodinger’s mathematical descriptions of quantum theory essentially predicted the same physical phenomena. He became a convert at last, championing the theory he had long detested. 

But Dirac’s magnum opus came when he finally developed the relativistic wave equation that finally seamed together quantum mechanics with Einstein’s special theory of relativity. 

Dirac was obsessed with mathematical rigor that he knew quantum theory so desperately needed. Experiments often led the way until then for theoretical models to be constructed which validated reality. However, the Dirac equation could predict novel phenomena hitherto undreamt of. His equation could predict every other puzzling phenomena attributed to quantum mechanics until then – including intrinsic spin, magnetism but also antimatter (particles that have the same mass as matter, but opposite electric charge). For these achievements, he was awarded the 1933 Nobel Prize in Physics with Erwin Schrodinger. He was the youngest ever winner at the time.

The reticent genius

Richard Feynman (right) chatting with his ‘hero’ Paul Dirac (left) at the sidelines of a conference in July 1962. Credit: Marek Holzman / Wikimedia

Despite these achievements though, Paul Dirac wasn’t the one to bask in any limelight. In Britain where he grew up, he’s ranked the greatest physicist since Newton, although he’s remotely popular there. Nonetheless, Dirac’s influence was massive at least with physicists entering the fray in his work. Richard Feynman called him his ‘hero’, falling in love with the mathematical rigor and clarity in his work.

“He made a new breakthrough, a new method of doing physics,” said Feynman. “He had the courage to simply guess at the form of an equation, the equation we now call the Dirac equation, and to try to interpret it afterwards.”

But to his immediate colleagues and well-wishers alike, he was as much shy and reticent as one could possibly get. At times he can come across as cold – almost schizoid like. In a sense, he fit the stereotype of the scientist who was below adept at socializing, but was a genius when it came to his intellectual acumen.

However, there could be a more genuine plausible reason. It’s very plausible that his outward appearance was shaped by his severe childhood trauma. At least that’s what one can construe hearing from Graham Farmelo who wrote Dirac’s only biography – released in 2009. Farmelo shed light onto Dirac’s difficult relationship with his father during childhood in great detail. The young Dirac grew up in an abusive and neglectful environment, where he didn’t just have the chance to develop his identity without being constantly intruded by his father who wanted him to be engrossed with his education. It’s pure speculation, but it’s possible that physics gave him respite from having to deal with the outside world. 

Niels Bohr once remarked, ‘Paul Dirac must be the strangest fellow to ever visit my institute.’ It wasn’t meant as condescension, although it was meant to highlight Dirac’s eccentric nature. 

In some respects, Dirac was as enigmatic and mysterious of a man, as the quantum theory he helped conceive.

Karthik is a science writer, and co-founder of Ed Publica. He writes and edits the science page. He's also a freelance journalist, with words in The Hindu, a prominent national newspaper in India.

Know The Scientist

Pierre Curie: The precision of a scientific pioneer

Pierre Curie is perhaps best known for his work on magnetism

Published

on

Pierre Curie image source: Wikimedia Commons

Pierre Curie (1859–1906) was a man whose legacy has shaped the course of modern science, yet his name is often overshadowed by that of his famous wife, Marie Curie. Despite this, Pierre’s contributions to physics, particularly in the field of magnetism and the discovery of radioactivity, were revolutionary and continue to influence scientific research today.

Early Life and Education

Born in Paris on May 15, 1859, Pierre Curie grew up in an intellectually stimulating environment. His father, Eugene, was a physician, and his mother, Sophie, was a teacher, which cultivated in Pierre a deep passion for learning. From an early age, Pierre showed an exceptional aptitude for mathematics and physics, subjects that would later define his career.

By the time Pierre was 16, he had already completed his studies in mathematics and physics, earning a degree from the prestigious Sorbonne University in Paris. This early foundation in scientific inquiry laid the groundwork for his future innovations.

In 1895 together with his brother Jacques Curie, Pierre Curie developed the Curie point—the temperature at which certain magnetic materials lose their magnetism

Innovative Work in Magnetism and Crystallography

Pierre Curie is perhaps best known for his work on magnetism. In 1895, together with his brother Jacques Curie, he developed the Curie point—the temperature at which certain magnetic materials lose their magnetism. This work, foundational in the study of thermodynamics and magnetism, continues to be a key concept in modern physics.

Additionally, Pierre Curie’s research in crystallography and his study of the magnetic properties of materials played a pivotal role in the development of solid-state physics. His work laid the foundation for understanding the relationship between a material’s structure and its magnetic properties, which remains essential in materials science today.

The Discovery of Radioactivity

However, Pierre Curie’s most significant contribution came from his work on radioactivity, which would forever alter the understanding of matter itself. In the late 19th century, the mysterious rays emitted by certain substances, like uranium, intrigued scientists. Working alongside his wife, Marie Curie, Pierre embarked on a series of experiments to better understand this phenomenon.

Their work, starting in 1898, led to the discovery of two new elements: polonium and radium. Marie Curie coined the term “radioactivity” to describe the spontaneous emission of radiation from these elements, but it was Pierre’s precise experimental methods and scientific rigor that helped bring clarity to the phenomenon. Their discovery of radium, in particular, was a breakthrough that would lead to numerous advancements in medical treatments, including cancer therapy.

Nobel Recognition and Collaboration with Marie Curie

In 1903, Pierre Curie, together with Marie Curie and Henri Becquerel, was awarded the Nobel Prize in Physics for their joint work on radioactivity. The recognition marked the first time a Nobel Prize had been awarded to a couple. However, what makes this achievement particularly notable is that Pierre Curie insisted that Marie be included in the award, a gesture that demonstrated not only his scientific partnership with his wife but also his support for women in science, a rare stance in the male-dominated field of the time.

Tragically, Pierre Curie’s life was cut short in 1906 when he was killed in a street accident at the age of 46

Pierre Curie’s dedication to scientific rigor and his ability to work collaboratively with Marie, his wife and fellow scientist, was vital to their success. Their work would not only earn them the Nobel Prize but also set the stage for later advancements in nuclear physics and medicine.

Tragic Loss and Enduring Legacy

Tragically, Pierre Curie’s life was cut short in 1906 when he was killed in a street accident at the age of 46. His death was a blow to both the scientific community and his family. However, his legacy continued through his wife, Marie, who carried on their groundbreaking work and became the first woman to win a second Nobel Prize.

Today, Pierre Curie is remembered as a visionary physicist whose discoveries were instrumental in shaping the fields of physics, chemistry, and medicine. His contributions to magnetism, crystallography, and radioactivity remain foundational to scientific inquiry. His work continues to inspire scientists across disciplines and serves as a reminder of the power of precision, collaboration, and dedication in the pursuit of knowledge.

Continue Reading

Know The Scientist

The ‘Godfather of AI’ has a warning for us

The speed with which large language models such as ChatGPT has come to the fore has re-invigorated serious discussion about AI ethics and safety among scientists and humanities scholars alike.

Karthik Vinod

Published

on

Credit: Jijin M.K. / Ed Publica

The quest to develop artificial intelligence (AI) in the 20th century had entrants coming in from various fields, mostly mathematicians and physicists.

Geoff Hinton, famously known as the ‘godfather of AI’ today, at one point dabbled in cognitive psychology as a young undergraduate student at Cambridge. Allured by the nascent field of AI in the 1970s, Hinton did a PhD from Edinburgh where he helped revive the idea of artificial neural networks (ANNs). These ANNs mimic neuronal connections in animal brains, and has been the staple of mainstream research into AI. Hinton, a British-born Canadian, since then moved to the University of Toronto, where he’s currently a professor in computer science.

In 2018, Hinton’s contributions to computer science and AI caught up to him. He was awarded a share of the coveted Turing Award, which is popularly known as the ‘Nobel Prize in Computing’. His 1986 work on ‘back propagation’ helped provide the blueprint to how machines learn, earning him the popular recognition of being one of the ‘fathers of deep learning’ as well.

The last two years saw artificial intelligence become commonplace in public discourse on technology. Leading the charge was OpenAI’s ChatGPT, as large language models (LLMs) found use in a whole host of settings across the globe.  OpenAI, Google, Microsoft and their likes are engaged in upping the ante.

But this sudden spurt has alarmed many and is re-invigorating a serious discussion about AI ethics and safety. Last year, Elon Musk was amongst signatories of a letter requesting to halt AI research for a while, fearing the ever-increasing odds that sentient AI may be in the horizon. But sociologists believe this risk is simply overplayed by billionaires to avoid the real-world problems posed by AI gets swept under the carpet. For example, job losses will occur for which there is no solution in sight about what should be done to compensate those who may lose their work.

However, in a very technical sense, computer scientists like Hinton have taken to the fore to make their views explicitly clear. In fact, Hinton ended his decade long association with Google last year to speak freely about what he thought was a competition between technology companies to climb upon each other’s advances. He, like many computer scientists, believe humanity is at a ‘turning point’ with AI, especially with large language models (LLMs) like ChatGPT at the fore.

“It’s [LLMs] very exciting,” said Hinton in a Science article. “It’s very nice to see all this work coming to fruition. But it’s also scary.” 

One research study suggests these LLMs are anything but ‘stochastic parrots’ that outputs what it’s been instructed to do. This doesn’t mean AI is anywhere close to being sentient today. However, Hinton and other computer scientists fear humanity may unwittingly run into the real risk of creating one. In fact, Hinton was one of several signatories of an open letter requesting policy makers to consider the existential risk of AI.

Creating a sentient AI, or artificial general intelligence (AGI, as it’s technically called) would vary in definition based on scientists researching them. They don’t exist for one today, and nobody safe to say knows what it would look like. But in popular lore, these can simply mean Skynet from the Terminator movies, becoming ‘self-aware’. Hinton was of the opinion that AI already surpassed biological intelligence in some ways. However, it must be bore in mind that AI isn’t anymore a stochastic parrot than it is sentient. Hinton doesn’t say more powerful AI would make humans all redundant. But AI could do many routine tasks humans already do, and thus replace them in those in time. Navigating them is a task that requires views that are transdisciplinary.

Continue Reading

Know The Scientist

The astrophysicist who featured in TIME’s most influential personality list

Priyamvada Natarajan’s contributions in astronomy helped shed light into two major research interests in contemporary astrophysics – the origins of supermassive black holes, and mapping dark matter in the universe.

Karthik Vinod

Published

on

Credit: Jijin M.K. / EdPublica

For Priyamvada Natarajan, her earliest exposure to scientific research arose from her childhood passion making star maps. Her love for maps never abated, and shaped her career as a theoretical astrophysicist. In the media, she’s the famous ‘cosmic cartographer’, who featured in the TIME magazine’s list of 100 most influential personalities this year.

“I realise what an honour and privilege this is,” said Natarajan to The Hindu. “It sends a message that people working in science can be seen as influential, and that is very gratifying.”

The Indian-American’s claim to fame arises from her pathbreaking research into dark matter and supermassive black holes.

She devised a mathematical technique to chart out dark matter clumps across the universe. Despite dark matter being invisible and elusive to astronomers, they’re thought to dominate some 75% of the universe’s matter. Dark matter clumps act as ‘scaffolding’, in the words of Natarajan, over which galaxies form. When light from background galaxies gets caught under the gravitational influence of dark matter clumps, they bend like they would when passed through a lens. Natarajan exploited this effect, called gravitational lensing, to map dark matter clumps across the universe.

Simulation of dark matter clumps and gas forming galaxies. Credit: Illustris Collaboration

Natarajan reflected her passion for mapping in a TEDx talk at Yale University, where she’s professor of physics and astronomy. Though she’s an ‘armchair’ cartographer, in her own description, she has resolved another major headwind in astronomy – nailing down the origins of supermassive black holes.

Black holes generally form from dying stars, after they collapse under their weight due to gravity. These black holes would swallow gas from their environment to grow in weight. However, there also exists supermassive black holes in the universe, millions of times heavier than any star or stellar-sized black hole, whose formation can’t be explained by the dying star collapse theory. One example is Sagittarius A* at the center of the Milky Way, which is a whopping four million times massive than our sun.

First direct image of Sagittarius A* at the Milky Way center. Credit: EHT

The origins of these behemoths remained in the dark until Natarajan and her collaborators shed some light to it. In their theory, massive clumps of gas in the early universe would collapse under its own weight to directly form a ‘seed’ supermassive black hole. This would grow similar to its stellar-massed counterparts by swallowing gas from its environment. In 2023, astronomers found compelling evidence to validate her theory. They reported a supermassive black hole powering the ancient quasar, UHZ1, at an epoch when no black hole could possibly have grown to attain such a massive size.

These observations came nearly two decades following Natarajan’s first paper on this in 2005. In a 2018 interview to Quanta, she expressed how content she would be with her contributions to astrophysics without having her theory requiring experimental verification done within her lifetime. For, she would be simply content at having succeeded at having her ideas resonate among astronomers for them to go search for her black holes. “I’m trying to tell myself that even that would be a supercool outcome,” she said in that interview. “But finding [the supermassive black hole ‘seed’] would be just so awesome.”

Beyond science, Natarajan’s a well-sought public speaker as well, with pursuits in the humanities as well. In fact, at Yale University, she’s the director of the Franke Program in Science and the Humanities, which fosters links between the two disciplines. Her humanities connect comes at MIT, where she did degrees in physics and mathematics before taking a three-year hiatus from science to explore her interest in the philosophy of science. However, she returned to astronomy soon thereafter, enrolling as a PhD student at Cambridge, where she worked under noted astronomer Martin Rees on black holes in the early universe which seeded her success in later years.

Continue Reading

Trending