3-Methacryloxypropylmethyldimethoxysilane: A Versatile Silane Reagent for Various Applications

Former president of the Royal Society: Young people want to do research well, it is best not to be too smart

Sometimes I worry that many people who truly enjoy scientific careers may be held back by the narrow and outdated notions of the scientific community. The word “scientist” still conjures up an image of “born out”—either like Einstein (an older male) or a young geek. There is still too little ethnic and gender diversity among scientists. But scientists’ careers involve a wide variety of knowledge structures and social attributes , requiring speculative theorists, lonely experimental scientists, ecologists collecting data in the fields, and quasi-industrial teams running giant particle accelerators or big science projects .

Everyone agrees that scientists have a unique way of thinking, that is, following the so-called “scientific method”. More precisely, scientists follow the method of rational reasoning. They, like lawyers or detectives, classify phenomena, construct hypotheses, and test evidence. A related and destructive misconception has emerged: Scientists with top-notch thinking skills must be extremely smart people. Academic competence is a facet of the broader intellectual competence that the best journalists, lawyers, engineers and politicians share.

The great ecologist EO Wilson believed that to be effective in some fields of science, it is best not to be too smart . He did not discount the intuition and inspiration that flashes in the working life of scientists (such moments are rare). But as the world’s expert on millions of ant species, Wilson’s research involves decades of hard work: Papers aren’t enough. Yes, scientists need to take the risk of being bored and tedious . He’s right that people with short attention spans and flippant minds are more likely to find happy (but less valuable) jobs on Wall Street as “millisecond traders.”

We have no reason to look down on purely applied work. Applying scientific concepts to practical goals can be more challenging than original discoveries . One of my engineer friend’s favorite cartoons has two beavers looking up at a giant hydroelectric dam. One beaver said to the other, “I didn’t build the dam, but I came up with the idea.” I would also like to mention my fellow theorist, the Swedish engineer Gideon Sundback. The story of him, who invented the zipper, made a bigger intellectual leap than most of us. By the middle of this century, the world’s population will reach 9 billion, and providing them with clean energy, better health, and enough food may be the most inspiring goal.

Aspiring scientists do their best work when they choose the field and type of research (field research or computer modeling) that suits their personality, skills, and tastes. It’s especially comforting to be in a field where things are moving fast, you can get access to the latest technology, use more powerful computers or larger datasets. Moreover, there is no need to stay in the same field for the entire research career, and it is not necessary to be a scientist all your life . Progress in some typical fields is like a surge, interspersed with periods of relative stagnation. And the most dynamic fields often cross the boundaries of traditional disciplines.

There’s another point: only geniuses (or geeks) go straight to the grandest and most basic of problems without looking back. You should multiply the importance of the problem by the probability of solving the problem and maximizing the output . Aspiring scientists should not swarm into how to unify the universe and quantum, even though this field is clearly the intellectual pinnacle we aspire to. They should realize that great challenges in cancer research and brain science need to be tackled one by one, not head on.

Strange as it may seem, sometimes the problems that baffle us the most are the problems we are most familiar with, and some of the most easily understood phenomena are found in the distant universe. Astronomers confidently explain black hole mergers billions of light-years away. But conversely, on everyday matters of interest to all, such as diet and child care, we have very little at our disposal, with “experts” giving a new opinion a year. However, there is no contradiction between understanding some mysterious cosmic phenomena and being confused about everyday affairs. What really challenges us is complexity, not size . Even the smallest insects, whose structures are far more complex than stars or galaxies, can provide us with deeper mysteries.

Conventional wisdom holds that scientists, especially theorists, do not become more successful as they get older, but rather run out of talent as they age. Wolfgang Pauli once quipped that scientists over the age of 30 are “still young, always unknown” (I wish older scientists would not be so resigned). Although there are some late blooming exceptions, few academics culminate in TA’s final research . This is different from many artists. Artists in their youth (like scientists) were influenced by the popular culture and styles of the time, enhancing and deepening connotations through the development of inner spirits. Scientists, on the other hand, need to constantly absorb new concepts and technologies if they want to stay on the cutting edge of science. But that doesn’t mean productivity can’t continue into old age . John Goodenough, the co-inventor of the lithium-ion battery, worked until he was 97 years old. In 2019, he became the oldest Nobel laureate.

There is another path that we should avoid, but which some of the greatest scientists are often tempted to do: ill-advised and overconfident into other fields in an attempt to blossom in all directions. People who go down this road are still “doing science” in their own eyes, they want to understand the world and the universe, but they no longer get the satisfaction from traditional piecemeal research: they inflate their ego too much, sometimes making them The admirers were embarrassed.

Arthur Eddingtoin was perhaps the leading astrophysicist of his time. In his later years (in the 1930s), he developed a “fundamental theory”, claiming that the exact number of atoms in the universe could be predicted through subtle mathematical calculations. At one point, Eddington was speaking at a lecture in the Netherlands, where a young scientist in the audience asked one of his older colleagues: “Are all physicists going crazy as they get older?” No,” the elder said, “a genius like Eddington can go mad, and a genius like you will only get more and more stupid.” For non-genius scholars, this is at least comforting.

Scientists tend to be harshly critical of other people’s research. They are more motivated than anyone to spot bugs. Because in their careers, the greatest respect goes to those who contribute unexpectedly original content that overturns consensus. But they should be equally critical of their own research . They should not be obsessed with their preferred theory, nor should they be swayed by wishful thinking. Not surprisingly, many people find this difficult to do. If a person devotes most of his life to a scientific research project, he will definitely affirm its importance. Once all efforts are in vain, it will be a fatal blow. Brutal reality destroys tempting theories. Only those scientific findings that are strong enough to survive the rigours of the test become part of public knowledge. For example, the relationship between smoking and lung cancer, the relationship between HIV and AIDS. The great historian Robert Merton once described science as “well-organized skepticism”.

The road to a generally accepted scientific understanding is often bumpy, with many cul-de-sacs along the way. Occasionally, the mavericks are right. We all love it, but these are far fewer cases than is usually assumed , perhaps even less than what is read in the media. Sometimes, previous consensus is overturned. But most scientific advances transcend and generalize prior concepts rather than contradict them. Einstein, for example, didn’t “overthrow” Newton, he just went beyond Newton to offer new perspectives, giving broader and deeper insights into space, time, and gravity.

When competing theories go head to head, there is often only one winner. Sometimes, a key piece of evidence can make the difference. Such was the case with the Big Bang theory of 1965, when a faint microwave signal was discovered that filled the entire universe, with no more plausible explanation than the aftermath of a hot, dense “origin.” The same thing happened a second time in that era. The discovery of “seafloor spreading” convinced almost all geophysicists of continental drift.

In other cases, an idea only gradually gains an edge: its alternative views are gradually marginalized until the main proponents have died. Sometimes research themes move on and what was once seen as an issue of the age is bypassed or set aside.

The cumulative progress of science requires new technologies and tools, and, of course, theories and insights in symbiotic relationships. Some instruments still only fit on a tabletop, and at the other extreme, the Large Hadron Collider at CERN in Geneva, Switzerland, is the most sophisticated scientific facility available. Likewise, astronomical facilities are operated by multinational consortia, some of which are truly global projects, such as the Atacama Large Millimeter/submillimeter Array (ALMA) radio telescope in Chile, in which Europe, the United States and Japan are all involved.

But even working on research in small, localized teams, we benefit from the fact that science is a truly global culture. Our skills (unlike lawyers) can be spread globally. This is also why scientists are more likely than other professions to cross the boundaries of nationality and consciousness to solve theoretical and practical problems . For many of us, this is an important advantage in our careers.

The best labs, like the best startups, should be the best incubators for original ideas and young talents. But, to be fair, a hidden demographic trend is destroying this wonderfully creative atmosphere.

Fifty years ago, my generation benefited from the fact that science has always grown exponentially, riding on the ever-expanding winds of higher education. Subsequently, the young outnumbered the old, and it was normal for people to retire around the age of 65 (usually mandatory). But academia, at least in the West, is not expanding rapidly (saturated in many fields) and there is no mandatory retirement age. Decades ago, the ambition to lead a team in your early 30s made sense. But in the U.S., taking the biomedical community as an example, it’s unusual now to get your first research grant before the age of 40. This is not a good sign. Science always attracts nerds who can’t do other professions. There are also many people in the lab who are willing to spend time writing grant applications, but often fail to apply.

But the industry needs to attract nimble minds and people with ambitions to make it in their 30s. Some people will leave academia and possibly try their hand at entrepreneurship if they perceive the prospect to be bleak. This brings great satisfaction and public good, and many people should go. But in the long run, it is also important that some of these people are willing to devote themselves to basic frontier research. Advances in IT and computing can be traced back to basic research at top universities, much of which took place over a century ago. Stumbling blocks in medicine stem from unstable foundations. The frustrating reality of the lack of effective drugs for Alzheimer’s disease illustrates how little we know about brain function, and a renewed focus on basic science. (Editor’s note: This article was first published in January 2020, when no Alzheimer’s disease drug was approved in Europe and the United States.)

But I hope this impasse is temporary, and new opportunities open up for ambitious scientists. The expansion of wealth and leisure, coupled with the information connections provided by Internet technology, will provide millions of educated amateurs and citizen scientists around the world with more space than ever before to follow their interests theme. These trends will allow top researchers to conduct cutting-edge research outside of traditional academies or government labs. If there are enough such options, it will undermine the supremacy of the research university and elevate the importance of independent scientists back to pre-twentieth-century levels, and perhaps promote a flourishing of genius original ideas for a sustainable future for the world. This kind of thinking is needed.

error: Content is protected !!