The annual ranking for Colleges and Universities has been announced.

Today’s post will be a bit out of the ordinary because I will not be discussing science or engineering so much as the places where our scientists and engineers receive their education. I’m talking about the Colleges and Universities of the world. I was prompted to write this post by the release of the annual Times Higher Education survey of the world’s best Colleges and Universities.

Why do we humans always feel the need to rank everything? (Credit: UCLA)

Now which University was chosen as the best, which schools made into the top 10, or which country had the most universities in the top 100 is really nothing more than a competitive exercise of no actual importance. What is important is whether or not new institutes of higher learning are being founded, and whether existing universities are getting better. Still it’s worth taking a quick look at some of the annual survey’s results in order to get an idea of what is going on in the world of higher education.

At the very top of the Times Higher Education list for the fifth straight year is Oxford University in the United Kingdom. Britain also has another spot in the top ten, Cambridge University coming in at number six. All of the other spots in the top ten belong to Universities in the United States from Stanford University at number two to the University of Chicago at number ten. Indeed the first non US non UK University is ETH Zurich in Switzerland coming in at number 14 with the University of Toronto in Canada ranking at #18, and Tsinghua University in China at #20, also appearing in the top 20.

For the fifth straight year Oxford University was chosen as the world’s best place of higher education. (Credit: University of Oxford)
Stanford University took the second spot in the listing. (Credit: Class Central)

Now I’m not trying to brag, and neither should these results be a great surprise. The US and UK have pretty much dominated the world of higher education since the end of World War 2 when most of the world’s other Universities lay in ruin.

During the years when the USSR was pushing education as the way to demonstrate the superiority of Communism several Russian Universities where recognized as among the best schools in the world. However the current Russian government appears to prefer to keep its population ignorant and gullible so the quality of Russian education has declined noticeably.

Lomonosov Moscow State University is Russia’s top school at #194 but just three years ago it placed at #161! (Credit: Schlinder)

Instead it’s now China whose institutes of higher learning are gaining the most ground. In addition to Tsinghua University, Peking University received a high ranking of 23 making China the only country other than the US and UK to place two universities in the top 25. China in fact succeeded in doubling its number of schools in the top 100 from three last year to six this year.

Tsinghua University is China’s top ranked school. The obvious newest of the buildings is an indication of the support education is receiving from the Chinese Government. (Credit: Hotels Combined)

Most of these Chinese Universities are in fact relatively new, babies when compared to Oxford or Cambridge. It’s a sign of China’s growing middle class who want a good education for their children, and are willing to pay for it. It’s also a sign that the Chinese government recognizes that a larger, better educated middle class will actually make China a stronger more powerful nation.

Other Asian nations are also working hard to improve the quality of the education they provide to their people. Sixteen Asian universities placed within the top 100, the most since the Times Higher Education list began.

The University of Tokyo is another important place of learning. (Credit: Britannica)

Of course the improvement in higher education in Asia doesn’t have to mean that education is the west is slipping. In the years to come the world is going to need all of its college and university graduates if we’re going to overcome the tremendous challenges facing our planet.

The Main Building at Drexel University. Drexel has grown so much in the years since I first attended class there! (Credit: The college Post)

By the way, my old Alma Matter Drexel University came in at 351. Not great, but not bad considering that’s 351st in the entire world.

Space News for September 2020.

There are a number of small but nevertheless important items that have happened over the last month which deal with NASA’s Artemis program. So let’s get started.

 If NASA’s Artemis program is going to successfully put Americans back on the Moon by 2024, or indeed ever, it is going to need a big rocket to put all of that hardware into space. The big rocket that NASA has been building now for nine years is called the Space Launch System (SLS) and although it may look superficially like the old Saturn V it is in fact a completely new design based on Space Shuttle hardware.

Artist’s impression of NASA’s Space Launch System (SLS). (Credit: NASA)

In fact the SLS employs four shuttle main engines in its first stage and in addition has two shuttle solid fuel boosters attached. Since the SLS is making use of a fair amount of existing components you’d think that the design cost and schedule would be reasonable compared to those for a completely new large launch vehicle, say Space X’s Falcon 9.

The first core stage of the SLS nearing completion. Those four big engines are identical to the engines used on the Space Shuttle. (Credit: NASA Spaceflight.com)

Well you’d be wrong, in fact the original cost of the central core first stage of the SLS was estimated at $6 billion. That amount was already ‘readjusted’ back in 2017 to $7.17 billion and now NASA has quietly increased that amount to $9.1 billion. And as to schedule, the original launch date for an unmanned flight of the SLS was supposed to be back in 2017, a date that was later pushed back to December of 2019 to June of 2020. Needless to say June has come and gone and the current schedule now for the first, unmanned launch of the SLS is November of 2021.

Even that is not certain however, because the SLS still has quite a lot of testing to finish first. In fact one big test, a static firing of one of the big solid fuel boosters, was carried out successfully on 2 September. During the test the 53m long booster burned for the full 126 seconds required for an actual flight. See image below. While the data from the test is still being analyzed the initial results indicated a very successful test.

Test of the SLS solid booster rocket, also based on Space Shuttle technology. Currently all indications are that the test was a complete success. (Credit: Spaceflight101)

The biggest test still remaining before next year’s unmanned flight is called ‘Green Run Hot Fire’ and may possibly occur as early as next month in October. For the Green Run Hot Fire test the entire rocket, except for the solid boosters, is held down to a test stand and the four main engines are fired for eight minutes, the time simulating a normal launch. Although all of the different subsystems of the SLS have been tested separately this will be the first time the entire rocket will be tested together.

Testing status for the core section of the SLS as of 10 July 2020. Test 8 could occur as early as late October. (Credit: NASA)

If any problems occur during the Green Run Hot Test it would almost certainly cause yet another delay in that first unmanned test flight. And if that first test flight gets pushed back any further there’s little hope of Artemis reaching the Moon by 2024. In fact because some members of congress are just getting sick and tired of the delays and cost overruns associated with the SLS it might just mean the end of the Artemis program entirely.

Mission plan for the unmanned Artemis 1 flight to the Moon. (Credit: NASA)

Thankfully there’s a bit of better news for Artemis. One of the aerospace companies that are preparing bids for the contract to build the Lunar lander that will actually take the Artemis astronauts down to the Moon’s surface is Blue Origin, the other two being Space X and Dynetics. In late August Blue Origin delivered to NASA’s Johnson Manned Spaceflight Center in Houston a full-scale model of their planned lander.

Mock up of the Blue Origin’s planned Lunar lander is delivered to NASA. (Credit: Tech Explorist)

The model is 12 meters in height and consists of both a planned descent and ascent stage. Although the mock-up does not in any sense function it will allow NASA astronauts to simulate getting down from the crew cabin in the ascent stage to the ground with all of their equipment, and back again. This sort of ergonomic testing is important at this stage because it will not only allow the astronauts to become familiar with the vehicle but if any design flaws are discovered during these tests they can be corrected before construction of the first lander begins.

Artists’s impression of the Blue Origin Lander on the Moon. (Credit: Blue Origin)

Although Blue Origin will be the prime contractor should they win the contract the lander design will actually be a team effort including Lockheed Martin, Northrop Grumman and Draper. While Blue Origin concentrates its efforts on the descent stage it is Lockheed Martin who will be primarily responsible for the ascent stage. The team members hope that by splitting up the design efforts it will speed up the design and development of the separate components.

So work is progressing, however slowly on the hardware needed to get Americans back to the Moon, but what about the equipment they’ll be using while on the Moon. For example the old Apollo astronauts had a small Lunar rover vehicle that allowed them to explore more of the Moon’s surface than they could on foot. Are there any plans for an updated Lunar Rover?

The last three Apollo missions, 15-17, took a small Lunar Rover along with them. This is Apollo 15’s. (Credit: The Detroit Bureau)

Well it turns out that it’s the Japanese Aerospace Exploration Agency (JAXA) who has been given the task of developing the rover as a part of their effort toward the Artemis program. As you might guess JAXA turned to a Japanese company well known for their expertise in motor vehicles, Toyota for help in developing an initial Lunar rover design.

Artist’s impression of Toyota’s concept for a new Lunar Rover. Looks a lot more comfortable! (Credit: Space News)

Named the Lunar Cruiser after Toyota’s famous Land Cruiser the proposed rover would be considerable larger than the Apollo rover. Equipped with a pressurized cabin so that the astronauts can remove their spacesuits while driving across the Moon’s surface the rover will be powered by hydrogen fuel cells and is expected to have a range as much as 10,000 kilometers.

Currently all of these design specifications are preliminary; after all we still a lot of work to do just getting back to the Moon. The eventual goal of the Artemis program is to establish a permanent base on the Moon and that’s when the Lunar Cruiser would become an important piece of equipment.

In 15-20 will we have a Moon Base resembling this artist’s impression? (Credit: European Space Agency)

Still it is nice to speculate about what kind of Lunar Base we may have in about another ten years. I do hope that NASA gets the Artemis program on track. It’s been almost 50 years since the last human set foot on the Moon, when Artemis succeeds in getting us back I hope this time its for good. 

Scientists are making strides in developing a direct connection between a human brain and a computer.

For decades now one of the dreams of science fiction has been the development of technology that would allow a direct connection between the human brain and an electronic computer, both a dream and a nightmare. The possibilities that such a technology could open up are beyond imaging. Just consider being able to access all of the stored knowledge on the Internet simply by thinking about it. Or how about being able to see, in your mind the images from cameras anywhere in the world. Such technology might even make the age old dream of telepathy real as two brains could speak to each other through a computer.

Robocop was an extreme example of a Human – Machine Interface. Would such a degree of integration even be possible? (Credit: Den of Geek)

On the other hand, could that same technology be used to access your most private thoughts and opinions without your permission, so that the very idea of privacy no longer existed?  Or what if advertisers could, on a regular schedule implant an ad directly into your brain. No changing channel or going to the bathroom during commercials, they’re inside you!

Such developments are at least decades away. Right now the major goal of Human-Machine-Interface (HMI) technology is to develop methods for people with advanced prosthetics to control them directly from their brains. Like Luke Skywalker’s robotic hand in Star Wars.

Great progress is currently being made in the technology that will allow disabled persons to control their prosthesis directly from their brain. (Business Insider)

Currently one of the major problems facing researchers in HMI is a matter of finding the right materials for the interface. Nearly all electronic circuits use copper as a conductor but when copper is implanted into living flesh, which is mostly a salty liquid, it corrodes very quickly, degrading if not actually blocking the performance of the circuit. And those corroded metals in turn cause scarring of the flesh, which will irritate if not actually harm the person who had the circuit implanted in them. So scientists and engineers have been searching for an organic conductor that will not only give good electrical performance but which will not react in any harmful way when inside the body.

Recently scientists at the University of Delaware have announced that they have found such a material. The team, led by Doctor David Martin has been investigating a class of organic materials known as conjugated polymers that are able to conduct electric current. The material that they identified is known as Pedot and is commercially available as an anti-static coating for electronic displays, I actually think I’m familiar with it.

Professor David Martin and his team of researchers at the University of Delaware. (Credit: University of Delaware)

During testing Pedot showed all of the specifications needed in an interface between electronics and living tissue without any sign of scaring. In other testing Pedot was even able to be infused with dopamine as a possible treatment for addiction making it a possible candidate for other medical procedures as well.

Some of the electrical characteristics of Pedot. To me these plots speak volumes. (Credit: Science Advances)
Disks of Pedot infused with other chemicals. (Credit: Researchgate)

So if scientists have found a material that will allow them to interface electronics directly to the human brain, what kind of electronics will be the first to be implanted? Well Elon Musk of Space X and Tesla fame has funded a small bio-tech company Neuralink that is developing a chip sized device that reads brain impulses and transmits them via bluetooth to a smartphone or other computerized device. Last year Musk showcased a model of Neuralink that was implanted behind the ear of a patient and picked up brain impulses by means of thin wired electrodes laid along the top of the skull. This year’s model has just been announced and consists of a coin-sized disk implanted directly onto the skull.

Last year’s model of Neuralink (left) and this year’s (right). (Credit: Dezeen)

Initial testing of this year’s model consisted of implanting the interface onto the skulls of three pigs, directly over that portion of the brain that dealt with signals from the animal’s snout. Now pig’s snouts are one of their main sensory organs and when the pigs were given food or other objects to smell and rummage through a display screen showed the firing of the neurons in the animal’s brains as they used their snouts.

One of the test subjects for this year’s testing of Neuralink. The disk was implanted onto the pig’s skull and could detect signals sent from the pig’s snout. (Credit: Medium)

Neuralink now hopes to begin testing on human volunteers sometime this year. The plan is to implant the device in patients with severe spinal chord injuries in the hope that a second device implanted below the injury would enable the patient’s brain signals to bypass the injury and therefore allow them to once again control their arms and legs.

We will soon be faced with the question, how far do we want to go? (Credit: Pinterest)

The future possibilities of such technology belong more in science fiction novels, for now. Right now however the biggest problem the engineers at Neuralink have is that their rather delicate thin-wire electrodes don’t last long inside the patient’s body. They degrade over time because of the corrosive chemicals in the body.

What do you want to bet that the people at Neuralink are contacting the team at the University of Delaware right now?

Three new inventions that may help to save our environment.

Everybody knows that our environment is in trouble. The waste and pollution generated by eight billion human beings is choking the planet, producing changes that have already caused the extinction of hundreds of species, and may lead to our own. If we are going to preserve the environment we cannot just return to a simpler, less polluting level of technology, let’s say the 18th century as an example. As I said there are eight billion of us now and horsepower, waterwheels and ox-drawn plows will not sustain such a large population. Instead we must use our technology to develop solutions to the problems that ironically we used technology to cause.

Going back to the days of Currier and Ives may seem attractive to some people but the world could only support about half a billion people back then. What happens to the other seven and a half billion people living today? (Credit: Granger Art on Demand)

Recently there have been three new technological breakthroughs, inventions if you like, that may play an important role in saving our planet. At least I hope so.

In many ways plastic is actually harmless. It’s neither poisonous nor cancer causing. In fact it has many excellent qualities, it has countless uses and it’s so cheap that we use it in countless ways. Ironically it is the fact that plastic is so useful, and cheap that makes it so great a danger. We manufacture so much of it and despite what the plastics manufactures tell us we don’t recycle more than a very few percent of what we make. The truth is that, aside from plastic 2-liter bottles, most single use plastic items, like plastic bags, utensils and straws, are not even made of the right kind of plastic to be recycled. All of those items, and many others just accumulate in our waste dumps which, since plastics don’t decay, are becoming an ever bigger problem on both the land and in the sea.

Every day we produce more plastic than this. Since it doesn’t bio-degrade what do we do with it? (Credit: Rolling Stone)

To solve that problem chemists have for many years been searching for a kind of plastic, technically a polymer, that can easily, and cheaply be broken back down into their constituent parts, chemically known as monomers. These reconstituted monomers could then used to create new polymers, new pieces of plastics over and over again.

Just a few of the kinds of plastics and other polymers chemistry has developed. Notice words like ‘Chain Growth’ or ‘Polyaddition’ or Step Growth’. In many way developing a polymer is like playing with legos! (Credit: Polymer Database)

A team of researchers from the United States, China and Saudi Arabia has recently announced the development of just such a polymer plastic, which they call PBTL. According to the announcement, which appeared in the journal Science Advances, PBTL has all of the desirable qualities of current plastics but in the presence of a catalyst PBTL breaks down readily into its original monomers. After testing through multiple build ups and breakdowns the teams concluded that there was no reason that the cycle could not be carried out over and over again, that they have succeeded in developing a plastic that is designed to be recycled.

PBTL in action, being broken down into its component monomers. (Credit: Innovation Toronto)

Of course there is one caveat, in order to make the optimal use of PBTL’s reusability it must be separated not only from non-plastic waste but from all other kinds of plastic. That means more sorting, more manpower required in the recycling effort and that means more cost. What’s needed therefore is some recognizable way to distinguish PBTL from everything else. It would also be helpful if all plastic items were manufactured from PBTL but that may be difficult to accomplish since there are so many plastic manufacturers.

Sorting plastics into their various types is time consuming and expensive, the reason why so little plastic is actually recycled. (Credit: Living on Earth)

Still it is a step in the right direction. With PBTL we now can recycle all of our plastics, if we have the will to do so.

As bad as the problem of plastics is, and even greater threat to our planet must surely be the enormous amounts of CO2 that we have been releasing into the atmosphere. And to make matters worse at the same time we are cutting down the Earth’s forests that are the best way of removing that CO2 from the air. The resulting buildup of greenhouse gasses is the direct cause of global warming and the attendant changes in climate.

Oh human activity can’t be the cause of environmental change, (cough) (cough)!!!! (Credit: Science Alert)
And even as we pour CO2 into the air we are cutting down all of the trees that could absorb it! (Credit: Medical News Today)

So if forests and other vegetation are one way of getting CO2 out of the atmosphere shouldn’t we be planting more trees and other plants. Of course there are people trying to do just that, however those efforts have so far been unable to even keep pace with deforestation let alone bring down the level of greenhouse gasses.

While planting trees to get CO2 out of the air is a good idea it’s only a drop in the bucket. We need to do a lot more! (Credit: American Forests)

So scientists have been trying to develop an ‘Artificial Leaf’ which, like a real leaf, would use sunlight and water to covert CO2 into a usable fuel. Such a technology would mimic photosynthesis and in large scale operations could provide the energy we use reducing if not eliminating our dependence on fossil fuels.

The goal of artificial photosynthesis is to develop a process that will use sunlight to remove CO2 from the air while producing organic fuels. (Credit: ResearchGate)

Some of the most advanced research toward an artificial leaf has come from the Department of Chemistry at Cambridge University where Professor Erwin Reisner leads a team of chemists who last year succeeded in producing a device that converted CO2 into the fuel syngas, a fuel that is not easy to store for long periods of time. Another problem with the device was that it was constructed from materials similar to those in ordinary solar cells, making the device expensive to scale up into a large scale power plant.

Now the team at Cambridge has developed a new artificial leaf that is manufactured on a photocatalyst sheet, a technology that is capable of being scaled up much more easily, and therefore more cheaply. Also the end fuel produced by the new ‘leaf’ is formic acid which is more storable and can be converted directly into hydrogen, as in a hydrogen fuel cell.

University of Cambridge ‘Artificial Leaf’, powered by sunlight it takes CO2 out of the air and produces usable fuel. (Credit: University of Cambridge)

The Cambridge team still has more work ahead of them; the efficiency of the entire system needs to be improved for one thing. However it is quite possible that in just a few years we may have a new form of solar technology that not only produces energy but actually removes CO2 from the atmosphere.

Of course we already have a both solar and wind technologies that are actually producing a sizable fraction of our electricity. One big problem that has limited the usefulness of both solar and wind power is that the energy they generate varies significantly. When it’s a sunny day or if there’s a good breeze they produce a lot of energy that somehow has to be stored so it can be used at night or an a calm day. Most often that energy is stored in old-fashioned chemical batteries, a technology that has hardly improved in the last 100 years.

Alessandro Volta’s original battery. It’s really not that different from the battery in your car. (Credit: Wikipedia)

As any owner of an electric car will tell you batteries absorb their energy slowly, taking a long time to charge up. Not only that but batteries tend to be heavy, costly and have a limited useful lifespan, a very large number of problems for such a critical component in modern technologies. 

One of the biggest drawbacks to owning an electric car is simply the time it takes to charge the batteries. (Credit: Car Magazine)

There is another energy storing electronic device that is cheap, lightweight, can be charged and discharged thousands of times, not only that but they can absorb or discharge their energy very quickly. They are called capacitors, descendents of the old Leyden jar and even if you’ve never heard of them you own hundreds of them in your cell phones, TVs, computers and other electronics. Capacitors, the very name comes from their capacity to store electricity, are superior to chemical batteries in every way except one, they can’t store nearly as much electrical energy as a battery can.

In the 18th Century a Leyden jar, a capacitor was high-tech. (Credit: Wired)

As you might guess there are engineers working on capacitor technologies in the hope of increasing the amount of energy they can store. One such group is working out of Lawrence Berkeley National Labouratories and is headed by Lane Martin, a Professor of Materials Science at the University of California at Berkeley. Taking a common type of commercially available capacitor known as a ‘Thin Film’ capacitor Martin and his associates introduced defects into the material of the thin film known as a ‘relaxor ferroelectric’.

Lane Martin — professor, materials science and engineering (at left). With Karthik Jambunathan, graduate researcher (center); and Vengadesh Mangalam, postdoctoral resercher (at left). (Credit: Phys.org)

Now ferroelectric materials are non-conductive which allows the capacitor to hold positive charges on one side of the film and negative charges on the other, that’s how the energy is stored. The higher the voltage across the thin film the more energy is stored but if the voltage gets too high the film breaks down, the energy is released and the capacitor is destroyed.

Just a sample of the many varieties of film capacitors. (Credit: Wikipedia)

 The engineers at Lawrence hoped that by adding the defects to the thin films they could increase the voltage the capacitor could withstand without breaking down. Doubling the voltage by the way would actually increase the energy stored by a factor of four. The team used an ion beam to bombard the ferroelectric material creating isolated defects in the film and the first results of testing have shown a 50% increase in the capacitor’s efficiency along with a doubling of the energy storage capacity.

Bombarding the ferroelectric films with an ion beam the researchers produced defects that doubled the energy storing ability of the capacitors. (Credit: Phys.org)

As with the other two new inventions described in this post, capacitors that can store twice as much energy are not going to solve all of our environmental problems, but they’ll help. That’s the takeaway from all of technology developments I’ve discussed; each one is a step towards solving our energy and pollution problems. We have the scientists who can find the solutions, do we have the will to use their work and save our planet before it’s too late?

Paleontological News for August 2020.

Like ever other science paleontology began with big discoveries, the existence of the dinosaurs would be one example. As time passed paleontologists began to fill in a few of the big details, such as the fact that some dinosaurs walked on two legs. As more and better preserved specimens were unearthed more and finer details were uncovered, like the fact that some dinosaurs actually had a covering of fine feathers to help keep them warm. Finding the kind of pristine fossils needed to fill in those gaps in our knowledge however requires a lot of patience, hard work and let’s be honest, luck.

Yes. the evidence is becoming clear that the famous T rex probably had a light coating of feathers to help keep it warm! (Credit: Animals / How Stuff Works)

Some of the best preserved fossils in recent years have come from amber deposits in the country of Myanmar, see my posts of 16 December 2016 and 1 June 2019. Now a new study in the journal ‘Current Biology’ by scientists at the New Jersey Institute of Technology, the University of Rennes in France and the Nanjing Institute of Geology and Paleontology has announced the discovery of a new fossil from Myanmar that answers a lot of questions about a unique group of extinct insects known as ‘Hell-ants’.

Drawing of a Hell Ant. Did those piercing jaws move up and down? (Credit: Live Science)

In the fossil record hell ants are one of the earliest known groups of ants, with 14 different known species from the Cretaceous period they appear to have become totally extinct in the same disaster that killed off the dinosaurs. Recognizing a hell ant is quite easy, they all have two very sharp, dagger like mandibles extending out and curving upwards from their lower jaw. In addition most species have a horn like structure at the top of their heads. The whole configuration strongly suggests that the hell ants attacked their prey by sweeping it up in the dagger like mandibles, trapping it against the horn structure.

The evolutionary ‘clade’ of ants. Hell Ant heads are the top row while living ants occupy the lower two rows. (Credit: NJIT News)

There’s a problem with that idea however, those ants who exist today, like virtually all insects have mouth parts that move, not up and down as ours do but horizontally. That’s one of the reasons why close up movies of insects look so icky,  their mouth parts move side to side. The idea that hell ants somehow moved their jaws upward was quite controversial, many paleontologists refused to believe it until they saw it.

The jaws of all modern ants, like the one in this close up image, move side to side not up and down as did the Hell Ant’s. (Credit: Fine Art America)

Well they believe it now, for a piece of amber from Myanmar has recently been discovered that encases a hell ant caught in the act of attacking its prey. Looking at the image below it is obvious that the Hell Ant, a new species that has been give the name Ceratomyrex ellenbergeri has grabbed its victim, an immature specimen of an ancient relative of cockroaches called Caputoraptor elegans from beneath with those dagger like mandibles. Capturing it in a fashion that could only be accomplished if those mandibles could move up and down.

The actual fossil of the Hell Ant attack is on the left. On the right is an artist’s impression for clarity. (Credit: USA Today)

Fossils like the hell ant from Myanmar, that answer specific questions are of course rare, even the best researchers can spend years of their career looking for one. Just as often scientists can make discoveries by using the newest, latest technology to examine fossils in new ways to answer important questions about the history of life.

One such question deals with the first appearance of the sense of sight in the fossil record, the first animals to have eyes. While paleontologists agree that the compound eye of the ancient arthropods called trilobites were the first eyes to evolve there are still many questions about that eye. How exactly did it function and was it as advanced as the compound eye of modern arthropods like insects or crustaceans? In other words how good was the vision of a trilobite?

The compound eyes of the trilobites were the eyes to evolve. (Credit: Littlefoot’s Anthro Blog)

Now paleontologists at the University of Cologne and the University of Edinburgh have employed a high-tech digital microscope to examine the eye of a particularly well preserved specimen of a 429 million year old trilobite Aulacopleura kionickii from the Czech Republic. What the scientists found was that the trilobite’s eye was constructed from a honeycomb structure of 200 cells with each cell having its own lens and providing the animal with one pixel of information. The vision of a 430 million year old animal was therefore equivalent to a modern digital image with 200 pixels, vague and imprecise but still the best in the world at that time.

The actual trilobite fossil used in the study by Cologne and Edinburgh Universities. (Credit: Phys.Org)

Such an eye is also virtually identical to that of a modern bee or dragonfly, the only difference being the number of cells, a dragonfly’s compound eye for instance can have as many as 30,000 cells. The fact that the arthropod eye has remained so stable for so long is a testament to both the simplicity and versatility of the compound eye but also to the conservatism of evolution. If you have an organ that is doing a job quite well it can exist for many millions of years with only superficial changes.

As a final example of how, if you wait long enough the fossil record will provide amazing evidence of how creatures lived long ago a recent fossil of an ichthyosaur was unearthed in China with the remains of its last meal still recognizable in its stomach. Now ichthyosaurs were aquatic reptiles who lived during the age of the dinosaurs, see my posts of 28 October 2017 and 18 April 2018, and the fossil ichthyosaur found in China was dated to about 200 million years ago.

Ichthyosaurs (top) swimming with one of their usual prey ammonites
The actual fossil ichthyosaur. The bump in the middle of the fossil is its last meal still recognizable. (Credit: Yahoo)

According to the paper published in the journal iScience the skeleton of the ichthyosaur, a member of the genus Guizhouichthyosaurs was nearly complete and measured about 5 meters in length. The big surprise was inside however, the partial skeleton of another marine reptile known as a thalattosaur.

In life the thalattosaur would have been about 4 meters in length making this find the earliest known example of one large predator feeding on another. Although the thalattosaur’s head and tail were missing the rest of the skeleton was intact, the four limbs still connected to the body. Although the researchers cannot be certain they consider the intact condition of the body to be evidence that what they have discovered is a case of predation, not scavenging. In either case it is a remarkable find, two fossils for the price of one telling a story from long ago.

The ichthyosaur shown with the thalattosaur for comparison. Only the middle section of the thalattosaur was consumed by the ichthyosaur. (Credit: Daily Mail)

Bit by bit paleontologists are filling in the gaps in our knowledge of the history of life here on Earth. Using the trilobite’s eye as a metaphor our image of the past started out with only a small number of pixels, vague and imprecise. Each new fossil discovery adds one more pixel to that image and while we may not yet have reached the level of high-definition our view of the past is becoming clearer all the time.

Gelatin and pectin, what are they, how do they work and how are they different?

Many of the foods that we buy in the supermarket are made more appetizing and longer lasting by the addition of a thickening agent to give them more body and volume. Thickeners work by increasing the viscosity of a liquid, normally without altering their taste or colour and one of the most common forms of thickening agents is known as a Gel. In chemistry a gel is defined as a liquid contained within a cross-linked solid network of molecules by surface tension that prevents the liquid from being able to flow. In some respects a gel acts almost like a sponge, a lattice of fibers that holds in a liquid.

Making Jams and Jellies at home is a popular hobby, one that requires the use of pectin! (Credit: Amazon.com)
Gelatins, on the other hand, are thickened by gelatin. (Credit: Dr. Oetker shop)

Commercially the two most common types of gels are pectin and gelatin, from which the word gel is derived. Both pectin and gelatin form their cross linked network from long chains of molecules, technically called a polymer. The primary difference between the two chemicals being that in pectin the chains are made up of sugar molecules while in gelatin they are composed of proteins. Those differences stem from the sources of the two classes of chemicals with pectin being derived from plant tissue while gelatin is produced from animal tissue.

Whether you’re talking about a sugar or a protein, this is a protein, you’re describing an extremely complex molecule formed as a chain of smaller molecules. (Credit: Biochemantics)

In plants pectin consists of a large number of compounds derived from sugars, technically polysaccharides, which serve as structural components in the cell wall. Pectin serves to not only strengthen the cell walls of the non-woody parts of plants but also allows for cell growth while at the same time holding plant cells together. The softening of fruit as it ripens is caused by the breakdown of pectin through the action of enzymes, as is the rotting of the leaves of deciduous trees.

Fruit becomes overripe when the pectin in the cell walls begins to break down. (Credit: Women’s Health)

Historically pectin has been used in food production for many centuries, just how many is not precisely known. However it was in 1825 that chemist Henri Braconnot first succeeded in isolating Pectin. Although pectin can be obtained from numerous fruits and vegetables modern commercial production of pectin is primarily derived from the peels of citrus fruits.

Today pectin is commercially produced from the peels of citrus fruit, (Credit: NDTV Food)

Pectin is perhaps best known in food preparation for the production of jellies and jams, indeed without pectin your favourite jelly would be nothing but a sweet juice. In addition to jellies pectin is also used to provide a bit of substance in low fat baked goods and health drinks. Being both colourless and tasteless Pectin does not interfere with the natural appearance or flavour of the food it is adding body to.

Raw pectin ready to be used. When dissolved in a sweet juice pectin has neither a taste nor colour. (Credit: Wikipedia)

Pectin is also frequently used in non-food products, being added to cosmetics and drugs such as in time-release drug capsules. In fact the increase in viscosity and volume provided by pectin have led to it being prescribed as a medication for constipation and diarrhea.

As mentioned earlier, unlike pectin gelatins are produced from animal parts, specifically protein collagens obtained from the hooves, bones and skins of pigs and cows. Despite being chemically so different from pectin gelatin behaves in much the same fashion and is often used in much the same way.

Raw gelatin can come in crystal form as shown here or as cubes of even sheets to be dissolved in water. (Credit: Alibaba.com)

Everyone is familiar with Jell-O, the brand name for fruit flavoured gelatin desserts. Other food products that obtain their firmness from gelatin include marshmallows, gummy candies, ice cream, cream cheese and even margarine. Like pectin, gelatin has no taste or colour and can be added to almost any food in order to give it some firmness without altering the flavour or appearance.

There’s always room for Jell-O. Sorry I just had to say that once. (Credit: Amazon.com)

Gelatin also has a large number of non-food uses including a variety of glues and other adhesives. In photographic film gelatin is by far the most commonly used material for holding the silver halide crystals onto the photographic paper. Other uses include the capsules of some powdered drugs as well as the binder for matches and sandpaper. Ballistics gelatin is commonly used in measuring the performance of firearms and their projectiles.

The chemicals that react to light are held onto photographic film by gelatin! (Credit: Wiki-Camera)
Ballistics Gel, made from gelatin, is used in the testing of firearms because its density and thickness is very close to that of animal flesh. (Credit: Wikipedia)

The earliest known uses of gelatin come from 15th century England where the hooves of cows were boiled to obtain gelatin. Modern gelatin production comes the hides of both pork and cows (75%) or their bones (25%) although some cooks still produce their own gelatin at home from animal bones and cuts of meat containing a good deal of cartilage.

Because it is made from animals the consumption of gelatin by some people may be taboo for religious or ethical reasons. Jews and Moslems are forbidden from eating gelatin made from pork while Hindus are forbidden from eating gelatin made from cows. A vegan of course would refuse to consume gelatin of any kind. Pectin on the other hand, being produced from plants doesn’t raise any such moral conflicts.

Almost everyone has some kind of food they will not eat but many religions require people to refrain from certain foods. (Credit: Slide Player)

Subjective opinions aside, pectin and gelatin are two very different classes of chemicals that nevertheless are used in very similar ways in the production of food. Providing another reminder that cooking is really just chemistry.

Meteorologists update their estimate for the 2020 Hurricane season. Hold onto your hat, stormy weather’s a headin’ our way.

It was only three months ago, 27May2020, that I published a post discussing how the 2020 hurricane season was shaping up to be a very active one. The official estimates at that time from both Colorado State University’s Tropical Meteorology Project and the National Oceanographic and Atmospherics Administration (NOAA) were predicting around 15-20 named storms with 8-10 becoming hurricanes and 3-5 turning into major hurricanes.

Beautiful but deadly. Hurricane Florence in 2018 as seen from Earth orbit. (Credit: Spectrum News)

Those predictions have already been proven to be conservative. We are still not through August and there have already been 12 tropical storms, five of which have developed into hurricanes. As I write these words Tropical Storm Marco, downgraded from a Cat1 hurricane, has battering the state of Louisiana while hurricane Laura, just upgraded to Cat3, is headed for almost the exact same area of the gulf coast. And the next month and a half is usually the busiest part of the Atlantic hurricane season.

Even as Marco (upper left) dumps rain on the Gulf States Laura is headed towards the very same region for a one-two punch that’s unprecedented. (Credit: CNN.com)

It’s not surprising therefore that the same institutes that made those predictions three months ago have reevaluated their estimates and are now publishing the most dire forecast in the history of hurricane studies. The meteorological team at Colorado State University now estimates that the 2020 hurricane season will consist of 24 named tropical storms of which 12 are likely to become hurricanes with 5 developing into major hurricanes.

If that forecast turns out to be accurate it would make the 2020 season the second most active in recorded history, surpassed only by the 2005 season which saw 28 named storms, including hurricanes Katrina and Wilma. And since hurricane forecasters only select 21 names for storms each year, the letters Q, U, X, Y and Z are not used to name storms, if there are 24 named storms meteorologists will be forced to use Greek letters to identify the last three instead of names, again something that has only ever happened once back in 2005.

So why is this year already so active, and what conditions are the meteorologists seeing that made them redo their forecasts. Well one factor that often inhibits the formation of hurricanes is a strong El Niño in the eastern Pacific Ocean. The strong winds developed by El Niño can produce wind shear that disperse low pressure systems in the Atlantic before they can even develop into tropical storms. This year however there is absolutely no trace of El Niño, a condition that will allow storms to grow unchecked.

A strong El Nino in the Pacific sends winds into the Atlantic that can break up hurricane’s before they become dangerous. There is no El Nino this year. (Credit: NOAA)

At the same time low-pressure systems moving westward off of North Africa are stronger than usual because of exceptional rainfall amounts in the Sahel region of Africa between the Sahara Desert and the Congo rainforest. This region generates almost 90% of the low-pressure systems that develop into tropical Atlantic storms and this year the excess rainfall is making them particularly intense.

Pressure waves moving west off of the Sahel region of North Africa often grow into Atlantic hurricanes. Sahel waves are especially strong this year. (Credit: The Weather Channel)
Those pressure waves often become the seeds that turn into deadly hurricanes like Laura! (Credit: The Weather Channel)

But of course the biggest factor in generating tropical storms is simply the temperature of the waters of the Atlantic Ocean and Gulf of Mexico both of which are at or even beyond historic levels. Warm tropical waters evaporate more quickly, putting not only more water but more energy into those low-pressure systems coming from Africa, leading to more, and more powerful storms. 

Surface water temperature in the Gulf of Mexico is becoming higher every year thanks to Global Warming! This increase in energy is reflected in more and more powerful Gulf hurricanes. (Credit: ResearchGate)

And what is it that’s making the waters of the Atlantic and Gulf warmer than ever observed before? Well if you haven’t already guessed Global Warming where have you been the last 20 years? Seriously the conditions caused by our continued, reckless emissions of carbon dioxide have grown beyond the point of causing ‘Slightly Higher Averages’ so that now nearly every year is noticeably hotter they were just 20 years ago.

This increase is occurring both globally and locally. For example, here in Philadelphia this past winter we had a very warm winter with no snow accumulation at all and right now we are enduring our 34nd day of +32ºC (+90º F) temperatures, our average is 22 days. The entire west coast of the US is currently suffering through a heat wave on a scale never seen before, hundreds of all time high records are expected to be broken while the wild fire season is already turning out to be especially destructive. In fact the National Weather Service has for the first time ever issued a warning for ‘Fire-Nadoes’, tornadoes generated by the extreme winds in a massive. Meanwhile the temperature in Death Valley was recently measured at 54.4ºC (130º F), the highest reliably recorded temperature ever in the entire world.

This summer in Philadelphia has been excessively hot, with stronger than normal storms added in. (Credit: 6ABC)
Meanwhile the western US is suffering under an historic heat wave. (Credit: CBS News)

Globally last year, 2019 was the second hottest ever recorded, coming in only slightly below 2016. In fact according to NOAA 9 out of the ten hottest years ever recorded have come in the last 15 years. A team of climatologists working in Greenland have recently announced that, in their opinion the glaciers there are beyond the point of no return. 

Greenland is melting while we watch, and all of that water is rising sea levels! (Credit: YouTube)

So I guess the question is how much more destruction is the environment going to have to cause before we’ll finally start to pay attention. Personally I’m beginning to fear that even a disaster on the scale of the sinking of Miami might not be enough, after all Katrina and New Orleans in 2005 weren’t. Currently millions of Americans are doing everything they can to ignore the worst epidemic to hit this country in 100 years. I’m almost certain they can find excuses to keep on ignoring climate change even as it’s blowing down their homes!

Ancient Navan Fort in Northern Ireland may just be the top layer of a vast complex of Bronze and Iron Age structures.

According to ancient Irish history and myth the high kings of Ulster, who ruled from about 1000BCE to 500CE, resided at a place known as Emain Macha. Today that site is called Navan Fort and is located just outside of the town of Armagh in Northern Ireland.

Navan Fort, ancient Emain Macha in what id today Northern Ireland. (Credit: Smithsonian Magazine)

Just how much of those ancient records are history, and how much are myth is often difficult to tell, that’s why the discoveries made by archaeology are so important in helping us to separate fact from fiction. Now a new study of Navan Fort is giving preliminary indications that there are more structures hidden in the soil than previous studies had found, that we’ve literally only just scratched the surface of the archaeological remains at Emain Macha.

Ancient manuscript telling the Irish myth of Cul Dreimhe (The Wild Geese). How much of these stories are based on actual events is unknown. (Credit: The Wild Geese)

The study was conducted by Queen’s University in Belfast but because the site is a well known historical and tourist attraction none of the usual digging associated with archaeology was conducted. Instead the researchers, led by study authors James O’Driscoll, Patrick Gleeson and Gordon Noble surveyed the site using high tech, non-invasive tools such as those I described in my post of 27 June 2020.

The scientists began their work with an aerial mapping of the site conducted by a technique known as LiDAR. The LiDAR instrument uses laser beams similar to those in a bar code scanner to sweep the ground from an airplane in order to construct a point-by-point 3D contour model of the entire site. LiDAR scans are so precise and accurate that small bumps and gullys that are imperceptible on the ground become clearly visible in a LiDAR generated plot. See image below.

(Credit: O’Driscoll, Gleeson and Noble)
Airborne LiDAR scans the ground with a laser beam generating an extremely accurate 3D model of the surface. (Credit: Abhipedia)

Having obtained a LiDAR survey of the entire site the researchers followed up their high altitude study with ground level Magnetic Gradiometry and soil Electric Resistance measurements. See images below.

(Credit: O’Driscoll, Gleeson and Noble)
(Credit: O’Driscoll, Gleeson and Noble)

When assembled into one map of the entire site the multiple readings reveal a large number of hidden, subsurface structures within the main outer ring of the fort. See image below.

(Credit: O’Driscoll, Gleeson and Noble)

Of particular interest to the archaeologists was a number of what appears to be two circular structures overlapping each other and forming a figure eight shape. While round houses are typical throughout the ancient Celtic world the purpose of these overlapping figure eights is currently unknown. That’s the problem with the high tech, non-invasive techniques, while they may shown the location and general shape of what’s hidden under the ground they can’t reveal exactly what those structures are nor precisely when they were built and occupied. To answer those questions it is necessary to do some actual digging.

During the Iron Age a typical Irish farmstead would consist of a roundhouse surrounded by a wooden fence. The new survey of Navan Fort has uncovered evidence of many such structures. (Credit: Virtual Visit Tours)

This is not the first time that Navan fort has been surveyed by archaeologists, excavations carried out during 1960s and 1990s discovered the largest known building dating from prehistoric Ireland, a 40m diameter roundhouse. Like Stonehenge in England however Navan fort is important both historically and culturally so any actual digging that takes place there must be carried out sparingly and carefully.

Like Stonehenge, Navan Fort is an archaeological site that is so famous and culturally important that extra special permission is required to disturb it in any way. (Credit: The New York Times)

 It was in the hopes of acquiring the funding for actual excavations that the archaeologists at Queen’s University conducted their high tech examination. By first using non-destructive instruments to locate hidden structures the scientists can concentrate on the most interesting areas, hoping to not only get the most bang for their buck but the most discoveries for each shovel full of dirt.

The stories associated with Navan Fort are rooted deep in Irish culture. Most famously the location plays a prominent part in the Ulster Cycle of stories as the home of the hero Cù Chulainn, Conchobar mac Nessa the king of Ulster and Deirdre the most beautiful woman in Ireland.

Known as the Hound of Ulster, Cu is Irish for Dog, Cu Chulain is the hero of a large cycle of myths central to Irish culture. (Credit: Twitter)

Whether of not archaeology can ever provide evidence that those legendary characters ever lived is questionable, but it is the only way we have of learning something about how it was that the people of those times lived.

Space News for August 2020.

Without doubt the big news for this month is the successful conclusion of the Space X crew demo 2 manned mission. The mission of astronauts Bob Behnken and Doug Hurley began back on the 30th of May 2020 as their Space X Dragon capsule blasted off from Kennedy Space Center in Florida. The very next day the spacecraft followed up its successful launch by docking at the International Space Station (ISS). For the last two months Behnken and Hurley have served as regular members of the ISS crew with Behnken even participating in two EVAs.

Splashdown of Space X’s demo1 capsule. The safe return of astronauts Behnken and Hurley completes the first manned space mission by a commercial company. (Credit: Space News)
The Demo1 capsule safely aboard the recovery ship. Space X intends to reuse their capsules as a part of their program to reduce the cost of space travel. (Credit: Space.com)

The mission of Space X crew demo 2 however was to demonstrate the ability of the Dragon capsule to take astronauts into, and back from space. So in order to complete their mission on the first of August Behnken and Hurley climbed back aboard their capsule and undocked from the ISS. The next day the Dragon fired its retro-rockets to slow its orbital speed so that it could reenter the atmosphere.

The whole operation went without a hitch; the capsule endured its fiery descent caused by friction with the atmosphere before first a pair of drogue parachutes and then four big main chutes brought the capsule velocity to less than 15kph. The most notable part of the whole reentry procedure was that this was the first American manned splashdown in 45 years. (The space shuttle you may recall, landed like an airplane on a runway).

So what’s next for the Space X Crew Dragon spacecraft? Well remember this mission was actually the last of the demonstration missions required by NASA to qualify the Dragon for taking their astronauts back and forth to the ISS. The next mission will officially begin NASA’s Commercial Crew Program with a mission to the ISS. That launch, the commercial crew 1 mission is currently scheduled for 23 October 2020. NASA astronauts Michael Hopkins, Victor Glover and Shannon Walker will be joined by Japanese astronaut Soichi Noguchi for a full six-month tour aboard the ISS.

Official poster for the upcoming Space X commercial Crew 1 mission to the ISS. (Credit: Reddit)

And NASA and Space X have also just announced the crewmembers for the commercial crew 2 mission scheduled for the spring of 2021. NASA astronauts Shane Kimbrough and Megan McArthur will serve as mission commander and pilot respectively. Japanese astronaut Akihiko Hoshide and European astronaut Thomas Pesquet will join Kimbrough and McArthur as mission specialists.

And the Space X corporation has even more news to celebrate, on August the 4th the SN5 prototype of Space X’s planned Starship rocket successfully completed its first short powered flight. Now Space X has had its share of problems in previous attempts at this first test flight. While one of the earlier prototype simply collapsed under its own weight several others actually exploded in spectacular fashion. But engineering is trial and error and eventually Space X got it right. Now this first test was only a short 150m hop but if you follow the link below to the youtube video below you’ll see that the rocket was under complete control the entire flight.https://www.youtube.com/watch?v=s1HA9LlFNM0

It sure doesn’t look much like a rocket but it sure flew with more control than any rocket ever! (Credit: New Atlas)

Still, this is only the beginning, the final starship rocket envisioned by Space X founder Elon Musk is projected to be 120m in height, four times that of its SN5 prototype. So there’s still a lot of work still to do before Space X can even begin its long term plans for using the starship rocket for the colonizing of the Moon and Mars.

Artist’s impression of Space X’s eventual starship rocket for travel to the Moon and Mars. (Credit: Techcrunch)

Believe it or not there is some space news that doesn’t deal with Space X. On July 30th NASA launched its latest rover on a mission to Mars. Perseverance will reach the red planet in February, landing in the Martian crater Jezero. Perseverance is the first rover vehicle designed to be able to look for signs of ancient life on Mars. The rover also carries with it a small helicopter as a demonstration model which if successful would become the first man made aircraft to fly anywhere outside of the Earth.

Launch of the Perseverance rover on its way to Mars. (Credit: BBC.com)

 Finally there’s good news in the preparations of the Lucy space probe for its mission to the Trojan asteroids scheduled for launch in October of 2021. Despite problems caused by the Covid-19 virus on 27 July the mission planners passed their System Integration Review. This will now allow assembly of the space probe to begin at Lockheed Martin’s Space Systems facility in Littleton, Colorado, where all of the mission systems are to be integrated onto the spacecraft’s main bus. Once assembly is completed testing of the entire probe can then begin.

NASA has to have a patch to commemorate every mission, here’s Lucy’s. (Credit: Twitter)

The schedule is tight, the mission planners are hoping to use a flyby of Mars as a gravity boost to speed Lucy on its 12 year mission so if they miss their October 2021 deadline they’ll have to wait another two years to launch. During its mission Lucy will visit as many as seven different asteroids making it in many ways the most complex mission ever attempted.

Axions, the elementary particle that ought to exist, but do they?

The sub-atomic physics that I was taught in high school was pretty simple. Atoms were made up of Neutrons, Protons and Electrons. The Neutrons and Protons stayed in the atom’s nucleus, and are given the name nucleons for that reason while the Electrons orbited around the nucleus. We also learned that the Protons had a positive charge, the Electrons a negative charge while the Neutrons were electrically neutral.

The Bohr model of an atom of Nitrogen. This is about as sophisticated as high school science classes will get. (Credit: SlidePlayer)

That was about all you’d learn in class, if you wanted to learn any more you’d have to do outside reading on your own, of which I did plenty. It was from books like George Gamow’s “Thirty Years that shook Physics” that I learned about other particles like the neutrino, muon, pion, Lepton and Delta particles. (Although the science may be rather outdated, I still highly recommend Gamow’s book as a history of Quantum Mechanics!!!) Oh, and I also learned that every one of those particles had an anti-particle, identical in every way to its partner except having the opposite electrical charge.

Thirty Years that Shook Physics by George Gamow. Highly Recommended!!! (Credit: Flickr)

But even as I was attending high school physicists were digging deeper. In fact it was in 1964 that physicist Murray Gell-Mann proposed the quark theory of nucleons. Gell-Mann’s idea was that the Proton and Neutron were composed of three smaller particles called quarks, two up quarks and a down quark made a proton while a neutron was two downs and an up. At the same time the lambda and delta particles were also composed of three quarks but for these unusual particles one of the quarks was a strange quark, a name given to the particle to indicate how little physicists understood it at the time. In Gell-Mann’s theory the pion was also composed of quarks but they were made of a quark anti-quark pair. Meanwhile the electron, muon and neutrino were not made of quarks, they remained elementary, fundamental particles that cannot be decomposed into smaller pieces.

The Gell-Mann model of nucleon structure. Three quarks make a proton or neutron while a quark anti-quark pair make a pion! (Credit: Lumen Learning)

It took physicists more than 20 years to work out the ramifications of Gell-Mann’s theory but by the early 1990s they had a framework called ‘The Standard Model’ that was able to broadly describe the interactions between the particles that they saw in their high energy ‘atom smasher’ experiments. The final piece in the standard model was the discovery in 2012 of the Higgs boson, the particle that gives all other particles their mass.

Robert Higgs standing in front of a photograph of some of the equipment needed to finally discover his Higgs boson. (Credit: Science / How Stuff Works)

The standard model doesn’t answer all our questions however. For example while the Higgs boson does give other particles their mass we don’t understand why those particles have the mass they do. The up and down quarks have roughly the same mass, about 5 times that of an electron but other quarks have much larger masses. At the same time we know that the neutrino also has a mass but one that is so small that we haven’t been able to measure it accurately yet, it’s less than one millionth that of the electron. What sets all of the masses for these different particles, we just don’t know?

The masses, in millions of electron volts (MeV) of some elementary particles. What makes all of these masses what they are is completely unknown! (Credit: Semantic Scholar)

One of the problems not addressed by the standard model is that according to theory the neutron should possess a strong electric dipole, it should act like a strong positive and strong negative charges brought close together, a property that would be easily discovered. In order to solve this dilemma, known as the strong CP problem (for Charge Conjugation / Parity) in 1977 the physicists Roberto Peccei, Helen Quinn, Frank Wilczek and Steven Weinberg proposed a new particle called the axion. This new particle would have a very low mass, like the neutrino on the order of one millionth that of an electron, and hardly interact with other types of particles.

One interesting experiment that might discover the axion. (Credit: Universe-Review.ca)

Even while particle physicists were trying to make sense of the concept of the axion astrophysicists and cosmologists heard about the particle and realized that the axion, if it existed, could be a major component of Dark Matter. With its low mass the axion would have been created in enormous numbers during the original Big Bang, and since they hardly interact with other particles they would still exist. Could the axion be the dark matter that the astrophysicists were searching for?

Now predicting new particles is a risky business. If you’re right you’ll become famous like Wolfgang Pauli with the neutrino or Robert Higgs and his boson. On the other hand there are dozens of ‘predicted particles’ that have never been found. And it often takes decades for experimentalists to develop the technology needed to prove that a particle exists. Pauli predicted the neutrino in 1930 and it wasn’t proven to exist until 1956. Same for the Higgs boson, Robert Higgs wrote his original paper in 1964 but the particle was only officially discovered in 2012.

That discovery is what researchers at the Gran Sasso National Labouratory in Italy hope to accomplish with their XENON1T experiment. The experiment consists of a 3.2 metric ton tank of Xenon gas in what is known as a Time Projection Chamber. Photomultiplier tubes inside the tank detect the tiny flashes of light produced by the interactions and the entire apparatus was constructed deep within a mine beneath the Gran Sasso Mountain in order to shield the experiment from false signals due to cosmic rays.

Main Detector of the Xenon1T experiment being readied for installation. (Credit: LNGS-Infn)

After two years of operation the XENON1T team has now announced the first ever measured evidence for the existence of axions. At a news conference on June the 17th the XENON1T physicists presented their data showing an excess number of flashes in the low energy region. This was exactly the sort of signal that would be expected for interactions with axions produced in the interior of the Sun. According to the announcement the amount of data collected was sufficient for a 3.5 sigma confidence level in the discovery.

Latout of the Gran Sasso Labouratory, now the world’s largest underground physics labouratory. (Credit: Nature)

That 3.5-sigma level is the problem; statistically 3.5-sigma means that there is only a one in 10,000 chance that the excess flashes are simply a matter of luck. Like rolling a pair of dice and getting boxcars three times in a row, something that only happens very rarely, but it does happen. The physics community has agreed that in order to really announce a ‘Discovery’ an experiment must achieve a confidence level of 5-sigma, which means that there is only one chance in 3.5 million that the data is just a statistical fluke. 

Scientists express their ‘confidence level’ by using the normal of bell curve distribution. The larger the sigma value the more of the bell is contained, the more confident you are! (Credit: Dummies.com)

So what do experimental physicists do when their experiment looks like it’s found something but the data is too small to be certain? Build a bigger, more sensitive experiment of course. The scientists at XENON1T are already doing just that, upgrading their equipment to an 8 metric ton container of Xenon for a new 5-year run that should be able to cross the magic 5-sigma threshold.

So has the axion been found? Well some other physicists are already criticizing the whole setup; the same signal could be produced by the detector being contaminated by the isotope of hydrogen called tritium. It takes time to be certain so we’re all just going to have to wait. Making a discovery is what every scientist dreams of but as they all know, it’s more important to be right than to be first!