Cyborgs now have their own Olympics, the Cybathlon

This story actually dates from almost a year ago but I only just heard about it myself and since it relates directly with one of my previous posts (11Mar17) I hope you’ll still find it interesting.

In October of 2016 the first ever athletic competition for cyborgs, yes you heard me right cyborgs, was held in Zurich, Switzerland. The competition was conceived by Robert Riener of the Swiss Federal Institute of Technology as a way to promote the development of computerized assistive technologies for the physically disabled. The organizers of the event also did so in an effort to get engineers working with assistive technologies to interact more with their potential users in order to improve their designs.

The event consisted of six events ranging from paralyzed individuals using exo-skeleton suits to walk through obstacle courses to amputees grasping and using everyday objects with powered prosthetic limbs. There was also a ‘virtual marathon’ where tetraplegics, people with little or no use of any limb, used their brains only to control computerized avatars.

The exo-skeleton obstacle course was designed to replicate normal life with walking up ramps and stairs along with maneuvering around corners. In other words the competition was not so much an athletic event as a side by side test of locomotive technologies to determine which was best suited to improving the lives of the disabled. The pictures below show  of the obstacle courses used in the cybathlon.

Cybathlon Obstacle Course (Credit: Alessandro Della Bella)
Cybathlon Obstacle Course (Credit: Nicola Pitaro)

The same is true of the events for powered prosthetics, which were intended to test ranges of motion as well as the ability to manipulate everyday objects. The pictures below show some of the innovative technologies in action.

Prosthetic Arm- Nicola Pitaro)
Prosthetic Arm (Credit: Alessandro Della Bella)

If those contests seem like something out of a Sci-Fi novel the virtual marathon using brain computer interface only is unquestionably futuristic. Electrodes in skullcaps were used to detect and measure brain waves that were fed into a software algorithm which ‘decoded’ the brain waves into commands for the avatars to either run, jump or slide as required for the virtual race. Interestingly, the race also had sections were the avatar was ordered to do nothing. This required the contestants to control there brain waves in order to give no commands, a situation that would occur frequently in real life.

Developing the software that decoded the brain waves required a long iterative process that matched a brain wave to an intended action. At the same time the contestants also had to train themselves to frame their thoughts properly, allowing the program a better chance of decoding it accurately. The pictures below show the skullcap used by one team of contestants along with an illustration of how the ‘virtual marathon’ was conducted.

Skullcap (Credit: Erik Tham)
Virtual Marathon (Illustration: James Provost)


Technology competitions of this kind have proven to be great spurs for the development of technology. In a previous post (17Jun17) I wrote about how DARPA’s road race challenge for robotic vehicles played a significant role in the development of the driverless cars now taking their first tentative ‘drives’ on our highways. The same can be said for the Xprize competition for space technology.

There are already plans for the next cybathlon scheduled to take place in 2020. With the advances in assistive technologies that are taking place in labouratories around the world Cybathlon 2020 may really be science fiction come true. If you’d like to read more about cybathlon click on the link below to be taken to the events official website.



Is Global Warming Responsible for the Increased Number and Strength of Hurricanes?

The hurricane season for 2017 is just past its half way point and already this year has proven to be abnormally deadly and destructive. Hurricane Harvey inundated southeast Texas with over a meter of rain while Hurricane Irma wrecked several Caribbean Islands before causing a trail of destruction the length of the Florida peninsula. By some measurements Irma was the strongest Atlantic storm ever seen, remaining a category five storm longer than any on record with the second highest wind speed ever measured. For a short time both Irma and Jose were cat 5, the first time ever two such powerful storms have existed at once. Plus, I just heard on the news that Jose has now been officially a hurricane longer than any storm on record.

Even now there are three powerful storms in the Atlantic. Jose has been downgraded to a cat 1 but is still a possible threat to the US east coast. Maria has strengthened to a cat 5 and is expected to strike Puerto Rico today and then perhaps hit the Carolina coast. Finally there is tropical storm Lee, so far out in the Atlantic we don’t yet know what it’s going to do. And hurricane season still has two months to go! The picture below is Irma taken from space and while beautiful you can still feel something of its power in the image. By the way, the small brown object to the left of the storm is Puerto Rico giving an idea of just how big this storm was.

Hurricane Irma from the Space Station (Credit: NASA)

The number and destructive power of these storms force us to ask the question, could global warming be responsible? Has all the carbon dioxide and methane we’ve been pouring into the atmosphere increased storm activity in the Atlantic?

First of all there is simply no doubt that carbon dioxide and methane are greenhouse gasses. Any college freshman chemistry lab is capable of making the necessary measurements. I know that because I did it way back in the 1970s!

Secondly, we know with great accuracy the amount of those gasses that are produced by our burning fossil fuels in our vehicles and power plants. Yes, I know the Earth’s atmosphere is huge but over 30 trillion kilograms of pollution every year is also an enormous amount, many cities throughout the world have smog problems and air pollution is a major health concern.

Thirdly, we can measure the rise in temperature over the last 50 years of the atmosphere, 0.6 degrees Celsius, and the oceans, 0.32 degrees. While these may seem like at small changes when you consider the world’s oceans it is simply an enormous amount of energy. The graph below from the National Oceanographic and Atmospheric Administration (NOAA) shows the increase in the amount of energy in the Oceans due to global warming. The total amount is about 15×1022 joules but to give you an idea just how much energy that is it’s more than the energy in 35 million one megaton nuclear bombs. That’s right, the increase in energy is more than 35 million nuclear bombs!!!

Increase in Oceanic Heat Content (Credit: NOAA)

So even if only a small fraction of that energy increase gets into the storms that form over the oceans it would certainly be enough to significantly amplify the number and power of those storms. So, what are the numbers? Has there been an increase in the number of tropical storms and hurricanes in the Atlantic?

The table below shows the average number of both tropical storms and hurricanes as a function of decade for the 1970s, 80s, 90s, 2000s along with 2010 to 2016. The obvious increase is between the 1990s and 2000s, a more than 40% increase but the increase from the 80s to the 90s is not insignificant. Now, climatologists like to look at long term trends, to them even a decade is a short period of time. Nevertheless over the last 16-17 years there has been an undeniable increase in both the number and strength of Atlantic storms.

Yearly Average of Tropical Storms in decades (Credit: R. A. Lawler)

Now I’ve only been talking about tropical storms in the Atlantic. The Pacific Ocean has also seen an uptick in activity along with an increase in tornadoes across North America and just an increase in rainfall in general throughout the world. All this is a strong indication that global warming is causing more powerful, more violent weather everywhere.

The time is past for debates, the effects of climate change are already upon us. There’s much worse to come unless we seriously reduce the amount of polluting gasses we generate. Sea level rise combined with increased hurricane activity could soon lead to much greater destruction than we’ve seen so far. Quick and decisive action is required before it’s too late.

Here we go Again. A Recent Paper by a Group of Cosmologists raises doubts about the very Existence of Dark Energy.

We’ve all heard the old saying ‘Two steps forward, one step back’. Well, when it comes to Cosmology, the study of the Universe as a whole, it seems like we take a step forward, another sideways, close your eyes and spin, take two steps etc, etc, you get the idea. The Universe is so large, the measurements so difficult to make, the theories so complex that progress in cosmology has always been slow with many wrong turns. So hang on folks, today’s post is going to be a little trip with Alice into wonderland.

Today the best model we have for the basic nature of the Universe is that is consists of billions of Galaxies like our Milky Way. That the Universe is expanding, all those Galaxies are moving away from each other, and that the expansion is not being slowed by the gravity of the Galaxies. In fact the expansion is accelerating. This basic model is outlined in the image below.

Big Bang Model (Credit: NASA)

It was Carl Hubble, back in the 1920s and 30s who discovered that the Universe was made of Galaxies and that it was expanding. The acceleration of the Universal expansion was discovered in the 1990s by two groups of astronomers led by Saul Perlmutter and Adam Riess.

The cause of this acceleration was completely unknown and quickly given the name ‘Dark Energy’, although cosmologists prefer the name ‘Vacuum Pressure’. Today we know almost nothing about ‘Dark Energy’ and it ranks as one of the greatest mysteries in all of science.

Now a recent paper published by Lawrence H. Dam, Asta Heinesen and David L. Wiltshire of the University of Canterbury in New Zealand may be about to throw the whole science of cosmology into a state of confusion. According to Professor Dam and his colleagues there is no such thing as Dark Energy, it simply doesn’t exist. Cosmologists only think there’s Dark Energy because they’re trying to fit their measurements to an incorrect mathematical model of the Universe.

To understand what Professors Dam, Heinesen and Wiltshire are saying we need to talk a little bit about the mathematical ideas we use to describe the Universe and of course we start with Albert Einstein. When Einstein published his General Theory of Relativity, also known as his Theory of Gravity, it was quickly realized that since it was gravity that held the Universe together then Einstein’s Gravity theory was the best way in which to study the Universe. The full Einstein equation for gravity is shown below, it’s the lambda (L) symbol that relates to Dark Energy.

Einstein’s Field Equation

A trio of physicists named Alexander Friedman, Howard Robertson, and Arthur Walker used Einstein’s theory to develop an exact set of equations for a Universe where matter was spread smoothly (homogenous) and the same in every direction (isotropic). A mathematician named Georges Lemaitre later expanded the FRW model to include the expansion of the Universe thereby creating the ‘Big Bang Theory’, although technically it is referred to as the FLRW model.

Now remember the two assumptions of the FLRW model, that the matter in the Universe is smoothly distributed with no preferred direction, i.e. it is homogenous and isotropic. At first glance however the Universe sure doesn’t look smooth, it’s got the Galaxies, clusters of stars with a whole lot of empty space between them. However, the idea was that when you considered the whole Universe with tens of billions of Galaxies they would all spread out evenly.

Except that they don’t. Another important astronomy project of the last twenty years has been the Sloan Digital Sky Survey (SDSS), an ambitious attempt to map the positions of nearly a million Galaxies and what the Sloan team has discovered is that the Universe actually looks more like Swiss cheese or soap bubbles with regions that are quite dense surrounding immense empty voids. The image below shows a sample of the results of the SDSS and clearly illustrates the ‘lumpiness’ of the Universe.

Results of Sloan Digital Sky Survey (Credit: SDSS)

So the basic assumptions of the FLRW model aren’t quiet right and Professors Dam, Heinesen and Wiltshire say that a new mathematical model, which they call the Timescape model, must be used instead. It’s in this mathematical model that the measurements made by Perlmutter and Riess fit without the need for anything like Dark Energy.

Now there’s a long way to go before the Timescape model is generally accepted, if it ever is. Chances are that this theory will not stand the test of close examination and Dark Energy will continue to be a mystery that needs to be solved. You never know though, every time we look further into the Universe it just seems to get stranger and stranger.

I realize that this post was rather long and heavy and dealt with some strange and difficult topics. However I hope that it wasn’t too abstract. The intersection between math and measurement is central to the advance of science and after all, we are taking about the basic structure of the Universe as a whole!

End of the Cassini Mission to Saturn

Two days from now, on Friday the 15th of September 2017 the Cassini Spacecraft will make it’s final orbit around Saturn. After exploring Saturn and its moons for the past 13 years Cassini is now nearly out of fuel and NASA has decided to plunge the probe into the atmosphere of the ringed planet. The reason for ending the probe’s mission in this fiery fashion is to make certain that Cassini does not crash onto one of Saturn’s moons and possibly contaminate it with micro-organisms from Earth.

The Cassini-Huygens mission began almost 20 years ago on the 15th of October 1997 when the spacecraft was launched from Kennedy Space Center aboard a Titan IV-Centaur rocket. The probe’s long seven year journey to Saturn required gravity boosting flybys from Venus (twice), Earth and Jupiter in order to gain enough energy to reach the outer Solar System. The image below shows the Cassini-Huygens spacecraft prior to its launch.

Cassini-Huygens Spacecraft (Credit: NASA)

Entering orbit around Saturn on the first of July 2004 Cassini released the Huygens landing module on the 25th of December 2004 and Huygens successfully landed on the Moon Titan on January the 14th, 2005. The lander, which was built by the European Space Agency, operated for 90 minutes sending back images and instrument measurements from the surface of the second largest moon in the Solar System. The first image below is a picture of the Huygens lander and the second is an image from the surface of Titan.

Huygens Lander (Credit: David Monniaux)
The Surface of Titan (Credit: ESA-NASA-JPL)

During it’s short operating life the Huygens lander found that the atmosphere of Titan was denser than Earth’s by about 45% with a composition of 95% nitrogen and 5% methane at a temperature of 98.3 degrees Kelvin (-179.3 Celsius). The rocks in the surface image above are actually water ice at a temperature so cold they are as hard as rocks!

The discoveries on Titan by Huygens were augmented by those of the Cassini orbiter which found both large lakes of liquid methane along with signs of channels other indications of erosion caused by flowing liquid methane. Between them Cassini and Huygens portray Titan as a world very similar to our own except it is so cold that methane has replaced water.

Since then the Cassini orbiter has continued its mission, making several major discoveries. One of the most important has been the eruptions of water ‘volcanoes’ spewing out of the moon Enceladus. The energy causing these eruptions is thought to be generated by the tidal forces of Saturn and its other moons and is similar to the process that heats Jupiter’s moons Io and Europa. After discovering the ‘volcanos’ the Cassini spacecraft was even sent into them in an attempt to analyze their composition. The plumes are indeed water but with a small amount of complex hydro-carbons mixed in. Are these complex molecules a sign of the beginning of life on Enceladus? Only time, and a lander to the moon will tell. The picture below is an artists idea of the hydro-thermal activity on Enceladus.

Hydro-Thermal Activity on Enceladus (Credit: NASA-JPL-Caltech)

As far as I’m concerned however the best part of any mission to Saturn is simply the images of the planet and its system of rings. So without further ado I’ll just add a few below.

Saturn by Cassini (Credit: NASA-JPL)
Saturn’s North Pole (Credit: NASA-JPL)
Saturn’s Rings (Credit: NASA-JPL)

NASA has actually set up live internet coverage of Cassini’s final moments, although we can’t be quite certain exactly when that will happen. The show will start at 7AM Eastern Daylight Time on Friday the 15th and if you’d like to tune in, I will be, click on the link below to be taken to NASA’s Cassini End of Mission Activities website.

5.7 Million Year Old Footprints Discovered on Crete. Did a Bipedal Ape inhabit Europe Millions of Years Earlier than Previously Thought?

Not long ago (My Post of June 10th 2017) I complained that important finds of human and hominid fossils are too often reported in the press as ‘Shocking new Discoveries that will rewrite Human Pre-History’. Well I may have to eat my words this time because the recent unearthing at Trachilos on the island of Crete of 5.7 million year old fossil footprints could indeed rewrite human pre-history.

The footprints of human beings and the human like, upright walking apes called hominids are different from any other kind of creature. First of all like all of the primates we have no claws and our inner toes are substantially larger than the others, hence the ‘big toe’. Unlike our cousins the apes however our big toe does not stick out at a right angle the way our thumb does. These characteristics combine to make hominid footprints truly distinctive.

For the past 40 years the earliest known fossil hominid footprints were those discovered by Mary Leakey at lake Laetoli in Kenya, which were dated to 3.66 million years ago. These footprints are thought to have been made by a member of the species Australopithecus afarensis, the same species as the famous fossil Lucy. The images below show the fossils of Lucy and a reconstruction of what she may have looked like.

(Skeleton of Lucy: Credit Getty Images)
Reconstruction of A. afarensis (Credit: Cleveland Museum of Natural History)

The footprints at Laetoli, along with fossil remains like Lucy, are some of the key evidence for the ‘East African Cradle’ model of human evolution. The basic idea is that about 4.5 million years ago our ancestors moved from the jungle onto the East African grasslands. Adapting to their new environment by 3.5 million years ago our ancestors had become fully bi-pedal like Lucy and the makers of the Laetoli footprints. All subsequent hominid species, including us, are descended from those early walkers.

The footprints discovered on Crete could require a significant extension of if not an almost complete rewrite of that theory. Not only are they two million years older than the prints at Laetoli but they are on a different continent!

The Trachilos footprints were discovered and have been studied by a group of paleo-anthropologists led by Matthew Robert Bennett of the University of Bournemouth in the UK and Per Ahlberg of the University of Uppsala in Sweden. The footprints, several of which are shown in the images below, have been dated very precisely by the presence of fossil shells of marine microorganisms called foraminifera. The shells of these tiny single celled creatures evolved very quickly making foraminifera very useful fossils for dating the sediments in which they’re found.

Crete Footprints (Credit: Matthew R. Bennett)

Doctors Bennett and Ahlberg point out that at the time the footprints were made the sea level in the Mediterranean was much lower. Back then Crete was not an island but rather a part of the Greek mainland. In fact the size and depth of the Mediterranean Sea has varied greatly over the past 10 million years and it is quite possible that groups of early or even pre-hominid apes may have wandered around the eastern Mediterranean basin with one of them making the footprints on Crete.

Of course it is also possible that we have simply misidentified the footprints. Precise identification of any fossil is a hard thing to do and trace fossils, such as footprints or burrows, can be the hardest of all.

One thing is certain; if Greece or Sicily or east cost of the Mediterranean was inhabited by groups of early hominids then there are more fossils out there to be found. More evidence that could lead to a more complete picture of human evolution. Perhaps an ‘East Africa and Eastern Mediterranean Cradle’ model. If you’d like to read more about the footprints discovered at Trachilos Crete click on the link below.



Voyager. The Longest Journey

Today is the 40th anniversary of the launch of the Voyager 1 spacecraft. On September the fifth in 1997 a Titan 3-C rocket took off from Kennedy Space Center carrying a 773 kilo spacecraft that has completely changed the way we see our solar system and even now is exploring the space between the stars themselves. How many things do you know of that are still working after 40 years. The picture below shows Voyager and the various parts of the spacecraft.

Voyage Spacecraft (Credit: NASA)

The original concept for the Voyager missions was to be a ‘Grand Tour’ of the outer solar system with flybys of four planets Jupiter, Saturn, Uranus and Neptune. Indeed, the Voyager 2 spacecraft did succeed in visiting them all giving us our first close up view of Uranus and Neptune.

The discoveries made by these two robot explorers are too numerous to mention. I can only mention a few: Jupiter’s Rings, volcanoes on Io, the ice covering on Europa, shepherd moons in Saturn’s rings, the dense hydrocarbon atmosphere of Titan, the broken moon Miranda of Uranus and the great black spot on Neptune. Before Voyager all of these places were at best hazy smudges in a telescope, it was Voyage that turned them into worlds for us. The mosaic picture below shows some of the images taken by the Voyager spacecraft.

Mosaic of Voyager Images (Credit: Don Davis, NASA)

Because their mission was planned to take them on a journey so far from the Sun the Voyagers could not be powered by solar cells as most spacecraft are. Instead, each of the two probes carries three Radioisotope-Thermoelectric-Generators (RTGs). RTGs are basically a rod of radioactive material surrounded by thermocouples that convert the heat into electricity. the three generators combined produced a combined 470 watts of power at launch and even today after 40 years they are still generating about half that amount. That is still enough power to enable the Voyager probes to remain in contact with Earth although most of the probe’s instruments, such as the cameras, have been turned off to conserve power. Only the magnetometer and the low and high energy particle detectors continue to operate, continue to give us information about the space through which Voyager still journeys.

Today Voyager 1 has entered interstellar space, the first object made by mankind to do so. When the Voyagers were launched 40 years ago no one had any idea what the edge of the solar system would be like let alone where it might be. It was Voyager 1 who showed how the solar wind, pushing out from the Sun, comes to a pause known as the heliopause. Beyond that the low energy particles from the Sun disappear while the magnetic field shifts to that of the Milky Way.

Voyager 1 entered interstellar space in August of 2012 and Voyager 2 will soon join it. NASA estimates that the power sources on the spacecraft will allow them to remain in contact with Earth until sometime around 2030.

When that contact is lost the long mission of these explorers will finally be over, but only for human beings! You see the scientists and engineers who built Voyager knew that their creation could travel between the stars for thousands if not millions of years and there was the remote but still exciting possibility that one of the Voyagers might someday be found by non-human intelligences.

So the men who built Voyager included a greeting to any aliens that might find it. A golden record was stored away in the Voyager spacecraft. This record contained some of the sounds of Earth, music and greetings, along with images of life on Earth. The cover protecting the record has instructions for playing the record and even a stylus to be used in the playback. It is possible that the messages sent on Voyager may one day be the only record of our ever existing!

In 40,000 years Voyager 1 will pass about 1.6 light years from the star Gliese 445 while at the same time Voyager 2 will pass about 1.7 light years away from the star Ross 248 (both are red dwarf type stars). Even then the Voyagers will continue on and where their journey will end no one can say.

As of this morning Voyager 1 was 20,884,724,316 kilometers from the Sun and getting 16.995 kilometers further every second. Voyager 2 was 17,178,385,861 from the Sun and moving at a velocity of 15.374 kps.

If you’d like to know more about the Voyager spacecraft NASA has two websites. The first deals with the entire voyager mission while the second is for ‘Voyager the Interstellar Mission’. Click on the links below to be taken to those sites.

Nuclear Fusion: Has MIT Found the right Recipe?

For over half a century now Hydrogen Fusion has been the Holy Grail of energy production for the human race. Fusion is the energy source that powers the stars themselves and the potential for Fusion power plants to provide cheap, inexhaustible, pollution free energy was never in doubt. The question has always been whether the extreme conditions necessary for Fusion to occur could be controlled and maintained, whether a reliable Fusion reactor was possible or, like the Holy Grail, just a dream.

Let me take a moment to provide a little background. The chemical elements we’re all familiar with from high school run from the simple ones like Hydrogen, just a proton and electron, to extremely complex ones such as Uranium with 92 protons, 92 electrons along with 146 neutrons.

Now it turns out that you can release energy by either splitting big atoms like Uranium, this is called Fission, or Fusing small ones like taking 4 Hydrogen atoms to form one Helium atom. The pictures below show the two different types of reactions.

Fission of a Uranium Nuclei (Public Domain)
Fusion of Hydrogen into Helium (Public Domain)

We all know that Atomic Fission reactors have been producing energy for over 50 years but they’re dangerous, even after you’ve shut them down they remain hot and if the reaction gets out of control a tremendous amount of harmful radiation can be released, as in Chernobyl or Fukashima. Another problem with Fission reactors is that the leftover fuel rods are also highly radioactive and storing them safely is a very difficult problem.

Fusion reactors on the other hand would have a number of clear advantages. First of all Hydrogen Fusion simply produces more energy per kilo of fuel. More importantly however is the fact that Fusion would produce zero dangerous waste. Also, the conditions needed to produce Fusion are difficult to initiate and maintain, so difficult in fact that if anything were to go wrong the reaction would just instantly stop with no chance of a meltdown or release of radiation.

So if Fusion is such a better form of energy production why aren’t we building them by the hundreds in order to satisfy the need for pollution free energy? Well, as I said the conditions needed for Fusion are difficult to initiate and maintain, so difficult in fact that the world’s best scientists have been unable to maintain a Fusion reaction for more than a fraction of a second.

A recent advance may help to change that however. Scientists at MIT’s Plasma Science and Fusion Center have been experimenting with a new recipe for the fuel in their Alcator C-Mod Tokamak reactor. Now a Tokamak is a doughnut shaped vacuum chamber that uses intense magnetic fields to confine plasma, a gas of atoms that have been stripped of one or more electrons. Producing and heating plasma to extremely high temperatures and pressures is how you initiate a Fusion reaction. The picture below shows the MIT Tokamak.

Inside MIT’s Tokamak (Credit: Bob Mumgaard-Plasma Science and Fusion Center)

For the last several decades the fuel recipe that researchers have used has consisted of about 5% ordinary Hydrogen and 95% Deuterium, Hydrogen with a neutron attached to the proton. Microwaves then heat the ordinary Hydrogen and Fusion occurs as the superheated Hydrogen slams into the Deuterium. As I said earlier no one has succeeded in keeping this reaction going for more than a second.

Now the team at MIT has added a trace, less than 1%, of Helium-3 to the mixture. (Helium-3 is an atom of Helium lacking a neutron) When the Helium-3 is heated by microwaves they were able to increase the Helium-3 to energy levels ten times greater than previously seen!

The results obtained by the MIT team were so exciting that they quickly shared their results with colleagues at the Joint European Torus (JET in Culham UK), which is the world’s largest experimental Fusion reactor. The JET team soon confirmed MIT’s results and so now both teams are fine-tuning the recipe in order to get the highest energy levels possible.

Whether or not this breakthrough will soon lead to practical Nuclear Fusion only time will tell. It is possible however that before too long humanity may possess an almost limitless supply of pollution free energy.

If you’d like to read more about the research at MIT click on the link below to be taken to the Plasma Science and Fusion Center webpage.


Concussions in Sports, the Danger Everyone is Trying to Ignore.

Football season is upon us once again and there are already stories during the sports segments on the news about players suffering injuries. Despite wearing the best protective gear sports science can provide Football is just such a highly physical, even violent sport that it is rare for a player to go an entire season without missing some action because of an injury.

(Note: When I refer to Football in this post I am speaking about American Football, the one where the ball is hardly ever touched by a foot. The game the rest of the world calls Football I will refer to as Futball.)

Now Football certainly isn’t alone in posing health risks to its athletes. Hockey, Rugby even Baseball and Futball all have their share of injuries. However it is Football that has become notorious for one kind of injury, concussions, repeated head injuries whose long-term health effects are severely impacting the lives of former players. We’re not talking about feeling woozy after a hard hit or ‘seeing stars’; this is major damage to the brain caused by multiple head injuries.

The condition has been given the name Chronic Traumatic Encephalopathy (CTE) and the symptoms of this disease generally don’t begin to appear until 8-10 years after the repeated injuries that trigger the condition. The first signs of CET are actually similar to the initial effects of a concussion, dizziness, disorientation and headaches. As the disease progresses new symptoms begin to develop that can include memory loss, poor judgment and erratic, sometimes violent behavior. In its final stages CTE can cause dementia, speech difficulty, tremors and thoughts of suicide.

One of the biggest problems in treating CTE is that at present a diagnosis of the disease cannot be confirmed without a physical examination of the brain, something that cannot take place until after the patient is dead! Techniques are being developed using Magnetic Resonance Imaging (MRI) and Diffusion Tensor Imaging (DTI) but more research needs to be conducted before these techniques can be considered reliable diagnostic tools. The photo below shows a healthy brain and the brain of someone who died of CTE.

Normal Brain and Brain with CTE comparison (Credit: Boston University Center for the Study of Traumatic Encephalopathy)

A recent study of deceased NFL players whose relatives allowed autopsies to be performed revealed that 110 out of 111 of the subjects had CTE. Now the researchers who conducted the study caution that the deceased players had all shown symptoms of CTE while still alive so the high percentage of confirmed cases made sense. However it is still a horrific glimpse into the extent of the disease.

However the biggest difficulty in dealing with CTE is simply money. Pro-Football and the other sports with high rates of player concussions are big business and nobody wants to see the hard hits in Football eliminated, not the owners, not the fans and not even the players themselves.

Think about it, if you asked a group of 20-year-old boys if they’d be willing to risk their health in order to play a game they love for millions of dollars a year, oh, and you get to be famous and admired as well. How many do you think would say; nah, I’d rather be an accountant! That is a part of the paradox here, CTE is a voluntary disease, you choose to risk getting it just like drug addition or lung cancer from smoking.

The NFL has agreed to establish a fund of $765 million dollars to help with the medical costs of retired players with CTE. However this agreement came only after several years of legal fights in court. Nevertheless, simply paying the medical bills of people who get sick by working for you is not a solution.

Technology is not going to make CTE go away either, the protective gear worn by Football players is already the best in any pro sport but it obviously isn’t nearly enough. I don’t know what the final solution will be but this present situation cannot continue.

And before I go I want to state once again that this disease is not confined to pro-Football, many cases of people who only played college or even high school Football have been diagnosed. Nor is it confined to Football, cases have been found in every contact sport. If you like to read more about CTE click on the link below to be taken to the ‘Concussion Foundation’s’ webpage.

The Great American Eclipse of 2017, that’s a Big One off of my Bucket List.

Did you see it? Did you get to see the eclipse? The place I chose to travel to in order to see the eclipse was Sweetwater Tennessee and boy did I pick the right spot. Two minutes and thirty-seven seconds of totality in an absolutely cloudless sky. Those are two minutes and thirty-seven seconds that I will never forget.

Now Sweetwater is a pretty little town just about thirty miles south of Knoxville, Tennessee with a population of 5,764. The townspeople knew this was going to be a big event for them and they made sure that they were ready. The main street of the town had been blocked off for the eclipse and a small park across the street was laid out with food vendors, people selling souvenirs and artists with their goods.

Every available parking spot had been opened up for visitors at reasonable rates, I paid $20, and the money that was collected for parking mostly went to local charities. The pictures below shown the main street and park before the crowd really started coming.

Sweetwater, Tenn. Main Street (Credit: R.A.Lawler)
Sweetwater, Tenn. Park Area (Credit: R.A.Lawler)

I wanted to be certain to arrive early so I got to Sweetwater at 8AM, that’s when the pictures were taken. Finding myself a nice spot in the shade of a cafe to wait for the show to start, I quickly made friends with a father and son; both named Glenn, from Houston and Baton Rouge who had actually arrived in Sweetwater at 2AM. They really wanted to be sure to get a good spot! I also met people from Pittsburgh, Detroit and New York along with several from nearby Knoxville and Chattanooga. The town hasn’t yet published any estimate of the number of visitors, if they do I’ll add it later, but I’d say that at least 15,000 people came.

In the early morning there were no clouds of any kind so with the bright August Sun the day quickly became fairly hot. Soon anyone who wasn’t actually buying something was staying in the shade where a nice breeze made it fairly comfortable. A few clouds started rolling in about noon and by around 1PM as the partial eclipse was starting you could hear a few people whisper, ‘I hope it doesn’t get any worse’. Well, it got better, by 2:30 and the start of totality there was an absolutely clear sky. Perfect viewing for something I’ve wanted to see my whole life.

I did take my solar telescope and managed to get some decent videos of the partial eclipse. The videos are all too large to imbed so I’ll just have to add a single frame image from one video. (If you look closely on the Sun’s left side you can see a small Sunspot.)

Partial Eclipse through Solar Telescope (Credit: R.A.Lawler)

Once totality started however I didn’t want to waste time fiddling with the solar telescope and just took a few of images with the same camera I used to take the pictures of Sweetwater. The best image is below.

Total Eclipse of the Sun (Credit: R.A.Lawler)

As I said, Sweetwater got two minutes and thirty-seven seconds of totality, that’s just five seconds less than the maximum time for the eclipse anywhere in the US. That was enough time for me to find the four planets, Mercury, Venus, Mars and Jupiter that became visible as the Sunlight was blocked by the Moon. Think of it, seeing four planets arching across the sky at 2:30 in the afternoon!

I have no doubt you can find better images of the eclipse very easily on the internet, I’ve never been much of a photographer and surely millions of people were taking pictures. These are mine however, and mean more to me than I can say. Yes I spent four days traveling to Tennessee and back and yes the traffic jam after the eclipse was the worst I have ever experienced. Nevertheless, I will always remember the town of Sweetwater because it was there I saw my first total eclipse of the Sun.



Science and Science Fiction Celebrates its First Birthday / My Patent

This week ‘Science and Science Fiction’ reaches the one-year milestone and I’d like to begin this anniversary post by thanking everyone who has visited my blog over the past year but especially to you my regular readers. I seriously could not have imagined a year ago that this blog would be averaging over 300 visitors a day and that over 4000 people would have become registered blog subscribers or that over 400 of you would leave some very flattering comments. I can only once again give you my thanks.

To celebrate Science and Science Fiction’s first birthday I hope you don’t mind if I’m a little egotistical and use this post to talk about some of my own research, my small contribution to progress. I’m going to talk about my patent.

It’s called a ‘Graded Resistance Solid State Current Control Circuit’ and its registered at the US Patent Office (Reg. US Pat Off: When I was a kid I always wondered what that meant) as US 2012/0243137.

The invention is a circuit that is a part of a design for a new ultra-fast electrical circuit breaker system. Now circuit breakers are needed in any electrical power system because if there’s ever a problem causing a short circuit the circuit breaker will open, eliminating the huge current rush that you get with a short circuit. The picture below shows the breadboard model of the entire circuit breaker. My patent is on the three green circuit boards clustered around the central metal can.

Breadboard Model of Graded Resistance Solid State Current Control Circuit (Credit: R.A.Lawler)

But there’s a problem, large amounts of current don’t want to just stop, electrons have inertia just like any other kind of matter. Trying to just stop a large current quickly in fact will generate a large voltage, the larger the current and the faster it’s eliminated the higher the voltage that is generated, this phenomenon is know as an inductive voltage spike.

The voltage that is generated when a circuit breaker opens can easily be thousands of volts causing huge electrical arcs that can be both dangerous and damaging. I have seen circuit breakers destroyed by doing their job but fortunately I’ve never seen a person injured.

My idea was simple enough, with modern high power solid-state switches instead of just breaking the circuit, as old fashioned mechanical circuit breakers do, we can insert a small resistance, first to control the rapid increase in the current, then add a little more resistance to control the current itself. Finally we insert enough resistance to eliminate the current entirely, hence a ‘Graded Resistance Solid State Current Control Circuit’.

The figure below shows a simplified schematic of the circuit and how it fits into the entire circuit breaker. The object at the top is an ultra-fast mechanical contact that is opened by the same mechanism that is being used in the US Navy’s new rail gun (It’s also the central can in the picture above). My patent is the parts in the figure numbered 1603 to 1610. The even numbers, 1604,1606, 1608 and 1610 are the high power transistors while the odd numbers 1603, 1605, 1607, and 1609 are the increasingly larger valued resistors.

Graded Resistance Solid State Current Control Circuit schematic (Credit: R.A.Lawler)

The breadboard model of the entire system has been tested with results similar to that shown in the figure below. The figure below shows the voltage spikes generated as each increase is resistance is inserted. The slanted line 1702 is the maximum voltage that the system can withstand as a function of time. It is easy to see that the voltage remains well below that line, well within safe limits.

Measured Test Data (Credit R.A.Lawler)

So that’s my little invention. The design is being finalized for production and research into improvements is still ongoing. Every engineer wants to invent something completely new and different during their career, and to have it recognized as such. You’ve just had a brief look at mine!

Now tomorrow I’ll be heading down to Tennessee for the total eclipse. I hope the weather’s good so I can grab some images to show you!