Friday, March 9, 2018

Goodstein Sequences and Hereditary Base Notation

In mathematics, Goodstein sequences are certain sequences of natural numbers. Though they are fairly easy to define, their properties have important consequences in logic. Before investigating these, however, we give the definition. It depends on the concept of expressing numbers in different bases (well-known examples in addition to normal base-10 representations include binary, base 2, and hexadecimal, base 16). Recall that when writing a number, such as 4291, what we mean is 4 thousands plus 2 hundreds plus 9 tens, plus 1 one, alternatively expressed as

4291 = 4*103 + 2*102 + 9*101 + 1.

This decomposition uses 10 as a base. Note that the numbers multiplying the powers of 10 always vary between 0 and 9. Base 2, for example, could be used just as easily, with only digits 0 and 1 as coefficients. Expressing 4291 as powers of 2 yields

4291 = 1*4096 + 0*2048 + 0*1024 + 0*512 + 0*256 + 1*128 + 1*64 + 0*32 + 0*16 + 0*8 + 0*4 + 1*2 + 1*1
= 1*212 + 0*211 + 0*210 + 0*29 + 0*28 + 1*27 + 1*26 + 0*25 + 0*24 + 0*23 + 0*22 + 1*21 + 1.

Therefore, 4291 is typically expressed in binary as the sequence of coefficients 1000011000011. However, for our purposes, it is more convenient to explicitly express the powers of the base involved, although it will simplify matters to drop those terms with coefficient 0 since they have no contribution to the sum. The equation above then becomes

4291 = 1*212 + 1*27 + 1*26 + 1*21 + 1.

The system described above is known as ordinary base notation, but the definition of Goodstein sequences requires a slightly modified version, hereditary base notation. This involves taking the exponents themselves and subjecting them to the same base decomposition as the original number. Since 12 = 1*23 + 1*22, 7 = 1*22 + 1*21 + 1, and 6 = 1*22 + 1*21, the integer 4291 now becomes

4291 = 1*21*23 + 1*22 + 1*21*22 + 1*21 + 1 + 1*21*22 + 1*21 + 1*21 + 1.

This expression is quite complicated, but the process is not quite finished yet! The exponents 2 and 3 within the exponents are not yet in base-2: 3 = 1*21 + 1 and 2 = 1*21. Making the necessary replacements finally gives 4291 in hereditary base-2 notation:

4291 = 1*21*21*21 + 1 + 1*21*21 + 1*21*21*21 + 1*21 + 1 + 1*21*21*21 + 1*21 + 1*21 + 1.

In the general case, there may be many iterations of this process, which motivates the name "hereditary"; a base-2 decomposition is applied to the original integer and then the exponents that result, and then their exponents, and so on. The end result has only 2's as bases of exponents and only 1's as coefficients. The interested reader can verify that this type of process may be repeated for any positive integer in any base (using as coefficients positive integers less than the base), and that for a fixed number and base, the representation thus obtained is unique. The stage is now set for the definition of Goodstein sequence.

A Goodstein sequence is simply a sequence of nonnegative numbers. We may choose any number 1, 2, 3,... to begin the sequence. Next, whatever this number is, we express it in hereditary base-2 notation, just as we did with the example 4291 above. To generate the next member of the sequence, simply change every 2 in the hereditary base-2 representation to a 3, and then subtract 1 from the resulting number. This is the second member of the sequence. After that, express this second number in hereditary base-3 notation, change the 3's to 4's, and subtract one to get the third, and so on. We denote the nth member of the Goldstein sequence beginning with m by Gm(n). The first few sequences Gm die out quickly: if the seed is 1 (whose hereditary base-2 representation is just 1), there are no 2's to change to 3's so we simply subtract 1 to find G1(2) = 0. If a sequence reaches 0, we end it there, so that the sequence

G1 = {1,0}.

G2 is scarcely more interesting: G2(1) = 2 = 1*21 so changing the single 2 to a 3 and subtracting 1 yields G2(2) = 1*31 - 1 = 2. Recall that coefficients 0-2 are allowed in hereditary base-3 notation so 2 in this notation is simply 2. There are no 3's to change to 4's, so we subtract 1 to get G2(3) = 1. There are no 4's to change to 5's, so G2(4) = 0 and the sequence is finished:

G2 = {2,2,1,0}.

Beginning with 3 leads to a sequence nearly identical: the reader may try calculating the sequence. The end result is G3 = {3,3,3,2,1,0}. However, at m = 4, new behavior emerges. 4 = 1*21*21, so both 2's must be replaced by 3's to get: G4(2) = 1*31*31 - 1 = 27-1 = 26. In hereditary base-3, 26 = 2*32 + 2*31 + 2 so G4(3) = 2*42 + 2*41 + 2 - 1 = 41. For the next step, we get G4(4) = 2*52 + 2*51 + 1 - 1 = 60. Note that the units digit is reduced by one in each step even as the sequence increases. When it hits zero, as in this step, the coefficient of the penultimate coefficient is decreased by one: G4(5) = 2*62 + 2*61 - 1 = 83 = 2*62 + 1*61 + 5. However, the new units digit becomes one less than the base, namely 5, so it takes more steps for this to reach zero than previously. After another five steps, we arrive at G4(10) = 2*112 + 1*11 = 253. When changing to base 12 at the next step, we obtain G4(11) = 2*122 + 11 = 299. The units digit again decreases for the next 11 steps, until G4(22) = 2*232 = 1058.

The next step starts to indicate why Goodstein sequences can increase for so long: G4(23) = 2*242 - 1 = 1151 = 1*242 + 23*241 + 23. Since the base is 24, we get two new coefficients of 23. Each time the units digit reaches zero, the value at which it has to start the next time doubles. The square term in the base representation does not vanish until the base reaches 402653184. And at this point the sequence has barely begun. The largest value it reaches is 3*2402653210 - 1 at base 3*2402653209, after which the sequence remains stable for a while before finally declining to zero. This maximum value is so astronomically large that if the digits of the number were printed at a typical font size, front and back, it would fill a stack of paper over 10 feet tall! And this is just G4. Goodstein sequences with higher initial values increase much, much faster.

If we start with 18, for instance, since 18 = 1*21*21*21 + 1*21, replacing all the 2's with 3's gives G18(2) = 1*31*31*31 + 1*31 - 1 = 7625597484989. The third term is G18(3) = 1*41*41*41 + 2 - 1 ~ 10154. The values this sequence reaches quickly become difficult to even write down. However, Reuben Goodstein himself, after whom the sequences are named, proved in 1944 a statement that became known as Goodstein's Theorem. His remarkable result showed that no matter how incalculably large the sequences become, they always terminate at 0. That is, after some finite, though possibly immense, series of steps, each sequence stops increasing eventually and decreases to 0.

The theorem's proof has significance beyond demonstrating this surprising fact about Goodstein sequences. For more, see the next post (coming March 26).


Monday, February 12, 2018

Black Holes and Information

Black holes, with their extreme gravity and ability to profoundly warp space and time, are some of the most interesting objects in the universe. However, in at least one precisely defined way, they are also the least interesting.

According to general relativity, black holes are nearly featureless. Specifically, there is a result known as the "no-hair theorem" that states that stationary black holes have exactly three features that are externally observable: their mass, their electric charge, and their angular momentum (direction and magnitude of spin). There are no other attributes that distinguish them (these additional properties would be the "hair"). It follows that if two black holes are exactly identical in mass, charge, and angular momentum, there is no way, even in principle, to tell them apart from the outside.

This in and of itself is not a problem. As usual, problems arise when the principles of quantum mechanics are brought to bear in circumstances where both gravity and quantum phenomena play a large role. At the heart of the formalism of quantum mechanics is the Schrödinger equation, which governs the time-evolution of a system (at least between measurements). Fundamentally, the evolution may be computed both forwards and backwards in time. Therefore, at least the mathematical principles of quantum mechanics hold that information about a physical system cannot be "lost", that is, we may always deduce what happened in the past from the present. This argument does not take the measurement process into account, but it is believed that these processes do not destroy information either. Black holes provide some problems for this paradigm.

However, it may seem that information is lost all the time. If a book is burned, for example, everything that was written on its pages is beyond our ability to reconstruct. However, in principle, some omniscient being could look at the state of every particle of the burnt book and surrounding system and deduce how they must have been arranged. As a result, the omniscient being could say what was written in the book. The situation is rather different for black holes. If a book falls into a black hole, outside observers cannot recover the text on its pages, but this poses no problem for our omniscient being: complete knowledge of the state of all particles in the universe includes of course those on the interior of black holes as well as the exterior. The book may be beyond our reach, but its information is still conserved in the black hole interior.

The real problem became evident in 1974, when physicist Stephen Hawking argued for the existence of what is now known as Hawking radiation. This quantum mechanism allows black holes to shed mass over time, requiring a modification to the conventional wisdom that nothing ever escapes black holes.

The principles of quantum mechanics dictate that the "vacuum" of space is not truly empty. Transient so-called "virtual" particles may spring in and out of existence. Pairs of such particles may emerge from the vacuum (a pair with opposite charges, etc. is required to preserve conservation laws) for a very short time; due to the uncertainty principle of quantum mechanics, short-lived fluctuations in energy that would result from the creation of particles do not violate energy conservation. In the presence of very strong gravitational fields, such as those around a black hole, the resulting pairs of particles sometimes do not come back together and annihilate each other (as in the closed virtual pairs above). Instead, the pairs "break" and become real particles, taking with them some of the black hole's gravitational energy. When this occurs on the event horizon, one particle may form just outside and the other just inside, so that the one on the outside escapes to space. This particle emission is Hawking Radiation.

Theoretically, therefore, black holes have a way of shedding mass (through radiation) over time. Eventually, they completely "evaporate" into nothing! This process is extremely slow: black holes resulting from the collapse of stars may take tens of billions of years (more than the current age of the Universe!) to evaporate. Larger ones take still longer. Nevertheless, a theoretical puzzle remains: if the black hole evaporates and disappears, where did its stored information go? This is known as the black hole information paradox. The only particles actually emitted from the horizon were spontaneously produced from the vacuum, so it is not obvious how these could encode information. Alternatively, the information could all be released in some way at the moment the black hole evaporates. This runs into another problem, known as the Bekenstein bound.

The Bekenstein bound, named after physicist Jacob Bekenstein, is an upper limit on the amount of information that may be stored in a finite volume using finite energy. To see why this bound arises, consider a physical system as a rudimentary "computer" that stores binary information (i.e. strings of 1's and 0's). In order to store a five-digit string such as 10011, there need to be five "switches," each of which has an "up" position for 1 and a "down" position for 0. Considering all possible binary strings, there are therefore 25 = 32 different physical states (positions of switches) for our five-digit string. This is a crude analogy, but it captures the basic gist: the Bekenstein bound comes about because a physical system of a certain size and energy can only occupy so many physical states, for quantum mechanical reasons. This bound is enormous; every rearrangement of atoms in the system, for example, would count as a state. Nevertheless, it is finite.

The mathematical statement of the bound gives the maximum number of bits, or the length of the longest binary sequence, that a physical system of mass m, expressed as a number of kilograms, and radius R, a number of meters, could store. It is I ≤ 2.5769*1043 mR.

This is far, far greater than what any existing or foreseeable computer is capable of storing, and is therefore not relevant to current technology. However, it matters to black holes, because if they hold information to the moment of evaporation, the black hole will have shrunk to a minuscule size and must retain the same information that it had at its largest. This hypothesis addressing the black hole information paradox seems at odds with the Bekenstein bound.

In summary, there are many possible avenues for study in resolving the black hole information paradox, nearly all of which require the sacrifice of at least one physical principle. Perhaps information is not preserved over time, due to the "collapse" of the quantum wavefunction that occurs with measurement. Perhaps there is a way for Hawking radiation to carry information. Or possibly, there is a way around the Bekenstein bound for evaporating black holes. These possibilities, as well as more exotic ones, are current areas of study. Resolving the apparent paradoxes that arise in the most extreme of environments, where quantum mechanics and relativity collide, would greatly advance our understanding of the universe.


Monday, January 22, 2018

Neutrinos and Their Detection 2

This is the second part of a two part post. For the first part, see here.

The discovery of neutrinos led to a rather startling realization concerning the omnipresence of these particles. Scientists have known since the early 20th century that stars such as the Sun generate energy through nuclear fusion, especially of hydrogen into helium. In addition to producing radiation that eventually leads to what we see as sunlight, every one of these reactions releases neutrinos. As a result, the Earth is continually bathed in a stream of neutrinos: every second, billions of neutrinos pass through every square centimeter of the Earth's surface. A vast, vast majority of these pass through the planet unimpeded and resume their course through space, just as discussed in the previous post. As we will see, studying the properties of these solar neutrinos later led to an revolutionary discovery.

In 1967, an experiment began that had much in common with many of the neutrino experiments to come. Known as the Homestake experiment after its location, the Homestake Gold Mine in South Dakota, the main apparatus of the experiment was an 100,000 gallon tank of perchloroethylene (a common cleaning fluid) located deep underground, nearly a mile below the Earth's surface. The purpose of holding the experiment underground was to minimize the influence of cosmic rays, which would react with the perchloroethylene and produce experimental noise. Cosmic rays do not penetrate deep underground, however, while neutrinos do. The immense volume of liquid was necessary to obtain statistically significant data from the small rate of neutrino interactions. The number of argon atoms that were produced through the reaction was measured to determine how many reactions were occurring.

Simultaneously, physicists made theoretical calculations using knowledge of the Sun's composition, the process of nucleosynthesis, the Earth's distance from the Sun, and the size of the detector to estimate what the rate of interactions should have been. However, the results were not consistent with the data collected from the experiment. Generally, theoretical estimates were around three times as large as the actual results. Two-thirds of the expected reactions were missing! This disagreement became known as the "solar neutrino problem."

The models of the Sun were not at fault. In fact, the cause of the problem was an incorrect feature of the otherwise quite powerful Standard Model of Particle Physics, namely that neutrinos have mass. As far back as 1957, Italian physicist Bruno Pontecorvo considered the implications of neutrinos having mass.

He and others realized that neutrinos with mass would undergo what is known as neutrino oscillation when traveling through space. For example, an electron neutrino emitted from nuclear fusion would become a "mix" of all three flavors of neutrinos: electron, muon, and tau. When a solar neutrino reaches Earth and interacts with matter, it only has roughly a 1 in 3 chance of "deciding" to be an electron neutrino. This would explain the observed missing neutrinos, since the Homestake detector only accounts for electron neutrinos.

For the remainder of the 20th century, several more experiments were performed to investigate whether neutrino oscillation was in fact the solution to the solar neutrino problem. One experiment that was crucial in conclusively settling the matter was Super-Kamiokande, a neutrino observatory located in Japan. Like the Homestake experiment, it was located deep underground in a mine and consisted of a large volume of liquid (in this case, water).

When neutrinos interact with the water molecules in the detector, charged particles are produced that propagate through the chamber. These release radiation which is amplified and recorded by the photomultipliers that surround the water tank on every side. The number of photomultipliers allows a more detailed analysis of this radiation, yielding the energy and direction of origin for each neutrino interaction. It was this added precision that helped to resolve the solar neutrino problem: neutrinos indeed have mass and undergo oscillation. This discovery led to Japanese physicist Takaaki Kajita (who worked on the Super-Kamiokande detector as well as its predecessor the Kamiokande detector) sharing the 2015 Nobel Prize in Physics.

The exact masses of the different flavors of neutrinos are not yet known, nor do we completely understand why they have mass. However, despite the mysteries of particle physics that remain, further applications of neutrino detection continue in a different field: astronomy. The use of neutrinos to observe extraterrestrial objects is known as neutrino astronomy. In theory, if one could accurately measure the direction from which every neutrino arrives at Earth, the result would be an "image" of the sky highlighting neutrino sources. In reality, the scattering that occurs in detectors such as Super-Kamiokande when incoming particles hit and change direction limits angular resolution and so few interactions occur that there are insufficient samples to construct such an image. Only two extraterrestrial objects have ever been detected through neutrino emissions, in fact: the Sun, and a nearby supernova event, known as SN1987a after the year in which it took place. Theoretical calculations indicate that sufficiently bright supernovae may be located with reasonable accuracy using neutrino detectors in the future.

There is one major advantage to using neutrinos as opposed to light in making observations: neutrinos pass through nearly all matter unimpeded. The above discussion indicated that the Sun is a neutrino source. This is true, but not fully precise; the solar core is the source of the neutrinos, as it is where fusion occurs, and its radius is only about a quarter of the Sun's. There is no way to see the light emanating from the core because it interacts with other solar particles. However, we can see the core directly through neutrino imaging. In fact, the data from the Super Kamiokande experiment should be enough to approximate the radius in which certain fusion reactions take place. Future detectors could tell us even more about the Sun's interior.

Neutrino astronomy is still a nascent field and we do not yet know its full potential. Further understanding and detection of neutrinos will tell us more about the fundamental building blocks of matter, allow us to peer inside our own Sun, and measure distant supernovae.


Monday, January 1, 2018

Neutrinos and Their Detection

Neutrinos are a type of subatomic particle known both for their ubiquity and their disinclination to interact with other forms of matter. They have zero electric charge and very little mass even compared to other fundamental particles (though not none, more on this later) so they are not affected by electromagnetic forces and only slightly by gravity.

Since neutrinos are so elusive, it is not surprising that their existence was first surmised indirectly. In 1930, while studying a type of radioactive decay known as beta decay, physicist Wolfgang Pauli noticed a discrepancy. Through beta decay (shown above), a neutron is converted into a proton. This is a common process by which unstable atomic nuclei transmute into more stable ones. It was known that an electron was also released in this process. However, Pauli found that this left some momentum unaccounted for. As a result, he postulated the existence of a small, neutral particle (this properties eventually led to the name "neutrino"). The type emitted in this sort of decay is now known as an electron antineutrino (all the types will be enumerated below).

However, they were speculative for some decades before a direct detection occurred in 1956 in the Cowan-Reines Neutrino Experiment, named after physicists Clyde Cowan and Frederick Reines.

The experiment relied upon the fact that nuclear reactors were expected to release a large flux of electron antineutrinos during their operation, providing a concentrated source with which to experiment. The main apparatus of the experiment was a volume of water that electron antineutrinos emerging from the reactor would pass through. Occasionally, one would interact with a proton in the tank, producing a neutron and a positron (or anti-electron, denoted e+) through the reaction shown on the bottom left. This positron would quickly encounter an ordinary electron and the two would mutually annihilate to form gamma rays (γ). These gamma rays would then be picked up by scintillators around the water tanks. To increase the certainty that these gamma ray signatures in fact came from neutrinos, the experimenters added a second layer of detection by dissolving the chemical cadmium chloride (CdCl) in the water. The addition of a neutron (the other product of the initial reaction) to the common isotope Cd-108 creates an unstable state of Cd-109 which releases a gamma ray after a period of a handful of microseconds. Thus, the detection of two gamma rays simultaneously and then a third after a small delay would definitively indicate a neutrino interaction. The experiment was very successful and the rate of interactions, about three per hour, matched the theoretical prediction well. The neutrino had been discovered.

The Standard Model of particle physics predicted the existence of three "generations" of neutrinos corresponding to three types of particles called leptons.

The above diagram shows the three types of leptons and their corresponding neutrinos. In addition to this, every particle type has a corresponding antiparticle which in a way has the "opposite" properties (though some properties, such as mass, remain the same). The electron antineutrino discussed above is simply the antiparticle corresponding to the electron neutrino, for example. The discoveries of the others occurred at particle accelerators, where concentrated beams could be produced: the muon neutrino in 1962, and the tau neutrino in 2000. These results completed the expected roster of neutrino types under the Standard Model. In its original form, though, the Standard Model predicted that all neutrinos would have exactly zero mass. Note that this hypothesis (though later proved incorrect) is not disproven by the fact that neutrinos account for the "missing momentum" Pauli originally identified; massless particles, such as photons (particles of light), can still carry momentum and energy.

All of the neutrino physics described so far concerns artificially produced particles. However, these discoveries were only the beginning. Countless neutrinos also originate in the cosmos, motivating the area of neutrino astronomy. For more on this field and its value to both astronomy and particle physics, see the next post (coming January 22).


Wednesday, December 20, 2017

2017 Season Summary

The 2017 Atlantic hurricane season had above-average activity, with a total of

18 cyclones attaining tropical depression status,
17 cyclones attaining tropical storm status,
10 cyclones attaining hurricane status, and
6 cyclones attaining major hurricane status.

Before the beginning of the season, I predicted that there would be

15 cyclones attaining tropical depression status,
15 cyclones attaining tropical storm status,
6 cyclones attaining hurricane status, and
3 cyclones attaining major hurricane status.

The average number of named storms, hurricanes, and major hurricanes for an Atlantic hurricane season (over the 30-year period 1981-2010) are 12.1, 6.4, and 2.7, respectively. The 2017 season was well above average in all categories, especially hurricanes and major hurricanes. In addition, there were several intense and long-lived hurricanes, inflating the ACE (accumulated cyclone energy) index to 223. This value, which takes into account the number, duration, and intensity of tropical cyclones, was the highest since 2005. 2017 was also the first year on record to have three storms exceeding 40 ACE units: Hurricane Jose, with 42, Hurricane Maria, with 45, and Hurricane Irma, with 67.

The ENSO oscillation, a variation in the ocean temperature anomalies of the tropical Pacific, often plays a role in Atlantic hurricane development. At the beginning of the 2017 season, these temperatures were predicted to rise, signaling a weak El Niño event and suppressing hurricane activity. However, this event did not materialize. Though anomalies did rise briefly in the spring, they returned to neutral and even negative by the early fall, when hurricane season peaks. This contributed to the extremely active September. In addition, conditions were more favorable for development in the central Atlantic than they had been for several years, allowing the formation of long-track major hurricanes. Due to these factors, my predictions significantly underestimated the season's extreme activity.

The 2017 Atlantic hurricane season was the costliest ever recorded, with Hurricanes Harvey, Irma, and Maria contributing the lion's share to this total. Among the areas most affected were southeastern Texas (by Harvey), the Leeward Islands (from Irma and Maria), and Puerto Rico and the Virgin Islands (from Maria). Some other notable facts and records for the 2017 season include:
  • Tropical Storm Arlene formed on April 20, one of only a small handful of April storms; it also had the lowest pressure ever recorded for an Atlantic tropical cyclone in April
  • The short-lived Tropical Storm Bret formed off the coast of South America and made landfall near the northern tip of Venezuela, becoming the southernmost forming June Atlantic cyclone since 1933
  • The remnants of Hurricane Franklin regenerated in the eastern Pacific after crossing Mexico and received a new name: Jova
  • Hurricane Harvey was the first major hurricane to make landfall in the U.S. since 2005, and the strongest to do so in Texas since 1961; the peak rainfall accumulation of 51.88" in Cedar Bayou, Texas was the largest tropical cyclone rain total ever for the continental U.S.
  • Hurricane Irma spent a total of 3.25 days as a category 5 hurricane, the most in the Atlantic since 1932, and maintained incredible 185 mph winds for 37 hours, the most recorded in the entire world
  • When Hurricanes Irma, Jose, and Katia were all at category 2 strength or above on September 8, it marked only the second such occurrence since 1893
  • Hurricane Maria reached a minimum pressure of 908 mb, then the tenth lowest ever for an Atlantic hurricane, and the lowest since Dean in 2007
  • Becoming a major hurricane near the Azores Islands, Hurricane Ophelia was the easternmost major hurricane ever to form in the Atlantic
  • All ten named storms from Hurricane Franklin to Ophelia became hurricanes, the first time ten consecutive names have done so in the Atlantic since 1893

Overall, the 2017 Atlantic hurricane season was exceptionally active and damaging, especially for parts of the Caribbean.


Tuesday, November 7, 2017

Tropical Storm Rina (2017)

Storm Active: November 6-9

On November 3, a weak area of low pressure developed in the central tropical Atlantic, well away from any land areas. It moved slowly north and north-northeast over the following days and became better organized on November 5. Early in the morning on the 6th, the disturbance was organized enough to be classified as Tropical Depression Nineteen. The system was moving over marginal sea surface temperatures in an area of shear that was not too high, so modest strengthening occurred over the next day and the depression became Tropical Storm Rina overnight.

Rina began to accelerate northward on the 7th, passing the latitude of Bermuda almost 1000 miles to the east. Though sea surface temperatures were declining, the system's maximum winds increased some as it took on some subtropical characteristics. Rina reached its peak intensity of 60 mph winds and a pressure of 997 mb on November 8. Later that day it turned toward the north-northeast and began weakening as it transitioned to an extratropical system; it completed this transition during the morning of November 9. The system then turned eastward, eventually impacting the UK as a weak low before dissipation.

This image shows Tropical Storm Rina shortly after formation.

Rina did not affect any land areas during its lifetime.

Sunday, October 29, 2017

Tropical Storm Philippe (2017)

Storm Active: October 28-29

On October 23, a broad area of low pressure formed in the southwestern Caribbean Sea. Since this is a favorable area for late-season development, it was monitored closely. The broadness of the low made organization quite slow, despite plenty of moist air and fairly favorable atmospheric conditions. In addition, the circulation spent the next few days in close proximity with the coast of Nicaragua, where it dropped heavy rains. As a result, it was not until October 28 that the disturbance became Tropical Depression Eighteen. By the time it formed, the cyclone was already accelerating toward the north and northeast under the influence of a trough over the United States. Conditions were still favorable though, and the system strengthened into Tropical Storm Philippe as it passed over western Cuba. The rain bands of the Philippe extended well to the north and east of the center, so Cuba and Florida has already been experiencing heavy rains. Early on October 29, the storm crossed south Florida and emerged into the Atlantic.

As Philippe approached the cold front to its north, upper-level winds increased to enormous values. The system quickly became elongated from north to south and dissipated during that afternoon before its remnants merged with a developing extratropical system off the coast of the Carolinas. Tropical moisture from Philippe contributed to an already powerful developing nor'easter, enhancing rainfall over many of the northeast and mid-Atlantic states. The storm ultimately brought heavy snowfall to parts of eastern Canada before dissipating.

Philippe was a disorganized but large tropical storm that brought heavy rainfall to Cuba and Florida.

While Philippe's time as a tropical cyclone was short-lived, it contributed to a large storm that affected the northeast U.S.

Monday, October 9, 2017

Hurricane Ophelia (2017)

Storm Active: October 9-15

Around October 6, an area of low pressure began to form along a stationary frontal boundary located over the eastern Atlantic. The next day, the system began to separate from the remainder of the frontal boundary, although it still displayed a long, curved, front-like band of convection emanating from the center. Slowly, it developed some subtropical characteristics as it drifted in the northeast Atlantic, moving little. By October 8, the low was on the verge of tropical or subtropical cyclone status and satellite data indicated gale-force winds near the center. Overnight, a region of shower activity persisted just east of the center. Available information pointed to winds just below tropical storm strength, so the system was designated Tropical Depression Seventeen. Banding features started to appear during the morning of the 9th, and the depression was upgraded to Tropical Storm Ophelia.

Initially, Ophelia was moving slowly to the north and northeast in a region of weak steering currents. Over the next day, however, a mid-level ridge built in northwest of the cyclone and it turned toward the southeast. Meanwhile, shear was diminishing and convection was able to wrap around the center, which allowed for some strengthening. The largest inhibitor to development was some dry air inside the circulation. Interaction with this dry air caused intensity fluctuations on October 10, but deeper convection completely enclosed the center that night. At the same time, an eye feature formed, and Ophelia strengthened more rapidly. During the afternoon of October 11, Ophelia reached hurricane strength, becoming the tenth consecutive tropical cyclone of the 2017 season to develop into a hurricane.

Steering currents collapsed later that day as well, leaving the system to drift slowly eastward overnight. The appearance of deeper convection near the center suggested that additional strengthening had occurred. Sea surface temperatures remained just lukewarm, but unusually cool upper atmospheric temperatures created a steep enough gradient to support intensification. On October 12, Ophelia reached category 2 status, an unprecedented achievement for a hurricane so far northeast that late in the hurricane season. The cyclone began to gradually accelerate east-northeast overnight, reaching an intensity of 105 mph winds and a pressure of 970 mb. The eye clouded over briefly the morning of the 13th, but this was a short-lived trend. Later that day the eye cleared out and became even better defined, with deep convection completely surrounding the center. As a result, Ophelia maintained its remarkable category 2 status even farther north and east.

The system was not finished, however. A final burst of intensification on October 14 brought Ophelia to major hurricane strength, and it reached a peak intensity of 115 mph winds and a pressure of 960 mb. In doing so, it became the easternmost major hurricane ever recorded. The gap between it and its predecessors was even more impressive in its latitude range, where it was 900 miles farther east than any previous major hurricane. Finally, early on October 15, much colder waters and higher shear began to weaken Ophelia and induce extratropical transition. Later that day, the storm became extratropical as it sped toward Ireland. The center made landfall in southwest Ireland during the morning of October 17, bringing damaging hurricane-force winds. Since the system was moving at over 40 mph, it quickly passed over Ireland and the UK. The post-tropical cyclone brought gale force winds all the way to Scandinavia before finally dissipating.

Hurricane Ophelia is shown above as a major hurricane near the Azores, an unprecedented event in Atlantic hurricane records dating back to 1851.

Ophelia was no longer a tropical cyclone when it reached Europe (triangle points) but it still brought hurricane-force winds to many parts of Ireland.

Wednesday, October 4, 2017

Hurricane Nate (2017)

Storm Active: October 4-9

During the last week of September, a tropical wave tracked across the Atlantic Ocean. As the season was progressing to its later stages, conditions were less favorable over the open Atlantic, but the wave continued across into the Caribbean. On October 3, its southern end began interacting with a vorticity called a monsoonal gyre in the southwestern Caribbean, just north of Panama. This interaction led to increased spin, and a combination of low wind shear and warm waters supported further development. During the morning of October 4, Tropical Depression Sixteen formed east of Nicaragua. Over the next day, it moved slowly northwest. Lacking an inner core, the system was not able to develop quickly, but it did become Tropical Storm Nate early on October 5.

Shortly after, the center made landfall in Nicaragua, bringing heavy rainfall to the country as well as neighboring Honduras. Late that night, it reentered the Caribbean, prompting some intensification as convection increased. Meanwhile, Nate was accelerating toward the north-northwest. Its rapid speed hampered strengthening somewhat, but parts of an eyewall appeared on the 6th in the southern and western quadrants, and the system became a strong tropical storm. That evening, it passed just east of the Yucatan Peninsula. Fortunately, the part of the circulation over land was the weaker western side, minimizing damage. However, the lack of significant land interaction allowed Nate to continue to intensify steadily and become a hurricane around midnight. Conditions were still favorable in the Gulf of Mexico, but the fast-moving system had difficulty assembling a complete eyewall. Winds still were increasing to the north and east of the center, though, and the central pressure continued to fall. At midmorning, Nate reached its peak intensity of 90 mph winds and a pressure of 981 mb.

Slight weakening occurred over the next few hours, but the system still made landfall as a hurricane that evening near the Mississippi river delta. Once inland, it moved quickly north-northeast and weakened. Nate was tropical depression strength by midday on the 8th and became post-tropical early the next morning as it sped toward the mid-Atlantic states. The system brought precipitation to a large swath of the eastern U.S., but its rapid motion mitigated flooding. The remnants of Nate dissipated soon after.

The above image shows Hurricane Nate near peak intensity shortly before its Gulf coast landfall. The system's rapid motion prevented further organization.

Though Nate reached hurricane strength only in the Gulf of Mexico, its worst impacts occurred in central America, where prolonged heavy rains caused extensive flooding.

Sunday, September 17, 2017

Hurricane Maria (2017)

Storm Active: September 16-30

Maria formed from a tropical wave that first left Africa around September 10 or 11. At first, conditions did not support development of the broad system, but they steadily improved over the next several days. On September 15, the disturbance appeared much more organized on satellite imagery, and some rotation became evident. By the morning of the 16th, only a well-defined center of circulation separated it from tropical cyclone status. It cleared this hurdle during the afternoon, becoming Tropical Depression Fifteen. From this point, its maximum winds increased almost immediately, and the depression was upgraded to Tropical Storm Maria shortly thereafter.

The tropical storm was already quite large, though gaps remained in the satellite presentation of the cyclone in between rain bands. Despite this, the inner core strengthened fairly swiftly, and Maria became a strong tropical storm by the morning of September 17. That afternoon, a large burst of convective activity ignited near the center of circulation, overcoming the dry slot that had been hampering intensification. Soon, Maria had a well-formed eyewall and was upgraded to a hurricane. The outer bands were starting to affect the Lesser Antilles and the system continued west-northwestward toward the islands. September 18 saw incredibly rapid strengthening of Maria. In the morning, it strengthened into a major hurricane, and while an eye was apparent on radar, it had not yet cleared out on satellite imagery. The clearing came that afternoon; a very small "pinhole" eye developed, indicating a small core but extremely intense winds. Its intensity shot up through category 4, and Maria achieved category 5 intensity with 160 mph winds and a pressure of 924 mb during that evening. The eye then made landfall small island of Dominica in the Lesser Antilles.

Though small, the island was mountainous, and briefly disrupted Maria's core, bringing the intensity down slightly to category 4. However, as moved west-northwest into the Caribbean, its central pressure began to drop again, and the hurricane regained category 5 status early in the morning of September 19. Remarkably, the cyclone was not done intensifying: it became more symmetric on satellite imagery that day, and thunderstorm activity around the centered grew even further. That evening, Maria reached a peak intensity of 175 mph winds and a central pressure of 908 mb, one of the top ten lowest pressures ever recorded in an Atlantic hurricane at the time, even though its maximum winds were slightly weaker than those of Hurricane Irma a few weeks earlier.

By this time, the center was approaching Puerto Rico and the U.S. Virgin Islands. As is typical with powerful hurricanes, a secondary eyewall then formed and the inner one weakened somewhat, causing a decrease in maximum winds. When Maria made landfall in Puerto Rico early on September 20, it was a high-end category 4 hurricane with maximum winds of 155 mph, but the area of maximum winds had expanded in the wake of the eyewall replacement. Regardless, it was the strongest cyclone to make landfall in Puerto Rico since 1928. The hurricane brought extremely strong winds and damaging flooding rains to the island, causing several rivers to exceed their previous record stages. Nevertheless, land interaction took a significant toll on Maria and it quickly weakened over the next several hours. After traversing much of Puerto Rico from east-southeast to west-northwest, the center emerged over water early in the afternoon. The system had dropped to high-end category 2 strength, but reorganization began as it moved further northwest. A ragged eye developed by the evening and the circulation recovered some overnight, bringing Maria back up to major hurricane strength early on September 21. The southern portion of the circulation brought widespread tropical storm conditions and occasional hurricane conditions to the Dominican Republic that day.

The cyclone then veered northwest, moving away from Hispaniola. Some modest strengthening ensued, though the eye of the hurricane was quite unstable and actually clouded over that night. Wind shear out of the southwest was disrupting the system. Early on September 22, the center passed just east of the Turks and Caicos islands. Following the weakness in the ridge to its north left by Jose before it, Maria moved north-northwest that day. No significant changes in strength occurred through September 23, although the pressure and winds fluctuated. In fact, the central pressure decreased, but the maximum winds found in the eyewall were not as strong as before. As a result, Maria was downgraded to a category 2 on September 24. Later that day, Maria's structure took a more significant hit as the center moved over the cold wake left by Jose and the northwestern eyewall collapsed. As a result, the hurricane weakened to a category 1 overnight as it continued slowly northward. The center was nearly exposed late on the 25th, but the storm maintained minimal hurricane strength.

Maria finally weakened to a tropical storm on September 26, as the outer edge of its tropical storm wind field brushed the Outer Banks of North Carolina. Since almost all thunderstorm activity was displaced to the north and east of the center, there were few land impacts. Winds actually increased for a brief period on September 27, and the storm regained hurricane strength. This was short-lived though; it was a tropical storm again the next morning. After moving north at a crawl for several days, Maria finally began to turn eastward and accelerate as a cold front approached the U.S. east coast. The next day, the heading shifted back east-northeast. Shear also increased significantly as the circulation encountered colder waters, beginning extratropical transition. On September 30, Maria became post-tropical.

The above image shows Maria as a category 5 hurricane in the Caribbean sea.

Hurricane Maria brought devastating damage to Dominica, the U.S. Virgin Islands, Puerto Rico, and other parts of the Lesser Antilles.