The auroras—the aurora borealis (or northern lights) in the Northern Hemisphere, and the aurora australis (the southern lights) in the Southern Hemisphere—are brilliant natural spectacles that can be seen in the evening sky especially at higher latitudes. Unlike other phenomena of the night sky, such as meteors and comets, the auroras are atmospheric phenomena, but what causes them? Although auroras appear in the atmosphere, they are the result of extraterrestrial forces; however, these forces are not particularly alien. The Sun’s corona—the outermost region of the Sun’s atmosphere, consisting of plasma (hot ionized gas)—drives the solar wind (a particle flux of protons and electrons) away from the Sun. Some of these high-energy particles strike Earth’s magnetic field and follow magnetic field lines down into Earth’s atmosphere at the North and South magnetic poles. Earth’s atmosphere is mostly made up of nitrogen and oxygen. Once the solar particles reach Earth’s atmosphere, they collide with atoms of nitrogen and oxygen, stripping away their electrons to leave ions in excited states. These ions emit radiation at various wavelengths, creating the characteristic colors. Collisions of solar particles with oxygen produce red or green light; collisions with nitrogen produce green and purple light. During periods of low solar activity—which are often associated with periods where the Sun has fewer sunspots—fewer of these high-energy particles are emitted from the Sun, and the shimmering sheets of color that characterize Earth’s auroral zones shift poleward. When the Sun is more active and larger amounts of plasma are erupting from the Sun’s surface, more particles reach Earth’s atmosphere, and the auroras occasionally extend to the middle latitudes. For example, the aurora borealis has been seen as far south as 40° latitude in the United States. The auroras typically occur at altitudes of about 100 km (60 miles); however, they may occur anywhere between 80 and 250 km (about 50 to 155 miles) above Earth’s surface.
Deoxyribonucleic acid, better known as DNA, is crucial to life on Earth. The questions and answers in this list are taken from the Top Questions sections of the articles on DNA, recombinant DNA, and chloroplast, where you can find more questions answered. The discovery of DNA’s double-helix structure is credited to the researchers James Watson and Francis Crick, who, with fellow researcher Maurice Wilkins, received a Nobel Prize in 1962 for their work. Many believe that Rosalind Franklin should also be given credit, since she made the revolutionary photo of DNA’s double-helix structure, which was used as evidence without her permission. Gene editing today is mostly done through a technique called Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR), adopted from a bacterial mechanism that can cut out specific sections in DNA. One use of CRISPR is the creation of genetically modified organism crops. Recombinant DNA technology is the joining together of DNA molecules from two different species. The recombined DNA molecule is inserted into a host organism to produce new genetic combinations that are of value to science, medicine, agriculture, and industry. Since the focus of all genetics is the gene, the fundamental goal of laboratory geneticists is to isolate, characterize, and manipulate genes. Recombinant DNA technology is based primarily on two other technologies, cloning and DNA sequencing. Cloning is undertaken in order to obtain the clone of one particular gene or DNA sequence of interest. The next step after cloning is to find and isolate that clone among other members of the library (a large collection of clones). Once a segment of DNA has been cloned, its nucleotide sequence can be determined. Knowledge of the sequence of a DNA segment has many uses. Unlike most other organelles, chloroplasts and mitochondria have small circular chromosomes known as extranuclear DNA. Chloroplast DNA contains genes that are involved with aspects of photosynthesis and other chloroplast activities. It is thought that both chloroplasts and mitochondria are descended from free-living cyanobacteria, which could explain why they possess DNA that is distinct from the rest of the cell.
Are you a logical, precise thinker, or would you say that you’re more free-spirited and artistic? If you’re the former, somebody’s probably told you at some point that you’re a left-brained person, and if you’re the latter, right-brained. The notion that the right half of the brain is the creative half and the left half is the analytical half and that our individual traits are determined by which half is dominant is widespread in popular psychology. There’s even a small industry devoted to this idea. There are self-help books, personality tests, therapies, and educational materials that claim to help you optimize the functions of the stronger half of your brain, get in touch with the weaker half, or even make the two halves stop their (supposedly) incessant battling inside your skull so you can finally get some peace and quiet. The idea that there are right-brained and left-brained people is a myth. Although we all obviously have different personalities and talents, there’s no reason to believe these differences can be explained by the dominance of one half of the brain over the other half. Recent research using brain imaging technology hasn’t found any evidence of right or left dominance. One of the myth’s fatal flaws is that it relies on vague conceptions of the abilities it purports to explain. Math, for example, requires logical thought and, thus, is generally said to reside in the left brain, far away from all those artsy right-brain abilities. But mathematics is a profoundly creative endeavor in addition to being a logical one. So would a gifted mathematician be a right-brained person or a left-brained person? Likewise, artistic creativity isn’t just unbridled emotion. Many of the greatest works of art are products of rigorous, precise thought.
Large Hadron Collider (LHC), the world’s most powerful particle accelerator. The Large Hadron Collider (LHC) was constructed by the European Organization for Nuclear Research (CERN) in the same 27-km (17-mile) tunnel that housed its Large Electron-Positron Collider (LEP). The tunnel is circular and is located 50–175 metres (165–575 feet) belowground on the border between France and Switzerland. The LHC ran its first test operation on September 10, 2008. An electrical problem in a cooling system on September 18 resulted in a temperature increase of about 100 °C (180 °F) in the magnets, which are meant to operate at temperatures near absolute zero (−273.15 °C, or −459.67 °F). Early estimates that the LHC would be quickly fixed soon turned out to be overly optimistic. It restarted on November 20, 2009. Shortly thereafter, on November 30, it supplanted the Fermi National Accelerator Laboratory’s Tevatron as the most powerful particle accelerator when it boosted protons to energies of 1.18 teraelectron volts (TeV; 1 × 1012 electron volts). In March 2010 scientists at CERN announced that a problem with the design of superconducting wire in the LHC required that the collider run only at half-energy (7 TeV). The LHC was shut down in February 2013 to fix the problem and was restarted in April 2015 to run at its full energy of 13 TeV. A second long shutdown, during which the LHC’s equipment would be upgraded, began in December 2018 and ended in July 2022. The heart of the LHC is a ring that runs through the circumference of the LEP tunnel; the ring is only a few centimetres in diameter, evacuated to a higher degree than deep space and cooled to within two degrees of absolute zero. In this ring, two counterrotating beams of heavy ions or protons are accelerated to speeds within one-millionth of a percent of the speed of light. (Protons belong to a category of heavy subatomic particles known as hadrons, which accounts for the name of this particle accelerator.) At four points on the ring, the beams can intersect and a small proportion of particles crash into each other. At maximum power, collisions between protons will take place at a combined energy of up to 13 TeV, about seven times greater than has been achieved previously. At each collision point are huge magnets weighing tens of thousands of tons and banks of detectors to collect the particles produced by the collisions. The project took a quarter of a century to realize; planning began in 1984, and the final go-ahead was granted in 1994. Thousands of scientists and engineers from dozens of countries were involved in designing, planning, and building the LHC, and the cost for materials and manpower was nearly $5 billion; this does not include the cost of running experiments and computers. One goal of the LHC project is to understand the fundamental structure of matter by re-creating the extreme conditions that occurred in the first few moments of the universe according to the big-bang model. For decades physicists have used the so-called standard model for fundamental particles, which has worked well but has weaknesses. First, and most important, it does not explain why some particles have mass. In the 1960s British physicist Peter Higgs postulated a particle that had interacted with other particles at the beginning of time to provide them with their mass. The Higgs boson had never been observed—it should be produced only by collisions in an energy range not available for experiments before the LHC. After a year of observing collisions at the LHC, scientists there announced in 2012 that they had detected an interesting signal that was likely from a Higgs boson with a mass of about 126 gigaelectron volts (billion electron volts). Further data definitively confirm those observations as that of the Higgs boson. Second, the standard model requires some arbitrary assumptions, which some physicists have suggested may be resolved by postulating a further class of supersymmetric particles; these might be produced by the extreme energies of the LHC. Finally, examination of asymmetries between particles and their antiparticles may provide a clue to another mystery: the imbalance between matter and antimatter in the universe.
Hallucinogens are a class of psychoactive drugs that produce temporary mental changes that include distorted sensory perception and dreamlike states of consciousness. For thousands of years, hallucinogenic substances have been linked with profound mystical experience; the Rigveda mentions a plant substance known as Soma, which, when ingested, produced brilliant visions of paradise. Most scholars believe the drug was a hallucinogen, although the exact plant is unidentified in modern times. The Eleusinian Mysteries, an ancient Greek ritual that persisted for nearly 2,000 years, were likewise centered on a beverage, known as kykeon, that was capable of producing altered states of consciousness. In the Americas, the Aztecs used a variety of hallucinogenic substances in religious and social rituals. In the 1950s and ’60s, hallucinogens were the subject of serious scientific study. One of the most-famous studies was the so-called Good Friday Experiment, in which 20 theology students were given the hallucinogen psilocybin or a placebo during a Good Friday church service. The students who received psilocybin reported having intense religious experiences. The apparent link between hallucinogen use and spiritual experience led some researchers to investigate the possible uses of hallucinogens as a treatment for psychiatric problems such as addiction, anxiety, and depression. The use of hallucinogens in scientific research was paralleled by their proliferation in the counterculture as recreational drugs. The widespread recreational use of hallucinogens in “hippie” culture provoked a cultural and political backlash that ultimately led to the criminalization of such drugs under the Controlled Substances Act of 1970; this legislation also had the effect of suppressing most scientific research concerning hallucinogens. After a long absence, hallucinogens began to reappear in scientific research in the late 1990s. The new studies, which have investigated the therapeutic applications of hallucinogens for a variety of conditions, have been conducted with greater methodological rigor and attention to patient safety than their predecessors in the 1960s. Most of the studies have been small, since hallucinogens are still tightly controlled, and the U.S. government does not recognize any legitimate medical uses and thus does not offer any funding for research. But researchers have generally characterized their initial results as very promising. For example, studies on patients with terminal illnesses found that the mystical experiences induced by psilocybin produced stronger and longer-lasting improvements in patients’ symptoms of depression and anxiety than conventional treatments.
nanotechnology, the manipulation and manufacture of materials and devices on the scale of atoms or small groups of atoms. The “nanoscale” is typically measured in nanometres, or billionths of a metre (nanos, the Greek word for “dwarf,” being the source of the prefix), and materials built at this scale often exhibit distinctive physical and chemical properties due to quantum mechanical effects. Although usable devices this small may be decades away (see microelectromechanical system), techniques for working at the nanoscale have become essential to electronic engineering, and nanoengineered materials have begun to appear in consumer products. For example, billions of microscopic “nanowhiskers,” each about 10 nanometres in length, have been molecularly hooked onto natural and synthetic fibres to impart stain resistance to clothing and other fabrics; zinc oxide nanocrystals have been used to create invisible sunscreens that block ultraviolet light; and silver nanocrystals have been embedded in bandages to kill bacteria and prevent infection. Possibilities for the future are numerous. Nanotechnology may make it possible to manufacture lighter, stronger, and programmable materials that require less energy to produce than conventional materials, that produce less waste than with conventional manufacturing, and that promise greater fuel efficiency in land transportation, ships, aircraft, and space vehicles. Nanocoatings for both opaque and translucent surfaces may render them resistant to corrosion, scratches, and radiation. Nanoscale electronic, magnetic, and mechanical devices and systems with unprecedented levels of information processing may be fabricated, as may chemical, photochemical, and biological sensors for protection, health care, manufacturing, and the environment; new photoelectric materials that will enable the manufacture of cost-efficient solar-energy panels; and molecular-semiconductor hybrid devices that may become engines for the next revolution in the information age. The potential for improvements in health, safety, quality of life, and conservation of the environment are vast. At the same time, significant challenges must be overcome for the benefits of nanotechnology to be realized. Scientists must learn how to manipulate and characterize individual atoms and small groups of atoms reliably. New and improved tools are needed to control the properties and structure of materials at the nanoscale; significant improvements in computer simulations of atomic and molecular structures are essential to the understanding of this realm. Next, new tools and approaches are needed for assembling atoms and molecules into nanoscale systems and for the further assembly of small systems into more-complex objects. Furthermore, nanotechnology products must provide not only improved performance but also lower cost. Finally, without integration of nanoscale objects with systems at the micro- and macroscale (that is, from millionths of a metre up to the millimetre scale), it will be very difficult to exploit many of the unique properties found at the nanoscale.