We are searching data for your request:
Upon completion, a link will appear to access the found materials.
Loosely related: Why of all units is Time standard?
It's pretty obvious why 0:00/12:00am is called "midnight". What is the history behind setting the 0 time in the middle of the night as opposed any other "time" of the day?
Some societies use sunset as the end of one day, and the start of the next: this is recorded in Genesis, chapter 1; for example, the Athenians, or see the Jewish civil day
Some traditional agricultural societies start the day with dawn, but Roman civil society defined the day as beginning at midnight. The day was divided into ante meridiem (am) and post meridiem (pm), where the meridian refers to local noon. Twelve hours later is midnight, the sixth hour of the night, and the beginning of the next day.
Telling Time in Ancient Rome
Additional information is available at Roman Time Keeping, including the calendar. The Roman's borrowed their system from the Greeks, who in turn had learned it from the Babylonians.
Western European timekeeping conventions come directly from Roman practices, and have spread around the world in recent times, but are not universal.
As pointed out in the comments, we used the ships bells when I was a sailor, though the official time was a 24 hour clock, GMT and local.
The natural clock on which all systems of time reckoning are based is the Sun. Noon is an astronomically defined event (does not depend on any convention): it is the upper culmination of the Sun. Midnight is similarly defined, it is the lower culmination. The lower culmination is not a visible event in most latitudes, most of the time. So there are two natural choices of the beginning of the day. Both were used. It is clear why noon is inconvenient for practical purposes. This is why we use midnight.
The day hasn't always started at midnight.
Up to late 1805 the Royal Navy used three days: nautical, civil (or "natural"), and astronomical. A nautical day entered in a ship's log as 10 July, for example, in fact commenced at noon on 9 July civil reckoning, PM therefore coming before AM. The astronomical day of 10 July, on the other hand, commenced at noon of 10 July civil reckoning, and ended at noon on 11 July. The astronomical day was brought into use following the introduction of the Nautical Almanac in 1767, and the British Admiralty issued an order ending the use of the nautical day on 11 October 1805. The US did not follow suit until 1848, while many foreign vessels carried on using it until the 1880s.
The clock runs whenever the ball is in play. The clock is stopped whenever the ball goes out of bounds, a foul is called, free throws are being shot, and during time outs. When the ball is inbound, the clock starts once a player touches the ball.
In the NBA the clock stops after a made shot during the last two minutes of the game and overtime. For college it stops during the last minute of the game and overtime.
If the game is tied after regulation time, there will be overtime. Overtime is 5 minutes long in most leagues. Additional overtimes will be added until one team ends up on top.
Not all states have a shot clock for high school. Where they do, it generally follows the NCAA rules.
30 second time out signal
In order to give your team some rest, call a play, or just stop the game for a while, teams can call a time out. There are different rules on time outs for different leagues:
High School - Players on the floor or the coach can call a time out. There are five time outs per game including three 60-second time outs and two 30-second time outs.
NCAA College - There are a different number of time outs depending on whether the game is on TV or not. This is because during a TV game there are media time outs so the TV channel can show ads. For a TV game each team gets one 60-second time out and four 30-second time outs. For a non-TV game each team has four 75-second and two 30-second time outs.
NBA - In the NBA each basketball team has six full time outs and one 20-second time out per half. Only a player in the game can call a time out.
Ancient Egyptian obelisks, constructed about 3,500 B.C., are also among the earliest shadow clocks. The oldest known sundial is from Egypt it dates back to around 1,500 B.C. Sundials have their origin in shadow clocks, which were the first devices used for measuring the parts of a day.
An early prototype of the alarm clock was invented by the Greeks around 250 BC. The Greeks built a water clock, called a clepsydra, where the rising waters would both keep time and eventually hit a mechanical bird that triggered an alarming whistle.
Clepsydras were more useful than sundials—they could be used indoors, during the night, and also when the sky was cloudy—although they were not as accurate. Greek water clocks became more accurate around 325 B.C., and they were adapted to have a face with an hour hand, making the reading of the clock more precise and convenient.
How Time Works
In the modern calendar, we label all years with B.C. (before Christ) or A.D. (anno domini, or "in the year of our lord"). There is no "zero" year -- in this system, the year Christ was born is 1 A.D., and the year preceding it is 1 B.C.
This practice was first suggested in the sixth century A.D., and was adopted by the pope of that time. It took quite a while for it to become a worldwide standard, however. Russia and Turkey, for example, did not convert to the modern calendar and year scheme until the 20th century.
One interesting side note: Because of a variety of changes and adjustments made to the calendar during the middle ages, it turns out that Jesus was most likely born in what we now think of as 6 B.C., and likely lived until 30 A.D.
Besides B.C. and A.D., some people use B.C.E. (for "before common era") and C.E. (for "common era").
For more information on time and related topics, check out the links below.
As for &ldquolove,&rdquo the word has been used since the 1700s to mean &ldquonothing&rdquo and is also used in a variety of other games from racket sports to cards (including bridge and whist). But how it came to mean this is also unexplained.
One often repeated option traces the etymology to the French l&rsquooeuf, meaning egg, an object the same shape as the number 0. But there is no indication the French ever used l&rsquooeuf in relation to tennis scoring, writes American tennis player Malcolm D. Whitman in his 1932 book Tennis: Origins and Mysteries, and they didn’t write scores down, so the visual association wouldn’t cue the egg comparison. Gillmeister also writes that &ldquolove&rdquo is not how that type of loan word would be modified into English &mdash Latin’s bovem became the French boeuf and turned into beef in English, so l&rsquooeuf would likely have become something sounding more like leaf if that theory had held true. Gillmeister has a different loan-word idea. Perhaps it’s from the Dutch or Flemish lof, meaning honor, which would have made sense if players saw a tennis match as akin to a battle. (&ldquoDeuce&rdquo is a clearer loan word &mdash deux is French for &ldquotwo&rdquo &mdash but the mechanism or timing of that transition is less clear.)
Or maybe it&rsquos not a loan word at all: phrases along the lines of &ldquoneither for love nor money&rdquo had already entered the lexicon, according to Gillmeister. So the idea that a person with &ldquolove&rdquo had no money could be a plausible option as to why that might be the word for having no points in a game that was a frequent subject of wagers.
What is Greenwich Mean Time?
How did local clock time in Greenwich, London change the world?
The Royal Observatory Greenwich is the home of Greenwich Mean Time (GMT). But what is GMT and why is it so important?
What does GMT mean?
Greenwich Mean Time is the yearly average (or ‘mean’) of the time each day when the Sun crosses the Prime Meridian at the Royal Observatory Greenwich.
Essentially, mean time is clock time rather than solar (astronomical) time.
Solar time varies throughout the year, as the time interval between the Sun crossing a set meridian line changes.
But each day measured by a clock has the same length, equal to the average (mean) length of a solar day. It’s a way of standardising and regularising time so we can all know exactly what time it is for our (or anyone’s) location.
Today GMT is reckoned from one midnight to the next.
What does GMT stand for?
GMT stands for Greenwich Mean Time, the local clock time at Greenwich. From 1884 until 1972, GMT was the international standard of civil time. Though it has now been replaced by Coordinated Universal Time (UTC), GMT is still the legal time in Britain in the winter, used by the Met Office, Royal Navy and BBC World Service. Greenwich Mean Time is also the name of the time zone used by some countries in Africa and Western Europe, including in Iceland all year round.
How did Greenwich Mean Time begin?
It wasn’t until the invention of the pendulum clock in the 1650s that it was possible to work out the relationship between mean (clock) time and solar time.
John Flamsteed came up with the formula for converting solar time to mean time, and published a set of conversion tables in the early 1670s. Soon after, he was appointed as the first Astronomer Royal and moved into the new Royal Observatory in Greenwich.
Here he had the best pendulum clocks installed and set them to the local time. This was Greenwich Mean Time, or the average time when the Sun crossed the meridian at Greenwich. At first though, Greenwich time was only really important to astronomers.
GMT and the quest for longitude
In the 1700s, the fifth Astronomer Royal Nevil Maskelyne brought Greenwich Mean Time to a wider audience.
In 1767 Maskelyne introduced the Nautical Almanac as part of the great 18th century quest to determine longitude.
These were tables of ‘lunar distance’ data based on observations at Greenwich and using GMT as the time standard. This data enabled navigators to find their position at sea.
GMT was also crucial to the other great solution to the ‘longitude problem’, represented by John Harrison’s famous timekeepers.
British mariners started keeping at least one chronometer set to GMT. This meant they could calculate their longitude from the Greenwich meridian (longitude 0° by convention).
These two solutions would help pave the way for GMT to become the worldwide time standard a century later.
How did railways lead to GMT becoming the UK time standard?
Until the mid-19th century, almost every town kept its own local time, defined by the Sun. There were no national or international conventions which set how time should be measured.
This meant there was no standard timings for when the day would begin and end, or what length an hour might be. As well as Greenwich Mean Time for example, there was also Bristol Mean Time (10 minutes behind GMT) Cardiff Mean Time (13 minutes behind GMT).
However, the 1850s and 1860s saw the expansion of the railway and communications networks. This meant the need for an national time standard became imperative.
British railway companies started introducing a single standard time across their networks, designed to make their timetables less confusing. It was mostly Greenwich Mean Time that they used. GMT was ultimately adopted across Great Britain by the Railway Clearing House in December 1847. It officially became 'Railway Time'.
By the mid-1850s, almost all public clocks in Britain were set to Greenwich Mean Time and it finally became Britain’s legal standard time in 1880.
How did Greenwich Mean Time become the international standard?
In 1884 the Greenwich Meridian was recommended as the Prime Meridian of the World.
There were two main reasons for this. The first was that the USA had already chosen Greenwich as the basis for its own national time zone system. The second was that in the late 19th century, 72% of the world's commerce depended on sea-charts which used Greenwich as the Prime Meridian.
The recommendation was based on the argument that naming Greenwich as Longitude 0º would be of advantage to the largest number of people.
As the reference for GMT, the Prime Meridian at Greenwich therefore became the centre of world time and the basis for the global system of time zones.
The Airy Transit Circle (telescope) became the telescope that would define the Prime Meridian of the World. Astronomer Royal George Biddell Airy designed it, and it is located at the Royal Observatory Greenwich.
It was recommended that the meridian line would indicate 0° longitude. Therefore this also became the start of the Universal Day. The meridian line is marked by the cross-hairs in the Airy Transit Circle eyepiece.
The first clock to show GMT to the public
The Shepherd gate clock can be seen at the gates to the Royal Observatory. It was the first clock ever to show Greenwich Mean Time directly to the public. It is a 'slave' clock, connected to the Shepherd master clock which was installed at the Royal Observatory in 1852.
From that time until 1893, the Shepherd master clock was the heart of Britain's time system. Its time was sent by telegraph wires to London, Edinburgh, Glasgow, Dublin, Belfast and many other cities. By 1866, time signals were also sent from the clock to Harvard University in Cambridge, Massachusetts via the new transatlantic submarine cable.
In terms of the distribution of accurate time into everyday life, it is one of the most important clocks ever made.
The first thing you notice about the clock is that it has 24 hours on its face rather than the usual 12. That means at 12 noon the hour hand is pointing straight down rather than straight up.
The clock originally indicated astronomical time, in which the counting of the 24 hours of each day starts at noon. The clock was changed in the 20th century to indicate Greenwich Mean Time, in which the counting of the 24 hours of each day starts at midnight. It continues to show Greenwich Mean Time and is not adjusted for British Summer Time.
Thrift wasn’t the only reason for saving daylight
In 1895, George Hudson, an entomologist from New Zealand, came up with the modern concept of daylight saving time. He proposed a two-hour time shift so he’d have more after-work hours of sunshine to go bug hunting in the summer.
Seven years later, British builder William Willett (the great-great grandfather of Coldplay frontman Chris Martin) independently hit on the idea while out horseback riding. He proposed it to England’s Parliament as a way to prevent the nation from wasting daylight. His idea was championed by Winston Churchill and Sir Arthur Conan Doyle—but was initially rejected by the British government. Willett kept arguing for the concept up until his death in 1915.
In 1916, two years into World War I, the German government started brainstorming ways to save energy.
“They remembered Willett’s idea of moving the clock forward and thus having more daylight during working hours,” explains David Prerau, author of Seize the Daylight: The Curious and Contentious Story of Daylight Saving Time. “While the British were talking about it year after year, the Germans decided to do it more or less by fiat.”
Soon, England and almost every other country that fought in World War I followed suit. So did the United States: On March 9, 1918, Congress enacted its first daylight saving law—and it was a two-fer: In addition to saving daylight, the Standard Time Act defined time zones in the U.S.
In those days, coal power was king, so people really did save energy (and thus contribute to the war effort) by changing their clocks.
Angles and ancient astronomy
In the 24th century B.C., the Sumerians were conquered by the Akkadians, who then fell to the Amorites, who rose to power and built the nation-state of Babylon, which peaked in the 18th century B.C. The Babylonians invented the degree and defined a circle as having 360 degrees. There are a couple of theories as to why they chose 360:
- The Babylonians understood a year as having close to 360 days hence the sun "moves" along the ecliptic approximately 1 degree per day.
- The radius of a circle maps onto a circumscribed hexagon of six equilateral triangles, and thus a sixth of a circle forms a natural angle measure. In the numerals inherited from the Sumerians, a number&rsquos sexagesimal value was inferred from context, so six was &ldquospelled&rdquo the same way as 360.
Babylonian astronomers began cataloging stars in the 14th century B.C. Astronomy flourished as they developed a deep understanding of sun and moon cycles, and even predicted eclipses. Babylonian star catalogs served as the basis of astronomy for more than a thousand years despite the boom and bust of the Middle Assyrian Empire, the Neo-Assyrian Empire, the Neo-Babylonian Empire and the Achaemenid Empire.
The myth of the eight-hour sleep
We often worry about lying awake in the middle of the night - but it could be good for you. A growing body of evidence from both science and history suggests that the eight-hour sleep may be unnatural.
In the early 1990s, psychiatrist Thomas Wehr conducted an experiment in which a group of people were plunged into darkness for 14 hours every day for a month.
It took some time for their sleep to regulate but by the fourth week the subjects had settled into a very distinct sleeping pattern. They slept first for four hours, then woke for one or two hours before falling into a second four-hour sleep.
Though sleep scientists were impressed by the study, among the general public the idea that we must sleep for eight consecutive hours persists.
In 2001, historian Roger Ekirch of Virginia Tech published a seminal paper, drawn from 16 years of research, revealing a wealth of historical evidence that humans used to sleep in two distinct chunks.
His book At Day's Close: Night in Times Past, published four years later, unearths more than 500 references to a segmented sleeping pattern - in diaries, court records, medical books and literature, from Homer's Odyssey to an anthropological account of modern tribes in Nigeria.
Much like the experience of Wehr's subjects, these references describe a first sleep which began about two hours after dusk, followed by waking period of one or two hours and then a second sleep.
"It's not just the number of references - it is the way they refer to it, as if it was common knowledge," Ekirch says.
During this waking period people were quite active. They often got up, went to the toilet or smoked tobacco and some even visited neighbours. Most people stayed in bed, read, wrote and often prayed. Countless prayer manuals from the late 15th Century offered special prayers for the hours in between sleeps.
And these hours weren't entirely solitary - people often chatted to bed-fellows or had sex.
A doctor's manual from 16th Century France even advised couples that the best time to conceive was not at the end of a long day's labour but "after the first sleep", when "they have more enjoyment" and "do it better".
Ekirch found that references to the first and second sleep started to disappear during the late 17th Century. This started among the urban upper classes in northern Europe and over the course of the next 200 years filtered down to the rest of Western society.
By the 1920s the idea of a first and second sleep had receded entirely from our social consciousness.
He attributes the initial shift to improvements in street lighting, domestic lighting and a surge in coffee houses - which were sometimes open all night. As the night became a place for legitimate activity and as that activity increased, the length of time people could dedicate to rest dwindled.
When segmented sleep was the norm
- "He knew this, even in the horror with which he started from his first sleep, and threw up the window to dispel it by the presence of some object, beyond the room, which had not been, as it were, the witness of his dream." Charles Dickens, Barnaby Rudge (1840)
- "Don Quixote followed nature, and being satisfied with his first sleep, did not solicit more. As for Sancho, he never wanted a second, for the first lasted him from night to morning." Miguel Cervantes, Don Quixote (1615)
- "And at the wakening of your first sleepe You shall have a hott drinke made, And at the wakening of your next sleepe Your sorrowes will have a slake." Early English ballad, Old Robin of Portingale
- The Tiv tribe in Nigeria employ the terms "first sleep" and "second sleep" to refer to specific periods of the night
In his new book, Evening's Empire, historian Craig Koslofsky puts forward an account of how this happened.
"Associations with night before the 17th Century were not good," he says. The night was a place populated by people of disrepute - criminals, prostitutes and drunks.
"Even the wealthy, who could afford candlelight, had better things to spend their money on. There was no prestige or social value associated with staying up all night."
That changed in the wake of the Reformation and the counter-Reformation. Protestants and Catholics became accustomed to holding secret services at night, during periods of persecution. If earlier the night had belonged to reprobates, now respectable people became accustomed to exploiting the hours of darkness.
This trend migrated to the social sphere too, but only for those who could afford to live by candlelight. With the advent of street lighting, however, socialising at night began to filter down through the classes.
In 1667, Paris became the first city in the world to light its streets, using wax candles in glass lamps. It was followed by Lille in the same year and Amsterdam two years later, where a much more efficient oil-powered lamp was developed.
London didn't join their ranks until 1684 but by the end of the century, more than 50 of Europe's major towns and cities were lit at night.
Night became fashionable and spending hours lying in bed was considered a waste of time.
"People were becoming increasingly time-conscious and sensitive to efficiency, certainly before the 19th Century," says Roger Ekirch. "But the industrial revolution intensified that attitude by leaps and bounds."
Strong evidence of this shifting attitude is contained in a medical journal from 1829 which urged parents to force their children out of a pattern of first and second sleep.
"If no disease or accident there intervene, they will need no further repose than that obtained in their first sleep, which custom will have caused to terminate by itself just at the usual hour.
"And then, if they turn upon their ear to take a second nap, they will be taught to look upon it as an intemperance not at all redounding to their credit."
Today, most people seem to have adapted quite well to the eight-hour sleep, but Ekirch believes many sleeping problems may have roots in the human body's natural preference for segmented sleep as well as the ubiquity of artificial light.
This could be the root of a condition called sleep maintenance insomnia, where people wake during the night and have trouble getting back to sleep, he suggests.
The condition first appears in literature at the end of the 19th Century, at the same time as accounts of segmented sleep disappear.
"For most of evolution we slept a certain way," says sleep psychologist Gregg Jacobs. "Waking up during the night is part of normal human physiology."
The idea that we must sleep in a consolidated block could be damaging, he says, if it makes people who wake up at night anxious, as this anxiety can itself prohibit sleeps and is likely to seep into waking life too.
Russell Foster, a professor of circadian [body clock] neuroscience at Oxford, shares this point of view.
"Many people wake up at night and panic," he says. "I tell them that what they are experiencing is a throwback to the bi-modal sleep pattern."
But the majority of doctors still fail to acknowledge that a consolidated eight-hour sleep may be unnatural.
"Over 30% of the medical problems that doctors are faced with stem directly or indirectly from sleep. But sleep has been ignored in medical training and there are very few centres where sleep is studied," he says.
Jacobs suggests that the waking period between sleeps, when people were forced into periods of rest and relaxation, could have played an important part in the human capacity to regulate stress naturally.
In many historic accounts, Ekirch found that people used the time to meditate on their dreams.
"Today we spend less time doing those things," says Dr Jacobs. "It's not a coincidence that, in modern life, the number of people who report anxiety, stress, depression, alcoholism and drug abuse has gone up."
So the next time you wake up in the middle of the night, think of your pre-industrial ancestors and relax. Lying awake could be good for you.
Craig Koslofsky and Russell Foster appeared on The Forum from the BBC World Service. Listen to the programme here.
Why is a minute divided into 60 seconds, an hour into 60 minutes, yet there are only 24 hours in a day?
In today's world, the most widely used numeral system is decimal (base 10), a system that probably originated because it made it easy for humans to count using their fingers. The civilizations that first divided the day into smaller parts, however, used different numeral systems, specifically duodecimal (base 12) and sexagesimal (base 60).
Thanks to documented evidence of the Egyptians' use of sundials, most historians credit them with being the first civilization to divide the day into smaller parts. The first sundials were simply stakes placed in the ground that indicated time by the length and direction of the resulting shadow. As early as 1500 B.C., the Egyptians had developed a more advanced sundial. A T-shaped bar placed in the ground, this instrument was calibrated to divide the interval between sunrise and sunset into 12 parts. This division reflected Egypt's use of the duodecimal system--the importance of the number 12 is typically attributed either to the fact that it equals the number of lunar cycles in a year or the number of finger joints on each hand (three in each of the four fingers, excluding the thumb), making it possible to count to 12 with the thumb. The next-generation sundial likely formed the first representation of what we now call the hour. Although the hours within a given day were approximately equal, their lengths varied during the year, with summer hours being much longer than winter hours.
Without artificial light, humans of this time period regarded sunlit and dark periods as two opposing realms rather than as part of the same day. Without the aid of sundials, dividing the dark interval between sunset and sunrise was more complex than dividing the sunlit period. During the era when sundials were first used, however, Egyptian astronomers also first observed a set of 36 stars that divided the circle of the heavens into equal parts. The passage of night could be marked by the appearance of 18 of these stars, three of which were assigned to each of the two twilight periods when the stars were difficult to view. The period of total darkness was marked by the remaining 12 stars, again resulting in 12 divisions of night (another nod to the duodecimal system). During the New Kingdom (1550 to 1070 B.C.), this measuring system was simplified to use a set of 24 stars, 12 of which marked the passage of the night. The clepsydra, or water clock, was also used to record time during the night, and was perhaps the most accurate timekeeping device of the ancient world. The timepiece--a specimen of which, found at the Temple of Ammon in Karnak, dated back to 1400 B.C.--was a vessel with slanted interior surfaces to allow for decreasing water pressure, inscribed with scales that marked the division of the night into 12 parts during various months.
Once both the light and dark hours were divided into 12 parts, the concept of a 24-hour day was in place. The concept of fixed-length hours, however, did not originate until the Hellenistic period, when Greek astronomers began using such a system for their theoretical calculations. Hipparchus, whose work primarily took place between 147 and 127 B.C., proposed dividing the day into 24 equinoctial hours, based on the 12 hours of daylight and 12 hours of darkness observed on equinox days. Despite this suggestion, laypeople continued to use seasonally varying hours for many centuries. (Hours of fixed length became commonplace only after mechanical clocks first appeared in Europe during the 14th century.)
Hipparchus and other Greek astronomers employed astronomical techniques that were previously developed by the Babylonians, who resided in Mesopotamia. The Babylonians made astronomical calculations in the sexagesimal (base 60) system they inherited from the Sumerians, who developed it around 2000 B.C. Although it is unknown why 60 was chosen, it is notably convenient for expressing fractions, since 60 is the smallest number divisible by the first six counting numbers as well as by 10, 12, 15, 20 and 30.
Although it is no longer used for general computation, the sexagesimal system is still used to measure angles, geographic coordinates and time. In fact, both the circular face of a clock and the sphere of a globe owe their divisions to a 4,000-year-old numeric system of the Babylonians.
The Greek astronomer Eratosthenes (who lived circa 276 to 194 B.C.) used a sexagesimal system to divide a circle into 60 parts in order to devise an early geographic system of latitude, with the horizontal lines running through well-known places on the earth at the time. A century later, Hipparchus normalized the lines of latitude, making them parallel and obedient to the earth's geometry. He also devised a system of longitude lines that encompassed 360 degrees and that ran north to south, from pole to pole. In his treatise Almagest (circa A.D. 150), Claudius Ptolemy explained and expanded on Hipparchus' work by subdividing each of the 360 degrees of latitude and longitude into smaller segments. Each degree was divided into 60 parts, each of which was again subdivided into 60 smaller parts. The first division, partes minutae primae, or first minute, became known simply as the "minute." The second segmentation, partes minutae secundae, or "second minute," became known as the second.
Minutes and seconds, however, were not used for everyday timekeeping until many centuries after the Almagest. Clock displays divided the hour into halves, thirds, quarters and sometimes even 12 parts, but never by 60. In fact, the hour was not commonly understood to be the duration of 60 minutes. It was not practical for the general public to consider minutes until the first mechanical clocks that displayed minutes appeared near the end of the 16th century. Even today, many clocks and wristwatches have a resolution of only one minute and do not display seconds.
Thanks to the ancient civilizations that defined and preserved the divisions of time, modern society still conceives of a day of 24 hours, an hour of 60 minutes and a minute of 60 seconds. Advances in the science of timekeeping, however, have changed how these units are defined. Seconds were once derived by dividing astronomical events into smaller parts, with the International System of Units (SI) at one time defining the second as a fraction of the mean solar day and later relating it to the tropical year. This changed in 1967, when the second was redefined as the duration of 9,192,631,770 energy transitions of the cesium atom. This recharacterization ushered in the era of atomic timekeeping and Coordinated Universal Time (UTC).
Interestingly, in order to keep atomic time in agreement with astronomical time, leap seconds occasionally must be added to UTC. Thus, not all minutes contain 60 seconds. A few rare minutes, occurring at a rate of about eight per decade, actually contain 61.